Also see:
The quest to build a better AI tutor — from hechingerreport.org by Jill Barshay
Researchers make progress with an older ed tech idea: personalized practice
One promising idea has less to do with how an AI tutor explains concepts and more with what it asks students to practice next.
A team at the University of Pennsylvania, which included some AI skeptics, recently tested this approach in a study of close to 800 Taiwanese high school students learning Python programming. All the students used the same AI tutor, which was designed not to give away answers.
But there was one key difference. Half the students were randomly assigned to a fixed sequence of practice problems, progressing from easy to hard. The other half received a personalized sequence with the AI tutor continuously adjusting the difficulty of each problem based on how the student was performing and interacting with the chatbot.
The idea is based on what educators call the “zone of proximal development.” When problems are too easy, students get bored. When they’re too hard, students get frustrated. The goal is to keep students in a sweet spot: challenged, but not overwhelmed.
The researchers found that students in the personalized group did better on a final exam than students in the fixed problem group. The difference was characterized as the equivalent of 6 to 9 months of additional schooling, an eye-catching claim for an after-school online course that lasted only five months.
…
To address this, Chung’s team combined a large language model with a separate machine-learning algorithm that analyzes how students interact with the online course platform — how they answer the practice questions, how many times they revise or edit their coding, and the quality of their conversations with the chatbot — and uses that information to decide which problem to serve up next.
From DSC:
I have been proposing that the AI-based learning platform of the future will be constantly doing this — every single day. It will know what the in-demand skills are — at any given moment in time. It will then be able to direct you to resources that will help you gain those skills. Though in my vision, the system is querying actual/open job descriptions, not analyzing learning data from enterprise learners. Perhaps I should add that to the vision.
Coursera’s Job Skills Report 2026: Top skills for your students — from coursera.org
The Job Skills Report 2026 analyzes learning data from more than 6 million enterprise learners to identify the future job skills organizations need most. It’s designed for HR and L&D leaders; data, IT, and software & product development leaders; higher education administrators; and government agencies seeking actionable insights on workforce skills trends and AI-driven transformation.
…
Drawing on data from 6 million enterprise learners across nearly 7,000 organizations, the Job Skills Report 2026 guides you through the skills reshaping the global economy. This year’s analysis spans Data, IT, and Software & Product Development—and the Generative AI skills becoming essential for every role.
Here is Chris Martin’s posting on LinkedIn.com:
Here is Dominik Mate Kovacs’ posting on LinkedIn.com:
The AI ‘hivemind’: Why so many student essays sound alike — from hechingerreport.org by Jill Barshay
A study of more than 70 large language models found similar answers to brainstorming and creative writing prompts
The answers were frequently indistinguishable across different models by different companies that have different architectures and use different training data. The metaphors, imagery, word choices, sentence structures — even punctuation — often converged. Jiang’s team called this phenomenon “inter-model homogeneity” and quantified the overlaps and similarities. To drive the point home, Jiang titled her paper, the “Artificial Hivemind.” The study won a best paper award at the annual conference on Neural Information Processing Systems in December 2025, one of the premier gatherings for AI research.
AI Has No Moral Compass. Do You? — from michelleweise.substack.com by Michelle Weise & Dana Walsh
Why the Age of AI Demands We Take Character Formation Seriously
Here’s something to chew on:
Anthropic, the company behind Claude — a chatbot used by 30 million users per month — has exactly one person (whom we know of) working on AI ethics. One. A young Scottish philosopher is doing the vital work of training a large language model to discern right from wrong.
I don’t say this to shame Anthropic. In fact, Anthropic appears to be the only company (that we know of) being explicit about the moral foundations and reasoning of its chatbot. Hundreds of millions of users worldwide are leveraging tools from other LLMs that do not appear to have an explicit moral compass being cultivated from within.
I raise this because this is yet another example of where we are: extraordinary technical power advancing without an equally strong moral infrastructure to support it.
Why do we keep producing people who are skilled but not wise?
Here is Pradnya’s posting out on LinkedIn.com:
From DSC…note these excerpts from Pradnya’s posting:
Pradnya links to a page out at ParadisoSolutions.com. Check out some of the functionality this AI-powered system provides:
U.S. Department of Labor Defines 5 Key Areas of AI Literacy — from campustechnology.com by Rhea Kelly
Key Takeaways
L&D Global Sentiment Survey 2026 — from linkedin.com by Donald H. Taylor
“But what’s happening right now is exponential.” — from linkedin.com by Josh Cavalier
Excerpt:
I need to be honest with you. I’ve been running experiments this week with Claude Code and Opus 4.6, and we have reached the precipice in the collapse of time required to produce high-quality text-based ID outputs.
This includes performance consulting reports, learning needs analyses, action mapping, scripts, storyboards, facilitator guides, rubrics, and technical specs.
I just mapped the entire performance consulting process into a multimodal AI integration architecture (diagram image). Every phase. Entry and contracting. Performance analysis. Cause analysis. Solution design. Implementation. Evaluation. Thirty files. System specifications for each. The next step is to vet out each “skill” with an expert performance consultant.
Then I attempted a learning output: an 8-module course built with a cognitive scaffold that moves beyond content delivery to facilitate deliberate practice, meaning-making, and guided reflection within the learner’s own context.
The result:
AI and human-centered learning — from linkedin.com by Patrick Blessinger
Democratizing opportunities
AI adaptive learning can adapt learning in real-time. These tools have the potential to provide a more personalized learning experience, but only if used properly.
The California State University system uses ChatGPT Edu (OpenAI, 2025). Students use it for AI-assisted tutoring, study aids, and writing support. These resources provide 24/7 availability of subject-matter expertise tailored to students’ learning needs. It is not a replacement for professors. Rather, it extends the reach of mentorship by reducing access barriers.
However, we must proceed with intellectual humility and ethical responsibility. Even though AI can customize messages, it cannot replace the encouragement of a teacher or professor, or the social and emotional aspects of learning. It’s at the intersection of humanistic values and knowledge development that education must find its balance.
Jim VandeHei’s note to his kids: Blunt AI talk — from axios.com by CEO Jim VandeHei
Axios CEO Jim VandeHei wrote this note to his wife, Autumn, and their three kids. She suggested sharing it more broadly since so many families are wrestling with how to think and talk about AI. So here it is …
Dear Family:
I want to put to words what I’m hearing, seeing, thinking and writing about AI.
I’m not trying to frighten you. And I know your opinions range from wonderment to worry. That’s natural and OK. Our species isn’t wired for change of this speed or scale.
All of you must figure out how to master AI for any specific job or internship you hold or take. You’d be jeopardizing your future careers by not figuring out how to use AI to amplify and improve your work. You’d be wise to replace social media scrolling with LLM testing.
Be the very best at using AI for your gig.
Also see:
Also relevant/see:
From Rooms to Ecosystems: When Connection Becomes the Catalyst
Some gatherings change not just in size, but in meaning. What started as a small, intentional space to celebrate partners has grown into a moment that reflects how an entire ecosystem has matured. Each year, the room fills with more leaders, more relationships, and more shared language about what learning can look like when people are genuinely connected. It is less about an event on the calendar and more about what it represents: an education community that knows each other, trusts each other, and keeps showing up.
That kind of connection did not happen by accident. Through efforts like Get on the Bus, hosted by the Ewing Marion Kauffman Foundation, networking for education leaders has shifted from transactional to relational. Students lead. Stories anchor the work. Conversations happen across tables, sectors, and roles. System leaders, intermediaries, industry partners, and civic organizations are not passing business cards. They are building shared understanding and social capital that lasts long after the room clears.
This week’s newsletter carries that same energy. You will find examples of learning that travels beyond buildings, leadership conversations grounded in real tensions, and models that reflect what becomes possible when ecosystems are aligned. When people feel connected to one another and to a common purpose, the work gets clearer, stronger, and more human. That sense of belonging is not just powerful. It is foundational to what comes next.
Town Hall Recap: What’s Next in Learning 2026 — from gettingsmart.com by Tom Vander Ark, Nate McClennen, Shawnee Caruthers, Victoria Andrews
As we enter 2026, the Getting Smart team is diving deep into the convergence of human potential and technological opportunity. Our annual Town Hall isn’t just a forecast—it’s a roadmap for the year ahead. We will explore how human-centered AI is reshaping pedagogy, the power of participation, and the new realities of educational leadership. Join us as we define the new dispositions for future-ready educators and discover how to build meaningful, personalized pathways for every student.
Farewell to Traditional Universities | What AI Has in Store for Education
Premiered Jan 16, 2026
Description:
What if the biggest change in education isn’t a new app… but the end of the university monopoly on credibility?
Jensen Huang has framed AI as a platform shift—an industrial revolution that turns intelligence into infrastructure. And when intelligence becomes cheap, personal, and always available, education stops being a place you go… and becomes a system that follows you. The question isn’t whether universities will disappear. The question is whether the old model—high cost, slow updates, one-size-fits-all—can survive a world where every student can have a private tutor, a lab partner, and a curriculum designer on demand.
This video explores what AI has in store for education—and why traditional universities may need to reinvent themselves fast.
In this video you’ll discover:
From DSC:
There appears to be another, similar video, but with a different date and length of the video. So I’m including this other recording as well here:
The End of Universities as We Know Them: What AI Is Bringing
Premiered Jan 27, 2026
What if universities don’t “disappear”… but lose their monopoly on learning, credentials, and opportunity?
AI is turning education into something radically different: personal, instant, adaptive, and always available. When every student can have a 24/7 tutor, a writing coach, a coding partner, and a study plan designed specifically for them, the old model—one professor, one curriculum, one pace for everyone—starts to look outdated. And the biggest disruption isn’t the classroom. It’s the credential. Because in an AI world, proof of skill can become more valuable than a piece of paper.
This video explores the end of universities as we know them: what AI is bringing, what will break, what will survive, and what replaces the traditional path.
In this video you’ll discover:
Global list of over 100 L&D conferences in 2026 — from donaldhtaylor.co.uk by Don Taylor
I’m a firm believer in conferences. This isn’t just because I have chaired the Learning Technologies Conference in London since 2000. It’s because they are invaluable in sustaining our community. So many in Learning and Development work alone or in small teams, that building and maintaining personal contacts is crucial.For a number of years, I have kept a personal list of the Learning and Development conferences running internationally. This year, I thought it would be helpful to share it.
AI and the Work of Centers for Teaching and Learning — from derekbruff.org by Derek Bruff
In the same way, when I approach any kind of educational technology, I’m looking for tools that can be responsive to my pedagogical aims. The pedagogy should drive the technology use, not the other way around.
How to Design with AI in 2026 (based on the most compelling research published in 2025). — from linkedin.com by Dr. Philippa Hardman
How Your Learners *Actually* Learn with AI — from drphilippahardman.substack.com by Dr. Philippa Hardman
What 37.5 million AI chats show us about how learners use AI at the end of 2025 — and what this means for how we design & deliver learning experiences in 2026
Last week, Microsoft released a similar analysis of a whopping 37.5 million Copilot conversations. These conversation took place on the platform from January to September 2025, providing us with a window into if and how AI use in general — and AI use among learners specifically – has evolved in 2025.
Microsoft’s mass behavioural data gives us a detailed, global glimpse into what learners are actually doing across devices, times of day and contexts. The picture that emerges is pretty clear and largely consistent with what OpenAI’s told us back in the summer:
AI isn’t functioning primarily as an “answers machine”: the majority of us use AI as a tool to personalise and differentiate generic learning experiences and – ultimately – to augment human learning.
Let’s dive in!
Learners don’t “decide” to use AI anymore. They assume it’s there, like search, like spellcheck, like calculators. The question has shifted from “should I use this?” to “how do I use this effectively?”
8 AI Agents Every HR Leader Needs To Know In 2026 — from forbes.com by Bernard Marr
So where do you start? There are many agentic tools and platforms for AI tasks on the market, and the most effective approach is to focus on practical, high-impact workflows. So here, I’ll look at some of the most compelling use cases, as well as provide an overview of the tools that can help you quickly deliver tangible wins.
…
Some of the strongest opportunities in HR include: