Also see:
The quest to build a better AI tutor — from hechingerreport.org by Jill Barshay
Researchers make progress with an older ed tech idea: personalized practice
One promising idea has less to do with how an AI tutor explains concepts and more with what it asks students to practice next.
A team at the University of Pennsylvania, which included some AI skeptics, recently tested this approach in a study of close to 800 Taiwanese high school students learning Python programming. All the students used the same AI tutor, which was designed not to give away answers.
But there was one key difference. Half the students were randomly assigned to a fixed sequence of practice problems, progressing from easy to hard. The other half received a personalized sequence with the AI tutor continuously adjusting the difficulty of each problem based on how the student was performing and interacting with the chatbot.
The idea is based on what educators call the “zone of proximal development.” When problems are too easy, students get bored. When they’re too hard, students get frustrated. The goal is to keep students in a sweet spot: challenged, but not overwhelmed.
The researchers found that students in the personalized group did better on a final exam than students in the fixed problem group. The difference was characterized as the equivalent of 6 to 9 months of additional schooling, an eye-catching claim for an after-school online course that lasted only five months.
…
To address this, Chung’s team combined a large language model with a separate machine-learning algorithm that analyzes how students interact with the online course platform — how they answer the practice questions, how many times they revise or edit their coding, and the quality of their conversations with the chatbot — and uses that information to decide which problem to serve up next.
Legal AI Access at 83%, But Trust Issues Remain — from artificiallawyer.com
A new survey of over 200 inhouse and law firm leaders provides solid evidence that while AI tools are now ‘standard’ across our sector, that trust in AI outputs fundamentally drives usage, along with ROI – and vice versa.
The data, from ALSP Factor, shows that 83% had ‘broad AI access’, which is up from 61% in 2025, and in itself is a very positive development that tells us legal AI is now becoming ubiquitous for commercial lawyers, with around 54% using such tools ‘often’.
Is the eLearning authoring tool dead? — from linkedin.com by Melissa Milloway & Tim Slade
…which links to the video below:
From DSC:
The types of postings/articles (such as the one below) make me ask, are we not shooting ourselves in the foot with AI and recent college graduates? If the bottom rungs continue to disappear, internships and apprenticeships can only go so far. There aren’t enough of them — especially valuable ones. So as this article points out, there will be threats to the long-term health of our talent pipelines unless we can take steps to thwart those impacts — and to do so fairly soon.
To me…vocational training and jobs are looking better all the time — i.e., plumbers, carpenters, electricians, mechanics, and more.
Can New Graduates Compete With AI? — from builtin.combyRichard Johnson
The increasing adoption of AI automation is compressing early-career jobs. How should new graduates get a foothold in the economy now?
Summary: AI is hollowing out entry-level roles by automating routine tasks, eliminating a rung on the career ladder. New graduates face intense competition and a rising skill floor. While firms gain short-term productivity, they risk a long-term talent shortage by eliminating junior training grounds.
Conversations about AI have covered all grounds: hype, fear and slop. But while some roll their eyes at yet another automation headline, soon?to?be graduates are watching the labor market with a very different level of urgency. They’re entering a world where the old paradox of needing experience to get experience is colliding with a new reality: AI is absorbing the standardized, routine tasks that once defined entry?level work. The result isn’t just a shift in job descriptions or skill-requirements, but rather a structural reshaping of the career pipeline.
Entry-level workers face an outsized disruption to their long-term career trajectories. They have the least buffer to adapt given their lack of relevant job market experience and heightened financial pressure to secure a job quickly with the student-debt repayment periods for recent graduates looming.
Momentum early in one’s career matters, and the first job on a resume shapes future compensation bands and opportunities. It also serves as a signal for perceived specialization or, at minimum, interest. Losing that foothold has compounding effects to one’s career ladder.
Also relevant/see:
New Anthropic Institute to Study Risks and Economic Effects of Advanced AI — from campustechnology.com by John K. Waters
Key Takeaways
Here is Chris Martin’s posting on LinkedIn.com:
Here is Dominik Mate Kovacs’ posting on LinkedIn.com:
The AI ‘hivemind’: Why so many student essays sound alike — from hechingerreport.org by Jill Barshay
A study of more than 70 large language models found similar answers to brainstorming and creative writing prompts
The answers were frequently indistinguishable across different models by different companies that have different architectures and use different training data. The metaphors, imagery, word choices, sentence structures — even punctuation — often converged. Jiang’s team called this phenomenon “inter-model homogeneity” and quantified the overlaps and similarities. To drive the point home, Jiang titled her paper, the “Artificial Hivemind.” The study won a best paper award at the annual conference on Neural Information Processing Systems in December 2025, one of the premier gatherings for AI research.
AI Has No Moral Compass. Do You? — from michelleweise.substack.com by Michelle Weise & Dana Walsh
Why the Age of AI Demands We Take Character Formation Seriously
Here’s something to chew on:
Anthropic, the company behind Claude — a chatbot used by 30 million users per month — has exactly one person (whom we know of) working on AI ethics. One. A young Scottish philosopher is doing the vital work of training a large language model to discern right from wrong.
I don’t say this to shame Anthropic. In fact, Anthropic appears to be the only company (that we know of) being explicit about the moral foundations and reasoning of its chatbot. Hundreds of millions of users worldwide are leveraging tools from other LLMs that do not appear to have an explicit moral compass being cultivated from within.
I raise this because this is yet another example of where we are: extraordinary technical power advancing without an equally strong moral infrastructure to support it.
Why do we keep producing people who are skilled but not wise?
The Future of College in an AI World — from linkedin.com by Jeff Selingo
In today’s issue: The tension over AI in higher ed; application inflation continues and testing is back; what’s the future of the original classroom technology, the learning management system.
Hundreds of higher ed and industry leaders gathered Tuesday for a summit
on AI and the future of learning at the University of Michigan.
.
Conversations like the one we had at Michigan this week are necessary, but the action rarely matches the ambition.
AI Music Generators: Teaching With These Catchy AI Tools — from techlearning.com by Erik Ofgang
AI music generators are getting better and better, and there are more applications in the classroom as a result.
Are All AI Music Generators More Or Less The Same?
No. After experimenting with a few various free ones, I found a wide range of quality with the same prompts.
Gemini is the only one I’d currently recommend. It’s user-friendly but limited and only creates 30-second clips. Other music generators could potentially outperform Gemini with prompt adjustments. The ones I tried did better with the instrumentals but struggled more with the lyrics, and that kind of defeated the purpose of the tool for me.
ChatDOC: Teaching With The AI Summarizing Tool — from techlearning.com by Erik Ofgang
ChatDOC lets users turn any PDF into an AI chatbot that can summarize the text, answer questions, and generate quizzes.
What Is ChatDOC?
ChatDOC is an AI designed to help users interact with PDFs of various types, be it research papers, short stories, or chapters from larger works. Users upload a PDF and then have the opportunity to “chat” with that document, that is speak with a chatbot that bases its answers off of the uploaded text.
ChatDOC can perform tasks such as provide a short summary, search for specific terms, explain the overall theme if it’s a work of literature, or unpack the science in a research paper.
Other similar tools are out there, but ChatDOC is definitely one of the better PDF readers I’ve used. Its free version is quick and easy-to-use, and delivers on its promise of providing an AI that can discuss a given document with users and even quiz them on it.
From AI access to workforce readiness — from chieflearningofficer.com by Johnny Hamilton, Amy Stratbucker, & Brad Bigelow
Is your workforce using the right tool with an outdated mindset and playbook? Why old playbooks fall short — and what learning leaders must do next.
The leadership opportunity
Organizations do not need to predict every future AI capability. They need systems that allow people to explore with curiosity, practice safely, reflect deeply and adapt continuously — starting with what they already have and extending as capabilities evolve.
For CLOs, this is a moment to lead from the center of change — designing workforce readiness that keeps pace with accelerating technology while making work more rewarding for employees and more valuable for the organization. That is how AI moves from the promise of transformation to demonstrated readiness and, ultimately, from promise to performance.
Addendums on 3/19/26:
How to Build Practice-Based Learning Activities with AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Four evidence-based methods for designing, building & deploying active learning activities with your favourite LLM
Most L&D teams are using AI to make content faster. The real opportunity is using it as a practice engine.
The Synthesia 2026 AI in L&D Report f2026 AI in L&D Report found that the fastest-growing areas of planned AI adoption aren’t in content creation — they’re in assessments and simulations (36%), adaptive pathways (33%), and AI tutors (29%). In other words: L&D teams are starting to realise that the most powerful use of AI isn’t producing learning materials. It’s creating environments where learners actually practise.
And you can build these right now — no dev team, no custom platform, no code. Each method below includes a prompt you can paste into your preferred AI tool to generate a working interactive prototype: a self-contained practice activity with a briefing screen, a live AI interaction, and a debrief — all running in the browser, ready to share with stakeholders or deploy to learners.
OpenAI Adds Interactive Math and Science Learning Tools to ChatGPT — from campustechnology.com by Rhea Kelly
Key Takeaways
Teach Smarter with AI — from wondertools.substack.com by Jeremy Caplan and Lance Eaton
10 tested strategies from two educators who actually use them
I recently talked with Lance Eaton, Senior Associate Director of AI and Teaching & Learning at Northeastern University and writer of AI + Education = Simplified. We traded ideas about what’s actually working. We came up with 10 specific, practical ways anyone who teaches, coaches, or leads can put AI to work.
Watch the full conversation above, or read highlights below.
Beyond Audio Summaries: How to Use NotebookLM to *Actually* Design Better Learning — from drphilippahardman.substack.com by Dr. Philippa Hardman
Five methods to maximise the value of NotebookLM’s features
In practice, what makes NotebookLM different for learning designers is four things:
…
5 Evidence-Based Methods NotebookLM Operationalises…
Shadow AI Isn’t a Threat: It’s a Signal — from campustechnology.com by Damien Eversmann
Unofficial AI use on campus reveals more about institutional gaps than misbehavior.
Key Takeaways
How L&D Can Lead in the Age of AI Even If Your Company’s Not Ready — from learningguild.com
How to lead even when your company doesn’t allow AI
Even if your corporation isn’t ready for AI, you can still research tools personally to stay ahead of the curve, so when organizational restrictions lift, you are ready to use AI for learning right away. Here are some tools you can test at home if they’re restricted in your workplace:
The Higher Ed Playbook for AI Affordability — from campustechnology.com by Jason Dunn-Potter
Key Takeaways
Claude is quietly becoming the go-to AI tool for learning designers. Here’s a 101 guide. — from Linkedin.com by Dr. Philippa Hardman
L&D Global Sentiment Survey 2026 — from linkedin.com by Donald H. Taylor
“But what’s happening right now is exponential.” — from linkedin.com by Josh Cavalier
Excerpt:
I need to be honest with you. I’ve been running experiments this week with Claude Code and Opus 4.6, and we have reached the precipice in the collapse of time required to produce high-quality text-based ID outputs.
This includes performance consulting reports, learning needs analyses, action mapping, scripts, storyboards, facilitator guides, rubrics, and technical specs.
I just mapped the entire performance consulting process into a multimodal AI integration architecture (diagram image). Every phase. Entry and contracting. Performance analysis. Cause analysis. Solution design. Implementation. Evaluation. Thirty files. System specifications for each. The next step is to vet out each “skill” with an expert performance consultant.
Then I attempted a learning output: an 8-module course built with a cognitive scaffold that moves beyond content delivery to facilitate deliberate practice, meaning-making, and guided reflection within the learner’s own context.
The result:
AI and human-centered learning — from linkedin.com by Patrick Blessinger
Democratizing opportunities
AI adaptive learning can adapt learning in real-time. These tools have the potential to provide a more personalized learning experience, but only if used properly.
The California State University system uses ChatGPT Edu (OpenAI, 2025). Students use it for AI-assisted tutoring, study aids, and writing support. These resources provide 24/7 availability of subject-matter expertise tailored to students’ learning needs. It is not a replacement for professors. Rather, it extends the reach of mentorship by reducing access barriers.
However, we must proceed with intellectual humility and ethical responsibility. Even though AI can customize messages, it cannot replace the encouragement of a teacher or professor, or the social and emotional aspects of learning. It’s at the intersection of humanistic values and knowledge development that education must find its balance.
The Campus AI Crisis — by Jeffrey Selingo; via Ryan Craig
Young graduates can’t find jobs. Colleges know they have to do something. But what?
Only now are colleges realizing that the implications of AI are much greater and are already outrunning their institutional ability to respond. As schools struggle to update their curricula and classroom policies, they also confront a deeper problem: the suddenly enormous gap between what they say a degree is for and what the labor market now demands. In that mismatch, students are left to absorb the risk. Alina McMahon and millions of other Gen-Zers like her are caught in a muddled in-between moment: colleges only just beginning to think about how to adapt and redefine their mission in the post-AI world, and a job market that’s changing much, much faster.
“Colleges and universities face an existential issue before them,” said Ryan Craig, author of Apprentice Nation and managing director of a firm that invests in new educational models. “They need to figure out how to integrate relevant, in-field, and hopefully paid work experience for every student, and hopefully multiple experiences before they graduate.”
Kling 3.0 just launched. The best video model yet. — from heatherbcooper.substack.com by Heather Cooper
& workflows from Imagine Art 1.5 pro, Pixverse Real-Time Video & Genspark
In today’s edition:
Kling 3.0: Everyone a Director
Kling just dropped version 3.0, and it’s a legitimate leap forward for AI video production (Kling is the GOAT). After spending early access time testing the new capabilities, I can confirm this is the most significant update to video generation tools I’ve seen in months.
Key highlights:
Jim VandeHei’s note to his kids: Blunt AI talk — from axios.com by CEO Jim VandeHei
Axios CEO Jim VandeHei wrote this note to his wife, Autumn, and their three kids. She suggested sharing it more broadly since so many families are wrestling with how to think and talk about AI. So here it is …
Dear Family:
I want to put to words what I’m hearing, seeing, thinking and writing about AI.
I’m not trying to frighten you. And I know your opinions range from wonderment to worry. That’s natural and OK. Our species isn’t wired for change of this speed or scale.
All of you must figure out how to master AI for any specific job or internship you hold or take. You’d be jeopardizing your future careers by not figuring out how to use AI to amplify and improve your work. You’d be wise to replace social media scrolling with LLM testing.
Be the very best at using AI for your gig.
Also see:
Also relevant/see:
From Rooms to Ecosystems: When Connection Becomes the Catalyst
Some gatherings change not just in size, but in meaning. What started as a small, intentional space to celebrate partners has grown into a moment that reflects how an entire ecosystem has matured. Each year, the room fills with more leaders, more relationships, and more shared language about what learning can look like when people are genuinely connected. It is less about an event on the calendar and more about what it represents: an education community that knows each other, trusts each other, and keeps showing up.
That kind of connection did not happen by accident. Through efforts like Get on the Bus, hosted by the Ewing Marion Kauffman Foundation, networking for education leaders has shifted from transactional to relational. Students lead. Stories anchor the work. Conversations happen across tables, sectors, and roles. System leaders, intermediaries, industry partners, and civic organizations are not passing business cards. They are building shared understanding and social capital that lasts long after the room clears.
This week’s newsletter carries that same energy. You will find examples of learning that travels beyond buildings, leadership conversations grounded in real tensions, and models that reflect what becomes possible when ecosystems are aligned. When people feel connected to one another and to a common purpose, the work gets clearer, stronger, and more human. That sense of belonging is not just powerful. It is foundational to what comes next.
Town Hall Recap: What’s Next in Learning 2026 — from gettingsmart.com by Tom Vander Ark, Nate McClennen, Shawnee Caruthers, Victoria Andrews
As we enter 2026, the Getting Smart team is diving deep into the convergence of human potential and technological opportunity. Our annual Town Hall isn’t just a forecast—it’s a roadmap for the year ahead. We will explore how human-centered AI is reshaping pedagogy, the power of participation, and the new realities of educational leadership. Join us as we define the new dispositions for future-ready educators and discover how to build meaningful, personalized pathways for every student.