More than a quarter of private colleges are at risk of closing, new projection shows — from hechingerreport.org by Jon Marcus
As one Vermont college finishes its last semester, an estimated 442 others may be in trouble

A new estimate projects that 442 of the nation’s 1,700 private, nonprofit four-year colleges and universities, with a combined 670,000 students, are at risk of closing or having to merge within the next 10 years.

More than 120 institutions are at the very highest risk, according to the forecast, by Huron Consulting Group, which analyzed enrollment trends, tuition revenue, assets, debt, cash on hand and other measures. Many are, like Sterling, small and rural.

“We have too many seats. We have too many classrooms,” said Peter Stokes, a managing director at Huron. “So over the coming five to 10 years, this shakeout is going to take place.” 

 

The quest to build a better AI tutor — from hechingerreport.org by Jill Barshay
Researchers make progress with an older ed tech idea: personalized practice

One promising idea has less to do with how an AI tutor explains concepts and more with what it asks students to practice next.

A team at the University of Pennsylvania, which included some AI skeptics, recently tested this approach in a study of close to 800 Taiwanese high school students learning Python programming. All the students used the same AI tutor, which was designed not to give away answers.

But there was one key difference. Half the students were randomly assigned to a fixed sequence of practice problems, progressing from easy to hard. The other half received a personalized sequence with the AI tutor continuously adjusting the difficulty of each problem based on how the student was performing and interacting with the chatbot.

The idea is based on what educators call the “zone of proximal development.” When problems are too easy, students get bored. When they’re too hard, students get frustrated. The goal is to keep students in a sweet spot: challenged, but not overwhelmed.

The researchers found that students in the personalized group did better on a final exam than students in the fixed problem group. The difference was characterized as the equivalent of 6 to 9 months of additional schooling, an eye-catching claim for an after-school online course that lasted only five months.

To address this, Chung’s team combined a large language model with a separate machine-learning algorithm that analyzes how students interact with the online course platform — how they answer the practice questions, how many times they revise or edit their coding, and the quality of their conversations with the chatbot — and uses that information to decide which problem to serve up next.

 

From DSC:
I have been proposing that the AI-based learning platform of the future will be constantly doing this — every single day. It will know what the in-demand skills are — at any given moment in time. It will then be able to direct you to resources that will help you gain those skills. Though in my vision, the system is querying actual/open job descriptions, not analyzing learning data from enterprise learners. Perhaps I should add that to the vision.


Coursera’s Job Skills Report 2026: Top skills for your students — from coursera.org

The Job Skills Report 2026 analyzes learning data from more than 6 million enterprise learners to identify the future job skills organizations need most. It’s designed for HR and L&D leaders; data, IT, and software & product development leaders; higher education administrators; and government agencies seeking actionable insights on workforce skills trends and AI-driven transformation.

Drawing on data from 6 million enterprise learners across nearly 7,000 organizations, the Job Skills Report 2026 guides you through the skills reshaping the global economy. This year’s analysis spans Data, IT, and Software & Product Development—and the Generative AI skills becoming essential for every role.

 

Here is Chris Martin’s posting on LinkedIn.com:


Here is Dominik Mate Kovacs’ posting on LinkedIn.com:


The AI ‘hivemind’: Why so many student essays sound alike — from hechingerreport.org by Jill Barshay
A study of more than 70 large language models found similar answers to brainstorming and creative writing prompts

The answers were frequently indistinguishable across different models by different companies that have different architectures and use different training data. The metaphors, imagery, word choices, sentence structures — even punctuation — often converged. Jiang’s team called this phenomenon “inter-model homogeneity” and quantified the overlaps and similarities. To drive the point home, Jiang titled her paper, the “Artificial Hivemind.” The study won a best paper award at the annual conference on Neural Information Processing Systems in December 2025, one of the premier gatherings for AI research.


AI Has No Moral Compass. Do You? — from michelleweise.substack.com by Michelle Weise & Dana Walsh
Why the Age of AI Demands We Take Character Formation Seriously

Here’s something to chew on:

Anthropic, the company behind Claude — a chatbot used by 30 million users per month — has exactly one person (whom we know of) working on AI ethics. One. A young Scottish philosopher is doing the vital work of training a large language model to discern right from wrong.

I don’t say this to shame Anthropic. In fact, Anthropic appears to be the only company (that we know of) being explicit about the moral foundations and reasoning of its chatbot. Hundreds of millions of users worldwide are leveraging tools from other LLMs that do not appear to have an explicit moral compass being cultivated from within.

I raise this because this is yet another example of where we are: extraordinary technical power advancing without an equally strong moral infrastructure to support it.

Why do we keep producing people who are skilled but not wise?

 

The Future of College in an AI World — from linkedin.com by Jeff Selingo
In today’s issue: The tension over AI in higher ed; application inflation continues and testing is back; what’s the future of the original classroom technology, the learning management system. 


Hundreds of higher ed and industry leaders gathered Tuesday for a summit
on AI and the future of learning at the University of Michigan.
.

Conversations like the one we had at Michigan this week are necessary, but the action rarely matches the ambition.

  • We say the humanities are the operating system of an AI world, yet students and parents don’t believe it. They’re voting with their feet toward STEM, business, and narrowly tailored majors they believe will lead to a job.
  • Meanwhile, colleges are quietly eliminating the very humanities degrees the panelists were championing, employers are cutting the entry rungs off the career ladder for new graduates, and as Podium Education co-founder Christopher Parrish reminded us yesterday, there’s a yawning gap between demand for experience and the internships that actually exist.


AI Music Generators: Teaching With These Catchy AI Tools — from techlearning.com by Erik Ofgang
AI music generators are getting better and better, and there are more applications in the classroom as a result.

Are All AI Music Generators More Or Less The Same?
No. After experimenting with a few various free ones, I found a wide range of quality with the same prompts.

Gemini is the only one I’d currently recommend. It’s user-friendly but limited and only creates 30-second clips. Other music generators could potentially outperform Gemini with prompt adjustments. The ones I tried did better with the instrumentals but struggled more with the lyrics, and that kind of defeated the purpose of the tool for me.


ChatDOC: Teaching With The AI Summarizing Tool — from techlearning.com by Erik Ofgang
ChatDOC lets users turn any PDF into an AI chatbot that can summarize the text, answer questions, and generate quizzes.

What Is ChatDOC?
ChatDOC is an AI designed to help users interact with PDFs of various types, be it research papers, short stories, or chapters from larger works. Users upload a PDF and then have the opportunity to “chat” with that document, that is speak with a chatbot that bases its answers off of the uploaded text.

ChatDOC can perform tasks such as provide a short summary, search for specific terms, explain the overall theme if it’s a work of literature, or unpack the science in a research paper.

Other similar tools are out there, but ChatDOC is definitely one of the better PDF readers I’ve used. Its free version is quick and easy-to-use, and delivers on its promise of providing an AI that can discuss a given document with users and even quiz them on it.


From AI access to workforce readiness — from chieflearningofficer.com by Johnny Hamilton, Amy Stratbucker, & Brad Bigelow
Is your workforce using the right tool with an outdated mindset and playbook? Why old playbooks fall short — and what learning leaders must do next.

The leadership opportunity
Organizations do not need to predict every future AI capability. They need systems that allow people to explore with curiosity, practice safely, reflect deeply and adapt continuously — starting with what they already have and extending as capabilities evolve.

For CLOs, this is a moment to lead from the center of change — designing workforce readiness that keeps pace with accelerating technology while making work more rewarding for employees and more valuable for the organization. That is how AI moves from the promise of transformation to demonstrated readiness and, ultimately, from promise to performance.


Addendums on 3/19/26:
How to Build Practice-Based Learning Activities with AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Four evidence-based methods for designing, building & deploying active learning activities with your favourite LLM

Most L&D teams are using AI to make content faster. The real opportunity is using it as a practice engine.

The Synthesia 2026 AI in L&D Report f2026 AI in L&D Report found that the fastest-growing areas of planned AI adoption aren’t in content creation — they’re in assessments and simulations (36%), adaptive pathways (33%), and AI tutors (29%). In other words: L&D teams are starting to realise that the most powerful use of AI isn’t producing learning materials. It’s creating environments where learners actually practise.

And you can build these right now — no dev team, no custom platform, no code. Each method below includes a prompt you can paste into your preferred AI tool to generate a working interactive prototype: a self-contained practice activity with a briefing screen, a live AI interaction, and a debrief — all running in the browser, ready to share with stakeholders or deploy to learners.

OpenAI Adds Interactive Math and Science Learning Tools to ChatGPT — from campustechnology.com by Rhea Kelly

Key Takeaways

  • ChatGPT adds interactive learning tools: OpenAI introduced interactive math and science visualizations that allow users to explore formulas, variables, and relationships in real time.
  • The tool currently covers over 70 core math and science topics and is aimed initially at high school and college-level learners.
  • Users can adjust variables, manipulate formulas, and immediately see how changes affect graphs and outcomes.
 

Americans’ retirement accounts – and hardship withdrawals – hit new highs. Here’s what to know — from weforum.org by Spencer Feingold

  • Last year, US retirement account balances rose at double-digit rates, driven by strong market performance and steady contributions.
  • At the same time, hardship withdrawals increased, highlighting growing short-term financial stress.
  • The trend underscores the importance of financial education and resilience to support long-term retirement security.

From DSC:
I’m hoping that we are doing a better job in the United States on educating our youth on investing, saving, and developing better legal knowledge (i.e., the need for wills, estate planning, trusts, etc.).

 

 
 

“But what’s happening right now is exponential.” — from linkedin.com by Josh Cavalier

Excerpt:

I need to be honest with you. I’ve been running experiments this week with Claude Code and Opus 4.6, and we have reached the precipice in the collapse of time required to produce high-quality text-based ID outputs.

This includes performance consulting reports, learning needs analyses, action mapping, scripts, storyboards, facilitator guides, rubrics, and technical specs.

I just mapped the entire performance consulting process into a multimodal AI integration architecture (diagram image). Every phase. Entry and contracting. Performance analysis. Cause analysis. Solution design. Implementation. Evaluation. Thirty files. System specifications for each. The next step is to vet out each “skill” with an expert performance consultant.

Then I attempted a learning output: an 8-module course built with a cognitive scaffold that moves beyond content delivery to facilitate deliberate practice, meaning-making, and guided reflection within the learner’s own context.

The result:



AI and human-centered learning — from linkedin.com by Patrick Blessinger

Democratizing opportunities

AI adaptive learning can adapt learning in real-time. These tools have the potential to provide a more personalized learning experience, but only if used properly.

The California State University system uses ChatGPT Edu (OpenAI, 2025). Students use it for AI-assisted tutoring, study aids, and writing support. These resources provide 24/7 availability of subject-matter expertise tailored to students’ learning needs. It is not a replacement for professors. Rather, it extends the reach of mentorship by reducing access barriers.

However, we must proceed with intellectual humility and ethical responsibility. Even though AI can customize messages, it cannot replace the encouragement of a teacher or professor, or the social and emotional aspects of learning. It’s at the intersection of humanistic values and knowledge development that education must find its balance.

 

Something Big Is Happening — from shumer.dev by Matt Shumer; see below from the BIG Questions Institute, where I got this article from

I’ve spent six years building an AI startup and investing in the space. I live in this world. And I’m writing this for the people in my life who don’t… my family, my friends, the people I care about who keep asking me “so what’s the deal with AI?” and getting an answer that doesn’t do justice to what’s actually happening. I keep giving them the polite version. The cocktail-party version. Because the honest version sounds like I’ve lost my mind. And for a while, I told myself that was a good enough reason to keep what’s truly happening to myself. But the gap between what I’ve been saying and what is actually happening has gotten far too big. The people I care about deserve to hear what is coming, even if it sounds crazy.


They’ve now done it. And they’re moving on to everything else.

The experience that tech workers have had over the past year, of watching AI go from “helpful tool” to “does my job better than I do”, is the experience everyone else is about to have. Law, finance, medicine, accounting, consulting, writing, design, analysis, customer service. Not in ten years. The people building these systems say one to five years. Some say less. And given what I’ve seen in just the last couple of months, I think “less” is more likely.

The models available today are unrecognizable from what existed even six months ago. The debate about whether AI is “really getting better” or “hitting a wall” — which has been going on for over a year — is over. It’s done. Anyone still making that argument either hasn’t used the current models, has an incentive to downplay what’s happening, or is evaluating based on an experience from 2024 that is no longer relevant. I don’t say that to be dismissive. I say it because the gap between public perception and current reality is now enormous, and that gap is dangerous… because it’s preventing people from preparing.


What “Something Big Is Happening” Means for Schools — from/by the BIG Questions Institute
Matt Shumer’s newsletter post Something Big is Happening has been read over 80 million times within the week when it was published, on February 9.

Still, it’s worth reading Shumer’s post. Given the claims and warnings in Something Big Is Happening (and countless other articles), how would you truly, honestly respond to these questions:

  • What will the purpose of school be in 5 years?
  • What are we doing now that we must leave behind right away?
  • What can we leave behind gradually?
  • What does rigor look like in this AI-powered world?
  • Does our strategy look like making adjustments at the margins or are we preparing our students for a fundamental shift?
  • What is our definition of success? How do the the implications of AI and jobs (and other important forces, from geopolitical shifts and climate change, to mental health needs and shifting generational values) impact the outcomes we prioritize? What is the story of success we want to pass on to our students and wider community?
 

Farewell to Traditional Universities | What AI Has in Store for Education

Premiered Jan 16, 2026

Description:

What if the biggest change in education isn’t a new app… but the end of the university monopoly on credibility?

Jensen Huang has framed AI as a platform shift—an industrial revolution that turns intelligence into infrastructure. And when intelligence becomes cheap, personal, and always available, education stops being a place you go… and becomes a system that follows you. The question isn’t whether universities will disappear. The question is whether the old model—high cost, slow updates, one-size-fits-all—can survive a world where every student can have a private tutor, a lab partner, and a curriculum designer on demand.

This video explores what AI has in store for education—and why traditional universities may need to reinvent themselves fast.

In this video you’ll discover:

  • How AI tutors could deliver personalized learning at scale
  • Why credentials may shift from “degrees” to proof-of-skill portfolios
  • What happens when the “middle” of studying becomes automated
  • How universities could evolve: research hubs, networks, and high-trust credentialing
  • The risks: cheating, dependency, bias, and widening inequality
  • The 3 skills that become priceless when information is everywhere: judgment, curiosity, and responsibility

From DSC:
There appears to be another, similar video, but with a different date and length of the video. So I’m including this other recording as well here:


The End of Universities as We Know Them: What AI Is Bringing

Premiered Jan 27, 2026

What if universities don’t “disappear”… but lose their monopoly on learning, credentials, and opportunity?

AI is turning education into something radically different: personal, instant, adaptive, and always available. When every student can have a 24/7 tutor, a writing coach, a coding partner, and a study plan designed specifically for them, the old model—one professor, one curriculum, one pace for everyone—starts to look outdated. And the biggest disruption isn’t the classroom. It’s the credential. Because in an AI world, proof of skill can become more valuable than a piece of paper.

This video explores the end of universities as we know them: what AI is bringing, what will break, what will survive, and what replaces the traditional path.

In this video you’ll discover:

  • Why AI tutoring could outperform one-size-fits-all lectures
  • How “degrees” may shift into skill proof: portfolios, projects, and verified competency
  • What happens when the “middle” of studying becomes automated
  • How universities may evolve: research hubs, networks, high-trust credentialing
  • The dark side: cheating, dependency, inequality, and biased evaluation
  • The new advantage: judgment, creativity, and responsibility in a world of instant answers
 

The Learning and Employment Records (LER) Report for 2026: Building the infrastructure between learning and work — from smartresume.com; with thanks to Paul Fain for this resource

Executive Summary (excerpt)

This report documents a clear transition now underway: LERs are moving from small experiments to systems people and organizations expect to rely on. Adoption remains early and uneven, but the forces reshaping the ecosystem are no longer speculative. Federal policy signals, state planning cycles, standards maturation, and employer behavior are aligning in ways that suggest 2026 will mark a shift from exploration to execution.

Across interviews with federal leaders, state CIOs, standards bodies, and ecosystem builders, a consistent theme emerged: the traditional model—where institutions control learning and employment records—no longer fits how people move through education and work. In its place, a new model is being actively designed—one in which individuals hold portable, verifiable records that systems can trust without centralizing control.

Most states are not yet operating this way. But planning timelines, RFP language, and federal signals indicate that many will begin building toward this model in early 2026.

As the ecosystem matures, another insight becomes unavoidable: records alone are not enough. Value emerges only when trusted records can be interpreted through shared skill languages, reused across contexts, and embedded into the systems and marketplaces where decisions are made.

Learning and Employment Records are not a product category. They are a data layer—one that reshapes how learning, work, and opportunity connect over time.

This report is written for anyone seeking to understand how LERs are beginning to move from concept to practice. Whether readers are new to the space or actively exploring implementation, the report focuses on observable signals, emerging patterns, and the practical conditions required to move from experimentation toward durable infrastructure.

 

“The building blocks for a global, interoperable skills ecosystem are already in place. As education and workforce alignment accelerates, the path toward trusted, machine-readable credentials is clear. The next phase depends on credentials that carry value across institutions, industries, states, and borders; credentials that move with learners wherever their education and careers take them. The question now isn’t whether to act, but how quickly we move.”

– Curtiss Barnes, Chief Executive Officer, 1EdTech

 


The above item was from Paul Fain’s recent posting, which includes the following excerpt:

SmartResume just published a guide for making sense of this rapidly expanding landscape. The LER Ecosystem Report was produced in partnership with AACRAO, Credential Engine, 1EdTech, HR Open Standards, and the U.S. Chamber of Commerce Foundation. It was based on interviews and feedback gathered over three years from 100+ leaders across education, workforce, government, standards bodies, and tech providers.

The tools are available now to create the sort of interoperable ecosystem that can make talent marketplaces a reality, the report argues. Meanwhile, federal policy moves and bipartisan attention to LERs are accelerating action at the state level.

“For state leaders, this creates a practical inflection point,” says the report. “LERs are shifting from an innovation discussion to an infrastructure planning conversation.”

 
 
 
 


From DSC:
One of my sisters shared this piece with me. She is very concerned about our society’s use of technology — whether it relates to our youth’s use of social media or the relentless pressure to be first in all things AI. As she was a teacher (at the middle school level) for 37 years, I greatly appreciate her viewpoints. She keeps me grounded in some of the negatives of technology. It’s important for us to listen to each other.


 
© 2025 | Daniel Christian