This $10K AI School Promises to Future-Proof Your Career — from builtin.com by Matthew Urwin
Khan Academy, TED and ETS are starting a new program to equip students and professionals with the skills to thrive in an increasingly AI-driven economy. Here’s what you need to know.

Summary: The Khan TED Institute is a higher-education program that will teach students and workers how to use AI through interactive learning. The program’s AI-centric curriculum is an unproven approach, though, casting doubt on whether it will actually improve learning outcomes and career prospects.

Higher education might be on the verge of a radical overhaul to bring it up to speed in the age of artificial intelligence. At the TED2026 conference, Khan Academy, TED and ETS announced that they’re partnering to establish the Khan TED Institute — a new program that reorients the college curriculum around AI. By joining forces, the education technology trio aims to develop an alternative to traditional universities that better tracks student progress, teaches more relevant skills and provides a more personalized learning experience.

Accessibility is another major tenet of the Khan TED Institute. Its virtual nature allows anyone with an internet connection to participate in the program and makes it easier for students to move at their preferred pace. And because its curriculum prioritizes competency over course credits, advanced learners can complete the program in a shorter period. Time isn’t the only thing students can save on, either: The Institute promises a bachelor’s degree for less than $10,000, offering a much more affordable alternative to the typical four-year degree. 


 

From DSC:
Faculty senates don’t do well with this pace of change. But to their credit, few organizations can begin to deal with this pace of change.

 

When anyone can build a course, the real job is deciding which ones shouldn’t exist — from drphilippahardman.substack.com by Dr. Philippa Hardman
Why deciding is the only L&D skill AI can’t replace.

The biggest AI risk that L&D faces isn’t that it gets left behind: it’s that we build more — and flood the organisation with meh-quality content nobody needed in the first place.

In this post, I’ll make the case that:

  • The L&D job has just split in two — and most of us are still working on the wrong half.
  • There’s a new operating model coming for the role, and it’s already running inside a lot of the companies you’ve heard of.
  • The smartest critique of everything I’m about to argue comes from Ethan Mollick — and I think he’s half right.

The question we’ve been asking for the last two years — “how do I get faster at building?” — was the wrong one.

The real question is: can I look at fifteen AI-generated learning assets and decide which three are worth scaling — and put my name to that decision?

 
 

6 Reasons Universities Are Building Media Labs Now — from edtechmagazine.com by Brad Grimes
Digital production centers help institutions close the gap between academic training and professional practice.

Higher education is undergoing a significant transformation in how it prepares the next generation of media professionals. Across the country, universities are investing in state-of-the-art media labs — facilities built not around traditional classroom instruction, but around the tools, workflows and collaborative environments that define today’s professional production landscape. These spaces represent a fundamental rethinking of what it means to train students for careers in film, animation, gaming and digital storytelling.

 
 

Why Sal Khan’s AI revolution hasn’t happened yet, according to Sal Khan — from chalkbeat.org by Matt Barnum

Three years ago, as Khan Academy founder Sal Khan rolled out an AI-powered tutoring chatbot, he predicted a revolution in learning.

So far, the revolution hasn’t happened, he acknowledges.

“For a lot of students, it was a non-event,” Khan told me recently about his eponymous chatbot, Khanmigo. “They just didn’t use it much.”

Khan gives this analogy: Imagine he walked into a class, sat in the back of the room, and waited for students to seek out help. “Some will; most won’t,” he said. That’s been the experience with AI tutoring, he said. It doesn’t necessarily make students motivated to learn or fill in gaps in knowledge needed to ask questions.

“AI is going to help,” said Khan of this reimagined Khan Academy. “But I think our biggest lever is really investing in the human systems.”

 

The Role of Faculty in the University of the Future — from er.educause.edu by Tanya Gamby, David Kil, Rachel Koblic, Paul LeBlanc, Mihnea Moldoveanu, and George Siemens
In the age of AI, the true future of higher education lies not in replacing faculty but in freeing them to do what only humans can—build meaningful relationships, cultivate wisdom, and guide students through the ethical and intellectual challenges machines cannot navigate.

Today, the work of knowledge transfer is often done better, faster, with more precision, and more patiently by AI. These systems can provide nonjudgmental, individualized learning opportunities twenty-four hours a day, seven days a week. Think of AI as a “genius teaching assistant” who assumes much of the work of basic knowledge transfer, unlocking learning when students get stuck and providing real-time assessment. Such a genius TA would offer faculty dashboards that update student progress, flag those who are struggling, and recommend targeted interventions. These tasks free faculty to focus on building genuine relationships with students, using the classroom to foster human skills, and curating community. This may be the great gift of AI to education. But it requires a profound reimagining of faculty roles—perhaps the single biggest hurdle to reimagining higher education, and equally its greatest opportunity.

A concerned faculty member might hear all this and conclude they are becoming obsolete. The opposite is true. The evolution of faculty roles demands more—not less—of what makes a great teacher.

This means intervening in high-impact moments when the genius TA has not unlocked learning; curating class time to lift students from knowing material to applying it in contexts that require critical thinking, judgment, and discernment; and cultivating the human skills that will be most prized in the age of AI: effective communication, constructive dialogue, empathy, creativity, and professional disposition. Most importantly, it means building genuine relationships with students—that make them feel like they matter—the kind that fuels transformation.


From DSC:
A quick comment on one of the sentences in the article, which asserts:

Centers for teaching and learning, which have long supported faculty development at many institutions, will be among the busiest places on campus in the years ahead.

I would change the word will be to should:

Centers for teaching and learning, which have long supported faculty development at many institutions, should be among the busiest places on campus in the years ahead.

For that statement to be true, centers for teaching and learning need to be well-versed in the tools and pedagogies involved, plus in learning science. Those centers need to have credibility for faculty members to value their services. And that’s just it, isn’t it? The faculty members need to see those centers for teaching and learning as having something that they lack…that they need assistance with. Otherwise, if such centers are just viewed as superfluous, nothing much will change.

Also, my experience has been that if those centers for teaching and learning are in an IT group/department, they should be moved to the academic side of the house instead. Many faculty members don’t value people from IT enough to make changes in how they teach — no matter how qualified those people are. They view those people as “IT” only.


You might also be interested in the other articles in that series:


 

The “Cognitive Offloading” Paradox — from drphilippahardman.substack.com by Dr. Philippa Hardman
New research shows that offloading learning tasks to AI can improve – rather than erode – human thinking and learning

The Rise of the “Offloading Paradox”
In March 2026, the International Journal of Educational Technology in Higher Education published a study that went beyond the question “does offloading hurt?” and asked a harder one: when students form genuine partnerships with AI — treating it as an intellectual collaborator rather than a passive tool — what actually happens to the way they think and learn? Specifically, do two cognitive responses — critical evaluation of AI outputs (what the researchers call cognitive vigilance) and strategic delegation to AI (cognitive offloading) — compete with each other, or can they coexist?

Based on previous research, Wang and Zhang hypothesised that cognitive offloading would hurt transformative learning. They expected the familiar story: delegation reduces cognitive struggle, struggle is where learning happens, therefore delegation undermines learning.

The study — 912 students across China, Europe, and the United States, using a three-wave time-lagged survey design that measured partnership orientation first, cognitive strategies two weeks later, and learning outcomes two weeks after that — found something more interesting than a simple reversal.

 

Make learning accessible to all in higher education — from The Times Higher Education

When accessibility is placed at the heart of teaching and learning, rather than treated as a bolt-on, every student benefits. This week’s spotlight guide offers advice on designing universally accessible learning, in-person and online. Find out how to ease the burden of disability disclosure with universal design for learning, better support neurodivergent students and students with hearing or vision issues, design more accessible assessments and ensure digital tools work for all.

 

 

“Learning ecosystems begin with people.” — Getting Smart


ASU/GSV Summit

There’s something about walking into a space like the ASU+GSV Summit that feels a little like stepping into a living, breathing idea. You hear fragments of possibility in passing conversations, see it in the way people lean in a little closer during sessions, feel it in the quiet moments when something lands and you know it’s going to stay with you. This year, what lingered wasn’t just the talk of innovation; it was a deeper pull toward something more human. A reminder that before we build better systems, we have to create better conditions for dreaming. And there’s a kind of quiet joy that emerges when educators find each other in that work, when ideas connect, and you can feel the bridges across networks and ecosystems getting stronger in real time.

And dreaming is not a given. It requires space, safety, and adults who understand the weight of what they’re holding. The most powerful moments weren’t about what we can do for learners, but how we show up with them. Adults who are still learning, still stretching, still willing to have their thinking reshaped are the ones who make room for young people to imagine beyond what they’ve seen. That kind of space doesn’t happen by accident. It’s protected. It’s intentional. It’s built by people who know their non-negotiables, who draw clear lines around dignity and belonging so learners can take risks without fear of losing themselves in the process.

Across conversations on pathways, experience, and AI, there was a steady undercurrent. Knowledge alone isn’t carrying the day anymore. Young people need chances to test, to try, to wrestle with ideas in real contexts. That’s where wisdom starts to take shape. AI showed up as a partner in that work, not the main character, but a tool that can expand thinking when used well. Still, the heartbeat of it all is human. It’s the relationships, the networks, the shared belief that we don’t have to do this alone. When adults come together to learn, to challenge each other, and to build something bigger than their own corner, they create the kind of ecosystems where young people don’t just prepare for the future, they begin to shape it.


Also from Getting Smart:

 
 

What the Future of Learning Looks Like in the Era of AI — from the Center for Academic Innovation at the University of Michigan, by Sean Corp

AI & the Future of Learning Summit brings industry, education leaders together to discuss higher education’s opportunity to lead, what students need, and what partnerships are possible

As artificial intelligence rapidly reshapes the nature of work and learning, speakers at the University of Michigan’s AI & the Future of Learning Summit delivered a clear message: higher education must take a leading role in defining what comes next.

One CEO of a leading educational technology company put it like this: “The only bad thing would be universities standing still.”

Universities must embrace their roles as providers of continuous, lifelong learning that evolves alongside technological change. 


This shift is already affecting early-career pathways. Employers are placing greater emphasis on experience, while traditional entry-level roles are becoming less accessible. There is often a gap between what a credential represents and the expectations of employers.

That gap is particularly evident in access to internships. Chris Parrish, co-founder and president of Podium, noted that millions of students compete for a limited number of internships each year, making it increasingly difficult to gain the experience employers demand.

“If you miss out on an internship, you’re twice as likely to be unemployed,” Parrish said. 

 

The Course Is Dying as the Unit of Learning — from drphilippahardman.substack.com by Dr Philippa Hardman
Here’s why, and what’s replacing It

What the Bleeding Edge Looks like in Practice
So what does “the new stack” actually look like when organisations lean into this? Here are four real patterns already in play.

Engineering: from engine courses to in-workflow AI coaching.
Product development: from courses to craft-specific agents.
Compliance: from annual course to nudge systems.|
Enablement systems, not catalogues.

 

The quest to build a better AI tutor — from hechingerreport.org by Jill Barshay
Researchers make progress with an older ed tech idea: personalized practice

One promising idea has less to do with how an AI tutor explains concepts and more with what it asks students to practice next.

A team at the University of Pennsylvania, which included some AI skeptics, recently tested this approach in a study of close to 800 Taiwanese high school students learning Python programming. All the students used the same AI tutor, which was designed not to give away answers.

But there was one key difference. Half the students were randomly assigned to a fixed sequence of practice problems, progressing from easy to hard. The other half received a personalized sequence with the AI tutor continuously adjusting the difficulty of each problem based on how the student was performing and interacting with the chatbot.

The idea is based on what educators call the “zone of proximal development.” When problems are too easy, students get bored. When they’re too hard, students get frustrated. The goal is to keep students in a sweet spot: challenged, but not overwhelmed.

The researchers found that students in the personalized group did better on a final exam than students in the fixed problem group. The difference was characterized as the equivalent of 6 to 9 months of additional schooling, an eye-catching claim for an after-school online course that lasted only five months.

To address this, Chung’s team combined a large language model with a separate machine-learning algorithm that analyzes how students interact with the online course platform — how they answer the practice questions, how many times they revise or edit their coding, and the quality of their conversations with the chatbot — and uses that information to decide which problem to serve up next.

 

The Most Obvious Fix in Education — from michelleweise.substack.com by Michelle Weise
The No-Brainer Nobody’s Doing 

We know what better learning looks like. We have known for a while.

Real problems. Real roles. Built-in conflict. Conditions that simulate the messiness of actual work. Reflection that asks not just what did you do but who are you becoming? These are not radical ideas. They are not untested theories. The research is clear, employers are asking for exactly this, and students consistently report that the closest they got to real work was the most valuable part of their education.

So why aren’t universities doing more of it?

That is the question worth sitting with — because the gap between what we know and what we do is not a knowledge problem. It is a design problem, an incentive problem, and if we’re being candid, a courage problem.

Because in the meantime, learners are paying the price. They graduate credentialed but untested. They enter labor markets that want proof of performance and experience, not transcripts. They lack the networks, the exposure, and the scar tissue that comes from navigating real work.


Also relevant, see:

The Apprenticeship (R)Evolution — from insidehighered.com by Sara Weissman and Colleen Flaherty
Once synonymous with hard hats and tool belts, apprenticeships are branching into health care, artificial intelligence, business services, advanced manufacturing and more.

Such programs also challenge stereotypes about apprenticeships—namely that they’re only in construction, an earn-and-learn catchall for traditionally apprenticeable occupations such as bricklayer, plumber, carpenter and electrician. In integrating robotics, automation, machining and logistics, the manufacturing development program is a bridge to understanding how apprenticeships are evolving to support some of the nation’s fastest-growing industries. These include advanced manufacturing, but also health care, information technology and other business services.

 
© 2025 | Daniel Christian