Transform Public Speaking with Yoodli: Your AI Coach — from rdene915.com by Paula Johnson

Yoodli is an AI tool designed to help users improve their public speaking skills. It analyzes your speech in real-time or after a recording and gives you feedback on things like:

    • Filler words (“um,” “like,” “you know”)
    • Pacing (Are you sprinting or sedating your audience?)
    • Word choice and sentence complexity
    • Eye contact and body language (with video)
    • And yes, even your “uhhh” to actual word ratio

Yoodli gives you a transcript and a confidence score, plus suggestions that range from helpful to brutally honest. It’s basically Simon Cowell with AI ethics and a smiley face interface.


[What’s] going on with AI and education? — from theneuron.ai by Grant Harvey
With students and teachers alike using AI, schools are facing an “assessment crisis” where the line between tool and cheating has blurred, forcing a shift away from a broken knowledge economy toward a new focus on building human judgment through strategic struggle.

What to do about it: The future belongs to the “judgment economy,” where knowledge is commoditized but taste, agency, and learning velocity become the new human moats. Use the “Struggle-First” principle: wrestle with problems for 20-30 minutes before turning to AI, then use AI as a sparring partner (not a ghostwriter) to deepen understanding. The goal isn’t to avoid AI, but to strategically choose when to embrace “desirable difficulties” that build genuine expertise versus when to leverage AI for efficiency.

The Alpha-School Program in brief:

    • Students complete core academics in just 2 hours using AI tutors, freeing up 4+ hours for life skills, passion projects, and real-world experiences.
    • The school claims students learn at least 2x faster than their peers in traditional school.
    • The top 20% of students show 6.5x growth. Classes score in the top 1-2% nationally across the board.
    • Claims are based on NWEA’s Measures of Academic Progress (MAP) assessments… with data only available to the school. Hmm…

Austen Allred shared a story about the school, which put it on our radar.


Featured Report:  Teaching for Tomorrow: Unlocking Six Weeks a Year With AI — from gallup.com
.

.
In the latest installment of Gallup and the Walton Family Foundation’s research on education, K-12 teachers reveal how AI tools are transforming their workloads, instructional quality and classroom optimism. The report finds that 60% of teachers used an AI tool during the 2024–25 school year. Weekly AI users report reclaiming nearly six hours per week — equivalent to six weeks per year — which they reinvest in more personalized instruction, deeper student feedback and better parent communication.

Despite this emerging “AI dividend,” adoption is uneven: 40% of teachers aren’t using AI at all, and only 19% report their school has a formal AI policy. Teachers with access to policies and support save significantly more time.

Educators also say AI improves their work. Most report higher-quality lesson plans, assessments and student feedback. And teachers who regularly use AI are more optimistic about its benefits for student engagement and accessibility — mirroring themes from the Voices of Gen Z: How American Youth View and Use Artificial Intelligence report, which found students hesitant but curious about AI’s classroom role. As AI tools grow more embedded in education, both teachers and students will need the training and support to use them effectively.

Also see:

  • 2-Hour Learning
    • What if children could crush academics in 2 hours, 2x faster? 
    • What if children could get back their most valuable resource, which is time?
    • What if children could pursue the things they want during their afternoons and develop life skills?

Amira Learning: Teaching With The AI-Powered Reading Tool — from techlearning.com by Erik Ofgang
Amira Learning is a research-backed AI reading tutor that incorporates the science of reading into its features.

What Is Amira Learning?
Amira Learning’s system is built upon research led by Jack Mostow, a professor at Carnegie Mellon who helped pioneer AI literacy education. Amira uses Claude AI to power its AI features, but these features are different than many other AI tools on the market. Instead of focusing on chat and generative response, Amira’s key feature is its advanced speech recognition and natural language processing capabilities, which allow the app to “hear” when a student is struggling and tailor suggestions to that student’s particular mistakes.

Though it’s not meant to replace a teacher, Amira provides real-time feedback and also helps teachers pinpoint where a student is struggling. For these reasons, Amira Learning is a favorite of education scientists and advocates for science of reading-based literacy instruction. The tool currently is used by more than 4 million students worldwide and across the U.S.


 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 

“The AI-enhanced learning ecosystem” [Jennings] + other items re: AI in our learning ecosystems

The AI-enhanced learning ecosystem: A case study in collaborative innovation — from chieflearningofficer.com by Kevin Jennings
How artificial intelligence can serve as a tool and collaborative partner in reimagining content development and management.

Learning and development professionals face unprecedented challenges in today’s rapidly evolving business landscape. According to LinkedIn’s 2025 Workplace Learning Report, 67 percent of L&D professionals report being “maxed out” on capacity, while 66 percent have experienced budget reductions in the past year.

Despite these constraints, 87 percent agree their organizations need to develop employees faster to keep pace with business demands. These statistics paint a clear picture of the pressure L&D teams face: do more, with less, faster.

This article explores how one L&D leader’s strategic partnership with artificial intelligence transformed these persistent challenges into opportunities, creating a responsive learning ecosystem that addresses the modern demands of rapid product evolution and diverse audience needs. With 71 percent of L&D professionals now identifying AI as a high or very high priority for their learning strategy, this case study demonstrates how AI can serve not merely as a tool but as a collaborative partner in reimagining content development and management.
.


How we use GenAI and AR to improve students’ design skills — from timeshighereducation.com by Antonio Juarez, Lesly Pliego and Jordi Rábago who are professors of architecture at Monterrey Institute of Technology in Mexico; Tomas Pachajoa is a professor of architecture at the El Bosque University in Colombia; & Carlos Hinrichsen and Marietta Castro are educators at San Sebastián University in Chile.
Guidance on using generative AI and augmented reality to enhance student creativity, spatial awareness and interdisciplinary collaboration

Blend traditional skills development with AI use
For subjects that require students to develop drawing and modelling skills, have students create initial design sketches or models manually to ensure they practise these skills. Then, introduce GenAI tools such as Midjourney, Leonardo AI and ChatGPT to help students explore new ideas based on their original concepts. Using AI at this stage broadens their creative horizons and introduces innovative perspectives, which are crucial in a rapidly evolving creative industry.

Provide step-by-step tutorials, including both written guides and video demonstrations, to illustrate how initial sketches can be effectively translated into AI-generated concepts. Offer example prompts to demonstrate diverse design possibilities and help students build confidence using GenAI.

Integrating generative AI and AR consistently enhanced student engagement, creativity and spatial understanding on our course. 


How Texas is Preparing Higher Education for AI — from the74million.org by Kate McGee
TX colleges are thinking about how to prepare students for a changing workforce and an already overburdened faculty for new challenges in classrooms.

“It doesn’t matter if you enter the health industry, banking, oil and gas, or national security enterprises like we have here in San Antonio,” Eighmy told The Texas Tribune. “Everybody’s asking for competency around AI.”

It’s one of the reasons the public university, which serves 34,000 students, announced earlier this year that it is creating a new college dedicated to AI, cyber security, computing and data science. The new college, which is still in the planning phase, would be one of the first of its kind in the country. UTSA wants to launch the new college by fall 2025.

But many state higher education leaders are thinking beyond that. As AI becomes a part of everyday life in new, unpredictable ways, universities across Texas and the country are also starting to consider how to ensure faculty are keeping up with the new technology and students are ready to use it when they enter the workforce.


In the Room Where It Happens: Generative AI Policy Creation in Higher Education — from er.educause.edu by Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini

To develop a robust policy for generative artificial intelligence use in higher education, institutional leaders must first create “a room” where diverse perspectives are welcome and included in the process.


Q&A: Artificial Intelligence in Education and What Lies Ahead — from usnews.com by Sarah Wood
Research indicates that AI is becoming an essential skill to learn for students to succeed in the workplace.

Q: How do you expect to see AI embraced more in the future in college and the workplace?
I do believe it’s going to become a permanent fixture for multiple reasons. I think the national security imperative associated with AI as a result of competing against other nations is going to drive a lot of energy and support for AI education. We also see shifts across every field and discipline regarding the usage of AI beyond college. We see this in a broad array of fields, including health care and the field of law. I think it’s here to stay and I think that means we’re going to see AI literacy being taught at most colleges and universities, and more faculty leveraging AI to help improve the quality of their instruction. I feel like we’re just at the beginning of a transition. In fact, I often describe our current moment as the ‘Ask Jeeves’ phase of the growth of AI. There’s a lot of change still ahead of us. AI, for better or worse, it’s here to stay.




AI-Generated Podcasts Outperform Textbooks in Landmark Education Study — form linkedin.com by David Borish

A new study from Drexel University and Google has demonstrated that AI-generated educational podcasts can significantly enhance both student engagement and learning outcomes compared to traditional textbooks. The research, involving 180 college students across the United States, represents one of the first systematic investigations into how artificial intelligence can transform educational content delivery in real-time.


What can we do about generative AI in our teaching?  — from linkedin.com by Kristina Peterson

So what can we do?

  • Interrogate the Process: We can ask ourselves if we I built in enough checkpoints. Steps that can’t be faked. Things like quick writes, question floods, in-person feedback, revision logs.
  • Reframe AI: We can let students use AI as a partner. We can show them how to prompt better, revise harder, and build from it rather than submit it. Show them the difference between using a tool and being used by one.
  • Design Assignments for Curiosity, Not Compliance: Even the best of our assignments need to adapt. Mine needs more checkpoints, more reflective questions along the way, more explanation of why my students made the choices they did.

Teachers Are Not OK — from 404media.co by Jason Koebler

The response from teachers and university professors was overwhelming. In my entire career, I’ve rarely gotten so many email responses to a single article, and I have never gotten so many thoughtful and comprehensive responses.

One thing is clear: teachers are not OK.

In addition, universities are contracting with companies like Microsoft, Adobe, and Google for digital services, and those companies are constantly pushing their AI tools. So a student might hear “don’t use generative AI” from a prof but then log on to the university’s Microsoft suite, which then suggests using Copilot to sum up readings or help draft writing. It’s inconsistent and confusing.

I am sick to my stomach as I write this because I’ve spent 20 years developing a pedagogy that’s about wrestling with big ideas through writing and discussion, and that whole project has been evaporated by for-profit corporations who built their systems on stolen work. It’s demoralizing.

 

These parents are ‘unschooling’ their kids. What does that mean? — from usatoday.com by Adrianna Rodriguez

“My goal for them is for them to love learning,” Franco said. “It’s realizing you can educate your child beyond the school model.”

Some parents say their children are thriving in the unschooling environment, fueling their confidence and desire to learn.

But not all students find success in unschooling. Some former students say the lack of structure and accountability can lead to educational neglect if parents don’t have the resources to make it work. Some kids who were unschooled feel they were left unprepared for adulthood and had fewer career opportunities.


What Is ‘Unschooling’ and Why Are More Parents Doing It? — from bckonline.com byTiffany Silva

Unschooling is a growing alternative education movement where children learn through life experiences instead of traditional classroom instruction. As more parents seek personalized and flexible learning paths, unschooling is gaining popularity across the U.S. and here’s what you need to know!

So, just what exactly is unschooling? Well, Unschooling is a form of homeschooling that breaks away from the idea of following a set curriculum. Instead, it centers the child’s interests, passions, and pace.

The belief is that learning doesn’t have to be separate from life because it is life. Unschooling functions on the premise that when kids are given the freedom to explore, they develop deep, authentic understanding and a lifelong love of learning.

 

AI & Schools: 4 Ways Artificial Intelligence Can Help Students — from the74million.org by W. Ian O’Byrne
AI creates potential for more personalized learning

I am a literacy educator and researcher, and here are four ways I believe these kinds of systems can be used to help students learn.

  1. Differentiated instruction
  2. Intelligent textbooks
  3. Improved assessment
  4. Personalized learning


5 Skills Kids (and Adults) Need in an AI World — from oreilly.com by Raffi Krikorian
Hint: Coding Isn’t One of Them

Five Essential Skills Kids Need (More than Coding)
I’m not saying we shouldn’t teach kids to code. It’s a useful skill. But these are the five true foundations that will serve them regardless of how technology evolves.

  1. Loving the journey, not just the destination
  2. Being a question-asker, not just an answer-getter
  3. Trying, failing, and trying differently
  4. Seeing the whole picture
  5. Walking in others’ shoes

The AI moment is now: Are teachers and students ready? — from iblnews.org

Day of AI Australia hosted a panel discussion on 20 May, 2025. Hosted by Dr Sebastian Sequoiah-Grayson (Senior Lecturer in the School of Computer Science and Engineering, UNSW Sydney) with panel members Katie Ford (Industry Executive – Higher Education at Microsoft), Tamara Templeton (Primary School Teacher, Townsville), Sarina Wilson (Teaching and Learning Coordinator – Emerging Technology at NSW Department of Education) and Professor Didar Zowghi (Senior Principal Research Scientist at CSIRO’s Data61).


Teachers using AI tools more regularly, survey finds — from iblnews.org

As many students face criticism and punishment for using artificial intelligence tools like ChatGPT for assignments, new reporting shows that many instructors are increasingly using those same programs.


Addendum on 5/28/25:

A Museum of Real Use: The Field Guide to Effective AI Use — from mikekentz.substack.com by Mike Kentz
Six Educators Annotate Their Real AI Use—and a Method Emerges for Benchmarking the Chats

Our next challenge is to self-analyze and develop meaningful benchmarks for AI use across contexts. This research exhibit aims to take the first major step in that direction.

With the right approach, a transcript becomes something else:

  • A window into student decision-making
  • A record of how understanding evolves
  • A conversation that can be interpreted and assessed
  • An opportunity to evaluate content understanding

This week, I’m excited to share something that brings that idea into practice.

Over time, I imagine a future where annotated transcripts are collected and curated. Schools and universities could draw from a shared library of real examples—not polished templates, but genuine conversations that show process, reflection, and revision. These transcripts would live not as static samples but as evolving benchmarks.

This Field Guide is the first move in that direction.


 

Find Your Next Great Job with AI — from wondertools.substack.com by Jeremy Caplan

1. Explore career directions

Recommended tool: Google’s Career Dreamer

What it is: A career visualization tool. See a map of professional fields related to your interests. (See video demo below)

How to use it: Start by typing in a current or previous role, or a type of job that interests you, using up to five words. Then optionally add the name of an organization or industry.

The free service then confirms job activities of interest and shows you a variety of related career paths. Pick one at a time to explore. You can then browse current job openings, refining the search based on location, company size, or other factors you care about.

Example: I’m not job hunting, but I tested out the service by typing in “journalist, writer and educator” as roles and then “journalism and education” as my industries of interest.

Why it’s useful: I appreciate that Career Dreamer not only suggests a range of relevant fields, but also summarizes what a typical day in those jobs might be like. It also suggests skills you’ll develop and other jobs that might follow on that career path.

Next step: After exploring potential career paths and looking at available jobs, you can jump into Gemini — Google’s equivalent of ChatGPT — for further career planning.


From DSC:
This is the type of functionality that will be woven into the powerful, global, Artificial Intelligence (AI)-based, next-generation, lifelong learning platform that I’ve been tracking. AI will be constantly used to determine which skills are marketable and how to get those skills. The platform will feature personalized recommendations and help a person brainstorm about potential right turns in their career path.


 

MOOC-Style Skills Training — from the-job.beehiiv.com by Paul Fain
WGU and tech companies use Open edX for flexible online learning. Could community colleges be next?

Open Source for Affordable Online Reach
The online titan Western Governors University is experimenting with an open-source learning platform. So are Verizon and the Indian government. And the platform’s leaders want to help community colleges take the plunge on competency-based education.

The Open edX platform inherently supports self-paced learning and offers several features that make it a good fit for competency-based education and skills-forward learning, says Stephanie Khurana, Axim’s CEO.

“Flexible modalities and a focus on competence instead of time spent learning improves access and affordability for learners who balance work and life responsibilities alongside their education,” she says.

“Plus, being open source means institutions and organizations can collaborate to build and share CBE-specific tools and features,” she says, “which could lower costs and speed up innovation across the field.”

Axim thinks Open edX’s ability to scale affordably can support community colleges in reaching working learners across an underserved market. 

 

AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?

AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?

On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.

“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”

Also relevant/see:

Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.

“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.

Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.

Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.


‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?

Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.

While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.


AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen

When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.

The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.


AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk

Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.

In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.


Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie

In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.

Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?

Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?

Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.

 

Another ‘shock’ is coming for American jobs — from washingtonpost.com by Heather Long. DSC: This is a gifted article
Millions of workers will need to shift careers. Our country is unprepared.

The United States is on the cusp of a massive economic shift due to AI, and it’s likely to cause greater change than anything President Donald Trump does in his second term. Much good can come from AI, but the country is unprepared to grapple with the need for millions — or perhaps tens of millions — of workers to shift jobs and entire careers.

“There’s a massive risk that entry-level, white-collar work could get automated. What does that do to career ladders?” asked Molly Kinder, a fellow at the Brookings Institution. Her research has found the jobs of marketing analysts are five times as likely to be replaced as those of marketing managers, and sales representative jobs are three times as likely to be replaced as those of sales managers.

Young people working in these jobs will need to be retrained, but it will be hard for them to invest in new career paths. Consider that many college graduates already carry a lot of debt (an average of about $30,000 for those who took student loans).What’s more, the U.S. unemployment insurance system covers only about 57 percent of unemployed workers and replaces only a modest amount of someone’s pay.

From DSC:
This is another reason why I think this vision here is at least a part of our future. We need shorter, less expensive credentials.

  • People don’t have the time to get degrees that take 2+ years to complete (after they have already gone through college once).
  • They don’t want to come out with more debt on their backs.
  • With inflation going back up, they won’t have as much money anyway.
  • Also, they may already have enough debt on their backs.
 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

Uplimit raises stakes in corporate learning with suite of AI agents that can train thousands of employees simultaneously — from venturebeat.com by Michael Nuñez|

Uplimit unveiled a suite of AI-powered learning agents today designed to help companies rapidly upskill employees while dramatically reducing administrative burdens traditionally associated with corporate training.

The San Francisco-based company announced three sets of purpose-built AI agents that promise to change how enterprises approach learning and development: skill-building agents, program management agents, and teaching assistant agents. The technology aims to address the growing skills gap as AI advances faster than most workforces can adapt.

“There is an unprecedented need for continuous learning—at a scale and speed traditional systems were never built to handle,” said Julia Stiglitz, CEO and co-founder of Uplimit, in an interview with VentureBeat. “The companies best positioned to thrive aren’t choosing between AI and their people—they’re investing in both.”


Introducing Claude for Education — from anthropic.com

Today we’re launching Claude for Education, a specialized version of Claude tailored for higher education institutions. This initiative equips universities to develop and implement AI-enabled approaches across teaching, learning, and administration—ensuring educators and students play a key role in actively shaping AI’s role in society.

As part of announcing Claude for Education, we’re introducing:

  1. Learning mode: A new Claude experience that guides students’ reasoning process rather than providing answers, helping develop critical thinking skills
  2. University-wide Claude availability: Full campus access agreements with Northeastern University, London School of Economics and Political Science (LSE), and Champlain College, making Claude available to all students
  3. Academic partnerships: Joining Internet2 and working with Instructure to embed AI into teaching & learning with Canvas LMS
  4. Student programs: A new Claude Campus Ambassadors program along with an initiative offering API credits for student projects

A comment on this from The Rundown AI:

Why it matters: Education continues to grapple with AI, but Anthropic is flipping the script by making the tech a partner in developing critical thinking rather than an answer engine. While the controversy over its use likely isn’t going away, this generation of students will have access to the most personalized, high-quality learning tools ever.


Should College Graduates Be AI Literate? — from chronicle.com by Beth McMurtrie (behind a paywall)
More institutions are saying yes. Persuading professors is only the first barrier they face.

Last fall one of Jacqueline Fajardo’s students came to her office, eager to tell her about an AI tool that was helping him learn general chemistry. Had she heard of Google NotebookLM? He had been using it for half a semester in her honors course. He confidently showed her how he could type in the learning outcomes she posted for each class and the tool would produce explanations and study guides. It even created a podcast based on an academic paper he had uploaded. He did not feel it was important to take detailed notes in class because the AI tool was able to summarize the key points of her lectures.


Showing Up for the Future: Why Educators Can’t Sit Out the AI Conversation — from marcwatkins.substack.com with a guest post from Lew Ludwig

The Risk of Disengagement
Let’s be honest: most of us aren’t jumping headfirst into AI. At many of our institutions, it’s not a gold rush—it’s a quiet standoff. But the group I worry most about isn’t the early adopters. It’s the faculty who’ve decided to opt out altogether.

That choice often comes from a place of care. Concerns about data privacy, climate impact, exploitative labor, and the ethics of using large language models are real—and important. But choosing not to engage at all, even on ethical grounds, doesn’t remove us from the system. It just removes our voices from the conversation.

And without those voices, we risk letting others—those with very different priorities—make the decisions that shape what AI looks like in our classrooms, on our campuses, and in our broader culture of learning.



Turbocharge Your Professional Development with AI — from learningguild.com by Dr. RK Prasad

You’ve just mastered a few new eLearning authoring tools, and now AI is knocking on the door, offering to do your job faster, smarter, and without needing coffee breaks. Should you be worried? Or excited?

If you’re a Learning and Development (L&D) professional today, AI is more than just a buzzword—it’s transforming the way we design, deliver, and measure corporate training. But here’s the good news: AI isn’t here to replace you. It’s here to make you better at what you do.

The challenge is to harness its potential to build digital-ready talent, not just within your organization but within yourself.

Let’s explore how AI is reshaping L&D strategies and how you can leverage it for professional development.


5 Recent AI Notables — from automatedteach.com by Graham Clay

1. OpenAI’s New Image Generator
What Happened: OpenAI integrated a much more powerful image generator directly into GPT-4o, making it the default image creator in ChatGPT. Unlike previous image models, this one excels at accurately rendering text in images, precise visualization of diagrams/charts, and multi-turn image refinement through conversation.

Why It’s Big: For educators, this represents a significant advancement in creating educational visuals, infographics, diagrams, and other instructional materials with unprecedented accuracy and control. It’s not perfect, but you can now quickly generate custom illustrations that accurately display mathematical equations, chemical formulas, or process workflows — previously a significant hurdle in digital content creation — without requiring graphic design expertise or expensive software. This capability dramatically reduces the time between conceptualizing a visual aid and implementing it in course materials.
.


The 4 AI modes that will supercharge your workflow — from aiwithallie.beehiiv.com by Allie K. Miller
The framework most people and companies won’t discover until 2026


 

Who does need college anymore? About that book title … — from Education Design Lab

As you may know, Lab founder Kathleen deLaski just published a book with a provocative title: Who Needs College Anymore? Imagining a Future Where Degrees Won’t Matter.

Kathleen is asked about the title in every media interview, before and since the Feb. 25 book release. “It has generated a lot of questions,” she said in our recent book chat. “I tell people to focus on the word, ‘who.’ Who needs college anymore? That’s in keeping with the design thinking frame, where you look at the needs of individuals and what needs are not being met.”

In the same conversation, Kathleen reminded us that only 38% of American adults have a four-year degree. “We never talk about the path to the American dream for the rest of folks,” she said. “We currently are not supporting the other really interesting pathways to financial sustainability — apprenticeships, short-term credentials. And that’s really why I wrote the book, to push the conversation around the 62% of who we call New Majority Learners at the Lab, the people for whom college was not designed.” Watch the full clip

She distills the point into one sentence in this SmartBrief essay:  “The new paradigm is a ‘yes and’ paradigm that embraces college and/or other pathways instead of college or bust.”

What can colleges do moving forward?
In this excellent Q&A with Inside Higher Ed, Kathleen shares her No. 1 suggestion: “College needs to be designed as a stepladder approach, where people can come in and out of it as they need, and at the very least, they can build earnings power along the way to help afford a degree program.”

In her Hechinger Report essay, Kathleen lists four more steps colleges can take to meet the demand for more choices, including “affordability must rule.”

From white-collar apprenticeships and micro-credential programs at local community colleges to online bootcamps, self-instruction using YouTube, and more—students are forging alternative paths to GREAT high-paying jobs. (source)

 

.
Building an AI-Ready Workforce: A look at College Student ChatGPT Adoption in the US — from cdn.openai.com

One finding from our student survey that stood out to us: Many college and university students are teaching themselves and their friends about AI without waiting for their institutions to provide formal AI education or clear policies about the technology’s use. The education ecosystem is in an important moment of exploration and learning, but the rapid adoption by students across the country who haven’t received formalized instruction in how and when to use the technology creates disparities in AI access and knowledge.

The enclosed snapshot of how young people are using ChatGPT provides insight into the state of AI use among America’s college-aged students. We also include actionable proposals to help address adoption gaps. We hope these insights and proposals can inform research and policy conversation across the nation’s education ecosystem about how to achieve outcomes that support our students, our workforce, and the economy. By improving literacy, expanding access, and implementing clear policies, policymakers and educators can better integrate AI into our educational infrastructure and ensure that our workforce is ready to both sustain and benefit from our future with AI.

Leah Belsky | Vice President, Education | OpenAI

 

Top student use cases of ChatGPT -- learning and tutoring, writing help, miscellaneouc questions, and programming help

 

AI in K12: Today’s Breakthroughs and Tomorrow’s Possibilities (webinar)
How AI is Transforming Classrooms Today and What’s Next


Audio-Based Learning 4.0 — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new & powerful way to leverage AI for learning?

At the end of all of this my reflection is that the research paints a pretty exciting picture – audio-based learning isn’t just effective, it’s got some unique superpowers when it comes to boosting comprehension, ramping up engagement, and delivering feedback that really connects with learners.

While audio has been massively under-used as a mode of learning, especially compared to video and text, we’re at an interesting turning point where AI tools are making it easier than ever to tap into audio’s potential as a pedagogical tool.

What’s super interesting is how the solid research backing audio’s effectiveness is and how well this is converging with these new AI capabilities.

From DSC:
I’ve noticed that I don’t learn as well via audio-only based events. It can help if visuals are also provided, but I have to watch the cognitive loads. My processing can start to get overloaded — to the point that I have to close my eyes and just listen sometimes. But there are people I know who love to listen to audiobooks and prefer to learn that way. They can devour content and process/remember it all. Audio is a nice change of pace at times, but I prefer visuals and reading often times. It needs to be absolutely quiet if I’m tackling some new information/learning. 


In Conversation With… Ashton Cousineau — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new video series exploring how L&D professionals are working with AI on the ground

In Conversation With… Ashton Cousineau by Dr Philippa Hardman

A new video series exploring how L&D professionals are working with AI on the ground

Read on Substack


The Learning Research Digest vol. 28 — from learningsciencedigest.substack.com by Dr. Philippa Hardman

Hot Off the Research Press This Month:

  • AI-Infused Learning Design – A structured approach to AI-enhanced assignments using a three-step model for AI integration.
  • Mathematical Dance and Creativity in STEAM – Using AI-powered motion capture to translate dance movements into mathematical models.
  • AI-Generated Instructional Videos – How adaptive AI-powered video learning enhances problem-solving and knowledge retention.
  • Immersive Language Learning with XR & AI – A new framework for integrating AI-driven conversational agents with Extended Reality (XR) for task-based language learning.
  • Decision-Making in Learning Design – A scoping review on how instructional designers navigate complex instructional choices and make data-driven decisions.
  • Interactive E-Books and Engagement – Examining the impact of interactive digital books on student motivation, comprehension, and cognitive engagement.
  • Elevating Practitioner Voices in Instructional Design – A new initiative to amplify instructional designers’ contributions to research and innovation.

Deep Reasoning, Agentic AI & the Continued Rise of Specialised AI Research & Tools for Education — from learningfuturesdigest.substack.com by Dr. Philippa Hardman

Here’s a quick teaser of key developments in the world of AI & learning this month:

  • DeepSeek R-1, OpenAI’s Deep Seek & Perplexity’s ‘Deep Research’ are the latest additions to a growing number of “reasoning models” with interesting implications for evidence-based learning design & development.
  • The U.S. Education Dept release an AI Toolkit and a fresh policy roadmap enabling the adoption of AI use in schools.
  • Anthropic Release “Agentic Claude”, another AI agent that clicks, scrolls, and can even successfully complete e-learning courses…
  • Oxford University Announce the AIEOU Hub, a research-backed research lab to support research and implementation on AI in education.
  • “AI Agents Everywhere”: A Forbes peek at how agentic AI will handle the “boring bits” of classroom life.
  • [Bias klaxon!] Epiphany AI: My own research leads to the creation of a specialised, “pedagogy first” AI co-pilot for instructional design marking the continued growth of specialised AI tools designed for specific industries and workflows.

AI is the Perfect Teaching Assistant for Any Educator — from unite.ai by Navi Azaria, CPO at Kaltura

Through my work with leading educational institutions at Kaltura, I’ve seen firsthand how AI agents are rapidly becoming indispensable. These agents alleviate the mounting burdens on educators and provide new generations of tech-savvy students with accessible, personalized learning, giving teachers the support they need to give their students the personalized attention and engagement they deserve.


Learning HQ — from ai-disruptor-hq.notion.site

This HQ includes all of my AI guides, organized by tool/platform. This list is updated each time a new one is released, and outdated guides are removed/replaced over time.



How AI Is Reshaping Teachers’ Jobs — from edweek.org

Artificial intelligence is poised to fundamentally change the job of teaching. AI-powered tools can shave hours off the amount of time teachers spend grading, lesson-planning, and creating materials. AI can also enrich the lessons they deliver in the classroom and help them meet the varied needs of all students. And it can even help bolster teachers’ own professional growth and development.

Despite all the promise of AI, though, experts still urge caution as the technology continues to evolve. Ethical questions and practical concerns are bubbling to the surface, and not all teachers feel prepared to effectively and safely use AI.

In this special report, see how early-adopter teachers are using AI tools to transform their daily work, tackle some of the roadblocks to expanded use of the technology, and understand what’s on the horizon for the teaching profession in the age of artificial intelligence.

 
© 2025 | Daniel Christian