Midoo AI Launches the World’s First AI Language Learning Agent, Redefining How People Learn Languages — from morningstar.com

SINGAPORE Sept. 3, 2025  /PRNewswire/ — Today, Midoo AI proudly announces the launch of the world’s first AI language learning agent, a groundbreaking innovation set to transform language education forever.

For decades, language learning has pursued one ultimate goal: true personalization. Traditional tools offered smart recommendations, gamified challenges, and pre-written role-play scripts—but real personalization remained out of reach. Midoo AI changes that. Here is the >launch video of Midoo AI.

Imagine a learning experience that evolves with you in real time. A system that doesn’t rely on static courses or scripts but creates a dynamic, one-of-a-kind language world tailored entirely to your needs. This is the power of Midoo’s Dynamic Generation technology.

“Midoo is not just a language-learning tool,” said Yvonne, co-founder of Midoo AI. “It’s a living agent that senses your needs, adapts instantly, and shapes an experience that’s warm, personal, and alive. Learning is no longer one-size-fits-all—now, it’s yours and yours alone.”


Midoo AI Review: Meet the First AI Language Learning Agent — from autogpt.net

Language learning apps have traditionally focused on exercises, quizzes, and progress tracking. Midoo AI introduces a different approach. Instead of presenting itself as a course provider, it acts as an intelligent learning agent that builds, adapts, and sustains a learner’s journey.

This review examines how Midoo AI operates, its feature set, and what makes it distinct from other AI-powered tutors.

Midoo AI in Context: Purpose and Position
Midoo AI is not structured around distributing lessons or modules. Its core purpose is to provide an agent-like partner that adapts in real time. Where many platforms ask learners to select a “level” or “topic,”

Midoo instead begins by analyzing goals, usage context, and error patterns. The result is less about consuming predesigned units and more about co-constructing a pathway.


AI Isn’t Replacing Teachers — It’s Helping Us Teach Better — from rdene915.com by guest author Matthew Mawn

Turning Time Saved Into Better Learning
AI can save teachers time, but what can that time be used for (besides taking a breath)? For most of us, it means redirecting energy into the parts of teaching that made us want to pursue this profession in the first place: connecting with our students and helping them grow academically.

Differentiation
Every classroom has students with different readiness levels, language needs, and learning preferences. AI tools like Diffit or MagicSchool can instantly create multiple versions of a passage or assignment, differentiated by grade level, complexity, or language. This allows every student to engage with the same core concept, moving together as one cohesive class. Instead of spending an evening retyping and rephrasing, teachers can review and tweak AI drafts in minutes, ready for the next lesson.


Mass Intelligence — from oneusefulthing.org by Ethan Mollick
From GPT-5 to nano banana: everyone is getting access to powerful AI

When a billion people have access to advanced AI, we’ve entered what we might call the era of Mass Intelligence. Every institution we have — schools, hospitals, courts, companies, governments — was built for a world where intelligence was scarce and expensive. Now every profession, every institution, every community has to figure out how to thrive with Mass Intelligence. How do we harness a billion people using AI while managing the chaos that comes with it? How do we rebuild trust when anyone can fabricate anything? How do we preserve what’s valuable about human expertise while democratizing access to knowledge?


AI Is the Cognitive Layer. Schools Still Think It’s a Study Tool. — from stefanbauschard.substack.com by Stefan Bauschard

By the time today’s 9th graders and college freshman enter the workforce, the most disruptive waves of AGI and robotics may already be embedded into part society.

What replaces the old system will not simply be a more digital version of the same thing. Structurally, schools may move away from rigid age-groupings, fixed schedules, and subject silos. Instead, learning could become more fluid, personalized, and interdisciplinary—organized around problems, projects, and human development rather than discrete facts or standardized assessments.

AI tutors and mentors will allow for pacing that adapts to each student, freeing teachers to focus more on guidance, relationships, and high-level facilitation. Classrooms may feel less like miniature factories and more like collaborative studios, labs, or even homes—spaces for exploring meaning and building capacity, not just delivering content.

If students are no longer the default source of action, then we need to teach them to:

    • Design agents,
    • Collaborate with agents,
    • Align agentic systems with human values,
    • And most of all, retain moral and civic agency in a world where machines act on our behalf.

We are no longer educating students to be just doers.
We must now educate them to be judgesdesigners, and stewards of agency.


Meet Your New AI Tutor — from wondertools.substack.com by Jeremy Caplan
Try new learning modes in ChatGPT, Claude, and Gemini

AI assistants are now more than simple answer machines. ChatGPT’s new Study Mode, Claude’s Learning Mode, and Gemini’s Guided Learning represent a significant shift. Instead of just providing answers, these free tools act as adaptive, 24/7 personal tutors.



AI Tools for Instructional Design (September, 2025) — from drphilh.gumroad.com by Dr Philippa Hardman

That’s why, in preparation for my next bootcamp which kicks off September 8th 2025, I’ve just completed a full refresh of my list of the most powerful & popular AI tools for Instructional Designers, complete with tips on how to get the most from each tool.

The list has been created using my own experience + the experience of hundreds of Instructional Designers who I work with every week.

It contains the 50 most powerful AI tools for instructional design available right now, along with tips on how to optimise their benefits while mitigating their risks.


Addendums on 9/4/25:


AI Companies Roll Out Educational Tools — from insidehighered.com by Ray Schroeder
This fall, Google, Anthropic and OpenAI are rolling out powerful new AI tools for students and educators, each taking a different path to shape the future of learning.



Rethinking My List of Essential Job Skills in the Age of AI — from michellekassorla.substack.com by Michelle Kassorla

So here’s the new list of essential skills I think my students will need when they are employed to work with AI five years from now:

  1. They can follow directions, analyze outcomes, and adapt to change when needed.
  2. They can write or edit AI to capture a unique voice and appropriate tone in sync with an audience’s needs
  3. They have a deep understanding of one or more content areas of a particular profession, business, or industry, so they can easily identify factual errors.
  4. They have a strong commitment to exploration, a flexible mindset, and a broad understanding of AI literacy.
  5. They are resilient and critical thinkers, ready to question results and demand better answers.
  6. They are problem solvers.

And, of course, here is a new rubric built on those skills:


 

CrashCourse on YouTube — via Matt Tower’s The EdSheet Vol. 18

Description:
At Crash Course, we believe that high-quality educational videos should be available to everyone for free! Subscribe for weekly videos from our current courses! The Crash Course team has produced more than 50 courses on a wide variety of subjects, ranging from the humanities to sciences and so much more! We also recently teamed up with Arizona State University to bring you more courses on the Study Hall channel.

And as Matt stated:


From DSC:
I wasn’t familiar with this “channel” — but I like their mission to help people learn…very inexpensively! Along these lines,  I, too, pray for the world’s learning ecosystems — especially those belonging to children.


 

Bringing the best of AI to college students for free — from blog.google by Sundar Pichai

Millions of college students around the world are getting ready to start classes. To help make the school year even better, we’re making our most advanced AI tools available to them for free, including our new Guided Learning mode. We’re also providing $1 billion to support AI education and job training programs and research in the U.S. This includes making our AI and career training free for every college student in America through our AI for Education Accelerator — over 100 colleges and universities have already signed up.

Guided Learning: from answers to understanding
AI can broaden knowledge and expand access to it in powerful ways, helping anyone, anywhere learn anything in the way that works best for them. It’s not about just getting an answer, but deepening understanding and building critical thinking skills along the way. That opportunity is why we built Guided Learning, a new mode in Gemini that acts as a learning companion guiding you with questions and step-by-step support instead of just giving you the answer. We worked closely with students, educators, researchers and learning experts to make sure it’s helpful for understanding new concepts and is backed by learning science.




 

BREAKING: Google introduces Guided Learning — from aieducation.substack.com by Claire Zau
Some thoughts on what could make Google’s AI tutor stand out

Another major AI lab just launched “education mode.”

Google introduced Guided Learning in Gemini, transforming it into a personalized learning companion designed to help you move from quick answers to real understanding.

Instead of immediately spitting out solutions, it:

  • Asks probing, open-ended questions
  • Walks learners through step-by-step reasoning
  • Adapts explanations to the learner’s level
  • Uses visuals, videos, diagrams, and quizzes to reinforce concepts

This Socratic style tutor rollout follows closely behind similar announcements like OpenAI’s Study Mode (last week) and Anthropic’s Claude for Education (April 2025).


How Sci-Fi Taught Me to Embrace AI in My Classroom — from edsurge.com by Dan Clark

I’m not too naive to understand that, no matter how we present it, some students will always be tempted by “the dark side” of AI. What I also believe is that the future of AI in education is not decided. It will be decided by how we, as educators, embrace or demonize it in our classrooms.

My argument is that setting guidelines and talking to our students honestly about the pitfalls and amazing benefits that AI offers us as researchers and learners will define it for the coming generations.

Can AI be the next calculator? Something that, yes, changes the way we teach and learn, but not necessarily for the worse? If we want it to be, yes.

How it is used, and more importantly, how AI is perceived by our students, can be influenced by educators. We have to first learn how AI can be used as a force for good. If we continue to let the dominant voice be that AI is the Terminator of education and critical thinking, then that will be the fate we have made for ourselves.


AI Tools for Strategy and Research – GT #32 — from goodtools.substack.com by Robin Good
Getting expert advice, how to do deep research with AI, prompt strategy, comparing different AIs side-by-side, creating mini-apps and an AI Agent that can critically analyze any social media channel

In this issue, discover AI tools for:

  • Getting Expert Advice
  • Doing Deep Research with AI
  • Improving Your AI Prompt Strategy
  • Comparing Results from Different AIs
  • Creating an AI Agent for Social Media Analysis
  • Summarizing YouTube Videos
  • Creating Mini-Apps with AI
  • Tasting an Award-Winning AI Short Film

GPT-Building, Agentic Workflow Design & Intelligent Content Curation — from drphilippahardman.substack.com by Dr. Philippa Hardman
What 3 recent job ads reveal about the changing nature of Instructional Design

In this week’s blog post, I’ll share my take on how the instructional design role is evolving and discuss what this means for our day-to-day work and the key skills it requires.

With this in mind, I’ve been keeping a close eye on open instructional design roles and, in the last 3 months, have noticed the emergence of a new flavour of instructional designer: the so-called “Generative AI Instructional Designer.”

Let’s deep dive into three explicitly AI-focused instructional design positions that have popped up in the last quarter. Each one illuminates a different aspect of how the role is changing—and together, they paint a picture of where our profession is likely heading.

Designers who evolve into prompt engineers, agent builders, and strategic AI advisors will capture the new premium. Those who cling to traditional tool-centric roles may find themselves increasingly sidelined—or automated out of relevance.


Google to Spend $1B on AI Training in Higher Ed — from insidehighered.com by Katherine Knott

Google’s parent company announced Wednesday (8/6/25) that it’s planning to spend $1 billion over the next three years to help colleges teach and train students about artificial intelligence.

Google is joining other AI companies, including OpenAI and Anthropic, in investing in AI training in higher education. All three companies have rolled out new tools aimed at supporting “deeper learning” among students and made their AI platforms available to certain students for free.


5 Predictions for How AI Will Impact Community Colleges — from pistis4edu.substack.com by Feng Hou

Based on current technology capabilities, adoption patterns, and the mission of community colleges, here are five well-supported predictions for AI’s impact in the coming years.

  1. Universal AI Tutor Access
  2. AI as Active Teacher
  3. Personalized Learning Pathways
  4. Interactive Multimodal Learning
  5. Value-Centric Education in an AI-Abundant World

 

One-size-fits-all learning is about to become completely obsolete. — from linkedin.com by Allie Miller


AI in the University: From Generative Assistant to Autonomous Agent This Fall — from insidehighered.com by
This fall we are moving into the agentic generation of artificial intelligence.

“Where generative AI creates, agentic AI acts.” That’s how my trusted assistant, Gemini 2.5 Pro deep research, describes the difference.

Agents, unlike generative tools, create and perform multistep goals with minimal human supervision. The essential difference is found in its proactive nature. Rather than waiting for a specific, step-by-step command, agentic systems take a high-level objective and independently create and execute a plan to achieve that goal. This triggers a continuous, iterative workflow that is much like a cognitive loop. The typical agentic process involves six key steps, as described by Nvidia:


AI in Education Podcast — from aipodcast.education by Dan Bowen and Ray Fleming


The State of AI in Education 2025 Key Findings from a National Survey — from Carnegie Learning

Our 2025 national survey of over 650 respondents across 49 states and Puerto Rico reveals both encouraging trends and important challenges. While AI adoption and optimism are growing, concerns about cheating, privacy, and the need for training persist.

Despite these challenges, I’m inspired by the resilience and adaptability of educators. You are the true game-changers in your students’ growth, and we’re honored to support this vital work.

This report reflects both where we are today and where we’re headed with AI. More importantly, it reflects your experiences, insights, and leadership in shaping the future of education.


Instructure and OpenAI Announce Global Partnership to Embed AI Learning Experiences within Canvas — from instructure.com

This groundbreaking collaboration represents a transformative step forward in education technology and will begin with, but is not limited to, an effort between Instructure and OpenAI to enhance the Canvas experience by embedding OpenAI’s next-generation AI technology into the platform.

IgniteAI announced earlier today, establishes Instructure’s future-ready, open ecosystem with agentic support as the AI landscape continues to evolve. This partnership with OpenAI exemplifies this bold vision for AI in education. Instructure’s strategic approach to AI emphasizes the enhancement of connections within an educational ecosystem comprising over 1,100 edtech partners and leading LLM providers.

“We’re committed to delivering next-generation LMS technologies designed with an open ecosystem that empowers educators and learners to adapt and thrive in a rapidly changing world,” said Steve Daly, CEO of Instructure. “This collaboration with OpenAI showcases our ambitious vision: creating a future-ready ecosystem that fosters meaningful learning and achievement at every stage of education. This is a significant step forward for the education community as we continuously amplify the learning experience and improve student outcomes.”


Faculty Latest Targets of Big Tech’s AI-ification of Higher Ed — from insidehighered.com by Kathryn Palmer
A new partnership between OpenAI and Instructure will embed generative AI in Canvas. It may make grading easier, but faculty are skeptical it will enhance teaching and learning.

The two companies, which have not disclosed the value of the deal, are also working together to embed large language models into Canvas through a feature called IgniteAI. It will work with an institution’s existing enterprise subscription to LLMs such as Anthropic’s Claude or OpenAI’s ChatGPT, allowing instructors to create custom LLM-enabled assignments. They’ll be able to tell the model how to interact with students—and even evaluate those interactions—and what it should look for to assess student learning. According to Instructure, any student information submitted through Canvas will remain private and won’t be shared with OpenAI.

Faculty Unsurprised, Skeptical
Few faculty were surprised by the Canvas-OpenAI partnership announcement, though many are reserving judgment until they see how the first year of using it works in practice.


 

From DSC:
In looking at
 
MyNextChapter.ai — THIS TYPE OF FUNCTIONALITY of an AI-based chatbot talking to you re: good fits for a future job — is the kind of thing that could work well in this type of vision/learning platform. The AI asks you relevant career-oriented questions, comes up with some potential job fits, and then gives you resources about how to gain those skills, who to talk with, organizations to join, next steps to get your foot in the door somewhere, etc.

The next gen learning platform would provide links to online-based courses, blogs, peoples’ names on LinkedIn, courses from L&D organizations or from institutions of higher education or from other entities/places to obtain those skills (similar to the ” Action Plan” below from MyNextChapter.ai).

 

On blogging (again) — from by Martin Weller

I also pondered what functions blogging has provided for me over the years.

  • Continuity – as an individual you persist across multiple organisations, roles and jobs. Although I stayed in one institution, I had many roles and the blog wasn’t associated with one specific project. Now I have left it continues.
  • Holistic – you can blog about one topic, but over time I think some personality will creep in. You are not just one thing, you have a personal life, tastes, interests etc which will all feed into what you do. A blog allows this more rounded representation.
  • Experimentation – there is relatively low cost and risk for much of it (this may not be the case for many people online, we need to acknowledge), so you can try things, and if they don’t work, so what? Also you can try formats that conventional outlets might not be appropriate for.
  • Development – the blog has been both an intentional and unintentional vehicle for working up ideas, documenting the process and getting feedback, which have led to more substantial outputs, such as books, project proposals and papers. Most importantly though it has been the means through which I have continually developed writing.
  • Connecting – particularly in those halcyon early days, it was a good way of finding others, working on ideas together, sharing something of yourself. A lot of my career related personal friendships have resulted from blogging.
  • Publicity – I became at one point (the OU crisis of 2018) something of a public voice of the OU, and have often used the blog for projects such as GO-GN

That’s not a bad return for a lil’ ol’ blog. I couldn’t say the same for academic journals.

 

AI is rewiring how we learn, and it’s a game-changer for L&D— from chieflearningofficer.com by Josh Bersin
As AI becomes central to learner engagement, L&D leaders are being urged to fundamentally rethink corporate training, says global industry analyst Josh Bersin.

What are people really doing with ChatGPT? They’re learning. They’re asking questions, getting immediate answers, digging deeper, analyzing information and ultimately making themselves more productive. So, one could argue that simply by shifting to a “learn by inquiry” model, we may triple our value to the business.

From my experience, there are two main learning models in this industry. The first is “what you need to know”—linear or prescriptive things that every employee needs to understand about the company, its products and their role. This kind of content is well handled by existing L&D models.

The second, and far more important, is “what you’d like to know”—questions, curiosities and explorations about how the company works, what customers truly need and how we can each go further in our careers. Thanks to AI, this kind of learning is now explosive and transformative.

Imagine a sales rep who loses a deal. Naturally, they may ask, “What could I have done to be more successful?” A well-designed AI-powered learning system would take that question, give the employee an initial answer and chat with the individual to dig into the problem.

The system would then surface relevant sales training material and recommend videos, tips or case studies for help. And the employee, assuming they like the experience, would likely keep exploring until they feel they’ve learned what they need.

This “curiosity-based” learning is now possible, and its benefits extend far beyond traditional training.

 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 

“The AI-enhanced learning ecosystem” [Jennings] + other items re: AI in our learning ecosystems

The AI-enhanced learning ecosystem: A case study in collaborative innovation — from chieflearningofficer.com by Kevin Jennings
How artificial intelligence can serve as a tool and collaborative partner in reimagining content development and management.

Learning and development professionals face unprecedented challenges in today’s rapidly evolving business landscape. According to LinkedIn’s 2025 Workplace Learning Report, 67 percent of L&D professionals report being “maxed out” on capacity, while 66 percent have experienced budget reductions in the past year.

Despite these constraints, 87 percent agree their organizations need to develop employees faster to keep pace with business demands. These statistics paint a clear picture of the pressure L&D teams face: do more, with less, faster.

This article explores how one L&D leader’s strategic partnership with artificial intelligence transformed these persistent challenges into opportunities, creating a responsive learning ecosystem that addresses the modern demands of rapid product evolution and diverse audience needs. With 71 percent of L&D professionals now identifying AI as a high or very high priority for their learning strategy, this case study demonstrates how AI can serve not merely as a tool but as a collaborative partner in reimagining content development and management.
.


How we use GenAI and AR to improve students’ design skills — from timeshighereducation.com by Antonio Juarez, Lesly Pliego and Jordi Rábago who are professors of architecture at Monterrey Institute of Technology in Mexico; Tomas Pachajoa is a professor of architecture at the El Bosque University in Colombia; & Carlos Hinrichsen and Marietta Castro are educators at San Sebastián University in Chile.
Guidance on using generative AI and augmented reality to enhance student creativity, spatial awareness and interdisciplinary collaboration

Blend traditional skills development with AI use
For subjects that require students to develop drawing and modelling skills, have students create initial design sketches or models manually to ensure they practise these skills. Then, introduce GenAI tools such as Midjourney, Leonardo AI and ChatGPT to help students explore new ideas based on their original concepts. Using AI at this stage broadens their creative horizons and introduces innovative perspectives, which are crucial in a rapidly evolving creative industry.

Provide step-by-step tutorials, including both written guides and video demonstrations, to illustrate how initial sketches can be effectively translated into AI-generated concepts. Offer example prompts to demonstrate diverse design possibilities and help students build confidence using GenAI.

Integrating generative AI and AR consistently enhanced student engagement, creativity and spatial understanding on our course. 


How Texas is Preparing Higher Education for AI — from the74million.org by Kate McGee
TX colleges are thinking about how to prepare students for a changing workforce and an already overburdened faculty for new challenges in classrooms.

“It doesn’t matter if you enter the health industry, banking, oil and gas, or national security enterprises like we have here in San Antonio,” Eighmy told The Texas Tribune. “Everybody’s asking for competency around AI.”

It’s one of the reasons the public university, which serves 34,000 students, announced earlier this year that it is creating a new college dedicated to AI, cyber security, computing and data science. The new college, which is still in the planning phase, would be one of the first of its kind in the country. UTSA wants to launch the new college by fall 2025.

But many state higher education leaders are thinking beyond that. As AI becomes a part of everyday life in new, unpredictable ways, universities across Texas and the country are also starting to consider how to ensure faculty are keeping up with the new technology and students are ready to use it when they enter the workforce.


In the Room Where It Happens: Generative AI Policy Creation in Higher Education — from er.educause.edu by Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini

To develop a robust policy for generative artificial intelligence use in higher education, institutional leaders must first create “a room” where diverse perspectives are welcome and included in the process.


Q&A: Artificial Intelligence in Education and What Lies Ahead — from usnews.com by Sarah Wood
Research indicates that AI is becoming an essential skill to learn for students to succeed in the workplace.

Q: How do you expect to see AI embraced more in the future in college and the workplace?
I do believe it’s going to become a permanent fixture for multiple reasons. I think the national security imperative associated with AI as a result of competing against other nations is going to drive a lot of energy and support for AI education. We also see shifts across every field and discipline regarding the usage of AI beyond college. We see this in a broad array of fields, including health care and the field of law. I think it’s here to stay and I think that means we’re going to see AI literacy being taught at most colleges and universities, and more faculty leveraging AI to help improve the quality of their instruction. I feel like we’re just at the beginning of a transition. In fact, I often describe our current moment as the ‘Ask Jeeves’ phase of the growth of AI. There’s a lot of change still ahead of us. AI, for better or worse, it’s here to stay.




AI-Generated Podcasts Outperform Textbooks in Landmark Education Study — form linkedin.com by David Borish

A new study from Drexel University and Google has demonstrated that AI-generated educational podcasts can significantly enhance both student engagement and learning outcomes compared to traditional textbooks. The research, involving 180 college students across the United States, represents one of the first systematic investigations into how artificial intelligence can transform educational content delivery in real-time.


What can we do about generative AI in our teaching?  — from linkedin.com by Kristina Peterson

So what can we do?

  • Interrogate the Process: We can ask ourselves if we I built in enough checkpoints. Steps that can’t be faked. Things like quick writes, question floods, in-person feedback, revision logs.
  • Reframe AI: We can let students use AI as a partner. We can show them how to prompt better, revise harder, and build from it rather than submit it. Show them the difference between using a tool and being used by one.
  • Design Assignments for Curiosity, Not Compliance: Even the best of our assignments need to adapt. Mine needs more checkpoints, more reflective questions along the way, more explanation of why my students made the choices they did.

Teachers Are Not OK — from 404media.co by Jason Koebler

The response from teachers and university professors was overwhelming. In my entire career, I’ve rarely gotten so many email responses to a single article, and I have never gotten so many thoughtful and comprehensive responses.

One thing is clear: teachers are not OK.

In addition, universities are contracting with companies like Microsoft, Adobe, and Google for digital services, and those companies are constantly pushing their AI tools. So a student might hear “don’t use generative AI” from a prof but then log on to the university’s Microsoft suite, which then suggests using Copilot to sum up readings or help draft writing. It’s inconsistent and confusing.

I am sick to my stomach as I write this because I’ve spent 20 years developing a pedagogy that’s about wrestling with big ideas through writing and discussion, and that whole project has been evaporated by for-profit corporations who built their systems on stolen work. It’s demoralizing.

 

Another ‘shock’ is coming for American jobs — from washingtonpost.com by Heather Long. DSC: This is a gifted article
Millions of workers will need to shift careers. Our country is unprepared.

The United States is on the cusp of a massive economic shift due to AI, and it’s likely to cause greater change than anything President Donald Trump does in his second term. Much good can come from AI, but the country is unprepared to grapple with the need for millions — or perhaps tens of millions — of workers to shift jobs and entire careers.

“There’s a massive risk that entry-level, white-collar work could get automated. What does that do to career ladders?” asked Molly Kinder, a fellow at the Brookings Institution. Her research has found the jobs of marketing analysts are five times as likely to be replaced as those of marketing managers, and sales representative jobs are three times as likely to be replaced as those of sales managers.

Young people working in these jobs will need to be retrained, but it will be hard for them to invest in new career paths. Consider that many college graduates already carry a lot of debt (an average of about $30,000 for those who took student loans).What’s more, the U.S. unemployment insurance system covers only about 57 percent of unemployed workers and replaces only a modest amount of someone’s pay.

From DSC:
This is another reason why I think this vision here is at least a part of our future. We need shorter, less expensive credentials.

  • People don’t have the time to get degrees that take 2+ years to complete (after they have already gone through college once).
  • They don’t want to come out with more debt on their backs.
  • With inflation going back up, they won’t have as much money anyway.
  • Also, they may already have enough debt on their backs.
 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

Uplimit raises stakes in corporate learning with suite of AI agents that can train thousands of employees simultaneously — from venturebeat.com by Michael Nuñez|

Uplimit unveiled a suite of AI-powered learning agents today designed to help companies rapidly upskill employees while dramatically reducing administrative burdens traditionally associated with corporate training.

The San Francisco-based company announced three sets of purpose-built AI agents that promise to change how enterprises approach learning and development: skill-building agents, program management agents, and teaching assistant agents. The technology aims to address the growing skills gap as AI advances faster than most workforces can adapt.

“There is an unprecedented need for continuous learning—at a scale and speed traditional systems were never built to handle,” said Julia Stiglitz, CEO and co-founder of Uplimit, in an interview with VentureBeat. “The companies best positioned to thrive aren’t choosing between AI and their people—they’re investing in both.”


Introducing Claude for Education — from anthropic.com

Today we’re launching Claude for Education, a specialized version of Claude tailored for higher education institutions. This initiative equips universities to develop and implement AI-enabled approaches across teaching, learning, and administration—ensuring educators and students play a key role in actively shaping AI’s role in society.

As part of announcing Claude for Education, we’re introducing:

  1. Learning mode: A new Claude experience that guides students’ reasoning process rather than providing answers, helping develop critical thinking skills
  2. University-wide Claude availability: Full campus access agreements with Northeastern University, London School of Economics and Political Science (LSE), and Champlain College, making Claude available to all students
  3. Academic partnerships: Joining Internet2 and working with Instructure to embed AI into teaching & learning with Canvas LMS
  4. Student programs: A new Claude Campus Ambassadors program along with an initiative offering API credits for student projects

A comment on this from The Rundown AI:

Why it matters: Education continues to grapple with AI, but Anthropic is flipping the script by making the tech a partner in developing critical thinking rather than an answer engine. While the controversy over its use likely isn’t going away, this generation of students will have access to the most personalized, high-quality learning tools ever.


Should College Graduates Be AI Literate? — from chronicle.com by Beth McMurtrie (behind a paywall)
More institutions are saying yes. Persuading professors is only the first barrier they face.

Last fall one of Jacqueline Fajardo’s students came to her office, eager to tell her about an AI tool that was helping him learn general chemistry. Had she heard of Google NotebookLM? He had been using it for half a semester in her honors course. He confidently showed her how he could type in the learning outcomes she posted for each class and the tool would produce explanations and study guides. It even created a podcast based on an academic paper he had uploaded. He did not feel it was important to take detailed notes in class because the AI tool was able to summarize the key points of her lectures.


Showing Up for the Future: Why Educators Can’t Sit Out the AI Conversation — from marcwatkins.substack.com with a guest post from Lew Ludwig

The Risk of Disengagement
Let’s be honest: most of us aren’t jumping headfirst into AI. At many of our institutions, it’s not a gold rush—it’s a quiet standoff. But the group I worry most about isn’t the early adopters. It’s the faculty who’ve decided to opt out altogether.

That choice often comes from a place of care. Concerns about data privacy, climate impact, exploitative labor, and the ethics of using large language models are real—and important. But choosing not to engage at all, even on ethical grounds, doesn’t remove us from the system. It just removes our voices from the conversation.

And without those voices, we risk letting others—those with very different priorities—make the decisions that shape what AI looks like in our classrooms, on our campuses, and in our broader culture of learning.



Turbocharge Your Professional Development with AI — from learningguild.com by Dr. RK Prasad

You’ve just mastered a few new eLearning authoring tools, and now AI is knocking on the door, offering to do your job faster, smarter, and without needing coffee breaks. Should you be worried? Or excited?

If you’re a Learning and Development (L&D) professional today, AI is more than just a buzzword—it’s transforming the way we design, deliver, and measure corporate training. But here’s the good news: AI isn’t here to replace you. It’s here to make you better at what you do.

The challenge is to harness its potential to build digital-ready talent, not just within your organization but within yourself.

Let’s explore how AI is reshaping L&D strategies and how you can leverage it for professional development.


5 Recent AI Notables — from automatedteach.com by Graham Clay

1. OpenAI’s New Image Generator
What Happened: OpenAI integrated a much more powerful image generator directly into GPT-4o, making it the default image creator in ChatGPT. Unlike previous image models, this one excels at accurately rendering text in images, precise visualization of diagrams/charts, and multi-turn image refinement through conversation.

Why It’s Big: For educators, this represents a significant advancement in creating educational visuals, infographics, diagrams, and other instructional materials with unprecedented accuracy and control. It’s not perfect, but you can now quickly generate custom illustrations that accurately display mathematical equations, chemical formulas, or process workflows — previously a significant hurdle in digital content creation — without requiring graphic design expertise or expensive software. This capability dramatically reduces the time between conceptualizing a visual aid and implementing it in course materials.
.


The 4 AI modes that will supercharge your workflow — from aiwithallie.beehiiv.com by Allie K. Miller
The framework most people and companies won’t discover until 2026


 

Investigating Informal Learning with Technology — from learningguild.com by Katie Belle (Curry) Nelson

Informal learning is having a moment right now, and it’s about time.

As learning professionals, we can often get caught up in designing, developing, and implementing formal learning experiences, which can cause informal learning to fall to the wayside and easily be overlooked. However, informal learning experiences can have major, long-term effects on learning and business outcomes, so finding creative ways to track them can be valuable for L&D departments.

Start small
These three methods are small steps to understanding the informal learning environment and its impact on your organization. The assuring thing about informal learning is that you can start small and incorporate more methods later because formal learning is always taking place. Start with one area and begin to explore what you can find out the content your learners want to know more about, how they are learning about things, and how others in the organization are solving problems.

 
© 2025 | Daniel Christian