Thoughts on thinking — from dcurt.is by Dustin Curtis
Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.
The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself.
Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them
Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.
First, the easy stuff.
Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.
Also see:
Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others
This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.
One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.
It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.
…
The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:
What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.
The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures
On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.
Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.
Of course, that would be a mistake.
We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.
Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng
By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.
This AI Model Never Stops Learning — from link.wired.com by Will Knight
Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.
The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.
The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.
Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing
Highlights:
- Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
- Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
The Memory Paradox: Why Our Brains Need Knowledge in an Age of AI — from papers.ssrn.com by Barbara Oakley, Michael Johnston, Kenzen Chen, Eulho Jung, and Terrence Sejnowski; via George Siemens
Abstract
In an era of generative AI and ubiquitous digital tools, human memory faces a paradox: the more we offload knowledge to external aids, the less we exercise and develop our own cognitive capacities. This chapter offers the first neuroscience-based explanation for the observed reversal of the Flynn Effect—the recent decline in IQ scores in developed countries—linking this downturn to shifts in educational practices and the rise of cognitive offloading via AI and digital tools. Drawing on insights from neuroscience, cognitive psychology, and learning theory, we explain how underuse of the brain’s declarative and procedural memory systems undermines reasoning, impedes learning, and diminishes productivity. We critique contemporary pedagogical models that downplay memorization and basic knowledge, showing how these trends erode long-term fluency and mental flexibility. Finally, we outline policy implications for education, workforce development, and the responsible integration of AI, advocating strategies that harness technology as a complement to – rather than a replacement for – robust human knowledge.
Keywords
cognitive offloading, memory, neuroscience of learning, declarative memory, procedural memory, generative AI, Flynn Effect, education reform, schemata, digital tools, cognitive load, cognitive architecture, reinforcement learning, basal ganglia, working memory, retrieval practice, schema theory, manifolds
AI & Schools: 4 Ways Artificial Intelligence Can Help Students — from the74million.org by W. Ian O’Byrne
AI creates potential for more personalized learning
I am a literacy educator and researcher, and here are four ways I believe these kinds of systems can be used to help students learn.
- Differentiated instruction
- Intelligent textbooks
- Improved assessment
- Personalized learning
5 Skills Kids (and Adults) Need in an AI World — from oreilly.com by Raffi Krikorian
Hint: Coding Isn’t One of Them
Five Essential Skills Kids Need (More than Coding)
I’m not saying we shouldn’t teach kids to code. It’s a useful skill. But these are the five true foundations that will serve them regardless of how technology evolves.
- Loving the journey, not just the destination
- Being a question-asker, not just an answer-getter
- Trying, failing, and trying differently
- Seeing the whole picture
- Walking in others’ shoes
The AI moment is now: Are teachers and students ready? — from iblnews.org
Day of AI Australia hosted a panel discussion on 20 May, 2025. Hosted by Dr Sebastian Sequoiah-Grayson (Senior Lecturer in the School of Computer Science and Engineering, UNSW Sydney) with panel members Katie Ford (Industry Executive – Higher Education at Microsoft), Tamara Templeton (Primary School Teacher, Townsville), Sarina Wilson (Teaching and Learning Coordinator – Emerging Technology at NSW Department of Education) and Professor Didar Zowghi (Senior Principal Research Scientist at CSIRO’s Data61).
Teachers using AI tools more regularly, survey finds — from iblnews.org
As many students face criticism and punishment for using artificial intelligence tools like ChatGPT for assignments, new reporting shows that many instructors are increasingly using those same programs.
Addendum on 5/28/25:
A Museum of Real Use: The Field Guide to Effective AI Use — from mikekentz.substack.com by Mike Kentz
Six Educators Annotate Their Real AI Use—and a Method Emerges for Benchmarking the Chats
Our next challenge is to self-analyze and develop meaningful benchmarks for AI use across contexts. This research exhibit aims to take the first major step in that direction.
With the right approach, a transcript becomes something else:
- A window into student decision-making
- A record of how understanding evolves
- A conversation that can be interpreted and assessed
- An opportunity to evaluate content understanding
This week, I’m excited to share something that brings that idea into practice.
Over time, I imagine a future where annotated transcripts are collected and curated. Schools and universities could draw from a shared library of real examples—not polished templates, but genuine conversations that show process, reflection, and revision. These transcripts would live not as static samples but as evolving benchmarks.
This Field Guide is the first move in that direction.
‘What I learned when students walked out of my AI class’ — from timeshighereducation.com by Chris Hogg
Chris Hogg found the question of using AI to create art troubled his students deeply. Here’s how the moment led to deeper understanding for both student and educator
Teaching AI can be as thrilling as it is challenging. This became clear one day when three students walked out of my class, visibly upset. They later explained their frustration: after spending years learning their creative skills, they were disheartened to see AI effortlessly outperform them at the blink of an eye.
This moment stuck with me – not because it was unexpected, but because it encapsulates the paradoxical relationship we all seem to have with AI. As both an educator and a creative, I find myself asking: how do we engage with this powerful tool without losing ourselves in the process? This is the story of how I turned moments of resistance into opportunities for deeper understanding.
In the AI era, how do we battle cognitive laziness in students? — from timeshighereducation.com by Sean McMinn
With the latest AI technology now able to handle complex problem-solving processes, will students risk losing their own cognitive engagement? Metacognitive scaffolding could be the answer, writes Sean McMinn
The concern about cognitive laziness seems to be backed by Anthropic’s report that students use AI tools like Claude primarily for creating (39.8 per cent) and analysing (30.2 per cent) tasks, both considered higher-order cognitive functions according to Bloom’s Taxonomy. While these tasks align well with advanced educational objectives, they also pose a risk: students may increasingly delegate critical thinking and complex cognitive processes directly to AI, risking a reduction in their own cognitive engagement and skill development.
Make Instructional Design Fun Again with AI Agents — from drphilippahardman.substack.com by Dr. Philippa Hardman
A special edition practical guide to selecting & building AI agents for instructional design and L&D
Exactly how we do this has been less clear, but — fuelled by the rise of so-called “Agentic AI” — more and more instructional designers ask me: “What exactly can I delegate to AI agents, and how do I start?”
In this week’s post, I share my thoughts on exactly what instructional design tasks can be delegated to AI agents, and provide a step-by-step approach to building and testing your first AI agent.
Here’s a sneak peak….
AI Personality Matters: Why Claude Doesn’t Give Unsolicited Advice (And Why You Should Care) — from mikekentz.substack.com by Mike Kentz
First in a four-part series exploring the subtle yet profound differences between AI systems and their impact on human cognition
After providing Claude with several prompts of context about my creative writing project, I requested feedback on one of my novel chapters. The AI provided thoughtful analysis with pros and cons, as expected. But then I noticed what wasn’t there: the customary offer to rewrite my chapter.
…
Without Claude’s prompting, I found myself in an unexpected moment of metacognition. When faced with improvement suggestions but no offer to implement them, I had to consciously ask myself: “Do I actually want AI to rewrite this section?” The answer surprised me – no, I wanted to revise it myself, incorporating the insights while maintaining my voice and process.
The contrast was striking. With ChatGPT, accepting its offer to rewrite felt like a passive, almost innocent act – as if I were just saying “yes” to a helpful assistant. But with Claude, requesting a rewrite required deliberate action. Typing out the request felt like a more conscious surrender of creative agency.
Also re: metacognition and AI, see:
In the AI era, how do we battle cognitive laziness in students? — from timeshighereducation.com by Sean McMinn
With the latest AI technology now able to handle complex problem-solving processes, will students risk losing their own cognitive engagement? Metacognitive scaffolding could be the answer, writes Sean McMinn
The concern about cognitive laziness seems to be backed by Anthropic’s report that students use AI tools like Claude primarily for creating (39.8 per cent) and analysing (30.2 per cent) tasks, both considered higher-order cognitive functions according to Bloom’s Taxonomy. While these tasks align well with advanced educational objectives, they also pose a risk: students may increasingly delegate critical thinking and complex cognitive processes directly to AI, risking a reduction in their own cognitive engagement and skill development.
By prompting students to articulate their cognitive processes, such tools reinforce the internalisation of self-regulated learning strategies essential for navigating AI-augmented environments.
EDUCAUSE Panel Highlights Practical Uses for AI in Higher Ed — from govtech.com by Abby Sourwine
A webinar this week featuring panelists from the education, private and nonprofit sectors attested to how institutions are applying generative artificial intelligence to advising, admissions, research and IT.
Many higher education leaders have expressed hope about the potential of artificial intelligence but uncertainty about where to implement it safely and effectively. According to a webinar Tuesday hosted by EDUCAUSE, “Unlocking AI’s Potential in Higher Education,” their answer may be “almost everywhere.”
Panelists at the event, including Kaskaskia College CIO George Kriss, Canyon GBS founder and CEO Joe Licata and Austin Laird, a senior program officer at the Gates Foundation, said generative AI can help colleges and universities meet increasing demands for personalization, timely communication and human-to-human connections throughout an institution, from advising to research to IT support.
Partly Cloudy with a Chance of Chatbots — from derekbruff.org by Derek Bruff
Here are the predictions, our votes, and some commentary:
- “By 2028, at least half of large universities will embed an AI ‘copilot’ inside their LMS that can draft content, quizzes, and rubrics on demand.” The group leaned toward yes on this one, in part because it was easy to see LMS vendors building this feature in as a default.
- “Discipline-specific ‘digital tutors’ (LLM chatbots trained on course materials) will handle at least 30% of routine student questions in gateway courses.” We learned toward yes on this one, too, which is why some of us are exploring these tools today. We would like to be ready how to use them well (or avoid their use) when they are commonly available.
- “Adaptive e-texts whose examples, difficulty, and media personalize in real time via AI will outsell static digital textbooks in the U.S. market.” We leaned toward no on this one, in part because the textbook market and what students want from textbooks has historically been slow to change. I remember offering my students a digital version of my statistics textbook maybe 6-7 years ago, and most students opted to print the whole thing out on paper like it was 1983.
- “AI text detectors will be largely abandoned as unreliable, shifting assessment design toward oral, studio, or project-based ‘AI-resilient’ tasks.” We leaned toward yes on this. I have some concerns about oral assessments (they certainly privilege some students over others), but more authentic assignments seems like what higher ed needs in the face of AI. Ted Underwood recently suggested a version of this: “projects that attempt genuinely new things, which remain hard even with AI assistance.” See his post and the replies for some good discussion on this idea.
- “AI will produce multimodal accessibility layers (live translation, alt-text, sign-language avatars) for most lecture videos without human editing.” We leaned toward yes on this one, too. This seems like another case where something will be provided by default, although my podcast transcripts are AI-generated and still need editing from me, so we’re not there quite yet.
‘We Have to Really Rethink the Purpose of Education’
The Ezra Klein Show
Description: I honestly don’t know how I should be educating my kids. A.I. has raised a lot of questions for schools. Teachers have had to adapt to the most ingenious cheating technology ever devised. But for me, the deeper question is: What should schools be teaching at all? A.I. is going to make the future look very different. How do you prepare kids for a world you can’t predict?
And if we can offload more and more tasks to generative A.I., what’s left for the human mind to do?
Rebecca Winthrop is the director of the Center for Universal Education at the Brookings Institution. She is also an author, with Jenny Anderson, of “The Disengaged Teen: Helping Kids Learn Better, Feel Better, and Live Better.” We discuss how A.I. is transforming what it means to work and be educated, and how our use of A.I. could revive — or undermine — American schools.
2025 EDUCAUSE Horizon Report | Teaching and Learning Edition — from library.educause.edu
Higher education is in a period of massive transformation and uncertainty. Not only are current events impacting how institutions operate, but technological advancement—particularly in AI and virtual reality—are reshaping how students engage with content, how cognition is understood, and how learning itself is documented and valued.
Our newly released 2025 EDUCAUSE Horizon Report | Teaching and Learning Edition captures the spirit of this transformation and how you can respond with confidence through the lens of emerging trends, key technologies and practices, and scenario-based foresight.
#teachingandlearning #highereducation #learningecosystems #learning #futurism #foresight #trends #emergingtechnologies #AI #VR #gamechangingenvironment #colleges #universities #communitycolleges #faculty #staff #IT
AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?
AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?
On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.
“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”
Also relevant/see:
Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.
“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.
Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.
Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.
‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?
Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.
While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.
AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen
When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.
…
The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.
AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk
Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.
In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.
Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie
In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.
Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?
Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?
Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.
Undergraduate Degree Earners for Academic Year 2023-24 — from nscresearchcenter.org; via Ryan Craig
The number of learners earning certificates continued its record-breaking growth, reaching a new 10-year high for the third consecutive year. Both first-time certificate earners (+12.6%, +41,500) and those with a prior award returning to earn a certificate (+8.0%, +11,500) saw significant increases.
In contrast, both bachelor’s degree and associate degree earners declined for the third consecutive year. Fewer students earned an associate degree this year than in any of the last ten years, and bachelor’s degree earners declined to their lowest level since 2015-16. As a result of ongoing certificate growth and associate and bachelor’s decline, the proportion of first-time completers who earn a certificate has risen from about 1 in 9 (11.3%) in 2014-15 to about 1 in 7 (15.4%) in 2023-24.
The 2023-24 academic year marks the first time that certificate completers aged 24 and younger outnumbered those 25 and older. Certificate completers 18-20 years old grew by 19,400 (17.8%) and those under 18 (likely dual enrolled high school students) grew by 7,100 (27.2%) in 2023-24.
.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham
Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.
…
What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.
…
Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.
…
In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.
Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom.
From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.
Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.
There are a few places where Scott and I differ.
- The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:
To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.”
Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”
— Satya Nitta, a longtime computer researcher at
IBM’s Watson Research Center in Yorktown Heights, NY
.
By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.
- Also, I would use the term learning preferences where Scott uses the term learning styles.
Scott also mentions:
“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”
It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).
That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.
So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!
Addendum later on 4/10/25:
U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain
Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.
The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.
The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.
8 Weeks Left to Prepare Students for the AI-Enhanced Workplace — from insidehighered.com by Ray Schroeder
We are down to the final weeks left to fully prepare students for entry into the AI-enhanced workplace. Are your students ready?
The urgent task facing those of us who teach and advise students, whether they be degree program or certificate seeking, is to ensure that they are prepared to enter (or re-enter) the workplace with skills and knowledge that are relevant to 2025 and beyond. One of the first skills to cultivate is an understanding of what kinds of services this emerging technology can provide to enhance the worker’s productivity and value to the institution or corporation.
…
Given that short period of time, coupled with the need to cover the scheduled information in the syllabus, I recommend that we consider merging AI use into authentic assignments and assessments, supplementary modules, and other resources to prepare for AI.
Learning Design in the Era of Agentic AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Aka, how to design online async learning experiences that learners can’t afford to delegate to AI agents
The point I put forward was that the problem is not AI’s ability to complete online async courses, but that online async courses courses deliver so little value to our learners that they delegate their completion to AI.
The harsh reality is that this is not an AI problem — it is a learning design problem.
However, this realisation presents us with an opportunity which we overall seem keen to embrace. Rather than seeking out ways to block AI agents, we seem largely to agree that we should use this as a moment to reimagine online async learning itself.
8 Schools Innovating With Google AI — Here’s What They’re Doing — from forbes.com by Dan Fitzpatrick
While fears of AI replacing educators swirl in the public consciousness, a cohort of pioneering institutions is demonstrating a far more nuanced reality. These eight universities and schools aren’t just experimenting with AI, they’re fundamentally reshaping their educational ecosystems. From personalized learning in K-12 to advanced research in higher education, these institutions are leveraging Google’s AI to empower students, enhance teaching, and streamline operations.
Essential AI tools for better work — from wondertools.substack.com by Jeremy Caplan
My favorite tactics for making the most of AI — a podcast conversation
AI tools I consistently rely on (areas covered mentioned below)
- Research and analysis
- Communication efficiency
- Multimedia creation
AI tactics that work surprisingly well
1. Reverse interviews
Instead of just querying AI, have it interview you. Get the AI to interview you, rather than interviewing it. Give it a little context and what you’re focusing on and what you’re interested in, and then you ask it to interview you to elicit your own insights.”
This approach helps extract knowledge from yourself, not just from the AI. Sometimes we need that guide to pull ideas out of ourselves.
AI in K12: Today’s Breakthroughs and Tomorrow’s Possibilities (webinar)
How AI is Transforming Classrooms Today and What’s Next
Audio-Based Learning 4.0 — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new & powerful way to leverage AI for learning?
At the end of all of this my reflection is that the research paints a pretty exciting picture – audio-based learning isn’t just effective, it’s got some unique superpowers when it comes to boosting comprehension, ramping up engagement, and delivering feedback that really connects with learners.
While audio has been massively under-used as a mode of learning, especially compared to video and text, we’re at an interesting turning point where AI tools are making it easier than ever to tap into audio’s potential as a pedagogical tool.
What’s super interesting is how the solid research backing audio’s effectiveness is and how well this is converging with these new AI capabilities.
From DSC:
I’ve noticed that I don’t learn as well via audio-only based events. It can help if visuals are also provided, but I have to watch the cognitive loads. My processing can start to get overloaded — to the point that I have to close my eyes and just listen sometimes. But there are people I know who love to listen to audiobooks and prefer to learn that way. They can devour content and process/remember it all. Audio is a nice change of pace at times, but I prefer visuals and reading often times. It needs to be absolutely quiet if I’m tackling some new information/learning.
In Conversation With… Ashton Cousineau — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new video series exploring how L&D professionals are working with AI on the ground
In Conversation With… Ashton Cousineau by Dr Philippa Hardman
A new video series exploring how L&D professionals are working with AI on the ground
The Learning Research Digest vol. 28 — from learningsciencedigest.substack.com by Dr. Philippa Hardman
Hot Off the Research Press This Month:
- AI-Infused Learning Design – A structured approach to AI-enhanced assignments using a three-step model for AI integration.
- Mathematical Dance and Creativity in STEAM – Using AI-powered motion capture to translate dance movements into mathematical models.
- AI-Generated Instructional Videos – How adaptive AI-powered video learning enhances problem-solving and knowledge retention.
- Immersive Language Learning with XR & AI – A new framework for integrating AI-driven conversational agents with Extended Reality (XR) for task-based language learning.
- Decision-Making in Learning Design – A scoping review on how instructional designers navigate complex instructional choices and make data-driven decisions.
- Interactive E-Books and Engagement – Examining the impact of interactive digital books on student motivation, comprehension, and cognitive engagement.
- Elevating Practitioner Voices in Instructional Design – A new initiative to amplify instructional designers’ contributions to research and innovation.
Deep Reasoning, Agentic AI & the Continued Rise of Specialised AI Research & Tools for Education — from learningfuturesdigest.substack.com by Dr. Philippa Hardman
Here’s a quick teaser of key developments in the world of AI & learning this month:
- DeepSeek R-1, OpenAI’s Deep Seek & Perplexity’s ‘Deep Research’ are the latest additions to a growing number of “reasoning models” with interesting implications for evidence-based learning design & development.
- The U.S. Education Dept release an AI Toolkit and a fresh policy roadmap enabling the adoption of AI use in schools.
- Anthropic Release “Agentic Claude”, another AI agent that clicks, scrolls, and can even successfully complete e-learning courses…
- Oxford University Announce the AIEOU Hub, a research-backed research lab to support research and implementation on AI in education.
- “AI Agents Everywhere”: A Forbes peek at how agentic AI will handle the “boring bits” of classroom life.
- [Bias klaxon!] Epiphany AI: My own research leads to the creation of a specialised, “pedagogy first” AI co-pilot for instructional design marking the continued growth of specialised AI tools designed for specific industries and workflows.
AI is the Perfect Teaching Assistant for Any Educator — from unite.ai by Navi Azaria, CPO at Kaltura
Through my work with leading educational institutions at Kaltura, I’ve seen firsthand how AI agents are rapidly becoming indispensable. These agents alleviate the mounting burdens on educators and provide new generations of tech-savvy students with accessible, personalized learning, giving teachers the support they need to give their students the personalized attention and engagement they deserve.
Learning HQ — from ai-disruptor-hq.notion.site
This HQ includes all of my AI guides, organized by tool/platform. This list is updated each time a new one is released, and outdated guides are removed/replaced over time.
How AI Is Reshaping Teachers’ Jobs — from edweek.org
Artificial intelligence is poised to fundamentally change the job of teaching. AI-powered tools can shave hours off the amount of time teachers spend grading, lesson-planning, and creating materials. AI can also enrich the lessons they deliver in the classroom and help them meet the varied needs of all students. And it can even help bolster teachers’ own professional growth and development.
Despite all the promise of AI, though, experts still urge caution as the technology continues to evolve. Ethical questions and practical concerns are bubbling to the surface, and not all teachers feel prepared to effectively and safely use AI.
In this special report, see how early-adopter teachers are using AI tools to transform their daily work, tackle some of the roadblocks to expanded use of the technology, and understand what’s on the horizon for the teaching profession in the age of artificial intelligence.
How to Make Learning as Addictive as Social Media | Duolingo’s Luis Von Ahn | TED — from youtube.com; via Kamil Banc at AI Adopter
When technologist Luis von Ahn was building the popular language-learning platform Duolingo, he faced a big problem: Could an app designed to teach you something ever compete with addictive platforms like Instagram and TikTok? He explains how Duolingo harnesses the psychological techniques of social media and mobile games to get you excited to learn — all while spreading access to education across the world.
.
6% of Faculty Feel Supported on AI?! — from automatedteach.com by Graham Clay
Plus, a webinar on building AI tutors this Friday.
The Digital Education Council just released their Global AI Faculty Survey of 1,681 faculty members from 52 institutions across 28 countries, and the findings are eye-opening. (Click here if you missed their analogous survey of students.)
While 86% of faculty see themselves using AI in their future teaching [p. 21], only 6% strongly agree that their institutions have provided sufficient resources to develop their AI literacy [p. 35].
This is a concerning gap between the recognized power of AI and institutional support, and it’s a clear signal about where higher education needs to focus in 2025.
Speaking with faculty about AI around the world, I’ve seen this firsthand. But let’s dig into the survey’s findings.
.
…
Why the gap? Well, one explanation is that faculty lack institutional support.
The survey reveals that…
- 80% of faculty don’t find their institutional AI guidelines comprehensive [p. 32]
- 80% say their institutions haven’t made clear how AI can be used in teaching [p. 33]
- The top barrier to AI adoption, at 40%? “I don’t have time or resources to explore AI” [p. 9]
- The second-highest barrier, at 38%? “I am not sure how to use AI in my teaching” [p. 9]
From DSC:
I was in a teaching and learning group for 10+ years (and in several edtech-related positions before that). We had a senior staff established there but we were mainly called upon for edtech, instructional technology, learning spaces, or LMS types of tasks and questions. Though we could have brought a lot of value to the pedagogical table, the vast majority of the faculty wanted to talk to other faculty members. Our group’s hard-earned — and expensive — expertise didn’t count. We ourselves were teaching classes..but not enough to be on par with the faculty members (at least in their minds). They didn’t seek us out. Perhaps we should have gone door to door, but we didn’t have the resources to do that.
Book groups were effective when the T&L group met with faculty members to discuss things. The discussions were productive. And in those groups, we DID have a seat at the pedagogical table.
But I’m not going to jump on the “we don’t have enough support” bandwagon. Faculty members seek out other faculty members. In many cases, if you aren’t faculty, you don’t count.
So if I were still working and I was in a leadership position, I would sponsor some book study groups with faculty and personnel from teaching and learning centers. Topics for those books could be:
- What AI is
- What those techs can offer
- What the LMS vendors are doing in this regard
- and ideas on how to use AI in one’s teaching
The Rise of the Heretical Leader — from ditchthattextbook.com; a guest post by Dan Fitzpatrick
Now is the time for visionary leadership in education. The era of artificial intelligence is reshaping the demands on education systems. Rigid policies, outdated curricula, and reliance on obsolete metrics are failing students. A recent survey from Resume Genius found that graduates lack skills in communication, collaboration, and critical thinking. Consequently, there is a growing trend in companies hiring candidates based on skills instead of traditional education or work experience. This underscores the urgent need for educational leaders to prioritize adaptability and innovation in their systems. Educational leaders must embrace a transformative approach to keep pace.
…
[Heretical leaders] bring courage, empathy, and strategic thinking to reimagine education’s potential. Here are their defining characteristics:
- Visionary Thinking: They identify bold, innovative paths to progress.
- Courage to Act: These leaders take calculated risks to overcome resistance and inertia.
- Relentless Curiosity: They challenge assumptions and seek better alternatives.
- Empathy for Stakeholders: Understanding the personal impact of change allows them to lead with compassion.
- Strategic Disruption: Their deliberate actions ensure systemic improvements.
These qualities enable Heretical leaders to reframe challenges as opportunities and drive meaningful change.
From DSC:
Readers of this blog will recognize that I believe visionary leadership is extremely important — in all areas of our society, but especially within our learning ecosystems. Vision trumps data, at least in my mind. There are times when data can be used to support a vision, but having a powerful vision is more lasting and impactful than relying on data to drive the organization.
So while I’d vote for a different term other than “heretical leaders,” I get what Dan is saying and I agree with him. Such leaders are going against the grain. They are swimming upstream. They are espousing perspectives that others often don’t buy into (at least initially or for some time).
Such were the leaders who introduced online learning into the K-16 educational systems back in the late ’90s and into the next two+ decades. The growth of online-based learning continues and has helped educate millions of people. Those leaders and the people who worked for such endeavors were going against the grain.
We haven’t seen the end point of online-based learning. I think it will become even more powerful and impactful when AI is used to determine which jobs are opening up, and which skills are needed for those jobs, and then provide a listing of sources of where one can obtain that knowledge and develop those skills. People will be key in this vision. But so will AI and personalized learning. It will be a collaborative effort.
By the way, I am NOT advocating for using AI to outsource our thinking. Also, having basic facts and background knowledge in a domain is critically important, especially to use AI effectively. But we should be teaching students about AI (as we learn more about it ourselves). We should be working collaboratively with our students to understand how best to use AI. It’s their futures at stake.