CrashCourse on YouTube — via Matt Tower’s The EdSheet Vol. 18

Description:
At Crash Course, we believe that high-quality educational videos should be available to everyone for free! Subscribe for weekly videos from our current courses! The Crash Course team has produced more than 50 courses on a wide variety of subjects, ranging from the humanities to sciences and so much more! We also recently teamed up with Arizona State University to bring you more courses on the Study Hall channel.

And as Matt stated:


From DSC:
I wasn’t familiar with this “channel” — but I like their mission to help people learn…very inexpensively! Along these lines,  I, too, pray for the world’s learning ecosystems — especially those belonging to children.


 

The future of L&D is here, and it’s powered by AI. — from linkedin.com by Josh Cavalier


4 Ways I Use AI to Think Better — from wondertools.substack.com by Jeremy Caplan
How AI helps me learn, decide, and create

Learn something new.
Map out a personalized curriculum

Try this: Give an AI assistant context about what you want to learn, why, and how.

  • Detail your rationale and motivation, which may impact your approach.
  • Note your current knowledge or skill level, ideally with examples.

Summarize your learning preferences

  • Note whether you prefer to read, listen to, or watch learning materials.
  • Mention if you like quizzes, drills, or exercises you can do while commuting or during a break at work.
  • If you appreciate learning games, task your AI assistant with generating one for you, using its coding capabilities detailed below.
  • Ask for specific book, textbook, article, or learning path recommendations using the Web search or Deep Research capabilities of PerplexityChatGPT, Gemini or Claude. They can also summarize research literature about effective learning tactics.
  • If you need a human learning partner, ask for guidance on finding one or language you can use in reaching out.

The Ends of Tests: Possibilities for Transformative Assessment and Learning with Generative AI


GPT-5 for Instructional Designers — from drphilippahardman.substack.com by Dr Philippa Hardman
10 Hacks to Work Smarter & Safer with OpenAI’s Latest Model

The TLDR is that as Instructional Designers, we can’t afford to miss some of the very real benefits of GPT-5’s potential, but we also can’t ensure our professional standards or learner outcomes if we blindly accept its outputs without due testing and validation.

For this reason, I decided to synthesise the latest GPT-5 research—from OpenAI’s technical documentation to independent security audits to real-world user testing—into 10 essential reality checks for using GPT-5 as an Instructional Designer.

These aren’t theoretical exercises; they’re practical tests designed to help you safely unlock GPT-5’s benefits while identifying and mitigating its most well-documented limitations.


Grammarly launches new specialist AI agents providing personalized assistance for students — from edtechinnovationhub.com by Rachel Lawler
Grammarly, an AI communication tool, has announced the launch of eight new specialized AI agents. The new assistants can support specific writing challenges such as finding credible sources and checking originality. 

Students will now be offered “responsible AI support” through Grammarly, with the eight new agents:

  • Reader Reactions agent …
  • AI Grader agent …
  • Citation Finder agent …
  • Expert Review agent …
  • Proofreader agent …
  • AI Detector agent …
  • Plagiarism Checker agent …
  • Paraphraser agent …


Why Perplexity AI Is My Go-To Research Tool as a Higher Education CIO — from mikekentz.substack.com; a guest post from Michael Lyons, CIO at MassBay Community College

While I regularly use tools like ChatGPT, Grammarly, Microsoft Copilot, and even YouTube Premium (I would cancel Netflix before this), Perplexity has earned a top spot in my toolkit. It blends AI and real-time web search into one seamless, research-driven platform that saves time and improves the quality of information I rely on every day.

 

Bringing the best of AI to college students for free — from blog.google by Sundar Pichai

Millions of college students around the world are getting ready to start classes. To help make the school year even better, we’re making our most advanced AI tools available to them for free, including our new Guided Learning mode. We’re also providing $1 billion to support AI education and job training programs and research in the U.S. This includes making our AI and career training free for every college student in America through our AI for Education Accelerator — over 100 colleges and universities have already signed up.

Guided Learning: from answers to understanding
AI can broaden knowledge and expand access to it in powerful ways, helping anyone, anywhere learn anything in the way that works best for them. It’s not about just getting an answer, but deepening understanding and building critical thinking skills along the way. That opportunity is why we built Guided Learning, a new mode in Gemini that acts as a learning companion guiding you with questions and step-by-step support instead of just giving you the answer. We worked closely with students, educators, researchers and learning experts to make sure it’s helpful for understanding new concepts and is backed by learning science.




 

BREAKING: Google introduces Guided Learning — from aieducation.substack.com by Claire Zau
Some thoughts on what could make Google’s AI tutor stand out

Another major AI lab just launched “education mode.”

Google introduced Guided Learning in Gemini, transforming it into a personalized learning companion designed to help you move from quick answers to real understanding.

Instead of immediately spitting out solutions, it:

  • Asks probing, open-ended questions
  • Walks learners through step-by-step reasoning
  • Adapts explanations to the learner’s level
  • Uses visuals, videos, diagrams, and quizzes to reinforce concepts

This Socratic style tutor rollout follows closely behind similar announcements like OpenAI’s Study Mode (last week) and Anthropic’s Claude for Education (April 2025).


How Sci-Fi Taught Me to Embrace AI in My Classroom — from edsurge.com by Dan Clark

I’m not too naive to understand that, no matter how we present it, some students will always be tempted by “the dark side” of AI. What I also believe is that the future of AI in education is not decided. It will be decided by how we, as educators, embrace or demonize it in our classrooms.

My argument is that setting guidelines and talking to our students honestly about the pitfalls and amazing benefits that AI offers us as researchers and learners will define it for the coming generations.

Can AI be the next calculator? Something that, yes, changes the way we teach and learn, but not necessarily for the worse? If we want it to be, yes.

How it is used, and more importantly, how AI is perceived by our students, can be influenced by educators. We have to first learn how AI can be used as a force for good. If we continue to let the dominant voice be that AI is the Terminator of education and critical thinking, then that will be the fate we have made for ourselves.


AI Tools for Strategy and Research – GT #32 — from goodtools.substack.com by Robin Good
Getting expert advice, how to do deep research with AI, prompt strategy, comparing different AIs side-by-side, creating mini-apps and an AI Agent that can critically analyze any social media channel

In this issue, discover AI tools for:

  • Getting Expert Advice
  • Doing Deep Research with AI
  • Improving Your AI Prompt Strategy
  • Comparing Results from Different AIs
  • Creating an AI Agent for Social Media Analysis
  • Summarizing YouTube Videos
  • Creating Mini-Apps with AI
  • Tasting an Award-Winning AI Short Film

GPT-Building, Agentic Workflow Design & Intelligent Content Curation — from drphilippahardman.substack.com by Dr. Philippa Hardman
What 3 recent job ads reveal about the changing nature of Instructional Design

In this week’s blog post, I’ll share my take on how the instructional design role is evolving and discuss what this means for our day-to-day work and the key skills it requires.

With this in mind, I’ve been keeping a close eye on open instructional design roles and, in the last 3 months, have noticed the emergence of a new flavour of instructional designer: the so-called “Generative AI Instructional Designer.”

Let’s deep dive into three explicitly AI-focused instructional design positions that have popped up in the last quarter. Each one illuminates a different aspect of how the role is changing—and together, they paint a picture of where our profession is likely heading.

Designers who evolve into prompt engineers, agent builders, and strategic AI advisors will capture the new premium. Those who cling to traditional tool-centric roles may find themselves increasingly sidelined—or automated out of relevance.


Google to Spend $1B on AI Training in Higher Ed — from insidehighered.com by Katherine Knott

Google’s parent company announced Wednesday (8/6/25) that it’s planning to spend $1 billion over the next three years to help colleges teach and train students about artificial intelligence.

Google is joining other AI companies, including OpenAI and Anthropic, in investing in AI training in higher education. All three companies have rolled out new tools aimed at supporting “deeper learning” among students and made their AI platforms available to certain students for free.


5 Predictions for How AI Will Impact Community Colleges — from pistis4edu.substack.com by Feng Hou

Based on current technology capabilities, adoption patterns, and the mission of community colleges, here are five well-supported predictions for AI’s impact in the coming years.

  1. Universal AI Tutor Access
  2. AI as Active Teacher
  3. Personalized Learning Pathways
  4. Interactive Multimodal Learning
  5. Value-Centric Education in an AI-Abundant World

 

One-size-fits-all learning is about to become completely obsolete. — from linkedin.com by Allie Miller


AI in the University: From Generative Assistant to Autonomous Agent This Fall — from insidehighered.com by
This fall we are moving into the agentic generation of artificial intelligence.

“Where generative AI creates, agentic AI acts.” That’s how my trusted assistant, Gemini 2.5 Pro deep research, describes the difference.

Agents, unlike generative tools, create and perform multistep goals with minimal human supervision. The essential difference is found in its proactive nature. Rather than waiting for a specific, step-by-step command, agentic systems take a high-level objective and independently create and execute a plan to achieve that goal. This triggers a continuous, iterative workflow that is much like a cognitive loop. The typical agentic process involves six key steps, as described by Nvidia:


AI in Education Podcast — from aipodcast.education by Dan Bowen and Ray Fleming


The State of AI in Education 2025 Key Findings from a National Survey — from Carnegie Learning

Our 2025 national survey of over 650 respondents across 49 states and Puerto Rico reveals both encouraging trends and important challenges. While AI adoption and optimism are growing, concerns about cheating, privacy, and the need for training persist.

Despite these challenges, I’m inspired by the resilience and adaptability of educators. You are the true game-changers in your students’ growth, and we’re honored to support this vital work.

This report reflects both where we are today and where we’re headed with AI. More importantly, it reflects your experiences, insights, and leadership in shaping the future of education.


Instructure and OpenAI Announce Global Partnership to Embed AI Learning Experiences within Canvas — from instructure.com

This groundbreaking collaboration represents a transformative step forward in education technology and will begin with, but is not limited to, an effort between Instructure and OpenAI to enhance the Canvas experience by embedding OpenAI’s next-generation AI technology into the platform.

IgniteAI announced earlier today, establishes Instructure’s future-ready, open ecosystem with agentic support as the AI landscape continues to evolve. This partnership with OpenAI exemplifies this bold vision for AI in education. Instructure’s strategic approach to AI emphasizes the enhancement of connections within an educational ecosystem comprising over 1,100 edtech partners and leading LLM providers.

“We’re committed to delivering next-generation LMS technologies designed with an open ecosystem that empowers educators and learners to adapt and thrive in a rapidly changing world,” said Steve Daly, CEO of Instructure. “This collaboration with OpenAI showcases our ambitious vision: creating a future-ready ecosystem that fosters meaningful learning and achievement at every stage of education. This is a significant step forward for the education community as we continuously amplify the learning experience and improve student outcomes.”


Faculty Latest Targets of Big Tech’s AI-ification of Higher Ed — from insidehighered.com by Kathryn Palmer
A new partnership between OpenAI and Instructure will embed generative AI in Canvas. It may make grading easier, but faculty are skeptical it will enhance teaching and learning.

The two companies, which have not disclosed the value of the deal, are also working together to embed large language models into Canvas through a feature called IgniteAI. It will work with an institution’s existing enterprise subscription to LLMs such as Anthropic’s Claude or OpenAI’s ChatGPT, allowing instructors to create custom LLM-enabled assignments. They’ll be able to tell the model how to interact with students—and even evaluate those interactions—and what it should look for to assess student learning. According to Instructure, any student information submitted through Canvas will remain private and won’t be shared with OpenAI.

Faculty Unsurprised, Skeptical
Few faculty were surprised by the Canvas-OpenAI partnership announcement, though many are reserving judgment until they see how the first year of using it works in practice.


 

BREAKING: OpenAI Releases Study Mode — from aieducation.substack.com by Claire Zau
What’s New, What Works, and What’s Still Missing

What is Study Mode?
Study Mode is OpenAI’s take on a smarter study partner – a version of the ChatGPT experience designed to guide users through problems with Socratic prompts, scaffolded reasoning, and adaptive feedback (instead of just handing over the answer).

Built with input from learning scientists, pedagogy experts, and educators, it was also shaped by direct feedback from college students. While Study Mode is designed with college students in mind, it’s meant for anyone who wants a more learning-focused, hands-on experience across a wide range of subjects and skill levels.

Who can access it? And how?
Starting July 29, Study Mode is available to users on Free, Plus, Pro, and Team plans. It will roll out to ChatGPT Edu users in the coming weeks.


ChatGPT became your tutor — from theneurondaily.com by Grant Harvey
PLUS: NotebookLM has video now & GPT 4o-level AI runs on laptop

Here’s how it works: instead of asking “What’s 2+2?” and getting “4,” study mode asks questions like “What do you think happens when you add these numbers?” and “Can you walk me through your thinking?” It’s like having a patient tutor who won’t let you off the hook that easily.

The key features include:

  • Socratic questioning: It guides you with hints and follow-up questions rather than direct answers.
  • Scaffolded responses: Information broken into digestible chunks that build on each other.
  • Personalized support: Adjusts difficulty based on your skill level and previous conversations.
  • Knowledge checks: Built-in quizzes and feedback to make sure concepts actually stick.
  • Toggle flexibility: Switch study mode on and off mid-conversation depending on your goals.

Try study mode yourself by selecting “Study and learn” from tools in ChatGPT and asking a question.


Introducing study mode — from openai.com
A new way to learn in ChatGPT that offers step by step guidance instead of quick answers.

[On 7/29/25, we introduced] study mode in ChatGPT—a learning experience that helps you work through problems step by step instead of just getting an answer. Starting today, it’s available to logged in users on Free, Plus, Pro, Team, with availability in ChatGPT Edu coming in the next few weeks.

ChatGPT is becoming one of the most widely used learning tools in the world. Students turn to it to work through challenging homework problems, prepare for exams, and explore new concepts. But its use in education has also raised an important question: how do we ensure it is used to support real learning, and doesn’t just offer solutions without helping students make sense of them?

We’ve built study mode to help answer this question. When students engage with study mode, they’re met with guiding questions that calibrate responses to their objective and skill level to help them build deeper understanding. Study mode is designed to be engaging and interactive, and to help students learn something—not just finish something.


 

Blood in the Instructional Design Machine? — from drphilippahardman.substack.com by Dr. Philippa Hardman
The reality of AI, job degradation & the likely future of Instructional Design

This raises a very important, perhaps even existential question for our profession: do these tools free a designer from the mind-numbing drudgery of content conversion (the “augmented human”)? Or do they automate the core expertise of the learning professional’s role, e.g. selecting instructional startegies, structuring narratives and designing a learning flow, in the process reducing the ID’s role to simply finding the source file and pushing a button (the “inverted centaur”)?

The stated aspiration of these tool builders seems to be a future where AI means that the instructional designer’s value shifts decisively from production to strategy. Their stated goal is to handle the heavy lifting of content generation, allowing the human ID to provide the indispensable context, creativity, and pedagogical judgment that AI cannot replicate.

However, the risk of these tools lies in how we use them, and the “inverted centaur” model remains deeply potent and possible. In an organisation that prioritises cost above all, these same tools can be used to justify reducing the ID role to the functional drudgery of inputting a PDF and supervising the machine.

The key to this paradox lies in a crucial data point: spending on outside products and services has jumped a dramatic 23% to $12.4 billion. 

This signals a fundamental shift: companies are reallocating funds from large internal teams toward specialised consultants and advanced learning technologies like AI. L&D is not being de-funded; it is being re-engineered.

 

New Lightcast Report: AI Skills Command 28% Salary Premium as Demand Shifts Beyond Tech Industry — from lightcast.io; via Paul Fain
First-of-its-kind analysis reveals specific AI skills employers need most, enabling targeted workforce training strategies across all career areas

July 23, 2025 – Lightcast, the global leader in labor market intelligence, today released “Beyond the Buzz: Developing the AI Skills Employers Actually Need,” a comprehensive analysis revealing that artificial intelligence has fundamentally transformed hiring patterns across the world of work. The report, based on analysis of over 1.3 billion job postings, shows that job postings including AI skills offer 28% higher salaries—nearly $18,000 more per year—than those without such capabilities.

More importantly, the research analyzes specific skills based on their growth across job postings, their importance in the workforce, and their exposure to AI. This shows exactly which AI skills create value in which contexts, solving the critical challenge facing educators and workforce development leaders: moving beyond vague “AI literacy” to precise, targeted training that delivers measurable results.


Also via Paul Fain:


Despite growing awareness, however, participation in skill development is limited. In 2024, less than half of U.S. employees (45%) participated in training or education to build new skills for their current job. About one in three employees (32%) who are hoping to move into a new role within the next year strongly agree that they have the skills needed to be exceptional in that role.

 

Building a learning ecosystem that drives business results — from chieflearningofficer.com by Nick Romanowski
How SAX combined adaptive e-learning and experiential workshops to accelerate capability development and impact the bottom line.

At SAX, we know that to succeed in today’s market, we need professionals who can learn quickly, apply that learning effectively and continuously adapt as client needs evolve.

Yet traditional training methods were no longer enough. Our firm faced familiar challenges: helping staff meet continuing professional education requirements efficiently, uncovering knowledge gaps to guide development and building a more capable, more client-ready workforce.

We found our solution in a flipped learning model that blends adaptive e-learning with live, experiential workshops. The results were transformative. We accelerated CPE credit completion by more than 50 percent, reclaimed 173 billable hours and equipped our people with deeper capabilities.

Here’s how we did it, and what we learned along the way.

Blend technology and human touch: Adaptive e-learning addresses individual knowledge gaps efficiently. Live workshops enable skill development through practice and feedback. Together, they drive both learning efficiency and behavior change.

 

PODCAST: Did AI “break” school? Or will it “fix” it? …and if so, what can we do about it? — from theneurondaily.com by Grant Harvey, Corey Noles, Grant Harvey, & Matthew Robinson

In Episode 5 of The Neuron Podcast, Corey Noles and Grant Harvey tackle the education crisis head-on. We explore the viral UCLA “CheatGPT” controversy, MIT’s concerning brain study, and innovative solutions like Alpha School’s 2-hour learning model. Plus, we break down OpenAI’s new $10M teacher training initiative and share practical tips for using AI to enhance learning rather than shortcut it. Whether you’re a student, teacher, or parent, you’ll leave with actionable insights on the future of education.

 

AI is rewiring how we learn, and it’s a game-changer for L&D— from chieflearningofficer.com by Josh Bersin
As AI becomes central to learner engagement, L&D leaders are being urged to fundamentally rethink corporate training, says global industry analyst Josh Bersin.

What are people really doing with ChatGPT? They’re learning. They’re asking questions, getting immediate answers, digging deeper, analyzing information and ultimately making themselves more productive. So, one could argue that simply by shifting to a “learn by inquiry” model, we may triple our value to the business.

From my experience, there are two main learning models in this industry. The first is “what you need to know”—linear or prescriptive things that every employee needs to understand about the company, its products and their role. This kind of content is well handled by existing L&D models.

The second, and far more important, is “what you’d like to know”—questions, curiosities and explorations about how the company works, what customers truly need and how we can each go further in our careers. Thanks to AI, this kind of learning is now explosive and transformative.

Imagine a sales rep who loses a deal. Naturally, they may ask, “What could I have done to be more successful?” A well-designed AI-powered learning system would take that question, give the employee an initial answer and chat with the individual to dig into the problem.

The system would then surface relevant sales training material and recommend videos, tips or case studies for help. And the employee, assuming they like the experience, would likely keep exploring until they feel they’ve learned what they need.

This “curiosity-based” learning is now possible, and its benefits extend far beyond traditional training.

 


Tech check: Innovation in motion: How AI is rewiring L&D workflows — from chieflearningofficer.com by Gabrielle Pike
AI isn’t here to replace us. It’s here to level us up.

For today’s chief learning officer, the days of just rolling out compliance training are long gone. In 2025, learning and development leaders are architects of innovation, crafting ecosystems that are agile, automated and AI-infused. This quarter’s Tech Check invites us to pause, assess and get strategic about where tech is taking us. Because the goal isn’t more tools—it’s smarter, more human learning systems that scale with the business.

Sections include:

  • The state of AI in L&D: Hype vs. reality
  • AI in design: From static content to dynamic experiences
  • AI in development: Redefining production workflows
  • Strategic questions CLOs should be asking
  • Future forward: What’s next?
  • Closing thought

American Federation of Teachers (AFT) to Launch National Academy for AI Instruction with Microsoft, OpenAI, Anthropic and United Federation of Teachers — from aft.org

NEW YORK – The AFT, alongside the United Federation of Teachers and lead partner Microsoft Corp., founding partner OpenAI, and Anthropic, announced the launch of the National Academy for AI Instruction today. The groundbreaking $23 million education initiative will provide access to free AI training and curriculum for all 1.8 million members of the AFT, starting with K-12 educators. It will be based at a state-of-the-art bricks-and-mortar Manhattan facility designed to transform how artificial intelligence is taught and integrated into classrooms across the United States.

The academy will help address the gap in structured, accessible AI training and provide a national model for AI-integrated curriculum and teaching that puts educators in the driver’s seat.


Students Are Anxious about the Future with A.I. Their Parents Are, Too. — from educationnext.org by Michael B. Horn
The fast-growing technology is pushing families to rethink the value of college

In an era when the college-going rate of high school graduates has dropped from an all-time high of 70 percent in 2016 to roughly 62 percent now, AI seems to be heightening the anxieties about the value of college.

According to the survey, two-thirds of parents say AI is impacting their view of the value of college. Thirty-seven percent of parents indicate they are now scrutinizing college’s “career-placement outcomes”; 36 percent say they are looking at a college’s “AI-skills curriculum,” while 35 percent respond that a “human-skills emphasis” is important to them.

This echoes what I increasingly hear from college leadership: Parents and students demand to see a difference between what they are getting from a college and what they could be “learning from AI.”


This next item on LinkedIn is compliments of Ray Schroeder:



How to Prepare Students for a Fast-Moving (AI)World — from rdene915.com by Dr. Rachelle Dené Poth

Preparing for a Future-Ready Classroom
Here are the core components I focus on to prepare students:

1. Unleash Creativity and Problem-Solving.
2. Weave in AI and Computational Thinking.
3. Cultivate Resilience and Adaptability.


AI Is Reshaping Learning Roles—Here’s How to Future-Proof Your Team — from onlinelearningconsortium.org by Jennifer Mathes, Ph.D., CEO, Online Learning Consortium; via Robert Gibson on LinkedIn

Culture matters here. Organizations that foster psychological safety—where experimentation is welcomed and mistakes are treated as learning—are making the most progress. When leaders model curiosity, share what they’re trying, and invite open dialogue, teams follow suit. Small tests become shared wins. Shared wins build momentum.

Career development must be part of this equation. As roles evolve, people will need pathways forward. Some will shift into new specialties. Others may leave familiar roles for entirely new ones. Making space for that evolution—through upskilling, mobility, and mentorship—shows your people that you’re not just investing in AI, you’re investing in them.

And above all, people need transparency. Teams don’t expect perfection. But they do need clarity. They need to understand what’s changing, why it matters, and how they’ll be supported through it. That kind of trust-building communication is the foundation for any successful change.

These shifts may play out differently across sectors—but the core leadership questions will likely be similar.

AI marks a turning point—not just for technology, but for how we prepare our people to lead through disruption and shape the future of learning.


.


 

2025 Learning System Top Picks — from elearninfo247.com by Craig Weiss

Who is leading the pack? Who is setting themselves apart here in the mid-year?

Are they an LMS? LMS/LXP? Talent Development System? Mentoring? Learning Platform?

Something else?

Are they solely customer training/education, mentoring, or coaching? Are they focused only on employees? Are they an amalgamation of all or some?

Well, they cut across the board – hence, they slide under the “Learning Systems” umbrella, which is under the bigger umbrella term – “Learning Technology.”

Categories: L&D-specific, Combo (L&D and Training, think internal/external audiences), and Customer Training/Education (this means customer education, which some vendors use to mean the same as customer training).

 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 
 
© 2025 | Daniel Christian