ChatGPT: the world’s most influential teacher — from drphilippahardman.substack.com by Dr. Philippa Hardman; emphasis DSC New research shows that millions of us are “learning with AI” every week: what does this mean for how (and how well) humans learn?
This week, an important piece of researchlanded that confirms the gravity of AI’s role in the learning process. The TLDR is that learning is now a mainstream use case for ChatGPT; around 10.2% of all ChatGPT messages (that’s ~2BN messages sent by over 7 million users per week) are requests for help with learning.
The research shows that about 10.2% of all messages are tutoring/teaching, and within the “Practical Guidance” category, tutoring is 36%. “Asking” interactions are growing faster than “Doing” and are rated higher quality by users. Younger people contribute a huge share of messages, and growth is fastest in low- and middle-income countries (How People Use ChatGPT, 2025).
If AI is already acting as a global tutor, the question isn’t “will people learn with AI?”—they already are. The real question we need to ask is: what does great learning actually look like, and how should AI evolve to support it? That’s where decades of learning science help us separate “feels like learning” from “actually gaining new knowledge and skills”.
A day in the life: The next 25 years A learner wakes up. Their AI-powered learning coach welcomes them, drawing their attention to their progress and helping them structure their approach to the day. A notification reminds them of an upcoming interview and suggests reflections to add to their learning portfolio.
Rather than a static gradebook, their portfolio is a dynamic, living record, curated by the student, validated by mentors in both industry and education, and enriched through co-creation with maturing modes of AI. It tells a story through essays, code, music, prototypes, journal reflections, and team collaborations. These artifacts are not “submitted”, they are published, shared, and linked to verifiable learning outcomes.
And when it’s time to move, to a new institution, a new job, or a new goal, their data goes with them, immutable, portable, verifiable, and meaningful.
From DSC: And I would add to that last solid sentence that the learner/student/employee will be able to control who can access this information. Anyway, some solid reflections here from Lev.
I know a lot of readers will disagree with this, and the timeline feels aggressive (the future always arrives more slowly than pundits expect) but I think the overall premise is sound: “The concept of a tipping point in education – where AI surpasses traditional schools as the dominant learning medium – is increasingly plausible based on current trends, technological advancements, and expert analyses.”
The Rundown: In this tutorial, you will learn how to combine NotebookLM with ChatGPT to master any subject faster, turning dense PDFs into interactive study materials with summaries, quizzes, and video explanations.
Step-by-step:
Go to notebooklm.google.com, click the “+” button, and upload your PDF study material (works best with textbooks or technical documents)
Choose your output mode: Summary for a quick overview, Mind Map for visual connections, or Video Overview for a podcast-style explainer with visuals
Generate a Study Guide under Reports — get Q&A sets, short-answer questions, essay prompts, and glossaries of key terms automatically
Take your PDF to ChatGPT and prompt: “Read this chapter by chapter and highlight confusing parts” or “Quiz me on the most important concepts”
Combine both tools: Use NotebookLM for quick context and interactive guides, then ChatGPT to clarify tricky parts and go deeperPro Tip: If your source is in EPUB or audiobook, convert it to PDF before uploading. Both NotebookLM and ChatGPT handle PDFs best.
Claude can now create and edit Excel spreadsheets, documents, PowerPoint slide decks, and PDFs directly in Claude.ai and the desktop app. This transforms how you work with Claude—instead of only receiving text responses or in-app artifacts, you can describe what you need, upload relevant data, and get ready-to-use files in return.
Also see:
Microsoft to lessen reliance on OpenAI by buying AI from rival Anthropic — from techcrunch.com byRebecca Bellan
Microsoft will pay to use Anthropic’s AI in Office 365 apps, The Information reports, citing two sources. The move means that Anthropic’s tech will help power new features in Word, Excel, Outlook, and PowerPoint alongside OpenAI’s, marking the end of Microsoft’s previous reliance solely on the ChatGPT maker for its productivity suite. Microsoft’s move to diversify its AI partnerships comes amid a growing rift with OpenAI, which has pursued its own infrastructure projects as well as a potential LinkedIn competitor.
In this episode of Unfixed, we talk with Ray Schroeder—Senior Fellow at UPCEA and Professor Emeritus at the University of Illinois Springfield—about Artificial General Intelligence (AGI) and what it means for the future of higher education. While most of academia is still grappling with ChatGPT and basic AI tools, Schroeder is thinking ahead to AI agents, human displacement, and AGI’s existential implications for teaching, learning, and the university itself. We explore why AGI is so controversial, what institutions should be doing now to prepare, and how we can respond responsibly—even while we’re already overwhelmed.
Data from the State of AI and Instructional Design Report revealed that 95.3% of the instructional designers interviewed use AI in their daily work [1]. And over 85% of this AI use occurs during the design and development process.
These figures showcase the immense impact AI is already having on the instructional design world.
If you’re an L&D professional still on the fence about adding AI to your workflow or an AI convert looking for the next best tools, keep reading.
This guide breaks down 5 of the top AI tools for instructional designers in 2025, so you can streamline your development processes and build better training faster.
But before we dive into the tools of the trade, let’s address the elephant in the room:
GRAND RAPIDS, MI — A new course at Grand Rapids Community College aims to help students learn about artificial intelligence by using the technology to solve real-world business problems.
…
In a release, the college said its grant application was supported by 20 local businesses, including Gentex, TwistThink and the Grand Rapids Public Museum. The businesses have pledged to work with students who will use business data to develop an AI project such as a chatbot that interacts with customers, or a program that automates social media posts or summarizes customer data.
“This rapidly emerging technology can transform the way businesses process data and information,” Kristi Haik, dean of GRCC’s School of Science, Technology, Engineering and Mathematics, said in a statement. “We want to help our local business partners understand and apply the technology. We also want to create real experiences for our students so they enter the workforce with demonstrated competence in AI applications.”
As Patrick Bailey said on LinkedIn about this article:
Nice to see a pedagogy that’s setting a forward movement rather than focusing on what could go wrong with AI in a curriculum.
As a 30 year observer and participant, it seems to me that previous technology platform shifts like SaaS and mobile did not fundamentally change the LMS. AI is different. We’re standing at the precipice of LMS 2.0, where the branding change from Course Management System to Learning Management System will finally live up to its name. Unlike SaaS or mobile, AI represents a technology platform shift that will transform the way participants interact with learning systems – and with it, the nature of the LMS itself.
Given the transformational potential of AI, it is useful to set the context and think about how we got here, especially on this 30th anniversary of the LMS.
Where AI is disruptive is in its ability to introduce a whole new set of capabilities that are best described as personalized learning services. AI offers a new value proposition to the LMS, roughly the set of capabilities currently being developed in the AI Tutor / agentic TA segment. These new capabilities are so valuable given their impact on learning that I predict they will become the services with greatest engagement within a school or university’s “enterprise” instructional platform.
In this way, by LMS paradigm shift, I specifically mean a shift from buyers valuing the product on its course-centric and course management capabilities, to valuing it on its learner-centric and personalized learning capabilities.
This anthology reveals how the integration of AI in education poses profound philosophical, pedagogical, ethical and political questions. As this global AI ecosystem evolves and becomes increasingly ubiquitous, UNESCO and its partners have a shared responsibility to lead the global discourse towards an equity- and justice-centred agenda. The volume highlights three areas in which UNESCO will continue to convene and lead a global commons for dialog and action particularly in areas on AI futures, policy and practice innovation, and experimentation.
As guardian of ethical, equitable human-centred AI in education.
As thought leader in reimagining curriculum and pedagogy
As a platform for engaging pluralistic and contested dialogues
AI, copyright and the classroom: what higher education needs to know — from timeshighereducation.com by Cayce Myers As artificial intelligence reshapes teaching and research, one legal principle remains at the heart of our work: copyright. Understanding its implications isn’t just about compliance – it’s about protecting academic integrity, intellectual property and the future of knowledge creation. Cayce Myers explains
Why It Matters A decade from now, we won’t say “AI changed schools.” We’ll say: this was the year schools began to change what it means to be human, augmented by AI.
This transformation isn’t about efficiency alone. It’s about dignity, creativity, and discovery, and connecting education more directly to human flourishing. The industrial age gave us schools to produce cookie-cutter workers. The digital age gave us knowledge anywhere, anytime. The AI age—beginning now—gives us back what matters most: the chance for every learner to become infinitely capable.
This fall may look like any other—bells ringing, rows of desks—but beneath the surface, education has begun its greatest transformation since the one-room schoolhouse.
Transactional and transformational leaderships’ combined impact on AI and trust Given the volatile times we live in, a leader may find themselves in a situation where they know how they will use AI, but they are not entirely clear on the goals and journey. In a teaching context, students can be given scenarios where they must lead a team, including autonomous AI agents, to achieve goals. They can then analyse the situations and decide what leadership styles to apply and how to build trust in their human team members. Educators can illustrate this decision-making process using a table (see above).
They may need to combine transactional leadership with transformational leadership, for example. Transactional leadership focuses on planning, communicating tasks clearly and an exchange of value. This works well with both humans and automated AI agents.
Real, capability-building learning requires three key elements: content, context and conversation.
The Rise Of AI Agents: Teaching At Scale
The generative AI revolution is often framed in terms of efficiency: faster content creation, automated processes and streamlined workflows. But in the world of L&D, its most transformative potential lies elsewhere: the ability to scale great teaching.
AI gives us the means to replicate the role of an effective teacher across an entire organization. Specifically, AI agents—purpose-built systems that understand, adapt and interact in meaningful, context-aware ways—can make this possible. These tools understand a learner’s role, skill level and goals, then tailor guidance to their specific challenges and adapt dynamically over time. They also reinforce learning continuously, nudging progress and supporting application in the flow of work.
More than simply sharing knowledge, an AI agent can help learners apply it and improve with every interaction. For example, a sales manager can use a learning agent to simulate tough customer scenarios, receive instant feedback based on company best practices and reinforce key techniques. A new hire in the product department could get guidance on the features and on how to communicate value clearly in a roadmap meeting.
In short, AI agents bring together the three essential elements of capability building, not in a one-size-fits-all curriculum but on demand and personalized for every learner. While, obviously, this technology shouldn’t replace human expertise, it can be an effective tool for removing bottlenecks and unlocking effective learning at scale.
SINGAPORE Sept. 3, 2025 /PRNewswire/ — Today, Midoo AIproudly announces the launch of the world’s first AI language learning agent, a groundbreaking innovation set to transform language education forever.
For decades, language learning has pursued one ultimate goal: true personalization. Traditional tools offered smart recommendations, gamified challenges, and pre-written role-play scripts—but real personalization remained out of reach. Midoo AI changes that. Here is the >launch video of Midoo AI.
Imagine a learning experience that evolves with you in real time. A system that doesn’t rely on static courses or scripts but creates a dynamic, one-of-a-kind language world tailored entirely to your needs. This is the power of Midoo’s Dynamic Generation technology.
“Midoo is not just a language-learning tool,” said Yvonne, co-founder of Midoo AI. “It’s a living agent that senses your needs, adapts instantly, and shapes an experience that’s warm, personal, and alive. Learning is no longer one-size-fits-all—now, it’s yours and yours alone.”
Language learning apps have traditionally focused on exercises, quizzes, and progress tracking. Midoo AI introduces a different approach. Instead of presenting itself as a course provider, it acts as an intelligent learning agent that builds, adapts, and sustains a learner’s journey.
This review examines how Midoo AI operates, its feature set, and what makes it distinct from other AI-powered tutors.
Midoo AI in Context: Purpose and Position
Midoo AI is not structured around distributing lessons or modules. Its core purpose is to provide an agent-like partner that adapts in real time. Where many platforms ask learners to select a “level” or “topic,”
Midoo instead begins by analyzing goals, usage context, and error patterns. The result is less about consuming predesigned units and more about co-constructing a pathway.
Turning Time Saved Into Better Learning
AI can save teachers time, but what can that time be used for (besides taking a breath)? For most of us, it means redirecting energy into the parts of teaching that made us want to pursue this profession in the first place: connecting with our students and helping them grow academically.
Differentiation Every classroom has students with different readiness levels, language needs, and learning preferences. AI tools like Diffit or MagicSchool can instantly create multiple versions of a passage or assignment, differentiated by grade level, complexity, or language. This allows every student to engage with the same core concept, moving together as one cohesive class. Instead of spending an evening retyping and rephrasing, teachers can review and tweak AI drafts in minutes, ready for the next lesson.
Mass Intelligence — from oneusefulthing.org by Ethan Mollick From GPT-5 to nano banana: everyone is getting access to powerful AI
When a billion people have access to advanced AI, we’ve entered what we might call the era of Mass Intelligence. Every institution we have — schools, hospitals, courts, companies, governments — was built for a world where intelligence was scarce and expensive. Now every profession, every institution, every community has to figure out how to thrive with Mass Intelligence. How do we harness a billion people using AI while managing the chaos that comes with it? How do we rebuild trust when anyone can fabricate anything? How do we preserve what’s valuable about human expertise while democratizing access to knowledge?
By the time today’s 9th graders and college freshman enter the workforce, the most disruptive waves of AGI and robotics may already be embedded into part society.
What replaces the old system will not simply be a more digital version of the same thing. Structurally, schools may move away from rigid age-groupings, fixed schedules, and subject silos. Instead, learning could become more fluid, personalized, and interdisciplinary—organized around problems, projects, and human development rather than discrete facts or standardized assessments.
AI tutors and mentors will allow for pacing that adapts to each student, freeing teachers to focus more on guidance, relationships, and high-level facilitation. Classrooms may feel less like miniature factories and more like collaborative studios, labs, or even homes—spaces for exploring meaning and building capacity, not just delivering content.
…
If students are no longer the default source of action, then we need to teach them to:
Design agents,
Collaborate with agents,
Align agentic systems with human values,
And most of all, retain moral and civic agency in a world where machines act on our behalf.
We are no longer educating students to be just doers.
We must now educate them to be judges, designers, and stewards of agency.
Meet Your New AI Tutor — from wondertools.substack.com by Jeremy Caplan Try new learning modes in ChatGPT, Claude, and Gemini
AI assistants are now more than simple answer machines. ChatGPT’s new Study Mode, Claude’s Learning Mode, and Gemini’s Guided Learningrepresent a significant shift. Instead of just providing answers, these free tools act as adaptive, 24/7 personal tutors.
That’s why, in preparation for my next bootcamp which kicks off September 8th 2025, I’ve just completed a full refresh of my list of the most powerful & popular AI tools for Instructional Designers, complete with tips on how to get the most from each tool.
The list has been created using my own experience + the experience of hundreds of Instructional Designers who I work with every week.
It contains the 50 most powerful AI tools for instructional design available right now, along with tips on how to optimise their benefits while mitigating their risks.
Addendums on 9/4/25:
AI Companies Roll Out Educational Tools — from insidehighered.com by Ray Schroeder This fall, Google, Anthropic and OpenAI are rolling out powerful new AI tools for students and educators, each taking a different path to shape the future of learning.
So here’s the new list of essential skills I think my students will need when they are employed to work with AI five years from now:
They can follow directions, analyze outcomes, and adapt to change when needed.
They can write or edit AI to capture a unique voice and appropriate tone in sync with an audience’s needs
They have a deep understanding of one or more content areas of a particular profession, business, or industry, so they can easily identify factual errors.
They have a strong commitment to exploration, a flexible mindset, and a broad understanding of AI literacy.
They are resilient and critical thinkers, ready to question results and demand better answers.
They are problem solvers.
And, of course, here is a new rubric built on those skills:
Learn something new. Map out a personalized curriculum
Try this: Give an AI assistant context about what you want to learn, why, and how.
Detail your rationale and motivation, which may impact your approach.
Note your current knowledge or skill level, ideally with examples.
Summarize your learning preferences
Note whether you prefer to read, listen to, or watch learning materials.
Mention if you like quizzes, drills, or exercises you can do while commuting or during a break at work.
If you appreciate learning games, task your AI assistant with generating one for you, using its coding capabilities detailed below.
Ask for specific book, textbook, article, or learning path recommendations using the Web search or Deep Research capabilities of Perplexity, ChatGPT, Gemini or Claude. They can also summarize research literature about effective learning tactics.
If you need a human learning partner, ask for guidance on finding one or language you can use in reaching out.
GPT-5 for Instructional Designers — from drphilippahardman.substack.com by Dr Philippa Hardman 10 Hacks to Work Smarter & Safer with OpenAI’s Latest Model
The TLDR is that as Instructional Designers, we can’t afford to miss some of the very real benefits of GPT-5’s potential, but we also can’t ensure our professional standards or learner outcomes if we blindly accept its outputs without due testing and validation.
For this reason, I decided to synthesise the latest GPT-5 research—from OpenAI’s technical documentation to independent security audits to real-world user testing—into 10 essential reality checks for using GPT-5 as an Instructional Designer.
These aren’t theoretical exercises; they’re practical tests designed to help you safely unlock GPT-5’s benefits while identifying and mitigating its most well-documented limitations.
While I regularly use tools like ChatGPT, Grammarly, Microsoft Copilot, and even YouTube Premium (I would cancel Netflix before this), Perplexity has earned a top spot in my toolkit. It blends AI and real-time web search into one seamless, research-driven platform that saves time and improves the quality of information I rely on every day.
Thomson Reuters President and CEO Steve Hasker believes the legal profession is experiencing “the biggest disruption … in its history” due to generative and agentic artificial intelligence, fundamentally rewriting how legal work products are created for the first time in more than 300 years.
Speaking to legal technology reporters during ILTACON, the International Legal Technology Association’s annual conference, Hasker outlined his company’s ambitious goal to become “the most innovative company” in the legal tech sector while navigating what he described as unprecedented technological change affecting a profession that has remained largely unchanged since its origins in London tea houses centuries ago.
The winning team comprised of students of computer science, law, psychology and physics. They developed a privacy-first legal assistant powered by AI that helps people understand their legal rights without needing to navigate dense legal language.
Law Schools These insights have particular urgency for legal education. Indeed, most of Cowen’s criticisms and suggested changes need to be front and center for law school leaders. It’s naïve to think that law student and lawyers aren’t going to use GenAI tools in virtually every aspect of their professional and personal lives. Rather than avoiding the subject or worse yet trying to stop use of these tools, law schools should make GenAI tools a fundamental part of research, writing and drafting training.
They need to focus not on memorization but on the critical thinking skills beginning lawyers used to get in the on-the-job training guild type system. As I discussed, that training came from repetitive and often tedious work that developed experienced lawyers who could recognize patterns and solutions based on the exposure to similar situations. But much of that repetitive and tedious work may go away in a GenAI world.
The Role of Adjunct Professors But to do this, law schools need to better partner with actual practicing lawyers who can serve as adjunct professors. Law schools need to do away with the notion that adjuncts are second-class teachers.
As someone who has covered legal tech for 30 years, I cannot remember there ever being a time such as this, when the energy and excitement are raging, driven by generative AI and a new era of innovation and new ideas of the possible.
…
But this year was different. Wandering the exhibit hall, getting product briefings from vendors, talking with attendees, it was impossible to ignore the fundamental shift happening in legal technology. Gen AI isn’t just creating new products – it is spawning entirely new categories of products that truly are reshaping how legal work gets done.
Agentic AI is the buzzword of 2025 and agentic systems were everywhere at ILTACON, promising to streamline workflows across all areas of legal practice. But, perhaps more importantly, these tools are also beginning to address the business side of running a law practice – from client intake and billing to marketing and practice management. The scope of transformation is now beginning to extend beyond the practice of law into the business of law. … Largely missing from this gathering were solo practitioners, small firm lawyers, legal aid organizations, and access-to-justice advocates – the very people who stand to benefit most from the democratizing potential of AI.
…
However, now more than ever, the innovations we are seeing in legal tech have the power to level the playing field, to give smaller practitioners access to tools and capabilities that were once prohibitively expensive. If these technologies remain priced for and marketed primarily to Big Law, we will have succeeded only in widening the justice gap rather than closing it.
Thanks to breakthroughs in artificial intelligence – particularly in semantic search, multimodal models, and natural language processing – new legaltech solutions are emerging to streamline and accelerate deposition review. What once took hours or days of manual analysis now can be accomplished in minutes, with greater accuracy and efficiency than possible with manual review.
Historically, lawyers have been slow adopters of emerging technologies, and with good reason. Legal work is high stakes, deeply rooted in precedent, and built on individual judgment. AI, especially the new generation of agentic AI (systems that not only generate output but initiate tasks, make decisions, and operate semi-autonomously), represents a fundamental shift in how legal work gets done. This shift naturally leads to caution as it challenges long-held assumptions about lawyer workflows and several aspects of their role in the legal process.
The path forward is not to push harder or faster, but smarter. Firms need to take a structured approach that builds trust through transparency, context, training, and measurement of success. This article provides a five-part playbook for law firm leaders navigating AI change management, especially in environments where skepticism is high and reputational risk is even higher.
This year’s ILTACON in Washington was heavy on AI, but the conversation with vendors has shifted. Legal IT Insider’s briefings weren’t about potential use cases or speculative roadmaps. Instead, they focused on how AI is now being embedded into the tools lawyers use every day — and, crucially, how those tools are starting to talk to each other.
Taken together, they point to an inflection point, where agentic workflows, data integration, and open ecosystems define the agenda. But it’s important amidst the latest buzzwords to remember that agents are only as good as the tools they have to work with, and AI only as good as its underlying data. Also, as we talk about autonomous AI, end users are still struggling with cloud implementations and infrastructure challenges, and need vendors to be business partners that help them to make progress at speed.
Harvey’s roadmap is all about expanding its surface area — connecting to systems like iManage, LexisNexis, and more recently publishing giant Wolters Kluwer — so that a lawyer can issue a single query and get synthesised, contextualised answers directly within their workflow. Weinberg said: “What we’re trying to do is get all of the surface area of all of the context that a lawyer needs to complete a task and we’re expanding the product surface so you can enter a search, search all resources, and apply that to the document automatically.”
The common thread: no one is talking about AI in isolation anymore. It’s about orchestration — pulling together multiple data sources into a workflow that actually reflects how lawyers practice.
Why You Can’t Afford to Wait to Adopt AI Tools that have Plaintiffs Moving Faster than Ever Just as photocopiers shifted law firm operations in the early 1970s and cloud computing transformed legal document management in the early 2000s, AI automation tools are altering the current legal landscape—enabling litigation teams to instantly structure unstructured data, zero in on key arguments in seconds, and save hundreds (if not thousands) of hours of manual work.
The Floor, Not the Ceiling Smart firms need to flip their entire approach. Instead of dictating which AI tools lawyers must use, leadership should set a floor for acceptable use and then get out of the way.
The floor is simple: no free versions for client work. Free tools are free because users are the product. Client data becomes training data. Confidentiality gets compromised. The firm loses any ability to audit or control how information flows. This isn’t about control; it’s about professional responsibility.
But setting the floor is only the first step. Firms must provide paid, enterprise versions of AI tools that lawyers actually want to use. Not some expensive legal tech platform that promises AI features but delivers complicated workflows. Real AI tools. The same ones lawyers are already using secretly, but with enterprise security, data protection, and proper access controls.
Education must be practical and continuous. Single training sessions don’t work. AI tools evolve weekly. New capabilities emerge constantly. Lawyers need ongoing support to experiment, learn, and share discoveries. This means regular workshops, internal forums for sharing prompts and techniques, and recognition for innovative uses.
The education investment pays off immediately. Lawyers who understand AI use it more effectively. They catch its mistakes. They know when to verify outputs. They develop specialized prompts for legal work. They become force multipliers, not just for themselves but for their entire teams.
As a new academic year begins, many instructors, trainers, and program leaders are bracing for familiar challenges—keeping learners engaged, making complex material accessible, and preparing students for real-world application.
But there’s a quiet shift happening in classrooms and online courses everywhere.
This fall, it’s not the syllabus that’s guiding the learning experience—it’s the conversation between the learner and an AI tool.
From bootcamp to bust: How AI is upending the software development industry — from reuters.com by Anna Tong; via Paul Fain Coding bootcamps have been a mainstay in Silicon Valley for more than a decade. Now, as AI eliminates the kind of entry-level roles for which they trained people, they’re disappearing.
Coding bootcamps have been a Silicon Valley mainstay for over a decade, offering an important pathway for non-traditional candidates to get six-figure engineering jobs. But coding bootcamp operators, students and investors tell Reuters that this path is rapidly disappearing, thanks in large part to AI.
“Coding bootcamps were already on their way out, but AI has been the nail in the coffin,” said Allison Baum Gates, a general partner at venture capital fund SemperVirens, who was an early employee at bootcamp pioneer General Assembly.
Gates said bootcamps were already in decline due to market saturation, evolving employer demand and market forces like growth in international hiring.
Millions of college students around the world are getting ready to start classes. To help make the school year even better, we’re making our most advanced AI tools available to them for free, including our new Guided Learning mode. We’re also providing $1 billion to support AI education and job training programs and research in the U.S. This includes making our AI and career training free for every college student in America through our AI for Education Accelerator — over 100 colleges and universities have already signed up.
… Guided Learning: from answers to understanding
AI can broaden knowledge and expand access to it in powerful ways, helping anyone, anywhere learn anything in the way that works best for them. It’s not about just getting an answer, but deepening understanding and building critical thinking skills along the way. That opportunity is why we built Guided Learning, a new mode in Gemini that acts as a learning companion guiding you with questions and step-by-step support instead of just giving you the answer. We worked closely with students, educators, researchers and learning experts to make sure it’s helpful for understanding new concepts and is backed by learning science.
Another major AI lab just launched “education mode.”
Google introduced Guided Learningin Gemini, transforming it into a personalized learning companion designed to help you move from quick answers to real understanding.
Instead of immediately spitting out solutions, it:
Asks probing, open-ended questions
Walks learners through step-by-step reasoning
Adapts explanations to the learner’s level
Uses visuals, videos, diagrams, and quizzes to reinforce concepts
I’m not too naive to understand that, no matter how we present it, some students will always be tempted by “the dark side” of AI. What I also believe is that the future of AI in education is not decided. It will be decided by how we, as educators, embrace or demonize it in our classrooms.
My argument is that setting guidelines and talking to our students honestly about the pitfalls and amazing benefits that AI offers us as researchers and learners will define it for the coming generations.
Can AI be the next calculator? Something that, yes, changes the way we teach and learn, but not necessarily for the worse? If we want it to be, yes.
How it is used, and more importantly, how AI is perceived by our students, can be influenced by educators. We have to first learn how AI can be used as a force for good. If we continue to let the dominant voice be that AI is the Terminator of education and critical thinking, then that will be the fate we have made for ourselves.
AI Tools for Strategy and Research – GT #32 — from goodtools.substack.com by Robin Good Getting expert advice, how to do deep research with AI, prompt strategy, comparing different AIs side-by-side, creating mini-apps and an AI Agent that can critically analyze any social media channel
In this week’s blog post, I’ll share my take on how the instructional design role is evolving and discuss what this means for our day-to-day work and the key skills it requires.
…
With this in mind, I’ve been keeping a close eye on open instructional design roles and, in the last 3 months, have noticed the emergence of a new flavour of instructional designer: the so-called “Generative AI Instructional Designer.”
Let’s deep dive into three explicitly AI-focused instructional design positions that have popped up in the last quarter. Each one illuminates a different aspect of how the role is changing—and together, they paint a picture of where our profession is likely heading.
Designers who evolve into prompt engineers, agent builders, and strategic AI advisors will capture the new premium. Those who cling to traditional tool-centric roles may find themselves increasingly sidelined—or automated out of relevance.
Google’s parent company announced Wednesday (8/6/25) that it’s planning to spend $1 billion over the next three years to help colleges teach and train students about artificial intelligence.
Google is joining other AI companies, including OpenAI and Anthropic, in investing in AI training in higher education. All three companies have rolled out new tools aimed at supporting “deeper learning” among students and made their AI platforms available to certain students for free.
Based on current technology capabilities, adoption patterns, and the mission of community colleges, here are five well-supported predictions for AI’s impact in the coming years.
“Where generative AI creates, agentic AI acts.” That’s how my trusted assistant, Gemini 2.5 Pro deep research, describes the difference.
…
Agents, unlike generative tools, create and perform multistep goals with minimal human supervision. The essential difference is found in its proactive nature. Rather than waiting for a specific, step-by-step command, agentic systems take a high-level objective and independently create and execute a plan to achieve that goal. This triggers a continuous, iterative workflow that is much like a cognitive loop. The typical agentic process involves six key steps, as described by Nvidia:
Our 2025 national survey of over 650 respondents across 49 states and Puerto Rico reveals both encouraging trends and important challenges. While AI adoption and optimism are growing, concerns about cheating, privacy, and the need for training persist.
Despite these challenges, I’m inspired by the resilience and adaptability of educators. You are the true game-changers in your students’ growth, and we’re honored to support this vital work.
This report reflects both where we are today and where we’re headed with AI. More importantly, it reflects your experiences, insights, and leadership in shaping the future of education.
This groundbreaking collaboration represents a transformative step forward in education technology and will begin with, but is not limited to, an effort between Instructure and OpenAI to enhance the Canvas experience by embedding OpenAI’s next-generation AI technology into the platform.
IgniteAI announced earlier today, establishes Instructure’s future-ready, open ecosystem with agentic support as the AI landscape continues to evolve. This partnership with OpenAI exemplifies this bold vision for AI in education. Instructure’s strategic approach to AI emphasizes the enhancement of connections within an educational ecosystem comprising over 1,100 edtech partners and leading LLM providers.
“We’re committed to delivering next-generation LMS technologies designed with an open ecosystem that empowers educators and learners to adapt and thrive in a rapidly changing world,” said Steve Daly, CEO of Instructure. “This collaboration with OpenAI showcases our ambitious vision: creating a future-ready ecosystem that fosters meaningful learning and achievement at every stage of education. This is a significant step forward for the education community as we continuously amplify the learning experience and improve student outcomes.”
Faculty Latest Targets of Big Tech’s AI-ification of Higher Ed— from insidehighered.com by Kathryn Palmer A new partnership between OpenAI and Instructure will embed generative AI in Canvas. It may make grading easier, but faculty are skeptical it will enhance teaching and learning.
The two companies, which have not disclosed the value of the deal, are also working together to embed large language models into Canvas through a feature called IgniteAI. It will work with an institution’s existing enterprise subscription to LLMs such as Anthropic’s Claude or OpenAI’s ChatGPT, allowing instructors to create custom LLM-enabled assignments. They’ll be able to tell the model how to interact with students—and even evaluate those interactions—and what it should look for to assess student learning. According to Instructure, any student information submitted through Canvas will remain private and won’t be shared with OpenAI.
… Faculty Unsurprised, Skeptical
Few faculty were surprised by the Canvas-OpenAI partnership announcement, though many are reserving judgment until they see how the first year of using it works in practice.
AI and Higher Ed: An Impending Collapse — from insidehighered.com by Robert Niebuhr; via George Siemens; I also think George’s excerpt (see below) gets right to the point. Universities’ rush to embrace AI will lead to an untenable outcome, Robert Niebuhr writes.
Herein lies the trap. If students learn how to use AI to complete assignments and faculty use AI to design courses, assignments, and grade student work, then what is the value of higher education? How long until people dismiss the degree as an absurdly overpriced piece of paper? How long until that trickles down and influences our economic and cultural output? Simply put, can we afford a scenario where students pretend to learn and we pretend to teach them?
This next report doesn’t look too good for traditional institutions of higher education either:
For the first time in modern history, a bachelor’s degree is no longer a reliable path to professional employment. Recent graduates face rising unemployment and widespread underemployment as structural—not cyclical—forces reshape entry?level work. This new report identifies four interlocking drivers: an AI?powered “Expertise Upheaval” eliminating many junior tasks, a post?pandemic shift to lean staffing and risk?averse hiring, AI acting as an accelerant to these changes, and a growing graduate glut. As a result, young degree holders are uniquely seeing their prospects deteriorate – even as the rest of the economy remain robust. Read the full report to explore the data behind these trends.
What is Study Mode?
Study Mode is OpenAI’s take on a smarter study partner – a version of the ChatGPT experience designed to guide users through problems with Socratic prompts, scaffolded reasoning, and adaptive feedback (instead of just handing over the answer).
Built with input from learning scientists, pedagogy experts, and educators, it was also shaped by direct feedback from college students. While Study Mode is designed with college students in mind, it’s meant for anyone who wants a more learning-focused, hands-on experience across a wide range of subjects and skill levels.
Who can access it? And how?
Starting July 29, Study Mode is available to users on Free, Plus, Pro, and Team plans. It will roll out to ChatGPT Edu users in the coming weeks.
ChatGPT became your tutor— from theneurondaily.com by Grant Harvey PLUS: NotebookLM has video now & GPT 4o-level AI runs on laptop
Here’s how it works: instead of asking “What’s 2+2?” and getting “4,” study mode asks questions like “What do you think happens when you add these numbers?” and “Can you walk me through your thinking?” It’s like having a patient tutor who won’t let you off the hook that easily.
The key features include:
Socratic questioning: It guides you with hints and follow-up questions rather than direct answers.
Scaffolded responses: Information broken into digestible chunks that build on each other.
Personalized support: Adjusts difficulty based on your skill level and previous conversations.
Knowledge checks: Built-in quizzes and feedback to make sure concepts actually stick.
Toggle flexibility: Switch study mode on and off mid-conversation depending on your goals.
Try study mode yourself by selecting “Study and learn” from tools in ChatGPT and asking a question.
Introducing study mode— from openai.com A new way to learn in ChatGPT that offers step by step guidance instead of quick answers.
[On 7/29/25, we introduced] study mode in ChatGPT—a learning experience that helps you work through problems step by step instead of just getting an answer. Starting today, it’s available to logged in users on Free, Plus, Pro, Team, with availability in ChatGPT Edu coming in the next few weeks.
ChatGPT is becoming one of the most widely used learning tools in the world. Students turn to it to work through challenging homework problems, prepare for exams, and explore new concepts. But its use in education has also raised an important question: how do we ensure it is used to support real learning, and doesn’t just offer solutions without helping students make sense of them?
We’ve built study mode to help answer this question. When students engage with study mode, they’re met with guiding questions that calibrate responses to their objective and skill level to help them build deeper understanding. Study mode is designed to be engaging and interactive, and to help students learn something—not just finish something.
“We know that true learning takes friction. It takes struggle,” said Mills. “You have to engage with the materials, and if students offload all of that work to a tool like ChatGPT, they will not learn those skills and they will not gain that critical thinking. That said, when ChatGPT is used correctly as a learning assistant and as a tutor, the results are powerful.”
Given that 40 percent of ChatGPT users are under the age of 24—and that learning is the platform’s number one use case, according to Mills—the need to fine-tune guardrails is becoming increasingly urgent. Pew Research reports that twice as many teens now use ChatGPT for schoolwork compared to 2023, with nearly one-third of teen respondents saying it’s acceptable to use the tool to solve math problems.
In response, Mills said OpenAI is actively researching what appropriate A.I. use in education looks like, with plans to share that guidance widely and rapidly with educators around the world.
From DSC: In looking atMyNextChapter.ai— THIS TYPE OF FUNCTIONALITY of an AI-based chatbot talking to you re: good fits for a future job — is the kind of thing that could work well in this type of vision/learning platform. The AI asks you relevant career-oriented questions, comes up with some potential job fits, and then gives you resources about how to gain those skills, who to talk with, organizations to join, next steps to get your foot in the door somewhere, etc.
The next gen learning platform would provide links to online-based courses, blogs, peoples’ names on LinkedIn, courses from L&D organizations or from institutions of higher education or from other entities/places to obtain those skills (similar to the ” Action Plan” below from MyNextChapter.ai).