AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?

AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?

On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.

“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”

Also relevant/see:

Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.

“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.

Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.

Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.


‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?

Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.

While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.


AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen

When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.

The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.


AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk

Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.

In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.


Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie

In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.

Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?

Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?

Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.

 

Higher Ed Institutions Rely Less on OPMs While Increasingly Hiring Fee-For-Service Models — from iblnews.org

market report from Validated Insights released this month notes that fewer colleges and universities hire external online program management (OPM) companies to develop their courses.

For 2024, higher education institutions launched only 81 new partnerships with OPMs —  a drop of 42% and the lowest number since 2016.

The report showed that institutions increasingly pay OPMs a fee-for-service instead of following a revenue-sharing model with big service bundles and profit splits.

Experts say revenue-sharing models, which critics denounce as predatory arrangements, incentivize service providers to use aggressive recruiting tactics to increase enrollments and maximize tuition revenue.

According to the report, fee-for-service has become the dominant business model for OPMs.


6 Online Edtech Professional Learning Communities & Resources for Teachers — from techlearning.com by Stephanie Smith Budhai, Ph.D.
These resources can help provide training, best practices, and advice, for using digital tools such as Canva, Curipod, Kahoot!, and more

While school-led professional development can be helpful, there are online professional learning communities on various edtech websites that can be leveraged. Also, some of these community spaces offer the chance to monetize your work.

Here is a summary of six online edtech professional learning spaces.

 

What does ‘age appropriate’ AI literacy look like in higher education? — from timeshighereducation.com by Fun Siong Lim
As AI literacy becomes an essential work skill, universities need to move beyond developing these competencies at ‘primary school’ level in their students. Here, Fun Siong Lim reflects on frameworks to support higher-order AI literacies

Like platforms developed at other universities, Project NALA offers a front-end interface (known as the builder) for faculty to create their own learning assistant. An idea we have is to open the builder up to students to allow them to create their own GenAI assistant as part of our AI literacy curriculum. As they design, configure and test their own assistant, they will learn firsthand how generative AI works. They get to test performance-enhancement approaches beyond prompt engineering, such as grounding the learning assistant with curated materials (retrieval-augmented generation) and advanced ideas such as incorporating knowledge graphs.

They should have the opportunity to analyse, evaluate and create responsible AI solutions. Offering students the opportunity to build their own AI assistants could be a way forward to develop these much-needed skills.


How to Use ChatGPT 4o’s Update to Turn Key Insights Into Clear Infographics (Prompts Included) — from evakeiffenheim.substack.com by Eva Keiffenheim
This 3-step workflow helps you break down books, reports, or slide-decks into professional visuals that accelerate understanding.

This article shows you how to find core ideas, prompt GPT-4o3 for a design brief, and generate clean, professional images that stick. These aren’t vague “creative visuals”—they’re structured for learning, memory, and action.

If you’re a lifelong learner, educator, creator, or just someone who wants to work smarter, this process is for you.

You’ll spend less time re-reading and more time understanding. And maybe—just maybe—you’ll build ideas that not only click in your brain, but also stick in someone else’s.


SchoolAI Secures $25 Million to Help Teachers and Schools Reach Every Student — from globenewswire.com
 The Classroom Experience platform gives every teacher and student their own AI tools for personalized learning

SchoolAI’s Classroom Experience platform combines AI assistants for teachers that help with classroom preparation and other administrative work, and Spaces–personalized AI tutors, games, and lessons that can adapt to each student’s unique learning style and interests. Together, these tools give teachers actionable insights into how students are doing, and how the teacher can deliver targeted support when it matters most.

“Teachers and schools are navigating hard challenges with shrinking budgets, teacher shortages, growing class sizes, and ongoing recovery from pandemic-related learning gaps,” said Caleb Hicks, founder and CEO of SchoolAI. “It’s harder than ever to understand how every student is really doing. Teachers deserve powerful tools to help extend their impact, not add to their workload. This funding helps us double down on connecting the dots for teachers and students, and later this year, bringing school administrators and parents at home onto the platform as well.”


AI in Education, Part 3: Looking Ahead – The Future of AI in Learning — from rdene915.com by Dr. Rachelle Dené Poth

In the first and second parts of my AI series, I focused on where we see AI in classrooms. Benefits range from personalized learning and accessibility tools to AI-driven grading and support of a teaching assistant. In Part 2, I chose to focus on some of the important considerations related to ethics that must be part of the conversation. Schools need to focus on data privacy, bias, overreliance, and the equity divide. I wanted to focus on the future for this last part in the current AI series. Where do we go from here?


Anthropic Education Report: How University Students Use Claude — from anthropic.com

The key findings from our Education Report are:

  • STEM students are early adopters of AI tools like Claude, with Computer Science students particularly overrepresented (accounting for 36.8% of students’ conversations while comprising only 5.4% of U.S. degrees). In contrast, Business, Health, and Humanities students show lower adoption rates relative to their enrollment numbers.
  • We identified four patterns by which students interact with AI, each of which were present in our data at approximately equal rates (each 23-29% of conversations): Direct Problem Solving, Direct Output Creation, Collaborative Problem Solving, and Collaborative Output Creation.
  • Students primarily use AI systems for creating (using information to learn something new) and analyzing (taking apart the known and identifying relationships), such as creating coding projects or analyzing law concepts. This aligns with higher-order cognitive functions on Bloom’s Taxonomy. This raises questions about ensuring students don’t offload critical cognitive tasks to AI systems.

From the Kuali Days 2025 Conference: A CEO’s View of Planning for AI — from campustechnology.com by Mary Grush
A Conversation with Joel Dehlin

How can a company serving higher education navigate the changes AI brings to the ed tech marketplace? What will customers expect in this dynamic? Here, CT talks with Kuali CEO Joel Dehlin, who shared his company’s AI strategies in a featured plenary session, “Sneak Peek of AI in Kuali Build,” at Kuali Days 2025 in Anaheim.


How students can use generative AI — from aliciabankhofer.substack.com by Alicia Bankhofer
Part 4 of 4 in my series on Teaching and Learning in the AI Age

This article is the culmination of a series exploring AI’s impact on education.

Part 1: What Educators Need outlined essential AI literacy skills for teachers, emphasizing the need to move beyond basic ChatGPT exploration to understand the full spectrum of AI tools available in education.

Part 2: What Students Need addressed how students require clear guidance to use AI safely, ethically, and responsibly, with emphasis on developing critical thinking skills alongside AI literacy.

Part 3: How Educators Can Use GenAI presented ten practical use cases for teachers, from creating differentiated resources to designing assessments, demonstrating how AI can reclaim 5-7 hours weekly for meaningful student interactions.

Part 4: How Students Can Use GenAI (this article) provides frameworks for guiding student AI use based on Joscha Falck’s dimensions: learning about, with, through, despite, and without AI.


Mapping a Multidimensional Framework for GenAI in Education — from er.educause.edu by Patricia Turner
Prompting careful dialogue through incisive questions can help chart a course through the ongoing storm of artificial intelligence.

The goal of this framework is to help faculty, educational developers, instructional designers, administrators, and others in higher education engage in productive discussions about the use of GenAI in teaching and learning. As others have noted, theoretical frameworks will need to be accompanied by research and teaching practice, each reinforcing and reshaping the others to create understandings that will inform the development of approaches to GenAI that are both ethical and maximally beneficial, while mitigating potential harms to those who engage with it.


Instructional Design Isn’t Dying — It’s Specialising — from drphilippahardman.substack.com by Dr. Philippa Hardman
Aka, how AI is impacting role & purpose of Instructional Design

Together, these developments have revealed something important: despite widespread anxiety, the instructional design role isn’t dying—it’s specialising.

What we’re witnessing isn’t the automation of instructional design and the death of the instructional designer, but rather the evolution of the ID role into multiple distinct professional pathways.

The generalist “full stack” instructional designer is slowly but decisively fracturing into specialised roles that reflect both the capabilities of generative AI and the strategic imperatives facing modern organisations.

In this week’s blog post, I’ll share what I’ve learned about how our field is transforming, and what it likely means for you and your career path.

Those instructional designers who cling to traditional generalist models risk being replaced, but those who embrace specialisation, data fluency, and AI collaboration will excel and lead the next evolution of the field. Similarly, those businesses that continue to view L&D as a cost centre and focus on automating content delivery will be outperformed, while those that invest in building agile, AI-enabled learning ecosystems will drive measurable performance gains and secure their competitive advantage.


Adding AI to Every Step in Your eLearning Design Workflow — from learningguild.com by George Hanshaw

We know that eLearning is a staple of training and development. The expectations of the learners are higher than ever: They expect a dynamic, interactive, and personalized learning experience. As instructional designers, we are tasked with meeting these expectations by creating engaging and effective learning solutions.

The integration of Artificial Intelligence (AI) into our eLearning design process is a game-changer that can significantly enhance the quality and efficiency of our work.

No matter if you use ADDIE or rapid prototyping, AI has a fit in every aspect of your workflow. By integrating AI, you can ensure a more efficient and effective design process that adapts to the unique needs of your learners. This not only saves time and resources but also significantly enhances the overall learning experience. We will explore the needs analysis and the general design process.

 

The following resource was from Roberto Ferraro:

Micromanagement — from psychsafety.com by Jade Garratt

Psychological Safety and Micromanagement
Those who have followed our work at Psych Safety for a while will know that we believe exploring not just what to do – the behaviours and practices that support psychological safety – but also what to avoid can be hugely valuable. Understanding the behaviours that damage psychological safety, what not to do, and even what not to say can help us build better workplaces.

There are many behaviours that damage psychological safety, and one that almost always comes up in our workshops when discussing cultures of fear is micromanagement. So we thought it was time we explored micromanagement in more detail, considering how and why it damages psychological safety and what we can do instead.

Micromanagement is a particular approach to leadership where a manager exhibits overly controlling behaviours or an excessive and inappropriate focus on minor details. They might scrutinise their team’s work closely, insist on checking work, refrain from delegating, and limit the autonomy people need to do their jobs well. It can also manifest as an authoritarian leadership style, where decision-making is centralised (back to themselves) and employees have little say in their work.


From DSC:
I was fortunate to not have a manager who was a micromanager until my very last boss/supervisor of my career. But it was that particular manager who made me call it quits and leave the track. She demeaned me in front of others, and was extremely directive and controlling. She wanted constant check-ins and progress reports. And I could go on and on here. 

But suffice it to say that after having worked for several decades, that kind of manager was not what I was looking for. And you wouldn’t be either. By the way…my previous boss — at the same place — and I achieved a great deal in a very short time. She taught me a lot and was a great administrator, designer, professor, mentor, and friend. But that boss was moved to a different role as upper management/leadership changed. Then the micromanagement began after I reported to a different supervisor.

Anyway, don’t be a micromanager. If you are a recent graduate or are coming up on your graduation from college, learn that lesson now. No one likes to work for a micromanager. No one. It can make your employees’ lives miserable and do damage to their mental health, their enjoyment (or lack thereof) of work, and several other things that this article mentions. Instead, respect your employees. Trust your employees. Let them do their thing. See what they might need, then help meet those needs. Then get out of their way.


 


.

2025 EDUCAUSE Students and Technology Report: Shaping the Future of Higher Education Through Technology, Flexibility, and Well-Being — from library.educause.edu

The student experience in higher education is continually evolving, influenced by technological advancements, shifting student needs and expectations, evolving workforce demands, and broadening sociocultural forces. In this year’s report, we examine six critical aspects of student experiences in higher education, providing insights into how institutions can adapt to meet student needs and enhance their learning experience and preparation for the workforce:

  • Satisfaction with Technology-Related Services and Supports
  • Modality Preferences
  • Hybrid Learning Experiences
  • Generative AI in the Classroom
  • Workforce Preparation
  • Accessibility and Mental Health

DSC: Shame on higher ed for not preparing students for the workplace (see below). You’re doing your students wrong…again. Not only do you continue to heap a load of debt on their backs, but you’re also continuing to not get them ready for the workplace. So don’t be surprised if eventually you’re replaced by a variety of alternatives that students will flock towards.
.

 

DSC: And students don’t have a clue as to what awaits them in the workplace — they see AI-powered tools and technologies at an incredibly low score of only 3%. Yeh, right. You’ll find out. Here’s but one example from one discipline/field of work –> Thomson Reuters Survey: Over 95% of Legal Professionals Expect Gen AI to Become Central to Workflow Within Five Years

.

Figure 15. Competency Areas Expected to Be Important for Career

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 




Students and folks looking for work may want to check out:

Also relevant/see:


 

Introducing NextGenAI: A consortium to advance research and education with AI — from openai.com; via Claire Zau
OpenAI commits $50M in funding and tools to leading institutions.

Today, we’re launching NextGenAI, a first-of-its-kind consortium with 15 leading research institutions dedicated to using AI to accelerate research breakthroughs and transform education.

AI has the power to drive progress in research and education—but only when people have the right tools to harness it. That’s why OpenAI is committing $50M in research grants, compute funding, and API access to support students, educators, and researchers advancing the frontiers of knowledge.

Uniting institutions across the U.S. and abroad, NextGenAI aims to catalyze progress at a rate faster than any one institution would alone. This initiative is built not only to fuel the next generation of discoveries, but also to prepare the next generation to shape AI’s future.


 ‘I want him to be prepared’: why parents are teaching their gen Alpha kids to use AI — from theguardian.com by Aaron Mok; via Claire Zau
As AI grows increasingly prevalent, some are showing their children tools from ChatGPT to Dall-E to learn and bond

“My goal isn’t to make him a generative AI wizard,” White said. “It’s to give him a foundation for using AI to be creative, build, explore perspectives and enrich his learning.”

White is part of a growing number of parents teaching their young children how to use AI chatbots so they are prepared to deploy the tools responsibly as personal assistants for school, work and daily life when they’re older.

 

Nvidia helps launch AI platform for teaching American Sign Language — from venturebeat.com by Dean Takahashi; via Claire Zau

Nvidia has unveiled a new AI platform for teaching people how to use American Sign Language to help bridge communication gaps.

The Signs platform is creating a validated dataset for sign language learners and developers of ASL-based AI applications.

Nvidia, the American Society for Deaf Children and creative agency Hello Monday are helping close this gap with Signs, an interactive web platform built to support ASL learning and the development of accessible AI applications.


Using Gen AI to Design, Implement, and Assess PBL — from gettingsmart.com by David Ross

Key Points

  • Generative AI can significantly reduce the time and effort required in designing PBL by providing tools for research, brainstorming, and organization.
  • AI tools can assist educators in managing project implementation and assessment, providing formative feedback and organizing resources efficiently.

I usually conclude blogs with some pithy words, but this time I’ll turn the microphone over to Rachel Harcrow, a high school English/Language Arts teacher at Young Women’s College Prep Charter School of Rochester, NY: “After years of struggling to call myself a PBL practitioner, I finally feel comfortable saying I am, thanks to the power of Gen AI,” Harcrow told me. “Initial ideas now turn into fully fledged high-quality project plans in minutes that I can refine, giving me the space and energy to focus on what truly matters: My students.”


AI Resources for District Leaders — from techlearning.com by Steve Baule
Educational leaders aiming to effectively integrate generative AI into their schools should consider several key resources

To truly harness the transformative power of generative AI in education, district leaders must navigate a landscape rich with resources and opportunities. By delving into state and national guidelines, exploring successful case studies, utilizing innovative planning tools, and engaging in professional development, educational leaders can craft robust implementation plans. These plans can then assist in integrating AI seamlessly into their schools and elevate the learning experience to new heights.


Anthropic brings ‘extended thinking’ to Claude, which can solves complex physics problems with 96.5% accuracy — from rdworldonline.com by Brian Buntz

Anthropic, a favorite frontier AI lab among many coders and genAI power users has unveiled Claude 3.7 Sonnet, its first “hybrid reasoning” AI model. It is capable of both near-instant answers and in-depth, step-by-step reasoning within a single system.

Users can toggle an extended thinking mode where the model self-reflects before answering, considerably improving performance on complex tasks like math, physics and coding. In early testing by the author, the model largely succeeded in creating lines of Python (related to unsupervised learning) that were close to 1,000 lines long that ran without error on the first or second try, including the unsupervised machine learning task shown below:


New Tools. Old Complaints. Why AI Won’t Kill Education or Fix it  — from coolcatteacher.com by Vicki Davis; via Stephen Downes

AI won’t kill education. But will it kill learning? The challenge isn’t AI itself—it’s whether students can still think for themselves when the answers are always one click away.

Wait. Before you go, let me ask you one thing.
AI has opportunities to help learning. But it also won’t fix it. The real question isn’t whether students can use AI—but whether they’re still learning without it.

Whether the learning is happening between the ears.

And so much of what we teach in schools isn’t the answers on a test. It answers questions like “What is my purpose in life?” “How do I make friends?” and “How can I help my team be stronger.” Questions that aren’t asked on a test but are essential to living a good life. These questions aren’t answered between the ears but within the heart.

That, my friends, is what teaching has always been about.

The heart.

And the heart of the matter is we have new challenges, but these are old complaints. Complaints since the beginning of time and teaching. And in those days, you didn’t need kids just to be able to talk about how to build a fire, they had to make one themselves. Their lives depend on it.

And these days, we need to build another kind of fire. A fire that sparks the joy of learning. The joy of the opportunities that await us sparked by some of the most powerful tools ever invented. Kids need to not be able to just talk about making a difference, they need to know how to build a better world tomorrow. Our lives depend on it.


How Debating Skills Can Help Us In The Fight Against AI — from adigaskell.org by Adi Gaskell

Debating skills have a range of benefits in the workplace, from helping to improve our communication to bolstering our critical thinking skills. Research from the University of Mississippi suggests it might also help us in the battle with AI in the workplace.

We can often assume that debate teaches us nothing more than how to argue our point, but in order to do this, we have to understand both our own take on a subject and that of our opponent. This allows us to see both sides of any issue we happen to be debating.

“Even though AI has offered a shortcut through the writing process, it actually still is important to be able to write and speak and think on your own,” the researchers explain. “That’s what the focus of this research is: how debate engenders those aspects of being able to write and speak and study and research on your own.”

 

Assistive tech in your classroom: A practical guide — from understood.org by Andrew M.I. Lee, JD
Assistive technology (AT) are tools that let people with differences work around challenges. They make tasks and activities accessible at school, work, and home. Learn how AT apps and software can help with reading, writing, math, and more.

People who learn and think differently can use technology to help work around their challenges. This is called assistive technology (AT). AT helps people with disabilities learn, communicate, or function better. It can be as high-tech as a computer, or as low-tech as a pencil grip. It’s a type of accommodation that involves tools.

Assistive technology has two parts: devices (the actual tools people use) and services (the support to choose and use the tools).

Students who struggle with learning can use AT to help with subjects like reading, writing, and math. AT can also help kids and adults with the tasks of daily life. And many adults use these tools on the job, too.
.

 

AI in K12: Today’s Breakthroughs and Tomorrow’s Possibilities (webinar)
How AI is Transforming Classrooms Today and What’s Next


Audio-Based Learning 4.0 — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new & powerful way to leverage AI for learning?

At the end of all of this my reflection is that the research paints a pretty exciting picture – audio-based learning isn’t just effective, it’s got some unique superpowers when it comes to boosting comprehension, ramping up engagement, and delivering feedback that really connects with learners.

While audio has been massively under-used as a mode of learning, especially compared to video and text, we’re at an interesting turning point where AI tools are making it easier than ever to tap into audio’s potential as a pedagogical tool.

What’s super interesting is how the solid research backing audio’s effectiveness is and how well this is converging with these new AI capabilities.

From DSC:
I’ve noticed that I don’t learn as well via audio-only based events. It can help if visuals are also provided, but I have to watch the cognitive loads. My processing can start to get overloaded — to the point that I have to close my eyes and just listen sometimes. But there are people I know who love to listen to audiobooks and prefer to learn that way. They can devour content and process/remember it all. Audio is a nice change of pace at times, but I prefer visuals and reading often times. It needs to be absolutely quiet if I’m tackling some new information/learning. 


In Conversation With… Ashton Cousineau — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new video series exploring how L&D professionals are working with AI on the ground

In Conversation With… Ashton Cousineau by Dr Philippa Hardman

A new video series exploring how L&D professionals are working with AI on the ground

Read on Substack


The Learning Research Digest vol. 28 — from learningsciencedigest.substack.com by Dr. Philippa Hardman

Hot Off the Research Press This Month:

  • AI-Infused Learning Design – A structured approach to AI-enhanced assignments using a three-step model for AI integration.
  • Mathematical Dance and Creativity in STEAM – Using AI-powered motion capture to translate dance movements into mathematical models.
  • AI-Generated Instructional Videos – How adaptive AI-powered video learning enhances problem-solving and knowledge retention.
  • Immersive Language Learning with XR & AI – A new framework for integrating AI-driven conversational agents with Extended Reality (XR) for task-based language learning.
  • Decision-Making in Learning Design – A scoping review on how instructional designers navigate complex instructional choices and make data-driven decisions.
  • Interactive E-Books and Engagement – Examining the impact of interactive digital books on student motivation, comprehension, and cognitive engagement.
  • Elevating Practitioner Voices in Instructional Design – A new initiative to amplify instructional designers’ contributions to research and innovation.

Deep Reasoning, Agentic AI & the Continued Rise of Specialised AI Research & Tools for Education — from learningfuturesdigest.substack.com by Dr. Philippa Hardman

Here’s a quick teaser of key developments in the world of AI & learning this month:

  • DeepSeek R-1, OpenAI’s Deep Seek & Perplexity’s ‘Deep Research’ are the latest additions to a growing number of “reasoning models” with interesting implications for evidence-based learning design & development.
  • The U.S. Education Dept release an AI Toolkit and a fresh policy roadmap enabling the adoption of AI use in schools.
  • Anthropic Release “Agentic Claude”, another AI agent that clicks, scrolls, and can even successfully complete e-learning courses…
  • Oxford University Announce the AIEOU Hub, a research-backed research lab to support research and implementation on AI in education.
  • “AI Agents Everywhere”: A Forbes peek at how agentic AI will handle the “boring bits” of classroom life.
  • [Bias klaxon!] Epiphany AI: My own research leads to the creation of a specialised, “pedagogy first” AI co-pilot for instructional design marking the continued growth of specialised AI tools designed for specific industries and workflows.

AI is the Perfect Teaching Assistant for Any Educator — from unite.ai by Navi Azaria, CPO at Kaltura

Through my work with leading educational institutions at Kaltura, I’ve seen firsthand how AI agents are rapidly becoming indispensable. These agents alleviate the mounting burdens on educators and provide new generations of tech-savvy students with accessible, personalized learning, giving teachers the support they need to give their students the personalized attention and engagement they deserve.


Learning HQ — from ai-disruptor-hq.notion.site

This HQ includes all of my AI guides, organized by tool/platform. This list is updated each time a new one is released, and outdated guides are removed/replaced over time.



How AI Is Reshaping Teachers’ Jobs — from edweek.org

Artificial intelligence is poised to fundamentally change the job of teaching. AI-powered tools can shave hours off the amount of time teachers spend grading, lesson-planning, and creating materials. AI can also enrich the lessons they deliver in the classroom and help them meet the varied needs of all students. And it can even help bolster teachers’ own professional growth and development.

Despite all the promise of AI, though, experts still urge caution as the technology continues to evolve. Ethical questions and practical concerns are bubbling to the surface, and not all teachers feel prepared to effectively and safely use AI.

In this special report, see how early-adopter teachers are using AI tools to transform their daily work, tackle some of the roadblocks to expanded use of the technology, and understand what’s on the horizon for the teaching profession in the age of artificial intelligence.

 

2025 EDUCAUSE AI Landscape Study: Into the Digital AI Divide — from library.educause.edu

The higher education community continues to grapple with questions related to using artificial intelligence (AI) in learning and work. In support of these efforts, we present the 2025 EDUCAUSE AI Landscape Study, summarizing our community’s sentiments and experiences related to strategy and leadership, policies and guidelines, use cases, the higher education workforce, and the institutional digital divide.

 
 

DeepSeek: How China’s AI Breakthrough Could Revolutionize Educational Technology — from nickpotkalitsky.substack.com by Nick Potkalitsky
Can DeepSeek’s 90% efficiency boost make AI accessible to every school?

The most revolutionary aspect of DeepSeek for education isn’t just its cost—it’s the combination of open-source accessibility and local deployment capabilities. As Azeem Azhar notes, “R-1 is open-source. Anyone can download and run it on their own hardware. I have R1-8b (the second smallest model) running on my Mac Mini at home.”

Real-time Learning Enhancement

  • AI tutoring networks that collaborate to optimize individual learning paths
  • Immediate, multi-perspective feedback on student work
  • Continuous assessment and curriculum adaptation

The question isn’t whether this technology will transform education—it’s how quickly institutions can adapt to a world where advanced AI capabilities are finally within reach of every classroom.


Over 100 AI Tools for Teachers — from educatorstechnology.com by Med Kharbach, PhD

I know through your feedback on my social media and blog posts that several of you have legitimate concerns about the impact of AI in education, especially those related to data privacy, academic dishonesty, AI dependence, loss of creativity and critical thinking, plagiarism, to mention a few. While these concerns are valid and deserve careful consideration, it’s also important to explore the potential benefits AI can bring when used thoughtfully.

Tools such as ChatGPT and Claude are like smart research assistants that are available 24/7 to support you with all kinds of tasks from drafting detailed lesson plans, creating differentiated materials, generating classroom activities, to summarizing and simplifying complex topics. Likewise, students can use them to enhance their learning by, for instance, brainstorming ideas for research projects, generating constructive feedback on assignments, practicing problem-solving in a guided way, and much more.

The point here is that AI is here to stay and expand, and we better learn how to use it thoughtfully and responsibly rather than avoid it out of fear or skepticism.


Beth’s posting links to:

 


Derek’s posting on LinkedIn


From Theory to Practice: How Generative AI is Redefining Instructional Materials — from edtechinsiders.substack.com by Alex Sarlin
Top trends and insights from The Edtech Insiders Generative AI Map research process about how Generative AI is transforming Instructional Materials

As part of our updates to the Edtech Insiders Generative AI Map, we’re excited to release a new mini market map and article deep dive on Generative AI tools that are specifically designed for Instructional Materials use cases.

In our database, the Instructional Materials use case category encompasses tools that:

  • Assist educators by streamlining lesson planning, curriculum development, and content customization
  • Enable educators or students to transform materials into alternative formats, such as videos, podcasts, or other interactive media, in addition to leveraging gaming principles or immersive VR to enhance engagement
  • Empower educators or students to transform text, video, slides or other source material into study aids like study guides, flashcards, practice tests, or graphic organizers
  • Engage students through interactive lessons featuring historical figures, authors, or fictional characters
  • Customize curriculum to individual needs or pedagogical approaches
  • Empower educators or students to quickly create online learning assets and courses

On a somewhat-related note, also see:


 
© 2025 | Daniel Christian