AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?

AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?

On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.

“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”

Also relevant/see:

Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.

“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.

Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.

Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.


‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?

Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.

While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.


AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen

When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.

The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.


AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk

Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.

In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.


Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie

In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.

Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?

Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?

Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.

 

Culminating Art Projects That Boost Students’ Confidence — from edutopia.org by Mary Beth Hertz
At the end of the year, high school students enjoy the opportunity to create a final product dictated by their own interests.


Boosting Engagement by Taking Math Outdoors — from edutopia.org by Sandy Vorensky
Bringing elementary students outside for math lessons provides a welcome change of pace and a chance for new activities.


Using a School Mural Project to Showcase Students’ Growth — from edutopia.org by Gloria Sevilla
Step-by-step instructions from an elementary school educator whose annual mural assignment is displayed at the spring open house.


How to Help Students Avoid Procrastinating — from edutopia.org by Sarah Kesty
A simple strategy can help students map out their assignments in manageable chunks so they can stay on top of their work.

Long-term projects and assignments present a unique challenge for many students, requiring several layers of executive function skills, like planning and time management, to be able to manage steps over an extended period of time. Much to our frustration, students may procrastinate or avoid working on an assignment when it seems overwhelming. This can lead to late, missing, or incomplete work, or it can push students into a stressful all-nighter, as they complete an assignment designed to take weeks in the span of just a few hours.

An effective way to address the challenges of overwhelm and procrastination—and a way that requires only a tweak to your teaching instead of another task on your plate—is to teach students to “scan and plan.” Scan and plans happen during the introduction of an assignment, usually one that takes more than a few steps. Teachers organically fold in the scan and plan approach as a layer to the assignment’s announcement to the class.

 

Higher Ed Institutions Rely Less on OPMs While Increasingly Hiring Fee-For-Service Models — from iblnews.org

market report from Validated Insights released this month notes that fewer colleges and universities hire external online program management (OPM) companies to develop their courses.

For 2024, higher education institutions launched only 81 new partnerships with OPMs —  a drop of 42% and the lowest number since 2016.

The report showed that institutions increasingly pay OPMs a fee-for-service instead of following a revenue-sharing model with big service bundles and profit splits.

Experts say revenue-sharing models, which critics denounce as predatory arrangements, incentivize service providers to use aggressive recruiting tactics to increase enrollments and maximize tuition revenue.

According to the report, fee-for-service has become the dominant business model for OPMs.


6 Online Edtech Professional Learning Communities & Resources for Teachers — from techlearning.com by Stephanie Smith Budhai, Ph.D.
These resources can help provide training, best practices, and advice, for using digital tools such as Canva, Curipod, Kahoot!, and more

While school-led professional development can be helpful, there are online professional learning communities on various edtech websites that can be leveraged. Also, some of these community spaces offer the chance to monetize your work.

Here is a summary of six online edtech professional learning spaces.

 

4 ways community colleges can boost workforce development — from highereddive.com by Natalie Schwartz
Higher education leaders at this week’s ASU+GSV Summit gave advice for how two-year institutions can boost the economic mobility of their students.

SAN DIEGO — How can community colleges deliver economic mobility to their students?

College leaders at this week’s ASU+GSV Summit, an annual education and technology conference, got a glimpse into that answer as they heard how community colleges are building support from business and industry and strengthening workforce development.

These types of initiatives may be helping to boost public perception of the value of community colleges vs. four-year institutions.

 


.

2025 EDUCAUSE Students and Technology Report: Shaping the Future of Higher Education Through Technology, Flexibility, and Well-Being — from library.educause.edu

The student experience in higher education is continually evolving, influenced by technological advancements, shifting student needs and expectations, evolving workforce demands, and broadening sociocultural forces. In this year’s report, we examine six critical aspects of student experiences in higher education, providing insights into how institutions can adapt to meet student needs and enhance their learning experience and preparation for the workforce:

  • Satisfaction with Technology-Related Services and Supports
  • Modality Preferences
  • Hybrid Learning Experiences
  • Generative AI in the Classroom
  • Workforce Preparation
  • Accessibility and Mental Health

DSC: Shame on higher ed for not preparing students for the workplace (see below). You’re doing your students wrong…again. Not only do you continue to heap a load of debt on their backs, but you’re also continuing to not get them ready for the workplace. So don’t be surprised if eventually you’re replaced by a variety of alternatives that students will flock towards.
.

 

DSC: And students don’t have a clue as to what awaits them in the workplace — they see AI-powered tools and technologies at an incredibly low score of only 3%. Yeh, right. You’ll find out. Here’s but one example from one discipline/field of work –> Thomson Reuters Survey: Over 95% of Legal Professionals Expect Gen AI to Become Central to Workflow Within Five Years

.

Figure 15. Competency Areas Expected to Be Important for Career

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Do I Need a Degree in Instructional Design? It Depends. — from teamedforlearning.com

It’s a common question for those considering a career in instructional design: Do I need a degree to land a job? The answer? It depends.

Hiring managers aren’t just looking for a degree—they want proof that you have the knowledge, skills, and abilities to succeed. In fact, most employers focus on 3 key factors when assessing candidates. You typically need at least 2 of these to be considered:

  1. A Credential – A degree or certification in instructional design, learning experience design, or a related field.
  2. Relevant Work Experience – Hands-on experience designing and developing learning solutions.
  3. Proof of Abilities – A strong portfolio showcasing eLearning modules, course designs, or learning strategies.

The good news? You don’t have to spend years earning a degree to break into the field. If you’re resourceful, you can fast-track your way in through volunteer projects, contract work, and portfolio building.

Whether you’re a recent graduate, a career changer, or a working professional looking for your next opportunity, focusing on these key factors can help you stand out and get hired.

 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

Uplimit raises stakes in corporate learning with suite of AI agents that can train thousands of employees simultaneously — from venturebeat.com by Michael Nuñez|

Uplimit unveiled a suite of AI-powered learning agents today designed to help companies rapidly upskill employees while dramatically reducing administrative burdens traditionally associated with corporate training.

The San Francisco-based company announced three sets of purpose-built AI agents that promise to change how enterprises approach learning and development: skill-building agents, program management agents, and teaching assistant agents. The technology aims to address the growing skills gap as AI advances faster than most workforces can adapt.

“There is an unprecedented need for continuous learning—at a scale and speed traditional systems were never built to handle,” said Julia Stiglitz, CEO and co-founder of Uplimit, in an interview with VentureBeat. “The companies best positioned to thrive aren’t choosing between AI and their people—they’re investing in both.”


Introducing Claude for Education — from anthropic.com

Today we’re launching Claude for Education, a specialized version of Claude tailored for higher education institutions. This initiative equips universities to develop and implement AI-enabled approaches across teaching, learning, and administration—ensuring educators and students play a key role in actively shaping AI’s role in society.

As part of announcing Claude for Education, we’re introducing:

  1. Learning mode: A new Claude experience that guides students’ reasoning process rather than providing answers, helping develop critical thinking skills
  2. University-wide Claude availability: Full campus access agreements with Northeastern University, London School of Economics and Political Science (LSE), and Champlain College, making Claude available to all students
  3. Academic partnerships: Joining Internet2 and working with Instructure to embed AI into teaching & learning with Canvas LMS
  4. Student programs: A new Claude Campus Ambassadors program along with an initiative offering API credits for student projects

A comment on this from The Rundown AI:

Why it matters: Education continues to grapple with AI, but Anthropic is flipping the script by making the tech a partner in developing critical thinking rather than an answer engine. While the controversy over its use likely isn’t going away, this generation of students will have access to the most personalized, high-quality learning tools ever.


Should College Graduates Be AI Literate? — from chronicle.com by Beth McMurtrie (behind a paywall)
More institutions are saying yes. Persuading professors is only the first barrier they face.

Last fall one of Jacqueline Fajardo’s students came to her office, eager to tell her about an AI tool that was helping him learn general chemistry. Had she heard of Google NotebookLM? He had been using it for half a semester in her honors course. He confidently showed her how he could type in the learning outcomes she posted for each class and the tool would produce explanations and study guides. It even created a podcast based on an academic paper he had uploaded. He did not feel it was important to take detailed notes in class because the AI tool was able to summarize the key points of her lectures.


Showing Up for the Future: Why Educators Can’t Sit Out the AI Conversation — from marcwatkins.substack.com with a guest post from Lew Ludwig

The Risk of Disengagement
Let’s be honest: most of us aren’t jumping headfirst into AI. At many of our institutions, it’s not a gold rush—it’s a quiet standoff. But the group I worry most about isn’t the early adopters. It’s the faculty who’ve decided to opt out altogether.

That choice often comes from a place of care. Concerns about data privacy, climate impact, exploitative labor, and the ethics of using large language models are real—and important. But choosing not to engage at all, even on ethical grounds, doesn’t remove us from the system. It just removes our voices from the conversation.

And without those voices, we risk letting others—those with very different priorities—make the decisions that shape what AI looks like in our classrooms, on our campuses, and in our broader culture of learning.



Turbocharge Your Professional Development with AI — from learningguild.com by Dr. RK Prasad

You’ve just mastered a few new eLearning authoring tools, and now AI is knocking on the door, offering to do your job faster, smarter, and without needing coffee breaks. Should you be worried? Or excited?

If you’re a Learning and Development (L&D) professional today, AI is more than just a buzzword—it’s transforming the way we design, deliver, and measure corporate training. But here’s the good news: AI isn’t here to replace you. It’s here to make you better at what you do.

The challenge is to harness its potential to build digital-ready talent, not just within your organization but within yourself.

Let’s explore how AI is reshaping L&D strategies and how you can leverage it for professional development.


5 Recent AI Notables — from automatedteach.com by Graham Clay

1. OpenAI’s New Image Generator
What Happened: OpenAI integrated a much more powerful image generator directly into GPT-4o, making it the default image creator in ChatGPT. Unlike previous image models, this one excels at accurately rendering text in images, precise visualization of diagrams/charts, and multi-turn image refinement through conversation.

Why It’s Big: For educators, this represents a significant advancement in creating educational visuals, infographics, diagrams, and other instructional materials with unprecedented accuracy and control. It’s not perfect, but you can now quickly generate custom illustrations that accurately display mathematical equations, chemical formulas, or process workflows — previously a significant hurdle in digital content creation — without requiring graphic design expertise or expensive software. This capability dramatically reduces the time between conceptualizing a visual aid and implementing it in course materials.
.


The 4 AI modes that will supercharge your workflow — from aiwithallie.beehiiv.com by Allie K. Miller
The framework most people and companies won’t discover until 2026


 

Outsourcing Thought: The Hidden Cost of Letting AI Think for You — from linkedin.com by Robert Atkinson

I’ve watched it unfold in real time. A student submits a flawless coding assignment or a beautifully written essay—clean syntax, sharp logic, polished prose. But when I ask them to explain their thinking, they hesitate. They can’t trace their reasoning or walk me through the process. The output is strong, but the understanding is shallow. As a professor, I’ve seen this pattern grow more common: AI-assisted work that looks impressive on the surface but reveals a troubling absence of cognitive depth underneath.

This article is written with my students in mind—but it’s meant for anyone navigating learning, teaching, or thinking in the age of artificial intelligence. Whether you’re a student, educator, or professional, the question is the same: What happens to the brain when we stop doing our own thinking?

We are standing at a pivotal moment. With just a few prompts, generative AI can produce essays, solve complex coding problems, and summarize ideas in seconds. It feels efficient. It feels like progress. But from a cognitive neuroscience perspective, that convenience comes at a hidden cost: the gradual erosion of the neural processes that support reasoning, creativity, and long-term learning.

 

7 ways to use ChatGPT’s new image AI — from wondertools.substack.com by Jeremy Caplan
Transform your ideas into strong visuals

7 ways to use ChatGPT’s new image AI

  • Cartoons
  • Infographics
  • Posters
  • …plus several more

 

AI in Education Survey: What UK and US Educators Think in 2025 — from twinkl.com
As artificial intelligence (AI) continues to shape the world around us, Twinkl conducted a large-scale survey between January 15th and January 22nd to explore its impact on the education sector, as well as the work lives of teachers across the UK and the USA.

Teachers’ use of AI for work continues to rise
Twinkl’s survey asked teachers whether they were currently using AI for work purposes. Comparing these findings to similar surveys over recent years shows the use of AI tools by teachers has seen a significant increase across both the UK and USA.

  • According to two UK surveys by the National Literacy Trust – 30% of teachers used generative AI in 2023 and nearly half (47.7%) in 2024. Twinkl’s survey indicates that AI adoption continues to rise rapidly, with 60% of UK educators currently integrating it into their work lives in 2025.
  • Similarly, with 62% of US teachers currently using AI for work, uptake appears to have risen greatly in the past 12 months, with just 25% saying they were leveraging the new technology in the 2023-24 school year according to a RAND report.
  • Teachers are using AI more for work than in their personal lives: In the UK, personal usage drops to 43% (from 60% at school).  In the US, 52% are using AI for non-work purposes (versus 62% in education settings).

    60% of UK teachers and 62% of US teachers use AI in their work life in 2025.

 

What trauma-informed practice is not — from timeshighereducation.com by Kate Cantrell, India Bryce, and Jessica Gildersleeve from The University of Southern Queensland
Before trauma-informed care can be the norm across all areas of the university, academic and professional staff need to understand what it is. Here, three academics debunk myths and demystify best practice

Recently, we conducted focus groups at our university to better ascertain how academics, administrators and student support staff perceive the purpose and value of trauma-informed practice, and how they perceive their capacity to contribute to organisational change.

We discovered that while most staff were united on the importance of trauma-informed care, several myths persist about what trauma-informed practice is (and is not). Some academic staff, for example, conflated teaching about trauma with trauma-informed teaching, confused trigger warnings with trigger points and, perhaps most alarmingly – given the prevalence of trauma exposure and risk among university students – misjudged trauma-informed practice as “the business of psychologists” rather than educators.

 




Students and folks looking for work may want to check out:

Also relevant/see:


 

The Third Horizon of Learning Shifting beyond the Industrial Model — from gettingsmart.com by Sujata Bhatt & Mason Pashia

Over 24 blog posts, we have sketched a bold vision of what this next horizon of education looks like in action and highlighted the many innovators working to bring it to life. These pioneers are building new models that prioritize human development, relationships, and real-world relevance as most valuable. They are forging partnerships, designing and adopting transformative technologies, developing new assessment methods, and more. These shifts transform the lived experiences of young people and serve the needs of families and communities. In short, they are delivering authentic learning experiences that better address the demands of today’s economy, society, and learners.

We’ve aggregated our findings from this blog series and turned it into an H3 Publication. Inside, you’ll find our key transformation takeaways for school designers and system leaders, as well as a full list of the contributing authors. Thank you to all of the contributors, including LearnerStudio for sponsoring the series and Sujata Bhatt at Incubate Learning for authorship, editing and curation support throughout the entirety of the series and publication.
.

 
© 2025 | Daniel Christian