AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?
AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?
On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.
“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”
Also relevant/see:
Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.
“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.
Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.
Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.
‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?
Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.
While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.
AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen
When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.
…
The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.
AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk
Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.
In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.
Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie
In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.
Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?
Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?
Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.