Values in the wild: Discovering and analyzing values in real-world language model interactions — from anthropic.com

In the latest research paper from Anthropic’s Societal Impacts team, we describe a practical way we’ve developed to observe Claude’s values—and provide the first large-scale results on how Claude expresses those values during real-world conversations. We also provide an open dataset for researchers to run further analysis of the values and how often they arise in conversations.

Per the Rundown AI

Why it matters: AI is increasingly shaping real-world decisions and relationships, making understanding their actual values more crucial than ever. This study also moves the alignment discussion toward more concrete observations, revealing that AI’s morals and values may be more contextual and situational than a static point of view.

Also from Anthropic, see:

Anthropic Education Report: How University Students Use Claude


Adobe Firefly: The next evolution of creative AI is here — from blog.adobe.com

In just under two years, Adobe Firefly has revolutionized the creative industry and generated more than 22 billion assets worldwide. Today at Adobe MAX London, we’re unveiling the latest release of Firefly, which unifies AI-powered tools for image, video, audio, and vector generation into a single, cohesive platform and introduces many new capabilities.

The new Firefly features enhanced models, improved ideation capabilities, expanded creative options, and unprecedented control. This update builds on earlier momentum when we introduced the Firefly web app and expanded into video and audio with Generate Video, Translate Video, and Translate Audio features.

Per The Rundown AI (here):

Why it matters: OpenAI’s recent image generator and other rivals have shaken up creative workflows, but Adobe’s IP-safe focus and the addition of competing models into Firefly allow professionals to remain in their established suite of tools — keeping users in the ecosystem while still having flexibility for other model strengths.

 

How People Are Really Using Gen AI in 2025 — from hbr.org by Marc Zao-Sanders

.

.


Here’s why you shouldn’t let AI run your company — from theneurondaily.com by Grant Harvey; emphasis DSC

When “vibe-coding” goes wrong… or, a parable in why you shouldn’t “vibe” your entire company.
Cursor, an AI-powered coding tool that many developers love-to-hate, face-planted spectacularly yesterday when its own AI support bot went off-script and fabricated a company policy, leading to a complete user revolt.

Here’s the short version:

  • A bug locked Cursor users out when switching devices.
  • Instead of human help, Cursor’s AI support bot confidently told users this was a new policy (it wasn’t).
  • No human checked the replies—big mistake.
  • The fake news spread, and devs canceled subscriptions en masse.
  • A Reddit thread about it got mysteriously nuked, fueling suspicion.

The reality? Just a bug, plus a bot hallucination… doing maximum damage.

Why it matters: This is what we’d call “vibe-companying”—blindly trusting AI with critical functions without human oversight.

Think about it like this: this was JUST a startup. If more big corporations continue to lay off entire departments, replaced by AI, these already byzantine companies will become increasingly more opaque, unaccountable systems where no one, human or AI, fully understands what’s happening or who’s responsible.

Our take? Kafka dude has it right. We need to pay attention to WHAT we’re actually automating. Because automating more bureaucracy at scale, with agents we increasingly don’t understand or don’t double check, can potentially make companies less intelligent—and harder to fix when things inevitably go wrong.


 

 

What does ‘age appropriate’ AI literacy look like in higher education? — from timeshighereducation.com by Fun Siong Lim
As AI literacy becomes an essential work skill, universities need to move beyond developing these competencies at ‘primary school’ level in their students. Here, Fun Siong Lim reflects on frameworks to support higher-order AI literacies

Like platforms developed at other universities, Project NALA offers a front-end interface (known as the builder) for faculty to create their own learning assistant. An idea we have is to open the builder up to students to allow them to create their own GenAI assistant as part of our AI literacy curriculum. As they design, configure and test their own assistant, they will learn firsthand how generative AI works. They get to test performance-enhancement approaches beyond prompt engineering, such as grounding the learning assistant with curated materials (retrieval-augmented generation) and advanced ideas such as incorporating knowledge graphs.

They should have the opportunity to analyse, evaluate and create responsible AI solutions. Offering students the opportunity to build their own AI assistants could be a way forward to develop these much-needed skills.


How to Use ChatGPT 4o’s Update to Turn Key Insights Into Clear Infographics (Prompts Included) — from evakeiffenheim.substack.com by Eva Keiffenheim
This 3-step workflow helps you break down books, reports, or slide-decks into professional visuals that accelerate understanding.

This article shows you how to find core ideas, prompt GPT-4o3 for a design brief, and generate clean, professional images that stick. These aren’t vague “creative visuals”—they’re structured for learning, memory, and action.

If you’re a lifelong learner, educator, creator, or just someone who wants to work smarter, this process is for you.

You’ll spend less time re-reading and more time understanding. And maybe—just maybe—you’ll build ideas that not only click in your brain, but also stick in someone else’s.


SchoolAI Secures $25 Million to Help Teachers and Schools Reach Every Student — from globenewswire.com
 The Classroom Experience platform gives every teacher and student their own AI tools for personalized learning

SchoolAI’s Classroom Experience platform combines AI assistants for teachers that help with classroom preparation and other administrative work, and Spaces–personalized AI tutors, games, and lessons that can adapt to each student’s unique learning style and interests. Together, these tools give teachers actionable insights into how students are doing, and how the teacher can deliver targeted support when it matters most.

“Teachers and schools are navigating hard challenges with shrinking budgets, teacher shortages, growing class sizes, and ongoing recovery from pandemic-related learning gaps,” said Caleb Hicks, founder and CEO of SchoolAI. “It’s harder than ever to understand how every student is really doing. Teachers deserve powerful tools to help extend their impact, not add to their workload. This funding helps us double down on connecting the dots for teachers and students, and later this year, bringing school administrators and parents at home onto the platform as well.”


AI in Education, Part 3: Looking Ahead – The Future of AI in Learning — from rdene915.com by Dr. Rachelle Dené Poth

In the first and second parts of my AI series, I focused on where we see AI in classrooms. Benefits range from personalized learning and accessibility tools to AI-driven grading and support of a teaching assistant. In Part 2, I chose to focus on some of the important considerations related to ethics that must be part of the conversation. Schools need to focus on data privacy, bias, overreliance, and the equity divide. I wanted to focus on the future for this last part in the current AI series. Where do we go from here?


Anthropic Education Report: How University Students Use Claude — from anthropic.com

The key findings from our Education Report are:

  • STEM students are early adopters of AI tools like Claude, with Computer Science students particularly overrepresented (accounting for 36.8% of students’ conversations while comprising only 5.4% of U.S. degrees). In contrast, Business, Health, and Humanities students show lower adoption rates relative to their enrollment numbers.
  • We identified four patterns by which students interact with AI, each of which were present in our data at approximately equal rates (each 23-29% of conversations): Direct Problem Solving, Direct Output Creation, Collaborative Problem Solving, and Collaborative Output Creation.
  • Students primarily use AI systems for creating (using information to learn something new) and analyzing (taking apart the known and identifying relationships), such as creating coding projects or analyzing law concepts. This aligns with higher-order cognitive functions on Bloom’s Taxonomy. This raises questions about ensuring students don’t offload critical cognitive tasks to AI systems.

From the Kuali Days 2025 Conference: A CEO’s View of Planning for AI — from campustechnology.com by Mary Grush
A Conversation with Joel Dehlin

How can a company serving higher education navigate the changes AI brings to the ed tech marketplace? What will customers expect in this dynamic? Here, CT talks with Kuali CEO Joel Dehlin, who shared his company’s AI strategies in a featured plenary session, “Sneak Peek of AI in Kuali Build,” at Kuali Days 2025 in Anaheim.


How students can use generative AI — from aliciabankhofer.substack.com by Alicia Bankhofer
Part 4 of 4 in my series on Teaching and Learning in the AI Age

This article is the culmination of a series exploring AI’s impact on education.

Part 1: What Educators Need outlined essential AI literacy skills for teachers, emphasizing the need to move beyond basic ChatGPT exploration to understand the full spectrum of AI tools available in education.

Part 2: What Students Need addressed how students require clear guidance to use AI safely, ethically, and responsibly, with emphasis on developing critical thinking skills alongside AI literacy.

Part 3: How Educators Can Use GenAI presented ten practical use cases for teachers, from creating differentiated resources to designing assessments, demonstrating how AI can reclaim 5-7 hours weekly for meaningful student interactions.

Part 4: How Students Can Use GenAI (this article) provides frameworks for guiding student AI use based on Joscha Falck’s dimensions: learning about, with, through, despite, and without AI.


Mapping a Multidimensional Framework for GenAI in Education — from er.educause.edu by Patricia Turner
Prompting careful dialogue through incisive questions can help chart a course through the ongoing storm of artificial intelligence.

The goal of this framework is to help faculty, educational developers, instructional designers, administrators, and others in higher education engage in productive discussions about the use of GenAI in teaching and learning. As others have noted, theoretical frameworks will need to be accompanied by research and teaching practice, each reinforcing and reshaping the others to create understandings that will inform the development of approaches to GenAI that are both ethical and maximally beneficial, while mitigating potential harms to those who engage with it.


Instructional Design Isn’t Dying — It’s Specialising — from drphilippahardman.substack.com by Dr. Philippa Hardman
Aka, how AI is impacting role & purpose of Instructional Design

Together, these developments have revealed something important: despite widespread anxiety, the instructional design role isn’t dying—it’s specialising.

What we’re witnessing isn’t the automation of instructional design and the death of the instructional designer, but rather the evolution of the ID role into multiple distinct professional pathways.

The generalist “full stack” instructional designer is slowly but decisively fracturing into specialised roles that reflect both the capabilities of generative AI and the strategic imperatives facing modern organisations.

In this week’s blog post, I’ll share what I’ve learned about how our field is transforming, and what it likely means for you and your career path.

Those instructional designers who cling to traditional generalist models risk being replaced, but those who embrace specialisation, data fluency, and AI collaboration will excel and lead the next evolution of the field. Similarly, those businesses that continue to view L&D as a cost centre and focus on automating content delivery will be outperformed, while those that invest in building agile, AI-enabled learning ecosystems will drive measurable performance gains and secure their competitive advantage.


Adding AI to Every Step in Your eLearning Design Workflow — from learningguild.com by George Hanshaw

We know that eLearning is a staple of training and development. The expectations of the learners are higher than ever: They expect a dynamic, interactive, and personalized learning experience. As instructional designers, we are tasked with meeting these expectations by creating engaging and effective learning solutions.

The integration of Artificial Intelligence (AI) into our eLearning design process is a game-changer that can significantly enhance the quality and efficiency of our work.

No matter if you use ADDIE or rapid prototyping, AI has a fit in every aspect of your workflow. By integrating AI, you can ensure a more efficient and effective design process that adapts to the unique needs of your learners. This not only saves time and resources but also significantly enhances the overall learning experience. We will explore the needs analysis and the general design process.

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

The 2025 AI Index Report — from Stanford University’s Human-Centered Artificial Intelligence Lab (hai.stanford.edu); item via The Neuron

Top Takeaways

  1. AI performance on demanding benchmarks continues to improve.
  2. AI is increasingly embedded in everyday life.
  3. Business is all in on AI, fueling record investment and usage, as research continues to show strong productivity impacts.
  4. The U.S. still leads in producing top AI models—but China is closing the performance gap.
  5. The responsible AI ecosystem evolves—unevenly.
  6. Global AI optimism is rising—but deep regional divides remain.
  7. …and several more

Also see:

The Neuron’s take on this:

So, what should you do? You really need to start trying out these AI tools. They’re getting cheaper and better, and they can genuinely help save time or make work easier—ignoring them is like ignoring smartphones ten years ago.

Just keep two big things in mind:

  1. Making the next super-smart AI costs a crazy amount of money and uses tons of power (seriously, they’re buying nuclear plants and pushing coal again!).
  2. Companies are still figuring out how to make AI perfectly safe and fair—cause it still makes mistakes.

So, use the tools, find what helps you, but don’t trust them completely.

We’re building this plane mid-flight, and Stanford’s report card is just another confirmation that we desperately need better safety checks before we hit major turbulence.


Addendum on 4/16:

 

The 2025 ABA Techshow Startup Alley Pitch Competition Ended In A Tie – Here Are The Winners — from lawnext.com by Bob Ambrogi

This year, two startups ended up with an equal number of votes for the top spot:

  • Case Crafter, a company from Norway that helps legal professionals build compelling visual timelines based on case files and evidence.
  • Querious, a product that provides attorneys with real-time insights during client conversations into legal issues, relevant content, and suggested questions and follow-ups.
    .


AI academy gives law students a head start on legal tech, says OBA innovator — from canadianlawyermag.com by Branislav Urosevic

The Ontario Bar Association has recently launched a hands-on AI learning platform tailored for lawyers. Called the AI Academy, the initiative is designed to help legal professionals explore, experiment with, and adopt AI tools relevant to their practice.

Colin Lachance, OBA’s innovator-in-residence and the lead designer of the platform, says that although the AI Academy was built for practising lawyers, it is also well-suited for law students.


 

Job hunting and hiring in the age of AI: Where did all the humans go? — from washingtonpost.com by Taylor Telford
The proliferation of artificial intelligence tools and overreliance on software such as ChatGPT is making the job market increasingly surreal.

The speedy embrace of AI tools meant to make job hunting and hiring more efficient is causing headaches and sowing distrust in these processes, people on both sides of the equation say. While companies embrace AI recruiters and application scanning systems, many job seekers are trying to boost their odds with software that generates application materials, optimizes them for AI and applies to hundreds of jobs in minutes.

Meanwhile, recruiters and hiring managers are fielding more applicants than they can keep up with, yet contend that finding real, qualified workers amid the bots, cheaters and deepfakes is only getting tougher as candidates use AI to write their cover letters, bluff their way through interviews and even hide their identities.

“I’m pro-AI in the sense that it allows you to do things that were impossible before … but it is being misused wildly,” Freire said. The problem is “when you let it do the thinking for you, it goes from a superpower to a crutch very easily.”

 

Outsourcing Thought: The Hidden Cost of Letting AI Think for You — from linkedin.com by Robert Atkinson

I’ve watched it unfold in real time. A student submits a flawless coding assignment or a beautifully written essay—clean syntax, sharp logic, polished prose. But when I ask them to explain their thinking, they hesitate. They can’t trace their reasoning or walk me through the process. The output is strong, but the understanding is shallow. As a professor, I’ve seen this pattern grow more common: AI-assisted work that looks impressive on the surface but reveals a troubling absence of cognitive depth underneath.

This article is written with my students in mind—but it’s meant for anyone navigating learning, teaching, or thinking in the age of artificial intelligence. Whether you’re a student, educator, or professional, the question is the same: What happens to the brain when we stop doing our own thinking?

We are standing at a pivotal moment. With just a few prompts, generative AI can produce essays, solve complex coding problems, and summarize ideas in seconds. It feels efficient. It feels like progress. But from a cognitive neuroscience perspective, that convenience comes at a hidden cost: the gradual erosion of the neural processes that support reasoning, creativity, and long-term learning.

 

AI in Education Survey: What UK and US Educators Think in 2025 — from twinkl.com
As artificial intelligence (AI) continues to shape the world around us, Twinkl conducted a large-scale survey between January 15th and January 22nd to explore its impact on the education sector, as well as the work lives of teachers across the UK and the USA.

Teachers’ use of AI for work continues to rise
Twinkl’s survey asked teachers whether they were currently using AI for work purposes. Comparing these findings to similar surveys over recent years shows the use of AI tools by teachers has seen a significant increase across both the UK and USA.

  • According to two UK surveys by the National Literacy Trust – 30% of teachers used generative AI in 2023 and nearly half (47.7%) in 2024. Twinkl’s survey indicates that AI adoption continues to rise rapidly, with 60% of UK educators currently integrating it into their work lives in 2025.
  • Similarly, with 62% of US teachers currently using AI for work, uptake appears to have risen greatly in the past 12 months, with just 25% saying they were leveraging the new technology in the 2023-24 school year according to a RAND report.
  • Teachers are using AI more for work than in their personal lives: In the UK, personal usage drops to 43% (from 60% at school).  In the US, 52% are using AI for non-work purposes (versus 62% in education settings).

    60% of UK teachers and 62% of US teachers use AI in their work life in 2025.

 




Students and folks looking for work may want to check out:

Also relevant/see:


 

AI Can’t Fix Bad Learning — from nafez.substack.com by Nafez Dakkak
Why pedagogy and good learning design still come first, and why faster isn’t always better.

I’ve followed Dr. Philippa Hardman’s work for years, and every time I engage with her work, I find it both refreshing and deeply grounded.

As one of the leading voices in learning design, Philippa has been able to cut through the noise and focus on what truly matters: designing learning experiences that actually work.

In an era where AI promises speed and scale, Philippa is making a different argument: faster isn’t always better. As the creator of Epiphany AI—figma for learning designers—Philippa is focused on closing the gap between what great learning design should look like and what’s actually being delivered.

While many AI tools optimize for the average, she believes the future belongs to those who can leverage AI without compromising on expertise or quality. Philippa wants learning designers to be more ambitious using AI to achieve what wasn’t possible before.

In this conversation, we explore why pedagogy must lead technology, how the return on expertise is only increasing in an AI-driven world, and why building faster doesn’t always mean building better.

An excerpted graphic:




Pearson, AWS Collaborate to Enhance AI-Powered Learning Functionality — from cloudwars.com

Pearson, the global educational publisher, and AWS have expanded their existing partnership to enhance AI-driven learning. AWS will help Pearson to deliver AI-powered lesson generation and more for educators, support workforce skilling initiatives, and continue an ongoing collaboration with Pearson VUE for AWS certification.


 

Who does need college anymore? About that book title … — from Education Design Lab

As you may know, Lab founder Kathleen deLaski just published a book with a provocative title: Who Needs College Anymore? Imagining a Future Where Degrees Won’t Matter.

Kathleen is asked about the title in every media interview, before and since the Feb. 25 book release. “It has generated a lot of questions,” she said in our recent book chat. “I tell people to focus on the word, ‘who.’ Who needs college anymore? That’s in keeping with the design thinking frame, where you look at the needs of individuals and what needs are not being met.”

In the same conversation, Kathleen reminded us that only 38% of American adults have a four-year degree. “We never talk about the path to the American dream for the rest of folks,” she said. “We currently are not supporting the other really interesting pathways to financial sustainability — apprenticeships, short-term credentials. And that’s really why I wrote the book, to push the conversation around the 62% of who we call New Majority Learners at the Lab, the people for whom college was not designed.” Watch the full clip

She distills the point into one sentence in this SmartBrief essay:  “The new paradigm is a ‘yes and’ paradigm that embraces college and/or other pathways instead of college or bust.”

What can colleges do moving forward?
In this excellent Q&A with Inside Higher Ed, Kathleen shares her No. 1 suggestion: “College needs to be designed as a stepladder approach, where people can come in and out of it as they need, and at the very least, they can build earnings power along the way to help afford a degree program.”

In her Hechinger Report essay, Kathleen lists four more steps colleges can take to meet the demand for more choices, including “affordability must rule.”

From white-collar apprenticeships and micro-credential programs at local community colleges to online bootcamps, self-instruction using YouTube, and more—students are forging alternative paths to GREAT high-paying jobs. (source)

 

The $100 billion disruption: How AI is reshaping legal tech — from americanbazaaronline.com by Rohan Hundia and Rajesh Mehta

The Size of the Problem: Judicial Backlog and Inefficiencies
India has a massive backlog of more than 47 million pending cases, with civil litigation itself averaging 1,445 days in resolution. In the United States, federal courts dispose of nearly 400,000 cases a year, and complex litigations take years to complete. Artificial intelligence-driven case law research, contract automation, and predictive analytics will cut legal research times by 90%, contract drafting fees by 60%, and hasten case settlements, potentially saving billions of dollars in legal costs.

This is not just an evolution—it is a permanent change toward data-driven jurisprudence, with AI supplementing human capabilities, speeding up delivery of justice, and extending access to legal services. The AI revolution for legal tech is not on its way; it is already under way, dismantling inefficiencies and transforming the legal world in real time.


Scaling and Improving Legal Tech Projects — from legaltalknetwork.com by Taylor Sartor, Luigi Bai, David Gray, and Cat Moon

Legal tech innovators discuss how they are working to scale and improve their successful projects on Talk Justice. FosterPower and Legal Aid Content Intelligence (LACI) leverage technology to make high-quality legal information available to people for free online. Both also received Technology Initiative Grants (TIG) from the Legal Services Corporation to launch their projects. Then, in 2024 they were both selected for a different TIG, called the Sustainability, Enhancement and Adoption (SEA) grant. This funding supports TIG projects that have demonstrated excellent results as they improve their tools and work to increase uptake.

 

AI in K12: Today’s Breakthroughs and Tomorrow’s Possibilities (webinar)
How AI is Transforming Classrooms Today and What’s Next


Audio-Based Learning 4.0 — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new & powerful way to leverage AI for learning?

At the end of all of this my reflection is that the research paints a pretty exciting picture – audio-based learning isn’t just effective, it’s got some unique superpowers when it comes to boosting comprehension, ramping up engagement, and delivering feedback that really connects with learners.

While audio has been massively under-used as a mode of learning, especially compared to video and text, we’re at an interesting turning point where AI tools are making it easier than ever to tap into audio’s potential as a pedagogical tool.

What’s super interesting is how the solid research backing audio’s effectiveness is and how well this is converging with these new AI capabilities.

From DSC:
I’ve noticed that I don’t learn as well via audio-only based events. It can help if visuals are also provided, but I have to watch the cognitive loads. My processing can start to get overloaded — to the point that I have to close my eyes and just listen sometimes. But there are people I know who love to listen to audiobooks and prefer to learn that way. They can devour content and process/remember it all. Audio is a nice change of pace at times, but I prefer visuals and reading often times. It needs to be absolutely quiet if I’m tackling some new information/learning. 


In Conversation With… Ashton Cousineau — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new video series exploring how L&D professionals are working with AI on the ground

In Conversation With… Ashton Cousineau by Dr Philippa Hardman

A new video series exploring how L&D professionals are working with AI on the ground

Read on Substack


The Learning Research Digest vol. 28 — from learningsciencedigest.substack.com by Dr. Philippa Hardman

Hot Off the Research Press This Month:

  • AI-Infused Learning Design – A structured approach to AI-enhanced assignments using a three-step model for AI integration.
  • Mathematical Dance and Creativity in STEAM – Using AI-powered motion capture to translate dance movements into mathematical models.
  • AI-Generated Instructional Videos – How adaptive AI-powered video learning enhances problem-solving and knowledge retention.
  • Immersive Language Learning with XR & AI – A new framework for integrating AI-driven conversational agents with Extended Reality (XR) for task-based language learning.
  • Decision-Making in Learning Design – A scoping review on how instructional designers navigate complex instructional choices and make data-driven decisions.
  • Interactive E-Books and Engagement – Examining the impact of interactive digital books on student motivation, comprehension, and cognitive engagement.
  • Elevating Practitioner Voices in Instructional Design – A new initiative to amplify instructional designers’ contributions to research and innovation.

Deep Reasoning, Agentic AI & the Continued Rise of Specialised AI Research & Tools for Education — from learningfuturesdigest.substack.com by Dr. Philippa Hardman

Here’s a quick teaser of key developments in the world of AI & learning this month:

  • DeepSeek R-1, OpenAI’s Deep Seek & Perplexity’s ‘Deep Research’ are the latest additions to a growing number of “reasoning models” with interesting implications for evidence-based learning design & development.
  • The U.S. Education Dept release an AI Toolkit and a fresh policy roadmap enabling the adoption of AI use in schools.
  • Anthropic Release “Agentic Claude”, another AI agent that clicks, scrolls, and can even successfully complete e-learning courses…
  • Oxford University Announce the AIEOU Hub, a research-backed research lab to support research and implementation on AI in education.
  • “AI Agents Everywhere”: A Forbes peek at how agentic AI will handle the “boring bits” of classroom life.
  • [Bias klaxon!] Epiphany AI: My own research leads to the creation of a specialised, “pedagogy first” AI co-pilot for instructional design marking the continued growth of specialised AI tools designed for specific industries and workflows.

AI is the Perfect Teaching Assistant for Any Educator — from unite.ai by Navi Azaria, CPO at Kaltura

Through my work with leading educational institutions at Kaltura, I’ve seen firsthand how AI agents are rapidly becoming indispensable. These agents alleviate the mounting burdens on educators and provide new generations of tech-savvy students with accessible, personalized learning, giving teachers the support they need to give their students the personalized attention and engagement they deserve.


Learning HQ — from ai-disruptor-hq.notion.site

This HQ includes all of my AI guides, organized by tool/platform. This list is updated each time a new one is released, and outdated guides are removed/replaced over time.



How AI Is Reshaping Teachers’ Jobs — from edweek.org

Artificial intelligence is poised to fundamentally change the job of teaching. AI-powered tools can shave hours off the amount of time teachers spend grading, lesson-planning, and creating materials. AI can also enrich the lessons they deliver in the classroom and help them meet the varied needs of all students. And it can even help bolster teachers’ own professional growth and development.

Despite all the promise of AI, though, experts still urge caution as the technology continues to evolve. Ethical questions and practical concerns are bubbling to the surface, and not all teachers feel prepared to effectively and safely use AI.

In this special report, see how early-adopter teachers are using AI tools to transform their daily work, tackle some of the roadblocks to expanded use of the technology, and understand what’s on the horizon for the teaching profession in the age of artificial intelligence.

 
© 2025 | Daniel Christian