5 Tech Strategies to Enhance Student-Led Learning — from edutopia.org by Rachelle Dené Poth
While technology has potential to distract students, it can also boost engagement and help them actively demonstrate their learning.

Over the years, I have learned that engagement doesn’t happen simply by adding technology. It increases when we give students more ownership by designing experiences that allow them to build, collaborate, reflect, and teach one another. Depending on how we use it, technology can either amplify engagement or distract from it. Technology can help build students’ confidence in learning, but it can also lead to passivity. When technology is used to amplify students’ voice, choice, and ownership in learning, their engagement will naturally increase.

Here are five strategies and some digital tools that can be used across grade levels and content areas to boost student engagement, build confidence, foster collaboration, and support meaningful learning experiences.


Project-Based Learning (PBL)
Implementing a PBL Design Challenge in Your School — from edutopia.org by Lisa Beck & Kim Mishkin
A weeklong, schoolwide project-based learning challenge encourages students to try to tackle meaningful problems.

For the past five years, Hudson Lab School (HLS), a K–8 progressive school committed to project?based learning (PBL), has kicked off each school year with an exciting tradition: Design Challenge Week. In five days, students take on a real?world problem, explore each phase of the design process, and present what they created and learned to an authentic audience. Design Challenge Week introduces concepts that students will revisit all year and offers a model for how any educational setting could experiment with PBL on a smaller scale. Even short, well?designed challenges can lead to deeply engaged learning experiences.


How to Give Students Directions They Actually Understand — from edutopia.org by Mary Davenport
Making small changes in your instructions can have a significant impact on students’ understanding and engagement.

No more than a minute after you’ve provided instruction on the day’s targeted content and given students directions for their next task, some brave soul utters the line that brings tired teachers to their knees: “What are we supposed to be doing?”

None of us want this. As teachers, we all want students to fully understand what they’re supposed to be doing so that they can be successful as they do it.

Good news: A few small changes in how we give directions can be the lever that boosts student understanding and engagement.

 

Law Firm AI Adoption: So Many Choices — from abovethelaw.com by Stephen Embry
Firms need to recognize reality, define what their legal professionals need, and then determine how to adopt and govern the use of AI tools.

It’s tough to be a law firm managing partner in the age of AI. So many choices, so little time. It’s like the proverbial kid in the candy store who has so many choices that they either can’t pick out anything or reach for too much. We see evidence of the first option in 8am’s recent outstanding Legal Industry Report, authored by Niki Black.

8am’s Legal Industry Report
One thing that stood out in the report was the discrepancy between use of AI by individual legal professionals and what firms are doing when it comes to AI adoption and guidance.  Almost 75% of those who responded said they were using general purpose AI tools like ChatGPT and Claude for work purposes. That’s pretty significant.


Legalweek: It’s time to re-engineer how legal work is delivered — from legaltechnology.com by Caroline Hill

AI for good
While focusing on the risks of AI going wrong, it is only fair to mention the conversations I had around using AI for good.  Two in particular stand out.

The first is the news from Everlaw that its Everlaw for Good Program has, over the past year, supported more than 675 active cases across 235 organisations, and expanded its support to a growing network of non-profit organisations.

The program extends Everlaw’s technology to organisations working to advance access to justice. In a recent survey by Everlaw, 88% of legal aid professionals said they are optimistic about AI’s potential to help narrow the justice gap.

“Mission-driven organizations are increasingly handling complex investigations and litigation with limited resources,” said Joanne Sprague, head of Everlaw for Good. “Expanding access to powerful, easy-to-use technology helps level the playing field so these teams can uncover critical evidence, take on more complex matters, and yield stronger results for the communities they serve.”


LawNext on Location: Visiting Everlaw’s Headquarters For A Conversation with AJ Shankar, Founder and CEO — from lawnext.com by Bob Ambrogi

The bulk of our conversation focuses on generative AI, and how Everlaw has approached it differently than much of the market. Rather than bolting on a chatbot, AJ says, Everlaw embedded AI deliberately throughout the platform — document summarization, coding suggestions, deposition analysis, fact extraction — always grounding responses in the actual documents at hand and citing sources so users can verify the work. The December launch of Deep Dive, which lets litigators pose a question and get a synthesized, cited answer drawn from an entire document corpus in about a minute, is the feature AJ calls a “new era” for discovery — one he genuinely believes represents a categorical shift.

 

How to Get Consistent, On-Brand Course Images from Any AI Image Tool — from drphilippahardman.substack.com by Dr. Philippa Hardman
A 3-step workflow that works every time — whatever AI tool you’re using

Most designers try to describe their way to an image. That’s the wrong approach. The goal is to show the tool the world it should be working in, then give it the minimum it needs to place your subject inside that world.

Every long, over-specified prompt is a sign that your visual inputs aren’t doing enough work.

The fix is an 3-step process which gives you superpowers in AI image generation…


How AI Could Transform, or Replace, the LMS — from futureupodcast.com by Jeff Selingo, Michael Horn, and Matthew Pittinsky

Tuesday, March 10, 2026 – For 30 years now, colleges have relied on the Learning Management System, or LMS, as a key portal for professors and students to teach and learn. It’s a tool that has helped colleges adapt to online learning and bring digital tools to classroom teaching. But generative AI seems poised to disrupt the LMS. And it’s unclear whether the LMS will evolve—or be replaced altogether. For this episode, Jeff and Michael talk with a pioneer of the technology, Matthew Pittinsky, about the lessons of past moments of tech disruption like the smartphone and cloud computing and about what could be different this time. This episode is made with support from Ascendium Education Group.


Gemini, Explained — from wondertools.substack.com by Jeremy Caplan
5 features worth your time — tested and compared

Google’s AI, Gemini, has quickly become one of the AI tools I rely on most. It builds dashboards and creates remarkable infographics. It spins out comprehensive research reports in minutes that would once have taken days to assemble.

It’s improving every month. On March 13, Google announced Ask Maps, so you can query Gemini about things like “Which nearby tennis courts are open with lights so I can play tonight?” On March 10, Gemini added new integrations to build, summarize, and analyze your Google Docs, Sheets, and Slides.

In today’s post below: catch up on the Gemini features worth your time, candid comparisons with other AI tools, and answers to the questions I hear most.


How we’re reimagining Maps with Gemini — from blog.google
Ask Maps answers your real-world questions with a conversation, and Immersive Navigation makes your route more intuitive.

Today, Google Maps is fundamentally changing what a map can do. By bringing together the world’s freshest map with our most capable Gemini models, we’re transforming exploration into a simple conversation and making driving more intuitive than ever with our biggest navigation upgrade in over a decade.

Ask anything about any place
We’re introducing Ask Maps, a new conversational experience that answers complex, real-world questions a map could never answer before. Now you can ask for things like, “My phone is dying — where can I charge it without having to wait in a long line for coffee?” or “Is there a public tennis court with lights on that I can play at tonight?” Previously, finding this information meant lots of research and sifting through reviews. But now, you can just tap the “Ask Maps” button and get your questions answered conversationally, with a customized map to help you visualize your options.

 

Cinematic Prompting Without IP — from heatherbcooper.substack.com by Heather Cooper
Stop saying “Blade Runner” style.

Beginner Prompt Structure
If you’re new to prompting, start with this framework:
[Subject] + [Description] + [Setting] + [Lighting] + [Style/Medium]

The advanced framework adds three layers:
[Lens] + [Subject + Action] + [Environment + Atmosphere] + [Lighting + Colour] + [Mood/Emotion] + [Technical Detail]

 
 

Something Big Is Happening — from shumer.dev by Matt Shumer; see below from the BIG Questions Institute, where I got this article from

I’ve spent six years building an AI startup and investing in the space. I live in this world. And I’m writing this for the people in my life who don’t… my family, my friends, the people I care about who keep asking me “so what’s the deal with AI?” and getting an answer that doesn’t do justice to what’s actually happening. I keep giving them the polite version. The cocktail-party version. Because the honest version sounds like I’ve lost my mind. And for a while, I told myself that was a good enough reason to keep what’s truly happening to myself. But the gap between what I’ve been saying and what is actually happening has gotten far too big. The people I care about deserve to hear what is coming, even if it sounds crazy.


They’ve now done it. And they’re moving on to everything else.

The experience that tech workers have had over the past year, of watching AI go from “helpful tool” to “does my job better than I do”, is the experience everyone else is about to have. Law, finance, medicine, accounting, consulting, writing, design, analysis, customer service. Not in ten years. The people building these systems say one to five years. Some say less. And given what I’ve seen in just the last couple of months, I think “less” is more likely.

The models available today are unrecognizable from what existed even six months ago. The debate about whether AI is “really getting better” or “hitting a wall” — which has been going on for over a year — is over. It’s done. Anyone still making that argument either hasn’t used the current models, has an incentive to downplay what’s happening, or is evaluating based on an experience from 2024 that is no longer relevant. I don’t say that to be dismissive. I say it because the gap between public perception and current reality is now enormous, and that gap is dangerous… because it’s preventing people from preparing.


What “Something Big Is Happening” Means for Schools — from/by the BIG Questions Institute
Matt Shumer’s newsletter post Something Big is Happening has been read over 80 million times within the week when it was published, on February 9.

Still, it’s worth reading Shumer’s post. Given the claims and warnings in Something Big Is Happening (and countless other articles), how would you truly, honestly respond to these questions:

  • What will the purpose of school be in 5 years?
  • What are we doing now that we must leave behind right away?
  • What can we leave behind gradually?
  • What does rigor look like in this AI-powered world?
  • Does our strategy look like making adjustments at the margins or are we preparing our students for a fundamental shift?
  • What is our definition of success? How do the the implications of AI and jobs (and other important forces, from geopolitical shifts and climate change, to mental health needs and shifting generational values) impact the outcomes we prioritize? What is the story of success we want to pass on to our students and wider community?
 

Claude Code Puts Tech Workers on Notice — from builtin.com by Matthew Urwin
Anthropic is flexing its new and improved Claude Code, which used vibe coding to build the company’s latest tool, Cowork. The feat has inspired both excitement and angst within the tech world as the future of work continues to grow more uncertain.

Summary:
Anthropic is becoming the leader in enterprise artificial intelligence, thanks to upgrades made to Claude Code. The coding tool practically built Anthropic’s Cowork product — sparking both excitement around the possibilities of vibe coding and fears around the job outlook of tech workers.

 

The Campus AI Crisis — by Jeffrey Selingo; via Ryan Craig
Young graduates can’t find jobs. Colleges know they have to do something. But what?

Only now are colleges realizing that the implications of AI are much greater and are already outrunning their institutional ability to respond. As schools struggle to update their curricula and classroom policies, they also confront a deeper problem: the suddenly enormous gap between what they say a degree is for and what the labor market now demands. In that mismatch, students are left to absorb the risk. Alina McMahon and millions of other Gen-Zers like her are caught in a muddled in-between moment: colleges only just beginning to think about how to adapt and redefine their mission in the post-AI world, and a job market that’s changing much, much faster.

“Colleges and universities face an existential issue before them,” said Ryan Craig, author of Apprentice Nation and managing director of a firm that invests in new educational models. “They need to figure out how to integrate relevant, in-field, and hopefully paid work experience for every student, and hopefully multiple experiences before they graduate.”

 

Kling 3.0 just launched. The best video model yet. — from heatherbcooper.substack.com by Heather Cooper
& workflows from Imagine Art 1.5 pro, Pixverse Real-Time Video & Genspark

In today’s edition:

  • Kling 3.0: Everyone a Director
  • Character consistency, native audio, 15-second generations & first results
  • Image & Video Prompts
  • Imagine Art 1.5 Pro, Genspark AI Workspace 2.0 & PixVerse Real-Time Video Workflows

Kling 3.0: Everyone a Director
Kling just dropped version 3.0, and it’s a legitimate leap forward for AI video production (Kling is the GOAT). After spending early access time testing the new capabilities, I can confirm this is the most significant update to video generation tools I’ve seen in months.

Key highlights:

  • Character & Element Consistency:
  • Flexible Video Production:
  • Native Audio with Dialogue & Singing:
  • Enhanced Image Generation:
  • Professional Output:
 

Jim VandeHei’s note to his kids: Blunt AI talk — from axios.com by CEO Jim VandeHei
Axios CEO Jim VandeHei wrote this note to his wife, Autumn, and their three kids. She suggested sharing it more broadly since so many families are wrestling with how to think and talk about AI. So here it is …

Dear Family:
I want to put to words what I’m hearing, seeing, thinking and writing about AI.

  • Simply put, I’m now certain it will upend your work and life in ways more profound than the internet or possibly electricity. This will hit in months, not years.
  • The changes will be fast, wide, radical, disorienting and scary. No one will avoid its reach.

I’m not trying to frighten you. And I know your opinions range from wonderment to worry. That’s natural and OK. Our species isn’t wired for change of this speed or scale.

  • My conversations with the CEOs and builders of these LLMs, as well as my own deep experimentation with AI, have shaken and stirred me in ways I never imagined.

All of you must figure out how to master AI for any specific job or internship you hold or take. You’d be jeopardizing your future careers by not figuring out how to use AI to amplify and improve your work. You’d be wise to replace social media scrolling with LLM testing.

Be the very best at using AI for your gig.

more here.


Also see:


Also relevant/see:

 

The Essential Retrieval Practice Handbook — from edutopia.org
Retrieval practice is one of the most effective ways to strengthen learning. Here’s a collection of our best resources to use in your classroom today.
January 29, 2026


Also see:

What is retrieval practice? — from retrievalpractice.org

When we think about learning, we typically focus on getting information into students’ heads. What if, instead, we focus on getting information out of students’ heads?


 

Farewell to Traditional Universities | What AI Has in Store for Education

Premiered Jan 16, 2026

Description:

What if the biggest change in education isn’t a new app… but the end of the university monopoly on credibility?

Jensen Huang has framed AI as a platform shift—an industrial revolution that turns intelligence into infrastructure. And when intelligence becomes cheap, personal, and always available, education stops being a place you go… and becomes a system that follows you. The question isn’t whether universities will disappear. The question is whether the old model—high cost, slow updates, one-size-fits-all—can survive a world where every student can have a private tutor, a lab partner, and a curriculum designer on demand.

This video explores what AI has in store for education—and why traditional universities may need to reinvent themselves fast.

In this video you’ll discover:

  • How AI tutors could deliver personalized learning at scale
  • Why credentials may shift from “degrees” to proof-of-skill portfolios
  • What happens when the “middle” of studying becomes automated
  • How universities could evolve: research hubs, networks, and high-trust credentialing
  • The risks: cheating, dependency, bias, and widening inequality
  • The 3 skills that become priceless when information is everywhere: judgment, curiosity, and responsibility

From DSC:
There appears to be another, similar video, but with a different date and length of the video. So I’m including this other recording as well here:


The End of Universities as We Know Them: What AI Is Bringing

Premiered Jan 27, 2026

What if universities don’t “disappear”… but lose their monopoly on learning, credentials, and opportunity?

AI is turning education into something radically different: personal, instant, adaptive, and always available. When every student can have a 24/7 tutor, a writing coach, a coding partner, and a study plan designed specifically for them, the old model—one professor, one curriculum, one pace for everyone—starts to look outdated. And the biggest disruption isn’t the classroom. It’s the credential. Because in an AI world, proof of skill can become more valuable than a piece of paper.

This video explores the end of universities as we know them: what AI is bringing, what will break, what will survive, and what replaces the traditional path.

In this video you’ll discover:

  • Why AI tutoring could outperform one-size-fits-all lectures
  • How “degrees” may shift into skill proof: portfolios, projects, and verified competency
  • What happens when the “middle” of studying becomes automated
  • How universities may evolve: research hubs, networks, high-trust credentialing
  • The dark side: cheating, dependency, inequality, and biased evaluation
  • The new advantage: judgment, creativity, and responsibility in a world of instant answers
 
 

AI and the Work of Centers for Teaching and Learning — from derekbruff.org by Derek Bruff

  • Penelope Adams Moon suggested that instead [of] framing a workshop around “How can we integrate AI into the work of teaching?” we should ask “Given what we know about learning, how might AI be useful?” I love that reframing, and I think it connects to the students’ requests for more AI knowhow. Students have a lot of options for learning: working with their instructor, collaborating with peers, surfing YouTube for explainer videos, university-provided social annotation platforms, and, yes, using AI as a kind of tutor. I think our job (collectively) isn’t just to teach students how to use AI (as they’re requesting) but also to help them figure out when and how AI is helpful for their learning. That’s highly dependent on the student and the learning task! I wrote about this kind of metacognition on my blog.

In the same way, when I approach any kind of educational technology, I’m looking for tools that can be responsive to my pedagogical aims. The pedagogy should drive the technology use, not the other way around.

 
 
© 2025 | Daniel Christian