Osgoode’s new simulation-based learning tool aims to merge ethical and practical legal skills — from canadianlawyermag.com by Tim Wilbur
The designer speaks about his vision for redefining legal education through an innovative platform

The disconnection between legal education and the real world starkly contrasted with what he expected law school to be. “I thought rather naively…this would be a really interesting experience…linked to lawyers and what lawyers are doing in society…Far from it. It was solidly academic, so uninteresting, and I thought it’s got to be better than this.”

These frustrations inspired his work on simulation-based education, which seeks to produce “client-ready” lawyers and professionals who reflect deeply on their future roles. Maharg recently worked as a consultant with Osgoode Professional Development at Osgoode Hall Law School to design a platform that eschews many of the assumptions about legal education to deliver practical skills with real-world scenarios.

Osgoode’s SIMPLE platform – short for “simulated professional learning environment” – integrates case management systems and simulation engines to immerse students in practical scenarios.

“It’s actually to get them thinking hard about what they do when they act as lawyers and what they will do when they become lawyers…putting it into values and an ethical framework, as well as making it highly intensively practical,” Maharg says.


And speaking of legal training, also see:

AI in law firms should be a training tool, not a threat, for young lawyers — from canadianlawyermag.com by Tim Wilbur
Tech should free associates for deeper learning, not remove them from the process

AI is rapidly transforming legal practice. Today, tools handle document review and legal research at a pace unimaginable just a few years ago. As recent Canadian Lawyer reporting shows, legal AI adoption is outpacing expectations, especially among in-house teams, and is fundamentally reshaping how legal services are delivered.

Crucially, though, AI should not replace associates. Instead, it should relieve them of repetitive tasks and allow them to focus on developing judgment, client management, and strategic thinking. As I’ve previously discussed regarding the risks of banning AI in court, the future of law depends on blending technological fluency with the human skills clients value most.


Also, the following relates to legaltech as well:

Agentic AI in Legaltech: Proceed with Supervision! — from directory.lawnext.com by Ken Crutchfield
Semi-Autonomous agents can transform work if leaders maintain oversight

The term autonomous agents should raise some concern. I believe semi-autonomous agents is a better term. Do we really want fully autonomous agents that learn and interact independently, to find ways to accomplish tasks?

We live in a world full of cybersecurity risks. Bad actors will think of ways to use agents. Even well-intentioned systems could mishandle a task without proper guardrails.

Legal professionals will want to thoughtfully equip their agent technology with controlled access to the right services. Agents must be supervised, and training must be required for those using or benefiting from agents. Legal professionals will also want to expand the scope of AI Governance to include the oversight of agents.

Agentic AI will require supervision. Human review of Generative AI output is essential. Stating the obvious may be necessary, especially with agents. Controls, human review, and human monitoring must be part of the design and the requirements for any project. Leadership should not leave this to the IT department alone.

 

15 Quick (and Mighty) Retrieval Practices — from edutopia.org by Daniel Leonard
From concept maps to flash cards to Pictionary, these activities help students reflect on—and remember—what they’ve learned.

But to genuinely commit information to long-term memory, there’s no replacement for active retrieval—the effortful practice of recalling information from memory, unaided by external sources like notes or the textbook. “Studying this way is mentally difficult,” Willingham acknowledged, “but it’s really, really good for memory.”

From low-stakes quizzes to review games to flash cards, there are a variety of effective retrieval practices that teachers can implement in class or recommend that students try at home. Drawing from a wide range of research, we compiled this list of 15 actionable retrieval practices.


And speaking of cognitive science, also see:

‘Cognitive Science,’ All the Rage in British Schools, Fails to Register in U.S. — from the74million.org by Greg Toppo
Educators blame this ‘reverse Beatles effect’ on America’s decentralized system and grad schools that are often hostile to research.

When Zach Groshell zoomed in as a guest on a longstanding British education podcast last March, a co-host began the interview by telling listeners he was “very well-known over in the U.S.”

Groshell, a former Seattle-area fourth-grade teacher, had to laugh: “Nobody knows me here in the U.S.,” he said in an interview.

But in Britain, lots of teachers know his name. An in-demand speaker at education conferences, he flies to London “as frequently as I can” to discuss Just Tell Them, his 2024 book on explicit instruction. Over the past year, Groshell has appeared virtually about once a month and has made two personal appearances at events across England.

The reason? A discipline known as cognitive science. Born in the U.S., it relies on decades of research on how kids learn to guide teachers in the classroom, and is at the root of several effective reforms, including the Science of Reading.

 

I Teach Creative Writing. This Is What A.I. Is Doing to Students. — from nytimes.com by Meghan O’Rourke; this is a gifted article.

We need a coherent approach grounded in understanding how the technology works, where it is going and what it will be used for.

From DSC:
I almost feel like Meghan should right the words “this week” or “this month” after the above sentence. Whew! Things are moving fast.

For example, we’re now starting to see more agents hitting the scene — software that can DO things. But that can open up a can of worms too. 

Students know the ground has shifted — and that the world outside the university expects them to shift with it. A.I. will be part of their lives regardless of whether we approve. Few issues expose the campus cultural gap as starkly as this one.ce 

From DSC:
Universities and colleges have little choice but to integrate AI into their programs and offerings. There’s enough pressure on institutions of traditional higher education to prove their worth/value. Students and their families want solid ROI’s. Students know that they are going to need AI-related skills (see the link immediately below for example), or they are going to be left out of the competitive job search process.

A relevant resource here:

 

From DSC:
In looking at
 
MyNextChapter.ai — THIS TYPE OF FUNCTIONALITY of an AI-based chatbot talking to you re: good fits for a future job — is the kind of thing that could work well in this type of vision/learning platform. The AI asks you relevant career-oriented questions, comes up with some potential job fits, and then gives you resources about how to gain those skills, who to talk with, organizations to join, next steps to get your foot in the door somewhere, etc.

The next gen learning platform would provide links to online-based courses, blogs, peoples’ names on LinkedIn, courses from L&D organizations or from institutions of higher education or from other entities/places to obtain those skills (similar to the ” Action Plan” below from MyNextChapter.ai).

 

Teach business students to write like executives — from timeshighereducation.com by José Ignacio Sordo Galarza
Many business students struggle to communicate with impact. Teach them to pitch ideas on a single page to build clarity, confidence and work-ready communication skills

Many undergraduate business students transition into the workforce equipped with communication habits that, while effective in academic settings, prove ineffective in professional environments. At university, students are trained to write for professors, not executives. This becomes problematic in the workplace where lengthy reports and academic jargon often obscure rather than clarify intent. Employers seek ideas they can absorb in seconds. This is where the one-pager – a single-page, high-impact document that helps students develop clarity of thought, concise expression and strategic communication – proves effective.


Also from Times Higher Education, see:


Is the dissertation dead? If so, what are the alternatives? — from timeshighereducation.com by Rushana Khusainova, Sarah Sholl, & Patrick Harte
Dissertation alternatives, such as capstone projects and applied group-based projects, could better prepare graduates for their future careers. Discover what these might look like

The traditional dissertation, a longstanding pillar of higher education, is facing increasing scrutiny. Concerns about its relevance to contemporary career paths, limitations in fostering practical skills and the changing nature of knowledge production in the GenAI age have fuelled discussions about its continued efficacy. So, is the dissertation dead?

The dissertation is facing a number of challenges. It can be perceived as having little relevance to career aspirations in increasingly competitive job markets. According to The Future of Jobs Report 2025 by the World Economic Forum, employers demand and indeed prioritise skills such as collaborative problem-solving in diverse and complex contexts, which a dissertation might not demonstrate.

 

 

PODCAST: Did AI “break” school? Or will it “fix” it? …and if so, what can we do about it? — from theneurondaily.com by Grant Harvey, Corey Noles, Grant Harvey, & Matthew Robinson

In Episode 5 of The Neuron Podcast, Corey Noles and Grant Harvey tackle the education crisis head-on. We explore the viral UCLA “CheatGPT” controversy, MIT’s concerning brain study, and innovative solutions like Alpha School’s 2-hour learning model. Plus, we break down OpenAI’s new $10M teacher training initiative and share practical tips for using AI to enhance learning rather than shortcut it. Whether you’re a student, teacher, or parent, you’ll leave with actionable insights on the future of education.

 


Tech check: Innovation in motion: How AI is rewiring L&D workflows — from chieflearningofficer.com by Gabrielle Pike
AI isn’t here to replace us. It’s here to level us up.

For today’s chief learning officer, the days of just rolling out compliance training are long gone. In 2025, learning and development leaders are architects of innovation, crafting ecosystems that are agile, automated and AI-infused. This quarter’s Tech Check invites us to pause, assess and get strategic about where tech is taking us. Because the goal isn’t more tools—it’s smarter, more human learning systems that scale with the business.

Sections include:

  • The state of AI in L&D: Hype vs. reality
  • AI in design: From static content to dynamic experiences
  • AI in development: Redefining production workflows
  • Strategic questions CLOs should be asking
  • Future forward: What’s next?
  • Closing thought

American Federation of Teachers (AFT) to Launch National Academy for AI Instruction with Microsoft, OpenAI, Anthropic and United Federation of Teachers — from aft.org

NEW YORK – The AFT, alongside the United Federation of Teachers and lead partner Microsoft Corp., founding partner OpenAI, and Anthropic, announced the launch of the National Academy for AI Instruction today. The groundbreaking $23 million education initiative will provide access to free AI training and curriculum for all 1.8 million members of the AFT, starting with K-12 educators. It will be based at a state-of-the-art bricks-and-mortar Manhattan facility designed to transform how artificial intelligence is taught and integrated into classrooms across the United States.

The academy will help address the gap in structured, accessible AI training and provide a national model for AI-integrated curriculum and teaching that puts educators in the driver’s seat.


Students Are Anxious about the Future with A.I. Their Parents Are, Too. — from educationnext.org by Michael B. Horn
The fast-growing technology is pushing families to rethink the value of college

In an era when the college-going rate of high school graduates has dropped from an all-time high of 70 percent in 2016 to roughly 62 percent now, AI seems to be heightening the anxieties about the value of college.

According to the survey, two-thirds of parents say AI is impacting their view of the value of college. Thirty-seven percent of parents indicate they are now scrutinizing college’s “career-placement outcomes”; 36 percent say they are looking at a college’s “AI-skills curriculum,” while 35 percent respond that a “human-skills emphasis” is important to them.

This echoes what I increasingly hear from college leadership: Parents and students demand to see a difference between what they are getting from a college and what they could be “learning from AI.”


This next item on LinkedIn is compliments of Ray Schroeder:



How to Prepare Students for a Fast-Moving (AI)World — from rdene915.com by Dr. Rachelle Dené Poth

Preparing for a Future-Ready Classroom
Here are the core components I focus on to prepare students:

1. Unleash Creativity and Problem-Solving.
2. Weave in AI and Computational Thinking.
3. Cultivate Resilience and Adaptability.


AI Is Reshaping Learning Roles—Here’s How to Future-Proof Your Team — from onlinelearningconsortium.org by Jennifer Mathes, Ph.D., CEO, Online Learning Consortium; via Robert Gibson on LinkedIn

Culture matters here. Organizations that foster psychological safety—where experimentation is welcomed and mistakes are treated as learning—are making the most progress. When leaders model curiosity, share what they’re trying, and invite open dialogue, teams follow suit. Small tests become shared wins. Shared wins build momentum.

Career development must be part of this equation. As roles evolve, people will need pathways forward. Some will shift into new specialties. Others may leave familiar roles for entirely new ones. Making space for that evolution—through upskilling, mobility, and mentorship—shows your people that you’re not just investing in AI, you’re investing in them.

And above all, people need transparency. Teams don’t expect perfection. But they do need clarity. They need to understand what’s changing, why it matters, and how they’ll be supported through it. That kind of trust-building communication is the foundation for any successful change.

These shifts may play out differently across sectors—but the core leadership questions will likely be similar.

AI marks a turning point—not just for technology, but for how we prepare our people to lead through disruption and shape the future of learning.


.


 
 

Is graduate employability a core university priority? — from timeshighereducation.com by Katherine Emms and Andrea Laczik
Universities, once judged primarily on the quality of their academic outcomes, are now also expected to prepare students for the workplace. Here’s how higher education is adapting to changing pressures

A clear, deliberate shift in priorities is under way. Embedding employability is central to an Edge Foundation report, carried out in collaboration with UCL’s Institute of Education, looking at how English universities are responding. In placing employability at the centre of their strategies – not just for professional courses but across all disciplines – the two universities that were analysed in this research show how they aim to prepare students for the labour market overall. Although the employability strategy is initialled by the universities’ senior leaders, the research showed that realising this employability strategy must be understood and executed by staff at all levels across departments. The complexity of offering insights into industry pathways and building relevant skills involves curricula development, student-centred teaching, careers support, partnership work and employer engagement.


Every student can benefit from an entrepreneurial mindset — from timeshighereducation.com by Nicolas Klotz
To develop the next generation of entrepreneurs, universities need to nurture the right mindset in students of all disciplines. Follow these tips to embed entrepreneurial education

This shift demands a radical rethink of how we approach entrepreneurial mindset in higher education. Not as a specialism for a niche group of business students but as a core competency that every student, in every discipline, can benefit from.

At my university, we’ve spent the past several years re-engineering how we embed entrepreneurship into daily student life and learning.

What we’ve learned could help other institutions, especially smaller or resource-constrained ones, adapt to this new landscape.

The first step is recognising that entrepreneurship is not only about launching start-ups for profit. It’s about nurturing a mindset that values initiative, problem-solving, resilience and creative risk-taking. Employers increasingly want these traits, whether the student is applying for a traditional job or proposing their own venture.


Build foundations for university-industry partnerships in 90 days— from timeshighereducation.com by Raul Villamarin Rodriguez and Hemachandran K
Graduate employability could be transformed through systematic integration of industry partnerships. This practical guide offers a framework for change in Indian universities

The most effective transformation strategy for Indian universities lies in systematic industry integration that moves beyond superficial partnerships and towards deep curriculum collaboration. Rather than hoping market alignment will occur naturally, institutions must reverse-engineer academic programmes from verified industry needs.

Our six-month implementation at Woxsen University demonstrates this framework’s practical effectiveness, achieving more than 130 industry partnerships, 100 per cent faculty participation in transformation training, and 75 per cent of students receiving industry-validated credentials with significantly improved employment outcomes.


 

How Do You Teach Computer Science in the A.I. Era? — from nytimes.com by Steve Lohr; with thanks to Ryan Craig for this resource
Universities across the country are scrambling to understand the implications of generative A.I.’s transformation of technology.

The future of computer science education, Dr. Maher said, is likely to focus less on coding and more on computational thinking and A.I. literacy. Computational thinking involves breaking down problems into smaller tasks, developing step-by-step solutions and using data to reach evidence-based conclusions.

A.I. literacy is an understanding — at varying depths for students at different levels — of how A.I. works, how to use it responsibly and how it is affecting society. Nurturing informed skepticism, she said, should be a goal.

At Carnegie Mellon, as faculty members prepare for their gathering, Dr. Cortina said his own view was that the coursework should include instruction in the traditional basics of computing and A.I. principles, followed by plenty of hands-on experience designing software using the new tools.

“We think that’s where it’s going,” he said. “But do we need a more profound change in the curriculum?”

 

 

Get yourself unstuck: overthinking is boring and perfectionism is a trap — from timeshighereducation.com by David Thompson
The work looks flawless, the student seems fine. But underneath, perfectionism is doing damage. David Thompson unpacks what educators can do to help high-performing students navigate the pressure to succeed and move from stuck to started

That’s why I encourage imperfection, messiness and play and build these ideas into how I teach.

These moments don’t come from big breakthroughs. They come from removing pressure and replacing it with permission.

 
 
 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 
© 2025 | Daniel Christian