AI agents arrive in US classrooms — from zdnet.com by Radhika Rajkumar
Kira AI’s personalized learning platform is currently being implemented in Tennessee schools. How will it change education?

AI for education is a new but rapidly expanding field. Can it support student outcomes and help teachers avoid burnout?

On Wednesday, AI education company Kira launched a “fully AI-native learning platform” for K-12 education, complete with agents to assist teachers with repetitive tasks. The platform hosts assignments, analyzes progress data, offers administrative assistance, helps build lesson plans and quizzes, and more.

“Unlike traditional tools that merely layer AI onto existing platforms, Kira integrates artificial intelligence directly into every educational workflow — from lesson planning and instruction to grading, intervention, and reporting,” the release explains. “This enables schools to improve student outcomes, streamline operations, and provide personalized support at scale.”

Also relevant/see:

Coursera Founder Andrew Ng’s New Venture Brings A.I. to K–12 Classrooms — from observer.com by Victor Dey
Andrew Ng’s Kira Learning uses A.I. agents to transform K–12 education with tools for teachers, students and administrators.

“Teachers today are overloaded with repetitive tasks. A.I. agents can change that, and free up their time to give more personalized help to students,” Ng said in a statement.

Kira was co-founded by Andrea Pasinetti and Jagriti Agrawal, both longtime collaborators of Ng. The platform embeds A.I. directly into lesson planning, instruction, grading and reporting. Teachers can instantly generate standards-aligned lesson plans, monitor student progress in real time and receive automated intervention strategies when a student falls behind.

Students, in turn, receive on-demand tutoring tailored to their learning styles. A.I. agents adapt to each student’s pace and mastery level, while grading is automated with instant feedback—giving educators time to focus on teaching.


‘Using GenAI is easier than asking my supervisor for support’ — from timeshighereducation.com
Doctoral researchers are turning to generative AI to assist in their research. How are they using it, and how can supervisors and candidates have frank discussions about using it responsibly?

Generative AI is increasingly the proverbial elephant in the supervisory room. As supervisors, you may be concerned about whether your doctoral researchers are using GenAI. It can be a tricky topic to broach, especially when you may not feel confident in understanding the technology yourself.

While the potential impact of GenAI use among undergraduate and postgraduate taught students, especially, is well discussed (and it is increasingly accepted that students and staff need to become “AI literate”), doctoral researchers often slip through the cracks in institutional guidance and policymaking.


AI as a Thought Partner in Higher Education — from er.educause.edu by Brian Basgen

When used thoughtfully and transparently, generative artificial intelligence can augment creativity and challenge assumptions, making it an excellent tool for exploring and developing ideas.

The glaring contrast between the perceived ubiquity of GenAI and its actual use also reveals fundamental challenges associated with the practical application of these tools. This article explores two key questions about GenAI to address common misconceptions and encourage broader adoption and more effective use of these tools in higher education.


AI for Automation or Augmentation of L&D? — from drphilippahardman.substack.com by Dr. Philippa Hardman
An audio summary of my Learning Technologies talk

Like many of you, I spent the first part of this week at Learning Technologies in London, where I was lucky enough to present a session on the current state of AI and L&D.

In this week’s blog post, I summarise what I covered and share an audio summary of my paper for you to check out.


Bridging the AI Trust Gap — from chronicle.com by Ian Wilhelm, Derek Bruff, Gemma Garcia, and Lee Rainie

In a 2024 Chronicle survey, 86 percent of administrators agreed with the statement: “Generative artificial intelligence tools offer an opportunity for higher education to improve how it educates, operates, and conducts research.” In contrast, just 55 percent of faculty agreed, showing the stark divisions between faculty and administrative perspectives on adopting AI.

Among many faculty members, a prevalent distrust of AI persists — and for valid reasons. How will it impact in-class instruction? What does the popularity of generative AI tools portend for the development of critical thinking skills for Gen-Z students? How can institutions, at the administrative level, develop policies to safeguard against students using these technologies as tools for cheating?

Given this increasing ‘trust gap,’ how can faculty and administrators work together to preserve academic integrity as AI seeps into all areas of academia, from research to the classroom?

Join us for “Bridging the AI Trust Gap,” an extended, 75-minute Virtual Forum exploring the trust gap on campus about AI, the contours of the differences, and what should be done about it.

 

4 ways community colleges can boost workforce development — from highereddive.com by Natalie Schwartz
Higher education leaders at this week’s ASU+GSV Summit gave advice for how two-year institutions can boost the economic mobility of their students.

SAN DIEGO — How can community colleges deliver economic mobility to their students?

College leaders at this week’s ASU+GSV Summit, an annual education and technology conference, got a glimpse into that answer as they heard how community colleges are building support from business and industry and strengthening workforce development.

These types of initiatives may be helping to boost public perception of the value of community colleges vs. four-year institutions.

 

What does ‘age appropriate’ AI literacy look like in higher education? — from timeshighereducation.com by Fun Siong Lim
As AI literacy becomes an essential work skill, universities need to move beyond developing these competencies at ‘primary school’ level in their students. Here, Fun Siong Lim reflects on frameworks to support higher-order AI literacies

Like platforms developed at other universities, Project NALA offers a front-end interface (known as the builder) for faculty to create their own learning assistant. An idea we have is to open the builder up to students to allow them to create their own GenAI assistant as part of our AI literacy curriculum. As they design, configure and test their own assistant, they will learn firsthand how generative AI works. They get to test performance-enhancement approaches beyond prompt engineering, such as grounding the learning assistant with curated materials (retrieval-augmented generation) and advanced ideas such as incorporating knowledge graphs.

They should have the opportunity to analyse, evaluate and create responsible AI solutions. Offering students the opportunity to build their own AI assistants could be a way forward to develop these much-needed skills.


How to Use ChatGPT 4o’s Update to Turn Key Insights Into Clear Infographics (Prompts Included) — from evakeiffenheim.substack.com by Eva Keiffenheim
This 3-step workflow helps you break down books, reports, or slide-decks into professional visuals that accelerate understanding.

This article shows you how to find core ideas, prompt GPT-4o3 for a design brief, and generate clean, professional images that stick. These aren’t vague “creative visuals”—they’re structured for learning, memory, and action.

If you’re a lifelong learner, educator, creator, or just someone who wants to work smarter, this process is for you.

You’ll spend less time re-reading and more time understanding. And maybe—just maybe—you’ll build ideas that not only click in your brain, but also stick in someone else’s.


SchoolAI Secures $25 Million to Help Teachers and Schools Reach Every Student — from globenewswire.com
 The Classroom Experience platform gives every teacher and student their own AI tools for personalized learning

SchoolAI’s Classroom Experience platform combines AI assistants for teachers that help with classroom preparation and other administrative work, and Spaces–personalized AI tutors, games, and lessons that can adapt to each student’s unique learning style and interests. Together, these tools give teachers actionable insights into how students are doing, and how the teacher can deliver targeted support when it matters most.

“Teachers and schools are navigating hard challenges with shrinking budgets, teacher shortages, growing class sizes, and ongoing recovery from pandemic-related learning gaps,” said Caleb Hicks, founder and CEO of SchoolAI. “It’s harder than ever to understand how every student is really doing. Teachers deserve powerful tools to help extend their impact, not add to their workload. This funding helps us double down on connecting the dots for teachers and students, and later this year, bringing school administrators and parents at home onto the platform as well.”


AI in Education, Part 3: Looking Ahead – The Future of AI in Learning — from rdene915.com by Dr. Rachelle Dené Poth

In the first and second parts of my AI series, I focused on where we see AI in classrooms. Benefits range from personalized learning and accessibility tools to AI-driven grading and support of a teaching assistant. In Part 2, I chose to focus on some of the important considerations related to ethics that must be part of the conversation. Schools need to focus on data privacy, bias, overreliance, and the equity divide. I wanted to focus on the future for this last part in the current AI series. Where do we go from here?


Anthropic Education Report: How University Students Use Claude — from anthropic.com

The key findings from our Education Report are:

  • STEM students are early adopters of AI tools like Claude, with Computer Science students particularly overrepresented (accounting for 36.8% of students’ conversations while comprising only 5.4% of U.S. degrees). In contrast, Business, Health, and Humanities students show lower adoption rates relative to their enrollment numbers.
  • We identified four patterns by which students interact with AI, each of which were present in our data at approximately equal rates (each 23-29% of conversations): Direct Problem Solving, Direct Output Creation, Collaborative Problem Solving, and Collaborative Output Creation.
  • Students primarily use AI systems for creating (using information to learn something new) and analyzing (taking apart the known and identifying relationships), such as creating coding projects or analyzing law concepts. This aligns with higher-order cognitive functions on Bloom’s Taxonomy. This raises questions about ensuring students don’t offload critical cognitive tasks to AI systems.

From the Kuali Days 2025 Conference: A CEO’s View of Planning for AI — from campustechnology.com by Mary Grush
A Conversation with Joel Dehlin

How can a company serving higher education navigate the changes AI brings to the ed tech marketplace? What will customers expect in this dynamic? Here, CT talks with Kuali CEO Joel Dehlin, who shared his company’s AI strategies in a featured plenary session, “Sneak Peek of AI in Kuali Build,” at Kuali Days 2025 in Anaheim.


How students can use generative AI — from aliciabankhofer.substack.com by Alicia Bankhofer
Part 4 of 4 in my series on Teaching and Learning in the AI Age

This article is the culmination of a series exploring AI’s impact on education.

Part 1: What Educators Need outlined essential AI literacy skills for teachers, emphasizing the need to move beyond basic ChatGPT exploration to understand the full spectrum of AI tools available in education.

Part 2: What Students Need addressed how students require clear guidance to use AI safely, ethically, and responsibly, with emphasis on developing critical thinking skills alongside AI literacy.

Part 3: How Educators Can Use GenAI presented ten practical use cases for teachers, from creating differentiated resources to designing assessments, demonstrating how AI can reclaim 5-7 hours weekly for meaningful student interactions.

Part 4: How Students Can Use GenAI (this article) provides frameworks for guiding student AI use based on Joscha Falck’s dimensions: learning about, with, through, despite, and without AI.


Mapping a Multidimensional Framework for GenAI in Education — from er.educause.edu by Patricia Turner
Prompting careful dialogue through incisive questions can help chart a course through the ongoing storm of artificial intelligence.

The goal of this framework is to help faculty, educational developers, instructional designers, administrators, and others in higher education engage in productive discussions about the use of GenAI in teaching and learning. As others have noted, theoretical frameworks will need to be accompanied by research and teaching practice, each reinforcing and reshaping the others to create understandings that will inform the development of approaches to GenAI that are both ethical and maximally beneficial, while mitigating potential harms to those who engage with it.


Instructional Design Isn’t Dying — It’s Specialising — from drphilippahardman.substack.com by Dr. Philippa Hardman
Aka, how AI is impacting role & purpose of Instructional Design

Together, these developments have revealed something important: despite widespread anxiety, the instructional design role isn’t dying—it’s specialising.

What we’re witnessing isn’t the automation of instructional design and the death of the instructional designer, but rather the evolution of the ID role into multiple distinct professional pathways.

The generalist “full stack” instructional designer is slowly but decisively fracturing into specialised roles that reflect both the capabilities of generative AI and the strategic imperatives facing modern organisations.

In this week’s blog post, I’ll share what I’ve learned about how our field is transforming, and what it likely means for you and your career path.

Those instructional designers who cling to traditional generalist models risk being replaced, but those who embrace specialisation, data fluency, and AI collaboration will excel and lead the next evolution of the field. Similarly, those businesses that continue to view L&D as a cost centre and focus on automating content delivery will be outperformed, while those that invest in building agile, AI-enabled learning ecosystems will drive measurable performance gains and secure their competitive advantage.


Adding AI to Every Step in Your eLearning Design Workflow — from learningguild.com by George Hanshaw

We know that eLearning is a staple of training and development. The expectations of the learners are higher than ever: They expect a dynamic, interactive, and personalized learning experience. As instructional designers, we are tasked with meeting these expectations by creating engaging and effective learning solutions.

The integration of Artificial Intelligence (AI) into our eLearning design process is a game-changer that can significantly enhance the quality and efficiency of our work.

No matter if you use ADDIE or rapid prototyping, AI has a fit in every aspect of your workflow. By integrating AI, you can ensure a more efficient and effective design process that adapts to the unique needs of your learners. This not only saves time and resources but also significantly enhances the overall learning experience. We will explore the needs analysis and the general design process.

 


.

2025 EDUCAUSE Students and Technology Report: Shaping the Future of Higher Education Through Technology, Flexibility, and Well-Being — from library.educause.edu

The student experience in higher education is continually evolving, influenced by technological advancements, shifting student needs and expectations, evolving workforce demands, and broadening sociocultural forces. In this year’s report, we examine six critical aspects of student experiences in higher education, providing insights into how institutions can adapt to meet student needs and enhance their learning experience and preparation for the workforce:

  • Satisfaction with Technology-Related Services and Supports
  • Modality Preferences
  • Hybrid Learning Experiences
  • Generative AI in the Classroom
  • Workforce Preparation
  • Accessibility and Mental Health

DSC: Shame on higher ed for not preparing students for the workplace (see below). You’re doing your students wrong…again. Not only do you continue to heap a load of debt on their backs, but you’re also continuing to not get them ready for the workplace. So don’t be surprised if eventually you’re replaced by a variety of alternatives that students will flock towards.
.

 

DSC: And students don’t have a clue as to what awaits them in the workplace — they see AI-powered tools and technologies at an incredibly low score of only 3%. Yeh, right. You’ll find out. Here’s but one example from one discipline/field of work –> Thomson Reuters Survey: Over 95% of Legal Professionals Expect Gen AI to Become Central to Workflow Within Five Years

.

Figure 15. Competency Areas Expected to Be Important for Career

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Do I Need a Degree in Instructional Design? It Depends. — from teamedforlearning.com

It’s a common question for those considering a career in instructional design: Do I need a degree to land a job? The answer? It depends.

Hiring managers aren’t just looking for a degree—they want proof that you have the knowledge, skills, and abilities to succeed. In fact, most employers focus on 3 key factors when assessing candidates. You typically need at least 2 of these to be considered:

  1. A Credential – A degree or certification in instructional design, learning experience design, or a related field.
  2. Relevant Work Experience – Hands-on experience designing and developing learning solutions.
  3. Proof of Abilities – A strong portfolio showcasing eLearning modules, course designs, or learning strategies.

The good news? You don’t have to spend years earning a degree to break into the field. If you’re resourceful, you can fast-track your way in through volunteer projects, contract work, and portfolio building.

Whether you’re a recent graduate, a career changer, or a working professional looking for your next opportunity, focusing on these key factors can help you stand out and get hired.

 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

The 2025 AI Index Report — from Stanford University’s Human-Centered Artificial Intelligence Lab (hai.stanford.edu); item via The Neuron

Top Takeaways

  1. AI performance on demanding benchmarks continues to improve.
  2. AI is increasingly embedded in everyday life.
  3. Business is all in on AI, fueling record investment and usage, as research continues to show strong productivity impacts.
  4. The U.S. still leads in producing top AI models—but China is closing the performance gap.
  5. The responsible AI ecosystem evolves—unevenly.
  6. Global AI optimism is rising—but deep regional divides remain.
  7. …and several more

Also see:

The Neuron’s take on this:

So, what should you do? You really need to start trying out these AI tools. They’re getting cheaper and better, and they can genuinely help save time or make work easier—ignoring them is like ignoring smartphones ten years ago.

Just keep two big things in mind:

  1. Making the next super-smart AI costs a crazy amount of money and uses tons of power (seriously, they’re buying nuclear plants and pushing coal again!).
  2. Companies are still figuring out how to make AI perfectly safe and fair—cause it still makes mistakes.

So, use the tools, find what helps you, but don’t trust them completely.

We’re building this plane mid-flight, and Stanford’s report card is just another confirmation that we desperately need better safety checks before we hit major turbulence.


Addendum on 4/16:

 

From DSC:
I value our constitutional democracy and I want to help preserve it. If you are an American, I encourage you to do the same. I’m not interested in living under an authoritarian government. The founders of this great nation developed an important document that integrated a system of checks and balances between the legislative, judicial, and executive branches of government. The rule of law was important then, and it should still be important now. That’s why I’m posting the following two items.


Several Hundred Law Professors File Amicus Brief Defending Biglaw Firms Against Trump’s Executive Order Attacks — from jdjournal.com by Maria Lenin Laus

In an unprecedented show of solidarity, over 300 law professors from leading American law schools have filed an amicus brief condemning former President Donald Trump’s executive orders targeting major law firms. The professors argue that the orders—issued in retaliation for the firms’ clients, diversity initiatives, and legal work opposing Trump policies—represent a dangerous abuse of executive power and a direct violation of constitutional protections.

The amicus brief, filed in support of the law firms’ challenge, was signed by professors from nearly every top-tier U.S. law school, including Harvard, Yale, Stanford, Columbia, NYU, and the University of Chicago. The professors argue that Trump’s orders:

  • Violate the First Amendment by penalizing firms for the viewpoints they express through advocacy and representation;
  • Undermine the rule of law by discouraging legal professionals from taking on controversial or unpopular clients;
  • Set a dangerous precedent for political retaliation against attorneys and the institutions of justice.

Law school deans around the country react to Trump’s undercutting the legal foundations/principles of our nation — from linkedin.com by Georgetown University Law Center

“We write to reaffirm basic principles: The government should not punish lawyers and law firms for the clients they represent, absent specific findings that such representation was illegal or unethical. Punishing lawyers for their representation and advocacy violates the First Amendment and undermines the Sixth Amendment.

We thus speak as legal educators, responsible for training the next generation of lawyers, in condemning any government efforts to punish lawyers or their firms based on the identity of their clients or for their zealous lawful and ethical advocacy.”


For related postings, also see:

President’s Third Term Talk Defies Constitution and Tests Democracy — from nytimes.com by Peter Baker (DSC: This is a free/gifted article for you.)
The 22nd Amendment is clear: President Trump has to give up his office after his second term. But his refusal to accept that underscores how far he is willing to consider going to consolidate power.

“This is in my mind a culmination of what he has already started, which is a methodical effort to destabilize and undermine our democracy so that he can assume much greater power,” Representative Daniel Goldman, Democrat of New York and lead counsel during Mr. Trump’s first impeachment, said in an interview.

“A lot of people are not talking about it because it’s not the most pressing issue of that particular day,” he said on Friday as stock markets were plunging in reaction to Mr. Trump’s newly declared trade war. But an attack on democracy, he added, “is actually in motion and people need to recognize that it is not hypothetical or speculative anymore.”

Mr. Trump’s autocratic tendencies and disregard for constitutional norms are well documented. In this second term alone, he has already sought to overrule birthright citizenship embedded in the 14th Amendment, effectively co-opted the power of Congress to determine what money will be spent or agencies closed, purged the uniformed leadership of the armed forces to enforce greater personal loyalty and punished dissent in academia, the news media, the legal profession and the federal bureaucracy.

BigLaw gives up on its future — from jordanfurlong.substack.com by Jordan Furlong
By putting business ahead of the rule of law when faced with assaults on their independence, many large US law firms have tarnished their reputations. Tomorrow’s lawyers could make them pay the price.

Two young Skadden associates, Rachel Cohen and Brenna Trout Frey, resigned from the firm, the former before its deal with Trump and the latter afterwards. “If my employer cannot stand up for the rule of law,” wrote Ms. Frey, “then I cannot ethically continue to work for them.” There might’ve been other public resignations I haven’t seen, but I’m confident there have been private ones from both firms, as well as intense efforts by other associates to find positions elsewhere.

This is the risk these firms are taking: It matters to young lawyers when their law firms fail to defend the rule of law. And it matters especially to young lawyers who are women and members of visible minorities when law firms jettison their vaunted diversity, equity, and inclusion programs under pressure from the government.

 

Outsourcing Thought: The Hidden Cost of Letting AI Think for You — from linkedin.com by Robert Atkinson

I’ve watched it unfold in real time. A student submits a flawless coding assignment or a beautifully written essay—clean syntax, sharp logic, polished prose. But when I ask them to explain their thinking, they hesitate. They can’t trace their reasoning or walk me through the process. The output is strong, but the understanding is shallow. As a professor, I’ve seen this pattern grow more common: AI-assisted work that looks impressive on the surface but reveals a troubling absence of cognitive depth underneath.

This article is written with my students in mind—but it’s meant for anyone navigating learning, teaching, or thinking in the age of artificial intelligence. Whether you’re a student, educator, or professional, the question is the same: What happens to the brain when we stop doing our own thinking?

We are standing at a pivotal moment. With just a few prompts, generative AI can produce essays, solve complex coding problems, and summarize ideas in seconds. It feels efficient. It feels like progress. But from a cognitive neuroscience perspective, that convenience comes at a hidden cost: the gradual erosion of the neural processes that support reasoning, creativity, and long-term learning.

 

What trauma-informed practice is not — from timeshighereducation.com by Kate Cantrell, India Bryce, and Jessica Gildersleeve from The University of Southern Queensland
Before trauma-informed care can be the norm across all areas of the university, academic and professional staff need to understand what it is. Here, three academics debunk myths and demystify best practice

Recently, we conducted focus groups at our university to better ascertain how academics, administrators and student support staff perceive the purpose and value of trauma-informed practice, and how they perceive their capacity to contribute to organisational change.

We discovered that while most staff were united on the importance of trauma-informed care, several myths persist about what trauma-informed practice is (and is not). Some academic staff, for example, conflated teaching about trauma with trauma-informed teaching, confused trigger warnings with trigger points and, perhaps most alarmingly – given the prevalence of trauma exposure and risk among university students – misjudged trauma-informed practice as “the business of psychologists” rather than educators.

 




Students and folks looking for work may want to check out:

Also relevant/see:


 

8 Weeks Left to Prepare Students for the AI-Enhanced Workplace — from insidehighered.com by Ray Schroeder
We are down to the final weeks left to fully prepare students for entry into the AI-enhanced workplace. Are your students ready?

The urgent task facing those of us who teach and advise students, whether they be degree program or certificate seeking, is to ensure that they are prepared to enter (or re-enter) the workplace with skills and knowledge that are relevant to 2025 and beyond. One of the first skills to cultivate is an understanding of what kinds of services this emerging technology can provide to enhance the worker’s productivity and value to the institution or corporation.

Given that short period of time, coupled with the need to cover the scheduled information in the syllabus, I recommend that we consider merging AI use into authentic assignments and assessments, supplementary modules, and other resources to prepare for AI.


Learning Design in the Era of Agentic AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Aka, how to design online async learning experiences that learners can’t afford to delegate to AI agents

The point I put forward was that the problem is not AI’s ability to complete online async courses, but that online async courses courses deliver so little value to our learners that they delegate their completion to AI.

The harsh reality is that this is not an AI problem — it is a learning design problem.

However, this realisation presents us with an opportunity which we overall seem keen to embrace. Rather than seeking out ways to block AI agents, we seem largely to agree that we should use this as a moment to reimagine online async learning itself.



8 Schools Innovating With Google AI — Here’s What They’re Doing — from forbes.com by Dan Fitzpatrick

While fears of AI replacing educators swirl in the public consciousness, a cohort of pioneering institutions is demonstrating a far more nuanced reality. These eight universities and schools aren’t just experimenting with AI, they’re fundamentally reshaping their educational ecosystems. From personalized learning in K-12 to advanced research in higher education, these institutions are leveraging Google’s AI to empower students, enhance teaching, and streamline operations.


Essential AI tools for better work — from wondertools.substack.com by Jeremy Caplan
My favorite tactics for making the most of AI — a podcast conversation

AI tools I consistently rely on (areas covered mentioned below)

  • Research and analysis
  • Communication efficiency
  • Multimedia creation

AI tactics that work surprisingly well 

1. Reverse interviews
Instead of just querying AI, have it interview you. Get the AI to interview you, rather than interviewing it. Give it a little context and what you’re focusing on and what you’re interested in, and then you ask it to interview you to elicit your own insights.”

This approach helps extract knowledge from yourself, not just from the AI. Sometimes we need that guide to pull ideas out of ourselves.

 
 

Introducing NextGenAI: A consortium to advance research and education with AI — from openai.com; via Claire Zau
OpenAI commits $50M in funding and tools to leading institutions.

Today, we’re launching NextGenAI, a first-of-its-kind consortium with 15 leading research institutions dedicated to using AI to accelerate research breakthroughs and transform education.

AI has the power to drive progress in research and education—but only when people have the right tools to harness it. That’s why OpenAI is committing $50M in research grants, compute funding, and API access to support students, educators, and researchers advancing the frontiers of knowledge.

Uniting institutions across the U.S. and abroad, NextGenAI aims to catalyze progress at a rate faster than any one institution would alone. This initiative is built not only to fuel the next generation of discoveries, but also to prepare the next generation to shape AI’s future.


 ‘I want him to be prepared’: why parents are teaching their gen Alpha kids to use AI — from theguardian.com by Aaron Mok; via Claire Zau
As AI grows increasingly prevalent, some are showing their children tools from ChatGPT to Dall-E to learn and bond

“My goal isn’t to make him a generative AI wizard,” White said. “It’s to give him a foundation for using AI to be creative, build, explore perspectives and enrich his learning.”

White is part of a growing number of parents teaching their young children how to use AI chatbots so they are prepared to deploy the tools responsibly as personal assistants for school, work and daily life when they’re older.

 
© 2025 | Daniel Christian