From DSC:
The types of postings/articles (such as the one below) make me ask, are we not shooting ourselves in the foot with AI and recent college graduates? If the bottom rungs continue to disappear, internships and apprenticeships can only go so far. There aren’t enough of them — especially valuable ones. So as this article points out, there will be threats to the long-term health of our talent pipelines unless we can take steps to thwart those impacts — and to do so fairly soon.

To me…vocational training and jobs are looking better all the time — i.e., plumbers, carpenters, electricians, mechanics, and more.


Can New Graduates Compete With AI? — from builtin.combyRichard Johnson
The increasing adoption of AI automation is compressing early-career jobs. How should new graduates get a foothold in the economy now?

Summary: AI is hollowing out entry-level roles by automating routine tasks, eliminating a rung on the career ladder. New graduates face intense competition and a rising skill floor. While firms gain short-term productivity, they risk a long-term talent shortage by eliminating junior training grounds.

Conversations about AI have covered all grounds: hype, fear and slop. But while some roll their eyes at yet another automation headline, soon?to?be graduates are watching the labor market with a very different level of urgency. They’re entering a world where the old paradox of needing experience to get experience is colliding with a new reality: AI is absorbing the standardized, routine tasks that once defined entry?level work. The result isn’t just a shift in job descriptions or skill-requirements, but rather a structural reshaping of the career pipeline.

Entry-level workers face an outsized disruption to their long-term career trajectories. They have the least buffer to adapt given their lack of relevant job market experience and heightened financial pressure to secure a job quickly with the student-debt repayment periods for recent graduates looming.

Momentum early in one’s career matters, and the first job on a resume shapes future compensation bands and opportunities. It also serves as a signal for perceived specialization or, at minimum, interest. Losing that foothold has compounding effects to one’s career ladder.


Also relevant/see:

New Anthropic Institute to Study Risks and Economic Effects of Advanced AI — from campustechnology.com by John K. Waters

Key Takeaways

  • Anthropic has launched the Anthropic Institute, a new research effort focused on the biggest societal challenges posed by more powerful AI systems.
  • The institute will study how advanced AI could affect the economy, the legal system, public safety, and broader social outcomes.
  • Anthropic co-founder Jack Clark will lead the institute in a new role as the company’s head of public benefit.
  • The new unit brings together Anthropic’s existing red-teaming, societal impacts, and economic research work, while adding new hires and new research areas.
 

Across the divide: reimagining faculty-staff collaboration in higher education — from timeshighereducation.com by Saskia van de Gevel
Academic units do best when they harness different viewpoints – from field scientists and curriculum designers to extension professionals – to drive innovation and relevance. Saskia van de Gevel offers proactive advice

Universities are not sustained by individual leaders or isolated units. They are sustained by teams of people who bring different kinds of expertise to a shared mission. When faculty and professional staff collaborate as genuine partners – aligned around outcomes, clear about roles and committed to mutual respect – institutions become more resilient, innovative and effective.

Also from timeshighereducation.com, see:

Again, we don’t send them 200 CVs. We might send 20, but they’re meticulously shortlisted. The employer saves time, the student feels they are being taken seriously and trust builds quickly on both sides.

And because we work closely with employers, we learn something universities often struggle to find out early enough: what the market is asking for now.

What academics need to know: we can’t do this without you
If I could say one thing to academic colleagues anywhere, it’s that employability can’t sit next to the curriculum. It has to live with it.

 

The Future of College in an AI World — from linkedin.com by Jeff Selingo
In today’s issue: The tension over AI in higher ed; application inflation continues and testing is back; what’s the future of the original classroom technology, the learning management system. 


Hundreds of higher ed and industry leaders gathered Tuesday for a summit
on AI and the future of learning at the University of Michigan.
.

Conversations like the one we had at Michigan this week are necessary, but the action rarely matches the ambition.

  • We say the humanities are the operating system of an AI world, yet students and parents don’t believe it. They’re voting with their feet toward STEM, business, and narrowly tailored majors they believe will lead to a job.
  • Meanwhile, colleges are quietly eliminating the very humanities degrees the panelists were championing, employers are cutting the entry rungs off the career ladder for new graduates, and as Podium Education co-founder Christopher Parrish reminded us yesterday, there’s a yawning gap between demand for experience and the internships that actually exist.


AI Music Generators: Teaching With These Catchy AI Tools — from techlearning.com by Erik Ofgang
AI music generators are getting better and better, and there are more applications in the classroom as a result.

Are All AI Music Generators More Or Less The Same?
No. After experimenting with a few various free ones, I found a wide range of quality with the same prompts.

Gemini is the only one I’d currently recommend. It’s user-friendly but limited and only creates 30-second clips. Other music generators could potentially outperform Gemini with prompt adjustments. The ones I tried did better with the instrumentals but struggled more with the lyrics, and that kind of defeated the purpose of the tool for me.


ChatDOC: Teaching With The AI Summarizing Tool — from techlearning.com by Erik Ofgang
ChatDOC lets users turn any PDF into an AI chatbot that can summarize the text, answer questions, and generate quizzes.

What Is ChatDOC?
ChatDOC is an AI designed to help users interact with PDFs of various types, be it research papers, short stories, or chapters from larger works. Users upload a PDF and then have the opportunity to “chat” with that document, that is speak with a chatbot that bases its answers off of the uploaded text.

ChatDOC can perform tasks such as provide a short summary, search for specific terms, explain the overall theme if it’s a work of literature, or unpack the science in a research paper.

Other similar tools are out there, but ChatDOC is definitely one of the better PDF readers I’ve used. Its free version is quick and easy-to-use, and delivers on its promise of providing an AI that can discuss a given document with users and even quiz them on it.


From AI access to workforce readiness — from chieflearningofficer.com by Johnny Hamilton, Amy Stratbucker, & Brad Bigelow
Is your workforce using the right tool with an outdated mindset and playbook? Why old playbooks fall short — and what learning leaders must do next.

The leadership opportunity
Organizations do not need to predict every future AI capability. They need systems that allow people to explore with curiosity, practice safely, reflect deeply and adapt continuously — starting with what they already have and extending as capabilities evolve.

For CLOs, this is a moment to lead from the center of change — designing workforce readiness that keeps pace with accelerating technology while making work more rewarding for employees and more valuable for the organization. That is how AI moves from the promise of transformation to demonstrated readiness and, ultimately, from promise to performance.


Addendums on 3/19/26:
How to Build Practice-Based Learning Activities with AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Four evidence-based methods for designing, building & deploying active learning activities with your favourite LLM

Most L&D teams are using AI to make content faster. The real opportunity is using it as a practice engine.

The Synthesia 2026 AI in L&D Report f2026 AI in L&D Report found that the fastest-growing areas of planned AI adoption aren’t in content creation — they’re in assessments and simulations (36%), adaptive pathways (33%), and AI tutors (29%). In other words: L&D teams are starting to realise that the most powerful use of AI isn’t producing learning materials. It’s creating environments where learners actually practise.

And you can build these right now — no dev team, no custom platform, no code. Each method below includes a prompt you can paste into your preferred AI tool to generate a working interactive prototype: a self-contained practice activity with a briefing screen, a live AI interaction, and a debrief — all running in the browser, ready to share with stakeholders or deploy to learners.

OpenAI Adds Interactive Math and Science Learning Tools to ChatGPT — from campustechnology.com by Rhea Kelly

Key Takeaways

  • ChatGPT adds interactive learning tools: OpenAI introduced interactive math and science visualizations that allow users to explore formulas, variables, and relationships in real time.
  • The tool currently covers over 70 core math and science topics and is aimed initially at high school and college-level learners.
  • Users can adjust variables, manipulate formulas, and immediately see how changes affect graphs and outcomes.
 

2026 Survey of College and University Presidents — from insidehighered.com, Liaison, & Jenzabar
Download and explore exclusive insights from the 2026 Survey of College and University Presidents to see how these campus leaders are responding to financial volatility, political interference, rapid advances in AI, and where they believe the biggest risks and opportunities lie as they look toward 2030.

In this year’s survey, presidents share perspectives on:

  • How presidents assess the second Trump administration’s impact on higher education
  • Which emerging or evolving educational models they plan to add or expand in the coming years
  • How effective they believe higher education has been in shaping national conversations arout AI
  • The issues presidents expect will have the greatest impact on higher education by 2030

 

 

Teach Smarter with AI — from wondertools.substack.com by Jeremy Caplan and Lance Eaton
10 tested strategies from two educators who actually use them

I recently talked with Lance Eaton, Senior Associate Director of AI and Teaching & Learning at Northeastern University and writer of AI + Education = Simplified. We traded ideas about what’s actually working. We came up with 10 specific, practical ways anyone who teaches, coaches, or leads can put AI to work.

Watch the full conversation above, or read highlights below.


Beyond Audio Summaries: How to Use NotebookLM to *Actually* Design Better Learning — from drphilippahardman.substack.com by Dr. Philippa Hardman
Five methods to maximise the value of NotebookLM’s features

In practice, what makes NotebookLM different for learning designers is four things:

  • Answers grounded in your sources (with citations):
  • Source toggling:
  • Multi-format studio & multi-source summaries:
  • Persistent workspace:


5 Evidence-Based Methods NotebookLM Operationalises…


Shadow AI Isn’t a Threat: It’s a Signal — from campustechnology.com by Damien Eversmann
Unofficial AI use on campus reveals more about institutional gaps than misbehavior.

Key Takeaways

  • Shadow AI is widespread in higher education: Faculty, researchers, students, and staff are using AI tools outside official IT channels, including consumer platforms and public cloud services that may involve sensitive data.
  • Unauthorized AI use creates data, compliance, and cost risks: Consumer AI tools may store or reuse user data, while uncoordinated adoption drives redundant licenses, unpredictable cloud costs, and weaker security oversight.
  • Institutions are shifting from restriction to enablement: Some campuses are making approved paths easier by offering ready-to-use research environments, campus-managed AI tools, clear guidance on data and vendors, and streamlined approval processes.

How L&D Can Lead in the Age of AI Even If Your Company’s Not Ready — from learningguild.com

How to lead even when your company doesn’t allow AI
Even if your corporation isn’t ready for AI, you can still research tools personally to stay ahead of the curve, so when organizational restrictions lift, you are ready to use AI for learning right away. Here are some tools you can test at home if they’re restricted in your workplace:

  • Content generation – Start testing text-based tools to get a taste of how AI can accelerate content creation. Then take it to the next level by exploring tools that generate voices, music, and sound effects.
  • AI coaching tools – Have AI pose as a customer co-worker or customer to get a taste of what it’s like to use it as a conversation coach. Next, use the voice and video capabilities in an app like ChatGPT to explore how AI can coach someone through tasks.
  • In-the-flow learning assistants – Test turning documents into a conversational avatar and interacting with it to see how it feels. Then think about how the technology could potentially transform static content into dynamic learning experiences for employees.
  • Vibe-coded simulations – Experiment with this technology by creating a simple, fun game. Afterwards, brainstorm some ideas on how it could quickly create simulations for your learners in the future.

The Higher Ed Playbook for AI Affordability — from campustechnology.com by Jason Dunn-Potter

Key Takeaways

  • Affordable AI adoption focuses on evolving existing systems: Universities are embedding AI into current devices, workflows, and legacy systems rather than rebuilding infrastructure or investing in new data centers.
  • Edge AI reduces costs and improves access: Running AI models on local devices or networks lowers cloud processing costs, enhances security, and supports learning use cases such as tutoring, translation, transcription, and adaptive learning.
  • Enterprise integration and governance drive impact: Institutions are applying AI across admissions, advising, facilities, and research workflows, supported by shared resource hubs, data governance, AI literacy, and outcome-driven implementation.
 

The Campus AI Crisis — by Jeffrey Selingo; via Ryan Craig
Young graduates can’t find jobs. Colleges know they have to do something. But what?

Only now are colleges realizing that the implications of AI are much greater and are already outrunning their institutional ability to respond. As schools struggle to update their curricula and classroom policies, they also confront a deeper problem: the suddenly enormous gap between what they say a degree is for and what the labor market now demands. In that mismatch, students are left to absorb the risk. Alina McMahon and millions of other Gen-Zers like her are caught in a muddled in-between moment: colleges only just beginning to think about how to adapt and redefine their mission in the post-AI world, and a job market that’s changing much, much faster.

“Colleges and universities face an existential issue before them,” said Ryan Craig, author of Apprentice Nation and managing director of a firm that invests in new educational models. “They need to figure out how to integrate relevant, in-field, and hopefully paid work experience for every student, and hopefully multiple experiences before they graduate.”

 

Jim VandeHei’s note to his kids: Blunt AI talk — from axios.com by CEO Jim VandeHei
Axios CEO Jim VandeHei wrote this note to his wife, Autumn, and their three kids. She suggested sharing it more broadly since so many families are wrestling with how to think and talk about AI. So here it is …

Dear Family:
I want to put to words what I’m hearing, seeing, thinking and writing about AI.

  • Simply put, I’m now certain it will upend your work and life in ways more profound than the internet or possibly electricity. This will hit in months, not years.
  • The changes will be fast, wide, radical, disorienting and scary. No one will avoid its reach.

I’m not trying to frighten you. And I know your opinions range from wonderment to worry. That’s natural and OK. Our species isn’t wired for change of this speed or scale.

  • My conversations with the CEOs and builders of these LLMs, as well as my own deep experimentation with AI, have shaken and stirred me in ways I never imagined.

All of you must figure out how to master AI for any specific job or internship you hold or take. You’d be jeopardizing your future careers by not figuring out how to use AI to amplify and improve your work. You’d be wise to replace social media scrolling with LLM testing.

Be the very best at using AI for your gig.

more here.


Also see:


Also relevant/see:

 

The Essential Retrieval Practice Handbook — from edutopia.org
Retrieval practice is one of the most effective ways to strengthen learning. Here’s a collection of our best resources to use in your classroom today.
January 29, 2026


Also see:

What is retrieval practice? — from retrievalpractice.org

When we think about learning, we typically focus on getting information into students’ heads. What if, instead, we focus on getting information out of students’ heads?


 

The Learning and Employment Records (LER) Report for 2026: Building the infrastructure between learning and work — from smartresume.com; with thanks to Paul Fain for this resource

Executive Summary (excerpt)

This report documents a clear transition now underway: LERs are moving from small experiments to systems people and organizations expect to rely on. Adoption remains early and uneven, but the forces reshaping the ecosystem are no longer speculative. Federal policy signals, state planning cycles, standards maturation, and employer behavior are aligning in ways that suggest 2026 will mark a shift from exploration to execution.

Across interviews with federal leaders, state CIOs, standards bodies, and ecosystem builders, a consistent theme emerged: the traditional model—where institutions control learning and employment records—no longer fits how people move through education and work. In its place, a new model is being actively designed—one in which individuals hold portable, verifiable records that systems can trust without centralizing control.

Most states are not yet operating this way. But planning timelines, RFP language, and federal signals indicate that many will begin building toward this model in early 2026.

As the ecosystem matures, another insight becomes unavoidable: records alone are not enough. Value emerges only when trusted records can be interpreted through shared skill languages, reused across contexts, and embedded into the systems and marketplaces where decisions are made.

Learning and Employment Records are not a product category. They are a data layer—one that reshapes how learning, work, and opportunity connect over time.

This report is written for anyone seeking to understand how LERs are beginning to move from concept to practice. Whether readers are new to the space or actively exploring implementation, the report focuses on observable signals, emerging patterns, and the practical conditions required to move from experimentation toward durable infrastructure.

 

“The building blocks for a global, interoperable skills ecosystem are already in place. As education and workforce alignment accelerates, the path toward trusted, machine-readable credentials is clear. The next phase depends on credentials that carry value across institutions, industries, states, and borders; credentials that move with learners wherever their education and careers take them. The question now isn’t whether to act, but how quickly we move.”

– Curtiss Barnes, Chief Executive Officer, 1EdTech

 


The above item was from Paul Fain’s recent posting, which includes the following excerpt:

SmartResume just published a guide for making sense of this rapidly expanding landscape. The LER Ecosystem Report was produced in partnership with AACRAO, Credential Engine, 1EdTech, HR Open Standards, and the U.S. Chamber of Commerce Foundation. It was based on interviews and feedback gathered over three years from 100+ leaders across education, workforce, government, standards bodies, and tech providers.

The tools are available now to create the sort of interoperable ecosystem that can make talent marketplaces a reality, the report argues. Meanwhile, federal policy moves and bipartisan attention to LERs are accelerating action at the state level.

“For state leaders, this creates a practical inflection point,” says the report. “LERs are shifting from an innovation discussion to an infrastructure planning conversation.”

 

AI and the Work of Centers for Teaching and Learning — from derekbruff.org by Derek Bruff

  • Penelope Adams Moon suggested that instead [of] framing a workshop around “How can we integrate AI into the work of teaching?” we should ask “Given what we know about learning, how might AI be useful?” I love that reframing, and I think it connects to the students’ requests for more AI knowhow. Students have a lot of options for learning: working with their instructor, collaborating with peers, surfing YouTube for explainer videos, university-provided social annotation platforms, and, yes, using AI as a kind of tutor. I think our job (collectively) isn’t just to teach students how to use AI (as they’re requesting) but also to help them figure out when and how AI is helpful for their learning. That’s highly dependent on the student and the learning task! I wrote about this kind of metacognition on my blog.

In the same way, when I approach any kind of educational technology, I’m looking for tools that can be responsive to my pedagogical aims. The pedagogy should drive the technology use, not the other way around.

 

AI Is Quietly Rewiring the ADDIE Model (In a Good Way) — from drphilippahardman.substack.com by Dr. Philippa Hardman
The traditional ADDIE workflow isn’t dead, but it is evolving

The real story isn’t what AI can produce — it’s how it changes the decisions we make at every stage of instructional design.

After working with thousands of instructional designers on my bootcamp, I’ve learned something counterintuitive: the best teams aren’t the ones with the fanciest AI tools — they’re the ones who know when to use which mode—and when to use none at all.

Once you recognise that, you start to see instructional design differently — not as a linear process, but as a series of decision loops where AI plays distinct roles.

In this post, I show you the 3 modes of AI that actually matter in instructional design — and map them across every phase of ADDIE so you know exactly when to let AI run, and when to slow down and think.


Also see:

Generative AI for Course Design: Writing Effective Prompts for Multiple Choice Question Development — from onlineteaching.umich.edu by Hedieh Najafi

In higher education, developing strong multiple-choice questions can be a time-intensive part of the course design process. Developing such items requires subject-matter expertise and assessment literacy, and for faculty and designers who are creating and producing online courses, it can be difficult to find the capacity to craft quality multiple-choice questions.

At the University of Michigan Center for Academic Innovation, learning experience designers are using generative artificial intelligence to streamline the multiple-choice question development process and help ameliorate this issue. In this article, I summarize one of our projects that explored effective prompting strategies to develop multiple-choice questions with ChatGPT for our open course portfolio. We examined how structured prompting can improve the quality of AI-generated assessments, producing relevant comprehension and recall items and options that include plausible distractors.

Achieving this goal enables us to develop several ungraded practice opportunities, preparing learners for their graded assessments while also freeing up more time for course instructors and designers.

 
 

How Your Learners *Actually* Learn with AI — from drphilippahardman.substack.com by Dr. Philippa Hardman
What 37.5 million AI chats show us about how learners use AI at the end of 2025 — and what this means for how we design & deliver learning experiences in 2026

Last week, Microsoft released a similar analysis of a whopping 37.5 million Copilot conversations. These conversation took place on the platform from January to September 2025, providing us with a window into if and how AI use in general — and AI use among learners specifically – has evolved in 2025.

Microsoft’s mass behavioural data gives us a detailed, global glimpse into what learners are actually doing across devices, times of day and contexts. The picture that emerges is pretty clear and largely consistent with what OpenAI’s told us back in the summer:

AI isn’t functioning primarily as an “answers machine”: the majority of us use AI as a tool to personalise and differentiate generic learning experiences and – ultimately – to augment human learning.

Let’s dive in!

Learners don’t “decide” to use AI anymore. They assume it’s there, like search, like spellcheck, like calculators. The question has shifted from “should I use this?” to “how do I use this effectively?”


8 AI Agents Every HR Leader Needs To Know In 2026 — from forbes.com by Bernard Marr

So where do you start? There are many agentic tools and platforms for AI tasks on the market, and the most effective approach is to focus on practical, high-impact workflows. So here, I’ll look at some of the most compelling use cases, as well as provide an overview of the tools that can help you quickly deliver tangible wins.

Some of the strongest opportunities in HR include:

  • Workforce management, administering job satisfaction surveys, monitoring and tracking performance targets, scheduling interventions, and managing staff benefits, medical leave, and holiday entitlement.
  • Recruitment screening, automatically generating and posting job descriptions, filtering candidates, ranking applicants against defined criteria, identifying the strongest matches, and scheduling interviews.
  • Employee onboarding, issuing new hires with contracts and paperwork, guiding them to onboarding and training resources, tracking compliance and completion rates, answering routine enquiries, and escalating complex cases to human HR specialists.
  • Training and development, identifying skills gaps, providing self-service access to upskilling and reskilling opportunities, creating personalized learning pathways aligned with roles and career goals, and tracking progress toward completion.

 

 

Making the case for arts and humanities — from timeshighereducation.com by campus contributors, Eliza Compton
The arts and humanities are often dismissed as an unaffordable luxury, when these disciplines underpin vital human skills such as critical thinking, creativity and communication. This collection explores many ways in which arts and humanities can be harnessed for the benefit of all – students, universities and wider society

Yet, amid the threat of AI-driven automation in the workforce, fierce competition for entry-level jobs, and complex global problems such as climate change, the skills that humanities disciplines are built upon are vital. These skills – such as critical thinking, communication and creativity – are also key to universities’ capacity to share knowledge with industry, policymakers and the public. When it comes to understanding how high-tech solutions can best be applied in the real world, often the barriers are not technical but human, as low vaccine take-ups show.

These human skills are not unique to disciplines such as history, philosophy, literature, linguistics, performance and visual arts, of course. The need for deep thinking and analysis across all areas of academic enquiry is embedded in interdisciplinarity and STEAM initiatives, which integrate science, technology, mathematics and engineering with arts and humanities.

At their core, the arts and humanities interrogate what makes us human and how we understand and communicate with the world. In this collection, contributors from around the globe articulate the value that these disciplines bring to students, industry, government and society, when taught and designed effectively. It also considers how arts-based research can drive discovery, the role of interdisciplinarity in teaching and research, and how humanities-led expertise supports sustainability and inclusion.

 

 

AI working competency is now a graduation requirement at Purdue [Pacton] + other items re: AI in our learning ecosystems


AI Has Landed in Education: Now What? — from learningfuturesdigest.substack.com by Dr. Philippa Hardman

Here’s what’s shaped the AI-education landscape in the last month:

  • The AI Speed Trap is [still] here: AI adoption in L&D is basically won (87%)—but it’s being used to ship faster, not learn better (84% prioritising speed), scaling “more of the same” at pace.
  • AI tutors risk a “pedagogy of passivity”: emerging evidence suggests tutoring bots can reduce cognitive friction and pull learners down the ICAP spectrum—away from interactive/constructive learning toward efficient consumption.
  • Singapore + India are building what the West lacks: they’re treating AI as national learning infrastructure—for resilience (Singapore) and access + language inclusion (India)—while Western systems remain fragmented and reactive.
  • Agentic AI is the next pivot: early signs show a shift from AI as a content engine to AI as a learning partner—with UConn using agents to remove barriers so learners can participate more fully in shared learning.
  • Moodle’s AI stance sends two big signals: the traditional learning ecosystem in fragmenting, and the concept of “user sovereignty” over by AI is emerging.

Four strategies for implementing custom AIs that help students learn, not outsource — from educational-innovation.sydney.edu.au by Kria Coleman, Matthew Clemson, Laura Crocco and Samantha Clarke; via Derek Bruff

For Cogniti to be taken seriously, it needs to be woven into the structure of your unit and its delivery, both in class and on Canvas, rather than left on the side. This article shares practical strategies for implementing Cogniti in your teaching so that students:

  • understand the context and purpose of the agent,
  • know how to interact with it effectively,
  • perceive its value as a learning tool over any other available AI chatbots, and
  • engage in reflection and feedback.

In this post, we discuss how to introduce and integrate Cogniti agents into the learning environment so students understand their context, interact effectively, and see their value as customised learning companions.

In this post, we share four strategies to help introduce and integrate Cogniti in your teaching so that students understand their context, interact effectively, and see their value as customised learning companions.


Collection: Teaching with Custom AI Chatbots — from teaching.virginia.edu; via Derek Bruff
The default behaviors of popular AI chatbots don’t always align with our teaching goals. This collection explores approaches to designing AI chatbots for particular pedagogical purposes.

Example/excerpt:



 
© 2025 | Daniel Christian