“OpenAI’s Atlas: the End of Online Learning—or Just the Beginning?” [Hardman] + other items re: AI in our LE’s

OpenAI’s Atlas: the End of Online Learning—or Just the Beginning? — from drphilippahardman.substack.com by Dr. Philippa Hardman

My take is this: in all of the anxiety lies a crucial and long-overdue opportunity to deliver better learning experiences. Precisely because Atlas perceives the same context in the same moment as you, it can transform learning into a process aligned with core neuro-scientific principles—including active retrieval, guided attention, adaptive feedback and context-dependent memory formation.

Perhaps in Atlas we have a browser that for the first time isn’t just a portal to information, but one which can become a co-participant in active cognitive engagement—enabling iterative practice, reflective thinking, and real-time scaffolding as you move through challenges and ideas online.

With this in mind, I put together 10 use cases for Atlas for you to try for yourself.

6. Retrieval Practice
What:
Pulling information from memory drives retention better than re-reading.
Why: Practice testing delivers medium-to-large effects (Adesope et al., 2017).
Try: Open a document with your previous notes. Ask Atlas for a mixed activity set: “Quiz me on the Krebs cycle—give me a near-miss, high-stretch MCQ, then a fill-in-the-blank, then ask me to explain it to a teen.”
Atlas uses its browser memory to generate targeted questions from your actual study materials, supporting spaced, varied retrieval.




From DSC:
A quick comment. I appreciate these ideas and approaches from Katarzyna and Rita. I do think that someone is going to want to be sure that the AI models/platforms/tools are given up-to-date information and updated instructions — i.e., any new procedures, steps to take, etc. Perhaps I’m missing the boat here, but an internal AI platform is going to need to have access to up-to-date information and instructions.


 

Ground-level Impacts of the Changing Landscape of Higher Education — from onedtech.philhillaa.com by Glenda Morgan; emphasis DSC
Evidence from the Virginia Community College System

In that spirit, in this post I examine a report from Virginia’s Joint Legislative Audit and Review Commission (JLARC) on Virginia’s Community Colleges and the changing higher-education landscape. The report offers a rich view of how several major issues are evolving at the institutional level over time, an instructive case study in big changes and their implications.

Its empirical depth also prompts broader questions we should ask across higher education.

  • What does the shift toward career education and short-term training mean for institutional costs and funding?
  • How do we deliver effective student supports as enrollment moves online?
  • As demand shifts away from on-campus learning, do physical campuses need to get smaller?
  • Are we seeing a generalizable movement from academic programs to CTE to short-term options? If so, what does that imply for how community colleges are staffed and funded?
  • As online learning becomes a larger, permanent share of enrollment, do student services need a true bimodal redesign, built to serve both online and on-campus students effectively? Evidence suggests this urgent question is not being addressed, especially in cash-strapped community colleges.
  • As online learning grows, what happens to physical campuses? Improving space utilization likely means downsizing, which carries other implications. Campuses are community anchors, even for online students—so finding the right balance deserves serious debate.
 

Agentic AI and the New Era of Corporate Learning for 2026 — from hrmorning.com by Carol Warner

That gap creates compliance risk and wasted investment. It leaves HR leaders with a critical question: How do you measure and validate real learning when AI is doing the work for employees?

Designing Training That AI Can’t Fake
Employees often find static slide decks and multiple-choice quizzes tedious, while AI can breeze through them. If employees would rather let AI take training for them, it’s a red flag about the content itself.

One of the biggest risks with agentic AI is disengagement. When AI can complete a task for employees, their incentive to engage disappears unless they understand why the skill matters, Rashid explains. Personalization and context are critical. Training should clearly connect to what employees value most – career mobility, advancement, and staying relevant in a fast-changing market.

Nearly half of executives believe today’s skills will expire within two years, making continuous learning essential for job security and growth. To make training engaging, Rashid recommends:

  • Delivering content in formats employees already consume – short videos, mobile-first modules, interactive simulations, or micro-podcasts that fit naturally into workflows. For frontline workers, this might mean replacing traditional desktop training with mobile content that integrates into their workday.
  • Aligning learning with tangible outcomes, like career opportunities or new responsibilities.
  • Layering in recognition, such as digital badges, leaderboards, or team shout-outs, to reinforce motivation and progress

Microsoft 365 Copilot AI agents reach a new milestone — is teamwork about to change? — from windowscentral.comby Adam Hales
Microsoft expands Copilot with collaborative agents in Teams, SharePoint and more to boost productivity and reshape teamwork.

Microsoft is pitching a recent shift of AI agents in Microsoft Teams as more than just smarter assistance. Instead, these agents are built to behave like human teammates inside familiar apps such as Teams, SharePoint, and Viva Engage. They can set up meeting agendas, keep files in order, and even step in to guide community discussions when things drift off track.

Unlike tools such as ChatGPT or Claude, which mostly wait for prompts, Microsoft’s agents are designed to take initiative. They can chase up unfinished work, highlight items that still need decisions, and keep projects moving forward. By drawing on Microsoft Graph, they also bring in the right files, past decisions, and context to make their suggestions more useful.



Chris Dede’s comments on LinkedIn re: Aibrary

As an advisor to Aibrary, I am impressed with their educational philosophy, which is based both on theory and on empirical research findings. Aibrary is an innovative approach to self-directed learning that complements academic resources. Expanding our historic conceptions of books, libraries, and lifelong learning to new models enabled by emerging technologies is central to empowering all of us to shape our future.
.

Also see:

Aibrary.ai


Why AI literacy must come before policy — from timeshighereducation.com by Kathryn MacCallum and David Parsons
When developing rules and guidelines around the uses of artificial intelligence, the first question to ask is whether the university policymakers and staff responsible for implementing them truly understand how learners can meet the expectations they set

Literacy first, guidelines second, policy third
For students to respond appropriately to policies, they need to be given supportive guidelines that enact these policies. Further, to apply these guidelines, they need a level of AI literacy that gives them the knowledge, skills and understanding required to support responsible use of AI. Therefore, if we want AI to enhance education rather than undermine it, we must build literacy first, then create supportive guidelines. Good policy can then follow.


AI training becomes mandatory at more US law schools — from reuters.com by Karen Sloan and Sara Merken

Sept 22 (Reuters) – At orientation last month, 375 new Fordham Law students were handed two summaries of rapper Drake’s defamation lawsuit against his rival Kendrick Lamar’s record label — one written by a law professor, the other by ChatGPT.

The students guessed which was which, then dissected the artificial intelligence chatbot’s version for accuracy and nuance, finding that it included some irrelevant facts.

The exercise was part of the first-ever AI session for incoming students at the Manhattan law school, one of at least eight law schools now incorporating AI training for first-year students in orientation, legal research and writing courses, or through mandatory standalone classes.

 

Digital Accessibility with Amy Lomellini — from intentionalteaching.buzzsprout.com by Derek Bruff

In this episode, we explore why digital accessibility can be so important to the student experience. My guest is Amy Lomellini, director of accessibility at Anthology, the company that makes the learning management system Blackboard. Amy teaches educational technology as an adjunct at Boise State University, and she facilitates courses on digital accessibility for the Online Learning Consortium. In our conversation, we talk about the importance of digital accessibility to students, moving away from the traditional disclosure-accommodation paradigm, AI as an assistive technology, and lots more.

 

From EdTech to TechEd: The next chapter in learning’s evolution — from linkedin.com by Lev Gonick

A day in the life: The next 25 years
A learner wakes up. Their AI-powered learning coach welcomes them, drawing their attention to their progress and helping them structure their approach to the day.  A notification reminds them of an upcoming interview and suggests reflections to add to their learning portfolio.

Rather than a static gradebook, their portfolio is a dynamic, living record, curated by the student, validated by mentors in both industry and education, and enriched through co-creation with maturing modes of AI. It tells a story through essays, code, music, prototypes, journal reflections, and team collaborations. These artifacts are not “submitted”, they are published, shared, and linked to verifiable learning outcomes.

And when it’s time to move, to a new institution, a new job, or a new goal, their data goes with them, immutable, portable, verifiable, and meaningful.

From DSC:
And I would add to that last solid sentence that the learner/student/employee will be able to control who can access this information. Anyway, some solid reflections here from Lev.


AI Could Surpass Schools for Academic Learning in 5-10 Years — from downes.ca with commentary from Stephen Downes

I know a lot of readers will disagree with this, and the timeline feels aggressive (the future always arrives more slowly than pundits expect) but I think the overall premise is sound: “The concept of a tipping point in education – where AI surpasses traditional schools as the dominant learning medium – is increasingly plausible based on current trends, technological advancements, and expert analyses.”


The world’s first AI cabinet member — from therundown.ai by Zach Mink, Rowan Cheung, Shubham Sharma, Joey Liu & Jennifer Mossalgue

The Rundown: In this tutorial, you will learn how to combine NotebookLM with ChatGPT to master any subject faster, turning dense PDFs into interactive study materials with summaries, quizzes, and video explanations.

Step-by-step:

  1. Go to notebooklm.google.com, click the “+” button, and upload your PDF study material (works best with textbooks or technical documents)
  2. Choose your output mode: Summary for a quick overview, Mind Map for visual connections, or Video Overview for a podcast-style explainer with visuals
  3. Generate a Study Guide under Reports — get Q&A sets, short-answer questions, essay prompts, and glossaries of key terms automatically
  4. Take your PDF to ChatGPT and prompt: “Read this chapter by chapter and highlight confusing parts” or “Quiz me on the most important concepts”
  5. Combine both tools: Use NotebookLM for quick context and interactive guides, then ChatGPT to clarify tricky parts and go deeperPro Tip: If your source is in EPUB or audiobook, convert it to PDF before uploading. Both NotebookLM and ChatGPT handle PDFs best.

Claude can now create and edit files — from anthropic.com

Claude can now create and edit Excel spreadsheets, documents, PowerPoint slide decks, and PDFs directly in Claude.ai and the desktop app. This transforms how you work with Claude—instead of only receiving text responses or in-app artifacts, you can describe what you need, upload relevant data, and get ready-to-use files in return.

Also see:

  • Microsoft to lessen reliance on OpenAI by buying AI from rival Anthropic — from techcrunch.com byRebecca Bellan
    Microsoft will pay to use Anthropic’s AI in Office 365 apps, The Information reports, citing two sources. The move means that Anthropic’s tech will help power new features in Word, Excel, Outlook, and PowerPoint alongside OpenAI’s, marking the end of Microsoft’s previous reliance solely on the ChatGPT maker for its productivity suite. Microsoft’s move to diversify its AI partnerships comes amid a growing rift with OpenAI, which has pursued its own infrastructure projects as well as a potential LinkedIn competitor.

Ep. 11 AGI and the Future of Higher Ed: Talking with Ray Schroeder

In this episode of Unfixed, we talk with Ray Schroeder—Senior Fellow at UPCEA and Professor Emeritus at the University of Illinois Springfield—about Artificial General Intelligence (AGI) and what it means for the future of higher education. While most of academia is still grappling with ChatGPT and basic AI tools, Schroeder is thinking ahead to AI agents, human displacement, and AGI’s existential implications for teaching, learning, and the university itself. We explore why AGI is so controversial, what institutions should be doing now to prepare, and how we can respond responsibly—even while we’re already overwhelmed.


Best AI Tools for Instructional Designers — from blog.cathy-moore.com by Cathy Moore

Data from the State of AI and Instructional Design Report revealed that 95.3% of the instructional designers interviewed use AI in their daily work [1]. And over 85% of this AI use occurs during the design and development process.

These figures showcase the immense impact AI is already having on the instructional design world.

If you’re an L&D professional still on the fence about adding AI to your workflow or an AI convert looking for the next best tools, keep reading.

This guide breaks down 5 of the top AI tools for instructional designers in 2025, so you can streamline your development processes and build better training faster.

But before we dive into the tools of the trade, let’s address the elephant in the room:




3 Human Skills That Make You Irreplaceable in an AI World — from gettingsmart.com/ by Tom Vander Ark and Mason Pashia

Key Points

  • Update learner profiles to emphasize curiosity, curation, and connectivity, ensuring students develop irreplaceable human skills.
  • Integrate real-world learning experiences and mastery-based assessments to foster agency, purpose, and motivation in students.
 

Miro and GenAI as drivers of online student engagement — from timeshighereducation.com by Jaime Eduardo Moncada Garibay
A set of practical strategies for transforming passive online student participation into visible, measurable and purposeful engagement through the use of Miro, enhanced by GenAI

To address this challenge, I shifted my focus from requesting participation to designing it. This strategic change led me to integrate Miro, a visual digital workspace, into my classes. Miro enables real-time visualisation and co-creation of ideas, whether individually or in teams.

The transition from passive attendance to active engagement in online classes requires deliberate instructional design. Tools such as Miro, enhanced by GenAI, enable educators to create structured, visually rich learning environments in which participation is both expected and documented.

While technology provides templates, frames, timers and voting features, its real pedagogical value emerges through intentional facilitation, where the educator’s role shifts from delivering content to orchestrating collaborative, purposeful learning experiences.


Benchmarking Online Education with Bruce Etter and Julie Uranis — from buzzsprout.com by Derek Bruff

Here are some that stood out to me:

  • In the past, it was typical for faculty to teach online courses as an “overload” of some kind, but BOnES data show that 92% of online programs feature courses taught as part of faculty member’s standard teaching responsibilities. Online teaching has become one of multiple modalities in which faculty teach regularly.
  • Three-quarters of chief online officers surveyed said they plan to have a great market share of online enrollments in the future, but only 23% said their current marketing is better than their competitors. The rising tide of online enrollments won’t lift all boats–some institutions will fare better than others.
  • Staffing at online education units is growing, with the median staff size increasing from 15 last year to 20 this year. Julie pointed out that successful online education requires investment of resources. You might need as many buildings as onsite education does, but you need people and you need technology.


 

The 2025 Changing Landscape of Online Education (CHLOE) 10 Report — from qualitymatters.org; emphasis below from DSC

Notable findings from the 73-page report include: 

  • Online Interest Surges Across Student Populations: 
  • Institutional Preparedness Falters Amid Rising Demand: Despite accelerating demand, institutional readiness has stagnated—or regressed—in key areas.
  • The Online Education Marketplace Is Increasingly Competitive: …
  • Alternative Credentials Take Center Stage: …
  • AI Integration Lacks Strategic Coordination: …

Just 28% of faculty are considered fully prepared for online course design, and 45% for teaching. Alarmingly, only 28% of institutions report having fully developed academic continuity plans for future emergency pivots to online.


Also relevant, see:


Great Expectations, Fragile Foundations — from onedtech.philhillaa.com by Glenda Morgan
Lessons about growth from the CHLOE & BOnES reports

Cultural resistance remains strong. Many [Chief Online Learning Officers] COLOs say faculty and deans still believe in-person learning is “just better,” creating headwinds even for modest online growth. As one respondent at a four-year institution with a large online presence put it:

Supportive departments [that] see the value in online may have very different levels of responsiveness compared to academic departments [that] are begrudgingly online. There is definitely a growing belief that students “should” be on-ground and are only choosing online because it’s easy/ convenient. Never mind the very real and growing population of nontraditional learners who can only take online classes, and the very real and growing population of traditional-aged learners who prefer online classes; many faculty/deans take a paternalistic, “we know what’s best” approach.


Ultimately, what we need is not just more ambition but better ambition. Ambition rooted in a realistic understanding of institutional capacity, a shared strategic vision, investments in policy and infrastructure, and a culture that supports online learning as a core part of the academic mission, not an auxiliary one. It’s time we talked about what it really takes to grow online learning , and where ambition needs to be matched by structure.

From DSC:
Yup. Culture is at the breakfast table again…boy, those strategies taste good.

I’d like to take some of this report — like the graphic below — and share it with former faculty members and members of a couple of my past job families’ leadership. They strongly didn’t agree with us when we tried to advocate for the development of online-based learning/programs at our organizations…but we were right. We were right all along. And we were LEADING all along. No doubt about it — even if the leadership at the time said that we weren’t leading.

The cultures of those organizations hurt us at the time. But our cultivating work eventually led to the development of online programs — unfortunately, after our groups were disbanded, they had to outsource those programs to OPMs.


Arizona State University — with its dramatic growth in online-based enrollments.

 

Digital Accessibility in 2025: A Screen Reader User’s Honest Take — from blog.usablenet.com by Michael Taylor

In this post, part of the UsableNet 25th anniversary series, I’m taking a look at where things stand in 2025. I’ll discuss the areas that have improved—such as online shopping, banking, and social media—and the ones that still make it challenging to perform basic tasks, including travel, healthcare, and mobile apps. I hope that by sharing what works and what doesn’t, I can help paint a clearer picture of the digital world as it stands today.


Why EAA Compliance and Legal Trends Are Shaping Accessibility in 2025 — from blog.usablenet.com by Jason Taylor

On June 28, 2025, the European Accessibility Act (EAA) officially became enforceable across the European Union. This law requires digital products and services—including websites, mobile apps, e-commerce platforms, and software to meet the defined accessibility standards outlined in EN 301 549, which aligns with the WCAG 2.1 Level AA.

Companies that serve EU consumers must be able to demonstrate that accessibility is built into the design, development, testing, and maintenance of their digital products and services.

This milestone also arrives as UsableNet celebrates 25 years of accessibility leadership—a moment to reflect on how far we’ve come and what digital teams must do next.

 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 

“The AI-enhanced learning ecosystem” [Jennings] + other items re: AI in our learning ecosystems

The AI-enhanced learning ecosystem: A case study in collaborative innovation — from chieflearningofficer.com by Kevin Jennings
How artificial intelligence can serve as a tool and collaborative partner in reimagining content development and management.

Learning and development professionals face unprecedented challenges in today’s rapidly evolving business landscape. According to LinkedIn’s 2025 Workplace Learning Report, 67 percent of L&D professionals report being “maxed out” on capacity, while 66 percent have experienced budget reductions in the past year.

Despite these constraints, 87 percent agree their organizations need to develop employees faster to keep pace with business demands. These statistics paint a clear picture of the pressure L&D teams face: do more, with less, faster.

This article explores how one L&D leader’s strategic partnership with artificial intelligence transformed these persistent challenges into opportunities, creating a responsive learning ecosystem that addresses the modern demands of rapid product evolution and diverse audience needs. With 71 percent of L&D professionals now identifying AI as a high or very high priority for their learning strategy, this case study demonstrates how AI can serve not merely as a tool but as a collaborative partner in reimagining content development and management.
.


How we use GenAI and AR to improve students’ design skills — from timeshighereducation.com by Antonio Juarez, Lesly Pliego and Jordi Rábago who are professors of architecture at Monterrey Institute of Technology in Mexico; Tomas Pachajoa is a professor of architecture at the El Bosque University in Colombia; & Carlos Hinrichsen and Marietta Castro are educators at San Sebastián University in Chile.
Guidance on using generative AI and augmented reality to enhance student creativity, spatial awareness and interdisciplinary collaboration

Blend traditional skills development with AI use
For subjects that require students to develop drawing and modelling skills, have students create initial design sketches or models manually to ensure they practise these skills. Then, introduce GenAI tools such as Midjourney, Leonardo AI and ChatGPT to help students explore new ideas based on their original concepts. Using AI at this stage broadens their creative horizons and introduces innovative perspectives, which are crucial in a rapidly evolving creative industry.

Provide step-by-step tutorials, including both written guides and video demonstrations, to illustrate how initial sketches can be effectively translated into AI-generated concepts. Offer example prompts to demonstrate diverse design possibilities and help students build confidence using GenAI.

Integrating generative AI and AR consistently enhanced student engagement, creativity and spatial understanding on our course. 


How Texas is Preparing Higher Education for AI — from the74million.org by Kate McGee
TX colleges are thinking about how to prepare students for a changing workforce and an already overburdened faculty for new challenges in classrooms.

“It doesn’t matter if you enter the health industry, banking, oil and gas, or national security enterprises like we have here in San Antonio,” Eighmy told The Texas Tribune. “Everybody’s asking for competency around AI.”

It’s one of the reasons the public university, which serves 34,000 students, announced earlier this year that it is creating a new college dedicated to AI, cyber security, computing and data science. The new college, which is still in the planning phase, would be one of the first of its kind in the country. UTSA wants to launch the new college by fall 2025.

But many state higher education leaders are thinking beyond that. As AI becomes a part of everyday life in new, unpredictable ways, universities across Texas and the country are also starting to consider how to ensure faculty are keeping up with the new technology and students are ready to use it when they enter the workforce.


In the Room Where It Happens: Generative AI Policy Creation in Higher Education — from er.educause.edu by Esther Brandon, Lance Eaton, Dana Gavin, and Allison Papini

To develop a robust policy for generative artificial intelligence use in higher education, institutional leaders must first create “a room” where diverse perspectives are welcome and included in the process.


Q&A: Artificial Intelligence in Education and What Lies Ahead — from usnews.com by Sarah Wood
Research indicates that AI is becoming an essential skill to learn for students to succeed in the workplace.

Q: How do you expect to see AI embraced more in the future in college and the workplace?
I do believe it’s going to become a permanent fixture for multiple reasons. I think the national security imperative associated with AI as a result of competing against other nations is going to drive a lot of energy and support for AI education. We also see shifts across every field and discipline regarding the usage of AI beyond college. We see this in a broad array of fields, including health care and the field of law. I think it’s here to stay and I think that means we’re going to see AI literacy being taught at most colleges and universities, and more faculty leveraging AI to help improve the quality of their instruction. I feel like we’re just at the beginning of a transition. In fact, I often describe our current moment as the ‘Ask Jeeves’ phase of the growth of AI. There’s a lot of change still ahead of us. AI, for better or worse, it’s here to stay.




AI-Generated Podcasts Outperform Textbooks in Landmark Education Study — form linkedin.com by David Borish

A new study from Drexel University and Google has demonstrated that AI-generated educational podcasts can significantly enhance both student engagement and learning outcomes compared to traditional textbooks. The research, involving 180 college students across the United States, represents one of the first systematic investigations into how artificial intelligence can transform educational content delivery in real-time.


What can we do about generative AI in our teaching?  — from linkedin.com by Kristina Peterson

So what can we do?

  • Interrogate the Process: We can ask ourselves if we I built in enough checkpoints. Steps that can’t be faked. Things like quick writes, question floods, in-person feedback, revision logs.
  • Reframe AI: We can let students use AI as a partner. We can show them how to prompt better, revise harder, and build from it rather than submit it. Show them the difference between using a tool and being used by one.
  • Design Assignments for Curiosity, Not Compliance: Even the best of our assignments need to adapt. Mine needs more checkpoints, more reflective questions along the way, more explanation of why my students made the choices they did.

Teachers Are Not OK — from 404media.co by Jason Koebler

The response from teachers and university professors was overwhelming. In my entire career, I’ve rarely gotten so many email responses to a single article, and I have never gotten so many thoughtful and comprehensive responses.

One thing is clear: teachers are not OK.

In addition, universities are contracting with companies like Microsoft, Adobe, and Google for digital services, and those companies are constantly pushing their AI tools. So a student might hear “don’t use generative AI” from a prof but then log on to the university’s Microsoft suite, which then suggests using Copilot to sum up readings or help draft writing. It’s inconsistent and confusing.

I am sick to my stomach as I write this because I’ve spent 20 years developing a pedagogy that’s about wrestling with big ideas through writing and discussion, and that whole project has been evaporated by for-profit corporations who built their systems on stolen work. It’s demoralizing.

 


Also relevant/see:


Report: 93% of Students Believe Gen AI Training Belongs in Degree Programs — from campustechnology.com by Rhea Kelly

The vast majority of today’s college students — 93% — believe generative AI training should be included in degree programs, according to a recent Coursera report. What’s more, 86% of students consider gen AI the most crucial technical skill for career preparation, prioritizing it above in-demand skills such as data strategy and software development. And 94% agree that microcredentials help build the essential skills they need to achieve career success.

For its Microcredentials Impact Report 2025, Coursera surveyed more than 1,200 learners and 1,000 employers around the globe to better understand the demand for microcredentials and their impact on workforce readiness and hiring trends.


1 in 4 employers say they’ll eliminate degree requirements by year’s end — from hrdive.com by Carolyn Crist
Companies that recently removed degree requirements reported a surge in applications, a more diverse applicant pool and the ability to offer lower salaries.

A quarter of employers surveyed said they will remove bachelor’s degree requirements for some roles by the end of 2025, according to a May 20 report from Resume Templates.

In addition, 7 in 10 hiring managers said their company looks at relevant experience over a bachelor’s degree while making hiring decisions.

In the survey of 1,000 hiring managers, 84% of companies that recently removed degree requirements said it has been a successful move. Companies without degree requirements also reported a surge in applications, a more diverse applicant pool and the ability to offer lower salaries.


Why AI literacy is now a core competency in education — from weforum.org by Tanya Milberg

  • Education systems must go beyond digital literacy and embrace AI literacy as a core educational priority.
  • A new AI Literacy Framework (AILit) aims to empower learners to navigate an AI-integrated world with confidence and purpose.
  • Here’s what you need to know about the AILit Framework – and how to get involved in making it a success.

Also from Allison Salisbury, see:

 

MOOC-Style Skills Training — from the-job.beehiiv.com by Paul Fain
WGU and tech companies use Open edX for flexible online learning. Could community colleges be next?

Open Source for Affordable Online Reach
The online titan Western Governors University is experimenting with an open-source learning platform. So are Verizon and the Indian government. And the platform’s leaders want to help community colleges take the plunge on competency-based education.

The Open edX platform inherently supports self-paced learning and offers several features that make it a good fit for competency-based education and skills-forward learning, says Stephanie Khurana, Axim’s CEO.

“Flexible modalities and a focus on competence instead of time spent learning improves access and affordability for learners who balance work and life responsibilities alongside their education,” she says.

“Plus, being open source means institutions and organizations can collaborate to build and share CBE-specific tools and features,” she says, “which could lower costs and speed up innovation across the field.”

Axim thinks Open edX’s ability to scale affordably can support community colleges in reaching working learners across an underserved market. 

 

Higher Ed Institutions Rely Less on OPMs While Increasingly Hiring Fee-For-Service Models — from iblnews.org

market report from Validated Insights released this month notes that fewer colleges and universities hire external online program management (OPM) companies to develop their courses.

For 2024, higher education institutions launched only 81 new partnerships with OPMs —  a drop of 42% and the lowest number since 2016.

The report showed that institutions increasingly pay OPMs a fee-for-service instead of following a revenue-sharing model with big service bundles and profit splits.

Experts say revenue-sharing models, which critics denounce as predatory arrangements, incentivize service providers to use aggressive recruiting tactics to increase enrollments and maximize tuition revenue.

According to the report, fee-for-service has become the dominant business model for OPMs.


6 Online Edtech Professional Learning Communities & Resources for Teachers — from techlearning.com by Stephanie Smith Budhai, Ph.D.
These resources can help provide training, best practices, and advice, for using digital tools such as Canva, Curipod, Kahoot!, and more

While school-led professional development can be helpful, there are online professional learning communities on various edtech websites that can be leveraged. Also, some of these community spaces offer the chance to monetize your work.

Here is a summary of six online edtech professional learning spaces.

 


.

2025 EDUCAUSE Students and Technology Report: Shaping the Future of Higher Education Through Technology, Flexibility, and Well-Being — from library.educause.edu

The student experience in higher education is continually evolving, influenced by technological advancements, shifting student needs and expectations, evolving workforce demands, and broadening sociocultural forces. In this year’s report, we examine six critical aspects of student experiences in higher education, providing insights into how institutions can adapt to meet student needs and enhance their learning experience and preparation for the workforce:

  • Satisfaction with Technology-Related Services and Supports
  • Modality Preferences
  • Hybrid Learning Experiences
  • Generative AI in the Classroom
  • Workforce Preparation
  • Accessibility and Mental Health

DSC: Shame on higher ed for not preparing students for the workplace (see below). You’re doing your students wrong…again. Not only do you continue to heap a load of debt on their backs, but you’re also continuing to not get them ready for the workplace. So don’t be surprised if eventually you’re replaced by a variety of alternatives that students will flock towards.
.

 

DSC: And students don’t have a clue as to what awaits them in the workplace — they see AI-powered tools and technologies at an incredibly low score of only 3%. Yeh, right. You’ll find out. Here’s but one example from one discipline/field of work –> Thomson Reuters Survey: Over 95% of Legal Professionals Expect Gen AI to Become Central to Workflow Within Five Years

.

Figure 15. Competency Areas Expected to Be Important for Career

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 
© 2025 | Daniel Christian