From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

Uplimit raises stakes in corporate learning with suite of AI agents that can train thousands of employees simultaneously — from venturebeat.com by Michael Nuñez|

Uplimit unveiled a suite of AI-powered learning agents today designed to help companies rapidly upskill employees while dramatically reducing administrative burdens traditionally associated with corporate training.

The San Francisco-based company announced three sets of purpose-built AI agents that promise to change how enterprises approach learning and development: skill-building agents, program management agents, and teaching assistant agents. The technology aims to address the growing skills gap as AI advances faster than most workforces can adapt.

“There is an unprecedented need for continuous learning—at a scale and speed traditional systems were never built to handle,” said Julia Stiglitz, CEO and co-founder of Uplimit, in an interview with VentureBeat. “The companies best positioned to thrive aren’t choosing between AI and their people—they’re investing in both.”


Introducing Claude for Education — from anthropic.com

Today we’re launching Claude for Education, a specialized version of Claude tailored for higher education institutions. This initiative equips universities to develop and implement AI-enabled approaches across teaching, learning, and administration—ensuring educators and students play a key role in actively shaping AI’s role in society.

As part of announcing Claude for Education, we’re introducing:

  1. Learning mode: A new Claude experience that guides students’ reasoning process rather than providing answers, helping develop critical thinking skills
  2. University-wide Claude availability: Full campus access agreements with Northeastern University, London School of Economics and Political Science (LSE), and Champlain College, making Claude available to all students
  3. Academic partnerships: Joining Internet2 and working with Instructure to embed AI into teaching & learning with Canvas LMS
  4. Student programs: A new Claude Campus Ambassadors program along with an initiative offering API credits for student projects

A comment on this from The Rundown AI:

Why it matters: Education continues to grapple with AI, but Anthropic is flipping the script by making the tech a partner in developing critical thinking rather than an answer engine. While the controversy over its use likely isn’t going away, this generation of students will have access to the most personalized, high-quality learning tools ever.


Should College Graduates Be AI Literate? — from chronicle.com by Beth McMurtrie (behind a paywall)
More institutions are saying yes. Persuading professors is only the first barrier they face.

Last fall one of Jacqueline Fajardo’s students came to her office, eager to tell her about an AI tool that was helping him learn general chemistry. Had she heard of Google NotebookLM? He had been using it for half a semester in her honors course. He confidently showed her how he could type in the learning outcomes she posted for each class and the tool would produce explanations and study guides. It even created a podcast based on an academic paper he had uploaded. He did not feel it was important to take detailed notes in class because the AI tool was able to summarize the key points of her lectures.


Showing Up for the Future: Why Educators Can’t Sit Out the AI Conversation — from marcwatkins.substack.com with a guest post from Lew Ludwig

The Risk of Disengagement
Let’s be honest: most of us aren’t jumping headfirst into AI. At many of our institutions, it’s not a gold rush—it’s a quiet standoff. But the group I worry most about isn’t the early adopters. It’s the faculty who’ve decided to opt out altogether.

That choice often comes from a place of care. Concerns about data privacy, climate impact, exploitative labor, and the ethics of using large language models are real—and important. But choosing not to engage at all, even on ethical grounds, doesn’t remove us from the system. It just removes our voices from the conversation.

And without those voices, we risk letting others—those with very different priorities—make the decisions that shape what AI looks like in our classrooms, on our campuses, and in our broader culture of learning.



Turbocharge Your Professional Development with AI — from learningguild.com by Dr. RK Prasad

You’ve just mastered a few new eLearning authoring tools, and now AI is knocking on the door, offering to do your job faster, smarter, and without needing coffee breaks. Should you be worried? Or excited?

If you’re a Learning and Development (L&D) professional today, AI is more than just a buzzword—it’s transforming the way we design, deliver, and measure corporate training. But here’s the good news: AI isn’t here to replace you. It’s here to make you better at what you do.

The challenge is to harness its potential to build digital-ready talent, not just within your organization but within yourself.

Let’s explore how AI is reshaping L&D strategies and how you can leverage it for professional development.


5 Recent AI Notables — from automatedteach.com by Graham Clay

1. OpenAI’s New Image Generator
What Happened: OpenAI integrated a much more powerful image generator directly into GPT-4o, making it the default image creator in ChatGPT. Unlike previous image models, this one excels at accurately rendering text in images, precise visualization of diagrams/charts, and multi-turn image refinement through conversation.

Why It’s Big: For educators, this represents a significant advancement in creating educational visuals, infographics, diagrams, and other instructional materials with unprecedented accuracy and control. It’s not perfect, but you can now quickly generate custom illustrations that accurately display mathematical equations, chemical formulas, or process workflows — previously a significant hurdle in digital content creation — without requiring graphic design expertise or expensive software. This capability dramatically reduces the time between conceptualizing a visual aid and implementing it in course materials.
.


The 4 AI modes that will supercharge your workflow — from aiwithallie.beehiiv.com by Allie K. Miller
The framework most people and companies won’t discover until 2026


 

Investigating Informal Learning with Technology — from learningguild.com by Katie Belle (Curry) Nelson

Informal learning is having a moment right now, and it’s about time.

As learning professionals, we can often get caught up in designing, developing, and implementing formal learning experiences, which can cause informal learning to fall to the wayside and easily be overlooked. However, informal learning experiences can have major, long-term effects on learning and business outcomes, so finding creative ways to track them can be valuable for L&D departments.

Start small
These three methods are small steps to understanding the informal learning environment and its impact on your organization. The assuring thing about informal learning is that you can start small and incorporate more methods later because formal learning is always taking place. Start with one area and begin to explore what you can find out the content your learners want to know more about, how they are learning about things, and how others in the organization are solving problems.

 

It’s the end of work as we knew it
and I feel…

powerless to fight the technology that we pioneered
nostalgic for a world that moved on without us
after decades of paying our dues
for a payday that never came
…so yeah
not exactly fine.


The Gen X Career Meltdown — from nytimes.com by Steeven Kurutz (DSC: This is a gifted article for you)
Just when they should be at their peak, experienced workers in creative fields find that their skills are all but obsolete.

If you entered media or image-making in the ’90s — magazine publishing, newspaper journalism, photography, graphic design, advertising, music, film, TV — there’s a good chance that you are now doing something else for work. That’s because those industries have shrunk or transformed themselves radically, shutting out those whose skills were once in high demand.

“I am having conversations every day with people whose careers are sort of over,” said Chris Wilcha, a 53-year-old film and TV director in Los Angeles.

Talk with people in their late 40s and 50s who once imagined they would be able to achieve great heights — or at least a solid career while flexing their creative muscles — and you are likely to hear about the photographer whose work dried up, the designer who can’t get hired or the magazine journalist who isn’t doing much of anything.

In the wake of the influencers comes another threat, artificial intelligence, which seems likely to replace many of the remaining Gen X copywriters, photographers and designers. By 2030, ad agencies in the United States will lose 32,000 jobs, or 7.5 percent of the industry’s work force, to the technology, according to the research firm Forrester.


From DSC:
This article reminds me of how tough it is to navigate change in our lives. For me, it was often due to the fact that I was working with technologies. Being a technologist can be difficult, especially as one gets older and faces age discrimination in a variety of industries. You need to pick the right technologies and the directions that will last (for me it was email, videoconferencing, the Internet, online-based education/training, discovering/implementing instructional technologies, and becoming a futurist).

For you younger folks out there — especially students within K-16 — aim to develop a perspective and a skillset that is all about adapting to change. You will likely need to reinvent yourself and/or pick up new skills over your working years. You are most assuredly required to be a lifelong learner now. That’s why I have been pushing for school systems to be more concerned with providing more choice and control to students — so that students actually like school and enjoy learning about new things.


 

 
 

Blind Spot on AI — from the-job.beehiiv.com by Paul Fain
Office tasks are being automated now, but nobody has answers on how education and worker upskilling should change.

Students and workers will need help adjusting to a labor market that appears to be on the verge of a historic disruption as many business processes are automated. Yet job projections and policy ideas are sorely lacking.

The benefits of agentic AI are already clear for a wide range of organizations, including small nonprofits like CareerVillage. But the ability to automate a broad range of business processes means that education programs and skills training for knowledge workers will need to change. And as Chung writes in a must-read essay, we have a blind spot with predicting the impacts of agentic AI on the labor market.

“Without robust projections,” he writes, “policymakers, businesses, and educators won’t be able to come to terms with how rapidly we need to start this upskilling.”

 

Learning as a Learning Professional: Unlock Hidden Opportunities — from learningguild.com by Will Thalheimer

As learning professionals, we help others grow—but how well are we developing ourselves? And does it really matter? Absolutely! In this article, I’ll explore why mastering the art of learning is crucial for our success and share strategies that go beyond traditional professional development.

Why learning matters for us
We need to be strong learners because our work demands broad expertise. We must understand the learning sciences, instructional design, project management, technology, evaluation, organizational dynamics, and business strategy. We also need to navigate a sea of learning frameworks, approaches, and models.


Also from learningguild.com, see:

Microlearning: The Key to Capturing Modern Learners’ Attention — by Sergiy Movchan

This shift in how we consume and process information is challenging traditional learning methods, which are finding it increasingly difficult to keep learners’ attention.

Microlearning is a bridge to the attention of today’s learners, delivering complex topics in short, manageable pieces. Whether it’s a five-minute video, a quick quiz, or a short lesson, microlearning makes it easier for students to stay engaged. Microlearning often holds learners’ attention better and for longer compared to standard learning methods.

Typical low completion rates clearly show the need for innovative approaches to content delivery and student engagement. Microlearning offers the answer to this need.

Cultivating Creativity as an L&D Professional — by Katie Belle (Curry) Nelson

Instructional designers and learning professionals are creative by nature. We are called upon to be creative with technology like Articulate, Camtasia, or Captivate. More often than we would like, organizations, red tape, and clients require us to be creative with timelines and budgets. Being creative is a core qualification and requirement of our work. So, what do we do when we feel like the creative river has run to a trickle or dried up entirely?

 

The Learning & Development Global Sentiment Survey 2025 — from donaldhtaylor.co.uk by Don Taylor

The L&D Global Sentiment Survey, now in its 12th year, once again asked two key questions of L&D professionals worldwide:

  • What will be hot in workplace learning in 2025?
  • What are your L&D challenges in 2025?

For the obligatory question on what they considered ‘hot’ topics, respondents voted for one to three of 15 suggested options, plus a free text ‘Other’ option. Over 3,000 voters participated from nearly 100 countries. 85% shared their challenges for 2025.

The results show more interest in AI, a renewed focus on showing the value of L&D, and some signs of greater maturity around our understanding of AI in L&D.


 

Your AI Writing Partner: The 30-Day Book Framework — from aidisruptor.ai by Alex McFarland and Kamil Banc
How to Turn Your “Someday” Manuscript into a “Shipped” Project Using AI-Powered Prompts

With that out of the way, I prefer Claude.ai for writing. For larger projects like a book, create a Claude Project to keep all context in one place.

  • Copy [the following] prompts into a document
  • Use them in sequence as you write
  • Adjust the word counts and specifics as needed
  • Keep your responses for reference
  • Use the same prompt template for similar sections to maintain consistency

Each prompt builds on the previous one, creating a systematic approach to helping you write your book.


Using NotebookLM to Boost College Reading Comprehension — from michellekassorla.substack.com by Michelle Kassorla and Eugenia Novokshanova
This semester, we are using NotebookLM to help our students comprehend and engage with scholarly texts

We were looking hard for a new tool when Google released NotebookLM. Not only does Google allow unfettered use of this amazing tool, it is also a much better tool for the work we require in our courses. So, this semester, we have scrapped our “old” tools and added NotebookLM as the primary tool for our English Composition II courses (and we hope, fervently, that Google won’t decide to severely limit its free tier before this semester ends!)

If you know next-to-nothing about NotebookLM, that’s OK. What follows is the specific lesson we present to our students. We hope this will help you understand all you need to know about NotebookLM, and how to successfully integrate the tool into your own teaching this semester.


Leadership & Generative AI: Hard-Earned Lessons That Matter — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
Actionable Advice for Higher Education Leaders in 2025

AFTER two years of working closely with leadership in multiple institutions, and delivering countless workshops, I’ve seen one thing repeatedly: the biggest challenge isn’t the technology itself, but how we lead through it. Here is some of my best advice to help you navigate generative AI with clarity and confidence:

  1. Break your own AI policies before you implement them.
  2. Fund your failures.
  3. Resist the pilot program. …
  4. Host Anti-Tech Tech Talks
  5. …+ several more tips

While generative AI in higher education obviously involves new technology, it’s much more about adopting a curious and human-centric approach in your institution and communities. It’s about empowering learners in new, human-oriented and innovative ways. It is, in a nutshell, about people adapting to new ways of doing things.



Maria Anderson responded to Clay’s posting with this idea:

Here’s an idea: […] the teacher can use the [most advanced] AI tool to generate a complete solution to “the problem” — whatever that is — and demonstrate how to do that in class. Give all the students access to the document with the results.

And then grade the students on a comprehensive followup activity / presentation of executing that solution (no notes, no more than 10 words on a slide). So the students all have access to the same deep AI result, but have to show they comprehend and can iterate on that result.



Grammarly just made it easier to prove the sources of your text in Google Docs — from zdnet.com by Jack Wallen
If you want to be diligent about proving your sources within Google Documents, Grammarly has a new feature you’ll want to use.

In this age of distrust, misinformation, and skepticism, you may wonder how to demonstrate your sources within a Google Document. Did you type it yourself, copy and paste it from a browser-based source, copy and paste it from an unknown source, or did it come from generative AI?

You may not think this is an important clarification, but if writing is a critical part of your livelihood or life, you will definitely want to demonstrate your sources.

That’s where the new Grammarly feature comes in.

The new feature is called Authorship, and according to Grammarly, “Grammarly Authorship is a set of features that helps users demonstrate their sources of text in a Google doc. When you activate Authorship within Google Docs, it proactively tracks the writing process as you write.”


AI Agents Are Coming to Higher Education — from govtech.com
AI agents are customizable tools with more decision-making power than chatbots. They have the potential to automate more tasks, and some schools have implemented them for administrative and educational purposes.

Custom GPTs are on the rise in education. Google’s version, Gemini Gems, includes a premade version called Learning Coach, and Microsoft announced last week a new agent addition to Copilot featuring use cases at educational institutions.


Generative Artificial Intelligence and Education: A Brief Ethical Reflection on Autonomy — from er.educause.edu by Vicki Strunk and James Willis
Given the widespread impacts of generative AI, looking at this technology through the lens of autonomy can help equip students for the workplaces of the present and of the future, while ensuring academic integrity for both students and instructors.

The principle of autonomy stresses that we should be free agents who can govern ourselves and who are able to make our own choices. This principle applies to AI in higher education because it raises serious questions about how, when, and whether AI should be used in varying contexts. Although we have only begun asking questions related to autonomy and many more remain to be asked, we hope that this serves as a starting place to consider the uses of AI in higher education.

 
 

AI Is Unavoidable, Not Inevitable — from marcwatkins.substack.com by Marc Watkins

I had the privilege of moderating a discussion between Josh Eyler and Robert Cummings about the future of AI in education at the University of Mississippi’s recent AI Winter Institute for Teachers. I work alongside both in faculty development here at the University of Mississippi. Josh’s position on AI sparked a great deal of debate on social media:

To make my position clear about the current AI in education discourse I want to highlight several things under an umbrella of “it’s very complicated.”

Most importantly, we all deserve some grace here. Dealing with generative AI in education isn’t something any of us asked for. It isn’t normal. It isn’t fixable by purchasing a tool or telling faculty to simply ‘prefer not to’ use AI. It is and will remain unavoidable for virtually every discipline taught at our institutions.

If one good thing happens because of generative AI let it be that it helps us clearly see how truly complicated our existing relationships with machines are now. As painful as this moment is, it might be what we need to help prepare us for a future where machines that mimic reasoning and human emotion refuse to be ignored.


“AI tutoring shows stunning results.”
See below article.


From chalkboards to chatbots: Transforming learning in Nigeria, one prompt at a time — from blogs.worldbank.org by Martín E. De Simone, Federico Tiberti, Wuraola Mosuro, Federico Manolio, Maria Barron, and Eliot Dikoru

Learning gains were striking
The learning improvements were striking—about 0.3 standard deviations. To put this into perspective, this is equivalent to nearly two years of typical learning in just six weeks. When we compared these results to a database of education interventions studied through randomized controlled trials in the developing world, our program outperformed 80% of them, including some of the most cost-effective strategies like structured pedagogy and teaching at the right level. This achievement is particularly remarkable given the short duration of the program and the likelihood that our evaluation design underestimated the true impact.

Our evaluation demonstrates the transformative potential of generative AI in classrooms, especially in developing contexts. To our knowledge, this is the first study to assess the impact of generative AI as a virtual tutor in such settings, building on promising evidence from other contexts and formats; for example, on AI in coding classes, AI and learning in one school in Turkey, teaching math with AI (an example through WhatsApp in Ghana), and AI as a homework tutor.

Comments on this article from The Rundown AI:

Why it matters: This represents one of the first rigorous studies showing major real-world impacts in a developing nation. The key appears to be using AI as a complement to teachers rather than a replacement — and results suggest that AI tutoring could help address the global learning crisis, particularly in regions with teacher shortages.


Other items re: AI in our learning ecosystems:

  • Will AI revolutionise marking? — from timeshighereducation.com by Rohim Mohammed
    Artificial intelligence has the potential to improve speed, consistency and detail in feedback for educators grading students’ assignments, writes Rohim Mohammed. Here he lists the pros and cons based on his experience
  • Marty the Robot: Your Classroom’s AI Companion — from rdene915.com by Dr. Rachelle Dené Poth
  • Generative Artificial Intelligence: Cautiously Recognizing Educational Opportunities — from scholarlyteacher.com by Todd Zakrajsek, University of North Carolina at Chapel Hill
  • Personal AI — from michelleweise.substack.com by Dr. Michelle Weise
    “Personalized” Doesn’t Have To Be a Buzzword
    Today, however, is a different kind of moment. GenAI is now rapidly evolving to the point where we may be able to imagine a new way forward. We can begin to imagine solutions truly tailored for each of us as individuals, our own personal AI (pAI). pAI could unify various silos of information to construct far richer and more holistic and dynamic views of ourselves as long-life learners. A pAI could become our own personal career navigator, skills coach, and storytelling agent. Three particular areas emerge when we think about tapping into the richness of our own data:

    • Personalized Learning Pathways & Dynamic Skill Assessment: …
    • Storytelling for Employers:…
    • Ongoing Mentorship and Feedback: …
  • Speak — a language learning app — via The Neuron

 

The Rise of the Heretical Leader — from ditchthattextbook.com; a guest post by Dan Fitzpatrick

Now is the time for visionary leadership in education. The era of artificial intelligence is reshaping the demands on education systems. Rigid policies, outdated curricula, and reliance on obsolete metrics are failing students. A recent survey from Resume Genius found that graduates lack skills in communication, collaboration, and critical thinking. Consequently, there is a growing trend in companies hiring candidates based on skills instead of traditional education or work experience. This underscores the urgent need for educational leaders to prioritize adaptability and innovation in their systems. Educational leaders must embrace a transformative approach to keep pace.

[Heretical leaders] bring courage, empathy, and strategic thinking to reimagine education’s potential. Here are their defining characteristics:

  • Visionary Thinking: They identify bold, innovative paths to progress.
  • Courage to Act: These leaders take calculated risks to overcome resistance and inertia.
  • Relentless Curiosity: They challenge assumptions and seek better alternatives.
  • Empathy for Stakeholders: Understanding the personal impact of change allows them to lead with compassion.
  • Strategic Disruption: Their deliberate actions ensure systemic improvements.
    These qualities enable Heretical leaders to reframe challenges as opportunities and drive meaningful change.

From DSC:
Readers of this blog will recognize that I believe visionary leadership is extremely important — in all areas of our society, but especially within our learning ecosystems. Vision trumps data, at least in my mind. There are times when data can be used to support a vision, but having a powerful vision is more lasting and impactful than relying on data to drive the organization.

So while I’d vote for a different term other than “heretical leaders,” I get what Dan is saying and I agree with him. Such leaders are going against the grain. They are swimming upstream. They are espousing perspectives that others often don’t buy into (at least initially or for some time). 

Such were the leaders who introduced online learning into the K-16 educational systems back in the late ’90s and into the next two+ decades. The growth of online-based learning continues and has helped educate millions of people. Those leaders and the people who worked for such endeavors were going against the grain.

We haven’t seen the end point of online-based learning. I think it will become even more powerful and impactful when AI is used to determine which jobs are opening up, and which skills are needed for those jobs, and then provide a listing of sources of where one can obtain that knowledge and develop those skills. People will be key in this vision. But so will AI and personalized learning. It will be a collaborative effort.

By the way, I am NOT advocating for using AI to outsource our thinking. Also, having basic facts and background knowledge in a domain is critically important, especially to use AI effectively. But we should be teaching students about AI (as we learn more about it ourselves). We should be working collaboratively with our students to understand how best to use AI. It’s their futures at stake.


 

The number of 18-year-olds is about to drop sharply, packing a wallop for colleges — and the economy — from hechingerreport.org by Jon Marcus
America is about to go over the ‘demographic cliff’

That’s because the current class of high school seniors is the last before a long decline begins in the number of 18-year-olds — the traditional age of students when they enter college.

This so-called demographic cliff has been predicted ever since Americans started having fewer babies at the advent of the Great Recession around the end of 2007 — a falling birth rate that has not recovered since, except for a slight blip after the Covid-19 pandemic, according to the Centers for Disease Control.

Demographers say it will finally arrive in the fall of this year. That’s when recruiting offices will begin to confront the long-anticipated drop-off in the number of applicants from among the next class of high school seniors.

“A few hundred thousand per year might not sound like a lot,” Strohl said. “But multiply that by a decade and it has a big impact.”

From DSC:
I remember seeing graphics about this demographic cliff over a decade ago…so institutions of traditional higher education have seen this coming for many years now (and the article references this as well). But it’s still important and the ramifications of this could be significant for many colleges and universities out there (for students, faculty, staff, and administrations).

  • Will there be new business models?
  • More lifelong learning models?
  • Additions to the curricula?

I sure hope so.


Higher Ed’s Governance Problem — from chronicle.com by Brian Rosenberg; via Ryan Craig
Boards are bloated and ineffectual.

According to the Association of Governing Boards of Universities and Colleges, the average size of a private nonprofit college or university board is 28 (larger than a major-league baseball roster), though boards of elite colleges tend to skew even larger: closer to 40, according to a study done by McKinsey.

By way of comparison, the average size of the board of directors of a publicly traded company in the United States is nine. If that seems too “corporate,” consider that the average size of the board of a nonprofit health-care institution is 13…

Still, anyone who studies organizational effectiveness would tell you that college and university boards are much too large, as would almost any college or university president when speaking off the record. Getting 12 people to spend significant time studying serious challenges and then reaching consensus about how to tackle those challenges is a heavy lift. Doing this with 25 or 35 or 45 people is close to impossible.


From Google ads to NFL sponsorships: Colleges throw billions at marketing themselves to attract students — from hechingerreport.org by Jon Marcus
Marketing and branding are getting big budgets and advertising is setting new records

In fact, the sum is small compared to what other colleges and universities are investing in advertising, marketing and promotion, which has been steadily rising and is on track this year to be nearly double what it was last year.

Among the reasons are a steep ongoing decline in enrollment, made worse by the pandemic, and increasing competition from online providers and others.

“Private schools in particular are acutely conscious of the demographics in this country. They’re competing for students, and marketing is how you have to do that.”

John Garvey, president, Catholic University


From DSC:
And for you students out there, check this sound advice out!

 
 
© 2025 | Daniel Christian