From GPTs (pt. 3) — from theneurondaily.com by Noah Edelman

BTW, here are a few GPTs worth checking out today:

  • ConvertAnything—convert images, audio, videos, PDFs, files, & more.
  • editGPT—edit any writing (like Grammarly inside ChatGPT).
  • Grimoire—a coding assistant that helps you build anything!

Some notes from Dan Fitzpatrick – The AI Educator:

Custom GPT Bots:

  • These could help with the creation of interactive learning assistants, aligned with curricula.
  • They can be easily created with natural language programming.
  • Important to note users must have a ChatGPT Plus paid account

Custom GPT Store:

  • Marketplace for sharing and accessing educational GPT tools created by other teachers.
  • A store could offer access to specialised tools for diverse learning needs.
  • A store could enhance teaching strategies when accessing proven, effective GPT applications.

From DSC:
I appreciate Dan’s potential menu of options for a child’s education:

Monday AM: Sports club
Monday PM: Synthesis Online School AI Tutor
Tuesday AM: Music Lesson
Tuesday PM: Synthesis Online School Group Work
Wednesday AM: Drama Rehearsal
Wednesday PM: Synthesis Online School AI Tutor
Thursday AM: Volunteer work
Thursday PM: Private study
Friday AM: Work experience
Friday PM: Work experience

Our daughter has special learning needs and this is very similar to what she is doing. 

Also, Dan has a couple of videos out here at Google for Education:



Tuesday’s AI Ten for Educators (November 14) — from stefanbauschard.substack.com by Stefan Bauschard
Ten AI developments for educators to be aware of

Two boxes. In my May Cottesmore presentation, I put up two boxes:

(a) Box 1 — How educators can use AI to do what they do now (lesson plans, quizzes, tests, vocabulary lists, etc.)

(b) Box 2 — How the education system needs to change because, in the near future (sort of already), everyone is going to have multiple AIs working with them all day, and the premium on intelligence, especially “knowledge-based” intelligence, is going to decline rapidly. It’s hard to think that significant changes in the education system won’t be needed to accommodate that change.

There is a lot of focus on preparing educators to work in Box 1, which is important, if for no other reason than that they can see the power of even the current but limited technologies, but the hard questions are starting to be about Box 2. I encourage you to start those conversations, as the “ed tech” companies already are, and they’ll be happy to provide the answers and the services if you don’t want to.

Practical suggestions: Two AI teams in your institution. Team 1 works on Box A and Team 2 works on Box B.

 

A future-facing minister, a young inventor and a shared vision: An AI tutor for every student — from news.microsoft.com by Chris Welsch

The Ministry of Education and Pativada see what has become known as the U.A.E. AI Tutor as a way to provide students with 24/7 assistance as well as help level the playing field for those families who cannot afford a private tutor. At the same time, the AI Tutor would be an aid to teachers, they say. “We see it as a tool that will support our teachers,” says Aljughaiman. “This is a supplement to classroom learning.”

If everything goes according to plan, every student in the United Arab Emirates’ school system will have a personal AI tutor – that fits in their pockets.

It’s a story that involves an element of coincidence, a forward-looking education minister and a tech team led by a chief executive officer who still lives at home with his parents.

In February 2023, the U.A.E.’s education minister, His Excellency Dr. Ahmad Belhoul Al Falasi, announced that the ministry was embracing AI technology and pursuing the idea of an AI tutor to help Emirati students succeed. And he also announced that the speech he presented had been written by ChatGPT. “We should not demonize AI,” he said at the time.



Fostering deep learning in humans and amplifying our intelligence in an AI World — from stefanbauschard.substack.com by Stefan Bauschard
A free 288-page report on advancements in AI and related technology, their effects on education, and our practical support for AI-amplified human deep learning

Six weeks ago, Dr. Sabba Quidwai and I accidentally stumbled upon an idea to compare the deep learning revolution in computer science to the mostly lacking deep learning efforts in education (Mehta & Fine). I started writing, and as these things often go with me, I thought there were many other things that would be useful to think through and for educators to know, and we ended up with this 288-page report.

***

Here’s an abstract from that report:

This report looks at the growing gap between the attention paid to the development of intelligence in machines and humans. While computer scientists have made great strides in developing human intelligence capacities in machines using deep learning technologies, including the abilities of machines to learn on their own, a significant part of the education system has not kept up with developing the intelligence capabilities in people that will enable them to succeed in the 21st century. Instead of fully embracing pedagogical methods that place primary emphasis on promoting collaboration, critical thinking, communication, creativity, and self-learning through experiential, interdisciplinary approaches grounded in human deep learning and combined with current technologies, a substantial portion of the educational system continues to heavily rely on traditional instructional methods and goals. These methods and goals prioritize knowledge acquisition and organization, areas in which machines already perform substantially better than people.

Also from Stefan Bauschard, see:

  • Debating in the World of AI
    Performative assessment, learning to collaborate with humans and machines, and developing special human qualities

13 Nuggets of AI Wisdom for Higher Education Leaders — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
Actionable AI Guidance for Higher Education Leaders

Incentivize faculty AI innovation with AI. 

Invest in people first, then technology. 

On teaching, learning, and assessment. AI has captured the attention of all institutional stakeholders. Capitalize to reimagine pedagogy and evaluation. Rethink lectures, examinations, and assignments to align with workforce needs. Consider incorporating Problem-Based Learning, building portfolios and proof of work, and conducting oral exams. And use AI to provide individualized support and assess real-world skills.

Actively engage students.


Some thoughts from George Siemens re: AI:

Sensemaking, AI, and Learning (SAIL), a regular look at how AI is impacting learning.

Our education system has a uni-dimensional focus: learning things. Of course, we say we care about developing the whole learner, but the metrics that matter (grade, transcripts) that underpin the education system are largely focused on teaching students things that have long been Google-able but are now increasingly doable by AI. Developments in AI matters in ways that calls into question large parts of what happens in our universities. This is not a statement that people don’t need to learn core concepts and skills. My point is that the fulcrum of learning has shifted. Knowing things will continue to matter less and less going forward as AI improves its capabilities. We’ll need to start intentionally developing broader and broader attributes of learners: metacognition, wellness, affect, social engagement, etc. Education will continue to shift toward human skills and away from primary assessment of knowledge gains disconnected from skills and practice and ways of being.


AI, the Next Chapter for College Librarians — from insidehighered.com by Lauren Coffey
Librarians have lived through the disruptions of fax machines, websites and Wikipedia, and now they are bracing to do it again as artificial intelligence tools go mainstream: “Maybe it’s our time to shine.”

A few months after ChatGPT launched last fall, faculty and students at Northwestern University had many questions about the building wave of new artificial intelligence tools. So they turned to a familiar source of help: the library.

“At the time it was seen as a research and citation problem, so that led them to us,” said Michelle Guittar, head of instruction and curriculum support at Northwestern University Libraries.

In response, Guittar, along with librarian Jeanette Moss, created a landing page in April, “Using AI Tools in Your Research.” At the time, the university itself had yet to put together a comprehensive resource page.


From Dr. Nick Jackson’s recent post on LinkedIn: 

Last night the Digitech team of junior and senior teachers from Scotch College Adelaide showcased their 2023 experiments, innovation, successes and failures with technology in education. Accompanied by Student digital leaders, we saw the following:

  •  AI used for languagelearning where avatars can help with accents
  • Motioncapture suits being used in mediastudies
  • AI used in assessment and automatic grading of work
  • AR used in designtechnology
  • VR used for immersive Junior school experiences
  • A teacher’s AI toolkit that has changed teaching practice and workflow
  • AR and the EyeJack app used by students to create dynamic art work
  • VR use in careers education in Senior school
  • How ethics around AI is taught to Junior school students from Year 1
  • Experiments with MyStudyWorks

Almost an Agent: What GPTs can do — from oneusefulthing.org by Ethan Mollick

What would a real AI agent look like? A simple agent that writes academic papers would, after being given a dataset and a field of study, read about how to compose a good paper, analyze the data, conduct a literature review, generate hypotheses, test them, and then write up the results, all without intervention. You put in a request, you get a Word document that contains a draft of an academic paper.

A process kind of like this one:


What I Learned From an Experiment to Apply Generative AI to My Data Course — from edsurge.com by Wendy Castillo

As an educator, I have a duty to remain informed about the latest developments in generative AI, not only to ensure learning is happening, but to stay on top of what tools exist, what benefits and limitations they have, and most importantly, how students might be using them.

However, it’s also important to acknowledge that the quality of work produced by students now requires higher expectations and potential adjustments to grading practices. The baseline is no longer zero, it is AI. And the upper limit of what humans can achieve with these new capabilities remains an unknown frontier.


Artificial Intelligence in Higher Education: Trick or Treat? — from tytonpartners.com by Kristen Fox and Catherine Shaw
.

Two components of AI -- generative AI and predictive AI

 

What happens to teaching after Covid? — from chronicle.com by Beth McMurtrie

It’s an era many instructors would like to put behind them: black boxes on Zoom screens, muffled discussions behind masks, students struggling to stay engaged. But how much more challenging would teaching during the pandemic have been if colleges did not have experts on staff to help with the transition? On many campuses, teaching-center directors, instructional designers, educational technologists, and others worked alongside professors to explore learning-management systems, master video technology, and rethink what and how they teach.

A new book out this month, Higher Education Beyond Covid: New Teaching Paradigms and Promise, explores this period through the stories of campus teaching and learning centers. Their experiences reflect successes and failures, and what higher education could learn as it plans for the future.

Beth also mentioned/link to:


How to hold difficult discussions online — from chronicle.com by Beckie Supiano

As usual, our readers were full of suggestions. Kathryn Schild, the lead instructional designer in faculty development and instructional support at the University of Alaska at Anchorage, shared a guide she’s compiled on holding asynchronous discussions, which includes a section on difficult topics.

In an email, Schild also pulled out a few ideas she thought were particularly relevant to Le’s question, including:

  • Set the ground rules as a class. One way to do this is to share your draft rules in a collaborative document and ask students to annotate it and add suggestions.
  • Plan to hold fewer difficult discussions than in a face-to-face class, and work on quality over quantity. This could include multiweek discussions, where you spiral through the same issue with fresh perspectives as the class learns new approaches.
  • Start with relationship-building interactions in the first few weeks, such as introductions, low-stakes group assignments, or peer feedback, etc.
 


Teaching writing in the age of AI — from the Future of Learning (a Hechinger Report newsletter) by Javeria Salman

ChatGPT can produce a perfectly serviceable writing “product,” she said. But writing isn’t a product per se — it’s a tool for thinking, for organizing ideas, she said.

“ChatGPT and other text-based tools can’t think for us,” she said. “There’s still things to learn when it comes to writing because writing is a form of figuring out what you think.”

When students could contrast their own writing to ChatGPT’s more generic version, Levine said, they were able to “understand what their own voice is and what it does.”




Grammarly’s new generative AI feature learns your style — and applies it to any text — from techcrunch.com by Kyle Wiggers; via Tom Barrett

But what about text? Should — and if so, how should — writers be recognized and remunerated for AI-generated works that mimic their voices?

Those are questions that are likely to be raised by a feature in Grammarly, the cloud-based typing assistant, that’s scheduled to launch by the end of the year for subscribers to Grammarly’s business tier. Called “Personalized voice detection and application,” the feature automatically detects a person’s unique writing style and creates a “voice profile” that can rewrite any text in the person’s style.


Is AI Quietly Weaving the Fabric of a Global Classroom Renaissance? — from medium.com by Robert the Robot
In a world constantly buzzing with innovation, a silent revolution is unfolding within the sanctuaries of learning—our classrooms.

From bustling metropolises to serene hamlets, schools across the globe are greeting a new companion—Artificial Intelligence (AI). This companion promises to redefine the essence of education, making learning a journey tailored to each child’s unique abilities.

The advent of AI in education is akin to a gentle breeze, subtly transforming the academic landscape. Picture a classroom where each child, with their distinct capabilities and pace, embarks on a personalized learning path. AI morphs this vision into reality, crafting a personalized educational landscape that celebrates the unique potential harbored within every learner.


AI Books for Educators — from aiadvisoryboards.wordpress.com by Barbara Anna Zielonka

Books have always held a special place in my heart. As an avid reader and AI enthusiast, I have curated a list of books on artificial intelligence specifically tailored for educators. These books delve into the realms of AI, exploring its applications, ethical considerations, and its impact on education. Share your suggestions and let me know which books you would like to see included on this list.


SAIL: ELAI recordings, AI Safety, Near term AI/learning — by George Siemens

We held our fourth online Empowering Learners for the Age of AI conference last week. We sold out at 1500 people (a Whova and budget limit). The recordings/playlist from the conference can now be accessed here.

 

60+ Ideas for ChatGPT Assignments — from stars.library.ucf.edu by Kevin Yee, Kirby Whittington, Erin Doggette, and Laurie Uttich

60+ ideas for using ChatGPT in your assignments today


Artificial intelligence is disrupting higher education — from itweb.co.za by Rennie Naidoo; via GSV
Traditional contact universities need to adapt faster and find creative ways of exploring and exploiting AI, or lose their dominant position.

Higher education professionals have a responsibility to shape AI as a force for good.


Introducing Canva’s biggest education launch — from canva.com
We’re thrilled to unveil our biggest education product launch ever. Today, we’re introducing a whole new suite of products that turn Canva into the all-in-one classroom tool educators have been waiting for.

Also see Canva for Education.
Create and personalize lesson plans, infographics,
posters, video, and more. 
100% free for
teachers and students at eligible schools.


ChatGPT and generative AI: 25 applications to support student engagement — from timeshighereducation.com by Seb Dianati and Suman Laudari
In the fourth part of their series looking at 100 ways to use ChatGPT in higher education, Seb Dianati and Suman Laudari share 25 prompts for the AI tool to boost student engagement


There are two ways to use ChatGPT — from theneurondaily.com

  1. Type to it.
  2. Talk to it (new).


Since then, we’ve looked to it for a variety of real-world business advice. For example, Prof Ethan Mollick posted a great guide using ChatGPT-4 with voice as a negotiation instructor.

In a similar fashion, you can consult ChatGPT with voice for feedback on:

  • Job interviews.
  • Team meetings.
  • Business presentations.



Via The Rundown: Google is using AI to analyze the company’s Maps data and suggest adjustments to traffic light timing — aiming to cut driver waits, stops, and emissions.


Google Pixel’s face-altering photo tool sparks AI manipulation debate — from bbc.com by Darren Waters

The camera never lies. Except, of course, it does – and seemingly more often with each passing day.
In the age of the smartphone, digital edits on the fly to improve photos have become commonplace, from boosting colours to tweaking light levels.

Now, a new breed of smartphone tools powered by artificial intelligence (AI) are adding to the debate about what it means to photograph reality.

Google’s latest smartphones released last week, the Pixel 8 and Pixel 8 Pro, go a step further than devices from other companies. They are using AI to help alter people’s expressions in photographs.



From Digital Native to AI-Empowered: Learning in the Age of Artificial Intelligence — from campustechnology.com by Kim Round
The upcoming generation of learners will enter higher education empowered by AI. How can institutions best serve these learners and prepare them for the workplace of the future?

Dr. Chris Dede, of Harvard University and Co-PI of the National AI Institute for Adult Learning and Online Education, spoke about the differences between knowledge and wisdom in AI-human interactions in a keynote address at the 2022 Empowering Learners for the Age of AI conference. He drew a parallel between Star Trek: The Next Generation characters Data and Picard during complex problem-solving: While Data offers the knowledge and information, Captain Picard offers the wisdom and context from on a leadership mantle, and determines its relevance, timing, and application.


The Near-term Impact of Generative AI on Education, in One Sentence — from opencontent.org by David Wiley

This “decreasing obstacles” framing turned out to be helpful in thinking about generative AI. When the time came, my answer to the panel question, “how would you summarize the impact generative AI is going to have on education?” was this:

“Generative AI greatly reduces the degree to which access to expertise is an obstacle to education.”

We haven’t even started to unpack the implications of this notion yet, but hopefully just naming it will give the conversation focus, give people something to disagree with, and help the conversation progress more quickly.


How to Make an AI-Generated Film — from heatherbcooper.substack.com by Heather Cooper
Plus, Midjourney finally has a new upscale tool!


Eureka! NVIDIA Research Breakthrough Puts New Spin on Robot Learning — from blogs.nvidia.com by Angie Lee
AI agent uses LLMs to automatically generate reward algorithms to train robots to accomplish complex tasks.

From DSC:
I’m not excited about this, as I can’t help but wonder…how long before the militaries of the world introduce this into their warfare schemes and strategies?


The 93 Questions Schools Should Ask About AI — from edweek.org by Alyson Klein

The toolkit recommends schools consider:

  • Purpose: How can AI help achieve educational goals?
  • Compliance: How does AI fit with existing policies?
  • Knowledge: How can schools advance AI Literacy?
  • Balance: What are the benefits and risks of AI?
  • Integrity: How does AI fit into policies on things like cheating?
  • Agency: How can humans stay in the loop on AI?
  • Evaluation: How can schools regularly assess the impact of AI?
 
 

The Learning & Employment Records (LER) Ecosystem Map — with thanks to Melanie Booth on LinkedIn for this resource
Driving Opportunity and Equity Through Learning & Employment Records

The Learning & Employment Records (LER) Ecosystem Map

Imagine A World Where…

  • Everyone is empowered to access learning and earning opportunities based on what they know and can do, whether those skills and abilities are obtained through degrees, work experiences, or independent learning.
  • People can capture and communicate the skills and competencies they’ve acquired across their entire learning journey — from education, experience and service — with more ease, confidence, and clarity than a traditional resume.
  • Learners and earners control their information and can curate their skills to take advantage of every opportunity they are truly qualified to pursue, opening up pathways that help address systemic inequities.
  • Employers can tap into a wider talent pool and better match applicants to opportunities with verifiable credentials that represent skills, competencies, and achievements.

This is the world that we believe can be created by Learning and Employment Records (LERs), i.e. digital records of learning and work experiences that are linked to and controlled by learners and earners. An interoperable, well-governed LER ecosystem has the potential to transform the future of work so that it is more equitable, efficient, and effective for everyone involved— individuals, training and education providers, employers, and policymakers.


Also per Melanie Booth, see:

 

Thinking with Colleagues: AI in Education — from campustechnology.com by Mary Grush
A Q&A with Ellen Wagner

Wagner herself recently relied on the power of collegial conversations to probe the question: What’s on the minds of educators as they make ready for the growing influence of AI in higher education? CT asked her for some takeaways from the process.

We are in the very early days of seeing how AI is going to affect education. Some of us are going to need to stay focused on the basic research to test hypotheses. Others are going to dive into laboratory “sandboxes” to see if we can build some new applications and tools for ourselves. Still others will continue to scan newsletters like ProductHunt every day to see what kinds of things people are working on. It’s going to be hard to keep up, to filter out the noise on our own. That’s one reason why thinking with colleagues is so very important.

Mary and Ellen linked to “What Is Top of Mind for Higher Education Leaders about AI?” — from northcoasteduvisory.com. Below are some excerpts from those notes:

We are interested how K-12 education will change in terms of foundational learning. With in-class, active learning designs, will younger students do a lot more intensive building of foundational writing and critical thinking skills before they get to college?

  1. The Human in the Loop: AI is built using math: think of applied statistics on steroids. Humans will be needed more than ever to manage, review and evaluate the validity and reliability of results. Curation will be essential.
  2. We will need to generate ideas about how to address AI factors such as privacy, equity, bias, copyright, intellectual property, accessibility, and scalability.
  3. Have other institutions experimented with AI detection and/or have held off on emerging tools related to this? We have just recently adjusted guidance and paused some tools related to this given the massive inaccuracies in detection (and related downstream issues in faculty-elevated conduct cases)

Even though we learn repeatedly that innovation has a lot to do with effective project management and a solid message that helps people understand what they can do to implement change, people really need innovation to be more exciting and visionary than that.  This is the place where we all need to help each other stay the course of change. 


Along these lines, also see:


What people ask me most. Also, some answers. — from oneusefulthing.org by Ethan Mollick
A FAQ of sorts

I have been talking to a lot of people about Generative AI, from teachers to business executives to artists to people actually building LLMs. In these conversations, a few key questions and themes keep coming up over and over again. Many of those questions are more informed by viral news articles about AI than about the real thing, so I thought I would try to answer a few of the most common, to the best of my ability.

I can’t blame people for asking because, for whatever reason, the companies actually building and releasing Large Language Models often seem allergic to providing any sort of documentation or tutorial besides technical notes. I was given much better documentation for the generic garden hose I bought on Amazon than for the immensely powerful AI tools being released by the world’s largest companies. So, it is no surprise that rumor has been the way that people learn about AI capabilities.

Currently, there are only really three AIs to consider: (1) OpenAI’s GPT-4 (which you can get access to with a Plus subscription or via Microsoft Bing in creative mode, for free), (2) Google’s Bard (free), or (3) Anthropic’s Claude 2 (free, but paid mode gets you faster access). As of today, GPT-4 is the clear leader, Claude 2 is second best (but can handle longer documents), and Google trails, but that will likely change very soon when Google updates its model, which is rumored to be happening in the near future.

 

The Public Is Giving Up on Higher Ed — from chronicle.com by Michael D. Smith
Our current system isn’t working for society. Digital alternatives can change that.

Excerpts:

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

On one hand, our defensiveness is natural. Change is hard, and technological change that risks making traditional parts of our sector obsolete is even harder. “A professor must have an incentive to adopt new technology,” a tenured colleague recently told me regarding online learning. “Innovation adoption will occur one funeral at a time.”

But while our defense of the status quo is understandable, maybe we should ask whether it’s ethical, given what we know about the injustice inherent in our current system. I believe a happier future for all involved — faculty, administrators, and students — is within reach, but requires we stop reflexively protecting our deeply flawed system. How can we do that? We could start by embracing three fundamental principles.

1. Digitization will change higher education.

2. We should want to embrace this change.

3. We have a way to embrace this change.

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

 

 

US Higher Education Needs a Revolution. What’s Holding It Back? — from bloomberg.com by Tyler Cowen
Not only do professors need to change how they teach, but universities need to change how they evaluate them.

When the revolution in higher education finally arrives, how will we know? I have a simple metric: When universities change how they measure faculty work time. Using this yardstick, the US system remains very far from a fundamental transformation.

But today’s education system is dynamic, and needs to become even more so. There is already the internet, YouTube, and a flurry of potential innovations coming from AI. If professors really are a society’s best minds, shouldn’t they be working to improve the entire educational process, not just punching the equivalent of a time clock at a university?

Such a change would require giving them credit for innovations, which in turn would require a broader conception of their responsibilities. 


Citing Significant Budget Deficits, Several Colleges Face Cuts — from insidehighered.com by Doug Lederman
The affected institutions include Christian Brothers, Delta State, Lane Community College, Miami University, St. Norbert and Shepherd.

Numerous colleges and universities, public and private, announced in recent days that they face significant budget deficits that will require cuts to programs and employees.

Many of the institutions appear to have been motivated by fall enrollment numbers that did not meet their expectations, in most cases representing a failure to recover from record low enrollments during the pandemic. Others cited the lingering effects on enrollment and budgets from COVID-19, exacerbated by the end of federal relief funds.


How universities can adopt a lifelong learning mindset: Lifelong learning that will last — from timeshighereducation.com by various authors
How the traditional university degree can be reimagined as a lifelong educational journey, enabling students to upskill and reskill throughout their lives

The rapid evolution of the workplace and changing skills demands are driving calls for better lifelong learning provision. For universities, this means re-examining traditional teaching practices and course design to ensure that students can benefit from continuing education throughout their careers. It requires more flexible, accessible, bite-sized learning that can be completed in tandem with other professional and personal commitments. But how can this be offered in a coherent, joined-up way without sacrificing quality? From Moocs to microcredentials, these resources offer advice and insight into how lifelong learning opportunities can be developed and improved for future generations.


The College Backlash Is Going Too Far — from theatlantic.com by David Deming; via Matthew Tower who also expresses his concerns re: this article from The Chronicle
Getting a four-year degree is still a good investment. 

American higher education certainly has its problems. But the bad vibes around college threaten to obscure an important economic reality: Most young people are still far better off with a four-year college degree than without one.

Historically, analysis of higher education’s value tends to focus on the so-called college wage premium. That premium has always been massive—college graduates earn much more than people without a degree, on average—but it doesn’t take into account the cost of getting a degree. So the St. Louis Fed researchers devised a new metric, the college wealth premium, to try to get a more complete picture.

But the long-term value of a bachelor’s degree is much greater than it initially appears. If a college professor or pundit tries to convince you otherwise, ask them what they would choose for their own children.

From DSC:
David’s last quote here is powerful and likely true. But that doesn’t mean that we should disregard trying to get the cost of obtaining a degree down by 50% or more. There are still way too many people struggling with student loans — and they have been for DECADES. And others will be joining these same financial struggles — again, for DECADES to come.


Johns Hopkins aims to address teacher shortage with new master’s residency option — from hub.jhu.edu ; via Matthew Tower

The School of Education’s TeachingWell program will provide professional, financial support for applicants looking to start long-term careers in teaching

Students in TeachingWell will earn the Master of Education for Teaching Professionals in four semesters at Johns Hopkins and gain Maryland state teacher certification along with real-world teaching experience—all made stronger by ongoing mentoring, life design, and teacher wellness programs through the university.

“We will focus on teacher well-being and life-design skills that address burnout and mental health concerns that are forcing too many teachers out of the profession,” says Mary Ellen Beaty-O’Ferrall, associate professor at the School of Education and faculty director of TeachingWell. “We want teachers with staying power—effective and financially stable educators with strong personal well-being.”


How to Build Stackable Credentials — from insidehighered.com by Lindsay Daugherty , Peter Nguyen , Jonah Kushner and Peter Riley Bahr
Five actions states and colleges are taking.

Stackable credentials are a top priority for many states and colleges these days. The term can be used to mean different things, from college efforts to embed short-term credentials into their degree programs to larger-scale efforts to rethink the way credentialing is done through alternative approaches, like skills badges. The goals of these initiatives are twofold: (1) to ensure individuals can get credit for a range of different learning experiences and better integrate these different types of learning, and (2) to better align our education and training systems with workforce needs, which often require reskilling through training and credentials below the bachelor’s degree level.S

 

As AI Chatbots Rise, More Educators Look to Oral Exams — With High-Tech Twist — from edsurge.com by Jeffrey R. Young

To use Sherpa, an instructor first uploads the reading they’ve assigned, or they can have the student upload a paper they’ve written. Then the tool asks a series of questions about the text (either questions input by the instructor or generated by the AI) to test the student’s grasp of key concepts. The software gives the instructor the choice of whether they want the tool to record audio and video of the conversation, or just audio.

The tool then uses AI to transcribe the audio from each student’s recording and flags areas where the student answer seemed off point. Teachers can review the recording or transcript of the conversation and look at what Sherpa flagged as trouble to evaluate the student’s response.

 

The Enemy Within: Former College Presidents Offer Warnings — from forbes-com.cdn.ampproject.org by David Rosowsky; via Robert Gibson on LinkedIn

Excerpt (emphasis DSC):

Brian Mitchell, former president of Bucknell University and Washington & Jefferson College, draws on his experience to offer insight in his newest Forbes contribution. He also offers a stern warning: “Boards, administrators, and faculty must wake up to the new realities they now face… the faculty can no longer live in a world that no longer exists… institutional change will happen at a speed to which they are unaccustomed and potentially unwilling to accept.” President Mitchell then goes on to offer some immediate steps that can be taken. Perhaps the most important is to “abandon the approach to governance where trustees are updated in their periodic board meetings.”

Incremental change is possible, but transformational change may not be.

Therein lies the conundrum about which Rosenberg writes in his new book. Higher ed’s own systems are inhibiting needed transformational change.

Also just published was the book, “Whatever It Is, I’m Against It: Resistance to Change in Higher Education” by Brian Rosenberg, former president of Macalester College. Articles on Rosenberg’s observations, analysis, and cautions have appeared this month in both The Chronicle of Higher Education and Inside Higher Ed, the two leading higher education publications in the US.


Addendum on 10/6/23:

Higher Education as Its Own Worst Enemy — from insidehighered.com/ by Susan H. Greenberg
In a wide-ranging discussion about his new book, Brian Rosenberg explains how shared governance, tenure and other practices stifle change on college campuses.

He argues that the institutions designed to foster critical inquiry and the open exchange of ideas are themselves staunchly resistant to both. 

The other would be some serious thinking about pedagogy and how students learn. Because the research is there if people were willing to take it seriously and think about ways of providing an education that is not quite as reliant upon lots of faculty with Ph.D.s. Is that easy to do? No, but it is something that I think there should at least begin to be some serious discussions about.

Shared governance is one of those things that if you ask any college president off the record, they’ll probably express their frustration, then they’ll go back to their campus and wax poetic about the wonders of shared governance, because that’s what they have to do to survive.

 

Student Use Cases for AI: Start by Sharing These Guidelines with Your Class — from hbsp.harvard.edu by Ethan Mollick and Lilach Mollick

To help you explore some of the ways students can use this disruptive new technology to improve their learning—while making your job easier and more effective—we’ve written a series of articles that examine the following student use cases:

  1. AI as feedback generator
  2. AI as personal tutor
  3. AI as team coach
  4. AI as learner

Recap: Teaching in the Age of AI (What’s Working, What’s Not) — from celt.olemiss.edu by Derek Bruff, visiting associate director

Earlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.


Teaching: Want your students to be skeptical of ChatGPT? Try this. — from chronicle.com by Beth McMurtrie

Then, in class he put them into groups where they worked together to generate a 500-word essay on “Why I Write” entirely through ChatGPT. Each group had complete freedom in how they chose to use the tool. The key: They were asked to evaluate their essay on how well it offered a personal perspective and demonstrated a critical reading of the piece. Weiss also graded each ChatGPT-written essay and included an explanation of why he came up with that particular grade.

After that, the students were asked to record their observations on the experiment on the discussion board. Then they came together again as a class to discuss the experiment.

Weiss shared some of his students’ comments with me (with their approval). Here are a few:


2023 EDUCAUSE Horizon Action Plan: Generative AI — from library.educause.edu by Jenay Robert and Nicole Muscanell

Asked to describe the state of generative AI that they would like to see in higher education 10 years from now, panelists collaboratively constructed their preferred future.
.

2023-educause-horizon-action-plan-generative-ai


Will Teachers Listen to Feedback From AI? Researchers Are Betting on It — from edsurge.com by Olina Banerji

Julie York, a computer science and media teacher at South Portland High School in Maine, was scouring the internet for discussion tools for her class when she found TeachFX. An AI tool that takes recorded audio from a classroom and turns it into data about who talked and for how long, it seemed like a cool way for York to discuss issues of data privacy, consent and bias with her students. But York soon realized that TeachFX was meant for much more.

York found that TeachFX listened to her very carefully, and generated a detailed feedback report on her specific teaching style. York was hooked, in part because she says her school administration simply doesn’t have the time to observe teachers while tending to several other pressing concerns.

“I rarely ever get feedback on my teaching style. This was giving me 100 percent quantifiable data on how many questions I asked and how often I asked them in a 90-minute class,” York says. “It’s not a rubric. It’s a reflection.”

TeachFX is easy to use, York says. It’s as simple as switching on a recording device.

But TeachFX, she adds, is focused not on her students’ achievements, but instead on her performance as a teacher.


ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds — from the74million.org by Mark Keierleber
While educators worry that students are using generative AI to cheat, a new report finds students are turning to the tool more for personal problems.

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat.


Embracing weirdness: What it means to use AI as a (writing) tool — from oneusefulthing.org by Ethan Mollick
AI is strange. We need to learn to use it.

But LLMs are not Google replacements, or thesauruses or grammar checkers. Instead, they are capable of so much more weird and useful help.


Diving Deep into AI: Navigating the L&D Landscape — from learningguild.com by Markus Bernhardt

The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.

Thanks to generative AI, such visions are transitioning from fiction to reality.


Video: Unleashing the Power of AI in L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman
An exclusive video walkthrough of my keynote at Sweden’s national L&D conference this week

Highlights

  • The wicked problem of L&D: last year, $371 billion was spent on workplace training globally, but only 12% of employees apply what they learn in the workplace
  • An innovative approach to L&D: when Mastery Learning is used to design & deliver workplace training, the rate of “transfer” (i.e. behaviour change & application) is 67%
  • AI 101: quick summary of classification, generative and interactive AI and its uses in L&D
  • The impact of AI: my initial research shows that AI has the potential to scale Mastery Learning and, in the process:
    • reduce the “time to training design” by 94% > faster
    • reduce the cost of training design by 92% > cheaper
    • increase the quality of learning design & delivery by 96% > better
  • Research also shows that the vast majority of workplaces are using AI only to “oil the machine” rather than innovate and improve our processes & practices
  • Practical tips: how to get started on your AI journey in your company, and a glimpse of what L&D roles might look like in a post-AI world

 

Preparing Students for the AI-Enhanced Workforce — from insidehighered.com by Ray Schroeder
Our graduating and certificate-completing students need documented generative AI skills, and they need them now.

The common adage repeated again and again is that AI will not take your job; a person with AI skills will replace you. The learners we are teaching this fall who will be entering, re-entering or seeking advancement in the workforce at the end of the year or in the spring must become demonstrably skilled in using generative AI. The vast majority of white-collar jobs will demand the efficiencies and flexibilities defined by generative AI now and in the future. As higher education institutions, we will be called upon to document and validate generative AI skills.


AI image generators: 10 tools, 10 classroom uses — from ditchthattextbook.com by Matt Miller

AI image generators: 10 tools, 10 classroom uses


A Majority of New Teachers Aren’t Prepared to Teach With Technology. What’s the Fix? — from edweek.org by Alyson Klein

Think all incoming teachers have a natural facility with technology just because most are digital natives? Think again.

Teacher preparation programs have a long way to go in preparing prospective educators to teach with technology, according to a report released September 12 by the International Society for Technology in Education, a nonprofit.

In fact, more than half of incoming teachers—56 percent—lack confidence in using learning technology prior to entering the classroom, according to survey data included with the report.


5 Actual Use Cases of AI in Education: Newsletter #68 — from transcend.substack.com by Alberto Arenaza
What areas has AI truly impacted educators, learners & workers?

  1. AI Copilot for educators, managers and leaders
  2. Flipped Classrooms Chatbots
  3. AI to assess complex answers
  4. AI as a language learning tool
  5. AI to brainstorm ideas

AI-Powered Higher Ed — from drphilippahardman.substack.com by  Dr. Philippa Hardman
What a House of Commons round table discussion tells us about how AI will impact the purpose of higher education

In this week’s blog post I’ll summarise the discussion and share what we agreed would be the most likely new model of assessment in HE in the post-AI world.

But this in turn raises a bigger question: why do people go to university, and what is the role of higher education in the twenty first century? Is it to create the workforce of the future? Or an institution for developing deep and original domain expertise? Can and should it be both?


How To Develop Computational Thinkers — from iste.org by Jorge Valenzuela

In my previous position with Richmond Public Schools, we chose to dive in with computational thinking, programming and coding, in that order. I recommend building computational thinking (CT) competency first by helping students recognize and apply the four elements of CT to familiar problems/situations. Computational thinking should come first because it’s the highest order of problem-solving, is a cross-curricular skill and is understandable to both machines and humans. Here are the four components of CT and how to help students understand them.

 

Next, The Future of Work is… Intersections — from linkedin.com by Gary A. Bolles; via Roberto Ferraro

So much of the way that we think about education and work is organized into silos. Sure, that’s one way to ensure a depth of knowledge in a field and to encourage learners to develop mastery. But it also leads to domains with strict boundaries. Colleges are typically organized into school sub-domains, managed like fiefdoms, with strict rules for professors who can teach in different schools.

Yet it’s at the intersections of seemingly-disparate domains where breakthrough innovation can occur.

Maybe intersections bring a greater chance of future work opportunity, because that young person can increase their focus in one arena or another as they discover new options for work — and because this is what meaningful work in the future is going to look like.

From DSC:
This posting strikes me as an endorsement for interdisciplinary degrees. I agree with much of this. It’s just hard to find the right combination of disciplines. But I supposed that depends upon the individual student and what he/she is passionate or curious about.


Speaking of the future of work, also see:

Centaurs and Cyborgs on the Jagged Frontier — from oneusefulthing.org by Ethan Mollick
I think we have an answer on whether AIs will reshape work…

A lot of people have been asking if AI is really a big deal for the future of work. We have a new paper that strongly suggests the answer is YES.
.

Consultants using AI finished 12.2% more tasks on average, completed tasks 25.1% more quickly, and produced 40% higher quality results than those without. Those are some very big impacts. Now, let’s add in the nuance.

 
© 2025 | Daniel Christian