Beyond the Hype: Taking a 50 Year Lens to the Impact of AI on Learning — from nafez.substack.com by Nafez Dakkak and Chris Dede
How do we make sure LLMs are not “digital duct tape”?

[Per Chris Dede] We often think of the product of teaching as the outcome (e.g. an essay, a drawing, etc.). The essence of education, in my view, lies not in the products or outcomes of learning but in the journey itself. The artifact is just a symbol that you’ve taken the journey.

The process of learning — the exploration, challenges, and personal growth that occur along the way — is where the real value lies. For instance, the act of writing an essay is valuable not merely for the final product but for the intellectual journey it represents. It forces you to improve and organize your thinking on a subject.

This distinction becomes important with the rise of generative AI, because it uniquely allows us to produce these artifacts without taking the journey.

As I’ve argued previously, I am worried that all this hype around LLMs renders them a “type of digital duct-tape to hold together an obsolete industrial-era educational system”. 


Speaking of AI in our learning ecosystems, also see:


On Building a AI Policy for Teaching & Learning — from by Lance Eaton
How students drove the development of a policy for students and faculty

Well, last month, the policy was finally approved by our Faculty Curriculum Committee and we can finally share the final version: AI Usage Policy. College Unbound also created (all-human, no AI used) a press release with the policy and some of the details.

To ensure you see this:

  • Usage Guidelines for AI Generative Tools at College Unbound
    These guidelines were created and reviewed by College Unbound students in Spring 2023 with the support of Lance Eaton, Director of Faculty Development & Innovation.  The students include S. Fast, K. Linder-Bey, Veronica Machado, Erica Maddox, Suleima L., Lora Roy.

ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds — from psypost.org by Eric W. Dolan

A recent study has found that scientific citations generated by ChatGPT often do not correspond to real academic work. The study, published in the Canadian Psychological Association’s Mind Pad, found that “false citation rates” across various psychology subfields ranged from 6% to 60%. Surprisingly, these fabricated citations feature elements such as legitimate researchers’ names and properly formatted digital object identifiers (DOIs), which could easily mislead both students and researchers.

MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals.

 

AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

AWS, Educause partner on generative AI readiness tool — from edscoop.com by Skylar Rispens
Amazon Web Services and the nonprofit Educause announced a new tool designed to help higher education institutions gauge their readiness to adopt generative artificial intelligence.

Amazon Web Services and the nonprofit Educause on Monday announced they’ve teamed up to develop a tool that assesses how ready higher education institutions are to adopt generative artificial intelligence.

Through a series of curated questions about institutional strategy, governance, capacity and expertise, AWS and Educause claim their assessment can point to ways that operations can be improved before generative AI is adopted to support students and staff.

“Generative AI will transform how educators engage students inside and outside the classroom, with personalized education and accessible experiences that provide increased student support and drive better learning outcomes,” Kim Majerus, vice president of global education and U.S. state and local government at AWS, said in a press release. “This assessment is a practical tool to help colleges and universities prepare their institutions to maximize this technology and support students throughout their higher ed journey.”


Speaking of AI and our learning ecosystems, also see:

Gen Z Wants AI Skills And Businesses Want Workers Who Can Apply AI: Higher Education Can Help — from forbes.com by Bruce Dahlgren

At a moment when the value of higher education has come under increasing scrutiny, institutions around the world can be exactly what learners and employers both need. To meet the needs of a rapidly changing job market and equip learners with the technical and ethical direction needed to thrive, institutions should familiarize students with the use of AI and nurture the innately human skills needed to apply it ethically. Failing to do so can create enormous risk for higher education, business and society.

What is AI literacy?
To effectively utilize generative AI, learners will need to grasp the appropriate use cases for these tools, understand when their use presents significant downside risk, and learn to recognize abuse to separate fact from fiction. AI literacy is a deeply human capacity. The critical thinking and communication skills required are muscles that need repeated training to be developed and maintained.

 

The New Academic Arms Race | Competition over amenities is over. The next battleground is technology. — from chronicle.com by Jeffrey J. Selingo

Now, after the pandemic, with the value of the bachelor’s degree foremost in the minds of students and families, a new academic arms race is emerging. This one is centered around academic innovation. The winners will be those institutions that in the decade ahead better apply technology in teaching and learning and develop different approaches to credentialing.

Sure, technology is often seen as plumbing on campuses — as long as it works, we don’t worry about it. And rarely do prospective students on a tour ever ask about academic innovations like extended reality or microcredentials. Campus tours prefer to show off the bells and whistles of residential life within dorms and dining halls.

That’s too bad.

The problem is not a lack of learners, but rather a lack of alignment in what colleges offer to a generation of learners surrounded by Amazon, Netflix, and Instagram, where they can stream entertainment and music anytime, anywhere.

From DSC:
When I worked for Calvin (then College, now University) from 2007-2017, that’s exactly how technologies and the entire IT Department were viewed — as infrastructure providers. We were not viewed as being able to enhance the core business/offerings of the institution. We weren’t relevant in that area. In fact, the IT Department was shoved down in the basement of the library. Our Teaching & Learning Digital Studio was sidelined in a part of the library where few students went to. The Digitial Studio’s marketing efforts didn’t help much, as faculty members didn’t offer assignments that called for multimedia-based deliverables. It was a very tough and steep hill to climb.

Also the Presidents and Provosts over the last couple of decades (not currently though) didn’t think much of online-based learning, and the top administrators dissed the Internet’s ability to provide 24/7 worldwide conversations and learning. They missed the biggest thing to come along in education in 500 years (since the invention of the printing press). Our Teaching & Learning Group provided leadership by starting a Calvin Online pilot. We had 13-14 courses built and inquiries from Christian-based high schools were coming in for dual enrollment scenarios, but when it came time for the College to make a decision, it never happened. The topic/vote never made it to the floor of the Faculty Senate. The faculty and administration missed an enormous opportunity.

When Calvin College became Calvin University in 2019, they were forced to offer online-based classes. Had they supported our T&L Group’s efforts back in the early to mid-2010’s, they would have dove-tailed very nicely into offering more courses to working adults. They would have built up the internal expertise to offer these courses/programs. But the culture of the college put a stop to online-based learning at that time. They now regret that decision I’m sure (as they’ve had to outsource many things and they now offer numerous online-based courses and even entire programs — at a high cost most likely).

My how times have changed.


For another item re: higher education at the 30,000-foot level, see:


Lifelong Learning Models for a Changing Higher Ed Marketplace — from changinghighered.com by Dr. Drumm McNaughton and Amrit Ahluwalia
Exploring the transformation of higher education into lifelong learning hubs for workforce development, with innovative models and continuing education’s role.

Higher education is undergoing transformational change to redefine its role as a facilitator of lifelong learning and workforce development. In this 200th episode of Changing Higher Ed, host Dr. Drumm McNaughton and guest Amrit Ahluwalia, incoming Executive Director for Continuing Studies at Western University, explore innovative models positioning universities as sustainable hubs for socioeconomic mobility.

The Consumer-Driven Educational Landscape
Over 60% of today’s jobs will be redefined by 2025, driving demand for continuous upskilling and reskilling to meet evolving workforce needs. However, higher education’s traditional model of imparting specific knowledge through multi-year degrees is hugely misaligned with this reality.

Soaring education costs have fueled a consumer mindset shift, with learners demanding a clear return on investment directly aligned with their career goals. The expectation is to see immediate skills application and professional impact from their educational investments, not just long-term outcomes years after completion.


 

How Generative AI Owns Higher Education. Now What? — from forbes.co by Steve Andriole

Excerpt (emphasis DSC):

What about course videos? Professors can create them (by lecturing into a camera for several hours hopefully in different clothes) from the readings, from their interpretations of the readings, from their own case experiences – from anything they like. But now professors can direct the creation of the videos by talking – actually describing – to a CustomGPTabout what they’d like the video to communicate with their or another image. Wait. What? They can make a video by talking to a CustomGPT and even select the image they want the “actor” to use? Yes. They can also add a British accent and insert some (GenAI-developed) jokes into the videos if they like. All this and much more is now possible. This means that a professor can specify how long the video should be, what sources should be consulted and describe the demeanor the professor wants the video to project.

From DSC:
Though I wasn’t crazy about the clickbait type of title here, I still thought that the article was solid and thought-provoking. It contained several good ideas for using AI.


Excerpt from a recent EdSurge Higher Ed newsletter:


There are darker metaphors though — ones that focus on the hazards for humanity of the tech. Some professors worry that AI bots are simply replacing hired essay-writers for many students, doing work for a student that they can then pass off as their own (and doing it for free).

From DSC:
Hmmm…the use of essay writers was around long before AI became mainstream within higher education. So we already had a serious problem where students didn’t see the why in what they were being asked to do. Some students still aren’t sold on the why of the work in the first place. The situation seems to involve ethics, yes, but it also seems to say that we haven’t sold students on the benefits of putting in the work. Students seem to be saying I don’t care about this stuff…I just need the degree so I can exit stage left.

My main point: The issue didn’t start with AI…it started long before that.

And somewhat relevant here, also see:

I Have Bigger Fish to Fry: Why K12 Education is Not Thinking About AI — from medium.com by Maurie Beasley, M.Ed. (Edited by Jim Beasley)

This financial stagnation is occurring as we face a multitude of escalating challenges. These challenges include but are in no way limited to, chronic absenteeism, widespread student mental health issues, critical staff shortages, rampant classroom behavior issues, a palpable sense of apathy for education in students, and even, I dare say, hatred towards education among parents and policymakers.

Our current focus is on keeping our heads above water, ensuring our students’ safety and mental well-being, and simply keeping our schools staffed and our doors open.


Meet Ed: Ed is an educational friend designed to help students reach their limitless potential. — from lausd.org (Los Angeles School District, the second largest in the U.S.)

What is Ed?
An easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement. It offers personalized guidance and resources to students and families 24/7 in over 100 languages.

Ed is an easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement.

Also relevant/see:

  • Los Angeles Unified Bets Big on ‘Ed,’ an AI Tool for Students — from by Lauraine Langreo
    The Los Angeles Unified School District has launched an AI-powered learning tool that will serve as a “personal assistant” to students and their parents.The tool, named “Ed,” can provide students from the nation’s second-largest district information about their grades, attendance, upcoming tests, and suggested resources to help them improve their academic skills on their own time, Superintendent Alberto Carvalho announced March 20. Students can also use the app to find social-emotional-learning resources, see what’s for lunch, and determine when their bus will arrive.

Could OpenAI’s Sora be a big deal for elementary school kids? — from futureofbeinghuman.com by Andrew Maynard
Despite all the challenges it comes with, AI-generated video could unleash the creativity of young children and provide insights into their inner worlds – if it’s developed and used responsibly

Like many others, I’m concerned about the challenges that come with hyper-realistic AI-generated video. From deep fakes and disinformation to blurring the lines between fact and fiction, generative AI video is calling into question what we can trust, and what we cannot.

And yet despite all the issues the technology is raising, it also holds quite incredible potential, including as a learning and development tool — as long as we develop and use it responsibly.

I was reminded of this a few days back while watching the latest videos from OpenAI created by their AI video engine Sora — including the one below generated from the prompt “an elephant made of leaves running in the jungle”

What struck me while watching this — perhaps more than any of the other videos OpenAI has been posting on its TikTok channel — is the potential Sora has for translating the incredibly creative but often hard to articulate ideas someone may have in their head, into something others can experience.


Can AI Aid the Early Education Workforce? — from edsurge.com by Emily Tate Sullivan
During a panel at SXSW EDU 2024, early education leaders discussed the potential of AI to support and empower the adults who help our nation’s youngest children.

While the vast majority of the conversations about AI in education have centered on K-12 and higher education, few have considered the potential of this innovation in early care and education settings.

At the conference, a panel of early education leaders gathered to do just that, in a session exploring the potential of AI to support and empower the adults who help our nation’s youngest children, titled, “ChatECE: How AI Could Aid the Early Educator Workforce.”

Hau shared that K-12 educators are using the technology to improve efficiency in a number of ways, including to draft individualized education programs (IEPs), create templates for communicating with parents and administrators, and in some cases, to support building lesson plans.


From EIEIO…Seasons Of Change

Again, we’ve never seen change happen as fast as it’s happening.


Enhancing World Language Instruction With AI Image Generators — from eduoptia.org by Rachel Paparone
By crafting an AI prompt in the target language to create an image, students can get immediate feedback on their communication skills.

Educators are, perhaps rightfully so, cautious about incorporating AI in their classrooms. With thoughtful implementation, however, AI image generators, with their ability to use any language, can provide powerful ways for students to engage with the target language and increase their proficiency.


AI in the Classroom: A Teacher’s Toolkit for Transformation — from esheninger.blogspot.com by Eric Sheninger

While AI offers numerous benefits, it’s crucial to remember that it is a tool to empower educators, not replace them. The human connection between teacher and student remains central to fostering creativity, critical thinking, and social-emotional development. The role of teachers will shift towards becoming facilitators, curators, and mentors who guide students through personalized learning journeys. By harnessing the power of AI, educators can create dynamic and effective classrooms that cater to each student’s individual needs. This paves the way for a more engaging and enriching learning experience that empowers students to thrive.


Teachers Are Using AI to Create New Worlds, Help Students with Homework, and Teach English — from themarkup.org by Ross Teixeira; via Matthew Tower
Around the world, these seven teachers are making AI work for them and their students

In this article, seven teachers across the world share their insights on AI tools for educators. You will hear a host of varied opinions and perspectives on everything from whether AI could hasten the decline of learning foreign languages to whether AI-generated lesson plans are an infringement on teachers’ rights. A common theme emerged from those we spoke with: just as the internet changed education, AI tools are here to stay, and it is prudent for teachers to adapt.


Teachers Desperately Need AI Training. How Many Are Getting It? — from edweek.org by Lauraine Langreo

Even though it’s been more than a year since ChatGPT made a big splash in the K-12 world, many teachers say they are still not receiving any training on using artificial intelligence tools in the classroom.

More than 7 in 10 teachers said they haven’t received any professional development on using AI in the classroom, according to a nationally representative EdWeek Research Center survey of 953 educators, including 553 teachers, conducted between Jan. 31 and March 4.

From DSC:
This article mentioned the following resource:

Artificial Intelligence Explorations for Educators — from iste.org


 

Which AI should I use? Superpowers and the State of Play — from by Ethan Mollick
And then there were three

For over a year, GPT-4 was the dominant AI model, clearly much smarter than any of the other LLM systems available. That situation has changed in the last month, there are now three GPT-4 class models, all powering their own chatbots: GPT-4 (accessible through ChatGPT Plus or Microsoft’s CoPilot), Anthropic’s Claude 3 Opus, and Google’s Gemini Advanced1.

Where we stand
We are in a brief period in the AI era where there are now multiple leading models, but none has yet definitively beaten the GPT-4 benchmark set over a year ago. While this may represent a plateau in AI abilities, I believe this is likely to change in the coming months as, at some point, models like GPT-5 and Gemini 2.0 will be released. In the meantime, you should be using a GPT-4 class model and using it often enough to learn what it does well. You can’t go wrong with any of them, pick a favorite and use it…

From DSC:
Here’s a powerful quote from Ethan:

In fact, in my new book I postulate that you haven’t really experienced AI until you have had three sleepless nights of existential anxiety, after which you can start to be productive again.


Using AI for Immersive Educational Experiences — from automatedteach.com by Graham Clay
Realistic video brings course content to life but requires AI literacy.

For us, I think the biggest promise of AI tools like Sora — that can create video with ease — is that they lower the cost of immersive educational experiences. This increases the availability of these experiences, expanding their reach to student populations who wouldn’t otherwise have them, whether due to time, distance, or expense.

Consider the profound impact on a history class, where students are transported to California during the gold rush through hyperrealistic video sequences. This vivifies the historical content and cultivates a deeper connection with the material.

In fact, OpenAI has already demonstrated the promise of this sort of use case, with a very simple prompt producing impressive results…


The Empathy Illusion: How AI Agents Could Manipulate Students — from marcwatkins.substack.com by Marc Watkins

Take this scenario. A student misses a class and, within twenty minutes, receives a series of texts and even a voicemail from a very concerned and empathic-sounding voice wanting to know what’s going on. Of course, the text is entirely generated, and the voice is synthetic as well, but the student likely doesn’t know this. To them, communication isn’t something as easy to miss or brush off as an email. It sounds like someone who cares is talking to them.

But let’s say that isn’t enough. By that evening, the student still hadn’t logged into their email or checked the LMS. The AI’s strategic reasoning is communicating with the predictive AI and analyzing the pattern of behavior against students who succeed or fail vs. students who are ill. The AI tracks the student’s movements on campus, monitors their social media usage, and deduces the student isn’t ill and is blowing off class.

The AI agent resumes communication with the student. But this time, the strategic AI adopts a different persona, not the kind and empathetic persona used for the initial contact, but a stern, matter-of-fact one. The student’s phone buzzes with alerts that talk about scholarships being lost, teachers being notified, etc. The AI anticipates the excuses the student will use and presents evidence tracking the student’s behavior to show they are not sick.


Not so much focused on learning ecosystems, but still worth mentioning:

The top 100 Gen AI Consumer Apps — from a16z.com / andreessen horowitz by Olivia Moore


 

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 

This week in 5 numbers: Another faith-based college plans to close — from by Natalie Schwartz
We’re rounding up some of our top recent stories, from Notre Dame College’s planned closure to Valparaiso’s potential academic cuts.

BY THE NUMBERS

  • 1,444
    The number of students who were enrolled at Notre Dame College in fall 2022, down 37% from 2014. The Roman Catholic college recently said it would close after the spring term, citing declining enrollment, along with rising costs and significant debt.
  • 28
    The number of academic programs that Valparaiso University may eliminate. Eric Johnson, the Indiana institution’s provost, said it offers too many majors, minors and graduate degrees in relation to its enrollment.

A couple of other items re: higher education that caught my eye were:

Universities Expect to Use More Tech in Future Classrooms—but Don’t Know How — from insidehighered.com by Lauren Coffey

University administrators see the need to implement education technology in their classrooms but are at a loss regarding how to do so, according to a new report.

The College Innovation Network released its first CIN Administrator EdTech survey today, which revealed that more than half (53 percent) of the 214 administrators surveyed do not feel extremely confident in choosing effective ed-tech products for their institutions.

“While administrators are excited about offering new ed-tech tools, they are lacking knowledge and data to help them make informed decisions that benefit students and faculty,” Omid Fotuhi, director of learning and innovation at WGU Labs, which funds the network, said in a statement.

From DSC:
I always appreciated our cross-disciplinary team at Calvin (then College). As we looked at enhancing our learning spaces, we had input from the Teaching & Learning Group, IT, A/V, the academic side of the house, and facilities. It was definitely a team-based approach. (As I think about it, it would have been helpful to have more channels for student feedback as well.)


Per Jeff Selingo:

Optionality. In my keynote, I pointed out that the academic calendar and credit hour in higher ed are like “shelf space” on the old television schedule that has been upended by streaming. In much the same way, we need similar optionality to meet the challenges of higher ed right now: in how students access learning (in-person, hybrid, online) to credentials (certificates, degrees) to how those experiences stack together for lifelong learning.

Culture in institutions. The common thread throughout the conference was how the culture of institutions (both universities and governments) need to change so our structures and practices can evolve. Too many people in higher ed right now are employing a scarcity mindset and seeing every change as a zero-sum game. If you’re not happy about the present, as many attendees suggested you’re not going to be excited about the future.

 

Programs, Services, and More: A Map of CTL Tactics — from derekbruff.org

My colleagues and I at the Center for Excellence in Teaching and Learning (CETL) have been reading and discussing Mary C. Wright’s new book Centers for Teaching and Learning: The New Landscape in Higher Education (Johns Hopkins University Press, 2023). Wright identified all the centers for teaching and learning (CTLs) in the United States and then did a content analysis of their websites to see what they were all about. For someone like me, who has spent his career working in CTLs, Wright’s work is a fascinating look at my own field and how it represents itself through mission statements, listings of programs and services, and annual reports.

Also from Derek, see:

Recap: Study skills, flipped learning, and more at spring STEM teaching lunches — from umcetl.substack.com
With the final spring STEM teaching lunch coming up on March 4th, here’s a recap of what you missed at the first two lunches.

    • February 8th – Helping students learn how to learn
    • February 20th – Reconsidering class time through flipped learning
 

The Transformative Trends Reshaping Higher Education in 2024 — from evolllution.com by Janet Spriggs; via Amrit Ahluwalia on LinkedIn

  • Artificial Intelligence: Embrace It or Fall Behind
  • Reassessing Value: Tackling Confidence and ROI in Higher Education
  • Innovating for the Future: Adapting to Changing Needs
  • Fostering Strategic Partnerships: Collaboration for Progress
  • Leadership Matters: Driving Innovation and Inclusivity
 

Top 6 Use Cases of Generative AI in Education in 2024 — from research.aimultiple.com by Cem Dilmegani

Use cases included:

  1. Personalized Lessons
  2. Course Design
  3. Content Creation for Courses
  4. Data Privacy Protection for Analytical Models
  5. Restoring Old Learning Materials
  6. Tutoring

The Next Phase of AI in Education at the U.S. Department of Education — from medium.com by Office of Ed Tech

Why are we doing this work?
Over the past two years, the U.S. Department of Education has been committed to maintaining an ongoing conversation with educators, students, researchers, developers — and the educational community at large — related to the continuous progress of Artificial Intelligence (AI) development and its implications for teaching and learning.

Many educators are seeking resources clarifying what AI is and how it will impact their work and their students. Similarly, developers of educational technology (“edtech”) products seek guidance on what guardrails exist that can support their efforts. After the release of our May 2023 report Artificial Intelligence and the Future of Teaching and Learningwe heard the desire for more.


2024 EDUCAUSE AI Landscape Study — from library.educause.edu by Jenay Robert

Moving from reaction to action, higher education stakeholders are currently exploring the opportunities afforded by AI for teaching, learning, and work while maintaining a sense of caution for the vast array of risks AI-powered technologies pose. To aid in these efforts, we present this inaugural EDUCAUSE AI Landscape Study, in which we summarize the higher education community’s current sentiments and experiences related to strategic planning and readiness, policies and procedures, workforce, and the future of AI in higher education.


AI Update for K-16 Administrators: More People Need to Step-Up and Take the AI Bull By the Horns — from stefanbauschard.substack.com by Stefan Bauschard
AI capabilities are way beyond what most schools are aware of, and they will transform education and society over the next few years.

Educational administrators should not worry about every AI development, but should, instead focus on the big picture, as those big picture changes will change the entire world and the educational system.

AI and related technologies (robotics, synthetic biology, and brain-computer interfaces) will continue to impact society and the entire educational system over the next 10 years. This impact on the system will be greater than anything that has happened over the last 100 years, including COVID-19, as COVID-19 eventually ended and the disruptive force of these technologies will only continue to develop.

AI is the bull in the China Shop, redefining the world and the educational system. Students writing a paper with AI is barely a poke in the educational world relative to what is starting to happen (active AI teachers and tutors; AI assessment; AI glasses; immersive learning environments; young students able to start their own business with AI tools; AIs replacing and changing jobs; deep voice and video fakes; intelligence leveling; individualized instruction; interactive and highly intelligent computers; computers that can act autonomously; and more).


 

 

Digital Learning Pulse Survey Reveals Higher-Ed Unprepared for Expected Impact of AI — from prnewswire.com by Cengage
Research illustrates that while GenAI could ease ongoing challenges in education, just 1 in 5 say their school is ready

WASHINGTONFeb. 6, 2024 /PRNewswire/ — While three-quarters of higher-education trustees (83%), faculty (81%) and administrators (76%) agree that generative artificial intelligence (GenAI) will noticeably change their institutions in the next five years, community college trustees are more optimistic than their community college counterparts, with (37%) saying their organization is prepared for the change coming compared to just 16% of faculty and 11% of administrator respondents.

Those findings are from the 2023-2024 Digital Learning Pulse Survey conducted by Cengage and Bay View Analytics with support from the Association of Community College Trustees (ACCT), the Association of College and University Educators (ACUE), College Pulse and the United States Distance Learning Association (USDLA) to understand the attitudes and concerns of higher education instructors and leadership.

From DSC:
It takes time to understand what a given technology brings to the table…let alone a slew of emerging technologies under the artificial intelligence (AI) umbrella. It’s hard enough when the technology is fairly well established and not changing all the time. But its extremely difficult when significant change occurs almost daily. 

The limited staff within the teaching & learning centers out there need time to research and learn about the relevant technologies and how to apply those techs to instructional design. The already stretched thin faculty members need time to learn about those techs as well — and if and how they want to apply them. It takes time and research and effort.

Provosts, deans, presidents, and such need time to learn things as well.

Bottom line: We need to have realistic expectations here.


AI Adoption in Corporate L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman
Where we are, and the importance of use cases in enabling change

At the end of last year, O’Reilly Media published a comprehensive report on the adoption and impact of generative AI within enterprises.

The headline of the report is that we’ve never seen a technology adopted in enterprise as fast as generative AI. As of November 2023, two-thirds (67%) of survey respondents reported that their companies are using generative AI.

However, the vast majority of AI adopters in enterprise are still in the early stages; they’re experimenting at the edges, rather than making larger-scale, strategic decisions on how to leverage AI to accelerate our progress towards org goals and visions.

The single biggest hurdle to AI adoption in large corporates is a lack of appropriate use cases.

 

The Teaching and Learning Workforce in Higher Education, 2024 — from library.educause.edu by Nicole Muscanell


Opinion: Higher-Ed Trends to Watch in 2024 — from govtech.com by Jim A. Jorstad
If the recent past is any indication, higher education this year is likely to see financial stress, online learning, a crisis of faith in leadership, emerging tech such as AI and VR, cybersecurity threats, and a desperate need for skilled IT staff.

 “We’re in the early stages of creating a new paradigm for personalized assessment and learning; it’s critical for moving the field forward … It’s supporting teachers in the classroom to personalize their teaching by using AI to provide feedback for individual learners and pointing in the direction where students can go.”


PROOF POINTS: Most college kids are taking at least one class online, even long after campuses reopened — from hechingerreport.org by Jill Barshay
Shift to online classes and degrees is a response to declining enrollment

The pandemic not only disrupted education temporarily; it also triggered permanent changes. One that is quietly taking place at colleges and universities is a major, expedited shift to online learning. Even after campuses reopened and the health threat diminished, colleges and universities continued to offer more online courses and added more online degrees and programs. Some brick-and-mortar schools even switched to online only.


College Affordability Helped Drive Rise in State Support for Higher Ed — from chronicle.com by Sonel Cutler

State support for higher education saw a significant jump this year, rising more than 10 percent from 2023 — even though the share of that money provided by the federal government dropped 50 percent.

That’s according to the annual Grapevine report released Thursday by the State Higher Education Executive Officers Association, or SHEEO. The data reflect a continued upward trajectory for state investment in higher education, with a 36.5-percent increase in support nationally over the last five years, not adjusted for inflation.


 

 

Augment teaching with AI – this teacher has it sussed… — from donaldclarkplanb.blogspot.com by Donald Clark

Emphasis (emphasis DSC):

You’re a teacher who wants to integrate AI into your teaching. What do you do? I often get asked how should I start with AI in my school or University. This, I think, is one answer.

Continuity with teaching
One school has got this exactly right in my opinion. Meredith Joy Morris has implemented ChatGPT into the teaching process. The teacher does their thing and the chatbot picks up where the teacher stops, augmenting and scaling the teaching and learning process, passing the baton to the learners who carry on. This gives the learner a more personalised experience, encouraging independent learning by using the undoubted engagement that 1:1 dialogue provides.

There’s no way any teacher can provide this carry on support with even a handful of students, never mind a class of 30 or a course with 100. Teaching here is ‘extended’ and ‘scaled’ by AI. The feedback from the students was extremely positive.


Reflections on Teaching in the AI Age — from by Jeffrey Watson

The transition which AI forces me to make is no longer to evaluate writings, but to evaluate writers. I am accustomed to grading essays impersonally with an objective rubric, treating the text as distinct from the author and commenting only on the features of the text. I need to transition to evaluating students a bit more holistically, as philosophers – to follow along with them in the early stages of the writing process, to ask them to present their ideas orally in conversation or in front of their peers, to push them to develop the intellectual virtues that they will need if they are not going to be mastered by the algorithms seeking to manipulate them. That’s the sort of development I’ve meant to encourage all along, not paragraph construction and citation formatting. If my grading practices incentivize outsourcing to a machine intelligence, I need to change my grading practices.


4 AI Imperatives for Higher Education in 2024 — from campustechnology.com by Rhea Kelly

[Bryan Alexander] There’s a crying need for faculty and staff professional development about generative AI. The topic is complicated and fast moving. Already the people I know who are seriously offering such support are massively overscheduled. Digital materials are popular. Books are lagging but will gradually surface. I hope we see more academics lead more professional development offerings.

For an academic institution to take emerging AI seriously it might have to set up a new body. Present organizational nodes are not necessarily a good fit.


A Technologist Spent Years Building an AI Chatbot Tutor. He Decided It Can’t Be Done. — from edsurge.com by Jeffrey R. Young
Is there a better metaphor than ‘tutor’ for what generative AI can do to help students and teachers?

When Satya Nitta worked at IBM, he and a team of colleagues took on a bold assignment: Use the latest in artificial intelligence to build a new kind of personal digital tutor.

This was before ChatGPT existed, and fewer people were talking about the wonders of AI. But Nitta was working with what was perhaps the highest-profile AI system at the time, IBM’s Watson. That AI tool had pulled off some big wins, including beating humans on the Jeopardy quiz show in 2011.

Nitta says he was optimistic that Watson could power a generalized tutor, but he knew the task would be extremely difficult. “I remember telling IBM top brass that this is going to be a 25-year journey,” he recently told EdSurge.


Teachers stan AI in education–but need more support — from eschoolnews.com by Laura Ascione

What are the advantages of AI in education?
Canva’s study found 78 percent of teachers are interested in using AI education tools, but their experience with the technology remains limited, with 93 percent indicating they know “a little” or “nothing” about it – though this lack of experience hasn’t stopped teachers quickly discovering and considering its benefits:

  • 60 percent of teachers agree it has given them ideas to boost student productivity
  • 59 percent of teachers agree it has cultivated more ways for their students to be creative
  • 56 percent of teachers agree it has made their lives easier

When looking at the ways teachers are already using generative artificial intelligence, the most common uses were:

  • Creating teaching materials (43 percent)
  • Collaborative creativity/co-creation (39 percent)
  • Translating text (36 percent)
  • Brainstorming and generating ideas (35 percent)

The next grand challenge for AI — from ted.com by Jim Fan


The State of Washington Embraces AI for Public Schools — from synthedia.substack.com by Bret Kinsella; via Tom Barrett
Educational institutions may be warming up to generative AI

Washington state issued new guidelines for K-12 public schools last week based on the principle of “embracing a human-centered approach to AI,” which also embraces the use of AI in the education process. The state’s Superintendent of Public Instruction, Chris Reykdal, commented in a letter accompanying the new guidelines:


New education features to help teachers save time and support students — from by Shantanu Sinha

Giving educators time back to invest in themselves and their students
Boost productivity and creativity with Duet AI: Educators can get fresh ideas and save time using generative AI across Workspace apps. With Duet AI, they can get help drafting lesson plans in Docs, creating images in Slides, building project plans in Sheets and more — all with control over their data.

 

OpenAI announces first partnership with a university — from cnbc.com by Hayden Field

Key Points:

  • OpenAI on Thursday announced its first partnership with a higher education institution.
  • Starting in February, Arizona State University will have full access to ChatGPT Enterprise and plans to use it for coursework, tutoring, research and more.
  • The partnership has been in the works for at least six months.
  • ASU plans to build a personalized AI tutor for students, allow students to create AI avatars for study help and broaden the university’s prompt engineering course.

A new collaboration with OpenAI charts the future of AI in higher education — from news.asu.edu

The collaboration between ASU and OpenAI brings the advanced capabilities of ChatGPT Enterprise into higher education, setting a new precedent for how universities enhance learning, creativity and student outcomes.

“ASU recognizes that augmented and artificial intelligence systems are here to stay, and we are optimistic about their ability to become incredible tools that help students to learn, learn more quickly and understand subjects more thoroughly,” ASU President Michael M. Crow said. “Our collaboration with OpenAI reflects our philosophy and our commitment to participating directly to the responsible evolution of AI learning technologies.”


AI <> Academia — from drphilippahardman.substack.com by Dr. Philippa Hardman
What might emerge from ASU’s pioneering partnership with OpenAI?

Phil’s Wish List #2: Smart Curriculum Development
ChatGPT assists in creating and updating course curricula, based on both student data and emerging domain and pedagogical research on the topic.

Output: using AI it will be possible to review course content and make data-informed automate recommendations based on latest pedagogical and domain-specific research

Potential Impact: increased dynamism and relevance in course content and reduced administrative lift for academics.


A full list of AI ideas from AI for Education dot org

A full list of AI ideas from AI-for-Education.org

You can filter by category, by ‘What does it do?’, by AI tool or search for keywords.


Navigating the new normal: Adapting in the age of AI and hybrid work models — from chieflearningofficer.com by Dr. Kylie Ensrud

Unlike traditional leadership, adaptable leadership is not bound by rigid rules and protocols. Instead, it thrives on flexibility. Adaptable leaders are willing to experiment, make course corrections, and pivot when necessary. Adaptable leadership is about flexibility, resilience and a willingness to embrace change. It embodies several key principles that redefine the role of leaders in organizations:

  1. Embracing uncertainty

Adaptable leaders understand that uncertainty is the new norm. They do not shy away from ambiguity but instead, see it as an opportunity for growth and innovation. They encourage a culture of experimentation and learning from failure.

  1. Empowering teams

Instead of dictating every move, adaptable leaders empower their teams to take ownership of their work. They foster an environment of trust and collaboration, enabling individuals to contribute their unique perspectives and skills.

  1. Continuous learning

Adaptable leaders are lifelong learners. They are constantly seeking new knowledge, stay informed about industry trends and encourage their teams to do the same. They understand that knowledge is a dynamic asset that must be constantly updated.


Major AI in Education Related Developments this week — from stefanbauschard.substack.com by Stefan Bauschard
ASU integrates with ChatGPT, K-12 AI integrations, Agents & the Rabbit, Uruguay, Meta and AGI, Rethinking curriculum

“The greatest risk is leaving school curriculum unchanged when the entire world is changing.”
Hadi Partovi, founder Code.org, Angel investor in Facebook, DropBox, AirBnb, Uber

Tutorbots in college. On a more limited scale, Georgia State University, Morgan State University, and the University of Central Florida are piloting a project using chatbots to support students in foundational math and English courses.


Pioneering AI-Driven Instructional Design in Small College Settings — from campustechnology.com by Gopu Kiron
For institutions that lack the budget or staff expertise to utilize instructional design principles in online course development, generative AI may offer a way forward.

Unfortunately, smaller colleges — arguably the institutions whose students are likely to benefit the most from ID enhancements — frequently find themselves excluded from authentically engaging in the ID arena due to tight budgets, limited faculty online course design expertise, and the lack of ID-specific staff roles. Despite this, recent developments in generative AI may offer these institutions a low-cost, tactical avenue to compete with more established players.


Google’s new AI solves math olympiad problems — from bensbites.beehiiv.com

There’s a new AI from Google DeepMind called AlphaGeometry that totally nails solving super hard geometry problems. We’re talking problems so tough only math geniuses who compete in the International Mathematical Olympiad can figure them out.


 
© 2024 | Daniel Christian