.

2024 EDUCAUSE Horizon Report® Teaching and Learning Edition

Trends
As a first activity, we asked the Horizon panelists to provide input on the macro trends they believe are going to shape the future of postsecondary teaching and learning and to provide observable evidence for those trends. To ensure an expansive view of the larger trends serving as context for institutions of higher education, panelists provided input across five trend categories: social, technological, economic, environmental, and political. Given the widespread impacts of emerging AI technologies on higher education, we are also including in this year’s report a list of “honorary trends” focused on AI. After several rounds of voting, the panelists selected the following trends as the most important:

 


Information Age vs Generation Age Technologies for Learning — from opencontent.org by David Wiley

Remember (emphasis DSC)

  • the internet eliminated time and place as barriers to education, and
  • generative AI eliminates access to expertise as a barrier to education.

Just as instructional designs had to be updated to account for all the changes in affordances of online learning, they will need to be dramatically updated again to account for the new affordances of generative AI.


The Curious Educator’s Guide to AI | Strategies and Exercises for Meaningful Use in Higher Ed  — from ecampusontario.pressbooks.pub by Kyle Mackie and Erin Aspenlieder; via Stephen Downes

This guide is designed to help educators and researchers better understand the evolving role of Artificial Intelligence (AI) in higher education. This openly-licensed resource contains strategies and exercises to help foster an understanding of AI’s potential benefits and challenges. We start with a foundational approach, providing you with prompts on aligning AI with your curiosities and goals.

The middle section of this guide encourages you to explore AI tools and offers some insights into potential applications in teaching and research. Along with exposure to the tools, we’ll discuss when and how to effectively build AI into your practice.

The final section of this guide includes strategies for evaluating and reflecting on your use of AI. Throughout, we aim to promote use that is effective, responsible, and aligned with your educational objectives. We hope this resource will be a helpful guide in making informed and strategic decisions about using AI-powered tools to enhance teaching and learning and research.


Annual Provosts’ Survey Shows Need for AI Policies, Worries Over Campus Speech — from insidehighered.com by Ryan Quinn
Many institutions are not yet prepared to help their faculty members and students navigate artificial intelligence. That’s just one of multiple findings from Inside Higher Ed’s annual survey of chief academic officers.

Only about one in seven provosts said their colleges or universities had reviewed the curriculum to ensure it will prepare students for AI in their careers. Thuswaldner said that number needs to rise. “AI is here to stay, and we cannot put our heads in the sand,” he said. “Our world will be completely dominated by AI and, at this point, we ain’t seen nothing yet.”


Is GenAI in education more of a Blackberry or iPhone? — from futureofbeinghuman.com by Andrew Maynard
There’s been a rush to incorporate generative AI into every aspect of education, from K-12 to university courses. But is the technology mature enough to support the tools that rely on it?

In other words, it’s going to mean investing in concepts, not products.

This, to me, is at the heart of an “iPhone mindset” as opposed to a “Blackberry mindset” when it comes to AI in education — an approach that avoids hard wiring in constantly changing technologies, and that builds experimentation and innovation into the very DNA of learning.

For all my concerns here though, maybe there is something to being inspired by the Blackberry/iPhone analogy — not as a playbook for developing and using AI in education, but as a mindset that embraces innovation while avoiding becoming locked in to apps that are detrimentally unreliable and that ultimately lead to dead ends.


Do teachers spot AI? Evaluating the detectability of AI-generated texts among student essays — from sciencedirect.com by Johanna Fleckenstein, Jennifer Meyer, Thorben Jansen, Stefan D. Keller, Olaf Köller, and Jens Möller

Highlights

  • Randomized-controlled experiments investigating novice and experienced teachers’ ability to identify AI-generated texts.
  • Generative AI can simulate student essay writing in a way that is undetectable for teachers.
  • Teachers are overconfident in their source identification.
  • AI-generated essays tend to be assessed more positively than student-written texts.

Can Using a Grammar Checker Set Off AI-Detection Software? — from edsurge.com by Jeffrey R. Young
A college student says she was falsely accused of cheating, and her story has gone viral. Where is the line between acceptable help and cheating with AI?


Use artificial intelligence to get your students thinking critically — from timeshighereducation.com by Urbi Ghosh
When crafting online courses, teaching critical thinking skills is crucial. Urbi Ghosh shows how generative AI can shape how educators can approach this


ChatGPT shaming is a thing – and it shouldn’t be — from futureofbeinghuman.com by Andrew Maynard
There’s a growing tension between early and creative adopters of text based generative AI and those who equate its use with cheating. And when this leads to shaming, it’s a problem.

Excerpt (emphasis DSC):

This will sound familiar to anyone who’s incorporating generative AI into their professional workflows. But there are still many people who haven’t used apps like ChatGPT, are largely unaware of what they do, and are suspicious of them. And yet they’ve nevertheless developed strong opinions around how they should and should not be used.

From DSC:
Yes…that sounds like how many faculty members viewed online learning, even though they had never taught online before.

 

The Digital Transformation Journey: Lessons For Lawyers Embracing AI — from abovethelaw.com by Olga V. Mack
The journey from the days of leather-bound law books to the digital age — and now toward an AI-driven future — offers valuable lessons for embracing change.

No One Will Miss The ‘Good Old Days’
I have yet to meet a lawyer nostalgic for the days of manually updating law reports or sifting through stacks of books for a single precedent. The convenience, speed, and breadth of digital research tools have made the practice of law more efficient and effective. As we move further into the AI era, the enhancements in predictive analytics, document automation, and legal research will make the “good old days” of even the early digital age seem quaint. The efficiencies and capabilities AI brings to the table are likely to become just as indispensable as online databases are today.

The Way We ‘Law’ Will Change For The Better
The ultimate goal of integrating AI into legal practice isn’t just to replace old methods with new ones; it’s to enhance our ability to serve justice, increase access to legal services, and improve the quality of our work. AI promises to automate mundane tasks, predict legal outcomes with greater accuracy, and unearth insights from vast data. These advancements will free us to focus more on the nuanced, human aspects of law — strategy, empathy, and ethical judgment.


AI to Help Double Legal Tech Market Over Five Years, Gartner Says — from news.bloomberglaw.com by Isabel Gottlieb (behind a paywall)

  • Tech to take up a bigger share of in-house legal spend
  • Generative AI boom has much longer to run

The legal tech market will expand to $50 billion by 2027, driven by the generative artificial intelligence boom, according to an analysis by market research firm Gartner Inc.

That growth, up from about $23 billion in 2022, will be driven by continued law firm spending on AI legal tech, as well as in-house departments allocating more of their overall budgets to technology, said Chris Audet, chief of research in Gartner’s legal, risk and compliance leaders practice. The market size prediction, released publicly on Thursday, comes from a late-2023 analysis for Gartner clients, and the 2022 market size comes from …


Legal Tech Market To See Huge Lift Off Thanks to GenAI — from digit.fyi by Elizabeth Greenberg

The global legal technology market has grown significantly in recent years and generative AI (GenAI) will accelerate this growth, meaning the market will reach $50 billion in value by 2027, according to Gartner.

“GenAI has huge potential for bringing more automation to the legal space,” said Chris Audet, chief of research in the Gartner for legal, risk & compliance leaders practice.

“Rapid GenAI developments, and the widespread availability of consumer tools such as OpenAI’s ChatGPT and Google’s Bard, will quickly increase the number of established legal technology use cases, in turn creating growing market conditions for an increasing number of legal-focused tools.”

“New technologies can fundamentally change the way legal organizations do business, and GenAI has enormous potential to do this,” an analyst at Gartner said.


Revolutionizing Legal Tech in 48 Hours — from law.stanford.edu by Monica Schreiber
At CodeX Hackathon, SLS Students Help Create Award-Winning AI Tools to Help Veterans and Streamline M&A

Disabled veterans seeking to file claims with the Veterans Administration are faced with multiple hurdles and reams of paperwork. Many vets resort to paying third-party companies thousands of dollars to help them with the process.

What if there were a way to streamline the claims process—to condense burdensome information gathering and data inputting into a linear, simplified set of tasks guided by a chatbot? How long would it take to roll out a tool that could accomplish that?

The answer: about 48 hours—at least for an interdisciplinary team of students from Stanford University’s schools of Law, Business, and Computer Science collaborating feverishly during Codex’s Large Language Model (LLM) Hackathon held recently on campus.


What If Your Law Firm Had A Blank Page For Legal Tech? — from artificiallawyer.com

f law firms had a blank page for legal technology and innovation, what would they do?

While organisations across all sectors are getting to grips with the opportunities and risks posed by genAI, forward-thinking law firm leaders are considering what it means for their businesses – today, tomorrow, and the day after tomorrow.

But some firms remain constrained by yesterday, due to legacy processes, ways of working and mindsets. To create the conditions for change, firms need to adopt a ‘blank page’ approach and review all areas of their businesses by asking: if we were starting afresh, how would we design the organisation to future-proof it to achieve transformative growth with genAI at the core?

From DSC:
This sentence reminds me of the power of culture:

But some firms remain constrained by yesterday, due to legacy processes, ways of working and mindsets.


Fresh Voices on Legal Tech with Sarah Glassmeyer — from legaltalknetwork.com by Dennis Kennedy, Tom Mighell, and Sarah Glassmeyer

What if, instead of tech competence being this scary, overwhelming thing, we showed lawyers how to engage with technology in a more lighthearted, even playful, way? The reality is—tech competency doesn’t have an endpoint, but the process of continuous learning shouldn’t be dull and confusing. Sarah Glassmeyer joins Dennis and Tom to talk about her perspectives on technology education for attorneys, the latest trends in the legal tech world and new AI developments, and growing your knowledge of technology by building on small skills, one at a time.
.

 


How Legal Technology Can Add Value to an M&A Practice — from lexology.com

Following is a primer on some of the A.I.-driven legal technologies, from contract review and automated due-diligence solutions to deal collaboration and closing-management tools, that can drive productivity and efficiency during the four phases of an M&A transaction, as well as enhance market insight and client service.

 

The Verge | What’s Next With AI | February 2024 | Consumer Survey

 

 

 

 

 

 




Microsoft AI creates talking deepfakes from single photo — from inavateonthenet.net


The Great Hall – where now with AI? It is not ‘Human Connection V Innovative Technology’ but ‘Human Connection + Innovative Technology’ — from donaldclarkplanb.blogspot.com by Donald Clark

The theme of the day was Human Connection V Innovative Technology. I see this a lot at conferences, setting up the human connection (social) against the machine (AI). I think this is ALL wrong. It is, and has always been a dialectic, human connection (social) PLUS the machine. Everyone had a smartphone, most use it for work, comms and social media. The binary between human and tech has long disappeared. 


Techno-Social Engineering: Why the Future May Not Be Human, TikTok’s Powerful ForYou Algorithm, & More — from by Misha Da Vinci

Things to consider as you dive into this edition:

  • As we increasingly depend on technology, how is it changing us?
  • In the interaction between humans and technology, who is adapting to whom?
  • Is the technology being built for humans, or are we being changed to fit into tech systems?
  • As time passes, will we become more like robots or the AI models we use?
  • Over the next 30 years, as we increasingly interact with technology, who or what will we become?

 

Description:

I recently created an AI version of myself—REID AI—and recorded a Q&A to see how this digital twin might challenge me in new ways. The video avatar is generated by Hour One, its voice was created by Eleven Labs, and its persona—the way that REID AI formulates responses—is generated from a custom chatbot built on GPT-4 that was trained on my books, speeches, podcasts and other content that I’ve produced over the last few decades. I decided to interview it to test its capability and how closely its responses match—and test—my thinking. Then, REID AI asked me some questions on AI and technology. I thought I would hate this, but I’ve actually ended up finding the whole experience interesting and thought-provoking.


From DSC:
This ability to ask questions of a digital twin is very interesting when you think about it in terms of “interviewing” a historical figure. I believe character.ai provides this kind of thing, but I haven’t used it much.


 

Instructors as Innovators: a Future-focused Approach to New AI Learning Opportunities, With Prompts –from papers.ssrn.com by Ethan R. Mollick and Lilach Mollick

Abstract

This paper explores how instructors can leverage generative AI to create personalized learning experiences for students that transform teaching and learning. We present a range of AI-based exercises that enable novel forms of practice and application including simulations, mentoring, coaching, and co-creation. For each type of exercise, we provide prompts that instructors can customize, along with guidance on classroom implementation, assessment, and risks to consider. We also provide blueprints, prompts that help instructors create their own original prompts. Instructors can leverage their content and pedagogical expertise to design these experiences, putting them in the role of builders and innovators. We argue that this instructor-driven approach has the potential to democratize the development of educational technology by enabling individual instructors to create AI exercises and tools tailored to their students’ needs. While the exercises in this paper are a starting point, not a definitive solutions, they demonstrate AI’s potential to expand what is possible in teaching and learning.

 

The AI Tools in Education Database — from aitoolsdirectory.notion.site; via George Siemens

Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day. This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.


Another Workshop for Faculty and Staff — from aiedusimplified.substack.com by Lance Eaton
A recent workshop with some adjustments.

The day started out with a short talk about AI (slides). Some of it is my usual schtick where I do a bit of Q&A with folks around myths and misunderstandings of generative AI in order to establish some common ground. These are often useful both in setting the tone and giving folks a sense of how I come to explore generative AI: with a mixture of humor, concern, curiosity, and of course, cat pics.

From there, we launched into a series of mini-workshops where folks had time to first play around with some previously created prompts around teaching and learning before moving onto prompts for administrative work. The prompts and other support materials are in this Workshop Resource Document. The goal was to just get them into using one or more AI tools with some useful prompts so they can learn more about its capabilities.


The Edtech Insiders Rundown of ASU+GSV 2024 — from edtechinsiders.substack.com by by Sarah Morin, Alex Sarlin, and Ben Kornell
And more on Edtech Insiders+, upcoming events, Gauth, AI Reading Tutors, The Artificial Intelligence Interdisciplinary Institute, and TeachAI Policy Resources

Alex Sarlin

4. Everyone is Edtech Now
This year, in addition to investors, entrepreneurs, educators, school leaders, university admins, non-profits, publishers, and operators from countless edtech startups and incumbents, there were some serious big tech companies in attendance like Meta, Google, OpenAI, Microsoft, Amazon, Tiktok, and Canva. Additionally, a horde of management consultancies, workforce organizations, mental health orgs, and filmmakers were in attendance.

Edtech continues to expand as an industry category and everyone is getting involved.


Ep 18 | Rethinking Education, Lessons to Unlearn, Become a Generalist, & More — Ana Lorena Fábrega — from mishadavinci.substack.com by Misha da Vinci

It was such a delight to chat with Ana. She’s brilliant and passionate, a talented educator, and an advocate for better ways of learning for children and adults. We cover ways to transform schools so that students get real-world skills, learn resilience and how to embrace challenges, and are prepared for an unpredictable future. And we go hard on why we must keep learning no matter our age, become generalists, and leverage technology in order to adapt to the fast-changing world.

Misha also featured an item re: the future of schooling and it contained this graphic:


Texas is replacing thousands of human exam graders with AI — from theverge.com by Jess Weatherbed

The Texas Tribune reports an “automated scoring engine” that utilizes natural language processing — the technology that enables chatbots like OpenAI’s ChatGPT to understand and communicate with users — is being rolled out by the Texas Education Agency (TEA) to grade open-ended questions on the State of Texas Assessments of Academic Readiness (STAAR) exams. The agency is expecting the system to save $15–20 million per year by reducing the need for temporary human scorers, with plans to hire under 2,000 graders this year compared to the 6,000 required in 2023.


Debating About AI: An Easy Path to AI Awareness and Basic Literacy — from stefanbauschard.substack.com by Stefan Bauschard
If you are an organization committed to AI literacy, consider sponsoring some debate topics and/or debates next year and expose thousands of students to AI literacy.

Resolved: Teachers should integrate generative AI in their teaching and learning.

The topic is simple but raises an issue that students can connect with.

While helping my students prepare and judging debates, I saw students demonstrate an understanding of many key issues and controversies.

These included—

*AI writing assessment/grading
*Bias
*Bullying
*Cognitive load
*Costs of AI systems
*Declining test scores
*Deep fakes
*Differentiation
*Energy consumption
*Hallucinations
*Human-to-human connection
*Inequality and inequity in access
*Neurodiversity
*Personalized learning
*Privacy
*Regulation (lack thereof)
*The future of work and unemployment
*Saving teachers time
*Soft skills
*Standardized testing
*Student engagement
*Teacher awareness and AI training; training resource trade-offs
*Teacher crowd-out
*Transparency and explainability
*Writing detectors (students had an exaggerated sense of the workability of these tools).

 

AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

Making your campus neurodivergent friendly — from timeshighereducation.com
How to create a university where neurodivergent staff and students feel welcome and thrive in the classroom, in the lab and throughout campus

Neurodivergent students and staff think about, interact with and see the world differently from their neurotypical peers and colleagues. Universities that adopt inclusive practices to welcome people with ADHD, autism, dyslexia, dyspraxia and other disabilities to campus also foster their distinct strengths and talents in the classroom, labs, boardrooms and social spaces. This collection of resources offers advice for teachers, researchers, PhD supervisors and administrators for supporting neurodiversity in higher education.


Some Colleges Will Soon Charge $100,000 a Year. How Did This Happen? — from nytimes.com by Ron Lieber; via Ryan Craig
Some Vanderbilt students will have $100,000 in total expenses for the 2024-25 school year. The school doesn’t really want to talk about it.

It was only a matter of time before a college would have the nerve to quote its cost of attendance at nearly $100,000 a year. This spring, we’re catching our first glimpse of it.

One letter to a newly admitted Vanderbilt University engineering student showed an all-in price — room, board, personal expenses, a high-octane laptop — of $98,426. A student making three trips home to Los Angeles or London from the Nashville campus during the year could hit six figures.

This eye-popping sum is an anomaly. Only a tiny fraction of college-going students will pay anything close to this anytime soon, and about 35 percent of Vanderbilt students — those who get neither need-based nor merit aid — pay the full list price.

But a few dozen other colleges and universities that reject the vast majority of applicants will probably arrive at this threshold within a few years. Their willingness to cross it raises two questions for anyone shopping for college: How did this happen, and can it possibly be worth it?


‘Running Out of Road’ for FAFSA Completion — from insidehighered.com by Liam Knox
The number of students who filled out the federal aid form is down nearly 30 percent. The ramifications for access and enrollment could be devastating.

And that’s probably an optimistic estimate, said Bill DeBaun, NCAN’s senior director of data and strategic initiatives; if the pace of completion doesn’t pick up, the decline could be closer to 700,000 students. That could translate to up to a 4 percent drop in college-goers come fall, DeBaun said, which would be the largest enrollment drop since the COVID-19 pandemic—and one that’s likely to be made up primarily of low-income and first-generation students.


Study: Nearly 40 Percent of Students Started, Never Finished College — from insidehighered.com by Kathryn Palmer
Federal researchers followed the post-secondary outcomes of 23,000 students for 12 years. 

Only 60 percent of students who enrolled in college earned a degree or credential within eight years of graduating high school.

That’s one of the biggest takeaways from a new report the National Center for Education Statistics released Monday that analyzed the enrollment, completion and financial aid outcomes of students.

The researchers tracked the postsecondary educational outcomes of roughly 23,000 students beginning in 2009 when they were freshman in high school through 2021, when the cohort was eight years out from graduating high school.


Race to the Finish | The rise of faster bachelor’s degrees raises the question: What is college for? — from chronicle.com by Kelly Field; from Jeff Selingo

Taken together, the two recent decisions illustrate a blurring of the lines between the two- and four-year sectors that is taking place not just in Idaho, but nationwide, as colleges struggle to overcome enrollment declines and skepticism about the value of a bachelor’s degree.

“It’s pretty clear that higher education is in a funk,” said Robert M. Zemsky, a University of Pennsylvania professor, who has been advocating for three-year programs for more than 15 years. “There’s a sense that we have to do something to make the product better, more relevant, and less costly to students.”


Excerpt from Next — from/by Jeff Selingo

Bottom line: While critics of a shorter degree see it as a lesser replacement for the four-year baccalaureate degree, advocates see it as another option for students who might not be interested in college at a time when enrollment is falling.

  • “We need to use this opportunity to redesign and do things better,” Carrell said. “That means that we all need to stay curious. We need to be a learning enterprise…and learn from the evidence we produce.”

Job-Ready on Day One — from the-job.beehiiv.com by Paul Fain

The U.S. faces a serious shortage of workers in the skilled trades—fields like HVAC, plumbing, electrical, solar, and construction. And those labor gaps are likely to widen as the federal government spends billions on infrastructure projects.

Employers in these industries are desperate for hires, says Doug Donovan, the founder and CEO of Interplay Learning. Yet the “challenge is not employer demand for workers,” he says, “but rather ensuring that learners learn about skilled trades careers and pursue them.”

The Austin-based Interplay offers online and VR training for workers in the skilled trades. The company was founded in 2016 with a focus on upskilling the hands-on worker. Even before the pandemic exacerbated labor shortages, Donovan says companies in these trades needed to hire workers who didn’t have all the skills required for jobs.

Interplay’s online courses and 3D, interactive simulations get close to what a learner is going to see on the job, says Donovan. “We aren’t trying to replace hands-on, instructor-led training,” he says. “We are trying to deliver tools that enhance that hands-on time or make it more efficient.”


 

 

The University Student’s Guide To Ethical AI Use  — from studocu.com; with thanks to Jervise Penton at 6XD Media Group for this resource

This comprehensive guide offers:

  • Up-to-date statistics on the current state of AI in universities, how institutions and students are currently using artificial intelligence
  • An overview of popular AI tools used in universities and its limitations as a study tool
  • Tips on how to ethically use AI and how to maximize its capabilities for students
  • Current existing punishment and penalties for cheating using AI
  • A checklist of questions to ask yourself, before, during, and after an assignment to ensure ethical use

Some of the key facts you might find interesting are:

  • The total value of AI being used in education was estimated to reach $53.68 billion by the end of 2032.
  • 68% of students say using AI has impacted their academic performance positively.
  • Educators using AI tools say the technology helps speed up their grading process by as much as 75%.
 

Corporate Learning Is Boring — But It Doesn’t Have to Be — from hbr.org by Duncan Wardle; via GSV

Summary:
Most corporate learnings aren’t cutting it. Almost 60% of employees say they’re interested in upskilling and training, but 57% of workers also say they’re already pursuing training outside of work. The author, the former Head of Innovation and Creativity at Disney, argues that creativity is the missing piece to make upskilling engaging and effective. From his experience, he shares four strategies to unlock creativity in trainings: 1) Encourage “What if?”, 2) respond “How else?” to challenges, 3) give people time to think by encouraging playfulness, and 4) make training a game.

 

[Report] The Top 100 AI for Work – April 2024 — from flexos.work; with thanks to Daan van Rossum for this resource
AI is helping us work up to 41% more effectively, according to recent Bain research. We review the platforms to consider for ourselves and our teams.

Following our AI Top 150, we spent the past few weeks analyzing data on the top AI platforms for work. This report shares key insights, including the AI tools you should consider adopting to work smarter, not harder.

While there is understandable concern about AI in the work context, the platforms in this list paint a different picture. It shows a future of work where people can do what humans are best suited for while offloading repetitive, digital tasks to AI.

This will fuel the notion that it’s not AI that takes your job but a supercharged human with an army of AI tools and agents. This should be a call to action for every working person and business leader reading this.

 

Assessment of Student Learning Is Broken — from insidehighered.com by Zach Justus and Nik Janos
And generative AI is the thing that broke it, Zach Justus and Nik Janos write.

Generative artificial intelligence (AI) has broken higher education assessment. This has implications from the classroom to institutional accreditation. We are advocating for a one-year pause on assessment requirements from institutions and accreditation bodies.

Implications and Options
The data we are collecting right now are literally worthless. These same trends implicate all data gathered from December 2022 through the present. So, for instance, if you are conducting a five-year program review for institutional accreditation you should separate the data from before the fall 2022 term and evaluate it independently. Whether you are evaluating writing, STEM outputs, coding, or anything else, you are now looking at some combination of student/AI work. This will get even more confounding as AI tools become more powerful and are integrated into our existing production platforms like Microsoft Office and Google Workspace.

The burden of adapting to artificial intelligence has fallen to faculty, but we are not positioned or equipped to lead these conversations across stakeholder groups.


7 TIPS TO AUTOMATE YOUR CLASSROOM WITH AI — from classtechtips.com by Dr. Monica Burns
.

 

 

The New Academic Arms Race | Competition over amenities is over. The next battleground is technology. — from chronicle.com by Jeffrey J. Selingo

Now, after the pandemic, with the value of the bachelor’s degree foremost in the minds of students and families, a new academic arms race is emerging. This one is centered around academic innovation. The winners will be those institutions that in the decade ahead better apply technology in teaching and learning and develop different approaches to credentialing.

Sure, technology is often seen as plumbing on campuses — as long as it works, we don’t worry about it. And rarely do prospective students on a tour ever ask about academic innovations like extended reality or microcredentials. Campus tours prefer to show off the bells and whistles of residential life within dorms and dining halls.

That’s too bad.

The problem is not a lack of learners, but rather a lack of alignment in what colleges offer to a generation of learners surrounded by Amazon, Netflix, and Instagram, where they can stream entertainment and music anytime, anywhere.

From DSC:
When I worked for Calvin (then College, now University) from 2007-2017, that’s exactly how technologies and the entire IT Department were viewed — as infrastructure providers. We were not viewed as being able to enhance the core business/offerings of the institution. We weren’t relevant in that area. In fact, the IT Department was shoved down in the basement of the library. Our Teaching & Learning Digital Studio was sidelined in a part of the library where few students went to. The Digitial Studio’s marketing efforts didn’t help much, as faculty members didn’t offer assignments that called for multimedia-based deliverables. It was a very tough and steep hill to climb.

Also the Presidents and Provosts over the last couple of decades (not currently though) didn’t think much of online-based learning, and the top administrators dissed the Internet’s ability to provide 24/7 worldwide conversations and learning. They missed the biggest thing to come along in education in 500 years (since the invention of the printing press). Our Teaching & Learning Group provided leadership by starting a Calvin Online pilot. We had 13-14 courses built and inquiries from Christian-based high schools were coming in for dual enrollment scenarios, but when it came time for the College to make a decision, it never happened. The topic/vote never made it to the floor of the Faculty Senate. The faculty and administration missed an enormous opportunity.

When Calvin College became Calvin University in 2019, they were forced to offer online-based classes. Had they supported our T&L Group’s efforts back in the early to mid-2010’s, they would have dove-tailed very nicely into offering more courses to working adults. They would have built up the internal expertise to offer these courses/programs. But the culture of the college put a stop to online-based learning at that time. They now regret that decision I’m sure (as they’ve had to outsource many things and they now offer numerous online-based courses and even entire programs — at a high cost most likely).

My how times have changed.


For another item re: higher education at the 30,000-foot level, see:


Lifelong Learning Models for a Changing Higher Ed Marketplace — from changinghighered.com by Dr. Drumm McNaughton and Amrit Ahluwalia
Exploring the transformation of higher education into lifelong learning hubs for workforce development, with innovative models and continuing education’s role.

Higher education is undergoing transformational change to redefine its role as a facilitator of lifelong learning and workforce development. In this 200th episode of Changing Higher Ed, host Dr. Drumm McNaughton and guest Amrit Ahluwalia, incoming Executive Director for Continuing Studies at Western University, explore innovative models positioning universities as sustainable hubs for socioeconomic mobility.

The Consumer-Driven Educational Landscape
Over 60% of today’s jobs will be redefined by 2025, driving demand for continuous upskilling and reskilling to meet evolving workforce needs. However, higher education’s traditional model of imparting specific knowledge through multi-year degrees is hugely misaligned with this reality.

Soaring education costs have fueled a consumer mindset shift, with learners demanding a clear return on investment directly aligned with their career goals. The expectation is to see immediate skills application and professional impact from their educational investments, not just long-term outcomes years after completion.


 

How Generative AI Owns Higher Education. Now What? — from forbes.co by Steve Andriole

Excerpt (emphasis DSC):

What about course videos? Professors can create them (by lecturing into a camera for several hours hopefully in different clothes) from the readings, from their interpretations of the readings, from their own case experiences – from anything they like. But now professors can direct the creation of the videos by talking – actually describing – to a CustomGPTabout what they’d like the video to communicate with their or another image. Wait. What? They can make a video by talking to a CustomGPT and even select the image they want the “actor” to use? Yes. They can also add a British accent and insert some (GenAI-developed) jokes into the videos if they like. All this and much more is now possible. This means that a professor can specify how long the video should be, what sources should be consulted and describe the demeanor the professor wants the video to project.

From DSC:
Though I wasn’t crazy about the clickbait type of title here, I still thought that the article was solid and thought-provoking. It contained several good ideas for using AI.


Excerpt from a recent EdSurge Higher Ed newsletter:


There are darker metaphors though — ones that focus on the hazards for humanity of the tech. Some professors worry that AI bots are simply replacing hired essay-writers for many students, doing work for a student that they can then pass off as their own (and doing it for free).

From DSC:
Hmmm…the use of essay writers was around long before AI became mainstream within higher education. So we already had a serious problem where students didn’t see the why in what they were being asked to do. Some students still aren’t sold on the why of the work in the first place. The situation seems to involve ethics, yes, but it also seems to say that we haven’t sold students on the benefits of putting in the work. Students seem to be saying I don’t care about this stuff…I just need the degree so I can exit stage left.

My main point: The issue didn’t start with AI…it started long before that.

And somewhat relevant here, also see:

I Have Bigger Fish to Fry: Why K12 Education is Not Thinking About AI — from medium.com by Maurie Beasley, M.Ed. (Edited by Jim Beasley)

This financial stagnation is occurring as we face a multitude of escalating challenges. These challenges include but are in no way limited to, chronic absenteeism, widespread student mental health issues, critical staff shortages, rampant classroom behavior issues, a palpable sense of apathy for education in students, and even, I dare say, hatred towards education among parents and policymakers.

Our current focus is on keeping our heads above water, ensuring our students’ safety and mental well-being, and simply keeping our schools staffed and our doors open.


Meet Ed: Ed is an educational friend designed to help students reach their limitless potential. — from lausd.org (Los Angeles School District, the second largest in the U.S.)

What is Ed?
An easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement. It offers personalized guidance and resources to students and families 24/7 in over 100 languages.

Ed is an easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement.

Also relevant/see:

  • Los Angeles Unified Bets Big on ‘Ed,’ an AI Tool for Students — from by Lauraine Langreo
    The Los Angeles Unified School District has launched an AI-powered learning tool that will serve as a “personal assistant” to students and their parents.The tool, named “Ed,” can provide students from the nation’s second-largest district information about their grades, attendance, upcoming tests, and suggested resources to help them improve their academic skills on their own time, Superintendent Alberto Carvalho announced March 20. Students can also use the app to find social-emotional-learning resources, see what’s for lunch, and determine when their bus will arrive.

Could OpenAI’s Sora be a big deal for elementary school kids? — from futureofbeinghuman.com by Andrew Maynard
Despite all the challenges it comes with, AI-generated video could unleash the creativity of young children and provide insights into their inner worlds – if it’s developed and used responsibly

Like many others, I’m concerned about the challenges that come with hyper-realistic AI-generated video. From deep fakes and disinformation to blurring the lines between fact and fiction, generative AI video is calling into question what we can trust, and what we cannot.

And yet despite all the issues the technology is raising, it also holds quite incredible potential, including as a learning and development tool — as long as we develop and use it responsibly.

I was reminded of this a few days back while watching the latest videos from OpenAI created by their AI video engine Sora — including the one below generated from the prompt “an elephant made of leaves running in the jungle”

What struck me while watching this — perhaps more than any of the other videos OpenAI has been posting on its TikTok channel — is the potential Sora has for translating the incredibly creative but often hard to articulate ideas someone may have in their head, into something others can experience.


Can AI Aid the Early Education Workforce? — from edsurge.com by Emily Tate Sullivan
During a panel at SXSW EDU 2024, early education leaders discussed the potential of AI to support and empower the adults who help our nation’s youngest children.

While the vast majority of the conversations about AI in education have centered on K-12 and higher education, few have considered the potential of this innovation in early care and education settings.

At the conference, a panel of early education leaders gathered to do just that, in a session exploring the potential of AI to support and empower the adults who help our nation’s youngest children, titled, “ChatECE: How AI Could Aid the Early Educator Workforce.”

Hau shared that K-12 educators are using the technology to improve efficiency in a number of ways, including to draft individualized education programs (IEPs), create templates for communicating with parents and administrators, and in some cases, to support building lesson plans.


From EIEIO…Seasons Of Change

Again, we’ve never seen change happen as fast as it’s happening.


Enhancing World Language Instruction With AI Image Generators — from eduoptia.org by Rachel Paparone
By crafting an AI prompt in the target language to create an image, students can get immediate feedback on their communication skills.

Educators are, perhaps rightfully so, cautious about incorporating AI in their classrooms. With thoughtful implementation, however, AI image generators, with their ability to use any language, can provide powerful ways for students to engage with the target language and increase their proficiency.


AI in the Classroom: A Teacher’s Toolkit for Transformation — from esheninger.blogspot.com by Eric Sheninger

While AI offers numerous benefits, it’s crucial to remember that it is a tool to empower educators, not replace them. The human connection between teacher and student remains central to fostering creativity, critical thinking, and social-emotional development. The role of teachers will shift towards becoming facilitators, curators, and mentors who guide students through personalized learning journeys. By harnessing the power of AI, educators can create dynamic and effective classrooms that cater to each student’s individual needs. This paves the way for a more engaging and enriching learning experience that empowers students to thrive.


Teachers Are Using AI to Create New Worlds, Help Students with Homework, and Teach English — from themarkup.org by Ross Teixeira; via Matthew Tower
Around the world, these seven teachers are making AI work for them and their students

In this article, seven teachers across the world share their insights on AI tools for educators. You will hear a host of varied opinions and perspectives on everything from whether AI could hasten the decline of learning foreign languages to whether AI-generated lesson plans are an infringement on teachers’ rights. A common theme emerged from those we spoke with: just as the internet changed education, AI tools are here to stay, and it is prudent for teachers to adapt.


Teachers Desperately Need AI Training. How Many Are Getting It? — from edweek.org by Lauraine Langreo

Even though it’s been more than a year since ChatGPT made a big splash in the K-12 world, many teachers say they are still not receiving any training on using artificial intelligence tools in the classroom.

More than 7 in 10 teachers said they haven’t received any professional development on using AI in the classroom, according to a nationally representative EdWeek Research Center survey of 953 educators, including 553 teachers, conducted between Jan. 31 and March 4.

From DSC:
This article mentioned the following resource:

Artificial Intelligence Explorations for Educators — from iste.org


 
© 2024 | Daniel Christian