The AI Tools in Education Database — from aitoolsdirectory.notion.site; via George Siemens

Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day. This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.


Another Workshop for Faculty and Staff — from aiedusimplified.substack.com by Lance Eaton
A recent workshop with some adjustments.

The day started out with a short talk about AI (slides). Some of it is my usual schtick where I do a bit of Q&A with folks around myths and misunderstandings of generative AI in order to establish some common ground. These are often useful both in setting the tone and giving folks a sense of how I come to explore generative AI: with a mixture of humor, concern, curiosity, and of course, cat pics.

From there, we launched into a series of mini-workshops where folks had time to first play around with some previously created prompts around teaching and learning before moving onto prompts for administrative work. The prompts and other support materials are in this Workshop Resource Document. The goal was to just get them into using one or more AI tools with some useful prompts so they can learn more about its capabilities.


The Edtech Insiders Rundown of ASU+GSV 2024 — from edtechinsiders.substack.com by by Sarah Morin, Alex Sarlin, and Ben Kornell
And more on Edtech Insiders+, upcoming events, Gauth, AI Reading Tutors, The Artificial Intelligence Interdisciplinary Institute, and TeachAI Policy Resources

Alex Sarlin

4. Everyone is Edtech Now
This year, in addition to investors, entrepreneurs, educators, school leaders, university admins, non-profits, publishers, and operators from countless edtech startups and incumbents, there were some serious big tech companies in attendance like Meta, Google, OpenAI, Microsoft, Amazon, Tiktok, and Canva. Additionally, a horde of management consultancies, workforce organizations, mental health orgs, and filmmakers were in attendance.

Edtech continues to expand as an industry category and everyone is getting involved.


Ep 18 | Rethinking Education, Lessons to Unlearn, Become a Generalist, & More — Ana Lorena Fábrega — from mishadavinci.substack.com by Misha da Vinci

It was such a delight to chat with Ana. She’s brilliant and passionate, a talented educator, and an advocate for better ways of learning for children and adults. We cover ways to transform schools so that students get real-world skills, learn resilience and how to embrace challenges, and are prepared for an unpredictable future. And we go hard on why we must keep learning no matter our age, become generalists, and leverage technology in order to adapt to the fast-changing world.

Misha also featured an item re: the future of schooling and it contained this graphic:


Texas is replacing thousands of human exam graders with AI — from theverge.com by Jess Weatherbed

The Texas Tribune reports an “automated scoring engine” that utilizes natural language processing — the technology that enables chatbots like OpenAI’s ChatGPT to understand and communicate with users — is being rolled out by the Texas Education Agency (TEA) to grade open-ended questions on the State of Texas Assessments of Academic Readiness (STAAR) exams. The agency is expecting the system to save $15–20 million per year by reducing the need for temporary human scorers, with plans to hire under 2,000 graders this year compared to the 6,000 required in 2023.


Debating About AI: An Easy Path to AI Awareness and Basic Literacy — from stefanbauschard.substack.com by Stefan Bauschard
If you are an organization committed to AI literacy, consider sponsoring some debate topics and/or debates next year and expose thousands of students to AI literacy.

Resolved: Teachers should integrate generative AI in their teaching and learning.

The topic is simple but raises an issue that students can connect with.

While helping my students prepare and judging debates, I saw students demonstrate an understanding of many key issues and controversies.

These included—

*AI writing assessment/grading
*Bias
*Bullying
*Cognitive load
*Costs of AI systems
*Declining test scores
*Deep fakes
*Differentiation
*Energy consumption
*Hallucinations
*Human-to-human connection
*Inequality and inequity in access
*Neurodiversity
*Personalized learning
*Privacy
*Regulation (lack thereof)
*The future of work and unemployment
*Saving teachers time
*Soft skills
*Standardized testing
*Student engagement
*Teacher awareness and AI training; training resource trade-offs
*Teacher crowd-out
*Transparency and explainability
*Writing detectors (students had an exaggerated sense of the workability of these tools).

 

Beyond the Hype: Taking a 50 Year Lens to the Impact of AI on Learning — from nafez.substack.com by Nafez Dakkak and Chris Dede
How do we make sure LLMs are not “digital duct tape”?

[Per Chris Dede] We often think of the product of teaching as the outcome (e.g. an essay, a drawing, etc.). The essence of education, in my view, lies not in the products or outcomes of learning but in the journey itself. The artifact is just a symbol that you’ve taken the journey.

The process of learning — the exploration, challenges, and personal growth that occur along the way — is where the real value lies. For instance, the act of writing an essay is valuable not merely for the final product but for the intellectual journey it represents. It forces you to improve and organize your thinking on a subject.

This distinction becomes important with the rise of generative AI, because it uniquely allows us to produce these artifacts without taking the journey.

As I’ve argued previously, I am worried that all this hype around LLMs renders them a “type of digital duct-tape to hold together an obsolete industrial-era educational system”. 


Speaking of AI in our learning ecosystems, also see:


On Building a AI Policy for Teaching & Learning — from by Lance Eaton
How students drove the development of a policy for students and faculty

Well, last month, the policy was finally approved by our Faculty Curriculum Committee and we can finally share the final version: AI Usage Policy. College Unbound also created (all-human, no AI used) a press release with the policy and some of the details.

To ensure you see this:

  • Usage Guidelines for AI Generative Tools at College Unbound
    These guidelines were created and reviewed by College Unbound students in Spring 2023 with the support of Lance Eaton, Director of Faculty Development & Innovation.  The students include S. Fast, K. Linder-Bey, Veronica Machado, Erica Maddox, Suleima L., Lora Roy.

ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds — from psypost.org by Eric W. Dolan

A recent study has found that scientific citations generated by ChatGPT often do not correspond to real academic work. The study, published in the Canadian Psychological Association’s Mind Pad, found that “false citation rates” across various psychology subfields ranged from 6% to 60%. Surprisingly, these fabricated citations feature elements such as legitimate researchers’ names and properly formatted digital object identifiers (DOIs), which could easily mislead both students and researchers.

MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals.

 

AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

AWS, Educause partner on generative AI readiness tool — from edscoop.com by Skylar Rispens
Amazon Web Services and the nonprofit Educause announced a new tool designed to help higher education institutions gauge their readiness to adopt generative artificial intelligence.

Amazon Web Services and the nonprofit Educause on Monday announced they’ve teamed up to develop a tool that assesses how ready higher education institutions are to adopt generative artificial intelligence.

Through a series of curated questions about institutional strategy, governance, capacity and expertise, AWS and Educause claim their assessment can point to ways that operations can be improved before generative AI is adopted to support students and staff.

“Generative AI will transform how educators engage students inside and outside the classroom, with personalized education and accessible experiences that provide increased student support and drive better learning outcomes,” Kim Majerus, vice president of global education and U.S. state and local government at AWS, said in a press release. “This assessment is a practical tool to help colleges and universities prepare their institutions to maximize this technology and support students throughout their higher ed journey.”


Speaking of AI and our learning ecosystems, also see:

Gen Z Wants AI Skills And Businesses Want Workers Who Can Apply AI: Higher Education Can Help — from forbes.com by Bruce Dahlgren

At a moment when the value of higher education has come under increasing scrutiny, institutions around the world can be exactly what learners and employers both need. To meet the needs of a rapidly changing job market and equip learners with the technical and ethical direction needed to thrive, institutions should familiarize students with the use of AI and nurture the innately human skills needed to apply it ethically. Failing to do so can create enormous risk for higher education, business and society.

What is AI literacy?
To effectively utilize generative AI, learners will need to grasp the appropriate use cases for these tools, understand when their use presents significant downside risk, and learn to recognize abuse to separate fact from fiction. AI literacy is a deeply human capacity. The critical thinking and communication skills required are muscles that need repeated training to be developed and maintained.

 

The University Student’s Guide To Ethical AI Use  — from studocu.com; with thanks to Jervise Penton at 6XD Media Group for this resource

This comprehensive guide offers:

  • Up-to-date statistics on the current state of AI in universities, how institutions and students are currently using artificial intelligence
  • An overview of popular AI tools used in universities and its limitations as a study tool
  • Tips on how to ethically use AI and how to maximize its capabilities for students
  • Current existing punishment and penalties for cheating using AI
  • A checklist of questions to ask yourself, before, during, and after an assignment to ensure ethical use

Some of the key facts you might find interesting are:

  • The total value of AI being used in education was estimated to reach $53.68 billion by the end of 2032.
  • 68% of students say using AI has impacted their academic performance positively.
  • Educators using AI tools say the technology helps speed up their grading process by as much as 75%.
 

Assessment of Student Learning Is Broken — from insidehighered.com by Zach Justus and Nik Janos
And generative AI is the thing that broke it, Zach Justus and Nik Janos write.

Generative artificial intelligence (AI) has broken higher education assessment. This has implications from the classroom to institutional accreditation. We are advocating for a one-year pause on assessment requirements from institutions and accreditation bodies.

Implications and Options
The data we are collecting right now are literally worthless. These same trends implicate all data gathered from December 2022 through the present. So, for instance, if you are conducting a five-year program review for institutional accreditation you should separate the data from before the fall 2022 term and evaluate it independently. Whether you are evaluating writing, STEM outputs, coding, or anything else, you are now looking at some combination of student/AI work. This will get even more confounding as AI tools become more powerful and are integrated into our existing production platforms like Microsoft Office and Google Workspace.

The burden of adapting to artificial intelligence has fallen to faculty, but we are not positioned or equipped to lead these conversations across stakeholder groups.


7 TIPS TO AUTOMATE YOUR CLASSROOM WITH AI — from classtechtips.com by Dr. Monica Burns
.

 

 

How Generative AI Owns Higher Education. Now What? — from forbes.co by Steve Andriole

Excerpt (emphasis DSC):

What about course videos? Professors can create them (by lecturing into a camera for several hours hopefully in different clothes) from the readings, from their interpretations of the readings, from their own case experiences – from anything they like. But now professors can direct the creation of the videos by talking – actually describing – to a CustomGPTabout what they’d like the video to communicate with their or another image. Wait. What? They can make a video by talking to a CustomGPT and even select the image they want the “actor” to use? Yes. They can also add a British accent and insert some (GenAI-developed) jokes into the videos if they like. All this and much more is now possible. This means that a professor can specify how long the video should be, what sources should be consulted and describe the demeanor the professor wants the video to project.

From DSC:
Though I wasn’t crazy about the clickbait type of title here, I still thought that the article was solid and thought-provoking. It contained several good ideas for using AI.


Excerpt from a recent EdSurge Higher Ed newsletter:


There are darker metaphors though — ones that focus on the hazards for humanity of the tech. Some professors worry that AI bots are simply replacing hired essay-writers for many students, doing work for a student that they can then pass off as their own (and doing it for free).

From DSC:
Hmmm…the use of essay writers was around long before AI became mainstream within higher education. So we already had a serious problem where students didn’t see the why in what they were being asked to do. Some students still aren’t sold on the why of the work in the first place. The situation seems to involve ethics, yes, but it also seems to say that we haven’t sold students on the benefits of putting in the work. Students seem to be saying I don’t care about this stuff…I just need the degree so I can exit stage left.

My main point: The issue didn’t start with AI…it started long before that.

And somewhat relevant here, also see:

I Have Bigger Fish to Fry: Why K12 Education is Not Thinking About AI — from medium.com by Maurie Beasley, M.Ed. (Edited by Jim Beasley)

This financial stagnation is occurring as we face a multitude of escalating challenges. These challenges include but are in no way limited to, chronic absenteeism, widespread student mental health issues, critical staff shortages, rampant classroom behavior issues, a palpable sense of apathy for education in students, and even, I dare say, hatred towards education among parents and policymakers.

Our current focus is on keeping our heads above water, ensuring our students’ safety and mental well-being, and simply keeping our schools staffed and our doors open.


Meet Ed: Ed is an educational friend designed to help students reach their limitless potential. — from lausd.org (Los Angeles School District, the second largest in the U.S.)

What is Ed?
An easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement. It offers personalized guidance and resources to students and families 24/7 in over 100 languages.

Ed is an easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement.

Also relevant/see:

  • Los Angeles Unified Bets Big on ‘Ed,’ an AI Tool for Students — from by Lauraine Langreo
    The Los Angeles Unified School District has launched an AI-powered learning tool that will serve as a “personal assistant” to students and their parents.The tool, named “Ed,” can provide students from the nation’s second-largest district information about their grades, attendance, upcoming tests, and suggested resources to help them improve their academic skills on their own time, Superintendent Alberto Carvalho announced March 20. Students can also use the app to find social-emotional-learning resources, see what’s for lunch, and determine when their bus will arrive.

Could OpenAI’s Sora be a big deal for elementary school kids? — from futureofbeinghuman.com by Andrew Maynard
Despite all the challenges it comes with, AI-generated video could unleash the creativity of young children and provide insights into their inner worlds – if it’s developed and used responsibly

Like many others, I’m concerned about the challenges that come with hyper-realistic AI-generated video. From deep fakes and disinformation to blurring the lines between fact and fiction, generative AI video is calling into question what we can trust, and what we cannot.

And yet despite all the issues the technology is raising, it also holds quite incredible potential, including as a learning and development tool — as long as we develop and use it responsibly.

I was reminded of this a few days back while watching the latest videos from OpenAI created by their AI video engine Sora — including the one below generated from the prompt “an elephant made of leaves running in the jungle”

What struck me while watching this — perhaps more than any of the other videos OpenAI has been posting on its TikTok channel — is the potential Sora has for translating the incredibly creative but often hard to articulate ideas someone may have in their head, into something others can experience.


Can AI Aid the Early Education Workforce? — from edsurge.com by Emily Tate Sullivan
During a panel at SXSW EDU 2024, early education leaders discussed the potential of AI to support and empower the adults who help our nation’s youngest children.

While the vast majority of the conversations about AI in education have centered on K-12 and higher education, few have considered the potential of this innovation in early care and education settings.

At the conference, a panel of early education leaders gathered to do just that, in a session exploring the potential of AI to support and empower the adults who help our nation’s youngest children, titled, “ChatECE: How AI Could Aid the Early Educator Workforce.”

Hau shared that K-12 educators are using the technology to improve efficiency in a number of ways, including to draft individualized education programs (IEPs), create templates for communicating with parents and administrators, and in some cases, to support building lesson plans.


From EIEIO…Seasons Of Change

Again, we’ve never seen change happen as fast as it’s happening.


Enhancing World Language Instruction With AI Image Generators — from eduoptia.org by Rachel Paparone
By crafting an AI prompt in the target language to create an image, students can get immediate feedback on their communication skills.

Educators are, perhaps rightfully so, cautious about incorporating AI in their classrooms. With thoughtful implementation, however, AI image generators, with their ability to use any language, can provide powerful ways for students to engage with the target language and increase their proficiency.


AI in the Classroom: A Teacher’s Toolkit for Transformation — from esheninger.blogspot.com by Eric Sheninger

While AI offers numerous benefits, it’s crucial to remember that it is a tool to empower educators, not replace them. The human connection between teacher and student remains central to fostering creativity, critical thinking, and social-emotional development. The role of teachers will shift towards becoming facilitators, curators, and mentors who guide students through personalized learning journeys. By harnessing the power of AI, educators can create dynamic and effective classrooms that cater to each student’s individual needs. This paves the way for a more engaging and enriching learning experience that empowers students to thrive.


Teachers Are Using AI to Create New Worlds, Help Students with Homework, and Teach English — from themarkup.org by Ross Teixeira; via Matthew Tower
Around the world, these seven teachers are making AI work for them and their students

In this article, seven teachers across the world share their insights on AI tools for educators. You will hear a host of varied opinions and perspectives on everything from whether AI could hasten the decline of learning foreign languages to whether AI-generated lesson plans are an infringement on teachers’ rights. A common theme emerged from those we spoke with: just as the internet changed education, AI tools are here to stay, and it is prudent for teachers to adapt.


Teachers Desperately Need AI Training. How Many Are Getting It? — from edweek.org by Lauraine Langreo

Even though it’s been more than a year since ChatGPT made a big splash in the K-12 world, many teachers say they are still not receiving any training on using artificial intelligence tools in the classroom.

More than 7 in 10 teachers said they haven’t received any professional development on using AI in the classroom, according to a nationally representative EdWeek Research Center survey of 953 educators, including 553 teachers, conducted between Jan. 31 and March 4.

From DSC:
This article mentioned the following resource:

Artificial Intelligence Explorations for Educators — from iste.org


 

The $340 Billion Corporate Learning Industry Is Poised For Disruption — from joshbersin.com by Josh Bersin

What if, for example, the corporate learning system knew who you were and you could simply ask it a question and it would generate an answer, a series of resources, and a dynamic set of learning objects for you to consume? In some cases you’ll take the answer and run. In other cases you’ll pour through the content. And in other cases you’ll browse through the course and take the time to learn what you need.

And suppose all this happened in a totally personalized way. So you didn’t see a “standard course” but a special course based on your level of existing knowledge?

This is what AI is going to bring us. And yes, it’s already happening today.

 

How to Make the Dream of Education Equity (or Most of It) a Reality — from nataliewexler.substack.com by Natalie Wexler
Studies on the effects of tutoring–by humans or computers–point to ways to improve regular classroom instruction.

One problem, of course, is that it’s prohibitively expensive to hire a tutor for every average or struggling student, or even one for every two or three of them. This was the two-sigma “problem” that Bloom alluded to in the title of his essay: how can the massive benefits of tutoring possibly be scaled up? Both Khan and Zuckerberg have argued that the answer is to have computers, maybe powered by artificial intelligence, serve as tutors instead of humans.

From DSC:
I’m hoping that AI-backed learning platforms WILL help many people of all ages and backgrounds. But I realize — and appreciate what Natalie is saying here as well — that human beings are needed in the learning process (especially at younger ages). 

But without the human element, that’s unlikely to be enough. Students are more likely to work hard to please a teacher than to please a computer.

Natalie goes on to talk about training all teachers in cognitive science — a solid idea for sure. That’s what I was trying to get at with this graphic:
.

We need to take more of the research from learning science and apply it in our learning spaces.

.
But I’m not as hopeful in all teachers getting trained in cognitive science…as it should have happened (in the Schools of Education and in the K12 learning ecosystem at large) by now. Perhaps it will happen, given enough time.

And with more homeschooling and blended programs of education occurring, that idea gets stretched even further. 

K-12 Hybrid Schooling Is in High Demand — from realcleareducation.com by Keri D. Ingraham (emphasis below from DSC); via GSV

Parents are looking for a different kind of education for their children. A 2024 poll of parents reveals that 72% are considering, 63% are searching for, and 44% have selected a new K-12 school option for their children over the past few years. So, what type of education are they seeking?

Additional polling data reveals that 49% of parents would prefer their child learn from home at least one day a week. While 10% want full-time homeschooling, the remaining 39% of parents desire their child to learn at home one to four days a week, with the remaining days attending school on-campus. Another parent poll released this month indicates that an astonishing 64% of parents indicated that if they were looking for a new school for their child, they would enroll him or her in a hybrid school.

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 


[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 

A Notre Dame Senior’s Perspective on AI in the Classroom — from learning.nd.edu — by Sarah Ochocki; via Derek Bruff on LinkedIn

At this moment, as a college student trying to navigate the messy, fast-developing, and varied world of generative AI, I feel more confused than ever. I think most of us can share that feeling. There’s no roadmap on how to use AI in education, and there aren’t the typical years of proof to show something works. However, this promising new tool is sitting in front of us, and we would be foolish to not use it or talk about it.

I’ve used it to help me understand sample code I was viewing, rather than mindlessly trying to copy what I was trying to learn from. I’ve also used it to help prepare for a debate, practicing making counterarguments to the points it came up with.

AI alone cannot teach something; there needs to be critical interaction with the responses we are given. However, this is something that is true of any form of education. I could sit in a lecture for hours a week, but if I don’t do the homework or critically engage with the material, I don’t expect to learn anything.


A Map of Generative AI for Education — from medium.com by Laurence Holt; via GSV
An update to our map of the current state-of-the-art


Last ones (for now):


Survey: K-12 Students Want More Guidance on Using AI — from govtech.com by Lauraine Langreo
Research from the nonprofit National 4-H Council found that most 9- to 17-year-olds have an idea of what AI is and what it can do, but most would like help from adults in learning how to use different AI tools.

“Preparing young people for the workforce of the future means ensuring that they have a solid understanding of these new technologies that are reshaping our world,” Jill Bramble, the president and CEO of the National 4-H Council, said in a press release.

AI School Guidance Document Toolkit, with Free Comprehensive Review — from tefanbauschard.substack.com by Stefan Bauschard and Dr. Sabba Quidwai

 

AI Is Becoming an Integral Part of the Instructional Design Process — from drphilippahardman.substack.com Dr. Philippa Hardman
What I learned from 150 interviews with instructional designers

1. AI already plays a central part in the instructional design process
A whopping 95.3% of the instructional designers interviewed said they use AI in their day to day work. Those who don’t use AI cite access or permission issues as the primary reason that they haven’t integrated AI into their process.

2. AI is predominantly used at the design and development stages of the instructional design process 
When mapped to the ADDIE process, the breakdown of use cases goes as follows:

  • Analysis: 5.5% of use cases
  • Design: 32.1%
  • Development: 53.2%
  • Implementation: 1.8%
  • Evaluation: 7.3%

Speaking of AI in our learning ecosystems, also see:


Will AI Use in Schools Increase Next Year? 56 Percent of Educators Say Yes — from edweek.org by Alyson Klein

The majority of educators expect use of artificial intelligence tools will increase in their school or district over the next year, according to an EdWeek Research Center survey.

Applying Multimodal AI to L&D — from learningguild.com by Sarah Clark
We’re just starting to see Multimodal AI systems hit the spotlight. Unlike the text-based and image-generation AI tools we’ve seen before, multimodal systems can absorb and generate content in multiple formats – text, image, video, audio, etc.

 

 

Using Generative AI throughout the Institution — from aiedusimplified.substack.com by Lance Eaton
8 lightning talk on generative AI and how to use it through higher education


The magic of AI to help educators with saving time. — from magicschool.ai; via Mrs. Kendall Sajdak


Getting Better Results out of Generative AI — from aiedusimplified.substack.com by Lance Eaton
The prompt to use before you prompt generative AI

Last month, I discussed a GPT that I had created around enhancing prompts. Since then, I have been actively using my Prompt Enhancer GPT to much more effective outputs. Last week, I did a series of mini-talks on generative AI in different parts of higher education (faculty development, human resources, grants, executive leadership, etc) and structured it as “5 tips”. I included a final bonus tip in all of them—a tip that I heard from many afterwards was probably the most useful tip—especially because you can only access the Prompt Enhancer GPT if you are paying for ChatGPT.


Exploring the Opportunities and Challenges with Generative AI — from er.educause.edu by Veronica Diaz

Effectively integrating generative AI into higher education requires policy development, cross-functional engagement, ethical principles, risk assessments, collaboration with other institutions, and an exploration of diverse use cases.


Creating Guidelines for the Use of Gen AI Across Campus — from campustechnology.com by Rhea Kelly
The University of Kentucky has taken a transdisciplinary approach to developing guidelines and recommendations around generative AI, incorporating input from stakeholders across all areas of the institution. Here, the director of UK’s Center for the Enhancement of Learning and Teaching breaks down the structure and thinking behind that process.

That resulted in a set of instructional guidelines that we released in August of 2023 and updated in December of 2023. We’re also looking at guidelines for researchers at UK, and we’re currently in the process of working with our colleagues in the healthcare enterprise, UK Healthcare, to comb through the additional complexities of this technology in clinical care and to offer guidance and recommendations around those issues.


From Mean Drafts to Keen Emails — from automatedteach.com by Graham Clay

My experiences match with the results of the above studies. The second study cited above found that 83% of those students who haven’t used AI tools are “not interested in using them,” so it is no surprise that many students have little awareness of their nature. The third study cited above found that, “apart from 12% of students identifying as daily users,” most students’ use cases were “relatively unsophisticated” like summarizing or paraphrasing text.

For those of us in the AI-curious bubble, we need to continually work to stay current, but we also need to recognize that what we take to be “common knowledge” is far from common outside of the bubble.


What do superintendents need to know about artificial intelligence? — from k12dive.com by Roger Riddell
District leaders shared strategies and advice on ethics, responsible use, and the technology’s limitations at the National Conference on Education.

Despite general familiarity, however, technical knowledge shouldn’t be assumed for district leaders or others in the school community. For instance, it’s critical that any materials related to AI not be written in “techy talk” so they can be clearly understood, said Ann McMullan, project director for the Consortium for School Networking’s EmpowerED Superintendents Initiative.

To that end, CoSN, a nonprofit that promotes technological innovation in K-12, has released an array of AI resources to help superintendents stay ahead of the curve, including a one-page explainer that details definitions and guidelines to keep in mind as schools work with the emerging technology.


 

Announcing the 2024 GSV 150: The Top Growth Companies in Digital Learning & Workforce Skills — from prnewswire.com with information provided by ASU+GSV Summit

“The world is adapting to seismic shifts from generative AI,” says Luben Pampoulov, Partner at GSV Ventures. “AI co-pilots, AI tutors, AI content generators—AI is ubiquitous, and differentiation is increasingly critical. This is an impressive group of EdTech companies that are leveraging AI and driving positive outcomes for learners and society.”

Workforce Learning comprises 34% of the list, K-12 29%, Higher Education 24%, Adult Consumer Learning 10%, and Early Childhood 3%. Additionally, 21% of the companies stretch across two or more “Pre-K to Gray” categories. A broader move towards profitability is also evident: the collective gross and EBITDA margin score of the 2024 cohort increased 5% compared to 2023.

See the list at https://www.asugsvsummit.com/gsv-150

Selected from 2,000+ companies around the world based on revenue scale, revenue growth, user reach, geographic diversification, and margin profile, this impressive group is reaching an estimated 3 billion people and generating an estimated $23 billion in revenue.

 
© 2024 | Daniel Christian