The AI Tools in Education Database — from aitoolsdirectory.notion.site; via George Siemens

Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day. This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.


Another Workshop for Faculty and Staff — from aiedusimplified.substack.com by Lance Eaton
A recent workshop with some adjustments.

The day started out with a short talk about AI (slides). Some of it is my usual schtick where I do a bit of Q&A with folks around myths and misunderstandings of generative AI in order to establish some common ground. These are often useful both in setting the tone and giving folks a sense of how I come to explore generative AI: with a mixture of humor, concern, curiosity, and of course, cat pics.

From there, we launched into a series of mini-workshops where folks had time to first play around with some previously created prompts around teaching and learning before moving onto prompts for administrative work. The prompts and other support materials are in this Workshop Resource Document. The goal was to just get them into using one or more AI tools with some useful prompts so they can learn more about its capabilities.


The Edtech Insiders Rundown of ASU+GSV 2024 — from edtechinsiders.substack.com by by Sarah Morin, Alex Sarlin, and Ben Kornell
And more on Edtech Insiders+, upcoming events, Gauth, AI Reading Tutors, The Artificial Intelligence Interdisciplinary Institute, and TeachAI Policy Resources

Alex Sarlin

4. Everyone is Edtech Now
This year, in addition to investors, entrepreneurs, educators, school leaders, university admins, non-profits, publishers, and operators from countless edtech startups and incumbents, there were some serious big tech companies in attendance like Meta, Google, OpenAI, Microsoft, Amazon, Tiktok, and Canva. Additionally, a horde of management consultancies, workforce organizations, mental health orgs, and filmmakers were in attendance.

Edtech continues to expand as an industry category and everyone is getting involved.


Ep 18 | Rethinking Education, Lessons to Unlearn, Become a Generalist, & More — Ana Lorena Fábrega — from mishadavinci.substack.com by Misha da Vinci

It was such a delight to chat with Ana. She’s brilliant and passionate, a talented educator, and an advocate for better ways of learning for children and adults. We cover ways to transform schools so that students get real-world skills, learn resilience and how to embrace challenges, and are prepared for an unpredictable future. And we go hard on why we must keep learning no matter our age, become generalists, and leverage technology in order to adapt to the fast-changing world.

Misha also featured an item re: the future of schooling and it contained this graphic:


Texas is replacing thousands of human exam graders with AI — from theverge.com by Jess Weatherbed

The Texas Tribune reports an “automated scoring engine” that utilizes natural language processing — the technology that enables chatbots like OpenAI’s ChatGPT to understand and communicate with users — is being rolled out by the Texas Education Agency (TEA) to grade open-ended questions on the State of Texas Assessments of Academic Readiness (STAAR) exams. The agency is expecting the system to save $15–20 million per year by reducing the need for temporary human scorers, with plans to hire under 2,000 graders this year compared to the 6,000 required in 2023.


Debating About AI: An Easy Path to AI Awareness and Basic Literacy — from stefanbauschard.substack.com by Stefan Bauschard
If you are an organization committed to AI literacy, consider sponsoring some debate topics and/or debates next year and expose thousands of students to AI literacy.

Resolved: Teachers should integrate generative AI in their teaching and learning.

The topic is simple but raises an issue that students can connect with.

While helping my students prepare and judging debates, I saw students demonstrate an understanding of many key issues and controversies.

These included—

*AI writing assessment/grading
*Bias
*Bullying
*Cognitive load
*Costs of AI systems
*Declining test scores
*Deep fakes
*Differentiation
*Energy consumption
*Hallucinations
*Human-to-human connection
*Inequality and inequity in access
*Neurodiversity
*Personalized learning
*Privacy
*Regulation (lack thereof)
*The future of work and unemployment
*Saving teachers time
*Soft skills
*Standardized testing
*Student engagement
*Teacher awareness and AI training; training resource trade-offs
*Teacher crowd-out
*Transparency and explainability
*Writing detectors (students had an exaggerated sense of the workability of these tools).

 

Beyond the Hype: Taking a 50 Year Lens to the Impact of AI on Learning — from nafez.substack.com by Nafez Dakkak and Chris Dede
How do we make sure LLMs are not “digital duct tape”?

[Per Chris Dede] We often think of the product of teaching as the outcome (e.g. an essay, a drawing, etc.). The essence of education, in my view, lies not in the products or outcomes of learning but in the journey itself. The artifact is just a symbol that you’ve taken the journey.

The process of learning — the exploration, challenges, and personal growth that occur along the way — is where the real value lies. For instance, the act of writing an essay is valuable not merely for the final product but for the intellectual journey it represents. It forces you to improve and organize your thinking on a subject.

This distinction becomes important with the rise of generative AI, because it uniquely allows us to produce these artifacts without taking the journey.

As I’ve argued previously, I am worried that all this hype around LLMs renders them a “type of digital duct-tape to hold together an obsolete industrial-era educational system”. 


Speaking of AI in our learning ecosystems, also see:


On Building a AI Policy for Teaching & Learning — from by Lance Eaton
How students drove the development of a policy for students and faculty

Well, last month, the policy was finally approved by our Faculty Curriculum Committee and we can finally share the final version: AI Usage Policy. College Unbound also created (all-human, no AI used) a press release with the policy and some of the details.

To ensure you see this:

  • Usage Guidelines for AI Generative Tools at College Unbound
    These guidelines were created and reviewed by College Unbound students in Spring 2023 with the support of Lance Eaton, Director of Faculty Development & Innovation.  The students include S. Fast, K. Linder-Bey, Veronica Machado, Erica Maddox, Suleima L., Lora Roy.

ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds — from psypost.org by Eric W. Dolan

A recent study has found that scientific citations generated by ChatGPT often do not correspond to real academic work. The study, published in the Canadian Psychological Association’s Mind Pad, found that “false citation rates” across various psychology subfields ranged from 6% to 60%. Surprisingly, these fabricated citations feature elements such as legitimate researchers’ names and properly formatted digital object identifiers (DOIs), which could easily mislead both students and researchers.

MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals.

 

Forbes 2024 AI 50 List: Top Artificial Intelligence Startups  — from forbes.com by Kenrick Cai

The artificial intelligence sector has never been more competitive. Forbes received some 1,900 submissions this year, more than double last year’s count. Applicants do not pay a fee to be considered and are judged for their business promise and technical usage of AI through a quantitative algorithm and qualitative judging panels. Companies are encouraged to share data on diversity, and our list aims to promote a more equitable startup ecosystem. But disparities remain sharp in the industry. Only 12 companies have women cofounders, five of whom serve as CEO, the same count as last year. For more, see our full package of coverage, including a detailed explanation of the list methodology, videos and analyses on trends in AI.


Adobe Previews Breakthrough AI Innovations to Advance Professional Video Workflows Within Adobe Premiere Pro — from news.adobe.com

  • New Generative AI video tools coming to Premiere Pro this year will streamline workflows and unlock new creative possibilities, from extending a shot to adding or removing objects in a scene
  • Adobe is developing a video model for Firefly, which will power video and audio editing workflows in Premiere Pro and enable anyone to create and ideate
    Adobe previews early explorations of bringing third-party generative AI models from OpenAI, Pika Labs and Runway directly into Premiere Pro, making it easy for customers to draw on the strengths of different models within the powerful workflows they use every day
  • AI-powered audio workflows in Premiere Pro are now generally available, making audio editing faster, easier and more intuitive

Also relevant see:




 

AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

AWS, Educause partner on generative AI readiness tool — from edscoop.com by Skylar Rispens
Amazon Web Services and the nonprofit Educause announced a new tool designed to help higher education institutions gauge their readiness to adopt generative artificial intelligence.

Amazon Web Services and the nonprofit Educause on Monday announced they’ve teamed up to develop a tool that assesses how ready higher education institutions are to adopt generative artificial intelligence.

Through a series of curated questions about institutional strategy, governance, capacity and expertise, AWS and Educause claim their assessment can point to ways that operations can be improved before generative AI is adopted to support students and staff.

“Generative AI will transform how educators engage students inside and outside the classroom, with personalized education and accessible experiences that provide increased student support and drive better learning outcomes,” Kim Majerus, vice president of global education and U.S. state and local government at AWS, said in a press release. “This assessment is a practical tool to help colleges and universities prepare their institutions to maximize this technology and support students throughout their higher ed journey.”


Speaking of AI and our learning ecosystems, also see:

Gen Z Wants AI Skills And Businesses Want Workers Who Can Apply AI: Higher Education Can Help — from forbes.com by Bruce Dahlgren

At a moment when the value of higher education has come under increasing scrutiny, institutions around the world can be exactly what learners and employers both need. To meet the needs of a rapidly changing job market and equip learners with the technical and ethical direction needed to thrive, institutions should familiarize students with the use of AI and nurture the innately human skills needed to apply it ethically. Failing to do so can create enormous risk for higher education, business and society.

What is AI literacy?
To effectively utilize generative AI, learners will need to grasp the appropriate use cases for these tools, understand when their use presents significant downside risk, and learn to recognize abuse to separate fact from fiction. AI literacy is a deeply human capacity. The critical thinking and communication skills required are muscles that need repeated training to be developed and maintained.

 

Colin Levy Discusses His New Book The Legal Tech Ecosystem & the Skills Needed to Succeed in Legal Tech — from tlpodcast.com by Chad Main

In the latest episode, legal tech guru and Head of Legal at contract lifecycle management company Malbek, Colin Levy, discusses his journey into legal tech and insights from his new book “The Legal Tech Ecosystem“. His book is a plainly written look into the legal tech field, emphasizing practical tools over AI hype and underscoring the importance of adaptability, risk-taking, and continuous learning in this evolving industry.

Also see:


Virtual Legal Advising: Mastering Business and Property Matters Online — from ventsmagazine.com by Abdus Subhan

Digital transformation has dominated every industry, the legal industry has not been left behind. Virtual law, or providing legal services through online platforms, has emerged as a vital resource for individuals and businesses alike. This article explores the idea of online professional legal advice, focusing on business and property matters. It serves as a thorough guide to navigating legal issues in these domains with the aid of virtual law.


 

 

Assessment of Student Learning Is Broken — from insidehighered.com by Zach Justus and Nik Janos
And generative AI is the thing that broke it, Zach Justus and Nik Janos write.

Generative artificial intelligence (AI) has broken higher education assessment. This has implications from the classroom to institutional accreditation. We are advocating for a one-year pause on assessment requirements from institutions and accreditation bodies.

Implications and Options
The data we are collecting right now are literally worthless. These same trends implicate all data gathered from December 2022 through the present. So, for instance, if you are conducting a five-year program review for institutional accreditation you should separate the data from before the fall 2022 term and evaluate it independently. Whether you are evaluating writing, STEM outputs, coding, or anything else, you are now looking at some combination of student/AI work. This will get even more confounding as AI tools become more powerful and are integrated into our existing production platforms like Microsoft Office and Google Workspace.

The burden of adapting to artificial intelligence has fallen to faculty, but we are not positioned or equipped to lead these conversations across stakeholder groups.


7 TIPS TO AUTOMATE YOUR CLASSROOM WITH AI — from classtechtips.com by Dr. Monica Burns
.

 

 

Dr Abigail Rekas, Lawyer & Lecturer at the School of Law, University of Galway

Abigail is a lecturer on two of the Law micro-credentials at University of Galway – Lawyering Technology & Innovation and Law & Analytics. Micro-credentials are short, flexible courses designed to fit around your busy life! They are designed in collaboration with industry to meet specific skills needs and are accredited by leading Irish universities.

Visit: universityofgalway.ie/courses/micro-credentials/


The Implications of Generative AI: From the Delivery of Legal Services to the Delivery of Justice — from iaals.du.edu by

The potential for AI’s impact is broad, as it has the ability to impact every aspect of human life, from home to work. It will impact our relationships to everything and everyone in our world. The implications for generative AI on the legal system, from how we deliver legal services to how we deliver justice, will be just as far reaching.

[N]ow we face the latest technological frontier: artificial intelligence (AI).… Law professors report with both awe and angst that AI apparently can earn Bs on law school assignments and even pass the bar exam. Legal research may soon be unimaginable without it. AI obviously has great potential to dramatically increase access to key information for lawyers and non-lawyers alike. But just as obviously it risks invading privacy interests and dehumanizing the law.

When you can no longer sell the time it takes to achieve a client’s outcome, then you must sell the outcome itself and the client’s experience of getting there. That completely changes the dynamics of what law firms are all about.


Preparing the Next Generation of Tech-Ready Lawyers — from news.gsu.edu
Legal Analytics and Innovation Initiative Gives Students a Competitive Advantage

Georgia State University College of Law faculty understand this need and designed the Legal Analytics & Innovation Initiative (LAII) to equip students with the competitive skills desired by law firms and other companies that align with the emerging technological environment.

“As faculty, we realized we need to be forward-thinking about incorporating technology into our curriculum. Students must understand new areas of law that arise from or are significantly altered by technological advances, like cybersecurity, privacy and AI. They also must understand how these advances change the practice of law,” said Kris Niedringhaus, associate dean for Law Library, Information Services, Legal Technology & Innovation.


The Imperative Of Identifying Use Cases In Legal Tech: A Guiding Light For Innovation In The Age Of AI — from abovethelaw.com by Olga V. Mack
In the quest to integrate AI and legal technology into legal practice, use cases are not just important but indispensable.

As the legal profession continues to navigate the waters of digital transformation, the importance of use cases stands as a beacon guiding the journey. They are the litmus test for the practical value of technology, ensuring that innovations not only dazzle with potential but also deliver tangible benefits. In the quest to integrate AI and legal technology into legal practice, use cases are not just important but indispensable.

The future of legal tech is not about technology for technology’s sake. It’s about thoughtful, purpose-driven innovation that enhances the practice of law, improves client outcomes, and upholds the principles of justice. Use cases are the roadmap for this future, charting a course for technology that is meaningful, impactful, and aligned with the noble pursuit of law.

 

The New Academic Arms Race | Competition over amenities is over. The next battleground is technology. — from chronicle.com by Jeffrey J. Selingo

Now, after the pandemic, with the value of the bachelor’s degree foremost in the minds of students and families, a new academic arms race is emerging. This one is centered around academic innovation. The winners will be those institutions that in the decade ahead better apply technology in teaching and learning and develop different approaches to credentialing.

Sure, technology is often seen as plumbing on campuses — as long as it works, we don’t worry about it. And rarely do prospective students on a tour ever ask about academic innovations like extended reality or microcredentials. Campus tours prefer to show off the bells and whistles of residential life within dorms and dining halls.

That’s too bad.

The problem is not a lack of learners, but rather a lack of alignment in what colleges offer to a generation of learners surrounded by Amazon, Netflix, and Instagram, where they can stream entertainment and music anytime, anywhere.

From DSC:
When I worked for Calvin (then College, now University) from 2007-2017, that’s exactly how technologies and the entire IT Department were viewed — as infrastructure providers. We were not viewed as being able to enhance the core business/offerings of the institution. We weren’t relevant in that area. In fact, the IT Department was shoved down in the basement of the library. Our Teaching & Learning Digital Studio was sidelined in a part of the library where few students went to. The Digitial Studio’s marketing efforts didn’t help much, as faculty members didn’t offer assignments that called for multimedia-based deliverables. It was a very tough and steep hill to climb.

Also the Presidents and Provosts over the last couple of decades (not currently though) didn’t think much of online-based learning, and the top administrators dissed the Internet’s ability to provide 24/7 worldwide conversations and learning. They missed the biggest thing to come along in education in 500 years (since the invention of the printing press). Our Teaching & Learning Group provided leadership by starting a Calvin Online pilot. We had 13-14 courses built and inquiries from Christian-based high schools were coming in for dual enrollment scenarios, but when it came time for the College to make a decision, it never happened. The topic/vote never made it to the floor of the Faculty Senate. The faculty and administration missed an enormous opportunity.

When Calvin College became Calvin University in 2019, they were forced to offer online-based classes. Had they supported our T&L Group’s efforts back in the early to mid-2010’s, they would have dove-tailed very nicely into offering more courses to working adults. They would have built up the internal expertise to offer these courses/programs. But the culture of the college put a stop to online-based learning at that time. They now regret that decision I’m sure (as they’ve had to outsource many things and they now offer numerous online-based courses and even entire programs — at a high cost most likely).

My how times have changed.


For another item re: higher education at the 30,000-foot level, see:


Lifelong Learning Models for a Changing Higher Ed Marketplace — from changinghighered.com by Dr. Drumm McNaughton and Amrit Ahluwalia
Exploring the transformation of higher education into lifelong learning hubs for workforce development, with innovative models and continuing education’s role.

Higher education is undergoing transformational change to redefine its role as a facilitator of lifelong learning and workforce development. In this 200th episode of Changing Higher Ed, host Dr. Drumm McNaughton and guest Amrit Ahluwalia, incoming Executive Director for Continuing Studies at Western University, explore innovative models positioning universities as sustainable hubs for socioeconomic mobility.

The Consumer-Driven Educational Landscape
Over 60% of today’s jobs will be redefined by 2025, driving demand for continuous upskilling and reskilling to meet evolving workforce needs. However, higher education’s traditional model of imparting specific knowledge through multi-year degrees is hugely misaligned with this reality.

Soaring education costs have fueled a consumer mindset shift, with learners demanding a clear return on investment directly aligned with their career goals. The expectation is to see immediate skills application and professional impact from their educational investments, not just long-term outcomes years after completion.


 

Chatbot prompts to help support student creativity — from classtechtips.com by Dr. Monica Burns

Do you use chatbots as a thought partner and collaborator? This year, I’ve spent time with educators all across the country to lead workshops on using Generative AI for instructional planning. If you want to use AI to support student creativity in your classroom, I have a handful of prompts that are ready for you to try.

Today on the blog, we’ll look at prompts to help support student creativity. You can tailor to your instructional environment.


Speaking of prompting, also see:


 

 

How Generative AI Owns Higher Education. Now What? — from forbes.co by Steve Andriole

Excerpt (emphasis DSC):

What about course videos? Professors can create them (by lecturing into a camera for several hours hopefully in different clothes) from the readings, from their interpretations of the readings, from their own case experiences – from anything they like. But now professors can direct the creation of the videos by talking – actually describing – to a CustomGPTabout what they’d like the video to communicate with their or another image. Wait. What? They can make a video by talking to a CustomGPT and even select the image they want the “actor” to use? Yes. They can also add a British accent and insert some (GenAI-developed) jokes into the videos if they like. All this and much more is now possible. This means that a professor can specify how long the video should be, what sources should be consulted and describe the demeanor the professor wants the video to project.

From DSC:
Though I wasn’t crazy about the clickbait type of title here, I still thought that the article was solid and thought-provoking. It contained several good ideas for using AI.


Excerpt from a recent EdSurge Higher Ed newsletter:


There are darker metaphors though — ones that focus on the hazards for humanity of the tech. Some professors worry that AI bots are simply replacing hired essay-writers for many students, doing work for a student that they can then pass off as their own (and doing it for free).

From DSC:
Hmmm…the use of essay writers was around long before AI became mainstream within higher education. So we already had a serious problem where students didn’t see the why in what they were being asked to do. Some students still aren’t sold on the why of the work in the first place. The situation seems to involve ethics, yes, but it also seems to say that we haven’t sold students on the benefits of putting in the work. Students seem to be saying I don’t care about this stuff…I just need the degree so I can exit stage left.

My main point: The issue didn’t start with AI…it started long before that.

And somewhat relevant here, also see:

I Have Bigger Fish to Fry: Why K12 Education is Not Thinking About AI — from medium.com by Maurie Beasley, M.Ed. (Edited by Jim Beasley)

This financial stagnation is occurring as we face a multitude of escalating challenges. These challenges include but are in no way limited to, chronic absenteeism, widespread student mental health issues, critical staff shortages, rampant classroom behavior issues, a palpable sense of apathy for education in students, and even, I dare say, hatred towards education among parents and policymakers.

Our current focus is on keeping our heads above water, ensuring our students’ safety and mental well-being, and simply keeping our schools staffed and our doors open.


Meet Ed: Ed is an educational friend designed to help students reach their limitless potential. — from lausd.org (Los Angeles School District, the second largest in the U.S.)

What is Ed?
An easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement. It offers personalized guidance and resources to students and families 24/7 in over 100 languages.

Ed is an easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement.

Also relevant/see:

  • Los Angeles Unified Bets Big on ‘Ed,’ an AI Tool for Students — from by Lauraine Langreo
    The Los Angeles Unified School District has launched an AI-powered learning tool that will serve as a “personal assistant” to students and their parents.The tool, named “Ed,” can provide students from the nation’s second-largest district information about their grades, attendance, upcoming tests, and suggested resources to help them improve their academic skills on their own time, Superintendent Alberto Carvalho announced March 20. Students can also use the app to find social-emotional-learning resources, see what’s for lunch, and determine when their bus will arrive.

Could OpenAI’s Sora be a big deal for elementary school kids? — from futureofbeinghuman.com by Andrew Maynard
Despite all the challenges it comes with, AI-generated video could unleash the creativity of young children and provide insights into their inner worlds – if it’s developed and used responsibly

Like many others, I’m concerned about the challenges that come with hyper-realistic AI-generated video. From deep fakes and disinformation to blurring the lines between fact and fiction, generative AI video is calling into question what we can trust, and what we cannot.

And yet despite all the issues the technology is raising, it also holds quite incredible potential, including as a learning and development tool — as long as we develop and use it responsibly.

I was reminded of this a few days back while watching the latest videos from OpenAI created by their AI video engine Sora — including the one below generated from the prompt “an elephant made of leaves running in the jungle”

What struck me while watching this — perhaps more than any of the other videos OpenAI has been posting on its TikTok channel — is the potential Sora has for translating the incredibly creative but often hard to articulate ideas someone may have in their head, into something others can experience.


Can AI Aid the Early Education Workforce? — from edsurge.com by Emily Tate Sullivan
During a panel at SXSW EDU 2024, early education leaders discussed the potential of AI to support and empower the adults who help our nation’s youngest children.

While the vast majority of the conversations about AI in education have centered on K-12 and higher education, few have considered the potential of this innovation in early care and education settings.

At the conference, a panel of early education leaders gathered to do just that, in a session exploring the potential of AI to support and empower the adults who help our nation’s youngest children, titled, “ChatECE: How AI Could Aid the Early Educator Workforce.”

Hau shared that K-12 educators are using the technology to improve efficiency in a number of ways, including to draft individualized education programs (IEPs), create templates for communicating with parents and administrators, and in some cases, to support building lesson plans.


From EIEIO…Seasons Of Change

Again, we’ve never seen change happen as fast as it’s happening.


Enhancing World Language Instruction With AI Image Generators — from eduoptia.org by Rachel Paparone
By crafting an AI prompt in the target language to create an image, students can get immediate feedback on their communication skills.

Educators are, perhaps rightfully so, cautious about incorporating AI in their classrooms. With thoughtful implementation, however, AI image generators, with their ability to use any language, can provide powerful ways for students to engage with the target language and increase their proficiency.


AI in the Classroom: A Teacher’s Toolkit for Transformation — from esheninger.blogspot.com by Eric Sheninger

While AI offers numerous benefits, it’s crucial to remember that it is a tool to empower educators, not replace them. The human connection between teacher and student remains central to fostering creativity, critical thinking, and social-emotional development. The role of teachers will shift towards becoming facilitators, curators, and mentors who guide students through personalized learning journeys. By harnessing the power of AI, educators can create dynamic and effective classrooms that cater to each student’s individual needs. This paves the way for a more engaging and enriching learning experience that empowers students to thrive.


Teachers Are Using AI to Create New Worlds, Help Students with Homework, and Teach English — from themarkup.org by Ross Teixeira; via Matthew Tower
Around the world, these seven teachers are making AI work for them and their students

In this article, seven teachers across the world share their insights on AI tools for educators. You will hear a host of varied opinions and perspectives on everything from whether AI could hasten the decline of learning foreign languages to whether AI-generated lesson plans are an infringement on teachers’ rights. A common theme emerged from those we spoke with: just as the internet changed education, AI tools are here to stay, and it is prudent for teachers to adapt.


Teachers Desperately Need AI Training. How Many Are Getting It? — from edweek.org by Lauraine Langreo

Even though it’s been more than a year since ChatGPT made a big splash in the K-12 world, many teachers say they are still not receiving any training on using artificial intelligence tools in the classroom.

More than 7 in 10 teachers said they haven’t received any professional development on using AI in the classroom, according to a nationally representative EdWeek Research Center survey of 953 educators, including 553 teachers, conducted between Jan. 31 and March 4.

From DSC:
This article mentioned the following resource:

Artificial Intelligence Explorations for Educators — from iste.org


 

How to Make the Dream of Education Equity (or Most of It) a Reality — from nataliewexler.substack.com by Natalie Wexler
Studies on the effects of tutoring–by humans or computers–point to ways to improve regular classroom instruction.

One problem, of course, is that it’s prohibitively expensive to hire a tutor for every average or struggling student, or even one for every two or three of them. This was the two-sigma “problem” that Bloom alluded to in the title of his essay: how can the massive benefits of tutoring possibly be scaled up? Both Khan and Zuckerberg have argued that the answer is to have computers, maybe powered by artificial intelligence, serve as tutors instead of humans.

From DSC:
I’m hoping that AI-backed learning platforms WILL help many people of all ages and backgrounds. But I realize — and appreciate what Natalie is saying here as well — that human beings are needed in the learning process (especially at younger ages). 

But without the human element, that’s unlikely to be enough. Students are more likely to work hard to please a teacher than to please a computer.

Natalie goes on to talk about training all teachers in cognitive science — a solid idea for sure. That’s what I was trying to get at with this graphic:
.

We need to take more of the research from learning science and apply it in our learning spaces.

.
But I’m not as hopeful in all teachers getting trained in cognitive science…as it should have happened (in the Schools of Education and in the K12 learning ecosystem at large) by now. Perhaps it will happen, given enough time.

And with more homeschooling and blended programs of education occurring, that idea gets stretched even further. 

K-12 Hybrid Schooling Is in High Demand — from realcleareducation.com by Keri D. Ingraham (emphasis below from DSC); via GSV

Parents are looking for a different kind of education for their children. A 2024 poll of parents reveals that 72% are considering, 63% are searching for, and 44% have selected a new K-12 school option for their children over the past few years. So, what type of education are they seeking?

Additional polling data reveals that 49% of parents would prefer their child learn from home at least one day a week. While 10% want full-time homeschooling, the remaining 39% of parents desire their child to learn at home one to four days a week, with the remaining days attending school on-campus. Another parent poll released this month indicates that an astonishing 64% of parents indicated that if they were looking for a new school for their child, they would enroll him or her in a hybrid school.

 

Which AI should I use? Superpowers and the State of Play — from by Ethan Mollick
And then there were three

For over a year, GPT-4 was the dominant AI model, clearly much smarter than any of the other LLM systems available. That situation has changed in the last month, there are now three GPT-4 class models, all powering their own chatbots: GPT-4 (accessible through ChatGPT Plus or Microsoft’s CoPilot), Anthropic’s Claude 3 Opus, and Google’s Gemini Advanced1.

Where we stand
We are in a brief period in the AI era where there are now multiple leading models, but none has yet definitively beaten the GPT-4 benchmark set over a year ago. While this may represent a plateau in AI abilities, I believe this is likely to change in the coming months as, at some point, models like GPT-5 and Gemini 2.0 will be released. In the meantime, you should be using a GPT-4 class model and using it often enough to learn what it does well. You can’t go wrong with any of them, pick a favorite and use it…

From DSC:
Here’s a powerful quote from Ethan:

In fact, in my new book I postulate that you haven’t really experienced AI until you have had three sleepless nights of existential anxiety, after which you can start to be productive again.


Using AI for Immersive Educational Experiences — from automatedteach.com by Graham Clay
Realistic video brings course content to life but requires AI literacy.

For us, I think the biggest promise of AI tools like Sora — that can create video with ease — is that they lower the cost of immersive educational experiences. This increases the availability of these experiences, expanding their reach to student populations who wouldn’t otherwise have them, whether due to time, distance, or expense.

Consider the profound impact on a history class, where students are transported to California during the gold rush through hyperrealistic video sequences. This vivifies the historical content and cultivates a deeper connection with the material.

In fact, OpenAI has already demonstrated the promise of this sort of use case, with a very simple prompt producing impressive results…


The Empathy Illusion: How AI Agents Could Manipulate Students — from marcwatkins.substack.com by Marc Watkins

Take this scenario. A student misses a class and, within twenty minutes, receives a series of texts and even a voicemail from a very concerned and empathic-sounding voice wanting to know what’s going on. Of course, the text is entirely generated, and the voice is synthetic as well, but the student likely doesn’t know this. To them, communication isn’t something as easy to miss or brush off as an email. It sounds like someone who cares is talking to them.

But let’s say that isn’t enough. By that evening, the student still hadn’t logged into their email or checked the LMS. The AI’s strategic reasoning is communicating with the predictive AI and analyzing the pattern of behavior against students who succeed or fail vs. students who are ill. The AI tracks the student’s movements on campus, monitors their social media usage, and deduces the student isn’t ill and is blowing off class.

The AI agent resumes communication with the student. But this time, the strategic AI adopts a different persona, not the kind and empathetic persona used for the initial contact, but a stern, matter-of-fact one. The student’s phone buzzes with alerts that talk about scholarships being lost, teachers being notified, etc. The AI anticipates the excuses the student will use and presents evidence tracking the student’s behavior to show they are not sick.


Not so much focused on learning ecosystems, but still worth mentioning:

The top 100 Gen AI Consumer Apps — from a16z.com / andreessen horowitz by Olivia Moore


 

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 

The 2024 Lawdragon 100 Leading AI & Legal Tech Advisors — from lawdragon.com by Katrina Dewey

These librarians, entrepreneurs, lawyers and technologists built the world where artificial intelligence threatens to upend life and law as we know it – and are now at the forefront of the battles raging within.

To create this first-of-its-kind guide, we cast a wide net with dozens of leaders in this area, took submissions, consulted with some of the most esteemed gurus in legal tech. We also researched the cases most likely to have the biggest impact on AI, unearthing the dozen or so top trial lawyers tapped to lead the battles. Many of them bring copyright or IP backgrounds and more than a few are Bay Area based. Those denoted with an asterisk are members of our Hall of Fame.
.


Free Legal Research Startup descrybe.ai Now Has AI Summaries of All State Supreme and Appellate Opinions — from lawnext.com by Bob Ambrogi

descrybe.ai, a year-old legal research startup focused on using artificial intelligence to provide free and easy access to court opinions, has completed its goal of creating AI-generated summaries of all available state supreme and appellate court opinions from throughout the United States.

descrybe.ai describes its mission as democratizing access to legal information and leveling the playing field in legal research, particularly for smaller-firm lawyers, journalists, and members of the public.


 
© 2024 | Daniel Christian