What: We’re taking the first steps in Bard’s ability to understand YouTube videos. For example, if you’re looking for videos on how to make olive oil cake, you can now also ask how many eggs the recipe in the first video requires.
Why: We’ve heard you want deeper engagement with YouTube videos. So we’re expanding the YouTube Extension to understand some video content so you can have a richer conversation with Bard about it.
I am not sure who said it first, but there are only two ways to react to exponential change: too early or too late. Today’s AIs are flawed and limited in many ways. While that restricts what AI can do, the capabilities of AI are increasing exponentially, both in terms of the models themselves and the tools these models can use. It might seem too early to consider changing an organization to accommodate AI, but I think that there is a strong possibility that it will quickly become too late.
From DSC: Readers of this blog have seen the following graphic for several years now, but there is no question that we are in a time of exponential change. One would have had an increasingly hard time arguing the opposite of this perspective during that time.
Nvidia’s results surpassed analysts’ projections for revenue and income in the fiscal fourth quarter.
Demand for Nvidia’s graphics processing units has been exceeding supply, thanks to the rise of generative artificial intelligence.
Nvidia announced the GH200 GPU during the quarter.
Here’s how the company did, compared to the consensus among analysts surveyed by LSEG, formerly known as Refinitiv:
Earnings: $4.02 per share, adjusted, vs. $3.37 per share expected
Revenue: $18.12 billion, vs. $16.18 billion expected
Nvidia’s revenue grew 206% year over year during the quarter ending Oct. 29, according to a statement. Net income, at $9.24 billion, or $3.71 per share, was up from $680 million, or 27 cents per share, in the same quarter a year ago.
DC: Anyone surprised? This is why the U.S. doesn’t want high-powered chips going to China. History repeats itself…again. The ways of the world/power continue on.
Pentagon’s AI initiatives accelerate hard decisions on lethal autonomous weapons https://t.co/PTDmJugiE2
From DSC: As I’ve long stated on the Learning from the Living [Class]Room vision, we are heading toward a new AI-empowered learning platform — where humans play a critically important role in making this new learning ecosystem work.
Along these lines, I ran into this site out on X/Twitter. We’ll see how this unfolds, but it will be an interesting space to watch.
From DSC: This future learning platform will also focus on developing skills and competencies. Along those lines, see:
Scale for Skills-First— from the-job.beehiiv.com by Paul Fain An ed-tech giant’s ambitious moves into digital credentialing and learner records.
A Digital Canvas for Skills
Instructure was a player in the skills and credentials space before its recent acquisition of Parchment, a digital transcript company. But that $800M move made many observers wonder if Instructure can develop digital records of skills that learners, colleges, and employers might actually use broadly.
…
Ultimately, he says, the CLR approach will allow students to bring these various learning types into a coherent format for employers.
Instructure seeks a leadership role in working with other organizations to establish common standards for credentials and learner records, to help create consistency. The company collaborates closely with 1EdTech. And last month it helped launch the 1EdTech TrustEd Microcredential Coalition, which aims to increase quality and trust in digital credentials.
It’s an era many instructors would like to put behind them: black boxes on Zoom screens, muffled discussions behind masks, students struggling to stay engaged. But how much more challenging would teaching during the pandemic have been if colleges did not have experts on staff to help with the transition? On many campuses, teaching-center directors, instructional designers, educational technologists, and others worked alongside professors to explore learning-management systems, master video technology, and rethink what and how they teach.
A new book out this month, Higher Education Beyond Covid: New Teaching Paradigms and Promise, explores this period through the stories of campus teaching and learning centers. Their experiences reflect successes and failures, and what higher education could learn as it plans for the future.
As usual, our readers were full of suggestions. Kathryn Schild, the lead instructional designer in faculty development and instructional support at the University of Alaska at Anchorage, shared a guide she’s compiled on holding asynchronous discussions, which includes a section on difficult topics.
In an email, Schild also pulled out a few ideas she thought were particularly relevant to Le’s question, including:
Set the ground rules as a class. One way to do this is to share your draft rules in a collaborative document and ask students to annotate it and add suggestions.
Plan to hold fewer difficult discussions than in a face-to-face class, and work on quality over quantity. This could include multiweek discussions, where you spiral through the same issue with fresh perspectives as the class learns new approaches.
Start with relationship-building interactions in the first few weeks, such as introductions, low-stakes group assignments, or peer feedback, etc.
What does the future of education LOOK like?— from stefanbauschard.substack.com by Stefen Bauschard Diverse students receive instruction from robots in an online classroom that mimics the current structure of education.
Last week, Matt Barnum reported in Chalkbeat that the Chan Zuckerberg Initiative is laying off dozens of staff members and pivoting away from the personalized learning platform they have funded since 2015 with somewhere near $100M.
…
I have tried to illustrate as often as my subscribers will tolerate that students don’t particularly enjoy learning alone with laptops within social spaces like classrooms. That learning fails to answer their questions about their social identity. It contributes to their feelings of alienation and disbelonging. I find this case easy to make but hard to prove. Maybe we just haven’t done personalized learning right? Maybe Summit just needed to include generative AI chatbots in their platform?
What is far easier to prove, or rather to disprove, is the idea that “whole class instruction must feel impersonal to students,” that “whole class instruction must necessarily fail to meet the needs of individual students.”
From DSC: I appreciate Dan’s comments here (as highlighted above) as they are helpful in my thoughts regarding the Learning from the Living [Class] Room vision. They seem to be echoed here by Jeppe Klitgaard Stricker when he says:
Personalized learning paths can be great, but they also entail a potential abolishment or unintended dissolution of learning communities and belonging.
Perhaps this powerful, global, Artificial Intelligence (AI)-backed, next-generation, lifelong learning platform of the future will be more focused on postsecondary students and experiences — but not so much for the K12 learning ecosystem.
But the school systems I’ve seen here in Michigan (USA) represent systems that address a majority of the class only. These one-size-fits-all systems don’t work for many students who need extra help and/or who are gifted students. The trains move fast. Good luck if you can’t keep up with the pace.
But if K-12’ers are involved in a future learning platform, the platform needs to address what Dan’s saying. It must address students questions about their social identity and not contribute to their feelings of alienation and disbelonging. It needs to support communities of practice and learning communities.
To use Sherpa, an instructor first uploads the reading they’ve assigned, or they can have the student upload a paper they’ve written. Then the tool asks a series of questions about the text (either questions input by the instructor or generated by the AI) to test the student’s grasp of key concepts. The software gives the instructor the choice of whether they want the tool to record audio and video of the conversation, or just audio.
The tool then uses AI to transcribe the audio from each student’s recording and flags areas where the student answer seemed off point. Teachers can review the recording or transcript of the conversation and look at what Sherpa flagged as trouble to evaluate the student’s response.
AI Meets Med School— from insidehighered.com by Lauren Coffey Adding to academia’s AI embrace, two institutions in the University of Texas system are jointly offering a medical degree paired with a master’s in artificial intelligence.
The University of Texas at San Antonio has launched a dual-degree program combining medical school with a master’s in artificial intelligence.
Several universities across the nation have begun integrating AI into medical practice. Medical schools at the University of Florida, the University of Illinois, the University of Alabama at Birmingham and Stanford and Harvard Universities all offer variations of a certificate in AI in medicine that is largely geared toward existing professionals.
“I think schools are looking at, ‘How do we integrate and teach the uses of AI?’” Dr. Whelan said. “And in general, when there is an innovation, you want to integrate it into the curriculum at the right pace.”
Speaking of emerging technologies and med school, also see:
— Salma – Midjourney & SD AI Product Photographer (@Salmaaboukarr) September 29, 2023
How to stop AI deepfakes from sinking society — and science — from nature.com by Nicola Jones; via The Neuron Deceptive videos and images created using generative AI could sway elections, crash stock markets and ruin reputations. Researchers are developing methods to limit their harm.
48+ hours since Chat GPT-4V has started rolling out for Plus and enterprise users.
With just under 10 acquisitions in the last 5 years, PowerSchool has been active in transforming itself from a student information systems company to an integrated education company that works across the day and lifecycle of K–12 students and educators. What’s more, the company turned heads in June with its announcement that it was partnering with Microsoft to integrate AI into its PowerSchool Performance Matters and PowerSchool LearningNav products to empower educators in delivering transformative personalized-learning pathways for students.
As readers of this series know, I’ve developed a six-session design/build workshop series for learning design teams to create an AI Learning Design Assistant (ALDA). In my last post in this series, I provided an elaborate ChatGPT prompt that can be used as a rapid prototype that everyone can try out and experiment with.1 In this post, I’d like to focus on how to address the challenges of AI literacy effectively and equitably.
Countries worldwide are designing and implementing AI governance legislation commensurate to the velocity and variety of proliferating AI-powered technologies. Legislative efforts include the development of comprehensive legislation, focused legislation for specific use cases, and voluntary guidelines and standards.
This tracker identifies legislative policy and related developments in a subset of jurisdictions. It is not globally comprehensive, nor does it include all AI initiatives within each jurisdiction, given the rapid and widespread policymaking in this space. This tracker offers brief commentary on the wider AI context in specific jurisdictions, and lists index rankings provided by Tortoise Media, the first index to benchmark nations on their levels of investment, innovation and implementation of AI.
The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.
“AI is real”
JPMorgan CEO Jamie Dimon says artificial intelligence will be part of “every single process,” adding it’s already “doing all the equity hedging for us” https://t.co/EtsTbiME1apic.twitter.com/J9YD4slOpv
Part 1: October 16 | 3:00–4:30 p.m. ET
Part 2: October 19 | 3:00–4:30 p.m. ET
Part 3: October 26 | 3:00–4:30 p.m. ET
Part 4: October 30 | 3:00–4:30 p.m. ET
Welcome to The Future of Education with Michael B. Horn. In this insightful episode, Michael gains perspective on mapping AI’s role in education from Jacob Klein, a Product Consultant at Oko Labs, and Laurence Holt, an Entrepreneur In Residence at the XQ Institute. Together, they peer into the burgeoning world of AI in education, analyzing its potential, risks, and roadmap for integrating it seamlessly into learning environments.
Student Use Cases for AI: Start by Sharing These Guidelines with Your Class — from hbsp.harvard.edu by Ethan Mollick and Lilach Mollick
To help you explore some of the ways students can use this disruptive new technology to improve their learning—while making your job easier and more effective—we’ve written a series of articles that examine the following student use cases:
Earlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.
Then, in class he put them into groups where they worked together to generate a 500-word essay on “Why I Write” entirely through ChatGPT. Each group had complete freedom in how they chose to use the tool. The key: They were asked to evaluate their essay on how well it offered a personal perspective and demonstrated a critical reading of the piece. Weiss also graded each ChatGPT-written essay and included an explanation of why he came up with that particular grade.
After that, the students were asked to record their observations on the experiment on the discussion board. Then they came together again as a class to discuss the experiment.
Weiss shared some of his students’ comments with me (with their approval). Here are a few:
Asked to describe the state of generative AI that they would like to see in higher education 10 years from now, panelists collaboratively constructed their preferred future. .
Julie York, a computer science and media teacher at South Portland High School in Maine, was scouring the internet for discussion tools for her class when she found TeachFX. An AI tool that takes recorded audio from a classroom and turns it into data about who talked and for how long, it seemed like a cool way for York to discuss issues of data privacy, consent and bias with her students. But York soon realized that TeachFX was meant for much more.
York found that TeachFX listened to her very carefully, and generated a detailed feedback report on her specific teaching style. York was hooked, in part because she says her school administration simply doesn’t have the time to observe teachers while tending to several other pressing concerns.
“I rarely ever get feedback on my teaching style. This was giving me 100 percent quantifiable data on how many questions I asked and how often I asked them in a 90-minute class,” York says. “It’s not a rubric. It’s a reflection.”
TeachFX is easy to use, York says. It’s as simple as switching on a recording device.
…
But TeachFX, she adds, is focused not on her students’ achievements, but instead on her performance as a teacher.
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds — from the74million.org by Mark Keierleber While educators worry that students are using generative AI to cheat, a new report finds students are turning to the tool more for personal problems.
Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.
Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat.
The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.
Thanks to generative AI, such visions are transitioning from fiction to reality.
Video: Unleashing the Power of AI in L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman An exclusive video walkthrough of my keynote at Sweden’s national L&D conference this week
Highlights
The wicked problem of L&D: last year, $371 billion was spent on workplace training globally, but only 12% of employees apply what they learn in the workplace
An innovative approach to L&D: when Mastery Learning is used to design & deliver workplace training, the rate of “transfer” (i.e. behaviour change & application) is 67%
AI 101: quick summary of classification, generative and interactive AI and its uses in L&D
The impact of AI: my initial research shows that AI has the potential to scale Mastery Learning and, in the process:
reduce the “time to training design” by 94% > faster
reduce the cost of training design by 92% > cheaper
increase the quality of learning design & delivery by 96% > better
Research also shows that the vast majority of workplaces are using AI only to “oil the machine” rather than innovate and improve our processes & practices
Practical tips: how to get started on your AI journey in your company, and a glimpse of what L&D roles might look like in a post-AI world
Edsoma is an app that uses an AI reading assistant to help people learn or improve their reading and communication.
…
For now, the company is targeting users in grades kindergarten to fourth grade based on the content that it has today. Wallgren noted that the Edsoma’s technology will work right through into university and he has ambitions to become the No. 1 literacy resource in the United States.
Outschool, the online learning platform that offers kid-friendly academic and interest-based classes, announced today the launch of its AI Teaching Assistant, a tool for tutors to generate progress reports for their students. The platform — mainly popular for its small group class offerings — also revealed that it’s venturing into one-on-one tutoring, putting it in direct competition with companies like Varsity Tutors, Tutor.com and Preply.
But this fall, a few educators are quietly charting a different course they believe could change everything: At least two groups are pushing to create new AI chatbots that would offer teachers unlimited access to sometimes confusing and often paywalled peer-reviewed research on the topics that most bedevil them.
Their aspiration is to offer new tools that are more focused and helpful than wide-ranging ones like ChatGPT, which tends to stumble over research questions with competing findings. Like many juveniles faced with questions they can’t answer, it has a frustrating tendency to make things up.
Then, I turned Custom Instructions on and gave it the same starting prompt. Here are a few of the responses (emphases are mine):
“Purpose:What’s the primary goal of the course? Is it to augment your existing offerings at Every, serve as a standalone product, or something else?”
“Content Depth: Given your love for detailed, intellectual writing, how in-depth will you go on each topic?”
“Guest Contributions: Do you plan to involve guest experts, perhaps even some of your writing heroes or team members like Kate Lee and Evan Armstrong?”
“Time Investment: Given that you’re already grappling with being too opportunistic and dutiful, how will you allocate time for this without sidelining your creative work or burdening yourself?”
AI for comics— from aiandacademia.substack.com by Bryan Alexander Exploring an entertaining new Hugging Face application
As generative AI grows, more uses appear. Beyond text and images, other domains for creation pop up as companies, groups, and individuals try out new functions.
Today’s case in point is AI Comic Factory from Hugging Face.* This service will generate a couple of comic book pages based on your text input. It gives you choices of comic style and page layout as well.
Items from Tom Barrett’ Promptcraft: AI for a better learning ecosystem
How many languages do you speak? Thanks to AI, that number could be as many as seven. Los Angeles-based AI video platform HeyGen has launched a new tool that clones your voice from a video and translates what you’re saying into seven different languages. If that wasn’t enough, it also syncs your lips to your new voice so the final clip looks (and sounds) as realistic as possible.
Microsoft and Project Gutenberg have used AI technologies to create more than 5,000 free audiobooks with high-quality synthetic voices.
For the project, the researchers combined advances in machine learning, automatic text selection (which texts are read aloud, which are not), and natural-sounding speech synthesis systems.
What if you could have a conversation with your notes?That question has consumed a corner of the internet recently, as companies like Dropbox, Box, Notion, and others have built generative AI tools that let you interact with and create new things from the data you already have in their systems.
Google’s version of this is called NotebookLM. It’s an AI-powered research tool that is meant to help you organize and interact with your own notes.
That got me to thinking…
What if the presenter/teacher/professor/trainer/preacher provided a set of notes for the AI to compare to the readers’ notes?
That way, the AI could see the discrepancies between what the presenter wanted their audience to learn/hear and what was actually being learned/heard. In a sort of digital Socratic Method, the AI could then generate some leading questions to get the audience member to check their thinking/understanding of the topic.
The end result would be that the main points were properly communicated/learned/received.
What if you could have a conversation with your notes?That question has consumed a corner of the internet recently, as companies like Dropbox, Box, Notion, and others have built generative AI tools that let you interact with and create new things from the data you already have in their systems.
Google’s version of this is called NotebookLM. It’s an AI-powered research tool that is meant to help you organize and interact with your own notes.
… Right now, it’s really just a prototype, but a small team inside the company has been trying to figure out what an AI notebook might look like.
The single narrative education system is no longer working.
Its main limitation is its inability to honor young people as the dynamic individuals that they are.
New models of teaching and learning need to be designed to center on the student, not the teacher.
When the opportunity arises to implement learning that uses immersive technology ask yourself if the learning you are designing passes the Ready Player One Test:
Does it allow learners to immerse themselves in environments that would be too expensive or dangerous to experience otherwise?
Can the learning be personalized by the student?
Is it regenerative?
Does it allow for learning to happen non-linearly, at any time and place?
So, as educators, mentors, and guides to our future generations, we must ask ourselves three pivotal questions:
What value do we offer to our students?
What value will they need to offer to the world?
How are we preparing them to offer that value?
The answers to these questions are crucial, and they will redefine the trajectory of our education system.
We need to create an environment that encourages curiosity, embraces failure as a learning opportunity, and celebrates diversity. We need to teach our students how to learn, how to ask the right questions, and how to think for themselves.
Leveraging ChatGPT for learning is the most meaningful skill this year for lifelong learners. But it’s too hard to find resources to master it.
As a learning science nerd, I’ve explored hundreds of prompts over the past months. Most of the advice doesn’t go beyond text summaries and multiple-choice testing.
That’s why I’ve created this article — it merges learning science with prompt writing to help you learn anything faster.
Midjourney AI Art for Teachers (for any kind of teacher, not just Art Teachers) — from The AI Educator on YouTube by Dan Fitzpatrick
From DSC: This is a very nice, clearly illustrated, free video to get started with the Midjourney (text-to-image) app. Nice work Dan!
In the new-normal of generative AI, how does one articulate the value of academic integrity? This blog presents my current response in about 2,500 words; a complete answer could fill a sizable book.
Massive amounts of misinformation are disseminated about generative AI, so the first part of my discussion clarifies what large language models (Chat-GPT and its counterparts) can currently do and what they cannot accomplish at this point in time. The second part describes ways in which generative AI can be misused as a means of learning; unfortunately, many people are now advocating for these mistaken applications to education. The third part describes ways in which large language models (LLM), used well, may substantially improve learning and education. I close with a plea for a robust, informed public discussion about these topics and issues.
Many of the more than a dozen teachers TIME interviewed for this story argue that the way to get kids to care is to proactively use ChatGPT in the classroom.
…
Some of those creative ideas are already in effect at Peninsula High School in Gig Harbor, about an hour from Seattle. In Erin Rossing’s precalculus class, a student got ChatGPT to generate a rap about vectors and trigonometry in the style of Kanye West, while geometry students used the program to write mathematical proofs in the style of raps, which they performed in a classroom competition. In Kara Beloate’s English-Language Arts class, she allowed students reading Shakespeare’s Othello to use ChatGPT to translate lines into modern English to help them understand the text, so that they could spend class time discussing the plot and themes.
I found that other developed countries share concerns about students cheating but are moving quickly to use AI to personalize education, enhance language lessons and help teachers with mundane tasks, such as grading. Some of these countries are in the early stages of training teachers to use AI and developing curriculum standards for what students should know and be able to do with the technology.
Several countries began positioning themselves several years ago to invest in AI in education in order to compete in the fourth industrial revolution.
AI in Education— from educationnext.org by John Bailey The leap into a new era of machine intelligence carries risks and challenges, but also plenty of promise
In the realm of education, this technology will influence how students learn, how teachers work, and ultimately how we structure our education system. Some educators and leaders look forward to these changes with great enthusiasm. Sal Kahn, founder of Khan Academy, went so far as to say in a TED talk that AI has the potential to effect “probably the biggest positive transformation that education has ever seen.” But others warn that AI will enable the spread of misinformation, facilitate cheating in school and college, kill whatever vestiges of individual privacy remain, and cause massive job loss. The challenge is to harness the positive potential while avoiding or mitigating the harm.
Generative AI and education futures — from ucl.ac.uk Video highlights from Professor Mike Sharples’ keynote address at the 2023 UCL Education Conference, which explored opportunities to prosper with AI as a part of education.
Bringing AI Literacy to High Schools— from by Nikki Goth Itoi Stanford education researchers collaborated with teachers to develop classroom-ready AI resources for high school instructors across subject areas.
To address these two imperatives, all high schools need access to basic AI tools and training. Yet the reality is that many underserved schools in low-income areas lack the bandwidth, skills, and confidence to guide their students through an AI-powered world. And if the pattern continues, AI will only worsen existing inequities. With this concern top of mind plus initial funding from the McCoy Ethics Center, Lee began recruiting some graduate students and high school teachers to explore how to give more people equal footing in the AI space.