This AI App Can Solve Your Math Homework, Steps Included — from link.wired.com by Will Knight

Right now, high schoolers and college students around the country are experimenting with free smartphone apps that help complete their math homework using generative AI. One of the most popular options on campus right now is the Gauth app, with millions of downloads. It’s owned by ByteDance, which is also TikTok’s parent company.

The Gauth app first launched in 2019 with a primary focus on mathematics, but soon expanded to other subjects as well, like chemistry and physics. It’s grown in relevance, and neared the top of smartphone download lists earlier this year for the education category. Students seem to love it. With hundreds of thousands of primarily positive reviews, Gauth has a favorable 4.8 star rating in the Apple App Store and Google Play Store.

All students have to do after downloading the app is point their smartphone at a homework problem, printed or handwritten, and then make sure any relevant information is inside of the image crop. Then Gauth’s AI model generates a step-by-step guide, often with the correct answer. 

From DSC:
I do hesitate to post this though, as I’ve seen numerous posting re: the dubious quality of AI as it relates to giving correct answers to math-related problems – or whether using AI-based tools help or hurt the learning process. The situation seems to be getting better, but as I understand it, we still have some progress to make in this area of mathematics.


Redefining Creativity in the Age of AI — from gettingsmart.com by David Ross

Key Points

  • Educational leaders must reconsider the definition of creativity, taking into account how generative AI tools can be used to produce novel and impactful creative work, similar to how film editors compile various elements into a cohesive, creative whole.
  • Generative AI democratizes innovation by allowing all students to become creators, expanding access to creative processes that were previously limited and fostering a broader inclusion of diverse talents and ideas in education.


AI-Powered Instructional Design at ASU — from drphilippahardman.substack.com by Dr. Philippa Hardman
How ASU’s Collaboration with OpenAI is Reshaping the Role of Instructional Designers

The developments and experiments at ASU provide a fascinating window into two things:

    1. How the world is reimagining learning in the age of AI;
    2. How the role of the instructional designer is changing in the age of AI.

In this week’s blog post, I’ll provide a summary of how faculty, staff and students at ASU are starting to reimagine education in the age of AI, and explore what this means for the instructions designers who work there.


PhysicsWallah’s ‘Alakh AI’ is Making Education Accessible to Millions in India — from analyticsindiamag.com by Siddharth Jindal

India’s ed-tech unicorn PhysicsWallah is using OpenAI’s GPT-4o to make education accessible to millions of students in India. Recently, the company launched a suite of AI products to ensure that students in Tier 2 & 3 cities can access high-quality education without depending solely on their enrolled institutions, as 85% of their enrollment comes from these areas.

Last year, AIM broke the news of PhysicsWallah introducing ‘Alakh AI’, its suite of generative AI tools, which was eventually launched at the end of December 2023. It quickly gained traction, amassing over 1.5 million users within two months of its release.


 

When A.I.’s Output Is a Threat to A.I. Itself — from nytimes.com by Aatish Bhatia
As A.I.-generated data becomes harder to detect, it’s increasingly likely to be ingested by future A.I., leading to worse results.

All this A.I.-generated information can make it harder for us to know what’s real. And it also poses a problem for A.I. companies. As they trawl the web for new data to train their next models on — an increasingly challenging task — they’re likely to ingest some of their own A.I.-generated content, creating an unintentional feedback loop in which what was once the output from one A.I. becomes the input for another.

In the long run, this cycle may pose a threat to A.I. itself. Research has shown that when generative A.I. is trained on a lot of its own output, it can get a lot worse.


Per The Rundown AI:

The Rundown: Elon Musk’s xAI just launched “Colossus“, the world’s most powerful AI cluster powered by a whopping 100,000 Nvidia H100 GPUs, which was built in just 122 days and is planned to double in size soon.

Why it matters: xAI’s Grok 2 recently caught up to OpenAI’s GPT-4 in record time, and was trained on only around 15,000 GPUs. With now more than six times that amount in production, the xAI team and future versions of Grok are going to put a significant amount of pressure on OpenAI, Google, and others to deliver.


Google Meet’s automatic AI note-taking is here — from theverge.com by Joanna Nelius
Starting [on 8/28/24], some Google Workspace customers can have Google Meet be their personal note-taker.

Google Meet’s newest AI-powered feature, “take notes for me,” has started rolling out today to Google Workspace customers with the Gemini Enterprise, Gemini Education Premium, or AI Meetings & Messaging add-ons. It’s similar to Meet’s transcription tool, only instead of automatically transcribing what everyone says, it summarizes what everyone talked about. Google first announced this feature at its 2023 Cloud Next conference.


The World’s Call Center Capital Is Gripped by AI Fever — and Fear — from bloomberg.com by Saritha Rai [behind a paywall]
The experiences of staff in the Philippines’ outsourcing industry are a preview of the challenges and choices coming soon to white-collar workers around the globe.


[Claude] Artifacts are now generally available — from anthropic.com

[On 8/27/24], we’re making Artifacts available for all Claude.ai users across our Free, Pro, and Team plans. And now, you can create and view Artifacts on our iOS and Android apps.

Artifacts turn conversations with Claude into a more creative and collaborative experience. With Artifacts, you have a dedicated window to instantly see, iterate, and build on the work you create with Claude. Since launching as a feature preview in June, users have created tens of millions of Artifacts.


MIT's AI Risk Repository -- a comprehensive database of risks from AI systems

What are the risks from Artificial Intelligence?
A comprehensive living database of over 700 AI risks categorized by their cause and risk domain.

What is the AI Risk Repository?
The AI Risk Repository has three parts:

  • The AI Risk Database captures 700+ risks extracted from 43 existing frameworks, with quotes and page numbers.
  • The Causal Taxonomy of AI Risks classifies how, when, and why these risks occur.
  • The Domain Taxonomy of AI Risks classifies these risks into seven domains (e.g., “Misinformation”) and 23 subdomains (e.g., “False or misleading information”).

California lawmakers approve legislation to ban deepfakes, protect workers and regulate AI — from newsday.com by The Associated Press

SACRAMENTO, Calif. — California lawmakers approved a host of proposals this week aiming to regulate the artificial intelligence industry, combat deepfakes and protect workers from exploitation by the rapidly evolving technology.

Per Oncely:

The Details:

  • Combatting Deepfakes: New laws to restrict election-related deepfakes and deepfake pornography, especially of minors, requiring social media to remove such content promptly.
  • Setting Safety Guardrails: California is poised to set comprehensive safety standards for AI, including transparency in AI model training and pre-emptive safety protocols.
  • Protecting Workers: Legislation to prevent the replacement of workers, like voice actors and call center employees, with AI technologies.

New in Gemini: Custom Gems and improved image generation with Imagen 3 — from blog.google
The ability to create custom Gems is coming to Gemini Advanced subscribers, and updated image generation capabilities with our latest Imagen 3 model are coming to everyone.

We have new features rolling out, [that started on 8/28/24], that we previewed at Google I/O. Gems, a new feature that lets you customize Gemini to create your own personal AI experts on any topic you want, are now available for Gemini Advanced, Business and Enterprise users. And our new image generation model, Imagen 3, will be rolling out across Gemini, Gemini Advanced, Business and Enterprise in the coming days.


Cut the Chatter, Here Comes Agentic AI — from trendmicro.com

Major AI players caught heat in August over big bills and weak returns on AI investments, but it would be premature to think AI has failed to deliver. The real question is what’s next, and if industry buzz and pop-sci pontification hold any clues, the answer isn’t “more chatbots”, it’s agentic AI.

Agentic AI transforms the user experience from application-oriented information synthesis to goal-oriented problem solving. It’s what people have always thought AI would do—and while it’s not here yet, its horizon is getting closer every day.

In this issue of AI Pulse, we take a deep dive into agentic AI, what’s required to make it a reality, and how to prevent ‘self-thinking’ AI agents from potentially going rogue.

Citing AWS guidance, ZDNET counts six different potential types of AI agents:

    • Simple reflex agents for tasks like resetting passwords
    • Model-based reflex agents for pro vs. con decision making
    • Goal-/rule-based agents that compare options and select the most efficient pathways
    • Utility-based agents that compare for value
    • Learning agents
    • Hierarchical agents that manage and assign subtasks to other agents

Ask Claude: Amazon turns to Anthropic’s AI for Alexa revamp — from reuters.com by Greg Bensinger

Summary:

  • Amazon developing new version of Alexa with generative AI
  • Retailer hopes to generate revenue by charging for its use
  • Concerns about in-house AI prompt Amazon to turn to Anthropic’s Claude, sources say
  • Amazon says it uses many different technologies to power Alexa

Alibaba releases new AI model Qwen2-VL that can analyze videos more than 20 minutes long — from venturebeat.com by Carl Franzen


Hobbyists discover how to insert custom fonts into AI-generated images — from arstechnica.com by Benj Edwards
Like adding custom art styles or characters, in-world typefaces come to Flux.


200 million people use ChatGPT every week – up from 100 million last fall, says OpenAI — from zdnet.com by Sabrina Ortiz
Nearly two years after launching, ChatGPT continues to draw new users. Here’s why.

 

AI agents are the future, and a lot is at stake — from forbes.com by Skip Sanzeri

What An Agent Is
Agents are computer programs that can autonomously perform tasks, make decisions and interact with humans or other computers. There are many different types of agents, and they are designed to achieve specific goals spanning our lives and nearly every industry, making them an integral and unstoppable part of our future.

Learning: AI agents will transform education by providing personalized learning experiences such as one-to-one tutoring. ChatGPT and other large language models (LLMs) are providing access to all digital knowledge now. An “agent” would act as a more personalized version of an LLM.

The hacking and control of an AI agent could lead to disastrous consequences, affecting privacy, security, the economy and societal stability. Proactive and comprehensive security strategies are essential to mitigate these risks in the future.

 

What Students Want: Key Results from DEC Global AI Student Survey 2024 — from digitaleducationcouncil.com by Digital Education Council

  • 86% of students globally are regularly using AI in their studies, with 54% of them using AI on a weekly basis, the recent Digital Education Council Global AI Student Survey found.
  • ChatGPT was found to be the most widely used AI tool, with 66% of students using it, and over 2 in 3 students reported using AI for information searching.
  • Despite their high rates of AI usage, 1 in 2 students do not feel AI ready. 58% reported that they do not feel that they had sufficient AI knowledge and skills, and 48% do not feel adequately prepared for an AI-enabled workplace.

Chatting with WEF about ChatGPT in the classroom — from futureofbeinghuman.com by Andrew Maynard
A short video on generative AI in education from the World Economic Forum


The Post-AI Instructional Designer — from drphilippahardman.substack.com by Dr. Philippa Hardman
How the ID role is changing, and what this means for your key skills, roles & responsibilities

Specifically, the study revealed that teachers who reported most productivity gains were those who used AI not just for creating outputs (like quizzes or worksheets) but also for seeking input on their ideas, decisions and strategies.

Those who engaged with AI as a thought partner throughout their workflow, using it to generate ideas, define problems, refine approaches, develop strategies and gain confidence in their decisions gained significantly more from their collaboration with AI than those who only delegated functional tasks to AI.  


Leveraging Generative AI for Inclusive Excellence in Higher Education — from er.educause.edu by Lorna Gonzalez, Kristi O’Neil-Gonzalez, Megan Eberhardt-Alstot, Michael McGarry and Georgia Van Tyne
Drawing from three lenses of inclusion, this article considers how to leverage generative AI as part of a constellation of mission-centered inclusive practices in higher education.

The hype and hesitation about generative artificial intelligence (AI) diffusion have led some colleges and universities to take a wait-and-see approach.Footnote1 However, AI integration does not need to be an either/or proposition where its use is either embraced or restricted or its adoption aimed at replacing or outright rejecting existing institutional functions and practices. Educators, educational leaders, and others considering academic applications for emerging technologies should consider ways in which generative AI can complement or augment mission-focused practices, such as those aimed at accessibility, diversity, equity, and inclusion. Drawing from three lenses of inclusion—accessibility, identity, and epistemology—this article offers practical suggestions and considerations that educators can deploy now. It also presents an imperative for higher education leaders to partner toward an infrastructure that enables inclusive practices in light of AI diffusion.

An example way to leverage AI:

How to Leverage AI for Identity Inclusion
Educators can use the following strategies to intentionally design instructional content with identity inclusion in mind.

  • Provide a GPT or AI assistant with upcoming lesson content (e.g., lecture materials or assignment instructions) and ask it to provide feedback (e.g., troublesome vocabulary, difficult concepts, or complementary activities) from certain perspectives. Begin with a single perspective (e.g., first-time, first-year student), but layer in more to build complexity as you interact with the GPT output.

Gen AI’s next inflection point: From employee experimentation to organizational transformation — from mckinsey.com by Charlotte Relyea, Dana Maor, and Sandra Durth with Jan Bouly
As many employees adopt generative AI at work, companies struggle to follow suit. To capture value from current momentum, businesses must transform their processes, structures, and approach to talent.

To harness employees’ enthusiasm and stay ahead, companies need a holistic approach to transforming how the whole organization works with gen AI; the technology alone won’t create value.

Our research shows that early adopters prioritize talent and the human side of gen AI more than other companies (Exhibit 3). Our survey shows that nearly two-thirds of them have a clear view of their talent gaps and a strategy to close them, compared with just 25 percent of the experimenters. Early adopters focus heavily on upskilling and reskilling as a critical part of their talent strategies, as hiring alone isn’t enough to close gaps and outsourcing can hinder strategic-skills development. Finally, 40 percent of early-adopter respondents say their organizations provide extensive support to encourage employee adoption, versus 9 percent of experimenter respondents.


7 Ways to Use AI Music in Your Classroom — from classtechtips.com by Monica Burns


Change blindness — from oneusefulthing.org by Ethan Mollick
21 months later

I don’t think anyone is completely certain about where AI is going, but we do know that things have changed very quickly, as the examples in this post have hopefully demonstrated. If this rate of change continues, the world will look very different in another 21 months. The only way to know is to live through it.


My AI Breakthrough — from mgblog.org by Miguel Guhlin

Over the subsequent weeks, I’ve made other adjustments, but that first one was the one I asked myself:

  1. What are you doing?
  2. Why are you doing it that way?
  3. How could you change that workflow with AI?
  4. Applying the AI to the workflow, then asking, “Is this what I was aiming for? How can I improve the prompt to get closer?”
  5. Documenting what worked (or didn’t). Re-doing the work with AI to see what happened, and asking again, “Did this work?”

So, something that took me WEEKS of hard work, and in some cases I found impossible, was made easy. Like, instead of weeks, it takes 10 minutes. The hard part? Building the prompt to do what I want, fine-tuning it to get the result. But that doesn’t take as long now.

 

One thing often happens at keynotes and conferences. It surprised me…. — from donaldclarkplanb.blogspot.com by Donald Clark

AI is welcomed by those with dyslexia, and other learning issues, helping to mitigate some of the challenges associated with reading, writing, and processing information. Those who want to ban AI want to destroy the very thing that has helped most on accessibility. Here are 10 ways dyslexics, and others with issues around text-based learning, can use AI to support their daily activities and learning.

    • Text-to-Speech & Speech-to-Text Tools…
    • Grammar and Spelling Assistants…
    • Comprehension Tools…
    • Visual and Multisensory Tools…
    • …and more

Let’s Make a Movie Teaser With AI — from whytryai.com by Daniel Nest
How to use free generative AI tools to make a teaser trailer.

Here are the steps and the free tools we can use for each.

  1. Brainstorm ideas & flesh out the concept.
    1. Claude 3.5 Sonnet
    2. Google Gemini 1.5 Pro
    3. …or any other free LLM
  2. Create starting frames for each scene.
    1. FLUX.1 Pro
    2. Ideogram
    3. …or any other free text-to-image model
  3. Bring the images to life.
    1. Kling AI
    2. Luma Dream Machine
    3. Runway Gen-2
  4. Generate the soundtrack.
    1. Udio
    2. Suno
  5. Add sound effects.
    1. ElevenLabs Sound Effects
    2. ElevenLabs VideoToSoundEffects
    3. Meta Audiobox
  6. Put everything together.
    1. Microsoft Clipchamp
    2. DaVinci Resolve
    3. …or any other free video editing tool.

Here we go.


Is AI in Schools Promising or Overhyped? Potentially Both, New Reports Suggest — from the74million.org by Greg Toppo; via Claire Zau
One urges educators to prep for an artificial intelligence boom. The other warns that it could all go awry. Together, they offer a reality check.

Are U.S. public schools lagging behind other countries like Singapore and South Korea in preparing teachers and students for the boom of generative artificial intelligence? Or are our educators bumbling into AI half-blind, putting students’ learning at risk?

Or is it, perhaps, both?

Two new reports, coincidentally released on the same day last week, offer markedly different visions of the emerging field: One argues that schools need forward-thinking policies for equitable distribution of AI across urban, suburban and rural communities. The other suggests they need something more basic: a bracing primer on what AI is and isn’t, what it’s good for and how it can all go horribly wrong.


Bite-Size AI Content for Faculty and Staff — from aiedusimplified.substack.com by Lance Eaton
Another two 5-tips videos for faculty and my latest use case: creating FAQs!

I had an opportunity recently to do more of my 15-minute lightning talks. You can see my lightning talks from late winter in this post, or can see all of them on my YouTube channel. These two talks were focused on faculty in particular.


Also from Lance, see:


AI in Education: Leading a Paradigm Shift — from gettingsmart.com by Dr. Tyler Thigpen

Despite possible drawbacks, an exciting wondering has been—What if AI was a tipping point helping us finally move away from a standardized, grade-locked, ranking-forced, batched-processing learning model based on the make believe idea of “the average man” to a learning model that meets every child where they are at and helps them grow from there?

I get that change is indescribably hard and there are risks. But the integration of AI in education isn’t a trend. It’s a paradigm shift that requires careful consideration, ongoing reflection, and a commitment to one’s core values. AI presents us with an opportunity—possibly an unprecedented one—to transform teaching and learning, making it more personalized, efficient, and impactful. How might we seize the opportunity boldly?


California and NVIDIA Partner to Bring AI to Schools, Workplaces — from govtech.com by Abby Sourwine
The latest step in Gov. Gavin Newsom’s plans to integrate AI into public operations across California is a partnership with NVIDIA intended to tailor college courses and professional development to industry needs.

California Gov. Gavin Newsom and tech company NVIDIA joined forces last week to bring generative AI (GenAI) to community colleges and public agencies across the state. The California Community Colleges Chancellor’s Office (CCCCO), NVIDIA and the governor all signed a memorandum of understanding (MOU) outlining how each partner can contribute to education and workforce development, with the goal of driving innovation across industries and boosting their economic growth.


Listen to anything on the go with the highest-quality voices — from elevenlabs.io; via The Neuron
The ElevenLabs Reader App narrates articles, PDFs, ePubs, newsletters, or any other text content. Simply choose a voice from our expansive library, upload your content, and listen on the go.

Per The Neuron

Some cool use cases:

  • Judy Garland can teach you biology while walking to class.
  • James Dean can narrate your steamy romance novel.
  • Sir Laurence Olivier can read you today’s newsletter—just paste the web link and enjoy!

Why it’s important: ElevenLabs shared how major Youtubers are using its dubbing services to expand their content into new regions with voices that actually sound like them (thanks to ElevenLabs’ ability to clone voices).
Oh, and BTW, it’s estimated that up to 20% of the population may have dyslexia. So providing people an option to listen to (instead of read) content, in their own language, wherever they go online can only help increase engagement and communication.


How Generative AI Improves Parent Engagement in K–12 Schools — from edtechmagazine.com by Alexadner Slagg
With its ability to automate and personalize communication, generative artificial intelligence is the ideal technological fix for strengthening parent involvement in students’ education.

As generative AI tools populate the education marketplace, the technology’s ability to automate complex, labor-intensive tasks and efficiently personalize communication may finally offer overwhelmed teachers a way to effectively improve parent engagement.

These personalized engagement activities for students and their families can include local events, certification classes and recommendations for books and videos. “Family Feed might suggest courses, such as an Adobe certification,” explains Jackson. “We have over 14,000 courses that we have vetted and can recommend. And we have books and video recommendations for students as well.”

Including personalized student information and an engagement opportunity makes it much easier for parents to directly participate in their children’s education.


Will AI Shrink Disparities in Schools, or Widen Them? — edsurge.com by Daniel Mollenkamp
Experts predict new tools could boost teaching efficiency — or create an “underclass of students” taught largely through screens.

 

UC Berkeley Law School To Offer Advanced Law Degree Focused On AI — from forbes.com by Michael T. Nietzel; via Greg Lambert

The University of California, Berkeley School of Law has announced that it will offer what it’s calling “the first-ever law degree with a focus on artificial intelligence (AI).” The new AI-focused Master of Laws (LL.M.) program is scheduled to launch in summer 2025.

The program, which will award an AI Law and Regulation certificate for students enrolled in UC Berkeley Law’s LL.M. executive track, is designed for working professionals and can be completed over two summers or through remote study combined with one summer on campus.


Also relevant, see:

Training AI to Mentor Like a Partner: Insights from Dr. Megan Ma — from geeklawblog.com

This week on The Geek in Review, we discuss the future of legal technology with Dr. Megan Ma, a distinguished research fellow and Associate Director of the Stanford Program in Law, Science, and Technology at the Stanford Center for Legal Informatics, also known as Codex. Dr. Ma’s groundbreaking work in integrating generative AI into legal applications takes center stage as she shares her insights on translating legal knowledge into code and the implications of human-machine collaboration in the legal field.

 

College Writing Centers Worry AI Could Replace Them — from edsurge.com by Maggie Hicks
Those who run the centers argue that they could be a hub for teaching AI literacy.

But as generative AI tools like ChatGPT sweep into mainstream business tools, promising to draft properly-formatted text from simple prompts and the click of a button, new questions are rising about what role writing centers should play — or whether they will be needed in the future.

Writing centers need to find a balance between introducing AI into the writing process and keeping the human support that every writer needs, argues Anna Mills, an English instructor at the College of Marin.

AI can serve as a supplement to a human tutor, Mills says. She encourages her students to use MyEssayFeedback, an AI tool that critiques the organization of an essay, the quality of evidence a student has included to support their thesis or the tone of the writing. Such tools can also evaluate research questions or review a student’s writing based on the rubric for the assignment, she says.

 

Gemini makes your mobile device a powerful AI assistant — from blog.google
Gemini Live is available today to Advanced subscribers, along with conversational overlay on Android and even more connected apps.

Rolling out today: Gemini Live <– Google swoops in before OpenAI can get their Voice Mode out there
Gemini Live is a mobile conversational experience that lets you have free-flowing conversations with Gemini. Want to brainstorm potential jobs that are well-suited to your skillset or degree? Go Live with Gemini and ask about them. You can even interrupt mid-response to dive deeper on a particular point, or pause a conversation and come back to it later. It’s like having a sidekick in your pocket who you can chat with about new ideas or practice with for an important conversation.

Gemini Live is also available hands-free: You can keep talking with the Gemini app in the background or when your phone is locked, so you can carry on your conversation on the go, just like you might on a regular phone call. Gemini Live begins rolling out today in English to our Gemini Advanced subscribers on Android phones, and in the coming weeks will expand to iOS and more languages.

To make speaking to Gemini feel even more natural, we’re introducing 10 new voices to choose from, so you can pick the tone and style that works best for you.

.

Per the Rundown AI:
Why it matters: Real-time voice is slowly shifting AI from a tool we text/prompt with, to an intelligence that we collaborate, learn, consult, and grow with. As the world’s anticipation for OpenAI’s unreleased products grows, Google has swooped in to steal the spotlight as the first to lead widespread advanced AI voice rollouts.

Beyond Social Media: Schmidt Predicts AI’s Earth-Shaking Impact— from wallstreetpit.com
The next wave of AI is coming, and if Schmidt is correct, it will reshape our world in ways we are only beginning to imagine.

In a recent Q&A session at Stanford, Eric Schmidt, former CEO and Chairman of search giant Google, offered a compelling vision of the near future in artificial intelligence. His predictions, both exciting and sobering, paint a picture of a world on the brink of a technological revolution that could dwarf the impact of social media.

Schmidt highlighted three key advancements that he believes will converge to create this transformative wave: very large context windows, agents, and text-to-action capabilities. These developments, according to Schmidt, are not just incremental improvements but game-changers that could reshape our interaction with technology and the world at large.

.


The rise of multimodal AI agents— from 11onze.cat
Technology companies are investing large amounts of money in creating new multimodal artificial intelligence models and algorithms that can learn, reason and make decisions autonomously after collecting and analysing data.

The future of multimodal agents
In practical terms, a multimodal AI agent can, for example, analyse a text while processing an image, spoken language, or an audio clip to give a more complete and accurate response, both through voice and text. This opens up new possibilities in various fields: from education and healthcare to e-commerce and customer service.


AI Change Management: 41 Tactics to Use (August 2024)— from flexos.work by Daan van Rossum
Future-proof companies are investing in driving AI adoption, but many don’t know where to start. The experts recommend these 41 tips for AI change management.

As Matt Kropp told me in our interview, BCG has a 10-20-70 rule for AI at work:

  • 10% is the LLM or algorithm
  • 20% is the software layer around it (like ChatGPT)
  • 70% is the human factor

This 70% is exactly why change management is key in driving AI adoption.

But where do you start?

As I coach leaders at companies like Apple, Toyota, Amazon, L’Oréal, and Gartner in our Lead with AI program, I know that’s the question on everyone’s minds.

I don’t believe in gatekeeping this information, so here are 41 principles and tactics I share with our community members looking for winning AI change management principles.


 

How Generative AI will change what lawyers do — from jordanfurlong.substack.com by Jordan Furlong
As we enter the Age of Accessible Law, a wave of new demand is coming our way — but AI will meet most of the surge. What will be left for lawyers? Just the most valuable and irreplaceable role in law.

AI can already provide actionable professional advice; within the next ten years, if it takes that long, I believe it will offer acceptable legal advice. No one really wants “AI courts,” but soon enough, we’ll have AI-enabled mediation and arbitration, which will have a much greater impact on everyday dispute resolution.

I think it’s dangerous to assume that AI will never be able to do something that lawyers now do. “Never” is a very long time. And AI doesn’t need to replicate the complete arsenal of the most gifted lawyer out there. If a Legal AI can replicate 80% of what a middling lawyer can do, for 10% of the cost, in 1% of the time, that’s all the revolution you’ll need.

From DSC:
It is my sincere hope that AI will open up the floodgates to FAR great Access to Justice (A2J) in the future.


It’s the Battle of the AI Legal Assistants, As LexisNexis Unveils Its New Protégé and Thomson Reuters Rolls Out CoCounsel 2.0 — from lawnext.com by Bob Ambrogi

It’s not quite BattleBots, but competitors LexisNexis and Thomson Reuters both made significant announcements today involving the development of generative AI legal assistants within their products.

Thomson Reuters, which last year acquired the CoCounsel legal assistant originally developed by Casetext, and which later announced plans to deploy it throughout its product lines, today unveiled what it says is the “supercharged” CoCounsel 2.0.

Meanwhile, LexisNexis said today it is rolling out the commercial preview version of its Protégé Legal AI Assistant, which it describes as a “substantial leap forward in personalized generative AI that will transform legal work.” It is part of the launch of the third generation of Lexis+ AI, the AI-driven legal research platform the company launched last year.


Thomson Reuters Launches CoCounsel 2.0 — from abovethelaw.com by Joe Patrice
New release promises results three times faster than the last version.

It seems like just last year we were talking about CoCounsel 1.0, the generative AI product launched by Casetext and then swiftly acquired by Thomson Reuters. That’s because it was just last year. Since then, Thomson Reuters has worked to marry Casetext’s tool with TR’s treasure trove of data.

It’s not an easy task. A lot of the legal AI conversation glosses over how constructing these tools requires a radical confrontation with the lawyers’ mind. Why do attorneys do what they do every day? Are there seemingly “inefficient” steps that actually serve a purpose? Does an AI “answer” advance the workflow or hinder the research alchemy? As recently as April, Thomson Reuters was busy hyping the fruits of its efforts to get ahead of these challenges.


Though this next item is not necessarily related to legaltech, it’s still relevant to the legal realm:

A Law Degree Is No Sure Thing— from cew.georgetown.edu
Some Law School Graduates Earn Top Dollar, but Many Do Not

Summary
Is law school worth it? A Juris Doctor (JD) offers high median earnings and a substantial earnings boost relative to a bachelor’s degree in the humanities or social sciences—two of the more common fields of study that lawyers pursue as undergraduate students. However, graduates of most law schools carry substantial student loan debt, which dims the financial returns associated with a JD.

A Law Degree Is No Sure Thing: Some Law School Graduates Earn Top Dollar, but Many Do Not finds that the return on investment (ROI) in earnings and career outcomes varies widely across law schools. The median earnings net of debt payments are $72,000 four years after graduation for all law school graduates, but exceed $200,000 at seven law schools. By comparison, graduates of 33 law schools earn less than $55,000 net of debt payments four years after graduation.

From DSC:
A former boss’ husband was starting up a local public defender’s office in Michigan and needed to hire over two dozen people. The salaries were in the $40K’s she said. This surprised me greatly, as I thought all lawyers were bringing in the big bucks. This is not the case, clearly. Many lawyers do not make the big bucks, as this report shows:

…graduates of 33 law schools earn less than $55,000 net of debt payments four years after graduation.

.

Also relevant/see:

 

From DSC:
The above item is simply excellent!!! I love it!



Also relevant/see:

3 new Chrome AI features for even more helpful browsing — from blog.google from Parisa Tabriz
See how Chrome’s new AI features, including Google Lens for desktop and Tab compare, can help you get things done more easily on the web.


On speaking to AI — from oneusefulthing.org by Ethan Mollick
Voice changes a lot of things

So, let’s talk about ChatGPT’s new Advanced Voice mode and the new AI-powered Siri. They are not just different approaches to talking to AI. In many ways, they represent the divide between two philosophies of AI – Copilots versus Agents, small models versus large ones, specialists versus generalists.


Your guide to AI – August 2024 — from nathanbenaich.substack.com by Nathan Benaich and Alex Chalmers


Microsoft says OpenAI is now a competitor in AI and search — from cnbc.com by Jordan Novet

Key Points

  • Microsoft’s annually updated list of competitors now includes OpenAI, a long-term strategic partner.
  • The change comes days after OpenAI announced a prototype of a search engine.
  • Microsoft has reportedly invested $13 billion into OpenAI.


Excerpt from by Graham Clay

1. Flux, an open-source text-to-image creator that is comparable to industry leaders like Midjourney, was released by Black Forest Labs (the “original team” behind Stable Diffusion). It is capable of generating high quality text in images (there are tons of educational use cases). You can play with it on their demo page, on Poe, or by running it on your own computer (tutorial here).

Other items re: Flux:

How to FLUX  — from heatherbcooper.substack.com by Heather Cooper
Where to use FLUX online & full tutorial to create a sleek ad in minutes

.

Also from Heather Cooper:

Introducing FLUX: Open-Source text to image model

FLUX… has been EVERYWHERE this week, as I’m sure you have seen. Developed by Black Forest Labs, is an open-source image generation model that’s gaining attention for its ability to rival leading models like Midjourney, DALL·E 3, and SDXL.

What sets FLUX apart is its blend of creative freedom, precision, and accessibility—it’s available across multiple platforms and can be run locally.

Why FLUX Matters
FLUX’s open-source nature makes it accessible to a broad audience, from hobbyists to professionals.

It offers advanced multimodal and parallel diffusion transformer technology, delivering high visual quality, strong prompt adherence, and diverse outputs.

It’s available in 3 models:
FLUX.1 [pro]: A high-performance, commercial image synthesis model.
FLUX.1 [dev]: An open-weight, non-commercial variant of FLUX.1 [pro]
FLUX.1 [schnell]: A faster, distilled version of FLUX.1, operating up to 10x quicker.

Daily Digest: Huge (in)Flux of AI videos. — from bensbites.beehiiv.com
PLUS: Review of ChatGPT’s advanced voice mode.

  1. During the weekend, image models made a comeback. Recently released Flux models can create realistic images with near-perfect text—straight from the model, without much patchwork. To get the party going, people are putting these images into video generation models to create prettytrippyvideos. I can’t identify half of them as AI, and they’ll only get better. See this tutorial on how to create a video ad for your product..

 


7 not only cool but handy use cases of new Claude — from techthatmatters.beehiiv.com by Harsh Makadia

  1. Data visualization
  2. Infographic
  3. Copy the UI of a website
  4. …and more

Achieving Human Level Competitive Robot Table Tennis — from sites.google.com

 

What Students Want When It Comes To AI — from onedtech.philhillaa.com by Glenda Morgan
The Digital Education Council Global AI Student Survey 2024

The Digital Education Council (DEC) this week released the results of a global survey of student opinions on AI. It’s a large survey with nearly 4,000 respondents conducted across 16 countries, but more importantly, it asks some interesting questions. There are many surveys about AI out there right now, but this one stands out. I’m going to go into some depth here, as the entire survey report is worth reading.

.

.


AI is forcing a teaching and learning evolution — from eschoolnews.com by Laura Ascione
AI and technology tools are leading to innovative student learning–along with classroom, school, and district efficiency

Key findings from the 2024 K-12 Educator + AI Survey, which was conducted by Hanover Research, include:

  • Teachers are using AI to personalize and improve student learning, not just run classrooms more efficiently, but challenges remain
  • While post-pandemic challenges persist, the increased use of technology is viewed positively by most teachers and administrators
  • …and more

From DSC:
I wonder…how will the use of AI in education square with the issues of using smartphones/laptops within the classrooms? See:

  • Why Schools Are Racing to Ban Student Phones — from nytimes.com by Natasha Singer; via GSV
    As the new school year starts, a wave of new laws that aim to curb distracted learning is taking effect in Indiana, Louisiana and other states.

A three-part series from Dr. Phillippa Hardman:

Part 1: Writing Learning Objectives  
The Results Part 1: Writing Learning Objectives

In this week’s post I will dive into the results from task 1: writing learning objectives. Stay tuned over the next two weeks to see all of the the results.

Part 2: Selecting Instructional Strategies.
The Results Part 2: Selecting an Instructional Strategy

Welcome back to our three-part series exploring the impact of AI on instructional design.

This week, we’re tackling a second task and a crucial aspect of instructional design: selecting instructional strategies. The ability to select appropriate instructional strategies to achieve intended objectives is a mission-critical skill for any instructional designer. So, can AI help us do a good job of it? Let’s find out!

Part 3: How Close is AI to Replacing Instructional Designers?
The Results Part 3: Creating a Course Outline

Today, we’re diving into what many consider to be the role-defining task of the instructional designer: creating a course design outline.


ChatGPT Cheat Sheet for Instructional Designers! — from Alexandra Choy Youatt EdD

Instructional Designers!
Whether you’re new to the field or a seasoned expert, this comprehensive guide will help you leverage AI to create more engaging and effective learning experiences.

What’s Inside?
Roles and Tasks: Tailored prompts for various instructional design roles and tasks.
Formats: Different formats to present your work, from training plans to rubrics.
Learning Models: Guidance on using the ADDIE model and various pedagogical strategies.
Engagement Tips: Techniques for online engagement and collaboration.
Specific Tips: Industry certifications, work-based learning, safety protocols, and more.

Who Can Benefit?
Corporate Trainers
Curriculum Developers
E-Learning Specialists
Instructional Technologists
Learning Experience Designers
And many more!

ChatGPT Cheat Sheet | Instructional Designer


5 AI Tools I Use Every Day (as a Busy Student) — from theaigirl.substack.com by Diana Dovgopol
AI tools that I use every day to boost my productivity.
#1 Gamma
#2 Perplexity
#3 Cockatoo

I use this AI tool almost every day as well. Since I’m still a master’s student at university, I have to attend lectures and seminars, which are always in English or German, neither of which is my native language. With the help of Cockatoo, I create scripts of the lectures and/or translations into my language. This means I don’t have to take notes in class and then manually translate them afterward. All I need to do is record the lecture audio on any device or directly in Cockatoo, upload it, and then you’ll have the audio and text ready for you.

…and more


Students Worry Overemphasis on AI Could Devalue Education — from insidehighered.com by Juliette Rowsell
Report stresses that AI is “new standard” and universities need to better communicate policies to learners.

Rising use of AI in higher education could cause students to question the quality and value of education they receive, a report warns.

This year’s Digital Education Council Global AI Student Survey, of more than 3,800 students from 16 countries, found that more than half (55 percent) believed overuse of AI within teaching devalued education, and 52 percent said it negatively impacted their academic performance.

Despite this, significant numbers of students admitted to using such technology. Some 86 percent said they “regularly” used programs such as ChatGPT in their studies, 54 percent said they used it on a weekly basis, and 24 percent said they used it to write a first draft of a submission.

Higher Ed Leadership Is Excited About AI – But Investment Is Lacking — from forbes.com by Vinay Bhaskara

As corporate America races to integrate AI into its core operations, higher education finds itself in a precarious position. I conducted a survey of 63 university leaders revealing that while higher ed leaders recognize AI’s transformative potential, they’re struggling to turn that recognition into action.

This struggle is familiar for higher education — gifted with the mission of educating America’s youth but plagued with a myriad of operational and financial struggles, higher ed institutions often lag behind their corporate peers in technology adoption. In recent years, this gap has become threateningly large. In an era of declining enrollments and shifting demographics, closing this gap could be key to institutional survival and success.

The survey results paint a clear picture of inconsistency: 86% of higher ed leaders see AI as a “massive opportunity,” yet only 21% believe their institutions are prepared for it. This disconnect isn’t just a minor inconsistency – it’s a strategic vulnerability in an era of declining enrollments and shifting demographics.


(Generative) AI Isn’t Going Anywhere but Up — from stefanbauschard.substack.com by Stefan Bauschard
“Hype” claims are nonsense.

There has been a lot of talk recently about an “AI Bubble.” Supposedly, the industry, or at least the generative AI subset of it, will collapse. This is known as the “Generative AI Bubble.” A bubble — a broad one or a generative one — is nonsense. These are the reasons we will continue to see massive growth in AI.


AI Readiness: Prepare Your Workforce to Embrace the Future — from learningguild.com by Danielle Wallace

Artificial Intelligence (AI) is revolutionizing industries, enhancing efficiency, and unlocking new opportunities. To thrive in this landscape, organizations need to be ready to embrace AI not just technologically but also culturally.

Learning leaders play a crucial role in preparing employees to adapt and excel in an AI-driven workplace. Transforming into an AI-empowered organization requires more than just technological adoption; it demands a shift in organizational mindset. This guide delves into how learning leaders can support this transition by fostering the right mindset attributes in employees.


Claude AI for eLearning Developers — from learningguild.com by Bill Brandon

Claude is fast, produces grammatically correct  text, and outputs easy-to-read articles, emails, blog posts, summaries, and analyses. Take some time to try it out. If you worry about plagiarism and text scraping, put the results through Grammarly’s plagiarism checker (I did not use Claude for this article, but I did send the text through Grammarly).


Survey: Top Teacher Uses of AI in the Classroom — from thejournal.com by Rhea Kelly

A new report from Cambium Learning Group outlines the top ways educators are using artificial intelligence to manage their classrooms and support student learning. Conducted by Hanover Research, the 2024 K-12 Educator + AI Survey polled 482 teachers and administrators at schools and districts that are actively using AI in the classroom.

More than half of survey respondents (56%) reported that they are leveraging AI to create personalized learning experiences for students. Other uses included providing real-time performance tracking and feedback (cited by 52% of respondents), helping students with critical thinking skills (50%), proofreading writing (47%), and lesson planning (44%).

On the administrator side, top uses of AI included interpreting/analyzing student data (61%), managing student records (56%), and managing professional development (56%).


Addendum on 8/14/24:

 


ChatGPT Voice Mode Is Here: Will It Revolutionize AI Communication?


Advanced Voice Mode – FAQ — from help.openai.com
Learn more about our Advanced Voice capabilities.

Advanced Voice Mode on ChatGPT features more natural, real-time conversations that pick up on and respond with emotion and non-verbal cues.

Advanced Voice Mode on ChatGPT is currently in a limited alpha. Please note that it may make mistakes, and access and rate limits are subject to change.


From DSC:
Think about the impacts/ramifications of global, virtual, real-time language translations!!! This type of technology will create very powerful, new affordances in our learning ecosystems — as well as in business communications, with the various governments across the globe, and more!

 

 

Welcome to the Digital Writing Lab -- Supporting teachers to develop and empower digitally literate citizens.

Digital Writing Lab

About this Project

The Digital Writing Lab is a key component of the Australian national Teaching Digital Writing project, which runs from 2022-2025.

This stage of the broader project involves academic and secondary English teacher collaboration to explore how teachers are conceptualising the teaching of digital writing and what further supports they may need.

Previous stages of the project included archival research reviewing materials related to digital writing in Australia’s National Textbook Collection, and a national survey of secondary English teachers. You can find out more about the whole project via the project blog.

Who runs the project?

Project Lead Lucinda McKnight is an Associate Professor and Australian Research Council (ARC) DECRA Fellow researching how English teachers can connect the teaching of writing to contemporary media and students’ lifeworlds.

She is working with Leon Furze, who holds the doctoral scholarship attached to this project, and Chris Zomer, the project Research Fellow. The project is located in the Research for Educational Impact (REDI) centre at Deakin University, Melbourne.

.

Teaching Digital Writing is a research project about English today.

 

For college students—and for higher ed itself—AI is a required course — from forbes.com by Jamie Merisotis

Some of the nation’s biggest tech companies have announced efforts to reskill people to avoid job losses caused by artificial intelligence, even as they work to perfect the technology that could eliminate millions of those jobs.

It’s fair to ask, however: What should college students and prospective students, weighing their choices and possible time and financial expenses, think of this?

The news this spring was encouraging for people seeking to reinvent their careers to grab middle-class jobs and a shot at economic security.

 


Addressing Special Education Needs With Custom AI Solutions — from teachthought.com
AI can offer many opportunities to create more inclusive and effective learning experiences for students with diverse learning profiles.

For too long, students with learning disabilities have struggled to navigate a traditional education system that often fails to meet their unique needs. But what if technology could help bridge the gap, offering personalized support and unlocking the full potential of every learner?

Artificial intelligence (AI) is emerging as a powerful ally in special education, offering many opportunities to create more inclusive and effective learning experiences for students with diverse learning profiles.

.


 

.


11 Summer AI Developments Important to Educators — from stefanbauschard.substack.com by Stefan Bauschard
Equity demands that we help students prepare to thrive in an AI-World

*SearchGPT
*Smaller & on-device (phones, glasses) AI models
*AI TAs
*Access barriers decline, equity barriers grow
*Claude Artifacts and Projects
*Agents, and Agent Teams of a million+
*Humanoid robots & self-driving cars
*AI Curricular integration
*Huge video and video-segmentation gains
*Writing Detectors — The final blow
*AI Unemployment, Student AI anxiety, and forward-thinking approaches
*Alternative assessments


Academic Fracking: When Publishers Sell Scholars Work to AI — from aiedusimplified.substack.com by Lance Eaton
Further discussion of publisher practices selling scholars’ work to AI companies

Last week, I explored AI and academic publishing in response to an article that came out a few weeks ago about a deal Taylor & Francis made to sell their books to Microsoft and one other AI company (unnamed) for a boatload of money.

Since then, two more pieces have been widely shared including this piece from Inside Higher Ed by Kathryn Palmer (and to which I was interviewed and mentioned in) and this piece from Chronicle of Higher Ed by Christa Dutton. Both pieces try to cover the different sides talking to authors, scanning the commentary online, finding some experts to consult and talking to the publishers. It’s one of those things that can feel like really important and also probably only to a very small amount of folks that find themselves thinking about academic publishing, scholarly communication, and generative AI.


At the Crossroads of Innovation: Embracing AI to Foster Deep Learning in the College Classroom — from er.educause.edu by Dan Sarofian-Butin
AI is here to stay. How can we, as educators, accept this change and use it to help our students learn?

The Way Forward
So now what?

In one respect, we already have a partial answer. Over the last thirty years, there has been a dramatic shift from a teaching-centered to a learning-centered education model. High-impact practices, such as service learning, undergraduate research, and living-learning communities, are common and embraced because they help students see the real-world connections of what they are learning and make learning personal.11

Therefore, I believe we must double down on a learning-centered model in the age of AI.

The first step is to fully and enthusiastically embrace AI.

The second step is to find the “jagged technological frontier” of using AI in the college classroom.


.

.


.

.


Futures Thinking in Education — from gettingsmart.com by Getting Smart Staff

Key Points

  • Educators should leverage these tools to prepare for rapid changes driven by technology, climate, and social dynamics.
  • Cultivating empathy for future generations can help educators design more impactful and forward-thinking educational practices.
 

Per the Rundown AI:

Why it matters: AI is slowly shifting from a tool we text/prompt with, to an intelligence that we collaborate, learn, and grow with. Advanced Voice Mode’s ability to understand and respond to emotions in real-time convos could also have huge use cases in everything from customer service to mental health support.

Also relevant/see:


Creators to Have Personalized AI Assistants, Meta CEO Mark Zuckerberg Tells NVIDIA CEO Jensen Huang — from blogs.nvidia.com by Brian Caulfield
Zuckerberg and Huang explore the transformative potential of open source AI, the launch of AI Studio, and exchange leather jackets at SIGGRAPH 2024.

“Every single restaurant, every single website will probably, in the future, have these AIs …” Huang said.

“…just like every business has an email address and a website and a social media account, I think, in the future, every business is going to have an AI,” Zuckerberg responded.

More broadly, the advancement of AI across a broad ecosystem promises to supercharge human productivity, for example, by giving every human on earth a digital assistant — or assistants — allowing people to live richer lives that they can interact with quickly and fluidly.

Also related/see:


From DSC:
Today was a MUCH better day for Nvidia however (up 12.81%). But it’s been very volatile in the last several weeks — as people and institutions ask where the ROI’s are going to come from.






9 compelling reasons to learn how to use AI Chatbots — from interestingengineering.com by Atharva Gosavi
AI Chatbots are conversational agents that can act on your behalf and converse with humans – a futuristic novelty that is already getting people excited about its usage in improving efficiency.

7. Accessibility and inclusivity
Chatbots can be designed to support multiple languages and accessibility needs, making services more inclusive. They can cater to users with disabilities by providing voice interaction capabilities and simplifying access to information. Understanding how to develop inclusive chatbots can help you contribute to making technology more accessible to everyone, a crucial aspect in today’s diverse society.

8. Future-proofing your skills
AI and automation are the future of work. Having the skills of building AI chatbots is a great way to future-proof your skills, and given the rising trajectory of AI, it’ll be a demanding skill in the market in the years to come. Staying ahead of technological trends is a great way to ensure you remain relevant and competitive in the job market.


Top 7 generative AI use cases for business — from cio.com by Grant Gross
Advanced chatbots, digital assistants, and coding helpers seem to be some of the sweet spots for gen AI use so far in business.

Many AI experts say the current use cases for generative AI are just the tip of the iceberg. More uses cases will present themselves as gen AIs get more powerful and users get more creative with their experiments.

However, a handful of gen AI use cases are already bubbling up. Here’s a look at the most popular and promising.

 
© 2024 | Daniel Christian