Tech Trends 2025 — from deloitte.com by Deloitte Insights
In Deloitte’s 16th annual Tech Trends report, AI is the common thread of nearly every trend. Moving forward, it will be part of the substructure of everything we do.

We propose that the future of technology isn’t so much about more AI as it is about ubiquitous AI. We expect that, going forward, AI will become so fundamentally woven into the fabric of our lives that it’s everywhere, and so foundational that we stop noticing it.

AI will eventually follow a similar path, becoming so ubiquitous that it will be a part of the unseen substructure of everything we do, and we eventually won’t even know it’s there. It will quietly hum along in the background, optimizing traffic in our cities, personalizing our health care, and creating adaptative and accessible learning paths in education. We won’t “use” AI. We’ll just experience a world where things work smarter, faster, and more intuitively—like magic, but grounded in algorithms. We expect that it will provide a foundation for business and personal growth while also adapting and sustaining itself over time.

Nowhere is this AI-infused future more evident than in this year’s Tech Trends report, which each year explores emerging trends across the six macro forces of information technology (figure 1). Half of the trends that we’ve chronicled are elevating forces—interaction, information, and computation—that underpin innovation and growth. The other half—the grounding forces of the business of technology, cyber and trust, and core modernization—help enterprises seamlessly operate while they grow.

 

Where to start with AI agents: An introduction for COOs — from fortune.com by Ganesh Ayyar

Picture your enterprise as a living ecosystem, where surging market demand instantly informs staffing decisions, where a new vendor’s onboarding optimizes your emissions metrics, where rising customer engagement reveals product opportunities. Now imagine if your systems could see these connections too! This is the promise of AI agents — an intelligent network that thinks, learns, and works across your entire enterprise.

Today, organizations operate in artificial silos. Tomorrow, they could be fluid and responsive. The transformation has already begun. The question is: will your company lead it?

The journey to agent-enabled operations starts with clarity on business objectives. Leaders should begin by mapping their business’s critical processes. The most pressing opportunities often lie where cross-functional handoffs create friction or where high-value activities are slowed by system fragmentation. These pain points become the natural starting points for your agent deployment strategy.


Create podcasts in minutes — from elevenlabs.io by Eleven Labs
Now anyone can be a podcast producer


Top AI tools for business — from theneuron.ai


This week in AI: 3D from images, video tools, and more — from heatherbcooper.substack.com by Heather Cooper
From 3D worlds to consistent characters, explore this week’s AI trends

Another busy AI news week, so I organized it into categories:

  • Image to 3D
  • AI Video
  • AI Image Models & Tools
  • AI Assistants / LLMs
  • AI Creative Workflow: Luma AI Boards

Want to speak Italian? Microsoft AI can make it sound like you do. — this is a gifted article from The Washington Post;
A new AI-powered interpreter is expected to simulate speakers’ voices in different languages during Microsoft Teams meetings.

Artificial intelligence has already proved that it can sound like a human, impersonate individuals and even produce recordings of someone speaking different languages. Now, a new feature from Microsoft will allow video meeting attendees to hear speakers “talk” in a different language with help from AI.


What Is Agentic AI?  — from blogs.nvidia.com by Erik Pounds
Agentic AI uses sophisticated reasoning and iterative planning to autonomously solve complex, multi-step problems.

The next frontier of artificial intelligence is agentic AI, which uses sophisticated reasoning and iterative planning to autonomously solve complex, multi-step problems. And it’s set to enhance productivity and operations across industries.

Agentic AI systems ingest vast amounts of data from multiple sources to independently analyze challenges, develop strategies and execute tasks like supply chain optimization, cybersecurity vulnerability analysis and helping doctors with time-consuming tasks.


 

What Students Are Saying About Teachers Using A.I. to Grade — from nytimes.com by The Learning Network; via Claire Zau
Teenagers and educators weigh in on a recent question from The Ethicist.

Is it unethical for teachers to use artificial intelligence to grade papers if they have forbidden their students from using it for their assignments?

That was the question a teacher asked Kwame Anthony Appiah in a recent edition of The Ethicist. We posed it to students to get their take on the debate, and asked them their thoughts on teachers using A.I. in general.

While our Student Opinion questions are usually reserved for teenagers, we also heard from a few educators about how they are — or aren’t — using A.I. in the classroom. We’ve included some of their answers, as well.


OpenAI wants to pair online courses with chatbots — from techcrunch.com by Kyle Wiggers; via James DeVaney on LinkedIn

If OpenAI has its way, the next online course you take might have a chatbot component.

Speaking at a fireside on Monday hosted by Coeus Collective, Siya Raj Purohit, a member of OpenAI’s go-to-market team for education, said that OpenAI might explore ways to let e-learning instructors create custom “GPTs” that tie into online curriculums.

“What I’m hoping is going to happen is that professors are going to create custom GPTs for the public and let people engage with content in a lifelong manner,” Purohit said. “It’s not part of the current work that we’re doing, but it’s definitely on the roadmap.”


15 Times to use AI, and 5 Not to — from oneusefulthing.org by Ethan Mollick
Notes on the Practical Wisdom of AI Use

There are several types of work where AI can be particularly useful, given the current capabilities and limitations of LLMs. Though this list is based in science, it draws even more from experience. Like any form of wisdom, using AI well requires holding opposing ideas in mind: it can be transformative yet must be approached with skepticism, powerful yet prone to subtle failures, essential for some tasks yet actively harmful for others. I also want to caveat that you shouldn’t take this list too seriously except as inspiration – you know your own situation best, and local knowledge matters more than any general principles. With all that out of the way, below are several types of tasks where AI can be especially useful, given current capabilities—and some scenarios where you should remain wary.


Learning About Google Learn About: What Educators Need To Know — from techlearning.com by Ray Bendici
Google’s experimental Learn About platform is designed to create an AI-guided learning experience

Google Learn About is a new experimental AI-driven platform available that provides digestible and in-depth knowledge about various topics, but showcases it all in an educational context. Described by Google as a “conversational learning companion,” it is essentially a Wikipedia-style chatbot/search engine, and then some.

In addition to having a variety of already-created topics and leading questions (in areas such as history, arts, culture, biology, and physics) the tool allows you to enter prompts using either text or an image. It then provides a general overview/answer, and then suggests additional questions, topics, and more to explore in regard to the initial subject.

The idea is for student use is that the AI can help guide a deeper learning process rather than just provide static answers.


What OpenAI’s PD for Teachers Does—and Doesn’t—Do — from edweek.org by Olina Banerji
What’s the first thing that teachers dipping their toes into generative artificial intelligence should do?

They should start with the basics, according to OpenAI, the creator of ChatGPT and one of the world’s most prominent artificial intelligence research companies. Last month, the company launched an hour-long, self-paced online course for K-12 teachers about the definition, use, and harms of generative AI in the classroom. It was launched in collaboration with Common Sense Media, a national nonprofit that rates and reviews a wide range of digital content for its age appropriateness.

…the above article links to:

ChatGPT Foundations for K–12 Educators — from commonsense.org

This course introduces you to the basics of artificial intelligence, generative AI, ChatGPT, and how to use ChatGPT safely and effectively. From decoding the jargon to responsible use, this course will help you level up your understanding of AI and ChatGPT so that you can use tools like this safely and with a clear purpose.

Learning outcomes:

  • Understand what ChatGPT is and how it works.
  • Demonstrate ways to use ChatGPT to support your teaching practices.
  • Implement best practices for applying responsible AI principles in a school setting.

Takeaways From Google’s Learning in the AI Era Event — from edtechinsiders.substack.com by Sarah Morin, Alex Sarlin, and Ben Kornell
Highlights from Our Day at Google + Behind-the-Scenes Interviews Coming Soon!

  1. NotebookLM: The Start of an AI Operating System
  2. Google is Serious About AI and Learning
  3. Google’s LearnLM Now Available in AI Studio
  4. Collaboration is King
  5. If You Give a Teacher a Ferrari

Rapid Responses to AI — from the-job.beehiiv.com by Paul Fain
Top experts call for better data and more short-term training as tech transforms jobs.

AI could displace middle-skill workers and widen the wealth gap, says landmark study, which calls for better data and more investment in continuing education to help workers make career pivots.

Ensuring That AI Helps Workers
Artificial intelligence has emerged as a general purpose technology with sweeping implications for the workforce and education. While it’s impossible to precisely predict the scope and timing of looming changes to the labor market, the U.S. should build its capacity to rapidly detect and respond to AI developments.
That’s the big-ticket framing of a broad new report from the National Academies of Sciences, Engineering, and Medicine. Congress requested the study, tapping an all-star committee of experts to assess the current and future impact of AI on the workforce.

“In contemplating what the future holds, one must approach predictions with humility,” the study says…

“AI could accelerate occupational polarization,” the committee said, “by automating more nonroutine tasks and increasing the demand for elite expertise while displacing middle-skill workers.”

The Kicker: “The education and workforce ecosystem has a responsibility to be intentional with how we value humans in an AI-powered world and design jobs and systems around that,” says Hsieh.


AI Predators: What Schools Should Know and Do — from techlearning.com by Erik Ofgang
AI is increasingly be used by predators to connect with underage students online. Yasmin London, global online safety expert at Qoria and a former member of the New South Wales Police Force in Australia, shares steps educators can take to protect students.

The threat from AI for students goes well beyond cheating, says Yasmin London, global online safety expert at Qoria and a former member of the New South Wales Police Force in Australia.

Increasingly at U.S. schools and beyond, AI is being used by predators to manipulate children. Students are also using AI generate inappropriate images of other classmates or staff members. For a recent report, Qoria, a company that specializes in child digital safety and wellbeing products, surveyed 600 schools across North America, UK, Australia, and New Zealand.


Why We Undervalue Ideas and Overvalue Writing — from aiczar.blogspot.com by Alexander “Sasha” Sidorkin

A student submits a paper that fails to impress stylistically yet approaches a worn topic from an angle no one has tried before. The grade lands at B minus, and the student learns to be less original next time. This pattern reveals a deep bias in higher education: ideas lose to writing every time.

This bias carries serious equity implications. Students from disadvantaged backgrounds, including first-generation college students, English language learners, and those from under-resourced schools, often arrive with rich intellectual perspectives but struggle with academic writing conventions. Their ideas – shaped by unique life experiences and cultural viewpoints – get buried under red ink marking grammatical errors and awkward transitions. We systematically undervalue their intellectual contributions simply because they do not arrive in standard academic packaging.


Google Scholar’s New AI Outline Tool Explained By Its Founder — from techlearning.com by Erik Ofgang
Google Scholar PDF reader uses Gemini AI to read research papers. The AI model creates direct links to the paper’s citations and a digital outline that summarizes the different sections of the paper.

Google Scholar has entered the AI revolution. Google Scholar PDF reader now utilizes generative AI powered by Google’s Gemini AI tool to create interactive outlines of research papers and provide direct links to sources within the paper. This is designed to make reading the relevant parts of the research paper more efficient, says Anurag Acharya, who co-founded Google Scholar on November 18, 2004, twenty years ago last month.


The Four Most Powerful AI Use Cases in Instructional Design Right Now — from drphilippahardman.substack.com by Dr. Philippa Hardman
Insights from ~300 instructional designers who have taken my AI & Learning Design bootcamp this year

  1. AI-Powered Analysis: Creating Detailed Learner Personas…
  2. AI-Powered Design: Optimising Instructional Strategies…
  3. AI-Powered Development & Implementation: Quality Assurance…
  4. AI-Powered Evaluation: Predictive Impact Assessment…

How Are New AI Tools Changing ‘Learning Analytics’? — from edsurge.com by Jeffrey R. Young
For a field that has been working to learn from the data trails students leave in online systems, generative AI brings new promises — and new challenges.

In other words, with just a few simple instructions to ChatGPT, the chatbot can classify vast amounts of student work and turn it into numbers that educators can quickly analyze.

Findings from learning analytics research is also being used to help train new generative AI-powered tutoring systems.

Another big application is in assessment, says Pardos, the Berkeley professor. Specifically, new AI tools can be used to improve how educators measure and grade a student’s progress through course materials. The hope is that new AI tools will allow for replacing many multiple-choice exercises in online textbooks with fill-in-the-blank or essay questions.


Increasing AI Fluency Among Enterprise Employees, Senior Management & Executives — from learningguild.com by Bill Brandon

This article attempts, in these early days, to provide some specific guidelines for AI curriculum planning in enterprise organizations.

The two reports identified in the first paragraph help to answer an important question. What can enterprise L&D teams do to improve AI fluency in their organizations?

You could be surprised how many software products have added AI features. Examples (to name a few) are productivity software (Microsoft 365 and Google Workspace); customer relationship management (Salesforce and Hubspot); human resources (Workday and Talentsoft); marketing and advertising (Adobe Marketing Cloud and Hootsuite); and communication and collaboration (Slack and Zoom). Look for more under those categories in software review sites.

 
 

The Edtech Insiders Generative AI Map — from edtechinsiders.substack.com by Ben Kornell, Alex Sarlin, Sarah Morin, and Laurence Holt
A market map and database featuring 60+ use cases for GenAI in education and 300+ GenAI powered education tools.


A Student’s Guide to Writing with ChatGPT— from openai.com

Used thoughtfully, ChatGPT can be a powerful tool to help students develop skills of rigorous thinking and clear writing, assisting them in thinking through ideas, mastering complex concepts, and getting feedback on drafts.

There are also ways to use ChatGPT that are counterproductive to learning—like generating an essay instead of writing it oneself, which deprives students of the opportunity to practice, improve their skills, and grapple with the material.

For students committed to becoming better writers and thinkers, here are some ways to use ChatGPT to engage more deeply with the learning process.


Community Colleges Are Rolling Out AI Programs—With a Boost from Big Tech — from workshift.org by Colleen Connolly

The Big Idea: As employers increasingly seek out applicants with AI skills, community colleges are well-positioned to train up the workforce. Partnerships with tech companies, like the AI Incubator Network, are helping some colleges get the resources and funding they need to overhaul programs and create new AI-focused ones.

Along these lines also see:

Practical AI Training — from the-job.beehiiv.com by Paul Fain
Community colleges get help from Big Tech to prepare students for applied AI roles at smaller companies.

Miami Dade and other two-year colleges try to be nimble by offering training for AI-related jobs while focusing on local employers. Also, Intel’s business struggles while the two-year sector wonders if Republicans will cut funds for semiconductor production.


Can One AI Agent Do Everything? How To Redesign Jobs for AI? HR Expertise And A Big Future for L&D. — from joshbersin.com by Josh Bersin

Here’s the AI summary, which is pretty good.

In this conversation, Josh Bersin discusses the evolving landscape of AI platforms, particularly focusing on Microsoft’s positioning and the challenges of creating a universal AI agent. He delves into the complexities of government efficiency, emphasizing the institutional challenges faced in re-engineering government operations.

The conversation also highlights the automation of work tasks and the need for businesses to decompose job functions for better efficiency.

Bersin stresses the importance of expertise in HR, advocating for a shift towards full stack professionals who possess a broad understanding of various HR functions.

Finally, he addresses the impending disruption in Learning and Development (L&D) due to AI advancements, predicting a significant transformation in how L&D professionals will manage knowledge and skills.


 

 

“The Value of Doing Things: What AI Agents Mean for Teachers” — from nickpotkalitsky.substack.com by guest author Jason Gulya, Professor of English and Applied Media at Berkeley College in New York City

AI Agents make me nervous. Really nervous.

I wish they didn’t.

I wish I could write that the last two years have made me more confident, more self-assured that AI is here to augment workers rather than replace them.

But I can’t.

I wish I could write that I know where schools and colleges will end up. I wish I could say that AI Agents will help us get where we need to be.

But I can’t.

At this point, today, I’m at a loss. I’m not sure where the rise of AI agents will take us, in terms of how we work and learn. I’m in the question-asking part of my journey. I have few answers.

So, let’s talk about where (I think) AI Agents will take education. And who knows? Maybe as I write I’ll come up with something more concrete.

It’s worth a shot, right?

From DSC: 
I completely agree with Jason’s following assertion:

A good portion of AI advancement will come down to employee replacement. And AI Agents push companies towards that. 

THAT’s where/what the ROI will be for corporations. They will make their investments up in the headcount area, and likely in other areas as well (product design, marketing campaigns, engineering-related items, and more). But how much time it takes to get there is a big question mark.

One last quote here…it’s too good not to include:

Behind these questions lies a more abstract, more philosophical one: what is the relationship between thinking and doing in a world of AI Agents and other kinds of automation?


How Good are Claude, ChatGPT & Gemini at Instructional Design? — from drphilippahardman.substack.com by Dr Philippa Hardman
A test of AI’s Instruction Design skills in theory & in practice

By examining models across three AI families—Claude, ChatGPT, and Gemini—I’ve started to identify each model’s strengths, limitations, and typical pitfalls.

Spoiler: my findings underscore that until we have specialised, fine-tuned AI copilots for instructional design, we should be cautious about relying on general-purpose models and ensure expert oversight in all ID tasks.


From DSC — I’m going to (have Nick) say this again:
I simply asked my students to use AI to brainstorm their own learning objectives. No restrictions. No predetermined pathways. Just pure exploration. The results? Astonishing.

Students began mapping out research directions I’d never considered. They created dialogue spaces with AI that looked more like intellectual partnerships than simple query-response patterns. 


The Digital Literacy Quest: Become an AI Hero — from gamma.app

From DSC:
I have not gone through all of these online-based materials, but I like what they are trying to get at:

  • Confidence with AI
    Students gain practical skills and confidence in using AI tools effectively.
  • Ethical Navigation
    Learn to navigate the ethical landscape of AI with integrity and responsibility. Make informed decisions about AI usage.
  • Mastering Essential Skills
    Develop critical thinking and problem-solving skills in the context of AI.

 


Expanding access to the Gemini app for teen students in education — from workspaceupdates.googleblog.com

Google Workspace for Education admins can now turn on the Gemini app with added data protection as an additional service for their teen users (ages 13+ or the applicable age in your country) in the following languages and countries. With added data protection, chats are not reviewed by human reviewers or otherwise used to improve AI models. The Gemini app will be a core service in the coming weeks for Education Standard and Plus users, including teens,


5 Essential Questions Educators Have About AI  — from edsurge.com by Annie Ning

Recently, I spoke with several teachers regarding their primary questions and reflections on using AI in teaching and learning. Their thought-provoking responses challenge us to consider not only what AI can do but what it means for meaningful and equitable learning environments. Keeping in mind these reflections, we can better understand how we move forward toward meaningful AI integration in education.


FrontierMath: A Benchmark for Evaluating Advanced Mathematical Reasoning in AI — from epoch.ai
FrontierMath presents hundreds of unpublished, expert-level mathematics problems that specialists spend days solving. It offers an ongoing measure of AI complex mathematical reasoning progress.

We’re introducing FrontierMath, a benchmark of hundreds of original, expert-crafted mathematics problems designed to evaluate advanced reasoning capabilities in AI systems. These problems span major branches of modern mathematics—from computational number theory to abstract algebraic geometry—and typically require hours or days for expert mathematicians to solve.


Rising demand for AI courses in UK universities shows 453% growth as students adapt to an AI-driven job market — from edtechinnovationhub.com

The demand for artificial intelligence courses in UK universities has surged dramatically over the past five years, with enrollments increasing by 453%, according to a recent study by Currys, a UK tech retailer.

The study, which analyzed UK university admissions data and surveyed current students and recent graduates, reveals how the growing influence of AI is shaping students’ educational choices and career paths.

This growth reflects the broader trend of AI integration across industries, creating new opportunities while transforming traditional roles. With AI’s influence on career prospects rising, students and graduates are increasingly drawn to AI-related courses to stay competitive in a rapidly changing job market.

 

2025 EDUCAUSE Top 10: Restoring Trust — from educause.edu

Higher education has a trust problem. In the past ten years, the share of Americans who are confident in higher education has dropped from 57 percent to 36 percent.

Colleges and universities need to show that they understand and care about students, faculty, staff, and community members, AND they need to work efficiently and effectively.

Technology leaders can help. The 2025 EDUCAUSE Top 10 describes how higher education technology and data leaders and professionals can help to restore trust in the sector by building competent and caring institutions and, through radical collaboration, leverage the fulcrum of leadership to maintain balance between the two.

.

 

AI-governed robots can easily be hacked — from theaivalley.com by Barsee
PLUS: Sam Altman’s new company “World” introduced…

In a groundbreaking study, researchers from Penn Engineering showed how AI-powered robots can be manipulated to ignore safety protocols, allowing them to perform harmful actions despite normally rejecting dangerous task requests.

What did they find ?

  • Researchers found previously unknown security vulnerabilities in AI-governed robots and are working to address these issues to ensure the safe use of large language models(LLMs) in robotics.
  • Their newly developed algorithm, RoboPAIR, reportedly achieved a 100% jailbreak rate by bypassing the safety protocols on three different AI robotic systems in a few days.
  • Using RoboPAIR, researchers were able to manipulate test robots into performing harmful actions, like bomb detonation and blocking emergency exits, simply by changing how they phrased their commands.

Why does it matter?

This research highlights the importance of spotting weaknesses in AI systems to improve their safety, allowing us to test and train them to prevent potential harm.

From DSC:
Great! Just what we wanted to hear. But does it surprise anyone? Even so…we move forward at warp speeds.


From DSC:
So, given the above item, does the next item make you a bit nervous as well? I saw someone on Twitter/X exclaim, “What could go wrong?”  I can’t say I didn’t feel the same way.

Introducing computer use, a new Claude 3.5 Sonnet, and Claude 3.5 Haiku — from anthropic.com

We’re also introducing a groundbreaking new capability in public beta: computer use. Available today on the API, developers can direct Claude to use computers the way people do—by looking at a screen, moving a cursor, clicking buttons, and typing text. Claude 3.5 Sonnet is the first frontier AI model to offer computer use in public beta. At this stage, it is still experimental—at times cumbersome and error-prone. We’re releasing computer use early for feedback from developers, and expect the capability to improve rapidly over time.

Per The Rundown AI:

The Rundown: Anthropic just introduced a new capability called ‘computer use’, alongside upgraded versions of its AI models, which enables Claude to interact with computers by viewing screens, typing, moving cursors, and executing commands.

Why it matters: While many hoped for Opus 3.5, Anthropic’s Sonnet and Haiku upgrades pack a serious punch. Plus, with the new computer use embedded right into its foundation models, Anthropic just sent a warning shot to tons of automation startups—even if the capabilities aren’t earth-shattering… yet.

Also related/see:

  • What is Anthropic’s AI Computer Use? — from ai-supremacy.com by Michael Spencer
    Task automation, AI at the intersection of coding and AI agents take on new frenzied importance heading into 2025 for the commercialization of Generative AI.
  • New Claude, Who Dis? — from theneurondaily.com
    Anthropic just dropped two new Claude models…oh, and Claude can now use your computer.
  • When you give a Claude a mouse — from oneusefulthing.org by Ethan Mollick
    Some quick impressions of an actual agent

Introducing Act-One — from runwayml.com
A new way to generate expressive character performances using simple video inputs.

Per Lore by Nathan Lands:

What makes Act-One special? It can capture the soul of an actor’s performance using nothing but a simple video recording. No fancy motion capture equipment, no complex face rigging, no army of animators required. Just point a camera at someone acting, and watch as their exact expressions, micro-movements, and emotional nuances get transferred to an AI-generated character.

Think about what this means for creators: you could shoot an entire movie with multiple characters using just one actor and a basic camera setup. The same performance can drive characters with completely different proportions and looks, while maintaining the authentic emotional delivery of the original performance. We’re witnessing the democratization of animation tools that used to require millions in budget and years of specialized training.

Also related/see:


Google to buy nuclear power for AI datacentres in ‘world first’ deal — from theguardian.com
Tech company orders six or seven small nuclear reactors from California’s Kairos Power

Google has signed a “world first” deal to buy energy from a fleet of mini nuclear reactors to generate the power needed for the rise in use of artificial intelligence.

The US tech corporation has ordered six or seven small nuclear reactors (SMRs) from California’s Kairos Power, with the first due to be completed by 2030 and the remainder by 2035.

Related:


ChatGPT Topped 3 Billion Visits in September — from similarweb.com

After the extreme peak and summer slump of 2023, ChatGPT has been setting new traffic highs since May

ChatGPT has been topping its web traffic records for months now, with September 2024 traffic up 112% year-over-year (YoY) to 3.1 billion visits, according to Similarweb estimates. That’s a change from last year, when traffic to the site went through a boom-and-bust cycle.


Crazy “AI Army” — from aisecret.us

Also from aisecret.us, see World’s First Nuclear Power Deal For AI Data Centers

Google has made a historic agreement to buy energy from a group of small nuclear reactors (SMRs) from Kairos Power in California. This is the first nuclear power deal specifically for AI data centers in the world.


New updates to help creators build community, drive business, & express creativity on YouTube — from support.google.com

Hey creators!
Made on YouTube 2024 is here and we’ve announced a lot of updates that aim to give everyone the opportunity to build engaging communities, drive sustainable businesses, and express creativity on our platform.

Below is a roundup with key info – feel free to upvote the announcements that you’re most excited about and subscribe to this post to get updates on these features! We’re looking forward to another year of innovating with our global community it’s a future full of opportunities, and it’s all Made on YouTube!


New autonomous agents scale your team like never before — from blogs.microsoft.com

Today, we’re announcing new agentic capabilities that will accelerate these gains and bring AI-first business process to every organization.

  • First, the ability to create autonomous agents with Copilot Studio will be in public preview next month.
  • Second, we’re introducing ten new autonomous agents in Dynamics 365 to build capacity for every sales, service, finance and supply chain team.

10 Daily AI Use Cases for Business Leaders— from flexos.work by Daan van Rossum
While AI is becoming more powerful by the day, business leaders still wonder why and where to apply today. I take you through 10 critical use cases where AI should take over your work or partner with you.


Multi-Modal AI: Video Creation Simplified — from heatherbcooper.substack.com by Heather Cooper

Emerging Multi-Modal AI Video Creation Platforms
The rise of multi-modal AI platforms has revolutionized content creation, allowing users to research, write, and generate images in one app. Now, a new wave of platforms is extending these capabilities to video creation and editing.

Multi-modal video platforms combine various AI tools for tasks like writing, transcription, text-to-voice conversion, image-to-video generation, and lip-syncing. These platforms leverage open-source models like FLUX and LivePortrait, along with APIs from services such as ElevenLabs, Luma AI, and Gen-3.


AI Medical Imagery Model Offers Fast, Cost-Efficient Expert Analysis — from developer.nvidia.com/

 

The Tutoring Revolution — from educationnext.org by Holly Korbey
More families are seeking one-on-one help for their kids. What does that tell us about 21st-century education?

Recent research suggests that the number of students seeking help with academics is growing, and that over the last couple of decades, more families have been turning to tutoring for that help.

What the Future Holds
Digital tech has made private tutoring more accessible, more efficient, and more affordable. Students whose families can’t afford to pay $75 an hour at an in-person center can now log on from home to access a variety of online tutors, including Outschool, Wyzant, and Anchorbridge, and often find someone who can cater to their specific skills and needs—someone who can offer help in French to a student with ADHD, for example. Online tutoring is less expensive than in-person programs. Khan Academy’s Khanmigo chatbot can be a student’s virtual AI tutor, no Zoom meeting required, for $4 a month, and nonprofits like Learn to Be work with homeless shelters and community centers to give virtual reading and math tutoring free to kids who can’t afford it and often might need it the most.

 

Employers Say Students Need AI Skills. What If Students Don’t Want Them? — from insidehighered.com by Ashley Mowreader
Colleges and universities are considering new ways to incorporate generative AI into teaching and learning, but not every student is on board with the tech yet. Experts weigh in on the necessity of AI in career preparation and higher education’s role in preparing students for jobs of the future.

Among the 5,025-plus survey respondents, around 2 percent (n=93), provided free responses to the question on AI policy and use in the classroom. Over half (55) of those responses were flat-out refusal to engage with AI. A few said they don’t know how to use AI or are not familiar with the tool, which impacts their ability to apply appropriate use to coursework.

But as generative AI becomes more ingrained into the workplace and higher education, a growing number of professors and industry experts believe this will be something all students need, in their classes and in their lives beyond academia.

From DSC:
I used to teach a Foundations of Information Technology class. Some of the students didn’t want to be there as they began the class, as it was a required class for non-CS majors. But after seeing what various applications and technologies could do for them, a good portion of those same folks changed their minds. But not all. Some students (2% sounds about right) asserted that they would never use technologies in their futures. Good luck with that I thought to myself. There’s hardly a job out there that doesn’t use some sort of technology.

And I still think that today — if not more so. If students want good jobs, they will need to learn how to use AI-based tools and technologies. I’m not sure there’s much of a choice. And I don’t think there’s much of a choice for the rest of us either — whether we’re still working or not. 

So in looking at the title of the article — “Employers Say Students Need AI Skills. What If Students Don’t Want Them?” — those of us who have spent any time working within the world of business already know the answer.

#Reinvent #Skills #StayingRelevant #Surviving #Workplace + several other categories/tags apply.


For those folks who have tried AI:

Skills: However, genAI may also be helpful in building skills to retain a job or secure a new one. People who had used genAI tools were more than twice as likely to think that these tools could help them learn new skills that may be useful at work or in locating a new job. Specifically, among those who had not used genAI tools, 23 percent believed that these tools might help them learn new skills, whereas 50 percent of those who had used the tools thought they might be helpful in acquiring useful skills (a highly statistically significant difference, after controlling for demographic traits).

Source: Federal Reserve Bank of New York

 

FlexOS’ Stay Ahead Edition #43 — from flexos.work

People started discussing what they could do with Notebook LM after Google launched the audio overview, where you can listen to 2 hosts talking in-depth about the documents you upload. Here are what it can do:

  • Summarization: Automatically generate summaries of uploaded documents, highlighting key topics and suggesting relevant questions.
  • Question Answering: Users can ask NotebookLM questions about their uploaded documents, and answers will be provided based on the information contained within them.
  • Idea Generation: NotebookLM can assist with brainstorming and developing new ideas.
  • Source Grounding: A big plus against AI chatbot hallucination, NotebookLM allows users to ground the responses in specific documents they choose.
  • …plus several other items

The posting also lists several ideas to try with NotebookLM such as:

Idea 2: Study Companion

  • Upload all your course materials and ask NotebookLM to turn them into Question-and-Answer format, a glossary, or a study guide.
  • Get a breakdown of the course materials to understand them better.

Google’s NotebookLM: A Game-Changer for Education and Beyond — from ai-supremacy.com by Michael Spencer and Nick Potkalitsky
AI Tools: Breaking down Google’s latest AI tool and its implications for education.

“Google’s AI note-taking app NotebookLM can now explain complex topics to you out loud”

With more immersive text-to-video and audio products soon available and the rise of apps like Suno AI, how we “experience” Generative AI is also changing from a chatbot of 2 years ago, to a more multi-modal educational journey. The AI tools on the research and curation side are also starting to reflect these advancements.


Meet Google NotebookLM: 10 things to know for educators — from ditchthattextbook.com by Matt Miller

1. Upload a variety of sources for NotebookLM to use. 
You can use …

  • websites
  • PDF files
  • links to websites
  • any text you’ve copied
  • Google Docs and Slides
  • even Markdown

You can’t link it to YouTube videos, but you can copy/paste the transcript (and maybe type a little context about the YouTube video before pasting the transcript).

2. Ask it to create resources.
3. Create an audio summary.
4. Chat with your sources.
5. Save (almost) everything. 


NotebookLM summarizes my dissertation — from darcynorman.net by D’Arcy Norman, PhD

I finally tried out Google’s newly-announced NotebookLM generative AI application. It provides a set of LLM-powered tools to summarize documents. I fed it my dissertation, and am surprised at how useful the output would be.

The most impressive tool creates a podcast episode, complete with dual hosts in conversation about the document. First – these are AI-generated hosts. Synthetic voices, speaking for synthetic hosts. And holy moly is it effective. Second – although I’d initially thought the conversational summary would be a dumb gimmick, it is surprisingly powerful.


4 Tips for Designing AI-Resistant Assessments — from techlearning.com by Steve Baule and Erin Carter
As AI continues to evolve, instructors must modify their approach by designing meaningful, rigorous assessments.

As instructors work through revising assessments to be resistant to generation by AI tools with little student input, they should consider the following principles:

  • Incorporate personal experiences and local content into assignments
  • Ask students for multi-modal deliverables
  • Assess the developmental benchmarks for assignments and transition assignments further up Bloom’s Taxonomy
  • Consider real-time and oral assignments

Google CEO Sundar Pichai announces $120M fund for global AI education — from techcrunch.com by Anthony Ha

He added that he wants to avoid a global “AI divide” and that Google is creating a $120 million Global AI Opportunity Fund through which it will “make AI education and training available in communities around the world” in partnership with local nonprofits and NGOs.


Educators discuss the state of creativity in an AI world — from gettingsmart.com by Joe & Kristin Merrill, LaKeshia Brooks, Dominique’ Harbour, Erika Sandstrom

Key Points

  • AI allows for a more personalized learning experience, enabling students to explore creative ideas without traditional classroom limitations.
  • The focus of technology integration should be on how the tool is used within lessons, not just the tool itself

Addendum on 9/27/24:

Google’s NotebookLM enhances AI note-taking with YouTube, audio file sources, sharable audio discussions — from techcrunch.com by Jagmeet Singh

Google on Thursday announced new updates to its AI note-taking and research assistant, NotebookLM, allowing users to get summaries of YouTube videos and audio files and even create sharable AI-generated audio discussions

NotebookLM adds audio and YouTube support, plus easier sharing of Audio Overviews — from blog.google

 

10 Ways I Use LLMs like ChatGPT as a Professor — from automatedteach.com by Graham Clay
ChatGPT-4o, Gemini 1.5 Pro, Claude 3.5 Sonnet, custom GPTs – you name it, I use it. Here’s how…

Excerpt:

  1. To plan lessons (especially activities)
  2. To create course content (especially quizzes)
  3. To tutor my students
  4. To grade faster and give better feedback
  5. To draft grant applications
  6. Plus 5 other items

From Caution to Calcification to Creativity: Reanimating Education with AI’s Frankenstein Potential — from nickpotkalitsky.substack.com by Nick Potkalitsky
A Critical Analysis of AI-Assisted Lesson Planning: Evaluating Efficacy and Pedagogical Implications

Excerpt (emphasis DSC):

As we navigate the rapidly evolving landscape of artificial intelligence in education, a troubling trend has emerged. What began as cautious skepticism has calcified into rigid opposition. The discourse surrounding AI in classrooms has shifted from empirical critique to categorical rejection, creating a chasm between the potential of AI and its practical implementation in education.

This hardening of attitudes comes at a significant cost. While educators and policymakers debate, students find themselves caught in the crossfire. They lack safe, guided access to AI tools that are increasingly ubiquitous in the world beyond school walls. In the absence of formal instruction, many are teaching themselves to use these tools, often in less than productive ways. Others live in a state of constant anxiety, fearing accusations of AI reliance in their work. These are just a few symptoms of an overarching educational culture that has become resistant to change, even as the world around it transforms at an unprecedented pace.

Yet, as this calcification sets in, I find myself in a curious position: the more I thoughtfully integrate AI into my teaching practice, the more I witness its potential to enhance and transform education


NotebookLM and Google’s Multimodal Vision for AI-Powered Learning Tools — from marcwatkins.substack.com by Marc Watkins

A Variety of Use Cases

  • Create an Interactive Syllabus
  • Presentation Deep Dive: Upload Your Slides
  • Note Taking: Turn Your Chalkboard into a Digital Canvas
  • Explore a Reading or Series of Readings
  • Help Navigating Feedback
  • Portfolio Building Blocks

Must-Have Competencies and Skills in Our New AI World: A Synthesis for Educational Reform — from er.educause.edu by Fawzi BenMessaoud
The transformative impact of artificial intelligence on educational systems calls for a comprehensive reform to prepare future generations for an AI-integrated world.

The urgency to integrate AI competencies into education is about preparing students not just to adapt to inevitable changes but to lead the charge in shaping an AI-augmented world. It’s about equipping them to ask the right questions, innovate responsibly, and navigate the ethical quandaries that come with such power.

AI in education should augment and complement their aptitude and expertise, to personalize and optimize the learning experience, and to support lifelong learning and development. AI in education should be a national priority and a collaborative effort among all stakeholders, to ensure that AI is designed and deployed in an ethical, equitable, and inclusive way that respects the diversity and dignity of all learners and educators and that promotes the common good and social justice. AI in education should be about the production of AI, not just the consumption of AI, meaning that learners and educators should have the opportunity to learn about AI, to participate in its creation and evaluation, and to shape its impact and direction.

 

From DSC:
Anyone who is involved in putting on conferences should at least be aware that this kind of thing is now possible!!! Check out the following posting from Adobe (with help from Tata Consultancy Services (TCS).


From impossible to POSSIBLE: Tata Consultancy Services uses Adobe Firefly generative AI and Acrobat AI Assistant to turn hours of work into minutes — from blog.adobe.com

This year, the organizers — innovative industry event company Beyond Ordinary Events — turned to Tata Consultancy Services (TCS) to make the impossible “possible.” Leveraging Adobe generative AI technology across products like Adobe Premiere Pro and Acrobat, they distilled hours of video content in minutes, delivering timely dispatches to thousands of attendees throughout the conference.

For POSSIBLE ’24, Muche had an idea for a daily dispatch summarizing each day’s sessions so attendees wouldn’t miss a single insight. But timing would be critical. The dispatch needed to reach attendees shortly after sessions ended to fuel discussions over dinner and carry the excitement over to the next day.

The workflow started in Adobe Premiere Pro, with the writer opening a recording of each session and using the Speech to Text feature to automatically generate a transcript. They saved the transcript as a PDF file and opened it in Adobe Acrobat Pro. Then, using Adobe Acrobat AI Assistant, the writer asked for a session summary.

It was that fast and easy. In less than four minutes, one person turned a 30-minute session into an accurate, useful summary ready for review and publication.

By taking advantage of templates, the designer then added each AI-enabled summary to the newsletter in minutes. With just two people and generative AI technology, TCS accomplished the impossible — for the first time delivering an informative, polished newsletter to all 3,500 conference attendees just hours after the last session of the day.

 

AI agents are the future, and a lot is at stake — from forbes.com by Skip Sanzeri

What An Agent Is
Agents are computer programs that can autonomously perform tasks, make decisions and interact with humans or other computers. There are many different types of agents, and they are designed to achieve specific goals spanning our lives and nearly every industry, making them an integral and unstoppable part of our future.

Learning: AI agents will transform education by providing personalized learning experiences such as one-to-one tutoring. ChatGPT and other large language models (LLMs) are providing access to all digital knowledge now. An “agent” would act as a more personalized version of an LLM.

The hacking and control of an AI agent could lead to disastrous consequences, affecting privacy, security, the economy and societal stability. Proactive and comprehensive security strategies are essential to mitigate these risks in the future.

 

What Students Want: Key Results from DEC Global AI Student Survey 2024 — from digitaleducationcouncil.com by Digital Education Council

  • 86% of students globally are regularly using AI in their studies, with 54% of them using AI on a weekly basis, the recent Digital Education Council Global AI Student Survey found.
  • ChatGPT was found to be the most widely used AI tool, with 66% of students using it, and over 2 in 3 students reported using AI for information searching.
  • Despite their high rates of AI usage, 1 in 2 students do not feel AI ready. 58% reported that they do not feel that they had sufficient AI knowledge and skills, and 48% do not feel adequately prepared for an AI-enabled workplace.

Chatting with WEF about ChatGPT in the classroom — from futureofbeinghuman.com by Andrew Maynard
A short video on generative AI in education from the World Economic Forum


The Post-AI Instructional Designer — from drphilippahardman.substack.com by Dr. Philippa Hardman
How the ID role is changing, and what this means for your key skills, roles & responsibilities

Specifically, the study revealed that teachers who reported most productivity gains were those who used AI not just for creating outputs (like quizzes or worksheets) but also for seeking input on their ideas, decisions and strategies.

Those who engaged with AI as a thought partner throughout their workflow, using it to generate ideas, define problems, refine approaches, develop strategies and gain confidence in their decisions gained significantly more from their collaboration with AI than those who only delegated functional tasks to AI.  


Leveraging Generative AI for Inclusive Excellence in Higher Education — from er.educause.edu by Lorna Gonzalez, Kristi O’Neil-Gonzalez, Megan Eberhardt-Alstot, Michael McGarry and Georgia Van Tyne
Drawing from three lenses of inclusion, this article considers how to leverage generative AI as part of a constellation of mission-centered inclusive practices in higher education.

The hype and hesitation about generative artificial intelligence (AI) diffusion have led some colleges and universities to take a wait-and-see approach.Footnote1 However, AI integration does not need to be an either/or proposition where its use is either embraced or restricted or its adoption aimed at replacing or outright rejecting existing institutional functions and practices. Educators, educational leaders, and others considering academic applications for emerging technologies should consider ways in which generative AI can complement or augment mission-focused practices, such as those aimed at accessibility, diversity, equity, and inclusion. Drawing from three lenses of inclusion—accessibility, identity, and epistemology—this article offers practical suggestions and considerations that educators can deploy now. It also presents an imperative for higher education leaders to partner toward an infrastructure that enables inclusive practices in light of AI diffusion.

An example way to leverage AI:

How to Leverage AI for Identity Inclusion
Educators can use the following strategies to intentionally design instructional content with identity inclusion in mind.

  • Provide a GPT or AI assistant with upcoming lesson content (e.g., lecture materials or assignment instructions) and ask it to provide feedback (e.g., troublesome vocabulary, difficult concepts, or complementary activities) from certain perspectives. Begin with a single perspective (e.g., first-time, first-year student), but layer in more to build complexity as you interact with the GPT output.

Gen AI’s next inflection point: From employee experimentation to organizational transformation — from mckinsey.com by Charlotte Relyea, Dana Maor, and Sandra Durth with Jan Bouly
As many employees adopt generative AI at work, companies struggle to follow suit. To capture value from current momentum, businesses must transform their processes, structures, and approach to talent.

To harness employees’ enthusiasm and stay ahead, companies need a holistic approach to transforming how the whole organization works with gen AI; the technology alone won’t create value.

Our research shows that early adopters prioritize talent and the human side of gen AI more than other companies (Exhibit 3). Our survey shows that nearly two-thirds of them have a clear view of their talent gaps and a strategy to close them, compared with just 25 percent of the experimenters. Early adopters focus heavily on upskilling and reskilling as a critical part of their talent strategies, as hiring alone isn’t enough to close gaps and outsourcing can hinder strategic-skills development. Finally, 40 percent of early-adopter respondents say their organizations provide extensive support to encourage employee adoption, versus 9 percent of experimenter respondents.


7 Ways to Use AI Music in Your Classroom — from classtechtips.com by Monica Burns


Change blindness — from oneusefulthing.org by Ethan Mollick
21 months later

I don’t think anyone is completely certain about where AI is going, but we do know that things have changed very quickly, as the examples in this post have hopefully demonstrated. If this rate of change continues, the world will look very different in another 21 months. The only way to know is to live through it.


My AI Breakthrough — from mgblog.org by Miguel Guhlin

Over the subsequent weeks, I’ve made other adjustments, but that first one was the one I asked myself:

  1. What are you doing?
  2. Why are you doing it that way?
  3. How could you change that workflow with AI?
  4. Applying the AI to the workflow, then asking, “Is this what I was aiming for? How can I improve the prompt to get closer?”
  5. Documenting what worked (or didn’t). Re-doing the work with AI to see what happened, and asking again, “Did this work?”

So, something that took me WEEKS of hard work, and in some cases I found impossible, was made easy. Like, instead of weeks, it takes 10 minutes. The hard part? Building the prompt to do what I want, fine-tuning it to get the result. But that doesn’t take as long now.

 
© 2025 | Daniel Christian