Here are some incredibly powerful numbers from Mary Meeker’s AI Trends report, which showcase how artificial intelligence as a tech is unlike any other the world has ever seen.
AI took only three years to reach 50% user adoption in the US; mobile internet took six years, desktop internet took 12 years, while PCs took 20 years.
ChatGPT reached 800 million users in 17 months and 100 million in only two months, vis-à-vis Netflix’s 100 million (10 years), Instagram (2.5 years) and TikTok (nine months).
ChatGPT hit 365 billion annual searches in two years (2024) vs. Google’s 11 years (2009)—ChatGPT 5.5x faster than Google.
Above via Mary Meeker’s AI Trend-Analysis — from getsuperintel.com by Kim “Chubby” Isenberg How AI’s rapid rise, efficiency race, and talent shifts are reshaping the future.
The TLDR
Mary Meeker’s new AI trends report highlights an explosive rise in global AI usage, surging model efficiency, and mounting pressure on infrastructure and talent. The shift is clear: AI is no longer experimental—it’s becoming foundational, and those who optimize for speed, scale, and specialization will lead the next wave of innovation.
The Rundown: Meta aims to release tools that eliminate humans from the advertising process by 2026, according to a report from the WSJ — developing an AI that can create ads for Facebook and Instagram using just a product image and budget.
The details:
Companies would submit product images and budgets, letting AI craft the text and visuals, select target audiences, and manage campaign placement.
The system will be able to create personalized ads that can adapt in real-time, like a car spot featuring mountains vs. an urban street based on user location.
The push would target smaller companies lacking dedicated marketing staff, promising professional-grade advertising without agency fees or skillset.
Advertising is a core part of Mark Zuckerberg’s AI strategy and already accounts for 97% of Meta’s annual revenue.
Why it matters: We’re already seeing AI transform advertising through image, video, and text, but Zuck’s vision takes the process entirely out of human hands. With so much marketing flowing through FB and IG, a successful system would be a major disruptor — particularly for small brands that just want results without the hassle.
Learning and development professionals face unprecedented challenges in today’s rapidly evolving business landscape. According to LinkedIn’s 2025 Workplace Learning Report, 67 percent of L&D professionals report being “maxed out” on capacity, while 66 percent have experienced budget reductions in the past year.
Despite these constraints, 87 percent agree their organizations need to develop employees faster to keep pace with business demands. These statistics paint a clear picture of the pressure L&D teams face: do more, with less, faster.
This article explores how one L&D leader’s strategic partnership with artificial intelligence transformed these persistent challenges into opportunities, creating a responsive learning ecosystem that addresses the modern demands of rapid product evolution and diverse audience needs. With 71 percent of L&D professionals now identifying AI as a high or very high priority for their learning strategy, this case study demonstrates how AI can serve not merely as a tool but as a collaborative partner in reimagining content development and management. .
How we use GenAI and AR to improve students’ design skills— from timeshighereducation.com by Antonio Juarez, Lesly Pliego and Jordi Rábago who are professors of architecture at Monterrey Institute of Technology in Mexico; Tomas Pachajoa is a professor of architecture at the El Bosque University in Colombia; & Carlos Hinrichsen and Marietta Castro are educators at San Sebastián University in Chile. Guidance on using generative AI and augmented reality to enhance student creativity, spatial awareness and interdisciplinary collaboration
Blend traditional skills development with AI use For subjects that require students to develop drawing and modelling skills, have students create initial design sketches or models manually to ensure they practise these skills. Then, introduce GenAI tools such as Midjourney, Leonardo AI and ChatGPT to help students explore new ideas based on their original concepts. Using AI at this stage broadens their creative horizons and introduces innovative perspectives, which are crucial in a rapidly evolving creative industry.
Provide step-by-step tutorials, including both written guides and video demonstrations, to illustrate how initial sketches can be effectively translated into AI-generated concepts. Offer example prompts to demonstrate diverse design possibilities and help students build confidence using GenAI.
Integrating generative AI and AR consistently enhanced student engagement, creativity and spatial understanding on our course.
How Texas is Preparing Higher Education for AI — from the74million.org by Kate McGee TX colleges are thinking about how to prepare students for a changing workforce and an already overburdened faculty for new challenges in classrooms.
“It doesn’t matter if you enter the health industry, banking, oil and gas, or national security enterprises like we have here in San Antonio,” Eighmy told The Texas Tribune. “Everybody’s asking for competency around AI.”
It’s one of the reasons the public university, which serves 34,000 students, announced earlier this year that it is creating a new college dedicated to AI, cyber security, computing and data science. The new college, which is still in the planning phase, would be one of the first of its kind in the country. UTSA wants to launch the new college by fall 2025.
But many state higher education leaders are thinking beyond that. As AI becomes a part of everyday life in new, unpredictable ways, universities across Texas and the country are also starting to consider how to ensure faculty are keeping up with the new technology and students are ready to use it when they enter the workforce.
To develop a robust policy for generative artificial intelligence use in higher education, institutional leaders must first create “a room” where diverse perspectives are welcome and included in the process.
Q: How do you expect to see AI embraced more in the future in college and the workplace?
I do believe it’s going to become a permanent fixture for multiple reasons. I think the national security imperative associated with AI as a result of competing against other nations is going to drive a lot of energy and support for AI education. We also see shifts across every field and discipline regarding the usage of AI beyond college. We see this in a broad array of fields, including health care and the field of law. I think it’s here to stay and I think that means we’re going to see AI literacy being taught at most colleges and universities, and more faculty leveraging AI to help improve the quality of their instruction. I feel like we’re just at the beginning of a transition. In fact, I often describe our current moment as the ‘Ask Jeeves’ phase of the growth of AI. There’s a lot of change still ahead of us. AI, for better or worse, it’s here to stay.
A new study from Drexel University and Google has demonstrated that AI-generated educational podcasts can significantly enhance both student engagement and learning outcomes compared to traditional textbooks. The research, involving 180 college students across the United States, represents one of the first systematic investigations into how artificial intelligence can transform educational content delivery in real-time.
Interrogate the Process: We can ask ourselves if we I built in enough checkpoints. Steps that can’t be faked. Things like quick writes, question floods, in-person feedback, revision logs.
Reframe AI: We can let students use AI as a partner. We can show them how to prompt better, revise harder, and build from it rather than submit it. Show them the difference between using a tool and being used by one.
Design Assignments for Curiosity, Not Compliance: Even the best of our assignments need to adapt. Mine needs more checkpoints, more reflective questions along the way, more explanation of why my students made the choices they did.
The response from teachers and university professors was overwhelming. In my entire career, I’ve rarely gotten so many email responses to a single article, and I have never gotten so many thoughtful and comprehensive responses.
One thing is clear: teachers are not OK.
…
In addition, universities are contracting with companies like Microsoft, Adobe, and Google for digital services, and those companies are constantly pushing their AI tools. So a student might hear “don’t use generative AI” from a prof but then log on to the university’s Microsoft suite, which then suggests using Copilot to sum up readings or help draft writing. It’s inconsistent and confusing.
I am sick to my stomach as I write this because I’ve spent 20 years developing a pedagogy that’s about wrestling with big ideas through writing and discussion, and that whole project has been evaporated by for-profit corporations who built their systems on stolen work. It’s demoralizing.
.Get the 2025 Student Guide to Artificial Intelligence — from studentguidetoai.org This guide is made available under a Creative Commons license by Elon University and the American Association of Colleges and Universities (AAC&U). .
Agentic AI is taking these already huge strides even further. Rather than simply asking a question and receiving an answer, an AI agent can assess your current level of understanding and tailor a reply to help you learn. They can also help you come up with a timetable and personalized lesson plan to make you feel as though you have a one-on-one instructor walking you through the process. If your goal is to learn to speak a new language, for example, an agent might map out a plan starting with basic vocabulary and pronunciation exercises, then progress to simple conversations, grammar rules and finally, real-world listening and speaking practice.
…
For instance, if you’re an entrepreneur looking to sharpen your leadership skills, an AI agent might suggest a mix of foundational books, insightful TED Talks and case studies on high-performing executives. If you’re aiming to master data analysis, it might point you toward hands-on coding exercises, interactive tutorials and real-world datasets to practice with.
The beauty of AI-driven learning is that it’s adaptive. As you gain proficiency, your AI coach can shift its recommendations, challenge you with new concepts and even simulate real-world scenarios to deepen your understanding.
Ironically, the very technology feared by workers can also be leveraged to help them. Rather than requiring expensive external training programs or lengthy in-person workshops, AI agents can deliver personalized, on-demand learning paths tailored to each employee’s role, skill level, and career aspirations. Given that 68% of employees find today’s workplace training to be overly “one-size-fits-all,” an AI-driven approach will not only cut costs and save time but will be more effective.
This is one reason why I don’t see AI-embedded classrooms and AI-free classrooms as opposite poles. The bone of contention, here, is not whether we can cultivate AI-free moments in the classroom, but for how long those moments are actually sustainable.
Can we sustain those AI-free moments for an hour? A class session? Longer?
…
Here’s what I think will happen. As AI becomes embedded in society at large, the sustainability of imposed AI-free learning spaces will get tested. Hard. I think it’ll become more and more difficult (though maybe not impossible) to impose AI-free learning spaces on students.
However, consensual and hybrid AI-free learning spaces will continue to have a lot of value. I can imagine classes where students opt into an AI-free space. Or they’ll even create and maintain those spaces.
Duolingo’s AI Revolution — from drphilippahardman.substack.com by Dr. Philippa Hardman What 148 AI-Generated Courses Tell Us About the Future of Instructional Design & Human Learning
Last week, Duolingo announced an unprecedented expansion: 148 new language courses created using generative AI, effectively doubling their content library in just one year. This represents a seismic shift in how learning content is created — a process that previously took the company 12 years for their first 100 courses.
As CEO Luis von Ahn stated in the announcement, “This is a great example of how generative AI can directly benefit our learners… allowing us to scale at unprecedented speed and quality.”
In this week’s blog, I’ll dissect exactly how Duolingo has reimagined instructional design through AI, what this means for the learner experience, and most importantly, what it tells us about the future of our profession.
Medical education is experiencing a quiet revolution—one that’s not taking place in lecture theatres or textbooks, but with headsets and holograms. At the heart of this revolution are Mixed Reality (MR) AI Agents, a new generation of devices that combine the immersive depth of mixed reality with the flexibility of artificial intelligence. These technologies are not mere flashy gadgets; they’re revolutionising the way medical students interact with complicated content, rehearse clinical skills, and prepare for real-world situations. By combining digital simulations with the physical world, MR AI Agents are redefining what it means to learn medicine in the 21st century.
4 Reasons To Use Claude AI to Teach — from techlearning.com by Erik Ofgang Features that make Claude AI appealing to educators include a focus on privacy and conversational style.
After experimenting using Claude AI on various teaching exercises, from generating quizzes to tutoring and offering writing suggestions, I found that it’s not perfect, but I think it behaves favorably compared to other AI tools in general, with an easy-to-use interface and some unique features that make it particularly suited for use in education.
The student experience in higher education is continually evolving, influenced by technological advancements, shifting student needs and expectations, evolving workforce demands, and broadening sociocultural forces. In this year’s report, we examine six critical aspects of student experiences in higher education, providing insights into how institutions can adapt to meet student needs and enhance their learning experience and preparation for the workforce:
Satisfaction with Technology-Related Services and Supports
Modality Preferences
Hybrid Learning Experiences
Generative AI in the Classroom
Workforce Preparation
Accessibility and Mental Health
DSC: Shame on higher ed for not preparing students for the workplace (see below). You’re doing your students wrong…again. Not only do you continue to heap a load of debt on their backs, but you’re also continuing to not get them ready for the workplace. So don’t be surprised if eventually you’re replaced by a variety of alternatives that students will flock towards. .
From DSC:
After seeing Sam’s posting below, I can’t help but wonder:
How might the memory of an AI over time impact the ability to offer much more personalized learning?
How will that kind of memory positively impact a person’s learning-related profile?
Which learning-related agents get called upon?
Which learning-related preferences does a person have while learning about something new?
Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?
we have greatly improved memory in chatgpt–it can now reference all your past conversations!
this is a surprisingly great feature imo, and it points at something we are excited about: ai systems that get to know you over your life, and become extremely useful and personalized.
Starting today, memory in ChatGPT can now reference all of your past chats to provide more personalized responses, drawing on your preferences and interests to make it even more helpful for writing, getting advice, learning, and beyond. pic.twitter.com/s9BrWl94iY
Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.
…
What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.
…
Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.
…
In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.
Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom.
From DSC: I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.
Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.
There are a few places where Scott and I differ.
The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:
To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.”
Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”
— Satya Nitta, a longtime computer researcher at
IBM’s Watson Research Center in Yorktown Heights, NY .
By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.
Also, I would use the term learning preferences where Scott uses the term learning styles.
Scott also mentions:
“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”
It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).
That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.
So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!
Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.
The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.
The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.
From DSC: Look out Google, Amazon, and others! Nvidia is putting the pedal to the metal in terms of being innovative and visionary! They are leaving the likes of Apple in the dust.
The top talent out there is likely to go to Nvidia for a while. Engineers, programmers/software architects, network architects, product designers, data specialists, AI researchers, developers of robotics and autonomous vehicles, R&D specialists, computer vision specialists, natural language processing experts, and many more types of positions will be flocking to Nvidia to work for a company that has already changed the world and will likely continue to do so for years to come.
NVIDIA just shook the AI and Robotic world at NVIDIA GTC 2025.
CEO Jensen Huang announced jaw-dropping breakthroughs.
Here are the top 11 key highlights you can’t afford to miss: (wait till you see no 3) pic.twitter.com/domejuVdw5
For enterprises, NVIDIA unveiled DGX Spark and DGX Station—Jensen’s vision of AI-era computing, bringing NVIDIA’s powerful Blackwell chip directly to your desk.
Nvidia Bets Big on Synthetic Data — from wired.com by Lauren Goode Nvidia has acquired synthetic data startup Gretel to bolster the AI training data used by the chip maker’s customers and developers.
Nvidia, xAI to Join BlackRock and Microsoft’s $30 Billion AI Infrastructure Fund — from investopedia.com by Aaron McDade Nvidia and xAI are joining BlackRock and Microsoft in an AI infrastructure group seeking $30 billion in funding. The group was first announced in September as BlackRock and Microsoft sought to fund new data centers to power AI products.
AI Super Bowl. Hi everyone. This week, 20,000 engineers, scientists, industry executives, and yours truly descended upon San Jose, Calif. for Nvidia’s annual GTC developers’ conference, which has been dubbed the “Super Bowl of AI.”
Imagine with me for a moment: Training is no longer confined to scheduled sessions in a classroom, an online module or even a microlearning you click to activate during your workflow. Imagine training being delivered because the system senses what you are doing and provides instructions and job aids without you having to take an action.
The rapid evolution of artificial intelligence (AI) and wearable technology has made it easier than ever to seamlessly integrate learning directly into the workflow. Smart glasses, earpieces, and other advanced devices are redefining how employees gain knowledge and skills by delivering microlearning moments precisely when and where they are needed.
AI plays a crucial role in this transformation by sensing the optimal moment to deliver the training through augmented reality (AR).
Kennelly and Geraffo are part of a small team at their school in Denver, DSST: College View High School, that is participating in the School Teams AI Collaborative, a year-long pilot initiative in which more than 80 educators from 19 traditional public and charter schools across the country are experimenting with and evaluating AI-enabled instruction to improve teaching and learning.
The goal is for some of AI’s earliest adopters in education to band together, share ideas and eventually help lead the way on what they and their colleagues around the U.S. could do with the emerging technology.
“Pretty early on we thought it was going to be a massive failure,” says Kennelly of last semester’s project. “But it became a huge hit. Students loved it. They were like, ‘I ran to second period to build this thing.’”
As writing instructors, we have a choice in how we frame AI for our students. I invite you to:
Experiment with AI as a conversation partner yourself before introducing it to students
Design assignments that leverage AI’s strengths as a thought partner rather than trying to “AI-proof” your existing assignments
Explicitly teach students how to engage in productive dialogue with AI—how to ask good questions, challenge AI’s assumptions, and use it to refine rather than replace their thinking
Share your experiences, both positive and negative, with colleagues to build our collective understanding of effective AI integration
Nvidia has unveiled a new AI platform for teaching people how to use American Sign Language to help bridge communication gaps.
The Signs platform is creating a validated dataset for sign language learners and developers of ASL-based AI applications.
…
Nvidia, the American Society for Deaf Children and creative agency Hello Monday are helping close this gap with Signs, an interactive web platform built to support ASL learning and the development of accessible AI applications.
Generative AI can significantly reduce the time and effort required in designing PBL by providing tools for research, brainstorming, and organization.
AI tools can assist educators in managing project implementation and assessment, providing formative feedback and organizing resources efficiently.
I usually conclude blogs with some pithy words, but this time I’ll turn the microphone over to Rachel Harcrow, a high school English/Language Arts teacher at Young Women’s College Prep Charter School of Rochester, NY: “After years of struggling to call myself a PBL practitioner, I finally feel comfortable saying I am, thanks to the power of Gen AI,” Harcrow told me. “Initial ideas now turn into fully fledged high-quality project plans in minutes that I can refine, giving me the space and energy to focus on what truly matters: My students.”
AI Resources for District Leaders — from techlearning.com by Steve Baule Educational leaders aiming to effectively integrate generative AI into their schools should consider several key resources
To truly harness the transformative power of generative AI in education, district leaders must navigate a landscape rich with resources and opportunities. By delving into state and national guidelines, exploring successful case studies, utilizing innovative planning tools, and engaging in professional development, educational leaders can craft robust implementation plans. These plans can then assist in integrating AI seamlessly into their schools and elevate the learning experience to new heights.
Anthropic, a favorite frontier AI lab among many coders and genAI power users has unveiled Claude 3.7 Sonnet, its first “hybrid reasoning” AI model. It is capable of both near-instant answers and in-depth, step-by-step reasoning within a single system.
Users can toggle an extended thinking mode where the model self-reflects before answering, considerably improving performance on complex tasks like math, physics and coding. In early testing by the author, the model largely succeeded in creating lines of Python (related to unsupervised learning) that were close to 1,000 lines long that ran without error on the first or second try, including the unsupervised machine learning task shown below:
AI won’t kill education. But will it kill learning? The challenge isn’t AI itself—it’s whether students can still think for themselves when the answers are always one click away.
…
Wait. Before you go, let me ask you one thing. AI has opportunities to help learning. But it also won’t fix it. The real question isn’t whether students can use AI—but whether they’re still learning without it.
Whether the learning is happening between the ears.
And so much of what we teach in schools isn’t the answers on a test. It answers questions like “What is my purpose in life?” “How do I make friends?” and “How can I help my team be stronger.” Questions that aren’t asked on a test but are essential to living a good life. These questions aren’t answered between the ears but within the heart.
That, my friends, is what teaching has always been about.
The heart.
And the heart of the matter is we have new challenges, but these are old complaints. Complaints since the beginning of time and teaching. And in those days, you didn’t need kids just to be able to talk about how to build a fire, they had to make one themselves. Their lives depend on it.
And these days, we need to build another kind of fire. A fire that sparks the joy of learning. The joy of the opportunities that await us sparked by some of the most powerful tools ever invented. Kids need to not be able to just talk about making a difference, they need to know how to build a better world tomorrow. Our lives depend on it.
Debating skills have a range of benefits in the workplace, from helping to improve our communication to bolstering our critical thinking skills. Research from the University of Mississippi suggests it might also help us in the battle with AI in the workplace.
We can often assume that debate teaches us nothing more than how to argue our point, but in order to do this, we have to understand both our own take on a subject and that of our opponent. This allows us to see both sides of any issue we happen to be debating.
“Even though AI has offered a shortcut through the writing process, it actually still is important to be able to write and speak and think on your own,” the researchers explain. “That’s what the focus of this research is: how debate engenders those aspects of being able to write and speak and study and research on your own.”
The most revolutionary aspect of DeepSeek for education isn’t just its cost—it’s the combination of open-source accessibility and local deployment capabilities. As Azeem Azhar notes, “R-1 is open-source. Anyone can download and run it on their own hardware. I have R1-8b (the second smallest model) running on my Mac Mini at home.”
…
Real-time Learning Enhancement
AI tutoring networks that collaborate to optimize individual learning paths
Immediate, multi-perspective feedback on student work
Continuous assessment and curriculum adaptation
The question isn’t whether this technology will transform education—it’s how quickly institutions can adapt to a world where advanced AI capabilities are finally within reach of every classroom.
I know through your feedback on my social media and blog posts that several of you have legitimate concerns about the impact of AI in education, especially those related to data privacy, academic dishonesty, AI dependence, loss of creativity and critical thinking, plagiarism, to mention a few. While these concerns are valid and deserve careful consideration, it’s also important to explore the potential benefits AI can bring when used thoughtfully.
Tools such as ChatGPT and Claude are like smart research assistants that are available 24/7 to support you with all kinds of tasks from drafting detailed lesson plans, creating differentiated materials, generating classroom activities, to summarizing and simplifying complex topics. Likewise, students can use them to enhance their learning by, for instance, brainstorming ideas for research projects, generating constructive feedback on assignments, practicing problem-solving in a guided way, and much more.
The point here is that AI is here to stay and expand, and we better learn how to use it thoughtfully and responsibly rather than avoid it out of fear or skepticism.
As part of our updates to the Edtech Insiders Generative AI Map, we’re excited to release a new mini market map and article deep dive on Generative AI tools that are specifically designed for Instructional Materials use cases.
In our database, the Instructional Materials use case category encompasses tools that:
Assist educators by streamlining lesson planning, curriculum development, and content customization
Enable educators or students to transform materials into alternative formats, such as videos, podcasts, or other interactive media, in addition to leveraging gaming principles or immersive VR to enhance engagement
Empower educators or students to transform text, video, slides or other source material into study aids like study guides, flashcards, practice tests, or graphic organizers
Engage students through interactive lessons featuring historical figures, authors, or fictional characters
Customize curriculum to individual needs or pedagogical approaches
Empower educators or students to quickly create online learning assets and courses
That’s why, today, the question I’m asking is: How best can we proactively guide AI’s use in higher education and shape its impact on our students, faculty and institution? The answer to that broad, strategic question lies in pursuing four objectives that, I believe, are relevant for many colleges and universities.
Learning to use business software is different from learning to think. But if the software is sufficiently complex, how different is it really? What if AI’s primary impact on education isn’t in the classroom, but rather shifting the locus of learning to outside the classroom?
Instead of sitting in a classroom listening to a teacher, high school and college students could be assigned real work and learn from that work. Students could be matched with employers or specific projects provided by or derived from employers, then do the work on the same software used in the enterprise. As AI-powered digital adoption platforms (DAPs) become increasingly powerful, they have the potential to transform real or simulated work into educational best practice for students only a few years away from seeking full-time employment.
If DAPs take us in this direction, four implications come to mind….
In this week’s blog post, I share a summary of five recent studies on the impact of Gen AI on learning to bring you right up to date.
… Implications for Educators and Developers
For Educators:
Combine ChatGPT with Structured Activities: …
Use ChatGPT as a Supplement, Not a Replacement:…
Promote Self-Reflection and Evaluation:
For Developers:
Reimagine AI for Reflection-First Design: …
Develop Tools that Foster Critical Thinking: …
Integrate Adaptive Support: …
Assessing the GenAI process, not the output — from timeshighereducation.com by Paul McDermott, Leoni Palmer, and Rosemary Norton A framework for building AI literacy in a literature-review-type assessment
In this resource, we outline our advice for implementing an approach that opens AI use up to our students through a strategy of assessing the process rather than outputs.
To start with, we recommend identifying learning outcomes for your students that can be achieved in collaboration with AI.
What’s New: The Updated Edtech Insiders Generative AI Map — from edtechinsiders.substack.com by Sarah Morin, Alex Sarlin, and Ben Kornell A major expansion on our previously released market map, use case database, and AI tool company directory.
.
Tutorial: 4 Ways to Use LearnLM as a Professor— from automatedteach.com by Graham Clay Create better assessments, improve instructions and feedback, and tutor your students with this fine-tuned version of Gemini.
I cover how to use LearnLM
to create sophisticated assessments that promote learning
to develop clearer and more effective assignment instructions
to provide more constructive feedback on student work, and
to support student learning through guided tutoring
With that out of the way, I prefer Claude.ai for writing. For larger projects like a book, create a Claude Project to keep all context in one place.
Copy [the following] prompts into a document
Use them in sequence as you write
Adjust the word counts and specifics as needed
Keep your responses for reference
Use the same prompt template for similar sections to maintain consistency
Each prompt builds on the previous one, creating a systematic approach to helping you write your book.
Using NotebookLM to Boost College Reading Comprehension— from michellekassorla.substack.com by Michelle Kassorla and Eugenia Novokshanova This semester, we are using NotebookLM to help our students comprehend and engage with scholarly texts
We were looking hard for a new tool when Google released NotebookLM. Not only does Google allow unfettered use of this amazing tool, it is also a much better tool for the work we require in our courses. So, this semester, we have scrapped our “old” tools and added NotebookLM as the primary tool for our English Composition II courses (and we hope, fervently, that Google won’t decide to severely limit its free tier before this semester ends!)
If you know next-to-nothing about NotebookLM, that’s OK. What follows is the specific lesson we present to our students. We hope this will help you understand all you need to know about NotebookLM, and how to successfully integrate the tool into your own teaching this semester.
AFTER two years of working closely with leadership in multiple institutions, and delivering countless workshops, I’ve seen one thing repeatedly: the biggest challenge isn’t the technology itself, but how we lead through it. Here is some of my best advice to help you navigate generative AI with clarity and confidence:
Break your own AI policies before you implement them. …
Fund your failures. …
Resist the pilot program. …
Host Anti-Tech Tech Talks …
…+ several more tips
While generative AI in higher education obviously involves new technology, it’s much more about adopting a curious and human-centric approach in your institution and communities. It’s about empowering learners in new, human-oriented and innovative ways. It is, in a nutshell, about people adapting to new ways of doing things.
Maria Anderson responded to Clay’s posting with this idea:
Here’s an idea: […] the teacher can use the [most advanced] AI tool to generate a complete solution to “the problem” — whatever that is — and demonstrate how to do that in class. Give all the students access to the document with the results.
And then grade the students on a comprehensive followup activity / presentation of executing that solution (no notes, no more than 10 words on a slide). So the students all have access to the same deep AI result, but have to show they comprehend and can iterate on that result.
In this age of distrust, misinformation, and skepticism, you may wonder how to demonstrate your sources within a Google Document. Did you type it yourself, copy and paste it from a browser-based source, copy and paste it from an unknown source, or did it come from generative AI?
You may not think this is an important clarification, but if writing is a critical part of your livelihood or life, you will definitely want to demonstrate your sources.
That’s where the new Grammarly feature comes in.
The new feature is called Authorship, and according to Grammarly, “Grammarly Authorship is a set of features that helps users demonstrate their sources of text in a Google doc. When you activate Authorship within Google Docs, it proactively tracks the writing process as you write.”
AI Agents Are Coming to Higher Education — from govtech.com AI agents are customizable tools with more decision-making power than chatbots. They have the potential to automate more tasks, and some schools have implemented them for administrative and educational purposes.
Custom GPTs are on the rise in education. Google’s version, Gemini Gems, includes a premade version called Learning Coach, and Microsoft announced last week a new agent addition to Copilot featuring use cases at educational institutions.
Generative Artificial Intelligence and Education: A Brief Ethical Reflection on Autonomy— from er.educause.edu by Vicki Strunk and James Willis Given the widespread impacts of generative AI, looking at this technology through the lens of autonomy can help equip students for the workplaces of the present and of the future, while ensuring academic integrity for both students and instructors.
The principle of autonomy stresses that we should be free agents who can govern ourselves and who are able to make our own choices. This principle applies to AI in higher education because it raises serious questions about how, when, and whether AI should be used in varying contexts. Although we have only begun asking questions related to autonomy and many more remain to be asked, we hope that this serves as a starting place to consider the uses of AI in higher education.
NVIDIA’s Apple moment?! — from theneurondaily.com by Noah Edelman and Grant Harvey PLUS: How to level up your AI workflows for 2025…
NVIDIA wants to put an AI supercomputer on your desk (and it only costs $3,000). … And last night at CES 2025, Jensen Huang announced phase two of this plan: Project DIGITS, a $3K personal AI supercomputer that runs 200B parameter models from your desk. Guess we now know why Apple recently developed an NVIDIA allergy…
… But NVIDIA doesn’t just want its “Apple PC moment”… it also wants its OpenAI moment. NVIDIA also announced Cosmos, a platform for building physical AI (think: robots and self-driving cars)—which Jensen Huang calls “the ChatGPT moment for robotics.”
NVIDIA is bringing AI from the cloud to personal devices and enterprises, covering all computing needs from developers to ordinary users.
At CES 2025, which opened this morning, NVIDIA founder and CEO Jensen Huang delivered a milestone keynote speech, revealing the future of AI and computing. From the core token concept of generative AI to the launch of the new Blackwell architecture GPU, and the AI-driven digital future, this speech will profoundly impact the entire industry from a cross-disciplinary perspective.
From DSC: I’m posting this next item (involving Samsung) as it relates to how TVs continue to change within our living rooms. AI is finding its way into our TVs…the ramifications of this remain to be seen.
The Rundown: Samsung revealed its new “AI for All” tagline at CES 2025, introducing a comprehensive suite of new AI features and products across its entire ecosystem — including new AI-powered TVs, appliances, PCs, and more.
The details:
Vision AI brings features like real-time translation, the ability to adapt to user preferences, AI upscaling, and instant content summaries to Samsung TVs.
Several of Samsung’s new Smart TVs will also have Microsoft Copilot built in, while also teasing a potential AI partnership with Google.
Samsung also announced the new line of Galaxy Book5 AI PCs, with new capabilities like AI-powered search and photo editing.
AI is also being infused into Samsung’s laundry appliances, art frames, home security equipment, and other devices within its SmartThings ecosystem.
Why it matters: Samsung’s web of products are getting the AI treatment — and we’re about to be surrounded by AI-infused appliances in every aspect of our lives. The edge will be the ability to sync it all together under one central hub, which could position Samsung as the go-to for the inevitable transition from smart to AI-powered homes.
***
“Samsung sees TVs not as one-directional devices for passive consumption but as interactive, intelligent partners that adapt to your needs,” said SW Yong, President and Head of Visual Display Business at Samsung Electronics. “With Samsung Vision AI, we’re reimagining what screens can do, connecting entertainment, personalization, and lifestyle solutions into one seamless experience to simplify your life.” — from Samsung
The following framework I offer for defining, understanding, and preparing for agentic AI blends foundational work in computer science with insights from cognitive psychology and speculative philosophy. Each of the seven levels represents a step-change in technology, capability, and autonomy. The framework expresses increasing opportunities to innovate, thrive, and transform in a data-fueled and AI-driven digital economy.
The Rise of AI Agents and Data-Driven Decisions — from devprojournal.com by Mike Monocello Fueled by generative AI and machine learning advancements, we’re witnessing a paradigm shift in how businesses operate and make decisions.
AI Agents Enhance Generative AI’s Impact Burley Kawasaki, Global VP of Product Marketing and Strategy at Creatio, predicts a significant leap forward in generative AI. “In 2025, AI agents will take generative AI to the next level by moving beyond content creation to active participation in daily business operations,” he says. “These agents, capable of partial or full autonomy, will handle tasks like scheduling, lead qualification, and customer follow-ups, seamlessly integrating into workflows. Rather than replacing generative AI, they will enhance its utility by transforming insights into immediate, actionable outcomes.”
Everyone’s talking about the potential of AI agents in 2025 (and don’t get me wrong, it’s really significant), but there’s a crucial detail that keeps getting overlooked: the gap between current capabilities and practical reliability.
Here’s the reality check that most predictions miss: AI agents currently operate at about 80% accuracy (according to Microsoft’s AI CEO). Sounds impressive, right? But here’s the thing – for businesses and users to actually trust these systems with meaningful tasks, we need 99% reliability. That’s not just a 19% gap – it’s the difference between an interesting tech demo and a business-critical tool.
This matters because it completely changes how we should think about AI agents in 2025. While major players like Microsoft, Google, and Amazon are pouring billions into development, they’re all facing the same fundamental challenge – making them work reliably enough that you can actually trust them with your business processes.
Think about it this way: Would you trust an assistant who gets things wrong 20% of the time? Probably not. But would you trust one who makes a mistake only 1% of the time, especially if they could handle repetitive tasks across your entire workflow? That’s a completely different conversation.
In the tech world, we like to label periods as the year of (insert milestone here). This past year (2024) was a year of broader experimentation in AI and, of course, agentic use cases.
As 2025 opens, VentureBeat spoke to industry analysts and IT decision-makers to see what the year might bring. For many, 2025 will be the year of agents, when all the pilot programs, experiments and new AI use cases converge into something resembling a return on investment.
In addition, the experts VentureBeat spoke to see 2025 as the year AI orchestration will play a bigger role in the enterprise. Organizations plan to make management of AI applications and agents much more straightforward.
Here are some themes we expect to see more in 2025.
AI agents take charge
Jérémy Grandillon, CEO of TC9 – AI Allbound Agency, said “Today, AI can do a lot, but we don’t trust it to take actions on our behalf. This will change in 2025. Be ready to ask your AI assistant to book a Uber ride for you.” Start small with one agent handling one task. Build up to an army.
“If 2024 was agents everywhere, then 2025 will be about bringing those agents together in networks and systems,” said Nicholas Holland, vice president of AI at Hubspot. “Micro agents working together to accomplish larger bodies of work, and marketplaces where humans can ‘hire’ agents to work alongside them in hybrid teams. Before long, we’ll be saying, ‘there’s an agent for that.'”
… Voice becomes default
Stop typing and start talking. Adam Biddlecombe, head of brand at Mindstream, predicts a shift in how we interact with AI. “2025 will be the year that people start talking with AI,” he said. “The majority of people interact with ChatGPT and other tools in the text format, and a lot of emphasis is put on prompting skills.
Biddlecombe believes, “With Apple’s ChatGPT integration for Siri, millions of people will start talking to ChatGPT. This will make AI so much more accessible and people will start to use it for very simple queries.”
Get ready for the next wave of advancements in AI. AGI arrives early, AI agents take charge, and voice becomes the norm. Video creation gets easy, AI embeds everywhere, and one-person billion-dollar companies emerge.
To better understand the types of roles that AI is impacting, ZoomInfo’s research team looked to its proprietary database of professional contacts for answers. The platform, which detects more than 1.5 million personnel changes per day, revealed a dramatic increase in AI-related job titles since 2022. With a 200% increase in two years, the data paints a vivid picture of how AI technology is reshaping the workforce.
Why does this shift in AI titles matter for every industry?
Ever since a new revolutionary version of chat ChatGPT became operable in late 2022, educators have faced several complex challenges as they learn how to navigate artificial intelligence systems.
…
Education Week produced a significant amount of coverage in 2024 exploring these and other critical questions involving the understanding and use of AI.
Here are the five most popular stories that Education Week published in 2024 about AI in schools.
Dr. Lodge said there are five key areas the higher education sector needs to address to adapt to the use of AI:
1. Teach ‘people’ skills as well as tech skills
2. Help all students use new tech
3. Prepare students for the jobs of the future
4. Learn to make sense of complex information
5. Universities to lead the tech change