DC: I’m not necessarily recommending this, but the next two items point out how the use of agents continues to move forward:

The Future is Here: Visa Announces New Era of Commerce Featuring AI

  • Global leader brings its trusted brand and powerful network to enable payments with new technologies
  • Launches new innovations and partnerships to drive flexibility, security and acceptance

SAN FRANCISCO–(BUSINESS WIRE)–The future of commerce is on display at the Visa Global Product Drop with powerful AI-enabled advancements allowing consumers to find and buy with AI plus the introduction of new strategic partnerships and product innovations.

Also related/see:

Find and Buy with AI: Visa Unveils New Era of Commerce — from businesswire.com

  • Collaborates with Anthropic, IBM, Microsoft, Mistral AI, OpenAI, Perplexity, Samsung, Stripe and more
  • Will make shopping experiences more personal, more secure and more convenient as they become powered by AI

Introduced [on April 30th] at the Visa Global Product Drop, Visa Intelligent Commerce enables AI to find and buy. It is a groundbreaking new initiative that opens Visa’s payment network to the developers and engineers building the foundational AI agents transforming commerce.


AI agents are the new buyers. How can you market to them? — from aiwithallie.beehiiv.com by Allie Miller
You’re optimizing for people. But the next buyers are bots.

In today’s newsletter, I’m unpacking why your next major buyers won’t be people at all. They’ll be AI agents, and your brand might already be invisible to them. We’ll dig into why traditional marketing strategies are breaking down in the age of autonomous AI shoppers, what “AI optimization” (AIO) really means, and the practical steps you can take right now to make sure your business stays visible and competitive as the new digital gatekeepers take over more digital tasks.

AI platforms and AI agents—the digital assistants that browse and actually do things powered by models like GPT-4o, Claude 3.7 Sonnet, and Gemini 2.5 Pro—are increasingly becoming the gatekeepers between your business and potential customers.

“AI is the new front door to your business for millions of consumers.”

The 40-Point (ish) AI Agent Marketing Playbook 
Here’s the longer list. I went ahead and broke these into four categories so you can more easily assign owners: Content, Structure & Design, Technical & Dev, and AI Strategy & Testing. I look forward to seeing how this space, and by extension my advice, changes in the coming months.


Microsoft CEO says up to 30% of the company’s code was written by AI — from techcrunch.com by Maxwell Zeff

During a fireside chat with Meta CEO Mark Zuckerberg at Meta’s LlamaCon conference on Tuesday, Microsoft CEO Satya Nadella said that 20% to 30% of code inside the company’s repositories was “written by software” — meaning AI.


The Top 100Gen AI Consumer Apps — from a16z.com

In just six months, the consumer AI landscape has been redrawn. Some products surged, others stalled, and a few unexpected players rewrote the leaderboard overnight. Deepseek rocketed from obscurity to a leading ChatGPT challenger. AI video models advanced from experimental to fairly dependable (at least for short clips!). And so-called “vibe coding” is changing who can create with AI, not just who can use it. The competition is tighter, the stakes are higher, and the winners aren’t just launching, they’re sticking.

We turned to the data to answer: Which AI apps are people actively using? What’s actually making money, beyond being popular? And which tools are moving beyond curiosity-driven dabbling to become daily staples?

This is the fourth installment of the Top 100 Gen AI Consumer Apps, our bi-annual ranking of the top 50 AI-first web products (by unique monthly visits, per Similarweb) and top 50 AI-first mobile apps (by monthly active users, per Sensor Tower). Since our last report in August 2024, 17 new companies have entered the rankings of top AI-first web products.


Deep Research with AI: 9 Ways to Get Started — from wondertools.substack.com by Jeremy Caplan
Practical strategies for thorough, citation-rich AI research

The AI search landscape is transforming at breakneck speed. New “Deep Research” tools from ChatGPT, Gemini and Perplexity autonomously search and gather information from dozens — even hundreds — of sites, then analyze and synthesize it to produce comprehensive reports. While a human might take days or weeks to produce these 30-page citation-backed reports, AI Deep Research reports are ready in minutes.

What’s in this post

    • Examples of each report type I generated for my research, so you can form your own impressions.
    • Tips on why & how to use Deep Research and how to craft effective queries.
    • Comparison of key features and strengths/limitations of the top platforms

AI Agents Are Here—So Are the Threats: Unit 42 Unveils the Top 10 AI Agent Security Risks — from marktechpost.com

As AI agents transition from experimental systems to production-scale applications, their growing autonomy introduces novel security challenges. In a comprehensive new report, AI Agents Are Here. So Are the Threats,” Palo Alto Networks’ Unit 42 reveals how today’s agentic architectures—despite their innovation—are vulnerable to a wide range of attacks, most of which stem not from the frameworks themselves, but from the way agents are designed, deployed, and connected to external tools.

To evaluate the breadth of these risks, Unit 42 researchers constructed two functionally identical AI agents—one built using CrewAI and the other with AutoGen. Despite architectural differences, both systems exhibited the same vulnerabilities, confirming that the underlying issues are not framework-specific. Instead, the threats arise from misconfigurations, insecure prompt design, and insufficiently hardened tool integrations—issues that transcend implementation choices.


LLMs Can Learn Complex Math from Just One Example: Researchers from University of Washington, Microsoft, and USC Unlock the Power of 1-Shot Reinforcement Learning with Verifiable Reward — from marktechpost.com by Sana Hassan


 

 

What does ‘age appropriate’ AI literacy look like in higher education? — from timeshighereducation.com by Fun Siong Lim
As AI literacy becomes an essential work skill, universities need to move beyond developing these competencies at ‘primary school’ level in their students. Here, Fun Siong Lim reflects on frameworks to support higher-order AI literacies

Like platforms developed at other universities, Project NALA offers a front-end interface (known as the builder) for faculty to create their own learning assistant. An idea we have is to open the builder up to students to allow them to create their own GenAI assistant as part of our AI literacy curriculum. As they design, configure and test their own assistant, they will learn firsthand how generative AI works. They get to test performance-enhancement approaches beyond prompt engineering, such as grounding the learning assistant with curated materials (retrieval-augmented generation) and advanced ideas such as incorporating knowledge graphs.

They should have the opportunity to analyse, evaluate and create responsible AI solutions. Offering students the opportunity to build their own AI assistants could be a way forward to develop these much-needed skills.


How to Use ChatGPT 4o’s Update to Turn Key Insights Into Clear Infographics (Prompts Included) — from evakeiffenheim.substack.com by Eva Keiffenheim
This 3-step workflow helps you break down books, reports, or slide-decks into professional visuals that accelerate understanding.

This article shows you how to find core ideas, prompt GPT-4o3 for a design brief, and generate clean, professional images that stick. These aren’t vague “creative visuals”—they’re structured for learning, memory, and action.

If you’re a lifelong learner, educator, creator, or just someone who wants to work smarter, this process is for you.

You’ll spend less time re-reading and more time understanding. And maybe—just maybe—you’ll build ideas that not only click in your brain, but also stick in someone else’s.


SchoolAI Secures $25 Million to Help Teachers and Schools Reach Every Student — from globenewswire.com
 The Classroom Experience platform gives every teacher and student their own AI tools for personalized learning

SchoolAI’s Classroom Experience platform combines AI assistants for teachers that help with classroom preparation and other administrative work, and Spaces–personalized AI tutors, games, and lessons that can adapt to each student’s unique learning style and interests. Together, these tools give teachers actionable insights into how students are doing, and how the teacher can deliver targeted support when it matters most.

“Teachers and schools are navigating hard challenges with shrinking budgets, teacher shortages, growing class sizes, and ongoing recovery from pandemic-related learning gaps,” said Caleb Hicks, founder and CEO of SchoolAI. “It’s harder than ever to understand how every student is really doing. Teachers deserve powerful tools to help extend their impact, not add to their workload. This funding helps us double down on connecting the dots for teachers and students, and later this year, bringing school administrators and parents at home onto the platform as well.”


AI in Education, Part 3: Looking Ahead – The Future of AI in Learning — from rdene915.com by Dr. Rachelle Dené Poth

In the first and second parts of my AI series, I focused on where we see AI in classrooms. Benefits range from personalized learning and accessibility tools to AI-driven grading and support of a teaching assistant. In Part 2, I chose to focus on some of the important considerations related to ethics that must be part of the conversation. Schools need to focus on data privacy, bias, overreliance, and the equity divide. I wanted to focus on the future for this last part in the current AI series. Where do we go from here?


Anthropic Education Report: How University Students Use Claude — from anthropic.com

The key findings from our Education Report are:

  • STEM students are early adopters of AI tools like Claude, with Computer Science students particularly overrepresented (accounting for 36.8% of students’ conversations while comprising only 5.4% of U.S. degrees). In contrast, Business, Health, and Humanities students show lower adoption rates relative to their enrollment numbers.
  • We identified four patterns by which students interact with AI, each of which were present in our data at approximately equal rates (each 23-29% of conversations): Direct Problem Solving, Direct Output Creation, Collaborative Problem Solving, and Collaborative Output Creation.
  • Students primarily use AI systems for creating (using information to learn something new) and analyzing (taking apart the known and identifying relationships), such as creating coding projects or analyzing law concepts. This aligns with higher-order cognitive functions on Bloom’s Taxonomy. This raises questions about ensuring students don’t offload critical cognitive tasks to AI systems.

From the Kuali Days 2025 Conference: A CEO’s View of Planning for AI — from campustechnology.com by Mary Grush
A Conversation with Joel Dehlin

How can a company serving higher education navigate the changes AI brings to the ed tech marketplace? What will customers expect in this dynamic? Here, CT talks with Kuali CEO Joel Dehlin, who shared his company’s AI strategies in a featured plenary session, “Sneak Peek of AI in Kuali Build,” at Kuali Days 2025 in Anaheim.


How students can use generative AI — from aliciabankhofer.substack.com by Alicia Bankhofer
Part 4 of 4 in my series on Teaching and Learning in the AI Age

This article is the culmination of a series exploring AI’s impact on education.

Part 1: What Educators Need outlined essential AI literacy skills for teachers, emphasizing the need to move beyond basic ChatGPT exploration to understand the full spectrum of AI tools available in education.

Part 2: What Students Need addressed how students require clear guidance to use AI safely, ethically, and responsibly, with emphasis on developing critical thinking skills alongside AI literacy.

Part 3: How Educators Can Use GenAI presented ten practical use cases for teachers, from creating differentiated resources to designing assessments, demonstrating how AI can reclaim 5-7 hours weekly for meaningful student interactions.

Part 4: How Students Can Use GenAI (this article) provides frameworks for guiding student AI use based on Joscha Falck’s dimensions: learning about, with, through, despite, and without AI.


Mapping a Multidimensional Framework for GenAI in Education — from er.educause.edu by Patricia Turner
Prompting careful dialogue through incisive questions can help chart a course through the ongoing storm of artificial intelligence.

The goal of this framework is to help faculty, educational developers, instructional designers, administrators, and others in higher education engage in productive discussions about the use of GenAI in teaching and learning. As others have noted, theoretical frameworks will need to be accompanied by research and teaching practice, each reinforcing and reshaping the others to create understandings that will inform the development of approaches to GenAI that are both ethical and maximally beneficial, while mitigating potential harms to those who engage with it.


Instructional Design Isn’t Dying — It’s Specialising — from drphilippahardman.substack.com by Dr. Philippa Hardman
Aka, how AI is impacting role & purpose of Instructional Design

Together, these developments have revealed something important: despite widespread anxiety, the instructional design role isn’t dying—it’s specialising.

What we’re witnessing isn’t the automation of instructional design and the death of the instructional designer, but rather the evolution of the ID role into multiple distinct professional pathways.

The generalist “full stack” instructional designer is slowly but decisively fracturing into specialised roles that reflect both the capabilities of generative AI and the strategic imperatives facing modern organisations.

In this week’s blog post, I’ll share what I’ve learned about how our field is transforming, and what it likely means for you and your career path.

Those instructional designers who cling to traditional generalist models risk being replaced, but those who embrace specialisation, data fluency, and AI collaboration will excel and lead the next evolution of the field. Similarly, those businesses that continue to view L&D as a cost centre and focus on automating content delivery will be outperformed, while those that invest in building agile, AI-enabled learning ecosystems will drive measurable performance gains and secure their competitive advantage.


Adding AI to Every Step in Your eLearning Design Workflow — from learningguild.com by George Hanshaw

We know that eLearning is a staple of training and development. The expectations of the learners are higher than ever: They expect a dynamic, interactive, and personalized learning experience. As instructional designers, we are tasked with meeting these expectations by creating engaging and effective learning solutions.

The integration of Artificial Intelligence (AI) into our eLearning design process is a game-changer that can significantly enhance the quality and efficiency of our work.

No matter if you use ADDIE or rapid prototyping, AI has a fit in every aspect of your workflow. By integrating AI, you can ensure a more efficient and effective design process that adapts to the unique needs of your learners. This not only saves time and resources but also significantly enhances the overall learning experience. We will explore the needs analysis and the general design process.

 

From DSC:
After seeing Sam’s posting below, I can’t help but wonder:

  • How might the memory of an AI over time impact the ability to offer much more personalized learning?
  • How will that kind of memory positively impact a person’s learning-related profile?
  • Which learning-related agents get called upon?
  • Which learning-related preferences does a person have while learning about something new?
  • Which methods have worked best in the past for that individual? Which methods didn’t work so well with him or her?



 

Reflections on “Are You Ready for the AI University? Everything is about to change.” [Latham]

.
Are You Ready for the AI University? Everything is about to change. — from chronicle.com by Scott Latham

Over the course of the next 10 years, AI-powered institutions will rise in the rankings. US News & World Report will factor a college’s AI capabilities into its calculations. Accrediting agencies will assess the degree of AI integration into pedagogy, research, and student life. Corporations will want to partner with universities that have demonstrated AI prowess. In short, we will see the emergence of the AI haves and have-nots.

What’s happening in higher education today has a name: creative destruction. The economist Joseph Schumpeter coined the term in 1942 to describe how innovation can transform industries. That typically happens when an industry has both a dysfunctional cost structure and a declining value proposition. Both are true of higher education.

Out of the gate, professors will work with technologists to get AI up to speed on specific disciplines and pedagogy. For example, AI could be “fed” course material on Greek history or finance and then, guided by human professors as they sort through the material, help AI understand the structure of the discipline, and then develop lectures, videos, supporting documentation, and assessments.

In the near future, if a student misses class, they will be able watch a recording that an AI bot captured. Or the AI bot will find a similar lecture from another professor at another accredited university. If you need tutoring, an AI bot will be ready to help any time, day or night. Similarly, if you are going on a trip and wish to take an exam on the plane, a student will be able to log on and complete the AI-designed and administered exam. Students will no longer be bound by a rigid class schedule. Instead, they will set the schedule that works for them.

Early and mid-career professors who hope to survive will need to adapt and learn how to work with AI. They will need to immerse themselves in research on AI and pedagogy and understand its effect on the classroom. 

From DSC:
I had a very difficult time deciding which excerpts to include. There were so many more excerpts for us to think about with this solid article. While I don’t agree with several things in it, EVERY professor, president, dean, and administrator working within higher education today needs to read this article and seriously consider what Scott Latham is saying.

Change is already here, but according to Scott, we haven’t seen anything yet. I agree with him and, as a futurist, one has to consider the potential scenarios that Scott lays out for AI’s creative destruction of what higher education may look like. Scott asserts that some significant and upcoming impacts will be experienced by faculty members, doctoral students, and graduate/teaching assistants (and Teaching & Learning Centers and IT Departments, I would add). But he doesn’t stop there. He brings in presidents, deans, and other members of the leadership teams out there.

There are a few places where Scott and I differ.

  • The foremost one is the importance of the human element — i.e., the human faculty member and students’ learning preferences. I think many (most?) students and lifelong learners will want to learn from a human being. IBM abandoned their 5-year, $100M ed push last year and one of the key conclusions was that people want to learn from — and with — other people:

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

— Satya Nitta, a longtime computer researcher at
IBM’s Watson
Research Center in Yorktown Heights, NY
.

By the way, it isn’t easy for me to write this. As I wanted AI and other related technologies to be able to do just what IBM was hoping that it would be able to do.

  • Also, I would use the term learning preferences where Scott uses the term learning styles.

Scott also mentions:

“In addition, faculty members will need to become technologists as much as scholars. They will need to train AI in how to help them build lectures, assessments, and fine-tune their classroom materials. Further training will be needed when AI first delivers a course.”

It has been my experience from working with faculty members for over 20 years that not all faculty members want to become technologists. They may not have the time, interest, and/or aptitude to become one (and vice versa for technologists who likely won’t become faculty members).

That all said, Scott relays many things that I have reflected upon and relayed for years now via this Learning Ecosystems blog and also via The Learning from the Living [AI-Based Class] Room vision — the use of AI to offer personalized and job-relevant learning, the rising costs of higher education, the development of new learning-related offerings and credentials at far less expensive prices, the need to provide new business models and emerging technologies that are devoted more to lifelong learning, plus several other things.

So this article is definitely worth your time to read, especially if you are working in higher education or are considering a career therein!


Addendum later on 4/10/25:

U-M’s Ross School of Business, Google Public Sector launch virtual teaching assistant pilot program — from news.umich.edu by Jeff Karoub; via Paul Fain

Google Public Sector and the University of Michigan’s Ross School of Business have launched an advanced Virtual Teaching Assistant pilot program aimed at improving personalized learning and enlightening educators on artificial intelligence in the classroom.

The AI technology, aided by Google’s Gemini chatbot, provides students with all-hours access to support and self-directed learning. The Virtual TA represents the next generation of educational chatbots, serving as a sophisticated AI learning assistant that instructors can use to modify their specific lessons and teaching styles.

The Virtual TA facilitates self-paced learning for students, provides on-demand explanations of complex course concepts, guides them through problem-solving, and acts as a practice partner. It’s designed to foster critical thinking by never giving away answers, ensuring students actively work toward solutions.

 

The 2025 AI Index Report — from Stanford University’s Human-Centered Artificial Intelligence Lab (hai.stanford.edu); item via The Neuron

Top Takeaways

  1. AI performance on demanding benchmarks continues to improve.
  2. AI is increasingly embedded in everyday life.
  3. Business is all in on AI, fueling record investment and usage, as research continues to show strong productivity impacts.
  4. The U.S. still leads in producing top AI models—but China is closing the performance gap.
  5. The responsible AI ecosystem evolves—unevenly.
  6. Global AI optimism is rising—but deep regional divides remain.
  7. …and several more

Also see:

The Neuron’s take on this:

So, what should you do? You really need to start trying out these AI tools. They’re getting cheaper and better, and they can genuinely help save time or make work easier—ignoring them is like ignoring smartphones ten years ago.

Just keep two big things in mind:

  1. Making the next super-smart AI costs a crazy amount of money and uses tons of power (seriously, they’re buying nuclear plants and pushing coal again!).
  2. Companies are still figuring out how to make AI perfectly safe and fair—cause it still makes mistakes.

So, use the tools, find what helps you, but don’t trust them completely.

We’re building this plane mid-flight, and Stanford’s report card is just another confirmation that we desperately need better safety checks before we hit major turbulence.


Addendum on 4/16:

 

The 2025 ABA Techshow Startup Alley Pitch Competition Ended In A Tie – Here Are The Winners — from lawnext.com by Bob Ambrogi

This year, two startups ended up with an equal number of votes for the top spot:

  • Case Crafter, a company from Norway that helps legal professionals build compelling visual timelines based on case files and evidence.
  • Querious, a product that provides attorneys with real-time insights during client conversations into legal issues, relevant content, and suggested questions and follow-ups.
    .


AI academy gives law students a head start on legal tech, says OBA innovator — from canadianlawyermag.com by Branislav Urosevic

The Ontario Bar Association has recently launched a hands-on AI learning platform tailored for lawyers. Called the AI Academy, the initiative is designed to help legal professionals explore, experiment with, and adopt AI tools relevant to their practice.

Colin Lachance, OBA’s innovator-in-residence and the lead designer of the platform, says that although the AI Academy was built for practising lawyers, it is also well-suited for law students.


 

Uplimit raises stakes in corporate learning with suite of AI agents that can train thousands of employees simultaneously — from venturebeat.com by Michael Nuñez|

Uplimit unveiled a suite of AI-powered learning agents today designed to help companies rapidly upskill employees while dramatically reducing administrative burdens traditionally associated with corporate training.

The San Francisco-based company announced three sets of purpose-built AI agents that promise to change how enterprises approach learning and development: skill-building agents, program management agents, and teaching assistant agents. The technology aims to address the growing skills gap as AI advances faster than most workforces can adapt.

“There is an unprecedented need for continuous learning—at a scale and speed traditional systems were never built to handle,” said Julia Stiglitz, CEO and co-founder of Uplimit, in an interview with VentureBeat. “The companies best positioned to thrive aren’t choosing between AI and their people—they’re investing in both.”


Introducing Claude for Education — from anthropic.com

Today we’re launching Claude for Education, a specialized version of Claude tailored for higher education institutions. This initiative equips universities to develop and implement AI-enabled approaches across teaching, learning, and administration—ensuring educators and students play a key role in actively shaping AI’s role in society.

As part of announcing Claude for Education, we’re introducing:

  1. Learning mode: A new Claude experience that guides students’ reasoning process rather than providing answers, helping develop critical thinking skills
  2. University-wide Claude availability: Full campus access agreements with Northeastern University, London School of Economics and Political Science (LSE), and Champlain College, making Claude available to all students
  3. Academic partnerships: Joining Internet2 and working with Instructure to embed AI into teaching & learning with Canvas LMS
  4. Student programs: A new Claude Campus Ambassadors program along with an initiative offering API credits for student projects

A comment on this from The Rundown AI:

Why it matters: Education continues to grapple with AI, but Anthropic is flipping the script by making the tech a partner in developing critical thinking rather than an answer engine. While the controversy over its use likely isn’t going away, this generation of students will have access to the most personalized, high-quality learning tools ever.


Should College Graduates Be AI Literate? — from chronicle.com by Beth McMurtrie (behind a paywall)
More institutions are saying yes. Persuading professors is only the first barrier they face.

Last fall one of Jacqueline Fajardo’s students came to her office, eager to tell her about an AI tool that was helping him learn general chemistry. Had she heard of Google NotebookLM? He had been using it for half a semester in her honors course. He confidently showed her how he could type in the learning outcomes she posted for each class and the tool would produce explanations and study guides. It even created a podcast based on an academic paper he had uploaded. He did not feel it was important to take detailed notes in class because the AI tool was able to summarize the key points of her lectures.


Showing Up for the Future: Why Educators Can’t Sit Out the AI Conversation — from marcwatkins.substack.com with a guest post from Lew Ludwig

The Risk of Disengagement
Let’s be honest: most of us aren’t jumping headfirst into AI. At many of our institutions, it’s not a gold rush—it’s a quiet standoff. But the group I worry most about isn’t the early adopters. It’s the faculty who’ve decided to opt out altogether.

That choice often comes from a place of care. Concerns about data privacy, climate impact, exploitative labor, and the ethics of using large language models are real—and important. But choosing not to engage at all, even on ethical grounds, doesn’t remove us from the system. It just removes our voices from the conversation.

And without those voices, we risk letting others—those with very different priorities—make the decisions that shape what AI looks like in our classrooms, on our campuses, and in our broader culture of learning.



Turbocharge Your Professional Development with AI — from learningguild.com by Dr. RK Prasad

You’ve just mastered a few new eLearning authoring tools, and now AI is knocking on the door, offering to do your job faster, smarter, and without needing coffee breaks. Should you be worried? Or excited?

If you’re a Learning and Development (L&D) professional today, AI is more than just a buzzword—it’s transforming the way we design, deliver, and measure corporate training. But here’s the good news: AI isn’t here to replace you. It’s here to make you better at what you do.

The challenge is to harness its potential to build digital-ready talent, not just within your organization but within yourself.

Let’s explore how AI is reshaping L&D strategies and how you can leverage it for professional development.


5 Recent AI Notables — from automatedteach.com by Graham Clay

1. OpenAI’s New Image Generator
What Happened: OpenAI integrated a much more powerful image generator directly into GPT-4o, making it the default image creator in ChatGPT. Unlike previous image models, this one excels at accurately rendering text in images, precise visualization of diagrams/charts, and multi-turn image refinement through conversation.

Why It’s Big: For educators, this represents a significant advancement in creating educational visuals, infographics, diagrams, and other instructional materials with unprecedented accuracy and control. It’s not perfect, but you can now quickly generate custom illustrations that accurately display mathematical equations, chemical formulas, or process workflows — previously a significant hurdle in digital content creation — without requiring graphic design expertise or expensive software. This capability dramatically reduces the time between conceptualizing a visual aid and implementing it in course materials.
.


The 4 AI modes that will supercharge your workflow — from aiwithallie.beehiiv.com by Allie K. Miller
The framework most people and companies won’t discover until 2026


 




Students and folks looking for work may want to check out:

Also relevant/see:


 

8 Weeks Left to Prepare Students for the AI-Enhanced Workplace — from insidehighered.com by Ray Schroeder
We are down to the final weeks left to fully prepare students for entry into the AI-enhanced workplace. Are your students ready?

The urgent task facing those of us who teach and advise students, whether they be degree program or certificate seeking, is to ensure that they are prepared to enter (or re-enter) the workplace with skills and knowledge that are relevant to 2025 and beyond. One of the first skills to cultivate is an understanding of what kinds of services this emerging technology can provide to enhance the worker’s productivity and value to the institution or corporation.

Given that short period of time, coupled with the need to cover the scheduled information in the syllabus, I recommend that we consider merging AI use into authentic assignments and assessments, supplementary modules, and other resources to prepare for AI.


Learning Design in the Era of Agentic AI — from drphilippahardman.substack.com by Dr Philippa Hardman
Aka, how to design online async learning experiences that learners can’t afford to delegate to AI agents

The point I put forward was that the problem is not AI’s ability to complete online async courses, but that online async courses courses deliver so little value to our learners that they delegate their completion to AI.

The harsh reality is that this is not an AI problem — it is a learning design problem.

However, this realisation presents us with an opportunity which we overall seem keen to embrace. Rather than seeking out ways to block AI agents, we seem largely to agree that we should use this as a moment to reimagine online async learning itself.



8 Schools Innovating With Google AI — Here’s What They’re Doing — from forbes.com by Dan Fitzpatrick

While fears of AI replacing educators swirl in the public consciousness, a cohort of pioneering institutions is demonstrating a far more nuanced reality. These eight universities and schools aren’t just experimenting with AI, they’re fundamentally reshaping their educational ecosystems. From personalized learning in K-12 to advanced research in higher education, these institutions are leveraging Google’s AI to empower students, enhance teaching, and streamline operations.


Essential AI tools for better work — from wondertools.substack.com by Jeremy Caplan
My favorite tactics for making the most of AI — a podcast conversation

AI tools I consistently rely on (areas covered mentioned below)

  • Research and analysis
  • Communication efficiency
  • Multimedia creation

AI tactics that work surprisingly well 

1. Reverse interviews
Instead of just querying AI, have it interview you. Get the AI to interview you, rather than interviewing it. Give it a little context and what you’re focusing on and what you’re interested in, and then you ask it to interview you to elicit your own insights.”

This approach helps extract knowledge from yourself, not just from the AI. Sometimes we need that guide to pull ideas out of ourselves.

 

AI Can’t Fix Bad Learning — from nafez.substack.com by Nafez Dakkak
Why pedagogy and good learning design still come first, and why faster isn’t always better.

I’ve followed Dr. Philippa Hardman’s work for years, and every time I engage with her work, I find it both refreshing and deeply grounded.

As one of the leading voices in learning design, Philippa has been able to cut through the noise and focus on what truly matters: designing learning experiences that actually work.

In an era where AI promises speed and scale, Philippa is making a different argument: faster isn’t always better. As the creator of Epiphany AI—figma for learning designers—Philippa is focused on closing the gap between what great learning design should look like and what’s actually being delivered.

While many AI tools optimize for the average, she believes the future belongs to those who can leverage AI without compromising on expertise or quality. Philippa wants learning designers to be more ambitious using AI to achieve what wasn’t possible before.

In this conversation, we explore why pedagogy must lead technology, how the return on expertise is only increasing in an AI-driven world, and why building faster doesn’t always mean building better.

An excerpted graphic:




Pearson, AWS Collaborate to Enhance AI-Powered Learning Functionality — from cloudwars.com

Pearson, the global educational publisher, and AWS have expanded their existing partnership to enhance AI-driven learning. AWS will help Pearson to deliver AI-powered lesson generation and more for educators, support workforce skilling initiatives, and continue an ongoing collaboration with Pearson VUE for AWS certification.


 

From DSC:
Look out Google, Amazon, and others! Nvidia is putting the pedal to the metal in terms of being innovative and visionary! They are leaving the likes of Apple in the dust.

The top talent out there is likely to go to Nvidia for a while. Engineers, programmers/software architects, network architects, product designers, data specialists, AI researchers, developers of robotics and autonomous vehicles, R&D specialists, computer vision specialists, natural language processing experts, and many more types of positions will be flocking to Nvidia to work for a company that has already changed the world and will likely continue to do so for years to come. 



NVIDIA’s AI Superbowl — from theneurondaily.com by Noah and Grant
PLUS: Prompt tips to make AI writing more natural

That’s despite a flood of new announcements (here’s a 16 min video recap), which included:

  1. A new architecture for massive AI data centers (now called “AI factories”).
  2. A physics engine for robot training built with Disney and DeepMind.
  3. partnership with GM to develop next-gen vehicles, factories and robots.
  4. A new Blackwell chip with “Dynamo” software that makes AI reasoning 40x faster than previous generations.
  5. A new “Rubin” chip slated for 2026 and a “Feynman” chip set for 2028.

For enterprises, NVIDIA unveiled DGX Spark and DGX Station—Jensen’s vision of AI-era computing, bringing NVIDIA’s powerful Blackwell chip directly to your desk.


Nvidia Bets Big on Synthetic Data — from wired.com by Lauren Goode
Nvidia has acquired synthetic data startup Gretel to bolster the AI training data used by the chip maker’s customers and developers.


Nvidia, xAI to Join BlackRock and Microsoft’s $30 Billion AI Infrastructure Fund — from investopedia.com by Aaron McDade
Nvidia and xAI are joining BlackRock and Microsoft in an AI infrastructure group seeking $30 billion in funding. The group was first announced in September as BlackRock and Microsoft sought to fund new data centers to power AI products.



Nvidia CEO Jensen Huang says we’ll soon see 1 million GPU data centers visible from space — from finance.yahoo.com by Daniel Howley
Nvidia CEO Jensen Huang says the company is preparing for 1 million GPU data centers.


Nvidia stock stems losses as GTC leaves Wall Street analysts ‘comfortable with long term AI demand’ — from finance.yahoo.com by Laura Bratton
Nvidia stock reversed direction after a two-day slide that saw shares lose 5% as the AI chipmaker’s annual GTC event failed to excite investors amid a broader market downturn.


Microsoft, Google, and Oracle Deepen Nvidia Partnerships. This Stock Got the Biggest GTC Boost. — from barrons.com by Adam Clark and Elsa Ohlen


The 4 Big Surprises from Nvidia’s ‘Super Bowl of AI’ GTC Keynote — from barrons.com by Tae Kim; behind a paywall

AI Super Bowl. Hi everyone. This week, 20,000 engineers, scientists, industry executives, and yours truly descended upon San Jose, Calif. for Nvidia’s annual GTC developers’ conference, which has been dubbed the “Super Bowl of AI.”


 

Blind Spot on AI — from the-job.beehiiv.com by Paul Fain
Office tasks are being automated now, but nobody has answers on how education and worker upskilling should change.

Students and workers will need help adjusting to a labor market that appears to be on the verge of a historic disruption as many business processes are automated. Yet job projections and policy ideas are sorely lacking.

The benefits of agentic AI are already clear for a wide range of organizations, including small nonprofits like CareerVillage. But the ability to automate a broad range of business processes means that education programs and skills training for knowledge workers will need to change. And as Chung writes in a must-read essay, we have a blind spot with predicting the impacts of agentic AI on the labor market.

“Without robust projections,” he writes, “policymakers, businesses, and educators won’t be able to come to terms with how rapidly we need to start this upskilling.”

 

5 Legal Tech Trends Set to Impact Law Firms in 2025 — from programminginsider.com by Marc Berman

The legal industry is experiencing swift changes, with technology becoming an ever more crucial factor in its evolution. As law firms respond to shifting client demands and regulatory changes, the pace of change is accelerating. Embracing legal tech is no longer just an advantage; it’s a necessity.

According to a Forbes report, 66% of legal leaders acknowledge this trend and intend to boost their investments in legal tech moving forward. From artificial intelligence streamlining workflows to cloud computing enabling globalized legal services, the legal landscape is undergoing a digital revolution.

In this article, we’ll explore five key legal tech trends that will define how law firms operate in 2025.


GenAI, Legal Ops, and The Future of Law Firms: A Wake-Up Call? — from echlawcrossroads.com by Stephen Embry

A new study from the Blickstein Group reveals some distributing trends for law firms that represent businesses, particularly large ones. The Study is entitled  Legal Service Delivery in the Age of AI. The Study was done jointly by FTI Technologies, a consulting group, and Blickstein. It looks at law department legal operations.

The Findings

GenAI Use by Legal Ops Personnel

The responses reflect a bullish view of what GenAI can do in the legal marketplace but also demonstrate GenAi has a ways to go:

  • Almost 80% of the respondents think that GenAI will become an “essential part of the legal profession.
  • 81% believe GenAi will drive improved efficiencies
  • Despite this belief, only some 30% have plans to purchase GenAI tools. For 81%, the primary reason for obtaining and using GenAI tools is the efficiencies these tools bring.
  • 52% say their GenAI strategy is not as sophisticated as they would like or nonexistent.

The biggest barrier to the use of GenAI among the legal ops professions is cost and security concerns and the lack of skilled personnel available to them.


Voting Is Closed, Results Are In: Here are the 15 Legal Tech Startups Selected for the 2025 Startup Alley at ABA TECHSHOW — from lawnext.com by Bob Ambrogi

Voting has now closed and your votes have been tallied to pick the 15 legal tech startups that will get to participate as finalists in the ninth-annual Startup Alley at ABA TECHSHOW 2025, taking place April 2-5 in Chicago.

These 15 finalists will face off in an opening-night pitch competition that is the opening event of TECHSHOW, with the conference’s attendees voting at the conclusion of the pitches to pick the top winners.


Balancing innovation and ethics: Applying generative AI in legal work — from legal.thomsonreuters.com

Generative artificial intelligence (GenAI) has brought a new wave of opportunities to the legal profession, opening doors to greater efficiency and innovation. Its rapid development has also raised questions about its integration within the legal industry. As legal professionals are presented with more options for adopting new technologies, they now face the important task of understanding how GenAI can be seamlessly — and ethically — incorporated into their daily operations.


Emerging Trends in Court Reporting for 2025: Legal Technology and Advantages for Law Firms — from jdsupra.com

The court reporting industry is evolving rapidly, propelled by technological advancements and the increasing demand for efficiency in the legal sector. For 2025, trends such as artificial intelligence (AI), real-time transcription technologies, and data-driven tools are reshaping how legal professionals work. Here’s an overview of these emerging trends and five reasons law firms should embrace these advancements.


 

Like it or not, AI is learning how to influence you — from venturebeat.com by Louis Rosenberg

Unfortunately, without regulatory protections, we humans will likely become the objective that AI agents are tasked with optimizing.

I am most concerned about the conversational agents that will engage us in friendly dialog throughout our daily lives. They will speak to us through photorealistic avatars on our PCs and phones and soon, through AI-powered glasses that will guide us through our days. Unless there are clear restrictions, these agents will be designed to conversationally probe us for information so they can characterize our temperaments, tendencies, personalities and desires, and use those traits to maximize their persuasive impact when working to sell us products, pitch us services or convince us to believe misinformation.
.

 
 
© 2025 | Daniel Christian