GRCC students to use AI to help businesses solve ‘real world’ challenges in new course — from www-mlive-com.cdn.ampproject.org by Brian McVicar; via Patrick Bailey on LinkedIn

GRAND RAPIDS, MI — A new course at Grand Rapids Community College aims to help students learn about artificial intelligence by using the technology to solve real-world business problems.

In a release, the college said its grant application was supported by 20 local businesses, including Gentex, TwistThink and the Grand Rapids Public Museum. The businesses have pledged to work with students who will use business data to develop an AI project such as a chatbot that interacts with customers, or a program that automates social media posts or summarizes customer data.

“This rapidly emerging technology can transform the way businesses process data and information,” Kristi Haik, dean of GRCC’s School of Science, Technology, Engineering and Mathematics, said in a statement. “We want to help our local business partners understand and apply the technology. We also want to create real experiences for our students so they enter the workforce with demonstrated competence in AI applications.”

As Patrick Bailey said on LinkedIn about this article:

Nice to see a pedagogy that’s setting a forward movement rather than focusing on what could go wrong with AI in a curriculum.


Forecast for Learning and Earning in 2025-2026 report — from pages.asugsvsummit.com by Jennifer Lee and Claire Zau

In this look ahead at the future of learning and work, we aim to define:

  • Major thematic observations
  • What makes this moment an inflection point
  • Key predictions (and their precedent)
  • Short- and long-term projected impacts


The LMS at 30: From Course Management to Learning Management (At Last) — from onedtech.philhillaa.com; a guest post from Matthew Pittinsky, Ph.D.

As a 30 year observer and participant, it seems to me that previous technology platform shifts like SaaS and mobile did not fundamentally change the LMS. AI is different. We’re standing at the precipice of LMS 2.0, where the branding change from Course Management System to Learning Management System will finally live up to its name. Unlike SaaS or mobile, AI represents a technology platform shift that will transform the way participants interact with learning systems – and with it, the nature of the LMS itself.

Given the transformational potential of AI, it is useful to set the context and think about how we got here, especially on this 30th anniversary of the LMS.

LMS at 30 Part 2: Learning Management in the AI Era — from onedtech.philhillaa.com; a guest post from Matthew Pittinsky, Ph.D.

Where AI is disruptive is in its ability to introduce a whole new set of capabilities that are best described as personalized learning services. AI offers a new value proposition to the LMS, roughly the set of capabilities currently being developed in the AI Tutor / agentic TA segment. These new capabilities are so valuable given their impact on learning that I predict they will become the services with greatest engagement within a school or university’s “enterprise” instructional platform.

In this way, by LMS paradigm shift, I specifically mean a shift from buyers valuing the product on its course-centric and course management capabilities, to valuing it on its learner-centric and personalized learning capabilities.


AI and the future of education: disruptions, dilemmas and directions — from unesdoc.unesco.org

This anthology reveals how the integration of AI in education poses profound philosophical, pedagogical, ethical and political questions. As this global AI ecosystem evolves and becomes increasingly ubiquitous, UNESCO and its partners have a shared responsibility to lead the global discourse towards an equity- and justice-centred agenda. The volume highlights three areas in which UNESCO will continue to convene and lead a global commons for dialog and action particularly in areas on AI futures, policy and practice innovation, and experimentation.

  1. As guardian of ethical, equitable human-centred AI in education.
  2. As thought leader in reimagining curriculum and pedagogy
  3. As a platform for engaging pluralistic and contested dialogues

AI, copyright and the classroom: what higher education needs to know — from timeshighereducation.com by Cayce Myers
As artificial intelligence reshapes teaching and research, one legal principle remains at the heart of our work: copyright. Understanding its implications isn’t just about compliance – it’s about protecting academic integrity, intellectual property and the future of knowledge creation. Cayce Myers explains


The School Year We Finally Notice “The Change” — from americanstogether.substack.com by Jason Palmer

Why It Matters
A decade from now, we won’t say “AI changed schools.” We’ll say: this was the year schools began to change what it means to be human, augmented by AI.

This transformation isn’t about efficiency alone. It’s about dignity, creativity, and discovery, and connecting education more directly to human flourishing. The industrial age gave us schools to produce cookie-cutter workers. The digital age gave us knowledge anywhere, anytime. The AI age—beginning now—gives us back what matters most: the chance for every learner to become infinitely capable.

This fall may look like any other—bells ringing, rows of desks—but beneath the surface, education has begun its greatest transformation since the one-room schoolhouse.


How should universities teach leadership now that teams include humans and autonomous AI agents? — from timeshighereducation.com by Alex Zarifis
Trust and leadership style are emerging as key aspects of teambuilding in the age of AI. Here are ways to integrate these considerations with technology in teaching

Transactional and transformational leaderships’ combined impact on AI and trust
Given the volatile times we live in, a leader may find themselves in a situation where they know how they will use AI, but they are not entirely clear on the goals and journey. In a teaching context, students can be given scenarios where they must lead a team, including autonomous AI agents, to achieve goals. They can then analyse the situations and decide what leadership styles to apply and how to build trust in their human team members. Educators can illustrate this decision-making process using a table (see above).

They may need to combine transactional leadership with transformational leadership, for example. Transactional leadership focuses on planning, communicating tasks clearly and an exchange of value. This works well with both humans and automated AI agents.

 

Higher ed’s ‘hunker-down mindset’ — from open-campus-dispatch.beehiiv.com by Colleen Murphy
A tight housing market and a fragile job market mean those working in higher ed have fewer options than ever.

Faculty and administrators could be just as constrained by the golden handcuffs of a 2% interest rate as everybody else. That makes them less likely to move for a new job, Kelchen said, especially since they’re unlikely to get the type of salary increase they’d need to offset more pricey mortgage payments. Plus, even finding an affordable house in the first place could be a challenge right now.

All of this contributes to what Kelchen called a “hunker-down mindset” in higher ed.

“Even if the institutions are giving out pay raises, the pay raises aren’t matching housing costs,” Kelchen said. “And then that creates a pressure to stay.”

While that might seem like a “first-world problem,” it also affects college and university staff members, Kelchen told me. Often the only way for staff members to make more money is to move universities — there aren’t the same in-house growth opportunities as there are for faculty. But that’s easier said than done.

 

Midoo AI Launches the World’s First AI Language Learning Agent, Redefining How People Learn Languages — from morningstar.com

SINGAPORE Sept. 3, 2025  /PRNewswire/ — Today, Midoo AI proudly announces the launch of the world’s first AI language learning agent, a groundbreaking innovation set to transform language education forever.

For decades, language learning has pursued one ultimate goal: true personalization. Traditional tools offered smart recommendations, gamified challenges, and pre-written role-play scripts—but real personalization remained out of reach. Midoo AI changes that. Here is the >launch video of Midoo AI.

Imagine a learning experience that evolves with you in real time. A system that doesn’t rely on static courses or scripts but creates a dynamic, one-of-a-kind language world tailored entirely to your needs. This is the power of Midoo’s Dynamic Generation technology.

“Midoo is not just a language-learning tool,” said Yvonne, co-founder of Midoo AI. “It’s a living agent that senses your needs, adapts instantly, and shapes an experience that’s warm, personal, and alive. Learning is no longer one-size-fits-all—now, it’s yours and yours alone.”


Midoo AI Review: Meet the First AI Language Learning Agent — from autogpt.net

Language learning apps have traditionally focused on exercises, quizzes, and progress tracking. Midoo AI introduces a different approach. Instead of presenting itself as a course provider, it acts as an intelligent learning agent that builds, adapts, and sustains a learner’s journey.

This review examines how Midoo AI operates, its feature set, and what makes it distinct from other AI-powered tutors.

Midoo AI in Context: Purpose and Position
Midoo AI is not structured around distributing lessons or modules. Its core purpose is to provide an agent-like partner that adapts in real time. Where many platforms ask learners to select a “level” or “topic,”

Midoo instead begins by analyzing goals, usage context, and error patterns. The result is less about consuming predesigned units and more about co-constructing a pathway.


AI Isn’t Replacing Teachers — It’s Helping Us Teach Better — from rdene915.com by guest author Matthew Mawn

Turning Time Saved Into Better Learning
AI can save teachers time, but what can that time be used for (besides taking a breath)? For most of us, it means redirecting energy into the parts of teaching that made us want to pursue this profession in the first place: connecting with our students and helping them grow academically.

Differentiation
Every classroom has students with different readiness levels, language needs, and learning preferences. AI tools like Diffit or MagicSchool can instantly create multiple versions of a passage or assignment, differentiated by grade level, complexity, or language. This allows every student to engage with the same core concept, moving together as one cohesive class. Instead of spending an evening retyping and rephrasing, teachers can review and tweak AI drafts in minutes, ready for the next lesson.


Mass Intelligence — from oneusefulthing.org by Ethan Mollick
From GPT-5 to nano banana: everyone is getting access to powerful AI

When a billion people have access to advanced AI, we’ve entered what we might call the era of Mass Intelligence. Every institution we have — schools, hospitals, courts, companies, governments — was built for a world where intelligence was scarce and expensive. Now every profession, every institution, every community has to figure out how to thrive with Mass Intelligence. How do we harness a billion people using AI while managing the chaos that comes with it? How do we rebuild trust when anyone can fabricate anything? How do we preserve what’s valuable about human expertise while democratizing access to knowledge?


AI Is the Cognitive Layer. Schools Still Think It’s a Study Tool. — from stefanbauschard.substack.com by Stefan Bauschard

By the time today’s 9th graders and college freshman enter the workforce, the most disruptive waves of AGI and robotics may already be embedded into part society.

What replaces the old system will not simply be a more digital version of the same thing. Structurally, schools may move away from rigid age-groupings, fixed schedules, and subject silos. Instead, learning could become more fluid, personalized, and interdisciplinary—organized around problems, projects, and human development rather than discrete facts or standardized assessments.

AI tutors and mentors will allow for pacing that adapts to each student, freeing teachers to focus more on guidance, relationships, and high-level facilitation. Classrooms may feel less like miniature factories and more like collaborative studios, labs, or even homes—spaces for exploring meaning and building capacity, not just delivering content.

If students are no longer the default source of action, then we need to teach them to:

    • Design agents,
    • Collaborate with agents,
    • Align agentic systems with human values,
    • And most of all, retain moral and civic agency in a world where machines act on our behalf.

We are no longer educating students to be just doers.
We must now educate them to be judgesdesigners, and stewards of agency.


Meet Your New AI Tutor — from wondertools.substack.com by Jeremy Caplan
Try new learning modes in ChatGPT, Claude, and Gemini

AI assistants are now more than simple answer machines. ChatGPT’s new Study Mode, Claude’s Learning Mode, and Gemini’s Guided Learning represent a significant shift. Instead of just providing answers, these free tools act as adaptive, 24/7 personal tutors.



AI Tools for Instructional Design (September, 2025) — from drphilh.gumroad.com by Dr Philippa Hardman

That’s why, in preparation for my next bootcamp which kicks off September 8th 2025, I’ve just completed a full refresh of my list of the most powerful & popular AI tools for Instructional Designers, complete with tips on how to get the most from each tool.

The list has been created using my own experience + the experience of hundreds of Instructional Designers who I work with every week.

It contains the 50 most powerful AI tools for instructional design available right now, along with tips on how to optimise their benefits while mitigating their risks.


Addendums on 9/4/25:


AI Companies Roll Out Educational Tools — from insidehighered.com by Ray Schroeder
This fall, Google, Anthropic and OpenAI are rolling out powerful new AI tools for students and educators, each taking a different path to shape the future of learning.



Rethinking My List of Essential Job Skills in the Age of AI — from michellekassorla.substack.com by Michelle Kassorla

So here’s the new list of essential skills I think my students will need when they are employed to work with AI five years from now:

  1. They can follow directions, analyze outcomes, and adapt to change when needed.
  2. They can write or edit AI to capture a unique voice and appropriate tone in sync with an audience’s needs
  3. They have a deep understanding of one or more content areas of a particular profession, business, or industry, so they can easily identify factual errors.
  4. They have a strong commitment to exploration, a flexible mindset, and a broad understanding of AI literacy.
  5. They are resilient and critical thinkers, ready to question results and demand better answers.
  6. They are problem solvers.

And, of course, here is a new rubric built on those skills:


 

Anthropic Education Report: How educators use Claude — from anthropic.com

We find that:

Educators use AI in and out of the classroom
Educators’ uses range from developing course materials and writing grant proposals to academic advising and managing administrative tasks like admissions and financial planning.

Educators aren’t just using chatbots; they’re building their own custom tools with AI
Faculty are using Claude Artifacts to create interactive educational materials, such as chemistry simulations, automated grading rubrics, and data visualization dashboards.

Educators tend to automate the drudgery while staying in the loop for everything else
Tasks requiring significant context, creativity, or direct student interaction—like designing lessons, advising students, and writing grant proposals—are where educators are more likely to use AI as an enhancement. In contrast, routine administrative work such as financial management and record-keeping are more automation-heavy.

Some educators are automating grading; others are deeply opposed
In our Claude.ai data, faculty used AI for grading and evaluation less frequently than other uses, but when they did, 48.9% of the time they used it in an automation-heavy way (where the AI directly performs the task). That’s despite educator concerns about automating assessment tasks, as well as our surveyed faculty rating it as the area where they felt AI was least effective.

 

The 2025 Changing Landscape of Online Education (CHLOE) 10 Report — from qualitymatters.org; emphasis below from DSC

Notable findings from the 73-page report include: 

  • Online Interest Surges Across Student Populations: 
  • Institutional Preparedness Falters Amid Rising Demand: Despite accelerating demand, institutional readiness has stagnated—or regressed—in key areas.
  • The Online Education Marketplace Is Increasingly Competitive: …
  • Alternative Credentials Take Center Stage: …
  • AI Integration Lacks Strategic Coordination: …

Just 28% of faculty are considered fully prepared for online course design, and 45% for teaching. Alarmingly, only 28% of institutions report having fully developed academic continuity plans for future emergency pivots to online.


Also relevant, see:


Great Expectations, Fragile Foundations — from onedtech.philhillaa.com by Glenda Morgan
Lessons about growth from the CHLOE & BOnES reports

Cultural resistance remains strong. Many [Chief Online Learning Officers] COLOs say faculty and deans still believe in-person learning is “just better,” creating headwinds even for modest online growth. As one respondent at a four-year institution with a large online presence put it:

Supportive departments [that] see the value in online may have very different levels of responsiveness compared to academic departments [that] are begrudgingly online. There is definitely a growing belief that students “should” be on-ground and are only choosing online because it’s easy/ convenient. Never mind the very real and growing population of nontraditional learners who can only take online classes, and the very real and growing population of traditional-aged learners who prefer online classes; many faculty/deans take a paternalistic, “we know what’s best” approach.


Ultimately, what we need is not just more ambition but better ambition. Ambition rooted in a realistic understanding of institutional capacity, a shared strategic vision, investments in policy and infrastructure, and a culture that supports online learning as a core part of the academic mission, not an auxiliary one. It’s time we talked about what it really takes to grow online learning , and where ambition needs to be matched by structure.

From DSC:
Yup. Culture is at the breakfast table again…boy, those strategies taste good.

I’d like to take some of this report — like the graphic below — and share it with former faculty members and members of a couple of my past job families’ leadership. They strongly didn’t agree with us when we tried to advocate for the development of online-based learning/programs at our organizations…but we were right. We were right all along. And we were LEADING all along. No doubt about it — even if the leadership at the time said that we weren’t leading.

The cultures of those organizations hurt us at the time. But our cultivating work eventually led to the development of online programs — unfortunately, after our groups were disbanded, they had to outsource those programs to OPMs.


Arizona State University — with its dramatic growth in online-based enrollments.

 
 
 

Thomson Reuters CEO: Legal Profession Faces “Biggest Disruption in Its History”from AI  — from lawnext.com by Bob Ambrogi

Thomson Reuters President and CEO Steve Hasker believes the legal profession is experiencing “the biggest disruption … in its history” due to generative and agentic artificial intelligence, fundamentally rewriting how legal work products are created for the first time in more than 300 years.

Speaking to legal technology reporters during ILTACON, the International Legal Technology Association’s annual conference, Hasker outlined his company’s ambitious goal to become “the most innovative company” in the legal tech sector while navigating what he described as unprecedented technological change affecting a profession that has remained largely unchanged since its origins in London tea houses centuries ago.


Legal tech hackathon challenges students to rethink access to justice — from the Centre for Innovation and Entrepreneurship, Auckland Law School
In a 24-hour sprint, student teams designed innovative tools to make legal and social support more accessible.

The winning team comprised of students of computer science, law, psychology and physics. They developed a privacy-first legal assistant powered by AI that helps people understand their legal rights without needing to navigate dense legal language. 


Teaching How To ‘Think Like a Lawyer’ Revisited — from abovethelaw.com by Stephen Embry
GenAI gives the concept of training law students to think like a lawyer a whole new meaning.

Law Schools
These insights have particular urgency for legal education. Indeed, most of Cowen’s criticisms and suggested changes need to be front and center for law school leaders. It’s naïve to think that law student and lawyers aren’t going to use GenAI tools in virtually every aspect of their professional and personal lives. Rather than avoiding the subject or worse yet trying to stop use of these tools, law schools should make GenAI tools a fundamental part of research, writing and drafting training.

They need to focus not on memorization but on the critical thinking skills beginning lawyers used to get in the on-the-job training guild type system. As I discussed, that training came from repetitive and often tedious work that developed experienced lawyers who could recognize patterns and solutions based on the exposure to similar situations. But much of that repetitive and tedious work may go away in a GenAI world.

The Role of Adjunct Professors
But to do this, law schools need to better partner with actual practicing lawyers who can serve as adjunct professors. Law schools need to do away with the notion that adjuncts are second-class teachers.


It’s a New Dawn In Legal Tech: From Woodstock to ILTACON (And Beyond) — from lawnext.com by Bob Ambrogi

As someone who has covered legal tech for 30 years, I cannot remember there ever being a time such as this, when the energy and excitement are raging, driven by generative AI and a new era of innovation and new ideas of the possible.

But this year was different. Wandering the exhibit hall, getting product briefings from vendors, talking with attendees, it was impossible to ignore the fundamental shift happening in legal technology. Gen AI isn’t just creating new products – it is spawning entirely new categories of products that truly are reshaping how legal work gets done.

Agentic AI is the buzzword of 2025 and agentic systems were everywhere at ILTACON, promising to streamline workflows across all areas of legal practice. But, perhaps more importantly, these tools are also beginning to address the business side of running a law practice – from client intake and billing to marketing and practice management. The scope of transformation is now beginning to extend beyond the practice of law into the business of law.

Largely missing from this gathering were solo practitioners, small firm lawyers, legal aid organizations, and access-to-justice advocates – the very people who stand to benefit most from the democratizing potential of AI.

However, now more than ever, the innovations we are seeing in legal tech have the power to level the playing field, to give smaller practitioners access to tools and capabilities that were once prohibitively expensive. If these technologies remain priced for and marketed primarily to Big Law, we will have succeeded only in widening the justice gap rather than closing it.


How AI is Transforming Deposition Review: A LegalTech Q&A — from jdsupra.com

Thanks to breakthroughs in artificial intelligence – particularly in semantic search, multimodal models, and natural language processing – new legaltech solutions are emerging to streamline and accelerate deposition review. What once took hours or days of manual analysis now can be accomplished in minutes, with greater accuracy and efficiency than possible with manual review.


From Skepticism to Trust: A Playbook for AI Change Management in Law Firms — from jdsupra.com by Scott Cohen

Historically, lawyers have been slow adopters of emerging technologies, and with good reason. Legal work is high stakes, deeply rooted in precedent, and built on individual judgment. AI, especially the new generation of agentic AI (systems that not only generate output but initiate tasks, make decisions, and operate semi-autonomously), represents a fundamental shift in how legal work gets done. This shift naturally leads to caution as it challenges long-held assumptions about lawyer workflows and several aspects of their role in the legal process.

The path forward is not to push harder or faster, but smarter. Firms need to take a structured approach that builds trust through transparency, context, training, and measurement of success. This article provides a five-part playbook for law firm leaders navigating AI change management, especially in environments where skepticism is high and reputational risk is even higher.


ILTACON 2025: The vendor briefings – Agents, ecosystems and the next stage of maturity — from legaltechnology.com by Caroline Hill

This year’s ILTACON in Washington was heavy on AI, but the conversation with vendors has shifted. Legal IT Insider’s briefings weren’t about potential use cases or speculative roadmaps. Instead, they focused on how AI is now being embedded into the tools lawyers use every day — and, crucially, how those tools are starting to talk to each other.

Taken together, they point to an inflection point, where agentic workflows, data integration, and open ecosystems define the agenda. But it’s important amidst the latest buzzwords to remember that agents are only as good as the tools they have to work with, and AI only as good as its underlying data. Also, as we talk about autonomous AI, end users are still struggling with cloud implementations and infrastructure challenges, and need vendors to be business partners that help them to make progress at speed.

Harvey’s roadmap is all about expanding its surface area — connecting to systems like iManage, LexisNexis, and more recently publishing giant Wolters Kluwer — so that a lawyer can issue a single query and get synthesised, contextualised answers directly within their workflow. Weinberg said: “What we’re trying to do is get all of the surface area of all of the context that a lawyer needs to complete a task and we’re expanding the product surface so you can enter a search, search all resources, and apply that to the document automatically.” 

The common thread: no one is talking about AI in isolation anymore. It’s about orchestration — pulling together multiple data sources into a workflow that actually reflects how lawyers practice. 


5 Pitfalls Of Delaying Automation In High-Volume Litigation And Claims — from jdsupra.com

Why You Can’t Afford to Wait to Adopt AI Tools that have Plaintiffs Moving Faster than Ever
Just as photocopiers shifted law firm operations in the early 1970s and cloud computing transformed legal document management in the early 2000s, AI automation tools are altering the current legal landscape—enabling litigation teams to instantly structure unstructured data, zero in on key arguments in seconds, and save hundreds (if not thousands) of hours of manual work.


Your Firm’s AI Policy Probably Sucks: Why Law Firms Need Education, Not Rules — from jdsupra.com

The Floor, Not the Ceiling
Smart firms need to flip their entire approach. Instead of dictating which AI tools lawyers must use, leadership should set a floor for acceptable use and then get out of the way.

The floor is simple: no free versions for client work. Free tools are free because users are the product. Client data becomes training data. Confidentiality gets compromised. The firm loses any ability to audit or control how information flows. This isn’t about control; it’s about professional responsibility.

But setting the floor is only the first step. Firms must provide paid, enterprise versions of AI tools that lawyers actually want to use. Not some expensive legal tech platform that promises AI features but delivers complicated workflows. Real AI tools. The same ones lawyers are already using secretly, but with enterprise security, data protection, and proper access controls.

Education must be practical and continuous. Single training sessions don’t work. AI tools evolve weekly. New capabilities emerge constantly. Lawyers need ongoing support to experiment, learn, and share discoveries. This means regular workshops, internal forums for sharing prompts and techniques, and recognition for innovative uses.

The education investment pays off immediately. Lawyers who understand AI use it more effectively. They catch its mistakes. They know when to verify outputs. They develop specialized prompts for legal work. They become force multipliers, not just for themselves but for their entire teams.

 

What next for EDI? Protecting equality of opportunity in HE — from timeshighereducation.com by Laura Duckett
As equity, diversity and inclusion practices face mounting political and cultural challenges, this guide includes strategies from academics around the world on preserving fair access and opportunity for all

As many in this guide explain, hostility to efforts to create fairer, inclusive and diverse institutions of higher education runs a lot deeper than the latest US presidential agenda and it cannot be ignored and rejected as a momentary political spike. Yet the continued need for EDI (or DEI as it is called in America) work to address historic and systemic injustice is clear from the data. In the US, Black, Hispanic, Latino, Native American and Pacific Islander people are under-represented in university student and staff populations. Students from these groups also have worse academic outcomes.

In the UK, only 1 per cent of professors are Black, women remain under-represented on the higher rungs of the academic ladder and the attainment gap between students from minoritised backgrounds and their white counterparts remains stubbornly evident across the higher education sector.

While not all EDI work has proved successful, significant progress has been made on widening participation in higher education and building more inclusive universities in which students and academics can thrive.

This guide shares lessons from academics on navigating increasingly choppy waters relating to EDI, addressing misconceptions about the work and its core ambitions, strategies for allyship, anti-racism and inclusion and how to champion EDI through your teaching and institutional culture.

 


Back to School in the AI Era: Why Students Are Rewriting Your Lesson Plan — from linkedin.com by Hailey Wilson

As a new academic year begins, many instructors, trainers, and program leaders are bracing for familiar challenges—keeping learners engaged, making complex material accessible, and preparing students for real-world application.

But there’s a quiet shift happening in classrooms and online courses everywhere.

This fall, it’s not the syllabus that’s guiding the learning experience—it’s the conversation between the learner and an AI tool.


From bootcamp to bust: How AI is upending the software development industry — from reuters.com by Anna Tong; via Paul Fain
Coding bootcamps have been a mainstay in Silicon Valley for more than a decade. Now, as AI eliminates the kind of entry-level roles for which they trained people, they’re disappearing.

Coding bootcamps have been a Silicon Valley mainstay for over a decade, offering an important pathway for non-traditional candidates to get six-figure engineering jobs. But coding bootcamp operators, students and investors tell Reuters that this path is rapidly disappearing, thanks in large part to AI.

“Coding bootcamps were already on their way out, but AI has been the nail in the coffin,” said Allison Baum Gates, a general partner at venture capital fund SemperVirens, who was an early employee at bootcamp pioneer General Assembly.

Gates said bootcamps were already in decline due to market saturation, evolving employer demand and market forces like growth in international hiring.

 

BREAKING: Google introduces Guided Learning — from aieducation.substack.com by Claire Zau
Some thoughts on what could make Google’s AI tutor stand out

Another major AI lab just launched “education mode.”

Google introduced Guided Learning in Gemini, transforming it into a personalized learning companion designed to help you move from quick answers to real understanding.

Instead of immediately spitting out solutions, it:

  • Asks probing, open-ended questions
  • Walks learners through step-by-step reasoning
  • Adapts explanations to the learner’s level
  • Uses visuals, videos, diagrams, and quizzes to reinforce concepts

This Socratic style tutor rollout follows closely behind similar announcements like OpenAI’s Study Mode (last week) and Anthropic’s Claude for Education (April 2025).


How Sci-Fi Taught Me to Embrace AI in My Classroom — from edsurge.com by Dan Clark

I’m not too naive to understand that, no matter how we present it, some students will always be tempted by “the dark side” of AI. What I also believe is that the future of AI in education is not decided. It will be decided by how we, as educators, embrace or demonize it in our classrooms.

My argument is that setting guidelines and talking to our students honestly about the pitfalls and amazing benefits that AI offers us as researchers and learners will define it for the coming generations.

Can AI be the next calculator? Something that, yes, changes the way we teach and learn, but not necessarily for the worse? If we want it to be, yes.

How it is used, and more importantly, how AI is perceived by our students, can be influenced by educators. We have to first learn how AI can be used as a force for good. If we continue to let the dominant voice be that AI is the Terminator of education and critical thinking, then that will be the fate we have made for ourselves.


AI Tools for Strategy and Research – GT #32 — from goodtools.substack.com by Robin Good
Getting expert advice, how to do deep research with AI, prompt strategy, comparing different AIs side-by-side, creating mini-apps and an AI Agent that can critically analyze any social media channel

In this issue, discover AI tools for:

  • Getting Expert Advice
  • Doing Deep Research with AI
  • Improving Your AI Prompt Strategy
  • Comparing Results from Different AIs
  • Creating an AI Agent for Social Media Analysis
  • Summarizing YouTube Videos
  • Creating Mini-Apps with AI
  • Tasting an Award-Winning AI Short Film

GPT-Building, Agentic Workflow Design & Intelligent Content Curation — from drphilippahardman.substack.com by Dr. Philippa Hardman
What 3 recent job ads reveal about the changing nature of Instructional Design

In this week’s blog post, I’ll share my take on how the instructional design role is evolving and discuss what this means for our day-to-day work and the key skills it requires.

With this in mind, I’ve been keeping a close eye on open instructional design roles and, in the last 3 months, have noticed the emergence of a new flavour of instructional designer: the so-called “Generative AI Instructional Designer.”

Let’s deep dive into three explicitly AI-focused instructional design positions that have popped up in the last quarter. Each one illuminates a different aspect of how the role is changing—and together, they paint a picture of where our profession is likely heading.

Designers who evolve into prompt engineers, agent builders, and strategic AI advisors will capture the new premium. Those who cling to traditional tool-centric roles may find themselves increasingly sidelined—or automated out of relevance.


Google to Spend $1B on AI Training in Higher Ed — from insidehighered.com by Katherine Knott

Google’s parent company announced Wednesday (8/6/25) that it’s planning to spend $1 billion over the next three years to help colleges teach and train students about artificial intelligence.

Google is joining other AI companies, including OpenAI and Anthropic, in investing in AI training in higher education. All three companies have rolled out new tools aimed at supporting “deeper learning” among students and made their AI platforms available to certain students for free.


5 Predictions for How AI Will Impact Community Colleges — from pistis4edu.substack.com by Feng Hou

Based on current technology capabilities, adoption patterns, and the mission of community colleges, here are five well-supported predictions for AI’s impact in the coming years.

  1. Universal AI Tutor Access
  2. AI as Active Teacher
  3. Personalized Learning Pathways
  4. Interactive Multimodal Learning
  5. Value-Centric Education in an AI-Abundant World

 

AI and Higher Ed: An Impending Collapse — from insidehighered.com by Robert Niebuhr; via George Siemens; I also think George’s excerpt (see below) gets right to the point.
Universities’ rush to embrace AI will lead to an untenable outcome, Robert Niebuhr writes.

Herein lies the trap. If students learn how to use AI to complete assignments and faculty use AI to design courses, assignments, and grade student work, then what is the value of higher education? How long until people dismiss the degree as an absurdly overpriced piece of paper? How long until that trickles down and influences our economic and cultural output? Simply put, can we afford a scenario where students pretend to learn and we pretend to teach them?


This next report doesn’t look too good for traditional institutions of higher education either:


No Country for Young Grads — from burningglassinstitute.org

For the first time in modern history, a bachelor’s degree is no longer a reliable path to professional employment. Recent graduates face rising unemployment and widespread underemployment as structural—not cyclical—forces reshape entry?level work. This new report identifies four interlocking drivers: an AI?powered “Expertise Upheaval” eliminating many junior tasks, a post?pandemic shift to lean staffing and risk?averse hiring, AI acting as an accelerant to these changes, and a growing graduate glut. As a result, young degree holders are uniquely seeing their prospects deteriorate – even as the rest of the economy remain robust. Read the full report to explore the data behind these trends.

The above article was via Brandon Busteed on LinkedIn:

 
 
 

“Using AI Right Now: A Quick Guide” [Molnick] + other items re: AI in our learning ecosystems

Thoughts on thinking — from dcurt.is by Dustin Curtis

Intellectual rigor comes from the journey: the dead ends, the uncertainty, and the internal debate. Skip that, and you might still get the insight–but you’ll have lost the infrastructure for meaningful understanding. Learning by reading LLM output is cheap. Real exercise for your mind comes from building the output yourself.

The irony is that I now know more than I ever would have before AI. But I feel slightly dumber. A bit more dull. LLMs give me finished thoughts, polished and convincing, but none of the intellectual growth that comes from developing them myself. 


Using AI Right Now: A Quick Guide — from oneusefulthing.org by Ethan Mollick
Which AIs to use, and how to use them

Every few months I put together a guide on which AI system to use. Since I last wrote my guide, however, there has been a subtle but important shift in how the major AI products work. Increasingly, it isn’t about the best model, it is about the best overall system for most people. The good news is that picking an AI is easier than ever and you have three excellent choices. The challenge is that these systems are getting really complex to understand. I am going to try and help a bit with both.

First, the easy stuff.

Which AI to Use
For most people who want to use AI seriously, you should pick one of three systems: Claude from Anthropic, Google’s Gemini, and OpenAI’s ChatGPT.

Also see:


Student Voice, Socratic AI, and the Art of Weaving a Quote — from elmartinsen.substack.com by Eric Lars Martinsen
How a custom bot helps students turn source quotes into personal insight—and share it with others

This summer, I tried something new in my fully online, asynchronous college writing course. These classes have no Zoom sessions. No in-person check-ins. Just students, Canvas, and a lot of thoughtful design behind the scenes.

One activity I created was called QuoteWeaver—a PlayLab bot that helps students do more than just insert a quote into their writing.

Try it here

It’s a structured, reflective activity that mimics something closer to an in-person 1:1 conference or a small group quote workshop—but in an asynchronous format, available anytime. In other words, it’s using AI not to speed students up, but to slow them down.

The bot begins with a single quote that the student has found through their own research. From there, it acts like a patient writing coach, asking open-ended, Socratic questions such as:

What made this quote stand out to you?
How would you explain it in your own words?
What assumptions or values does the author seem to hold?
How does this quote deepen your understanding of your topic?
It doesn’t move on too quickly. In fact, it often rephrases and repeats, nudging the student to go a layer deeper.


The Disappearance of the Unclear Question — from jeppestricker.substack.com Jeppe Klitgaard Stricker
New Piece for UNESCO Education Futures

On [6/13/25], UNESCO published a piece I co-authored with Victoria Livingstone at Johns Hopkins University Press. It’s called The Disappearance of the Unclear Question, and it’s part of the ongoing UNESCO Education Futures series – an initiative I appreciate for its thoughtfulness and depth on questions of generative AI and the future of learning.

Our piece raises a small but important red flag. Generative AI is changing how students approach academic questions, and one unexpected side effect is that unclear questions – for centuries a trademark of deep thinking – may be beginning to disappear. Not because they lack value, but because they don’t always work well with generative AI. Quietly and unintentionally, students (and teachers) may find themselves gradually avoiding them altogether.

Of course, that would be a mistake.

We’re not arguing against using generative AI in education. Quite the opposite. But we do propose that higher education needs a two-phase mindset when working with this technology: one that recognizes what AI is good at, and one that insists on preserving the ambiguity and friction that learning actually requires to be successful.




Leveraging GenAI to Transform a Traditional Instructional Video into Engaging Short Video Lectures — from er.educause.edu by Hua Zheng

By leveraging generative artificial intelligence to convert lengthy instructional videos into micro-lectures, educators can enhance efficiency while delivering more engaging and personalized learning experiences.


This AI Model Never Stops Learning — from link.wired.com by Will Knight

Researchers at Massachusetts Institute of Technology (MIT) have now devised a way for LLMs to keep improving by tweaking their own parameters in response to useful new information.

The work is a step toward building artificial intelligence models that learn continually—a long-standing goal of the field and something that will be crucial if machines are to ever more faithfully mimic human intelligence. In the meantime, it could give us chatbots and other AI tools that are better able to incorporate new information including a user’s interests and preferences.

The MIT scheme, called Self Adapting Language Models (SEAL), involves having an LLM learn to generate its own synthetic training data and update procedure based on the input it receives.


Edu-Snippets — from scienceoflearning.substack.com by Nidhi Sachdeva and Jim Hewitt
Why knowledge matters in the age of AI; What happens to learners’ neural activity with prolonged use of LLMs for writing

Highlights:

  • Offloading knowledge to Artificial Intelligence (AI) weakens memory, disrupts memory formation, and erodes the deep thinking our brains need to learn.
  • Prolonged use of ChatGPT in writing lowers neural engagement, impairs memory recall, and accumulates cognitive debt that isn’t easily reversed.
 
© 2025 | Daniel Christian