The Library as a Learning Campus
Many microschool founders are wrestling with the same core challenge: how do you provide students with enriching, hands-on experiences when you’re working with a small team and a lean budget? Erin’s answer is deceptively simple — walk through the library’s front door.
Modern public libraries are far more than book repositories. Most educators walk past an entire ecosystem of free resources without realizing what’s available. Need printing, computers, or digital tools? Libraries offer them at little or no cost. Looking for hands-on science programming? Many branches host makerspaces and science stations built for exactly that kind of exploration. Need a space to hold a small class, workshop, or seminar? Bookable collaboration rooms are often just a phone call away.
Beyond the physical infrastructure, libraries frequently offer life skills programming — resume writing, financial literacy, job readiness — that can support the families surrounding a microschool, not just its students. And in some branches, social workers are embedded on site, providing the kind of wraparound support that few microschools could ever access on their own.
Libraries are also deeply invested in expanding their community reach. A microschool brings exactly the kind of engaged, mission-driven partnership that many branches are actively seeking. The relationship benefits both sides from day one.
Across the divide: reimagining faculty-staff collaboration in higher education — from timeshighereducation.com by Saskia van de Gevel Academic units do best when they harness different viewpoints – from field scientists and curriculum designers to extension professionals – to drive innovation and relevance. Saskia van de Gevel offers proactive advice
Universities are not sustained by individual leaders or isolated units. They are sustained by teams of people who bring different kinds of expertise to a shared mission. When faculty and professional staff collaborate as genuine partners – aligned around outcomes, clear about roles and committed to mutual respect – institutions become more resilient, innovative and effective.
Also from timeshighereducation.com, see:
The five levels of learning designer support — from timeshighereducation.com by Daniel Searson Learning designers and academics may have different expectations when it comes to collaborating on course design. Here’s how a five-point scale can help
How employability teams can strengthen academic programmes — from timeshighereducation.com by Hanene Duprat Working like recruitment partners, rather than just career advisers, can help align teaching with industry needs, writes Hanene Duprat Excerpts:
Again, we don’t send them 200 CVs. We might send 20, but they’re meticulously shortlisted. The employer saves time, the student feels they are being taken seriously and trust builds quickly on both sides.
And because we work closely with employers, we learn something universities often struggle to find out early enough: what the market is asking for now.
What academics need to know: we can’t do this without you
If I could say one thing to academic colleagues anywhere, it’s that employability can’t sit next to the curriculum. It has to live with it.
The benefits of engaging third space practitioners in curriculum development — from timeshighereducation.com by Steve Briggs Third space practitioners are often overlooked in the curriculum development process, to everyone’s detriment. Here’s a look at the viewpoints they can offer and how to engage them better
Jim VandeHei’s note to his kids: Blunt AI talk — from axios.com by CEO Jim VandeHei Axios CEO Jim VandeHei wrote this note to his wife, Autumn, and their three kids. She suggested sharing it more broadly since so many families are wrestling with how to think and talk about AI. So here it is …
Dear Family: I want to put to words what I’m hearing, seeing, thinking and writing about AI.
Simply put, I’m now certain it will upend your work and life in ways more profound than the internet or possibly electricity. This will hit in months, not years.
The changes will be fast, wide, radical, disorienting and scary. No one will avoid its reach.
I’m not trying to frighten you. And I know your opinions range from wonderment to worry. That’s natural and OK. Our species isn’t wired for change of this speed or scale.
My conversations with the CEOs and builders of these LLMs, as well as my own deep experimentation with AI, have shaken and stirred me in ways I never imagined.
All of you must figure out how to master AI for any specific job or internship you hold or take. You’d be jeopardizing your future careers by not figuring out how to use AI to amplify and improve your work. You’d be wise to replace social media scrolling with LLM testing.
Penelope Adams Moon suggested that instead [of] framing a workshop around “How can we integrate AI into the work of teaching?” we should ask “Given what we know about learning, how might AI be useful?” I love that reframing, and I think it connects to the students’ requests for more AI knowhow. Students have a lot of options for learning: working with their instructor, collaborating with peers, surfing YouTube for explainer videos, university-provided social annotation platforms, and, yes, using AI as a kind of tutor. I think our job (collectively) isn’t just to teach students how to use AI (as they’re requesting) but also to help them figure out when and how AI is helpful for their learning. That’s highly dependent on the student and the learning task! I wrote about this kind of metacognition on my blog.
In the same way, when I approach any kind of educational technology, I’m looking for tools that can be responsive to my pedagogical aims. The pedagogy should drive the technology use, not the other way around.
How Your Learners *Actually* Learn with AI— from drphilippahardman.substack.com by Dr. Philippa Hardman What 37.5 million AI chats show us about how learners use AI at the end of 2025 — and what this means for how we design & deliver learning experiences in 2026
Last week, Microsoft released a similar analysis of a whopping 37.5 millionCopilot conversations. These conversation took place on the platform from January to September 2025, providing us with a window into if and how AI use in general — and AI use among learners specifically – has evolved in 2025.
Microsoft’s mass behavioural data gives us a detailed, global glimpse into what learners are actually doing across devices, times of day and contexts. The picture that emerges is pretty clear and largely consistent with what OpenAI’s told us back in the summer:
AI isn’t functioning primarily as an “answers machine”: the majority of us use AI as a tool to personalise and differentiate generic learning experiences and – ultimately – to augment human learning.
Let’s dive in!
Learners don’t “decide” to use AI anymore. They assume it’s there, like search, like spellcheck, like calculators. The question has shifted from “should I use this?” to “how do I use this effectively?”
So where do you start? There are many agentic tools and platforms for AI tasks on the market, and the most effective approach is to focus on practical, high-impact workflows. So here, I’ll look at some of the most compelling use cases, as well as provide an overview of the tools that can help you quickly deliver tangible wins.
…
Some of the strongest opportunities in HR include:
Workforce management, administering job satisfaction surveys, monitoring and tracking performance targets, scheduling interventions, and managing staff benefits, medical leave, and holiday entitlement.
Recruitment screening, automatically generating and posting job descriptions, filtering candidates, ranking applicants against defined criteria, identifying the strongest matches, and scheduling interviews.
Employee onboarding, issuing new hires with contracts and paperwork, guiding them to onboarding and training resources, tracking compliance and completion rates, answering routine enquiries, and escalating complex cases to human HR specialists.
Training and development, identifying skills gaps, providing self-service access to upskilling and reskilling opportunities, creating personalized learning pathways aligned with roles and career goals, and tracking progress toward completion.
Here’s what’s shaped the AI-education landscape in the last month:
The AI Speed Trap is [still] here: AI adoption in L&D is basically won (87%)—but it’s being used to ship faster, not learn better (84% prioritising speed), scaling “more of the same” at pace.
AI tutors risk a “pedagogy of passivity”: emerging evidence suggests tutoring bots can reduce cognitive friction and pull learners down the ICAP spectrum—away from interactive/constructive learning toward efficient consumption.
Singapore + India are building what the West lacks: they’re treating AI as national learning infrastructure—for resilience (Singapore) and access + language inclusion (India)—while Western systems remain fragmented and reactive.
Agentic AI is the next pivot: early signs show a shift from AI as a content engine to AI as a learning partner—with UConn using agents to remove barriers so learners can participate more fully in shared learning.
Moodle’s AI stance sends two big signals: the traditional learning ecosystem in fragmenting, and the concept of “user sovereignty” over by AI is emerging.
For Cogniti to be taken seriously, it needs to be woven into the structure of your unit and its delivery, both in class and on Canvas, rather than left on the side. This article shares practical strategies for implementing Cogniti in your teaching so that students:
understand the context and purpose of the agent,
know how to interact with it effectively,
perceive its value as a learning tool over any other available AI chatbots, and
engage in reflection and feedback.
In this post, we discuss how to introduce and integrate Cogniti agents into the learning environment so students understand their context, interact effectively, and see their value as customised learning companions.
In this post, we share four strategies to help introduce and integrate Cogniti in your teaching so that students understand their context, interact effectively, and see their value as customised learning companions.
Collection: Teaching with Custom AI Chatbots — from teaching.virginia.edu; via Derek Bruff The default behaviors of popular AI chatbots don’t always align with our teaching goals. This collection explores approaches to designing AI chatbots for particular pedagogical purposes.
While it’s true that Nano Banana generates better infographics than other AI models, the conversation has so far massively under-sold what’s actually different and valuable about this tool for those of us who design learning experiences.
What this means for our workflow:
Instead of the traditional “commission ? wait ? tweak ? approve ? repeat” cycle, Nano Banana enables an iterative, rapid-cycle design process where you can:
Sketch an idea and see it refined in minutes.
Test multiple visual metaphors for the same concept without re-briefing a designer.
Build 10-image storyboards with perfect consistency by specifying the constraints once, not manually editing each frame.
Implement evidence-based strategies (contrasting cases, worked examples, observational learning) that are usually too labour-intensive to produce at scale.
This shift—from “image generation as decoration” to “image generation as instructional scaffolding”—is what makes Nano Banana uniquely useful for the 10 evidence-based strategies below.
So this year, I’ve been exploring new ways that AI can help support students with disabilities—students on IEPs, learning plans, or 504s—and, honestly, it’s changing the way I think about differentiation in general.
As a quick note, a lot of what I’m finding applies just as well to English language learners or really to any students. One of the big ideas behind Universal Design for Learning (UDL) is that accommodations and strategies designed for students with disabilities are often just good teaching practices. When we plan instruction that’s accessible to the widest possible range of learners, everyone benefits. For example, UDL encourages explaining things in multiple modes—written, visual, auditory, kinesthetic—because people access information differently. I hear students say they’re “visual learners,” but I think everyone is a visual learner, and an auditory learner, and a kinesthetic learner. The more ways we present information, the more likely it is to stick.
So, with that in mind, here are four ways I’ve been using AI to differentiate instruction for students with disabilities (and, really, everyone else too):
What I’ve tried to do is bring together genuinely useful AI tools that I know are already making a difference.
For colleagues wanting to explore further, I’m sharing the list exactly as it appears in the table, including website links, grouped by category below. Please do check it out, as along with links to all of the resources, I’ve also written a brief summary explaining what each of the different tools do and how they can help.
Last week, I wrapped up Dr Philippa Hardman’s intensive bootcamp on AI in learning design. Four conversations, countless iterations, and more than a few humbling moments later – here’s what I am left thinking about.
An aside: Google is working on a new vision for textbooks that can be easily differentiated based on the beautiful success for NotebookLM. You can get on the waiting list for that tool by going to LearnYourWay.withgoogle.com.
… Nano Banana Pro
Sticking with the Google tools for now, Nano Banana Pro (which you can use for free on Google’s AI Studio), is doing something that everyone has been waiting a long time for: it adds correct text to images.
The simple act of remembering is the crux of how we navigate the world: it shapes our experiences, informs our decisions, and helps us anticipate what comes next. For AI agents like Comet Assistant, that continuity leads to a more powerful, personalized experience.
Today we are announcing new personalization features to remember your preferences, interests, and conversations. Perplexity now synthesizes them automatically like memory, for valuable context on relevant tasks. Answers are smarter, faster, and more personalized, no matter how you work.
From DSC : This should be important as we look at learning-related applications for AI.
For the last three days, my Substack has been in the top “Rising in Education” list. I realize this is based on a hugely flawed metric, but it still feels good. ?
I just completed nearly 60,000 miles of travel across Europe, Asia, and the Middle East meeting with hundred of companies to discuss their AI strategies. While every company’s maturity is different, one thing is clear: AI as a business tool has arrived: it’s real and the use-cases are growing.
A new survey by Wharton shows that 46% of business leaders use Gen AI daily and 80% use it weekly. And among these users, 72% are measuring ROI and 74% report a positive return. HR, by the way, is the #3 department in use cases, only slightly behind IT and Finance.
What are companies getting out of all this? Productivity. The #1 use case, by far, is what we call “stage 1” usage – individual productivity.
.
From DSC: Josh writes: “Many of our large clients are now implementing AI-native learning systems and seeing 30-40% reduction in staff with vast improvements in workforce enablement.”
While I get the appeal (and ROI) from management’s and shareholders’ perspective, this represents a growing concern for employment and people’s ability to earn a living.
And while I highly respect Josh and his work through the years, I disagree that we’re over the problems with AI and how people are using it:
Two years ago the NYT was trying to frighten us with stories of AI acting as a romance partner. Well those stories are over, and thanks to a $Trillion (literally) of capital investment in infrastructure, engineering, and power plants, this stuff is reasonably safe.
Those stories are just beginning…they’re not close to being over.
So let’s imagine a world where there’s no separation between learning and assessment: it’s ongoing. There’s always assessment, always learning, and they’re tied together. Then we can ask: what is the role of the human in that world? What is it that AI can’t do?
…
Imagine something like that in higher ed. There could be tutoring or skill-based work happening outside of class, and then relationship-based work happening inside of class, whether online, in person, or some hybrid mix.
The aspects of learning that don’t require relational context could be handled by AI, while the human parts remain intact. For example, I teach strategy and strategic management. I teach people how to talk with one another about the operation and function of a business. I can help students learn to be open to new ideas, recognize when someone pushes back out of fear of losing power, or draw from my own experience in leading a business and making future-oriented decisions.
But the technical parts such as the frameworks like SWOT analysis, the mechanics of comparing alternative viewpoints in a boardroom—those could be managed through simulations or reports that receive immediate feedback from AI. The relational aspects, the human mentoring, would still happen with me as their instructor.
My take is this: in all of the anxiety lies a crucial and long-overdue opportunity to deliver better learning experiences. Precisely because Atlas perceives the same context in the same moment as you, it can transform learning into a process aligned with core neuro-scientific principles—including active retrieval, guided attention, adaptive feedback and context-dependent memory formation.
Perhaps in Atlas we have a browser that for the first time isn’t just a portal to information, but one which can become a co-participant in active cognitive engagement—enabling iterative practice, reflective thinking, and real-time scaffolding as you move through challenges and ideas online.
With this in mind, I put together 10 use cases for Atlas for you to try for yourself.
…
6. Retrieval Practice
What: Pulling information from memory drives retention better than re-reading. Why: Practice testing delivers medium-to-large effects (Adesope et al., 2017). Try: Open a document with your previous notes. Ask Atlas for a mixed activity set: “Quiz me on the Krebs cycle—give me a near-miss, high-stretch MCQ, then a fill-in-the-blank, then ask me to explain it to a teen.” Atlas uses its browser memory to generate targeted questions from your actual study materials, supporting spaced, varied retrieval.
From DSC: A quick comment. I appreciate these ideas and approaches from Katarzyna and Rita. I do think that someone is going to want to be sure that the AI models/platforms/tools are given up-to-date information and updated instructions — i.e., any new procedures, steps to take, etc. Perhaps I’m missing the boat here, but an internal AI platform is going to need to have access to up-to-date information and instructions.
From DSC: Stephen has some solid reflections and asks some excellent questions in this posting, including:
The question is: how do we optimize an AI to support learning? Will one model be enough? Or do we need different models for different learners in different scenarios?
A More Human University: The Role of AI in Learning — from er.educause.edu by Robert Placido Far from heralding the collapse of higher education, artificial intelligence offers a transformative opportunity to scale meaningful, individualized learning experiences across diverse classrooms.
The narrative surrounding artificial intelligence (AI) in higher education is often grim. We hear dire predictions of an “impending collapse,” fueled by fears of rampant cheating, the erosion of critical thinking, and the obsolescence of the human educator.Footnote1 This dystopian view, however, is a failure of imagination. It mistakes the death rattle of an outdated pedagogical model for the death of learning itself. The truth is far more hopeful: AI is not an asteroid coming for higher education. It is a catalyst that can finally empower us to solve our oldest, most intractable problem: the inability to scale deep, engaged, and truly personalized learning.
Increasing the rate of scientific progress is a core part of Anthropic’s public benefit mission.
We are focused on building the tools to allow researchers to make new discoveries – and eventually, to allow AI models to make these discoveries autonomously.
Until recently, scientists typically used Claude for individual tasks, like writing code for statistical analysis or summarizing papers. Pharmaceutical companies and others in industry also use it for tasks across the rest of their business, like sales, to fund new research. Now, our goal is to make Claude capable of supporting the entire process, from early discovery through to translation and commercialization.
To do this, we’re rolling out several improvements that aim to make Claude a better partner for those who work in the life sciences, including researchers, clinical coordinators, and regulatory affairs managers.
AI as an access tool for neurodiverse and international staff— from timeshighereducation.com by Vanessa Mar-Molinero Used transparently and ethically, GenAI can level the playing field and lower the cognitive load of repetitive tasks for admin staff, student support and teachers
Where AI helps without cutting academic corners When framed as accessibility and quality enhancement, AI can support staff to complete standard tasks with less friction. However, while it supports clarity, consistency and inclusion, generative AI (GenAI) does not replace disciplinary expertise, ethical judgement or the teacher–student relationship. These are ways it can be put to effective use:
The Sleep of Liberal Arts Produces AI — from aiedusimplified.substack.com by Lance Eaton, Ph.D. A keynote at the AI and the Liberal Arts Symposium Conference
This past weekend, I had the honor to be the keynote speaker at a really fantstistic conferece, AI and the Liberal Arts Symposium at Connecticut College. I had shared a bit about this before with my interview with Lori Looney. It was an incredible conference, thoughtfully composed with a lot of things to chew on and think about.
It was also an entirely brand new talk in a slightly different context from many of my other talks and workshops. It was something I had to build entirely from the ground up. It reminded me in some ways of last year’s “What If GenAI Is a Nothingburger”.
It was a real challenge and one I’ve been working on and off for months, trying to figure out the right balance. It’s a work I feel proud of because of the balancing act I try to navigate. So, as always, it’s here for others to read and engage with. And, of course, here is the slide deck as well (with CC license).
Combining two strategies—spacing and retrieval practice—is key to success in learning, says Shana Carpenter.
On a somewhat related note (i.e., for Instructional Designers, teachers, faculty members, T&L staff members), also see:
Fresh Approaches to Instructional Design — from edutopia.org by Sara Furnival An educator with 20-plus years of experience on crafting creative and energizing lessons.
ChatGPT: the world’s most influential teacher — from drphilippahardman.substack.com by Dr. Philippa Hardman; emphasis DSC New research shows that millions of us are “learning with AI” every week: what does this mean for how (and how well) humans learn?
This week, an important piece of researchlanded that confirms the gravity of AI’s role in the learning process. The TLDR is that learning is now a mainstream use case for ChatGPT; around 10.2% of all ChatGPT messages (that’s ~2BN messages sent by over 7 million users per week) are requests for help with learning.
The research shows that about 10.2% of all messages are tutoring/teaching, and within the “Practical Guidance” category, tutoring is 36%. “Asking” interactions are growing faster than “Doing” and are rated higher quality by users. Younger people contribute a huge share of messages, and growth is fastest in low- and middle-income countries (How People Use ChatGPT, 2025).
If AI is already acting as a global tutor, the question isn’t “will people learn with AI?”—they already are. The real question we need to ask is: what does great learning actually look like, and how should AI evolve to support it? That’s where decades of learning science help us separate “feels like learning” from “actually gaining new knowledge and skills”.
Millions of college students around the world are getting ready to start classes. To help make the school year even better, we’re making our most advanced AI tools available to them for free, including our new Guided Learning mode. We’re also providing $1 billion to support AI education and job training programs and research in the U.S. This includes making our AI and career training free for every college student in America through our AI for Education Accelerator — over 100 colleges and universities have already signed up.
… Guided Learning: from answers to understanding
AI can broaden knowledge and expand access to it in powerful ways, helping anyone, anywhere learn anything in the way that works best for them. It’s not about just getting an answer, but deepening understanding and building critical thinking skills along the way. That opportunity is why we built Guided Learning, a new mode in Gemini that acts as a learning companion guiding you with questions and step-by-step support instead of just giving you the answer. We worked closely with students, educators, researchers and learning experts to make sure it’s helpful for understanding new concepts and is backed by learning science.
What is Study Mode?
Study Mode is OpenAI’s take on a smarter study partner – a version of the ChatGPT experience designed to guide users through problems with Socratic prompts, scaffolded reasoning, and adaptive feedback (instead of just handing over the answer).
Built with input from learning scientists, pedagogy experts, and educators, it was also shaped by direct feedback from college students. While Study Mode is designed with college students in mind, it’s meant for anyone who wants a more learning-focused, hands-on experience across a wide range of subjects and skill levels.
Who can access it? And how?
Starting July 29, Study Mode is available to users on Free, Plus, Pro, and Team plans. It will roll out to ChatGPT Edu users in the coming weeks.
ChatGPT became your tutor— from theneurondaily.com by Grant Harvey PLUS: NotebookLM has video now & GPT 4o-level AI runs on laptop
Here’s how it works: instead of asking “What’s 2+2?” and getting “4,” study mode asks questions like “What do you think happens when you add these numbers?” and “Can you walk me through your thinking?” It’s like having a patient tutor who won’t let you off the hook that easily.
The key features include:
Socratic questioning: It guides you with hints and follow-up questions rather than direct answers.
Scaffolded responses: Information broken into digestible chunks that build on each other.
Personalized support: Adjusts difficulty based on your skill level and previous conversations.
Knowledge checks: Built-in quizzes and feedback to make sure concepts actually stick.
Toggle flexibility: Switch study mode on and off mid-conversation depending on your goals.
Try study mode yourself by selecting “Study and learn” from tools in ChatGPT and asking a question.
Introducing study mode— from openai.com A new way to learn in ChatGPT that offers step by step guidance instead of quick answers.
[On 7/29/25, we introduced] study mode in ChatGPT—a learning experience that helps you work through problems step by step instead of just getting an answer. Starting today, it’s available to logged in users on Free, Plus, Pro, Team, with availability in ChatGPT Edu coming in the next few weeks.
ChatGPT is becoming one of the most widely used learning tools in the world. Students turn to it to work through challenging homework problems, prepare for exams, and explore new concepts. But its use in education has also raised an important question: how do we ensure it is used to support real learning, and doesn’t just offer solutions without helping students make sense of them?
We’ve built study mode to help answer this question. When students engage with study mode, they’re met with guiding questions that calibrate responses to their objective and skill level to help them build deeper understanding. Study mode is designed to be engaging and interactive, and to help students learn something—not just finish something.