AI Tutors: Hype or Hope for Education? — from educationnext.org by John Bailey and John Warner
In a new book, Sal Khan touts the potential of artificial intelligence to address lagging student achievement. Our authors weigh in.

In Salman Khan’s new book, Brave New Words: How AI Will Revolutionize Education (and Why That’s a Good Thing) (Viking, 2024), the Khan Academy founder predicts that AI will transform education by providing every student with a virtual personalized tutor at an affordable cost. Is Khan right? Is radically improved achievement for all students within reach at last? If so, what sorts of changes should we expect to see, and when? If not, what will hold back the AI revolution that Khan foresees? John Bailey, a visiting fellow at the American Enterprise Institute, endorses Khan’s vision and explains the profound impact that AI technology is already making in education. John Warner, a columnist for the Chicago Tribune and former editor for McSweeney’s Internet Tendency, makes the case that all the hype about AI tutoring is, as Macbeth quips, full of sound and fury, signifying nothing.

 
 

2024: The State of Generative AI in the Enterprise — from menlovc.com (Menlo Ventures)
The enterprise AI landscape is being rewritten in real time. As pilots give way to production, we surveyed 600 U.S. enterprise IT decision-makers to reveal the emerging winners and losers.

This spike in spending reflects a wave of organizational optimism; 72% of decision-makers anticipate broader adoption of generative AI tools in the near future. This confidence isn’t just speculative—generative AI tools are already deeply embedded in the daily work of professionals, from programmers to healthcare providers.

Despite this positive outlook and increasing investment, many decision-makers are still figuring out what will and won’t work for their businesses. More than a third of our survey respondents do not have a clear vision for how generative AI will be implemented across their organizations. This doesn’t mean they’re investing without direction; it simply underscores that we’re still in the early stages of a large-scale transformation. Enterprise leaders are just beginning to grasp the profound impact generative AI will have on their organizations.


Business spending on AI surged 500% this year to $13.8 billion, says Menlo Ventures — from cnbc.com by Hayden Field

Key Points

  • Business spending on generative AI surged 500% this year, hitting $13.8 billion — up from just $2.3 billion in 2023, according to data from Menlo Ventures released Wednesday.
  • OpenAI ceded market share in enterprise AI, declining from 50% to 34%, per the report.
  • Amazon-backed Anthropic doubled its market share from 12% to 24%.

Microsoft quietly assembles the largest AI agent ecosystem—and no one else is close — from venturebeat.com by Matt Marshall

Microsoft has quietly built the largest enterprise AI agent ecosystem, with over 100,000 organizations creating or editing AI agents through its Copilot Studio since launch – a milestone that positions the company ahead in one of enterprise tech’s most closely watched and exciting  segments.

The rapid adoption comes as Microsoft significantly expands its agent capabilities. At its Ignite conference [that started on 11/19/24], the company announced it will allow enterprises to use any of the 1,800 large language models (LLMs) in the Azure catalog within these agents – a significant move beyond its exclusive reliance on OpenAI’s models. The company also unveiled autonomous agents that can work independently, detecting events and orchestrating complex workflows with minimal human oversight.


Now Hear This: World’s Most Flexible Sound Machine Debuts — from
Using text and audio as inputs, a new generative AI model from NVIDIA can create any combination of music, voices and sounds.

Along these lines, also see:


AI Agents Versus Human Agency: 4 Ways To Navigate Our AI-Driven World — from forbes.com by Cornelia C. Walther

To understand the implications of AI agents, it’s useful to clarify the distinctions between AI, generative AI, and AI agents and explore the opportunities and risks they present to our autonomy, relationships, and decision-making.

AI Agents: These are specialized applications of AI designed to perform tasks or simulate interactions. AI agents can be categorized into:

    • Tool Agents…
    • Simulation Agents..

While generative AI creates outputs from prompts, AI agents use AI to act with intention, whether to assist (tool agents) or emulate (simulation agents). The latter’s ability to mirror human thought and action offers fascinating possibilities — and raises significant risks.

 

Skill-Based Training: Embrace the Benefits; Stay Wary of the Hype — from learningguild.com by Paige Yousey

1. Direct job relevance
One of the biggest draws of skill-based training is its direct relevance to employees’ daily roles. By focusing on teaching job-specific skills, this approach helps workers feel immediately empowered to apply what they learn, leading to a quick payoff for both the individual and the organization. Yet, while this tight focus is a major benefit, it’s important to consider some potential drawbacks that could arise from an overly narrow approach.

Be wary of:

  • Overly Narrow Focus: Highly specialized training might leave employees with little room to apply their skills to broader challenges, limiting versatility and growth potential.
  • Risk of Obsolescence: Skills can quickly become outdated, especially in fast-evolving industries. L&D leaders should aim for regular updates to maintain relevance.
  • Neglect of Soft Skills: While technical skills are crucial, ignoring soft skills like communication and problem-solving may lead to a lack of balanced competency.

2. Enhanced job performance…
3. Addresses skill gaps…

…and several more areas to consider


Another item from Paige Yousey

5 Key EdTech Innovations to Watch — from learningguild.com by Paige Yousey

AI-driven course design

Strengths

  • Content creation and updates: AI streamlines the creation of training materials by identifying resource gaps and generating tailored content, while also refreshing existing materials based on industry trends and employee feedback to maintain relevance.
  • Data-driven insights: Use AI tools to provide valuable analytics to inform course development and instructional strategies, helping learner designers identify effective practices and improve overall learning outcomes.
  • Efficiency: Automating repetitive tasks, such as learner assessments and administrative duties, enables L&D professionals to concentrate on developing impactful training programs and fostering learner engagement.

Concerns

  • Limited understanding of context: AI may struggle to understand the specific educational context or the unique needs of diverse learner populations, potentially hindering effectiveness.
  • Oversimplification of learning: AI may reduce complex educational concepts to simple metrics or algorithms, oversimplifying the learning process and neglecting deeper cognitive development.
  • Resistance to change: Learning leaders may face resistance from staff who are skeptical about integrating AI into their training practices.

Also from the Learning Guild, see:

Use Twine to Easily Create Engaging, Immersive Scenario-Based Learning — from learningguild.com by Bill Brandon

Scenario-based learning immerses learners in realistic scenarios that mimic real-world challenges they might face in their roles. These learning experiences are highly relevant and relatable. SBL is active learning. Instead of passively consuming information, learners actively engage with the content by making decisions and solving problems within the scenario. This approach enhances critical thinking and decision-making skills.

SBL can be more effective when storytelling techniques create a narrative that guides learners through the scenario to maintain engagement and make the learning memorable. Learners receive immediate feedback on their decisions and learn from their mistakes. Reflection can deepen their understanding. Branching scenarios add simulated complex decision-making processes and show the outcome of various actions through interactive scenarios where learner choices lead to different outcomes.

Embrace the Future: Why L&D Leaders Should Prioritize AI Digital Literacy — from learningguild.com by Dr. Erica McCaig

The role of L&D leaders in AI digital literacy
For L&D leaders, developing AI digital literacy within an organization requires a well-structured curriculum and development plan that equips employees with the knowledge, skills, and ethical grounding needed to thrive in an AI-augmented workplace. This curriculum should encompass a range of competencies that enhance technical understanding and foster a mindset ready for innovation and responsible use of AI. Key areas to focus on include:

  • Understanding AI Fundamentals: …
  • Proficiency with AI Tools: …
  • Ethical Considerations: …
  • Cultivating Critical Thinking: …
 

How to use NotebookLM for personalized knowledge synthesis — from ai-supremacy.com by Michael Spencer and Alex McFarland
Two powerful workflows that unlock everything else. Intro: Golden Age of AI Tools and AI agent frameworks begins in 2025.

What is Google Learn about?
Google’s new AI tool, Learn About, is designed as a conversational learning companion that adapts to individual learning needs and curiosity. It allows users to explore various topics by entering questions, uploading images or documents, or selecting from curated topics. The tool aims to provide personalized responses tailored to the user’s knowledge level, making it user-friendly and engaging for learners of all ages.

Is Generative AI leading to a new take on Educational technology? It certainly appears promising heading into 2025.

The Learn About tool utilizes the LearnLM AI model, which is grounded in educational research and focuses on how people learn. Google insists that unlike traditional chatbots, it emphasizes interactive and visual elements in its responses, enhancing the educational experience. For instance, when asked about complex topics like the size of the universe, Learn About not only provides factual information but also includes related content, vocabulary building tools, and contextual explanations to deepen understanding.

 

What DICE does in this posting will be available 24x7x365 in the future [Christian]

From DSC:
First of all, when you look at the following posting:


What Top Tech Skills Should You Learn for 2025? — from dice.com by Nick Kolakowski


…you will see that they outline which skills you should consider mastering in 2025 if you want to stay on top of the latest career opportunities. They then list more information about the skills, how you apply the skills, and WHERE to get those skills.

I assert that in the future, people will be able to see this information on a 24x7x365 basis.

  • Which jobs are in demand?
  • What skills do I need to do those jobs?
  • WHERE do I get/develop those skills?


And that last part (about the WHERE do I develop those skills) will pull from many different institutions, people, companies, etc.

BUT PEOPLE are the key! Oftentimes, we need to — and prefer to — learn with others!


 

A Code-Red Leadership Crisis: A Wake-Up Call for Talent Development — from learningguild.com by Dr. Arika Pierce Williams

This company’s experience offers three crucial lessons for other organizational leaders who may be contemplating cutting or reducing talent development investments in their 2025 budgets to focus on “growth.”

  1. Leadership development isn’t a luxury – it’s a strategic imperative…
  2. Succession planning must be an ongoing process, not a reactive measure…
  3. The cost of developing leaders is far less than the cost of not having them when you need them most…

Also from The Learning Guild, see:

5 Key EdTech Innovations to Watch — from learningguild.com by Paige Yousey

  1. AI-driven course design
  2. Hyper-personalized content curation
  3. Immersive scenario-based training
  4. Smart chatbots
  5. Wearable devices
 

“The Value of Doing Things: What AI Agents Mean for Teachers” — from nickpotkalitsky.substack.com by guest author Jason Gulya, Professor of English and Applied Media at Berkeley College in New York City

AI Agents make me nervous. Really nervous.

I wish they didn’t.

I wish I could write that the last two years have made me more confident, more self-assured that AI is here to augment workers rather than replace them.

But I can’t.

I wish I could write that I know where schools and colleges will end up. I wish I could say that AI Agents will help us get where we need to be.

But I can’t.

At this point, today, I’m at a loss. I’m not sure where the rise of AI agents will take us, in terms of how we work and learn. I’m in the question-asking part of my journey. I have few answers.

So, let’s talk about where (I think) AI Agents will take education. And who knows? Maybe as I write I’ll come up with something more concrete.

It’s worth a shot, right?

From DSC: 
I completely agree with Jason’s following assertion:

A good portion of AI advancement will come down to employee replacement. And AI Agents push companies towards that. 

THAT’s where/what the ROI will be for corporations. They will make their investments up in the headcount area, and likely in other areas as well (product design, marketing campaigns, engineering-related items, and more). But how much time it takes to get there is a big question mark.

One last quote here…it’s too good not to include:

Behind these questions lies a more abstract, more philosophical one: what is the relationship between thinking and doing in a world of AI Agents and other kinds of automation?


How Good are Claude, ChatGPT & Gemini at Instructional Design? — from drphilippahardman.substack.com by Dr Philippa Hardman
A test of AI’s Instruction Design skills in theory & in practice

By examining models across three AI families—Claude, ChatGPT, and Gemini—I’ve started to identify each model’s strengths, limitations, and typical pitfalls.

Spoiler: my findings underscore that until we have specialised, fine-tuned AI copilots for instructional design, we should be cautious about relying on general-purpose models and ensure expert oversight in all ID tasks.


From DSC — I’m going to (have Nick) say this again:
I simply asked my students to use AI to brainstorm their own learning objectives. No restrictions. No predetermined pathways. Just pure exploration. The results? Astonishing.

Students began mapping out research directions I’d never considered. They created dialogue spaces with AI that looked more like intellectual partnerships than simple query-response patterns. 


The Digital Literacy Quest: Become an AI Hero — from gamma.app

From DSC:
I have not gone through all of these online-based materials, but I like what they are trying to get at:

  • Confidence with AI
    Students gain practical skills and confidence in using AI tools effectively.
  • Ethical Navigation
    Learn to navigate the ethical landscape of AI with integrity and responsibility. Make informed decisions about AI usage.
  • Mastering Essential Skills
    Develop critical thinking and problem-solving skills in the context of AI.

 


Expanding access to the Gemini app for teen students in education — from workspaceupdates.googleblog.com

Google Workspace for Education admins can now turn on the Gemini app with added data protection as an additional service for their teen users (ages 13+ or the applicable age in your country) in the following languages and countries. With added data protection, chats are not reviewed by human reviewers or otherwise used to improve AI models. The Gemini app will be a core service in the coming weeks for Education Standard and Plus users, including teens,


5 Essential Questions Educators Have About AI  — from edsurge.com by Annie Ning

Recently, I spoke with several teachers regarding their primary questions and reflections on using AI in teaching and learning. Their thought-provoking responses challenge us to consider not only what AI can do but what it means for meaningful and equitable learning environments. Keeping in mind these reflections, we can better understand how we move forward toward meaningful AI integration in education.


FrontierMath: A Benchmark for Evaluating Advanced Mathematical Reasoning in AI — from epoch.ai
FrontierMath presents hundreds of unpublished, expert-level mathematics problems that specialists spend days solving. It offers an ongoing measure of AI complex mathematical reasoning progress.

We’re introducing FrontierMath, a benchmark of hundreds of original, expert-crafted mathematics problems designed to evaluate advanced reasoning capabilities in AI systems. These problems span major branches of modern mathematics—from computational number theory to abstract algebraic geometry—and typically require hours or days for expert mathematicians to solve.


Rising demand for AI courses in UK universities shows 453% growth as students adapt to an AI-driven job market — from edtechinnovationhub.com

The demand for artificial intelligence courses in UK universities has surged dramatically over the past five years, with enrollments increasing by 453%, according to a recent study by Currys, a UK tech retailer.

The study, which analyzed UK university admissions data and surveyed current students and recent graduates, reveals how the growing influence of AI is shaping students’ educational choices and career paths.

This growth reflects the broader trend of AI integration across industries, creating new opportunities while transforming traditional roles. With AI’s influence on career prospects rising, students and graduates are increasingly drawn to AI-related courses to stay competitive in a rapidly changing job market.

 

It’s The End Of The Legal Industry As We Know It — from artificiallawyer.com by Richard Tromans

It’s the end of the legal industry as we know it and I feel fine. I really do.

The legal industry as we know it is already over. The seismic event that triggered this evolutionary shift happened in November 2022. There’s no going back to a pre-genAI world. Change, incremental or otherwise, will be unstoppable. The only question is: at what pace will this change happen?

It’s clear that substantive change at the heart of the legal economy may take a long time – and we should never underestimate the challenge of overturning decades of deeply embedded cultural practices – but, at least it has begun.


AI: The New Legal Powerhouse — Why Lawyers Should Befriend The Machine To Stay Ahead — from today.westlaw.com

(October 24, 2024) – Jeremy Glaser and Sharzaad Borna of Mintz discuss waves of change in the legal profession brought on by AI, in areas such as billing, the work of support staff and junior associates, and ethics.

The dual nature of AI — excitement and fear
AI is evolving at lightning speed, sparking both wonder and worry. As it transforms industries and our daily lives, we are caught between the thrill of innovation and the jitters of uncertainty. Will AI elevate the human experience or just leave us in the dust? How will it impact our careers, privacy and sense of security?

Just as we witnessed with the rise of the internet — and later, social media — AI is poised to redefine how we work and live, bringing a mix of optimism and apprehension. While we grapple with AI’s implications, our clients expect us to lead the charge in leveraging it for their benefit.

However, this shift also means more competition for fewer entry-level jobs. Law schools will play a key role in helping students become more marketable by offering courses on AI tools and technology. Graduates with AI literacy will have an edge over their peers, as firms increasingly value associates who can collaborate effectively with AI tools.


Will YOU use ChatGPT voice mode to lie to your family? Brainyacts #244 — from thebrainyacts.beehiiv.com by Sam Douthit, Aristotle Jones, and Derek Warzel.

Small Law’s Secret Weapon: AI Courtroom Mock Battles — this excerpt is by Brainacts author Josh Kubicki
As many of you know, this semester my law students have the opportunity to write the lead memo for this newsletter, each tackling issues that they believe are both timely and intriguing for our readers. This week’s essay presents a fascinating experiment conducted by three students who explored how small law firms might leverage ChatGPT in a safe, effective manner. They set up ChatGPT to simulate a mock courtroom, even assigning it the persona of a Seventh Circuit Court judge to stage a courtroom dialogue. It’s an insightful take on the how adaptable technology like ChatGPT can offer unique advantages to smaller practices. They share other ideas as well. Enjoy!

The following excerpt was written by Sam Douthit, Aristotle Jones, and Derek Warzel.

One exciting example is a “Courtroom Persona AI” tool, which could let solo practitioners simulate mock trials and practice arguments with AI that mimics specific judges, local courtroom customs, or procedural quirks. Small firms, with their deep understanding of local courts and judicial styles, could take full advantage of this tool to prepare more accurate and relevant arguments. Unlike big firms that have to spread resources across jurisdictions, solo and small firms could use this AI-driven feedback to tailor their strategies closely to local court dynamics, making their preparations sharper and more strategic. Plus, not all solo or small firms have someone to practice with or bounce their ideas off of. For these practitioners, it’s a chance to level up their trial preparation without needing large teams or costly mock trials, gaining a practical edge where it counts most.

Some lawyers have already started to test this out, like the mock trial tested out here. One oversimplified and quick way to try this out is using the ChatGPT app.


The Human in AI-Assisted Dispute Resolution — from jdsupra.com by Epiq

Accountability for Legal Outputs
AI is set to replace some of the dispute resolution work formerly done by lawyers. This work includes summarising documents, drafting legal contracts and filings, using generative AI to produce arbitration submissions for an oral hearing, and, in the not-too-distant future, ingesting transcripts from hearings and comparing them to the documentary record to spot inconsistencies.

As Pendell put it, “There’s quite a bit of lawyering going on there.” So, what’s left for humans?

The common feature in all those examples is that humans must make the judgement call. Lawyers won’t just turn over a first draft of an AI-generated contract or filing to another party or court. The driving factor is that law is still a regulated profession, and regulators will hold humans accountable.

The idea that young lawyers must do routine, menial work as a rite of passage needs to be updated. Today’s AI tools put lawyers at the top of an accountability chain, allowing them to practice law using judgement and strategy as they supervise the work of AI. 


Small law firms embracing AI as they move away from hourly billing — from legalfutures.co.uk by Neil Rose

Small law firms have embraced artificial intelligence (AI), with document drafting or automation the most popular application, according to new research.

The survey also found expectations of a continued move away from hourly billing to fixed fees.

Legal technology provider Clio commissioned UK-specific research from Censuswide as an adjunct to its annual US-focused Legal Trends report, polling 500 solicitors, 82% of whom worked at firms with 20 lawyers or fewer.

Some 96% of them reported that their firms have adopted AI into their processes in some way – 56% of them said it was widespread or universal – while 62% anticipated an increase in AI usage over the next 12 months.

 

Is Generative AI and ChatGPT healthy for Students? — from ai-supremacy.com by Michael Spencer and Nick Potkalitsky
Beyond Text Generation: How AI Ignites Student Discovery and Deep Thinking, according to firsthand experiences of Teachers and AI researchers like Nick Potkalitsky.

After two years of intensive experimentation with AI in education, I am witnessing something amazing unfolding before my eyes. While much of the world fixates on AI’s generative capabilities—its ability to create essays, stories, and code—my students have discovered something far more powerful: exploratory AI, a dynamic partner in investigation and critique that’s transforming how they think.

They’ve moved beyond the initial fascination with AI-generated content to something far more sophisticated: using AI as an exploratory tool for investigation, interrogation, and intellectual discovery.

Instead of the much-feared “shutdown” of critical thinking, we’re witnessing something extraordinary: the emergence of what I call “generative thinking”—a dynamic process where students learn to expand, reshape, and evolve their ideas through meaningful exploration with AI tools. Here I consciously reposition the term “generative” as a process of human origination, although one ultimately spurred on by machine input.


A Road Map for Leveraging AI at a Smaller Institution — from er.educause.edu by Dave Weil and Jill Forrester
Smaller institutions and others may not have the staffing and resources needed to explore and take advantage of developments in artificial intelligence (AI) on their campuses. This article provides a roadmap to help institutions with more limited resources advance AI use on their campuses.

The following activities can help smaller institutions better understand AI and lay a solid foundation that will allow them to benefit from it.

  1. Understand the impact…
  2. Understand the different types of AI tools…
  3. Focus on institutional data and knowledge repositories…

Smaller institutions do not need to fear being left behind in the wake of rapid advancements in AI technologies and tools. By thinking intentionally about how AI will impact the institution, becoming familiar with the different types of AI tools, and establishing a strong data and analytics infrastructure, institutions can establish the groundwork for AI success. The five fundamental activities of coordinating, learning, planning and governing, implementing, and reviewing and refining can help smaller institutions make progress on their journey to use AI tools to gain efficiencies and improve students’ experiences and outcomes while keeping true to their institutional missions and values.

Also from Educause, see:


AI school opens – learners are not good or bad but fast and slow — from donaldclarkplanb.blogspot.com by Donald Clark

That is what they are doing here. Lesson plans focus on learners rather than the traditional teacher-centric model. Assessing prior strengths and weaknesses, personalising to focus more on weaknesses and less on things known or mastered. It’s adaptive, personalised learning. The idea that everyone should learn at the exactly same pace, within the same timescale is slightly ridiculous, ruled by the need for timetabling a one to many, classroom model.

For the first time in the history of our species we have technology that performs some of the tasks of teaching. We have reached a pivot point where this can be tried and tested. My feeling is that we’ll see a lot more of this, as parents and general teachers can delegate a lot of the exposition and teaching of the subject to the technology. We may just see a breakthrough that transforms education.


Agentic AI Named Top Tech Trend for 2025 — from campustechnology.com by David Ramel

Agentic AI will be the top tech trend for 2025, according to research firm Gartner. The term describes autonomous machine “agents” that move beyond query-and-response generative chatbots to do enterprise-related tasks without human guidance.

More realistic challenges that the firm has listed elsewhere include:

    • Agentic AI proliferating without governance or tracking;
    • Agentic AI making decisions that are not trustworthy;
    • Agentic AI relying on low-quality data;
    • Employee resistance; and
    • Agentic-AI-driven cyberattacks enabling “smart malware.”

Also from campustechnology.com, see:


Three items from edcircuit.com:


All or nothing at Educause24 — from onedtech.philhillaa.com by Kevin Kelly
Looking for specific solutions at the conference exhibit hall, with an educator focus

Here are some notable trends:

  • Alignment with campus policies: …
  • Choose your own AI adventure: …
  • Integrate AI throughout a workflow: …
  • Moving from prompt engineering to bot building: …
  • More complex problem-solving: …


Not all AI news is good news. In particular, AI has exacerbated the problem of fraudulent enrollment–i.e., rogue actors who use fake or stolen identities with the intent of stealing financial aid funding with no intention of completing coursework.

The consequences are very real, including financial aid funding going to criminal enterprises, enrollment estimates getting dramatically skewed, and legitimate students being blocked from registering for classes that appear “full” due to large numbers of fraudulent enrollments.


 

 

How Legal Education Must Evolve In The Age Of AI: Insights From An In-House Legal Innovator — from by abovethelaw.com Olga Mack
Traditional legal education has remained largely unchanged for decades, focusing heavily on theoretical knowledge and case law analysis.

As we stand on the brink of a new era defined by artificial intelligence (AI) and data-driven decision-making, the question arises: How should legal education adapt to prepare the next generation of lawyers for the challenges ahead?

Here are three unconventional, actionable insights from our conversation that highlight the need for a radical rethinking of legal education.

  1. Integrate AI Education Into Every Aspect Of Legal Training…
  2. Adopt A ‘Technology-Agnostic’ Approach To AI Training…
  3. Redefine Success In Legal Education To Include Technological Proficiency…
 



Google’s worst nightmare just became reality — from aidisruptor.ai by Alex McFarland
OpenAI just launched an all-out assault on traditional search engines.

Google’s worst nightmare just became reality. OpenAI didn’t just add search to ChatGPT – they’ve launched an all-out assault on traditional search engines.

It’s the beginning of the end for search as we know it.

Let’s be clear about what’s happening: OpenAI is fundamentally changing how we’ll interact with information online. While Google has spent 25 years optimizing for ad revenue and delivering pages of blue links, OpenAI is building what users actually need – instant, synthesized answers from current sources.

The rollout is calculated and aggressive: ChatGPT Plus and Team subscribers get immediate access, followed by Enterprise and Education users in weeks, and free users in the coming months. This staged approach is about systematically dismantling Google’s search dominance.




Open for AI: India Tech Leaders Build AI Factories for Economic Transformation — from blogs.nvidia.com
Yotta Data Services, Tata Communications, E2E Networks and Netweb are among the providers building and offering NVIDIA-accelerated infrastructure and software, with deployments expected to double by year’s end.


 

AI Tutors Double Rates of Learning in Less Learning Time — by drphilippahardman.substack.com Dr. Philippa Hardman
Inside Harvard’s new groundbreaking study

Conclusion
This Harvard study provides robust evidence that AI tutoring, when thoughtfully designed, can significantly enhance learning outcomes. The combination of doubled learning gains, increased engagement, and reduced time to competency suggests we’re seeing just the beginning of AI’s potential in education and that its potential is significant.

If this data is anything to go by, and if we – as humans – are open and willing to acting on it, it’s possible AI will have a significant and for some deeply positive impact on how we design and deliver learning experiences.

That said, as we look forward, the question shouldn’t just be, “how AI can enhance current educational methods?”, but also “how it might AI transform the very nature of learning itself?”. With continued research and careful implementation, we could be moving toward an era of education that’s more effective but also more accessible than ever before.


Three Quick Examples of Teaching with and about Generative AI — from derekbruff.org Derek Bruff

  • Text-to-Podcast.
  • Assigning Students to Groups.
  • AI Acceptable Use Scale.

Also from Derek’s blog, see:


From Mike Sharples on LinkedIn: 


ChatGPT’s free voice wizard — from wondertools.substack.com by Jeremy Caplan
How and why to try the new Advanced Voice Mode

7 surprisingly practical ways to use voice AI
Opening up ChatGPT’s Advanced Voice Mode (AVM) is like conjuring a tutor eager to help with whatever simple — or crazy — query you throw at it. Talking is more fluid and engaging than typing, especially if you’re out and about. It’s not a substitute for human expertise, but AVM provides valuable machine intelligence.

  • Get a virtual museum tour. …
  • Chat with historical figures….
  • Practice languages. …
  • Explore books. …
  • Others…


Though not AI-related, this is along the lines of edtech:


…which links to:

 

Half of Higher Ed Institutions Now Use AI for Outcomes Tracking, But Most Lag in Implementing Comprehensive Learner Records — from prnewswire.com; via GSV

SALT LAKE CITY, Oct. 22, 2024 /PRNewswire/ — Instructure, the leading learning ecosystem and UPCEA, the online and professional education association, announced the results of a survey on whether institutions are leveraging AI to improve learner outcomes and manage records, along with the specific ways these tools are being utilized. Overall, the study revealed interest in the potential of these technologies is far outpacing adoption. Most respondents are heavily involved in developing learner experiences and tracking outcomes, though nearly half report their institutions have yet to adopt AI-driven tools for these purposes. The research also found that only three percent of institutions have implemented Comprehensive Learner Records (CLRs), which provide a complete overview of an individual’s lifelong learning experiences.


New Survey Says U.S. Teachers Colleges Lag on AI Training. Here are 4 Takeaways — from the74million.org by ; via GSV
Most preservice teachers’ programs lack policies on using AI, CRPE finds, and are likely unready to teach future educators about the field.

In the nearly two years since generative artificial intelligence burst into public consciousness, U.S. schools of education have not kept pace with the rapid changes in the field, a new report suggests.

Only a handful of teacher training programs are moving quickly enough to equip new K-12 teachers with a grasp of AI fundamentals — and fewer still are helping future teachers grapple with larger issues of ethics and what students need to know to thrive in an economy dominated by the technology.

The report, from the Center on Reinventing Public Education, a think tank at Arizona State University, tapped leaders at more than 500 U.S. education schools, asking how their faculty and preservice teachers are learning about AI. Through surveys and interviews, researchers found that just one in four institutions now incorporates training on innovative teaching methods that use AI. Most lack policies on using AI tools, suggesting that they probably won’t be ready to teach future educators about the intricacies of the field anytime soon.



The 5 Secret Hats Teachers are Wearing Right Now (Thanks to AI!) — from aliciabankhofer.substack.com by Alicia Bankhofer
New, unanticipated roles for educators sitting in the same boat

As beta testers, we’re shaping the tools of tomorrow. As researchers, we’re pioneering new pedagogical approaches. As ethical guardians, we’re ensuring that AI enhances rather than compromises the educational experience. As curators, we’re guiding students through the wealth of information AI provides. And as learners ourselves, we’re staying at the forefront of educational innovation.


 

 


Articulate AI & the “Buttonification” of Instructional Design — from drphilippahardman.substack.com by Dr. Philippa Hardman
A new trend in AI-UX, and its implications for Instructional Design

1. Using AI to Scale Exceptional Instructional Design Practice
Imagine a bonification system that doesn’t just automate tasks, but scales best practices in instructional design:

  • Evidence-Based Design Button…
  • Learner-Centered Objectives Generator…
    Engagement Optimiser…

2. Surfacing AI’s Instructional Design Thinking
Instead of hiding AI’s decision-making process, what if we built an AI system which invites instructional designers to probe, question, and learn from an expert trained AI?

  • Explain This Design…
  • Show Me Alternatives…
  • Challenge My Assumptions…
  • Learning Science Insights…

By reimagining the role of AI in this way, we would…


Recapping OpenAI’s Education Forum — from marcwatkins.substack.com by Marc Watkins

OpenAI’s Education Forum was eye-opening for a number of reasons, but the one that stood out the most was Leah Belsky acknowledging what many of us in education had known for nearly two years—the majority of the active weekly users of ChatGPT are students. OpenAI has internal analytics that track upticks in usage during the fall and then drops off in the spring. Later that evening, OpenAI’s new CFO, Sarah Friar, further drove the point home with an anecdote about usage in the Philippines jumping nearly 90% at the start of the school year.

I had hoped to gain greater insight into OpenAI’s business model and how it related to education, but the Forum left me with more questions than answers. What app has the majority of users active 8 to 9 months out of the year and dormant for the holidays and summer breaks? What business model gives away free access and only converts 1 out of every 20-25 users to paid users? These were the initial thoughts that I hoped the Forum would address. But those questions, along with some deeper and arguably more critical ones, were skimmed over to drive home the main message of the Forum—Universities have to rapidly adopt AI and become AI-enabled institutions.


Off-Loading in the Age of Generative AI — from insidehighered.com by James DeVaney

As we embrace these technologies, we must also consider the experiences we need to discover and maintain our connections—and our humanity. In a world increasingly shaped by AI, I find myself asking: What are the experiences that define us, and how do they influence the relationships we build, both professionally and personally?

This concept of “off-loading” has become central to my thinking. In simple terms, off-loading is the act of delegating tasks to AI that we would otherwise do ourselves. As AI systems advance, we’re increasingly confronted with a question: Which tasks should we off-load to AI?

 
© 2024 | Daniel Christian