The California State University system has partnered with OpenAI to launch the largest deployment of AI in higher education to date.
The CSU system, which serves nearly 500,000 students across 23 campuses, has announced plans to integrate ChatGPT Edu, an education-focused version of OpenAI’s chatbot, into its curriculum and operations. The rollout, which includes tens of thousands of faculty and staff, represents the most significant AI deployment within a single educational institution globally.
We’re still in the early stages of AI adoption in education, and it is critical that the entire ecosystem—education systems, technologists, educators, and governments—work together to ensure that all students globally have access to AI and develop the skills to use it responsibly
Leah Belsky, VP and general manager of education at OpenAI.
As you read through these use cases, you’ll notice that each one addresses multiple tasks from our list above.
1. Researching a topic for a lesson
2. Creating Tasks For Practice
3. Creating Sample Answers
4. Generating Ideas
5. Designing Lesson Plans
6. Creating Tests
7. Using AI in Virtual Classrooms
8. Creating Images
9. Creating worksheets
10. Correcting and Feedback
When technologist Luis von Ahn was building the popular language-learning platform Duolingo, he faced a big problem: Could an app designed to teach you something ever compete with addictive platforms like Instagram and TikTok? He explains how Duolingo harnesses the psychological techniques of social media and mobile games to get you excited to learn — all while spreading access to education across the world. .
1. A Focus On Client Experience And Technology-Driven Client Services
2. Evolution Of Pricing Models In Legal Services
3. Cloud Computing, Remote Work, Globalization And Cross-Border Legal Services
4. Legal Analytics And Data-Driven Decision Making
5. Automation Of Routine Legal Tasks
6. Integration Of Artificial Intelligence
7. AI In Mergers And Acquisitions
8. Cybersecurity And Data Privacy
This guide explores the top legal tech jobs in demand, key skills for success, hiring trends, and future predictionsfor the legal industry. Whether you’re a lawyer, law student, IT professional, or business leader, this article will help you navigate the shifting terrain of legal tech careers.
… Top Legal Tech Hiring Trends for 2025
1. Law Firms Are Prioritizing Tech Skills
Over 65% of law firms are hiring legal tech experts over traditional attorneys.
AI implementation, automation, and analytics skills are now must-haves. 2. In-House Legal Teams Are Expanding Legal Tech Roles
77% of corporate legal teams say tech expertise is now mandatory.
More companies are investing in contract automation and legal AI tools. 3. Law Schools Are Adding Legal Tech Courses
Institutions like Harvard and Stanford now offer AI and legal tech curriculums.
Graduates with legal tech skills gain a competitive advantage.
Contract reviews and negotiations are the bread-and-butter work of many corporate lawyers, but artificial intelligence (AI) promises to transform every aspect of the legal profession. Legaltech start-up Ivo, which is today announcing a $16 million Series A funding round, wants to make manual contract work a thing of the past.
“We help in-house legal teams to red-line and negotiate contract agreements more quickly and easily,” explains Min-Kyu Jung, CEO and co-founder of Ivo. “It’s a challenge that couldn’t be solved well by AI until relatively recently, but the evolution of generative AI has made it possible.”
The most revolutionary aspect of DeepSeek for education isn’t just its cost—it’s the combination of open-source accessibility and local deployment capabilities. As Azeem Azhar notes, “R-1 is open-source. Anyone can download and run it on their own hardware. I have R1-8b (the second smallest model) running on my Mac Mini at home.”
…
Real-time Learning Enhancement
AI tutoring networks that collaborate to optimize individual learning paths
Immediate, multi-perspective feedback on student work
Continuous assessment and curriculum adaptation
The question isn’t whether this technology will transform education—it’s how quickly institutions can adapt to a world where advanced AI capabilities are finally within reach of every classroom.
I know through your feedback on my social media and blog posts that several of you have legitimate concerns about the impact of AI in education, especially those related to data privacy, academic dishonesty, AI dependence, loss of creativity and critical thinking, plagiarism, to mention a few. While these concerns are valid and deserve careful consideration, it’s also important to explore the potential benefits AI can bring when used thoughtfully.
Tools such as ChatGPT and Claude are like smart research assistants that are available 24/7 to support you with all kinds of tasks from drafting detailed lesson plans, creating differentiated materials, generating classroom activities, to summarizing and simplifying complex topics. Likewise, students can use them to enhance their learning by, for instance, brainstorming ideas for research projects, generating constructive feedback on assignments, practicing problem-solving in a guided way, and much more.
The point here is that AI is here to stay and expand, and we better learn how to use it thoughtfully and responsibly rather than avoid it out of fear or skepticism.
As part of our updates to the Edtech Insiders Generative AI Map, we’re excited to release a new mini market map and article deep dive on Generative AI tools that are specifically designed for Instructional Materials use cases.
In our database, the Instructional Materials use case category encompasses tools that:
Assist educators by streamlining lesson planning, curriculum development, and content customization
Enable educators or students to transform materials into alternative formats, such as videos, podcasts, or other interactive media, in addition to leveraging gaming principles or immersive VR to enhance engagement
Empower educators or students to transform text, video, slides or other source material into study aids like study guides, flashcards, practice tests, or graphic organizers
Engage students through interactive lessons featuring historical figures, authors, or fictional characters
Customize curriculum to individual needs or pedagogical approaches
Empower educators or students to quickly create online learning assets and courses
That’s why, today, the question I’m asking is: How best can we proactively guide AI’s use in higher education and shape its impact on our students, faculty and institution? The answer to that broad, strategic question lies in pursuing four objectives that, I believe, are relevant for many colleges and universities.
Learning to use business software is different from learning to think. But if the software is sufficiently complex, how different is it really? What if AI’s primary impact on education isn’t in the classroom, but rather shifting the locus of learning to outside the classroom?
Instead of sitting in a classroom listening to a teacher, high school and college students could be assigned real work and learn from that work. Students could be matched with employers or specific projects provided by or derived from employers, then do the work on the same software used in the enterprise. As AI-powered digital adoption platforms (DAPs) become increasingly powerful, they have the potential to transform real or simulated work into educational best practice for students only a few years away from seeking full-time employment.
If DAPs take us in this direction, four implications come to mind….
In this week’s blog post, I share a summary of five recent studies on the impact of Gen AI on learning to bring you right up to date.
… Implications for Educators and Developers
For Educators:
Combine ChatGPT with Structured Activities: …
Use ChatGPT as a Supplement, Not a Replacement:…
Promote Self-Reflection and Evaluation:
For Developers:
Reimagine AI for Reflection-First Design: …
Develop Tools that Foster Critical Thinking: …
Integrate Adaptive Support: …
Assessing the GenAI process, not the output — from timeshighereducation.com by Paul McDermott, Leoni Palmer, and Rosemary Norton A framework for building AI literacy in a literature-review-type assessment
In this resource, we outline our advice for implementing an approach that opens AI use up to our students through a strategy of assessing the process rather than outputs.
To start with, we recommend identifying learning outcomes for your students that can be achieved in collaboration with AI.
What’s New: The Updated Edtech Insiders Generative AI Map — from edtechinsiders.substack.com by Sarah Morin, Alex Sarlin, and Ben Kornell A major expansion on our previously released market map, use case database, and AI tool company directory.
.
Tutorial: 4 Ways to Use LearnLM as a Professor— from automatedteach.com by Graham Clay Create better assessments, improve instructions and feedback, and tutor your students with this fine-tuned version of Gemini.
I cover how to use LearnLM
to create sophisticated assessments that promote learning
to develop clearer and more effective assignment instructions
to provide more constructive feedback on student work, and
to support student learning through guided tutoring
With that out of the way, I prefer Claude.ai for writing. For larger projects like a book, create a Claude Project to keep all context in one place.
Copy [the following] prompts into a document
Use them in sequence as you write
Adjust the word counts and specifics as needed
Keep your responses for reference
Use the same prompt template for similar sections to maintain consistency
Each prompt builds on the previous one, creating a systematic approach to helping you write your book.
Using NotebookLM to Boost College Reading Comprehension— from michellekassorla.substack.com by Michelle Kassorla and Eugenia Novokshanova This semester, we are using NotebookLM to help our students comprehend and engage with scholarly texts
We were looking hard for a new tool when Google released NotebookLM. Not only does Google allow unfettered use of this amazing tool, it is also a much better tool for the work we require in our courses. So, this semester, we have scrapped our “old” tools and added NotebookLM as the primary tool for our English Composition II courses (and we hope, fervently, that Google won’t decide to severely limit its free tier before this semester ends!)
If you know next-to-nothing about NotebookLM, that’s OK. What follows is the specific lesson we present to our students. We hope this will help you understand all you need to know about NotebookLM, and how to successfully integrate the tool into your own teaching this semester.
AFTER two years of working closely with leadership in multiple institutions, and delivering countless workshops, I’ve seen one thing repeatedly: the biggest challenge isn’t the technology itself, but how we lead through it. Here is some of my best advice to help you navigate generative AI with clarity and confidence:
Break your own AI policies before you implement them. …
Fund your failures. …
Resist the pilot program. …
Host Anti-Tech Tech Talks …
…+ several more tips
While generative AI in higher education obviously involves new technology, it’s much more about adopting a curious and human-centric approach in your institution and communities. It’s about empowering learners in new, human-oriented and innovative ways. It is, in a nutshell, about people adapting to new ways of doing things.
Maria Anderson responded to Clay’s posting with this idea:
Here’s an idea: […] the teacher can use the [most advanced] AI tool to generate a complete solution to “the problem” — whatever that is — and demonstrate how to do that in class. Give all the students access to the document with the results.
And then grade the students on a comprehensive followup activity / presentation of executing that solution (no notes, no more than 10 words on a slide). So the students all have access to the same deep AI result, but have to show they comprehend and can iterate on that result.
In this age of distrust, misinformation, and skepticism, you may wonder how to demonstrate your sources within a Google Document. Did you type it yourself, copy and paste it from a browser-based source, copy and paste it from an unknown source, or did it come from generative AI?
You may not think this is an important clarification, but if writing is a critical part of your livelihood or life, you will definitely want to demonstrate your sources.
That’s where the new Grammarly feature comes in.
The new feature is called Authorship, and according to Grammarly, “Grammarly Authorship is a set of features that helps users demonstrate their sources of text in a Google doc. When you activate Authorship within Google Docs, it proactively tracks the writing process as you write.”
AI Agents Are Coming to Higher Education — from govtech.com AI agents are customizable tools with more decision-making power than chatbots. They have the potential to automate more tasks, and some schools have implemented them for administrative and educational purposes.
Custom GPTs are on the rise in education. Google’s version, Gemini Gems, includes a premade version called Learning Coach, and Microsoft announced last week a new agent addition to Copilot featuring use cases at educational institutions.
Generative Artificial Intelligence and Education: A Brief Ethical Reflection on Autonomy— from er.educause.edu by Vicki Strunk and James Willis Given the widespread impacts of generative AI, looking at this technology through the lens of autonomy can help equip students for the workplaces of the present and of the future, while ensuring academic integrity for both students and instructors.
The principle of autonomy stresses that we should be free agents who can govern ourselves and who are able to make our own choices. This principle applies to AI in higher education because it raises serious questions about how, when, and whether AI should be used in varying contexts. Although we have only begun asking questions related to autonomy and many more remain to be asked, we hope that this serves as a starting place to consider the uses of AI in higher education.
I had the privilege of moderating a discussion between Josh Eyler and Robert Cummings about the future of AI in education at the University of Mississippi’s recent AI Winter Institute for Teachers. I work alongside both in faculty development here at the University of Mississippi. Josh’s position on AI sparked a great deal of debate on social media:
…
To make my position clear about the current AI in education discourse I want to highlight several things under an umbrella of “it’s very complicated.”
Most importantly, we all deserve some grace here. Dealing with generative AI in education isn’t something any of us asked for. It isn’t normal. It isn’t fixable by purchasing a tool or telling faculty to simply ‘prefer not to’ use AI. It is and will remain unavoidable for virtually every discipline taught at our institutions.
If one good thing happens because of generative AI let it be that it helps us clearly see how truly complicated our existing relationships with machines are now. As painful as this moment is, it might be what we need to help prepare us for a future where machines that mimic reasoning and human emotion refuse to be ignored.
“AI tutoring shows stunning results.” See below article.
Learning gains were striking The learning improvements were striking—about 0.3 standard deviations. To put this into perspective, this is equivalent to nearly two years of typical learning in just six weeks.When we compared these results to a database of education interventions studied through randomized controlled trials in the developing world, our program outperformed 80% of them, including some of the most cost-effective strategies like structured pedagogy and teaching at the right level. This achievement is particularly remarkable given the short duration of the program and the likelihood that our evaluation design underestimated the true impact.
… Our evaluation demonstrates the transformative potential of generative AI in classrooms, especially in developing contexts. To our knowledge, this is the first study to assess the impact of generative AI as a virtual tutor in such settings, building on promising evidence from other contexts and formats; for example, on AI in coding classes, AI and learning in one school in Turkey, teaching math with AI (an example through WhatsApp in Ghana), and AI as a homework tutor.
Why it matters: This represents one of the first rigorous studies showing major real-world impacts in a developing nation. The key appears to be using AI as a complement to teachers rather than a replacement — and results suggest that AI tutoring could help address the global learning crisis, particularly in regions with teacher shortages.
Other items re: AI in our learning ecosystems:
Will AI revolutionise marking? — from timeshighereducation.com by Rohim Mohammed Artificial intelligence has the potential to improve speed, consistency and detail in feedback for educators grading students’ assignments, writes Rohim Mohammed. Here he lists the pros and cons based on his experience
Personal AI— from michelleweise.substack.com by Dr. Michelle Weise “Personalized” Doesn’t Have To Be a Buzzword Today, however, is a different kind of moment. GenAI is now rapidly evolving to the point where we may be able to imagine a new way forward. We can begin to imagine solutions truly tailored for each of us as individuals, our own personal AI (pAI). pAI could unify various silos of information to construct far richer and more holistic and dynamic views of ourselves as long-life learners. A pAI could become our own personal career navigator, skills coach, and storytelling agent. Three particular areas emerge when we think about tapping into the richness of our own data:
Today we’re rolling out a beta version of tasks—a new way to ask ChatGPT to do things for you at a future time.
Whether it’s one-time reminders or recurring actions, tell ChatGPT what you need and when, and it will automatically take care of it. pic.twitter.com/7lgvsPehHv
OpenAI is launching a new beta feature in ChatGPT called Tasks that lets users schedule future actions and reminders.
The feature, which is rolling out to Plus, Team, and Pro subscribers starting today, is an attempt to make the chatbot into something closer to a traditional digital assistant — think Google Assistant or Siri but with ChatGPT’s more advanced language capabilities.
The Rundown: OpenAI is rolling out Tasks, a new ChatGPT beta feature that allows users to schedule reminders and recurring actions, marking the company’s first step into agentic AI capabilities.
… Why it matters: While reminders aren’t groundbreaking, Tasks lays the groundwork for incorporating agentic abilities into ChatGPT, which will likely gain value once integrated with other features like tool or computer use. With ‘Operator’ also rumored to be coming this month, all signs are pointing towards 2025 being the year of the AI agent.
J.P. Morgan Healthcare Conference—NVIDIA today announced new partnerships to transform the $10 trillion healthcare and life sciences industry by accelerating drug discovery, enhancing genomic research and pioneering advanced healthcare services with agentic and generative AI.
The convergence of AI, accelerated computing and biological data is turning healthcare into the largest technology industry. Healthcare leaders IQVIA, Illumina and Mayo Clinic, as well as Arc Institute, are using the latest NVIDIA technologies to develop solutions that will help advance human health.
These solutions include AI agents that can speed clinical trials by reducing administrative burden, AI models that learn from biology instruments to advance drug discovery and digital pathology, and physical AI robots for surgery, patient monitoring and operations. AI agents, AI instruments and AI robots will help address the $3 trillion of operations dedicated to supporting industry growth and create an AI factory opportunity in the hundreds of billions of dollars.
True progress in transforming health care will require solutions across the political, scientific and medical sectors. But new forms of artificial intelligence have the potential to help. Innovators are racing to deploy AI technologies to make health care more effective, equitable and humane.
AI could spot cancer early, design lifesaving drugs, assist doctors in surgery and even peer into people’s futures to predict and prevent disease. The potential to help people live longer, healthier lives is vast. But physicians and researchers must overcome a legion of challenges to harness AI’s potential.
The U.S. Department of Health and Human Services has issued its HHS Artificial Intelligence Strategic Plan, which the agency says will “set in motion a coordinated public-private approach to improving the quality, safety, efficiency, accessibility, equitability and outcomes in health and human services through the innovative, safe, and responsible use of AI.”
How Journalism Will Adapt in the Age of AI— from bloomberg.com/ by John Micklethwait The news business is facing its next enormous challenge. Here are eight reasons to be both optimistic and paranoid.
AI promises to get under the hood of our industry — to change the way we write and edit stories. It will challenge us, just like it is challenging other knowledge workers like lawyers, scriptwriters and accountants.
…
Most journalists love AI when it helps them uncover Iranian oil smuggling. Investigative journalism is not hard to sell to a newsroom. The second example is a little harder. Over the past month we have started testing AI-driven summaries for some longer stories on the Bloomberg Terminal.
The software reads the story and produces three bullet points. Customers like it — they can quickly see what any story is about. Journalists are more suspicious. Reporters worry that people will just read the summary rather than their story.
…
So, looking into our laboratory, what do I think will happen in the Age of AI? Here are eight predictions.
Nvidia’s CEO, Jensen Huang’s recent statement “IT will become the HR of AI agents” continues to spark debate about IT’s evolving role in managing AI systems. As AI tools become integral, IT teams will take on tasks like training and optimising AI agents, blending technical and HR responsibilities. So, how should organisations respond to this transformation?
An Arizona charter school will use AI instead of human teachers for two hours a day on academic lessons.
The AI will customize lessons in real-time to match each student’s needs.
The company has only tested this idea at private schools before but claims it hugely increases student academic success.
One school in Arizona is trying out a new educational model built around AI and a two-hour school day. When Arizona’s Unbound Academy opens, the only teachers will be artificial intelligence algorithms in a perfect utopia or dystopia, depending on your point of view.
In order to encourage and facilitate debate on key controversies related to AI, I put together this free 130+ page guide to the main arguments and ideas related to the controversies.
Universities need to step up their AGI game — from futureofbeinghuman.com by Andrew Maynard As Sam Altman and others push toward a future where AI changes everything, universities need to decide if they’re going to be leaders or bystanders in helping society navigate advanced AI transitions
And because of this, I think there’s a unique opportunity for universities (research universities in particular) to up their game and play a leadership role in navigating the coming advanced AI transition.
Of course, there are already a number of respected university-based initiatives that are working on parts of the challenge. Stanford HAI (Human-centered Artificial Intelligence) is one that stands out, as does the Leverhulm Center for the Future of Intelligence at the University of Cambridge, and the Center for Governance of AI at the University of Oxford. But these and other initiatives are barely scratching the surface of what is needed to help successfully navigate advanced AI transitions.
If universities are to be leaders rather than bystanders in ensuring human flourishing in an age of AI, there’s an urgent need for bolder and more creative forward-looking initiatives that support research, teaching, thought leadership, and knowledge mobilization, at the intersection of advanced AI and all aspects of what it means to thrive and grow as a species.
Ever since a new revolutionary version of chat ChatGPT became operable in late 2022, educators have faced several complex challenges as they learn how to navigate artificial intelligence systems.
…
Education Week produced a significant amount of coverage in 2024 exploring these and other critical questions involving the understanding and use of AI.
Here are the five most popular stories that Education Week published in 2024 about AI in schools.
Dr. Lodge said there are five key areas the higher education sector needs to address to adapt to the use of AI:
1. Teach ‘people’ skills as well as tech skills
2. Help all students use new tech
3. Prepare students for the jobs of the future
4. Learn to make sense of complex information
5. Universities to lead the tech change
Checking the Pulse: The Impact of AI on Everyday Lives So, what exactly did our users have to say about how AI transformed their lives this year? .
Top 2024 Developments in AI
Video Generation…
AI Employees…
Open Source Advancements…
Getting ready for 2025: your AI team members (Gift lesson 3/3) — from flexos.com by Daan van Rossum
And that’s why today, I’ll tell you exactly which AI tools I’ve recommended for the top 5 use cases to almost 200 business leaders who took the Lead with AI course.
1. Email Management: Simplifying Communication with AI
Microsoft Copilot for Outlook. …
Gemini AI for Gmail. …
Grammarly. …
2. Meeting Management: Maximize Your Time
Otter.ai. …
Copilot for Microsoft Teams. …
Other AI Meeting Assistants. Zoom AI Companion, Granola, and Fathom
3. Research: Streamlining Information Gathering
ChatGPT. …
Perplexity. …
Consensus. …
…plus several more items and tools that were mentioned by Daan.
“I mean, that’s what I’ll always want for my own children and, frankly, for anyone’s children,” Khan said. “And the hope here is that we can use artificial intelligence and other technologies to amplify what a teacher can do so they can spend more time standing next to a student, figuring them out, having a person-to-person connection.”
…
“After a week you start to realize, like, how you can use it,” Brockman said. “That’s been one of the really important things about working with Sal and his team, to really figure out what’s the right way to sort of bring this to parents and to teachers and to classrooms and to do that in a way…so that the students really learn and aren’t just, you know, asking for the answers and that the parents can have oversight and the teachers can be involved in that process.”
More than 100 colleges and high schools are turning to a new AI tool called Nectir, allowing teachers to create a personalized learning partner that’s trained on their syllabi, textbooks, and assignments to help students with anything from questions related to their coursework to essay writing assistance and even future career guidance.
…
With Nectir, teachers can create an AI assistant tailored to their specific needs, whether for a single class, a department, or the entire campus. There are various personalization options available, enabling teachers to establish clear boundaries for the AI’s interactions, such as programming the assistant to assist only with certain subjects or responding in a way that aligns with their teaching style.
“It’ll really be that customized learning partner. Every single conversation that a student has with any of their assistants will then be fed into that student profile for them to be able to see based on what the AI thinks, what should I be doing next, not only in my educational journey, but in my career journey,” Ghai said.
How Will AI Influence Higher Ed in 2025? — from insidehighered.com by Kathryn Palmer No one knows for sure, but Inside Higher Ed asked seven experts for their predictions.
As the technology continues to evolve at a rapid pace, no one knows for sure how AI will influence higher education in 2025. But several experts offered Inside Higher Ed their predictions—and some guidance—for how colleges and universities will have to navigate AI’s potential in the new year.
In the short term, A.I. will help teachers create lesson plans, find illustrative examples and generate quizzes tailored to each student. Customized problem sets will serve as tools to combat cheating while A.I. provides instant feedback.
…
In the longer term, it’s possible to imagine a world where A.I. can ingest rich learner data and create personalized learning paths for students, all within a curriculum established by the teacher. Teachers can continue to be deeply involved in fostering student discussions, guiding group projects and engaging their students, while A.I. handles grading and uses the Socratic method to help students discover answers on their own. Teachers provide encouragement and one-on-one support when needed, using their newfound availability to give students some extra care.
Let’s be clear: A.I. will never replace the human touch that is so vital to education. No algorithm can replicate the empathy, creativity and passion a teacher brings to the classroom. But A.I. can certainly amplify those qualities. It can be our co-pilot, our chief of staff helping us extend our reach and improve our effectiveness.
Today, I want to reflect on two recent OpenAI developments that highlight this evolution: their belated publication of advice for students on integrating AI into writing workflows, and last week’s launch of the full GPTo1 Pro version. When OpenAI released their student writing guide, there were plenty of snarky comments about how this guidance arrives almost a year after they thoroughly disrupted the educational landscape. Fair enough – I took my own side swipes initially. But let’s look at what they’re actually advising, because the details matter more than the timing.
Tutoring programs exploded in the last five years as states and school districts searched for ways to counter plummeting achievement during COVID. But the cost of providing supplemental instruction to tens of millions of students can be eye-watering, even as the results seem to taper off as programs serve more students.
That’s where artificial intelligence could prove a decisive advantage. A report circulated in October by the National Student Support Accelerator found that an AI-powered tutoring assistant significantly improved the performance of hundreds of tutors by prompting them with new ways to explain concepts to students. With the help of the tool, dubbed Tutor CoPilot, students assigned to the weakest tutors began posting academic results nearly equal to those assigned to the strongest. And the cost to run the program was just $20 per pupil.
Faculty must have the time and support necessary to come to terms with this new technology and that requires us to change how we view professional development in higher education and K-12. We cannot treat generative AI as a one-off problem that can be solved by a workshop, an invited talk, or a course policy discussion. Generative AI in education has to be viewed as a continuum. Faculty need a myriad of support options each semester:
Course buyouts
Fellowships
Learning communities
Reading groups
AI Institutes and workshops
Funding to explore the scholarship of teaching and learning around generative AI
Education leaders should focus on integrating AI literacy, civic education, and work-based learning to equip students for future challenges and opportunities.
Building social capital and personalized learning environments will be crucial for student success in a world increasingly influenced by AI and decentralized power structures.
In this episode of My EdTech Life, I had the pleasure of interviewing Mike Kentz and Nick Potkalitsky, PhD, to discuss their new book, AI in Education: The K-12 Roadmap to Teacher-Led Transformation. We dive into the transformative power of AI in education, exploring its potential for personalization, its impact on traditional teaching practices, and the critical need for teacher-driven experimentation.
Striking a Balance: Navigating the Ethical Dilemmas of AI in Higher Education — from er.educause.edu by Katalin Wargo and Brier Anderson Navigating the complexities of artificial intelligence (AI) while upholding ethical standards requires a balanced approach that considers the benefits and risks of AI adoption.
As artificial intelligence (AI) continues to transform the world—including higher education—the need for responsible use has never been more critical. While AI holds immense potential to enhance teaching and learning, ethical considerations around social inequity, environmental concerns, and dehumanization continue to emerge. College and university centers for teaching and learning (CTLs), tasked with supporting faculty in best instructional practices, face growing pressure to take a balanced approach to adopting new technologies. This challenge is compounded by an unpredictable and rapidly evolving landscape. New AI tools surface almost daily. With each new tool, the educational possibilities and challenges increase exponentially. Keeping up is virtually impossible for CTLs, which historically have been institutional hubs for innovation. In fact, as of this writing, the There’s an AI for That website indicates that there are 23,208 AIs for 15,636 tasks for 4,875 jobs—with all three numbers increasing daily.
To support college and university faculty and, by extension, learners in navigating the complexities of AI integration while upholding ethical standards, CTLs must prioritize a balanced approach that considers the benefits and risks of AI adoption. Teaching and learning professionals need to expand their resources and support pathways beyond those solely targeting how to leverage AI or mitigate academic integrity violations. They need to make a concerted effort to promote critical AI literacy, grapple with issues of social inequity, examine the environmental impact of AI technologies, and promote human-centered design principles.1
We’re truly spoiled for choice when it comes to AI learning tools.
In principle, any free LLM can become an endlessly patient tutor or an interactive course-maker.
If that’s not enough, tools like NotebookLM’s “Audio Overviews” and ElevenLabs’ GenFM can turn practically any material into a breezy podcast.
But what if you’re looking to explore new topics in a way that’s more interactive than vanilla chatbots and more open-ended than source-grounded NotebookLM?
Well, then you might want to give one of these free-to-try learning tools a go.
Picture your enterprise as a living ecosystem,where surging market demand instantly informs staffing decisions, where a new vendor’s onboarding optimizes your emissions metrics, where rising customer engagement reveals product opportunities. Now imagine if your systems could see these connections too! This is the promise of AI agents — an intelligent network that thinks, learns, and works across your entire enterprise.
Today, organizations operate in artificial silos. Tomorrow, they could be fluid and responsive. The transformation has already begun. The question is: will your company lead it?
The journey to agent-enabled operations starts with clarity on business objectives. Leaders should begin by mapping their business’s critical processes. The most pressing opportunities often lie where cross-functional handoffs create friction or where high-value activities are slowed by system fragmentation. These pain points become the natural starting points for your agent deployment strategy.
Artificial intelligence has already proved that it can sound like a human, impersonate individuals and even produce recordings of someone speaking different languages. Now, a new feature from Microsoft will allow video meeting attendees to hear speakers “talk” in a different language with help from AI.
What Is Agentic AI? — from blogs.nvidia.com by Erik Pounds Agentic AI uses sophisticated reasoning and iterative planning to autonomously solve complex, multi-step problems.
The next frontier of artificial intelligence is agentic AI, which uses sophisticated reasoning and iterative planning to autonomously solve complex, multi-step problems. And it’s set to enhance productivity and operations across industries.
Agentic AI systems ingest vast amounts of data from multiple sources to independently analyze challenges, develop strategies and execute tasks like supply chain optimization, cybersecurity vulnerability analysis and helping doctors with time-consuming tasks.