Agentic AI and the New Era of Corporate Learning for 2026 — from hrmorning.com by Carol Warner

That gap creates compliance risk and wasted investment. It leaves HR leaders with a critical question: How do you measure and validate real learning when AI is doing the work for employees?

Designing Training That AI Can’t Fake
Employees often find static slide decks and multiple-choice quizzes tedious, while AI can breeze through them. If employees would rather let AI take training for them, it’s a red flag about the content itself.

One of the biggest risks with agentic AI is disengagement. When AI can complete a task for employees, their incentive to engage disappears unless they understand why the skill matters, Rashid explains. Personalization and context are critical. Training should clearly connect to what employees value most – career mobility, advancement, and staying relevant in a fast-changing market.

Nearly half of executives believe today’s skills will expire within two years, making continuous learning essential for job security and growth. To make training engaging, Rashid recommends:

  • Delivering content in formats employees already consume – short videos, mobile-first modules, interactive simulations, or micro-podcasts that fit naturally into workflows. For frontline workers, this might mean replacing traditional desktop training with mobile content that integrates into their workday.
  • Aligning learning with tangible outcomes, like career opportunities or new responsibilities.
  • Layering in recognition, such as digital badges, leaderboards, or team shout-outs, to reinforce motivation and progress

Microsoft 365 Copilot AI agents reach a new milestone — is teamwork about to change? — from windowscentral.comby Adam Hales
Microsoft expands Copilot with collaborative agents in Teams, SharePoint and more to boost productivity and reshape teamwork.

Microsoft is pitching a recent shift of AI agents in Microsoft Teams as more than just smarter assistance. Instead, these agents are built to behave like human teammates inside familiar apps such as Teams, SharePoint, and Viva Engage. They can set up meeting agendas, keep files in order, and even step in to guide community discussions when things drift off track.

Unlike tools such as ChatGPT or Claude, which mostly wait for prompts, Microsoft’s agents are designed to take initiative. They can chase up unfinished work, highlight items that still need decisions, and keep projects moving forward. By drawing on Microsoft Graph, they also bring in the right files, past decisions, and context to make their suggestions more useful.



Chris Dede’s comments on LinkedIn re: Aibrary

As an advisor to Aibrary, I am impressed with their educational philosophy, which is based both on theory and on empirical research findings. Aibrary is an innovative approach to self-directed learning that complements academic resources. Expanding our historic conceptions of books, libraries, and lifelong learning to new models enabled by emerging technologies is central to empowering all of us to shape our future.
.

Also see:

Aibrary.ai


Why AI literacy must come before policy — from timeshighereducation.com by Kathryn MacCallum and David Parsons
When developing rules and guidelines around the uses of artificial intelligence, the first question to ask is whether the university policymakers and staff responsible for implementing them truly understand how learners can meet the expectations they set

Literacy first, guidelines second, policy third
For students to respond appropriately to policies, they need to be given supportive guidelines that enact these policies. Further, to apply these guidelines, they need a level of AI literacy that gives them the knowledge, skills and understanding required to support responsible use of AI. Therefore, if we want AI to enhance education rather than undermine it, we must build literacy first, then create supportive guidelines. Good policy can then follow.


AI training becomes mandatory at more US law schools — from reuters.com by Karen Sloan and Sara Merken

Sept 22 (Reuters) – At orientation last month, 375 new Fordham Law students were handed two summaries of rapper Drake’s defamation lawsuit against his rival Kendrick Lamar’s record label — one written by a law professor, the other by ChatGPT.

The students guessed which was which, then dissected the artificial intelligence chatbot’s version for accuracy and nuance, finding that it included some irrelevant facts.

The exercise was part of the first-ever AI session for incoming students at the Manhattan law school, one of at least eight law schools now incorporating AI training for first-year students in orientation, legal research and writing courses, or through mandatory standalone classes.

 


Tech check: Innovation in motion: How AI is rewiring L&D workflows — from chieflearningofficer.com by Gabrielle Pike
AI isn’t here to replace us. It’s here to level us up.

For today’s chief learning officer, the days of just rolling out compliance training are long gone. In 2025, learning and development leaders are architects of innovation, crafting ecosystems that are agile, automated and AI-infused. This quarter’s Tech Check invites us to pause, assess and get strategic about where tech is taking us. Because the goal isn’t more tools—it’s smarter, more human learning systems that scale with the business.

Sections include:

  • The state of AI in L&D: Hype vs. reality
  • AI in design: From static content to dynamic experiences
  • AI in development: Redefining production workflows
  • Strategic questions CLOs should be asking
  • Future forward: What’s next?
  • Closing thought

American Federation of Teachers (AFT) to Launch National Academy for AI Instruction with Microsoft, OpenAI, Anthropic and United Federation of Teachers — from aft.org

NEW YORK – The AFT, alongside the United Federation of Teachers and lead partner Microsoft Corp., founding partner OpenAI, and Anthropic, announced the launch of the National Academy for AI Instruction today. The groundbreaking $23 million education initiative will provide access to free AI training and curriculum for all 1.8 million members of the AFT, starting with K-12 educators. It will be based at a state-of-the-art bricks-and-mortar Manhattan facility designed to transform how artificial intelligence is taught and integrated into classrooms across the United States.

The academy will help address the gap in structured, accessible AI training and provide a national model for AI-integrated curriculum and teaching that puts educators in the driver’s seat.


Students Are Anxious about the Future with A.I. Their Parents Are, Too. — from educationnext.org by Michael B. Horn
The fast-growing technology is pushing families to rethink the value of college

In an era when the college-going rate of high school graduates has dropped from an all-time high of 70 percent in 2016 to roughly 62 percent now, AI seems to be heightening the anxieties about the value of college.

According to the survey, two-thirds of parents say AI is impacting their view of the value of college. Thirty-seven percent of parents indicate they are now scrutinizing college’s “career-placement outcomes”; 36 percent say they are looking at a college’s “AI-skills curriculum,” while 35 percent respond that a “human-skills emphasis” is important to them.

This echoes what I increasingly hear from college leadership: Parents and students demand to see a difference between what they are getting from a college and what they could be “learning from AI.”


This next item on LinkedIn is compliments of Ray Schroeder:



How to Prepare Students for a Fast-Moving (AI)World — from rdene915.com by Dr. Rachelle Dené Poth

Preparing for a Future-Ready Classroom
Here are the core components I focus on to prepare students:

1. Unleash Creativity and Problem-Solving.
2. Weave in AI and Computational Thinking.
3. Cultivate Resilience and Adaptability.


AI Is Reshaping Learning Roles—Here’s How to Future-Proof Your Team — from onlinelearningconsortium.org by Jennifer Mathes, Ph.D., CEO, Online Learning Consortium; via Robert Gibson on LinkedIn

Culture matters here. Organizations that foster psychological safety—where experimentation is welcomed and mistakes are treated as learning—are making the most progress. When leaders model curiosity, share what they’re trying, and invite open dialogue, teams follow suit. Small tests become shared wins. Shared wins build momentum.

Career development must be part of this equation. As roles evolve, people will need pathways forward. Some will shift into new specialties. Others may leave familiar roles for entirely new ones. Making space for that evolution—through upskilling, mobility, and mentorship—shows your people that you’re not just investing in AI, you’re investing in them.

And above all, people need transparency. Teams don’t expect perfection. But they do need clarity. They need to understand what’s changing, why it matters, and how they’ll be supported through it. That kind of trust-building communication is the foundation for any successful change.

These shifts may play out differently across sectors—but the core leadership questions will likely be similar.

AI marks a turning point—not just for technology, but for how we prepare our people to lead through disruption and shape the future of learning.


.


 

2025: The Year the Frontier Firm Is Born — from Microsoft

We are entering a new reality—one in which AI can reason and solve problems in remarkable ways. This intelligence on tap will rewrite the rules of business and transform knowledge work as we know it. Organizations today must navigate the challenge of preparing for an AI-enhanced future, where AI agents will gain increasing levels of capability over time that humans will need to harness as they redesign their business. Human ambition, creativity, and ingenuity will continue to create new economic value and opportunity as we redefine work and workflows.

As a result, a new organizational blueprint is emerging, one that blends machine intelligence with human judgment, building systems that are AI-operated but human-led. Like the Industrial Revolution and the internet era, this transformation will take decades to reach its full promise and involve broad technological, societal, and economic change.

To help leaders understand how knowledge work will evolve, Microsoft analyzed survey data from 31,000 workers across 31 countries, LinkedIn labor market trends, and trillions of Microsoft 365 productivity signals. We also spoke with AI-native startups, academics, economists, scientists, and thought leaders to explore what work could become. The data and insights point to the emergence of an entirely new organization, a Frontier Firm that looks markedly different from those we know today. Structured around on-demand intelligence and powered by “hybrid” teams of humans + agents, these companies scale rapidly, operate with agility, and generate value faster.

Frontier Firms are already taking shape, and within the next 2–5 years we expect that every organization will be on their journey to becoming one. 82% of leaders say this is a pivotal year to rethink key aspects of strategy and operations, and 81% say they expect agents to be moderately or extensively integrated into their company’s AI strategy in the next 12–18 months. Adoption is accelerating: 24% of leaders say their companies have already deployed AI organization-wide, while just 12% remain in pilot mode.

The time to act is now. The question for every leader and employee is: how will you adapt?


On a somewhat related note, also see:

Exclusive: Anthropic warns fully AI employees are a year away — from axios.com by Sam Sabin

Anthropic expects AI-powered virtual employees to begin roaming corporate networks in the next year, the company’s top security leader told Axios in an interview this week.

Why it matters: Managing those AI identities will require companies to reassess their cybersecurity strategies or risk exposing their networks to major security breaches.

The big picture: Virtual employees could be the next AI innovation hotbed, Jason Clinton, the company’s chief information security officer, told Axios.

 

Organizing Teams for Continuous Learning: A Complete Guide — from intelligenthq.com

In today’s fast-paced business world, continuous learning has become a vital element for both individual and organizational growth. Teams that foster a culture of learning remain adaptable, innovative, and competitive. However, simply encouraging learning isn’t enough; the way teams are structured and supported plays a huge role in achieving long-term success. In this guide, we’ll explore how to effectively organize teams for continuous learning, leveraging tools, strategies, and best practices.

 

How Generative AI Is Shaping the Future of Law: Challenges and Trends in the Legal Profession — from thomsonreuters.com by Raghu Ramanathan

With this mind, Thomson Reuters and Lexpert hosted a panel featuring law firm leaders and industry experts discussing the challenges and trends around the use of generative AI in the legal profession.?Below are insights from an engaging and informative discussion.

Sections included:

  • Lawyers are excited to implement generative AI solutions
  • Unfounded concerns about robot lawyers
  • Changing billing practices and elevating services
  • Managing and mitigating risks

Adopting Legal Technology Responsibly — from lexology.com by Sacha Kirk

Here are fundamental principles to guide the process:

  1. Start with a Needs Assessment…
  2. Engage Stakeholders Early…
  3. Choose Scalable Solutions…
  4. Prioritise Security and Compliance…
  5. Plan for Change Management…

Modernizing Legal Workflows: The Role Of AI, Automation, And Strategic Partnerships — from abovethelaw.com by Scott Angelo, Jared Gullbergh, Nancy Griffing, and Michael Owen Hill
A roadmap for law firms.  

Angelo added, “We really doubled down on AI because it was just so new — not just to the legal industry, but to the world.” Under his leadership, Buchanan’s efforts to embrace AI have garnered significant attention, earning the firm recognition as one of the “Best of the Best for Generative AI” in the 2024 BTI “Leading Edge Law Firms” survey.

This acknowledgment reflects more than ambition; it highlights the firm’s ability to translate innovative ideas into actionable results. By focusing on collaboration and leveraging technology to address client demands, Buchanan has set a benchmark for what is possible in legal technology innovation.

The collective team followed these essential steps for app development:

  • Identify and Prioritize Use Cases…
  • Define App Requirements…
  • Leverage Pre-Built Studio Apps and Templates…
  • Incorporate AI and Automation…
  • Test and Iterate…
  • Deploy and Train…
  • Measure Success…

Navigating Generative AI in Legal Practice — from linkedin.com by Colin Levy

The rise of artificial intelligence (AI), particularly generative AI, has introduced transformative potential to legal practice. For in-house counsel, managing legal risk while driving operational efficiency increasingly involves navigating AI’s opportunities and challenges. While AI offers remarkable tools for automation and data-driven decision-making, it is essential to approach these tools as complementary to human judgment, not replacements. Effective AI adoption requires balancing its efficiencies with a commitment to ethical, nuanced legal practice.

Here a few ways in which this arises:

 

How to Secure Your 2025 Legal Tech — from americanbar.org by Rachel Bailey

Summary

  • With firms increasingly open to AI tools, now is an exciting time to do some blue-sky thinking about your firm’s technology as a whole.
  • This is a chance for teams to envision the future of their firm’s technology landscape and make bold choices that align with long-term goals.
  • Learn six tips that will improve your odds of approval for your legal tech budget.

Also relevant, see:


Why Technology-Driven Law Firms Are Poised For Long-Term Success — from forbes.com by Daniel Farrar

Client expectations have shifted significantly in today’s technology-driven world. Quick communication and greater transparency are now a priority for clients throughout the entire case life cycle. This growing demand for tech-enhanced processes comes not only from clients but also from staff, and is set to rise even further as more advances become available.

I see the shift to cloud-based digital systems, especially for small and midsized law firms, as evening the playing field by providing access to robust tools that can aid legal services. Here are some examples of how legal professionals are leveraging tech every day…


Just 10% of law firms have a GenAI policy, new Thomson Reuters report shows — from legaltechnology.com by Caroline Hill

Just 10% of law firms and 21% of corporate legal teams have now implemented policies to guide their organisation’s use of generative AI, according to a report out today (2 December) from Thomson Reuters.


AI & Law Symposium: Students Exploring Innovation, Challenges, and Legal Implications of a Technological Revolution — from allard.ubc.ca

Artificial Intelligence (AI) has been rapidly deployed around the world in a growing number of sectors, offering unprecedented opportunities while raising profound legal and ethical questions. This symposium will explore the transformative power of AI, focusing on its benefits, limitations, and the legal challenges it poses.

AI’s ability to revolutionize sectors such as healthcare, law, and business holds immense potential, from improving efficiency and access to services, to providing new tools for analysis and decision-making. However, the deployment of AI also introduces significant risks, including bias, privacy concerns, and ethical dilemmas that challenge existing legal and regulatory frameworks. As AI technologies continue to evolve, it is crucial to assess their implications critically to ensure responsible and equitable development.


The role of legal teams in creating AI ethics guardrails — from legaldive.com by Catherine Dawson
For organizations to balance the benefits of artificial intelligence with its risk, it’s important for counsel to develop policy on data governance and privacy.


How Legal Aid and Tech Collaboration Can Bridge the Justice Gap — from law.com by Kelli Raker and Maya Markovich
“Technology, when thoughtfully developed and implemented, has the potential to expand access to legal services significantly,” write Kelli Raker and Maya Markovich.

Challenges and Concerns
Despite the potential benefits, legal aid organizations face several hurdles in working with new technologies:

1. Funding and incentives: Most funding for legal aid is tied to direct legal representation, leaving little room for investment in general case management or exploration of innovative service delivery methods to exponentially scale impact.

2. Jurisdictional inconsistency: The lack of a unified court system or standardized forms across regions makes it challenging to develop accurate and widely applicable tech solutions in certain types of matters.

3. Organizational capacity: Many legal aid organizations lack the time and resources to thoroughly evaluate new tech offerings or collaboration opportunities or identify internal workflows and areas of unmet need with the highest chance for impact.

4. Data privacy and security: Legal aid providers need assurance that tech protects client data and avoids misuse of sensitive information.

5. Ethical considerations: There’s significant concern about the accuracy of information produced by consumer-facing technology and the potential for inadvertent unauthorized practice of law.

 

The Musician’s Rule and GenAI in Education — from opencontent.org by David Wiley

We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.

From DSC:
As is often the case, David put together a solid posting here. A few comments/reflections on it:

  • I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
  • The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
  • The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
  • The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
  • As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
  • There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better.
    .

We need to take more of the research from learning science and apply it in our learning spaces.

 

OpenAI announces first partnership with a university — from cnbc.com by Hayden Field

Key Points:

  • OpenAI on Thursday announced its first partnership with a higher education institution.
  • Starting in February, Arizona State University will have full access to ChatGPT Enterprise and plans to use it for coursework, tutoring, research and more.
  • The partnership has been in the works for at least six months.
  • ASU plans to build a personalized AI tutor for students, allow students to create AI avatars for study help and broaden the university’s prompt engineering course.

A new collaboration with OpenAI charts the future of AI in higher education — from news.asu.edu

The collaboration between ASU and OpenAI brings the advanced capabilities of ChatGPT Enterprise into higher education, setting a new precedent for how universities enhance learning, creativity and student outcomes.

“ASU recognizes that augmented and artificial intelligence systems are here to stay, and we are optimistic about their ability to become incredible tools that help students to learn, learn more quickly and understand subjects more thoroughly,” ASU President Michael M. Crow said. “Our collaboration with OpenAI reflects our philosophy and our commitment to participating directly to the responsible evolution of AI learning technologies.”


AI <> Academia — from drphilippahardman.substack.com by Dr. Philippa Hardman
What might emerge from ASU’s pioneering partnership with OpenAI?

Phil’s Wish List #2: Smart Curriculum Development
ChatGPT assists in creating and updating course curricula, based on both student data and emerging domain and pedagogical research on the topic.

Output: using AI it will be possible to review course content and make data-informed automate recommendations based on latest pedagogical and domain-specific research

Potential Impact: increased dynamism and relevance in course content and reduced administrative lift for academics.


A full list of AI ideas from AI for Education dot org

A full list of AI ideas from AI-for-Education.org

You can filter by category, by ‘What does it do?’, by AI tool or search for keywords.


Navigating the new normal: Adapting in the age of AI and hybrid work models — from chieflearningofficer.com by Dr. Kylie Ensrud

Unlike traditional leadership, adaptable leadership is not bound by rigid rules and protocols. Instead, it thrives on flexibility. Adaptable leaders are willing to experiment, make course corrections, and pivot when necessary. Adaptable leadership is about flexibility, resilience and a willingness to embrace change. It embodies several key principles that redefine the role of leaders in organizations:

  1. Embracing uncertainty

Adaptable leaders understand that uncertainty is the new norm. They do not shy away from ambiguity but instead, see it as an opportunity for growth and innovation. They encourage a culture of experimentation and learning from failure.

  1. Empowering teams

Instead of dictating every move, adaptable leaders empower their teams to take ownership of their work. They foster an environment of trust and collaboration, enabling individuals to contribute their unique perspectives and skills.

  1. Continuous learning

Adaptable leaders are lifelong learners. They are constantly seeking new knowledge, stay informed about industry trends and encourage their teams to do the same. They understand that knowledge is a dynamic asset that must be constantly updated.


Major AI in Education Related Developments this week — from stefanbauschard.substack.com by Stefan Bauschard
ASU integrates with ChatGPT, K-12 AI integrations, Agents & the Rabbit, Uruguay, Meta and AGI, Rethinking curriculum

“The greatest risk is leaving school curriculum unchanged when the entire world is changing.”
Hadi Partovi, founder Code.org, Angel investor in Facebook, DropBox, AirBnb, Uber

Tutorbots in college. On a more limited scale, Georgia State University, Morgan State University, and the University of Central Florida are piloting a project using chatbots to support students in foundational math and English courses.


Pioneering AI-Driven Instructional Design in Small College Settings — from campustechnology.com by Gopu Kiron
For institutions that lack the budget or staff expertise to utilize instructional design principles in online course development, generative AI may offer a way forward.

Unfortunately, smaller colleges — arguably the institutions whose students are likely to benefit the most from ID enhancements — frequently find themselves excluded from authentically engaging in the ID arena due to tight budgets, limited faculty online course design expertise, and the lack of ID-specific staff roles. Despite this, recent developments in generative AI may offer these institutions a low-cost, tactical avenue to compete with more established players.


Google’s new AI solves math olympiad problems — from bensbites.beehiiv.com

There’s a new AI from Google DeepMind called AlphaGeometry that totally nails solving super hard geometry problems. We’re talking problems so tough only math geniuses who compete in the International Mathematical Olympiad can figure them out.


 

Where a developing, new kind of learning ecosystem is likely headed [Christian]

From DSC:
As I’ve long stated on the Learning from the Living [Class]Room vision, we are heading toward a new AI-empowered learning platform — where humans play a critically important role in making this new learning ecosystem work.

Along these lines, I ran into this site out on X/Twitter. We’ll see how this unfolds, but it will be an interesting space to watch.

Project Chiron's vision: Our vision for education Every child will soon have a super-intelligent AI teacher by their side. We want to make sure they instill a love of learning in children.


From DSC:
This future learning platform will also focus on developing skills and competencies. Along those lines, see:

Scale for Skills-First — from the-job.beehiiv.com by Paul Fain
An ed-tech giant’s ambitious moves into digital credentialing and learner records.

A Digital Canvas for Skills
Instructure was a player in the skills and credentials space before its recent acquisition of Parchment, a digital transcript company. But that $800M move made many observers wonder if Instructure can develop digital records of skills that learners, colleges, and employers might actually use broadly.

Ultimately, he says, the CLR approach will allow students to bring these various learning types into a coherent format for employers.

Instructure seeks a leadership role in working with other organizations to establish common standards for credentials and learner records, to help create consistency. The company collaborates closely with 1EdTech. And last month it helped launch the 1EdTech TrustEd Microcredential Coalition, which aims to increase quality and trust in digital credentials.

Paul also links to 1EDTECH’s page regarding the Comprehensive Learning Record

 

Regional Colleges Saw Biggest Application Gains After Tuition Resets — from insidehighered.com by Kathryn Palmer
A new report compared post-reset application growth at nationally known and regional institutions. 

Dozens of colleges and universities have dropped their sticker prices for tuition over the past decade, even as research has shown that tuition resets have a nominal influence on long-term enrollment increases. But a report released this week shows that regional colleges were more likely than nationally known institutions to see increases in applications after a reset.

“Students are more focused now on return on investment than they used to be,” said Devon McGee, a principal at Kennedy & Company, the higher education consulting firm that produced the report. Compared to bigger-name colleges, “A lot of these regional institutions are great liberal arts–type institutions, but they are less associated—fairly or unfairly—with preparing students for a job.”


Why hybrid learning needs hybrid faculties — from timeshighereducation.com by An Jacobs & Norma Rossi
Online courses should be integrated into everyday faculty functions to improve remote and in-person classes as well as the overall student experience


 

From DSC:
I thought this was a really good idea from Dan Pontefact: “Why Experienced Employees Should Write Letters to New Team Members”

Excerpt:

Regardless of their age, an individual who is fresh to the team is given between five and ten pieces of advice from a more seasoned employee in the form of an email or letter. These tidbits of knowledge are what these seasoned professionals wish they had known when they first joined.

This is more than just a welcome; it’s a guide, a primer, offering an insider’s view of the organization and fostering a sense of camaraderie from the very beginning.

 

Law Firms Are Recruiting More AI Experts as Clients Demand ‘More for Less’ — from bloomberg.com by Irina Anghel
Data scientists, software engineers among roles being sought | Legal services seen as vulnerable to ChatGPT-type software

Excerpt (emphasis DSC):

Chatbots, data scientists, software engineers. As clients demand more for less, law firms are hiring growing numbers of staff who’ve studied technology not tort law to try and stand out from their rivals.

Law firms are advertising for experts in artificial intelligence “more than ever before,” says Chris Tart-Roberts, head of the legal technology practice at Macfarlanes, describing a trend he says began about six months ago.


Legal is the second industry with the highest potential for automation

.


AI Will Threaten Law Firm Jobs, But Innovators Will Thrive — from law.com

Excerpts:

What You Need to Know

  • Law firm leaders and consultants are unsure of how AI use will ultimately impact the legal workforce.
  • Consults are advising law firms and attorneys alike to adapt to the use of generative AI, viewing this as an opportunity for attorneys to learn new skills and law firms to take a look at their business models.

Split between foreseeing job cuts and opportunities to introduce new skills and additional efficiencies into the office, firm leaders and consultants remain uncertain about the impact of artificial intelligence on the legal workforce.

However, one thing is certain: law firms and attorneys need to adapt and learn how to integrate this new technology in their business models, according to consultants. 


AI Lawyer — A personal AI lawyer at your fingertips — from ailawyer.pro

AI Lawyer

From DSC:
I hope that we will see a lot more of this kind of thing!
I’m counting on it.
.


Revolutionize Your Legal Education with Law School AI — from law-school-ai.vercel.app
Your Ultimate Study Partner

Are you overwhelmed by countless cases, complex legal concepts, and endless readings? Law School AI is here to help. Our cutting-edge AI chatbot is designed to provide law students with an accessible, efficient, and engaging way to learn the law. Our chatbot simplifies complex legal topics, delivers personalized study guidance, and answers your questions in real-time – making your law school journey a whole lot easier.


Job title of the future: metaverse lawyer — from technologyreview.com by Amanda Smith
Madaline Zannes’s virtual offices come with breakout rooms, an art gallery… and a bar.
.

Excerpt:

Lot #651 on Somnium Space belongs to Zannes Law, a Toronto-based law firm. In this seven-level metaverse office, principal lawyer Madaline Zannes conducts private consultations with clients, meets people wandering in with legal questions, hosts conferences, and gives guest lectures. Zannes says that her metaverse office allows for a more immersive, imaginative client experience. She hired a custom metaverse builder to create the space from scratch—with breakout rooms, presentation stages, offices to rent, an art gallery, and a rooftop bar.


A Literal Generative AI Discussion: How AI Could Reshape Law — from geeklawblog.com by Greg Lambert

Excerpt:

Greg spoke with an AI guest named Justis for this episode. Justis, powered by OpenAI’s GPT-4, was able to have a natural conversation with Greg and provide insightful perspectives on the use of generative AI in the legal industry, specifically in law firms.

In the first part of their discussion, Justis gave an overview of the legal industry’s interest in and uncertainty around adopting generative AI. While many law firm leaders recognize its potential, some are unsure of how it fits into legal work or worry about risks. Justis pointed to examples of firms exploring AI and said letting lawyers experiment with the tools could help identify use cases.


Robots aren’t representing us in court but here are 7 legal tech startups transforming the legal system — from tech.eu by Cate Lawrence
Legal tech startups are stepping up to the bar, using tech such as AI, teleoperations, and apps to bring justice to more people than ever before. This increases efficiency, reduces delays, and lowers costs, expanding legal access.


Putting Humans First: Solving Real-Life Problems With Legal Innovation — from abovethelaw.com by Olga Mack
Placing the end-user at the heart of the process allows innovators to identify pain points and create solutions that directly address the unique needs and challenges individuals and businesses face.

 

Professors Plan Summer AI Upskilling, With or Without Support — from insidehighered.com by Susan D’Agostino
Academics seeking respite from the fire hose of AI information and hot takes launch summer workshops. But many of the grass-roots efforts fall short of meeting demand.

Excerpt:

In these summer faculty AI workshops, some plan to take their first tentative steps in redesigning assignments to recognize the AI-infused landscape. Others expect to evolve their in-progress teaching-with-AI practices. At some colleges, full-time staff will deliver the workshops or pay participants for professional development time. But some offerings are grassroots efforts delivered by faculty volunteers attended by participants on their own time. Even so, many worry that the efforts will fall short of meeting demand.

From DSC:
We aren’t used to this pace of change. It will take time for faculty members — as well as Instructional Designers, Instructional Technologists, Faculty Developers, Learning Experience Designers, Librarians, and others — to learn more about AI and its implications for teaching and learning. Faculty are learning. Staff are learning. Students are learning. Grace is needed. And faculty/staff modeling what it is to learn themselves is a good thing for students to see as well.


Also relevant/see:

It takes a village… Reflections on sustainable learning design — from The Educationalist (educationalist.substack.com) by Alexandra Mihai

Excerpts:

This can be done first and foremost through collaboration, bringing more people at the table, in a meaningful workflow, whereby they can make the best use of their expertise. Moreover, we need to take a step back and keep the big picture in mind, if we want to provide our students with a valuable experience.

This is all about creating and nurturing partnerships. Thinking in an inclusive way about who is at the table when we design our courses and our programmes and who we are currently missing. Generally speaking, the main actors involved should be: teaching staff, learning design professionals (under all their various names) and students. Yes, students. Although we are designing for their learning, they are all too often not part of the process.

In order to yield results, collaborative practice needs to be embedded in the institutional fabric, and this takes time. Building silos happens fast, breaking them is a long term process. Creating a culture of dialogue, with clear and replicable processes is key to making collaborative learning design work.

From DSC:
To me, Alexandra is addressing the topic of using teams to design, develop, and teach/offer courses. This is where a variety of skills and specialties can come together to produce an excellent learning experience. No one individual has all of the necessary skills — nor the necessary time. No way.

 

It takes a village… Reflections on sustainable learning design – from educationalist.substack.com; The Educationalist by Alexandra Mihai

Excerpt:

For the purpose of this article I want to look at learning design in a more holistic way, as a practice that takes place at institutional level. Because we are actually not designing the learning, we are designing for learning. It’s all about an ecosystem with many variable components, including people, institutions, pedagogy, disciplinary content, technology. Some of them more controllable or predictable, some of them less so. So learning design is (should be!) all about being adaptive, iterative, empathic, but also efficient, sustainable (from different points of view, I will come back to that later), scalable.

 

AI In Education -from Getting Smart -- May 2023

AI in Education — from gettingsmart.com

Some core takeaways for school leaders are:

  • AI has the potential to personalize learning experiences for students, improve student outcomes, and reduce the administrative burden on educators.
  • Addressing issues of data privacy, bias, and equity is crucial for responsible AI integration in education.
  • Collaboration between educators and AI developers is important to ensure that AI tools align with educational goals and values.
  • Professional development for educators is essential to effectively integrate AI tools in the classroom.
 
© 2025 | Daniel Christian