As part of the 2023 78th General Assembly of the United Nations, IBM announced it will train, for free, 2 million learners in artificial intelligence worldwide within the next three years, building on its existing commitment to skill 30 million learners by 2030.
The free program, called IBM SkillsBuild, will use its career-building platforms to partner with universities around the world to develop and make available new generative AI courses, with a significant focus on adult learners in underrepresented communities.
Consider Marco Argenti, CIO at Goldman Sachs — who told me in a recent interview that the leading global investment banking, securities and investment management firm has, nearly a year after ChatGPT was released, put exactly zero generative AI use cases into production. Instead, the company is “deeply into experimentation” and has a “high bar” of expectation before deployment. Certainly this is a highly-regulated company, so careful deployment must always be the norm. But Goldman Sachs is also far from new to implementing AI-driven tools — but is still treading slowly and carefully.
Last week, Matt Barnum reported in Chalkbeat that the Chan Zuckerberg Initiative is laying off dozens of staff members and pivoting away from the personalized learning platform they have funded since 2015 with somewhere near $100M.
…
I have tried to illustrate as often as my subscribers will tolerate that students don’t particularly enjoy learning alone with laptops within social spaces like classrooms. That learning fails to answer their questions about their social identity. It contributes to their feelings of alienation and disbelonging. I find this case easy to make but hard to prove. Maybe we just haven’t done personalized learning right? Maybe Summit just needed to include generative AI chatbots in their platform?
What is far easier to prove, or rather to disprove, is the idea that “whole class instruction must feel impersonal to students,” that “whole class instruction must necessarily fail to meet the needs of individual students.”
From DSC: I appreciate Dan’s comments here (as highlighted above) as they are helpful in my thoughts regarding the Learning from the Living [Class] Room vision. They seem to be echoed here by Jeppe Klitgaard Stricker when he says:
Personalized learning paths can be great, but they also entail a potential abolishment or unintended dissolution of learning communities and belonging.
Perhaps this powerful, global, Artificial Intelligence (AI)-backed, next-generation, lifelong learning platform of the future will be more focused on postsecondary students and experiences — but not so much for the K12 learning ecosystem.
But the school systems I’ve seen here in Michigan (USA) represent systems that address a majority of the class only. These one-size-fits-all systems don’t work for many students who need extra help and/or who are gifted students. The trains move fast. Good luck if you can’t keep up with the pace.
But if K-12’ers are involved in a future learning platform, the platform needs to address what Dan’s saying. It must address students questions about their social identity and not contribute to their feelings of alienation and disbelonging. It needs to support communities of practice and learning communities.
New York City Public Schools will launch an Artificial Intelligence Policy Lab to guide the nation’s largest school district’s approach to this rapidly evolving technology.
Kevin McCullen, an associate professor of computer science at the State University of New York at Plattsburgh, teaches a freshman seminar about AI and robotics. As part of the course, students read Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots, by John Markoff. McCullen had the students work in groups to outline and summarize the first three chapters. Then he showed them what ChatGPT had produced in an outline.
“Their version and ChatGPT’s version seemed to be from two different books,” McCullen wrote. “ChatGPT’s version was essentially a ‘laundry list’ of events. Their version was narratives of what they found interesting. The students had focused on what the story was telling them, while ChatGPT focused on who did what in what year.” The chatbot also introduced false information, such as wrong chapter names.
The students, he wrote, found the writing “soulless.”
In the Wild West of generative AI, educators and institutions are working out how best to use the technology for learning. How can institutions define AI guidelines that allow for experimentation while providing students with consistent guidance on appropriate use of AI tools?
To find out, we spoke with Dr. Cristi Ford, vice president of academic affairs at D2L. With more than two decades of educational experience in nonprofit, higher education, and K-12 institutions, Ford works with D2L’s institutional partners to elevate best practices in teaching, learning, and student support. Here, she shares her advice on setting and communicating AI policies that are consistent and future-ready.
“If we want to use AI to improve education, we need more teachers at the table,” said Avery Pan, Class Companion co-founder and CEO. “Class Companion is designed by teachers, for teachers, to harness the most sophisticated AI and improve their classroom experience. Developing technologies specifically for teachers is imperative to supporting our next generation of students and education system.”
7 Questions on Generative AI in Learning Design — from campustechnology.com by Rhea Kelly Open LMS Adoption and Education Specialist Michael Vaughn on the challenges and possibilities of using artificial intelligence to move teaching and learning forward.
The potential for artificial intelligence tools to speed up course design could be an attractive prospect for overworked faculty and spread-thin instructional designers. Generative AI can shine, for example, in tasks such as reworking assessment question sets, writing course outlines and learning objectives, and generating subtitles for audio and video clips. The key, says Michael Vaughn, adoption and education specialist at learning platform Open LMS, is treating AI like an intern who can be guided and molded along the way, and whose work is then vetted by a human expert.
We spoke with Vaughn about how best to utilize generative AI in learning design, ethical issues to consider, and how to formulate an institution-wide policy that can guide AI use today and in the future.
PXU City HS has no physical site — its 83 students create custom programs, choosing from a menu of some 500 options fromPhoenix Union High School District’s bricks-and-mortar schools; its online-only program, internships; jobs; college classes; and career training programs.
But in the process, it became clear just how many high school-aged students were working, caring for siblings, filling in for their parents or significantly behind — or ahead and bored — academically.
…
If PXU City works as well for all its students as it does for Dominguez, he adds, every high school in the district ought to throw away the bell schedule and offer a truly personalized education.
There are 10 areas of study students chose from:architectural design, business administration logistics-distribution, computer science, construction technology, cybersecurity, diesel/auto technology, energy technologies, human and social services, teaching and training, and sports medicine.
Students pick a job within the program they’re working toward, but that can change, Cordia said, noting that there are hundreds of possible jobs within the automotive program.
The school received e more than 1,000 applications from interested students, he said, calling it “very humbling.”
Addendum that also involves changes within the K12 learning ecosystem:
We’re standing on the cusp of artificially generated content that could theoretically never end. According to futurist Sinéad Bovell, “Generative artificial intelligence also means that say we don’t want a movie or a series to end. It doesn’t have to, you could use AI to continue to generate more episodes and other sequels and have this kind of ongoing storyline.”
If we take this logic further, we could also see hyper-personalized content that’s created just for us. Imagine getting an AI generated album from your favourite artist every week. Or a brand new movie starring actors who are no longer alive, like a new romcom with Marylin Monroe and Frank Sinatra.
While this sounds like a compelling proposition for consumers, it’s mostly bad news for actors, writers, and other professionals working in the media industry. Hollywood studios are already investing heavily in generative AI, and many professionals working in the industry are afraid to lose their jobs.
So, as educators, mentors, and guides to our future generations, we must ask ourselves three pivotal questions:
What value do we offer to our students?
What value will they need to offer to the world?
How are we preparing them to offer that value?
The answers to these questions are crucial, and they will redefine the trajectory of our education system.
We need to create an environment that encourages curiosity, embraces failure as a learning opportunity, and celebrates diversity. We need to teach our students how to learn, how to ask the right questions, and how to think for themselves.
Leveraging ChatGPT for learning is the most meaningful skill this year for lifelong learners. But it’s too hard to find resources to master it.
As a learning science nerd, I’ve explored hundreds of prompts over the past months. Most of the advice doesn’t go beyond text summaries and multiple-choice testing.
That’s why I’ve created this article — it merges learning science with prompt writing to help you learn anything faster.
Midjourney AI Art for Teachers (for any kind of teacher, not just Art Teachers) — from The AI Educator on YouTube by Dan Fitzpatrick
From DSC: This is a very nice, clearly illustrated, free video to get started with the Midjourney (text-to-image) app. Nice work Dan!
In the new-normal of generative AI, how does one articulate the value of academic integrity? This blog presents my current response in about 2,500 words; a complete answer could fill a sizable book.
Massive amounts of misinformation are disseminated about generative AI, so the first part of my discussion clarifies what large language models (Chat-GPT and its counterparts) can currently do and what they cannot accomplish at this point in time. The second part describes ways in which generative AI can be misused as a means of learning; unfortunately, many people are now advocating for these mistaken applications to education. The third part describes ways in which large language models (LLM), used well, may substantially improve learning and education. I close with a plea for a robust, informed public discussion about these topics and issues.
Many of the more than a dozen teachers TIME interviewed for this story argue that the way to get kids to care is to proactively use ChatGPT in the classroom.
…
Some of those creative ideas are already in effect at Peninsula High School in Gig Harbor, about an hour from Seattle. In Erin Rossing’s precalculus class, a student got ChatGPT to generate a rap about vectors and trigonometry in the style of Kanye West, while geometry students used the program to write mathematical proofs in the style of raps, which they performed in a classroom competition. In Kara Beloate’s English-Language Arts class, she allowed students reading Shakespeare’s Othello to use ChatGPT to translate lines into modern English to help them understand the text, so that they could spend class time discussing the plot and themes.
I found that other developed countries share concerns about students cheating but are moving quickly to use AI to personalize education, enhance language lessons and help teachers with mundane tasks, such as grading. Some of these countries are in the early stages of training teachers to use AI and developing curriculum standards for what students should know and be able to do with the technology.
Several countries began positioning themselves several years ago to invest in AI in education in order to compete in the fourth industrial revolution.
AI in Education— from educationnext.org by John Bailey The leap into a new era of machine intelligence carries risks and challenges, but also plenty of promise
In the realm of education, this technology will influence how students learn, how teachers work, and ultimately how we structure our education system. Some educators and leaders look forward to these changes with great enthusiasm. Sal Kahn, founder of Khan Academy, went so far as to say in a TED talk that AI has the potential to effect “probably the biggest positive transformation that education has ever seen.” But others warn that AI will enable the spread of misinformation, facilitate cheating in school and college, kill whatever vestiges of individual privacy remain, and cause massive job loss. The challenge is to harness the positive potential while avoiding or mitigating the harm.
Generative AI and education futures — from ucl.ac.uk Video highlights from Professor Mike Sharples’ keynote address at the 2023 UCL Education Conference, which explored opportunities to prosper with AI as a part of education.
Bringing AI Literacy to High Schools— from by Nikki Goth Itoi Stanford education researchers collaborated with teachers to develop classroom-ready AI resources for high school instructors across subject areas.
To address these two imperatives, all high schools need access to basic AI tools and training. Yet the reality is that many underserved schools in low-income areas lack the bandwidth, skills, and confidence to guide their students through an AI-powered world. And if the pattern continues, AI will only worsen existing inequities. With this concern top of mind plus initial funding from the McCoy Ethics Center, Lee began recruiting some graduate students and high school teachers to explore how to give more people equal footing in the AI space.
The urgency to fill existing and prospective positions with digital talent and to upskill those already in the workforce are among the reasons why leading companies have boldly assessed and transformed their enterprise talent management strategies. Some key initiatives leading companies are undertaking include:
Direct involvement by the C-Suite in the formulation of the enterprise talent strategy and lifecycle;
A paradigmatic hiring shift from diplomas to skills;
Increased investment in upskilling and career advancement to promote retention and to identify high-performers early on;
Targeted collaboration with universities focused on training in areas of existing and projected talent supply demand
Promoting a learning-for-life mindset and encouraging creative thinking, cross-cultural collaboration, and forging a culture that values these and other humanistic values.
Collaborating with other companies to create joint solutions for fulfilling skill demand
Consider this comparison: In more passive online learning, a participant will learn primarily by listening, watching and observing. Conversely, in an interactive model, the participant will be expected to engage with a story or situation by being asked to make choices that will show potential consequences.
…
Here are some of the elements that, when combined, make interactive learning especially effective:
After a successful career as a recording artist, David “TC” Ellis created Studio 4 in St. Paul to spot budding music stars.
It became a hangout spot for creative young people, most of whom had “dropped out of school due to boredom and a sense that school wasn’t relevant to their lives and dreams.”
Ellis and colleagues then opened the High School for Recording Arts in 1998.
Post-AI Assessment Design — from drphilippahardman.substack.com by Dr. Philippa Hardman A simple, three-step guide on how to design assessments in a post-AI world
Excerpt:
Step 1: Write Inquiry-Based Objectives
Inquiry-based objectives focus not just on the acquisition of knowledge but also on the development of skills and behaviours, like critical thinking, problem-solving, collaboration and research skills.
They do this by requiring learners not just to recall or “describe back” concepts that are delivered via text, lecture or video. Instead, inquiry-based objectives require learners to construct their own understanding through the process of investigation, analysis and questioning.
Just for a minute, consider how education would change if the following were true –
AIs “hallucinated” less than humans
AIs could write in our own voices
AIs could accurately do math
AIs understood the unique academic (and eventually developmental) needs of each student and adapt instruction to that student
AIs could teach anything any student wanted or need to know any time of day or night
AIs could do this at a fraction of the cost of a human teacher or professor
Fall 2026 is three years away. Do you have a three year plan? Perhaps you should scrap it and write a new one (or at least realize that your current one cannot survive). If you run an academic institution in 2026 the same way you ran it in 2022, you might as well run it like you would have in 1920. If you run an academic institution in 2030 (or any year when AI surpasses human intelligence) the same way you ran it in 2022, you might as well run it like you would have in 1820. AIs will become more intelligent than us, perhaps in 10-20 years (LeCun), though there could be unanticipated breakthroughs that lower the time frame to a few years or less (Benjio); it’s just a question of when, not “if.”
On one creative use of AI — from aiandacademia.substack.com by Bryan Alexander A new practice with pedagogical possibilities
Excerpt:
Look at those material items again. The voiceover? Written by an AI and turned into audio by software. The images? Created by human prompts in Midjourney. The music is, I think, human created. And the idea came from a discussion between a human and an AI?
…
How might this play out in a college or university class?
Imagine assignments which require students to craft such a video. Start from film, media studies, or computer science classes. Students work through a process:
I continue to try to imagine ways generative AI can impact teaching and learning, including learning materials like textbooks. Earlier this week I started wondering – what if, in the future, educators didn’t write textbooks at all? What if, instead, we only wrote structured collections of highly crafted prompts? Instead of reading a static textbook in a linear fashion, the learner would use the prompts to interact with a large language model.These prompts could help learners ask for things like:
overviews and in-depth explanations of specific topics in a specific sequence,
examples that the learner finds personally relevant and interesting,
interactive practice – including open-ended exercises – with immediate, corrective feedback,
the structure of the relationships between ideas and concepts,
Designed for K12 and Higher-Ed Educators & Administrators, this conference aims to provide a platform for educators, administrators, AI experts, students, parents, and EdTech leaders to discuss the impact of AI on education, address current challenges and potentials, share their perspectives and experiences, and explore innovative solutions. A special emphasis will be placed on including students’ voices in the conversation, highlighting their unique experiences and insights as the primary beneficiaries of these educational transformations.
The use of generative AI in K-12 settings is complex and still in its infancy. We need to consider how these tools can enhance student creativity, improve writing skills, and be transparent with students about how generative AI works so they can better understand its limitations. As with any new tech, our students will be exposed to it, and it is our task as educators to help them navigate this new territory as well-informed, curious explorers.
The education ministry has emphasized the need for students to understand artificial intelligence in new guidelines released Tuesday, setting out how generative AI can be integrated into schools and the precautions needed to address associated risks.
Students should comprehend the characteristics of AI, including its advantages and disadvantages, with the latter including personal information leakages and copyright infringement, before they use it, according to the guidelines. They explicitly state that passing off reports, essays or any other works produced by AI as one’s own is inappropriate.
Thanks to the rapid development of artificial intelligence tools like Dall-E and ChatGPT, my brother-in-law has been wrestling with low-level anxiety: Is it a good idea to steer his son down this path when AI threatens to devalue the work of creatives? Will there be a job for someone with that skill set in 10 years? He’s unsure. But instead of burying his head in the sand, he’s doing what any tech-savvy parent would do: He’s teaching his son how to use AI.
In recent months the family has picked up subscriptions to AI services. Now, in addition to drawing and sculpting and making movies and video games, my nephew is creating the monsters of his dreams with Midjourney, a generative AI tool that uses language prompts to produce images.
To bridge this knowledge gap, I decided to make a quick little dictionary of AI terms specifically tailored for educators worldwide. Initially created for my own benefit, I’ve reworked my own AI Dictionary for Educators and expanded it to help my fellow teachers embrace the advancements AI brings to education.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
.
On giving AI eyes and ears— from oneusefulthing.org by Ethan Mollick AI can listen and see, with bigger implications than we might realize.
Excerpt:
But even this is just the beginning, and new modes of using AI are appearing, which further increases their capabilities. I want to show you some examples of this emerging world, which I think will soon introduce a new wave of AI use cases, and accompanying disruption.
We need to recognize that these capabilities will continue to grow, and AI will be able to play a more active role in the real world by observing and listening. The implications are likely to be profound, and we should start thinking through both the huge benefits and major concerns today.
Even though generative AI is a new thing, it doesn’t change why students cheat. They’ve always cheated for the same reason: They don’t find the work meaningful, and they don’t think they can achieve it to their satisfaction. So we need to design assessments that students find meaning in.
Tricia Bertram Gallant
Caught off guard by AI— from chonicle.com by Beth McMurtrie and Beckie Supiano Professor scrambled to react to ChatGPT this spring — and started planning for the fall
Excerpt:
Is it cheating to use AI to brainstorm, or should that distinction be reserved for writing that you pretend is yours? Should AI be banned from the classroom, or is that irresponsible, given how quickly it is seeping into everyday life? Should a student caught cheating with AI be punished because they passed work off as their own, or given a second chance, especially if different professors have different rules and students aren’t always sure what use is appropriate?
…OpenAI built tool use right into the GPT API with an update called function calling. It’s a little like a child’s ability to ask their parents to help them with a task that they know they can’t do on their own. Except in this case, instead of parents, GPT can call out to external code, databases, or other APIs when it needs to.
Each function in function calling represents a tool that a GPT model can use when necessary, and GPT gets to decide which ones it wants to use and when. This instantly upgrades GPT capabilities—not because it can now do every task perfectly—but because it now knows how to ask for what it wants and get it. .
.
How ChatGPT can help disrupt assessment overload— from timeshighereducation.com by David Carless Advances in AI are not necessarily the enemy – in fact, they should prompt long overdue consideration of assessment types and frequency, says David Carless
Excerpt:
Reducing the assessment burden could support trust in students as individuals wanting to produce worthwhile, original work. Indeed, students can be co-opted as partners in designing their own assessment tasks, so they can produce something meaningful to them.
A strategic reduction in quantity of assessment would also facilitate a refocusing of assessment priorities on deep understanding more than just performance and carries potential to enhance feedback processes.
If we were to tackle assessment overload in these ways, it opens up various possibilities. Most significantly there is potential to revitalise feedback so that it becomes a core part of a learning cycle rather than an adjunct at its end. End-of-semester, product-oriented feedback, which comes after grades have already been awarded, fails to encourage the iterative loops and spirals typical of productive learning. .
Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day.This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.
These claims conjure up the rosiest of images: human resource departments and their robot buddies solving discrimination in workplace hiring. It seems plausible, in theory, that AI could root out unconscious bias, but a growing body of research shows the opposite may be more likely.
…
Companies’ use of AI didn’t come out of nowhere: For example, automated applicant tracking systems have been used in hiring for decades. That means if you’ve applied for a job, your resume and cover letter were likely scanned by an automated system. You probably heard from a chatbot at some point in the process. Your interview might have been automatically scheduled and later even assessed by AI.
From DSC:
Here was my reflection on this:
DC: Along these lines, I wonder if Applicant Tracking Systems cause us to become like typecast actors and actresses — only thought of for certain roles. Pigeonholed.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
In June, ResumeBuilder.com surveyed more than 1,000 employees who are involved in hiring processes at their workplaces to find out about their companies’ use of AI interviews.
The results:
43% of companies already have or plan to adopt AI interviews by 2024
Two-thirds of this group believe AI interviews will increase hiring efficiency
15% say that AI will be used to make decisions on candidates without any human input
More than half believe AI will eventually replace human hiring managers
Watch OpenAI CEO Sam Altman on the Future of AI — from bloomberg.com Sam Altman, CEO & Co-Founder, OpenAI discusses the explosive rise of OpenAI and its products and what an AI-laced future can look like with Bloomberg’s Emily Chang at the Bloomberg Technology Summit.
The implementation of generative AI within these products will dramatically improve educators’ ability to deliver personalized learning to students at scale by enabling the application of personalized assessments and learning pathways based on individual student needs and learning goals. K-12 educators will also benefit from access to OpenAI technology…
After chronicling 160+ AI tools (which is surely only a small fraction of the total), we’re seeing a few clear patterns among the tools that have come out so far- here are 10 categories that are jumping out!
“I don’t usually get worked up about announcements but I see promise in JFF’s plans for a new Center for Artificial Intelligence & the Future of Work, in no small part because the organization bridges higher ed, K-12 education, employers, and policymakers.”
BOSTON June 14, 2023 —Jobs for the Future (JFF), a national nonprofit that drives transformation in the U.S. education and workforce systems, today announced the launch of its new Center for Artificial Intelligence &the Future of Work. This center will play an integral role in JFF’s mission and newly announced 10-year North Star goal to help 75 million people facing systemic barriers to advancement work in quality jobs. As AI’s explosive growth reshapes every aspect of how we learn, work, and live, this new center will serve as a nexus of collaboration among stakeholders from every part of the education-to-career ecosystem to explore the most promising opportunities—and profound challenges—of AI’s potential to advance an accessible and equitable future of learning and work.
OpenAI Considers ‘App Store’ For ChatGPT — from searchenginejournal.com by; with thanks to Barsee at AI Valley for this resource OpenAI explores launching an ‘app store’ for AI models, potentially challenging current partners and expanding customer reach.
Highlights:
OpenAI considers launching an ‘app store’ for customized AI chatbots.
This move could create competition with current partners and extend OpenAI’s customer reach.
Early interest from companies like Aquant and Khan Academy shows potential, but product development and market positioning challenges remain.
The rise of artificial intelligence, especially generative AI, boosts productivity in content creation–text, code, images and increasingly video.
Here are six preliminary conclusions about the nature of work and learning.
Wonder Tools: AI to try— from wondertools.substack.com by Jeremy Caplan 9 playful little ways to explore AI
Excerpt:
Create a personalized children’s story ? | Schrodi Collaborate with AI on a free customized, illustrated story for someone special. Give your story’s hero a name, pick a genre (e.g. comedy, thriller), choose an illustration style (e.g. watercolor, 3d animation) and provide a prompt to shape a simple story. You can even suggest a moral. After a minute, download a full-color PDF to share. Or print it and read your new mini picture book aloud.
Generate a quiz ? | Piggy Put in a link, a topic, or some text and you’ll get a quiz you can share, featuring multiple-choice or true-false questions. Example: try this quick entrepreneurship quiz Piggy generated for me.
Q: How will generative AI impact teaching and learning in the near and long term?
Baker Stein: One-on-one tutoring at scale is finally being unlocked for learners around the world. This type of quality education is no longer only available to students with the means to hire a private tutor.I’m also particularly excited to see how educators make use of generative AI tools to create courses much faster and likely at a higher quality with increased personalization for each student or even by experimenting with new technologies like extended reality. Professors will be able to put their time toward high-impact activities like mentoring, researching and office hours instead of tedious course-creation tasks. This helps open up the capacity for educators to iterate on their courses faster to keep pace with industry and global changes that may impact their field of study.
Another important use case is how generative AI can serve as a great equalizer for students when it comes to writing, especially second language learners.
Why it matters: The best AI assistants will be the ones that require the least prompting. They’ll get to know who you are, what you need, and your modus operandi. Profiles are a good starting point, but we believe the game-changer will be larger context windows (that’s nerd-speak for the amount of context ChatGPT can handle). .
From DSC: And how about taking this a step further and remembering — or being able to access — our constantly updated Cloud-Based Learning Profiles?
My hypothesis and research suggest that as bar associations and the ABA begin to recognize the on-going systemic issues of high-cost legal education, growing legal deserts (where no lawyer serves a given population), on-going and pervasive access to justice issues, and a public that is already weary of the legal system – alternative options that are already in play might become more supported.
What might that look like?
The combination of AI-assisted education with traditional legal apprenticeships has the potential to create a rich, flexible, and engaging learning environment. Here are three scenarios that might illustrate what such a combination could look like:
Scenario One – Personalized Curriculum Development
Scenario Two – On-Demand Tutoring and Mentoring
Scenario Three – AI-assisted Peer Networks and Collaborative Learning:
We know that there are challenges – a threat to human jobs, the potential implications for cyber security and data theft, or perhaps even an existential threat to humanity as a whole. But we certainly don’t yet have a full understanding of all of the implications. In fact, a World Economic Forum report recently stated that organizations “may currently underappreciate AI-related risks,” with just four percent of leaders considering the risk level to be “significant.”
A survey carried out by analysts Baker McKenzie concluded that many C-level leaders are over-confident in their assessments of organizational preparedness in relation to AI. In particular, it exposed concerns about the potential implications of biased data when used to make HR decisions.
AI & lawyer training: How law firms can embrace hybrid learning & development — thomsonreuters.com A big part of law firms’ successful adaptation to the increased use of ChatGPT and other forms of generative AI, may depend upon how firmly they embrace online learning & development tools designed for hybrid work environments
Excerpt:
As law firms move forward in using of advanced artificial intelligence such as ChatGPT and other forms of generative AI, their success may hinge upon how they approach lawyer training and development and what tools they enlist for the process.
One of the tools that some law firms use to deliver a new, multi-modal learning environment is an online, video-based learning platform, Hotshot, that delivers more than 250 on-demand courses on corporate, litigation, and business skills.
Ian Nelson, co-founder of Hotshot, says he has seen a dramatic change in how law firms are approaching learning & development (L&D) in the decade or so that Hotshot has been active. He believes the biggest change is that 10 years ago, firms hadn’t yet embraced the need to focus on training and development.
From DSC: Heads up law schools. Are you seeing/hearing this!?
Are we moving more towards a lifelong learning model within law schools?
If not, shouldn’t we be doing that?
Are LLM programs expanding quickly enough? Is more needed?