A pilot project designed to test the potential of artificial intelligence tools at McCarthy Tétrault LLP showed that certain types of applications for the legal profession seemed to work better than others, panellists told attendees to the recent Canadian Lawyer Legal Tech Summit.
“I would say that the results were mixed,” David Cohen, senior director of client service delivery for the firm. During the panel, moderated by University of Calgary assistant professor Gideon Christian, Cohen spoke about a pilot of about 40 lawyers from different practices at the firm who used an AI platform with only public data.
The group [testing the platform] said it needs to get better before we start using this for research,” he said. However, he said, when it came to tasks like generating documents, reviewing 100-page cases “and summarizing and analyzing them,” the AI platforms did much better.
To help with this, our Client Success Team have summarised the eight key legal technology trends in the market, as well as the themes discussed at recent legal technology events and conferences including the British Legal Technology Forum and iManage ConnectLive Virtual 2023, both of which we were proud to sponsor.
On a somewhat related note, also see:
Designing the Law Office of the Future — from workdesign.com by Deborah Nemeth Deborah Nemeth of SmithGroup shares how inspiration from higher education and hospitality can help inform the next evolution of the law office.
Merlyn Mind, an AI-powered digital assistant platform, announced the launch of a suite of large language models (LLMs) specifically tailored for the education sector under an open-source license.
Designing courses in an age of AI— from teachinginhighered.com by Maria Andersen Maria Andersen shares about designing courses in an age of artificial intelligence (AI) on episode 469 of the Teaching in Higher Ed podcast.
With generative AI, we have an incredible acceleration of change happening.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
.
On giving AI eyes and ears— from oneusefulthing.org by Ethan Mollick AI can listen and see, with bigger implications than we might realize.
Excerpt:
But even this is just the beginning, and new modes of using AI are appearing, which further increases their capabilities. I want to show you some examples of this emerging world, which I think will soon introduce a new wave of AI use cases, and accompanying disruption.
We need to recognize that these capabilities will continue to grow, and AI will be able to play a more active role in the real world by observing and listening. The implications are likely to be profound, and we should start thinking through both the huge benefits and major concerns today.
Even though generative AI is a new thing, it doesn’t change why students cheat. They’ve always cheated for the same reason: They don’t find the work meaningful, and they don’t think they can achieve it to their satisfaction. So we need to design assessments that students find meaning in.
Tricia Bertram Gallant
Caught off guard by AI— from chonicle.com by Beth McMurtrie and Beckie Supiano Professor scrambled to react to ChatGPT this spring — and started planning for the fall
Excerpt:
Is it cheating to use AI to brainstorm, or should that distinction be reserved for writing that you pretend is yours? Should AI be banned from the classroom, or is that irresponsible, given how quickly it is seeping into everyday life? Should a student caught cheating with AI be punished because they passed work off as their own, or given a second chance, especially if different professors have different rules and students aren’t always sure what use is appropriate?
…OpenAI built tool use right into the GPT API with an update called function calling. It’s a little like a child’s ability to ask their parents to help them with a task that they know they can’t do on their own. Except in this case, instead of parents, GPT can call out to external code, databases, or other APIs when it needs to.
Each function in function calling represents a tool that a GPT model can use when necessary, and GPT gets to decide which ones it wants to use and when. This instantly upgrades GPT capabilities—not because it can now do every task perfectly—but because it now knows how to ask for what it wants and get it. .
.
How ChatGPT can help disrupt assessment overload— from timeshighereducation.com by David Carless Advances in AI are not necessarily the enemy – in fact, they should prompt long overdue consideration of assessment types and frequency, says David Carless
Excerpt:
Reducing the assessment burden could support trust in students as individuals wanting to produce worthwhile, original work. Indeed, students can be co-opted as partners in designing their own assessment tasks, so they can produce something meaningful to them.
A strategic reduction in quantity of assessment would also facilitate a refocusing of assessment priorities on deep understanding more than just performance and carries potential to enhance feedback processes.
If we were to tackle assessment overload in these ways, it opens up various possibilities. Most significantly there is potential to revitalise feedback so that it becomes a core part of a learning cycle rather than an adjunct at its end. End-of-semester, product-oriented feedback, which comes after grades have already been awarded, fails to encourage the iterative loops and spirals typical of productive learning. .
Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day.This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.
These claims conjure up the rosiest of images: human resource departments and their robot buddies solving discrimination in workplace hiring. It seems plausible, in theory, that AI could root out unconscious bias, but a growing body of research shows the opposite may be more likely.
…
Companies’ use of AI didn’t come out of nowhere: For example, automated applicant tracking systems have been used in hiring for decades. That means if you’ve applied for a job, your resume and cover letter were likely scanned by an automated system. You probably heard from a chatbot at some point in the process. Your interview might have been automatically scheduled and later even assessed by AI.
From DSC:
Here was my reflection on this:
DC: Along these lines, I wonder if Applicant Tracking Systems cause us to become like typecast actors and actresses — only thought of for certain roles. Pigeonholed.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
In June, ResumeBuilder.com surveyed more than 1,000 employees who are involved in hiring processes at their workplaces to find out about their companies’ use of AI interviews.
The results:
43% of companies already have or plan to adopt AI interviews by 2024
Two-thirds of this group believe AI interviews will increase hiring efficiency
15% say that AI will be used to make decisions on candidates without any human input
More than half believe AI will eventually replace human hiring managers
Watch OpenAI CEO Sam Altman on the Future of AI — from bloomberg.com Sam Altman, CEO & Co-Founder, OpenAI discusses the explosive rise of OpenAI and its products and what an AI-laced future can look like with Bloomberg’s Emily Chang at the Bloomberg Technology Summit.
The implementation of generative AI within these products will dramatically improve educators’ ability to deliver personalized learning to students at scale by enabling the application of personalized assessments and learning pathways based on individual student needs and learning goals. K-12 educators will also benefit from access to OpenAI technology…
After chronicling 160+ AI tools (which is surely only a small fraction of the total), we’re seeing a few clear patterns among the tools that have come out so far- here are 10 categories that are jumping out!
“I don’t usually get worked up about announcements but I see promise in JFF’s plans for a new Center for Artificial Intelligence & the Future of Work, in no small part because the organization bridges higher ed, K-12 education, employers, and policymakers.”
BOSTON June 14, 2023 —Jobs for the Future (JFF), a national nonprofit that drives transformation in the U.S. education and workforce systems, today announced the launch of its new Center for Artificial Intelligence &the Future of Work. This center will play an integral role in JFF’s mission and newly announced 10-year North Star goal to help 75 million people facing systemic barriers to advancement work in quality jobs. As AI’s explosive growth reshapes every aspect of how we learn, work, and live, this new center will serve as a nexus of collaboration among stakeholders from every part of the education-to-career ecosystem to explore the most promising opportunities—and profound challenges—of AI’s potential to advance an accessible and equitable future of learning and work.
OpenAI Considers ‘App Store’ For ChatGPT — from searchenginejournal.com by; with thanks to Barsee at AI Valley for this resource OpenAI explores launching an ‘app store’ for AI models, potentially challenging current partners and expanding customer reach.
Highlights:
OpenAI considers launching an ‘app store’ for customized AI chatbots.
This move could create competition with current partners and extend OpenAI’s customer reach.
Early interest from companies like Aquant and Khan Academy shows potential, but product development and market positioning challenges remain.
The rise of artificial intelligence, especially generative AI, boosts productivity in content creation–text, code, images and increasingly video.
Here are six preliminary conclusions about the nature of work and learning.
Wonder Tools: AI to try— from wondertools.substack.com by Jeremy Caplan 9 playful little ways to explore AI
Excerpt:
Create a personalized children’s story ? | Schrodi Collaborate with AI on a free customized, illustrated story for someone special. Give your story’s hero a name, pick a genre (e.g. comedy, thriller), choose an illustration style (e.g. watercolor, 3d animation) and provide a prompt to shape a simple story. You can even suggest a moral. After a minute, download a full-color PDF to share. Or print it and read your new mini picture book aloud.
Generate a quiz ? | Piggy Put in a link, a topic, or some text and you’ll get a quiz you can share, featuring multiple-choice or true-false questions. Example: try this quick entrepreneurship quiz Piggy generated for me.
Q: How will generative AI impact teaching and learning in the near and long term?
Baker Stein: One-on-one tutoring at scale is finally being unlocked for learners around the world. This type of quality education is no longer only available to students with the means to hire a private tutor.I’m also particularly excited to see how educators make use of generative AI tools to create courses much faster and likely at a higher quality with increased personalization for each student or even by experimenting with new technologies like extended reality. Professors will be able to put their time toward high-impact activities like mentoring, researching and office hours instead of tedious course-creation tasks. This helps open up the capacity for educators to iterate on their courses faster to keep pace with industry and global changes that may impact their field of study.
Another important use case is how generative AI can serve as a great equalizer for students when it comes to writing, especially second language learners.
Why it matters: The best AI assistants will be the ones that require the least prompting. They’ll get to know who you are, what you need, and your modus operandi. Profiles are a good starting point, but we believe the game-changer will be larger context windows (that’s nerd-speak for the amount of context ChatGPT can handle). .
From DSC: And how about taking this a step further and remembering — or being able to access — our constantly updated Cloud-Based Learning Profiles?
My hypothesis and research suggest that as bar associations and the ABA begin to recognize the on-going systemic issues of high-cost legal education, growing legal deserts (where no lawyer serves a given population), on-going and pervasive access to justice issues, and a public that is already weary of the legal system – alternative options that are already in play might become more supported.
What might that look like?
The combination of AI-assisted education with traditional legal apprenticeships has the potential to create a rich, flexible, and engaging learning environment. Here are three scenarios that might illustrate what such a combination could look like:
Scenario One – Personalized Curriculum Development
Scenario Two – On-Demand Tutoring and Mentoring
Scenario Three – AI-assisted Peer Networks and Collaborative Learning:
We know that there are challenges – a threat to human jobs, the potential implications for cyber security and data theft, or perhaps even an existential threat to humanity as a whole. But we certainly don’t yet have a full understanding of all of the implications. In fact, a World Economic Forum report recently stated that organizations “may currently underappreciate AI-related risks,” with just four percent of leaders considering the risk level to be “significant.”
A survey carried out by analysts Baker McKenzie concluded that many C-level leaders are over-confident in their assessments of organizational preparedness in relation to AI. In particular, it exposed concerns about the potential implications of biased data when used to make HR decisions.
AI & lawyer training: How law firms can embrace hybrid learning & development — thomsonreuters.com A big part of law firms’ successful adaptation to the increased use of ChatGPT and other forms of generative AI, may depend upon how firmly they embrace online learning & development tools designed for hybrid work environments
Excerpt:
As law firms move forward in using of advanced artificial intelligence such as ChatGPT and other forms of generative AI, their success may hinge upon how they approach lawyer training and development and what tools they enlist for the process.
One of the tools that some law firms use to deliver a new, multi-modal learning environment is an online, video-based learning platform, Hotshot, that delivers more than 250 on-demand courses on corporate, litigation, and business skills.
Ian Nelson, co-founder of Hotshot, says he has seen a dramatic change in how law firms are approaching learning & development (L&D) in the decade or so that Hotshot has been active. He believes the biggest change is that 10 years ago, firms hadn’t yet embraced the need to focus on training and development.
From DSC: Heads up law schools. Are you seeing/hearing this!?
Are we moving more towards a lifelong learning model within law schools?
If not, shouldn’t we be doing that?
Are LLM programs expanding quickly enough? Is more needed?
Could this immersive AR experience revolutionize the culinary arts?
Earlier this month, the popular culinary livestreaming network Kittch announced that it is partnering with American technology company Qualcomm to create hands-free cooking experiences accessible via AR glasses.
The generative AI announcements are coming fast and furious these days, but among the biggest in terms of sheer dollar commitments just landed: Accenture, the global professional services and consulting giant, today announced it will invest $3 billion (with a “b”!) in AI over the next three years in building out its team of AI professionals and AI-focused solutions for its clients.
“There is unprecedented interest in all areas of AI, and the substantial investment we are making in our Data & AI practice will help our clients move from interest to action to value, and in a responsible way with clear business cases,” said Julie Sweet, Accenture’s chairwoman and CEO.
Also related/see:
Artificial intelligence creates 40,000 new roles at Accenture— from computerweekly.com by Karl Flinders Accenture is planning to add thousands of AI experts to its workforce as part of a $3bn investment in its data and artificial intelligence practice
Why leaders need to evolve alongside generative AI — from fastcompany.com by Kelsey Behringer Even if you’re not an educator, you should not be sitting on the sidelines watching the generative AI conversation being had around you—hop in.
Excerpts (emphasis DSC):
Leaders should be careful to watch and support education right now. At the end of the day, the students sitting in K-12 and college classrooms are going to be future CPAs, lawyers, writers, and teachers. If you are parenting a child, you have skin in the game. If you use professional services, you have skin in the game. When it comes to education, we all have skin in the game. … Students need to master fundamental skills like editing, questioning, researching, and verifying claims before they can use generative AI exceptionally well.
[On 6/15/23, I joined] colleagues from OpenAI, Google, Microsoft, Stanford, Harvard and other others at the first meeting of the GenAI Summit. Our shared goal [was] to help to educate universities & schools in Europe about the impact of Generative AI on their work.
…how can we effectively communicate to education professionals that generative AI will enhance their work rather than replace them?
A recent controlled study found that ChatGPT can help professionals increase their efficiency in routine tasks by ~35%. If we keep in mind that the productivity gains brought by the steam engine in the nineteenth century was ~25%, this is huge.
As educators, we should embrace the power of ChatGPT to automate the repetitive tasks which we’ve been distracted by for decades. Lesson planning, content creation, assessment design, grading and feedback – generative AI can help us to do all of these things faster than ever before, freeing us up to focus on where we bring most value for our students.
SAN FRANCISCO, June 15 (Reuters) – Alphabet Inc (GOOGL.O) is cautioning employees about how they use chatbots, including its own Bard, at the same time as it markets the program around the world, four people familiar with the matter told Reuters.
The Google parent has advised employees not to enter its confidential materials into AI chatbots, the people said and the company confirmed, citing long-standing policy on safeguarding information.
Adobe Firefly for the Enterprise — Dream Bigger with Adobe Firefly. Dream it, type it, see it with Firefly, our creative generative AI engine. Now in Photoshop (beta), Illustrator, Adobe Express, and on the web.
“With Vision Pro, you’re no longer limited by a display,” Apple CEO Tim Cook said, introducing the new headset at WWDC 2023. Unlike earlier mixed reality reports, the system is far more focused on augmented reality than virtual. The company refresh to this new paradigm is “spatial computing.”
“This is the first Apple product you look through and not at.” – Tim Cook
And with those famous words, Apple announced a new era of consumer tech.
Apple’s new headset will operate on VisionOS – its new operating system – and will work with existing iOS and iPad apps. The new OS is created specifically for spatial computing — the blend of digital content into real space.
Vision Pro is controlled through hand gestures, eye movements and your voice (parts of it assisted by AI). You can use apps, change their size, capture photos and videos and more.
From DSC: Time will tell what happens with this new operating system and with this type of platform. I’m impressed with the engineering — as Apple wants me to be — but I doubt that this will become mainstream for quite some time yet. Also, I wonder what Steve Jobs would think of this…? Would he say that people would be willing to wear this headset (for long? at all?)? What about Jony Ive?
I’m sure the offered experiences will be excellent. But I won’t be buying one, as it’s waaaaaaaaay too expensive.
From DSC: I also wanted to highlight the item below, which Barsee also mentioned above, as it will likely hit the world of education and training as well:
Last night, Jensen Huang of NVIDIA gave his very first live keynote in 4-years.
The most show-stopping moment from the event was when he showed off the real-time AI in video games. A human speaks, the NPC responds, in real time and the dialogue was generated with AI on the fly. pic.twitter.com/TDoUM1zSiy
Changed by Our Journey: Engaging Students through Simulive Learning — from er.educause.edu by Lisa Lenze and Megan Costello In this article, an instructor explains how she took an alternative approach to teaching—simulive learning—and discusses the benefits that have extended to her in-person classrooms.
Excerpts:
Mustering courage, Costello devised a novel way to (1) share the course at times other than when it was regularly scheduled and (2) fully engage with her students in the chat channel during the scheduled class meeting time. Her solution, which she calls simulive learning, required her to record her lectures and watch them with her students. (Courageous, indeed!)
Below, Costello and I discuss what simulive learning looks like, how it works, and how Costello has taken her version of remote synchronous teaching forward into current semesters.
Megan Costello: I took a different approach to remote synchronous online learning at the start of the pandemic. Instead of using traditional videoconferencing software to hold class, I prerecorded, edited, and uploaded videos of my lectures to a streaming website. This website allowed me to specify a time and date to broadcast my lectures to my students. Because the lectures were already prepared, I could watch and participate in the chat with my students as we encountered the materials together during the scheduled class time. I drove conversations in chat, asked questions, and got students engaged as we covered materials for the day. The students had my full attention.
Professors Plan Summer AI Upskilling, With or Without Support — from insidehighered.com by Susan D’Agostino Academics seeking respite from the fire hose of AI information and hot takes launch summer workshops. But many of the grass-roots efforts fall short of meeting demand.
Excerpt:
In these summer faculty AI workshops, some plan to take their first tentative steps in redesigning assignments to recognize the AI-infused landscape. Others expect to evolve their in-progress teaching-with-AI practices. At some colleges, full-time staff will deliver the workshops or pay participants for professional development time. But some offerings are grassroots efforts delivered by faculty volunteers attended by participants on their own time. Even so, many worry that the efforts will fall short of meeting demand.
From DSC: We aren’t used to this pace of change. It will take time for faculty members — as well as Instructional Designers, Instructional Technologists, Faculty Developers, Learning Experience Designers, Librarians, and others — to learn more about AI and its implications for teaching and learning. Faculty are learning. Staff are learning. Students are learning. Grace is needed. And faculty/staff modeling what it is to learn themselves is a good thing for students to see as well.
This can be done first and foremost through collaboration, bringing more people at the table, in a meaningful workflow, whereby they can make the best use of their expertise. Moreover, we need to take a step back and keep the big picture in mind, if we want to provide our students with a valuable experience.
…
This is all about creating and nurturing partnerships. Thinking in an inclusive way about who is at the table when we design our courses and our programmes and who we are currently missing. Generally speaking, the main actors involved should be: teaching staff, learning design professionals (under all their various names) and students. Yes, students. Although we are designing for their learning, they are all too often not part of the process.
In order to yield results, collaborative practice needs to be embedded in the institutional fabric, and this takes time. Building silos happens fast, breaking them is a long term process. Creating a culture of dialogue, with clear and replicable processes is key to making collaborative learning design work.
From DSC: To me, Alexandra is addressing the topic of using teams to design, develop, and teach/offer courses. This is where a variety of skills and specialties can come together to produce an excellent learning experience. No one individual has all of the necessary skills — nor the necessary time. No way.
Last night, Jensen Huang of NVIDIA gave his very first live keynote in 4-years.
The most show-stopping moment from the event was when he showed off the real-time AI in video games. A human speaks, the NPC responds, in real time and the dialogue was generated with AI on the fly. pic.twitter.com/TDoUM1zSiy
Bill Gates says AI is poised to destroy search engines and Amazon — from futurism.com by Victor Tangermann Who will win the AI [competition]? (DSC: I substituted the word competition here, as that’s what it is. It’s not a war, it’s a part of America’s way of doing business.)
“Whoever wins the personal agent, that’s the big thing, because you will never go to a search site again, you will never go to a productivity site, you’ll never go to Amazon again,” Gates said during a Goldman Sachs event on AI in San Francisco this week, as quoted by CNBC.
These AI assistants could “read the stuff you don’t have time to read,” he said, allowing users to get to information without having to use a search engine like Google.
The online learning platform edX introduced two new tools on Friday based on OpenAI’s ChatGPT technology: an edX plugin for ChatGPT and a learning assistant embedded in the edX platform, called Xpert.
According to the company, its plugin will enable ChatGPT Plus subscribers to discover educational programs and explore learning content such as videos and quizzes across edX’s library of 4,200 courses.
Bing is now the default search for ChatGPT— from theverge.com by Tom Warren; via superhuman.beehiiv.com The close partnership between Microsoft and OpenAI leads to plug-in interoperability and search defaults.
Excerpt:
OpenAI will start using Bing as the default search experience for ChatGPT. The new search functionality will be rolling out to ChatGPT Plus users today and will be enabled for all free ChatGPT users soon through a plug-in in ChatGPT.
Students with mobility challenges may find it easier to use generative AI tools — such as ChatGPT or Elicit — to help them conduct research if that means they can avoid a trip to the library.
Students who have trouble navigating conversations — such as those along the autism spectrum — could use these tools for “social scripting.” In that scenario, they might ask ChatGPT to give them three ways to start a conversation with classmates about a group project.
Students who have trouble organizing their thoughts might benefit from asking a generative AI tool to suggest an opening paragraph for an essay they’re working on — not to plagiarize, but to help them get over “the terror of the blank page,” says Karen Costa, a faculty-development facilitator who, among other things, focuses on teaching, learning, and living with ADHD. “AI can help build momentum.”
ChatGPT is good at productive repetition. That is a practice most teachers use anyway to reinforce learning. But AI can take that to the next level by allowing students who have trouble processing information to repeatedly generate examples, definitions, questions, and scenarios of concepts they are learning.
It’s not all on you to figure this out and have all the answers. Partner with your students and explore this together.