Why it matters: The best AI assistants will be the ones that require the least prompting. They’ll get to know who you are, what you need, and your modus operandi. Profiles are a good starting point, but we believe the game-changer will be larger context windows (that’s nerd-speak for the amount of context ChatGPT can handle). .
From DSC: And how about taking this a step further and remembering — or being able to access — our constantly updated Cloud-Based Learning Profiles?
My hypothesis and research suggest that as bar associations and the ABA begin to recognize the on-going systemic issues of high-cost legal education, growing legal deserts (where no lawyer serves a given population), on-going and pervasive access to justice issues, and a public that is already weary of the legal system – alternative options that are already in play might become more supported.
What might that look like?
The combination of AI-assisted education with traditional legal apprenticeships has the potential to create a rich, flexible, and engaging learning environment. Here are three scenarios that might illustrate what such a combination could look like:
Scenario One – Personalized Curriculum Development
Scenario Two – On-Demand Tutoring and Mentoring
Scenario Three – AI-assisted Peer Networks and Collaborative Learning:
We know that there are challenges – a threat to human jobs, the potential implications for cyber security and data theft, or perhaps even an existential threat to humanity as a whole. But we certainly don’t yet have a full understanding of all of the implications. In fact, a World Economic Forum report recently stated that organizations “may currently underappreciate AI-related risks,” with just four percent of leaders considering the risk level to be “significant.”
A survey carried out by analysts Baker McKenzie concluded that many C-level leaders are over-confident in their assessments of organizational preparedness in relation to AI. In particular, it exposed concerns about the potential implications of biased data when used to make HR decisions.
AI & lawyer training: How law firms can embrace hybrid learning & development — thomsonreuters.com A big part of law firms’ successful adaptation to the increased use of ChatGPT and other forms of generative AI, may depend upon how firmly they embrace online learning & development tools designed for hybrid work environments
Excerpt:
As law firms move forward in using of advanced artificial intelligence such as ChatGPT and other forms of generative AI, their success may hinge upon how they approach lawyer training and development and what tools they enlist for the process.
One of the tools that some law firms use to deliver a new, multi-modal learning environment is an online, video-based learning platform, Hotshot, that delivers more than 250 on-demand courses on corporate, litigation, and business skills.
Ian Nelson, co-founder of Hotshot, says he has seen a dramatic change in how law firms are approaching learning & development (L&D) in the decade or so that Hotshot has been active. He believes the biggest change is that 10 years ago, firms hadn’t yet embraced the need to focus on training and development.
From DSC: Heads up law schools. Are you seeing/hearing this!?
Are we moving more towards a lifelong learning model within law schools?
If not, shouldn’t we be doing that?
Are LLM programs expanding quickly enough? Is more needed?
In March, I reported a pair of stories from Jackson, Miss., where the school district is paying for unlicensed classroom aides to go back to school and get their master’s degrees.
In April, I told the story of a remarkable idea: A new high school in San Antonio dedicated entirely to training high-schoolers in the art and science of good teaching.
From DSC: I would add a few more items:
Significantly reduce the impact of legislators on K-12. If they do vote on something that would impact schools, each legislator that votes on such legislation must first spend at least ___ week(s) observing in some of the schools that would be impacted before even starting to draft legislation and/or debate on the topic(s).
Instead, turn over more control and power to the students, teachers, K12 administrators, parents, and school boards.
Provide more choice, more control as each student can handle it.
Stop the one-size fits all system. Instead use AI-based systems to provide more personalized learning.
Develop more hybrid programs — but this time I’m talking mixing what we’ve known as public education with homeschooling and smaller learning pods. Let’s expand what’s included when we discuss “learning spaces.”
Strive for a love of learning — vs. competition and developing gameplayers
Support makerspaces, entrepreneurship, and experiments
Speaking of experiments, I would recommend developing more bold experiments outside of the current systems.
Along the lines of potential solutions/visions, see:
Foremost among them is this: Despite all the fancy models and white papers around what are all the levers to pull in order to transform a system, system transformation almost never happens by changing the fundamental tenets of the system itself. Instead, it comes from replacing the system with a brand-new system.
To start to understand why, consider the complicated system in which public schools find themselves. As Thomas Arnett explained, they are one part of a vast value network of federal, state, and local regulators, voters and taxpayers, parents and students, teachers, administrators, unions, curriculum providers, school vendors, public infrastructure, higher education institutions, and more.
New ideas, programs, or entities that don’t fit into these processes, priorities, and cost structures are simply not plug-compatible into that value network. They consequently get rejected, tossed to the fringe, or altered to meet the needs of the existing actors in the value network.
How might we develop and deploy beneficial, safe artificial general intelligence for humanity? Reid and Aria are joined by Sam Altman, the CEO of OpenAI, and Greg Brockman, OpenAI co-founder and president. Sam and Greg trace their journey—from articulating their mission to early company projects and decisions to scaling and sharing GPT-4 with the world. They also explore the transformative impact artificial intelligence can have on other industries, like energy, medicine, education, and law. Plus, GPT-4 offers a poetic perspective on a piece of code. .
Let’s look at some ideas of how law schools could use AI tools like Khanmigo or ChatGPT to support lectures, assignments, and discussions, or use plagiarism detection software to maintain academic integrity.
In particular, we’re betting on four trends for AI and L&D.
Rapid content production
Personalized content
Detailed, continuous feedback
Learner-driven exploration
In a world where only 7 percent of the global population has a college degree, and as many as three quarters of workers don’t feel equipped to learn the digital skills their employers will need in the future, this is the conversation people need to have.
…
Taken together, these trends will change the cost structure of education and give learning practitioners new superpowers. Learners of all backgrounds will be able to access quality content on any topic and receive the ongoing support they need to master new skills. Even small L&D teams will be able to create programs that have both deep and broad impact across their organizations.
Generative AI is set to play a pivotal role in the transformation of educational technologies and assisted learning. Its ability to personalize learning experiences, power intelligent tutoring systems, generate engaging content, facilitate collaboration, and assist in assessment and grading will significantly benefit both students and educators.
With today’s advancements in generative AI, that vision of personalized learning may not be far off from reality. We spoke with Dr. Kim Round, associate dean of the Western Governors University School of Education, about the potential of technologies like ChatGPT for learning, the need for AI literacy skills, why learning experience designers have a leg up on AI prompt engineering, and more. And get ready for more Star Trek references, because the parallels between AI and Sci Fi are futile to resist.
NVIDIA today introduced a wave of cutting-edge AI research that will enable developers and artists to bring their ideas to life — whether still or moving, in 2D or 3D, hyperrealistic or fantastical.
Around 20 NVIDIA Research papers advancing generative AI and neural graphics — including collaborations with over a dozen universities in the U.S., Europe and Israel — are headed to SIGGRAPH 2023, the premier computer graphics conference, taking place Aug. 6-10 in Los Angeles.
The papers include generative AI models that turn text into personalized images; inverse rendering tools that transform still images into 3D objects; neural physics models that use AI to simulate complex 3D elements with stunning realism; and neural rendering models that unlock new capabilities for generating real-time, AI-powered visual details.
Also relevant to the item from Nvidia (above), see:
Across all socioeconomic and racial groups, Americans want an education system that goes beyond college preparation and delivers practical skills for every learner, based on their own needs, goals and vision for the future.
We believe that this can be achieved by making the future of learning more personalized, focused on the needs of individual learners, with success measured by progress and proficiency instead of point-in-time test scores.
Change is hard, but we expect our students to take risks and fail every day. We should ask no less of ourselves.
The Future of Teaching and Learning
Artificial intelligence (AI) has taken the world by storm, with new AI-powered tools such as ChatGPT opening up new opportunities in higher education for content creation, communication, and learning, while also raising new concerns about the misuses and overreach of technology. Our shared humanity has also become a key focal point within higher education, as faculty and leaders continue to wrestle with understanding and meeting the diverse needs of students and to find ways of cultivating institutional communities that support student well-being and belonging.
For this year’s teaching and learning Horizon Report, then, our panelists’ discussions oscillated between these seemingly polar ideas: the supplanting of human activity with powerful new technological capabilities, and the need for more humanity at the center of everything we do. This report summarizes the results of those discussions and serves as one vantage point on where our future may be headed.
A few current categories of AI in Edtech particularly jump out:
Teacher Productivity and Joy: Tools to make educators’ lives easier (and more fun?) by removing some of the more rote tasks of teaching, like lesson planning (we counted at least 8 different tools for lesson planning), resource curation and data collection.
Personalization and Learning Delivery: Tools to tailor instruction to the particular interests, learning preferences and preferred media consumption of students. This includes tools that convert text to video, video to text, text to comic books, Youtube to notes, and many more.
Study and Course Creation Tools: Tools for learners to automatically make quizzes, flashcards, notes or summaries of material, or even to automatically create full courses from a search term.
AI Tutors, Chatbots and Teachers: There will be no shortage of conversational AI “copilots” (which may take many guises) to support students in almost any learning context. Many Edtech companies launched their own during the conference. Possible differentiators here could be personality, safety, privacy, access to a proprietary or specific data set, or bots built on proprietary LLMs.
Simplifying Complex Processes: One of the most inspiring conversations of the conference for me was with Tiffany Green, founder of Uprooted Academy, about how AI can and should be used to remove bureaucratic barriers to college for underrepresented students (for example, used to autofill FAFSA forms, College Applications, to search for schools and access materials, etc). This is not the only complex bureaucratic process in education.
Educational LLMs: The race is on to create usable large language models for education that are safe, private, appropriate and classroom-ready. Merlyn Mind is working on this, and companies that make LLMs are sprouting up in other sectors…
This week I spent a few days at the ASU/GSV conference and ran into 7,000 educators, entrepreneurs, and corporate training people who had gone CRAZY for AI.
No, I’m not kidding. This community, which makes up people like training managers, community college leaders, educators, and policymakers is absolutely freaked out about ChatGPT, Large Language Models, and all sorts of issues with AI. Now don’t get me wrong: I’m a huge fan of this. But the frenzy is unprecedented: this is bigger than the excitement at the launch of the i-Phone.
Second, the L&D market is about to get disrupted like never before. I had two interactive sessions with about 200 L&D leaders and I essentially heard the same thing over and over. What is going to happen to our jobs when these Generative AI tools start automatically building content, assessments, teaching guides, rubrics, videos, and simulations in seconds?
The answer is pretty clear: you’re going to get disrupted. I’m not saying that L&D teams need to worry about their careers, but it’s very clear to me they’re going to have to swim upstream in a big hurry. As with all new technologies, it’s time for learning leaders to get to know these tools, understand how they work, and start to experiment with them as fast as you can.
Speaking of the ASU+GSV Summit, see this posting from Michael Moe:
Last week, the 14th annual ASU+GSV Summit hosted over 7,000 leaders from 70+ companies well as over 900 of the world’s most innovative EdTech companies. Below are some of our favorite speeches from this year’s Summit…
High-quality tutoring is one of the most effective educational interventions we have – but we need both humans and technology for it to work. In a standing-room-only session, GSE Professor Susanna Loeb, a faculty lead at the Stanford Accelerator for Learning, spoke alongside school district superintendents on the value of high-impact tutoring. The most important factors in effective tutoring, she said, are (1) the tutor has data on specific areas where the student needs support, (2) the tutor has high-quality materials and training, and (3) there is a positive, trusting relationship between the tutor and student. New technologies, including AI, can make the first and second elements much easier – but they will never be able to replace human adults in the relational piece, which is crucial to student engagement and motivation.
ChatGPT, Bing Chat, Google’s Bard—AI is infiltrating the lives of billions.
The 1% who understand it will run the world.
Here’s a list of key terms to jumpstart your learning:
Being “good at prompting” is a temporary state of affairs.The current AI systems are already very good at figuring out your intent, and they are getting better. Prompting is not going to be that important for that much longer. In fact, it already isn’t in GPT-4 and Bing. If you want to do something with AI, just ask it to help you do the thing. “I want to write a novel, what do you need to know to help me?” will get you surprisingly far.
…
The best way to use AI systems is not to craft the perfect prompt, but rather to use it interactively. Try asking for something. Then ask the AI to modify or adjust its output. Work with the AI, rather than trying to issue a single command that does everything you want. The more you experiment, the better off you are. Just use the AI a lot, and it will make a big difference – a lesson my class learned as they worked with the AI to create essays.
From DSC: Agreed –> “Being “good at prompting” is a temporary state of affairs.” The User Interfaces that are/will be appearing will help greatly in this regard.
From DSC: Bizarre…at least for me in late April of 2023:
FaceTiming live with AI… This app came across the @ElunaAI Discord and I was very impressed with its responsiveness, natural expression and language, etc…
Feels like the beginning of another massive wave in consumer AI products.
The rise of AI-generated music has ignited legal and ethical debates, with record labels invoking copyright law to remove AI-generated songs from platforms like YouTube.
Tech companies like Google face a conundrum: should they take down AI-generated content, and if so, on what grounds?
Some artists, like Grimes, are embracing the change, proposing new revenue-sharing models and utilizing blockchain-based smart contracts for royalties.
The future of AI-generated music presents both challenges and opportunities, with the potential to create new platforms and genres, democratize the industry, and redefine artist compensation.
The Need for AI PD — from techlearning.com by Erik Ofgang Educators need training on how to effectively incorporate artificial intelligence into their teaching practice, says Lance Key, an award-winning educator.
“School never was fun for me,” he says, hoping that as an educator he could change that with his students. “I wanted to make learning fun.” This ‘learning should be fun’ philosophy is at the heart of the approach he advises educators take when it comes to AI.
At its 11th annual conference in 2023, educational company Coursera announced it is adding ChatGPT-powered interactive ed tech tools to its learning platform, including a generative AI coach for students and an AI course-building tool for teachers. It will also add machine learning-powered translation, expanded VR immersive learning experiences, and more.
Coursera Coach will give learners a ChatGPT virtual coach to answer questions, give feedback, summarize video lectures and other materials, give career advice, and prepare them for job interviews. This feature will be available in the coming months.
From DSC: Yes…it will be very interesting to see how tools and platforms interact from this time forth. The term “integration” will take a massive step forward, at least in my mind.
This year’s Global Sentiment Survey – the tenth – paints a picture that is both familiar and unusual. In our 2020 survey report, we noted that ‘Data dominates this year’s survey’. It does so again this year, with the near 4,000 respondents showing a strong interest in AI, Skills-based talent management and Learning analytics (in positions #2, #3 and #4), all of which rely on data. The table is topped by Reskilling/upskilling, in the #1 spot for the third year running. .
Learning happens throughout life and is not isolated to the K-12 or higher education sectors. Yet, often, validations of learning only happen in these specific areas. The system of evaluation based on courses, grades, and credit serves as a poor proxy for communicating skills given the variation in course content, grade inflation, and inclusion of participation and extra credit within course grades.
Credentialed learning provides a way to accurately document human capability for all learners throughout their life. A lifetime credentialed learning ecosystem provides better granularity around learning, better documentation of the learning, and more relevance for both the credential recipient and reviewer. This improves the match between higher education and/or employment with the individual, while also providing a more clear and accurate lifetime learning pathway.
With a fully-credentialed system, individuals can own well-documented evidence of a lifetime of learning and choose what and when to share this data. This technology enables every learner to have more opportunities for finding the best career match without today’s existing barriers around cost, access, and proxies.
Addendum on 4/28/23 — speaking of credentials:
First Rung — from the-job.beehiiv.com by Paul Fain New research shows stacking credentials pays off for low-income learners.
Stacking credentials pays off for many low-income students, new research finds, but only if learners move up the education ladder. Also, Kansas is hoping a new grant program will attract more companies to participate in microinternships.
Imagine introducing high-quality AI tutors into the flipped classroom model. These AI-powered systems have the potential to significantly enhance the learning experience for students and make flipped classrooms even more effective. They provide personalized learning, where AI tutors can tailor instruction to each student’s unique needs while continually adjusting content based on performance. This means that students can engage with the content at home more effectively, ensuring they come to class better prepared and ready to dive into hands-on activities or discussions.
With AI tutors taking care of some of the content delivery outside of class, teachers can devote more time to fostering meaningful interactions with their students during class. They can also use insights from the AI tutors to identify areas where students might need extra support or guidance, enabling them to provide more personalized and effective instruction. And with AI assistance, they can design better active learning opportunities in class to make sure learnings stick.
Also relevant/see:
ChatGPT is going to change education, not destroy it — from technologyreview.com by Will Douglas Heaven The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Advanced chatbots could be used as powerful classroom aids that make lessons more interactive, teach students media literacy, generate personalized lesson plans, save teachers time on admin, and more.
What the Past Can Teach Us About the Future of AI and Education — from campustechnology.com by Dr. David Wiley Current attitudes toward generative AI hearken back to early skepticism about the impact of the internet on education. Both then and now, technology has created challenges but also opportunities that can’t be ignored.