1. AI already plays a central part in the instructional design process A whopping 95.3% of the instructional designers interviewed said they use AI in their day to day work. Those who don’t use AI cite access or permission issues as the primary reason that they haven’t integrated AI into their process.
2. AI is predominantly used at the design and development stages of the instructional design process
When mapped to the ADDIE process, the breakdown of use cases goes as follows:
Analysis: 5.5% of use cases
Design: 32.1%
Development: 53.2%
Implementation: 1.8%
Evaluation: 7.3%
Speaking of AI in our learning ecosystems, also see:
The majority of educators expect use of artificial intelligence tools will increase in their school or district over the next year, according to an EdWeek Research Center survey.
Applying Multimodal AI to L&D — from learningguild.com by Sarah Clark We’re just starting to see Multimodal AI systems hit the spotlight. Unlike the text-based and image-generation AI tools we’ve seen before, multimodal systems can absorb and generate content in multiple formats – text, image, video, audio, etc.
The early vibrations of AI have already been shaking the newsroom. One downside of the new technology surfaced at CNET and Sports Illustrated, where editors let AI run amok with disastrous results. Elsewhere in news media, AI is already writing headlines, managing paywalls to increase subscriptions, performing transcriptions, turning stories in audio feeds, discovering emerging stories, fact checking, copy editing and more.
Felix M. Simon, a doctoral candidate at Oxford, recently published a white paper about AI’s journalistic future that eclipses many early studies. Swinging a bat from a crouch that is neither doomer nor Utopian, Simon heralds both the downsides and promise of AI’s introduction into the newsroom and the publisher’s suite.
Unlike earlier technological revolutions, AI is poised to change the business at every level. It will become — if it already isn’t — the beginning of most story assignments and will become, for some, the new assignment editor. Used effectively, it promises to make news more accurate and timely. Used frivolously, it will spawn an ocean of spam. Wherever the production and distribution of news can be automated or made “smarter,” AI will surely step up. But the future has not yet been written, Simon counsels. AI in the newsroom will be only as bad or good as its developers and users make it.
We proposed EMO, an expressive audio-driven portrait-video generation framework. Input a single reference image and the vocal audio, e.g. talking and singing, our method can generate vocal avatar videos with expressive facial expressions, and various head poses, meanwhile, we can generate videos with any duration depending on the length of input video.
New experimental work from Adobe Research is set to change how people create and edit custom audio and music. An early-stage generative AI music generation and editing tool, Project Music GenAI Control allows creators to generate music from text prompts, and then have fine-grained control to edit that audio for their precise needs.
“With Project Music GenAI Control, generative AI becomes your co-creator. It helps people craft music for their projects, whether they’re broadcasters, or podcasters, or anyone else who needs audio that’s just the right mood, tone, and length,” says Nicholas Bryan, Senior Research Scientist at Adobe Research and one of the creators of the technologies.
There’s a lot going on in the world of generative AI, but maybe the biggest is the increasing number of copyright lawsuits being filed against AI companies like OpenAI and Stability AI. So for this episode, we brought on Verge features editor Sarah Jeong, who’s a former lawyer just like me, and we’re going to talk about those cases and the main defense the AI companies are relying on in those copyright cases: an idea called fair use.
The FCC’s war on robocalls has gained a new weapon in its arsenal with the declaration of AI-generated voices as “artificial” and therefore definitely against the law when used in automated calling scams. It may not stop the flood of fake Joe Bidens that will almost certainly trouble our phones this election season, but it won’t hurt, either.
The new rule, contemplated for months and telegraphed last week, isn’t actually a new rule — the FCC can’t just invent them with no due process. Robocalls are just a new term for something largely already prohibited under the Telephone Consumer Protection Act: artificial and pre-recorded messages being sent out willy-nilly to every number in the phone book (something that still existed when they drafted the law).
EIEIO…Chips Ahoy!— from dashmedia.co by Michael Moe, Brent Peus, and Owen Ritz
Here Come the AI Worms — from wired.com by Matt Burgess Security researchers created an AI worm in a test environment that can automatically spread between generative AI agents—potentially stealing data and sending spam emails along the way.
Now, in a demonstration of the risks of connected, autonomous AI ecosystems, a group of researchers have created one of what they claim are the first generative AI worms—which can spread from one system to another, potentially stealing data or deploying malware in the process. “It basically means that now you have the ability to conduct or to perform a new kind of cyberattack that hasn’t been seen before,” says Ben Nassi, a Cornell Tech researcher behind the research.
Last month, I discussed a GPT that I had created around enhancing prompts. Since then, I have been actively using my Prompt Enhancer GPT to much more effective outputs. Last week, I did a series of mini-talks on generative AI in different parts of higher education (faculty development, human resources, grants, executive leadership, etc) and structured it as “5 tips”. I included a final bonus tip in all of them—a tip that I heard from many afterwards was probably the most useful tip—especially because you can only access the Prompt Enhancer GPT if you are paying for ChatGPT.
Effectively integrating generative AI into higher education requires policy development, cross-functional engagement, ethical principles, risk assessments, collaboration with other institutions, and an exploration of diverse use cases.
Creating Guidelines for the Use of Gen AI Across Campus— from campustechnology.com by Rhea Kelly The University of Kentucky has taken a transdisciplinary approach to developing guidelines and recommendations around generative AI, incorporating input from stakeholders across all areas of the institution. Here, the director of UK’s Center for the Enhancement of Learning and Teaching breaks down the structure and thinking behind that process.
That resulted in a set of instructional guidelines that we released in August of 2023 and updated in December of 2023. We’re also looking at guidelines for researchers at UK, and we’re currently in the process of working with our colleagues in the healthcare enterprise, UK Healthcare, to comb through the additional complexities of this technology in clinical care and to offer guidance and recommendations around those issues.
My experiences match with the results of the above studies. The second study cited above found that 83% of those students who haven’t used AI tools are “not interested in using them,” so it is no surprise that many students have little awareness of their nature. The third study cited above found that, “apart from 12% of students identifying as daily users,” most students’ use cases were “relatively unsophisticated” like summarizing or paraphrasing text.
For those of us in the AI-curious bubble, we need to continually work to stay current, but we also need to recognize that what we take to be “common knowledge” is far from common outside of the bubble.
Despite general familiarity, however, technical knowledge shouldn’t be assumed for district leaders or others in the school community. For instance, it’s critical that any materials related to AI not be written in “techy talk” so they can be clearly understood, said Ann McMullan, project director for the Consortium for School Networking’s EmpowerED Superintendents Initiative.
To that end, CoSN, a nonprofit that promotes technological innovation in K-12, has released an array of AI resources to help superintendents stay ahead of the curve, including a one-page explainer that details definitions and guidelines to keep in mind as schools work with the emerging technology.
Last month, OpenAI chief executive Sam Altman finally admitted what researchers have been saying for years — that the artificial intelligence (AI) industry is heading for an energy crisis. It’s an unusual admission. At the World Economic Forum’s annual meeting in Davos, Switzerland, Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.
I’m glad he said it. I’ve seen consistent downplaying and denial about the AI industry’s environmental costs since I started publishing about them in 2018. Altman’s admission has got researchers, regulators and industry titans talking about the environmental impact of generative AI.
Yesterday, Nvidia reported $22.1 billion in revenue for its fourth fiscal quarter of fiscal 2024 (ending January 31, 2024), easily topping Wall Street’s expectations. The revenues grew 265% from a year ago, thanks to the explosive growth of generative AI.
…
He also repeated a notion about “sovereign AI.” This means that countries are protecting the data of their users and companies are protecting data of employees through “sovereign AI,” where the large-language models are contained within the borders of the country or the company for safety purposes.
??BREAKING: Adobe has created a new 50-person AI research org called CAVA (Co-Creation for Audio, Video, & Animation).
I can’t help but wonder if OpenAI’s Sora has been a wake up call for Adobe to formalize and accelerate their video and multimodal creation efforts?
Nvidia is building a new type of data centre called AI factory. Every company—biotech, self-driving, manufacturing, etc will need an AI factory.
Jensen is looking forward to foundational robotics and state space models. According to him, foundational robotics could have a breakthrough next year.
The crunch for Nvidia GPUs is here to stay. It won’t be able to catch up on supply this year. Probably not next year too.
A new generation of GPUs called Blackwell is coming out, and the performance of Blackwell is off the charts.
Nvidia’s business is now roughly 70% inference and 30% training, meaning AI is getting into users’ hands.
From DSC: This first item is related to the legal field being able to deal with today’s issues:
The Best Online Law School Programs (2024) — from abovethelaw.com by Staci Zaretsky A tasty little rankings treat before the full Princeton Review best law schools ranking is released.
Several law schools now offer online JD programs that have become as rigorous as their on-campus counterparts. For many JD candidates, an online law degree might even be the smarter choice. Online programs offer flexibility, affordability, access to innovative technologies, students from a diversity of career backgrounds, and global opportunities.
Voila! Feast your eyes upon the Best Online JD Programs at Law School for 2024 (in alphabetical order):
Mitchell Hamline School of Law – Hybrid J.D.
Monterey College of Law – Hybrid Online J.D.
Purdue Global Law School – Online J.D.
Southwestern Law School – Online J.D.
Syracuse University – J.D. Interactive
University of Dayton School of Law – Online Hybrid J.D.
University of New Hampshire – Hybrid J.D.
DSC: FINALLY!!! Online learning hits law schools (at least in a limited fashion)!!! Maybe there’s hope yet for the American Bar Association and for America’s legal system to be able to deal with the emerging technologies — and the issues presented therein — in the 21st century!!! Because if we can’t even get caught up to where numerous institutions of higher education were back at the turn of this century, we don’t have as much hope in the legal field being able to address things like AI, XR, cryptocurrency, blockchain, and more.
Advocate, advise, and accompany— from jordanfurlong.substack.com by Jordan Furlong These are the three essential roles lawyers will play in the post-AI era. We need to start preparing legal education, lawyer licensing, and law practices to adapt.
Consider this scenario:
Ten years from now, Generative AI has proven capable of a stunning range of legal activities. Not only can it accurately write legal documents and conduct legal research and apply law to facts, it can reliably oversee legal document production, handle contract negotiations, monitor regulatory compliance, render legal opinions, and much more. Lawyers are no longer needed to carry out these previously billable tasks or even to double-check the AI’s performance. Tasks that once occupied 80% of lawyers’ billable time have been automated.
What are the chances this scenario unfolds within the next ten years? You can decide that likelihood for yourself, but I think anything above 1% represents the potential for major disruption to the legal profession.
Also from Jordan, see:
Could Generative AI help solve the law’s access dilemma? Legal regulators have failed to strike the right balance between accessibility and quality in legal services. It’s just possible that Gen AI could open the door to an entirely new solution.
We have gathered, from Colin Levy’s insights, the top five strategies that legal professionals can implement to excel in this transformational era – bringing them together with technology.
Going further than just legal spend, analytics on volume of work and diversity metrics can help legal teams make the business case they need to drive important initiatives and decisions forward. And a key differentiator of top-performing companies is the ability to get all of this data in one place, which is why Mitratech was thrilled to unveil PlatoBI, an embedded analytics platform powered by Snowflake, earlier this year with several exciting AI and Analytic enhancements.
Also worrisome is that AI will be used as a crutch that short circuits learning. Some people look for shortcuts. What effect of AI on that learning process and the result, for students and when lawyers use AI to draft documents and research?
SAN JOSE, Calif. – [On 2/20/23], Adobe (Nasdaq:ADBE) introduced AI Assistant in beta, a new generative AI-powered conversational engine in Reader and Acrobat.
…
Simply open Reader or Acrobat and start working with the new capabilities, including:
AI Assistant: AI Assistant recommends questions based on a PDF’s content and answers questions about what’s in the document – all through an intuitive conversational interface.
Generative summary: Get a quick understanding of the content inside long documents with short overviews in easy-to-read formats.
Intelligent citations: Adobe’s custom attribution engine and proprietary AI generate citations so customers can easily verify the source of AI Assistant’s answers.
Easy navigation:
Formatted output:
Respect for customer data:
Beyond PDF: Customers can use AI Assistant with all kinds of document formats (Word, PowerPoint, meeting transcripts, etc.)
Essential skills to thrive with Sora AI
The realm of video editing isn’t about cutting and splicing.
A Video Editor should learn a diverse set of skills to earn money, such as:
Prompt Writing
Software Mastery
Problem-solving skills
Collaboration and communication skills
Creative storytelling and visual aesthetics
Invest in those skills that give you a competitive edge.
The text file that runs the internet — from theverge.com by David Pierce For decades, robots.txt governed the behavior of web crawlers. But as unscrupulous AI companies seek out more and more data, the basic social contract of the web is falling apart.
Using AI, the team was able to plow through 32.6 million possible battery materials in 80 hours, a task the team estimates would have taken them 20 years to do.
Text to video via OpenAI’s Sora. (I had taken this screenshot on the 15th, but am posting it now.)
We’re teaching AI to understand and simulate the physical world in motion, with the goal of training models that help people solve problems that require real-world interaction.
Introducing Sora, our text-to-video model. Sora can generate videos up to a minute long while maintaining visual quality and adherence to the user’s prompt.
At the University of Pennsylvania, undergraduate students in its school of engineering will soon be able to study for a bachelor of science degree in artificial intelligence.
What can one do with an AI degree? The University of Pennsylvania says students will be able to apply the skills they learn in school to build responsible AI tools, develop materials for emerging chips and hardware, and create AI-driven breakthroughs in healthcare through new antibiotics, among other things.
Google on Monday announced plans to help train people in Europe with skills in artificial intelligence, the latest tech giant to invest in preparing workers and economies amid the disruption brought on by technologies they are racing to develop.
The acceleration of AI deployments has gotten so absurdly out of hand that a draft post I started a week ago about a new development is now out of date.
… The Pace is Out of Control
A mere week since Ultra 1.0’s announcement, Google has now introduced us to Ultra 1.5, a model they are clearly positioning to be the leader in the field. Here is the full technical report for Gemini Ultra 1.5, and what it can do is stunning.
[St. Louis, MO, February 14, 2024] – In a bold move that counters the conventions of more traditional schools, Maryville University has unveiled a substantial $21 million multi-year investment in artificial intelligence (AI) and cutting-edge technologies. This groundbreaking initiative is set to transform the higher education experience to be powered by the latest technology to support student success and a five-star experience for thousands of students both on-campus and online.
“The world is adapting to seismic shifts from generative AI,” says Luben Pampoulov, Partner at GSV Ventures. “AI co-pilots, AI tutors, AI content generators—AI is ubiquitous, and differentiation is increasingly critical. This is an impressive group of EdTech companies that are leveraging AI and driving positive outcomes for learners and society.”
Workforce Learning comprises 34% of the list, K-12 29%, Higher Education 24%, Adult Consumer Learning 10%, and Early Childhood 3%. Additionally, 21% of the companies stretch across two or more “Pre-K to Gray” categories. A broader move towards profitability is also evident: the collective gross and EBITDA margin score of the 2024 cohort increased 5% compared to 2023.
Selected from 2,000+ companies around the world based on revenue scale, revenue growth, user reach, geographic diversification, and margin profile, this impressive group is reaching an estimated 3 billion people and generating an estimated $23 billion in revenue.
From DSC: This would be huge for all of our learning ecosystems, as the learning agents could remember where a particular student or employee is at in terms of their learning curve for a particular topic.
In the realm of technological advancements, artificial intelligence (AI) stands out as a beacon of immeasurable potential, yet also as a source of existential angst when considering that AI might already be beyond our ability to control.
His research underscores a chilling truth: our current understanding and control of AI are woefully inadequate, posing a threat that could either lead to unprecedented prosperity or catastrophic extinction.
From DSC: This next item is for actors, actresses, and voiceover specialists:
Turn your voice into passive income. — from elevenlabs.io; via Ben’s Bites Are you a professional voice actor? Sign up and share your voice today to start earning rewards every time it’s used.
This paper presents a groundbreaking comparison between Large Language Models (LLMs) and traditional legal contract reviewers—Junior Lawyers and Legal Process Outsourcers (LPOs). We dissect whether LLMs can outperform humans in accuracy, speed, and cost-efficiency during contract review. Our empirical analysis benchmarks LLMs against a ground truth set by Senior Lawyers, uncovering that advanced models match or exceed human accuracy in determining legal issues. In speed, LLMs complete reviews in mere seconds, eclipsing the hours required by their human counterparts. Cost-wise, LLMs operate at a fraction of the price, offering a staggering 99.97 percent reduction in cost over traditional methods. These results are not just statistics—they signal a seismic shift in legal practice. LLMs stand poised to disrupt the legal industry, enhancing accessibility and efficiency of legal services. Our research asserts that the era of LLM dominance in legal contract review is upon us, challenging the status quo and calling for a reimagined future of legal workflows.
Technology can help you solve problems in your law firm, connect with clients, save time and money, and so much more. But, how do you know what tech will work best for your practice? Molly Ranns and JoAnn Hathaway talk with Colin Levy about understanding and utilizing technology, common mistakes in choosing new tools, and ways to overcome tech-related fear and anxiety.
Every time we attend Legalweek, we have a unique opportunity to tap into the collective knowledge of hundreds of legal professionals. This year at Legalweek 2024 we talked with peers in a wide variety of roles, from litigation support professionals and lawyers to partners and heads of innovation. Throughout the sessions and discussions, we started to notice a few common themes, interesting trends, and helpful insights.
When it comes to generative AI at school and work, Gen Z says: Bring it on.
Why it matters: While some workers are fearful or ambivalent about how ChatGPT, DALL-E and their ilk will affect their jobs, many college students and newly minted grads think it can give them a career edge.
So-called “AI natives” who are studying the technology in school may have a leg up on older “digital natives” — the same way that “digital native” millennials smugly bested their “digital immigrant” elders.
Why are we doing this work? Over the past two years, the U.S. Department of Education has been committed to maintaining an ongoing conversation with educators, students, researchers, developers — and the educational community at large — related to the continuous progress of Artificial Intelligence (AI) development and its implications for teaching and learning.
Many educators are seeking resources clarifying what AI is and how it will impact their work and their students. Similarly, developers of educational technology (“edtech”) products seek guidance on what guardrails exist that can support their efforts. After the release of our May 2023 report Artificial Intelligence and the Future of Teaching and Learning, we heard the desire for more.
Moving from reaction to action, higher education stakeholders are currently exploring the opportunities afforded by AI for teaching, learning, and work while maintaining a sense of caution for the vast array of risks AI-powered technologies pose. To aid in these efforts, we present this inaugural EDUCAUSE AI Landscape Study, in which we summarize the higher education community’s current sentiments and experiences related to strategic planning and readiness, policies and procedures, workforce, and the future of AI in higher education.
Educational administrators should not worry about every AI development, but should, instead focus on the big picture, as those big picture changes will change the entire world and the educational system.
AI and related technologies (robotics, synthetic biology, and brain-computer interfaces) will continue to impact society and the entire educational system over the next 10 years. This impact on the system will be greater than anything that has happened over the last 100 years, including COVID-19, as COVID-19 eventually ended and the disruptive force of these technologies will only continue to develop.
AI is the bull in the China Shop, redefining the world and the educational system. Students writing a paper with AI is barely a poke in the educational world relative to what is starting to happen (active AI teachers and tutors; AI assessment; AI glasses; immersive learning environments; young students able to start their own business with AI tools; AIs replacing and changing jobs; deep voice and video fakes; intelligence leveling; individualized instruction; interactive and highly intelligent computers; computers that can act autonomously; and more).
WASHINGTON, Feb. 6, 2024 /PRNewswire/ — While three-quarters of higher-education trustees (83%), faculty (81%) and administrators (76%) agree that generative artificial intelligence (GenAI) will noticeably change their institutions in the next five years, community college trustees are more optimistic than their community college counterparts, with (37%) saying their organization is prepared for the change coming compared to just 16% of faculty and 11% of administrator respondents.
Those findings are from the 2023-2024 Digital Learning Pulse Survey conducted by Cengage and Bay View Analytics with support from the Association of Community College Trustees (ACCT), the Association of College and University Educators (ACUE), College Pulse and the United States Distance Learning Association (USDLA) to understand the attitudes and concerns of higher education instructors and leadership.
From DSC: It takes time to understand what a given technology brings to the table…let alone a slew of emerging technologies under the artificial intelligence (AI) umbrella. It’s hard enough when the technology is fairly well established and not changing all the time. But its extremely difficult when significant change occurs almost daily.
The limited staff within the teaching & learning centers out there need time to research and learn about the relevant technologies and how to apply those techs to instructional design. The already stretched thin faculty members need time to learn about those techs as well — and if and how they want to apply them. It takes time and research and effort.
Provosts, deans, presidents, and such need time to learn things as well.
Bottom line: We need to have realistic expectations here.
AI Adoption in Corporate L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman Where we are, and the importance of use cases in enabling change
The headline of the report is that we’ve never seen a technology adopted in enterprise as fast as generative AI. As of November 2023, two-thirds (67%) of survey respondents reported that their companies are using generative AI.
However, the vast majority of AI adopters in enterprise are still in the early stages; they’re experimenting at the edges, rather than making larger-scale, strategic decisions on how to leverage AI to accelerate our progress towards org goals and visions.
The single biggest hurdle to AI adoption in large corporates is a lack of appropriate use cases.