LLMs Will Make Creating the Content Infrastructure Significantly Easier, Faster, and Cheaper
LLMs will dramatically increase the speed of creating the informational resources that comprise the content infrastructure. Of course the drafts of these informational resources will need to be reviewed and improvements will need to be made – just as is the case with all first drafts – to insure accuracy and timeliness. But it appears that LLMs can get us 80% or so of the way to reasonable first drafts orders of magnitude faster, eliminating the majority of the expense involved in this part of the process. Here’s an example of what I’m talking about. Imagine you’re a SME who has been tasked with writing the content for an introductory economics textbook. (The following examples are from ChatGPT.)
Speaking of ID and higher education, also relevant/see:
Infinite AI Interns for Everybody— from wired.com by Matt Clifford; via Sam DeBrule These assistants won’t just ease the workload, they’ll unleash a wave of entrepreneurship.
Excerpt:
Excitingly, though, there’s also a new generation of startups that are demonstrating that you don’t need a billion-dollar budget to get to the cutting edge of AI. Take Midjourney or Stability AI, applications which produce results that rival DALL-E, or Causaly (disclosure: I’m an investor), which allows scientists to find new causal relationships in life sciences with natural language questions. Then there is a growing list of new AI startups with impressive backers and more general ambitions, like Anthropic (an AI safety and research firm), Conjecture (which seeks to keep damaging factors such as racial bias out of AI), and Keen Technologies, which was founded by computer science legend John Carmack.
Just as the advent of the internet gave every startup a vastly scalable distribution engine, the era of AI superpowers will give every startup a vastly scalable production engine.
Recent advances in artificial intelligence (AI) are promising great things for learning. The potential here is impressive, but there also exist many questions and insecurities around deploying AI technology for learning: What can AI do? Where is it best utilized? What are the limits? And particularly: What does that leave for the instructional designer and other human roles in learning, such as coaching and training?
We want to suggest that these developments are for the benefit of everyone—from organizational development strategy devised in the C-suite, via content creation/curation by instructional designers, right through to the learners, as well as coaches and trainers who work with the learners.
A learning ecosystem is composed of people, tools, technologies, content, processes, culture, strategies, and any other resource that helps one learn. Learning ecosystems can be at an individual level as well as at an organizational level.
Some example components:
Subject Matter Experts (SMEs) such as faculty, staff, teachers, trainers, parents, coaches, directors, and others
Fellow employees
L&D/Training professionals
Managers
Instructional Designers
Librarians
Consultants
Types of learning
Active learning
Adult learning
PreK-12 education
Training/corporate learning
Vocational learning
Experiential learning
Competency-based learning
Self-directed learning (i.e., heutagogy)
Mobile learning
Online learning
Face-to-face-based learning
Hybrid/blended learning
Hyflex-based learning
Game-based learning
XR-based learning (AR, MR, and VR)
Informal learning
Formal learning
Lifelong learning
Microlearning
Personalized/customized learning
Play-based learning
Cloud-based learning apps
Coaching & mentoring
Peer feedback
Job aids/performance tools and other on-demand content
Websites
Conferences
Professional development
Professional organizations
Social networking
Social media – Twitter, LinkedIn, Facebook/Meta, other
Communities of practice
Artificial Intelligence (AI) — including ChatGPT, learning agents, learner profiles,
In spite of our tendency to break things down into tidy time frames, like a new year or academic semester, change constantly turns over the status quo. Especially in the world of technology, where disruptive innovation may evolve rapidly from the fringe to the mainstream.
“At ASU’s Enterprise Technology, we work in spaces where technology is not just revolutionizing higher education, but the world at large,” said Lev Gonick, chief information officer at Arizona State University. “We strive to be proactive, not reactive, to new paradigms changing the ways in which we work, learn and thrive.”
Thus, the top higher education technology trends to watch out for in 2023 include Artificial Intelligence (AI), Virtual Reality (VR), Augmented Reality (AR), Digital Twins, the Metaverse (including digital avatars and NFT art for use in the Metaverse and other Web3-based virtual environments), Internet of Things (IoT), Blockchain, Cloud, Gamification, and Chatbots. These technologies will support the expansion of the Digital Transformation of higher education going forward.
DAVOS, Switzerland—Microsoft Corp. MSFT 2.86%increase; green up pointing triangle plans to incorporate artificial-intelligence tools like ChatGPT into all of its products and make them available as platforms for other businesses to build on, Chief Executive Satya Nadella said.
It’s a matter of time before the LMSs like Canvas and Anthology do the same. Really going to change the complexion of online learning.
Microsoft are holding a lot of great cards in the AI game, especially ChatGPT-3, but Google also have a great hand, in fact they have a bird in the hand:
Sparrow, from Deepmind, is likely to launch soon. Their aim is to trump ChatGTP by having a chatbot that is more useful and reduces the risk of unsafe and inappropriate answers. In the released paper, they also indicate that it will have moral constraints. Smart move.
Hassabis has promised some sort of release in 2023. Their goal is to reduce wrong and invented information by linking it to Google Search and Scholar for citations.
The Art of ChatGPT Prompting: A Guide to Crafting Clear and Effective Prompts.
This free e-book acts a useful guide for beginners.
Collection of ChatGPT Resources Use ChatGPT in Google Docs, WhatsApp, as a desktop app, with your voice, or in other ways with this running list of tools.
Awesome ChatGPT prompts
Dozens of clever pre-written prompts you can use to initiate your own conversations with ChatGPT to get it to reply as a fallacy finder or a journal reviewer or whatever else.
Writing for Renegades – Co-writing with AI
This free 17-page resource has writing exercises you can try with ChatGPT. It also includes interesting nuggets, like Wycliffe A. Hill’s 1936 attempt at writing automation, Plot Genie.
We often see the battle between technology and humans as a zero-sum game. And that’s how much of the discussion about ChatGPT is being framed now. Like many others who have been experimenting with ChatGPT in recent weeks, I find that a lot of the output depends on the input. In other words, the better the human question, the better the ChatGPT answer.
So instead of seeing ourselves competing with technology, we should find ways to complement it and view ChatGPT as a tool that assists us in collecting information and in writing drafts.
If we reframe the threat, think about how much time can be freed up to read, to think, to write?
As many have noted, including Michael Horn on the Class Disrupted podcast he co-hosts, ChatGPT is to writing what calculators were once to math and other STEM disciplines.
GPT in Higher Education — from insidehighered.com by Ray Schroeder ChatGPT has caught our attention in higher education. What will it mean in 2023?
Excerpt:
Founder and CEO at Moodle Martin Dougiamas writes in Open Ed Tech that as educators, we must recognize that artificial general intelligence will become ubiquitous. “In short, we need to embrace that AI is going to be a huge part of our lives when creating anything. There is no gain in banning it or avoiding it. It’s actually easier (and better) to use this moment to restructure our education processes to be useful and appropriate in today’s environment (which is full of opportunities).”
Who, at your institution, is examining the impact of AI, and in particular GPT, upon the curriculum? Are instructional designers working with instructors in revising syllabi and embedding AI applications into the course offerings? What can you do to ensure that your university is preparing learners for the future rather than the past?
Ray Schroeder
ChatGPT Advice Academics Can Use Now — from insidehighered.com by Susan D’Agostino To harness the potential and avert the risks of OpenAI’s new chat bot, academics should think a few years out, invite students into the conversation and—most of all—experiment, not panic.
At schools including George Washington University in Washington, D.C., Rutgers University in New Brunswick, New Jersey, and Appalachian State University in Boone, North Carolina, professors are phasing out take-home, open-book assignments — which became a dominant method of assessment in the pandemic but now seem vulnerable to chatbots. They are instead opting for in-class assignments, handwritten papers, group work and oral exams.
Gone are prompts like “write five pages about this or that.” Some professors are instead crafting questions that they hope will be too clever for chatbots and asking students to write about their own lives and current events.
Why Banning ChatGPT in Class Is a Mistake — from campustechnology.com by Thomas Mennella Artificial intelligence can be a valuable learning tool, if used in the right context. Here are ways to embrace ChatGPT and encourage students to think critically about the content it produces.
Well, it was bound to happen. Anytime you have a phenomenon as disruptive as generative AI, you can expect lawsuits.
Case in point: the lawsuit recently filed by Getty Images against Stability AI, highlighting the ongoing legal challenges posed by the use of AI in the creative industries. But it’s not the only lawsuit recently filed, see e.g. Now artists sue AI image generation tools Stable Diffusion, Midjourney over copyright | Technology News, The Indian Express
14 Technology Predictions for Higher Education in 2023 — from campustechnology.com by Rhea Kelly How will technologies and practices like artificial intelligence, predictive analytics, digital transformation, and change management impact colleges and universities this year? Here’s what the experts told us.
Excerpt:
In an open call on LinkedIn, we asked higher education and ed tech industry leaders to forecast the most important trends to watch in the coming year. Their responses reflect both the challenges on the horizon — persistent cyber attacks, the disruptive force of emerging technologies, failures in project management — as well as the opportunities that technology brings to better serve students and support the institutional mission. Here are 14 predictions to help steer your technology efforts in 2023.
Here’s the list of sources: https://t.co/fJd4rh8kLy. The larger resource area at https://t.co/bN7CReGIEC has sample ChatGPT essays, strategies for mitigating harm, and questions for teachers to ask as well as a listserv.
— Anna Mills, amills@mastodon.oeru.org, she/her (@EnglishOER) January 11, 2023
Microsoft is reportedly eyeing a $10 billion investment in OpenAI, the startup that created the viral chatbot ChatGPT, and is planning to integrate it into Office products and Bing search.The tech giant has already invested at least $1 billion into OpenAI. Some of these features might be rolling out as early as March, according to The Information.
This is a big deal. If successful, it will bring powerful AI tools to the masses.So what would ChatGPT-powered Microsoft products look like? We asked Microsoft and OpenAI. Neither was willing to answer our questions on how they plan to integrate AI-powered products into Microsoft’s tools, even though work must be well underway to do so. However, we do know enough to make some informed, intelligent guesses. Hint: it’s probably good news if, like me, you find creating PowerPoint presentations and answering emails boring.
I have maintained for several years, including a book ‘AI for Learning’, that AI is the technology of the age and will change everything. This is unfolding as we speak but it is interesting to ask who the winners are likely to be.
People who have heard of GPT-3 / ChatGPT, and are vaguely following the advances in machine learning, large language models, and image generators. Also people who care about making the web a flourishing social and intellectual space.
That dark forest is about to expand. Large Language Models (LLMs) that can instantly generate coherent swaths of human-like text have just joined the party.
It is in this uncertain climate that Hassabis agrees to a rare interview, to issue a stark warning about his growing concerns. “I would advocate not moving fast and breaking things.”
…
“When it comes to very powerful technologies—and obviously AI is going to be one of the most powerful ever—we need to be careful,” he says. “Not everybody is thinking about those things. It’s like experimentalists, many of whom don’t realize they’re holding dangerous material.” Worse still, Hassabis points out, we are the guinea pigs.
Demis Hassabis
Excerpt (emphasis DSC):
Hassabis says these efforts are just the beginning. He and his colleagues have been working toward a much grander ambition: creating artificial general intelligence, or AGI, by building machines that can think, learn, and be set to solve humanity’s toughest problems.Today’s AI is narrow, brittle, and often not very intelligent at all. But AGI, Hassabis believes, will be an “epoch-defining” technology—like the harnessing of electricity—that will change the very fabric of human life. If he’s right, it could earn him a place in history that would relegate the namesakes of his meeting rooms to mere footnotes.
But with AI’s promise also comes peril.In recent months, researchers building an AI system to design new drugs revealed that their tool could be easily repurposed to make deadly new chemicals. A separate AI model trained to spew out toxic hate speech went viral, exemplifying the risk to vulnerable communities online. And inside AI labs around the world, policy experts were grappling with near-term questions like what to do when an AI has the potential to be commandeered by rogue states to mount widespread hacking campaigns or infer state-level nuclear secrets.
Headteachers and university lecturers have expressed concerns that ChatGPT, which can provide convincing human-sounding answers to exam questions, could spark a wave of cheating in homework and exam coursework.
Now, the bot’s makers, San Francisco-based OpenAI, are trying to counter the risk by “watermarking” the bot’s output and making plagiarism easier to spot.
Students need now, more than ever, to understand how to navigate a world in which artificial intelligence is increasingly woven into everyday life. It’s a world that they, ultimately, will shape.
We hail from two professional fields that have an outsize interest in this debate. Joanne is a veteran journalist and editor deeply concerned about the potential for plagiarism and misinformation. Rebecca is a public health expert focused on artificial intelligence, who champions equitable adoption of new technologies.
We are also mother and daughter. Our dinner-table conversations have become a microcosm of the argument around ChatGPT, weighing its very real dangers against its equally real promise. Yet we both firmly believe that a blanket ban is a missed opportunity.
ChatGPT: Threat or Menace? — from insidehighered.com by Steven Mintz Are fears about generative AI warranted?
The rapid pace of change is driven by a “perfect storm” of factors, including the falling cost of computing power, the rise of data-driven decision-making, and the increasing availability of new technologies. “The speed of current breakthroughs has no historical precedent,”concluded Andrew Doxsey, co-founder of Libra Incentix, in an interview. “Unlike previous technological revolutions, the Fourth Industrial Revolution is evolving exponentially rather than linearly. Furthermore, it disrupts almost every industry worldwide.”
An updated version of the AI chatbot ChatGPT was recently released to the public.
I got the chatbot to write cover letters for real jobs and asked hiring managers what they thought.
The managers said they would’ve given me a call but that the letters lacked personality.
.
I mentor a young lad with poor literacy skills who is starting a landscaping business. He struggles to communicate with clients in a professional manner.
I created a GPT3-powered Gmail account to which he sends a message. It responds with the text to send to the client. pic.twitter.com/nlFX9Yx6wR
AI to help realize your dream career The Unschooler mentor helps you understand what you need to do to achieve your dream career. You can select one of six broad areas of expertise: science, people, tech, info, art, and business. The platform will then ask questions related to your future career.
It also has some other useful features. Unschooler keeps track of your skills by adding them to a skill map that’s unique to you.You can also ask it to expand on the information it has already given you. This is done by selecting the text and clicking one of four buttons: more, example, how to, explain, and a question mark icon that defines the selected text. There’s also a mobile app that analyzes text from pictures and explains tasks or concepts.
From DSC: This integration of AI is part of the vision that I’ve been tracking at:
AI-powered results will be both highly confident and often wrong, this dangerous combo of inconsistent accuracy with high authority and assertiveness will be the long final mile to overcome.
The defensibility of these AI capabilities as stand-alone companies will rely on data moats, privacy preferences for consumers and enterprises, developer ecosystems, and GTM advantages. (still brewing, but let’s discuss)
…
As I suggested in Edition 1, ChatGPT has done to writing what the calculator did to arithmetic. But what other implications can we expect here?
The return of the Socratic method, at scale and on-demand…
The art and science of prompt engineering…
The bar for teaching will rise, as traditional research for paper-writing and memorization become antiquated ways of building knowledge.
CES is more than just a neon-drenched show-and-tell session for the world’s biggest tech manufacturers. More and more, it’s also a place where companies showcase innovations that could truly make the world a better place — and at CES 2023, this type of tech was on full display. We saw everything from accessibility-minded PS5 controllers to pedal-powered smart desks. But of all the amazing innovations on display this year, these three impressed us the most…
AI already does and will continue to impact education – along with every other sector.
Innovative education leaders have an opportunity to build the foundation for the most personalized learning system we have ever seen.
Action
Education leaders need to consider these possible futures now. There is no doubt that K-12 and higher ed learners will be using these tools immediately. It is not a question of preventing “AI plagiarism” (if such a thing could exist), but a question of how to modify teaching to take advantage of these new tools.
From DSC: They go on to list some solid ideas and experiments to try out — both for students and for teachers. Thanks Nate and Rachelle!
From DSC: A few items re: ChatGPT — with some items pro-chat and other items against the use of ChatGPT (or at least to limit its use).
How About We Put Learning at the Center? — from insidehighered.com by John Warner The ongoing freak-out about ChatGPT sent me back to considering the fundamentals.
Excerpt:
So, when people express concern that students will use ChatGPT to complete their assignments, I understand the concern, but what I don’t understand is why this concern is so often channeled into discussions about how to police student behavior, rather than using this as an opportunity to exam the kind of work we actually ask students (and faculty) to do around learning.
If ChatGPT can do the things we ask students to do in order to demonstrate learning, it seems possible to me that those things should’ve been questioned a long time ago. It’s why I continue to believe this technology is an opportunity for reinvention, precisely because it is a threat to the status quo.
Top AI conference bans use of ChatGPT and AI language tools to write academic papers — from theverge.com by James Vincent; with thanks to Anna Mills for this resource AI tools can be used to ‘edit’ and ‘polish’ authors’ work, say the conference organizers, but text ‘produced entirely’ by AI is not allowed. This raises the question: where do you draw the line between editing and writing?
Excerpt:
The International Conference on Machine Learning (ICML) announced the policy earlier this week, stating, “Papers that include text generated from a large-scale language model (LLM) such as ChatGPT are prohibited unless the produced text is presented as a part of the paper’s experimental analysis.” The news sparked widespread discussion on social media, with AI academics and researchers both defending and criticizing the policy. The conference’s organizers responded by publishing a longer statement explaining their thinking. (The ICML responded to requests from The Verge for comment by directing us to this same statement.)
I am protective of my organic writing process, but when it’s time to edit, ChatGPT sometimes helps me notice which sentences aren’t quite put together as they should be. When I ask it why it suggested a revision, it sometimes describes a word choice nuance I hadn’t thought of.
— Anna Mills, amills@mastodon.oeru.org, she/her (@EnglishOER) January 5, 2023
Instead, I want to discuss the opportunity provided by AI, because it can help us teach in new ways. The very things that make AI scary for educators — its tedency to make up facts, its lack of nuance, and its ability to make excellent student essays — can be used to make education better.
This isn’t for some future theoretical version of AI. You can create assignments, right now, using ChatGPT, that we will help stretch students in knew ways. We wrote a paper with the instructions. You can read it here, but I also want to summarize our suggestions. These are obviously not the only ways to use AI to educate, but they solve some of the hardest problems in education, and you can start experimenting with them right now.
New York City students and teachers can no longer access ChatGPT — the new artificial intelligence-powered chatbot that generates stunningly cogent and lifelike writing — on education department devices or internet networks, agency officials confirmed Tuesday.
SINGAPORE – Teachers in Singapore say they will likely have to move from assignments requiring regurgitation to those that require greater critical thinking, to stay ahead in the fight against plagiarism.
This comes on the back of the rise of ChatGPT, an intelligent chatbot that is able to spin essays and solve mathematical equations in seconds.
ChatGPT Is Not Ready to Teach Geometry (Yet)— from educationnext.org by Paul T. von Hippel The viral chatbot is often wrong, but never in doubt. Educators need to tread carefully.
Excerpt:
Can ChatGPT provide feedback and answer questions about math in a more tailored and natural way? The answer, for the time being, is no. Although ChatGPT can talk about math superficially, it doesn’t “understand” math with real depth. It cannot correct mathematical misconceptions, it often introduces misconceptions of its own; and it sometimes makes inexplicable mathematical errors that a basic spreadsheet or hand calculator wouldn’t make.