How Workers Rise — from the-job.beehiiv.com by Paul Fain
A look forward at skills-based hiring and AI’s impacts on education and work.

Impacts of AI: Fuller is optimistic about companies making serious progress on skills-based hiring over the next five to 10 years. AI will help drive that transformation, he says, by creating the data to better understand the skills associated with jobs.

The technology will allow for a more accurate matching of skills and experiences, says Fuller, and for companies to “not rely on proxies like degrees or grade point averages or even the proxy of what someone currently makes or how fast they’ve gotten promoted on their résumé.”

Change is coming soon, Fuller predicts, particularly as AI’s impacts accelerate. And the disruption will affect wealthier Americans who’ve been spared during previous shifts in the labor market.

The Kicker: “When people in bedroom suburbs are losing their six-figure jobs, that changes politics,” Fuller says. “That changes the way people are viewing things like equity and where that leads. It’s certainly going to put a lot of pressure on the way the system has worked.”

 

 

More Chief Online Learning Officers Step Up to Senior Leadership Roles 
In 2024, I think we will see more Chief Online Learning Officers (COLOs) take on more significant roles and projects at institutions.

In recent years, we have seen many COLOs accept provost positions. The typical provost career path that runs up through the faculty ranks does not adequately prepare leaders for the digital transformation occurring in postsecondary education.

As we’ve seen with the professionalization of the COLO role, in general, these same leaders proved to be incredibly valuable during the pandemic due to their unique skills: part academic, part entrepreneur, part technologist, COLOs are unique in higher education. They sit at the epicenter of teaching, learning, technology, and sustainability. As institutions are evolving, look for more online and professional continuing leaders to take on more senior roles on campuses.

Julie Uranis, Senior Vice President, Online and Strategic Initiatives, UPCEA

 

What value do you offer? — from linkedin.com by Dan Fitzpatrick — The AI Educator

Excerpt (emphasis DSC): 

So, as educators, mentors, and guides to our future generations, we must ask ourselves three pivotal questions:

  1. What value do we offer to our students?
  2. What value will they need to offer to the world?
  3. How are we preparing them to offer that value?

The answers to these questions are crucial, and they will redefine the trajectory of our education system.

We need to create an environment that encourages curiosity, embraces failure as a learning opportunity, and celebrates diversity. We need to teach our students how to learn, how to ask the right questions, and how to think for themselves.


AI 101 for Teachers



5 Little-Known ChatGPT Prompts to Learn Anything Faster — from medium.com by Eva Keiffenheim
Including templates, you can copy.

Leveraging ChatGPT for learning is the most meaningful skill this year for lifelong learners. But it’s too hard to find resources to master it.

As a learning science nerd, I’ve explored hundreds of prompts over the past months. Most of the advice doesn’t go beyond text summaries and multiple-choice testing.

That’s why I’ve created this article — it merges learning science with prompt writing to help you learn anything faster.


From DSC:
This is a very nice, clearly illustrated, free video to get started with the Midjourney (text-to-image) app. Nice work Dan!

Also see Dan’s
AI Generated Immersive Learning Series


What is Academic Integrity in the Era of Generative Artificial intelligence? — from silverliningforlearning.org by Chris Dede

In the new-normal of generative AI, how does one articulate the value of academic integrity? This blog presents my current response in about 2,500 words; a complete answer could fill a sizable book.

Massive amounts of misinformation are disseminated about generative AI, so the first part of my discussion clarifies what large language models (Chat-GPT and its counterparts) can currently do and what they cannot accomplish at this point in time. The second part describes ways in which generative AI can be misused as a means of learning; unfortunately, many people are now advocating for these mistaken applications to education. The third part describes ways in which large language models (LLM), used well, may substantially improve learning and education. I close with a plea for a robust, informed public discussion about these topics and issues.


Dr. Chris Dede and the Necessity of Training Students and Faculty to Improve Their Human Judgment and Work Properly with AIs — from stefanbauschard.substack.com by Stefan Bauschard
We need to stop using test-driven curriculums that train students to listen and to compete against machines, a competition they cannot win. Instead, we need to help them augment their Judgment.


The Creative Ways Teachers Are Using ChatGPT in the Classroom — from time.com by Olivia B. Waxman

Many of the more than a dozen teachers TIME interviewed for this story argue that the way to get kids to care is to proactively use ChatGPT in the classroom.

Some of those creative ideas are already in effect at Peninsula High School in Gig Harbor, about an hour from Seattle. In Erin Rossing’s precalculus class, a student got ChatGPT to generate a rap about vectors and trigonometry in the style of Kanye West, while geometry students used the program to write mathematical proofs in the style of raps, which they performed in a classroom competition. In Kara Beloate’s English-Language Arts class, she allowed students reading Shakespeare’s Othello to use ChatGPT to translate lines into modern English to help them understand the text, so that they could spend class time discussing the plot and themes.


AI in Higher Education: Aiding Students’ Academic Journey — from td.org by J. Chris Brown

Topics/sections include:

Automatic Grading and Assessment
AI-Assisted Student Support Services
Intelligent Tutoring Systems
AI Can Help Both Students and Teachers


Shockwaves & Innovations: How Nations Worldwide Are Dealing with AI in Education — from the74million.org by Robin Lake
Lake: Other countries are quickly adopting artificial intelligence in schools. Lessons from Singapore, South Korea, India, China, Finland and Japan.

I found that other developed countries share concerns about students cheating but are moving quickly to use AI to personalize education, enhance language lessons and help teachers with mundane tasks, such as grading. Some of these countries are in the early stages of training teachers to use AI and developing curriculum standards for what students should know and be able to do with the technology.

Several countries began positioning themselves several years ago to invest in AI in education in order to compete in the fourth industrial revolution.


AI in Education — from educationnext.org by John Bailey
The leap into a new era of machine intelligence carries risks and challenges, but also plenty of promise

In the realm of education, this technology will influence how students learn, how teachers work, and ultimately how we structure our education system. Some educators and leaders look forward to these changes with great enthusiasm. Sal Kahn, founder of Khan Academy, went so far as to say in a TED talk that AI has the potential to effect “probably the biggest positive transformation that education has ever seen.” But others warn that AI will enable the spread of misinformation, facilitate cheating in school and college, kill whatever vestiges of individual privacy remain, and cause massive job loss. The challenge is to harness the positive potential while avoiding or mitigating the harm.


Generative AI and education futures — from ucl.ac.uk
Video highlights from Professor Mike Sharples’ keynote address at the 2023 UCL Education Conference, which explored opportunities to prosper with AI as a part of education.


Bringing AI Literacy to High Schools — from by Nikki Goth Itoi
Stanford education researchers collaborated with teachers to develop classroom-ready AI resources for high school instructors across subject areas.

To address these two imperatives, all high schools need access to basic AI tools and training. Yet the reality is that many underserved schools in low-income areas lack the bandwidth, skills, and confidence to guide their students through an AI-powered world. And if the pattern continues, AI will only worsen existing inequities. With this concern top of mind plus initial funding from the McCoy Ethics Center, Lee began recruiting some graduate students and high school teachers to explore how to give more people equal footing in the AI space.


 

Navigating the Future of Learning in a Digitally-Disrupted World — from thinklearningstudio.org by Russell Cailey

Are we on the frontier of unveiling an unseen revolution in education? The hypothesis is that this quiet upheaval’s importance is far more significant than we imagine. As our world adjusts, restructures, and emerges from a year which launched an era of mass AI, so too does a new academic year dawn for many – with hope and enthusiasm about new roles, titles, or simply just a new mindset. Concealed from sight, however, I believe a significant transformative wave has started and will begin to reshape our education systems and push us into a new stage of innovative teaching practice whether we desire it or not. The risk and hope is that the quiet revolution remains outside the regulator’s and ministries’ purview, which could risk a dangerous fragmentation of education policy and practice, divorced from the actualities of the world ‘in and outside school’.

“This goal can be achieved through continued support for introducing more new areas of study, such as ‘foresight and futures’, in the high school classroom.”


Four directions for assessment redesign in the age of generative AI— from timeshighereducation.com by Julia Chen
The rise of generative AI has led universities to rethink how learning is quantified. Julia Chen offers four options for assessment redesign that can be applied across disciplines

Direction 1: From written description to multimodal explanation and application

Direction 2: From literature review alone to referencing lectures

Direction 3: From presentation of ideas to defence of views

Direction 4: From working alone to student-staff partnership




15 Inspirational Voices in the Space Between AI and Education — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
Get Inspired for AI and The Future of Education.

If you are just back from vacation and still not quite sure what to do about AI, let me assure you that you are not the only one. My advice for you today is this: fill your LinkedIn-feed and/or inbox with ideas, inspirational writing and commentary on AI. This will get you up to speed quickly and is a great way to stay informed on the newest movements you need to be aware of.

My personal recommendation for you is to check out these bright people who are all very active on LinkedIn and/or have a newsletter worth paying attention to. I have kept the list fairly short – only 15 people – in order to make it as easy as possible for you to begin exploring.


Universities say AI cheats can’t be beaten, moving away from attempts to block AI (Australia) — from abc.net.au by Jake Evans

Key points:

  • Universities have warned against banning AI technologies in academia
  • Several say AI cheating in tests will be too difficult to stop, and it is more practical to change assessment methods
  • The sector says the entire nature of teaching will have to change to ensure students continue to effectively learn

aieducator.tools


Navigating A World of Generative AI: Suggestions for Educators — from nextlevellab.gse.harvard.edu by Lydia Cao and Chris Dede

Understanding the nature of generative AI is crucial for educators to navigate the evolving landscape of teaching and learning. In a new report from the Next Level Lab, Lydia Cao and Chris Dede reflect on the role of generative AI in learning and how this pushes us to reconceptualize our visions of effective education. Though there are concerns of plagiarism and replacement of human jobs, Cao and Dede argue that a more productive way forward is for educators to focus on demystifying AI, emphasizing the learning process over the final product, honoring learner agency, orchestrating multiple sources of motivation, cultivating skills that AI cannot easily replicate, and fostering intelligence augmentation (IA) through building human-AI partnerships.

Navigating A World of Generative AI: Suggestions for Educators -- by Lydia Cao and Chris Dede


20 CHATGPT PROMPTS FOR ELA TEACHERS — from classtechtips.com by Dr. Monica Burns

Have you used chatbots to save time this school year? ChatGPT and generative artificial intelligence (AI) have changed the way I think about instructional planning. Today on the blog, I have a selection of ChatGPT prompts for ELA teachers.

You can use chatbots to tackle tedious tasks, gather ideas, and even support your work to meet the needs of every student. In my recent quick reference guide published by ISTE and ASCD, Using AI Chatbots to Enhance Planning and Instruction, I explore this topic. You can also find 50 more prompts for educators in this free ebook.


Professors Craft Courses on ChatGPT With ChatGPT — from insidehighered.com by Lauren Coffey
While some institutions are banning the use of the new AI tool, others are leaning into its use and offering courses dedicated solely to navigating the new technology.

Maynard, along with Jules White at Vanderbilt University, are among a small number of professors launching courses focused solely on teaching students across disciplines to better navigate AI and ChatGPT.

The offerings go beyond institutions flexing their innovation skills—the faculty behind these courses view them as imperative to ensure students are prepared for ever-changing workforce needs.


GPT-4 can already pass freshman year at Harvard | professors need to adapt to their students’ new reality — fast — from chronicle.com by Maya Bodnick (an undergraduate at Harvard University, studying government)

A. A. A-. B. B-. Pass.

That’s a solid report card for a freshman in college, a respectable 3.57 GPA. I recently finished my freshman year at Harvard, but those grades aren’t mine — they’re GPT-4’s.

Three weeks ago, I asked seven Harvard professors and teaching assistants to grade essays written by GPT-4 in response to a prompt assigned in their class. Most of these essays were major assignments which counted for about one-quarter to one-third of students’ grades in the class. (I’ve listed the professors or preceptors for all of these classes, but some of the essays were graded by TAs.)

Here are the prompts with links to the essays, the names of instructors, and the grades each essay received…

The impact that AI is having on liberal-arts homework is indicative of the AI threat to the career fields that liberal-arts majors tend to enter. So maybe what we should really be focused on isn’t, “How do we make liberal-arts homework better?” but rather, “What are jobs going to look like over the next 10–20 years, and how do we prepare students to succeed in that world?”



The great assessment rethink — from timeshighereducation.com by
How to measure learning and protect academic integrity in the age of ChatGPT

Items from Times Higher Education re: redesigning assessment

 

New York Times sues AI — theneurondaily.com


In Hollywood, writers and actors/actresses are on strike due to AI-related items.


From DSC:
And while some are asking about other industries’/individuals’ data, Bryan Alexander asks this about academics:

While I’m here, also see Bryan’s posting –> How colleges and university are responding to AI now


AI’s Coming Constitutional Convention — from thebrainyacts.beehiiv.com

For centuries, constitutional conventions have been pivotal moments in history for codifying the rights and responsibilities that shape civilized societies. As artificial intelligence rapidly grows more powerful, the AI community faces a similar historic inflection point. The time has come to draft a “Constitutional Convention for AI” – to proactively encode principles that will steer these transformative technologies toward justice, empowerment, and human flourishing.

AI promises immense benefits, from curing diseases to unlocking clean energy. But uncontrolled, it poses existential dangers that could undermine human autonomy and dignity. Lawyers understand well that mere regulations or oversight are no match for a determined bad actor. Fundamental principles must be woven into the very fabric of the system.

 

Why L&D should be at the forefront of the AI revolution — from managementtoday.co.uk by Bill Borrows
AI means that 50% of all employees will need reskilling or upskilling by 2025, according to the World Economic Forum.

Excerpt (emphasis DSC):

Ernst and Young dug a little deeper. “Today’s disruptive working landscape requires organisations to largely restructure the way they are doing work,” they noted in a bulletin in March this year. “Time now spent on tasks will be equally divided between people and machines. For these reasons, workforce roles will change and so do the skills needed to perform them.”

The World Economic Forum has pointed to this global skills gap and estimates that, while 85 million jobs will be displaced, 50% of all employees will need reskilling and/or upskilling by 2025. This, it almost goes without saying, will require Learning and Development departments to do the heavy-lifting in this initial transformational phase but also in an on-going capacity.

“And that’s the big problem,” says Hardman. “2025 is only two and half years away and the three pillars of L&D – knowledge transference, knowledge reinforcement and knowledge assessment – are crumbling. They have been unchanged for decades and are now, faced by revolutionary change, no longer fit for purpose.”


ChatGPT is the shakeup education needs — from eschoolnews.com by Joshua Sine
As technology evolves, industries must evolve alongside it, and education is no exception–especially when students heavily and regularly rely on edtech

Key points:

  • Education must evolve along with technology–students will expect it
  • Embracing new technologies helps education leverage adaptive technology that engage student interest
  • See related article: AI tools are set to impact tutoring in a big way


Welcome to the new surreal. How AI-generated video is changing film. — from technologyreview.com by Will Douglas Heaven
Exclusive: Watch the world premiere of the AI-generated short film The Frost.

.



The Future Of Education – Disruption Caused By AI And ChatGPT: Artificial Intelligence Series 3 Of 5 — from forbes.com by Nicole Serena Silver
Here are some ways AI can be introduced at various age levels


 

From DSC:
As Rob Toews points out in his recent article out at Forbes.com, we had better hope that the Taiwan Semiconductor Manufacturing Company (TSMC) builds out the capacity to make chips in various countries. Why? Because:

The following statement is utterly ludicrous. It is also true. The world’s most important advanced technology is nearly all produced in a single facility.

What’s more, that facility is located in one of the most geopolitically fraught areas on earth—an area in which many analysts believe that war is inevitable within the decade.

The future of artificial intelligence hangs in the balance.

The Taiwan Semiconductor Manufacturing Company (TSMC) makes ***all of the world’s advanced AI chips.*** Most importantly, this means Nvidia’s GPUs; it also includes the AI chips from Google, AMD, Amazon, Microsoft, Cerebras, SambaNova, Untether and every other credible competitor.

— from The Geopolitics Of AI Chips Will Define The Future Of AI
out at Forbes.com by Rob Toews

Little surprise, then, that Time Magazine described TSMC
as “the world’s most important company that you’ve
probably never heard of.”

 


From DSC:
If that facility was actually the only one and something happened to it, look at how many things would be impacted as of early May 2023!


 

Examples of generative AI models

 

From DSC:
I don’t think all students hate AI. My guess is that a lot of them like AI and are very intrigued by it. The next generation is starting to see its potential — for good and/or for ill.

One of the comments (from the above item) said to check out the following video.  I saw one (or both?) of these people on a recent 60 Minutes piece as well.


Speaking of AI, also see:

 

EdTech Is Going Crazy For AI — from joshbersin.com by Josh Bersin

Excerpts:

This week I spent a few days at the ASU/GSV conference and ran into 7,000 educators, entrepreneurs, and corporate training people who had gone CRAZY for AI.

No, I’m not kidding. This community, which makes up people like training managers, community college leaders, educators, and policymakers is absolutely freaked out about ChatGPT, Large Language Models, and all sorts of issues with AI. Now don’t get me wrong: I’m a huge fan of this. But the frenzy is unprecedented: this is bigger than the excitement at the launch of the i-Phone.

Second, the L&D market is about to get disrupted like never before. I had two interactive sessions with about 200 L&D leaders and I essentially heard the same thing over and over. What is going to happen to our jobs when these Generative AI tools start automatically building content, assessments, teaching guides, rubrics, videos, and simulations in seconds?

The answer is pretty clear: you’re going to get disrupted. I’m not saying that L&D teams need to worry about their careers, but it’s very clear to me they’re going to have to swim upstream in a big hurry. As with all new technologies, it’s time for learning leaders to get to know these tools, understand how they work, and start to experiment with them as fast as you can.


Speaking of the ASU+GSV Summit, see this posting from Michael Moe:

EIEIO…Brave New World
By: Michael Moe, CFA, Brent Peus, Owen Ritz

Excerpt:

Last week, the 14th annual ASU+GSV Summit hosted over 7,000 leaders from 70+ companies well as over 900 of the world’s most innovative EdTech companies. Below are some of our favorite speeches from this year’s Summit…

***

Also see:

Imagining what’s possible in lifelong learning: Six insights from Stanford scholars at ASU+GSV — from acceleratelearning.stanford.edu by Isabel Sacks

Excerpt:

High-quality tutoring is one of the most effective educational interventions we have – but we need both humans and technology for it to work. In a standing-room-only session, GSE Professor Susanna Loeb, a faculty lead at the Stanford Accelerator for Learning, spoke alongside school district superintendents on the value of high-impact tutoring. The most important factors in effective tutoring, she said, are (1) the tutor has data on specific areas where the student needs support, (2) the tutor has high-quality materials and training, and (3) there is a positive, trusting relationship between the tutor and student. New technologies, including AI, can make the first and second elements much easier – but they will never be able to replace human adults in the relational piece, which is crucial to student engagement and motivation.



A guide to prompting AI (for what it is worth) — from oneusefulthing.org by Ethan Mollick
A little bit of magic, but mostly just practice

Excerpt (emphasis DSC):

Being “good at prompting” is a temporary state of affairs. The current AI systems are already very good at figuring out your intent, and they are getting better. Prompting is not going to be that important for that much longer. In fact, it already isn’t in GPT-4 and Bing. If you want to do something with AI, just ask it to help you do the thing. “I want to write a novel, what do you need to know to help me?” will get you surprisingly far.

The best way to use AI systems is not to craft the perfect prompt, but rather to use it interactively. Try asking for something. Then ask the AI to modify or adjust its output. Work with the AI, rather than trying to issue a single command that does everything you want. The more you experiment, the better off you are. Just use the AI a lot, and it will make a big difference – a lesson my class learned as they worked with the AI to create essays.

From DSC:
Agreed –> “Being “good at prompting” is a temporary state of affairs.” The User Interfaces that are/will be appearing will help greatly in this regard.


From DSC:
Bizarre…at least for me in late April of 2023:


Excerpt from Lore Issue #28: Drake, Grimes, and The Future of AI Music — from lore.com

Here’s a summary of what you need to know:

  • The rise of AI-generated music has ignited legal and ethical debates, with record labels invoking copyright law to remove AI-generated songs from platforms like YouTube.
  • Tech companies like Google face a conundrum: should they take down AI-generated content, and if so, on what grounds?
  • Some artists, like Grimes, are embracing the change, proposing new revenue-sharing models and utilizing blockchain-based smart contracts for royalties.
  • The future of AI-generated music presents both challenges and opportunities, with the potential to create new platforms and genres, democratize the industry, and redefine artist compensation.

The Need for AI PD — from techlearning.com by Erik Ofgang
Educators need training on how to effectively incorporate artificial intelligence into their teaching practice, says Lance Key, an award-winning educator.

“School never was fun for me,” he says, hoping that as an educator he could change that with his students. “I wanted to make learning fun.”  This ‘learning should be fun’ philosophy is at the heart of the approach he advises educators take when it comes to AI. 


Coursera Adds ChatGPT-Powered Learning Tools — from campustechnology.com by Kate Lucariello

Excerpt:

At its 11th annual conference in 2023, educational company Coursera announced it is adding ChatGPT-powered interactive ed tech tools to its learning platform, including a generative AI coach for students and an AI course-building tool for teachers. It will also add machine learning-powered translation, expanded VR immersive learning experiences, and more.

Coursera Coach will give learners a ChatGPT virtual coach to answer questions, give feedback, summarize video lectures and other materials, give career advice, and prepare them for job interviews. This feature will be available in the coming months.

From DSC:
Yes…it will be very interesting to see how tools and platforms interact from this time forth. The term “integration” will take a massive step forward, at least in my mind.


 

From DSC:
Regarding the core curricula of colleges and universities…

For decades now, faculty members have taught what they wanted to teach and what interested them. They taught what they wanted to research vs. what the wider marketplace/workplace needed. They were not responsive to the needs of the workplace — nor to the needs of their students!

And this situation has been all the more compounded by the increasing costs of obtaining a degree plus the exponential pace of change. We weren’t doing a good job before this exponential pace of change started taking place — and now it’s (almost?) impossible to keep up.

The bottom line on the article below: ***It’s sales.***

Therefore, it’s about what you are selling — and at what price. The story hasn’t changed much. The narrative (i.e., the curricula and more) is pretty much the same thing that’s been sold for years.

But the days of faculty members teaching whatever they wanted to are over, or significantly waning.

Faculty members, faculty senates, provosts, presidents, and accreditors are reaping what they’ve sown.

The questions are now:

  • Will new seeds be sown?
  • Will new crops arise in the future?
  • Will there be new narratives?
  • Will institutions be able to reinvent themselves (one potential example here)? Or will their cultures not allow such significant change to take place? Will alternatives to institutions of traditional higher education continue to pick up steam?

A Profession on the Edge — from chronicle.com by Eric Hoover
Why enrollment leaders are wearing down, burning out, and leaving jobs they once loved.

Excerpts:

Similar stories are echoing throughout the hallways of higher education. Vice presidents for enrollment, as well as admissions deans and directors, are wearing down, burning out, and leaving jobs they once loved. Though there’s no way to compile a chart quantifying the churn, industry insiders describe it as significant. “We’re at an inflection point,” says Rick Clark, executive director of undergraduate admission at Georgia Tech. “There have always been people leaving the field, but not in the numbers we’re seeing now.”

Some are being shoved out the door by presidents and boards. Some are resigning out of exhaustion, frustration, and disillusionment. And some who once sought top-level positions are rethinking their ambitions. “The pressures have ratcheted up tenfold,” says Angel B. Pérez, chief executive of the National Association for College Admission Counseling, known as NACAC. “I talk with someone each week who’s either leaving the field or considering leaving.”


From DSC:
This quote points to what I’m trying to address here:

Dahlstrom and other veterans of the field say they’ve experienced something especially disquieting: an erosion of faith in the transformational power of higher education. Though she sought a career in admissions to help students, her disillusionment grew after taking on a leadership role. She became less confident that she was equipped to effect positive changes, at her institution or beyond, especially when it came to the challenge of expanding college access in a nation of socioeconomic disparities: “I felt like a cog in a huge machine that’s not working, yet continues to grind while only small, temporary fixes are made.”

 

From DSC:
Before we get to Scott Belsky’s article, here’s an interesting/related item from Tobi Lutke:


Our World Shaken, Not Stirred: Synthetic entertainment, hybrid social experiences, syncing ourselves with apps, and more. — from implications.com by Scott Belsky
Things will get weird. And exciting.

Excerpts:

Recent advances in technology will stir shake the pot of culture and our day-to-day experiences. Examples? A new era of synthetic entertainment will emerge, online social dynamics will become “hybrid experiences” where AI personas are equal players, and we will sync ourselves with applications as opposed to using applications.

A new era of synthetic entertainment will emerge as the world’s video archives – as well as actors’ bodies and voices – will be used to train models. Expect sequels made without actor participation, a new era of ai-outfitted creative economy participants, a deluge of imaginative media that would have been cost prohibitive, and copyright wars and legislation.

Unauthorized sequels, spin-offs, some amazing stuff, and a legal dumpster fire: Now lets shift beyond Hollywood to the fast-growing long tail of prosumer-made entertainment. This is where entirely new genres of entertainment will emerge including the unauthorized sequels and spinoffs that I expect we will start seeing.


Also relevant/see:

Digital storytelling with generative AI: notes on the appearance of #AICinema — from bryanalexander.org by Bryan Alexander

Excerpt:

This is how I viewed a fascinating article about the so-called #AICinema movement.  Benj Edwards describes this nascent current and interviews one of its practitioners, Julie Wieland.  It’s a great example of people creating small stories using tech – in this case, generative AI, specifically the image creator Midjourney.

Bryan links to:

Artists astound with AI-generated film stills from a parallel universe — from arstechnica.com by Benj Edwards
A Q&A with “synthographer” Julie Wieland on the #aicinema movement.

An AI-generated image from an #aicinema still series called Vinyl Vengeance by Julie Wieland, created using Midjourney.


From DSC:
How will text-to-video impact the Learning and Development world? Teaching and learning? Those people communicating within communities of practice? Those creating presentations and/or offering webinars?

Hmmm…should be interesting!


 

ANALYSIS: ‘Microcredentials’ poised to disrupt higher ed as degrees lose relevance to employers — from campusreform.org by Shelby Kearns; with thanks to Ray Schroeder on LinkedIn for this resource

Key points:

  • Survey respondents are demonstrating confidence in microcredentials–online training programs that take no more than six months to complete–as four-year degree programs often overlook job training.
  • ‘Grade inflation and efforts to help everyone … attend college make it harder for employers to differentiate among applicants.’
 


Also from Julie Sobowale, see:

  • Law’s AI revolution is here — from nationalmagazine.ca
    At least this much we know. Firms need to develop a strategy around language models.

Also re: legaltech, see:

  • Pioneers and Pathfinders: Richard Susskind — from seyfarth.com by J. Stephen Poor
    In our conversation, Richard discusses the ways we should all be thinking about legal innovation, the challenges of training lawyers for the future, and the qualifications of those likely to develop breakthrough technologies in law, as well as his own journey and how he became interested in AI as an undergraduate student.

Also re: legaltech, see:

There is an elephant in the room that is rarely discussed. Who owns the IP of AI-generated content?

 

Law has a magic wand now — from jordanfurlong.substack.com by Jordan Furlong
Some people think Large Language Models will transform the practice of law. I think it’s bigger than that.

Excerpts:

ChatGPT4 can also do things that only lawyers (used to be able to) do. It can look up and summarize a court decisionanalyze and apply sections of copyright law, and generate a statement of claim for breach of contract.

What happens when you introduce a magic wand into the legal market? On the buyer side, you reduce by a staggering degree the volume of tasks that you need to pay lawyers (whether in-house or outside counsel) to perform. It won’t happen overnight: Developing, testing, revising, approving, and installing these sorts of systems in corporations will take time. But once that’s done, the beauty of LLMs like ChatGPT4 is that they are not expert systems. Anyone can use them. Anyone will.

But I can’t shake the feeling that someday, we’ll divide the history of legal services into “Before GPT4” and “After GPT4.” I think it’s that big.


From DSC:
Jordan mentions: “Some people think Large Language Models will transform the practice of law. I think it’s bigger than that.”

I agree with Jordan. It most assuredly IS bigger than that. AI will profoundly impact many industries/disciplines. The legal sector is but one of them. Education is another. People’s expectations are now changing — and the “ramification wheels” are now in motion.

I take the position that many others have as well (at least as of this point in time) that take the position that AI will supplement humans’ capabilities and activities. But those who know AI-driven apps will outcompete those who don’t know about such apps. 

 

ChatGPT is Everywhere — from chronicle.com by Beth McMurtrie
Love it or hate it, academics can’t ignore the already pervasive technology.

Excerpt:

Many academics see these tools as a danger to authentic learning, fearing that students will take shortcuts to avoid the difficulty of coming up with original ideas, organizing their thoughts, or demonstrating their knowledge. Ask ChatGPT to write a few paragraphs, for example, on how Jean Piaget’s theories on childhood development apply to our age of anxiety and it can do that.

Other professors are enthusiastic, or at least intrigued, by the possibility of incorporating generative AI into academic life. Those same tools can help students — and professors — brainstorm, kick-start an essay, explain a confusing idea, and smooth out awkward first drafts. Equally important, these faculty members argue, is their responsibility to prepare students for a world in which these technologies will be incorporated into everyday life, helping to produce everything from a professional email to a legal contract.

“Artificial-intelligence tools present the greatest creative disruption to learning that we’ve seen in my lifetime.”

Sarah Eaton, associate professor of education at the University of Calgary



Artificial intelligence and academic integrity, post-plagiarism — from universityworldnews.com Sarah Elaine Eaton; with thanks to Robert Gibson out on LinkedIn for the resource

Excerpt:

The use of artificial intelligence tools does not automatically constitute academic dishonesty. It depends how the tools are used. For example, apps such as ChatGPT can be used to help reluctant writers generate a rough draft that they can then revise and update.

Used in this way, the technology can help students learn. The text can also be used to help students learn the skills of fact-checking and critical thinking, since the outputs from ChatGPT often contain factual errors.

When students use tools or other people to complete homework on their behalf, that is considered a form of academic dishonesty because the students are no longer learning the material themselves. The key point is that it is the students, and not the technology, that is to blame when students choose to have someone – or something – do their homework for them.

There is a difference between using technology to help students learn or to help them cheat. The same technology can be used for both purposes.

From DSC:
These couple of sentences…

In the age of post-plagiarism, humans use artificial intelligence apps to enhance and elevate creative outputs as a normal part of everyday life. We will soon be unable to detect where the human written text ends and where the robot writing begins, as the outputs of both become intertwined and indistinguishable.

…reminded me of what’s been happening within the filmmaking world for years (i.e., such as in Star Wars, Jurrasic Park, and many others). It’s often hard to tell what’s real and what’s been generated by a computer.
 
© 2024 | Daniel Christian