Thinking with Colleagues: AI in Education — from campustechnology.com by Mary Grush
A Q&A with Ellen Wagner

Wagner herself recently relied on the power of collegial conversations to probe the question: What’s on the minds of educators as they make ready for the growing influence of AI in higher education? CT asked her for some takeaways from the process.

We are in the very early days of seeing how AI is going to affect education. Some of us are going to need to stay focused on the basic research to test hypotheses. Others are going to dive into laboratory “sandboxes” to see if we can build some new applications and tools for ourselves. Still others will continue to scan newsletters like ProductHunt every day to see what kinds of things people are working on. It’s going to be hard to keep up, to filter out the noise on our own. That’s one reason why thinking with colleagues is so very important.

Mary and Ellen linked to “What Is Top of Mind for Higher Education Leaders about AI?” — from northcoasteduvisory.com. Below are some excerpts from those notes:

We are interested how K-12 education will change in terms of foundational learning. With in-class, active learning designs, will younger students do a lot more intensive building of foundational writing and critical thinking skills before they get to college?

  1. The Human in the Loop: AI is built using math: think of applied statistics on steroids. Humans will be needed more than ever to manage, review and evaluate the validity and reliability of results. Curation will be essential.
  2. We will need to generate ideas about how to address AI factors such as privacy, equity, bias, copyright, intellectual property, accessibility, and scalability.
  3. Have other institutions experimented with AI detection and/or have held off on emerging tools related to this? We have just recently adjusted guidance and paused some tools related to this given the massive inaccuracies in detection (and related downstream issues in faculty-elevated conduct cases)

Even though we learn repeatedly that innovation has a lot to do with effective project management and a solid message that helps people understand what they can do to implement change, people really need innovation to be more exciting and visionary than that.  This is the place where we all need to help each other stay the course of change. 


Along these lines, also see:


What people ask me most. Also, some answers. — from oneusefulthing.org by Ethan Mollick
A FAQ of sorts

I have been talking to a lot of people about Generative AI, from teachers to business executives to artists to people actually building LLMs. In these conversations, a few key questions and themes keep coming up over and over again. Many of those questions are more informed by viral news articles about AI than about the real thing, so I thought I would try to answer a few of the most common, to the best of my ability.

I can’t blame people for asking because, for whatever reason, the companies actually building and releasing Large Language Models often seem allergic to providing any sort of documentation or tutorial besides technical notes. I was given much better documentation for the generic garden hose I bought on Amazon than for the immensely powerful AI tools being released by the world’s largest companies. So, it is no surprise that rumor has been the way that people learn about AI capabilities.

Currently, there are only really three AIs to consider: (1) OpenAI’s GPT-4 (which you can get access to with a Plus subscription or via Microsoft Bing in creative mode, for free), (2) Google’s Bard (free), or (3) Anthropic’s Claude 2 (free, but paid mode gets you faster access). As of today, GPT-4 is the clear leader, Claude 2 is second best (but can handle longer documents), and Google trails, but that will likely change very soon when Google updates its model, which is rumored to be happening in the near future.

 

The Public Is Giving Up on Higher Ed — from chronicle.com by Michael D. Smith
Our current system isn’t working for society. Digital alternatives can change that.

Excerpts:

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

On one hand, our defensiveness is natural. Change is hard, and technological change that risks making traditional parts of our sector obsolete is even harder. “A professor must have an incentive to adopt new technology,” a tenured colleague recently told me regarding online learning. “Innovation adoption will occur one funeral at a time.”

But while our defense of the status quo is understandable, maybe we should ask whether it’s ethical, given what we know about the injustice inherent in our current system. I believe a happier future for all involved — faculty, administrators, and students — is within reach, but requires we stop reflexively protecting our deeply flawed system. How can we do that? We could start by embracing three fundamental principles.

1. Digitization will change higher education.

2. We should want to embrace this change.

3. We have a way to embrace this change.

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

 

 

180 Degree Turn: NYC District Goes From Banning ChatGPT to Exploring AI’s Potential — from edweek.org by Alyson Klein (behind paywall)

New York City Public Schools will launch an Artificial Intelligence Policy Lab to guide the nation’s largest school district’s approach to this rapidly evolving technology.


The Leader’s Blindspot: How to Prepare for the Real Future — from preview.mailerlite.io by the AIEducator
The Commonly Held Belief: AI Will Automate Only Boring, Repetitive Tasks First

The Days of Task-Based Views on AI Are Numbered
The winds of change are sweeping across the educational landscape (emphasis DSC):

  1. Multifaceted AI: AI technologies are not one-trick ponies; they are evolving into complex systems that can handle a variety of tasks.
  2. Rising Expectations: As technology becomes integral to our lives, the expectations for personalised, efficient education are soaring.
  3. Skill Transformation: Future job markets will demand a different skill set, one that is symbiotic with AI capabilities.

Teaching: How to help students better understand generative AI — from chronicle.com by Beth McMurtrie
Beth describes ways professors have used ChatGPT to bolster critical thinking in writing-intensive courses

Kevin McCullen, an associate professor of computer science at the State University of New York at Plattsburgh, teaches a freshman seminar about AI and robotics. As part of the course, students read Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots, by John Markoff. McCullen had the students work in groups to outline and summarize the first three chapters. Then he showed them what ChatGPT had produced in an outline.

“Their version and ChatGPT’s version seemed to be from two different books,” McCullen wrote. “ChatGPT’s version was essentially a ‘laundry list’ of events. Their version was narratives of what they found interesting. The students had focused on what the story was telling them, while ChatGPT focused on who did what in what year.” The chatbot also introduced false information, such as wrong chapter names.

The students, he wrote, found the writing “soulless.”


7 Questions with Dr. Cristi Ford, VP of Academic Affairs at D2L — from campustechnology.com by Rhea Kelly

In the Wild West of generative AI, educators and institutions are working out how best to use the technology for learning. How can institutions define AI guidelines that allow for experimentation while providing students with consistent guidance on appropriate use of AI tools?

To find out, we spoke with Dr. Cristi Ford, vice president of academic affairs at D2L. With more than two decades of educational experience in nonprofit, higher education, and K-12 institutions, Ford works with D2L’s institutional partners to elevate best practices in teaching, learning, and student support. Here, she shares her advice on setting and communicating AI policies that are consistent and future-ready.


AI Platform Built by Teachers, for Teachers, Class Companion Raises $4 Million to Tap Into the Power of Practice — from prweb.com

“If we want to use AI to improve education, we need more teachers at the table,” said Avery Pan, Class Companion co-founder and CEO. “Class Companion is designed by teachers, for teachers, to harness the most sophisticated AI and improve their classroom experience. Developing technologies specifically for teachers is imperative to supporting our next generation of students and education system.”


7 Questions on Generative AI in Learning Design — from campustechnology.com by Rhea Kelly
Open LMS Adoption and Education Specialist Michael Vaughn on the challenges and possibilities of using artificial intelligence to move teaching and learning forward.

The potential for artificial intelligence tools to speed up course design could be an attractive prospect for overworked faculty and spread-thin instructional designers. Generative AI can shine, for example, in tasks such as reworking assessment question sets, writing course outlines and learning objectives, and generating subtitles for audio and video clips. The key, says Michael Vaughn, adoption and education specialist at learning platform Open LMS, is treating AI like an intern who can be guided and molded along the way, and whose work is then vetted by a human expert.

We spoke with Vaughn about how best to utilize generative AI in learning design, ethical issues to consider, and how to formulate an institution-wide policy that can guide AI use today and in the future.


10 Ways Technology Leaders Can Step Up and Into the Generative AI Discussion in Higher Ed — from er.educause.edu by Lance Eaton and Stan Waddell

  1. Offer Short Primers on Generative AI
  2. Explain How to Get Started
  3. Suggest Best Practices for Engaging with Generative AI
  4. Give Recommendations for Different Groups
  5. Recommend Tools
  6. Explain the Closed vs. Open-Source Divide
  7. Avoid Pitfalls
  8. Conduct Workshops and Events
  9. Spot the Fake
  10. Provide Proper Guidance on the Limitations of AI Detectors


 

As AI Chatbots Rise, More Educators Look to Oral Exams — With High-Tech Twist — from edsurge.com by Jeffrey R. Young

To use Sherpa, an instructor first uploads the reading they’ve assigned, or they can have the student upload a paper they’ve written. Then the tool asks a series of questions about the text (either questions input by the instructor or generated by the AI) to test the student’s grasp of key concepts. The software gives the instructor the choice of whether they want the tool to record audio and video of the conversation, or just audio.

The tool then uses AI to transcribe the audio from each student’s recording and flags areas where the student answer seemed off point. Teachers can review the recording or transcript of the conversation and look at what Sherpa flagged as trouble to evaluate the student’s response.

 



AI Meets Med School— from insidehighered.com by Lauren Coffey
Adding to academia’s AI embrace, two institutions in the University of Texas system are jointly offering a medical degree paired with a master’s in artificial intelligence.

Doctor AI

The University of Texas at San Antonio has launched a dual-degree program combining medical school with a master’s in artificial intelligence.

Several universities across the nation have begun integrating AI into medical practice. Medical schools at the University of Florida, the University of Illinois, the University of Alabama at Birmingham and Stanford and Harvard Universities all offer variations of a certificate in AI in medicine that is largely geared toward existing professionals.

“I think schools are looking at, ‘How do we integrate and teach the uses of AI?’” Dr. Whelan said. “And in general, when there is an innovation, you want to integrate it into the curriculum at the right pace.”

Speaking of emerging technologies and med school, also see:


Though not necessarily edu-related, this was interesting to me and hopefully will be to some profs and/or students out there:


How to stop AI deepfakes from sinking society — and science — from nature.com by Nicola Jones; via The Neuron
Deceptive videos and images created using generative AI could sway elections, crash stock markets and ruin reputations. Researchers are developing methods to limit their harm.





Exploring the Impact of AI in Education with PowerSchool’s CEO & Chief Product Officer — from michaelbhorn.substack.com by Michael B. Horn

With just under 10 acquisitions in the last 5 years, PowerSchool has been active in transforming itself from a student information systems company to an integrated education company that works across the day and lifecycle of K–12 students and educators. What’s more, the company turned heads in June with its announcement that it was partnering with Microsoft to integrate AI into its PowerSchool Performance Matters and PowerSchool LearningNav products to empower educators in delivering transformative personalized-learning pathways for students.


AI Learning Design Workshop: The Trickiness of AI Bootcamps and the Digital Divide — from eliterate.usby Michael Feldstein

As readers of this series know, I’ve developed a six-session design/build workshop series for learning design teams to create an AI Learning Design Assistant (ALDA). In my last post in this series, I provided an elaborate ChatGPT prompt that can be used as a rapid prototype that everyone can try out and experiment with.1 In this post, I’d like to focus on how to address the challenges of AI literacy effectively and equitably.


Global AI Legislation Tracker— from iapp.org; via Tom Barrett

Countries worldwide are designing and implementing AI governance legislation commensurate to the velocity and variety of proliferating AI-powered technologies. Legislative efforts include the development of comprehensive legislation, focused legislation for specific use cases, and voluntary guidelines and standards.

This tracker identifies legislative policy and related developments in a subset of jurisdictions. It is not globally comprehensive, nor does it include all AI initiatives within each jurisdiction, given the rapid and widespread policymaking in this space. This tracker offers brief commentary on the wider AI context in specific jurisdictions, and lists index rankings provided by Tortoise Media, the first index to benchmark nations on their levels of investment, innovation and implementation of AI.


Diving Deep into AI: Navigating the L&D Landscape — from learningguild.com by Markus Bernhardt

The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.



Learning Lab | ChatGPT in Higher Education: Exploring Use Cases and Designing Prompts — from events.educause.edu; via Robert Gibson on LinkedIn

Part 1: October 16 | 3:00–4:30 p.m. ET
Part 2: October 19 | 3:00–4:30 p.m. ET
Part 3: October 26 | 3:00–4:30 p.m. ET
Part 4: October 30 | 3:00–4:30 p.m. ET


Mapping AI’s Role in Education: Pioneering the Path to the Future — from marketscale.com by Michael B. Horn, Jacob Klein, and Laurence Holt

Welcome to The Future of Education with Michael B. Horn. In this insightful episode, Michael gains perspective on mapping AI’s role in education from Jacob Klein, a Product Consultant at Oko Labs, and Laurence Holt, an Entrepreneur In Residence at the XQ Institute. Together, they peer into the burgeoning world of AI in education, analyzing its potential, risks, and roadmap for integrating it seamlessly into learning environments.


Ten Wild Ways People Are Using ChatGPT’s New Vision Feature — from newsweek.com by Meghan Roos; via Superhuman

Below are 10 creative ways ChatGPT users are making use of this new vision feature.


 

The Enemy Within: Former College Presidents Offer Warnings — from forbes-com.cdn.ampproject.org by David Rosowsky; via Robert Gibson on LinkedIn

Excerpt (emphasis DSC):

Brian Mitchell, former president of Bucknell University and Washington & Jefferson College, draws on his experience to offer insight in his newest Forbes contribution. He also offers a stern warning: “Boards, administrators, and faculty must wake up to the new realities they now face… the faculty can no longer live in a world that no longer exists… institutional change will happen at a speed to which they are unaccustomed and potentially unwilling to accept.” President Mitchell then goes on to offer some immediate steps that can be taken. Perhaps the most important is to “abandon the approach to governance where trustees are updated in their periodic board meetings.”

Incremental change is possible, but transformational change may not be.

Therein lies the conundrum about which Rosenberg writes in his new book. Higher ed’s own systems are inhibiting needed transformational change.

Also just published was the book, “Whatever It Is, I’m Against It: Resistance to Change in Higher Education” by Brian Rosenberg, former president of Macalester College. Articles on Rosenberg’s observations, analysis, and cautions have appeared this month in both The Chronicle of Higher Education and Inside Higher Ed, the two leading higher education publications in the US.


Addendum on 10/6/23:

Higher Education as Its Own Worst Enemy — from insidehighered.com/ by Susan H. Greenberg
In a wide-ranging discussion about his new book, Brian Rosenberg explains how shared governance, tenure and other practices stifle change on college campuses.

He argues that the institutions designed to foster critical inquiry and the open exchange of ideas are themselves staunchly resistant to both. 

The other would be some serious thinking about pedagogy and how students learn. Because the research is there if people were willing to take it seriously and think about ways of providing an education that is not quite as reliant upon lots of faculty with Ph.D.s. Is that easy to do? No, but it is something that I think there should at least begin to be some serious discussions about.

Shared governance is one of those things that if you ask any college president off the record, they’ll probably express their frustration, then they’ll go back to their campus and wax poetic about the wonders of shared governance, because that’s what they have to do to survive.

 

Student Use Cases for AI: Start by Sharing These Guidelines with Your Class — from hbsp.harvard.edu by Ethan Mollick and Lilach Mollick

To help you explore some of the ways students can use this disruptive new technology to improve their learning—while making your job easier and more effective—we’ve written a series of articles that examine the following student use cases:

  1. AI as feedback generator
  2. AI as personal tutor
  3. AI as team coach
  4. AI as learner

Recap: Teaching in the Age of AI (What’s Working, What’s Not) — from celt.olemiss.edu by Derek Bruff, visiting associate director

Earlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.


Teaching: Want your students to be skeptical of ChatGPT? Try this. — from chronicle.com by Beth McMurtrie

Then, in class he put them into groups where they worked together to generate a 500-word essay on “Why I Write” entirely through ChatGPT. Each group had complete freedom in how they chose to use the tool. The key: They were asked to evaluate their essay on how well it offered a personal perspective and demonstrated a critical reading of the piece. Weiss also graded each ChatGPT-written essay and included an explanation of why he came up with that particular grade.

After that, the students were asked to record their observations on the experiment on the discussion board. Then they came together again as a class to discuss the experiment.

Weiss shared some of his students’ comments with me (with their approval). Here are a few:


2023 EDUCAUSE Horizon Action Plan: Generative AI — from library.educause.edu by Jenay Robert and Nicole Muscanell

Asked to describe the state of generative AI that they would like to see in higher education 10 years from now, panelists collaboratively constructed their preferred future.
.

2023-educause-horizon-action-plan-generative-ai


Will Teachers Listen to Feedback From AI? Researchers Are Betting on It — from edsurge.com by Olina Banerji

Julie York, a computer science and media teacher at South Portland High School in Maine, was scouring the internet for discussion tools for her class when she found TeachFX. An AI tool that takes recorded audio from a classroom and turns it into data about who talked and for how long, it seemed like a cool way for York to discuss issues of data privacy, consent and bias with her students. But York soon realized that TeachFX was meant for much more.

York found that TeachFX listened to her very carefully, and generated a detailed feedback report on her specific teaching style. York was hooked, in part because she says her school administration simply doesn’t have the time to observe teachers while tending to several other pressing concerns.

“I rarely ever get feedback on my teaching style. This was giving me 100 percent quantifiable data on how many questions I asked and how often I asked them in a 90-minute class,” York says. “It’s not a rubric. It’s a reflection.”

TeachFX is easy to use, York says. It’s as simple as switching on a recording device.

But TeachFX, she adds, is focused not on her students’ achievements, but instead on her performance as a teacher.


ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds — from the74million.org by Mark Keierleber
While educators worry that students are using generative AI to cheat, a new report finds students are turning to the tool more for personal problems.

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat.


Embracing weirdness: What it means to use AI as a (writing) tool — from oneusefulthing.org by Ethan Mollick
AI is strange. We need to learn to use it.

But LLMs are not Google replacements, or thesauruses or grammar checkers. Instead, they are capable of so much more weird and useful help.


Diving Deep into AI: Navigating the L&D Landscape — from learningguild.com by Markus Bernhardt

The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.

Thanks to generative AI, such visions are transitioning from fiction to reality.


Video: Unleashing the Power of AI in L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman
An exclusive video walkthrough of my keynote at Sweden’s national L&D conference this week

Highlights

  • The wicked problem of L&D: last year, $371 billion was spent on workplace training globally, but only 12% of employees apply what they learn in the workplace
  • An innovative approach to L&D: when Mastery Learning is used to design & deliver workplace training, the rate of “transfer” (i.e. behaviour change & application) is 67%
  • AI 101: quick summary of classification, generative and interactive AI and its uses in L&D
  • The impact of AI: my initial research shows that AI has the potential to scale Mastery Learning and, in the process:
    • reduce the “time to training design” by 94% > faster
    • reduce the cost of training design by 92% > cheaper
    • increase the quality of learning design & delivery by 96% > better
  • Research also shows that the vast majority of workplaces are using AI only to “oil the machine” rather than innovate and improve our processes & practices
  • Practical tips: how to get started on your AI journey in your company, and a glimpse of what L&D roles might look like in a post-AI world

 

The next wave of AI will be interactive — from joinsuperhuman.ai by Zain Kahn
ALSO: AI startups raise over $500 million

Google DeepMind cofounder Mustafa Suleyman thinks that generative is a passing phase, and that interactive AI is the next big thing in AI. Suleyman called the transformation “a profound moment” in the history of technology.

Suleyman divided AI’s evolution into 3 waves:

  1. Classification: Training computers to classify various types of data like images and text.
  2. Generative: The current wave, which takes input data to generate new data. ChatGPT is the best example of this.
  3. Interactive: The next wave, where an AI will be capable of communicating and operating autonomously.

“Think of it as autonomous software that can talk to other apps to get things done.”

From DSC:
Though I find this a generally positive thing, the above sentence makes me exclaim, “No, nothing could possibly go wrong there.”


 

Preparing Students for the AI-Enhanced Workforce — from insidehighered.com by Ray Schroeder
Our graduating and certificate-completing students need documented generative AI skills, and they need them now.

The common adage repeated again and again is that AI will not take your job; a person with AI skills will replace you. The learners we are teaching this fall who will be entering, re-entering or seeking advancement in the workforce at the end of the year or in the spring must become demonstrably skilled in using generative AI. The vast majority of white-collar jobs will demand the efficiencies and flexibilities defined by generative AI now and in the future. As higher education institutions, we will be called upon to document and validate generative AI skills.


AI image generators: 10 tools, 10 classroom uses — from ditchthattextbook.com by Matt Miller

AI image generators: 10 tools, 10 classroom uses


A Majority of New Teachers Aren’t Prepared to Teach With Technology. What’s the Fix? — from edweek.org by Alyson Klein

Think all incoming teachers have a natural facility with technology just because most are digital natives? Think again.

Teacher preparation programs have a long way to go in preparing prospective educators to teach with technology, according to a report released September 12 by the International Society for Technology in Education, a nonprofit.

In fact, more than half of incoming teachers—56 percent—lack confidence in using learning technology prior to entering the classroom, according to survey data included with the report.


5 Actual Use Cases of AI in Education: Newsletter #68 — from transcend.substack.com by Alberto Arenaza
What areas has AI truly impacted educators, learners & workers?

  1. AI Copilot for educators, managers and leaders
  2. Flipped Classrooms Chatbots
  3. AI to assess complex answers
  4. AI as a language learning tool
  5. AI to brainstorm ideas

AI-Powered Higher Ed — from drphilippahardman.substack.com by  Dr. Philippa Hardman
What a House of Commons round table discussion tells us about how AI will impact the purpose of higher education

In this week’s blog post I’ll summarise the discussion and share what we agreed would be the most likely new model of assessment in HE in the post-AI world.

But this in turn raises a bigger question: why do people go to university, and what is the role of higher education in the twenty first century? Is it to create the workforce of the future? Or an institution for developing deep and original domain expertise? Can and should it be both?


How To Develop Computational Thinkers — from iste.org by Jorge Valenzuela

In my previous position with Richmond Public Schools, we chose to dive in with computational thinking, programming and coding, in that order. I recommend building computational thinking (CT) competency first by helping students recognize and apply the four elements of CT to familiar problems/situations. Computational thinking should come first because it’s the highest order of problem-solving, is a cross-curricular skill and is understandable to both machines and humans. Here are the four components of CT and how to help students understand them.

 

Generative A.I. + Law – Background, Applications and Use Cases Including GPT-4 Passes the Bar Exam – Speaker Deck — from speakerdeck.com by Professor Daniel Martin Katz

 

 

 


Also relevant/see:

AI-Powered Virtual Legal Assistants Transform Client Services — from abovethelaw.com by Olga V. Mack
They can respond more succinctly than ever to answer client questions, triage incoming requests, provide details, and trigger automated workflows that ensure lawyers handle legal issues efficiently and effectively.

Artificial Intelligence in Law: How AI Can Reshape the Legal Industry — from jdsupra.com

 

10 Free AI Tools for Graphic Designing — from medium.com by Qz Ruslan

With the advancements in Artificial Intelligence (AI), designers now have access to a wide array of free AI-powered tools that streamline their creative process, enhance productivity, and add a touch of uniqueness to their designs. In this article, we will explore ten such free AI tools websites for graphic designing that have revolutionized the way designers approach their craft.


Generative Art in Motion — from heatherbcooper.substack.com by Heather Cooper
Animation and video tools create an explosion of creative expression


World’s first AI cinema opening in Auckland to make all your Matrix fantasies come true — from stuff.co.nz by Jonny Mahon-Heap
Review: My HyperCinema experience was futuristic, sleek – and slightly insane as I became the star of my own show.


AI That Alters Voice and Imagery in Political Ads Will Require Disclosure on Google and YouTube — from usnews.com by Associated Press
Political ads using artificial intelligence on Google and YouTube must soon be accompanied by a prominent disclosure if imagery or sounds have been synthetically altered

Google will soon require that political ads using artificial intelligence be accompanied by a prominent disclosure if imagery or sounds have been synthetically altered.

AI-generated election ads on YouTube and other Google platforms that alter people or events must include a clear disclaimer located somewhere that users are likely to notice, the company said in an update this week to its political content policy.


 

Future of Work Report AI at Work — from economicgraph.linkedin.com; via Superhuman

The intersection of AI and the world of work: Not only are job postings increasing, but we’re seeing more LinkedIn members around the globe adding AI skills to their profiles than ever before. We’ve seen a 21x increase in the share of global English-language job postings that mention new AI technologies such as GPT or ChatGPT since November 2022. In June 2023, the number of AI-skilled members was 9x larger than in January 2016, globally.

The state of play of Generative AI (GAI) in the workforce: GAI technologies, including ChatGPT, are poised to start to change the way we work. In fact, 47% of US executives believe that using generative AI will increase productivity, and 92% agree that people skills are more important than ever. This means jobs won’t necessarily go away but they will change as will the skills necessary to do them.

Also relevant/see:

The Working Future: More Human, Not Less — from bain.com
It’s time to change how we think about work

Contents

  • Introduction
  • Motivations for Work Are Changing.
  • Beliefs about What Makes a “Good Job” Are Diverging
  • Automation Is Helping to Rehumanize Work
  • Technological Change Is Blurring the Boundaries of the Firm
  • Young Workers Are Increasingly Overwhelmed
  • Rehumanizing Work: The Journey Ahead
 

OpenAI angles to put ChatGPT in classrooms with special tutor prompts — from techcrunch.com by Devin Coldewey

Taking the bull by the horns, the company has proposed a few ways for teachers to put the system to use… outside its usual role as “research assistant” for procrastinating students.
.

Teaching with AI -- a guide from OpenAI


Q2 Earnings Roundup – EdTech Generative AI — from aieducation.substack.com by Claire Zau
A roundup of LLM and AI discussions from Q2 EdTech Earnings

In this piece, we’ll be breaking down how a few of edtech’s most important companies are thinking about AI developments.

  • Duolingo
  • Powerschool
  • Coursera
  • Docebo
  • Instructure
  • Nerdy
 

A First Look at Teaching Preferences since the Pandemic”— from library.educause.edu/ by Muscanell

2023 Faculty & Technology Report: A First Look at Teaching Preferences since the Pandemic

This is the first faculty research conducted by EDUCAUSE since 2019. Since then, the higher education landscape has been through a lot, including COVID-19, fluctuations in enrollment and public funding, and the rapid adoption of multiple instructional modalities and new technologies. In this report, we describe the findings of the research in four key areas:

  • Modality preferences and the impacts of teaching in non-preferred modes
  • Experiences teaching online and hybrid courses
  • Technology and digital availability of course components
  • Types of support needed and utilized for teaching

From DSC:
Polling the faculty members and getting their feedback is not as relevant and important to the future of higher education as better addressing the needs and wants of parents and students who are paying the bills. Asking faculty members what they want to post online is not as relevant as what students want and need to see online.


From DSC:
More fringe responses — versus overhauling pricing, updating curriculum, providing more opportunities to try out jobs before investing in a degree, and/or better rewarding those adjunct faculty members who are doing the majority of the teaching on many campuses.


Online college enrollment is on the rise: What brings students to virtual campuses? — from digitaljournal.com by Jill Jaracz and Emma Rubin; via GSV

Before the pandemic, online learning programs were typically for people going back to school to augment or change their career or pursuing a graduate degree to enhance their career while they work. That attitude is shifting as students juggle learning with jobs, family responsibilities, and commutes. In California, 4 in 5 community college classes were in person before the pandemic. By 2021, just 1 in 4 were in person, while 65% were online, according to the California Community Colleges Chancellor’s Office.

Younger students are also opting for online classes. EducationDynamics found in 2023 that the largest share of students pursuing undergraduate or graduate degrees online is 35 or younger. That said, 35% of students pursuing online undergraduate degrees are between


 

From DSC: If this is true, how will we meet this type of demand?!?

RESKILLING NEEDED FOR 40% OF WORKFORCE BECAUSE OF AI, REPORT FROM IBM SAYS — from staffingindustry.com; via GSV

Generative AI will require skills upgrades for workers, according to a report from IBM based on a survey of executives from around the world. One finding: Business leaders say 40% of their workforces will need to reskill as AI and automation are implemented over the next three years. That could translate to 1.4 billion people in the global workforce who require upskilling, according to the company.

 
© 2024 | Daniel Christian