[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 

AI University for UK? — from donaldclarkplanb.blogspot.com by Donald Clark

Tertiary Education in the UK needs a fresh idea. What we need is an initiative on the same scale as The Open University, kicked off over 50 years ago.

It is clear that an educational vision is needed and I think the best starting point is that outlined and executed by Paul LeBlanc at SNHU. It is substantial, well articulated and has worked in what has become the largest University in the US.

It would be based on the competence model, with a focus on skills shortages. Here’s a starter with 25 ideas, a manifesto of sorts, based on lessons learnt from other successful models:

  1. Non-traditional students in terms of age and background
  2. Quick and easy application process
  3. Personalised learning using AI
  4. Multimodal from the start
  5. Full range of summarisation, create self-assessment, dialogue tools
  6. Focus on generative learning using AI
  7. …and Donald lists many more (ending at #25)
 

60+ Ideas for ChatGPT Assignments — from stars.library.ucf.edu by Kevin Yee, Kirby Whittington, Erin Doggette, and Laurie Uttich

60+ ideas for using ChatGPT in your assignments today


Artificial intelligence is disrupting higher education — from itweb.co.za by Rennie Naidoo; via GSV
Traditional contact universities need to adapt faster and find creative ways of exploring and exploiting AI, or lose their dominant position.

Higher education professionals have a responsibility to shape AI as a force for good.


Introducing Canva’s biggest education launch — from canva.com
We’re thrilled to unveil our biggest education product launch ever. Today, we’re introducing a whole new suite of products that turn Canva into the all-in-one classroom tool educators have been waiting for.

Also see Canva for Education.
Create and personalize lesson plans, infographics,
posters, video, and more. 
100% free for
teachers and students at eligible schools.


ChatGPT and generative AI: 25 applications to support student engagement — from timeshighereducation.com by Seb Dianati and Suman Laudari
In the fourth part of their series looking at 100 ways to use ChatGPT in higher education, Seb Dianati and Suman Laudari share 25 prompts for the AI tool to boost student engagement


There are two ways to use ChatGPT — from theneurondaily.com

  1. Type to it.
  2. Talk to it (new).


Since then, we’ve looked to it for a variety of real-world business advice. For example, Prof Ethan Mollick posted a great guide using ChatGPT-4 with voice as a negotiation instructor.

In a similar fashion, you can consult ChatGPT with voice for feedback on:

  • Job interviews.
  • Team meetings.
  • Business presentations.



Via The Rundown: Google is using AI to analyze the company’s Maps data and suggest adjustments to traffic light timing — aiming to cut driver waits, stops, and emissions.


Google Pixel’s face-altering photo tool sparks AI manipulation debate — from bbc.com by Darren Waters

The camera never lies. Except, of course, it does – and seemingly more often with each passing day.
In the age of the smartphone, digital edits on the fly to improve photos have become commonplace, from boosting colours to tweaking light levels.

Now, a new breed of smartphone tools powered by artificial intelligence (AI) are adding to the debate about what it means to photograph reality.

Google’s latest smartphones released last week, the Pixel 8 and Pixel 8 Pro, go a step further than devices from other companies. They are using AI to help alter people’s expressions in photographs.



From Digital Native to AI-Empowered: Learning in the Age of Artificial Intelligence — from campustechnology.com by Kim Round
The upcoming generation of learners will enter higher education empowered by AI. How can institutions best serve these learners and prepare them for the workplace of the future?

Dr. Chris Dede, of Harvard University and Co-PI of the National AI Institute for Adult Learning and Online Education, spoke about the differences between knowledge and wisdom in AI-human interactions in a keynote address at the 2022 Empowering Learners for the Age of AI conference. He drew a parallel between Star Trek: The Next Generation characters Data and Picard during complex problem-solving: While Data offers the knowledge and information, Captain Picard offers the wisdom and context from on a leadership mantle, and determines its relevance, timing, and application.


The Near-term Impact of Generative AI on Education, in One Sentence — from opencontent.org by David Wiley

This “decreasing obstacles” framing turned out to be helpful in thinking about generative AI. When the time came, my answer to the panel question, “how would you summarize the impact generative AI is going to have on education?” was this:

“Generative AI greatly reduces the degree to which access to expertise is an obstacle to education.”

We haven’t even started to unpack the implications of this notion yet, but hopefully just naming it will give the conversation focus, give people something to disagree with, and help the conversation progress more quickly.


How to Make an AI-Generated Film — from heatherbcooper.substack.com by Heather Cooper
Plus, Midjourney finally has a new upscale tool!


Eureka! NVIDIA Research Breakthrough Puts New Spin on Robot Learning — from blogs.nvidia.com by Angie Lee
AI agent uses LLMs to automatically generate reward algorithms to train robots to accomplish complex tasks.

From DSC:
I’m not excited about this, as I can’t help but wonder…how long before the militaries of the world introduce this into their warfare schemes and strategies?


The 93 Questions Schools Should Ask About AI — from edweek.org by Alyson Klein

The toolkit recommends schools consider:

  • Purpose: How can AI help achieve educational goals?
  • Compliance: How does AI fit with existing policies?
  • Knowledge: How can schools advance AI Literacy?
  • Balance: What are the benefits and risks of AI?
  • Integrity: How does AI fit into policies on things like cheating?
  • Agency: How can humans stay in the loop on AI?
  • Evaluation: How can schools regularly assess the impact of AI?
 

WHAT WAS GARY MARCUS THINKING, IN THAT INTERVIEW WITH GEOFF HINTON? — from linkedin.com by Stephen Downes

Background (emphasis DSC): 60 Minutes did an interview with ‘the Godfather of AI’, Geoffrey Hinton. In response, Gary Marcus wrote a column in which he inserted his own set of responses into the transcript, as though he were a panel participant. Neat idea. So, of course, I’m stealing it, and in what follows, I insert my own comments as I join the 60 Minutes panel with Geoffrey Hinton and Gary Marcus.

Usually I put everyone else’s text in italics, but for this post I’ll put it all in normal font, to keep the format consistent.

Godfather of Artificial Intelligence Geoffrey Hinton on the promise, risks of advanced AI


OpenAI’s Revenue Skyrockets to $1.3 Billion Annualized Rate — from maginative.com by Chris McKay
This means the company is generating over $100 million per month—a 30% increase from just this past summer.

OpenAI, the company behind the viral conversational AI ChatGPT, is experiencing explosive revenue growth. The Information reports that CEO Sam Altman told the staff this week that OpenAI’s revenue is now crossing $1.3 billion on an annualized basis. This means the company is generating over $100 million per month—a 30% increase from just this past summer.

Since the launch of a paid version of ChatGPT in February, OpenAI’s financial growth has been nothing short of meteoric. Additionally, in August, the company announced the launch of ChatGPT Enterprise, a commercial version of its popular conversational AI chatbot aimed at business users.

For comparison, OpenAI’s total revenue for all of 2022 was just $28 million. The launch of ChatGPT has turbocharged OpenAI’s business, positioning it as a bellwether for demand for generative AI.



From 10/13:


New ways to get inspired with generative AI in Search — from blog.google
We’re testing new ways to get more done right from Search, like the ability to generate imagery with AI or creating the first draft of something you need to write.

 

Thinking with Colleagues: AI in Education — from campustechnology.com by Mary Grush
A Q&A with Ellen Wagner

Wagner herself recently relied on the power of collegial conversations to probe the question: What’s on the minds of educators as they make ready for the growing influence of AI in higher education? CT asked her for some takeaways from the process.

We are in the very early days of seeing how AI is going to affect education. Some of us are going to need to stay focused on the basic research to test hypotheses. Others are going to dive into laboratory “sandboxes” to see if we can build some new applications and tools for ourselves. Still others will continue to scan newsletters like ProductHunt every day to see what kinds of things people are working on. It’s going to be hard to keep up, to filter out the noise on our own. That’s one reason why thinking with colleagues is so very important.

Mary and Ellen linked to “What Is Top of Mind for Higher Education Leaders about AI?” — from northcoasteduvisory.com. Below are some excerpts from those notes:

We are interested how K-12 education will change in terms of foundational learning. With in-class, active learning designs, will younger students do a lot more intensive building of foundational writing and critical thinking skills before they get to college?

  1. The Human in the Loop: AI is built using math: think of applied statistics on steroids. Humans will be needed more than ever to manage, review and evaluate the validity and reliability of results. Curation will be essential.
  2. We will need to generate ideas about how to address AI factors such as privacy, equity, bias, copyright, intellectual property, accessibility, and scalability.
  3. Have other institutions experimented with AI detection and/or have held off on emerging tools related to this? We have just recently adjusted guidance and paused some tools related to this given the massive inaccuracies in detection (and related downstream issues in faculty-elevated conduct cases)

Even though we learn repeatedly that innovation has a lot to do with effective project management and a solid message that helps people understand what they can do to implement change, people really need innovation to be more exciting and visionary than that.  This is the place where we all need to help each other stay the course of change. 


Along these lines, also see:


What people ask me most. Also, some answers. — from oneusefulthing.org by Ethan Mollick
A FAQ of sorts

I have been talking to a lot of people about Generative AI, from teachers to business executives to artists to people actually building LLMs. In these conversations, a few key questions and themes keep coming up over and over again. Many of those questions are more informed by viral news articles about AI than about the real thing, so I thought I would try to answer a few of the most common, to the best of my ability.

I can’t blame people for asking because, for whatever reason, the companies actually building and releasing Large Language Models often seem allergic to providing any sort of documentation or tutorial besides technical notes. I was given much better documentation for the generic garden hose I bought on Amazon than for the immensely powerful AI tools being released by the world’s largest companies. So, it is no surprise that rumor has been the way that people learn about AI capabilities.

Currently, there are only really three AIs to consider: (1) OpenAI’s GPT-4 (which you can get access to with a Plus subscription or via Microsoft Bing in creative mode, for free), (2) Google’s Bard (free), or (3) Anthropic’s Claude 2 (free, but paid mode gets you faster access). As of today, GPT-4 is the clear leader, Claude 2 is second best (but can handle longer documents), and Google trails, but that will likely change very soon when Google updates its model, which is rumored to be happening in the near future.

 

The MIT Press announces the Open Encyclopedia of Cognitive Science, a paradigm shift in open-access reference works — mitpress.mit.edu
The Open Encyclopedia of Cognitive Science will equip readers with essential tools to grapple with the profound implications of cognition and intelligence in today’s society

OECS’s articles will not only establish a shared understanding of foundational concepts, but also showcase cutting-edge debates and introduce core subfields, central concepts, significant phenomena, and key methodologies.

 

 

The economic potential of generative AI — from mckinsey.com; via Superhuman

The economic potential of generative AI -- from McKinsey & Co

.



.


On giving AI eyes and ears — from oneusefulthing.org by Ethan Mollick
AI can listen and see, with bigger implications than we might realize.

Excerpt:

But even this is just the beginning, and new modes of using AI are appearing, which further increases their capabilities. I want to show you some examples of this emerging world, which I think will soon introduce a new wave of AI use cases, and accompanying disruption.

We need to recognize that these capabilities will continue to grow, and AI will be able to play a more active role in the real world by observing and listening. The implications are likely to be profound, and we should start thinking through both the huge benefits and major concerns today.

Ethan Mollick


5 Steps to Transforming Images into Videos Using AI Tools — from heatherbcooper.substack.com by Heather Cooper
A simple guide to layering AI tools for quick video creation

5 Steps to Transforming Images into Videos Using AI Tools

.


‘Nobody wins in an academic-integrity arms race’ — from chonicle.com by Ian Wilhelm
How artificial intelligence is changing the way college thing about cheating

Even though generative AI is a new thing, it doesn’t change why students cheat. They’ve always cheated for the same reason: They don’t find the work meaningful, and they don’t think they can achieve it to their satisfaction. So we need to design assessments that students find meaning in. 

Tricia Bertram Gallant


Caught off guard by AI — from chonicle.com by Beth McMurtrie and Beckie Supiano
Professor scrambled to react to ChatGPT this spring — and started planning for the fall

Excerpt:

Is it cheating to use AI to brainstorm, or should that distinction be reserved for writing that you pretend is yours? Should AI be banned from the classroom, or is that irresponsible, given how quickly it is seeping into everyday life? Should a student caught cheating with AI be punished because they passed work off as their own, or given a second chance, especially if different professors have different rules and students aren’t always sure what use is appropriate?


GPT-4 Can Use Tools Now—That’s a Big Deal — from every.to by Dan Shipper; resource via Sam DeBrule
What “function calling” is, how it works, and what it means

Excerpt:

…OpenAI built tool use right into the GPT API with an update called function calling. It’s a little like a child’s ability to ask their parents to help them with a task that they know they can’t do on their own. Except in this case, instead of parents, GPT can call out to external code, databases, or other APIs when it needs to.

Each function in function calling represents a tool that a GPT model can use when necessary, and GPT gets to decide which ones it wants to use and when. This instantly upgrades GPT capabilities—not because it can now do every task perfectly—but because it now knows how to ask for what it wants and get it.
.



.


How ChatGPT can help disrupt assessment overload — from timeshighereducation.com by David Carless
Advances in AI are not necessarily the enemy – in fact, they should prompt long overdue consideration of assessment types and frequency, says David Carless

Excerpt:

Reducing the assessment burden could support trust in students as individuals wanting to produce worthwhile, original work. Indeed, students can be co-opted as partners in designing their own assessment tasks, so they can produce something meaningful to them.

A strategic reduction in quantity of assessment would also facilitate a refocusing of assessment priorities on deep understanding more than just performance and carries potential to enhance feedback processes.

If we were to tackle assessment overload in these ways, it opens up various possibilities. Most significantly there is potential to revitalise feedback so that it becomes a core part of a learning cycle rather than an adjunct at its end. End-of-semester, product-oriented feedback, which comes after grades have already been awarded, fails to encourage the iterative loops and spirals typical of productive learning.
.



.


The full 12 uses are here: https://edgeoflearning.com/your-new-teaching-superpower-ai-tools/


The AI Tools in Education Database — from kiwi-path-612.notion.site by EdTech Insiders

Excerpt (emphasis DSC):

Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day. This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.

.


 

.


Time for Class 2023 Study finds students are earlier adopters of generative AI tools than faculty, and majority (69%) of learners prefer hybrid, blended or online course formats — from globenewswire.com by Tyton Partners

.


AI Could Prevent Hiring Bias — Unless It Makes It Worse — from nerdwallet.com by Anna Helhoski
Advocates say AI can eliminate human biases in hiring. Skeptics point out that AI tools are trained by … humans.

Excerpt:

These claims conjure up the rosiest of images: human resource departments and their robot buddies solving discrimination in workplace hiring. It seems plausible, in theory, that AI could root out unconscious bias, but a growing body of research shows the opposite may be more likely.

Companies’ use of AI didn’t come out of nowhere: For example, automated applicant tracking systems have been used in hiring for decades. That means if you’ve applied for a job, your resume and cover letter were likely scanned by an automated system. You probably heard from a chatbot at some point in the process. Your interview might have been automatically scheduled and later even assessed by AI.

From DSC:
Here was my reflection on this:


Also related to AI in hiring, see:

4 in 10 Companies Will Be Using AI Interviews by 2024 — from resumebuilder.com

In June, ResumeBuilder.com surveyed more than 1,000 employees who are involved in hiring processes at their workplaces to find out about their companies’ use of AI interviews.

The results:

  • 43% of companies already have or plan to adopt AI interviews by 2024
  • Two-thirds of this group believe AI interviews will increase hiring efficiency
  • 15% say that AI will be used to make decisions on candidates without any human input
  • More than half believe AI will eventually replace human hiring managers

Watch OpenAI CEO Sam Altman on the Future of AI — from bloomberg.com
Sam Altman, CEO & Co-Founder, OpenAI discusses the explosive rise of OpenAI and its products and what an AI-laced future can look like with Bloomberg’s Emily Chang at the Bloomberg Technology Summit.

.


PowerSchool Announces Collaboration with Microsoft Azure OpenAI Service to Provide Personalized Learning at Scale in K-12 Education — from powerschool.com
Large-scale language models integrated within PowerSchool Performance Matters and PowerSchool LearningNav products will empower educators in delivering transformative personalized learning pathways

The implementation of generative AI within these products will dramatically improve educators’ ability to deliver personalized learning to students at scale by enabling the application of personalized assessments and learning pathways based on individual student needs and learning goals. K-12 educators will also benefit from access to OpenAI technology…

.


FETC 2023 Virtual Roundtable: How AI Will Transform K-12 Education

AI could be the great equalizer!

Holly Clark

Example screenshots:


 

The Cambrian Explosion of AI Edtech Is Here — from edtechinsiders.substack.com by Alex Sarlin, Sarah Morin, and Ben Kornell

Excerpt:

Our AI in Edtech Takeaways

After chronicling 160+ AI tools (which is surely only a small fraction of the total), we’re seeing a few clear patterns among the tools that have come out so far- here are 10 categories that are jumping out!

  • Virtual Teaching Assistants:
  • Virtual Tutors:
  • AI-Powered Study Tools:  
  • Educational Content Creation:
  • Educational Search:
  • Auto-generated Learning Paths: 
  • AI-Powered Research:
  • Speak to Characters:  
  • Grammar and Writing: 
  • AI Cheating Detection:

 


Ready or not, AI is here — from the chronicle.com’s The Edge, by Goldie Blumenstyk

Excerpt:

“I don’t usually get worked up about announcements but I see promise in JFF’s plans for a new Center for Artificial Intelligence & the Future of Work, in no small part because the organization bridges higher ed, K-12 education, employers, and policymakers.”

Goldie Blumenstyk

Goldie’s article links to:

Jobs for the Future Launches New Center for Artificial Intelligence & the Future of Work — from archive.jff.org
Center launches as JFF releases preliminary survey data which finds a majority of workers feel they need new skills and training to prepare for AI’s future impact.

Excerpt:

BOSTON June 14, 2023 —Jobs for the Future (JFF), a national nonprofit that drives transformation in the U.S. education and workforce systems, today announced the launch of its new Center for Artificial Intelligence &the Future of Work. This center will play an integral role in JFF’s mission and newly announced 10-year North Star goal to help 75 million people facing systemic barriers to advancement work in quality jobs. As AI’s explosive growth reshapes every aspect of how we learn, work, and live, this new center will serve as a nexus of collaboration among stakeholders from every part of the education-to-career ecosystem to explore the most promising opportunities—and profound challenges—of AI’s potential to advance an accessible and equitable future of learning and work.

 


OpenAI Considers ‘App Store’ For ChatGPT — from searchenginejournal.com by; with thanks to Barsee at AI Valley for this resource
OpenAI explores launching an ‘app store’ for AI models, potentially challenging current partners and expanding customer reach.

Highlights:

  • OpenAI considers launching an ‘app store’ for customized AI chatbots.
  • This move could create competition with current partners and extend OpenAI’s customer reach.
  • Early interest from companies like Aquant and Khan Academy shows potential, but product development and market positioning challenges remain.

The Rise of AI: New Rules for Super T Professionals and Next Steps for EdLeaders — from gettingsmart.com by Tom Vander Ark

Key Points

  • The rise of artificial intelligence, especially generative AI, boosts productivity in content creation–text, code, images and increasingly video.
  • Here are six preliminary conclusions about the nature of work and learning.

Wonder Tools: AI to try — from wondertools.substack.com by Jeremy Caplan
9 playful little ways to explore AI

Excerpt:

  1. Create a personalized children’s story ? Schrodi
    Collaborate with AI on a free customized, illustrated story for someone special. Give your story’s hero a name, pick a genre (e.g. comedy, thriller), choose an illustration style (e.g. watercolor, 3d animation) and provide a prompt to shape a simple story. You can even suggest a moral. After a minute, download a full-color PDF to share. Or print it and read your new mini picture book aloud.
  2. Generate a quiz ? | Piggy
    Put in a link, a topic, or some text and you’ll get a quiz you can share, featuring multiple-choice or true-false questions. Example: try this quick entrepreneurship quiz Piggy generated for me.

 


3 Questions for Coursera About Generative AI in Education — from insidehighered.com by Joshua Kim
How this tech will change the learning experience, course creation and more.

Excerpt (emphasis DSC):

Q: How will generative AI impact teaching and learning in the near and long term?

Baker Stein: One-on-one tutoring at scale is finally being unlocked for learners around the world. This type of quality education is no longer only available to students with the means to hire a private tutor. I’m also particularly excited to see how educators make use of generative AI tools to create courses much faster and likely at a higher quality with increased personalization for each student or even by experimenting with new technologies like extended reality. Professors will be able to put their time toward high-impact activities like mentoring, researching and office hours instead of tedious course-creation tasks. This helps open up the capacity for educators to iterate on their courses faster to keep pace with industry and global changes that may impact their field of study.

Another important use case is how generative AI can serve as a great equalizer for students when it comes to writing, especially second language learners.

 

Accenture announces jaw-dropping $3 billion investment in AI — from venturebeat.com by Carl Franzen; via Superhuman

Excerpt:

The generative AI announcements are coming fast and furious these days, but among the biggest in terms of sheer dollar commitments just landed: Accenture, the global professional services and consulting giant, today announced it will invest $3 billion (with a “b”!) in AI over the next three years in building out its team of AI professionals and AI-focused solutions for its clients.

“There is unprecedented interest in all areas of AI, and the substantial investment we are making in our Data & AI practice will help our clients move from interest to action to value, and in a responsible way with clear business cases,” said Julie Sweet, Accenture’s chairwoman and CEO.

Also related/see:

Artificial intelligence creates 40,000 new roles at Accenture — from computerweekly.com by Karl Flinders
Accenture is planning to add thousands of AI experts to its workforce as part of a $3bn investment in its data and artificial intelligence practice

Why leaders need to evolve alongside generative AI — from fastcompany.com by Kelsey Behringer
Even if you’re not an educator, you should not be sitting on the sidelines watching the generative AI conversation being had around you—hop in.

Excerpts (emphasis DSC):

Leaders should be careful to watch and support education right now. At the end of the day, the students sitting in K-12 and college classrooms are going to be future CPAs, lawyers, writers, and teachers. If you are parenting a child, you have skin in the game. If you use professional services, you have skin in the game. When it comes to education, we all have skin in the game.

Students need to master fundamental skills like editing, questioning, researching, and verifying claims before they can use generative AI exceptionally well.

GenAI & Education: Enhancement, not Replacement — from drphilippahardman.substack.com by Dr. Philipa Hardman
How to co-exist in the age of automation

Excerpts (emphasis DSC):

[On 6/15/23, I joined] colleagues from OpenAI, Google, Microsoft, Stanford, Harvard and other others at the first meeting of the GenAI Summit. Our shared goal [was] to help to educate universities & schools in Europe about the impact of Generative AI on their work.

how can we effectively communicate to education professionals that generative AI will enhance their work rather than replace them?

A recent controlled study found that ChatGPT can help professionals increase their efficiency in routine tasks by ~35%. If we keep in mind that the productivity gains brought by the steam engine in the nineteenth century was ~25%, this is huge.

As educators, we should embrace the power of ChatGPT to automate the repetitive tasks which we’ve been distracted by for decades. Lesson planning, content creation, assessment design, grading and feedback – generative AI can help us to do all of these things faster than ever before, freeing us up to focus on where we bring most value for our students.

Google, one of AI’s biggest backers, warns own staff about chatbots — from reuters.com by Jeffrey Dastin and Anna Tong

Excerpt:

SAN FRANCISCO, June 15 (Reuters) – Alphabet Inc (GOOGL.O) is cautioning employees about how they use chatbots, including its own Bard, at the same time as it markets the program around the world, four people familiar with the matter told Reuters.

The Google parent has advised employees not to enter its confidential materials into AI chatbots, the people said and the company confirmed, citing long-standing policy on safeguarding information.

The economic potential of generative AI: The next productivity frontier — from mckinsey.com
Generative AI’s impact on productivity could add trillions of dollars in value to the global economy—and the era is just beginning.



Preparing for the Classrooms and Workplaces of the Future: Generative AI in edX — from campustechnology.com by Mary Grush
A Q&A with Anant Agarwal


Adobe Firefly for the Enterprise — Dream Bigger with Adobe Firefly.
Dream it, type it, see it with Firefly, our creative generative AI engine. Now in Photoshop (beta), Illustrator, Adobe Express, and on the web.


Apple Vision Pro, Higher Education and the Next 10 Years — from insidehighered.com by Joshua Kim
How this technology will play out in our world over the next decade.



Zoom can now give you AI summaries of the meetings you’ve missed — from theverge.com by Emma Roth


Mercedes-Benz Is Adding ChatGPT to Cars for AI Voice Commands — from decrypt.co by Jason Nelson; via Superhuman
The luxury automaker is set to integrate OpenAI’s ChatGPT chatbot into its Mercedes-Benz User Experience (MBUX) feature in the U.S.


 



Introducing the ChatGPT app for iOS — from openai.com
The ChatGPT app syncs your conversations, supports voice input, and brings our latest model improvements to your fingertips.

Excerpt:

Since the release of ChatGPT, we’ve heard from users that they love using ChatGPT on the go. Today, we’re launching the ChatGPT app for iOS.

The ChatGPT app is free to use and syncs your history across devices. It also integrates Whisper, our open-source speech-recognition system, enabling voice input. ChatGPT Plus subscribers get exclusive access to GPT-4’s capabilities, early access to features and faster response times, all on iOS.


Spotlight: AI Myths and MisconceptionsYour Undivided Attention — from your-undivided-attention.simplecast.com

A few episodes back, we presented Tristan Harris and Aza Raskin’s talk The AI Dilemma. People inside the companies that are building generative artificial intelligence came to us with their concerns about the rapid pace of deployment and the problems that are emerging as a result. We felt called to lay out the catastrophic risks that AI poses to society and sound the alarm on the need to upgrade our institutions for a post-AI world.

The talk resonated – over 1.6 million people have viewed it on YouTube as of this episode’s release date. The positive reception gives us hope that leaders will be willing to come to the table for a difficult but necessary conversation about AI.

However, now that so many people have watched or listened to the talk, we’ve found that there are some AI myths getting in the way of making progress. On this episode of Your Undivided Attention, we debunk five of those misconceptions.



The State of Voice Technology in 2023 — from deepgram.com; with thanks to The Rundown for this resource
Explore the latest insights on speech AI applications and automatic speech recognition (ASR) across a dozen industries, as seen by 400 business leaders surveyed for this report by Opus Research.

Report -- State Of Voice Technology in 2023 -from Deepgram

Also relevant here, see:


Your guide to AI: May 2023 — from nathanbenaich.substack.com by Nathan Benaich and Othmane Sebbouh
Welcome to the latest issue of your guide to AI, an editorialized newsletter covering key developments in AI research (particularly for this issue!), industry, geopolitics and startups during April 2023. 


NYC Public Schools Drop Ban on AI Tool ChatGPT — from bloomberg.com


 

 

Introducing Teach AI — Empowering educators to teach w/ AI & about AI [ISTE & many others]


Teach AI -- Empowering educators to teach with AI and about AI


Also relevant/see:

 

ChatGPT: 30 incredible ways to use the AI-powered chatbot — from interestingengineering.com by Christopher McFadden
You’ve heard of ChatGPT, but do you know how to use it? Or what to use it for? If not, then here are some ideas to get you started.

Excerpts:

  • It’s great at writing CVs and resumes
  • It can also read and improve the existing CV or resume
  • It can help you prepare for a job interview
  • ChatGPT can even do some translation work for you
  • Have it draft you an exam

Chatbots’ Time Has Come. Why Now? — from every.to by Nathan Baschez
Narratives have network effects

Excerpt:

There are obvious questions like “Are the AI’s algorithms good enough?” (probably not yet) and “What will happen to Google?” (nobody knows), but I’d like to take a step back and ask some more fundamental questions: why chat? And why now?

Most people don’t realize that the AI model powering ChatGPT is not all that new. It’s a tweaked version of a foundation model, GPT-3, that launched in June 2020. Many people have built chatbots using it before now. OpenAI even has a guide in its documentation showing exactly how you can use its APIs to make one.

So what happened? The simple narrative is that AI got exponentially more powerful recently, so now a lot of people want to use it. That’s true if you zoom out. But if you zoom in, you start to see that something much more complex and interesting is happening.

This leads me to a surprising hypothesis: perhaps the ChatGPT moment never would have happened without DALL-E 2 and Stable Diffusion happening earlier in the year!


The Most Important Job Skill of This Century — from theatlantic.com by Charlie Warzel
Your work future could depend on how well you can talk to AI.

Excerpt:

Like writing and coding before it, prompt engineering is an emergent form of thinking. It lies somewhere between conversation and query, between programming and prose. It is the one part of this fast-changing, uncertain future that feels distinctly human.


The ChatGPT AI hype cycle is peaking, but even tech skeptics don’t expect a bust — from cnbc.com by Eric Rosenbaum

Key Points:

  • OpenAI’s ChatGPT, with new funding from Microsoft, has grown to over one million users faster than many of dominant tech companies, apps and platforms of the past decade.
  • Unlike the metaverse concept, which had a hype cycle based on an idea still nebulous to many, generative AI as tech’s next big thing is being built on top of decades of existing machine learning already embedded in business processes.
  • We asked top technology officers, specifically reaching out to many at non-tech sector companies, to break down the potential and pitfalls of AI adoption.

ChatGPT and the college curriculum — out at youtube.com by Bryan Alexander with Maria Anderson


AI in EDU: Know the Risks– from linkedin.com by Angela Maiers

AI in EDU -- Know the Risks

 


 

Introducing: ChatGPT Edu-Mega-Prompts — from drphilippahardman.substack.com by Dr. Philippa Hardman; with thanks to Ray Schroeder out on LinkedIn for this resource
How to combine the power of AI + learning science to improve your efficiency & effectiveness as an educator

From DSC:
Before relaying some excerpts, I want to say that I get the gist of what Dr. Hardman is saying re: quizzes. But I’m surprised to hear she had so many pedagogical concerns with quizzes. I, too, would like to see quizzes used as an instrument of learning and to practice recall — and not just for assessment. But I would give quizzes a higher thumbs up than what she did. I think she was also trying to say that quizzes don’t always identify misconceptions or inaccurate foundational information. 

Excerpts:

The Bad News: Most AI technologies that have been built specifically for educators in the last few years and months imitate and threaten to spread the use of broken instructional practices (i.e. content + quiz).

The Good News: Armed with prompts which are carefully crafted to ask the right thing in the right way, educators can use AI like GPT3 to improve the effectiveness of their instructional practices.

As is always the case, ChatGPT is your assistant. If you’re not happy with the result, you can edit and refine it using your expertise, either alone or through further conversation with ChatGPT.

For example, once the first response is generated, you can ask ChatGPT to make the activity more or less complex, to change the scenario and/or suggest more or different resources – the options are endless.

Philippa recommended checking out Rob Lennon’s streams of content. Here’s an example from his Twitter account:


Also relevant/see:

3 trends that may unlock AI's potential for Learning and Development in 2023

3 Trends That May Unlock AI’s Potential for L&D in 2023 — from learningguild.com by Juan Naranjo

Excerpts:

AI-assisted design and development work
This is the trend most likely to have a dramatic evolution this year.

Solutions like large language models, speech generators, content generators, image generators, translation tools, transcription tools, and video generators, among many others, will transform the way IDs create the learning experiences our organizations use. Two examples are:

1. IDs will be doing more curation and less creation:

  • Many IDs will start pulling raw material from content generators (built using natural language processing platforms like Open AI’s GPT-3, Microsoft’s LUIS, IBM’s Watson, Google’s BERT, etc.) to obtain ideas and drafts that they can then clean up and add to the assets they are assembling. As technology advances, the output from these platforms will be more suitable to become final drafts, and the curation and clean-up tasks will be faster and easier.
  • Then, the designer can leverage a solution like DALL-E 2 (or a product developed based on it) to obtain visuals that can (or not) be modified with programs like Illustrator or Photoshop (see image below for Dall-E’s “Cubist interpretation of AI and brain science.”

2. IDs will spend less, and in some cases no time at all, creating learning pathways

AI engines contained in LXPs and other platforms will select the right courses for employees and guide these learners from their current level of knowledge and skill to their goal state with substantially less human intervention.

 


The Creator of ChatGPT Thinks AI Should Be Regulated — from time.com by John Simons

Excerpts:

Somehow, Mira Murati can forthrightly discuss the dangers of AI while making you feel like it’s all going to be OK.

A growing number of leaders in the field are warning of the dangers of AI. Do you have any misgivings about the technology?

This is a unique moment in time where we do have agency in how it shapes society. And it goes both ways: the technology shapes us and we shape it. There are a lot of hard problems to figure out. How do you get the model to do the thing that you want it to do, and how you make sure it’s aligned with human intention and ultimately in service of humanity? There are also a ton of questions around societal impact, and there are a lot of ethical and philosophical questions that we need to consider. And it’s important that we bring in different voices, like philosophers, social scientists, artists, and people from the humanities.


Whispers of A.I.’s Modular Future — from newyorker.com by James Somers; via Sam DeBrule

Excerpts:

Gerganov adapted it from a program called Whisper, released in September by OpenAI, the same organization behind ChatGPTand dall-e. Whisper transcribes speech in more than ninety languages. In some of them, the software is capable of superhuman performance—that is, it can actually parse what somebody’s saying better than a human can.

Until recently, world-beating A.I.s like Whisper were the exclusive province of the big tech firms that developed them.

Ever since I’ve had tape to type up—lectures to transcribe, interviews to write down—I’ve dreamed of a program that would do it for me. The transcription process took so long, requiring so many small rewindings, that my hands and back would cramp. As a journalist, knowing what awaited me probably warped my reporting: instead of meeting someone in person with a tape recorder, it often seemed easier just to talk on the phone, typing up the good parts in the moment.

From DSC:
Journalism majors — and even seasoned journalists — should keep an eye on this type of application, as it will save them a significant amount of time and/or money.

Microsoft Teams Premium: Cut costs and add AI-powered productivity — from microsoft.com by Nicole Herskowitz

Excerpt:

Built on the familiar, all-in-one collaborative experience of Microsoft Teams, Teams Premium brings the latest technologies, including Large Language Models powered by OpenAI’s GPT-3.5, to make meetings more intelligent, personalized, and protected—whether it’s one-on-one, large meetings, virtual appointments, or webinars.


 

AI, Instructional Design, and OER — from opencontent.org by David Wiley

Excerpt:

LLMs Will Make Creating the Content Infrastructure Significantly Easier, Faster, and Cheaper
LLMs will dramatically increase the speed of creating the informational resources that comprise the content infrastructure. Of course the drafts of these informational resources will need to be reviewed and improvements will need to be made – just as is the case with all first drafts – to insure accuracy and timeliness. But it appears that LLMs can get us 80% or so of the way to reasonable first drafts orders of magnitude faster, eliminating the majority of the expense involved in this part of the process. Here’s an example of what I’m talking about. Imagine you’re a SME who has been tasked with writing the content for an introductory economics textbook. (The following examples are from ChatGPT.)

Speaking of ID and higher education, also relevant/see:

 

Some example components of a learning ecosystem [Christian]

A learning ecosystem is composed of people, tools, technologies, content, processes, culture, strategies, and any other resource that helps one learn. Learning ecosystems can be at an individual level as well as at an organizational level.

Some example components:

  • Subject Matter Experts (SMEs) such as faculty, staff, teachers, trainers, parents, coaches, directors, and others
  • Fellow employees
  • L&D/Training professionals
  • Managers
  • Instructional Designers
  • Librarians
  • Consultants
  • Types of learning
    • Active learning
    • Adult learning
    • PreK-12 education
    • Training/corporate learning
    • Vocational learning
    • Experiential learning
    • Competency-based learning
    • Self-directed learning (i.e., heutagogy)
    • Mobile learning
    • Online learning
    • Face-to-face-based learning
    • Hybrid/blended learning
    • Hyflex-based learning
    • Game-based learning
    • XR-based learning (AR, MR, and VR)
    • Informal learning
    • Formal learning
    • Lifelong learning
    • Microlearning
    • Personalized/customized learning
    • Play-based learning
  • Cloud-based learning apps
  • Coaching & mentoring
  • Peer feedback
  • Job aids/performance tools and other on-demand content
  • Websites
  • Conferences
  • Professional development
  • Professional organizations
  • Social networking
  • Social media – Twitter, LinkedIn, Facebook/Meta, other
  • Communities of practice
  • Artificial Intelligence (AI) — including ChatGPT, learning agents, learner profiles, 
  • LMS/CMS/Learning Experience Platforms
  • Tutorials
  • Videos — including on YouTube, Vimeo, other
  • Job-aids
  • E-learning-based resources
  • Books, digital textbooks, journals, and manuals
  • Enterprise social networks/tools
  • RSS feeds and blogging
  • Podcasts/vodcasts
  • Videoconferencing/audio-conferencing/virtual meetings
  • Capturing and sharing content
  • Tagging/rating/curating content
  • Decision support tools
  • Getting feedback
  • Webinars
  • In-person workshops
  • Discussion boards/forums
  • Chat/IM
  • VOIP
  • Online-based resources (periodicals, journals, magazines, newspapers, and others)
  • Learning spaces
  • Learning hubs
  • Learning preferences
  • Learning theories
  • Microschools
  • MOOCs
  • Open courseware
  • Portals
  • Wikis
  • Wikipedia
  • Slideshare
  • TED talks
  • …and many more components.

These people, tools, technologies, etc. are constantly morphing — as well as coming and going in and out of our lives.

 

 
© 2024 | Daniel Christian