Bill Gates Reveals Superhuman AI Prediction — from youtube.com by Rufus Griscom, Bill Gates, Andy Sack, and Adam Brotman

This episode of the Next Big Idea podcast, host Rufus Griscom and Bill Gates are joined by Andy Sack and Adam Brotman, co-authors of an exciting new book called “AI First.” Together, they consider AI’s impact on healthcare, education, productivity, and business. They dig into the technology’s risks. And they explore its potential to cure diseases, enhance creativity, and usher in a world of abundance.

Key moments:

00:05 Bill Gates discusses AI’s transformative potential in revolutionizing technology.
02:21 Superintelligence is inevitable and marks a significant advancement in AI technology.
09:23 Future AI may integrate deeply as cognitive assistants in personal and professional life.
14:04 AI’s metacognitive advancements could revolutionize problem-solving capabilities.
21:13 AI’s next frontier lies in developing human-like metacognition for sophisticated problem-solving.
27:59 AI advancements empower both good and malicious intents, posing new security challenges.
28:57 Rapid AI development raises questions about controlling its global application.
33:31 Productivity enhancements from AI can significantly improve efficiency across industries.
35:49 AI’s future applications in consumer and industrial sectors are subjects of ongoing experimentation.
46:10 AI democratization could level the economic playing field, enhancing service quality and reducing costs.
51:46 AI plays a role in mitigating misinformation and bridging societal divides through enhanced understanding.


OpenAI Introduces CriticGPT: A New Artificial Intelligence AI Model based on GPT-4 to Catch Errors in ChatGPT’s Code Output — from marktechpost.com

The team has summarized their primary contributions as follows.

  1. The team has offered the first instance of a simple, scalable oversight technique that greatly assists humans in more thoroughly detecting problems in real-world RLHF data.
  1. Within the ChatGPT and CriticGPT training pools, the team has discovered that critiques produced by CriticGPT catch more inserted bugs and are preferred above those written by human contractors.
  1. Compared to human contractors working alone, this research indicates that teams consisting of critic models and human contractors generate more thorough criticisms. When compared to reviews generated exclusively by models, this partnership lowers the incidence of hallucinations.
  1. This study provides Force Sampling Beam Search (FSBS), an inference-time sampling and scoring technique. This strategy well balances the trade-off between minimizing bogus concerns and discovering genuine faults in LLM-generated critiques.

Character.AI now allows users to talk with AI avatars over calls — from techcrunch.com by Ivan Mehta

a16z-backed Character.AI said today that it is now allowing users to talk to AI characters over calls. The feature currently supports multiple languages, including English, Spanish, Portuguese, Russian, Korean, Japanese and Chinese.

The startup tested the calling feature ahead of today’s public launch. During that time, it said that more than 3 million users had made over 20 million calls. The company also noted that calls with AI characters can be useful for practicing language skills, giving mock interviews, or adding them to the gameplay of role-playing games.


Google Translate Just Added 110 More Languages — from lifehacker.com by
You can now use the app to communicate in languages you’ve never even heard of.

Google Translate can come in handy when you’re traveling or communicating with someone who speaks another language, and thanks to a new update, you can now connect with some 614 million more people. Google is adding 110 new languages to its Translate tool using its AI PaLM 2 large language model (LLM), which brings the total of supported languages to nearly 250. This follows the 24 languages added in 2022, including Indigenous languages of the Americas as well as those spoken across Africa and central Asia.




Listen to your favorite books and articles voiced by Judy Garland, James Dean, Burt Reynolds and Sir Laurence Olivier — from elevenlabs.io
ElevenLabs partners with estates of iconic stars to bring their voices to the Reader App

 

The Musician’s Rule and GenAI in Education — from opencontent.org by David Wiley

We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.

From DSC:
As is often the case, David put together a solid posting here. A few comments/reflections on it:

  • I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
  • The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
  • The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
  • The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
  • As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
  • There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better.
    .

We need to take more of the research from learning science and apply it in our learning spaces.

 

2024 Global Skills Report -- from Coursera

  • AI literacy emerges as a global imperative
  • AI readiness initiatives drive emerging skill adoption across regions
  • The digital skills gap persists in a rapidly evolving job market
  • Cybersecurity skills remain crucial amid talent shortages and evolving threats
  • Micro-credentials are a rapid pathway for learners to prepare for in-demand jobs
  • The global gender gap in online learning continues to narrow, but regional disparities persist
  • Different regions prioritize different skills, but the majority focus on emerging or foundational capabilities

You can use the Global Skills Report 2024 to:

  • Identify critical skills for your students to strengthen employability
  • Align curriculum to drive institutional advantage nationally
  • Track emerging skill trends like GenAI and cybersecurity
  • Understand entry-level and digital role skill trends across six regions
 

Daniel Christian: My slides for the Educational Technology Organization of Michigan’s Spring 2024 Retreat

From DSC:
Last Thursday, I presented at the Educational Technology Organization of Michigan’s Spring 2024 Retreat. I wanted to pass along my slides to you all, in case they are helpful to you.

Topics/agenda:

  • Topics & resources re: Artificial Intelligence (AI)
    • Top multimodal players
    • Resources for learning about AI
    • Applications of AI
    • My predictions re: AI
  • The powerful impact of pursuing a vision
  • A potential, future next-gen learning platform
  • Share some lessons from my past with pertinent questions for you all now
  • The significant impact of an organization’s culture
  • Bonus material: Some people to follow re: learning science and edtech

 

Education Technology Organization of Michigan -- ETOM -- Spring 2024 Retreat on June 6-7

PowerPoint slides of Daniel Christian's presentation at ETOM

Slides of the presentation (.PPTX)
Slides of the presentation (.PDF)

 


Plus several more slides re: this vision.

 

How Generative AI Owns Higher Education. Now What? — from forbes.co by Steve Andriole

Excerpt (emphasis DSC):

What about course videos? Professors can create them (by lecturing into a camera for several hours hopefully in different clothes) from the readings, from their interpretations of the readings, from their own case experiences – from anything they like. But now professors can direct the creation of the videos by talking – actually describing – to a CustomGPTabout what they’d like the video to communicate with their or another image. Wait. What? They can make a video by talking to a CustomGPT and even select the image they want the “actor” to use? Yes. They can also add a British accent and insert some (GenAI-developed) jokes into the videos if they like. All this and much more is now possible. This means that a professor can specify how long the video should be, what sources should be consulted and describe the demeanor the professor wants the video to project.

From DSC:
Though I wasn’t crazy about the clickbait type of title here, I still thought that the article was solid and thought-provoking. It contained several good ideas for using AI.


Excerpt from a recent EdSurge Higher Ed newsletter:


There are darker metaphors though — ones that focus on the hazards for humanity of the tech. Some professors worry that AI bots are simply replacing hired essay-writers for many students, doing work for a student that they can then pass off as their own (and doing it for free).

From DSC:
Hmmm…the use of essay writers was around long before AI became mainstream within higher education. So we already had a serious problem where students didn’t see the why in what they were being asked to do. Some students still aren’t sold on the why of the work in the first place. The situation seems to involve ethics, yes, but it also seems to say that we haven’t sold students on the benefits of putting in the work. Students seem to be saying I don’t care about this stuff…I just need the degree so I can exit stage left.

My main point: The issue didn’t start with AI…it started long before that.

And somewhat relevant here, also see:

I Have Bigger Fish to Fry: Why K12 Education is Not Thinking About AI — from medium.com by Maurie Beasley, M.Ed. (Edited by Jim Beasley)

This financial stagnation is occurring as we face a multitude of escalating challenges. These challenges include but are in no way limited to, chronic absenteeism, widespread student mental health issues, critical staff shortages, rampant classroom behavior issues, a palpable sense of apathy for education in students, and even, I dare say, hatred towards education among parents and policymakers.

Our current focus is on keeping our heads above water, ensuring our students’ safety and mental well-being, and simply keeping our schools staffed and our doors open.


Meet Ed: Ed is an educational friend designed to help students reach their limitless potential. — from lausd.org (Los Angeles School District, the second largest in the U.S.)

What is Ed?
An easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement. It offers personalized guidance and resources to students and families 24/7 in over 100 languages.

Ed is an easy-to-understand learning platform designed by Los Angeles Unified to increase student achievement.

Also relevant/see:

  • Los Angeles Unified Bets Big on ‘Ed,’ an AI Tool for Students — from by Lauraine Langreo
    The Los Angeles Unified School District has launched an AI-powered learning tool that will serve as a “personal assistant” to students and their parents.The tool, named “Ed,” can provide students from the nation’s second-largest district information about their grades, attendance, upcoming tests, and suggested resources to help them improve their academic skills on their own time, Superintendent Alberto Carvalho announced March 20. Students can also use the app to find social-emotional-learning resources, see what’s for lunch, and determine when their bus will arrive.

Could OpenAI’s Sora be a big deal for elementary school kids? — from futureofbeinghuman.com by Andrew Maynard
Despite all the challenges it comes with, AI-generated video could unleash the creativity of young children and provide insights into their inner worlds – if it’s developed and used responsibly

Like many others, I’m concerned about the challenges that come with hyper-realistic AI-generated video. From deep fakes and disinformation to blurring the lines between fact and fiction, generative AI video is calling into question what we can trust, and what we cannot.

And yet despite all the issues the technology is raising, it also holds quite incredible potential, including as a learning and development tool — as long as we develop and use it responsibly.

I was reminded of this a few days back while watching the latest videos from OpenAI created by their AI video engine Sora — including the one below generated from the prompt “an elephant made of leaves running in the jungle”

What struck me while watching this — perhaps more than any of the other videos OpenAI has been posting on its TikTok channel — is the potential Sora has for translating the incredibly creative but often hard to articulate ideas someone may have in their head, into something others can experience.


Can AI Aid the Early Education Workforce? — from edsurge.com by Emily Tate Sullivan
During a panel at SXSW EDU 2024, early education leaders discussed the potential of AI to support and empower the adults who help our nation’s youngest children.

While the vast majority of the conversations about AI in education have centered on K-12 and higher education, few have considered the potential of this innovation in early care and education settings.

At the conference, a panel of early education leaders gathered to do just that, in a session exploring the potential of AI to support and empower the adults who help our nation’s youngest children, titled, “ChatECE: How AI Could Aid the Early Educator Workforce.”

Hau shared that K-12 educators are using the technology to improve efficiency in a number of ways, including to draft individualized education programs (IEPs), create templates for communicating with parents and administrators, and in some cases, to support building lesson plans.


From EIEIO…Seasons Of Change

Again, we’ve never seen change happen as fast as it’s happening.


Enhancing World Language Instruction With AI Image Generators — from eduoptia.org by Rachel Paparone
By crafting an AI prompt in the target language to create an image, students can get immediate feedback on their communication skills.

Educators are, perhaps rightfully so, cautious about incorporating AI in their classrooms. With thoughtful implementation, however, AI image generators, with their ability to use any language, can provide powerful ways for students to engage with the target language and increase their proficiency.


AI in the Classroom: A Teacher’s Toolkit for Transformation — from esheninger.blogspot.com by Eric Sheninger

While AI offers numerous benefits, it’s crucial to remember that it is a tool to empower educators, not replace them. The human connection between teacher and student remains central to fostering creativity, critical thinking, and social-emotional development. The role of teachers will shift towards becoming facilitators, curators, and mentors who guide students through personalized learning journeys. By harnessing the power of AI, educators can create dynamic and effective classrooms that cater to each student’s individual needs. This paves the way for a more engaging and enriching learning experience that empowers students to thrive.


Teachers Are Using AI to Create New Worlds, Help Students with Homework, and Teach English — from themarkup.org by Ross Teixeira; via Matthew Tower
Around the world, these seven teachers are making AI work for them and their students

In this article, seven teachers across the world share their insights on AI tools for educators. You will hear a host of varied opinions and perspectives on everything from whether AI could hasten the decline of learning foreign languages to whether AI-generated lesson plans are an infringement on teachers’ rights. A common theme emerged from those we spoke with: just as the internet changed education, AI tools are here to stay, and it is prudent for teachers to adapt.


Teachers Desperately Need AI Training. How Many Are Getting It? — from edweek.org by Lauraine Langreo

Even though it’s been more than a year since ChatGPT made a big splash in the K-12 world, many teachers say they are still not receiving any training on using artificial intelligence tools in the classroom.

More than 7 in 10 teachers said they haven’t received any professional development on using AI in the classroom, according to a nationally representative EdWeek Research Center survey of 953 educators, including 553 teachers, conducted between Jan. 31 and March 4.

From DSC:
This article mentioned the following resource:

Artificial Intelligence Explorations for Educators — from iste.org


 

Amid explosive demand, America is running out of power — from washingtonpost.com by Evan Halper
AI and the boom in clean-tech manufacturing are pushing America’s power grid to the brink. Utilities can’t keep up.

Vast swaths of the United States are at risk of running short of power as electricity-hungry data centers and clean-technology factories proliferate around the country, leaving utilities and regulators grasping for credible plans to expand the nation’s creaking power grid.

A major factor behind the skyrocketing demand is the rapid innovation in artificial intelligence, which is driving the construction of large warehouses of computing infrastructure that require exponentially more power than traditional data centers. AI is also part of a huge scale-up of cloud computing. Tech firms like Amazon, Apple, Google, Meta and Microsoft are scouring the nation for sites for new data centers, and many lesser-known firms are also on the hunt.


The Obscene Energy Demands of A.I. — from newyorker.com by Elizabeth Kolbert
How can the world reach net zero if it keeps inventing new ways to consume energy?

“There’s a fundamental mismatch between this technology and environmental sustainability,” de Vries said. Recently, the world’s most prominent A.I. cheerleader, Sam Altman, the C.E.O. of OpenAI, voiced similar concerns, albeit with a different spin. “I think we still don’t appreciate the energy needs of this technology,” Altman said at a public appearance in Davos. He didn’t see how these needs could be met, he went on, “without a breakthrough.” He added, “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale—like, a scale that no one is really planning for.”


A generative AI reset: Rewiring to turn potential into value in 2024 — from mckinsey.com by Eric Lamarre, Alex Singla, Alexander Sukharevsky, and Rodney Zemmel; via Philippa Hardman
The generative AI payoff may only come when companies do deeper organizational surgery on their business.

  • Figure out where gen AI copilots can give you a real competitive advantage
  • Upskill the talent you have but be clear about the gen-AI-specific skills you need
  • Form a centralized team to establish standards that enable responsible scaling
  • Set up the technology architecture to scale
  • Ensure data quality and focus on unstructured data to fuel your models
  • Build trust and reusability to drive adoption and scale

AI Prompt Engineering Is Dead Long live AI prompt engineering — from spectrum.ieee.org

Since ChatGPT dropped in the fall of 2022, everyone and their donkey has tried their hand at prompt engineering—finding a clever way to phrase your query to a large language model (LLM) or AI art or video generator to get the best results or sidestep protections. The Internet is replete with prompt-engineering guides, cheat sheets, and advice threads to help you get the most out of an LLM.

However, new research suggests that prompt engineering is best done by the model itself, and not by a human engineer. This has cast doubt on prompt engineering’s future—and increased suspicions that a fair portion of prompt-engineering jobs may be a passing fad, at least as the field is currently imagined.


What the birth of the spreadsheet teaches us about generative AI — from timharford.com by Tim Harford; via Sam DeBrule

There is one very clear parallel between the digital spreadsheet and generative AI: both are computer apps that collapse time. A task that might have taken hours or days can suddenly be completed in seconds. So accept for a moment the premise that the digital spreadsheet has something to teach us about generative AI. What lessons should we absorb?

It’s that pace of change that gives me pause. Ethan Mollick, author of the forthcoming book Co-Intelligence, tells me “if progress on generative AI stops now, the spreadsheet is not a bad analogy”. We’d get some dramatic shifts in the workplace, a technology that broadly empowers workers and creates good new jobs, and everything would be fine. But is it going to stop any time soon? Mollick doubts that, and so do I.


 

 

6 work and workplace trends to watch in 2024 — from weforum.org by Kate Whiting; via Melanie Booth on LinkedIn

Excerpts (emphasis DSC):

The world of work is changing fast.

By 2027, businesses predict that almost half (44%) of workers’ core skills will be disrupted.

Technology is moving faster than companies can design and scale up their training programmes, found the World Economic Forum’s Future of Jobs Report.

The Forum’s Global Risks Report 2024 found that “lack of economic opportunity” ranked as one of the top 10 biggest risks among risk experts over the next two years.

5. Skills will become even more important
With 23% of jobs expected to change in the next five years, according to the Future of Jobs Report, millions of people will need to move between declining and growing jobs.

 

How Workers Rise — from the-job.beehiiv.com by Paul Fain
A look forward at skills-based hiring and AI’s impacts on education and work.

Impacts of AI: Fuller is optimistic about companies making serious progress on skills-based hiring over the next five to 10 years. AI will help drive that transformation, he says, by creating the data to better understand the skills associated with jobs.

The technology will allow for a more accurate matching of skills and experiences, says Fuller, and for companies to “not rely on proxies like degrees or grade point averages or even the proxy of what someone currently makes or how fast they’ve gotten promoted on their résumé.”

Change is coming soon, Fuller predicts, particularly as AI’s impacts accelerate. And the disruption will affect wealthier Americans who’ve been spared during previous shifts in the labor market.

The Kicker: “When people in bedroom suburbs are losing their six-figure jobs, that changes politics,” Fuller says. “That changes the way people are viewing things like equity and where that leads. It’s certainly going to put a lot of pressure on the way the system has worked.”

 

Learners’ Edition: AI-powered Coaching, Professional Certifications + Inspiring conversations about mastering your learning & speaking skills

Learners’ Edition: AI-powered Coaching, Professional Certifications + Inspiring conversations about mastering your learning & speaking skills — from linkedin.com by Tomer Cohen

Excerpts:

1. Your own AI-powered coaching
Learners can go into LinkedIn Learning and ask a question or explain a challenge they are currently facing at work (we’re focusing on areas within Leadership and Management to start). AI-powered coaching will pull from the collective knowledge of our expansive LinkedIn Learning library and, instantaneously, offer advice, examples, or feedback that is personalized to the learner’s skills, job, and career goals.

What makes us so excited about this launch is we can now take everything we as LinkedIn know about people’s careers and how they navigate them and help accelerate them with AI.

3. Learn exactly what you need to know for your next job
When looking for a new job, it’s often the time we think about refreshing our LinkedIn profiles. It’s also a time we can refresh our skills. And with skill sets for jobs having changed by 25% since 2015 – with the number expected to increase by 65% by 2030– keeping our skills a step ahead is one of the most important things we can do to stand out.

There are a couple of ways we’re making it easier to learn exactly what you need to know for your next job:

When you set a job alert, in addition to being notified about open jobs, we’ll recommend learning courses and Professional Certificate offerings to help you build the skills needed for that role.

When you view a job, we recommend specific courses to help you build the required skills. If you have LinkedIn Learning access through your company or as part of a Premium subscription, you can follow the skills for the job, that way we can let you know when we launch new courses for those skills and recommend you content on LinkedIn that better aligns to your career goals.


2024 Edtech Predictions from Edtech Insiders — from edtechinsiders.substack.com by Alex Sarlin, Ben Kornell, and Sarah Morin
Omni-modal AI, edtech funding prospects, higher ed wake up calls, focus on career training, and more!

Alex: I talked to the 360 Learning folks at one point and they had this really interesting epiphany, which is basically that it’s been almost impossible for every individual company in the past to create a hierarchy of skills and a hierarchy of positions and actually organize what it looks like for people to move around and upskill within the company and get to new paths.

Until now. AI actually can do this very well. It can take not only job description data, but it can take actual performance data. It can actually look at what people do on a daily basis and back fit that to training, create automatic training based on it.

From DSC:
I appreciated how they addressed K-12, higher ed, and the workforce all in one posting. Nice work. We don’t need siloes. We need more overall design thinking re: our learning ecosystems — as well as more collaborations. We need more on-ramps and pathways in a person’s learning/career journey.

 

Education evolves to match the speed of tech innovation — from Tech predictions for 2024 and beyond — from allthingsdistributed.com
Higher education alone cannot keep up with the rate of technological change. Industry-led skills-based training programs will emerge that more closely resemble the journeys of skilled tradespeople. This shift to continuous learning will benefit individuals and businesses alike.

Similar to the software development processes of decades past, we have reached a pivotal point with tech education, and we will see what was once bespoke on-the-job-training for a few evolve into industry-led skills-based education for many.

We have seen glimpses of this shift underway for years. Companies like Coursera, who originally focused on consumers, have partnered with enterprises to scale their upskilling and reskilling efforts. Degree apprenticeships have continued to grow in popularity because education can be specialized by the employer, and apprentices can earn as they learn. But now, companies themselves are starting to seriously invest in skills-based education at scale.

All of these programs enable learners at different points in their career journey to gain the exact skills they need to enter in-demand roles, without the commitment of a traditional multi-year program.

But there will be many industries where the impact of technology outpaces traditional educational systems. To meet the demands of business, we will see a new era of industry-led educational opportunities that can’t be ignored.

From DSC:
It seems to me that this is saying that higher education is not able to — nor will it be able to in the future — match the speed of innovation taking place today. Therefore, alternatives will continue to hit the learning landscapes/radar. For example, Amazon’s CTO, Dr. Werner Vogels, mentioned Amazon’s efforts here:

Amazon just announced that it has already trained 21 million tech learners across the world in tech skills. And it’s in part thanks to programs like the Mechatronics and Robotics Apprenticeship and AWS Cloud Institute.

 

Expanding Bard’s understanding of YouTube videos — via AI Valley

  • What: We’re taking the first steps in Bard’s ability to understand YouTube videos. For example, if you’re looking for videos on how to make olive oil cake, you can now also ask how many eggs the recipe in the first video requires.
  • Why: We’ve heard you want deeper engagement with YouTube videos. So we’re expanding the YouTube Extension to understand some video content so you can have a richer conversation with Bard about it.

Reshaping the tree: rebuilding organizations for AI — from oneusefulthing.org by Ethan Mollick
Technological change brings organizational change.

I am not sure who said it first, but there are only two ways to react to exponential change: too early or too late. Today’s AIs are flawed and limited in many ways. While that restricts what AI can do, the capabilities of AI are increasing exponentially, both in terms of the models themselves and the tools these models can use. It might seem too early to consider changing an organization to accommodate AI, but I think that there is a strong possibility that it will quickly become too late.

From DSC:
Readers of this blog have seen the following graphic for several years now, but there is no question that we are in a time of exponential change. One would have had an increasingly hard time arguing the opposite of this perspective during that time.

 


 



Nvidia’s revenue triples as AI chip boom continues — from cnbc.com by Jordan Novet; via GSV

KEY POINTS

  • Nvidia’s results surpassed analysts’ projections for revenue and income in the fiscal fourth quarter.
  • Demand for Nvidia’s graphics processing units has been exceeding supply, thanks to the rise of generative artificial intelligence.
  • Nvidia announced the GH200 GPU during the quarter.

Here’s how the company did, compared to the consensus among analysts surveyed by LSEG, formerly known as Refinitiv:

  • Earnings: $4.02 per share, adjusted, vs. $3.37 per share expected
  • Revenue: $18.12 billion, vs. $16.18 billion expected

Nvidia’s revenue grew 206% year over year during the quarter ending Oct. 29, according to a statement. Net income, at $9.24 billion, or $3.71 per share, was up from $680 million, or 27 cents per share, in the same quarter a year ago.



 



Don’t Believe the Hype? Practical Thoughts About Using AI in Legal (Stephen Embry – TechLaw Crossroads) — from tlpodcast.com by Stephen Embry

Despite the hype and big promises about AI, if it is used correctly, could it be the differentiator that sets good legal professionals apart from the pack? Stephen Embry offers a good argument for this in the latest episode.

Stephen is a long-time attorney and the legal tech aficionado behind the TechLaw Crossroads blog– a great resource for practical and real-world insight about legal tech and how technology is impacting the practice of law. Embry emphasizes that good lawyers will embrace artificial intelligence to increase efficiency and serve their clients better, leaving more time for strategic thinking and advisory roles.

 

The Beatles’ final song is now streaming thanks to AI — from theverge.com by Chris Welch
Machine learning helped Paul McCartney and Ringo Starr turn an old John Lennon demo into what’s likely the band’s last collaborative effort.


Scientists excited by AI tool that grades severity of rare cancer — from bbc.com by Fergus Walsh

Artificial intelligence is nearly twice as good at grading the aggressiveness of a rare form of cancer from scans as the current method, a study suggests.

By recognising details invisible to the naked eye, AI was 82% accurate, compared with 44% for lab analysis.

Researchers from the Royal Marsden Hospital and Institute of Cancer Research say it could improve treatment and benefit thousands every year.

They are also excited by its potential for spotting other cancers early.


Microsoft unveils ‘LeMa’: A revolutionary AI learning method mirroring human problem solving — from venturebeat.com by Michael Nuñez

Researchers from Microsoft Research Asia, Peking University, and Xi’an Jiaotong University have developed a new technique to improve large language models’ (LLMs) ability to solve math problems by having them learn from their mistakes, akin to how humans learn.

The researchers have revealed a pioneering strategy, Learning from Mistakes (LeMa), which trains AI to correct its own mistakes, leading to enhanced reasoning abilities, according to a research paper published this week.

Also from Michael Nuñez at venturebeat.com, see:


GPTs for all, AzeemBot; conspiracy theorist AI; big tech vs. academia; reviving organs ++448 — from exponentialviewco by Azeem Azhar and Chantal Smith


Personalized A.I. Agents Are Here. Is the World Ready for Them? — from ytimes.com by Kevin Roose (behind a paywall)

You could think of the recent history of A.I. chatbots as having two distinct phases.

The first, which kicked off last year with the release of ChatGPT and continues to this day, consists mainly of chatbots capable of talking about things. Greek mythology, vegan recipes, Python scripts — you name the topic and ChatGPT and its ilk can generate some convincing (if occasionally generic or inaccurate) text about it.

That ability is impressive, and frequently useful, but it is really just a prelude to the second phase: artificial intelligence that can actually do things. Very soon, tech companies tell us, A.I. “agents” will be able to send emails and schedule meetings for us, book restaurant reservations and plane tickets, and handle complex tasks like “negotiate a raise with my boss” or “buy Christmas presents for all my family members.”


From DSC:
Very cool!


Nvidia Stock Jumps After Unveiling of Next Major AI Chip. It’s Bad News for Rivals. — from barrons.com

On Monday, Nvidia (ticker: NVDA) announced its new H200 Tensor Core GPU. The chip incorporates 141 gigabytes of memory and offers up to 60% to 90% performance improvements versus its current H100 model when used for inference, or generating answers from popular AI models.

From DSC:
The exponential curve seems to be continuing — 60% to 90% performance improvements is a huge boost in performance.

Also relevant/see:


The 5 Best GPTs for Work — from the AI Exchange

Custom GPTs are exploding, and we wanted to highlight our top 5 that we’ve seen so far:

 

How Have Schools Improved Since the Pandemic? What Teachers Had to Say — from the74million.org by Cory Beets
Educator’s view: In technology, mental health, and nurturing and solutions-oriented environments, COVID provided lessons schools have taken to heart.

In doing research for my Ph.D. program, I sought out the perspectives of five teachers through informal conversations about how schools have improved since the pandemic. Four themes emerged.

From DSC:
To add another positive to the COVID-19 picture…

Just like COVID-19 did more for the advancement of online learning within our learning ecosystems than 20+ years of online learning development, COVID-19 may have done more to move our younger learners along the flexibility route that will serve them well in their futures. That is, with today’s exponential pace of change, we all need to be more agile and flexible — and be able to reinvent ourselves along the way. The type of learning that our K-12ers went through during COVID-19 may have been the most helpful thing yet for their future success and career development. They will need to pivot, adapt, and take right turn after right turn. 

 

180 Degree Turn: NYC District Goes From Banning ChatGPT to Exploring AI’s Potential — from edweek.org by Alyson Klein (behind paywall)

New York City Public Schools will launch an Artificial Intelligence Policy Lab to guide the nation’s largest school district’s approach to this rapidly evolving technology.


The Leader’s Blindspot: How to Prepare for the Real Future — from preview.mailerlite.io by the AIEducator
The Commonly Held Belief: AI Will Automate Only Boring, Repetitive Tasks First

The Days of Task-Based Views on AI Are Numbered
The winds of change are sweeping across the educational landscape (emphasis DSC):

  1. Multifaceted AI: AI technologies are not one-trick ponies; they are evolving into complex systems that can handle a variety of tasks.
  2. Rising Expectations: As technology becomes integral to our lives, the expectations for personalised, efficient education are soaring.
  3. Skill Transformation: Future job markets will demand a different skill set, one that is symbiotic with AI capabilities.

Teaching: How to help students better understand generative AI — from chronicle.com by Beth McMurtrie
Beth describes ways professors have used ChatGPT to bolster critical thinking in writing-intensive courses

Kevin McCullen, an associate professor of computer science at the State University of New York at Plattsburgh, teaches a freshman seminar about AI and robotics. As part of the course, students read Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots, by John Markoff. McCullen had the students work in groups to outline and summarize the first three chapters. Then he showed them what ChatGPT had produced in an outline.

“Their version and ChatGPT’s version seemed to be from two different books,” McCullen wrote. “ChatGPT’s version was essentially a ‘laundry list’ of events. Their version was narratives of what they found interesting. The students had focused on what the story was telling them, while ChatGPT focused on who did what in what year.” The chatbot also introduced false information, such as wrong chapter names.

The students, he wrote, found the writing “soulless.”


7 Questions with Dr. Cristi Ford, VP of Academic Affairs at D2L — from campustechnology.com by Rhea Kelly

In the Wild West of generative AI, educators and institutions are working out how best to use the technology for learning. How can institutions define AI guidelines that allow for experimentation while providing students with consistent guidance on appropriate use of AI tools?

To find out, we spoke with Dr. Cristi Ford, vice president of academic affairs at D2L. With more than two decades of educational experience in nonprofit, higher education, and K-12 institutions, Ford works with D2L’s institutional partners to elevate best practices in teaching, learning, and student support. Here, she shares her advice on setting and communicating AI policies that are consistent and future-ready.


AI Platform Built by Teachers, for Teachers, Class Companion Raises $4 Million to Tap Into the Power of Practice — from prweb.com

“If we want to use AI to improve education, we need more teachers at the table,” said Avery Pan, Class Companion co-founder and CEO. “Class Companion is designed by teachers, for teachers, to harness the most sophisticated AI and improve their classroom experience. Developing technologies specifically for teachers is imperative to supporting our next generation of students and education system.”


7 Questions on Generative AI in Learning Design — from campustechnology.com by Rhea Kelly
Open LMS Adoption and Education Specialist Michael Vaughn on the challenges and possibilities of using artificial intelligence to move teaching and learning forward.

The potential for artificial intelligence tools to speed up course design could be an attractive prospect for overworked faculty and spread-thin instructional designers. Generative AI can shine, for example, in tasks such as reworking assessment question sets, writing course outlines and learning objectives, and generating subtitles for audio and video clips. The key, says Michael Vaughn, adoption and education specialist at learning platform Open LMS, is treating AI like an intern who can be guided and molded along the way, and whose work is then vetted by a human expert.

We spoke with Vaughn about how best to utilize generative AI in learning design, ethical issues to consider, and how to formulate an institution-wide policy that can guide AI use today and in the future.


10 Ways Technology Leaders Can Step Up and Into the Generative AI Discussion in Higher Ed — from er.educause.edu by Lance Eaton and Stan Waddell

  1. Offer Short Primers on Generative AI
  2. Explain How to Get Started
  3. Suggest Best Practices for Engaging with Generative AI
  4. Give Recommendations for Different Groups
  5. Recommend Tools
  6. Explain the Closed vs. Open-Source Divide
  7. Avoid Pitfalls
  8. Conduct Workshops and Events
  9. Spot the Fake
  10. Provide Proper Guidance on the Limitations of AI Detectors


 
© 2024 | Daniel Christian