One left
byu/jim_andr inOpenAI

 

From DSC:
I’m not trying to gossip here. I post this because Sam Altman is the head of arguably one of the most powerful companies in the world today — at least in terms of introducing change to a variety of societies throughout the globe (both positive and negative). So when we’ve now seen almost the entire leadership team head out the door, this certainly gives me major pause. I don’t like it.
Items like the ones below begin to capture some of why I’m troubled and suspicious about these troubling moves.

 

AI researcher Jim Fan has had a charmed career. He was OpenAI’s first intern before he did his PhD at Stanford with “godmother of AI,” Fei-Fei Li. He graduated into a research scientist position at Nvidia and now leads its Embodied AI “GEAR” group. The lab’s current work spans foundation models for humanoid robots to agents for virtual worlds. Jim describes a three-pronged data strategy for robotics, combining internet-scale data, simulation data and real world robot data. He believes that in the next few years it will be possible to create a “foundation agent” that can generalize across skills, embodiments and realities—both physical and virtual. He also supports Jensen Huang’s idea that “Everything that moves will eventually be autonomous.”


Runway Partners with Lionsgate — from runwayml.com via The Rundown AI
Runway and Lionsgate are partnering to explore the use of AI in film production.

Lionsgate and Runway have entered into a first-of-its-kind partnership centered around the creation and training of a new AI model, customized on Lionsgate’s proprietary catalog. Fundamentally designed to help Lionsgate Studios, its filmmakers, directors and other creative talent augment their work, the model generates cinematic video that can be further iterated using Runway’s suite of controllable tools.

Per The Rundown: Lionsgate, the film company behind The Hunger Games, John Wick, and Saw, teamed up with AI video generation company Runway to create a custom AI model trained on Lionsgate’s film catalogue.

The details:

  • The partnership will develop an AI model specifically trained on Lionsgate’s proprietary content library, designed to generate cinematic video that filmmakers can further manipulate using Runway’s tools.
  • Lionsgate sees AI as a tool to augment and enhance its current operations, streamlining both pre-production and post-production processes.
  • Runway is considering ways to offer similar custom-trained models as templates for individual creators, expanding access to AI-powered filmmaking tools beyond major studios.

Why it matters: As many writers, actors, and filmmakers strike against ChatGPT, Lionsgate is diving head-first into the world of generative AI through its partnership with Runway. This is one of the first major collabs between an AI startup and a major Hollywood company — and its success or failure could set precedent for years to come.


A bottle of water per email: the hidden environmental costs of using AI chatbots — from washingtonpost.com by Pranshu Verma and Shelly Tan (behind paywall)
AI bots generate a lot of heat, and keeping their computer servers running exacts a toll.

Each prompt on ChatGPT flows through a server that runs thousands of calculations to determine the best words to use in a response.

In completing those calculations, these servers, typically housed in data centers, generate heat. Often, water systems are used to cool the equipment and keep it functioning. Water transports the heat generated in the data centers into cooling towers to help it escape the building, similar to how the human body uses sweat to keep cool, according to Shaolei Ren, an associate professor at UC Riverside.

Where electricity is cheaper, or water comparatively scarce, electricity is often used to cool these warehouses with large units resembling air-conditioners, he said. That means the amount of water and electricity an individual query requires can depend on a data center’s location and vary widely.


AI, Humans and Work: 10 Thoughts. — from rishad.substack.com by Rishad Tobaccowala
The Future Does Not Fit in the Containers of the Past. Edition 215.

10 thoughts about AI, Humans and Work in 10 minutes:

  1. AI is still Under-hyped.
  2. AI itself will be like electricity and is unlikely to be a differentiator for most firms.
  3. AI is not alive but can be thought of as a new species.
  4. Knowledge will be free and every knowledge workers job will change in 2025.
  5. The key about AI is not to ask what AI will do to us but what AI can do for us.
  6. Plus 5 other thoughts

 

 

Why “Wisdom Work” Is the New “Knowledge Work” — from hbr.org by Chip Conley

Summary:

Today the workforce is getting older, and the number of younger workers in positions of senior management is growing. These two developments might appear to spell trouble, in that they seem to set the generations against one another, but the author of this article argues that in fact they represent an important opportunity: If companies can figure out how to enable the intergenerational transfer of the wisdom that comes with age and experience, they can strengthen themselves — and the workplace as a whole.

It also allowed us to develop a list of the character qualities that most commonly defined our best informal mentors, among them: less ego and more collaboration skills, a knack at asking generative questions, and an ability to offer unvarnished insight that feels like a gift as opposed to judgment.

It’s time that we invest as much energy in helping older workers distill their wisdom as we do in helping younger workers accumulate their knowledge.


From DSC:
I think Chip hits on many important and valuable insights in this article. His messages apply to all kinds of organizations. Still, they are especially relevant to the Magnificent Seven (i.e., Google parent Alphabet, Meta Platforms, Amazon.com, Tesla, Apple, Microsoft, and Nvidia) and other tech-related companies who often move forward with designing and producing things without ever thinking about whether they SHOULD be producing those things. What are the positive and negative ramifications of this technology on society? THAT’s a wise question.

I would also add that the word WISDOM is spread throughout the Bible as you can see here — especially in Proverbs 2. So while Chip talks about human wisdom, there is a deeper kind of wisdom that comes from God: 

For the Lord gives wisdom;
    from his mouth come knowledge and understanding.

THAT kind of wisdom is priceless. And we need it in our businesses and in our lives.


 

 

From DSC:
On a macro scale…this is on my heart these days.

I ran across some troubling but informative items re: religion in America from item #5 at Rex Woodbury’s 10 Charts That Capture How the World Is Changing:

  • How Religious Are Americans? — from news.gallup.com
    • The long-term decline in church attendance is linked to a drop in religious identification in general — particularly for Protestant religions — but also to decreasing weekly attendance among U.S. Catholics.
    • Steep Decline in U.S. Church Membership
      Additionally, less than half of Americans, 45%, belong to a formal house of worship. Church membership has been below the majority level each of the past four years. When Gallup first asked the question in 1937, 73% were members of a church, and as recently as 1999, 70% were. The decline in formal church membership has largely been driven by younger generations of Americans. Slightly more than one-third of U.S. young adults have no religious affiliation. Further, many young adults who do identify with a religion do not belong to a church. But even older adults who have a religious preference are less likely to belong to a church today than in the past.

.

I’ve known about this decline for years now, but Rex’ posting and graphs were disheartening nonetheless. And Samuel Abrams’ article contains many reflections that I’ve had as well.

The Christian journey is about transformation — our hearts and minds are changed so that we become more like Jesus Christ (the pioneer and perfecter of [our] faith, per Hebrews 12:2). This transformation involves how we see and experience the world as well as how we are supposed to treat others. We receive new “glasses” if you will — new lenses on the world. In fact, Jesus said in Matthew 22:37-40:

37 Jesus replied: “‘Love the Lord your God with all your heart and with all your soul and with all your mind.’ 38 This is the first and greatest commandment. 39 And the second is like it: ‘Love your neighbor as yourself.’ 40 All the Law and the Prophets hang on these two commandments.”

So Christians are taught to love our neighbors. I/we mess up on this constantly, but many of us are trying to get better at it.

But what happens when we don’t love — or even care for/about — our neighbor? Do you know that if you are living in the United States right now, you are already feeling and experiencing the impact of this in an enormous way?

Here are a few ways that you can see this playing out — even from a secular/business standpoint:

  • Numerous businesses don’t care at all if their products harm you, your family, or your future. For example, food companies don’t care if their products aren’t good for you — they just want your repeat business. They are concerned FAR more about Wall Street and their shareholders than about your health. With knowing that I am a chief sinner, I could also point to those businesses pushing marijuana/cannabis (especially right next to universities and colleges), cigarettes, gambling, and others. There are some dubious folks within the healthcare and pharmaceutical industries as well.
  • Many businesses lie to you when you call into their 800 #’s and they tell you that they care about you and your business. Again we see that they greatly appreciate your money, but they really don’t care about you or your time. They often put you in a long queue.  The worst voice response units are programmed to make it extremely difficult — if not impossible — to let you talk to a live person.
  • Many businesses have embarked on the shrinkification of their products: offering smaller amounts but charging the same.
  • Many businesses don’t care if our youth are being negatively impacted (social media companies may come to some peoples’ minds. Disclaimer: I use Twitter/X and LinkedIn frequently).
  • Many businesses don’t care if their technologies are beneficial to society. They don’t stop to think about whether they should design and produce their products…just whether or not they can. Little to no wisdom is being displayed here.
  • …and I — and you — could list many more here.

So you and I are already being impacted when we push God out of our lives and out of our institutions. When we Americans look around these days..how’s that going for us? In my own life, the further I get away from God, the worse things get.

Also, we could talk about mental health*, shootings in our schools and on our streets, and several other things.

Do we care? I do. I think about this kind of thing more and more these days. LORD, forgive us. We need your help.

* I realize that Christians can struggle with mental health too
 

7.2M Americans Over 50 Hold Student Debt, New Report Shows — from insidehighered.com by Jessica Blake
Urban Institute researchers say the financial burden not only puts a strain on the borrowers themselves but also the social welfare programs designed to be their safety net.

However, a recent series of reports and blog posts published by Urban Institute shows that older adults are also struggling to pay back their student loans.

By analyzing a nationally representative sample of credit records from roughly four million adults aged 50 and older, Urban Institute’s report concludes that as of August 2022, approximately 6 percent of older adults—or 7.2 million Americans—have yet to pay off their student loans. Among those same borrowers, 8 percent, or 580,000 individuals, are behind on payments. The median amount of delinquent debt was approximately $11,500.

“These disadvantages can compound over decades within and across generations, making these borrowers less able to repay their loans on time,” wrote Mingli Zhong, an Urban Institute senior research associate who specializes in borrowing behavior. “Over all, older adults are carrying more debt, not just student loan debt but all kinds of debt [medical, mortgage, etc.] into retirement,” she later told Inside Higher Ed.

 

What aspects of teaching should remain human? — from hechingerreport.org by Chris Berdik
Even techno optimists hesitate to say teaching is best left to the bots, but there’s a debate about where to draw the line

ATLANTA — Science teacher Daniel Thompson circulated among his sixth graders at Ron Clark Academy on a recent spring morning, spot checking their work and leading them into discussions about the day’s lessons on weather and water. He had a helper: As Thompson paced around the class, peppering them with questions, he frequently turned to a voice-activated AI to summon apps and educational videos onto large-screen smartboards.

When a student asked, “Are there any animals that don’t need water?” Thompson put the question to the AI. Within seconds, an illustrated blurb about kangaroo rats appeared before the class.

Nitta said there’s something “deeply profound” about human communication that allows flesh-and-blood teachers to quickly spot and address things like confusion and flagging interest in real time.


Deep Learning: Five New Superpowers of Higher Education — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
How Deep Learning is Transforming Higher Education

While the traditional model of education is entrenched, emerging technologies like deep learning promise to shake its foundations and usher in an age of personalized, adaptive, and egalitarian education. It is expected to have a significant impact across higher education in several key ways.

…deep learning introduces adaptivity into the learning process. Unlike a typical lecture, deep learning systems can observe student performance in real-time. Confusion over a concept triggers instant changes to instructional tactics. Misconceptions are identified early and remediated quickly. Students stay in their zone of proximal development, constantly challenged but never overwhelmed. This adaptivity prevents frustration and stagnation.


InstructureCon 24 Conference Notes — from onedtech.philhillaa.com by Glenda Morgan
Another solid conference from the market leader, even with unclear roadmap

The new stuff: AI
Instructure rolled out multiple updates and improvements – more than last year. These included many AI-based or focused tools and services as well as some functional improvements. I’ll describe the AI features first.

Sal Khan was a surprise visitor to the keynote stage to announce the September availability of the full suite of AI-enabled Khanmigo Teacher Tools for Canvas users. The suite includes 20 tools, such as tools to generate lesson plans and quiz questions and write letters of recommendation. Next year, they plan to roll out tools for students themselves to use.

Other AI-based features include.

    • Discussion tool summaries and AI-generated responses…
    • Translation of inbox messages and discussions…
    • Smart search …
    • Intelligent Insights…

 

 

Introducing Eureka Labs — “We are building a new kind of school that is AI native.” — by Andrej Karpathy, Previously Director of AI @ Tesla, founding team @ OpenAI

However, with recent progress in generative AI, this learning experience feels tractable. The teacher still designs the course materials, but they are supported, leveraged and scaled with an AI Teaching Assistant who is optimized to help guide the students through them. This Teacher + AI symbiosis could run an entire curriculum of courses on a common platform. If we are successful, it will be easy for anyone to learn anything, expanding education in both reach (a large number of people learning something) and extent (any one person learning a large amount of subjects, beyond what may be possible today unassisted).


After Tesla and OpenAI, Andrej Karpathy’s startup aims to apply AI assistants to education — from techcrunch.com by Rebecca Bellan

Andrej Karpathy, former head of AI at Tesla and researcher at OpenAI, is launching Eureka Labs, an “AI native” education platform. In tech speak, that usually means built from the ground up with AI at its core. And while Eureka Labs’ AI ambitions are lofty, the company is starting with a more traditional approach to teaching.

San Francisco-based Eureka Labs, which Karpathy registered as an LLC in Delaware on June 21, aims to leverage recent progress in generative AI to create AI teaching assistants that can guide students through course materials.


What does it mean for students to be AI-ready? — from timeshighereducation.com by David Joyner
Not everyone wants to be a computer scientist, a software engineer or a machine learning developer. We owe it to our students to prepare them with a full range of AI skills for the world they will graduate into, writes David Joyner

We owe it to our students to prepare them for this full range of AI skills, not merely the end points. The best way to fulfil this responsibility is to acknowledge and examine this new category of tools. More and more tools that students use daily – word processors, email, presentation software, development environments and more – have AI-based features. Practising with these tools is a valuable exercise for students, so we should not prohibit that behaviour. But at the same time, we do not have to just shrug our shoulders and accept however much AI assistance students feel like using.


Teachers say AI usage has surged since the school year started — from eschoolnews.com by Laura Ascione
Half of teachers report an increase in the use of AI and continue to seek professional learning

Fifty percent of educators reported an increase in AI usage, by both students and teachers, over the 2023–24 school year, according to The 2024 Educator AI Report: Perceptions, Practices, and Potential, from Imagine Learning, a digital curriculum solutions provider.

The report offers insight into how teachers’ perceptions of AI use in the classroom have evolved since the start of the 2023–24 school year.


OPINION: What teachers call AI cheating, leaders in the workforce might call progress — from hechingerreport.org by C. Edward Waston and Jose Antonio Bowen
Authors of a new guide explore what AI literacy might look like in a new era

Excerpt (emphasis DSC):

But this very ease has teachers wondering how we can keep our students motivated to do the hard work when there are so many new shortcuts. Learning goals, curriculums, courses and the way we grade assignments will all need to be reevaluated.

The new realities of work also must be considered. A shift in employers’ job postings rewards those with AI skills. Many companies report already adopting generative AI tools or anticipate incorporating them into their workflow in the near future.

A core tension has emerged: Many teachers want to keep AI out of our classrooms, but also know that future workplaces may demand AI literacy.

What we call cheating, business could see as efficiency and progress.

It is increasingly likely that using AI will emerge as an essential skill for students, regardless of their career ambitions, and that action is required of educational institutions as a result.


Teaching Writing With AI Without Replacing Thinking: 4 Tips — from by Erik Ofgang
AI has a lot of potential for writing students, but we can’t let it replace the thinking parts of writing, says writing professor Steve Graham

Reconciling these two goals — having AI help students learn to write more efficiently without hijacking the cognitive benefits of writing — should be a key goal of educators. Finding the ideal balance will require more work from both researchers and classroom educators, but Graham shares some initial tips for doing this currently.




Why I ban AI use for writing assignments — from timeshighereducation.com by James Stacey Taylor
Students may see handwriting essays in class as a needlessly time-consuming approach to assignments, but I want them to learn how to engage with arguments, develop their own views and convey them effectively, writes James Stacey Taylor

Could they use AI to generate objections to the arguments they read? Of course. AI does a good job of summarising objections to Singer’s view. But I don’t want students to parrot others’ objections. I want them to think of objections themselves. 

Could AI be useful for them in organising their exegesis of others’ views and their criticisms of them? Yes. But, again, part of what I want my students to learn is precisely what this outsources to the AI: how to organise their thoughts and communicate them effectively. 


How AI Will Change Education — from digitalnative.tech by Rex Woodbury
Predicting Innovation in Education, from Personalized Learning to the Downfall of College 

This week explores how AI will bleed into education, looking at three segments of education worth watching, then examining which business models will prevail.

  1. Personalized Learning and Tutoring
  2. Teacher Tools
  3. Alternatives to College
  4. Final Thoughts: Business Models and Why Education Matters

New Guidance from TeachAI and CSTA Emphasizes Computer Science Education More Important than Ever in an Age of AI — from csteachers.org by CSTA
The guidance features new survey data and insights from teachers and experts in computer science (CS) and AI, informing the future of CS education.

SEATTLE, WA – July 16, 2024 – Today, TeachAI, led by Code.org, ETS, the International Society of Technology in Education (ISTE), Khan Academy, and the World Economic Forum, launches a new initiative in partnership with the Computer Science Teachers Association (CSTA) to support and empower educators as they grapple with the growing opportunities and risks of AI in computer science (CS) education.

The briefs draw on early research and insights from CSTA members, organizations in the TeachAI advisory committee, and expert focus groups to address common misconceptions about AI and offer a balanced perspective on critical issues in CS education, including:

  • Why is it Still Important for Students to Learn to Program?
  • How Are Computer Science Educators Teaching With and About AI?
  • How Can Students Become Critical Consumers and Responsible Creators of AI?
 

A New Digital Divide: Student AI Use Surges, Leaving Faculty Behind— from insidehighered.com by Lauren Coffey
While both students and faculty have concerns with generative artificial intelligence, two new reports show a divergence in AI adoption. 

Meanwhile, a separate survey of faculty released Thursday by Ithaka S+R, a higher education consulting firm, showcased that faculty—while increasingly familiar with AI—often do not know how to use it in classrooms. Two out of five faculty members are familiar with AI, the Ithaka report found, but only 14 percent said they are confident in their ability to use AI in their teaching. Just slightly more (18 percent) said they understand the teaching implications of generative AI.

“Serious concerns about academic integrity, ethics, accessibility, and educational effectiveness are contributing to this uncertainty and hostility,” the Ithaka report said.

The diverging views about AI are causing friction. Nearly a third of students said they have been warned to not use generative AI by professors, and more than half (59 percent) are concerned they will be accused of cheating with generative AI, according to the Pearson report, which was conducted with Morning Consult and surveyed 800 students.


What teachers want from AI — from hechingerreport.org by Javeria Salman
When teachers designed their own AI tools, they built math assistants, tools for improving student writing, and more

An AI chatbot that walks students through how to solve math problems. An AI instructional coach designed to help English teachers create lesson plans and project ideas. An AI tutor that helps middle and high schoolers become better writers.

These aren’t tools created by education technology companies. They were designed by teachers tasked with using AI to solve a problem their students were experiencing.

Over five weeks this spring, about 300 people – teachers, school and district leaders, higher ed faculty, education consultants and AI researchers – came together to learn how to use AI and develop their own basic AI tools and resources. The professional development opportunity was designed by technology nonprofit Playlab.ai and faculty at the Relay Graduate School of Education.


The Comprehensive List of Talks & Resources for 2024 — from aiedusimplified.substack.com by Lance Eaton
Resources, talks, podcasts, etc that I’ve been a part of in the first half of 2024

Resources from things such as:

  • Lightning Talks
  • Talks & Keynotes
  • Workshops
  • Podcasts & Panels
  • Honorable Mentions

Next-Gen Classroom Observations, Powered by AI — from educationnext.org by Michael J. Petrilli
The use of video recordings in classrooms to improve teacher performance is nothing new. But the advent of artificial intelligence could add a helpful evaluative tool for teachers, measuring instructional practice relative to common professional goals with chatbot feedback.

Multiple companies are pairing AI with inexpensive, ubiquitous video technology to provide feedback to educators through asynchronous, offsite observation. It’s an appealing idea, especially given the promise and popularity of instructional coaching, as well as the challenge of scaling it effectively (see “Taking Teacher Coaching To Scale,” research, Fall 2018).

Enter AI. Edthena is now offering an “AI Coach” chatbot that offers teachers specific prompts as they privately watch recordings of their lessons. The chatbot is designed to help teachers view their practice relative to common professional goals and to develop action plans to improve.

To be sure, an AI coach is no replacement for human coaching.


Personalized AI Tutoring as a Social Activity: Paradox or Possibility? — from er.educause.edu by Ron Owston
Can the paradox between individual tutoring and social learning be reconciled though the possibility of AI?

We need to shift our thinking about GenAI tutors serving only as personal learning tools. The above activities illustrate how these tools can be integrated into contemporary classroom instruction. The activities should not be seen as prescriptive but merely suggestive of how GenAI can be used to promote social learning. Although I specifically mention only one online activity (“Blended Learning”), all can be adapted to work well in online or blended classes to promote social interaction.


Stealth AI — from higherai.substack.com by Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler
What happens when students use AI all the time, but aren’t allowed to talk about it?

In many ways, this comes back to one of my general rules: You cannot ban AI in the classroom. You can only issue a gag rule.

And if you do issue a gag rule, then it deprives students of the space they often need to make heads and tails of this technology.

We need to listen to actual students talking about actual uses, and reflecting on their actual feelings. No more abstraction.

In this conversation, Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler about what students are saying about Artificial Intelligence and education.


What’s New in Microsoft EDU | ISTE Edition June 2024 — from techcommunity.microsoft.com

Welcome to our monthly update for Teams for Education and thank you so much for being part of our growing community! We’re thrilled to share over 20 updates and resources and show them in action next week at ISTELive 24 in Denver, Colorado, US.

Copilot for Microsoft 365 – Educator features
Guided Content Creation
Coming soon to Copilot for Microsoft 365 is a guided content generation experience to help educators get started with creating materials like assignments, lesson plans, lecture slides, and more. The content will be created based on the educator’s requirements with easy ways to customize the content to their exact needs.
Standards alignment and creation
Quiz generation through Copilot in Forms
Suggested AI Feedback for Educators
Teaching extension
To better support educators with their daily tasks, we’ll be launching a built-in Teaching extension to help guide them through relevant activities and provide contextual, educator-based support in Copilot.
Education data integration

Copilot for Microsoft 365 – Student features
Interactive practice experiences
Flashcards activity
Guided chat activity
Learning extension in Copilot for Microsoft 365


New AI tools for Google Workspace for Education — from blog.google by Akshay Kirtikar and Brian Hendricks
We’re bringing Gemini to teen students using their school accounts to help them learn responsibly and confidently in an AI-first future, and empowering educators with new tools to help create great learning experiences.

 

Latent Expertise: Everyone is in R&D — from oneusefulthing.org by Ethan Mollick
Ideas come from the edges, not the center

Excerpt (emphasis DSC):

And to understand the value of AI, they need to do R&D. Since AI doesn’t work like traditional software, but more like a person (even though it isn’t one), there is no reason to suspect that the IT department has the best AI prompters, nor that it has any particular insight into the best uses of AI inside an organization. IT certainly plays a role, but the actual use cases will come from workers and managers who find opportunities to use AI to help them with their job. In fact, for large companies, the source of any real advantage in AI will come from the expertise of their employees, which is needed to unlock the expertise latent in AI.


OpenAI’s former chief scientist is starting a new AI company — from theverge.com by Emma Roth
Ilya Sutskever is launching Safe Superintelligence Inc., an AI startup that will prioritize safety over ‘commercial pressures.’

Ilya Sutskever, OpenAI’s co-founder and former chief scientist, is starting a new AI company focused on safety. In a post on Wednesday, Sutskever revealed Safe Superintelligence Inc. (SSI), a startup with “one goal and one product:” creating a safe and powerful AI system.

Ilya Sutskever Has a New Plan for Safe Superintelligence — from bloomberg.com by Ashlee Vance (behind a paywall)
OpenAI’s co-founder discloses his plans to continue his work at a new research lab focused on artificial general intelligence.

Safe Superintelligence — from theneurondaily.com by Noah Edelman

Ilya Sutskever is kind of a big deal in AI, to put it lightly.

Part of OpenAI’s founding team, Ilya was Chief Data Scientist (read: genius) before being part of the coup that fired Sam Altman.

Yesterday, Ilya announced that he’s forming a new initiative called Safe Superintelligence.

If AGI = AI that can perform a wide range of tasks at our level, then Superintelligence = an even more advanced AI that surpasses human capabilities in all areas.


AI is exhausting the power grid. Tech firms are seeking a miracle solution. — from washingtonpost.com by Evan Halper and Caroline O’Donovan
As power needs of AI push emissions up and put big tech in a bind, companies put their faith in elusive — some say improbable — technologies.

As the tech giants compete in a global AI arms race, a frenzy of data center construction is sweeping the country. Some computing campuses require as much energy as a modest-sized city, turning tech firms that promised to lead the way into a clean energy future into some of the world’s most insatiable guzzlers of power. Their projected energy needs are so huge, some worry whether there will be enough electricity to meet them from any source.


Microsoft, OpenAI, Nvidia join feds for first AI attack simulation — from axios.com by Sam Sabin

Federal officials, AI model operators and cybersecurity companies ran the first joint simulation of a cyberattack involving a critical AI system last week.

Why it matters: Responding to a cyberattack on an AI-enabled system will require a different playbook than the typical hack, participants told Axios.

The big picture: Both Washington and Silicon Valley are attempting to get ahead of the unique cyber threats facing AI companies before they become more prominent.


Hot summer of AI video: Luma & Runway drop amazing new models — from heatherbcooper.substack.com by Heather Cooper
Plus an amazing FREE video to sound app from ElevenLabs

Immediately after we saw Sora-like videos from KLING, Luma AI’s Dream Machine video results overshadowed them.

Dream Machine is a next-generation AI video model that creates high-quality, realistic shots from text instructions and images.


Introducing Gen-3 Alpha — from runwayml.com by Anastasis Germanidis
A new frontier for high-fidelity, controllable video generation.


AI-Generated Movies Are Around the Corner — from news.theaiexchange.com by The AI Exchange
The future of AI in filmmaking; participate in our AI for Agencies survey

AI-Generated Feature Films Are Around the Corner.
We predict feature-film length AI-generated films are coming by the end of 2025, if not sooner.

Don’t believe us? You need to check out Runway ML’s new Gen-3 model they released this week.

They’re not the only ones. We also have Pika, which just raised $80M. And Google’s Veo. And OpenAI’s Sora. (+ many others)

 

The Musician’s Rule and GenAI in Education — from opencontent.org by David Wiley

We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.

From DSC:
As is often the case, David put together a solid posting here. A few comments/reflections on it:

  • I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
  • The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
  • The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
  • The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
  • As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
  • There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better.
    .

We need to take more of the research from learning science and apply it in our learning spaces.

 

Exclusive: AI isn’t a daily habit yet for teens, young adults — from axios.com by Scott Rosenberg

Young Americans are quickly embracing generative AI as a tool, but few have yet made it a part of their daily lives, according to new data shared exclusively with Axios from Common Sense Media, Hopelab and the Harvard Graduate School of Education’s Center for Digital Thriving.

Why it matters: Since the rise of the web 30 years ago, young users have typically adopted and shaped each new dominant tech platform.

By the numbers: The survey of 1,274 U.S.-based teens and young adults, conducted in October and November 2023, found that only 4% of respondents, all aged 14-22, said they use AI tools daily or almost daily.

As cited in the above article, also see:

 

 

 

The Verge | What’s Next With AI | February 2024 | Consumer Survey

 

 

 

 

 

 




Microsoft AI creates talking deepfakes from single photo — from inavateonthenet.net


The Great Hall – where now with AI? It is not ‘Human Connection V Innovative Technology’ but ‘Human Connection + Innovative Technology’ — from donaldclarkplanb.blogspot.com by Donald Clark

The theme of the day was Human Connection V Innovative Technology. I see this a lot at conferences, setting up the human connection (social) against the machine (AI). I think this is ALL wrong. It is, and has always been a dialectic, human connection (social) PLUS the machine. Everyone had a smartphone, most use it for work, comms and social media. The binary between human and tech has long disappeared. 


Techno-Social Engineering: Why the Future May Not Be Human, TikTok’s Powerful ForYou Algorithm, & More — from by Misha Da Vinci

Things to consider as you dive into this edition:

  • As we increasingly depend on technology, how is it changing us?
  • In the interaction between humans and technology, who is adapting to whom?
  • Is the technology being built for humans, or are we being changed to fit into tech systems?
  • As time passes, will we become more like robots or the AI models we use?
  • Over the next 30 years, as we increasingly interact with technology, who or what will we become?

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 

“AI native” Gen Zers are comfortable on the cutting edge — from axios.com by Jennifer A. Kingson; via The Rundown AI

When it comes to generative AI at school and work, Gen Z says: Bring it on.

Why it matters: While some workers are fearful or ambivalent about how ChatGPT, DALL-E and their ilk will affect their jobs, many college students and newly minted grads think it can give them a career edge.

  • So-called “AI natives” who are studying the technology in school may have a leg up on older “digital natives” — the same way that “digital native” millennials smugly bested their “digital immigrant” elders.

 

 
© 2024 | Daniel Christian