According to Notebook LM on this Future U podcast Searching for Fit: The Impacts of AI in Higher Edhere are some excerpts from the generated table of contents:

Part 1: Setting the Stage

I. Introduction (0:00 – 6:16): …
II. Historical Contextualization (6:16 – 11:30): …
III. The Role of Product Fit in AI’s Impact (11:30 – 17:10): …
IV. AI and the Future of Knowledge Work (17:10 – 24:03): …
V. Teaching About AI in Higher Ed: A Measured Approach (24:03 – 34:20): …
VI. AI & the Evolving Skills Landscape (34:20 – 44:35): …
VII. Ethical & Pedagogical Considerations in an AI-Driven World (44:35 – 54:03):…
VIII. AI Beyond the Classroom: Administrative Applications & the Need for Intuition (54:03 – 1:04:30): …
IX. Reflections & Future Directions (1:04:30 – 1:11:15): ….

Part 2: Administrative Impacts & Looking Ahead

X. Bridging the Conversation: From Classroom to Administration (1:11:15 – 1:16:45): …
XI. The Administrative Potential of AI: A Looming Transformation (1:16:45 – 1:24:42): …
XII. The Need for Intuitiveness & the Importance of Real-World Applications (1:24:42 – 1:29:45): …
XIII. Looking Ahead: From Hype to Impactful Integration (1:29:45 – 1:34:25): …
XIV. Conclusion and Call to Action (1:34:25 – 1:36:03): …


The future of language learning — from medium.com by Sami Tatar

Most language learners do not have access to affordable 1:1 tutoring, which is also proven to be the most effective way to learn (short of moving to a specific country for complete immersion). Meanwhile, language learning is a huge market, and with an estimated 60% of this still dominated by “offline” solutions, meaning it is prime for disruption and never more so than with the opportunities unlocked through AI powered language learning. Therefore — we believe this presents huge opportunities for new startups creating AI native products to create the next language learning unicorns.



“The Broken Mirror: Rethinking Education, AI, and Equity in America’s Classrooms” — from nickpotkalitsky.substack.com by JC Price

It’s not that AI is inherently biased, but in its current state, it favors those who can afford it. The wealthy districts continue to pull ahead, leaving schools without resources further behind. Students in these underserved areas aren’t just being deprived of technology—they’re being deprived of the future.

But imagine a different world—one where AI doesn’t deepen the divide, but helps to bridge it. Technology doesn’t have to be the luxury of the wealthy. It can be a tool for every student, designed to meet them where they are. Adaptive AI systems, integrated into schools regardless of their budget, can provide personalized learning experiences that help students catch up and push forward, all while respecting the limits of their current infrastructure. This is where AI’s true potential lies—not in widening the gap, but in leveling the field.

But imagine if, instead of replacing teachers, AI helped to support them. Picture a world where teachers are freed from the administrative burdens that weigh them down. Where AI systems handle the logistics, so teachers can focus on what they do best—teaching, mentoring, and inspiring the next generation. Professional development could be personalized, helping teachers integrate AI into their classrooms in ways that enhance their teaching, without adding to their workload. This is the future we should be striving toward—one where technology serves to lift up educators, not push them out.

 

Employers Say Students Need AI Skills. What If Students Don’t Want Them? — from insidehighered.com by Ashley Mowreader
Colleges and universities are considering new ways to incorporate generative AI into teaching and learning, but not every student is on board with the tech yet. Experts weigh in on the necessity of AI in career preparation and higher education’s role in preparing students for jobs of the future.

Among the 5,025-plus survey respondents, around 2 percent (n=93), provided free responses to the question on AI policy and use in the classroom. Over half (55) of those responses were flat-out refusal to engage with AI. A few said they don’t know how to use AI or are not familiar with the tool, which impacts their ability to apply appropriate use to coursework.

But as generative AI becomes more ingrained into the workplace and higher education, a growing number of professors and industry experts believe this will be something all students need, in their classes and in their lives beyond academia.

From DSC:
I used to teach a Foundations of Information Technology class. Some of the students didn’t want to be there as they began the class, as it was a required class for non-CS majors. But after seeing what various applications and technologies could do for them, a good portion of those same folks changed their minds. But not all. Some students (2% sounds about right) asserted that they would never use technologies in their futures. Good luck with that I thought to myself. There’s hardly a job out there that doesn’t use some sort of technology.

And I still think that today — if not more so. If students want good jobs, they will need to learn how to use AI-based tools and technologies. I’m not sure there’s much of a choice. And I don’t think there’s much of a choice for the rest of us either — whether we’re still working or not. 

So in looking at the title of the article — “Employers Say Students Need AI Skills. What If Students Don’t Want Them?” — those of us who have spent any time working within the world of business already know the answer.

#Reinvent #Skills #StayingRelevant #Surviving #Workplace + several other categories/tags apply.


For those folks who have tried AI:

Skills: However, genAI may also be helpful in building skills to retain a job or secure a new one. People who had used genAI tools were more than twice as likely to think that these tools could help them learn new skills that may be useful at work or in locating a new job. Specifically, among those who had not used genAI tools, 23 percent believed that these tools might help them learn new skills, whereas 50 percent of those who had used the tools thought they might be helpful in acquiring useful skills (a highly statistically significant difference, after controlling for demographic traits).

Source: Federal Reserve Bank of New York

 

Workera’s CEO was mentored by Andrew Ng. Now he wants an AI agent to mentor you. — from techcrunch.com by Maxwell Zeff; via Claire Zau

On Tuesday, Workera announced Sage, an AI agent you can talk with that’s designed to assess an employee’s skill level, goals, and needs. After taking some short tests, Workera claims Sage will accurately gauge how proficient someone is at a certain skill. Then, Sage can recommend the appropriate online courses through Coursera, Workday, or other learning platform partners. Through chatting with Sage, Workera is designed to meet employees where they are, testing their skills in writing, machine learning, or math, and giving them a path to improve.

From DSC:
This is very much akin to what I’ve been trying to get at with my Learning from the Living [AI-Based Class] Room vision. And as learning agents come onto the scene, this type of vision should take off!

 

Voice and Trust in Autonomous Learning Experiences — from learningguild.com by Bill Brandon

This article seeks to apply some lessons from brand management to learning design at a high level. Throughout the rest of this article, it is essential to remember that the context is an autonomous, interactive learning experience. The experience is created adaptively by Gen AI or (soon enough) by agents, not by rigid scripts. It may be that an AI will choose to present prewritten texts or prerecorded videos from a content library according to the human users’ responses or questions. Still, the overall experience will be different for each user. It will be more like a conversation than a book.

In summary, while AI chatbots have the potential to enhance learning experiences, their acceptance and effectiveness depend on several factors, including perceived usefulness, ease of use, trust, relational factors, perceived risk, and enjoyment. 

Personalization and building trust are essential for maintaining user engagement and achieving positive learning outcomes. The right “voice” for autonomous AI or a chatbot can enhance trust by making interactions more personal, consistent, and empathetic.

 

AI’s Trillion-Dollar Opportunity — from bain.com by David Crawford, Jue Wang, and Roy Singh
The market for AI products and services could reach between $780 billion and $990 billion by 2027.

At a Glance

  • The big cloud providers are the largest concentration of R&D, talent, and innovation today, pushing the boundaries of large models and advanced infrastructure.
  • Innovation with smaller models (open-source and proprietary), edge infrastructure, and commercial software is reaching enterprises, sovereigns, and research institutions.
  • Commercial software vendors are rapidly expanding their feature sets to provide the best use cases and leverage their data assets.

Accelerated market growth. Nvidia’s CEO, Jensen Huang, summed up the potential in the company’s Q3 2024 earnings call: “Generative AI is the largest TAM [total addressable market] expansion of software and hardware that we’ve seen in several decades.”


And on a somewhat related note (i.e., emerging technologies), also see the following two postings:

Surgical Robots: Current Uses and Future Expectations — from medicalfuturist.com by Pranavsingh Dhunnoo
As the term implies, a surgical robot is an assistive tool for performing surgical procedures. Such manoeuvres, also called robotic surgeries or robot-assisted surgery, usually involve a human surgeon controlling mechanical arms from a control centre.

Key Takeaways

  • Robots’ potentials have been a fascination for humans and have even led to a booming field of robot-assisted surgery.
  • Surgical robots assist surgeons in performing accurate, minimally invasive procedures that are beneficial for patients’ recovery.
  • The assistance of robots extend beyond incisions and includes laparoscopies, radiosurgeries and, in the future, a combination of artificial intelligence technologies to assist surgeons in their craft.

Proto hologram tech allows cancer patients to receive specialist care without traveling large distances — from inavateonthenet.net

“Working with the team from Proto to bring to life, what several years ago would have seemed impossible, is now going to allow West Cancer Center & Research Institute to pioneer options for patients to get highly specialized care without having to travel to large metro areas,” said West Cancer’s CEO, Mitch Graves.




Clone your voice in minutes: The AI trick 95% don’t know about — from aidisruptor.ai by Alex McFarland
Warning: May cause unexpected bouts of talking to yourself

Now that you’ve got your voice clone, what can you do with it?

  1. Content Creation:
    • Podcast Production: Record episodes in half the time. Your listeners won’t know the difference, but your schedule will thank you.
    • Audiobook Narration: Always wanted to narrate your own book? Now you can, without spending weeks in a recording studio.
    • YouTube Videos: Create voiceovers for your videos in multiple languages. World domination, here you come!
  2. Business Brilliance:
    • Customer Service: Personalized automated responses that actually sound personal.
    • Training Materials: Create engaging e-learning content in your own voice, minus the hours of recording.
    • Presentations: Never worry about losing your voice before a big presentation again. Your clone’s got your back.

185 real-world gen AI use cases from the world’s leading organizations — from blog.google by Brian Hall; via Daniel Nest’s Why Try AI

In a matter of months, organizations have gone from AI helping answer questions, to AI making predictions, to generative AI agents. What makes AI agents unique is that they can take actions to achieve specific goals, whether that’s guiding a shopper to the perfect pair of shoes, helping an employee looking for the right health benefits, or supporting nursing staff with smoother patient hand-offs during shifts changes.

In our work with customers, we keep hearing that their teams are increasingly focused on improving productivity, automating processes, and modernizing the customer experience. These aims are now being achieved through the AI agents they’re developing in six key areas: customer service; employee empowerment; code creation; data analysis; cybersecurity; and creative ideation and production.

Here’s a snapshot of how 185 of these industry leaders are putting AI to use today, creating real-world use cases that will transform tomorrow.


AI Data Drop: 3 Key Insights from Real-World Research on AI Usage — from microsoft.com; via Daniel Nest’s Why Try AI
One of the largest studies of Copilot usage—at nearly 60 companies—reveals how AI is changing the way we work.

  1. AI is starting to liberate people from email
  2. Meetings are becoming more about value creation
  3. People are co-creating more with AI—and with one another


*** Dharmesh has been working on creating agent.ai — a professional network for AI agents.***


Speaking of agents, also see:

Onboarding the AI workforce: How digital agents will redefine work itself — from venturebeat.com by Gary Grossman

AI in 2030: A transformative force

  1. AI agents are integral team members
  2. The emergence of digital humans
  3. AI-driven speech and conversational interfaces
  4. AI-enhanced decision-making and leadership
  5. Innovation and research powered by AI
  6. The changing nature of job roles and skills

AI Video Tools You Can Use Today — from heatherbcooper.substack.com by Heather Cooper
The latest AI video models that deliver results

AI video models are improving so quickly, I can barely keep up! I wrote about unreleased Adobe Firefly Video in the last issue, and we are no closer to public access to Sora.

No worries – we do have plenty of generative AI video tools we can use right now.

  • Kling AI launched its updated v1.5 and the quality of image or text to video is impressive.
  • Hailuo MiniMax text to video remains free to use for now, and it produces natural and photorealistic results (with watermarks).
  • Runway added the option to upload portrait aspect ratio images to generate vertical videos in Gen-3 Alpha & Turbo modes.
  • …plus several more

 

Why Jensen Huang and Marc Benioff see ‘gigantic’ opportunity for agentic AI — from venturebeat.com by Taryn Plumb

Going forward, the opportunity for AI agents will be “gigantic,” according to Nvidia founder and CEO Jensen Huang.

Already, progress is “spectacular and surprising,” with AI development moving faster and faster and the industry getting into the “flywheel zone” that technology needs to advance, Huang said in a fireside chat at Salesforce’s flagship event Dreamforce this week.

“This is an extraordinary time,” Huang said while on stage with Marc Benioff, Salesforce chair, CEO and co-founder. “In no time in history has technology moved faster than Moore’s Law. We’re moving way faster than Moore’s Law, are arguably reasonably Moore’s Law squared.”

“We’ll have agents working with agents, agents working with us,” said Huang.

 

One left
byu/jim_andr inOpenAI

 

From DSC:
I’m not trying to gossip here. I post this because Sam Altman is the head of arguably one of the most powerful companies in the world today — at least in terms of introducing change to a variety of societies throughout the globe (both positive and negative). So when we’ve now seen almost the entire leadership team head out the door, this certainly gives me major pause. I don’t like it.
Items like the ones below begin to capture some of why I’m troubled and suspicious about these troubling moves.

 

This article….

Artificial Intelligence and Schools: When Tech Makers and Educators Collaborate, AI Doesn’t Have to be Scary — from the74million.org by Edward Montalvo
AI is already showing us how to make education more individualized and equitable.

The XQ Institute shares this mindset as part of our mission to reimagine the high school learning experience so it’s more relevant and engaging for today’s learners, while better preparing them for the future. We see AI as a tool with transformative potential for educators and makers to leverage — but only if it’s developed and implemented with ethics, transparency and equity at the forefront. That’s why we’re building partnerships between educators and AI developers to ensure that products are shaped by the real needs and challenges of students, teachers and schools. Here’s how we believe all stakeholders can embrace the Department’s recommendations through ongoing collaborations with tech leaders, educators and students alike.

…lead me to the XQ Institute, and I very much like what I’m initially seeing! Here are some excerpts from their website:

 


 

FlexOS’ Stay Ahead Edition #43 — from flexos.work

People started discussing what they could do with Notebook LM after Google launched the audio overview, where you can listen to 2 hosts talking in-depth about the documents you upload. Here are what it can do:

  • Summarization: Automatically generate summaries of uploaded documents, highlighting key topics and suggesting relevant questions.
  • Question Answering: Users can ask NotebookLM questions about their uploaded documents, and answers will be provided based on the information contained within them.
  • Idea Generation: NotebookLM can assist with brainstorming and developing new ideas.
  • Source Grounding: A big plus against AI chatbot hallucination, NotebookLM allows users to ground the responses in specific documents they choose.
  • …plus several other items

The posting also lists several ideas to try with NotebookLM such as:

Idea 2: Study Companion

  • Upload all your course materials and ask NotebookLM to turn them into Question-and-Answer format, a glossary, or a study guide.
  • Get a breakdown of the course materials to understand them better.

Google’s NotebookLM: A Game-Changer for Education and Beyond — from ai-supremacy.com by Michael Spencer and Nick Potkalitsky
AI Tools: Breaking down Google’s latest AI tool and its implications for education.

“Google’s AI note-taking app NotebookLM can now explain complex topics to you out loud”

With more immersive text-to-video and audio products soon available and the rise of apps like Suno AI, how we “experience” Generative AI is also changing from a chatbot of 2 years ago, to a more multi-modal educational journey. The AI tools on the research and curation side are also starting to reflect these advancements.


Meet Google NotebookLM: 10 things to know for educators — from ditchthattextbook.com by Matt Miller

1. Upload a variety of sources for NotebookLM to use. 
You can use …

  • websites
  • PDF files
  • links to websites
  • any text you’ve copied
  • Google Docs and Slides
  • even Markdown

You can’t link it to YouTube videos, but you can copy/paste the transcript (and maybe type a little context about the YouTube video before pasting the transcript).

2. Ask it to create resources.
3. Create an audio summary.
4. Chat with your sources.
5. Save (almost) everything. 


NotebookLM summarizes my dissertation — from darcynorman.net by D’Arcy Norman, PhD

I finally tried out Google’s newly-announced NotebookLM generative AI application. It provides a set of LLM-powered tools to summarize documents. I fed it my dissertation, and am surprised at how useful the output would be.

The most impressive tool creates a podcast episode, complete with dual hosts in conversation about the document. First – these are AI-generated hosts. Synthetic voices, speaking for synthetic hosts. And holy moly is it effective. Second – although I’d initially thought the conversational summary would be a dumb gimmick, it is surprisingly powerful.


4 Tips for Designing AI-Resistant Assessments — from techlearning.com by Steve Baule and Erin Carter
As AI continues to evolve, instructors must modify their approach by designing meaningful, rigorous assessments.

As instructors work through revising assessments to be resistant to generation by AI tools with little student input, they should consider the following principles:

  • Incorporate personal experiences and local content into assignments
  • Ask students for multi-modal deliverables
  • Assess the developmental benchmarks for assignments and transition assignments further up Bloom’s Taxonomy
  • Consider real-time and oral assignments

Google CEO Sundar Pichai announces $120M fund for global AI education — from techcrunch.com by Anthony Ha

He added that he wants to avoid a global “AI divide” and that Google is creating a $120 million Global AI Opportunity Fund through which it will “make AI education and training available in communities around the world” in partnership with local nonprofits and NGOs.


Educators discuss the state of creativity in an AI world — from gettingsmart.com by Joe & Kristin Merrill, LaKeshia Brooks, Dominique’ Harbour, Erika Sandstrom

Key Points

  • AI allows for a more personalized learning experience, enabling students to explore creative ideas without traditional classroom limitations.
  • The focus of technology integration should be on how the tool is used within lessons, not just the tool itself

Addendum on 9/27/24:

Google’s NotebookLM enhances AI note-taking with YouTube, audio file sources, sharable audio discussions — from techcrunch.com by Jagmeet Singh

Google on Thursday announced new updates to its AI note-taking and research assistant, NotebookLM, allowing users to get summaries of YouTube videos and audio files and even create sharable AI-generated audio discussions

NotebookLM adds audio and YouTube support, plus easier sharing of Audio Overviews — from blog.google

 

AI researcher Jim Fan has had a charmed career. He was OpenAI’s first intern before he did his PhD at Stanford with “godmother of AI,” Fei-Fei Li. He graduated into a research scientist position at Nvidia and now leads its Embodied AI “GEAR” group. The lab’s current work spans foundation models for humanoid robots to agents for virtual worlds. Jim describes a three-pronged data strategy for robotics, combining internet-scale data, simulation data and real world robot data. He believes that in the next few years it will be possible to create a “foundation agent” that can generalize across skills, embodiments and realities—both physical and virtual. He also supports Jensen Huang’s idea that “Everything that moves will eventually be autonomous.”


Runway Partners with Lionsgate — from runwayml.com via The Rundown AI
Runway and Lionsgate are partnering to explore the use of AI in film production.

Lionsgate and Runway have entered into a first-of-its-kind partnership centered around the creation and training of a new AI model, customized on Lionsgate’s proprietary catalog. Fundamentally designed to help Lionsgate Studios, its filmmakers, directors and other creative talent augment their work, the model generates cinematic video that can be further iterated using Runway’s suite of controllable tools.

Per The Rundown: Lionsgate, the film company behind The Hunger Games, John Wick, and Saw, teamed up with AI video generation company Runway to create a custom AI model trained on Lionsgate’s film catalogue.

The details:

  • The partnership will develop an AI model specifically trained on Lionsgate’s proprietary content library, designed to generate cinematic video that filmmakers can further manipulate using Runway’s tools.
  • Lionsgate sees AI as a tool to augment and enhance its current operations, streamlining both pre-production and post-production processes.
  • Runway is considering ways to offer similar custom-trained models as templates for individual creators, expanding access to AI-powered filmmaking tools beyond major studios.

Why it matters: As many writers, actors, and filmmakers strike against ChatGPT, Lionsgate is diving head-first into the world of generative AI through its partnership with Runway. This is one of the first major collabs between an AI startup and a major Hollywood company — and its success or failure could set precedent for years to come.


A bottle of water per email: the hidden environmental costs of using AI chatbots — from washingtonpost.com by Pranshu Verma and Shelly Tan (behind paywall)
AI bots generate a lot of heat, and keeping their computer servers running exacts a toll.

Each prompt on ChatGPT flows through a server that runs thousands of calculations to determine the best words to use in a response.

In completing those calculations, these servers, typically housed in data centers, generate heat. Often, water systems are used to cool the equipment and keep it functioning. Water transports the heat generated in the data centers into cooling towers to help it escape the building, similar to how the human body uses sweat to keep cool, according to Shaolei Ren, an associate professor at UC Riverside.

Where electricity is cheaper, or water comparatively scarce, electricity is often used to cool these warehouses with large units resembling air-conditioners, he said. That means the amount of water and electricity an individual query requires can depend on a data center’s location and vary widely.


AI, Humans and Work: 10 Thoughts. — from rishad.substack.com by Rishad Tobaccowala
The Future Does Not Fit in the Containers of the Past. Edition 215.

10 thoughts about AI, Humans and Work in 10 minutes:

  1. AI is still Under-hyped.
  2. AI itself will be like electricity and is unlikely to be a differentiator for most firms.
  3. AI is not alive but can be thought of as a new species.
  4. Knowledge will be free and every knowledge workers job will change in 2025.
  5. The key about AI is not to ask what AI will do to us but what AI can do for us.
  6. Plus 5 other thoughts

 

 

10 Ways I Use LLMs like ChatGPT as a Professor — from automatedteach.com by Graham Clay
ChatGPT-4o, Gemini 1.5 Pro, Claude 3.5 Sonnet, custom GPTs – you name it, I use it. Here’s how…

Excerpt:

  1. To plan lessons (especially activities)
  2. To create course content (especially quizzes)
  3. To tutor my students
  4. To grade faster and give better feedback
  5. To draft grant applications
  6. Plus 5 other items

From Caution to Calcification to Creativity: Reanimating Education with AI’s Frankenstein Potential — from nickpotkalitsky.substack.com by Nick Potkalitsky
A Critical Analysis of AI-Assisted Lesson Planning: Evaluating Efficacy and Pedagogical Implications

Excerpt (emphasis DSC):

As we navigate the rapidly evolving landscape of artificial intelligence in education, a troubling trend has emerged. What began as cautious skepticism has calcified into rigid opposition. The discourse surrounding AI in classrooms has shifted from empirical critique to categorical rejection, creating a chasm between the potential of AI and its practical implementation in education.

This hardening of attitudes comes at a significant cost. While educators and policymakers debate, students find themselves caught in the crossfire. They lack safe, guided access to AI tools that are increasingly ubiquitous in the world beyond school walls. In the absence of formal instruction, many are teaching themselves to use these tools, often in less than productive ways. Others live in a state of constant anxiety, fearing accusations of AI reliance in their work. These are just a few symptoms of an overarching educational culture that has become resistant to change, even as the world around it transforms at an unprecedented pace.

Yet, as this calcification sets in, I find myself in a curious position: the more I thoughtfully integrate AI into my teaching practice, the more I witness its potential to enhance and transform education


NotebookLM and Google’s Multimodal Vision for AI-Powered Learning Tools — from marcwatkins.substack.com by Marc Watkins

A Variety of Use Cases

  • Create an Interactive Syllabus
  • Presentation Deep Dive: Upload Your Slides
  • Note Taking: Turn Your Chalkboard into a Digital Canvas
  • Explore a Reading or Series of Readings
  • Help Navigating Feedback
  • Portfolio Building Blocks

Must-Have Competencies and Skills in Our New AI World: A Synthesis for Educational Reform — from er.educause.edu by Fawzi BenMessaoud
The transformative impact of artificial intelligence on educational systems calls for a comprehensive reform to prepare future generations for an AI-integrated world.

The urgency to integrate AI competencies into education is about preparing students not just to adapt to inevitable changes but to lead the charge in shaping an AI-augmented world. It’s about equipping them to ask the right questions, innovate responsibly, and navigate the ethical quandaries that come with such power.

AI in education should augment and complement their aptitude and expertise, to personalize and optimize the learning experience, and to support lifelong learning and development. AI in education should be a national priority and a collaborative effort among all stakeholders, to ensure that AI is designed and deployed in an ethical, equitable, and inclusive way that respects the diversity and dignity of all learners and educators and that promotes the common good and social justice. AI in education should be about the production of AI, not just the consumption of AI, meaning that learners and educators should have the opportunity to learn about AI, to participate in its creation and evaluation, and to shape its impact and direction.

 

Top Software Engineering Newsletters in 2024 — from ai-supremacy.com by Michael Spencer
Including a very select few ML, AI and product Newsletters into the mix for Software Engineers.

This is an article specifically for the software engineers and developers among you.

In the past year (2023-2024) professionals are finding more value in Newsletters than ever before (especially on Substack).

As working from home took off, the nature of mentorship and skill acquisition has also evolved and shifted. Newsletters with pragmatic advice on our careers it turns out, are super valuable. This article is a resource list. Are you a software developer, work with one or know someone who is or wants to be?

 

Legal budgets will get an AI-inspired makeover in 2025: survey — from legaldive.com by Justin Bachman
Nearly every general counsel is budgeting to add generative AI tools to their departments – and they’re all expecting to realize efficiencies by doing so.

Dive Brief:

  • Nearly all general counsel say their budgets are up slightly after wrestling with widespread cuts last year. And most of them, 61%, say they expect slightly larger budgets next year as well, an average of 5% more, according to the 2025 In-House Legal Budgeting Report from Axiom and Wakefield Research. Technology was ranked as the top in-house investment priority for both 2024 and 2025 for larger companies.
  • Legal managers predict their companies will boost investment on technology and real estate/facilities in 2025, while reducing outlays for human resources and mergers and acquisition activity, according to the survey. This mix of changing priorities might disrupt legal budgets.
  • Among the planned legal tech spending, the top three areas for investment are virtual legal assistants/AI-powered chatbots (35%); e-billing and spend-management software (31%); and contract management platforms (30%).
 



“Who to follow in AI” in 2024? — from ai-supremacy.com by Michael Spencer
Part III – #35-55 – I combed the internet, I found the best sources of AI insights, education and articles. LinkedIn | Newsletters | X | YouTube | Substack | Threads | Podcasts

This list features both some of the best Newsletters on AI and people who make LinkedIn posts about AI papers, advances and breakthroughs. In today’s article we’ll be meeting the first 19-34, in a list of 180+.

Newsletter Writers
YouTubers
Engineers
Researchers who write
Technologists who are Creators
AI Educators
AI Evangelists of various kinds
Futurism writers and authors

I have been sharing the list in reverse chronological order on LinkedIn here.


Inside Google’s 7-Year Mission to Give AI a Robot Body — from wired.com by Hans Peter Brondmo
As the head of Alphabet’s AI-powered robotics moonshot, I came to believe many things. For one, robots can’t come soon enough. For another, they shouldn’t look like us.


Learning to Reason with LLMs — from openai.com
We are introducing OpenAI o1, a new large language model trained with reinforcement learning to perform complex reasoning. o1 thinks before it answers—it can produce a long internal chain of thought before responding to the user.


Items re: Microsoft Copilot:

Also see this next video re: Copilot Pages:


Sal Khan on the critical human skills for an AI age — from time.com by Kevin J. Delaney

As a preview of the upcoming Summit interview, here are Khan’s views on two critical questions, edited for space and clarity:

  1. What are the enduring human work skills in a world with ever-advancing AI? Some people say students should study liberal arts. Others say deep domain expertise is the key to remaining professionally relevant. Others say you need to have the skills of a manager to be able to delegate to AI. What do you think are the skills or competencies that ensure continued relevance professionally, employability, etc.?
  2. A lot of organizations are thinking about skills-based approaches to their talent. It involves questions like, ‘Does someone know how to do this thing or not?’ And what are the ways in which they can learn it and have some accredited way to know they actually have done it? That is one of the ways in which people use Khan Academy. Do you have a view of skills-based approaches within workplaces, and any thoughts on how AI tutors and training fit within that context?

 
© 2024 | Daniel Christian