From DSC:
The above item is simply excellent!!! I love it!



Also relevant/see:

3 new Chrome AI features for even more helpful browsing — from blog.google from Parisa Tabriz
See how Chrome’s new AI features, including Google Lens for desktop and Tab compare, can help you get things done more easily on the web.


On speaking to AI — from oneusefulthing.org by Ethan Mollick
Voice changes a lot of things

So, let’s talk about ChatGPT’s new Advanced Voice mode and the new AI-powered Siri. They are not just different approaches to talking to AI. In many ways, they represent the divide between two philosophies of AI – Copilots versus Agents, small models versus large ones, specialists versus generalists.


Your guide to AI – August 2024 — from nathanbenaich.substack.com by Nathan Benaich and Alex Chalmers


Microsoft says OpenAI is now a competitor in AI and search — from cnbc.com by Jordan Novet

Key Points

  • Microsoft’s annually updated list of competitors now includes OpenAI, a long-term strategic partner.
  • The change comes days after OpenAI announced a prototype of a search engine.
  • Microsoft has reportedly invested $13 billion into OpenAI.


Excerpt from by Graham Clay

1. Flux, an open-source text-to-image creator that is comparable to industry leaders like Midjourney, was released by Black Forest Labs (the “original team” behind Stable Diffusion). It is capable of generating high quality text in images (there are tons of educational use cases). You can play with it on their demo page, on Poe, or by running it on your own computer (tutorial here).

Other items re: Flux:

How to FLUX  — from heatherbcooper.substack.com by Heather Cooper
Where to use FLUX online & full tutorial to create a sleek ad in minutes

.

Also from Heather Cooper:

Introducing FLUX: Open-Source text to image model

FLUX… has been EVERYWHERE this week, as I’m sure you have seen. Developed by Black Forest Labs, is an open-source image generation model that’s gaining attention for its ability to rival leading models like Midjourney, DALL·E 3, and SDXL.

What sets FLUX apart is its blend of creative freedom, precision, and accessibility—it’s available across multiple platforms and can be run locally.

Why FLUX Matters
FLUX’s open-source nature makes it accessible to a broad audience, from hobbyists to professionals.

It offers advanced multimodal and parallel diffusion transformer technology, delivering high visual quality, strong prompt adherence, and diverse outputs.

It’s available in 3 models:
FLUX.1 [pro]: A high-performance, commercial image synthesis model.
FLUX.1 [dev]: An open-weight, non-commercial variant of FLUX.1 [pro]
FLUX.1 [schnell]: A faster, distilled version of FLUX.1, operating up to 10x quicker.

Daily Digest: Huge (in)Flux of AI videos. — from bensbites.beehiiv.com
PLUS: Review of ChatGPT’s advanced voice mode.

  1. During the weekend, image models made a comeback. Recently released Flux models can create realistic images with near-perfect text—straight from the model, without much patchwork. To get the party going, people are putting these images into video generation models to create prettytrippyvideos. I can’t identify half of them as AI, and they’ll only get better. See this tutorial on how to create a video ad for your product..

 


7 not only cool but handy use cases of new Claude — from techthatmatters.beehiiv.com by Harsh Makadia

  1. Data visualization
  2. Infographic
  3. Copy the UI of a website
  4. …and more

Achieving Human Level Competitive Robot Table Tennis — from sites.google.com

 

What Students Want When It Comes To AI — from onedtech.philhillaa.com by Glenda Morgan
The Digital Education Council Global AI Student Survey 2024

The Digital Education Council (DEC) this week released the results of a global survey of student opinions on AI. It’s a large survey with nearly 4,000 respondents conducted across 16 countries, but more importantly, it asks some interesting questions. There are many surveys about AI out there right now, but this one stands out. I’m going to go into some depth here, as the entire survey report is worth reading.

.

.


AI is forcing a teaching and learning evolution — from eschoolnews.com by Laura Ascione
AI and technology tools are leading to innovative student learning–along with classroom, school, and district efficiency

Key findings from the 2024 K-12 Educator + AI Survey, which was conducted by Hanover Research, include:

  • Teachers are using AI to personalize and improve student learning, not just run classrooms more efficiently, but challenges remain
  • While post-pandemic challenges persist, the increased use of technology is viewed positively by most teachers and administrators
  • …and more

From DSC:
I wonder…how will the use of AI in education square with the issues of using smartphones/laptops within the classrooms? See:

  • Why Schools Are Racing to Ban Student Phones — from nytimes.com by Natasha Singer; via GSV
    As the new school year starts, a wave of new laws that aim to curb distracted learning is taking effect in Indiana, Louisiana and other states.

A three-part series from Dr. Phillippa Hardman:

Part 1: Writing Learning Objectives  
The Results Part 1: Writing Learning Objectives

In this week’s post I will dive into the results from task 1: writing learning objectives. Stay tuned over the next two weeks to see all of the the results.

Part 2: Selecting Instructional Strategies.
The Results Part 2: Selecting an Instructional Strategy

Welcome back to our three-part series exploring the impact of AI on instructional design.

This week, we’re tackling a second task and a crucial aspect of instructional design: selecting instructional strategies. The ability to select appropriate instructional strategies to achieve intended objectives is a mission-critical skill for any instructional designer. So, can AI help us do a good job of it? Let’s find out!

Part 3: How Close is AI to Replacing Instructional Designers?
The Results Part 3: Creating a Course Outline

Today, we’re diving into what many consider to be the role-defining task of the instructional designer: creating a course design outline.


ChatGPT Cheat Sheet for Instructional Designers! — from Alexandra Choy Youatt EdD

Instructional Designers!
Whether you’re new to the field or a seasoned expert, this comprehensive guide will help you leverage AI to create more engaging and effective learning experiences.

What’s Inside?
Roles and Tasks: Tailored prompts for various instructional design roles and tasks.
Formats: Different formats to present your work, from training plans to rubrics.
Learning Models: Guidance on using the ADDIE model and various pedagogical strategies.
Engagement Tips: Techniques for online engagement and collaboration.
Specific Tips: Industry certifications, work-based learning, safety protocols, and more.

Who Can Benefit?
Corporate Trainers
Curriculum Developers
E-Learning Specialists
Instructional Technologists
Learning Experience Designers
And many more!

ChatGPT Cheat Sheet | Instructional Designer


5 AI Tools I Use Every Day (as a Busy Student) — from theaigirl.substack.com by Diana Dovgopol
AI tools that I use every day to boost my productivity.
#1 Gamma
#2 Perplexity
#3 Cockatoo

I use this AI tool almost every day as well. Since I’m still a master’s student at university, I have to attend lectures and seminars, which are always in English or German, neither of which is my native language. With the help of Cockatoo, I create scripts of the lectures and/or translations into my language. This means I don’t have to take notes in class and then manually translate them afterward. All I need to do is record the lecture audio on any device or directly in Cockatoo, upload it, and then you’ll have the audio and text ready for you.

…and more


Students Worry Overemphasis on AI Could Devalue Education — from insidehighered.com by Juliette Rowsell
Report stresses that AI is “new standard” and universities need to better communicate policies to learners.

Rising use of AI in higher education could cause students to question the quality and value of education they receive, a report warns.

This year’s Digital Education Council Global AI Student Survey, of more than 3,800 students from 16 countries, found that more than half (55 percent) believed overuse of AI within teaching devalued education, and 52 percent said it negatively impacted their academic performance.

Despite this, significant numbers of students admitted to using such technology. Some 86 percent said they “regularly” used programs such as ChatGPT in their studies, 54 percent said they used it on a weekly basis, and 24 percent said they used it to write a first draft of a submission.

Higher Ed Leadership Is Excited About AI – But Investment Is Lacking — from forbes.com by Vinay Bhaskara

As corporate America races to integrate AI into its core operations, higher education finds itself in a precarious position. I conducted a survey of 63 university leaders revealing that while higher ed leaders recognize AI’s transformative potential, they’re struggling to turn that recognition into action.

This struggle is familiar for higher education — gifted with the mission of educating America’s youth but plagued with a myriad of operational and financial struggles, higher ed institutions often lag behind their corporate peers in technology adoption. In recent years, this gap has become threateningly large. In an era of declining enrollments and shifting demographics, closing this gap could be key to institutional survival and success.

The survey results paint a clear picture of inconsistency: 86% of higher ed leaders see AI as a “massive opportunity,” yet only 21% believe their institutions are prepared for it. This disconnect isn’t just a minor inconsistency – it’s a strategic vulnerability in an era of declining enrollments and shifting demographics.


(Generative) AI Isn’t Going Anywhere but Up — from stefanbauschard.substack.com by Stefan Bauschard
“Hype” claims are nonsense.

There has been a lot of talk recently about an “AI Bubble.” Supposedly, the industry, or at least the generative AI subset of it, will collapse. This is known as the “Generative AI Bubble.” A bubble — a broad one or a generative one — is nonsense. These are the reasons we will continue to see massive growth in AI.


AI Readiness: Prepare Your Workforce to Embrace the Future — from learningguild.com by Danielle Wallace

Artificial Intelligence (AI) is revolutionizing industries, enhancing efficiency, and unlocking new opportunities. To thrive in this landscape, organizations need to be ready to embrace AI not just technologically but also culturally.

Learning leaders play a crucial role in preparing employees to adapt and excel in an AI-driven workplace. Transforming into an AI-empowered organization requires more than just technological adoption; it demands a shift in organizational mindset. This guide delves into how learning leaders can support this transition by fostering the right mindset attributes in employees.


Claude AI for eLearning Developers — from learningguild.com by Bill Brandon

Claude is fast, produces grammatically correct  text, and outputs easy-to-read articles, emails, blog posts, summaries, and analyses. Take some time to try it out. If you worry about plagiarism and text scraping, put the results through Grammarly’s plagiarism checker (I did not use Claude for this article, but I did send the text through Grammarly).


Survey: Top Teacher Uses of AI in the Classroom — from thejournal.com by Rhea Kelly

A new report from Cambium Learning Group outlines the top ways educators are using artificial intelligence to manage their classrooms and support student learning. Conducted by Hanover Research, the 2024 K-12 Educator + AI Survey polled 482 teachers and administrators at schools and districts that are actively using AI in the classroom.

More than half of survey respondents (56%) reported that they are leveraging AI to create personalized learning experiences for students. Other uses included providing real-time performance tracking and feedback (cited by 52% of respondents), helping students with critical thinking skills (50%), proofreading writing (47%), and lesson planning (44%).

On the administrator side, top uses of AI included interpreting/analyzing student data (61%), managing student records (56%), and managing professional development (56%).


Addendum on 8/14/24:

 


ChatGPT Voice Mode Is Here: Will It Revolutionize AI Communication?


Advanced Voice Mode – FAQ — from help.openai.com
Learn more about our Advanced Voice capabilities.

Advanced Voice Mode on ChatGPT features more natural, real-time conversations that pick up on and respond with emotion and non-verbal cues.

Advanced Voice Mode on ChatGPT is currently in a limited alpha. Please note that it may make mistakes, and access and rate limits are subject to change.


From DSC:
Think about the impacts/ramifications of global, virtual, real-time language translations!!! This type of technology will create very powerful, new affordances in our learning ecosystems — as well as in business communications, with the various governments across the globe, and more!

 

 

Welcome to the Digital Writing Lab -- Supporting teachers to develop and empower digitally literate citizens.

Digital Writing Lab

About this Project

The Digital Writing Lab is a key component of the Australian national Teaching Digital Writing project, which runs from 2022-2025.

This stage of the broader project involves academic and secondary English teacher collaboration to explore how teachers are conceptualising the teaching of digital writing and what further supports they may need.

Previous stages of the project included archival research reviewing materials related to digital writing in Australia’s National Textbook Collection, and a national survey of secondary English teachers. You can find out more about the whole project via the project blog.

Who runs the project?

Project Lead Lucinda McKnight is an Associate Professor and Australian Research Council (ARC) DECRA Fellow researching how English teachers can connect the teaching of writing to contemporary media and students’ lifeworlds.

She is working with Leon Furze, who holds the doctoral scholarship attached to this project, and Chris Zomer, the project Research Fellow. The project is located in the Research for Educational Impact (REDI) centre at Deakin University, Melbourne.

.

Teaching Digital Writing is a research project about English today.

 

For college students—and for higher ed itself—AI is a required course — from forbes.com by Jamie Merisotis

Some of the nation’s biggest tech companies have announced efforts to reskill people to avoid job losses caused by artificial intelligence, even as they work to perfect the technology that could eliminate millions of those jobs.

It’s fair to ask, however: What should college students and prospective students, weighing their choices and possible time and financial expenses, think of this?

The news this spring was encouraging for people seeking to reinvent their careers to grab middle-class jobs and a shot at economic security.

 


Addressing Special Education Needs With Custom AI Solutions — from teachthought.com
AI can offer many opportunities to create more inclusive and effective learning experiences for students with diverse learning profiles.

For too long, students with learning disabilities have struggled to navigate a traditional education system that often fails to meet their unique needs. But what if technology could help bridge the gap, offering personalized support and unlocking the full potential of every learner?

Artificial intelligence (AI) is emerging as a powerful ally in special education, offering many opportunities to create more inclusive and effective learning experiences for students with diverse learning profiles.

.


 

.


11 Summer AI Developments Important to Educators — from stefanbauschard.substack.com by Stefan Bauschard
Equity demands that we help students prepare to thrive in an AI-World

*SearchGPT
*Smaller & on-device (phones, glasses) AI models
*AI TAs
*Access barriers decline, equity barriers grow
*Claude Artifacts and Projects
*Agents, and Agent Teams of a million+
*Humanoid robots & self-driving cars
*AI Curricular integration
*Huge video and video-segmentation gains
*Writing Detectors — The final blow
*AI Unemployment, Student AI anxiety, and forward-thinking approaches
*Alternative assessments


Academic Fracking: When Publishers Sell Scholars Work to AI — from aiedusimplified.substack.com by Lance Eaton
Further discussion of publisher practices selling scholars’ work to AI companies

Last week, I explored AI and academic publishing in response to an article that came out a few weeks ago about a deal Taylor & Francis made to sell their books to Microsoft and one other AI company (unnamed) for a boatload of money.

Since then, two more pieces have been widely shared including this piece from Inside Higher Ed by Kathryn Palmer (and to which I was interviewed and mentioned in) and this piece from Chronicle of Higher Ed by Christa Dutton. Both pieces try to cover the different sides talking to authors, scanning the commentary online, finding some experts to consult and talking to the publishers. It’s one of those things that can feel like really important and also probably only to a very small amount of folks that find themselves thinking about academic publishing, scholarly communication, and generative AI.


At the Crossroads of Innovation: Embracing AI to Foster Deep Learning in the College Classroom — from er.educause.edu by Dan Sarofian-Butin
AI is here to stay. How can we, as educators, accept this change and use it to help our students learn?

The Way Forward
So now what?

In one respect, we already have a partial answer. Over the last thirty years, there has been a dramatic shift from a teaching-centered to a learning-centered education model. High-impact practices, such as service learning, undergraduate research, and living-learning communities, are common and embraced because they help students see the real-world connections of what they are learning and make learning personal.11

Therefore, I believe we must double down on a learning-centered model in the age of AI.

The first step is to fully and enthusiastically embrace AI.

The second step is to find the “jagged technological frontier” of using AI in the college classroom.


.

.


.

.


Futures Thinking in Education — from gettingsmart.com by Getting Smart Staff

Key Points

  • Educators should leverage these tools to prepare for rapid changes driven by technology, climate, and social dynamics.
  • Cultivating empathy for future generations can help educators design more impactful and forward-thinking educational practices.
 

Per the Rundown AI:

Why it matters: AI is slowly shifting from a tool we text/prompt with, to an intelligence that we collaborate, learn, and grow with. Advanced Voice Mode’s ability to understand and respond to emotions in real-time convos could also have huge use cases in everything from customer service to mental health support.

Also relevant/see:


Creators to Have Personalized AI Assistants, Meta CEO Mark Zuckerberg Tells NVIDIA CEO Jensen Huang — from blogs.nvidia.com by Brian Caulfield
Zuckerberg and Huang explore the transformative potential of open source AI, the launch of AI Studio, and exchange leather jackets at SIGGRAPH 2024.

“Every single restaurant, every single website will probably, in the future, have these AIs …” Huang said.

“…just like every business has an email address and a website and a social media account, I think, in the future, every business is going to have an AI,” Zuckerberg responded.

More broadly, the advancement of AI across a broad ecosystem promises to supercharge human productivity, for example, by giving every human on earth a digital assistant — or assistants — allowing people to live richer lives that they can interact with quickly and fluidly.

Also related/see:


From DSC:
Today was a MUCH better day for Nvidia however (up 12.81%). But it’s been very volatile in the last several weeks — as people and institutions ask where the ROI’s are going to come from.






9 compelling reasons to learn how to use AI Chatbots — from interestingengineering.com by Atharva Gosavi
AI Chatbots are conversational agents that can act on your behalf and converse with humans – a futuristic novelty that is already getting people excited about its usage in improving efficiency.

7. Accessibility and inclusivity
Chatbots can be designed to support multiple languages and accessibility needs, making services more inclusive. They can cater to users with disabilities by providing voice interaction capabilities and simplifying access to information. Understanding how to develop inclusive chatbots can help you contribute to making technology more accessible to everyone, a crucial aspect in today’s diverse society.

8. Future-proofing your skills
AI and automation are the future of work. Having the skills of building AI chatbots is a great way to future-proof your skills, and given the rising trajectory of AI, it’ll be a demanding skill in the market in the years to come. Staying ahead of technological trends is a great way to ensure you remain relevant and competitive in the job market.


Top 7 generative AI use cases for business — from cio.com by Grant Gross
Advanced chatbots, digital assistants, and coding helpers seem to be some of the sweet spots for gen AI use so far in business.

Many AI experts say the current use cases for generative AI are just the tip of the iceberg. More uses cases will present themselves as gen AIs get more powerful and users get more creative with their experiments.

However, a handful of gen AI use cases are already bubbling up. Here’s a look at the most popular and promising.

 

The resistance to AI in education isn’t really about learning — from medium.com by Peter Shea


A quick comment first from DSC:
Peter Shea gives us some interesting perspectives here. His thoughts should give many of us fodder for our own further reflection.


This reaction underscores a deeper issue: the resistance to AI in education is not truly about learning. It reflects a reluctance to re-evaluate the traditional roles of educators and to embrace the opportunities AI offers to enhance the learning experience.

In order to thrive in the learning ecosystem that will evolve in the Age of AI, the teaching profession needs to do some difficult but essential re-evaluation of their role, in order to better understand where they can provide the best value to learners. This requires confronting some comforting myths and uncomfortable truths.

Problem #2: The Closed World of Academic Culture
In addition, many teachers have spent little time working in non-academic professions. This is especially true for college instructors, who must devote five to seven years to graduate education before obtaining their first full-time position, and thus have little time to explore careers outside academia. This common lack of non-academic work experience heightens the anxiety that educators feel when contemplating the potential impact of generative AI on their work lives.


Also see this related posting:

Majority of Grads Wish They’d Been Taught AI in College — from insidehighered.com by Lauren Coffey
A new survey shows 70 percent of graduates think generative AI should be incorporated into courses. More than half said they felt unprepared for the workforce.

A majority of college graduates believe generative artificial intelligence tools should be incorporated into college classrooms, with more than half saying they felt unprepared for the workforce, according to a new survey from Cengage Group, an education-technology company.

The survey, released today, found that 70 percent of graduates believe basic generative AI training should be integrated into courses; 55 percent said their degree programs did not prepare them to use the new technology tools in the workforce.

 

Researchers develop VR training to tackle racial disparity — from inavateonthenet.net

Researchers at the University of Illinois Urbana-Champaign have developed a VR training system for physicians, aimed at tackling racial and class health disparities.

“Ultimately, this virtual reality training system could become a viable tool for practicing communication with diverse patients across different types of health care professions. “There’s no reason why nurses couldn’t also use this across different health care contexts — not just for Black maternal health, but chronic pain, diabetes or some of these other health issues in which we know that there are disparities based on markers of difference such as race or class”.

Two additional VR training modules are under development, aimed at promoting self-reflection by helping medical students to identify their own biases and learn how to mitigate them. The third module will focus on students practicing intercultural communication skills through interactions with a virtual patient, an approach that is seen by the researchers as more cost-effective than recruiting people for role playing with medical students.

 

From DSC:
I’ve often thought that VR could be used to help us walk in someone else’s shoes….to experience things as THEY experience things.

 


“Who to follow in AI” in 2024? [Part I] — from ai-supremacy.com by Michael Spencer [some of posting is behind a paywall]
#1-20 [of 150] – I combed the internet, I found the best sources of AI insights, education and articles. LinkedIn | Newsletters | X | YouTube | Substack | Threads | Podcasts

Also see:

Along these lines, also see:


AI In Medicine: 3 Future Scenarios From Utopia To Dystopia — from medicalfuturist.com by Andrea Koncz
There’s a vast difference between baseless fantasizing and realistic forward planning. Structured methodologies help us learn how to “dream well”.

Key Takeaways

  • We’re often told that daydreaming and envisioning the future is a waste of time. But this notion is misguided.
  • We all instinctively plan for the future in small ways, like organizing a trip or preparing for a dinner party. This same principle can be applied to larger-scale issues, and smart planning does bring better results.
  • We show you a method that allows us to think “well” about the future on a larger scale so that it better meets our needs.

Adobe Unveils Powerful New Innovations in Illustrator and Photoshop Unlocking New Design Possibilities for Creative Pros — from news.adobe.com

  • Latest Illustrator and Photoshop releases accelerate creative workflows, save pros time and empower designers to realize their visions faster
  • New Firefly-enabled features like Generative Shape Fill in Illustrator along with the Dimension Tool, Mockup, Text to Pattern, the Contextual Taskbar and performance enhancement tools accelerate productivity and free up time so creative pros can dive deeper into the parts of their work they love
  • Photoshop introduces all-new Selection Brush Tool and the general availability of Generate Image, Adjustment Brush Tool and other workflow enhancements empowering creators to make complex edits and unique designs
    .


Nike is using AI to turn athletes’ dreams into shoes — from axios.com by Ina Fried

Zoom in: Nike used genAI for ideation, including using a variety of prompts to produce images with different textures, materials and color to kick off the design process.

What they’re saying: “It’s a new way for us to work,” Nike lead footwear designer Juliana Sagat told Axios during a media tour of the showcase on Tuesday.
.


AI meets ‘Do no harm’: Healthcare grapples with tech promises — from finance.yahoo.com by Maya Benjamin

Major companies are moving at high speed to capture the promises of artificial intelligence in healthcare while doctors and experts attempt to integrate the technology safely into patient care.

“Healthcare is probably the most impactful utility of generative AI that there will be,” Kimberly Powell, vice president of healthcare at AI hardware giant Nvidia (NVDA), which has partnered with Roche’s Genentech (RHHBY) to enhance drug discovery in the pharmaceutical industry, among other investments in healthcare companies, declared at the company’s AI Summit in June.


Mistral reignites this week’s LLM rivalry with Large 2 (source) — from superhuman.ai

Today, we are announcing Mistral Large 2, the new generation of our flagship model. Compared to its predecessor, Mistral Large 2 is significantly more capable in code generation, mathematics, and reasoning. It also provides a much stronger multilingual support, and advanced function calling capabilities.


Meta releases the biggest and best open-source AI model yet — from theverge.com by Alex Heath
Llama 3.1 outperforms OpenAI and other rivals on certain benchmarks. Now, Mark Zuckerberg expects Meta’s AI assistant to surpass ChatGPT’s usage in the coming months.

Back in April, Meta teased that it was working on a first for the AI industry: an open-source model with performance that matched the best private models from companies like OpenAI.

Today, that model has arrived. Meta is releasing Llama 3.1, the largest-ever open-source AI model, which the company claims outperforms GPT-4o and Anthropic’s Claude 3.5 Sonnet on several benchmarks. It’s also making the Llama-based Meta AI assistant available in more countries and languages while adding a feature that can generate images based on someone’s specific likeness. CEO Mark Zuckerberg now predicts that Meta AI will be the most widely used assistant by the end of this year, surpassing ChatGPT.


4 ways to boost ChatGPT — from wondertools.substack.com by Jeremy Caplan & The PyCoach
Simple tactics for getting useful responses

To help you make the most of ChatGPT, I’ve invited & edited today’s guest post from the author of a smart AI newsletter called The Artificial Corner. I appreciate how Frank Andrade pushes ChatGPT to produce better results with four simple, clever tactics. He offers practical examples to help us all use AI more effectively.

Frank Andrade: Most of us fail to make the most of ChatGPT.

  1. We omit examples in our prompts.
  2. We fail to assign roles to ChatGPT to guide its behavior.
  3. We let ChatGPT guess instead of providing it with clear guidance.

If you rely on vague prompts, learning how to create high-quality instructions will get you better results. It’s a skill often referred to as prompt engineering. Here are several techniques to get you to the next level.

 

How to use Perplexity in your daily workflow — from ai-supremacy.com by Michael Spencer and Alex McFarland
“I barely use Google anymore (for anything)” says today’s guest author.

Make Perplexity your go-to research companion with these strategies:

  • Morning briefings: Start your day by asking Perplexity for the latest news in your field. (I personally like to use Perplexity to curate the top AI news of the day to consider writing about for Unite AI and Techopedia.)
  • Fact-checking: Use it to quickly verify information before including it in your work.
  • Brainstorming: Generate ideas for projects or content by asking open-ended questions.
  • Learning new concepts: When you encounter an unfamiliar term or idea, turn to Perplexity for a quick, comprehensive explanation.
  • Writing assistance: Use it to find relevant statistics, examples, or counterarguments for your content.

From DSC:
So I used it to ask “What is Elon Musk’s Memphis Supercluster?” (which went online yesterday I believe) — which resulted in this reply:

Elon Musk’s Memphis Supercluster is a newly activated AI training cluster that is claimed to be the most powerful in the world. Here are the key details about this supercomputer:

1. Location: The supercluster is located in Memphis, Tennessee[1][2].

2. Hardware: It consists of 100,000 liquid-cooled Nvidia H100 GPUs connected through a single RDMA (Remote Direct Memory Access) fabric[1][3].

3. Purpose: The supercluster is designed for training large language models (LLMs) and other advanced AI technologies for Musk’s xAI company[1][2].

4. Activation: The Memphis Supercluster began training at approximately 4:20 AM local time on July 22, 2024[1][3].

5. Collaboration: The project is a result of collaboration between xAI, X (formerly Twitter), Nvidia, and other supporting companies[1][2].

6. Investment: With each H100 GPU estimated to cost between $30,000 to $40,000, the total investment in GPUs alone is estimated to be between $3 billion to $4 billion[5].

7. Goals: Musk claims that this supercluster will be used to develop “the world’s most powerful AI by every measure” by December 2024[1].

8. Comparison: The Memphis Supercluster’s 100,000 H100 GPUs significantly outclass other supercomputers in terms of GPU horsepower, such as Frontier (37,888 AMD GPUs) and Microsoft Eagle (14,400 Nvidia H100 GPUs)[3].

9. Infrastructure: The project required significant infrastructure development, including fiber optic networking[5].

While Musk’s claims about the supercluster’s capabilities are ambitious, it remains to be seen how it will perform in practice and whether it will meet the stated goals within the given timeframe[1].

Citations:
[1] https://venturebeat.com/ai/elon-musk-announces-most-powerful-ai-training-cluster-in-the-world/
[2] https://www.tuaw.com/2024/07/22/elon-musks-xai-unveils-worlds-most-powerful-ai-training-cluster/
[3] https://www.tomshardware.com/pc-components/gpus/elon-musk-fires-up-the-most-powerful-ai-training-cluster-in-the-world-uses-100000-nvidia-h100-gpus-on-a-single-fabric
[4] https://www.thestack.technology/xai-elon-musk-ai-supercomputer/
[5] https://www.benzinga.com/news/24/07/39881748/elon-musks-xai-flips-the-switch-on-100k-nvidia-h100-gpus-worth-up-to-4b-in-memphis-supercluster-most
[6] https://www.youtube.com/watch?v=0JhtJa05C2Y

—–
Elon’s AI empire expands — from theneurondaily.com by Grant Harvey
Elon Musk’s team at xAI just powered on the “World’s Most Powerful AI Training Cluster.”
If you don’t know what a supercluster is, it’s basically a massive network of Nvidia GPUs (computer chips) working together as a single unit to solve “super” complex calculations at unprecedented speeds.

And this Memphis Supercluster is the most “super” supercluster we’ve ever seen. The new facility, dubbed the “Gigafactory of Compute”, is a beast:

  • 100,000 liquid-cooled Nvidia H100 GPUs on a single RDMA fabric (for context, Google snagged only 50,000 H100 GPUs last year).
  • Up to 150 megawatts of electricity usage per hour—enough for 100K homes.
  • At least one million gallons of water per day to keep cool!

What to expect: Better models, more frequently. That’s been the trend, at least—look at how the last few model releases have become more squished together. 


OpenAI to make GPT-4o Advanced Voice available by the end of the month to select group of users — from tomsguide.com by Ryan Morrison

GPT-4o Advanced Voice is an entirely new type of voice assistant, similar to but larger than the recently unveiled French model Moshi, which argued with me over a story.

In demos of the model, we’ve seen GPT-4o Advanced Voice create custom character voices, generate sound effects while telling a story and even act as a live translator.

This native speech ability is a significant step in creating more natural AI assistants. In the future, it will also come with live vision abilities, allowing the AI to see what you see.


Could AGI break the world? — from theneurondaily.com by Noah Edelman

“Biggest IT outage in history” proves we’re not ready for AGI.

Here’s the TL;DR
—a faulty software update from cybersecurity firm Crowdstrike made this happen:

  • Grounded 5,000+ flights around the world.
  • Slowed healthcare across the UK.
  • Forced retailers to revert to cash-only transactions in Australia (what is this, the stone ages?!).


Here’s where AI comes in: Imagine today’s AI as a new operating system. In 5-10 years, it’ll likely be as integrated into our economy as Microsoft’s cloud servers are now. This isn’t that far-fetched—Microsoft is already planning to embed AI into all its programs.

So what if a Crowdstrike-like incident happens with a more powerful AI system? Some experts predict an AI-powered IT outage could be 10x worse than Friday’s fiasco.


The Crowdstrike outage and global software’s single-point failure problem — from cnbc.com by Kaya Ginsky

KEY POINTS

  • The CrowdStrike software bug that took down global IT infrastructure exposed a single-point-of-failure risk unrelated to malicious cyberattack.
  • National and cybersecurity experts say the risk of this kind of technical outage is increasing alongside the risk of hacks, and the market will need to adopt better competitive practices.
  • Government is also likely to look at new regulations related to software updates and patches.

The “largest IT outage in history,” briefly explained — from vox.com by Li Zhou
Airlines, banks, and hospitals saw computer systems go down because of a CrowdStrike software glitch.

 

The race to deploy GenAI in the legal sector — from sifted.eu by Kai Nicol-Schwarz
LegalFly’s €15m Series A is the latest in a string of raises for European GenAI legaltech startups

Speak to any lawyer and you’ll soon discover that the job is a far cry from the fevered excitement of a courtroom drama. Behind the scenes, there’s an endless amount of laborious and typically manual tasks like drafting, reviewing and negotiating contracts and other legal documents that have to be done manually daily.

It was this realisation that led four product managers at dating app giant Tinder, frustrated by what they saw as a lack of AI adoption at the company, to jump ship and found Belgium-based LegalFly last year. The startup is building a generative AI copilot for lawyers which eventually, it says, will be able to automate entire workflows in the legal profession.

“We were looking at what GenAI was good at, which is synthesising data and generating content,” says founder and CEO Ruben Miessen. “What industry works like that? Law, and it does it all in a very manual way.”

“The legal industry is a global behemoth that’s seen minimal innovation since the advent of Microsoft Word in the 90s,” says Carina Namih, partner at Plural. “GenAI — especially with a human in the loop to keep accuracy high — is ideally suited to drafting, editing and negotiating legal documents.”


Legal Technology Company Relativity Announces OpenAI ChatGPT Integration — from lawfuel.com

CHICAGO– July 18 – Relativity, a global legal technology company, today announced it is integrating with OpenAI’s ChatGPT Enterprise Compliance API. The integration adds ChatGPT Enterprise as a Collect in RelativityOne data source, allowing users to seamlessly collect and process human-to-AI conversational data.

“The future around human and AI interaction is changing rapidly, calling for innovative legal data management solutions to include novel data sources, such as conversations with AI agents,” said Chris Brown, Chief Product Officer at Relativity. “In answering that call, we are committed to equipping our community with the tools they need to traverse the evolving future of human-to-AI conversational data and putting users in control of this new data landscape.”

 

What aspects of teaching should remain human? — from hechingerreport.org by Chris Berdik
Even techno optimists hesitate to say teaching is best left to the bots, but there’s a debate about where to draw the line

ATLANTA — Science teacher Daniel Thompson circulated among his sixth graders at Ron Clark Academy on a recent spring morning, spot checking their work and leading them into discussions about the day’s lessons on weather and water. He had a helper: As Thompson paced around the class, peppering them with questions, he frequently turned to a voice-activated AI to summon apps and educational videos onto large-screen smartboards.

When a student asked, “Are there any animals that don’t need water?” Thompson put the question to the AI. Within seconds, an illustrated blurb about kangaroo rats appeared before the class.

Nitta said there’s something “deeply profound” about human communication that allows flesh-and-blood teachers to quickly spot and address things like confusion and flagging interest in real time.


Deep Learning: Five New Superpowers of Higher Education — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
How Deep Learning is Transforming Higher Education

While the traditional model of education is entrenched, emerging technologies like deep learning promise to shake its foundations and usher in an age of personalized, adaptive, and egalitarian education. It is expected to have a significant impact across higher education in several key ways.

…deep learning introduces adaptivity into the learning process. Unlike a typical lecture, deep learning systems can observe student performance in real-time. Confusion over a concept triggers instant changes to instructional tactics. Misconceptions are identified early and remediated quickly. Students stay in their zone of proximal development, constantly challenged but never overwhelmed. This adaptivity prevents frustration and stagnation.


InstructureCon 24 Conference Notes — from onedtech.philhillaa.com by Glenda Morgan
Another solid conference from the market leader, even with unclear roadmap

The new stuff: AI
Instructure rolled out multiple updates and improvements – more than last year. These included many AI-based or focused tools and services as well as some functional improvements. I’ll describe the AI features first.

Sal Khan was a surprise visitor to the keynote stage to announce the September availability of the full suite of AI-enabled Khanmigo Teacher Tools for Canvas users. The suite includes 20 tools, such as tools to generate lesson plans and quiz questions and write letters of recommendation. Next year, they plan to roll out tools for students themselves to use.

Other AI-based features include.

    • Discussion tool summaries and AI-generated responses…
    • Translation of inbox messages and discussions…
    • Smart search …
    • Intelligent Insights…

 

 

School 3.0: Reimagining Education in 2026, 2029, and 2034 — from davidborish.com by David Borish
.

The landscape of education is on the brink of a profound transformation, driven by rapid advancements in artificial intelligence. This shift was highlighted recently by Andrej Karpathy’s announcement of Eureka Labs, a venture aimed at creating an “AI-native” school. As we look ahead, it’s clear that the integration of AI in education will reshape how we learn, teach, and think about schooling altogether.

Traditional textbooks will begin to be replaced by interactive, AI-powered learning materials that adapt in real-time to a student’s progress.

As we approach 2029, the line between physical and virtual learning environments will blur significantly.

Curriculum design will become more flexible and personalized, with AI systems suggesting learning pathways based on each student’s interests, strengths, and career aspirations.

The boundaries between formal education and professional development will blur, creating a continuous learning ecosystem.

 

Introducing Eureka Labs — “We are building a new kind of school that is AI native.” — by Andrej Karpathy, Previously Director of AI @ Tesla, founding team @ OpenAI

However, with recent progress in generative AI, this learning experience feels tractable. The teacher still designs the course materials, but they are supported, leveraged and scaled with an AI Teaching Assistant who is optimized to help guide the students through them. This Teacher + AI symbiosis could run an entire curriculum of courses on a common platform. If we are successful, it will be easy for anyone to learn anything, expanding education in both reach (a large number of people learning something) and extent (any one person learning a large amount of subjects, beyond what may be possible today unassisted).


After Tesla and OpenAI, Andrej Karpathy’s startup aims to apply AI assistants to education — from techcrunch.com by Rebecca Bellan

Andrej Karpathy, former head of AI at Tesla and researcher at OpenAI, is launching Eureka Labs, an “AI native” education platform. In tech speak, that usually means built from the ground up with AI at its core. And while Eureka Labs’ AI ambitions are lofty, the company is starting with a more traditional approach to teaching.

San Francisco-based Eureka Labs, which Karpathy registered as an LLC in Delaware on June 21, aims to leverage recent progress in generative AI to create AI teaching assistants that can guide students through course materials.


What does it mean for students to be AI-ready? — from timeshighereducation.com by David Joyner
Not everyone wants to be a computer scientist, a software engineer or a machine learning developer. We owe it to our students to prepare them with a full range of AI skills for the world they will graduate into, writes David Joyner

We owe it to our students to prepare them for this full range of AI skills, not merely the end points. The best way to fulfil this responsibility is to acknowledge and examine this new category of tools. More and more tools that students use daily – word processors, email, presentation software, development environments and more – have AI-based features. Practising with these tools is a valuable exercise for students, so we should not prohibit that behaviour. But at the same time, we do not have to just shrug our shoulders and accept however much AI assistance students feel like using.


Teachers say AI usage has surged since the school year started — from eschoolnews.com by Laura Ascione
Half of teachers report an increase in the use of AI and continue to seek professional learning

Fifty percent of educators reported an increase in AI usage, by both students and teachers, over the 2023–24 school year, according to The 2024 Educator AI Report: Perceptions, Practices, and Potential, from Imagine Learning, a digital curriculum solutions provider.

The report offers insight into how teachers’ perceptions of AI use in the classroom have evolved since the start of the 2023–24 school year.


OPINION: What teachers call AI cheating, leaders in the workforce might call progress — from hechingerreport.org by C. Edward Waston and Jose Antonio Bowen
Authors of a new guide explore what AI literacy might look like in a new era

Excerpt (emphasis DSC):

But this very ease has teachers wondering how we can keep our students motivated to do the hard work when there are so many new shortcuts. Learning goals, curriculums, courses and the way we grade assignments will all need to be reevaluated.

The new realities of work also must be considered. A shift in employers’ job postings rewards those with AI skills. Many companies report already adopting generative AI tools or anticipate incorporating them into their workflow in the near future.

A core tension has emerged: Many teachers want to keep AI out of our classrooms, but also know that future workplaces may demand AI literacy.

What we call cheating, business could see as efficiency and progress.

It is increasingly likely that using AI will emerge as an essential skill for students, regardless of their career ambitions, and that action is required of educational institutions as a result.


Teaching Writing With AI Without Replacing Thinking: 4 Tips — from by Erik Ofgang
AI has a lot of potential for writing students, but we can’t let it replace the thinking parts of writing, says writing professor Steve Graham

Reconciling these two goals — having AI help students learn to write more efficiently without hijacking the cognitive benefits of writing — should be a key goal of educators. Finding the ideal balance will require more work from both researchers and classroom educators, but Graham shares some initial tips for doing this currently.




Why I ban AI use for writing assignments — from timeshighereducation.com by James Stacey Taylor
Students may see handwriting essays in class as a needlessly time-consuming approach to assignments, but I want them to learn how to engage with arguments, develop their own views and convey them effectively, writes James Stacey Taylor

Could they use AI to generate objections to the arguments they read? Of course. AI does a good job of summarising objections to Singer’s view. But I don’t want students to parrot others’ objections. I want them to think of objections themselves. 

Could AI be useful for them in organising their exegesis of others’ views and their criticisms of them? Yes. But, again, part of what I want my students to learn is precisely what this outsources to the AI: how to organise their thoughts and communicate them effectively. 


How AI Will Change Education — from digitalnative.tech by Rex Woodbury
Predicting Innovation in Education, from Personalized Learning to the Downfall of College 

This week explores how AI will bleed into education, looking at three segments of education worth watching, then examining which business models will prevail.

  1. Personalized Learning and Tutoring
  2. Teacher Tools
  3. Alternatives to College
  4. Final Thoughts: Business Models and Why Education Matters

New Guidance from TeachAI and CSTA Emphasizes Computer Science Education More Important than Ever in an Age of AI — from csteachers.org by CSTA
The guidance features new survey data and insights from teachers and experts in computer science (CS) and AI, informing the future of CS education.

SEATTLE, WA – July 16, 2024 – Today, TeachAI, led by Code.org, ETS, the International Society of Technology in Education (ISTE), Khan Academy, and the World Economic Forum, launches a new initiative in partnership with the Computer Science Teachers Association (CSTA) to support and empower educators as they grapple with the growing opportunities and risks of AI in computer science (CS) education.

The briefs draw on early research and insights from CSTA members, organizations in the TeachAI advisory committee, and expert focus groups to address common misconceptions about AI and offer a balanced perspective on critical issues in CS education, including:

  • Why is it Still Important for Students to Learn to Program?
  • How Are Computer Science Educators Teaching With and About AI?
  • How Can Students Become Critical Consumers and Responsible Creators of AI?
 


The race against time to reinvent lawyers — from jordanfurlong.substack.com by Jordan Furlong
Our legal education and licensing systems produce one kind of lawyer. The legal market of the near future will need another kind. If we can’t close this gap fast, we’ll have a very serious problem.

Excerpt (emphasis DSC):

Lawyers will still need competencies like legal reasoning and analysis, statutory and contractual interpretation, and a range of basic legal knowledge. But it’s unhelpful to develop these skills through activities that lawyers won’t be performing much longer, while neglecting to provide them with other skills and prepare them for other situations that they will face. Our legal education and licensing systems are turning out lawyers whose competence profiles simply won’t match up with what people will need lawyers to do.

A good illustration of what I mean can be found in an excellent recent podcast from the Practising Law Institute, “Shaping the Law Firm Associate of the Future.” Over the course of the episode, moderator Jennifer Leonard of Creative Lawyers asked Professors Alice Armitage of UC Law San Francisco and Heidi K. Brown of New York Law School to identify some of the competencies that newly called lawyers and law firm associates are going to need in future. Here’s some of what they came up with:

  • Agile, nimble, extrapolative thinking
  • Collaborative, cross-disciplinary learning
  • Entrepreneurial, end-user-focused mindsets
  • Generative AI knowledge (“Their careers will be shaped by it”)
  • Identifying your optimal individual workflow
  • Iteration, learning by doing, and openness to failure
  • Leadership and interpersonal communication skills
  • Legal business know-how, including client standards and partner expectations
  • Receiving and giving feedback to enhance effectiveness

Legal Tech for Legal Departments – What In-House Lawyers Need to Know — from legal.thomsonreuters.com by Sterling Miller

Whatever the reason, you must understand the problem inside and out. Here are the key points to understanding your use case:

  • Identify the problem.
  • What is the current manual process to solve the problem?
  • Is there technology that will replace this manual process and solve the problem?
  • What will it cost and do you have (or can you get) the budget?
  • Will the benefits of the technology outweigh the cost? And how soon will those benefits pay off the cost? In other words, what is the return on investment?
  • Do you have the support of the organization to buy it (inside the legal department and elsewhere, e.g., CFO, CTO)?

2024-05-13: Of Legal AI — from emergentbehavior.co

Long discussion with a senior partner at a major Bay Area law firm:

Takeaways

A) They expect legal AI to decimate the profession…
B) Unimpressed by most specific legal AI offerings…
C) Generative AI error rates are acceptable even at 10–20%…
D) The future of corporate law is in-house…
E) The future of law in general?…
F) Of one large legal AI player…


2024 Legal Technology Survey Results — from lexology.com

Additional findings of the annual survey include:

  • 77 percent of firms have a formal technology strategy in place
  • Interest and intentions regarding generative A.I. remain high, with almost 80 percent of participating firms expecting to leverage it within the next five years. Many have either already begun or are planning to undertake data hygiene projects as a precursor to using generative A.I. and other automation solutions. Although legal market analysts have hypothesized that proprietary building of generative A.I. solutions remain out of reach for mid-sized firms, several Meritas survey respondents are making traction. Many other firms are also licensing third-party generative A.I. solutions.
  • The survey showed strong technology progression among several Meritas member firms, with most adopting a tech stack of core, foundational systems of infrastructure technology and adding cloud-based practice management, document management, time, billing, and document drafting applications.
  • Most firms reported increased adoption and utilization of options already available within their current core systems, such as Microsoft Office 365 Teams, SharePoint, document automation, and other native functionalities for increasing efficiencies; these functions were used more often in place of dedicated purpose-built solutions such as comparison and proofreading tools.
  • The legal technology market serving Meritas’ member firms continues to be fractured, with very few providers emerging as market leaders.

AI Set to Save Professionals 12 Hours Per Week by 2029 — from legalitprofessionals.com

Thomson Reuters, a global content and technology company, today released its 2024 Future of Professionals report, an annual survey of more than 2,200 professionals working across legal, tax, and risk & compliance fields globally. Respondents predicted that artificial intelligence (AI) has the potential to save them 12 hours per week in the next five years, or four hours per week over the upcoming year – equating to 200 hours annually.

This timesaving potential is the equivalent productivity boost of adding an extra colleague for every 10 team members on staff. Harnessing the power of AI across various professions opens immense economic opportunities. For a U.S. lawyer, this could translate to an estimated $100,000 in additional billable hours.*

 
© 2024 | Daniel Christian