From DSC:
As I can’t embed his posting, I’m copying/pasting Jeff’s posting on LinkedIn:


According to Flighty, I logged more than 2,220 flight miles in the last 5 days traveling to three conferences to give keynotes and spend time with housing officers in Milwaukee, college presidents in Mackinac Island, MI, and enrollment and marketing leaders in Raleigh.

Before I rest, I wanted to post some quick thoughts about what I learned. Thank you to everyone who shared their wisdom these past few days:

  • We need to think about the “why” and “how” of AI in higher ed. The “why” shouldn’t be just because everyone else is doing it. Rather, the “why” is to reposition higher ed for a different future of competitors. The “how” shouldn’t be to just seek efficiency and cut jobs. Rather we should use AI to learn from its users to create a better experience going forward.
  • Residence halls are not just infrastructure. They are part and parcel of the student experience and critical to student success. Almost half of students living on campus say it increases their sense of belonging, according to research by the Association of College & University Housing Officers.
  • How do we extend the “residential experience”? More than half of traditional undergraduates who live on campus now take at least once course online. As students increasingly spend time off campus – or move off campus as early as their second year in college – we need to help continue to make the connections for them that they would in a dorm. Why? 47% of college students believe living in a college residence hall enhanced their ability to resolve conflicts.
  • Career must be at the core of the student experience for colleges to thrive in the future, says Andy Chan. Yes, some people might see that as too narrow of a view of higher ed or might not want to provide cogs for the wheel of the workforce, but without the job, none of the other benefits of college follow–citizenship, health, engagement.
  • A “triple threat grad”–someone who has an internship, a semester-long project, and an industry credential (think Salesforce or Adobe in addition to their degree–matters more in the job market than major or institution, says Brandon Busteed.
  • Every faculty member should think of themselves as an ambassador for the institution. Yes, care about their discipline/department, but that doesn’t survive if the rest of the institution falls down around them.
  • Presidents need to place bigger bets rather than spend pennies and dimes on a bunch of new strategies. That means to free up resources they need to stop doing things.
  • Higher ed needs a new business model. Institutions can’t make money just from tuition, and new products like certificates, are pennies on the dollars of degrees.
  • Boards aren’t ready for the future. They are over-indexed on philanthropy and alumni and not enough on the expertise needed for leading higher ed.

From DSC:
As I can’t embed his posting, I’m copying/pasting Jeff’s posting on LinkedIn:


It’s the stat that still gnaws at me: 62%.

That’s the percentage of high school graduates going right on to college. A decade ago it was around 70%. So for all the bellyaching about the demographic cliff in higher ed, just imagine if today we were close to that 70% number? We’d be talking a few hundred thousand more students in the system.

As I told a gathering of presidents of small colleges and universities last night on Mackinac Island — the first time I had to take [numerous modes of transportation] to get to a conference — being small isn’t distinctive anymore.

There are many reasons undergrad enrollment is down, but they all come down to two interrelated trends: jobs and affordability.

The job has become so central to what students want out of the experience. It’s almost as if colleges now need to guarantee a job.

These institutions will need to rethink the learner relationship with work. Instead of college with work on the side, we might need to move to more of a mindset of work with college on the side by:

  • Making campus jobs more meaningful. Why can’t we have accounting and finance majors work in the CFO office, liberal arts majors work in IT on platforms such as Salesforce and Workday, which are skills needed in the workplace, etc.?
  • Apprenticeships are not just for the trades anymore. Integrate work-based learning into the undergrad experience in a much bigger way than internships and even co-ops.
  • Credentials within the degree. Every graduate should leave college with more than just a BA but also a certified credential in things like data viz, project management, the Adobe suite, Alteryx, etc.
  • The curriculum needs to be more flexible for students to combine work and learning — not only for the experience but also money for college — so more availability of online courses, hybrid courses, and flexible semesters.

How else can we think about learning and earning?


 

The Musician’s Rule and GenAI in Education — from opencontent.org by David Wiley

We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.

From DSC:
As is often the case, David put together a solid posting here. A few comments/reflections on it:

  • I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
  • The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
  • The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
  • The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
  • As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
  • There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better.
    .

We need to take more of the research from learning science and apply it in our learning spaces.

 

Brazil hires OpenAI to cut costs of court battles — from reuters.com by Marcela Ayres and Bernardo Caram

BRASILIA, June 10 (Reuters) – Brazil’s government is hiring OpenAI to expedite the screening and analysis of thousands of lawsuits using artificial intelligence (AI), trying to avoid costly court losses that have weighed on the federal budget.

The AI service will flag to government the need to act on lawsuits before final decisions, mapping trends and potential action areas for the solicitor general’s office (AGU).


Gen AI Cut Lawyers’ Drafting Time in Half, UK’s Ashurst Says — from news.bloomberglaw.com by Isabel Gottlieb

The firm’s findings included:

  • Using generative AI tools, lawyers saved 45% of the time they said it would otherwise have taken them to write a first draft of a legal brief—about 2.5 hours saved per briefing draft, the firm said in a release. They also saved 80% of the time it would have taken them to draft UK corporate filings “requiring review and extraction of information from articles of association”; and 59% of the time it would have taken them to draft reports about industries and sectors using company filings.
  • Output produced by generative AI was at least as accurate, or legally correct, as a human lawyer’s first draft in 67% of cases. Human output had higher average scores for accuracy, and the AI content had greater variations than human-created content in how accurate it was.
  • The panel was unable to identify half of the AI-created output as coming from AI, but it made no mistakes on identifying human-created content.

 

 

The Impact of Technology on Modern Legal Services — from smdailyjournal.com

In an ever-evolving digital landscape, the fusion of technology and legal services has ushered in a new era of efficiency, accessibility, and innovation. The traditional image of legal professionals buried in stacks of paperwork and endless research has been transformed by cutting-edge technologies that promise to revolutionize how legal services are delivered, accessed, and executed. From artificial intelligence to blockchain, cloud computing to automation, the impact of technology on modern legal services is palpable and profound.

Technology Trends Shaping Legal Services
The legal industry is experiencing a seismic shift driven by technology, with key trends reshaping legal services. Artificial Intelligence (AI) is revolutionizing legal research, document analysis, and predictive analytics, enabling legal professionals to streamline their workflow and deliver more accurate and timely insights to clients. Blockchain technology improves the safety and transparency of legal transactions, while cloud computing optimizes data storage and accessibility in the legal sector.


Clients Care About Legal Tech: Dig Into Legal Tech and Tech Related Careers. — from legaltalknetwork.com by Dan Lear and Adriana Linares
A new survey proves clients care about a lawyer’s tech skills. Hear about adding the latest tech and about emerging jobs in the legal tech field.

A new survey finds that clients care deeply about their attorney’s tech tools and tech skills. The numbers don’t lie: Legal tech matters. An efficient, integrated system is no longer “nice to have.” It’s table stakes, from case management to client communications to online filing and billing. 

 

 

Microsoft teams with Khan Academy to make its AI tutor free for K-12 educators and will develop a Phi-3 math model — from venturebeat.com by Ken Yeung

Microsoft is partnering with Khan Academy in a multifaceted deal to demonstrate how AI can transform the way we learn. The cornerstone of today’s announcement centers on Khan Academy’s Khanmigo AI agent. Microsoft says it will migrate the bot to its Azure OpenAI Service, enabling the nonprofit educational organization to provide all U.S. K-12 educators free access to Khanmigo.

In addition, Microsoft plans to use its Phi-3 model to help Khan Academy improve math tutoring and collaborate to generate more high-quality learning content while making more courses available within Microsoft Copilot and Microsoft Teams for Education.


One-Third of Teachers Have Already Tried AI, Survey Finds — from the74million.org by Kevin Mahnken
A RAND poll released last month finds English and social studies teachers embracing tools like ChatGPT.

One in three American teachers have used artificial intelligence tools in their teaching at least once, with English and social studies teachers leading the way, according to a RAND Corporation survey released last month. While the new technology isn’t yet transforming how kids learn, both teachers and district leaders expect that it will become an increasingly common feature of school life.


Professors Try ‘Restrained AI’ Approach to Help Teach Writing — from edsurge.com by Jeffrey R. Young
Can ChatGPT make human writing more efficient, or is writing an inherently time-consuming process best handled without AI tools?

This article is part of the guide: For Education, ChatGPT Holds Promise — and Creates Problems.

When ChatGPT emerged a year and half ago, many professors immediately worried that their students would use it as a substitute for doing their own written assignments — that they’d click a button on a chatbot instead of doing the thinking involved in responding to an essay prompt themselves.

But two English professors at Carnegie Mellon University had a different first reaction: They saw in this new technology a way to show students how to improve their writing skills.

“They start really polishing way too early,” Kaufer says. “And so what we’re trying to do is with AI, now you have a tool to rapidly prototype your language when you are prototyping the quality of your thinking.”

He says the concept is based on writing research from the 1980s that shows that experienced writers spend about 80 percent of their early writing time thinking about whole-text plans and organization and not about sentences.


On Building AI Models for Education — from aieducation.substack.com by Claire Zau
Google’s LearnLM, Khan Academy/MSFT’s Phi-3 Models, and OpenAI’s ChatGPT Edu

This piece primarily breaks down how Google’s LearnLM was built, and takes a quick look at Microsoft/Khan Academy’s Phi-3 and OpenAI’s ChatGPT Edu as alternative approaches to building an “education model” (not necessarily a new model in the latter case, but we’ll explain). Thanks to the public release of their 86-page research paper, we have the most comprehensive view into LearnLM. Our understanding of Microsoft/Khan Academy small language models and ChatGPT Edu is limited to the information provided through announcements, leaving us with less “under the hood” visibility into their development.


AI tutors are quietly changing how kids in the US study, and the leading apps are from China — from techcrunch.com by Rita Liao

Answer AI is among a handful of popular apps that are leveraging the advent of ChatGPT and other large language models to help students with everything from writing history papers to solving physics problems. Of the top 20 education apps in the U.S. App Store, five are AI agents that help students with their school assignments, including Answer AI, according to data from Data.ai on May 21.


Is your school behind on AI? If so, there are practical steps you can take for the next 12 months — from stefanbauschard.substack.com by Stefan Bauschard

If your school (district) or university has not yet made significant efforts to think about how you will prepare your students for a World of AI, I suggest the following steps:

July 24 – Administrator PD & AI Guidance
In July, administrators should receive professional development on AI, if they haven’t already. This should include…

August 24 –Professional Development for Teachers and Staff…
Fall 24 — Parents; Co-curricular; Classroom experiments…
December 24 — Revision to Policy…


New ChatGPT Version Aiming at Higher Ed — from insidehighered.com by Lauren Coffey
ChatGPT Edu, emerging after initial partnerships with several universities, is prompting both cautious optimism and worries.

OpenAI unveiled a new version of ChatGPT focused on universities on Thursday, building on work with a handful of higher education institutions that partnered with the tech giant.

The ChatGPT Edu product, expected to start rolling out this summer, is a platform for institutions intended to give students free access. OpenAI said the artificial intelligence (AI) toolset could be used for an array of education applications, including tutoring, writing grant applications and reviewing résumés.

 

Khan Academy and Microsoft partner to expand access to AI tools that personalize teaching and help make learning fun — from news.microsoft.com

[On 5/21/24] at Microsoft Build, Microsoft and Khan Academy announced a new partnership that aims to bring these time-saving and lesson-enhancing AI tools to millions of educators. By donating access to Azure AI-optimized infrastructure, Microsoft is enabling Khan Academy to offer all K-12 educators in the U.S. free access to the pilot of Khanmigo for Teachers, which will now be powered by Azure OpenAI Service.

The two companies will also collaborate to explore opportunities to improve AI tools for math tutoring in an affordable, scalable and adaptable way with a new version of Phi-3, a family of small language models (SLMs) developed by Microsoft.

 

Also see/referenced:

Khanmigo -- a free, AI-powered teaching assistant


Also relevant/see:

Khan Academy and Microsoft are teaming up to give teachers a free AI assistant — from fastcompany.com by Steven Melendez
AI assistant Khanmigo can help time-strapped teachers come up with lesson ideas and test questions, the companies say.

Khan Academy’s AI assistant, Khanmigo, has earned praise for helping students to understand and practice everything from math to English, but it can also help teachers devise lesson plans, formulate questions about assigned readings, and even generate reading passages appropriate for students at different levels. More than just a chatbot, the software offers specific AI-powered tools for generating quizzes and assignment instructions, drafting lesson plans, and formulating letters of recommendation.

Having a virtual teaching assistant is especially valuable in light of recent research from the RAND Corporation that found teachers work longer hours than most working adults, which includes administrative and prep work outside the classroom.

 

.
Grasp is the world’s first generative AI platform for finance professionals.

We build domain-specific AI systems that address the complex needs of investment bankers and management consultants.

By automating finance workflows, Grasp dramatically increases employee productivity and satisfaction.

 

Introducing Copilot+ PCs — from blogs.microsoft.com

[On May 20th], at a special event on our new Microsoft campus, we introduced the world to a new category of Windows PCs designed for AI, Copilot+ PCs.

Copilot+ PCs are the fastest, most intelligent Windows PCs ever built. With powerful new silicon capable of an incredible 40+ TOPS (trillion operations per second), all–day battery life and access to the most advanced AI models, Copilot+ PCs will enable you to do things you can’t on any other PC. Easily find and remember what you have seen in your PC with Recall, generate and refine AI images in near real-time directly on the device using Cocreator, and bridge language barriers with Live Captions, translating audio from 40+ languages into English.

From DSC:
As a first off-the-hip look, Recall could be fraught with possible security/privacy-related issues. But what do I know? The Neuron states “Microsoft assures that everything Recall sees remains private.” Ok…


From The Rundown AI concerning the above announcements:

The details:

  • A new system enables Copilot+ PCs to run AI workloads up to 20x faster and 100x more efficiently than traditional PCs.
    Windows 11 has been rearchitected specifically for AI, integrating the Copilot assistant directly into the OS.
  • New AI experiences include a new feature called Recall, which allows users to search for anything they’ve seen on their screen with natural language.
  • Copilot’s new screen-sharing feature allows AI to watch, hear, and understand what a user is doing on their computer and answer questions in real-time.
  • Copilot+ PCs will start at $999, and ship with OpenAI’s latest GPT-4o models.

Why it matters: Tony Stark’s all-powerful JARVIS AI assistant is getting closer to reality every day. Once Copilot, ChatGPT, Project Astra, or anyone else can not only respond but start executing tasks autonomously, things will start getting really exciting — and likely initiate a whole new era of tech work.


 

.

2024 EDUCAUSE Horizon Report® Teaching and Learning Edition

Trends
As a first activity, we asked the Horizon panelists to provide input on the macro trends they believe are going to shape the future of postsecondary teaching and learning and to provide observable evidence for those trends. To ensure an expansive view of the larger trends serving as context for institutions of higher education, panelists provided input across five trend categories: social, technological, economic, environmental, and political. Given the widespread impacts of emerging AI technologies on higher education, we are also including in this year’s report a list of “honorary trends” focused on AI. After several rounds of voting, the panelists selected the following trends as the most important:

 

Shares of two big online education stocks tank more than 10% as students use ChatGPT — from cnbc.com by Michelle Fox; via Robert Gibson on LinkedIn

The rapid rise of artificial intelligence appears to be taking a toll on the shares of online education companies Chegg and Coursera.

Both stocks sank by more than 10% on Tuesday after issuing disappointing guidance in part because of students using AI tools such as ChatGPT from OpenAI.



Synthetic Video & AI Professors — from drphilippahardman.substack.com by Dr. Philippa Hardman
Are we witnessing the emergence of a new, post-AI model of async online learning?

TLDR: by effectively tailoring the learning experience to the learner’s comprehension levels and preferred learning modes, AI can enhance the overall learning experience, leading to increased “stickiness” and higher rates of performance in assessments.

TLDR: AI enables us to scale responsive, personalised “always on” feedback and support in a way that might help to solve one of the most wicked problems of online async learning – isolation and, as a result, disengagement.

In the last year we have also seen the rise of an unprecedented number of “always on” AI tutors, built to provide coaching and feedback how and when learners need it.

Perhaps the most well-known example is Khan Academy’s Khanmigo and its GPT sidekick Tutor Me. We’re also seeing similar tools emerge in K12 and Higher Ed where AI is being used to extend the support and feedback provided for students beyond the physical classroom.


Our Guidance on School AI Guidance document has been updated — from stefanbauschard.substack.com by Stefan Bauschard

We’ve updated the free 72-page document we wrote to help schools design their own AI guidance policies.

There are a few key updates.

  1. Inclusion of Oklahoma and significant updates from North Carolina and Washington.
  2. More specifics on implementation — thanks NC and WA!
  3. A bit more on instructional redesign. Thanks to NC for getting this party started!

Creating a Culture Around AI: Thoughts and Decision-Making — from er.educause.edu by Courtney Plotts and Lorna Gonzalez

Given the potential ramifications of artificial intelligence (AI) diffusion on matters of diversity, equity, inclusion, and accessibility, now is the time for higher education institutions to adopt culturally aware, analytical decision-making processes, policies, and practices around AI tools selection and use.

 

[Report] The Top 100 AI for Work – April 2024 — from flexos.work; with thanks to Daan van Rossum for this resource
AI is helping us work up to 41% more effectively, according to recent Bain research. We review the platforms to consider for ourselves and our teams.

Following our AI Top 150, we spent the past few weeks analyzing data on the top AI platforms for work. This report shares key insights, including the AI tools you should consider adopting to work smarter, not harder.

While there is understandable concern about AI in the work context, the platforms in this list paint a different picture. It shows a future of work where people can do what humans are best suited for while offloading repetitive, digital tasks to AI.

This will fuel the notion that it’s not AI that takes your job but a supercharged human with an army of AI tools and agents. This should be a call to action for every working person and business leader reading this.

 


How Early Adopters of Gen AI Are Gaining Efficiencies — from knowledge.wharton.upenn.edu by Prasanna (Sonny) Tambe and Scott A. Snyder; via Ray Schroeder on LinkedIn
Enterprises are seeing gains from generative AI in productivity and strategic planning, according to speakers at a recent Wharton conference.

Its unique strengths in translation, summation, and content generation are especially useful in processing unstructured data. Some 80% of all new data in enterprises is unstructured, he noted, citing research firm Gartner. Very little of that unstructured data that resides in places like emails “is used effectively at the point of decision making,” he noted. “[With gen AI], we have a real opportunity” to garner new insights from all the information that resides in emails, team communication platforms like Slack, and agile project management tools like Jira, he said.


6 YouTube Channels to Stay Up to Date with AI — from heaigirl.substack.com by Diana Dovgopol
Here are some cool AI YouTube channels.

Here are 6 YouTube channels I watch to stay up to date with AI. This list will be useful whether you’re a casual AI enthusiast or an experienced programmer.

1. Matt Wolfe: AI for non-coders
This is a fast-growing YouTube channel focused on artificial intelligence for non-coders. On this channel, you’ll find videos about ChatGPT, Midjourney, and any AI tool that it’s gaining popularity.


Top AI mobile apps, Stable Video 3D, & my AI film workflow — from by Heather Cooper
Plus 1-Click 3D animation and other cool AI tools

#3 Photomath
Photomath is a comprehensive math help app that provides step-by-step explanations for a wide range of math problems, from elementary to college level. Photomath is only available as a mobile app. (link)

Features:

  • Get step-by-step solutions with multiple methods to choose from
  • Scan any math problem, including word problems, using the app’s camera
  • Access custom visual aids and extra “how” and “why” tips for deeper understanding

Google researchers unveil ‘VLOGGER’, an AI that can bring still photos to life — from venturebeat.com by Michael Nuñez

Google researchers have developed a new artificial intelligence system that can generate lifelike videos of people speaking, gesturing and moving — from just a single still photo. The technology, called VLOGGER, relies on advanced machine learning models to synthesize startlingly realistic footage, opening up a range of potential applications while also raising concerns around deepfakes and misinformation.



What We Risk By Automating Tasks We Loathe — from marcwatkins.substack.com by Marc Watkins

I’m fascinated by the potential of these tools to augment and enhance our work and creativity. There’s no denying the impressive capabilities we’re already seeing with text generation, image creation, coding assistance, and more. Used thoughtfully, AI can be a powerful productivity multiplier.

At the same time, I have significant concerns about the broader implications of this accelerating technology, especially for education and society at large. We’re traversing new ground at a breakneck pace, and it’s crucial that we don’t blindly embrace AI without considering the potential risks.

My worry is that by automating away too many tasks, even seemingly rote ones like creating slide decks, we risk losing something vital—humanity at the heart of knowledge work.


Nvidia Introduce AI Nurses — from wireprompt.substack.com | Weekkly AI Report from WirePrompt

Nvidia has announced a partnership with Hippocratic AI to introduce AI “agents” aimed at replacing nurses in hospitals. These AI “nurses” come at a significantly low cost compared to human nurses and are purportedly intended to address staffing issues by handling “low-risk,” patient-facing tasks via video calls. However, concerns are raised regarding the ethical implications and effectiveness of replacing human nurses with AI, particularly given the complex nature of medical care.



16 Changes to the Way Enterprises Are Building and Buying Generative AI — from a16z.com by Sarah Wang and Shangda Xu

TABLE OF CONTENTS

  • Resourcing: budgets are growing dramatically and here to stay
  • Models: enterprises are trending toward a multi-model, open source world
  • Use cases: more migrating to production
  • Size of total opportunity: massive and growing quickly

 

Which AI should I use? Superpowers and the State of Play — from by Ethan Mollick
And then there were three

For over a year, GPT-4 was the dominant AI model, clearly much smarter than any of the other LLM systems available. That situation has changed in the last month, there are now three GPT-4 class models, all powering their own chatbots: GPT-4 (accessible through ChatGPT Plus or Microsoft’s CoPilot), Anthropic’s Claude 3 Opus, and Google’s Gemini Advanced1.

Where we stand
We are in a brief period in the AI era where there are now multiple leading models, but none has yet definitively beaten the GPT-4 benchmark set over a year ago. While this may represent a plateau in AI abilities, I believe this is likely to change in the coming months as, at some point, models like GPT-5 and Gemini 2.0 will be released. In the meantime, you should be using a GPT-4 class model and using it often enough to learn what it does well. You can’t go wrong with any of them, pick a favorite and use it…

From DSC:
Here’s a powerful quote from Ethan:

In fact, in my new book I postulate that you haven’t really experienced AI until you have had three sleepless nights of existential anxiety, after which you can start to be productive again.


Using AI for Immersive Educational Experiences — from automatedteach.com by Graham Clay
Realistic video brings course content to life but requires AI literacy.

For us, I think the biggest promise of AI tools like Sora — that can create video with ease — is that they lower the cost of immersive educational experiences. This increases the availability of these experiences, expanding their reach to student populations who wouldn’t otherwise have them, whether due to time, distance, or expense.

Consider the profound impact on a history class, where students are transported to California during the gold rush through hyperrealistic video sequences. This vivifies the historical content and cultivates a deeper connection with the material.

In fact, OpenAI has already demonstrated the promise of this sort of use case, with a very simple prompt producing impressive results…


The Empathy Illusion: How AI Agents Could Manipulate Students — from marcwatkins.substack.com by Marc Watkins

Take this scenario. A student misses a class and, within twenty minutes, receives a series of texts and even a voicemail from a very concerned and empathic-sounding voice wanting to know what’s going on. Of course, the text is entirely generated, and the voice is synthetic as well, but the student likely doesn’t know this. To them, communication isn’t something as easy to miss or brush off as an email. It sounds like someone who cares is talking to them.

But let’s say that isn’t enough. By that evening, the student still hadn’t logged into their email or checked the LMS. The AI’s strategic reasoning is communicating with the predictive AI and analyzing the pattern of behavior against students who succeed or fail vs. students who are ill. The AI tracks the student’s movements on campus, monitors their social media usage, and deduces the student isn’t ill and is blowing off class.

The AI agent resumes communication with the student. But this time, the strategic AI adopts a different persona, not the kind and empathetic persona used for the initial contact, but a stern, matter-of-fact one. The student’s phone buzzes with alerts that talk about scholarships being lost, teachers being notified, etc. The AI anticipates the excuses the student will use and presents evidence tracking the student’s behavior to show they are not sick.


Not so much focused on learning ecosystems, but still worth mentioning:

The top 100 Gen AI Consumer Apps — from a16z.com / andreessen horowitz by Olivia Moore


 

 

The Edtech Insiders Rundown of SXSW EDU 2024 — from edtechinsiders.substack.com by Ben Kornell, Alex Sarlin, and Sarah Morin
And more on our ASU + GSV Happy Hour, GenAI in edtech market valuations, and interviews from The Common Sense Summit.

Theme 1: The Kids Are Not Alright
This year’s SXSW EDU had something for everyone, with over a dozen “Program Tracks.” However, the one theme that truly connected the entire conference was mental health.

36 sessions were specifically tagged with mental health and wellness, but in sessions on topics ranging from literacy to edtech to civic engagement, presenters continued to come back again and again to the mental health crisis amongst teens and young adults.

Theme 2: Aye AI, Captain
Consistent with past conferences, this year leaned in on the K12 education world. As expected, one of the hottest topics for K12 was the role of AI (or lack thereof) in schools. Key takeaways included…


AI Literacy: A New Graduation Requirement and Civic Imperative — from gettingsmart.com by Tom Vander Ark and Mason Pashia

Key Points

  • There is still time to ensure that all of your students graduate with an understanding of how AI works, why it is important and how to best use it.

What would it look like to make a commitment that come graduation every senior will have at least basic AI literacy? This includes an appreciation of AI as a creation engine and learning partner but also an understanding of the risks of deepfakes and biased curation. We’re entering a time where to quote Ethan Mollick “You can’t trust anything you read or see ever again.” Whether formal or informal, it’s time to start building AI literacy.


New Alabama Education Law Represents Small But Significant Advance — from forbes.com by Jeanne Allen

Valiant Cross Academy raises the bar for young men. A young man seated in a pew is raising his hand.

More than 50 years later, across the street from the church and concerned with declining education and the pace of social change, brothers Anthony and Fred Brock founded Valiant Cross Academy, an all-male academy aimed at “helping boys of color become men of valor.”

Valiant Cross embodies King’s hopes, pursuing the dream that its students will be judged by the content of their character, not the color of their skin, and working to ensure that they are well prepared for productive lives filled with accomplishment and purpose.

“We’re out to prove that it’s an opportunity gap, not an achievement gap” says head of school Anthony Brock. And they have. In 2022, 100 percent of Valiant seniors graduated from the academy, pursuing post-graduate options, enrolling in either four- or two-year college, or established career-training programs.


 

 


[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 
© 2024 | Daniel Christian