School 3.0: Reimagining Education in 2026, 2029, and 2034 — from davidborish.com by David Borish
.

The landscape of education is on the brink of a profound transformation, driven by rapid advancements in artificial intelligence. This shift was highlighted recently by Andrej Karpathy’s announcement of Eureka Labs, a venture aimed at creating an “AI-native” school. As we look ahead, it’s clear that the integration of AI in education will reshape how we learn, teach, and think about schooling altogether.

Traditional textbooks will begin to be replaced by interactive, AI-powered learning materials that adapt in real-time to a student’s progress.

As we approach 2029, the line between physical and virtual learning environments will blur significantly.

Curriculum design will become more flexible and personalized, with AI systems suggesting learning pathways based on each student’s interests, strengths, and career aspirations.

The boundaries between formal education and professional development will blur, creating a continuous learning ecosystem.

 


And while we’re having some fun, also see the following items:


 

Introducing Eureka Labs — “We are building a new kind of school that is AI native.” — by Andrej Karpathy, Previously Director of AI @ Tesla, founding team @ OpenAI

However, with recent progress in generative AI, this learning experience feels tractable. The teacher still designs the course materials, but they are supported, leveraged and scaled with an AI Teaching Assistant who is optimized to help guide the students through them. This Teacher + AI symbiosis could run an entire curriculum of courses on a common platform. If we are successful, it will be easy for anyone to learn anything, expanding education in both reach (a large number of people learning something) and extent (any one person learning a large amount of subjects, beyond what may be possible today unassisted).


After Tesla and OpenAI, Andrej Karpathy’s startup aims to apply AI assistants to education — from techcrunch.com by Rebecca Bellan

Andrej Karpathy, former head of AI at Tesla and researcher at OpenAI, is launching Eureka Labs, an “AI native” education platform. In tech speak, that usually means built from the ground up with AI at its core. And while Eureka Labs’ AI ambitions are lofty, the company is starting with a more traditional approach to teaching.

San Francisco-based Eureka Labs, which Karpathy registered as an LLC in Delaware on June 21, aims to leverage recent progress in generative AI to create AI teaching assistants that can guide students through course materials.


What does it mean for students to be AI-ready? — from timeshighereducation.com by David Joyner
Not everyone wants to be a computer scientist, a software engineer or a machine learning developer. We owe it to our students to prepare them with a full range of AI skills for the world they will graduate into, writes David Joyner

We owe it to our students to prepare them for this full range of AI skills, not merely the end points. The best way to fulfil this responsibility is to acknowledge and examine this new category of tools. More and more tools that students use daily – word processors, email, presentation software, development environments and more – have AI-based features. Practising with these tools is a valuable exercise for students, so we should not prohibit that behaviour. But at the same time, we do not have to just shrug our shoulders and accept however much AI assistance students feel like using.


Teachers say AI usage has surged since the school year started — from eschoolnews.com by Laura Ascione
Half of teachers report an increase in the use of AI and continue to seek professional learning

Fifty percent of educators reported an increase in AI usage, by both students and teachers, over the 2023–24 school year, according to The 2024 Educator AI Report: Perceptions, Practices, and Potential, from Imagine Learning, a digital curriculum solutions provider.

The report offers insight into how teachers’ perceptions of AI use in the classroom have evolved since the start of the 2023–24 school year.


OPINION: What teachers call AI cheating, leaders in the workforce might call progress — from hechingerreport.org by C. Edward Waston and Jose Antonio Bowen
Authors of a new guide explore what AI literacy might look like in a new era

Excerpt (emphasis DSC):

But this very ease has teachers wondering how we can keep our students motivated to do the hard work when there are so many new shortcuts. Learning goals, curriculums, courses and the way we grade assignments will all need to be reevaluated.

The new realities of work also must be considered. A shift in employers’ job postings rewards those with AI skills. Many companies report already adopting generative AI tools or anticipate incorporating them into their workflow in the near future.

A core tension has emerged: Many teachers want to keep AI out of our classrooms, but also know that future workplaces may demand AI literacy.

What we call cheating, business could see as efficiency and progress.

It is increasingly likely that using AI will emerge as an essential skill for students, regardless of their career ambitions, and that action is required of educational institutions as a result.


Teaching Writing With AI Without Replacing Thinking: 4 Tips — from by Erik Ofgang
AI has a lot of potential for writing students, but we can’t let it replace the thinking parts of writing, says writing professor Steve Graham

Reconciling these two goals — having AI help students learn to write more efficiently without hijacking the cognitive benefits of writing — should be a key goal of educators. Finding the ideal balance will require more work from both researchers and classroom educators, but Graham shares some initial tips for doing this currently.




Why I ban AI use for writing assignments — from timeshighereducation.com by James Stacey Taylor
Students may see handwriting essays in class as a needlessly time-consuming approach to assignments, but I want them to learn how to engage with arguments, develop their own views and convey them effectively, writes James Stacey Taylor

Could they use AI to generate objections to the arguments they read? Of course. AI does a good job of summarising objections to Singer’s view. But I don’t want students to parrot others’ objections. I want them to think of objections themselves. 

Could AI be useful for them in organising their exegesis of others’ views and their criticisms of them? Yes. But, again, part of what I want my students to learn is precisely what this outsources to the AI: how to organise their thoughts and communicate them effectively. 


How AI Will Change Education — from digitalnative.tech by Rex Woodbury
Predicting Innovation in Education, from Personalized Learning to the Downfall of College 

This week explores how AI will bleed into education, looking at three segments of education worth watching, then examining which business models will prevail.

  1. Personalized Learning and Tutoring
  2. Teacher Tools
  3. Alternatives to College
  4. Final Thoughts: Business Models and Why Education Matters

New Guidance from TeachAI and CSTA Emphasizes Computer Science Education More Important than Ever in an Age of AI — from csteachers.org by CSTA
The guidance features new survey data and insights from teachers and experts in computer science (CS) and AI, informing the future of CS education.

SEATTLE, WA – July 16, 2024 – Today, TeachAI, led by Code.org, ETS, the International Society of Technology in Education (ISTE), Khan Academy, and the World Economic Forum, launches a new initiative in partnership with the Computer Science Teachers Association (CSTA) to support and empower educators as they grapple with the growing opportunities and risks of AI in computer science (CS) education.

The briefs draw on early research and insights from CSTA members, organizations in the TeachAI advisory committee, and expert focus groups to address common misconceptions about AI and offer a balanced perspective on critical issues in CS education, including:

  • Why is it Still Important for Students to Learn to Program?
  • How Are Computer Science Educators Teaching With and About AI?
  • How Can Students Become Critical Consumers and Responsible Creators of AI?
 

OpenAI illegally barred staff from airing safety risks, whistleblowers say — from washingtonpost.com by Pranshu Verma, Cat Zakrzewski, and Nitasha Tiku
In a letter exclusively obtained by The Washington Post, whistleblowers asked the SEC to probe company’s allegedly restrictive non-disclosure agreements

OpenAI whistleblowers have filed a complaint with the Securities and Exchange Commission alleging the artificial intelligence company illegally prohibited its employees from warning regulators about the grave risks its technology may pose to humanity, calling for an investigation.

The whistleblowers said OpenAI issued its employees overly restrictive employment, severance and nondisclosure agreements that could have led to penalties against workers who raised concerns about OpenAI to federal regulators, according to a seven-page letter sent to the SEC commissioner earlier this month that referred to the formal complaint. The letter was obtained exclusively by The Washington Post.

 

What to Know About Buying A Projector for School — from by Luke Edwards
Buy the right projector for school with these helpful tips and guidance.

Picking the right projector for school can be a tough decision as the types and prices range pretty widely. From affordable options to professional grade pricing, there are many choices. The problem is that the performance is also hugely varied. This guide aims to be the solution by offering all you need to know about buying the right projector for school where you are.

Luke covers a variety of topics including:

  • Types of projectors
  • Screen quality
  • Light type
  • Connectivity
  • Pricing

From DSC:
I posted this because Luke covered a variety of topics — and if you’re set on going with a projector, this is a solid article. But I hesitated to post this, as I’m not sure of the place that projectors will have in the future of our learning spaces. With voice-enabled apps and appliances continuing to be more prevalent — along with the presence of AI-based human-computer interactions and intelligent systems — will projectors be the way to go? Will enhanced interactive whiteboards be the way to go? Will there be new types of displays? I’m not sure. Time will tell.

 


The race against time to reinvent lawyers — from jordanfurlong.substack.com by Jordan Furlong
Our legal education and licensing systems produce one kind of lawyer. The legal market of the near future will need another kind. If we can’t close this gap fast, we’ll have a very serious problem.

Excerpt (emphasis DSC):

Lawyers will still need competencies like legal reasoning and analysis, statutory and contractual interpretation, and a range of basic legal knowledge. But it’s unhelpful to develop these skills through activities that lawyers won’t be performing much longer, while neglecting to provide them with other skills and prepare them for other situations that they will face. Our legal education and licensing systems are turning out lawyers whose competence profiles simply won’t match up with what people will need lawyers to do.

A good illustration of what I mean can be found in an excellent recent podcast from the Practising Law Institute, “Shaping the Law Firm Associate of the Future.” Over the course of the episode, moderator Jennifer Leonard of Creative Lawyers asked Professors Alice Armitage of UC Law San Francisco and Heidi K. Brown of New York Law School to identify some of the competencies that newly called lawyers and law firm associates are going to need in future. Here’s some of what they came up with:

  • Agile, nimble, extrapolative thinking
  • Collaborative, cross-disciplinary learning
  • Entrepreneurial, end-user-focused mindsets
  • Generative AI knowledge (“Their careers will be shaped by it”)
  • Identifying your optimal individual workflow
  • Iteration, learning by doing, and openness to failure
  • Leadership and interpersonal communication skills
  • Legal business know-how, including client standards and partner expectations
  • Receiving and giving feedback to enhance effectiveness

Legal Tech for Legal Departments – What In-House Lawyers Need to Know — from legal.thomsonreuters.com by Sterling Miller

Whatever the reason, you must understand the problem inside and out. Here are the key points to understanding your use case:

  • Identify the problem.
  • What is the current manual process to solve the problem?
  • Is there technology that will replace this manual process and solve the problem?
  • What will it cost and do you have (or can you get) the budget?
  • Will the benefits of the technology outweigh the cost? And how soon will those benefits pay off the cost? In other words, what is the return on investment?
  • Do you have the support of the organization to buy it (inside the legal department and elsewhere, e.g., CFO, CTO)?

2024-05-13: Of Legal AI — from emergentbehavior.co

Long discussion with a senior partner at a major Bay Area law firm:

Takeaways

A) They expect legal AI to decimate the profession…
B) Unimpressed by most specific legal AI offerings…
C) Generative AI error rates are acceptable even at 10–20%…
D) The future of corporate law is in-house…
E) The future of law in general?…
F) Of one large legal AI player…


2024 Legal Technology Survey Results — from lexology.com

Additional findings of the annual survey include:

  • 77 percent of firms have a formal technology strategy in place
  • Interest and intentions regarding generative A.I. remain high, with almost 80 percent of participating firms expecting to leverage it within the next five years. Many have either already begun or are planning to undertake data hygiene projects as a precursor to using generative A.I. and other automation solutions. Although legal market analysts have hypothesized that proprietary building of generative A.I. solutions remain out of reach for mid-sized firms, several Meritas survey respondents are making traction. Many other firms are also licensing third-party generative A.I. solutions.
  • The survey showed strong technology progression among several Meritas member firms, with most adopting a tech stack of core, foundational systems of infrastructure technology and adding cloud-based practice management, document management, time, billing, and document drafting applications.
  • Most firms reported increased adoption and utilization of options already available within their current core systems, such as Microsoft Office 365 Teams, SharePoint, document automation, and other native functionalities for increasing efficiencies; these functions were used more often in place of dedicated purpose-built solutions such as comparison and proofreading tools.
  • The legal technology market serving Meritas’ member firms continues to be fractured, with very few providers emerging as market leaders.

AI Set to Save Professionals 12 Hours Per Week by 2029 — from legalitprofessionals.com

Thomson Reuters, a global content and technology company, today released its 2024 Future of Professionals report, an annual survey of more than 2,200 professionals working across legal, tax, and risk & compliance fields globally. Respondents predicted that artificial intelligence (AI) has the potential to save them 12 hours per week in the next five years, or four hours per week over the upcoming year – equating to 200 hours annually.

This timesaving potential is the equivalent productivity boost of adding an extra colleague for every 10 team members on staff. Harnessing the power of AI across various professions opens immense economic opportunities. For a U.S. lawyer, this could translate to an estimated $100,000 in additional billable hours.*

 


Higher Education Has Not Been Forgotten by Generative AI — from insidehighered.com by Ray Schroeder
The generative AI (GenAI) revolution has not ignored higher education; a whole host of tools are available now and more revolutionary tools are on the way.

Some of the apps that have been developed for general use can be customized for specific topical areas in higher ed. For example, I created a version of GPT, “Ray’s EduAI Advisor,” that builds onto the current GPT-4o version with specific updates and perspectives on AI in higher education. It is freely available to users. With few tools and no knowledge of the programming involved, anyone can build their own GPT to supplement information for their classes or interest groups.

Excerpts from Ray’s EduAI Advisor bot:

AI’s global impact on higher education, particularly in at-scale classes and degree programs, is multifaceted, encompassing several key areas:
1. Personalized Learning…
2. Intelligent Tutoring Systems…
3. Automated Assessment…
4. Enhanced Accessibility…
5. Predictive Analytics…
6. Scalable Virtual Classrooms
7. Administrative Efficiency…
8. Continuous Improvement…

Instructure and Khan Academy Announce Partnership to Enhance Teaching and Learning With Khanmigo, the AI Tool for Education — from instructure.com
Shiren Vijiasingam and Jody Sailor make an exciting announcement about a new partnership sure to make a difference in education everywhere.

 

Rethinking Legal Ops Skills: Generalists Versus Specialists — from abovethelaw.com by Silvie Tucker and Brandi Pack
This ongoing conversation highlights the changing demands on legal ops practitioners.

A thought-provoking discussion is unfolding in the legal operations community regarding one intriguing question: Should legal operations professionals strive to be generalists or specialists?

The conversation is timely as the marketplace consolidates and companies grapple with the best way to fill valuable and limited headcount allotments. It also highlights the evolving landscape of legal operations and the changing demands on its practitioners.

The Evolution of Legal Ops
Over the past decade, the field of legal operations has undergone significant transformation. Initially strictly focused on streamlining processes and reducing costs, the role has expanded to include various responsibilities driven by technological advancements and heightened industry expectations. Key areas of expansion include:


He added: “I have for long been of the view – for decades – that AI will be a vital tool in overcoming the access to justice challenge. Existing and emerging technologies are now very promising.”

Richard Susskind


In A First for Law Practice Management Platforms, Clio Rolls Out An Integrated E-Filing Service in Texas — from lawnext.com by Bob Ambrogi

Last October, during its annual Clio Cloud Conference, the law practice management company Clio announced its plan to roll out an e-filing service, called Clio File, during 2024, starting with Texas, which would make it the first law practice management platform with built-in e-filing. Today, it delivered on that promise, launching Clio File for e-filing in Texas courts.

“Lawyers can now seamlessly submit court documents directly from our flagship practice management product, Clio Manage, streamlining their workflows and simplifying the filing process,” said Chris Stock, vice president of legal content and migrations at Clio. “This is an exciting step in expanding the capabilities of our platform, providing a comprehensive solution for legal documents, from drafting to court filing.”


Just-Launched Quench Uses Gen AI to Bring Greater Speed and Accuracy to Medico-Legal Records Review — from lawnext.com by Bob Ambrogi

A cardiologist with a background in medical technology, computer science and artificial intelligence has launched a product for legal professionals and physician expert witnesses that targets the tedious task of reviewing and analyzing thousands of pages of medical records.

The product, Quench SmartChart, uses generative AI to streamline the medico-legal review process, enabling users to quickly extract, summarize and create chronologies from large, disorganized PDFs of medical records.

The product also includes a natural language chat feature, AskQuench, that lets users interact with and interrogate records to surface essential insights.

 




 

Can Schools and Vendors Work Together Constructively on AI? A New Guide May Help — from edweek.org by Alyson Klein
The Education Department outlines key steps on AI development for schools

Educators need to work with vendors and tech developers to ensure artificial intelligence-driven innovations for schools go hand-in-hand with managing the technology’s risks, recommends guidance released July 8 by the U.S. Department of Education.

The guidance—called “Designing for Education with Artificial Intelligence: An Essential Guide for Developers“—includes extensive recommendations for both vendors and school district officials.


Also, on somewhat related notes see the following items:


 



 

Mary Meeker wants AI and higher education to be partners — from axios.com by Dan Primack; via Robert Gibson on LinkedIn

Mary Meeker has written her first report in over four years, focused on the relationship between artificial intelligence and U.S. higher education.

Why it matters: Meeker’s annual “Internet Trends” reports were among Silicon Valley’s most cited and consumed documents.

  • Each one dug deep into the new tech economy, with hundreds of pages of slides. The last one was published in 2019.
  • Meeker’s new effort is a shorter attempt (16 pages!) at reconciling tech’s brave new world and America’s economic vitality, with higher ed as the connective tissue.

Excerpts from Meeker’s report:

Actions taken in the next five years will be consequential. It’s important for higher education to take a leadership role, in combination with industry and government. The ramp in artificial intelligence – which leverages the history of learning for learning – affects all forms of learning, teaching, understanding, and decision making. This should be the best of times…

Our first-pass observations on these topics follow. We begin with an overview, followed by thoughts on the unprecedented ramp in AI usage and the magnitude of investment in AI from America’s leading global technology companies. Then we explore ways that this rapidly changing AI landscape may drive transformations in higher education. We hope these add to the discussion.

AI & Universities – Will Masters of Learning Master New Learnings?

In a time of rapid technological change led by American companies, American universities must determine how best to optimize for the future. Many institutions have work to do to meet these changes in demand, per the Burning Glass Institute. As the AI challenge looms, they will need thoughtful plans that balance their rich traditions and research history with the needs of a rapidly evolving marketplace supercharged by innovation. Keeping an eye on the output and trends in various AI skunkworks, such as the team at AI Acceleration at Arizona State, may help universities determine the products and software tools that could transform the educational experience.

 


Bill Gates Reveals Superhuman AI Prediction — from youtube.com by Rufus Griscom, Bill Gates, Andy Sack, and Adam Brotman

This episode of the Next Big Idea podcast, host Rufus Griscom and Bill Gates are joined by Andy Sack and Adam Brotman, co-authors of an exciting new book called “AI First.” Together, they consider AI’s impact on healthcare, education, productivity, and business. They dig into the technology’s risks. And they explore its potential to cure diseases, enhance creativity, and usher in a world of abundance.

Key moments:

00:05 Bill Gates discusses AI’s transformative potential in revolutionizing technology.
02:21 Superintelligence is inevitable and marks a significant advancement in AI technology.
09:23 Future AI may integrate deeply as cognitive assistants in personal and professional life.
14:04 AI’s metacognitive advancements could revolutionize problem-solving capabilities.
21:13 AI’s next frontier lies in developing human-like metacognition for sophisticated problem-solving.
27:59 AI advancements empower both good and malicious intents, posing new security challenges.
28:57 Rapid AI development raises questions about controlling its global application.
33:31 Productivity enhancements from AI can significantly improve efficiency across industries.
35:49 AI’s future applications in consumer and industrial sectors are subjects of ongoing experimentation.
46:10 AI democratization could level the economic playing field, enhancing service quality and reducing costs.
51:46 AI plays a role in mitigating misinformation and bridging societal divides through enhanced understanding.


OpenAI Introduces CriticGPT: A New Artificial Intelligence AI Model based on GPT-4 to Catch Errors in ChatGPT’s Code Output — from marktechpost.com

The team has summarized their primary contributions as follows.

  1. The team has offered the first instance of a simple, scalable oversight technique that greatly assists humans in more thoroughly detecting problems in real-world RLHF data.
  1. Within the ChatGPT and CriticGPT training pools, the team has discovered that critiques produced by CriticGPT catch more inserted bugs and are preferred above those written by human contractors.
  1. Compared to human contractors working alone, this research indicates that teams consisting of critic models and human contractors generate more thorough criticisms. When compared to reviews generated exclusively by models, this partnership lowers the incidence of hallucinations.
  1. This study provides Force Sampling Beam Search (FSBS), an inference-time sampling and scoring technique. This strategy well balances the trade-off between minimizing bogus concerns and discovering genuine faults in LLM-generated critiques.

Character.AI now allows users to talk with AI avatars over calls — from techcrunch.com by Ivan Mehta

a16z-backed Character.AI said today that it is now allowing users to talk to AI characters over calls. The feature currently supports multiple languages, including English, Spanish, Portuguese, Russian, Korean, Japanese and Chinese.

The startup tested the calling feature ahead of today’s public launch. During that time, it said that more than 3 million users had made over 20 million calls. The company also noted that calls with AI characters can be useful for practicing language skills, giving mock interviews, or adding them to the gameplay of role-playing games.


Google Translate Just Added 110 More Languages — from lifehacker.com by
You can now use the app to communicate in languages you’ve never even heard of.

Google Translate can come in handy when you’re traveling or communicating with someone who speaks another language, and thanks to a new update, you can now connect with some 614 million more people. Google is adding 110 new languages to its Translate tool using its AI PaLM 2 large language model (LLM), which brings the total of supported languages to nearly 250. This follows the 24 languages added in 2022, including Indigenous languages of the Americas as well as those spoken across Africa and central Asia.




Listen to your favorite books and articles voiced by Judy Garland, James Dean, Burt Reynolds and Sir Laurence Olivier — from elevenlabs.io
ElevenLabs partners with estates of iconic stars to bring their voices to the Reader App

 

A New Digital Divide: Student AI Use Surges, Leaving Faculty Behind— from insidehighered.com by Lauren Coffey
While both students and faculty have concerns with generative artificial intelligence, two new reports show a divergence in AI adoption. 

Meanwhile, a separate survey of faculty released Thursday by Ithaka S+R, a higher education consulting firm, showcased that faculty—while increasingly familiar with AI—often do not know how to use it in classrooms. Two out of five faculty members are familiar with AI, the Ithaka report found, but only 14 percent said they are confident in their ability to use AI in their teaching. Just slightly more (18 percent) said they understand the teaching implications of generative AI.

“Serious concerns about academic integrity, ethics, accessibility, and educational effectiveness are contributing to this uncertainty and hostility,” the Ithaka report said.

The diverging views about AI are causing friction. Nearly a third of students said they have been warned to not use generative AI by professors, and more than half (59 percent) are concerned they will be accused of cheating with generative AI, according to the Pearson report, which was conducted with Morning Consult and surveyed 800 students.


What teachers want from AI — from hechingerreport.org by Javeria Salman
When teachers designed their own AI tools, they built math assistants, tools for improving student writing, and more

An AI chatbot that walks students through how to solve math problems. An AI instructional coach designed to help English teachers create lesson plans and project ideas. An AI tutor that helps middle and high schoolers become better writers.

These aren’t tools created by education technology companies. They were designed by teachers tasked with using AI to solve a problem their students were experiencing.

Over five weeks this spring, about 300 people – teachers, school and district leaders, higher ed faculty, education consultants and AI researchers – came together to learn how to use AI and develop their own basic AI tools and resources. The professional development opportunity was designed by technology nonprofit Playlab.ai and faculty at the Relay Graduate School of Education.


The Comprehensive List of Talks & Resources for 2024 — from aiedusimplified.substack.com by Lance Eaton
Resources, talks, podcasts, etc that I’ve been a part of in the first half of 2024

Resources from things such as:

  • Lightning Talks
  • Talks & Keynotes
  • Workshops
  • Podcasts & Panels
  • Honorable Mentions

Next-Gen Classroom Observations, Powered by AI — from educationnext.org by Michael J. Petrilli
The use of video recordings in classrooms to improve teacher performance is nothing new. But the advent of artificial intelligence could add a helpful evaluative tool for teachers, measuring instructional practice relative to common professional goals with chatbot feedback.

Multiple companies are pairing AI with inexpensive, ubiquitous video technology to provide feedback to educators through asynchronous, offsite observation. It’s an appealing idea, especially given the promise and popularity of instructional coaching, as well as the challenge of scaling it effectively (see “Taking Teacher Coaching To Scale,” research, Fall 2018).

Enter AI. Edthena is now offering an “AI Coach” chatbot that offers teachers specific prompts as they privately watch recordings of their lessons. The chatbot is designed to help teachers view their practice relative to common professional goals and to develop action plans to improve.

To be sure, an AI coach is no replacement for human coaching.


Personalized AI Tutoring as a Social Activity: Paradox or Possibility? — from er.educause.edu by Ron Owston
Can the paradox between individual tutoring and social learning be reconciled though the possibility of AI?

We need to shift our thinking about GenAI tutors serving only as personal learning tools. The above activities illustrate how these tools can be integrated into contemporary classroom instruction. The activities should not be seen as prescriptive but merely suggestive of how GenAI can be used to promote social learning. Although I specifically mention only one online activity (“Blended Learning”), all can be adapted to work well in online or blended classes to promote social interaction.


Stealth AI — from higherai.substack.com by Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler
What happens when students use AI all the time, but aren’t allowed to talk about it?

In many ways, this comes back to one of my general rules: You cannot ban AI in the classroom. You can only issue a gag rule.

And if you do issue a gag rule, then it deprives students of the space they often need to make heads and tails of this technology.

We need to listen to actual students talking about actual uses, and reflecting on their actual feelings. No more abstraction.

In this conversation, Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler about what students are saying about Artificial Intelligence and education.


What’s New in Microsoft EDU | ISTE Edition June 2024 — from techcommunity.microsoft.com

Welcome to our monthly update for Teams for Education and thank you so much for being part of our growing community! We’re thrilled to share over 20 updates and resources and show them in action next week at ISTELive 24 in Denver, Colorado, US.

Copilot for Microsoft 365 – Educator features
Guided Content Creation
Coming soon to Copilot for Microsoft 365 is a guided content generation experience to help educators get started with creating materials like assignments, lesson plans, lecture slides, and more. The content will be created based on the educator’s requirements with easy ways to customize the content to their exact needs.
Standards alignment and creation
Quiz generation through Copilot in Forms
Suggested AI Feedback for Educators
Teaching extension
To better support educators with their daily tasks, we’ll be launching a built-in Teaching extension to help guide them through relevant activities and provide contextual, educator-based support in Copilot.
Education data integration

Copilot for Microsoft 365 – Student features
Interactive practice experiences
Flashcards activity
Guided chat activity
Learning extension in Copilot for Microsoft 365


New AI tools for Google Workspace for Education — from blog.google by Akshay Kirtikar and Brian Hendricks
We’re bringing Gemini to teen students using their school accounts to help them learn responsibly and confidently in an AI-first future, and empowering educators with new tools to help create great learning experiences.

 

Overcoming the ‘Entry Level’ Catch-22 in the Age of AI — from reachcapital.com by Shauntel Garvey

The New Entry-Level Job (and Skill)
In a world where AI can perform entry-level tasks, and employers are prioritizing experienced candidates, how can recent college graduates and job seekers find a job?

AI is the new entry-level skill. As AI permeates every industry, it’s becoming increasingly common for employers to ask candidates how they think about applying AI to their jobs. (We’ve started asking this here at Reach ourselves.) Even if the job is not technical and doesn’t list AI as a skill, candidates would do well to prepare. Journalists, for instance, are warming up to using AI to transcribe interviews and suggest headlines.

So it’s not just AI that may take your entry-level role, but rather the person who knows how to use it. Candidates who are bracing for this technological shift and proactively building their AI literacy and expertise will have a leg up.


On a related note, also see:

Make AI Literacy a Priority With These Free Resources — from gettingsmart.com by Tom Vander Ark

Key Points

  • Leading school systems are incorporating AI tools such as tutoring, chatbots, and teacher assistants, and promoting AI literacy among teachers and students to adapt to the evolving role of AI in education.

 
© 2024 | Daniel Christian