Higher Education Has Not Been Forgotten by Generative AI — from insidehighered.com by Ray Schroeder
The generative AI (GenAI) revolution has not ignored higher education; a whole host of tools are available now and more revolutionary tools are on the way.

Some of the apps that have been developed for general use can be customized for specific topical areas in higher ed. For example, I created a version of GPT, “Ray’s EduAI Advisor,” that builds onto the current GPT-4o version with specific updates and perspectives on AI in higher education. It is freely available to users. With few tools and no knowledge of the programming involved, anyone can build their own GPT to supplement information for their classes or interest groups.

Excerpts from Ray’s EduAI Advisor bot:

AI’s global impact on higher education, particularly in at-scale classes and degree programs, is multifaceted, encompassing several key areas:
1. Personalized Learning…
2. Intelligent Tutoring Systems…
3. Automated Assessment…
4. Enhanced Accessibility…
5. Predictive Analytics…
6. Scalable Virtual Classrooms
7. Administrative Efficiency…
8. Continuous Improvement…

Instructure and Khan Academy Announce Partnership to Enhance Teaching and Learning With Khanmigo, the AI Tool for Education — from instructure.com
Shiren Vijiasingam and Jody Sailor make an exciting announcement about a new partnership sure to make a difference in education everywhere.

 

Is College Worth It? Poll Finds Only 36% of Americans Have Confidence in Higher Education — from usnews.com by Associated Press
A new poll finds Americans are increasingly skeptical about the value and cost of college

Americans are increasingly skeptical about the value and cost of college, with most saying they feel the U.S. higher education system is headed in the “wrong direction,” according to a new poll.

Overall, only 36% of adults say they have a “great deal” or “quite a lot” of confidence in higher education, according to the report released Monday by Gallup and the Lumina Foundation. That confidence level has declined steadily from 57% in 2015.

 

Enrollment Planning in the Specter of Closure — from insidehighered.com by Mark Campbell and Rachel Schreiber; via GSV
Misunderstandings about enrollment management and changing student needs can make a bad situation worse, Mark Campbell and Rachel Schreiber write. 

Excerpts (emphasis DSC):

However, we find that many institutions provide little to no information to prospective students about actual outcomes for graduates. Examples include: What does applying to graduate school look like for graduates? Employment and earning potential? Average student loan debt? What do alumni say about their experience? What data do you have that is compelling to answer these and related questions? Families increasingly ask, “What is the ROI on this investment?”

Another important issue relates to the unwillingness of leaders to evolve the institution to meet market demands. We have too often seen that storied, historic institutions have cultures that are change averse, and this seems to be particularly true in the liberal arts. This statement might appear to be controversial—but only if misunderstood.

To be clear, the humanities and the arts are vital, critical aspects of our institutions. But today’s prospective students are highly focused on career outcomes, given the financial investment they and their families are being asked to make. We believe that curricular offerings can place a high value on the core principles of the humanities and liberal arts while also preparing students for careers.

By contrast, curricular innovation, alterations to long-held marketing practices, openness to self-reflection regarding out-of-date programs, practices and policies—in short, a willingness to change and adapt—are all key. Finally, vital and successful institutions develop long-term strategic enrollment plans that are tactical, realistic and assessable and for which there is clarity about accountability. Putting these practices in place now can avert catastrophe down the road.

 
 

More colleges are breaching their debt requirements: S&P — from highereddive.com by Ben Unglesbee
Amid operating pressures, some institutions are struggling to meet financial metrics stipulated in their bond and loan covenants.

Dive Brief:

  • A growing number of colleges are breaching bond and loan stipulations, known as covenants, that require them to stay within certain financial health parameters, according to a new report from S&P Global Ratings.
  • The agency cited 12 colleges it rates that have breached covenants since last June. In most cases, bondholders waived the violation. Some covenants could allow debtholders to accelerate repayment, which could add to an institution’s liquidity and ratings risks.
  • S&P downgraded ratings for about half the institutions with violations, typically because of underlying financial issues. “We see continued credit quality divergence in the U.S. higher education sector, with weaker-positioned institutions experiencing budgetary pressure and covenant violations,” the analysts said.

Student Loan Borrowers Owe $1.6 Trillion. Nearly Half Aren’t Paying. — from nytimes.com by Stacy Cowley (behind a paywall)
Millions of people are overdue on their federal loans or still have them paused — and court rulings keep upending collection efforts.

After an unprecedented three-year timeout on federal student loan payments because of the pandemic, millions of borrowers began repaying their debt when billing resumed late last year. But nearly as many have not.

That reality, along with court decisions that regularly upend the rules, has complicated the government’s efforts to restart its system for collecting the $1.6 trillion it is owed.


Universities Investing in Microcredential Leadership — from insidehighered.com by Lauren Coffey
As microcredential programs slowly gain traction, more universities are looking for leaders to coordinate the efforts.

Microcredentials—also known as digital badges, credentials, certificate, or alternative credentials—grew in popularity during the COVID-19 pandemic. Now they are attracting renewed interest as institutions look to widen their nets for nontraditional students as an enrollment cliff looms.

In addition to backing these programs, some universities are going further by hiring staff solely to oversee microcredential efforts.


A Plan to Save Small Colleges — from insidehighered.com by Michael Alexander
Small colleges could join forces through a supporting-organization model, Michael Alexander writes.

The challenges are significant. But there is a way to increase the probability of survival for many small colleges or spare them from a spartan existence. It involves groups of colleges affiliating under a particular structure that would facilitate both (1) a significant reduction in operating costs for each college and (2) a rationalization of each college’s academic offerings to concentrate on its strongest programs.

 


Bill Gates Reveals Superhuman AI Prediction — from youtube.com by Rufus Griscom, Bill Gates, Andy Sack, and Adam Brotman

This episode of the Next Big Idea podcast, host Rufus Griscom and Bill Gates are joined by Andy Sack and Adam Brotman, co-authors of an exciting new book called “AI First.” Together, they consider AI’s impact on healthcare, education, productivity, and business. They dig into the technology’s risks. And they explore its potential to cure diseases, enhance creativity, and usher in a world of abundance.

Key moments:

00:05 Bill Gates discusses AI’s transformative potential in revolutionizing technology.
02:21 Superintelligence is inevitable and marks a significant advancement in AI technology.
09:23 Future AI may integrate deeply as cognitive assistants in personal and professional life.
14:04 AI’s metacognitive advancements could revolutionize problem-solving capabilities.
21:13 AI’s next frontier lies in developing human-like metacognition for sophisticated problem-solving.
27:59 AI advancements empower both good and malicious intents, posing new security challenges.
28:57 Rapid AI development raises questions about controlling its global application.
33:31 Productivity enhancements from AI can significantly improve efficiency across industries.
35:49 AI’s future applications in consumer and industrial sectors are subjects of ongoing experimentation.
46:10 AI democratization could level the economic playing field, enhancing service quality and reducing costs.
51:46 AI plays a role in mitigating misinformation and bridging societal divides through enhanced understanding.


OpenAI Introduces CriticGPT: A New Artificial Intelligence AI Model based on GPT-4 to Catch Errors in ChatGPT’s Code Output — from marktechpost.com

The team has summarized their primary contributions as follows.

  1. The team has offered the first instance of a simple, scalable oversight technique that greatly assists humans in more thoroughly detecting problems in real-world RLHF data.
  1. Within the ChatGPT and CriticGPT training pools, the team has discovered that critiques produced by CriticGPT catch more inserted bugs and are preferred above those written by human contractors.
  1. Compared to human contractors working alone, this research indicates that teams consisting of critic models and human contractors generate more thorough criticisms. When compared to reviews generated exclusively by models, this partnership lowers the incidence of hallucinations.
  1. This study provides Force Sampling Beam Search (FSBS), an inference-time sampling and scoring technique. This strategy well balances the trade-off between minimizing bogus concerns and discovering genuine faults in LLM-generated critiques.

Character.AI now allows users to talk with AI avatars over calls — from techcrunch.com by Ivan Mehta

a16z-backed Character.AI said today that it is now allowing users to talk to AI characters over calls. The feature currently supports multiple languages, including English, Spanish, Portuguese, Russian, Korean, Japanese and Chinese.

The startup tested the calling feature ahead of today’s public launch. During that time, it said that more than 3 million users had made over 20 million calls. The company also noted that calls with AI characters can be useful for practicing language skills, giving mock interviews, or adding them to the gameplay of role-playing games.


Google Translate Just Added 110 More Languages — from lifehacker.com by
You can now use the app to communicate in languages you’ve never even heard of.

Google Translate can come in handy when you’re traveling or communicating with someone who speaks another language, and thanks to a new update, you can now connect with some 614 million more people. Google is adding 110 new languages to its Translate tool using its AI PaLM 2 large language model (LLM), which brings the total of supported languages to nearly 250. This follows the 24 languages added in 2022, including Indigenous languages of the Americas as well as those spoken across Africa and central Asia.




Listen to your favorite books and articles voiced by Judy Garland, James Dean, Burt Reynolds and Sir Laurence Olivier — from elevenlabs.io
ElevenLabs partners with estates of iconic stars to bring their voices to the Reader App

 

A New Digital Divide: Student AI Use Surges, Leaving Faculty Behind— from insidehighered.com by Lauren Coffey
While both students and faculty have concerns with generative artificial intelligence, two new reports show a divergence in AI adoption. 

Meanwhile, a separate survey of faculty released Thursday by Ithaka S+R, a higher education consulting firm, showcased that faculty—while increasingly familiar with AI—often do not know how to use it in classrooms. Two out of five faculty members are familiar with AI, the Ithaka report found, but only 14 percent said they are confident in their ability to use AI in their teaching. Just slightly more (18 percent) said they understand the teaching implications of generative AI.

“Serious concerns about academic integrity, ethics, accessibility, and educational effectiveness are contributing to this uncertainty and hostility,” the Ithaka report said.

The diverging views about AI are causing friction. Nearly a third of students said they have been warned to not use generative AI by professors, and more than half (59 percent) are concerned they will be accused of cheating with generative AI, according to the Pearson report, which was conducted with Morning Consult and surveyed 800 students.


What teachers want from AI — from hechingerreport.org by Javeria Salman
When teachers designed their own AI tools, they built math assistants, tools for improving student writing, and more

An AI chatbot that walks students through how to solve math problems. An AI instructional coach designed to help English teachers create lesson plans and project ideas. An AI tutor that helps middle and high schoolers become better writers.

These aren’t tools created by education technology companies. They were designed by teachers tasked with using AI to solve a problem their students were experiencing.

Over five weeks this spring, about 300 people – teachers, school and district leaders, higher ed faculty, education consultants and AI researchers – came together to learn how to use AI and develop their own basic AI tools and resources. The professional development opportunity was designed by technology nonprofit Playlab.ai and faculty at the Relay Graduate School of Education.


The Comprehensive List of Talks & Resources for 2024 — from aiedusimplified.substack.com by Lance Eaton
Resources, talks, podcasts, etc that I’ve been a part of in the first half of 2024

Resources from things such as:

  • Lightning Talks
  • Talks & Keynotes
  • Workshops
  • Podcasts & Panels
  • Honorable Mentions

Next-Gen Classroom Observations, Powered by AI — from educationnext.org by Michael J. Petrilli
The use of video recordings in classrooms to improve teacher performance is nothing new. But the advent of artificial intelligence could add a helpful evaluative tool for teachers, measuring instructional practice relative to common professional goals with chatbot feedback.

Multiple companies are pairing AI with inexpensive, ubiquitous video technology to provide feedback to educators through asynchronous, offsite observation. It’s an appealing idea, especially given the promise and popularity of instructional coaching, as well as the challenge of scaling it effectively (see “Taking Teacher Coaching To Scale,” research, Fall 2018).

Enter AI. Edthena is now offering an “AI Coach” chatbot that offers teachers specific prompts as they privately watch recordings of their lessons. The chatbot is designed to help teachers view their practice relative to common professional goals and to develop action plans to improve.

To be sure, an AI coach is no replacement for human coaching.


Personalized AI Tutoring as a Social Activity: Paradox or Possibility? — from er.educause.edu by Ron Owston
Can the paradox between individual tutoring and social learning be reconciled though the possibility of AI?

We need to shift our thinking about GenAI tutors serving only as personal learning tools. The above activities illustrate how these tools can be integrated into contemporary classroom instruction. The activities should not be seen as prescriptive but merely suggestive of how GenAI can be used to promote social learning. Although I specifically mention only one online activity (“Blended Learning”), all can be adapted to work well in online or blended classes to promote social interaction.


Stealth AI — from higherai.substack.com by Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler
What happens when students use AI all the time, but aren’t allowed to talk about it?

In many ways, this comes back to one of my general rules: You cannot ban AI in the classroom. You can only issue a gag rule.

And if you do issue a gag rule, then it deprives students of the space they often need to make heads and tails of this technology.

We need to listen to actual students talking about actual uses, and reflecting on their actual feelings. No more abstraction.

In this conversation, Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler about what students are saying about Artificial Intelligence and education.


What’s New in Microsoft EDU | ISTE Edition June 2024 — from techcommunity.microsoft.com

Welcome to our monthly update for Teams for Education and thank you so much for being part of our growing community! We’re thrilled to share over 20 updates and resources and show them in action next week at ISTELive 24 in Denver, Colorado, US.

Copilot for Microsoft 365 – Educator features
Guided Content Creation
Coming soon to Copilot for Microsoft 365 is a guided content generation experience to help educators get started with creating materials like assignments, lesson plans, lecture slides, and more. The content will be created based on the educator’s requirements with easy ways to customize the content to their exact needs.
Standards alignment and creation
Quiz generation through Copilot in Forms
Suggested AI Feedback for Educators
Teaching extension
To better support educators with their daily tasks, we’ll be launching a built-in Teaching extension to help guide them through relevant activities and provide contextual, educator-based support in Copilot.
Education data integration

Copilot for Microsoft 365 – Student features
Interactive practice experiences
Flashcards activity
Guided chat activity
Learning extension in Copilot for Microsoft 365


New AI tools for Google Workspace for Education — from blog.google by Akshay Kirtikar and Brian Hendricks
We’re bringing Gemini to teen students using their school accounts to help them learn responsibly and confidently in an AI-first future, and empowering educators with new tools to help create great learning experiences.

 

Overcoming the ‘Entry Level’ Catch-22 in the Age of AI — from reachcapital.com by Shauntel Garvey

The New Entry-Level Job (and Skill)
In a world where AI can perform entry-level tasks, and employers are prioritizing experienced candidates, how can recent college graduates and job seekers find a job?

AI is the new entry-level skill. As AI permeates every industry, it’s becoming increasingly common for employers to ask candidates how they think about applying AI to their jobs. (We’ve started asking this here at Reach ourselves.) Even if the job is not technical and doesn’t list AI as a skill, candidates would do well to prepare. Journalists, for instance, are warming up to using AI to transcribe interviews and suggest headlines.

So it’s not just AI that may take your entry-level role, but rather the person who knows how to use it. Candidates who are bracing for this technological shift and proactively building their AI literacy and expertise will have a leg up.


On a related note, also see:

Make AI Literacy a Priority With These Free Resources — from gettingsmart.com by Tom Vander Ark

Key Points

  • Leading school systems are incorporating AI tools such as tutoring, chatbots, and teacher assistants, and promoting AI literacy among teachers and students to adapt to the evolving role of AI in education.

 

From DSC:
As I can’t embed his posting, I’m copying/pasting Jeff’s posting on LinkedIn:


According to Flighty, I logged more than 2,220 flight miles in the last 5 days traveling to three conferences to give keynotes and spend time with housing officers in Milwaukee, college presidents in Mackinac Island, MI, and enrollment and marketing leaders in Raleigh.

Before I rest, I wanted to post some quick thoughts about what I learned. Thank you to everyone who shared their wisdom these past few days:

  • We need to think about the “why” and “how” of AI in higher ed. The “why” shouldn’t be just because everyone else is doing it. Rather, the “why” is to reposition higher ed for a different future of competitors. The “how” shouldn’t be to just seek efficiency and cut jobs. Rather we should use AI to learn from its users to create a better experience going forward.
  • Residence halls are not just infrastructure. They are part and parcel of the student experience and critical to student success. Almost half of students living on campus say it increases their sense of belonging, according to research by the Association of College & University Housing Officers.
  • How do we extend the “residential experience”? More than half of traditional undergraduates who live on campus now take at least once course online. As students increasingly spend time off campus – or move off campus as early as their second year in college – we need to help continue to make the connections for them that they would in a dorm. Why? 47% of college students believe living in a college residence hall enhanced their ability to resolve conflicts.
  • Career must be at the core of the student experience for colleges to thrive in the future, says Andy Chan. Yes, some people might see that as too narrow of a view of higher ed or might not want to provide cogs for the wheel of the workforce, but without the job, none of the other benefits of college follow–citizenship, health, engagement.
  • A “triple threat grad”–someone who has an internship, a semester-long project, and an industry credential (think Salesforce or Adobe in addition to their degree–matters more in the job market than major or institution, says Brandon Busteed.
  • Every faculty member should think of themselves as an ambassador for the institution. Yes, care about their discipline/department, but that doesn’t survive if the rest of the institution falls down around them.
  • Presidents need to place bigger bets rather than spend pennies and dimes on a bunch of new strategies. That means to free up resources they need to stop doing things.
  • Higher ed needs a new business model. Institutions can’t make money just from tuition, and new products like certificates, are pennies on the dollars of degrees.
  • Boards aren’t ready for the future. They are over-indexed on philanthropy and alumni and not enough on the expertise needed for leading higher ed.

From DSC:
As I can’t embed his posting, I’m copying/pasting Jeff’s posting on LinkedIn:


It’s the stat that still gnaws at me: 62%.

That’s the percentage of high school graduates going right on to college. A decade ago it was around 70%. So for all the bellyaching about the demographic cliff in higher ed, just imagine if today we were close to that 70% number? We’d be talking a few hundred thousand more students in the system.

As I told a gathering of presidents of small colleges and universities last night on Mackinac Island — the first time I had to take [numerous modes of transportation] to get to a conference — being small isn’t distinctive anymore.

There are many reasons undergrad enrollment is down, but they all come down to two interrelated trends: jobs and affordability.

The job has become so central to what students want out of the experience. It’s almost as if colleges now need to guarantee a job.

These institutions will need to rethink the learner relationship with work. Instead of college with work on the side, we might need to move to more of a mindset of work with college on the side by:

  • Making campus jobs more meaningful. Why can’t we have accounting and finance majors work in the CFO office, liberal arts majors work in IT on platforms such as Salesforce and Workday, which are skills needed in the workplace, etc.?
  • Apprenticeships are not just for the trades anymore. Integrate work-based learning into the undergrad experience in a much bigger way than internships and even co-ops.
  • Credentials within the degree. Every graduate should leave college with more than just a BA but also a certified credential in things like data viz, project management, the Adobe suite, Alteryx, etc.
  • The curriculum needs to be more flexible for students to combine work and learning — not only for the experience but also money for college — so more availability of online courses, hybrid courses, and flexible semesters.

How else can we think about learning and earning?


 

Latent Expertise: Everyone is in R&D — from oneusefulthing.org by Ethan Mollick
Ideas come from the edges, not the center

Excerpt (emphasis DSC):

And to understand the value of AI, they need to do R&D. Since AI doesn’t work like traditional software, but more like a person (even though it isn’t one), there is no reason to suspect that the IT department has the best AI prompters, nor that it has any particular insight into the best uses of AI inside an organization. IT certainly plays a role, but the actual use cases will come from workers and managers who find opportunities to use AI to help them with their job. In fact, for large companies, the source of any real advantage in AI will come from the expertise of their employees, which is needed to unlock the expertise latent in AI.


OpenAI’s former chief scientist is starting a new AI company — from theverge.com by Emma Roth
Ilya Sutskever is launching Safe Superintelligence Inc., an AI startup that will prioritize safety over ‘commercial pressures.’

Ilya Sutskever, OpenAI’s co-founder and former chief scientist, is starting a new AI company focused on safety. In a post on Wednesday, Sutskever revealed Safe Superintelligence Inc. (SSI), a startup with “one goal and one product:” creating a safe and powerful AI system.

Ilya Sutskever Has a New Plan for Safe Superintelligence — from bloomberg.com by Ashlee Vance (behind a paywall)
OpenAI’s co-founder discloses his plans to continue his work at a new research lab focused on artificial general intelligence.

Safe Superintelligence — from theneurondaily.com by Noah Edelman

Ilya Sutskever is kind of a big deal in AI, to put it lightly.

Part of OpenAI’s founding team, Ilya was Chief Data Scientist (read: genius) before being part of the coup that fired Sam Altman.

Yesterday, Ilya announced that he’s forming a new initiative called Safe Superintelligence.

If AGI = AI that can perform a wide range of tasks at our level, then Superintelligence = an even more advanced AI that surpasses human capabilities in all areas.


AI is exhausting the power grid. Tech firms are seeking a miracle solution. — from washingtonpost.com by Evan Halper and Caroline O’Donovan
As power needs of AI push emissions up and put big tech in a bind, companies put their faith in elusive — some say improbable — technologies.

As the tech giants compete in a global AI arms race, a frenzy of data center construction is sweeping the country. Some computing campuses require as much energy as a modest-sized city, turning tech firms that promised to lead the way into a clean energy future into some of the world’s most insatiable guzzlers of power. Their projected energy needs are so huge, some worry whether there will be enough electricity to meet them from any source.


Microsoft, OpenAI, Nvidia join feds for first AI attack simulation — from axios.com by Sam Sabin

Federal officials, AI model operators and cybersecurity companies ran the first joint simulation of a cyberattack involving a critical AI system last week.

Why it matters: Responding to a cyberattack on an AI-enabled system will require a different playbook than the typical hack, participants told Axios.

The big picture: Both Washington and Silicon Valley are attempting to get ahead of the unique cyber threats facing AI companies before they become more prominent.


Hot summer of AI video: Luma & Runway drop amazing new models — from heatherbcooper.substack.com by Heather Cooper
Plus an amazing FREE video to sound app from ElevenLabs

Immediately after we saw Sora-like videos from KLING, Luma AI’s Dream Machine video results overshadowed them.

Dream Machine is a next-generation AI video model that creates high-quality, realistic shots from text instructions and images.


Introducing Gen-3 Alpha — from runwayml.com by Anastasis Germanidis
A new frontier for high-fidelity, controllable video generation.


AI-Generated Movies Are Around the Corner — from news.theaiexchange.com by The AI Exchange
The future of AI in filmmaking; participate in our AI for Agencies survey

AI-Generated Feature Films Are Around the Corner.
We predict feature-film length AI-generated films are coming by the end of 2025, if not sooner.

Don’t believe us? You need to check out Runway ML’s new Gen-3 model they released this week.

They’re not the only ones. We also have Pika, which just raised $80M. And Google’s Veo. And OpenAI’s Sora. (+ many others)

 

The Musician’s Rule and GenAI in Education — from opencontent.org by David Wiley

We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.

From DSC:
As is often the case, David put together a solid posting here. A few comments/reflections on it:

  • I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
  • The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
  • The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
  • The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
  • As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
  • There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better.
    .

We need to take more of the research from learning science and apply it in our learning spaces.

 


From DSC:
I’ve been wondering about collaborations, consortiums, and other forms of pooling resources within higher education for quite some time. As such, this an interesting item to me.


 

2024 Global Skills Report -- from Coursera

  • AI literacy emerges as a global imperative
  • AI readiness initiatives drive emerging skill adoption across regions
  • The digital skills gap persists in a rapidly evolving job market
  • Cybersecurity skills remain crucial amid talent shortages and evolving threats
  • Micro-credentials are a rapid pathway for learners to prepare for in-demand jobs
  • The global gender gap in online learning continues to narrow, but regional disparities persist
  • Different regions prioritize different skills, but the majority focus on emerging or foundational capabilities

You can use the Global Skills Report 2024 to:

  • Identify critical skills for your students to strengthen employability
  • Align curriculum to drive institutional advantage nationally
  • Track emerging skill trends like GenAI and cybersecurity
  • Understand entry-level and digital role skill trends across six regions
 

Daniel Christian: My slides for the Educational Technology Organization of Michigan’s Spring 2024 Retreat

From DSC:
Last Thursday, I presented at the Educational Technology Organization of Michigan’s Spring 2024 Retreat. I wanted to pass along my slides to you all, in case they are helpful to you.

Topics/agenda:

  • Topics & resources re: Artificial Intelligence (AI)
    • Top multimodal players
    • Resources for learning about AI
    • Applications of AI
    • My predictions re: AI
  • The powerful impact of pursuing a vision
  • A potential, future next-gen learning platform
  • Share some lessons from my past with pertinent questions for you all now
  • The significant impact of an organization’s culture
  • Bonus material: Some people to follow re: learning science and edtech

 

Education Technology Organization of Michigan -- ETOM -- Spring 2024 Retreat on June 6-7

PowerPoint slides of Daniel Christian's presentation at ETOM

Slides of the presentation (.PPTX)
Slides of the presentation (.PDF)

 


Plus several more slides re: this vision.

 

A Right to Warn about Advanced Artificial Intelligence — from righttowarn.ai

We are current and former employees at frontier AI companies, and we believe in the potential of AI technology to deliver unprecedented benefits to humanity.

We also understand the serious risks posed by these technologies. These risks range from the further entrenchment of existing inequalities, to manipulation and misinformation, to the loss of control of autonomous AI systems potentially resulting in human extinction. AI companies themselves have acknowledged these risks [123], as have governments across the world [456] and other AI experts [789].

We are hopeful that these risks can be adequately mitigated with sufficient guidance from the scientific community, policymakers, and the public. However, AI companies have strong financial incentives to avoid effective oversight, and we do not believe bespoke structures of corporate governance are sufficient to change this.

 
© 2024 | Daniel Christian