Learning Designers Call for More User Testing of Edtech Products and Teaching Materials — from edsurge.com by Jeffrey R. Young
‘Co-creating’ materials with students can lead to better outcomes, experts argue.

He notes that there aren’t good incentives for edtech companies to spend the time and effort on more-detailed testing with students. “They’re selling to the government, to the administration, to the district,” he points out. “They’re not selling to the child — the child has no purchasing power. The kids never really get heard and the teachers rarely get heard. Then they throw it into the classroom and then you’re testing, ‘Did the scores go up?’”

Experts have also called for more teachers and educators to be involved with the development of edtech products.

 

The $340 Billion Corporate Learning Industry Is Poised For Disruption — from joshbersin.com by Josh Bersin

What if, for example, the corporate learning system knew who you were and you could simply ask it a question and it would generate an answer, a series of resources, and a dynamic set of learning objects for you to consume? In some cases you’ll take the answer and run. In other cases you’ll pour through the content. And in other cases you’ll browse through the course and take the time to learn what you need.

And suppose all this happened in a totally personalized way. So you didn’t see a “standard course” but a special course based on your level of existing knowledge?

This is what AI is going to bring us. And yes, it’s already happening today.

 

Which AI should I use? Superpowers and the State of Play — from by Ethan Mollick
And then there were three

For over a year, GPT-4 was the dominant AI model, clearly much smarter than any of the other LLM systems available. That situation has changed in the last month, there are now three GPT-4 class models, all powering their own chatbots: GPT-4 (accessible through ChatGPT Plus or Microsoft’s CoPilot), Anthropic’s Claude 3 Opus, and Google’s Gemini Advanced1.

Where we stand
We are in a brief period in the AI era where there are now multiple leading models, but none has yet definitively beaten the GPT-4 benchmark set over a year ago. While this may represent a plateau in AI abilities, I believe this is likely to change in the coming months as, at some point, models like GPT-5 and Gemini 2.0 will be released. In the meantime, you should be using a GPT-4 class model and using it often enough to learn what it does well. You can’t go wrong with any of them, pick a favorite and use it…

From DSC:
Here’s a powerful quote from Ethan:

In fact, in my new book I postulate that you haven’t really experienced AI until you have had three sleepless nights of existential anxiety, after which you can start to be productive again.


Using AI for Immersive Educational Experiences — from automatedteach.com by Graham Clay
Realistic video brings course content to life but requires AI literacy.

For us, I think the biggest promise of AI tools like Sora — that can create video with ease — is that they lower the cost of immersive educational experiences. This increases the availability of these experiences, expanding their reach to student populations who wouldn’t otherwise have them, whether due to time, distance, or expense.

Consider the profound impact on a history class, where students are transported to California during the gold rush through hyperrealistic video sequences. This vivifies the historical content and cultivates a deeper connection with the material.

In fact, OpenAI has already demonstrated the promise of this sort of use case, with a very simple prompt producing impressive results…


The Empathy Illusion: How AI Agents Could Manipulate Students — from marcwatkins.substack.com by Marc Watkins

Take this scenario. A student misses a class and, within twenty minutes, receives a series of texts and even a voicemail from a very concerned and empathic-sounding voice wanting to know what’s going on. Of course, the text is entirely generated, and the voice is synthetic as well, but the student likely doesn’t know this. To them, communication isn’t something as easy to miss or brush off as an email. It sounds like someone who cares is talking to them.

But let’s say that isn’t enough. By that evening, the student still hadn’t logged into their email or checked the LMS. The AI’s strategic reasoning is communicating with the predictive AI and analyzing the pattern of behavior against students who succeed or fail vs. students who are ill. The AI tracks the student’s movements on campus, monitors their social media usage, and deduces the student isn’t ill and is blowing off class.

The AI agent resumes communication with the student. But this time, the strategic AI adopts a different persona, not the kind and empathetic persona used for the initial contact, but a stern, matter-of-fact one. The student’s phone buzzes with alerts that talk about scholarships being lost, teachers being notified, etc. The AI anticipates the excuses the student will use and presents evidence tracking the student’s behavior to show they are not sick.


Not so much focused on learning ecosystems, but still worth mentioning:

The top 100 Gen AI Consumer Apps — from a16z.com / andreessen horowitz by Olivia Moore


 

 

The Edtech Insiders Rundown of SXSW EDU 2024 — from edtechinsiders.substack.com by Ben Kornell, Alex Sarlin, and Sarah Morin
And more on our ASU + GSV Happy Hour, GenAI in edtech market valuations, and interviews from The Common Sense Summit.

Theme 1: The Kids Are Not Alright
This year’s SXSW EDU had something for everyone, with over a dozen “Program Tracks.” However, the one theme that truly connected the entire conference was mental health.

36 sessions were specifically tagged with mental health and wellness, but in sessions on topics ranging from literacy to edtech to civic engagement, presenters continued to come back again and again to the mental health crisis amongst teens and young adults.

Theme 2: Aye AI, Captain
Consistent with past conferences, this year leaned in on the K12 education world. As expected, one of the hottest topics for K12 was the role of AI (or lack thereof) in schools. Key takeaways included…


AI Literacy: A New Graduation Requirement and Civic Imperative — from gettingsmart.com by Tom Vander Ark and Mason Pashia

Key Points

  • There is still time to ensure that all of your students graduate with an understanding of how AI works, why it is important and how to best use it.

What would it look like to make a commitment that come graduation every senior will have at least basic AI literacy? This includes an appreciation of AI as a creation engine and learning partner but also an understanding of the risks of deepfakes and biased curation. We’re entering a time where to quote Ethan Mollick “You can’t trust anything you read or see ever again.” Whether formal or informal, it’s time to start building AI literacy.


New Alabama Education Law Represents Small But Significant Advance — from forbes.com by Jeanne Allen

Valiant Cross Academy raises the bar for young men. A young man seated in a pew is raising his hand.

More than 50 years later, across the street from the church and concerned with declining education and the pace of social change, brothers Anthony and Fred Brock founded Valiant Cross Academy, an all-male academy aimed at “helping boys of color become men of valor.”

Valiant Cross embodies King’s hopes, pursuing the dream that its students will be judged by the content of their character, not the color of their skin, and working to ensure that they are well prepared for productive lives filled with accomplishment and purpose.

“We’re out to prove that it’s an opportunity gap, not an achievement gap” says head of school Anthony Brock. And they have. In 2022, 100 percent of Valiant seniors graduated from the academy, pursuing post-graduate options, enrolling in either four- or two-year college, or established career-training programs.


 

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 

Supreme Court: Bar exam will no longer be required to become attorney in Washington State — from spokesman.com by Emma Epperly

Excerpt (emphasis DSC):

The bar exam will no longer be required to become a lawyer in Washington, the state Supreme Court ruled in a pair of orders Friday.

The court approved alternative ways to show competency and earn a law license after appointing a task force to examine the issue in 2020.

The Bar Licensure Task Force found that the traditional exam “disproportionally and unnecessarily blocks” marginalized groups from becoming practicing attorneys and is “at best minimally effective” for ensuring competency, according to a news release from the Washington Administrative Office of the Courts.

 

12 Books for Instructional Designers to Read This Year — from theelearningcoach.com by Connie Malamed

Over the past year, many excellent and resourceful books have crossed my desk or Kindle. I’m rounding them up here so you can find a few to expand your horizons. The list below is in alphabetical order by title.

Each book is unique, yet as a collection, they reflect some common themes and trends in Learning and Development: a focus on empathy and emotion, adopting best practices from other fields, using data for greater impact, aligning projects with organizational goals, and developing consultative skills. The authors listed here are optimistic and forward-thinking—they believe change is possible. I hope you enjoy the books.

 


[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 

Doubts About Value Are Deterring College Enrollment — from insidehighered.com by Jessica Blake
Survey data suggests that prospective learners are being dissuaded from college by skepticism about whether degrees are worth the time and money.

Enrollment has been declining in higher education for more than a decade, and the most common explanations in recent years have been lingering effects of the pandemic and a looming demographic cliff expected to shrink the number of traditional-age college students. But new research suggests that public doubts about the value of a college degree are a key contributor.

The study—conducted by Edge Research, a marketing research firm, and HCM Strategists, a public policy and advocacy consulting firm with funding from the Bill & Melinda Gates Foundation—uses focus groups and parallel national surveys of current high school students and of adults who decided to leave college or who didn’t go at all to link the value proposition of a college degree and Americans’ behaviors after high school.

“At the end of the day, higher education has a lot of work to do to convince these audiences of its value,” said Terrell Dunn, an HCM consultant.


‘A Condemnation’: Under Mental Health Strains, Students Weigh Quitting College — from edsurge.com by Daniel Mollenkamp

When college students think about quitting, it’s most likely because of mental health strain or stress.

That’s according to the recent data from the “State of Higher Education Study,” conducted by the analytics company Gallup and the private foundation Lumina.

 

Edtech Unicorns Are Evolving Rather Than Disrupting — from bloomberg.com by Alex Webb

Consider Coursera Inc., the most prominent survivor of that early edtech hype. It’s now a public company, with a hefty $2.3 billion valuation. Finally, 12 years after it was founded — by, incidentally, another Google veteran in Andrew Ng — it’s set to report its first profit this year, according to analyst estimates. And the enterprise business is considerably more profitable, enjoying a 68% gross margin in 2023, compared to the consumer business’s 53% margin.

Figuring out the right match between training and utility is how several business schools seem to have developed successful online courses — which they are charging top dollar for. They’re in close contact with the sort of large corporations who hire their graduates, giving them a more intimate understanding of what those businesses seek.

Harvard Business School is one example. It made $74 million from online courses in fiscal 2022, the most recent year for which data is available

 

Also see:

Cognition Labs Blob

 

Amid explosive demand, America is running out of power — from washingtonpost.com by Evan Halper
AI and the boom in clean-tech manufacturing are pushing America’s power grid to the brink. Utilities can’t keep up.

Vast swaths of the United States are at risk of running short of power as electricity-hungry data centers and clean-technology factories proliferate around the country, leaving utilities and regulators grasping for credible plans to expand the nation’s creaking power grid.

A major factor behind the skyrocketing demand is the rapid innovation in artificial intelligence, which is driving the construction of large warehouses of computing infrastructure that require exponentially more power than traditional data centers. AI is also part of a huge scale-up of cloud computing. Tech firms like Amazon, Apple, Google, Meta and Microsoft are scouring the nation for sites for new data centers, and many lesser-known firms are also on the hunt.


The Obscene Energy Demands of A.I. — from newyorker.com by Elizabeth Kolbert
How can the world reach net zero if it keeps inventing new ways to consume energy?

“There’s a fundamental mismatch between this technology and environmental sustainability,” de Vries said. Recently, the world’s most prominent A.I. cheerleader, Sam Altman, the C.E.O. of OpenAI, voiced similar concerns, albeit with a different spin. “I think we still don’t appreciate the energy needs of this technology,” Altman said at a public appearance in Davos. He didn’t see how these needs could be met, he went on, “without a breakthrough.” He added, “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale—like, a scale that no one is really planning for.”


A generative AI reset: Rewiring to turn potential into value in 2024 — from mckinsey.com by Eric Lamarre, Alex Singla, Alexander Sukharevsky, and Rodney Zemmel; via Philippa Hardman
The generative AI payoff may only come when companies do deeper organizational surgery on their business.

  • Figure out where gen AI copilots can give you a real competitive advantage
  • Upskill the talent you have but be clear about the gen-AI-specific skills you need
  • Form a centralized team to establish standards that enable responsible scaling
  • Set up the technology architecture to scale
  • Ensure data quality and focus on unstructured data to fuel your models
  • Build trust and reusability to drive adoption and scale

AI Prompt Engineering Is Dead Long live AI prompt engineering — from spectrum.ieee.org

Since ChatGPT dropped in the fall of 2022, everyone and their donkey has tried their hand at prompt engineering—finding a clever way to phrase your query to a large language model (LLM) or AI art or video generator to get the best results or sidestep protections. The Internet is replete with prompt-engineering guides, cheat sheets, and advice threads to help you get the most out of an LLM.

However, new research suggests that prompt engineering is best done by the model itself, and not by a human engineer. This has cast doubt on prompt engineering’s future—and increased suspicions that a fair portion of prompt-engineering jobs may be a passing fad, at least as the field is currently imagined.


What the birth of the spreadsheet teaches us about generative AI — from timharford.com by Tim Harford; via Sam DeBrule

There is one very clear parallel between the digital spreadsheet and generative AI: both are computer apps that collapse time. A task that might have taken hours or days can suddenly be completed in seconds. So accept for a moment the premise that the digital spreadsheet has something to teach us about generative AI. What lessons should we absorb?

It’s that pace of change that gives me pause. Ethan Mollick, author of the forthcoming book Co-Intelligence, tells me “if progress on generative AI stops now, the spreadsheet is not a bad analogy”. We’d get some dramatic shifts in the workplace, a technology that broadly empowers workers and creates good new jobs, and everything would be fine. But is it going to stop any time soon? Mollick doubts that, and so do I.


 

 

A Notre Dame Senior’s Perspective on AI in the Classroom — from learning.nd.edu — by Sarah Ochocki; via Derek Bruff on LinkedIn

At this moment, as a college student trying to navigate the messy, fast-developing, and varied world of generative AI, I feel more confused than ever. I think most of us can share that feeling. There’s no roadmap on how to use AI in education, and there aren’t the typical years of proof to show something works. However, this promising new tool is sitting in front of us, and we would be foolish to not use it or talk about it.

I’ve used it to help me understand sample code I was viewing, rather than mindlessly trying to copy what I was trying to learn from. I’ve also used it to help prepare for a debate, practicing making counterarguments to the points it came up with.

AI alone cannot teach something; there needs to be critical interaction with the responses we are given. However, this is something that is true of any form of education. I could sit in a lecture for hours a week, but if I don’t do the homework or critically engage with the material, I don’t expect to learn anything.


A Map of Generative AI for Education — from medium.com by Laurence Holt; via GSV
An update to our map of the current state-of-the-art


Last ones (for now):


Survey: K-12 Students Want More Guidance on Using AI — from govtech.com by Lauraine Langreo
Research from the nonprofit National 4-H Council found that most 9- to 17-year-olds have an idea of what AI is and what it can do, but most would like help from adults in learning how to use different AI tools.

“Preparing young people for the workforce of the future means ensuring that they have a solid understanding of these new technologies that are reshaping our world,” Jill Bramble, the president and CEO of the National 4-H Council, said in a press release.

AI School Guidance Document Toolkit, with Free Comprehensive Review — from tefanbauschard.substack.com by Stefan Bauschard and Dr. Sabba Quidwai

 

This week in 5 numbers: Another faith-based college plans to close — from by Natalie Schwartz
We’re rounding up some of our top recent stories, from Notre Dame College’s planned closure to Valparaiso’s potential academic cuts.

BY THE NUMBERS

  • 1,444
    The number of students who were enrolled at Notre Dame College in fall 2022, down 37% from 2014. The Roman Catholic college recently said it would close after the spring term, citing declining enrollment, along with rising costs and significant debt.
  • 28
    The number of academic programs that Valparaiso University may eliminate. Eric Johnson, the Indiana institution’s provost, said it offers too many majors, minors and graduate degrees in relation to its enrollment.

A couple of other items re: higher education that caught my eye were:

Universities Expect to Use More Tech in Future Classrooms—but Don’t Know How — from insidehighered.com by Lauren Coffey

University administrators see the need to implement education technology in their classrooms but are at a loss regarding how to do so, according to a new report.

The College Innovation Network released its first CIN Administrator EdTech survey today, which revealed that more than half (53 percent) of the 214 administrators surveyed do not feel extremely confident in choosing effective ed-tech products for their institutions.

“While administrators are excited about offering new ed-tech tools, they are lacking knowledge and data to help them make informed decisions that benefit students and faculty,” Omid Fotuhi, director of learning and innovation at WGU Labs, which funds the network, said in a statement.

From DSC:
I always appreciated our cross-disciplinary team at Calvin (then College). As we looked at enhancing our learning spaces, we had input from the Teaching & Learning Group, IT, A/V, the academic side of the house, and facilities. It was definitely a team-based approach. (As I think about it, it would have been helpful to have more channels for student feedback as well.)


Per Jeff Selingo:

Optionality. In my keynote, I pointed out that the academic calendar and credit hour in higher ed are like “shelf space” on the old television schedule that has been upended by streaming. In much the same way, we need similar optionality to meet the challenges of higher ed right now: in how students access learning (in-person, hybrid, online) to credentials (certificates, degrees) to how those experiences stack together for lifelong learning.

Culture in institutions. The common thread throughout the conference was how the culture of institutions (both universities and governments) need to change so our structures and practices can evolve. Too many people in higher ed right now are employing a scarcity mindset and seeing every change as a zero-sum game. If you’re not happy about the present, as many attendees suggested you’re not going to be excited about the future.

 

Accenture to acquire Udacity to build a learning platform focused on AI — from techcrunch.com by Ron Miller

Excerpt: (emphasis DSC):

Accenture announced today that it would acquire the learning platform Udacity as part of an effort to build a learning platform focused on the growing interest in AI. While the company didn’t specify how much it paid for Udacity, it also announced a $1 billion investment in building a technology learning platform it’s calling LearnVantage.

“The rise of generative AI represents one of the most transformative changes in how work gets done and is driving a growing need for enterprises to train and upskill people in cloud, data and AI as they build their digital core and reinvent their enterprises,” Kishore Durg, global lead of Accenture LearnVantage said in a statement.

.


.

 
© 2024 | Daniel Christian