Over the past year, many excellent and resourceful books have crossed my desk or Kindle. I’m rounding them up here so you can find a few to expand your horizons. The list below is in alphabetical order by title.
Each book is unique, yet as a collection, they reflect some common themes and trends in Learning and Development: a focus on empathy and emotion, adopting best practices from other fields, using data for greater impact, aligning projects with organizational goals, and developing consultative skills. The authors listed here are optimistic and forward-thinking—they believe change is possible. I hope you enjoy the books.
As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme. … 2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch. As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.
With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.
Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.
Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.
The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?
…
The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.
In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.
Doubts About Value Are Deterring College Enrollment — from insidehighered.com by Jessica Blake Survey data suggests that prospective learners are being dissuaded from college by skepticism about whether degrees are worth the time and money.
Enrollment has been declining in higher education for more than a decade, and the most common explanations in recent years have been lingering effects of the pandemic and a looming demographic cliff expected to shrink the number of traditional-age college students. But new research suggests that public doubts about the value of a college degree are a key contributor.
The study—conducted by Edge Research, a marketing research firm, and HCM Strategists, a public policy and advocacy consulting firm with funding from the Bill & Melinda Gates Foundation—uses focus groups and parallel national surveys of current high school students and of adults who decided to leave college or who didn’t go at all to link the value proposition of a college degree and Americans’ behaviors after high school.
“At the end of the day, higher education has a lot of work to do to convince these audiences of its value,” said Terrell Dunn, an HCM consultant.
When college students think about quitting, it’s most likely because of mental health strain or stress.
That’s according to the recent data from the “State of Higher Education Study,” conducted by the analytics company Gallup and the private foundation Lumina.
Consider Coursera Inc., the most prominent survivor of that early edtech hype. It’s now a public company, with a hefty $2.3 billion valuation. Finally, 12 years after it was founded — by, incidentally, another Google veteran in Andrew Ng — it’s set to report its first profit this year, according to analyst estimates. And the enterprise business is considerably more profitable, enjoying a 68% gross margin in 2023, compared to the consumer business’s 53% margin.
…
Figuring out the right match between training and utility is how several business schools seem to have developed successful online courses — which they are charging top dollar for. They’re in close contact with the sort of large corporations who hire their graduates, giving them a more intimate understanding of what those businesses seek.
Harvard Business School is one example. It made $74 million from online courses in fiscal 2022, the most recent year for whichdata is available.
DC: Hmmm…given that the militaries of the world have been integrating AI into their arsenals (likely for years), this kind of thing is a bit disturbing for me. Autonomous/self-correcting missiles, robotic tanks, drones, and more…here we come. Ouch.https://t.co/Qljl1U9m9S
— Daniel Christian (he/him/his) (@dchristian5) March 13, 2024
Vast swaths of the United States are at risk of running short of power as electricity-hungry data centers and clean-technology factories proliferate around the country, leaving utilities and regulators grasping for credible plans to expand the nation’s creaking power grid.
…
A major factor behind the skyrocketing demand is the rapid innovation in artificial intelligence, which is driving the construction of large warehouses of computing infrastructure that require exponentially more power than traditional data centers. AI is also part of a huge scale-up of cloud computing. Tech firms like Amazon, Apple, Google, Meta and Microsoft are scouring the nation for sites for new data centers, and many lesser-known firms are also on the hunt.
The Obscene Energy Demands of A.I.— from newyorker.com by Elizabeth Kolbert How can the world reach net zero if it keeps inventing new ways to consume energy?
“There’s a fundamental mismatch between this technology and environmental sustainability,” de Vries said. Recently, the world’s most prominent A.I. cheerleader, Sam Altman, the C.E.O. of OpenAI, voiced similar concerns, albeit with a different spin. “I think we still don’t appreciate the energy needs of this technology,” Altman said at a public appearance in Davos. He didn’t see how these needs could be met, he went on, “without a breakthrough.” He added, “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale—like, a scale that no one is really planning for.”
A generative AI reset: Rewiring to turn potential into value in 2024 — from mckinsey.com by Eric Lamarre, Alex Singla, Alexander Sukharevsky, and Rodney Zemmel; via Philippa Hardman The generative AI payoff may only come when companies do deeper organizational surgery on their business.
Figure out where gen AI copilots can give you a real competitive advantage
Upskill the talent you have but be clear about the gen-AI-specific skills you need
Form a centralized team to establish standards that enable responsible scaling
Set up the technology architecture to scale
Ensure data quality and focus on unstructured data to fuel your models
Build trust and reusability to drive adoption and scale
Since ChatGPT dropped in the fall of 2022, everyone and their donkey has tried their hand at prompt engineering—finding a clever way to phrase your query to a large language model (LLM) or AI art or video generator to get the best results or sidestep protections. The Internet is replete with prompt-engineering guides, cheat sheets, and advice threads to help you get the most out of an LLM.
…
However, new research suggests that prompt engineering is best done by the model itself, and not by a human engineer. This has cast doubt on prompt engineering’s future—and increased suspicions that a fair portion of prompt-engineering jobs may be a passing fad, at least as the field is currently imagined.
There is one very clear parallel between the digital spreadsheet and generative AI: both are computer apps that collapse time. A task that might have taken hours or days can suddenly be completed in seconds. So accept for a moment the premise that the digital spreadsheet has something to teach us about generative AI. What lessons should we absorb?
It’s that pace of change that gives me pause. Ethan Mollick, author of the forthcoming book Co-Intelligence, tells me “if progress on generative AI stops now, the spreadsheet is not a bad analogy”. We’d get some dramatic shifts in the workplace, a technology that broadly empowers workers and creates good new jobs, and everything would be fine. But is it going to stop any time soon? Mollick doubts that, and so do I.
At this moment, as a college student trying to navigate the messy, fast-developing, and varied world of generative AI, I feel more confused than ever. I think most of us can share that feeling. There’s no roadmap on how to use AI in education, and there aren’t the typical years of proof to show something works. However, this promising new tool is sitting in front of us, and we would be foolish to not use it or talk about it.
…
I’ve used it to help me understand sample code I was viewing, rather than mindlessly trying to copy what I was trying to learn from. I’ve also used it to help prepare for a debate, practicing making counterarguments to the points it came up with.
AI alone cannot teach something; there needs to be critical interaction with the responses we are given. However, this is something that is true of any form of education. I could sit in a lecture for hours a week, but if I don’t do the homework or critically engage with the material, I don’t expect to learn anything.
Survey: K-12 Students Want More Guidance on Using AI — from govtech.com by Lauraine Langreo Research from the nonprofit National 4-H Council found that most 9- to 17-year-olds have an idea of what AI is and what it can do, but most would like help from adults in learning how to use different AI tools.
“Preparing young people for the workforce of the future means ensuring that they have a solid understanding of these new technologies that are reshaping our world,” Jill Bramble, the president and CEO of the National 4-H Council, said in a press release.
1,444
The number of students who were enrolled at Notre Dame College in fall 2022, down 37% from 2014. The Roman Catholic college recently said it would close after the spring term, citing declining enrollment, along with rising costs and significant debt.
28
The number of academic programs that Valparaiso University may eliminate. Eric Johnson, the Indiana institution’s provost, said it offers too many majors, minors and graduate degrees in relation to its enrollment.
…
A couple of other items re: higher education that caught my eye were:
University administrators see the need to implement education technology in their classrooms but are at a loss regarding how to do so, according to a new report.
The College Innovation Network released its first CIN Administrator EdTech survey today, which revealed that more than half (53 percent) of the 214 administrators surveyed do not feel extremely confident in choosing effective ed-tech products for their institutions.
“While administrators are excited about offering new ed-tech tools, they are lacking knowledge and data to help them make informed decisions that benefit students and faculty,” Omid Fotuhi, director of learning and innovation at WGU Labs, which funds the network, saidin a statement.
From DSC: I always appreciated our cross-disciplinary team at Calvin (then College). As we looked at enhancing our learning spaces, we had input from the Teaching & Learning Group, IT, A/V, the academic side of the house, and facilities. It was definitely a team-based approach. (As I think about it, it would have been helpful to have more channels for student feedback as well.)
Optionality. In my keynote, I pointed out that the academic calendar and credit hour in higher ed are like “shelf space” on the old television schedule that has been upended by streaming. In much the same way, we need similar optionality to meet the challenges of higher ed right now: in how students access learning (in-person, hybrid, online) to credentials (certificates, degrees) to how those experiences stack together for lifelong learning.
Culture in institutions. The common thread throughout the conference was how the culture of institutions (both universities and governments) need to change so our structures and practices can evolve. Too many people in higher ed right now are employing a scarcity mindset and seeing every change as a zero-sum game. If you’re not happy about the present, as many attendees suggested you’re not going to be excited about the future.
Accenture announced today that it would acquire the learning platform Udacity as part of an effort to build a learning platform focused on the growing interest in AI. While the company didn’t specify how much it paid for Udacity, it also announced a $1 billion investment in building a technology learning platform it’s calling LearnVantage.
“The rise of generative AI represents one of the most transformative changes in how work gets done and is driving a growing need for enterprises to train and upskill people in cloud, data and AI as they build their digital core and reinvent their enterprises,” Kishore Durg, global lead of Accenture LearnVantage said in a statement.
The market continues to be a matter of Canvas and Brightspace winning new accounts, Anthology Bb Learn and Moodle losing accounts, with more variety for smaller institutions.
Canvas has further consolidated its position as the market leader in North America, with 41% of the market share. Blackboard fell from 18% of the market share to 17%. Moodle has plateaued at 16% while Brightspace increased to 16%. As a reminder, we have removed Open LMS from the Moodle market share.
Populi LMS (3%), Open LMS (3%), Sakai (2%), and Schoology (1%) round out the remainder of the market, with 1% of the market going to Other.
As always, market share is very much a story of size, type of institution (public or private), and control. The numbers above refer to the market take as a whole, but if we start to drill down to different sizes and types of institution, we get very different market dynamics.
1. AI already plays a central part in the instructional design process A whopping 95.3% of the instructional designers interviewed said they use AI in their day to day work. Those who don’t use AI cite access or permission issues as the primary reason that they haven’t integrated AI into their process.
2. AI is predominantly used at the design and development stages of the instructional design process
When mapped to the ADDIE process, the breakdown of use cases goes as follows:
Analysis: 5.5% of use cases
Design: 32.1%
Development: 53.2%
Implementation: 1.8%
Evaluation: 7.3%
Speaking of AI in our learning ecosystems, also see:
The majority of educators expect use of artificial intelligence tools will increase in their school or district over the next year, according to an EdWeek Research Center survey.
Applying Multimodal AI to L&D — from learningguild.com by Sarah Clark We’re just starting to see Multimodal AI systems hit the spotlight. Unlike the text-based and image-generation AI tools we’ve seen before, multimodal systems can absorb and generate content in multiple formats – text, image, video, audio, etc.
Most people enroll in college because they believe it will help them secure a good job and open the door to economic opportunity. In “Talent Disrupted,” a new and updated version of the 2018 report, “The Permanent Detour,” Strada Institute for the Future of Work and The Burning Glass Institute show that a college degree is not always a guarantee of labor market success.
Using a combination of online career histories of tens of millions of graduates, as well as census microdata for millions of graduates, the report offers a comprehensive picture of how college graduates fare in the job market over their first decade of employment after college.
The report shows that only about half of bachelor’s degree graduates secure employment in a college-level job within a year of graduation.
Roughly half of college graduates end up in jobs where their degrees aren’t needed, and that underemployment has lasting implications for workers’ earnings and career paths.
That is the key finding of a new study tracking the career paths of more than 10 million people who entered the job market over the past decade. It suggests that the number of graduates in jobs that don’t make use of their skills or credentials—52%—is greater than previously thought, and underscores the lasting importance of that first job after graduation.
The early vibrations of AI have already been shaking the newsroom. One downside of the new technology surfaced at CNET and Sports Illustrated, where editors let AI run amok with disastrous results. Elsewhere in news media, AI is already writing headlines, managing paywalls to increase subscriptions, performing transcriptions, turning stories in audio feeds, discovering emerging stories, fact checking, copy editing and more.
Felix M. Simon, a doctoral candidate at Oxford, recently published a white paper about AI’s journalistic future that eclipses many early studies. Swinging a bat from a crouch that is neither doomer nor Utopian, Simon heralds both the downsides and promise of AI’s introduction into the newsroom and the publisher’s suite.
Unlike earlier technological revolutions, AI is poised to change the business at every level. It will become — if it already isn’t — the beginning of most story assignments and will become, for some, the new assignment editor. Used effectively, it promises to make news more accurate and timely. Used frivolously, it will spawn an ocean of spam. Wherever the production and distribution of news can be automated or made “smarter,” AI will surely step up. But the future has not yet been written, Simon counsels. AI in the newsroom will be only as bad or good as its developers and users make it.
We proposed EMO, an expressive audio-driven portrait-video generation framework. Input a single reference image and the vocal audio, e.g. talking and singing, our method can generate vocal avatar videos with expressive facial expressions, and various head poses, meanwhile, we can generate videos with any duration depending on the length of input video.
New experimental work from Adobe Research is set to change how people create and edit custom audio and music. An early-stage generative AI music generation and editing tool, Project Music GenAI Control allows creators to generate music from text prompts, and then have fine-grained control to edit that audio for their precise needs.
“With Project Music GenAI Control, generative AI becomes your co-creator. It helps people craft music for their projects, whether they’re broadcasters, or podcasters, or anyone else who needs audio that’s just the right mood, tone, and length,” says Nicholas Bryan, Senior Research Scientist at Adobe Research and one of the creators of the technologies.
There’s a lot going on in the world of generative AI, but maybe the biggest is the increasing number of copyright lawsuits being filed against AI companies like OpenAI and Stability AI. So for this episode, we brought on Verge features editor Sarah Jeong, who’s a former lawyer just like me, and we’re going to talk about those cases and the main defense the AI companies are relying on in those copyright cases: an idea called fair use.
The FCC’s war on robocalls has gained a new weapon in its arsenal with the declaration of AI-generated voices as “artificial” and therefore definitely against the law when used in automated calling scams. It may not stop the flood of fake Joe Bidens that will almost certainly trouble our phones this election season, but it won’t hurt, either.
The new rule, contemplated for months and telegraphed last week, isn’t actually a new rule — the FCC can’t just invent them with no due process. Robocalls are just a new term for something largely already prohibited under the Telephone Consumer Protection Act: artificial and pre-recorded messages being sent out willy-nilly to every number in the phone book (something that still existed when they drafted the law).
EIEIO…Chips Ahoy!— from dashmedia.co by Michael Moe, Brent Peus, and Owen Ritz
Here Come the AI Worms — from wired.com by Matt Burgess Security researchers created an AI worm in a test environment that can automatically spread between generative AI agents—potentially stealing data and sending spam emails along the way.
Now, in a demonstration of the risks of connected, autonomous AI ecosystems, a group of researchers have created one of what they claim are the first generative AI worms—which can spread from one system to another, potentially stealing data or deploying malware in the process. “It basically means that now you have the ability to conduct or to perform a new kind of cyberattack that hasn’t been seen before,” says Ben Nassi, a Cornell Tech researcher behind the research.
In fact a slew of research over the past two decades has found that teaching online makes professors better teachers in their classrooms, so much so that one 2009 study recommended that “faculty should be trained in distance education methods and technologies and should be encouraged to use those methods back in the classroom.”
It’s a message I’ve been arguing for a while. But now that so many educators and students have had direct experience with online formats, it’s a narrative that seems to be sinking in.
Now is the time to fully embrace how physical classrooms can be improved by online techniques.
When professors teaching face-to-face adopt online pedagogy, the classroom is transformed into a “blended” experience, moving from conventional to active learning. And that helps students turn from passive to engaged participants in their own intellectual excursions.
The intent of this report is to help communities build their capacity for transformation of education, advancing toward what our society needs most—a system that works for young people. It draws on the experiences and insights of innovators across the United States who are already answering this challenge—creating learner-centered, community-based ecosystems.
…
This report includes:
a landscape analysis of select communities creating learning ecosystems;
a framework that emerged from the analysis and can be used by communities to consider their readiness and appetite for this transformation;
an invitation to communities to explore and discover their own path for reimagining education; and
a call for national and regional institutions to listen, learn from, and create the conditions for communities to pursue their visions.
From DSC:
The above items was accessed via the article below:
The Norris School District in Wisconsin exemplifies how learner profiles and community connections can enhance authentic learning experiences for young people, fostering a culture of belonging and responsibility.
Purdue Polytechnic High School demonstrates the importance of enabling conditions, such as creating microschools with access to shared services, to support a learner-centered approach while ensuring scalability and access to a variety of resources.