Enrollment Planning in the Specter of Closure — from insidehighered.com by Mark Campbell and Rachel Schreiber; via GSV Misunderstandings about enrollment management and changing student needs can make a bad situation worse, Mark Campbell and Rachel Schreiber write.
Excerpts (emphasis DSC):
However, we find that many institutions provide little to no information to prospective students about actual outcomes for graduates. Examples include: What does applying to graduate school look like for graduates? Employment and earning potential? Average student loan debt? What do alumni say about their experience? What data do you have that is compelling to answer these and related questions? Families increasingly ask, “What is the ROI on this investment?”
… Another important issue relates to the unwillingness of leaders to evolve the institution to meet market demands. We have too often seen that storied, historic institutions have cultures that are change averse, and this seems to be particularly true in the liberal arts.This statement might appear to be controversial—but only if misunderstood.
To be clear, the humanities and the arts are vital, critical aspects of our institutions. But today’s prospective students are highly focused on career outcomes, given the financial investment they and their families are being asked to make. We believe that curricular offerings can place a high value on the core principles of the humanities and liberal arts while also preparing students for careers.
By contrast, curricular innovation, alterations to long-held marketing practices, openness to self-reflection regarding out-of-date programs, practices and policies—in short, a willingness to change and adapt—are all key. Finally, vital and successful institutions develop long-term strategic enrollment plans that are tactical, realistic and assessable and for which there is clarity about accountability. Putting these practices in place now can avert catastrophe down the road.
Meanwhile, a separate survey of faculty released Thursday by Ithaka S+R, a higher education consulting firm, showcased that faculty—while increasingly familiar with AI—often do not know how to use it in classrooms. Two out of five faculty members are familiar with AI, the Ithaka report found, but only 14 percent said they are confident in their ability to use AI in their teaching. Just slightly more (18 percent) said they understand the teaching implications of generative AI.
“Serious concerns about academic integrity, ethics, accessibility, and educational effectiveness are contributing to this uncertainty and hostility,” the Ithaka report said.
The diverging views about AI are causing friction. Nearly a third of students said they have been warned to not use generative AI by professors, and more than half (59 percent) are concerned they will be accused of cheating with generative AI, according to the Pearson report, which was conducted with Morning Consult and surveyed 800 students.
What teachers want from AI — from hechingerreport.org by Javeria Salman When teachers designed their own AI tools, they built math assistants, tools for improving student writing, and more
An AI chatbot that walks students through how to solve math problems. An AI instructional coach designed to help English teachers create lesson plans and project ideas. An AI tutor that helps middle and high schoolers become better writers.
These aren’t tools created by education technology companies. They were designed by teachers tasked with using AI to solve a problem their students were experiencing.
Over five weeks this spring, about 300 people – teachers, school and district leaders, higher ed faculty, education consultants and AI researchers – came together to learn how to use AI and develop their own basic AI tools and resources. The professional development opportunity was designed by technology nonprofit Playlab.ai and faculty at the Relay Graduate School of Education.
Next-Gen Classroom Observations, Powered by AI — from educationnext.org by Michael J. Petrilli The use of video recordings in classrooms to improve teacher performance is nothing new. But the advent of artificial intelligence could add a helpful evaluative tool for teachers, measuring instructional practice relative to common professional goals with chatbot feedback.
Multiple companies are pairing AI with inexpensive, ubiquitous video technology to provide feedback to educators through asynchronous, offsite observation. It’s an appealing idea, especially given the promise and popularity of instructional coaching, as well as the challenge of scaling it effectively (see “Taking Teacher Coaching To Scale,” research, Fall 2018).
…
Enter AI. Edthena is now offering an “AI Coach” chatbot that offers teachers specific prompts as they privately watch recordings of their lessons. The chatbot is designed to help teachers view their practice relative to common professional goals and to develop action plans to improve.
To be sure, an AI coach is no replacement for human coaching.
We need to shift our thinking about GenAI tutors serving only as personal learning tools. The above activities illustrate how these tools can be integrated into contemporary classroom instruction. The activities should not be seen as prescriptive but merely suggestive of how GenAI can be used to promote social learning. Although I specifically mention only one online activity (“Blended Learning”), all can be adapted to work well in online or blended classes to promote social interaction.
Stealth AI — from higherai.substack.com by Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler What happens when students use AI all the time, but aren’t allowed to talk about it?
In many ways, this comes back to one of my general rules: You cannot ban AI in the classroom. You can only issue a gag rule.
And if you do issue a gag rule, then it deprives students of the space they often need to make heads and tails of this technology.
We need to listen to actual students talking about actual uses, and reflecting on their actual feelings. No more abstraction.
In this conversation, Jason Gulya (a Professor of English at Berkeley College) talks to Zack Kinzler about what students are saying about Artificial Intelligence and education.
Welcome to our monthly update for Teams for Education and thank you so much for being part of our growing community! We’re thrilled to share over 20 updates and resources and show them in action next week at ISTELive 24 in Denver, Colorado, US.
… Copilot for Microsoft 365 – Educator features Guided Content Creation
Coming soon to Copilot for Microsoft 365 is a guided content generation experience to help educators get started with creating materials like assignments, lesson plans, lecture slides, and more. The content will be created based on the educator’s requirements with easy ways to customize the content to their exact needs. Standards alignment and creation Quiz generation through Copilot in Forms Suggested AI Feedback for Educators Teaching extension
To better support educators with their daily tasks, we’ll be launching a built-in Teaching extension to help guide them through relevant activities and provide contextual, educator-based support in Copilot. Education data integration
Copilot for Microsoft 365 – Student features Interactive practice experiences
Flashcards activity
Guided chat activity Learning extension in Copilot for Microsoft 365
…
New AI tools for Google Workspace for Education — from blog.google by Akshay Kirtikar and Brian Hendricks We’re bringing Gemini to teen students using their school accounts to help them learn responsibly and confidently in an AI-first future, and empowering educators with new tools to help create great learning experiences.
According to Flighty, I logged more than 2,220 flight miles in the last 5 days traveling to three conferences to give keynotes and spend time with housing officers in Milwaukee, college presidents in Mackinac Island, MI, and enrollment and marketing leaders in Raleigh.
Before I rest, I wanted to post some quick thoughts about what I learned. Thank you to everyone who shared their wisdom these past few days:
We need to think about the “why” and “how” of AI in higher ed. The “why” shouldn’t be just because everyone else is doing it. Rather, the “why” is to reposition higher ed for a different future of competitors. The “how” shouldn’t be to just seek efficiency and cut jobs. Rather we should use AI to learn from its users to create a better experience going forward.
Residence halls are not just infrastructure. They are part and parcel of the student experience and critical to student success. Almost half of students living on campus say it increases their sense of belonging, according to research by the Association of College & University Housing Officers.
How do we extend the “residential experience”? More than half of traditional undergraduates who live on campus now take at least once course online. As students increasingly spend time off campus – or move off campus as early as their second year in college – we need to help continue to make the connections for them that they would in a dorm. Why? 47% of college students believe living in a college residence hall enhanced their ability to resolve conflicts.
Career must be at the core of the student experience for colleges to thrive in the future, says Andy Chan. Yes, some people might see that as too narrow of a view of higher ed or might not want to provide cogs for the wheel of the workforce, but without the job, none of the other benefits of college follow–citizenship, health, engagement.
A “triple threat grad”–someone who has an internship, a semester-long project, and an industry credential (think Salesforce or Adobe in addition to their degree–matters more in the job market than major or institution, says Brandon Busteed.
Every faculty member should think of themselves as an ambassador for the institution. Yes, care about their discipline/department, but that doesn’t survive if the rest of the institution falls down around them.
Presidents need to place bigger bets rather than spend pennies and dimes on a bunch of new strategies. That means to free up resources they need to stop doing things.
Higher ed needs a new business model. Institutions can’t make money just from tuition, and new products like certificates, are pennies on the dollars of degrees.
Boards aren’t ready for the future. They are over-indexed on philanthropy and alumni and not enough on the expertise needed for leading higher ed.
That’s the percentage of high school graduates going right on to college. A decade ago it was around 70%. So for all the bellyaching about the demographic cliff in higher ed, just imagine if today we were close to that 70% number? We’d be talking a few hundred thousand more students in the system.
As I told a gathering of presidents of small colleges and universities last night on Mackinac Island — the first time I had to take [numerous modes of transportation] to get to a conference — being small isn’t distinctive anymore.
There are many reasons undergrad enrollment is down, but they all come down to two interrelated trends: jobs and affordability.
The job has become so central to what students want out of the experience. It’s almost as if colleges now need to guarantee a job.
These institutions will need to rethink the learner relationship with work. Instead of college with work on the side, we might need to move to more of a mindset of work with college on the side by:
Making campus jobs more meaningful. Why can’t we have accounting and finance majors work in the CFO office, liberal arts majors work in IT on platforms such as Salesforce and Workday, which are skills needed in the workplace, etc.?
Apprenticeships are not just for the trades anymore. Integrate work-based learning into the undergrad experience in a much bigger way than internships and even co-ops.
Credentials within the degree. Every graduate should leave college with more than just a BA but also a certified credential in things like data viz, project management, the Adobe suite, Alteryx, etc.
The curriculum needs to be more flexible for students to combine work and learning — not only for the experience but also money for college — so more availability of online courses, hybrid courses, and flexible semesters.
We have to provide instructors the support they need to leverage educational technologies like generative AI effectively in the service of learning. Given the amount of benefit that could accrue to students if powerful tools like generative AI were used effectively by instructors, it seems unethical not to provide instructors with professional development that helps them better understand how learning occurs and what effective teaching looks like. Without more training and support for instructors, the amount of student learning higher education will collectively “leave on the table” will only increase as generative AI gets more and more capable. And that’s a problem.
From DSC: As is often the case, David put together a solid posting here. A few comments/reflections on it:
I agree that more training/professional development is needed, especially regarding generative AI. This would help achieve a far greater ROI and impact.
The pace of change makes it difficult to see where the sand is settling…and thus what to focus on
The Teaching & Learning Groups out there are also trying to learn and grow in their knowledge (so that they can train others)
The administrators out there are also trying to figure out what all of this generative AI stuff is all about; and so are the faculty members. It takes time for educational technologies’ impact to roll out and be integrated into how people teach.
As we’re talking about multiple disciplines here, I think we need more team-based content creation and delivery.
There needs to be more research on how best to use AI — again, it would be helpful if the sand settled a bit first, so as not to waste time and $$. But then that research needs to be piped into the classrooms far better. .
Survey: Most Students Approve of Education Quality, Climate — from insidehighered.com by Colleen Flaherty Initial findings from Inside Higher Ed’s annual Student Voice survey challenge popular narratives about how college is failing students, while also pointing to areas for improvement.
Public doubts about higher education may be increasing, but three in four current students rate the quality of education they’re receiving as good (46 percent) or excellent (27 percent), according to just-in results from Inside Higher Ed’s annual Student Voice survey. This is relatively consistent across institution types.
How Learning Designers Are Using AI for Analysis — from drphilippahardman.substack.com by Dr. Philippa Hardman A practical guide on how to 10X your analysis process using free AI tools, based on real use cases
There are three key areas where AI tools make a significant impact on how we tackle the analysis part of the learning design process:
Understanding the why: what is the problem this learning experience solves? What’s the change we want to see as a result?
Defining the who: who do we need to target in order to solve the problem and achieve the intended goal?
Clarifying the what: given who our learners are and the goal we want to achieve, what concepts and skills do we need to teach?
Two new surveys, both released this month, show how high school and college-age students are embracing artificial intelligence. There are some inconsistencies and many unanswered questions, but what stands out is how much teens are turning to AI for information and to ask questions, not just to do their homework for them. And they’re using it for personal reasons as well as for school. Another big takeaway is that there are different patterns by race and ethnicity with Black, Hispanic and Asian American students often adopting AI faster than white students.
We’ve ceded so much trust to digital systems already that most simply assume a tool is safe to use with students because a company published it. We don’t check to see if it is compliant with any existing regulations. We don’t ask what powers it. We do not question what happens to our data or our student’s data once we upload it. We likewise don’t know where its information came from or how it came to generate human-like responses. The trust we put into these systems is entirely unearned and uncritical.
…
The allure of these AI tools for teachers is understandable—who doesn’t want to save time on the laborious process of designing lesson plans and materials? But we have to ask ourselves what is lost when we cede the instructional design process to an automated system without critical scrutiny.
From DSC: I post this to be a balanced publisher of information. I don’t agree with everything Marc says here, but he brings up several solids points.
As news about generative artificial intelligence (GenAI) continually splashes across social media feeds, including how ChatGPT 4o can help you play Rock, Paper, Scissors with a friend, breathtaking pronouncements about GenAI’s “disruptive” impact aren’t hard to find.
It turns out that it doesn’t make much sense to talk about GenAI as being “disruptive” in and of itself.
Can it be part of a disruptive innovation? You bet.
But much more important than just the AI technology in determining whether something is disruptive is the business model in which the AI is used—and its competitive impact on existing products and services in different markets.
On a somewhat note, also see:
National summit explores how digital education can promote deeper learning — from digitaleducation.stanford.edu by Jenny Robinson; via Eric Kunnen on Linkedin.com The conference, held at Stanford, was organized to help universities imagine how digital innovation can expand their reach, improve learning, and better serve the public good.
The summit was organized around several key questions: “What might learning design, learning technologies, and educational media look like in three, five, or ten years at our institutions? How will blended and digital education be poised to advance equitable, just, and accessible education systems and contribute to the public good? What structures will we need in place for our teams and offices?”
From DSC: Last Thursday, I presented at the Educational Technology Organization of Michigan’s Spring 2024 Retreat. I wanted to pass along my slides to you all, in case they are helpful to you.
Every six months or so, I write a guide to doing stuff with AI. A lot has changed since the last guide, while a few important things have stayed the same. It is time for an update.
…
To learn to do serious stuff with AI, choose a Large Language Model and just use it to do serious stuff – get advice, summarize meetings, generate ideas, write, produce reports, fill out forms, discuss strategy – whatever you do at work, ask the AI to help. A lot of people I talk to seem to get the most benefit from engaging the AI in conversation, often because it gives good advice, but also because just talking through an issue yourself can be very helpful. I know this may not seem particularly profound, but “always invite AI to the table” is the principle in my book that people tell me had the biggest impact on them. You won’t know what AI can (and can’t) do for you until you try to use it for everything you do. And don’t sweat prompting too much, though here are some useful tips, just start a conversation with AI and see where it goes.
You do need to use one of the most advanced frontier models, however.
Hybrid learning through podcasts: a practical approach — from timeshighereducation.com by Catherine Chambers Adapting practice-based learning to a blend of synchronous and asynchronous delivery gives learners more control and creates opportunities for real-world learning of skills such as podcast production, writes Catherine Chambers
Hybrid learning provides students with greater control over their learning and enables the development of employability skills, supporting practice-based group work through in situ activities.
Aligned with Keele’s curriculum expectations framework, the module was designed around podcasts to support inclusivity, active learning, digital capability and external engagement.
Microsoft is partnering with Khan Academy in a multifaceted deal to demonstrate how AI can transform the way we learn. The cornerstone of today’s announcement centers on Khan Academy’s Khanmigo AI agent. Microsoft says it will migrate the bot to its Azure OpenAI Service, enabling the nonprofit educational organization to provide all U.S. K-12 educators free access to Khanmigo.
In addition, Microsoft plans to use its Phi-3 model to help Khan Academy improve math tutoring and collaborate to generate more high-quality learning content while making more courses available within Microsoft Copilot and Microsoft Teams for Education.
One in three American teachers have used artificial intelligence tools in their teaching at least once, with English and social studies teachers leading the way, according to a RAND Corporation survey released last month. While the new technology isn’t yet transforming how kids learn, both teachers and district leaders expect that it will become an increasingly common feature of school life.
When ChatGPT emerged a year and half ago, many professors immediately worried that their students would use it as a substitute for doing their own written assignments — that they’d click a button on a chatbot instead of doing the thinking involved in responding to an essay prompt themselves.
But two English professors at Carnegie Mellon University had a different first reaction: They saw in this new technology a way to show students how to improve their writing skills.
“They start really polishing way too early,” Kaufer says. “And so what we’re trying to do is with AI, now you have a tool to rapidly prototype your language when you are prototyping the quality of your thinking.”
He says the concept is based on writing research from the 1980s that shows that experienced writers spend about 80 percent of their early writing time thinking about whole-text plans and organization and not about sentences.
On Building AI Models for Education — from aieducation.substack.com by Claire Zau Google’s LearnLM, Khan Academy/MSFT’s Phi-3 Models, and OpenAI’s ChatGPT Edu
This piece primarily breaks down how Google’s LearnLM was built, and takes a quick look at Microsoft/Khan Academy’s Phi-3 and OpenAI’s ChatGPT Edu as alternative approaches to building an “education model” (not necessarily a new model in the latter case, but we’ll explain). Thanks to the public release of their 86-page research paper, we have the most comprehensive view into LearnLM. Our understanding of Microsoft/Khan Academy small language models and ChatGPT Edu is limited to the information provided through announcements, leaving us with less “under the hood” visibility into their development.
Answer AI is among a handful of popular apps that are leveraging the advent of ChatGPT and other large language models to help students with everything from writing history papers to solving physics problems. Of the top 20 education apps in the U.S. App Store, five are AI agents that help students with their school assignments, including Answer AI, according to data from Data.ai on May 21.
If your school (district) or university has not yet made significant efforts to think about how you will prepare your students for a World of AI, I suggest the following steps:
July 24 – Administrator PD & AI Guidance In July, administrators should receive professional development on AI, if they haven’t already. This should include…
August 24 –Professional Development for Teachers and Staff…
Fall 24 — Parents; Co-curricular; Classroom experiments…
December 24 — Revision to Policy…
New ChatGPT Version Aiming at Higher Ed — from insidehighered.com by Lauren Coffey ChatGPT Edu, emerging after initial partnerships with several universities, is prompting both cautious optimism and worries.
OpenAI unveiled a new version of ChatGPT focused on universities on Thursday, building on work with a handful of higher education institutions that partnered with the tech giant.
The ChatGPT Edu product, expected to start rolling out this summer, is a platform for institutions intended to give students free access. OpenAI said the artificial intelligence (AI) toolset could be used for an array of education applications, including tutoring, writing grant applications and reviewing résumés.
Introducing Perplexity Pages — from perplexity.ai You’ve used Perplexity to search for answers, explore new topics, and expand your knowledge. Now, it’s time to share what you learned.
Meet Perplexity Pages, your new tool for easily transforming research into visually stunning, comprehensive content. Whether you’re crafting in-depth articles, detailed reports, or informative guides, Pages streamlines the process so you can focus on what matters most: sharing your knowledge with the world.
Seamless creation
Pages lets you effortlessly create, organize, and share information. Search any topic, and instantly receive a well-structured, beautifully formatted article. Publish your work to our growing library of user-generated content and share it directly with your audience with a single click.
A tool for everyone Pages is designed to empower creators in any field to share knowledge.
Educators: Develop comprehensive study guides for your students, breaking down complex topics into easily digestible content.
Researchers: Create detailed reports on your findings, making your work more accessible to a wider audience.
Hobbyists: Share your passions by creating engaging guides that inspire others to explore new interests.
AI’s New Conversation Skills Eyed for Education — from insidehighered.com by Lauren Coffey The latest ChatGPT’s more human-like verbal communication has professors pondering personalized learning, on-demand tutoring and more classroom applications.
ChatGPT’s newest version, GPT-4o ( the “o” standing for “omni,” meaning “all”), has a more realistic voice and quicker verbal response time, both aiming to sound more human. The version, which should be available to free ChatGPT users in coming weeks—a change also hailed by educators—allows people to interrupt it while it speaks, simulates more emotions with its voice and translates languages in real time. It also can understand instructions in text and images and has improved video capabilities.
…
Ajjan said she immediately thought the new vocal and video capabilities could allow GPT to serve as a personalized tutor. Personalized learning has been a focus for educators grappling with the looming enrollment cliff and for those pushing for student success.
There’s also the potential for role playing, according to Ajjan. She pointed to mock interviews students could do to prepare for job interviews, or, for example, using GPT to play the role of a buyer to help prepare students in an economics course.
Trends
As a first activity, we asked the Horizon panelists to provide input on the macro trends they believe are going to shape the future of postsecondary teaching and learning and to provide observable evidence for those trends. To ensure an expansive view of the larger trends serving as context for institutions of higher education, panelists provided input across five trend categories: social, technological, economic, environmental, and political. Given the widespread impacts of emerging AI technologies on higher education, we are also including in this year’s report a list of “honorary trends” focused on AI. After several rounds of voting, the panelists selected the following trends as the most important:
About one university or college per week so far this year, on average, has announced that it will close or merge. That’s up from a little more than two a month last year, according to the State Higher Education Executive Officers Association, or SHEEO.
…
Most students at colleges that close give up on their educations altogether. Fewer than half transfer to other institutions, a SHEEO study found. Of those, fewer than half stay long enough to get degrees. Many lose credits when they move from one school to another and have to spend longer in college, often taking out more loans to pay for it.
…
Colleges are almost certain to keep closing. As many as one in 10 four-year colleges and universities are in financial peril, the consulting firm EY Parthenon estimates.
Students who transferlose an average of 43 percentof the credits they’ve already earned and paid for, the Government Accountability Office found in the most recent comprehensive study of this problem.
This paper explores how instructors can leverage generative AI to create personalized learning experiences for students that transform teaching and learning. We present a range of AI-based exercises that enable novel forms of practice and application including simulations, mentoring, coaching, and co-creation. For each type of exercise, we provide prompts that instructors can customize, along with guidance on classroom implementation, assessment, and risks to consider. We also provide blueprints, prompts that help instructors create their own original prompts. Instructors can leverage their content and pedagogical expertise to design these experiences, putting them in the role of builders and innovators. We argue that this instructor-driven approach has the potential to democratize the development of educational technology by enabling individual instructors to create AI exercises and tools tailored to their students’ needs. While the exercises in this paper are a starting point, not a definitive solutions, they demonstrate AI’s potential to expand what is possible in teaching and learning.