It’s an era many instructors would like to put behind them: black boxes on Zoom screens, muffled discussions behind masks, students struggling to stay engaged. But how much more challenging would teaching during the pandemic have been if colleges did not have experts on staff to help with the transition? On many campuses, teaching-center directors, instructional designers, educational technologists, and others worked alongside professors to explore learning-management systems, master video technology, and rethink what and how they teach.
A new book out this month, Higher Education Beyond Covid: New Teaching Paradigms and Promise, explores this period through the stories of campus teaching and learning centers. Their experiences reflect successes and failures, and what higher education could learn as it plans for the future.
As usual, our readers were full of suggestions. Kathryn Schild, the lead instructional designer in faculty development and instructional support at the University of Alaska at Anchorage, shared a guide she’s compiled on holding asynchronous discussions, which includes a section on difficult topics.
In an email, Schild also pulled out a few ideas she thought were particularly relevant to Le’s question, including:
Set the ground rules as a class. One way to do this is to share your draft rules in a collaborative document and ask students to annotate it and add suggestions.
Plan to hold fewer difficult discussions than in a face-to-face class, and work on quality over quantity. This could include multiweek discussions, where you spiral through the same issue with fresh perspectives as the class learns new approaches.
Start with relationship-building interactions in the first few weeks, such as introductions, low-stakes group assignments, or peer feedback, etc.
Wagner herself recently relied on the power of collegial conversations to probe the question: What’s on the minds of educators as they make ready for the growing influence of AI in higher education? CT asked her for some takeaways from the process.
We are in the very early days of seeing how AI is going to affect education. Some of us are going to need to stay focused on the basic research to test hypotheses. Others are going to dive into laboratory “sandboxes” to see if we can build some new applications and tools for ourselves. Still others will continue to scan newsletters like ProductHunt every day to see what kinds of things people are working on. It’s going to be hard to keep up, to filter out the noise on our own. That’s one reason why thinking with colleagues is so very important.
We are interested how K-12 education will change in terms of foundational learning. With in-class, active learning designs, will younger students do a lot more intensive building of foundational writing and critical thinking skills before they get to college?
The Human in the Loop: AI is built using math: think of applied statistics on steroids. Humans will be needed more than ever to manage, review and evaluate the validity and reliability of results. Curation will be essential.
We will need to generate ideas about how to address AI factors such as privacy, equity, bias, copyright, intellectual property, accessibility, and scalability.
Have other institutions experimented with AI detection and/or have held off on emerging tools related to this? We have just recently adjusted guidance and paused some tools related to this given the massive inaccuracies in detection (and related downstream issues in faculty-elevated conduct cases)
Even though we learn repeatedly that innovation has a lot to do with effective project management and a solid message that helps people understand what they can do to implement change, people really need innovation to be more exciting and visionary than that. This is the place where we all need to help each other stay the course of change.
I have been talking to a lot of people about Generative AI, from teachers to business executives to artists to people actually building LLMs. In these conversations, a few key questions and themes keep coming up over and over again. Many of those questions are more informed by viral news articles about AI than about the real thing, so I thought I would try to answer a few of the most common, to the best of my ability.
I can’t blame people for asking because, for whatever reason, the companies actually building and releasing Large Language Models often seem allergic to providing any sort of documentation or tutorial besides technical notes. I was given much better documentation for the generic garden hose I bought on Amazon than for the immensely powerful AI tools being released by the world’s largest companies. So, it is no surprise that rumor has been the way that people learn about AI capabilities.
Currently, there are only really three AIs to consider: (1) OpenAI’s GPT-4 (which you can get access to with a Plus subscription or via Microsoft Bing in creative mode, for free), (2) Google’s Bard (free), or (3) Anthropic’s Claude 2 (free, but paid mode gets you faster access). As of today, GPT-4 is the clear leader, Claude 2 is second best (but can handle longer documents), and Google trails, but that will likely change very soon when Google updates its model, which is rumored to be happening in the near future.
In this second edition, award-winning educator Sue Ellen Christian offers students an accessible and informed guide to how they can consume and create media intentionally and critically.
The textbook applies media literacy principles and critical thinking to the key issues facing young adults today, from analyzing and creating media messages to verifying information and understanding online privacy. Through discussion prompts, writing exercises, key terms, and links, readers are provided with a framework from which to critically consume and create media in their everyday lives. This new edition includes updates covering privacy aspects of AI, VR and the metaverse, and a new chapter on digital audiences, gaming, and the creative and often unpaid labor of social media and influencers. Chapters examine news literacy, online activism, digital inequality, social media and identity, and global media corporations, giving readers a nuanced understanding of the key concepts at the core of media literacy. Concise, creative, and curated, this book highlights the cultural, political, and economic dynamics of media in contemporary society, and how consumers can mindfully navigate their daily media use.
This textbook is perfect for students and educators of media literacy, journalism, and education looking to build their understanding in an engaging way.
Canva’s new AI tools automate boring, labor-intensive design tasks — from theverge.com by Jess Weatherbed Magic Studio features like Magic Switch automatically convert your designs into blogs, social media posts, emails, and more to save time on manually editing documents.
Preparing Students for the AI-Enhanced Workforce— from insidehighered.com by Ray Schroeder Our graduating and certificate-completing students need documented generative AI skills, and they need them now.
The common adage repeated again and again is that AI will not take your job; a person with AI skills will replace you. The learners we are teaching this fall who will be entering, re-entering or seeking advancement in the workforce at the end of the year or in the spring must become demonstrably skilled in using generative AI. The vast majority of white-collar jobs will demand the efficiencies and flexibilities defined by generative AI now and in the future. As higher education institutions, we will be called upon to document and validate generative AI skills.
Think all incoming teachers have a natural facility with technology just because most are digital natives? Think again.
Teacher preparation programs have a long way to go in preparing prospective educators to teach with technology, according to a report released September 12 by the International Society for Technology in Education, a nonprofit.
In fact, more than half of incoming teachers—56 percent—lack confidence in using learning technology prior to entering the classroom, according to survey data included with the report.
AI-Powered Higher Ed — from drphilippahardman.substack.com by Dr. Philippa Hardman What a House of Commons round table discussion tells us about how AI will impact the purpose of higher education
In this week’s blog post I’ll summarise the discussion and share what we agreed would be the most likely new model of assessment in HE in the post-AI world.
But this in turn raises a bigger question: why do people go to university, and what is the role of higher education in the twenty first century? Is it to create the workforce of the future? Or an institution for developing deep and original domain expertise? Can and should it be both?
In my previous position with Richmond Public Schools, we chose to dive in with computational thinking, programming and coding, in that order. I recommend building computational thinking (CT) competency first by helping students recognize and apply the four elements of CT to familiar problems/situations. Computational thinking should come first because it’s the highest order of problem-solving, is a cross-curricular skill and is understandable to both machines and humans. Here are the four components of CT and how to help students understand them.
So much of the way that we think about education and work is organized into silos. Sure, that’s one way to ensure a depth of knowledge in a field and to encourage learners to develop mastery. But it also leads to domains with strict boundaries. Colleges are typically organized into school sub-domains, managed like fiefdoms, with strict rules for professors who can teach in different schools.
Yet it’s at the intersections of seemingly-disparate domains where breakthrough innovation can occur.
Maybe intersections bring a greater chance of future work opportunity, because that young person can increase their focus in one arena or another as they discover new options for work — and because this is what meaningful work in the future is going to look like.
From DSC: This posting strikes me as an endorsement for interdisciplinary degrees. I agree with much of this. It’s just hard to find the right combination of disciplines. But I supposed that depends upon the individual student and what he/she is passionate or curious about.
A lot of people have been asking if AI is really a big deal for the future of work. We have a new paper that strongly suggests the answer is YES. .
Consultants using AI finished 12.2% more tasks on average, completed tasks 25.1% more quickly, and produced 40% higher quality results than those without. Those are some very big impacts. Now, let’s add in the nuance.
One of the biggest challenges to navigate now is the fact that more digital tools will come with generative AI already embedded in them, says Annette Vee, director of composition and an associate professor at the University of Pittsburgh. “It’s everywhere in professional writing.”
“We need to be fundamentally rethinking ways we teach writing, so we are thinking about integrating tools mindfully,” says Vee, who helped develop a new resource, TextGenEd, that provides guidance in this area. “The real challenge is how do we teach courses that are preparing students and that are smart about generative AI? We have very few teachers currently equipped to do that work.”
“It’s best if there are real stakes attached to the work, for example, an authentic audience the student is writing to,” he writes. “A subject on which students have both sufficient interest and knowledge in order to feel as though they can write convincingly to this audience also matters a lot.”
Also relevant/see — via Robert Gibson on LinkedIn:
Learnt.ai — Built for Learning Specialists — from learnt.ai Harness the power of artificial intelligence to enhance your learning and development efforts with our easy-to-use platform – no technical expertise required!
Introducing Learnt.ai – a revolutionary collection of AI-powered content generation tools and AI chatbots that are specifically designed to support the common writing tasks of educationalists and learning and development professionals. Imagine being able to generate learning objectives on any topic of your choice, create engaging icebreakers and activities, write assessment questions with ease, and so much more.
Students are using artificial intelligence tools to assist them in their academic careers. Three students share their viewpoints on the tools they use and how using these tools helps them in their coursework and prepares them for the professional world.
Also relevant/see:
Why Professors Are Polarized on AI — from insidehighered.com by Susan D’Agostino Academics who perceive threats to education from AI band together as a survival mechanism. The resulting alliances echo divisions formed during online learning’s emergence.
What does active learning require from students? There is no secret that PBL and all other active learning approaches are much more demanding from students compared to traditional methods, mainly in terms of skills and attitudes towards learning. Here are some of the aspects where students, especially when first faced to active learning, seem to struggle:
Formulating own learning goals and following through with independent study. While in traditional teaching the learning goals are given to students, in PBL (or at least in some of its purest variants), they need to come up with their own, for each problem they are solving. This requires understanding the problem well but also a certain frame of mind where one can assess what is necessary to solve it and make a plan of how to go about it (independently and as a group). All these seemingly easy steps are often new to students and something they intrinsically expect from us as educators.
From DSC: The above excerpt re: formulating one’s own learning goals reminded me of project management and learning how to be a project manager.
It reminded me of a project that I was assigned back at Kraft (actually Kraft General Foods at the time). It was an online-based directory of everyone in the company at the time. When it was given to me, several questions arose in my mind:
Where do I start?
How do I even organize this project?
What is the list of to-do’s?
Who will I need to work with?
Luckily I had a mentor/guide who helped me get going and an excellent contact with the vendor who educated me and helped me get the ball rolling.
I’ll end with another quote and a brief comment:
Not being afraid of mistakes and learning from them. The education system, at all stages, still penalises mistakes, often with long term consequences. So it’s no wonder students are afraid of making mistakes…
What we teachers desperately need, though, is an ocean of examples and training. We need to see and share examples of generative AI—any type of artificial intelligence that can be used to create new text, images, video, audio, code, or data—being used across the curriculum. We need catalogs of new lesson plans and new curriculum.
And we need training on theoretical and practical levels: training to understand what artificial intelligence actually is and where it stands in the development timeline and training about how to integrate it into our classes.
So, my advice to teachers is to use any and all the generative AI you can get your hands on. Then experience—for yourself—verification of the information. Track it back to the source because in doing so, you’ll land on the adjustments you need to make in your classes next year.
From DSC: Interesting.
Learners can now seamlessly transition between AI-powered assistance (AI Tutor) and Live Expert support to get access to instant support, whether through AI-guided learning or real-time interactions with a human expert.
ASSIGNMENT MAKEOVERS IN THE AI AGE WITH DEREK BRUFF — from teachinginhighered.com by Bonni Stachowiak Derek Bruff shares about assignment makeovers in the AI age on episode 481 of the Teaching in Higher Ed podcast
Comment on this per Derek Bruff:
Why not ask ChatGPT to write what King or X would say about a current debate and then have the students critique the ChatGPT output? That would meet the same learning goals while also teaching AI literacy.
(Be sure to read Asim’s contribution for a useful take.)
Here’s a closer look at the concurrent AI landscape in schools — and a prediction of what the future holds.
So far, high-profile ventures in the instruction realm, such asKyron Learning, have fused teacher-produced, recorded content with LLM-powered conversational UX. The micro-learning tool Nolej references internet material when generating tasks and tests, but always holds the language model closely to the ground truth provided by teachers. Both are intriguing takes on re-imagining how to deliver core instruction and avoid hallucinations (generated content that is nonsensical).
As a result, real-time 3D jobs are among the most in demand within the tech industry. According to Unity’s vice president of Education and Social Impact, Jessica Lindl, demand is 50% higher than traditional IT jobs—adding that salaries for real-time 3D jobs are 60% greater.
“We want to provide really simple on ramps and pathways that will lead you into entry level jobs so that at any point in your career, you can decide to transfer into the industry,” Lindl says.
University World News continues its exploration of generative AI in our new special report on ‘AI and Higher Education’. In commentaries and features, academics and our journalists around the world investigate issues and developments around AI that are impacting on universities. Generative AI tools are challenging and changing higher education systems and institutions — how they are run as well as ways of teaching and learning and conducting research.
My advice for you today is this: fill your LinkedIn-feed and/or inbox with ideas, inspirational writing and commentary on AI.
This will get you up to speed quickly and is a great way to stay informed on the newest movements you need to be aware of.
My personal recommendation for you is to check out these bright people who are all very active on LinkedIn and/or have a newsletter worth paying attention to.
I have kept the list fairly short – only 15 people – in order to make it as easy as possible for you to begin exploring.
It is crucial to recognize that the intrinsic value of higher education isn’t purely in its ability to adapt to market fluctuations or technological innovations. Its core strength lies in promoting critical thinking, nurturing creativity, and instilling a sense of purpose and belonging. As AI progresses, these traits will likely become even more crucial. The question then becomes if higher education institutions as we know them today are the ony ones, or indeed the best ones, equipped to convey those core strengths to students.
Higher education clearly finds itself caught in a whirlwind of transformation, both in its essence and execution. The juxtaposition of legacy structures and the evolving technological landscape paints a complex picture.
For institutional leaders, the dual challenge lies in proactively seeking and initiating change (not merely adapting to it) without losing sight of their foundational principles. Simultaneously, they must equip students with skills and perspectives that AI cannot replicate.
“They begged, bargained with, and berated their instructor in pursuit of better grades — not “because they like points,” but rather, “because the education system has told them that these points are the currency with which they can buy a successful future.””
During this special keynote presentation, Western Michigan University (WMU) professor Sue Ellen Christian speaks about the importance of media literacy for all ages and how we can help educate our friends and families about media literacy principles. Hosted by the Grand Rapids Public Library and GRTV, a program of the Grand Rapids Community Media Center. Special thanks to the Grand Rapids Public Library Foundation for their support of this program.
Per Jeff Maggioncalda, Coursera CEO: “This system-wide industry micro-credential program sets an innovative blueprint for the future of higher education.”
The University of Texas and Coursera, the online learning platform and a pioneer of Massive Open Online Courses (MOOCS), are launching a large-scale, industry-recognized micro-credential program. The collaboration was announced today in a blogpost by Coursera.
Through the new partnership, every student, faculty, and staff (and even alumni) across all nine universities in the University of Texas (UT) System will gain access to Courser’s Career Academy for no additional cost to them.
The Homework Apocalypse — from oneusefulthing.org by Ethan Mollick Fall is going to be very different this year. Educators need to be ready.
Excerpt:
Students will cheat with AI. But they also will begin to integrate AI into everything they do, raising new questions for educators. Students will want to understand why they are doing assignments that seem obsolete thanks to AI. They will want to use AI as a learning companion, a co-author, or a teammate. They will want to accomplish more than they did before, and also want answers about what AI means for their future learning paths. Schools will need to decide how to respond to this flood of questions.
The challenge of AI in education can feel abstract, so to understand a bit more about what is going to happen, I wanted to examine some common assignment types.
After a successful career as a recording artist, David “TC” Ellis created Studio 4 in St. Paul to spot budding music stars.
It became a hangout spot for creative young people, most of whom had “dropped out of school due to boredom and a sense that school wasn’t relevant to their lives and dreams.”
Ellis and colleagues then opened the High School for Recording Arts in 1998.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
.
On giving AI eyes and ears— from oneusefulthing.org by Ethan Mollick AI can listen and see, with bigger implications than we might realize.
Excerpt:
But even this is just the beginning, and new modes of using AI are appearing, which further increases their capabilities. I want to show you some examples of this emerging world, which I think will soon introduce a new wave of AI use cases, and accompanying disruption.
We need to recognize that these capabilities will continue to grow, and AI will be able to play a more active role in the real world by observing and listening. The implications are likely to be profound, and we should start thinking through both the huge benefits and major concerns today.
Even though generative AI is a new thing, it doesn’t change why students cheat. They’ve always cheated for the same reason: They don’t find the work meaningful, and they don’t think they can achieve it to their satisfaction. So we need to design assessments that students find meaning in.
Tricia Bertram Gallant
Caught off guard by AI— from chonicle.com by Beth McMurtrie and Beckie Supiano Professor scrambled to react to ChatGPT this spring — and started planning for the fall
Excerpt:
Is it cheating to use AI to brainstorm, or should that distinction be reserved for writing that you pretend is yours? Should AI be banned from the classroom, or is that irresponsible, given how quickly it is seeping into everyday life? Should a student caught cheating with AI be punished because they passed work off as their own, or given a second chance, especially if different professors have different rules and students aren’t always sure what use is appropriate?
…OpenAI built tool use right into the GPT API with an update called function calling. It’s a little like a child’s ability to ask their parents to help them with a task that they know they can’t do on their own. Except in this case, instead of parents, GPT can call out to external code, databases, or other APIs when it needs to.
Each function in function calling represents a tool that a GPT model can use when necessary, and GPT gets to decide which ones it wants to use and when. This instantly upgrades GPT capabilities—not because it can now do every task perfectly—but because it now knows how to ask for what it wants and get it. .
.
How ChatGPT can help disrupt assessment overload— from timeshighereducation.com by David Carless Advances in AI are not necessarily the enemy – in fact, they should prompt long overdue consideration of assessment types and frequency, says David Carless
Excerpt:
Reducing the assessment burden could support trust in students as individuals wanting to produce worthwhile, original work. Indeed, students can be co-opted as partners in designing their own assessment tasks, so they can produce something meaningful to them.
A strategic reduction in quantity of assessment would also facilitate a refocusing of assessment priorities on deep understanding more than just performance and carries potential to enhance feedback processes.
If we were to tackle assessment overload in these ways, it opens up various possibilities. Most significantly there is potential to revitalise feedback so that it becomes a core part of a learning cycle rather than an adjunct at its end. End-of-semester, product-oriented feedback, which comes after grades have already been awarded, fails to encourage the iterative loops and spirals typical of productive learning. .
Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day.This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.
These claims conjure up the rosiest of images: human resource departments and their robot buddies solving discrimination in workplace hiring. It seems plausible, in theory, that AI could root out unconscious bias, but a growing body of research shows the opposite may be more likely.
…
Companies’ use of AI didn’t come out of nowhere: For example, automated applicant tracking systems have been used in hiring for decades. That means if you’ve applied for a job, your resume and cover letter were likely scanned by an automated system. You probably heard from a chatbot at some point in the process. Your interview might have been automatically scheduled and later even assessed by AI.
From DSC:
Here was my reflection on this:
DC: Along these lines, I wonder if Applicant Tracking Systems cause us to become like typecast actors and actresses — only thought of for certain roles. Pigeonholed.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
In June, ResumeBuilder.com surveyed more than 1,000 employees who are involved in hiring processes at their workplaces to find out about their companies’ use of AI interviews.
The results:
43% of companies already have or plan to adopt AI interviews by 2024
Two-thirds of this group believe AI interviews will increase hiring efficiency
15% say that AI will be used to make decisions on candidates without any human input
More than half believe AI will eventually replace human hiring managers
Watch OpenAI CEO Sam Altman on the Future of AI — from bloomberg.com Sam Altman, CEO & Co-Founder, OpenAI discusses the explosive rise of OpenAI and its products and what an AI-laced future can look like with Bloomberg’s Emily Chang at the Bloomberg Technology Summit.
The implementation of generative AI within these products will dramatically improve educators’ ability to deliver personalized learning to students at scale by enabling the application of personalized assessments and learning pathways based on individual student needs and learning goals. K-12 educators will also benefit from access to OpenAI technology…