Last week, Matt Barnum reported in Chalkbeat that the Chan Zuckerberg Initiative is laying off dozens of staff members and pivoting away from the personalized learning platform they have funded since 2015 with somewhere near $100M.
…
I have tried to illustrate as often as my subscribers will tolerate that students don’t particularly enjoy learning alone with laptops within social spaces like classrooms. That learning fails to answer their questions about their social identity. It contributes to their feelings of alienation and disbelonging. I find this case easy to make but hard to prove. Maybe we just haven’t done personalized learning right? Maybe Summit just needed to include generative AI chatbots in their platform?
What is far easier to prove, or rather to disprove, is the idea that “whole class instruction must feel impersonal to students,” that “whole class instruction must necessarily fail to meet the needs of individual students.”
From DSC: I appreciate Dan’s comments here (as highlighted above) as they are helpful in my thoughts regarding the Learning from the Living [Class] Room vision. They seem to be echoed here by Jeppe Klitgaard Stricker when he says:
Personalized learning paths can be great, but they also entail a potential abolishment or unintended dissolution of learning communities and belonging.
Perhaps this powerful, global, Artificial Intelligence (AI)-backed, next-generation, lifelong learning platform of the future will be more focused on postsecondary students and experiences — but not so much for the K12 learning ecosystem.
But the school systems I’ve seen here in Michigan (USA) represent systems that address a majority of the class only. These one-size-fits-all systems don’t work for many students who need extra help and/or who are gifted students. The trains move fast. Good luck if you can’t keep up with the pace.
But if K-12’ers are involved in a future learning platform, the platform needs to address what Dan’s saying. It must address students questions about their social identity and not contribute to their feelings of alienation and disbelonging. It needs to support communities of practice and learning communities.
To use Sherpa, an instructor first uploads the reading they’ve assigned, or they can have the student upload a paper they’ve written. Then the tool asks a series of questions about the text (either questions input by the instructor or generated by the AI) to test the student’s grasp of key concepts. The software gives the instructor the choice of whether they want the tool to record audio and video of the conversation, or just audio.
The tool then uses AI to transcribe the audio from each student’s recording and flags areas where the student answer seemed off point. Teachers can review the recording or transcript of the conversation and look at what Sherpa flagged as trouble to evaluate the student’s response.
AI Meets Med School— from insidehighered.com by Lauren Coffey Adding to academia’s AI embrace, two institutions in the University of Texas system are jointly offering a medical degree paired with a master’s in artificial intelligence.
The University of Texas at San Antonio has launched a dual-degree program combining medical school with a master’s in artificial intelligence.
Several universities across the nation have begun integrating AI into medical practice. Medical schools at the University of Florida, the University of Illinois, the University of Alabama at Birmingham and Stanford and Harvard Universities all offer variations of a certificate in AI in medicine that is largely geared toward existing professionals.
“I think schools are looking at, ‘How do we integrate and teach the uses of AI?’” Dr. Whelan said. “And in general, when there is an innovation, you want to integrate it into the curriculum at the right pace.”
Speaking of emerging technologies and med school, also see:
— Salma – Midjourney & SD AI Product Photographer (@Salmaaboukarr) September 29, 2023
How to stop AI deepfakes from sinking society — and science — from nature.com by Nicola Jones; via The Neuron Deceptive videos and images created using generative AI could sway elections, crash stock markets and ruin reputations. Researchers are developing methods to limit their harm.
48+ hours since Chat GPT-4V has started rolling out for Plus and enterprise users.
With just under 10 acquisitions in the last 5 years, PowerSchool has been active in transforming itself from a student information systems company to an integrated education company that works across the day and lifecycle of K–12 students and educators. What’s more, the company turned heads in June with its announcement that it was partnering with Microsoft to integrate AI into its PowerSchool Performance Matters and PowerSchool LearningNav products to empower educators in delivering transformative personalized-learning pathways for students.
As readers of this series know, I’ve developed a six-session design/build workshop series for learning design teams to create an AI Learning Design Assistant (ALDA). In my last post in this series, I provided an elaborate ChatGPT prompt that can be used as a rapid prototype that everyone can try out and experiment with.1 In this post, I’d like to focus on how to address the challenges of AI literacy effectively and equitably.
Countries worldwide are designing and implementing AI governance legislation commensurate to the velocity and variety of proliferating AI-powered technologies. Legislative efforts include the development of comprehensive legislation, focused legislation for specific use cases, and voluntary guidelines and standards.
This tracker identifies legislative policy and related developments in a subset of jurisdictions. It is not globally comprehensive, nor does it include all AI initiatives within each jurisdiction, given the rapid and widespread policymaking in this space. This tracker offers brief commentary on the wider AI context in specific jurisdictions, and lists index rankings provided by Tortoise Media, the first index to benchmark nations on their levels of investment, innovation and implementation of AI.
The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.
“AI is real”
JPMorgan CEO Jamie Dimon says artificial intelligence will be part of “every single process,” adding it’s already “doing all the equity hedging for us” https://t.co/EtsTbiME1apic.twitter.com/J9YD4slOpv
Part 1: October 16 | 3:00–4:30 p.m. ET
Part 2: October 19 | 3:00–4:30 p.m. ET
Part 3: October 26 | 3:00–4:30 p.m. ET
Part 4: October 30 | 3:00–4:30 p.m. ET
Welcome to The Future of Education with Michael B. Horn. In this insightful episode, Michael gains perspective on mapping AI’s role in education from Jacob Klein, a Product Consultant at Oko Labs, and Laurence Holt, an Entrepreneur In Residence at the XQ Institute. Together, they peer into the burgeoning world of AI in education, analyzing its potential, risks, and roadmap for integrating it seamlessly into learning environments.
Student Use Cases for AI: Start by Sharing These Guidelines with Your Class — from hbsp.harvard.edu by Ethan Mollick and Lilach Mollick
To help you explore some of the ways students can use this disruptive new technology to improve their learning—while making your job easier and more effective—we’ve written a series of articles that examine the following student use cases:
Earlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.
Then, in class he put them into groups where they worked together to generate a 500-word essay on “Why I Write” entirely through ChatGPT. Each group had complete freedom in how they chose to use the tool. The key: They were asked to evaluate their essay on how well it offered a personal perspective and demonstrated a critical reading of the piece. Weiss also graded each ChatGPT-written essay and included an explanation of why he came up with that particular grade.
After that, the students were asked to record their observations on the experiment on the discussion board. Then they came together again as a class to discuss the experiment.
Weiss shared some of his students’ comments with me (with their approval). Here are a few:
Asked to describe the state of generative AI that they would like to see in higher education 10 years from now, panelists collaboratively constructed their preferred future. .
Julie York, a computer science and media teacher at South Portland High School in Maine, was scouring the internet for discussion tools for her class when she found TeachFX. An AI tool that takes recorded audio from a classroom and turns it into data about who talked and for how long, it seemed like a cool way for York to discuss issues of data privacy, consent and bias with her students. But York soon realized that TeachFX was meant for much more.
York found that TeachFX listened to her very carefully, and generated a detailed feedback report on her specific teaching style. York was hooked, in part because she says her school administration simply doesn’t have the time to observe teachers while tending to several other pressing concerns.
“I rarely ever get feedback on my teaching style. This was giving me 100 percent quantifiable data on how many questions I asked and how often I asked them in a 90-minute class,” York says. “It’s not a rubric. It’s a reflection.”
TeachFX is easy to use, York says. It’s as simple as switching on a recording device.
…
But TeachFX, she adds, is focused not on her students’ achievements, but instead on her performance as a teacher.
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds — from the74million.org by Mark Keierleber While educators worry that students are using generative AI to cheat, a new report finds students are turning to the tool more for personal problems.
Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.
Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat.
The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.
Thanks to generative AI, such visions are transitioning from fiction to reality.
Video: Unleashing the Power of AI in L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman An exclusive video walkthrough of my keynote at Sweden’s national L&D conference this week
Highlights
The wicked problem of L&D: last year, $371 billion was spent on workplace training globally, but only 12% of employees apply what they learn in the workplace
An innovative approach to L&D: when Mastery Learning is used to design & deliver workplace training, the rate of “transfer” (i.e. behaviour change & application) is 67%
AI 101: quick summary of classification, generative and interactive AI and its uses in L&D
The impact of AI: my initial research shows that AI has the potential to scale Mastery Learning and, in the process:
reduce the “time to training design” by 94% > faster
reduce the cost of training design by 92% > cheaper
increase the quality of learning design & delivery by 96% > better
Research also shows that the vast majority of workplaces are using AI only to “oil the machine” rather than innovate and improve our processes & practices
Practical tips: how to get started on your AI journey in your company, and a glimpse of what L&D roles might look like in a post-AI world
Edsoma is an app that uses an AI reading assistant to help people learn or improve their reading and communication.
…
For now, the company is targeting users in grades kindergarten to fourth grade based on the content that it has today. Wallgren noted that the Edsoma’s technology will work right through into university and he has ambitions to become the No. 1 literacy resource in the United States.
Outschool, the online learning platform that offers kid-friendly academic and interest-based classes, announced today the launch of its AI Teaching Assistant, a tool for tutors to generate progress reports for their students. The platform — mainly popular for its small group class offerings — also revealed that it’s venturing into one-on-one tutoring, putting it in direct competition with companies like Varsity Tutors, Tutor.com and Preply.
But this fall, a few educators are quietly charting a different course they believe could change everything: At least two groups are pushing to create new AI chatbots that would offer teachers unlimited access to sometimes confusing and often paywalled peer-reviewed research on the topics that most bedevil them.
Their aspiration is to offer new tools that are more focused and helpful than wide-ranging ones like ChatGPT, which tends to stumble over research questions with competing findings. Like many juveniles faced with questions they can’t answer, it has a frustrating tendency to make things up.
Then, I turned Custom Instructions on and gave it the same starting prompt. Here are a few of the responses (emphases are mine):
“Purpose:What’s the primary goal of the course? Is it to augment your existing offerings at Every, serve as a standalone product, or something else?”
“Content Depth: Given your love for detailed, intellectual writing, how in-depth will you go on each topic?”
“Guest Contributions: Do you plan to involve guest experts, perhaps even some of your writing heroes or team members like Kate Lee and Evan Armstrong?”
“Time Investment: Given that you’re already grappling with being too opportunistic and dutiful, how will you allocate time for this without sidelining your creative work or burdening yourself?”
AI for comics— from aiandacademia.substack.com by Bryan Alexander Exploring an entertaining new Hugging Face application
As generative AI grows, more uses appear. Beyond text and images, other domains for creation pop up as companies, groups, and individuals try out new functions.
Today’s case in point is AI Comic Factory from Hugging Face.* This service will generate a couple of comic book pages based on your text input. It gives you choices of comic style and page layout as well.
Items from Tom Barrett’ Promptcraft: AI for a better learning ecosystem
How many languages do you speak? Thanks to AI, that number could be as many as seven. Los Angeles-based AI video platform HeyGen has launched a new tool that clones your voice from a video and translates what you’re saying into seven different languages. If that wasn’t enough, it also syncs your lips to your new voice so the final clip looks (and sounds) as realistic as possible.
Microsoft and Project Gutenberg have used AI technologies to create more than 5,000 free audiobooks with high-quality synthetic voices.
For the project, the researchers combined advances in machine learning, automatic text selection (which texts are read aloud, which are not), and natural-sounding speech synthesis systems.
What if you could have a conversation with your notes?That question has consumed a corner of the internet recently, as companies like Dropbox, Box, Notion, and others have built generative AI tools that let you interact with and create new things from the data you already have in their systems.
Google’s version of this is called NotebookLM. It’s an AI-powered research tool that is meant to help you organize and interact with your own notes.
That got me to thinking…
What if the presenter/teacher/professor/trainer/preacher provided a set of notes for the AI to compare to the readers’ notes?
That way, the AI could see the discrepancies between what the presenter wanted their audience to learn/hear and what was actually being learned/heard. In a sort of digital Socratic Method, the AI could then generate some leading questions to get the audience member to check their thinking/understanding of the topic.
The end result would be that the main points were properly communicated/learned/received.
What if you could have a conversation with your notes?That question has consumed a corner of the internet recently, as companies like Dropbox, Box, Notion, and others have built generative AI tools that let you interact with and create new things from the data you already have in their systems.
Google’s version of this is called NotebookLM. It’s an AI-powered research tool that is meant to help you organize and interact with your own notes.
… Right now, it’s really just a prototype, but a small team inside the company has been trying to figure out what an AI notebook might look like.
The single narrative education system is no longer working.
Its main limitation is its inability to honor young people as the dynamic individuals that they are.
New models of teaching and learning need to be designed to center on the student, not the teacher.
When the opportunity arises to implement learning that uses immersive technology ask yourself if the learning you are designing passes the Ready Player One Test:
Does it allow learners to immerse themselves in environments that would be too expensive or dangerous to experience otherwise?
Can the learning be personalized by the student?
Is it regenerative?
Does it allow for learning to happen non-linearly, at any time and place?
So, as educators, mentors, and guides to our future generations, we must ask ourselves three pivotal questions:
What value do we offer to our students?
What value will they need to offer to the world?
How are we preparing them to offer that value?
The answers to these questions are crucial, and they will redefine the trajectory of our education system.
We need to create an environment that encourages curiosity, embraces failure as a learning opportunity, and celebrates diversity. We need to teach our students how to learn, how to ask the right questions, and how to think for themselves.
Leveraging ChatGPT for learning is the most meaningful skill this year for lifelong learners. But it’s too hard to find resources to master it.
As a learning science nerd, I’ve explored hundreds of prompts over the past months. Most of the advice doesn’t go beyond text summaries and multiple-choice testing.
That’s why I’ve created this article — it merges learning science with prompt writing to help you learn anything faster.
Midjourney AI Art for Teachers (for any kind of teacher, not just Art Teachers) — from The AI Educator on YouTube by Dan Fitzpatrick
From DSC: This is a very nice, clearly illustrated, free video to get started with the Midjourney (text-to-image) app. Nice work Dan!
In the new-normal of generative AI, how does one articulate the value of academic integrity? This blog presents my current response in about 2,500 words; a complete answer could fill a sizable book.
Massive amounts of misinformation are disseminated about generative AI, so the first part of my discussion clarifies what large language models (Chat-GPT and its counterparts) can currently do and what they cannot accomplish at this point in time. The second part describes ways in which generative AI can be misused as a means of learning; unfortunately, many people are now advocating for these mistaken applications to education. The third part describes ways in which large language models (LLM), used well, may substantially improve learning and education. I close with a plea for a robust, informed public discussion about these topics and issues.
Many of the more than a dozen teachers TIME interviewed for this story argue that the way to get kids to care is to proactively use ChatGPT in the classroom.
…
Some of those creative ideas are already in effect at Peninsula High School in Gig Harbor, about an hour from Seattle. In Erin Rossing’s precalculus class, a student got ChatGPT to generate a rap about vectors and trigonometry in the style of Kanye West, while geometry students used the program to write mathematical proofs in the style of raps, which they performed in a classroom competition. In Kara Beloate’s English-Language Arts class, she allowed students reading Shakespeare’s Othello to use ChatGPT to translate lines into modern English to help them understand the text, so that they could spend class time discussing the plot and themes.
I found that other developed countries share concerns about students cheating but are moving quickly to use AI to personalize education, enhance language lessons and help teachers with mundane tasks, such as grading. Some of these countries are in the early stages of training teachers to use AI and developing curriculum standards for what students should know and be able to do with the technology.
Several countries began positioning themselves several years ago to invest in AI in education in order to compete in the fourth industrial revolution.
AI in Education— from educationnext.org by John Bailey The leap into a new era of machine intelligence carries risks and challenges, but also plenty of promise
In the realm of education, this technology will influence how students learn, how teachers work, and ultimately how we structure our education system. Some educators and leaders look forward to these changes with great enthusiasm. Sal Kahn, founder of Khan Academy, went so far as to say in a TED talk that AI has the potential to effect “probably the biggest positive transformation that education has ever seen.” But others warn that AI will enable the spread of misinformation, facilitate cheating in school and college, kill whatever vestiges of individual privacy remain, and cause massive job loss. The challenge is to harness the positive potential while avoiding or mitigating the harm.
Generative AI and education futures — from ucl.ac.uk Video highlights from Professor Mike Sharples’ keynote address at the 2023 UCL Education Conference, which explored opportunities to prosper with AI as a part of education.
Bringing AI Literacy to High Schools— from by Nikki Goth Itoi Stanford education researchers collaborated with teachers to develop classroom-ready AI resources for high school instructors across subject areas.
To address these two imperatives, all high schools need access to basic AI tools and training. Yet the reality is that many underserved schools in low-income areas lack the bandwidth, skills, and confidence to guide their students through an AI-powered world. And if the pattern continues, AI will only worsen existing inequities. With this concern top of mind plus initial funding from the McCoy Ethics Center, Lee began recruiting some graduate students and high school teachers to explore how to give more people equal footing in the AI space.
Inspired by my recent Review: Shure MV7 dynamic hybrid studio microphone – near, far and beyond, Beaker Films of Fairfield, Connecticut, US has developed and deployed a first batch of 10 kits to capture remote conversations from different locations worldwide. Beaker Films is frequently contracted to record remote interviews or testimonials from medical professionals. For this project, Beaker Films’ clients wanted consistent, high quality audio and video, but with 3 additional challenges: they preferred to have no visible microphone in the shot, they needed a teleprompter function and the whole kit needed to be as simple as possible for non-technical guests.
West Suffolk College in the UK has opened its Extended Reality Lab (XR Lab), the facilities comprise of four distinct areas: an Immersion Lab, a Collaboration Theatre, a Green Room, and a Conference Room. The project was designed by architects WindsorPatania for Eastern Colleges Group.
Systems integrator CJP Broadcast Service Solutions, has won a tender to build a virtual production environment for Solent University in the UK.
The new facilities, converted from an existing studio space, will provide students on the film production courses with outstanding opportunities to develop their creative output.
I get weary of AI hype auto-generated by *gasp* ChatGPT without any real applicability to the classroom. I’m collecting blogs of teachers who are in the classroom actually use AI and have practical, real examples to share. Got links for those teachers? Enough of the junk.
COLORADO SPRINGS, Colo., July 26, 2023 /PRNewswire/ — A new survey of teens conducted for Junior Achievement by the research firm Big Village shows that nearly half of teens (44%) are “likely” to use AI to do their schoolwork instead of doing it themselves this coming school year. However, most teens (60%) consider using AI in this way as “cheating.” The survey of 1,006 13- to 17-year-olds was conducted by Big Village from July 6 through 11, 2023.
From DSC: In a competitive society as we have in the U.S. and when many of our K-12 learning ecosystems are designed to create game players, we shouldn’t be surprised to see a significant amount of our students using AI to “win”/game the system.
As it becomes appropriate for each student, offering more choice and control should help to allow more students to pursue what they want to learn about. They won’t be as interested in gaming the system if they truly want to learn about something.
“Some of the stuff we’re doing is creating templates and workflows that capture multiple feeds: not just the teacher, [but also] the white board, an overhead camera,” Risby says.
“The student can then go in and pick what they look at, so it’s more interactive. You might be watching it the first time to listen to the lecturer, but you might watch the second time to concentrate on the experiment. It makes the stream more valuable.”
The implications of this development are perhaps more significant than we realise. There has been much discussion in recent months about the risks associated with the rise of generative AI for higher education, with most of the discussion centring around the challenge that ChatGPT poses to academic integrity.
However, much less work has been done on exploring the negative – even existential – consequences that might stem from not embracing AI in higher education.Are these new principles enough to reverse the risk of irrelevance?
What if we reimagine “learning” in higher education as something more than the recall and restructuring of existing information? What if instead of lectures, essays and exams we shifted to a model of problem sets, projects and portfolios?
I am often asked what this could look like in practice. If we turn to tried and tested instructional strategies which optimise for learner motivation and mastery, it would look something like this…
Also relevant/see:
Do or Die? — from drphilippahardman.substack.com by Dr. Philippa Hardman The invisible cost of resisting AI in higher education
Excerpt:
Embracing AI in the higher education sector prepares students for the increasingly technology-driven job market and promotes more active, participatory learning experiences which we know lead to better outcomes for both students and employers.
With the rising popularity of alternative education routes such as bootcamps and apprenticeships, it’s crucial for traditional higher education to engage positively with AI in order to maintain its competitiveness and relevance.
DC: Sounds very useful for learning-related items.
“Custom instructions allow you to add preferences or requirements that you’d like ChatGPT to consider when generating its responses.” https://t.co/n0WOJnmDIY
— Daniel Christian (he/him/his) (@dchristian5) July 21, 2023
For example, a teacher crafting a lesson plan no longer has to repeat that they’re teaching 3rd grade science. A developer preferring efficient code in a language that’s not Python – they can say it once, and it’s understood. Grocery shopping for a big family becomes easier, with the model accounting for 6 servings in the grocery list.
Teaching technology There is also the misconception around the word ‘generative’, the assumption that all it does is create blocks of predictable text. Wrong. May of its best uses in learning are its ability to summarise, outline, provide guidance, support and many other pedagogic features that can be built into the software. This works and will mean tutors, teachers, teaching support, not taking support, coaches and many other services will emerge that aid both teaching and learning. They are being developed in their hundreds as we speak.
This simple fact, that this is the first technology to ‘learn’ and learn fast, on scale, continuously, across a range of media and tasks, it what makes it extraordinary.
On holding back the strange AI tide — from oneusefulthing.org by Ethan Mollick There is no way to stop the disruption. We need to channel it instead
And empowering workers is not going to be possible with a top-down solution alone. Instead, consider:
Radical incentives to ensure that workers are willing to share what they learn. If they are worried about being punished, they won’t share. If they are worried they won’t be rewarded, they won’t share. If they are worried that the AI tools that they develop might replace them, or their coworkers, they won’t share. Corporate leaders need to figure out a way to reassure and reward workers, something they are not used to doing.
Empowering user-to-user innovation. Build prompt libraries that help workers develop and share prompts with other people inside the organization. Open up tools broadly to workers to use (while still setting policies around proprietary information), and see what they come up with. Create slack time for workers to develop, and discuss, AI approaches.
Don’t rely on outside providers or your existing R&D groups to tell you the answer. We are in the very early days of a new technology. Nobody really knows anything about the best ways to use AI, and they certainly don’t know the best ways to use it in your company. Only by diving in, responsibly, can you hope to figure out the best use cases.
In recent years, I have witnessed the transformative power of technology in higher education. One particular innovation that has captured my attention is Artificial Intelligence (AI). AI holds tremendous potential as an assistive technology for students with reasonable adjustments in further education (FE) and higher education (HE).
In this comprehensive blog post, I will delve into the multifaceted aspects of AI as an assistive technology, exploring its benefits, considerations, challenges, and the future it holds for transforming higher education.
The integration of AI as an assistive technology can create an inclusive educational environment where all students, regardless of disabilities or specific learning needs, have equal access to educational resources. Real-time transcription services, text-to-speech capabilities, and personalized learning experiences empower students like me to engage with course content in various formats and at our own pace (Fenews, 2023). This not only removes barriers but also fosters a more inclusive and diverse academic community.
What can we do? Here are five considerations I’ll be following this coming fall in response to that nagging “less discussion, more instruction” evaluation.
Courseware Can Be Integral to a Course. Why, Then, Are Students Footing the Bill for It? — from chronicle.com by Taylor Swaak The Homework Tax | For students already struggling to afford college, courseware can add to the burden Their argument is multifold: For one, they say, products like these — which often deliver key elements of a course that an instructor would typically be responsible for, like homework, assessments, and grading — should not be the student’s burden. At least one student advocate said colleges, rather, should cover or subsidize the cost, as they do with software like learning-management systems, if they’re allowing faculty free rein to adopt the products.
…
And the fact that students’ access to these products expires — sometimes after just a semester — rubs salt in the wound, and risks further disadvantaging students. .
Institutions aren’t “letting the wolf into the henhouse”; instead, “we’re letting the hens out into a forest of wolves,” said Billy Meinke, an open educational resources technologist with the Outreach College at the University of Hawaii-Manoa who’s done research on publisher misuse of student data. .
Here are five summer reading challenges to learn about the science of learning.
Important: make sure you remember what you learn! Engage yourself in retrieval practice and retrieve two things after each book, practice guide, and research article you read. Share your two things with our communities on Twitter and Facebook, make a list of what you’ve learned to boost your long-term learning,…
Last week, I explored some ways an instructor might want to (or need to) redesign a reading response assignment for the fall, given the many AI text generation tools now available to students. This week, I want to continue that thread with another assignment makeover. Reading response assignments were just the warm up; now we’re tackling the essay assignment.
First, they need to understand that the technological side of AI can no longer be simply left to the information technology experts. Regardless of the professional domain, understanding what AI is, how it works, how the underlying code and algorithms are designed, and what assumptions lie behind the computer code are important components to being able to use and consume the products of AI tools appropriately.