This moment requires agility — and L&D can lead the way.
For individuals, agility fuels career growth and relevance. For organizations, agility equals the ability to survive and thrive even amid economic headwinds.
And what is agility if not constant learning? To build a resilient and adaptable future, learning leaders are working across HR to ensure everyone has the tools, the career paths, and the skills to succeed.
Read on for insights and ideas to power your own path.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
.
On giving AI eyes and ears— from oneusefulthing.org by Ethan Mollick AI can listen and see, with bigger implications than we might realize.
Excerpt:
But even this is just the beginning, and new modes of using AI are appearing, which further increases their capabilities. I want to show you some examples of this emerging world, which I think will soon introduce a new wave of AI use cases, and accompanying disruption.
We need to recognize that these capabilities will continue to grow, and AI will be able to play a more active role in the real world by observing and listening. The implications are likely to be profound, and we should start thinking through both the huge benefits and major concerns today.
Even though generative AI is a new thing, it doesn’t change why students cheat. They’ve always cheated for the same reason: They don’t find the work meaningful, and they don’t think they can achieve it to their satisfaction. So we need to design assessments that students find meaning in.
Tricia Bertram Gallant
Caught off guard by AI— from chonicle.com by Beth McMurtrie and Beckie Supiano Professor scrambled to react to ChatGPT this spring — and started planning for the fall
Excerpt:
Is it cheating to use AI to brainstorm, or should that distinction be reserved for writing that you pretend is yours? Should AI be banned from the classroom, or is that irresponsible, given how quickly it is seeping into everyday life? Should a student caught cheating with AI be punished because they passed work off as their own, or given a second chance, especially if different professors have different rules and students aren’t always sure what use is appropriate?
…OpenAI built tool use right into the GPT API with an update called function calling. It’s a little like a child’s ability to ask their parents to help them with a task that they know they can’t do on their own. Except in this case, instead of parents, GPT can call out to external code, databases, or other APIs when it needs to.
Each function in function calling represents a tool that a GPT model can use when necessary, and GPT gets to decide which ones it wants to use and when. This instantly upgrades GPT capabilities—not because it can now do every task perfectly—but because it now knows how to ask for what it wants and get it. .
.
How ChatGPT can help disrupt assessment overload— from timeshighereducation.com by David Carless Advances in AI are not necessarily the enemy – in fact, they should prompt long overdue consideration of assessment types and frequency, says David Carless
Excerpt:
Reducing the assessment burden could support trust in students as individuals wanting to produce worthwhile, original work. Indeed, students can be co-opted as partners in designing their own assessment tasks, so they can produce something meaningful to them.
A strategic reduction in quantity of assessment would also facilitate a refocusing of assessment priorities on deep understanding more than just performance and carries potential to enhance feedback processes.
If we were to tackle assessment overload in these ways, it opens up various possibilities. Most significantly there is potential to revitalise feedback so that it becomes a core part of a learning cycle rather than an adjunct at its end. End-of-semester, product-oriented feedback, which comes after grades have already been awarded, fails to encourage the iterative loops and spirals typical of productive learning. .
Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day.This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.
These claims conjure up the rosiest of images: human resource departments and their robot buddies solving discrimination in workplace hiring. It seems plausible, in theory, that AI could root out unconscious bias, but a growing body of research shows the opposite may be more likely.
…
Companies’ use of AI didn’t come out of nowhere: For example, automated applicant tracking systems have been used in hiring for decades. That means if you’ve applied for a job, your resume and cover letter were likely scanned by an automated system. You probably heard from a chatbot at some point in the process. Your interview might have been automatically scheduled and later even assessed by AI.
From DSC:
Here was my reflection on this:
DC: Along these lines, I wonder if Applicant Tracking Systems cause us to become like typecast actors and actresses — only thought of for certain roles. Pigeonholed.
— Daniel Christian (he/him/his) (@dchristian5) June 23, 2023
In June, ResumeBuilder.com surveyed more than 1,000 employees who are involved in hiring processes at their workplaces to find out about their companies’ use of AI interviews.
The results:
43% of companies already have or plan to adopt AI interviews by 2024
Two-thirds of this group believe AI interviews will increase hiring efficiency
15% say that AI will be used to make decisions on candidates without any human input
More than half believe AI will eventually replace human hiring managers
Watch OpenAI CEO Sam Altman on the Future of AI — from bloomberg.com Sam Altman, CEO & Co-Founder, OpenAI discusses the explosive rise of OpenAI and its products and what an AI-laced future can look like with Bloomberg’s Emily Chang at the Bloomberg Technology Summit.
The implementation of generative AI within these products will dramatically improve educators’ ability to deliver personalized learning to students at scale by enabling the application of personalized assessments and learning pathways based on individual student needs and learning goals. K-12 educators will also benefit from access to OpenAI technology…
Schools and organizations are human systems, filled with opportunities.
And yet there is a profound difference between building from a foundation of schooling and building from a foundation of learning.
Rather than simply replicate and scale the arrangements of schooling, we must seize the possibility to shift from a schooling system to a learning ecosystem to truly empower learners, educators, and parents to create a virtuous future for themselves and their communities.
Louka Parry
Our challenge will be how we choose to redesign and remake our experiences, environments, and ecosystems so that we unlock our true creative potential and thrive in the future.
After chronicling 160+ AI tools (which is surely only a small fraction of the total), we’re seeing a few clear patterns among the tools that have come out so far- here are 10 categories that are jumping out!
“I don’t usually get worked up about announcements but I see promise in JFF’s plans for a new Center for Artificial Intelligence & the Future of Work, in no small part because the organization bridges higher ed, K-12 education, employers, and policymakers.”
BOSTON June 14, 2023 —Jobs for the Future (JFF), a national nonprofit that drives transformation in the U.S. education and workforce systems, today announced the launch of its new Center for Artificial Intelligence &the Future of Work. This center will play an integral role in JFF’s mission and newly announced 10-year North Star goal to help 75 million people facing systemic barriers to advancement work in quality jobs. As AI’s explosive growth reshapes every aspect of how we learn, work, and live, this new center will serve as a nexus of collaboration among stakeholders from every part of the education-to-career ecosystem to explore the most promising opportunities—and profound challenges—of AI’s potential to advance an accessible and equitable future of learning and work.
OpenAI Considers ‘App Store’ For ChatGPT — from searchenginejournal.com by; with thanks to Barsee at AI Valley for this resource OpenAI explores launching an ‘app store’ for AI models, potentially challenging current partners and expanding customer reach.
Highlights:
OpenAI considers launching an ‘app store’ for customized AI chatbots.
This move could create competition with current partners and extend OpenAI’s customer reach.
Early interest from companies like Aquant and Khan Academy shows potential, but product development and market positioning challenges remain.
The rise of artificial intelligence, especially generative AI, boosts productivity in content creation–text, code, images and increasingly video.
Here are six preliminary conclusions about the nature of work and learning.
Wonder Tools: AI to try— from wondertools.substack.com by Jeremy Caplan 9 playful little ways to explore AI
Excerpt:
Create a personalized children’s story ? | Schrodi Collaborate with AI on a free customized, illustrated story for someone special. Give your story’s hero a name, pick a genre (e.g. comedy, thriller), choose an illustration style (e.g. watercolor, 3d animation) and provide a prompt to shape a simple story. You can even suggest a moral. After a minute, download a full-color PDF to share. Or print it and read your new mini picture book aloud.
Generate a quiz ? | Piggy Put in a link, a topic, or some text and you’ll get a quiz you can share, featuring multiple-choice or true-false questions. Example: try this quick entrepreneurship quiz Piggy generated for me.
Q: How will generative AI impact teaching and learning in the near and long term?
Baker Stein: One-on-one tutoring at scale is finally being unlocked for learners around the world. This type of quality education is no longer only available to students with the means to hire a private tutor.I’m also particularly excited to see how educators make use of generative AI tools to create courses much faster and likely at a higher quality with increased personalization for each student or even by experimenting with new technologies like extended reality. Professors will be able to put their time toward high-impact activities like mentoring, researching and office hours instead of tedious course-creation tasks. This helps open up the capacity for educators to iterate on their courses faster to keep pace with industry and global changes that may impact their field of study.
Another important use case is how generative AI can serve as a great equalizer for students when it comes to writing, especially second language learners.
Why it matters: The best AI assistants will be the ones that require the least prompting. They’ll get to know who you are, what you need, and your modus operandi. Profiles are a good starting point, but we believe the game-changer will be larger context windows (that’s nerd-speak for the amount of context ChatGPT can handle). .
From DSC: And how about taking this a step further and remembering — or being able to access — our constantly updated Cloud-Based Learning Profiles?
My hypothesis and research suggest that as bar associations and the ABA begin to recognize the on-going systemic issues of high-cost legal education, growing legal deserts (where no lawyer serves a given population), on-going and pervasive access to justice issues, and a public that is already weary of the legal system – alternative options that are already in play might become more supported.
What might that look like?
The combination of AI-assisted education with traditional legal apprenticeships has the potential to create a rich, flexible, and engaging learning environment. Here are three scenarios that might illustrate what such a combination could look like:
Scenario One – Personalized Curriculum Development
Scenario Two – On-Demand Tutoring and Mentoring
Scenario Three – AI-assisted Peer Networks and Collaborative Learning:
We know that there are challenges – a threat to human jobs, the potential implications for cyber security and data theft, or perhaps even an existential threat to humanity as a whole. But we certainly don’t yet have a full understanding of all of the implications. In fact, a World Economic Forum report recently stated that organizations “may currently underappreciate AI-related risks,” with just four percent of leaders considering the risk level to be “significant.”
A survey carried out by analysts Baker McKenzie concluded that many C-level leaders are over-confident in their assessments of organizational preparedness in relation to AI. In particular, it exposed concerns about the potential implications of biased data when used to make HR decisions.
AI & lawyer training: How law firms can embrace hybrid learning & development — thomsonreuters.com A big part of law firms’ successful adaptation to the increased use of ChatGPT and other forms of generative AI, may depend upon how firmly they embrace online learning & development tools designed for hybrid work environments
Excerpt:
As law firms move forward in using of advanced artificial intelligence such as ChatGPT and other forms of generative AI, their success may hinge upon how they approach lawyer training and development and what tools they enlist for the process.
One of the tools that some law firms use to deliver a new, multi-modal learning environment is an online, video-based learning platform, Hotshot, that delivers more than 250 on-demand courses on corporate, litigation, and business skills.
Ian Nelson, co-founder of Hotshot, says he has seen a dramatic change in how law firms are approaching learning & development (L&D) in the decade or so that Hotshot has been active. He believes the biggest change is that 10 years ago, firms hadn’t yet embraced the need to focus on training and development.
From DSC: Heads up law schools. Are you seeing/hearing this!?
Are we moving more towards a lifelong learning model within law schools?
If not, shouldn’t we be doing that?
Are LLM programs expanding quickly enough? Is more needed?
Teaching with music can enhance learning in almost any subject area, says Sherena Small, a school social worker at Champaign Unit 4 School District in Illinois.
“It’s just such a good way to enhance what kids are learning,” says Small, who uses hip-hop and other music to teach social-emotional learning skills, including empathy and active listening. Earlier this year, Nearpod recognized Small as an Educator of the Year for her innovative efforts using Nearpod’s Flocabulary tool to incorporate music into class.
Speaking of multimedia, also see:
And here’s another interesting item from Dr. Burns:
Since 2012, 65 private colleges and universities with enrollment of 500 students or more, that I know of, have reduced their tuition, and commensurately reduced their discount rate. Several more schools are planning price resets for fall 2024. Schools use this strategy to increase the number of students who will consider them, and this approach has been successful for more than 80 percent of the schools which have reduced their published price.
From DSC: What I learned of economics in college would agree with this last bit. As the price goes down, demand goes up. And conversely, as the price goes up, demand goes down. As Lucie points out, many people don’t know about the heavily discounted prices within higher education. I’ve been fighting for price decreases for over 15 years…clearly, I haven’t had much success in that area.
AI-assisted cheating isn’t a temptation if students have a reason to care about their own learning.
Yesterday I happened to listen to two different podcasts that ended up resonating with one another and with an idea that’s been rattling around inside my head with all of this moral uproar about generative AI:
** If we trust students – and earn their trust in return – then they will be far less motivated to cheat with AI or in any other way. **
First, the question of motivation. On the Intentional Teaching podcast, while interviewing James Lang and Michelle Miller on the impact of generative AI, Derek Bruff points out (drawing on Lang’s Cheating Lessons book) that if students have “real motivation to get some meaning out of [an] activity, then there’s far less motivation to just have ChatGPT write it for them.” Real motivation and real meaning FOR THE STUDENT translates into an investment in doing the work themselves.
…
Then I hopped over to one of my favorite podcasts – Teaching in Higher Ed – where Bonni Stachowiak was interviewing Cate Denial about a “pedagogy of kindness,” which is predicated on trusting students and not seeing them as adversaries in the work we’re doing.
So the second key element: being kind and trusting students, which builds a culture of mutual respect and care that again diminishes the likelihood that they will cheat.
…
Again, human-centered learning design seems to address so many of the concerns and challenges of the current moment in higher ed. Maybe it’s time to actually practice it more consistently. #aiineducation #higheredteaching #inclusiveteaching
How liberal arts colleges can make career services a priority — from highereddive.com by John Boyer Creating internships and focusing on short-term experiences has a big impact, the longtime undergraduate dean at the University of Chicago says.
TI-ADDIE: A Trauma-Informed Model of Instructional Design — from er.educause.edu by Ali Carr-Chellman and Treavor Bogard Adjusting the ADDIE model of instructional design specifically to accommodate trauma offers an opportunity to address the collective challenges that designers, instructors, and learners have faced during the current learning moment.
From DSC: Dr. Nino makes several solid points in this article. The article won’t let me copy/paste some excerpts for you, but I would encourage you to look at it.
I would add a few things:
The huge advantage of online-based learning is that a significant amount of learning-related data is automatically captured and doesn’t need to be manually entered (if such manually entered data ever does get entered…which most of it doesn’t).
Learners have much more control over the pacing within the digital realm — i.e., which media they want to use as well as stopping/fast-forwarding/rewinding certain kinds of media.
Most people are now required to be lifelong learners — where convenience and time-savings become very important factors in continuing one’s education
And finally, as AI and other technologies continue to make their way forward, it will be hard to beat online-based and/or hybrid-based learning.
Ernst and Young dug a little deeper. “Today’s disruptive working landscape requires organisations to largely restructure the way they are doing work,” they noted in a bulletin in March this year. “Time now spent on tasks will be equally divided between people and machines. For these reasons, workforce roles will change and so do the skills needed to perform them.”
The World Economic Forum has pointed to this global skills gap and estimates that, while 85 million jobs will be displaced, 50% of all employees will need reskilling and/or upskilling by 2025. This, it almost goes without saying, will require Learning and Development departments to do the heavy-lifting in this initial transformational phase but also in an on-going capacity.
“And that’s the big problem,” says Hardman. “2025 is only two and half years away and the three pillars of L&D – knowledge transference, knowledge reinforcement and knowledge assessment – are crumbling. They have been unchanged for decades and are now, faced by revolutionary change, no longer fit for purpose.”
ChatGPT is the shakeup education needs— from eschoolnews.com by Joshua Sine As technology evolves, industries must evolve alongside it, and education is no exception–especially when students heavily and regularly rely on edtech
Key points:
Education must evolve along with technology–students will expect it
Embracing new technologies helps education leverage adaptive technology that engage student interest
With the pilot project, Wyoming has become the final state to allow competency-based learning in some form, marking a historic point in a growing, albeit slow, movement in favor of a model that emphasizes students’ achievement rather than the set 13-year academic schedule. That movement—long championed by many high-profile education leaders—has seen a handful of states embrace competency-based education faster than others and uneven progress within those states.
In Wyoming, it’s part of a three-pronged effort to move the state toward what State Superintendent of Public Instruction Megan Degenfelder calls “student-centered learning,” which is heavier on personalized and project-based learning that emphasize the development of problem-solving skills and letting students test-drive different career pathways.
So you think you want to hire an instructional designer. Great choice. Instructional designers are eLearning industry superheroes. They create learning experiences and develop instructional materials to make learning accessible. Whether you’re creating training modules for your employees or building online courses for students, an instructional designer is an essential member of your eLearning team.
Addendum on 6/6/23, a somewhat relevant posting:
Professional Organizations for Instructional Designers — from christytuckerlearning.com by Christy Tucker What professional organizations are useful for instructional designers? The Learning Guild, ATD, TLDC, Training Magazine Network, and LDA.
But whereas Finland’s schools are still characterized by a culture of teaching, Oodi stands as a beacon of learning — self-organizing, emergent, and overflowing with the life force of its inhabitants. .
From DSC: As the above got me to thinking about learning spaces, here’s another somewhat relevant item from Steelcase:
Addendum on 6/6/23:
Also relevant to the first item in this posting, see:
Looking for Miracles in the Wrong Places — from nataliewexler.substack.com by Natalie Wexler An “edutourist” in Finland finds the ideal school, but it isn’t a school at all.
Counterpoint/excerpt:
It sounds appealing, but any country following that route is not only likely to find itself at the bottom of the PISA heap. It’s also likely to do a profound disservice to many of its children, particularly those from less highly educated families, who depend on teachers to impart information they don’t already have and to systematically build their knowledge.
Of course it’s possible for explicit, teacher-directed instruction to be soul-crushing for students. But it certainly doesn’t have to be, and there’s no indication from Mr. X’s account that the students in the schools he visited felt their experience was oppressive. When teachers get good training—of the kind apparently provided in Finland—they know how to engage students in the content they’re teaching and guide them to think about it deeply and analytically.
That’s not oppressive. In fact, it’s the key to enabling students to reach their full potential. In that sense, it’s liberating.
AI-assisted cheating isn’t a temptation if students have a reason to care about their own learning.
Yesterday I happened to listen to two different podcasts that ended up resonating with one another and with an idea that’s been rattling around inside my head with all of this moral uproar about generative AI:
** If we trust students – and earn their trust in return – then they will be far less motivated to cheat with AI or in any other way. **
First, the question of motivation. On the Intentional Teaching podcast, while interviewing James Lang and Michelle Miller on the impact of generative AI, Derek Bruff points out (drawing on Lang’s Cheating Lessons book) that if students have “real motivation to get some meaning out of [an] activity, then there’s far less motivation to just have ChatGPT write it for them.” Real motivation and real meaning FOR THE STUDENT translates into an investment in doing the work themselves.
…
Then I hopped over to one of my favorite podcasts – Teaching in Higher Ed – where Bonni Stachowiak was interviewing Cate Denial about a “pedagogy of kindness,” which is predicated on trusting students and not seeing them as adversaries in the work we’re doing.
So the second key element: being kind and trusting students, which builds a culture of mutual respect and care that again diminishes the likelihood that they will cheat.
…
Again, human-centered learning design seems to address so many of the concerns and challenges of the current moment in higher ed. Maybe it’s time to actually practice it more consistently. #aiineducation #higheredteaching #inclusiveteaching