The Three Wave Strategy of AI Implementation — from aiczar.blogspot.com by Alexander “Sasha” Sidorkin

The First Wave: Low-Hanging Fruit

These are just examples:

  • Student services
  • Resume and Cover Letter Review (Career Services)Offering individual resume critiques
  • Academic Policy Development and Enforcement (Academic Affairs)…
  • Health Education and Outreach (Health and Wellness Services) …
  • Sustainability Education and Outreach (Sustainability and Environmental Initiatives) …
  • Digital Marketing and Social Media Management (University Communications and Marketing) …
  • Grant Proposal Development and Submission (Research and Innovation) …
  • Financial Aid Counseling (Financial Aid and Scholarships) …
  • Alumni Communications (Alumni Relations and Development) …
  • Scholarly Communications (Library Services) …
  • International Student and Scholar Services (International Programs and Global Engagement)

Duolingo Max: A Paid Subscription to Learn a Language Using ChatGPT AI (Worth It?) — from theaigirl.substack.com by Diana Dovgopol (behind paywall for the most part)
The integration of AI in language learning apps could be game-changing.


Research Insights #12: Copyrights and Academia — from aiedusimplified.substack.com by Lance Eaton
Scholarly authors are not going to be happy…

A while back, I wrote about some of my thoughts on generative AI around the copyright issues. Not much has changed since then, but a new article (Academic authors ‘shocked’ after Taylor & Francis sells access to their research to Microsoft AI) is definitely stirring up all sorts of concerns by academic authors. The basics of that article are that Taylor & Francis sold access to authors’ research to Microsoft for AI development without informing the authors, sparking significant concern among academics and the Society of Authors about transparency, consent, and the implications for authors’ rights and future earnings.

The stir can be seen as both valid and redundant. Two folks’ points stick out to me in this regard.

 

AI candidate running for Parliament in the U.K. says AI can humanize politics — from nbcnews.com by Angela Yang and Daniele Hamamdjian; via The Rundown AI
Voters can talk to AI Steve, whose name will be on the ballot for the U.K.’s general election next month, to ask policy questions or raise concerns.

Commentary from The Rundown AI:

The Rundown: An AI-powered candidate named ‘AI Steve’ is running for U.K. Parliament in next month’s general election — creating polarizing questions around AI’s use in government affairs.

The details:

  • AI Steve is represented by businessman Steve Endacott and will appear as an independent candidate in the upcoming election.
  • Voters can interact with AI Steve online to ask policy questions and raise concerns or suggestions, which the AI will incorporate based on feedback.
  • If elected, Endacott will serve as AI Steve’s human proxy in Parliament, attending meetings and casting votes based on the AI’s constituent-driven platform.

Why it matters: The idea of an AI running for office might sound like a joke, but the tech behind it could actually help make our politicians more independent and (ironically) autonomous. AI-assisted governance is likely coming someday, but it’s probably still a bit too early to be taken seriously.

Also related, see:


From The Deep View:

The details: Hearing aids have employed machine learning algorithms for decades. But these algorithms historically have not been powerful enough to tackle the ‘cocktail party’ problem; they weren’t able to isolate a single voice in a loud, crowded room.

Dr. DeLiang Wang has been working on the problem for decades and has published numerous studies in recent years that explore the application of deep learning within hearing aids.

Last year, Google partnered up with a number of organizations to design personalized, AI-powered hearing aids.

Why it matters: Wang’s work has found that deep learning algorithms, running in real-time, could separate speech from background noises, “significantly” improving intelligibility in hearing-impaired people.
The tech is beginning to become publicly available, with brands like Phonak and Starkey leveraging deep learning and AI to enhance their hearing aids.



 

Microsoft teams with Khan Academy to make its AI tutor free for K-12 educators and will develop a Phi-3 math model — from venturebeat.com by Ken Yeung

Microsoft is partnering with Khan Academy in a multifaceted deal to demonstrate how AI can transform the way we learn. The cornerstone of today’s announcement centers on Khan Academy’s Khanmigo AI agent. Microsoft says it will migrate the bot to its Azure OpenAI Service, enabling the nonprofit educational organization to provide all U.S. K-12 educators free access to Khanmigo.

In addition, Microsoft plans to use its Phi-3 model to help Khan Academy improve math tutoring and collaborate to generate more high-quality learning content while making more courses available within Microsoft Copilot and Microsoft Teams for Education.


One-Third of Teachers Have Already Tried AI, Survey Finds — from the74million.org by Kevin Mahnken
A RAND poll released last month finds English and social studies teachers embracing tools like ChatGPT.

One in three American teachers have used artificial intelligence tools in their teaching at least once, with English and social studies teachers leading the way, according to a RAND Corporation survey released last month. While the new technology isn’t yet transforming how kids learn, both teachers and district leaders expect that it will become an increasingly common feature of school life.


Professors Try ‘Restrained AI’ Approach to Help Teach Writing — from edsurge.com by Jeffrey R. Young
Can ChatGPT make human writing more efficient, or is writing an inherently time-consuming process best handled without AI tools?

This article is part of the guide: For Education, ChatGPT Holds Promise — and Creates Problems.

When ChatGPT emerged a year and half ago, many professors immediately worried that their students would use it as a substitute for doing their own written assignments — that they’d click a button on a chatbot instead of doing the thinking involved in responding to an essay prompt themselves.

But two English professors at Carnegie Mellon University had a different first reaction: They saw in this new technology a way to show students how to improve their writing skills.

“They start really polishing way too early,” Kaufer says. “And so what we’re trying to do is with AI, now you have a tool to rapidly prototype your language when you are prototyping the quality of your thinking.”

He says the concept is based on writing research from the 1980s that shows that experienced writers spend about 80 percent of their early writing time thinking about whole-text plans and organization and not about sentences.


On Building AI Models for Education — from aieducation.substack.com by Claire Zau
Google’s LearnLM, Khan Academy/MSFT’s Phi-3 Models, and OpenAI’s ChatGPT Edu

This piece primarily breaks down how Google’s LearnLM was built, and takes a quick look at Microsoft/Khan Academy’s Phi-3 and OpenAI’s ChatGPT Edu as alternative approaches to building an “education model” (not necessarily a new model in the latter case, but we’ll explain). Thanks to the public release of their 86-page research paper, we have the most comprehensive view into LearnLM. Our understanding of Microsoft/Khan Academy small language models and ChatGPT Edu is limited to the information provided through announcements, leaving us with less “under the hood” visibility into their development.


AI tutors are quietly changing how kids in the US study, and the leading apps are from China — from techcrunch.com by Rita Liao

Answer AI is among a handful of popular apps that are leveraging the advent of ChatGPT and other large language models to help students with everything from writing history papers to solving physics problems. Of the top 20 education apps in the U.S. App Store, five are AI agents that help students with their school assignments, including Answer AI, according to data from Data.ai on May 21.


Is your school behind on AI? If so, there are practical steps you can take for the next 12 months — from stefanbauschard.substack.com by Stefan Bauschard

If your school (district) or university has not yet made significant efforts to think about how you will prepare your students for a World of AI, I suggest the following steps:

July 24 – Administrator PD & AI Guidance
In July, administrators should receive professional development on AI, if they haven’t already. This should include…

August 24 –Professional Development for Teachers and Staff…
Fall 24 — Parents; Co-curricular; Classroom experiments…
December 24 — Revision to Policy…


New ChatGPT Version Aiming at Higher Ed — from insidehighered.com by Lauren Coffey
ChatGPT Edu, emerging after initial partnerships with several universities, is prompting both cautious optimism and worries.

OpenAI unveiled a new version of ChatGPT focused on universities on Thursday, building on work with a handful of higher education institutions that partnered with the tech giant.

The ChatGPT Edu product, expected to start rolling out this summer, is a platform for institutions intended to give students free access. OpenAI said the artificial intelligence (AI) toolset could be used for an array of education applications, including tutoring, writing grant applications and reviewing résumés.

 


[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 

Slow Shift to Skills — from the-job.beehiiv.com by Paul Fain
Gains in nondegree hiring aren’t happening at scale yet, even in Texas.

Real progress in efforts to increase mobility for nondegree workers is unlikely during the next couple years, Joseph Fuller, a professor at Harvard University’s business school who co-leads its Managing the Future of Work initiative, recently told me.

Yet Fuller is bullish on skills-based hiring becoming a real thing in five to 10 years. That’s because he predicts that AI will create the data to solve the skills taxonomy problem Kolko describes. And if skills-based hiring allows for serious movement for workers without bachelor’s degrees, Fuller says the future will look like where Texas is headed.

Beyond coding, Altman said he’s most excited about the productivity improvement curve for healthcare and education. 

Even so, AI’s likely disruption to jobs isn’t following the pattern envisioned by most experts..

“The consensus prediction, if we rewind seven or 10 years, was that the impact was going to be blue-collar work first, white-collar work second, creativity maybe never, but certainly last, because that was magic and human,” Altman said. “Obviously, it’s gone exactly the other direction.”

From DSC:
Altman is most excited about productivity improvements in coding, healthcare, and education. Nice to see healthcare and education making that list.

AI will create the data to solve the skills taxonomy problem — i.e., skills and competencies will be better matched with positions/jobs.

 

Generative AI Is Set to Shake Up Education — from morganstanley.com
While educators debate the risks and opportunities of generative AI as a learning tool, some education technology companies are using it to increase revenue and lower costs.

Key Takeaways

  • Contrary to the view that generative AI is undermining education, it could ultimately improve access and quality.
  • Education technology companies have opportunities from generative AI that markets may be missing.
  • Generative AI could bring $200 billion in value to the global education sector by 2025.
  • Reskilling and retraining alone could require $6 billion in investments by 2025, with edtech companies poised to fill that need.

Outgoing SNHU president: AI means universities must change ‘dramatically’ — from msn.com by Steven Porter

In his next chapter, LeBlanc will work with a team of researchers to study emerging AI trends, impacts on education, and opportunities to innovate. (The initiative harkens back to his early scholarship. During grad school decades ago, LeBlanc studied the ways computers could impact how societies think.)

LeBlanc said the AI-induced changes on the horizon will require educational institutions to reimagine how they assess student learning and grapple with implications for privacy and data security. There are also bigger questions about what jobs will go away and what jobs will be created, which influences the fields of study schools will offer, he said.



AI & Education: A Year in Review — from drphilippahardman.substack.com by Dr. Philippa Hardman
The top five use cases & most popular tools among educators at the end of 2023 – the year than Gen AI shook-up education

Use Case #1: Content Creation
Use Case #2: Brainstorming & Ideation
Use Case #3: Research & Analysis
Use Case #4: Writing & Communicating
Use Case #5: Task Automation


I Used ChatGPT for 12 Months. Here Are Some Hidden Gems That Will Change Your Life — from theaigirl.substack.com by Diana Dovgopol
Transform your life with these ChatGPT’s hidden gems.

1. Summarize videos, articles, papers and posts
Here’s how it works (note that you need to enable browsing or plugins for this)

  1. Find the video/article/paper/post.
  2. Copy the link.
  3. Ask ChatGPT to summarize it for you.

AI ADVISORY BOARDS: Giving Students and Teachers a Voice — from aiadvisoryboards.wordpress.com

My mission is to spread awareness about the incredible potential of AI and AI advisory boards in education. Through my website, aiadvisoryboards.wordpress.com, I aim to inspire educators, administrators, and students to embrace AI and create innovative learning environments.


Report Update: Human and Computer Deep Learning and the Future of Humanity — from by Stefan Bauschard
New Chapter on School Guidance; updates on technology, the labor markets, and deep learning


 

Some scripture for us

Psalms 121:5-6


Hebrews 2:17-18 NIV

For this reason he had to be made like them, fully human in every way, in order that he might become a merciful and faithful high priest in service to God, and that he might make atonement for the sins of the people. Because he himself suffered when he was tempted, he is able to help those who are being tempted.


Lamentations 3:40 NIV

Let us examine our ways and test them, and let us return to the LORD.


Psalms 121:1-2


Jeremiah 31:31-34

31 “The days are coming,” declares the Lord,
    “when I will make a new covenant
with the people of Israel
    and with the people of Judah.
32 It will not be like the covenant
    I made with their ancestors
when I took them by the hand
    to lead them out of Egypt,
because they broke my covenant,
    though I was a husband to them,”
declares the Lord.
33 “This is the covenant I will make with the people of Israel
    after that time,” declares the Lord.
“I will put my law in their minds
    and write it on their hearts.
I will be their God,
    and they will be my people.
34 No longer will they teach their neighbor,
    or say to one another, ‘Know the Lord,’
because they will all know me,
    from the least of them to the greatest,”
declares the Lord.
“For I will forgive their wickedness
    and will remember their sins no more.”

 

Home schooling’s rise from fringe to fastest-growing form of education — from washingtonpost.com by Peter Jamison, Laura Meckler, Prayag Gordy, Clara Ence Morse and Chris Alcantara
A district-by-district look at home schooling’s explosive growth, which a Post analysis finds has far outpaced the rate at private and public schools

 

The Game-Changer: How Legal Technology is Transforming the Legal Sector — from todaysconveyancer.co.uk by Perfect Portal

Rob Lawson, Strategic Sales Manager at Perfect Portal discussed why he thinks legal technology is so important:

“I spent almost 20 years in private practice and was often frustrated at the antiquated technology and processes that were deployed. It is one of the reasons that I love working in legal tech to provide solutions and streamline processes in the modern law firm. One of the major grumbles for practitioners is the amount of admin that they must do to fulfil the needs of their clients. Technology can automate routine tasks, streamline processes, and help manage large volumes of data more effectively. This then allows legal professionals to focus on more strategic aspects of their work. Ultimately this will increase efficiency and productivity.”



Some gen AI vendors say they’ll defend customers from IP lawsuits. Others, not so much. — from techcrunch.com by Kyle Wiggers

A person using generative AI — models that generate text, images, music and more given a prompt — could infringe on someone else’s copyright through no fault of their own. But who’s on the hook for the legal fees and damages if — or rather, when — that happens?

It depends.

In the fast-changing landscape of generative AI, companies monetizing the tech — from startups to big tech companies like Google, Amazon and Microsoft — are approaching IP risks from very different angles.


Clio Goes All Out with Major Product Announcements, Including A Personal Injury Add-On, E-Filing, and (Of Course) Generative AI — from lawnext.com by Bob Ambrogi

At its annual Clio Cloud Conference in Nashville today, the law practice management company Clio introduced an array of major new products and product updates, calling the series of announcements its most expansive product update ever in its 15-year history.


AI will invert the biglaw pyramid — from alexofftherecord.com by Cece Xie

These tasks that GPT can now handle are, coincidentally, common tasks for junior associates. From company and transaction summaries to legal research and drafting memos, analyzing and drafting have long been the purview of bright-eyed, bushy-tailed new law grads.

If we follow the capitalistic impulse of biglaw firms to its logical conclusion, this means that junior associates may soon face obsolescence. Why spend an hour figuring out how to explain an assignment to a first-year associate when you can just ask CoCounsel in five minutes? And the initial output will likely be better than a first-year’s initial work product, too.

Given the immense cost-savings that legal GPT products can confer, I suspect the rise of AI in legal tech will coincide with smaller junior associate classes. Gone are the days of 50+ junior lawyers all working on the same document review or due diligence. Instead, a fraction of those junior lawyers will be hired to oversee and QC the AI’s outputs. Junior associates will edit more than they do currently and manage more than they do right now. Juniors will effectively be more like midlevels from the get-go.


Beyond Law Firms: How Legal Tech’s Real Frontier Lies With SMBs (small and medium-sized businesses) — from forbes.com by Charles Brecque

Data and artificial intelligence are transforming the legal technology space—there’s no doubt about it. A recent Thomson Reuters Institute survey of lawyers showed that a large majority (82%) of respondents believe ChatGPT and generative AI can be readily applied to legal work.

While it’s tempting to think of legal tech as a playground exclusive to law firms, as technology enables employees without legal training to use and create legal frameworks and documentation, I’d like to challenge that narrative. Being the founder of a company that uses AI to manage contracts, the way I see it is the real magic happens when legal tech tools meet the day-to-day challenges of small and medium-sized businesses (known as “SMBs”).

 

Student Use Cases for AI: Start by Sharing These Guidelines with Your Class — from hbsp.harvard.edu by Ethan Mollick and Lilach Mollick

To help you explore some of the ways students can use this disruptive new technology to improve their learning—while making your job easier and more effective—we’ve written a series of articles that examine the following student use cases:

  1. AI as feedback generator
  2. AI as personal tutor
  3. AI as team coach
  4. AI as learner

Recap: Teaching in the Age of AI (What’s Working, What’s Not) — from celt.olemiss.edu by Derek Bruff, visiting associate director

Earlier this week, CETL and AIG hosted a discussion among UM faculty and other instructors about teaching and AI this fall semester. We wanted to know what was working when it came to policies and assignments that responded to generative AI technologies like ChatGPT, Google Bard, Midjourney, DALL-E, and more. We were also interested in hearing what wasn’t working, as well as questions and concerns that the university community had about teaching and AI.


Teaching: Want your students to be skeptical of ChatGPT? Try this. — from chronicle.com by Beth McMurtrie

Then, in class he put them into groups where they worked together to generate a 500-word essay on “Why I Write” entirely through ChatGPT. Each group had complete freedom in how they chose to use the tool. The key: They were asked to evaluate their essay on how well it offered a personal perspective and demonstrated a critical reading of the piece. Weiss also graded each ChatGPT-written essay and included an explanation of why he came up with that particular grade.

After that, the students were asked to record their observations on the experiment on the discussion board. Then they came together again as a class to discuss the experiment.

Weiss shared some of his students’ comments with me (with their approval). Here are a few:


2023 EDUCAUSE Horizon Action Plan: Generative AI — from library.educause.edu by Jenay Robert and Nicole Muscanell

Asked to describe the state of generative AI that they would like to see in higher education 10 years from now, panelists collaboratively constructed their preferred future.
.

2023-educause-horizon-action-plan-generative-ai


Will Teachers Listen to Feedback From AI? Researchers Are Betting on It — from edsurge.com by Olina Banerji

Julie York, a computer science and media teacher at South Portland High School in Maine, was scouring the internet for discussion tools for her class when she found TeachFX. An AI tool that takes recorded audio from a classroom and turns it into data about who talked and for how long, it seemed like a cool way for York to discuss issues of data privacy, consent and bias with her students. But York soon realized that TeachFX was meant for much more.

York found that TeachFX listened to her very carefully, and generated a detailed feedback report on her specific teaching style. York was hooked, in part because she says her school administration simply doesn’t have the time to observe teachers while tending to several other pressing concerns.

“I rarely ever get feedback on my teaching style. This was giving me 100 percent quantifiable data on how many questions I asked and how often I asked them in a 90-minute class,” York says. “It’s not a rubric. It’s a reflection.”

TeachFX is easy to use, York says. It’s as simple as switching on a recording device.

But TeachFX, she adds, is focused not on her students’ achievements, but instead on her performance as a teacher.


ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds — from the74million.org by Mark Keierleber
While educators worry that students are using generative AI to cheat, a new report finds students are turning to the tool more for personal problems.

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat.


Embracing weirdness: What it means to use AI as a (writing) tool — from oneusefulthing.org by Ethan Mollick
AI is strange. We need to learn to use it.

But LLMs are not Google replacements, or thesauruses or grammar checkers. Instead, they are capable of so much more weird and useful help.


Diving Deep into AI: Navigating the L&D Landscape — from learningguild.com by Markus Bernhardt

The prospect of AI-powered, tailored, on-demand learning and performance support is exhilarating: It starts with traditional digital learning made into fully adaptive learning experiences, which would adjust to strengths and weaknesses for each individual learner. The possibilities extend all the way through to simulations and augmented reality, an environment to put into practice knowledge and skills, whether as individuals or working in a team simulation. The possibilities are immense.

Thanks to generative AI, such visions are transitioning from fiction to reality.


Video: Unleashing the Power of AI in L&D — from drphilippahardman.substack.com by Dr. Philippa Hardman
An exclusive video walkthrough of my keynote at Sweden’s national L&D conference this week

Highlights

  • The wicked problem of L&D: last year, $371 billion was spent on workplace training globally, but only 12% of employees apply what they learn in the workplace
  • An innovative approach to L&D: when Mastery Learning is used to design & deliver workplace training, the rate of “transfer” (i.e. behaviour change & application) is 67%
  • AI 101: quick summary of classification, generative and interactive AI and its uses in L&D
  • The impact of AI: my initial research shows that AI has the potential to scale Mastery Learning and, in the process:
    • reduce the “time to training design” by 94% > faster
    • reduce the cost of training design by 92% > cheaper
    • increase the quality of learning design & delivery by 96% > better
  • Research also shows that the vast majority of workplaces are using AI only to “oil the machine” rather than innovate and improve our processes & practices
  • Practical tips: how to get started on your AI journey in your company, and a glimpse of what L&D roles might look like in a post-AI world

 

Post-prison job training — hechingerreport.org by Olivia Sanchez and Tara García Mathewson

We’ve known for a long time that higher education can play a huge role in helping people who serve time in prison get back on their feet. Research shows that higher-ed attainment is directly correlated with a lower likelihood of being reincarcerated, as is stable employment.

But people getting out of prison face many obstacles in finding jobs, and lack of educational opportunities is just part of the issue. A patchwork of more than 14,000 federal, state and local laws and regulations restricts individuals who have arrest and conviction histories from getting licensed in certain fields. Here’s some of what my reporting found about how pervasive this problem is and why it matters:

 

Some scripture for us

Psalms 86:11
Teach me your way, LORD, that I may rely on your faithfulness; give me an undivided heart, that I may fear (respect/hold in awe) your name.

Amos 5:24
But let justice roll on like a river, righteousness like a never-failing stream!

Psalms 86:5
You, Lord, are forgiving and good, abounding in love to all who call to you.

Proverbs 17:17
A friend loves at all times, and a brother is born for a time of adversity.

 

 

Holograms Are Coming To California State Parks — from vrscout.com by Kyle Melnick

Holograms Are Coming To California State Parks


AR Technology Is Invading The Kitchen — from vrscout.com by Kyle Melnick

Excerpt (emphasis DSC):

Could this immersive AR experience revolutionize the culinary arts?

Earlier this month, the popular culinary livestreaming network Kittch announced that it is partnering with American technology company Qualcomm to create hands-free cooking experiences accessible via AR glasses.


Apple Vision Pro, Higher Education and the Next 10 Years — from insidehighered.com by Joshua Kim
How this technology will play out in our world over the next decade.

 

From DSC:
Attention all school districts! Listen up! –> Top 10 Tips for Creating a Successful Onboarding Experience  — from learningguild.com by Tara Roberson-Moore, PhD

Excerpts:

Did you know that 69% of employees say they are likely to stay with a company for a least three years after a great onboarding experience? (O.C.Tanner, 2018)

Did you know that new hires in companies with longer onboarding programs report being more proficient in their roles four months sooner than in companies with short onboarding programs? (O.C. Tanner, 2018)

Also relevant/see:

Analyze This — from learningguild.com by Tara Roberson-Moore

Excerpt:

Onboarding is a program. It is not one training event. It is also usually not one development event either—I have seen onboarding programs built in one full development phase and I have seen onboarding programs divided into several development phases (costs are one factor, but project fatigue is also a factor). Due to the sheer size of a long-term initiative like onboarding, getting everything in place before development begins is imperative to success.

Analysis results in a plan. As we have mentioned before, you get the blueprint of the house—not the house. Remember, not even paint colors are being picked yet!

If you are thinking about creating an onboarding program, whether on your own or working with a really cool and fun troupe of elves, start by asking questions…


Cognitive Load and Virtual Training — from learningguild.com by Pamela Wise

Excerpt:

 As designers navigate the waters of building learning interventions that utilize virtual technology, they must keep the learner’s cognitive load low in order to increase learning retention. As with all instructional design, virtual training should utilize effective instructional methods. However, studies have shown that some virtual training strategies, such as immersion, may increase the cognitive load in one setting but decrease it in another.

 

 
© 2024 | Daniel Christian