From DSC:
I was watching a sermon the other day, and I’m always amazed when the pastor doesn’t need to read their notes (or hardly ever refers to them). And they can still do this in a much longer sermon too. Not me man.

It got me wondering about the idea of having a teleprompter on our future Augmented Reality (AR) glasses and/or on our Virtual Reality (VR) headsets.  Or perhaps such functionality will be provided on our mobile devices as well (i.e., our smartphones, tablets, laptops, other) via cloud-based applications.

One could see one’s presentation, sermon, main points for the meeting, what charges are being brought against the defendant, etc. and the system would know to scroll down as you said the words (via Natural Language Processing (NLP)).  If you went off script, the system would stop scrolling and you might need to scroll down manually or just begin where you left off.

For that matter, I suppose a faculty member could turn on and off a feed for an AI-based stream of content on where a topic is in the textbook. Or a CEO or University President could get prompted to refer to a particular section of the Strategic Plan. Hmmm…I don’t know…it might be too much cognitive load/overload…I’d have to try it out.

And/or perhaps this is a feature in our future videoconferencing applications.

But I just wanted to throw these ideas out there in case someone wanted to run with one or more of them.

Along these lines, see:

.

Is a teleprompter a feature in our future Augmented Reality (AR) glasses?

Is a teleprompter a feature in our future Augmented Reality (AR) glasses?

 

The Top 10 Digital Health Stories Of 2022 — from medicalfuturist.com by Dr. Bertalan Mesko

Excerpt:

Edging towards the end of the year, it is time for a summary of how digital health progressed in 2022. It is easy to get lost in the noise – I myself shared well over a thousand articles, studies and news items between January and the end of November 2022. Thus, just like in 20212020 (and so on), I picked the 10 topics I believe will have the most significance in the future of healthcare.

9. Smart TVs Becoming A Remote Care Platform
The concept of turning one’s TV into a remote care hub isn’t new. Back in 2012, researchers designed a remote health assistance system for the elderly to use through a TV set. But we are exploring this idea now as a major tech company has recently pushed for telehealth through TVs. In early 2022, electronics giant LG announced that its smart TVs will be equipped with the remote health platform Independa. 

And in just a few months (late November) came a follow-up: a product called Carepoint TV Kit 200L, in beta testing now. Powered by Amwell’s Converge platform, the product is aimed at helping clinicians more easily engage with patients amid healthcare’s workforce shortage crisis.

Also relevant/see:

Asynchronous Telemedicine Is Coming And Here Is Why It’s The Future Of Remote Care — from medicalfuturist.com by Dr. Bertalan Mesko

Excerpt:

Asynchronous telemedicine is one of those terms we will need to get used to in the coming years. Although it may sound alien, chances are you have been using some form of it for a while.

With the progress of digital health, especially due to the pandemic’s impact, remote care has become a popular approach in the healthcare setting. It can come in two forms: synchronous telemedicine and asynchronous telemedicine.

 

What might the ramifications be for text-to-everything? [Christian]

From DSC:

  • We can now type in text to get graphics and artwork.
  • We can now type in text to get videos.
  • There are several tools to give us transcripts of what was said during a presentation.
  • We can search videos for spoken words and/or for words listed within slides within a presentation.

Allie Miller’s posting on LinkedIn (see below) pointed these things out as well — along with several other things.



This raises some ideas/questions for me:

  • What might the ramifications be in our learning ecosystems for these types of functionalities? What affordances are forthcoming? For example, a teacher, professor, or trainer could quickly produce several types of media from the same presentation.
  • What’s said in a videoconference or a webinar can already be captured, translated, and transcribed.
  • Or what’s said in a virtual courtroom, or in a telehealth-based appointment. Or perhaps, what we currently think of as a smart/connected TV will give us these functionalities as well.
  • How might this type of thing impact storytelling?
  • Will this help someone who prefers to soak in information via the spoken word, or via a podcast, or via a video?
  • What does this mean for Augmented Reality (AR), Mixed Reality (MR), and/or Virtual Reality (VR) types of devices?
  • Will this kind of thing be standard in the next version of the Internet (Web3)?
  • Will this help people with special needs — and way beyond accessibility-related needs?
  • Will data be next (instead of typing in text)?

Hmmm….interesting times ahead.

 

Why text-to-speech tools might have a place in your classroom with Dr. Kirsten Kohlmeyer – Easy TeTech Podcast 183 — from classtechtips.com by Monica Burns

Excerpt:

In this episode, Assistive Technology Director, Dr. Kirsten Kohlmeyer, joins to discuss the power of accessibility and text-to-speech tools in classroom environments. You’ll also hear plenty of digital resources to check out for text-to-speech options, audiobooks, and more!

Assistive tools can provide:

  • Text-to-speech
  • Definitions/vocabularies
  • Ability to level the Lexile level of a reading
  • Capability to declutter a website
  • More chances to read to learn something new
  • and more

Speaking of tools, also see:

 


Ways that artificial intelligence is revolutionizing education — from thetechedvocate.org by Matthew Lynch

Excerpt:

I was speaking with an aging schoolteacher who believes that AI is destroying education. They challenged me to come up with 26 ways that artificial intelligence (AI) is improving education, and instead, I came up with. They’re right here.


AI Startup Speeds Healthcare Innovations To Save Lives — from by Geri Stengel

Excerpt:

This project was a light-bulb moment for her. The financial industry had Bloomberg to analyze content and data to help investors uncover opportunities and minimize risk, and pharmaceutical, biotech, and medical device companies needed something similar.



 

The Future of Education | By Futurist Gerd Leonhard | A Video for EduCanada — from futuristgerd.com

Per Gerd:

Recently, I was invited by the Embassy of Canada in Switzerland to create this special presentation and promotional video discussing the Future of Education and to explore how Canada might be leading the way. Here are some of the key points I spoke about in the video. Watch the whole thing here: the Future of Education.

 

…because by 2030, I believe, the traditional way of learning — just in case — you know storing, downloading information will be replaced by learning just in time, on-demand, learning to learn, unlearning, relearning, and the importance of being the right person. Character skills, personality skills, traits, they may very well rival the value of having the right degree.

If you learn like a robot…you’ll never have a job to begin with.

Gerd Leonhard


Also relevant/see:

The Next 10 Years: Rethinking Work and Revolutionising Education (Gerd Leonhard’s keynote in Riga) — from futuristgerd.com


 

Will Learning Move into the Metaverse? — from learningsolutionsmag.com by Pamela Hogle

Excerpt:

In its 2022 Tech Trends report, the Future Today Institute predicts that, “The future of work will become more digitally immersive as companies deploy virtual meeting platforms, digital experiences, and mixed reality worlds.”

Learning leaders are likely to spearhead the integration of their organizations’ workers into a metaverse, whether by providing training in using the tools that make a metaverse possible or through developing training and performance support resources that learners will use in an immersive environment.

Advantages of moving some workplace collaboration and learning into a metaverse include ease of scaling and globalization. The Tech Trends report mentions personalization at scale and easy multilingual translation as advantages of “synthetic media”—algorithmically generated digital content, which could proliferate in metaverses.

Also see:

Future Institute Today -- Tech Trends 2022


Also from learningsolutionsmag.com, see:

Manage Diverse Learning Ecosystems with Federated Governance

Excerpt:

So, over time, the L&D departments eventually go back to calling their own shots.

What does this mean for the learning ecosystem? If each L&D team chooses its own learning platforms, maintenance and support will be a nightmare. Each L&D department may be happy with the autonomy but learners have no patience for navigating multiple LMSs or going to several systems to get their training records.

Creating common infrastructure among dispersed groups
Here you have the problem: How can groups that have no accountability to each other share a common infrastructure?

 

Every month Essentials publish an Industry Trend Report on AI in general and the following related topics:

  • AI Research
  • AI Applied Use Cases
  • AI Ethics
  • AI Robotics
  • AI Marketing
  • AI Cybersecurity
  • AI Healthcare

The Race to Hide Your Voice — from wired.com by Matt Burgess
Voice recognition—and data collection—have boomed in recent years. Researchers are figuring out how to protect your privacy.

AI: Where are we now? — from educause.edu by EDUCAUSE
Is the use of AI in higher education today invisible? dynamic? perilous? Maybe it’s all three.

What is artificial intelligence and how is it used? — from europart.europa.eu; with thanks to Tom Barrett for this resource

 

Apple Starts Connecting the Dots for Its Next Big Thing — from nytimes.com by Tripp Mickle and Brian X. Chen
The company has enlisted Hollywood directors like Jon Favreau to help its effort to create products that blend the physical and virtual worlds.

Excerpt:

Nearly 15 years after the iPhone set off the smartphone revolution, Apple is assembling the pieces for what it hopes will become its next business-altering device: a headset that blends the digital world with the real one.

The company has enlisted Hollywood directors such as Jon Favreau to develop video content for a headset that it is expected to ship next year, according to three people familiar with that work. Mr. Favreau, an executive producer of “Prehistoric Planet” on Apple TV+, is working to bring that show’s dinosaurs to life on the headset, which looks like a pair of ski goggles and aims to offer virtual- and augmented-reality experiences, these people said.

Speaking of the future, here’s another item regarding what’s coming down the pike:

 

From DSC:
Wow…I hadn’t heard of voice banking before. This was an interesting item from multiple perspectives.

Providing a creative way for people with Motor Neurone Disease to bank their voices, I Will Always Be Me is a dynamic and heartfelt publication — from itsnicethat.com by Olivia Hingley
Speaking to the project’s illustrator and creative director, we discover how the book aims to be a tool for family and loved ones to discuss and come to terms with the diagnosis.

Excerpt:

Whilst voice banking technology is widely available to those suffering from MND, Tal says that the primary problem is “that not enough people are banking their voice because the process is long, boring and solitary. People with MND don’t want to sit in a lonely room to record random phrases and sentences; they already have a lot to deal with.” Therefore, many people only realise or interact with the importance of voice banking when their voice has already deteriorated. “So,” Tal expands, “the brief we got was: turn voice banking into something that people will want to do as soon as they’re diagnosed.”

 

AI research is a dumpster fire and Google’s holding the matches — from thenextweb.com by Tristan Greene
Scientific endeavor is no match for corporate greed

Excerpts:

The world of AI research is in shambles. From the academics prioritizing easy-to-monetize schemes over breaking novel ground, to the Silicon Valley elite using the threat of job loss to encourage corporate-friendly hypotheses, the system is a broken mess.

And Google deserves a lion’s share of the blame.

Google, more than any other company, bears responsibility for the modern AI paradigm. That means we need to give big G full marks for bringing natural language processing and image recognition to the masses.

It also means we can credit Google with creating the researcher-eat-researcher environment that has some college students and their big-tech-partnered professors treating research papers as little more than bait for venture capitalists and corporate headhunters.

But the system’s set up to encourage the monetization of algorithms first, and to further the field second. In order for this to change, big tech and academia both need to commit to wholesale reform in how research is presented and reviewed.

Also relevant/see:

Every month Essentials publish an Industry Trend Report on AI in general and the following related topics:

  • AI Research
  • AI Applied Use Cases
  • AI Ethics
  • AI Robotics
  • AI Marketing
  • AI Cybersecurity
  • AI Healthcare

It’s never too early to get your AI ethics right — from protocol.com by Veronica Irwin
The Ethical AI Governance Group wants to give startups a framework for avoiding scandals and blunders while deploying new technology.

Excerpt:

To solve this problem, a group of consultants, venture capitalists and executives in AI created the Ethical AI Governance Group last September. In March, it went public, and published a survey-style “continuum” for investors to use in advising the startups in their portfolio.

The continuum conveys clear guidance for startups at various growth stages, recommending that startups have people in charge of AI governance and data privacy strategy, for example. EAIGG leadership argues that using the continuum will protect VC portfolios from value-destroying scandals.

 

Radar trends to watch: May 2022 — from oreilly.com
Developments in Web3, Security, Biology, and More

Excerpt:

April was the month for large language models. There was one announcement after another; most new models were larger than the previous ones, several claimed to be significantly more energy efficient.

 

Remote court transcription technology enables virtual court appearances — from abajournal.com by Nicole Black

Excerpts:

That’s why it’s imperative to make certain remote options are available for all aspects of legal work since doing so is the only way to guarantee the justice system doesn’t come to a grinding halt. One way to prevent that is to take advantage of the virtual deposition transcription tools I discussed in last month’s column. In that article, I provided an overview of virtual deposition transcription products and services that rely on videoconferencing tools and software platforms to facilitate remote depositions.

Another way business continuity has been maintained since March 2020 is via virtual court proceedings. Remote court appearances are now more common since courts periodically shifted to partial or fully remote operations throughout the pandemic. Many judges have become accustomed to and appreciate the convenience of virtual court proceedings, and many expect them to continue even after the pandemic ends.

Because all signs point to the continuation of virtual court proceedings, I promised in last month’s article that I would focus on remote court proceeding options in this column. These include software platforms and artificial intelligence language-processing tools that facilitate remote court proceedings.

Nicole’s article mentioned the following vendor/product:

Live Litigation -- Remote Solutions for Attending and Participating in Depositions, Trials, Hearings, Arbitrations, Mediations, Witness Prep, and more.

 

The amazing opportunities of AI in the future of the educational metaverse [Darbinyan]

The amazing opportunities of AI in the future of the educational metaverse — from forbes.com by Rem Darbinyan

Excerpt:

Looking ahead, let’s go over several potential AI-backed applications of the metaverse that can empower the education industry in many ways.

Multilingual Learning Opportunities
Language differences may be a real challenge for students from different cultures as they may not be able to understand and keep up with the materials and assignments. Artificial intelligence, VR and AR technologies can enhance multilingual accessibility for learners no matter where they are in the world. Speech-to-text, text-to-speech and machine translation technologies enrich the learning process and create more immersive learning environments.

AI can process multiple languages simultaneously and provide real-time translations, enabling learners to engage with the materials in the language of their choice. With the ability to instantly transcribe speech across multiple languages, artificial intelligence removes any language barriers for students, enabling them to be potentially involved, learn and communicate in any language.

Artificial Intelligence (AI) in Education Market size exceeded USD 1 billion in 2020 and is expected to grow at a CAGR of over 40% between 2021 and 2027. (source)

Along the lines of innovation within our educational learning ecosystems, see:

3 Questions for Coursera’s Betty Vandenbosch & U-M’s Lauren Atkins Budde on XR — from insidehighered.com by Joshua Kiim
How might extended reality shape the future of learning?

Excerpts (emphasis DSC):

[Lauren Atkins Budde] “Being able to embed quality, effective extended reality experiences into online courses is exponentially a game-changer. One of the persistent constraints of online learning, especially at scale, is how do learners get hands-on practice? How do they experience specific contexts and situations? How do they learn things that are best experienced? XR provides that opportunity for actively doing different kinds of tasks, in various environments, in ways that would otherwise not be possible. It will open up  Lauren Atkins Buddeboth how we teach online and also what we teach online.”

These courses are really exciting and cover a broad range of disciplines, which is particularly important. To choose the right subjects, we did an extensive review of insights from industry partners, learners and market research on in-demand and emerging future-of-work skills and then paired that with content opportunities where immersive learning is really a value-add and creates what our learning experience designers call “embodied learning.”

Addendum on 5/1/22:
Can the Metaverse Improve Learning? New Research Finds Some Promise — from edsurge.com by Jeffrey R. Young

“The findings support a deeper understanding of how creating unique educational experiences that feel real (i.e., create a high level of presence) through immersive technology can influence learning through different affective and cognitive processes including enjoyment and interest,” Mayer and his colleagues write.

 

2021 Year End report ADA Digital Accessibility Lawsuits — from info.usablenet.com (need to complete form to get access to the report)

Excerpts:

The UsableNet data and research team reviews hundreds of website and app accessibility lawsuits in federal courts under ADA and in California state courts for our bi-annual lawsuit reports.

Highlights from the report include:

  • The top-line numbers in 2021
  • The trend of repeat digital lawsuits
  • Where key cases have brought some relief and re-focus on connecting to brick and mortar
  • A break-down of the impact on E-commerce
  • Top plaintiffs and plaintiff firms
  • The increase in lawsuits involving accessibility widgets and overlays

Also relevant/see:

Also relevant/see:

 
© 2025 | Daniel Christian