94% of Consumers are Satisfied with Virtual Primary Care — from hitconsultant.net

Excerpt from What You Should Know (emphasis DSC):

  • For people who have used virtual primary care, the vast majority of them (94%) are satisfied with their experience, and nearly four in five (79%) say it has allowed them to take charge of their health. The study included findings around familiarity and experience with virtual primary care, virtual primary care and chronic conditions, current health and practices, and more.
  • As digital health technology continues to advance and the healthcare industry evolves, many Americans want the ability to utilize more digital methods when it comes to managing their health, according to a study recently released by Elevance Health — formerly Anthem, Inc. Elevance Health commissioned to conduct an online study of over 5,000 US adults age 18+ around virtual primary care.
 

The talent needed to adopt mobile AR in industry — from chieflearningofficer.com by Yao Huang Ph.D.

Excerpt:

Therefore, when adopting mobile AR to improve job performance, L&D professionals need to shift their mindset from offering training with AR alone to offering performance support with AR in the middle of the workflow.

The learning director from a supply chain industry pointed out that “70 percent of the information needed to build performance support systems already exists. The problem is it is all over the place and is available on different systems.”

It is the learning and development professional’s job to design a solution with the capability of the technology and present it in a way that most benefits the end users.

All participants revealed that mobile AR adoption in L&D is still new, but growing rapidly. L&D professionals face many opportunities and challenges. Understanding the benefits, challenges and opportunities of mobile AR used in the workplace is imperative.

A brief insert from DSC:
Augmented Reality (AR) is about to hit the mainstream in the next 1-3 years. It will connect the physical world with the digital world in powerful, helpful ways (and likely in negative ways as well). I think it will be far bigger and more commonly used than Virtual Reality (VR). (By the way, I’m also including Mixed Reality (MR) within the greater AR domain.) With Artificial Intelligence (AI) making strides in object recognition, AR could be huge.

Learning & Development groups should ask for funding soon — or develop proposals for future funding as the new hardware and software products mature — in order to upskill at least some members of their groups in the near future.

As within Teaching & Learning Centers within higher education, L&D groups need to practice what they preach — and be sure to train their own people as well.

 

From DSC:
I was watching a sermon the other day, and I’m always amazed when the pastor doesn’t need to read their notes (or hardly ever refers to them). And they can still do this in a much longer sermon too. Not me man.

It got me wondering about the idea of having a teleprompter on our future Augmented Reality (AR) glasses and/or on our Virtual Reality (VR) headsets.  Or perhaps such functionality will be provided on our mobile devices as well (i.e., our smartphones, tablets, laptops, other) via cloud-based applications.

One could see one’s presentation, sermon, main points for the meeting, what charges are being brought against the defendant, etc. and the system would know to scroll down as you said the words (via Natural Language Processing (NLP)).  If you went off script, the system would stop scrolling and you might need to scroll down manually or just begin where you left off.

For that matter, I suppose a faculty member could turn on and off a feed for an AI-based stream of content on where a topic is in the textbook. Or a CEO or University President could get prompted to refer to a particular section of the Strategic Plan. Hmmm…I don’t know…it might be too much cognitive load/overload…I’d have to try it out.

And/or perhaps this is a feature in our future videoconferencing applications.

But I just wanted to throw these ideas out there in case someone wanted to run with one or more of them.

Along these lines, see:

.

Is a teleprompter a feature in our future Augmented Reality (AR) glasses?

Is a teleprompter a feature in our future Augmented Reality (AR) glasses?

 

Making a Digital Window Wall from TVs — from theawesomer.com

Drew Builds Stuff has an office in the basement of his parents’ house. Because of its subterranean location, it doesn’t get much light. To brighten things up, he built a window wall out of three 75? 4K TVs, resulting in a 12-foot diagonal image. Since he can load up any video footage, he can pretend to be anywhere on Earth.

From DSC:
Perhaps some ideas here for learning spaces!

 

What might the ramifications be for text-to-everything? [Christian]

From DSC:

  • We can now type in text to get graphics and artwork.
  • We can now type in text to get videos.
  • There are several tools to give us transcripts of what was said during a presentation.
  • We can search videos for spoken words and/or for words listed within slides within a presentation.

Allie Miller’s posting on LinkedIn (see below) pointed these things out as well — along with several other things.



This raises some ideas/questions for me:

  • What might the ramifications be in our learning ecosystems for these types of functionalities? What affordances are forthcoming? For example, a teacher, professor, or trainer could quickly produce several types of media from the same presentation.
  • What’s said in a videoconference or a webinar can already be captured, translated, and transcribed.
  • Or what’s said in a virtual courtroom, or in a telehealth-based appointment. Or perhaps, what we currently think of as a smart/connected TV will give us these functionalities as well.
  • How might this type of thing impact storytelling?
  • Will this help someone who prefers to soak in information via the spoken word, or via a podcast, or via a video?
  • What does this mean for Augmented Reality (AR), Mixed Reality (MR), and/or Virtual Reality (VR) types of devices?
  • Will this kind of thing be standard in the next version of the Internet (Web3)?
  • Will this help people with special needs — and way beyond accessibility-related needs?
  • Will data be next (instead of typing in text)?

Hmmm….interesting times ahead.

 

Top Tools for Learning 2022 [Jane Hart]

Top Tools for Learning 2022

 

Top tools for learning 2022 — from toptools4learning.com by Jane Hart

Excerpt:

In fact, it has become clear that whilst 2021 was the year of experimentation – with an explosion of tools being used as people tried out new things, 2022 has been the year of consolidation – with people reverting to their trusty old favourites. In fact, many of the tools that were knocked off their perches in 2021, have now recovered their lost ground this year.


Also somewhat relevant/see:


 

The Metaverse in 2040 — from pewresearch.org by Janna Anderson and Lee Rainie
Hype? Hope? Hell? Maybe all three. Experts are split about the likely evolution of a truly immersive ‘metaverse.’ They expect that augmented- and mixed-reality enhancements will become more useful in people’s daily lives. Many worry that current online problems may be magnified if Web3 development is led by those who built today’s dominant web platforms

 

The metaverse will, at its core, be a collection of new and extended technologies. It is easy to imagine that both the best and the worst aspects of our online lives will be extended by being able to tap into a more-complete immersive experience, by being inside a digital space instead of looking at one from the outside.

Laurence Lannom, vice president at the Corporation for National Research Initiatives

“Virtual, augmented and mixed reality are the gateway to phenomenal applications in medicine, education, manufacturing, retail, workforce training and more, and it is the gateway to deeply social and immersive interactions – the metaverse.

Elizabeth Hyman, CEO for the XR Association

 


 

The table of contents for the Metaverse in 2040 set of articles out at Pew Research dot org -- June 30, 2022

 


 

Meet the metaverse: Creating real value in a virtual world — from mckinsey.com with Eric Hazan and Lareina Yee

Excerpt (emphasis DSC):

Welcome to the metaverse. Now, where exactly are we? Imagine for a moment the next iteration of the internet, seamlessly combining our physical and digital lives. It’s many things: a gaming platform, a virtual retail spot, a training tool, an advertising channel, a digital classroom, a gateway to entirely new virtual experiences. While the metaverse continues to be defined, its potential to unleash the next wave of digital disruption is clear. In the first five months of 2022, more than $120 billion have been invested in building out metaverse technology and infrastructure. That’s more than double the $57 billion invested in all of 2021.

How would you define the metaverse?
Lareina: What’s exciting is that the metaverse, like the internet, is the next platform on which we can work, live, connect, and collaborate. It’s going to be an immersive virtual environment that connects different worlds and communities. There are going to be creators and alternative currencies that you can buy and sell things with. It will have a lot of the components of Web3 and gaming and AR, but it will be much larger.

Also relevant/see:


Also relevant/see:


 

Conduct Your Own Virtual Orchestra In Maestro VR — from vrscout.com by Kyle Melnick

Niantic moves beyond games with Lightship AR platform and a social network — from theverge.com by Alex Heath
The maker of Pokémon Go is releasing its AR map for other apps and a location-based social network called Campfire

Excerpt:

Niantic made a name for itself in the mobile gaming industry through the enduring success of Pokémon Go. Now the company is hoping to become something else: a platform for other developers to build location-aware AR apps on top of.

disguise launches Metaverse Solutions division enabling next-level extended reality experiences — from etnow.com

Excerpt:

UK – disguise, the visual storytelling platform and market leader for extended reality (xR) solutions has launched its Metaverse Solutions division to enable the next generation of extraordinary live, virtual production and audiovisual location-based experiences for the metaverse.

The recent rise of real-time 3D graphics rendering capabilities in gaming platforms means that today’s audiences are craving richer, more immersive experiences that are delivered via the metaverse. While the metaverse is already defined as an $8 trillion dollar opportunity by Goldman Sachs, companies are still finding it challenging to navigate the technical elements needed to start building metaverse experiences.

On this item, also see:

disguise.one

disguise launches Metaverse Solutions division — from televisual.com by

Excerpt:

“Our xR technology combines key metaverse building blocks including real-time 3D graphics, spatial technologies and advanced display interfaces – all to deliver a one-of-a-kind gateway to the metaverse,” says disguise CXO and head of Metaverse Solutions Alex Wills.

 

Opportunities for Education in the Metaverse -- from downes.ca by Stephen Downes

Opportunities for Education in the Metaverse — from downes.ca by Stephen Downes

Excerpt:

This short presentation introduces major elements of the metaverse, outlines some applications for education, discusses how it may be combined with other technologies for advanced applications, and outlines some issues and concerns.

Also relevant/see:

What Should Higher Ed in the Metaverse Look like? – from linkedin.com by Joe Schaefer

Excerpt:

The Metaverse is coming whether we like it or not, and it is time for educators to think critically about how it can benefit students. As higher education continues to evolve, I believe every learning product and platform working with or within the Metaverse should, at least, have these functionalities:


Addendum on 5/23/22:


 

From DSC:
I love the parts about seeing instant language translations — including sign language! Very cool indeed!
(With thanks to Ori Inbar out on Twitter for this resource.)

Realtime American Sign Language translation via potential set of AR glasses from Google

Also see:

 

From DSC:
For the last few years, I’ve been thinking that we need to make learning science-related information more accessible to students, teachers, professors, trainers, and employees — no matter what level they are at.

One idea on how to do this — besides putting posters up in the hallways, libraries, classrooms, conference rooms, cafeterias, etc. — is that we could put a How best to study/learn link in all of the global navigation bars and/or course navigation bars out there in organizations’ course management systems and learning management systems. Learners of all ages could have 24 x 7 x 365, easy, instant access as to how to be more productive as they study and learn about new things.

For example, they could select that link in their CMS/LMS to access information on:

  • Retrieval practice
  • Spacing
  • Interleaving
  • Metacognition
  • Elaboration
  • The Growth Mindset
  • Accessibility-related tools / assistive technologies
  • Links to further resources re: learning science and learning theories

What do you think? If we started this in K12, kept it up in higher ed and vocational programs, and took the idea into the corporate world, valuable information could be relayed and absorbed. This is the kind of information that is highly beneficial these days — as all of us need to be lifelong learners now.

 
 

The amazing opportunities of AI in the future of the educational metaverse [Darbinyan]

The amazing opportunities of AI in the future of the educational metaverse — from forbes.com by Rem Darbinyan

Excerpt:

Looking ahead, let’s go over several potential AI-backed applications of the metaverse that can empower the education industry in many ways.

Multilingual Learning Opportunities
Language differences may be a real challenge for students from different cultures as they may not be able to understand and keep up with the materials and assignments. Artificial intelligence, VR and AR technologies can enhance multilingual accessibility for learners no matter where they are in the world. Speech-to-text, text-to-speech and machine translation technologies enrich the learning process and create more immersive learning environments.

AI can process multiple languages simultaneously and provide real-time translations, enabling learners to engage with the materials in the language of their choice. With the ability to instantly transcribe speech across multiple languages, artificial intelligence removes any language barriers for students, enabling them to be potentially involved, learn and communicate in any language.

Artificial Intelligence (AI) in Education Market size exceeded USD 1 billion in 2020 and is expected to grow at a CAGR of over 40% between 2021 and 2027. (source)

Along the lines of innovation within our educational learning ecosystems, see:

3 Questions for Coursera’s Betty Vandenbosch & U-M’s Lauren Atkins Budde on XR — from insidehighered.com by Joshua Kiim
How might extended reality shape the future of learning?

Excerpts (emphasis DSC):

[Lauren Atkins Budde] “Being able to embed quality, effective extended reality experiences into online courses is exponentially a game-changer. One of the persistent constraints of online learning, especially at scale, is how do learners get hands-on practice? How do they experience specific contexts and situations? How do they learn things that are best experienced? XR provides that opportunity for actively doing different kinds of tasks, in various environments, in ways that would otherwise not be possible. It will open up  Lauren Atkins Buddeboth how we teach online and also what we teach online.”

These courses are really exciting and cover a broad range of disciplines, which is particularly important. To choose the right subjects, we did an extensive review of insights from industry partners, learners and market research on in-demand and emerging future-of-work skills and then paired that with content opportunities where immersive learning is really a value-add and creates what our learning experience designers call “embodied learning.”

Addendum on 5/1/22:
Can the Metaverse Improve Learning? New Research Finds Some Promise — from edsurge.com by Jeffrey R. Young

“The findings support a deeper understanding of how creating unique educational experiences that feel real (i.e., create a high level of presence) through immersive technology can influence learning through different affective and cognitive processes including enjoyment and interest,” Mayer and his colleagues write.

 

We need to use more tools — that go beyond screen sharing — where we can collaborate regardless of where we’re at. [Christian]

From DSC:
Seeing the functionality in Freehand — it makes me once again think that we need to use more tools where faculty/staff/students can collaborate with each other REGARDLESS of where they’re coming in to partake in a learning experience (i.e., remotely or physically/locally). This is also true for trainers and employees, teachers and students, as well as in virtual tutoring types of situations. We need tools that offer functionalities that go beyond screen sharing in order to collaborate, design, present, discuss, and create things.  (more…)

 
© 2022 | Daniel Christian