Opportunities for Education in the Metaverse -- from downes.ca by Stephen Downes

Opportunities for Education in the Metaverse — from downes.ca by Stephen Downes

Excerpt:

This short presentation introduces major elements of the metaverse, outlines some applications for education, discusses how it may be combined with other technologies for advanced applications, and outlines some issues and concerns.

Also relevant/see:

What Should Higher Ed in the Metaverse Look like? – from linkedin.com by Joe Schaefer

Excerpt:

The Metaverse is coming whether we like it or not, and it is time for educators to think critically about how it can benefit students. As higher education continues to evolve, I believe every learning product and platform working with or within the Metaverse should, at least, have these functionalities:


Addendum on 5/23/22:


 

Google Maps to add “immersive view” — from futuretimeline.net
Google Maps, the world’s most-downloaded travel app, will soon become more immersive and intuitive thanks to a major upgrade.

 

Excerpts:

A new “immersive view” will generate far more detailed graphics than are available currently. This will provide sweeping views of cities in full 3D, complete with simulated cars, real-time weather, realistic day/night cycles, shadow effects, water reflections, and even animations like birds flying through the sky.

The company also announced an update for Live View. First launched in 2019, this provides augmented reality (AR) walking routes in the form of arrows, directions, and distance markers.

These improvements are possible thanks to advances in computer vision and AI that allow billions of aerial, street view, and other images to combine and create a rich, seamless, digital model of the world.


Addendums on 5/23/22:

Google Wants To Deliver World-Scale AR Using Google Maps — from vrscout.com by Bobby Carlton

Somewhat relevant/see:

Earth // Around The World From The Air 4K from Kien Lam on Vimeo.

 
 

Ransomware is already out of control. AI-powered ransomware could be ‘terrifying.’ — from protocol.com by Kyle Alspach
Hiring AI experts to automate ransomware could be the next step for well-endowed ransomware groups that are seeking to scale up their attacks.

Excerpt:

In the perpetual battle between cybercriminals and defenders, the latter have always had one largely unchallenged advantage: The use of AI and machine learning allows them to automate a lot of what they do, especially around detecting and responding to attacks. This leg-up hasn’t been nearly enough to keep ransomware at bay, but it has still been far more than what cybercriminals have ever been able to muster in terms of AI and automation.

That’s because deploying AI-powered ransomware would require AI expertise. And the ransomware gangs don’t have it. At least not yet.

But given the wealth accumulated by a number of ransomware gangs in recent years, it may not be long before attackers do bring aboard AI experts of their own, prominent cybersecurity authority Mikko Hyppönen said.

Also re: AI, see:

Nuance partners with The Academy to launch The AI Collaborative — from artificialintelligence-news.com by Ryan Daws

Excerpt:

Nuance has partnered with The Health Management Academy (The Academy) to launch The AI Collaborative, an industry group focused on advancing healthcare using artificial intelligence and machine learning.

Nuance became a household name for creating the speech engine recognition engine behind Siri. In recent years, the company has put a strong focus on AI solutions for healthcare and is now a full-service partner of 77 percent of US hospitals and is trusted by over 500,000 physicians daily.

Inflection AI, led by LinkedIn and DeepMind co-founders, raises $225M to transform computer-human interactions — from techcrunch.com by Kyle Wiggers

Excerpts:

Inflection AI, the machine learning startup headed by LinkedIn co-founder Reid Hoffman and founding DeepMind member Mustafa Suleyman, has secured $225 million in equity financing, according to a filing with the U.S. Securities and Exchange Commission.

“[Programming languages, mice, and other interfaces] are ways we simplify our ideas and reduce their complexity and in some ways their creativity and their uniqueness in order to get a machine to do something,” Suleyman told the publication. “It feels like we’re on the cusp of being able to generate language to pretty much human-level performance. It opens up a whole new suite of things that we can do in the product space.”

 

Inside Microsoft’s new Inclusive Tech Lab — from engadget.com by C. Low; with thanks to Nick Floro on Twitter for some of these resources
“An embassy for people with disabilities.”

Increasing our Focus on Inclusive Technology — from mblogs.microsoft.com by Dave Dame

Excerpt:

In recent years, tied to Microsoft’s mission of empowering every person and organization on the planet to achieve more, teams from across Microsoft have launched several products and features to make technology more inclusive and accessible. [On May 10, 2022], as part of the 12th annual Microsoft Ability Summit, we celebrate a new and expanded Inclusive Tech Lab, powerful new software features, and are unveiling Microsoft adaptive accessories designed to give people with disabilities greater access to technology.

Microsoft’s Latest Hardware Is More Accessible and Customizable — from wired.com by Brenda Stolyar
The wireless system—a mouse, a button, and a hub—is designed to increase productivity for those with limited mobility.

Excerpt:

Microsoft if expanding its lineup of accessibility hardware. During its annual Ability Summit—an event dedicated to disability inclusion and accessibility—the company showed attendees some new PC hardware it has developed for users with limited mobility. Available later this year, the wireless system will consist of an adaptive mouse, a programmable button, and a hub to handle the connection to a Windows PC. Users set up the devices to trigger various keystrokes, shortcuts, and sequences. These new input devices can be used with existing accessories, and they can be further customized with 3D-printed add-ons. There are no price details yet.

Along these lines, also see:

  • 14 Equity Considerations for Ed Tech — from campustechnology.com by Reed Dickson
    Is the education technology in your online course equitable and inclusive of all learners? Here are key equity questions to ask when considering the pedagogical experience of an e-learning tool.
 
 

The Future Is Here: Assistive Technology for Learning Disabilities — from studycorgi.com; with thanks to Alysson Webb for this resource

Excerpt:

Equal learning and personal development opportunities help ensure everyone reaches their highest potential. However, it is important to look at comparable needs. People with learning disabilities can require individual or additional services from a school program. According to the National Center of Educational Statistics, in 2019 – 2020, 14% (7.3 million) of children from 3 to 21 received special education services in the US. One-third of them had various learning disabilities that required specific assistance and tools.

Table of Contents

  1. What Is a Learning Disability?
  2. Assistive Technology (AT) in the Classroom
  3. AT for Learning Disabilities: Benefits & Tools
  4. References
 

The amazing opportunities of AI in the future of the educational metaverse [Darbinyan]

The amazing opportunities of AI in the future of the educational metaverse — from forbes.com by Rem Darbinyan

Excerpt:

Looking ahead, let’s go over several potential AI-backed applications of the metaverse that can empower the education industry in many ways.

Multilingual Learning Opportunities
Language differences may be a real challenge for students from different cultures as they may not be able to understand and keep up with the materials and assignments. Artificial intelligence, VR and AR technologies can enhance multilingual accessibility for learners no matter where they are in the world. Speech-to-text, text-to-speech and machine translation technologies enrich the learning process and create more immersive learning environments.

AI can process multiple languages simultaneously and provide real-time translations, enabling learners to engage with the materials in the language of their choice. With the ability to instantly transcribe speech across multiple languages, artificial intelligence removes any language barriers for students, enabling them to be potentially involved, learn and communicate in any language.

Artificial Intelligence (AI) in Education Market size exceeded USD 1 billion in 2020 and is expected to grow at a CAGR of over 40% between 2021 and 2027. (source)

Along the lines of innovation within our educational learning ecosystems, see:

3 Questions for Coursera’s Betty Vandenbosch & U-M’s Lauren Atkins Budde on XR — from insidehighered.com by Joshua Kiim
How might extended reality shape the future of learning?

Excerpts (emphasis DSC):

[Lauren Atkins Budde] “Being able to embed quality, effective extended reality experiences into online courses is exponentially a game-changer. One of the persistent constraints of online learning, especially at scale, is how do learners get hands-on practice? How do they experience specific contexts and situations? How do they learn things that are best experienced? XR provides that opportunity for actively doing different kinds of tasks, in various environments, in ways that would otherwise not be possible. It will open up  Lauren Atkins Buddeboth how we teach online and also what we teach online.”

These courses are really exciting and cover a broad range of disciplines, which is particularly important. To choose the right subjects, we did an extensive review of insights from industry partners, learners and market research on in-demand and emerging future-of-work skills and then paired that with content opportunities where immersive learning is really a value-add and creates what our learning experience designers call “embodied learning.”

Addendum on 5/1/22:
Can the Metaverse Improve Learning? New Research Finds Some Promise — from edsurge.com by Jeffrey R. Young

“The findings support a deeper understanding of how creating unique educational experiences that feel real (i.e., create a high level of presence) through immersive technology can influence learning through different affective and cognitive processes including enjoyment and interest,” Mayer and his colleagues write.

 

What Educators Need to Know About Assistive Tech Tools: Q&A with Texthelp CEO — from thejournal.com by Kristal Kuykendall and Texthelp CEO Martin McKay

Excerpts (emphasis DSC):

THE Journal: What are some examples of the types of assistive technology tools now available for K–12 schools?
McKay: There are a broad range of disabilities, and accordingly, a broad range of learning and access difficulties that assistive technology can help with. Just considering students with dyslexia — since that is the largest group among students who can benefit from assistive tech tools — the main problems they have are around reading comprehension and writing. Assistive technology can provide text-to-speech, talking dictionaries, picture dictionaries, and text simplification tools to help with comprehension.

It’s important that these tools need to work everywhere — not just in their word processor. Assistive technology must work in their learning management systems, and must work in their online assessment environment, so that the student can use the assistive tech tools not only in class, but at home as they work on their homework, and perhaps most importantly on test day when they are using a secure assessment environment.

 

We need to use more tools — that go beyond screen sharing — where we can collaborate regardless of where we’re at. [Christian]

From DSC:
Seeing the functionality in Freehand — it makes me once again think that we need to use more tools where faculty/staff/students can collaborate with each other REGARDLESS of where they’re coming in to partake in a learning experience (i.e., remotely or physically/locally). This is also true for trainers and employees, teachers and students, as well as in virtual tutoring types of situations. We need tools that offer functionalities that go beyond screen sharing in order to collaborate, design, present, discuss, and create things.  (more…)

 

Zoom Announces New Education Features, Enhancing Hybrid [Hyflex] Learning Experience for Educators & Students — from edtechreview.in by Stephen Soulunii

Excerpt:

According to a news release, the features span Zoom’s Chat and Meeting offerings and are designed to support teachers who need to engage and manage students joining class remotely or submitting homework assignments.

Breakout Rooms Enhancements
Breakout rooms, a popular education feature, also received enhancements in this latest release. Program Audio allows meeting hosts to share content with audio to breakout rooms, adding the ability to share videos with audio. With the LTI Pro integration enhancement, educators can populate breakout rooms from the course roster. This can be used to assign breakout rooms in advance, and then automatically sort students into breakout rooms.

Anywhere Polls
Anywhere Polls will allow polling content to live in a central repository that can be accessed from any meeting on an account, instead of being associated with a particular meeting. This will make it easier for instructors to reuse polls and will also be beneficial for grading. This feature will be available this year.

 

From Instructional Design to Learning Experience Design: Understanding the Whole Student — a podcast out at campustechnology.com by Rhea Kelly, Mark Milliron, and Kim Round

Excerpt:

These days, we hear a lot about the “new normal” in higher education. Remote and hybrid learning is here to stay, offering students more flexibility in their learning journeys. But what if the new normal is not enough? It’s time to go beyond the new normal and consider the “new possible” — how to put together the best of face-to-face, online and hybrid to create powerful learning experiences based on a deep understanding of the whole student. We spoke to Mark Milliron, senior vice president of Western Governors University and executive dean of the Teachers College, and Kim Round, academic programs director and associate dean of the Teachers College, about their vision for reimagining education and why learning experience design is essential to student success.

Another interesting podcast:

 

Now we just need a “Likewise TV” for learning-related resources! [Christian]

Likewise TV Brings Curation to Streaming — from lifewire.com by Cesar Aroldo-Cadenas
And it’s available on iOS, Android, and some smart TVs

All your streaming services in one place. One search. One watchlist. Socially powered recommendations.

Entertainment startup Likewise has launched a new recommendations hub that pulls from all the different streaming platforms to give you personalized picks.

Likewise TV is a streaming hub powered by machine learning, people from the Likewise community, and other streaming services. The service aims to do away with mindlessly scrolling through a menu, looking for something to watch, or jumping from one app to another by providing a single location for recommendations.

Note that Likewise TV is purely an aggregator.


Also see:

Likewise TV -- All your streaming services in one place. One search. One watchlist. Socially powered recommendations.

 


From DSC:
Now we need this type of AI-based recommendation engine, aggregator, and service for learning-related resources!

I realize that we have a long ways to go here — as a friend/former colleague of mine just reminded me that these recommendation engines often miss the mark. I’m just hoping that a recommendation engine like this could ingest our cloud-based learner profiles and our current goals and then present some promising learning-related possibilities for us. Especially if the following graphic is or will be the case in the future:


Learning from the living class room


Also relevant/see:

From DSC:
Some interesting/noteworthy features:

  • “The 32- inch display has Wi-Fi capabilities to supports multiple streaming services, can stream smartphone content, and comes with a removable SlimFit Cam.”
  • The M8 has Wi-Fi connectivity for its native streaming apps so you won’t have to connect to a computer to watch something on Netflix. And its Far Field Voice mic can be used w/ the Always On feature to control devices like Amazon Alexa with your voice, even if the monitor is off.
  • “You can also connect devices to the monitor via the SmartThings Hub, which can be tracked with the official SmartThings app.”

I wonder how what we call the TV (or television) will continue to morph in the future.


Addendum on 3/31/22 from DSC:
Perhaps people will co-create their learning playlists…as is now possible with Spotify’s “Blend” feature:

Today’s Blend update allows you to share your personal Spotify playlists with your entire group chat—up to 10 users. You can manually invite these friends and family members to join you from in the app, then Spotify will create a playlist for you all to listen to using a mixture of everyone’s music preferences. Spotify will also create a special share card that everyone in the group can use to save and share the created playlist in the future.


 

Some announcements from NVIDIA today:


Nvidia unveils server CPU to challenge Intel and AMD in the data center — from protocol.com by Max A. Cherney
Nvidia’s CEO Jensen Huang announced the company’s next-generation GPU architecture and a new CPU Tuesday.

The new Grace processor is designed for AI, high-performance computing and hyperscale data center applications.

Nvidia launched a mapping product for the autonomous vehicle industry — from techcrunch.com by Rebecca Bellan

Nvidia has launched a new mapping platform that will provide the autonomous vehicle industry with ground truth mapping coverage of over 300,000 miles of roadway in North America, Europe and Asia by 2024, founder and CEO Jensen Huang said at the company’s GTC event on Tuesday.

NVIDIA Launches AI Computing Platform for Medical Devices — from hitconsultant.net by Jasmine Pennic

Nvidia Unveils AI Chips and Software, Plus Tools for Creating Virtual Worlds — by Eric Savitz and Conor Smith

Excerpt:

Nvidia is doubling down on artificial-intelligence  technology that CEO Jensen Huang predicts will revolutionize every industry.

In the keynote speech for the annual Nvidia GTC conference—the acronym once stood for GPU Technology Conference, a reference to the company’s roots in graphics processing chips—Huang focused specifically on expanding the company’s portfolio of AI-focused chips and software applications.

Nvidia CEO lays out plans after Arm deal fell through, reveals new Hopper GPU — from marketwatch.com by Jeremy C. Owens
Roadmap seems little changed after chip maker ditched $40 billion acquisition of designer Arm, with new GPUs and first server CPU still on track as EV makers sign on for autonomous-driving tech


Addendum 3/26/22:

Nvidia’s Clara Holoscan MGX means to bring high-powered AI to the doctor’s office — from techcrunch.com


Addendum 3/28/22:

Nvidia’s $1 trillion ambitions draw cheers as software becomes a bigger piece of the pie — from marketwatch.com


 
© 2022 | Daniel Christian