Instructional Audio: 4 Benefits to Improving It — from techlearning.com by Erik Ofgang
Ensuring every classroom has instructional audio capabilities helps all students hear what the teacher is saying.

Excerpt (emphasis DSC):

Sound is a key component of education. If students can’t hear their instructor well, they’re clearly not going to focus or learn as much. That’s why more and more schools are investing in instructional audio systems, which are high-tech public address systems designed with classrooms, teachers, and all students in mind.

Terri Meier is director of education technology for Rio Rancho Public Schools in New Mexico where all new classrooms are being built with voice amplification systems in place and many existing classrooms are being retrofitted with similar systems. These systems are key for schools in their accessibility efforts and in providing quality instruction overall, she says.

And speaking of accessibility-related postings/items, also see:

 

Apple just quietly gave us the golden key to unlock the Metaverse — from medium.com by Klas Holmlund; with thanks to Ori Inbar out on Twitter for this resource

Excerpt:

But the ‘Oh wow’ moment came when I pointed the app at a window. Or a door. Because with a short pause, a correctly placed 3D model of the window snapped in place. Same with a door. But the door could be opened or closed. RoomPlan did not care. It understands a door. It understands a chair. It understands a cabinet. And when it sees any of these things, it places a model of them, with the same dimensions, in the model.

Oh, the places you will go!
OK, so what will this mean to Metaverse building? Why is this a big deal? Well, to someone who is not a 3D modeler, it is hard to overstate what amount of work has to go into generating useable geometry. The key word, here, being useable. To be able to move around, exist in a VR space it has to be optimized. You’re not going to have a fun party if your dinner guests fall through a hole in reality. This technology will let you create a fully digital twin of any space you are in in the space of time it takes you to look around.

In a future Apple VR or AR headset, this technology will obviuosly be built in. You will build a VR capable digital twin of any space you are in just by wearing the headset. All of this is optimized.

Also with thanks to Ori Inbar:


Somewhat relevant/see:

“The COVID-19 pandemic spurred us to think creatively about how we can train the next generation of electrical construction workers in a scalable and cost-effective way,” said Beau Pollock, president and CEO of TRIO Electric. “Finding electrical instructors is difficult and time-consuming, and training requires us to use the same materials that technicians use on the job. The virtual simulations not only offer learners real-world experience and hands-on practice before they go into the field, they also help us to conserve resources in the process.”


 

Top 5 Developments in Web 3.0 We Will See in the Next Five Years — from intelligenthq.com

Excerpt:

Today, websites have turned highly engaging, and the internet is full of exciting experiences. Yet, web 3.0 is coming with noteworthy trends and things to look out for.

Here are the top 5 developments in web 3.0 expected in the coming five years.
.

 

European telco giants collaborate on 5G-powered holographic videocalls — from inavateonthenet.net

Excerpt:

Some of Europe’s biggest telecoms operators have joined forces for a pilot project that aims to make holographic calls as simple and straightforward as a phone call.

Deutsche Telekom, Orange, Telefónica and Vodafone are working with holographic presence company Matsuko to develop an easy-to-use platform for immersive 3D experiences that could transform communications and the virtual events market

Advances in connectivity, thanks to 5G and edge computing technology, allow smooth and natural movement of holograms and make the possibility of easy-to-access holographic calls a reality.
.

Top XR Vendors Majoring in Education for 2022 — from xrtoday.com

Excerpt:

Few things are more important than delivering the right education to individuals around the globe. Whether enlightening a new generation of young students, or empowering professionals in a complex business environment, learning is the key to building a better future.

In recent years, we’ve discovered just how powerful technology can be in delivering information to those who need it most. The cloud has paved the way for a new era of collaborative remote learning, while AI tools and automated systems are assisting educators in their tasks. XR has the potential to be one of the most disruptive new technologies in the educational space.

With Extended Reality technology, training professionals can deliver incredible experiences to students all over the globe, without the risks or resource requirements of traditional education. Today, we’re looking at just some of the major vendors leading the way to a future of immersive learning.

 

Course Awareness in HyFlex: Managing unequal participation numbers — from hyflexlearning.org by Candice Freeman

Excerpt:

How do you teach a HyFlex course when the number of students in various participation modes is very unequal? How do you teach one student in a mode – often in the classroom? Conversely, you could ask how do you teach 50 asynchronous students with very few in the synchronous mode(s)? Answers will vary greatly depending from teacher to teacher. This article suggests a strategy called Course Awareness, a mindfulness technique designed to help teachers envision each learner as being in the instructor’s presence and engaged in the instruction regardless of participation (or attendance) mode choice.

Teaching HyFlex in an active learning classroom

From DSC:
I had understood the hyflex teaching model as addressing both online-based (i.e., virtual/not on-site) and on-site/physically-present students at the same time — and that each student could choose the manner in which they wanted to attend that day’s class. For example, on one day, a student could take the course in room 123 of Anderson Hall. The next time the class meets, that same student could attend from their dorm room.

But this article introduces — at least to me — the idea that we have a third method of participating in the hyflex model — asynchronously (i.e., not at the same time). So rather than making their way to Anderson Hall or attending from their dorm, that same student does not attend at the same time as other students (either virtually or physically). That student will likely check in with a variety of tools to catch up with — and contribute to — the discussions. As the article mentions:

Strategically, you need to employ web-based resources designed to gather real-time information over a specified period of time, capturing all students and not just students engaging live. For example, Mentimeter, PollEverywhere, and Sli.do allow the instructor to pose engaging, interactive questions without limiting engagement time to the instance the question is initially posed. These tools are designed to support both synchronous and asynchronous participation. 

So it will be interesting to see how our learning ecosystems morph in this area. Will there be other new affordances, pedagogies, and tools that take into consideration that the faculty members are addressing synchronous and asynchronous students as well as online and physically present students? Hmmm…more experimentation is needed here, as well as more:

  • Research
  • Instructional design
  • Design thinking
  • Feedback from students and faculty members

Will this type of model work best in the higher education learning ecosystem but not the K-12 learning ecosystem? Will it thrive with employees within the corporate realm? Hmmm…again, time will tell.


And to add another layer to the teaching and learning onion, now let’s talk about multimodal learning. This article, How to support multimodal learningby Monica Burns, mentions that:

Multimodal learning is a teaching concept where using different senses simultaneously helps students interact with content at a deeper level. In the same way we learn through multiple types of media in our own journey as lifelong learners, students can benefit from this type of learning experience.

The only comment I have here is that if you think that throwing a warm body into a K12 classroom fixes the problem of teachers leaving the field, you haven’t a clue how complex this teaching and learning onion is. Good luck to all of those people who are being thrown into the deep end — and essentially being told to sink or swim.

 

To Improve Outcomes for Students, We Must Improve Support for Faculty — from campustechnology.com by Dr. David Wiley
The doctoral programs that prepare faculty for their positions often fail to train them on effective teaching practices. We owe it to our students to provide faculty with the professional development they need to help learners realize their full potential.

Excerpts:

Why do we allow so much student potential to go unrealized? Why are well-researched, highly effective teaching practices not used more widely?

The doctoral programs that are supposed to prepare them to become faculty in physics, philosophy, and other disciplines don’t require them to take a single course in effective teaching practices. 

The entire faculty preparation enterprise seems to be caught in a loop, unintentionally but consistently passing on an unawareness that some teaching practices are significantly more effective than others. How do we break this cycle and help students realize their full potential as learners?

From DSC:
First of all, I greatly appreciate the work of Dr. David Wiley. His career has been dedicated to teaching and learning, open educational resources, and more. I also appreciate and agree with what David is saying here — i.e., that professors need to be taught how to teach as well as what we know about how people learn at this point in time. 

For years now, I’ve been (unpleasantly) amazed that we hire and pay our professors primarily for their research capabilities — vs. their teaching competence. At the same time, we continually increase the cost of tuition, books, and other fees. Students have the right to let their feet do the walking. As the alternatives to traditional institutions of higher education increase, I’m quite sure that we’ll see that happen more and more.

While I think that training faculty members about effective teaching practices is highly beneficial, I also think that TEAM-BASED content creation and delivery will deliver the best learning experiences that we can provide. I say this because multiple disciplines and specialists are involved, such as:

  • Subject Matter Experts (i.e., faculty members)
  • Instructional Designers
  • Graphic Designers
  • Web Designers
  • Learning Scientists; Cognitive Learning Researchers
  • Audio/Video Specialists  and Learning Space Designers/Architects
  • CMS/LMS Administrators
  • Programmers
  • Multimedia Artists who are skilled in working with digital audio and digital video
  • Accessibility Specialists
  • Librarians
  • Illustrators and Animators
  • and more

The point here is that one person can’t do it all — especially now that the expectation is that courses should be offered in a hybrid format or in an online-based format. For a solid example of the power of team-based content creation/delivery, see this posting.

One last thought/question here though. Once a professor is teaching, are they open to working with and learning from the Instructional Designers, Learning Scientists, and/or others from the Teaching & Learning Centers that do exist on their campus? Or do they, like many faculty members, think that such people are irrelevant because they aren’t faculty members themselves? Oftentimes, faculty members look to each other and don’t really care what support is offered (unless they need help with some of the technology.)


Also relevant/see:


 

What if smart TVs’ new killer app was a next-generation learning-related platform? [Christian]

TV makers are looking beyond streaming to stay relevant — from protocol.com by Janko Roettgers and Nick Statt

A smart TV's main menu listing what's available -- application wise

Excerpts:

The search for TV’s next killer app
TV makers have some reason to celebrate these days: Streaming has officially surpassed cable and broadcast as the most popular form of TV consumption; smart TVs are increasingly replacing external streaming devices; and the makers of these TVs have largely figured out how to turn those one-time purchases into recurring revenue streams, thanks to ad-supported services.

What TV makers need is a new killer app. Consumer electronics companies have for some time toyed with the idea of using TV for all kinds of additional purposes, including gaming, smart home functionality and fitness. Ad-supported video took priority over those use cases over the past few years, but now, TV brands need new ways to differentiate their devices.

Turning the TV into the most useful screen in the house holds a lot of promise for the industry. To truly embrace this trend, TV makers might have to take some bold bets and be willing to push the envelope on what’s possible in the living room.

 


From DSC:
What if smart TVs’ new killer app was a next-generation learning-related platform? Could smart TVs deliver more blended/hybrid learning? Hyflex-based learning?
.

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

.

Or what if smart TVs had to do with delivering telehealth-based apps? Or telelegal/virtual courts-based apps?


 

From DSC:
I wanted to pass this along to the learning space designers out there in case it’s helpful to them.

“The sound absorbing Sola felt is made from 50% post-consumer recycled PET.”

Source


Acoustic Static Links lighting by LightArt — from dezeen.com

 

Dive Into AI, Avatars and the Metaverse With NVIDIA at SIGGRAPH — from blogs.nvidia.com

Excerpt:

Innovative technologies in AI, virtual worlds and digital humans are shaping the future of design and content creation across every industry. Experience the latest advances from NVIDIA in all these areas at SIGGRAPH, the world’s largest gathering of computer graphics experts, [which ran from Aug. 8-11].

At SIGGRAPH, NVIDIA CEO Jensen Huang Illuminates Three Forces Sparking Graphics Revolution — from blogs.nvidia.com by Rick Merritt
NVIDIA unveils new products and research to transform industries with AI, the metaverse and digital humans.

NVIDIA AI Makes Performance Capture Possible With Any Camera — from blogs.nvidia.com by Isha Salian
Derivative, Notch, Pixotope and others use NVIDIA Vid2Vid Cameo and 3D body-pose estimation tools to drive performances in real time.

How to Start a Career in AI — from blogs.nvidia.com by Brian Caulfield
Four most important steps to starting a career in AI, seven big questions answered.

As Far as the AI Can See: ILM Uses Omniverse DeepSearch to Create the Perfect Sky — from blogs.nvidia.com by Richard Kerris
Omniverse AI-enabled search tool lets legendary studio sift through massive database of 3D scenes.

Future of Creativity on Display ‘In the NVIDIA Studio’ During SIGGRAPH Special Address — from blogs.nvidia.com by Gerardo Degaldo
Major NVIDIA Omniverse updates power 3D virtual worlds, digital twins and avatars, reliably boosted by August NVIDIA Studio Driver; #MadeInMachinima contest winner revealed.

What Is Direct and Indirect Lighting? — from blogs.nvidia.com by JJ Kim
In computer graphics, the right balance between direct and indirect lighting elevates the photorealism of a scene.

NVIDIA Studio Laptops Offer Students AI, Creative Capabilities That Are Best in… Class — from blogs.nvidia.com by Gerardo Degaldo
Designed for creativity and speed, Studio laptops are the ultimate creative tool for aspiring 3D artists, video editors, designers and photographers.

Design in the Age of Digital Twins: A Conversation With Graphics Pioneer Donald Greenberg — from blogs.nvidia.com by Rick Merritt
From his Cornell office, home to a career of 54 years and counting, he shares with SIGGRAPH attendees his latest works in progress.

 

Augmented Books Are On The Way According To Researchers — from vrscout.com by Kyle Melnick

Excerpt:

Imagine this. You’re several chapters into a captivating novel when a character from an earlier book makes a surprise appearance. You swipe your finger across their name on the page at which point their entire backstory is displayed on a nearby smartphone, allowing you to refresh your memory before moving forward.

This may sound like science fiction, but researchers at the University of Surrey in England say that the technology described above is already here in the form of “a-books” (augmented reality books).

The potential use-cases for such a technology are virtually endless. As previously mentioned, a-books could be used to deliver character details and plot points for a variety of fictional works. The same technology could also be applied to textbooks, allowing students to display helpful information on their smartphones, tablets, and smart TVs with the swipe of a finger.

From DSC:

  • How might instructional designers use this capability?
  • How about those in theatre/drama?
  • Educational gaming?
  • Digital storytelling?
  • Interaction design?
  • Interface design?
  • User experience design?

Also see:


 
 

The Metaverse Is Not a Place — from oreilly.com by Tim O’Reilly
It’s a communications medium.

Excerpt:

Foundations of the metaverse
You can continue this exercise by thinking about the metaverse as the combination of multiple technology trend vectors progressing at different speeds and coming from different directions, and pushing the overall vector forward (or backward) accordingly. No new technology is the product of a single vector.

So rather than settling on just “the metaverse is a communications medium,” think about the various technology vectors besides real-time communications that are coming together in the current moment. What news from the future might we be looking for?

  • Virtual Reality/Augmented Reality
  • Social media
  • Gaming
  • AI
  • Cryptocurrencies and “Web3”
  • Identity

#metaverse #AI #communications #gaming #socialmedia #cryptocurrencies #Web3 #identity #bots #XR #VR #emergingtechnologies

 

‘It’s a mess and I’ve never seen anything like it’: global lost luggage crisis mounts — from theguardian.com by Mattha Busby

From DSC:
After seeing the above article, I wondered:

  • How might the future #AR glasses help with this situation?
  • Could such devices help the relevant employees automatically scan each bar code for where to route (or not) the luggage? (As I’m sure is already happening in many situations that occur away from what’s occurring on the tarmac.)

Will AR glasses be used for such training-related applications in the future?  For such logistical applications? Hmmm…time will tell.

 

The Metaverse Will Reshape Our Lives. Let’s Make Sure It’s for the Better. — from time.com by Matthew Ball

Excerpts (emphasis DSC):

The metaverse, a 30-year-old term but nearly century-old idea, is forming around us. Every few decades, a platform shift occurs—such as that from mainframes to PCs and the internet, or the subsequent evolution to mobile and cloud computing. Once a new era has taken shape, it’s incredibly difficult to alter who leads it and how. But between eras, those very things usually do change. If we hope to build a better future, then we must be as aggressive about shaping it as are those who are investing to build it.

The next evolution to this trend seems likely to be a persistent and “living” virtual world that is not a window into our life (such as Instagram) nor a place where we communicate it (such as Gmail) but one in which we also exist—and in 3D (hence the focus on immersive VR headsets and avatars).

 


Ways that artificial intelligence is revolutionizing education — from thetechedvocate.org by Matthew Lynch

Excerpt:

I was speaking with an aging schoolteacher who believes that AI is destroying education. They challenged me to come up with 26 ways that artificial intelligence (AI) is improving education, and instead, I came up with. They’re right here.


AI Startup Speeds Healthcare Innovations To Save Lives — from by Geri Stengel

Excerpt:

This project was a light-bulb moment for her. The financial industry had Bloomberg to analyze content and data to help investors uncover opportunities and minimize risk, and pharmaceutical, biotech, and medical device companies needed something similar.



 

Matthew Ball on the metaverse: We’ve never seen a shift this enormous — protocol.com by Janko Roettgers
The leading metaverse theorist shares his thoughts on the sudden rise of the concept, its utility for the enterprise and what we still get wrong about the metaverse.

Excerpts:

What are the biggest misconceptions about the metaverse?
First, the idea that the metaverse is immersive virtual reality, such as an Oculus or Meta Quest. That’s an access device. It would be akin to saying the mobile internet is a smartphone.

We should think of the metaverse as perhaps changing the devices we use, the experiences, business models, protocols and behaviors that we enjoy online. But we’ll keep using smartphones, keyboards. We don’t need to do all video conferences or all calls in 3D. It’s supplements and complements, doesn’t replace everything.

Also relevant/see:

A former Amazon exec thinks Disney will win the metaverse — from protocol.com by

Excerpt:

This month, Ball is publishing his book, “The Metaverse: And How It Will Revolutionize Everything.” The work explains in detail what the metaverse is all about and which shifts in tech, business and culture need to fall into place for it to come into existence.

How will the metaverse change Hollywood? In his book, Ball argues that people tend to underestimate the changes new technologies will have on media and entertainment.

  • Instead of just seeing a movie play out in 360 degrees around us, we’ll want to be part of the movie and play a more active role.
  • One way to achieve that is through games, which have long blurred the lines between storytelling and interactivity. But Ball also predicts there will be a wide range of adjacent content experiences, from virtual Tinder dates in the “Star Wars” universe to Peloton rides through your favorite movie sets.

Addendum on 7/24/22:

Neurodiversity, Inclusion And The Metaverse — from workdesign.com by Derek McCallum

Excerpt:

Innovation in virtual and augmented reality platforms and the vast opportunities connected to the metaverse are driving innovation in nearly every industry. In the workplace, future-focused companies are increasingly exploring ways to use this nascent technology to offer workers more choices and better support for neurodiverse employees.

It would be nearly impossible to list all the challenges and opportunities associated with this technology in a single article, so I’ll keep things focused on an area that is top-of-mind right now as many of us start to make our way back into the office—the workplace. The truth is, while we can use our expertise and experience to anticipate outcomes, no one truly knows what the metaverse will become and what the wide-ranging effects will be. At the moment, the possibilities are exciting and bring to mind more questions than answers. As a principal and hands-on designer in a large, diverse practice, my hope is that we will be able to collectively harness the inherent opportunities of the metaverse to support richer, more accessible human experiences across all aspects of the built environment, and that includes the workplace.


 
© 2022 | Daniel Christian