How to Set Up a VR Pilot — from campustechnology.com by Dian Schaffhauser As Washington & Lee University has found, there is no best approach for introducing virtual reality into your classrooms — just stages of faculty commitment.
Excerpt:
The work at the IQ Center offers a model for how other institutions might want to approach their own VR experimentation. The secret to success, suggested IQ Center Coordinator David Pfaff, “is to not be afraid to develop your own stuff” — in other words, diving right in. But first, there’s dipping a toe.
The IQ Center is a collaborative workspace housed in the science building but providing services to “departments all over campus,” said Pfaff. The facilities include three labs: one loaded with high-performance workstations, another decked out for 3D visualization and a third packed with physical/mechanical equipment, including 3D printers, a laser cutter and a motion-capture system.
Here, I would like to stick to the challenges and opportunities presented by augmented reality and virtual reality for language learning.
…
While the challenge is a significant one, I am more optimistic than most that wearable AR will be available and popular soon. We don’t yet know how Snap Spectacles will evolve, and, of course, there’s always Apple.
…
I suspect we will see a flurry of new VR apps from language learning startups soon, especially from Duolingo and in combination with their AI chat bots. I am curious if users will quickly abandon the isolating experiences or become dedicated users.
Bose has a plan to make AR glasses— from cnet.com by David Carnoy Best known for its speakers and headphones, the company has created a $50 million development fund to back a new AR platform that’s all about audio.
Excerpts:
“Unlike other augmented reality products and platforms, Bose AR doesn’t change what you see, but knows what you’re looking at — without an integrated lens or phone camera,” Bose said. “And rather than superimposing visual objects on the real world, Bose AR adds an audible layer of information and experiences, making every day better, easier, more meaningful, and more productive.”
…
The secret sauce seems to be the tiny, “wafer-thin” acoustics package developed for the platform. Bose said it represents the future of mobile micro-sound and features “jaw-dropping power and clarity.”
Bose adds the technology can “be built into headphones, eyewear, helmets and more and it allows simple head gestures, voice, or a tap on the wearable to control content.”
Here are some examples Bose gave for how it might be used:
For travel, the Bose AR could simulate historic events at landmarks as you view them — “so voices and horses are heard charging in from your left, then passing right in front of you before riding off in the direction of their original route, fading as they go.” You could hear a statue make a famous speech when you approach it. Or get told which way to turn towards your departure gate while checking in at the airport.
Bose AR could translate a sign you’re reading. Or tell you the word or phrase for what you’re looking at in any language. Or explain the story behind the painting you’ve just approached.
With gesture controls, you could choose or change your music with simple head nods indicating yes, no, or next (Bragi headphones already do this).
Bose AR would add useful information based on where you look. Like the forecast when you look up or information about restaurants on the street you look down.
On the heels of last week’s rollout on Android, Google’s new AI-powered technology, Google Lens, is now arriving on iOS. The feature is available within the Google Photos iOS application, where it can do things like identify objects, buildings, and landmarks, and tell you more information about them, including helpful details like their phone number, address, or open hours. It can also identify things like books, paintings in museums, plants, and animals. In the case of some objects, it can also take actions.
For example, you can add an event to your calendar from a photo of a flyer or event billboard, or you can snap a photo of a business card to store the person’s phone number or address to your Contacts.
The eventual goal is to allow smartphone cameras to understand what it is they’re seeing across any type of photo, then helping you take action on that information, if need be – whether that’s calling a business, saving contact information, or just learning about the world on the other side of the camera.
The introduction of VR technology is part of a “courtroom evidence visualization system” developed by the local court. The system also includes a newly developed computer program that allows lawyers to present evidence with higher quality and efficiency, which will replace a traditional PowerPoint slideshow.
It is reported that the system will soon be implemented in courtrooms across the city of Beijing.
As virtual- and augmented-reality technologies mature, legal questions are emerging that could trip up VR and AR developers. One of the first lawyers to explore these questions is Robyn Chatwood, of the international law firm Dentons. “VR and AR are areas where the law is just not keeping up with [technology] developments,” she says. IEEE Spectrum contributing editor Tam Harbert talked with Chatwood about the legal challenges.
A new virtual reality app that’s designed to help kids suffering from conditions like Crohn’s disease understand their maladies immerses those children in a cartoon-like virtual reality tour through their body.
Called HealthVoyager, the tool, a collaboration between Boston Children’s Hospital and the health-tech company Klick Health, is being launched today at an event featuring former First Lady Michelle Obama.
A lot of kids are confused by doctors’ intricate explanations of complex procedures like a colonoscopy, and they, and their families, can feel much more engaged, and satisfied, if they really understand what’s going on. But that’s been hard to do in a way that really works and doesn’t get bogged down with a lot of meaningless jargon.
The state of virtual reality— from furthermore.equinox.com by Rachael Schultz How the latest advancements are optimizing performance, recovery, and injury prevention
Excerpt:
Virtual reality is increasingly used to enhance everything from museum exhibits to fitness classes. Elite athletes are using VR goggles to refine their skills, sports rehabilitation clinics are incorporating it into recovery regimes, and others are using it to improve focus and memory.
Here, some of the most exciting things happening with virtual reality, as well as what’s to come.
Cornell researchers are taking 3-D printing and 3-D modeling to a new level by using augmented reality (AR) to allow designers to design in physical space while a robotic arm rapidly prints the work. To use the Robotic Modeling Assistant (RoMA), a designer wears an AR headset with hand controllers. As soon as a design feature is completed, the robotic arm prints the new feature.
From DSC: How might the types of technologies being developed and used by Kazendi’s Holomeeting be used for building/enhancing learning spaces?
In my introduction to the AR Cloud I argued that in order to reach mass adoption, AR experiences need to persist in the real world across space, time, and devices.
To achieve that, we will need a persistent realtime spatial map of the world that enables sharing and collaboration of AR Experiences among many users.
And according to AR industry insiders, it’s poised to become:
“the most important software infrastructure in computing”
5 benefits of using Augmented & Virtual Reality Technologies in eLearning — from elearningindustry.com by Christoper Pappas Are you looking for ways to make your eLearning course stand out from the crowd? What if I told you there is technology that can help you achieve not only that but also increase online learner engagement and motivation? In this article, I’ll share the most notable benefits of using Augmented and Virtual Reality technologies in your eLearning course.
Excerpt:
Although their full implications are yet to be explored, alternate reality technologies make eLearning more engaging and productive. They are here to stay, and who knows what benefits they will bring to future learners. As the technology evolves, so too will the applications in eLearning. Which is why it’s essential for eLearning pros to keep up with cutting-edge tech and think of new and innovative uses for AR and VR tools.
Augmented reality (AR) technologies that blend computer-generated images and data from MRI and CT scans with real-world views are making it possible for doctors to “see under the skin” of their patients to visualize bones, muscles, and internal organs without having to cut open a body.
Experts say AR will transform medical care by improving precision during operations, reducing medical errors, and giving doctors and patients alike a better understanding of complex medical problems.
Healthcare VR innovations are healing patients — from cio.com by Peter Nichol Virtual reality is healing patients with augmented technologies. The patient experience has been transformed. Welcome to the era of engaged recovery — the new world of healing.
Excerpt:
Three emerging realities will change the hospital experience with unparalleled possibilities:
Virtual reality (VR): full immersion, a simulated reality.
Mixed reality: partial immersion, virtual objects in a real world.
Augmented reality (AR): information overlay, simulated information on top of the real world.
Today, we’ll explore how advances in virtual reality are creating worlds that heal.
…
The next generation of clinical education
The list of possibilities for VR is endless. Augmented and virtual reality medical solutions are removing distractions, improving the quality of critical thinking, and maturing learning solutions, saving time and money while supercharging the learning experience. Explosive developments in 3D virtual and augmented reality have taken clinical education and hands-on learning to the next level.
…
Innovation is ever present in the virtual reality space for healthcare.
Mindmaze has developed a breakthrough platform to build intuitive human-machine interfaces combining virtual reality, computer graphics, brain imaging and neuroscience.
MindMotionPRO is a healthcare product offering immersive virtual reality solutions for early motor rehabilitation in stroke patients.
Live 360 uses consumer-level virtual reality devices such as the Oculus Rift.
Medical Realities offers systems designed to reduce the cost of training.
ImmersiveTouch is a surgical virtual reality technology that offers a realistic surgical touch and feel. It also brings patient images to life with AR and VR imaging.
BioFlight VR offers a broad range of medical VR and AR services, including VR training and simulations, AR training, behavior modification and 360-degree video.
Zspace is an immersive medical learning platform, virtualizing anatomical representations into complete procedural planning. zSpace brings a new dimension to medical learning and visualization across three spaces: gross anatomy VR lab (13,000 plus anatomical objects), teaching presentation view (share the teaching experience with the class via HD TV) and DICOM Viewer (volumetrically render 2D DICOM slices).
Digital reality is generally defined as the wide spectrum of technologies and capabilities that inhere in AR, VR, MR, 360° video, and the immersive experience, enabling simulation of reality in various ways (see figure 1).
Key players in digital reality
In terms of key players, the digital reality space can be divided into areas of activity:
Tools/content—platforms, apps, capture tools, etc.
Application content—information from industry, analytics, social, etc.
Infrastructure—hardware, data systems, HMDs, etc.
Increasing investment in infrastructure may drive the growth of software and content, leading to new practical applications and, possibly, an infusion of digital reality software development talent.
This All-Female Founders Pitch Event Was Held in VR — from vrscout.com by Malia Probst Hailing from 26 countries across the world, people came together in virtual reality to cheer on these top female founders in the XR industry.
Enterprise leading consumer tech adoption Concerning the need for a VR/AR eco system Max referred to the challenge of technology adoption: people need to be able to try different use cases and be convinced about the potential of AR ,VR and MR. In order to become available (and affordable) for consumers, the technology would have to be adapted by businesses first as the story of 3D printing shows as one example.
He also highlighted the importance of the right training for users to reduce the general learning curve for immersive technology. Poor instructions in the first instance can lead to bad user experiences and cause doubt and even a dismissall of ‘new’ technologies.
We see this firsthand at Kazendi when users try out Microsoft HoloLens for the first time. Max commented that: ‘When people try to make the basic hand gestures and fail they often take the device off and say it’s broken.’
We do have a robust entry demo process to combat this but at the consumer level, and this is as true for VR as much as it is for MR and AR, there is little room for error when learning curves are concerned.
Oklahoma State University’s first inaugural “Virtual + Augmented Reality Hackathon” hosted January 26-27 by the Mixed Reality Lab in the university’s College of Human Sciences gave students and the community a chance to tackle real-world problems using augmented and virtual reality tools, while offering researchers a glimpse into the ways teams work with digital media tools. Campus Technology asked Dr. Tilanka Chandrasekera, an assistant professor in the department of Design, Housing and Merchandising at Oklahoma State University about the hackathon and how it fits into the school’s broader goals.
To set up the audio feed, use the Alexa mobile app to search for “Campus Technology News” in the Alexa Skills catalog. Once you enable the skill, you can ask Alexa “What’s in the news?” or “What’s my Flash Briefing?” and she will read off the latest news briefs from Campus Technology.
Computer simulations are nothing new in the field of aviation education. But a new partnership between Western Michigan University and Microsoft is taking that one big step further. Microsoft has selected Lori Brown, an associate professor of aviation at WMU, to test out their new HoloLens, the world’s first self-contained holographic computer. The augmented reality interface will bring students a little closer to the realities of flight.
When it comes to the use of innovative technology in the classroom, this is by no means Professor Brown’s first rodeo. She has spent years researching the uses of virtual and augmented reality in aviation education.
“In the past 16 years that I’ve been teaching advanced aircraft systems, I have identified many gaps in the tools and equipment available to me as a professor. Ultimately, mixed reality bridges the gap between simulation, the aircraft and the classroom,” Brown told WMU News.
Storytelling traces its roots back to the very beginning of human experience. It’s found its way through multiple forms, from oral traditions to art, text, images, cinema, and multimedia formats on the web.
As we move into a world of immersive technologies, how will virtual and augmented reality transform storytelling? What roles will our institutions and students play as early explorers? In the traditional storytelling format, a narrative structure is presented to a listener, reader, or viewer. In virtual reality, in contrast, you’re no longer the passive witness. As Chris Milk said, “In the future, you will be the character. The story will happen to you.”
If the accepted rules of storytelling are undermined, we find ourselves with a remarkably creative opportunity no longer bound by the rectangular frame of traditional media.
We are in the earliest stages of virtual reality as an art form. The exploration and experimentation with immersive environments is so nascent that new terms have been proposed for immersive storytelling. Abigail Posner, the head of strategic planning at Google Zoo, said that it totally “shatters” the storytelling experience and refers to it as “storyliving.” At the Tribeca Film Festival, immersive stories are termed “storyscapes.”
Learning through a virtual experience The concept to use VR as an educational tool has been gaining success amongst teachers and students, who apply the medium to a wide range of activities and in a variety of subjects. Many schools start with a simple cardboard viewer such as the Google cardboard, available for less than $10 and enough to play with simple VRs.
A recent study by Foundry10 analyzed how students perceived the usage of VR in their education and in what subjects they saw it being the most useful. According to the report, 44% of students were interested in using VR for science education, 38% for history education, 12% for English education, 3% for math education, and 3% for art education.
Among the many advantages brought by VR, the aspect that generally comes first when discussing the new technology is the immersion made possible by entering a 360° and 3-dimensional virtual space. This immersive aspect offers a different perception of the content being viewed, which enables new possibilities in education.
Schools today seem to be getting more and more concerned with making their students “future-ready.” By bringing the revolutionary medium of VR to the classroom and letting kids experiment with it, they help prepare them for the digital world in which they will grow and later start a career.
Last but not least, the new medium also adds a considerable amount of fun to the classroom as students get excited to receive the opportunity, sometimes for the first time, to put a headset viewer on and try VR.
VR also has the potential to stimulate enthusiasm within the classroom and increase students’ engagement. Several teachers have reported that they were impressed by the impact on students’ motivation and in some cases, even on their new perspective toward learning matter.
These teachers explained that when put in control of creating a piece of content and exposed to the fascinating new medium of VR, some of their students showed higher levels of participation and in some cases, even better retention of the information.
“The good old reality is no longer best practice in teaching. Worksheets and book reports do not foster imagination or inspire kids to connect with literature. I want my students to step inside the characters and the situations they face, I want them to visualize the setting and the elements of conflict in the story that lead to change.”
Extended Reality (XR) Extended Reality (XR) is a newly added term to the dictionary of the technical words. For now, only a few people are aware of XR. Extended Reality refers to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables. Extended Reality includes all its descriptive forms like the Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR). In other words, XR can be defined as an umbrella, which brings all three Reality (AR, VR, MR) together under one term, leading to less public confusion. Extended reality provides a wide variety and vast number of levels in the Virtuality of partially sensor inputs to Immersive Virtuality.
Since past few years, we have been talking regarding AR, VR, and MR, and probably in coming years, we will be speaking about XR.
Summary: VR is immersing people into a completely virtual environment ; AR is creating an overlay of virtual content, but can’t interact with the environment; MR is a mixed of virtual reality and the reality, it creates virtual objects that can interact with the actual environment. XR brings all three Reality (AR, VR, MR) together under one term.
Audi has released a new AR smartphone application that is triggered by their TV commercials. The app brings the cars from the commercial out of the screen and into your living room or driveway.
According to a release from the company, the Audi quattro coaster AR application “recognizes” specific Audi TV commercials. If the right commercial is playing, it will then trigger a series of AR events.
From DSC: How might this type of setup be used for learning-related applications?
Will Augmented and Virtual Reality Replace Textbooks? — from centerdigitaled.com by Michael Mathews Students who are conceptual and visual learners can grasp concepts through AVR, which in turn allows textbooks to make sense.
Excerpt:
This past year, Tulsa TV-2, an NBC News affiliate, did a great story on the transition in education through the eyes of professors and students who are using augmented and virtual reality. As you watch the news report you will notice the following:
Professors will quickly embrace technology that directly impacts student success.
Students are more engaged and learn quicker through visual stimulation.
Grades can be immediately improved with augmented and virtual reality.
An international and global reach is possible with stimulating technology.
Within the food industry, AR and VR have also begun to make headway. Although development costs are still high, more and more F&B businesses are beginning to realize the potential of AR/VR and see it as a worthwhile investment. Three main areas – human resources, customer experiences, food products – have seen the most concentration of AR/VR development so far and will likely continue to push the envelope on what use cases AR & VR have within the industry.
Hologram-like 3D images offer new ways to study educational models in science and other subjects. zSpace has built a tablet that uses a stylus and glasses to allow students to have interactive learning experiences. Technology like this not only makes education more immersive and captivating, but also can provide more accurate models for students in professional fields like medicine.
Just days after previewing its augmented reality content strategy, the Times has already delivered on its promise to unveil its first official AR coverage, centered on the 2018 Winter Olympic Games in PyeongChang. When viewed through the NYTimes app for iPhones and iPads, the “Four of the World’s Best Olympians, as You’ve Never Seen Them Before” article displays AR content embedded at regular intervals as readers scroll along.
Retail IT is still in its infancy and is yet to become general practice, but given the popularity of video, the immersive experience will undoubtedly catch on. The explanation lies in the fact that the wealth of information and the extensive range of products on offer are overwhelming for consumers. Having the opportunity to try products by touching a button in an environment that feels real is what can make the shopping experience more animated and less stressful. Also, through VR, even regular customers can experience VIP treatment at no additional cost. Sitting in the front row at the Paris Fashion Week without leaving your local mall or, soon, your own house, will become the norm.
The VR / AR / MR Breakdown
This year will see growth in a variety of virtual technologies and uses. There are differences and similarities between virtual, augmented, and mixed reality technologies. The technology is constantly evolving and even the terminology around it changes quickly, so you may hear variations on these terms.
Augmented reality is what was behind the Pokémon Go craze. Players could see game characters on their devices superimposed over images of their physical surroundings. Virtual features seemed to exist in the real world.
Mixed reality combines virtual features and real-life objects. So, in this way it includes AR but it also includes environments where real features seem to exist in a virtual world.
The folks over at Recode explain mixed reality this way:
In theory, mixed reality lets the user see the real world (like AR) while also seeing believable, virtual objects (like VR). And then it anchors those virtual objects to a point in real space, making it possible to treat them as “real,” at least from the perspective of the person who can see the MR experience.
And, virtual reality uses immersive technology to seemingly place a user into a simulated lifelike environment.
… Where You’ll Find These New Realities Education and research fields are at the forefront of VR and AR technologies, where an increasing number of students have access to tools. But higher education isn’t the only place you see this trend. The number of VR companies grew 250 percent between 2012 and 2017. Even the latest iPhones include augmented reality capabilities. Aside from the classroom and your pocket, here are some others places you’re likely to see VR and AR pop up in 2018.
VR/AR for Impact experiences shown this week at WEF 2018 include:
OrthoVR aims to increase the availability of well-fitting prosthetics in low-income countries by using Virtual Reality and 3D rapid prototyping tools to increase the capacity of clinical staff without reducing quality. VR allows current prosthetists and orthosists to leverage their hands-on and embodied skills within a digital environment.
The Extraordinary Honey Bee is designed to help deepen our understanding of the honey bee’s struggle and learn what is at stake for humanity due to the dying global population of the honey bee. Told from a bee’s perspective, The Extraordinary Honey Bee harnesses VR to inspire change in the next generation of honey bee conservationists.
The Blank Canvas: Hacking Nature is an episodic exploration of the frontiers of bioengineering as taught by the leading researchers within the field. Using advanced scientific visualization techniques, the Blank Canvas will demystify the cellular and molecular mechanisms that are being exploited to drive substantial leaps such as gene therapy.
LIFE (Life-saving Instruction For Emergencies) is a new mobile and VR platform developed by the University of Oxford that enables all types of health worker to manage medical emergencies. Through the use of personalized simulation training and advanced learning analytics, the LIFE platform offers the potential to dramatically extend access to life-saving knowledge in low-income countries.
Tree is a critically acclaimed virtual reality experience to immerse viewers in the tragic fate that befalls a rainforest tree. The experience brings to light the harrowing realities of deforestation, one of the largest contributors to global warming.
For the Amazonian Yawanawa, ‘medicine’ has the power to travel you in a vision to a place you have never been. Hushuhu, the first woman shaman of the Yawanawa, uses VR like medicine to open a portal to another way of knowing. AWAVENAis a collaboration between a community and an artist, melding technology and transcendent experience so that a vision can be shared, and a story told of a people ascending from the edge of extinction.
Types of Virtual Reality Technology
We can segregate the type of Virtual Reality Technology according to their user experience
Non-Immersive Non-immersive simulations are the least immersion implementation of Virtual Reality Technology.
In this kind of simulation, only a subset of the user’s senses is replicated, allowing for marginal awareness of the reality outside the VR simulation. A user enters into 3D virtual environments through a portal or window by utilizing standard HD monitors typically found on conventional desktop workstations.
Semi Immersive In this simulation, users experience a more rich immersion, where a user partly, not fully involved in a virtual environment. Semi immersive simulations are based on high-performance graphical computing, which is often coupled with large screen projector systems or multiple TV projections to properly simulate the user’s visuals.
Fully immersive Offers the full immersive experience to the user of Virtual Reality Technology, in this phase of VR head-mounted displays and motion sensing devices are used to simulate all of the user senses. In this situation, a user can experience the realistic virtual environment, where a user can experience a wide view field, high resolutions, increased refresh rates and a high quality of visualization through HMD.
These workers already routinely use technology such as tablets to access plans and data on site, but going from 2D to 3D at scale brings that to a whole new level. “Superimposing the digital model on the physical environment provides a clear understanding of the relations between the 3D design model and the actual work on a jobsite,” explained Olivier Pellegrin, BIM manager, GA Smart Building.
The application they are using is called Trimble Connect. It turns data into 3D holograms, which are then mapped out to scale onto the real-world environment. This gives workers an instant sense of where and how various elements will fit and exposes mistakes early on in the process.
Also see:
Trimble Connect for HoloLens is a mixed reality solution that improves building coordination by combining models from multiple stakeholders such as structural, mechanical and electrical trade partners. The solution provides for precise alignment of holographic data on a 1:1 scale on the job site, to review models in the context of the physical environment. Predefined views from Trimble Connect further simplify in-field use with quick and easy access to immersive visualizations of 3D data. Users can leverage mixed reality for training purposes and to compare plans against work completed. Advanced visualization further enables users to view assigned tasks and capture data with onsite measurement tools. Trimble Connect for HoloLens is available now through the Microsoft Windows App Store. A free trial option is available enabling integration with HoloLens. Paid subscriptions support premium functionality allowing for precise on-site alignment and collaboration. Trimble’s Hard Hat Solution for Microsoft HoloLens extends the benefits of HoloLens mixed reality into areas where increased safety requirements are mandated, such as construction sites, offshore facilities, and mining projects. The solution, which is ANSI-approved, integrates the HoloLens holographic computer with an industry-standard hard hat. Trimble’s Hard Hat Solution for HoloLens is expected to be available in the first quarter of 2018. To learn more, visit mixedreality.trimble.com.
From DSC:
Combining voice recognition / Natural Language Processing (NLP) with Mixed Reality should provide some excellent, powerful user experiences. Doing so could also provide some real-time understanding as well as highlight potential issues in current designs. It will be interesting to watch this space develop. If there were an issue, wouldn’t it be great to remotely ask someone to update the design and then see the updated design in real-time? (Or might there be a way to make edits via one’s voice and/or with gestures?)
I could see where these types of technologies could come in handy when designing / enhancing learning spaces.
There’s been a lot of cool stuff happening lately around Augmented Reality (AR), and since I love exploring and having fun with new technologies, I thought I would see what I could do with AR and the Web?—?and it turns out I was able to do quite a lot!
Most AR demos are with static objects, like showing how you can display a cool model on a table, but AR really begins to shine when you start adding in animations!
With animated AR, your models come to life, and you can then start telling a story with them.
If you’re in the market for some art in your house or apartment, Art.com will now let you use AR to put digital artwork up on your wall.
The company’s ArtView feature is one of the few augmented reality features that actually adds a lot to the app it’s put in. With the ARKit-enabled tech, the artwork is accurately sized so you can get a perfect idea of how your next purchase could fit on your wall. The feature can be used for the two million pieces of art on the site and can be customized with different framing types.
Bailenson’s newest book, Experience on Demand, builds on that earlier work while focusing more clearly — even bluntly — on what we do and don’t know about how VR affects humans.
…
“The best way to use it responsibly is to be educated about what it is capable of, and to know how to use it — as a developer or a user — responsibly,” Bailenson wrote in the book.
Among the questions raised:
“How educationally effective are field trips in VR? What are the design principles that should guide these types of experiences?”
How many individuals are not meeting their potential because they lack the access to good instruction and learning tools?”
“When we consider that the subjects were made uncomfortable by the idea of administering fake electric shocks, what can we expect people will feel when they are engaging all sorts of fantasy violence and mayhem in virtual reality?”
“What is the effect of replacing social contact with virtual social contact over long periods of time?”
“How do we walk the line and leverage what is amazing about VR, without falling prey to the bad parts?”
Explore what’s next in learning spaces + design thinking that breaks the barriers of tradition and inspire innovation
Retool your learning environments with virtual & augmented reality
Connect your learning space design with strategic planning initiatives
Discover next generation learning solutions during our networking breaks
Overcome institutional and financial roadblocks to building active learning spaces
Redesign spaces with limited budgets
From DSC: I am honored to be serving on the Advisory Council for this conference with a great group of people. Missing — at least from my perspective — from the image below is Kristen Tadrous, Senior Program Director with the Corporate Learning Network. Kristen has done a great job these last few years planning and running this conference.
NOTE: The above graphic reflects a change for me. I am still an Adjunct Faculty Member at Calvin College, but I am no longer a Senior Instructional Designer there.
This national conference will be held in Los Angeles, CA on February 26-28, 2018. It is designed to help institutions of higher education develop highly-innovative cultures — something that’s needed in many institutions of traditional higher education right now.
I have attended the first 3 conferences and I moderated a panel at last year’s conference out in San Diego. I just want to say that this is a great conference and I encourage you to bring a group of people to it from your organization! I say a group of people because a group of 5 of us (from a variety of departments) went one year and the result of attending the NGLS Conference was a brand new Sandbox Classroom — an active-learning based, highly-collaborative learning space where faculty members can experiment with new pedagogies as well as with new technologies. The conference helped us discuss things as a diverse group, think out loud, come up with some innovative ideas, and then build the momentum to move forward with some of those key ideas.
If you haven’t already attended this conference, I highly recommend that you check it out.
8 ways augmented and virtual reality are changing medicine — from israel21c.org by Abigail Klein Leichman Israeli companies are using futuristic technologies to simplify complex surgery, manage rehab, relieve pain, soothe autistic kids and much more.
Here’s the trade-off: what we gain in development ease-of-use (native SDKs, integration into existing workflows) and performance enhancements (load times, battery efficiency, render quality, integration with native apps), we lose in universality; naturally, each company wants you staying within its own ecosystem.
In a nutshell: new AR platforms from today’s tech giants are aimed at reducing technical headache so you can focus on creating amazing experiences… but they also want you creating more apps for their respective mobile ecosystems.
Learning to play the piano is getting an immersive upgrade with a new augmented reality (AR) piano training software called Music Everywhere. The HoloLens app aims to help students of all talent levels build fundamental music theory and performance skills. While traditional piano lessons can cost upwards of $100 per hour, Music Everywhere is free on the Microsoft store and offers a cost effective tutoring solution that provides students with immediate interaction feedback, making it differ greatly from watching a video tutorial.
Founded in 2017, Music Everywhere began at Carnegie Mellon’s ETC with Seth Glickman, Fu Yen Hsiao, and Byunghwan Lee realizing the nascent technology could be used for skills training. The app was the first Augmented Reality music learning platform to take first prize in Microsoft’s HoloLens Developer Contest, beating more than one-thousand submissions.
The market for virtual reality applications is growing at a rapid pace, and is expected to double in the next five years (Bolkan, 2017). As the cost of equipment falls and schools have greater access to technology, there is great interest in virtual reality as an educational tool. A small but growing group of educators have started to integrate virtual reality in their classrooms, with promising results (Castaneda, Cechony & Bautista, 2017). We reached out to teachers and administrators who are currently using virtual reality in their classrooms to hear their perspectives and practical strategies for infusing this resource into their classrooms.
Teachers have creative ideas for how to incorporate immersive education in current classrooms: how to select activities, how to set up the classroom, how to get support during the activity and how to transport devices. Teachers also shared their ideas for future applications of VR, including how to deepen the learning experience and to expand the reach of these technologies to a greater population of students.
Here we share three vignettes of three different approaches: a social studies class in a suburban school district, a district-wide perspective from an urban school district and a class designed entirely around understanding and implementing VR for other classrooms. We also share how we are using these ideas to inform our own project in designing a collaborative immersive virtual reality educational game for introductory high school biology.
VR is already being used for many real-world applications–hiring, training, marketing/sales, medical purposes, entertainment, and more–and is worth considering for many different university departments.
…
At German University in Cairo, architecture students used our platform to create tours of historical Cairo buildings, complete with educational hotspot overlays on particularly interesting features. This multimedia approach educated students without them having to travel to the buildings. It also made for a more “stickier” learning experience for the students involved in creating it.
…
At Emporia State University, for example, the forensic science students view virtual crime scenes recorded at the Kansas Bureau of Investigation in Topeka. Forensic-science students can look for clues and learn facts via voiceover, mimicking an actual crime-scene inquiry quite impressively.
Just as televisions and driverless cars have become part of the furniture at the CES technology show, so too have virtual and augmented reality headsets.
Although the momentum behind VR’s growth slowed in 2017 – the industry seemingly unsure if it should progress with a technology destined to remain a niche – AR is being welcomed into the spotlight with open arms.
…
Here are six AR and VR highlights from CES 2018.
The University of Washington, hoping to get ahead in the burgeoning field of augmented reality (AR) and virtual reality (VR), has launched the UW Reality Lab, a center for research, education and innovation in AR and VR.
…
One of the first research centers in the world built for AR and VR projects, the UW Reality Lab is also located in Seattle — a hotspot for technology companies, from behemoths like Amazon and Microsoft to startups still trying to get off the ground.
“We’re seeing some really compelling and high-quality AR and VR experiences being built today,” Steve Seitz, center co-lead and Allen School professor, said in the university’s statement. “But, there are still many core research advances needed to move the industry forward — tools for easily creating content, infrastructure solutions for streaming 3D video, and privacy and security safeguards — that university researchers are uniquely positioned to tackle.”
Why Augmented Reality is Important for eLearning According to a report released by CCS Insight, augmented and virtual reality hardware is set to become a $4 billion market by 2018. Let’s take a look at how augmented reality can be leveraged in the online learning space:
Simulated working environments One of the most common advantages of online learning is the ability to form an environment in which the users have the freedom to experiment. As people usually learn from their mistakes, when they work in a consequence-free environment, they are most likely to remember the right way to do things.
Support Gamification
As online learning management systems (LMSs) use gamification widely, augmented reality can be directly applied. In AR reality training module, employees will be rewarded for effectively performing their routine tasks in the right way, which will eventually improve performance.
Immersive Learning Environments
Using a tablet, smartphone for the online training software means the users are constantly distracted with emails, notifications from social channels etc. This is one of the reasons why elearning content uses interactive multimedia elements to engage students. With augmented reality, elearning courses can be supported with 360° video, which will engage the user and remove distractions for them.
Motion tracking
Motion and gesture tracking are part of the AR experience. They are commonly leveraged for choosing menu items or engaging with video game-based environments.
In the online learning domain, LMSs can use this technology to track learner’s progress to ensure that they are achieving the set targets without fail. This will boost real-time training performance and improve interactivity with instant feedback.
Simply put, with augmented reality the possibilities are endless. With the growing number of Bring Your Own Device (BYOD) workplaces, it is expected that employees and learners will be delighted to use augmented reality.
The Musical Future of VR
VR technology is still in its earliest stages, but musicians are already seeing how they will be able to connect to fans and make news ones without the expense of touring. In artificial environments, bands can invite music lovers into their world.
But beyond the obvious entertainment factor, VR has the potential to become a tool for education. Music students could enter a studio space using VR gear for lessons and practice. The immediate feedback provided and game-like atmosphere may keep students more motivated and engaged. Imagine methods for teaching that include ways to slow down and loop difficult parts or bringing in the composer for lessons.
VR can also connect music lovers to the many people behind the scenes involved in producing the music they enjoy. Listeners can learn about the industry and how a song comes to life. They’ll understand why it’s important to play a part in sustaining the music business.
For this technology to become a reality in itself inside consumers’ listening and learning spaces, obstacles need addressing. The hardware is still pricey, bulky and requires a power source. Apps need creators who will need more in the way of artificial intelligence.
“The ability to combine digital information with the real world is going to disrupt every business model, transform human/machine interaction, and generate innovative use cases across every discipline and in every vertical including education, healthcare, manufacturing,” Werner continued. “I see ARiA as the TED for AR, where the best minds come together to solve real work problems and share ideas to capitalize on the huge opportunity.”
Broadcast news and sports now routinely lay data, graphics, and animation onto the physical world. AR has become ubiquitous in ways that have nothing to do with smart glasses. “AR is on the verge.
In regards to mixed reality for immersive learning:
Pearson – the world’s largest education company – will begin rolling out in March curriculum that will work on both HoloLens and Windows Mixed Reality immersive VR headsets. These six new applications will deliver seamless experiences across devices and further illustrate the value of immersive educational experiences.
We are expanding our mixed media reality curriculum offerings through a new partnership with WGBH’s Bringing the Universe to America’s Classrooms project****, for distribution nationally on PBS LearningMedia™. This effort brings cutting-edge Earth and Space Science content into classrooms through digital learning resources that increase student engagement with science phenomena and practices.
To keep up with growing demand for HoloLens in the classroom we are committed to providing affordable solutions. Starting on January 22, we are making available a limited-time academic pricing offer for HoloLens. To take advantage of the limited-time academic pricing offer, please visit, hololens.com/edupromo.
From DSC: Though this outlook report paints a rosier picture than I think we will actually encounter, there are several interesting perspectives in this report. We need to be peering out into the future to see which trends and scenarios are most likely to occur…then plan accordingly. With that in mind, I’ve captured a few of the thoughts below.
At its inception, very few people anticipated the pace at which the internet would spread across the world, or the impact it would have in remaking business and culture. And yet, as journalist Oliver Burkeman wrote in 2009, “Without most of us quite noticing when it happened, the web went from being a strange new curiosity to a background condition of everyday life.”1
In Dell’s Digital Transformation Index study, with 4,000 senior decision makers across the world, 45% say they are concerned about becoming obsolete in just 3-5 years, nearly half don’t know what their industry will look like in just three years’ time, and 73% believe they need to be more ‘digital’ to succeed in the future.
With this in mind, we set out with 20 experts to explore how various social and technological drivers will influence the next decade and, specifically, how emerging technologies will recast our society and the way we conduct business by the year 2030. As a result, this outlook report concludes that, over the next decade, emerging technologies will underpin the formation of new human-machine partnerships that make the most of their respective complementary strengths. These partnerships will enhance daily activities around the coordination of resources and in-the-moment learning, which will reset expectations for work and require corporate structures to adapt to the expanding capabilities of human-machine teams.
…
For the purpose of this study, IFTF explored the impact that Robotics, Artificial Intelligence (AI) and Machine Learning, Virtual Reality (VR) and Augmented Reality (AR), and Cloud Computing,will have on society by 2030. These technologies, enabled by significant advances in software, will underpin the formation of new human-machine partnerships.
… On-demand access to AR learning resources will reset expectations and practices around workplace training and retraining, and real-time decision-making will be bolstered by easy access to information flows. VR-enabled simulation will immerse people in alternative scenarios, increasing empathy for others and preparation for future situations. It will empower the internet of experience by blending physical and virtual worlds.
Already, the number of digital platforms that are being used to orchestrate either physical or human resources has surpassed 1,800.9 They are not only connecting people in need of a ride with drivers, or vacationers with a place to stay, but job searchers with work, and vulnerable populations with critical services. The popularity of the services they offer is introducing society to the capabilities of coordinating technologies and resetting expectations about the ownership of fixed assets.
Human-machine partnerships won’t spell the end of human jobs, but work will be vastly different. …
The U.S. Bureau of Labor Statistics says that today’s learners will have 8 to 10 jobs by the time they are 38. Many of them will join the workforce of freelancers. Already 50 million strong, freelancers are projected to make up 50% of the workforce in the United States by 2020.12 Most freelancers will not be able to rely on traditional HR departments, onboarding processes, and many of the other affordances of institutional work.
By 2030, in-the-moment learning will become the modus operandi, and the ability to gain new knowledge will be valued higher than the knowledge people already have.
Though the Next Big Thing won’t appear for a while, we know pretty much what it will look like: a lightweight, always-on wearable that obliterates the divide between the stuff we see on screens and the stuff we see when we look up from our screens.
…
Not every company working on post-reality glasses shares an identical vision; some have differing views of how immersive it should be. But all have quietly adopted the implicit assumption that a persistent, wearable artificial reality is the next big thing. The pressure of the competition has forced them to begin releasing interim products, now.
And he added that “these glasses will offer AR, VR, and everything in between, and we’ll wear them all day and we’ll use them in every aspect of our lives.”
Concerned with the decay of African heritage sites, The Zamani Project, based at the University of Cape Town, South Africa, is seeking to immortalize historic spots in three-dimensional, virtual reality-ready models.
Professor Heinz Ruther steers the project. He ventures up and down the continent — visiting Ghana, Tanzania, Mali, Ethiopia, Kenya and elsewhere — recording in remarkable detail the structure and condition of tombs, churches and other buildings.
“I’ve seen how sites are deteriorating visibly,” Ruther told CNN.
The project’s aim is to build a database of complex, lifelike 3-D models. Presently, they’ve mapped around 16 sites including Lalibela in Ethiopia, Timbuktu in Mali and Kilwa in Tanzania.
Education has been a recurring theme throughout the many programs of the NYC Media Lab, a public-private partnership where I serve as an Executive Director. How will virtual and augmented reality change the classroom? How can teachers use immersive media to educate citizens and keep our communities vibrant? In what ways can enterprises leverage innovation to better train employees and streamline workflows?
These are just a few of the top-of-mind questions that NYC Media Lab’s consortium is thinking about as we enter the next wave of media transformation.
Researchers and professionals at work across the VR/AR community in New York City are excited for what comes next. At NYC Media Lab’s recent Exploring Future Reality conference, long-time educators including Agnieszka Roginska of New York University and Columbia University’s Steven Feiner pointed to emerging media as a way to improve multi-modal learning for students and train computer systems to understand the world around us.
NYC Media Lab merges engineering and design research happening at the city’s universities with resources and opportunities from the media and technology industry—to produce new prototypes, launch new companies and advocate for the latest thinking.
In the past year, the Lab has completed dozens of rapid prototyping projects; exhibited hundreds of demos from the corporate, university and entrepreneurship communities; helped new startups make their mark; and hosted three major events, all to explore emerging media technologies and their evolving impact.
While it’s all too easy to lose ourselves in the countless VR worlds at our fingertips, sometimes we just need to access the desktop and get things done in Windows. Thanks to a few innovative apps, this is possible without removing your headset.
The augmented reality industry made great strides in 2017, but its apex is not even in sight.
In terms of software, augmented reality is approaching meaningful mainstream awareness, thanks mostly to Apple and ARKit. Meanwhile, on the hardware side, AR is very much in its infancy, with headsets mostly limited to enterprise customers or developer kits and the majority of smartphones lacking the sensors necessary to achieve much more than parlor tricks.
We now have enough specimens of AR software and hardware to compile a roster of the best efforts of the industry. Rest assured, we expect every item listed herein to be surpassed in 2018 either by their next iterations or new entries.
3. Augmented reality goes mainstream Before smartphones existed 10 years ago, most people would consider spending five hours daily staring at your phone as crazy. In 2018, the bent-neck trend will start to reverse itself.
The mobile game Pokémon Go has unleashed a billion-dollar demand for augmented reality entertainment, and major brands are taking notice. Thanks to the introduction of affordable augmented reality glasses, our phones will remain in our pockets and Heads Up Displays (HUD) will improve how we work, shop, and play.
HUDs, best known today as the instrument gauges that fighter pilots monitor on their visors or windshields, will become a standard in consumer eyeglasses. Imagine walking down the street in a foreign country, for example, and having all of the store signs instantly translated into English thanks to your trendy sunglasses.
AR will customize in-store experiences with mannequins that match your body type and display enough virtual inventory to rival any online site. Merchants will create AR experiences with their packaging so that demonstration videos can appear when you look at the product on the shelf or celebrity spokespeople can magically stand in the aisle to pitch the product. Virtual pop-up stores can be built to appear anywhere that crowds are gathered (in a stadium, a busy street corner, or even inside a subway). These non-brick and mortar retail locations will bring new opportunities for merchants to create engaging shopping experiences anywhere with accessible bandwidth.
Li-Fi, a new light-base wireless connection with data speeds 100 times that of Wi-Fi, will bring high-definition virtual objects into stores. With Li-Fi and AR, consumers can see limitless virtual inventory in store, at scale.
With just a wave of your hand, a car salesperson can change the model, color, and customized features of the car “sitting” on the dealership’s showroom floor. Combining real and virtual objects can enhance experiences for all out-of-home activities. Sports stadiums will be brought into the 21st century with personalized HUDs of players on the field. Imagine watching a live football game in the stadium and seeing personalized stats floating above the fantasy sports players you follow. When watching sports from home, AR has the potential to bring the excitement of life-size boxing matches into your living room. The real promise of AR is to bring people the information they need without having to ask for it.
For many, 2018 will be the start of living an augmented life.
Smartphones vs. Glasses vs. The Future On the upside, it’s clear that investors are still bullish about AR, particularly in contrast to VR. That’s likely because they understand something that’s fairly obvious once you spend a decent amount of time analyzing both sectors: VR, as we know it today, is best suited as an escape experience (including interactive entertainment, passive consumption of visuals, and virtual meetings) in a stationary location. On the other hand, AR is limitless in its reach as a mobile-friendly tool for interfacing with the real world through our smartphones and tablets.
Those real-world AR interactions will begin to include everything from banking transactions, to driving and walking assistance (actually, some of this is already happening). Soon, AR will provide real-time language translations with subtitles floating over someone’s head like captions during a foreign movie, but in the real world.
A retrospective on the 2017 innovations that are helping the #VR and #AR industries to evolve. https://t.co/6dtSwtfoNm
Artificial Intelligence is the talk of the town. It has evolved past merely being a buzzword in 2016, to be used in a more practical manner in 2017. As 2018 rolls out, we will gradually notice AI transitioning into a necessity. We have prepared a detailed report, on what we can expect from AI in the upcoming year. So sit back, relax, and enjoy the ride through the future. (Don’t forget to wear your VR headgear! )
Here are 18 things that will happen in 2018 that are either AI driven or driving AI:
Artificial General Intelligence may gain major traction in research.
We will turn to AI enabled solution to solve mission-critical problems.
Machine Learning adoption in business will see rapid growth.
Safety, ethics, and transparency will become an integral part of AI application design conversations.
Mainstream adoption of AI on mobile devices
Major research on data efficient learning methods
AI personal assistants will continue to get smarter
Race to conquer the AI optimized hardware market will heat up further
We will see closer AI integration into our everyday lives.
The cryptocurrency hype will normalize and pave way for AI-powered Blockchain applications.
Deep learning will continue to play a significant role in AI development progress.
AI will be on both sides of the cybersecurity challenge.
Augmented reality content will be brought to smartphones.
Reinforcement learning will be applied to a large number of real-world situations.
Robotics development will be powered by Deep Reinforcement learning and Meta-learning
Rise in immersive media experiences enabled by AI.
A large number of organizations will use Digital Twin.
Inside AI — from inside.com The year 2017 has been full of interesting news about Artificial Intelligence, so to close out the year, we’re doing two special retrospective issues covering the highlights.
Excerpt:
A Reality Check For IBM’s A.I. Ambitions. MIT Tech Review. This is a must read piece about the failures, and continued promise, of Watson. Some of the press about Watson has made IBM appear behind some of the main tech leaders, but, keep in mind that Google, Amazon, Facebook, and others don’t do the kinds of customer facing projects IBM is doing with Watson. When you look at how the tech giants are positioned, I think IBM has been vastly underestimated, given that they have the one thing few others do – large scale enterprise A.I. projects. Whether it all works today, or not, doesn’t matter. The experience and expertise they are building is a competitive advantage in a market that is very young where no other companies are doing these types of projects that will soon enough be mainstream.
The Business of Artificial Intelligence. Harvard Business Review. This cover story for the latest edition of HBR explains why artificial intelligence is the most powerful general purpose technology to come around in a long time. It also looks into some of the key ways to think about applying A.I. at work, and how to expect the next phase of this technology to play out.
The Robot Revolution Is Coming. Just Be Patient. Bloomberg. We keep hearing that robots and A.I. are about to make us super productive. But when? Sooner than we think, according to this.
There are few electronic devices with which you cannot order a Domino’s pizza. When the craving hits, you can place an order via Twitter, Slack, Facebook Messenger, SMS, your tablet, your smartwatch, your smart TV, and even your app-enabled Ford. This year, the pizza monger added another ordering tool: If your home is one of the 20 million with a voice assistant, you can place a regular order through Alexa or Google Home. Just ask for a large extra-cheese within earshot, and voila—your pizza is in the works.
Amazon’s Alexa offers more than 25,000 skills—the set of actions that serve as applications for voice technology. Yet Domino’s is one of a relatively small number of brands that has seized the opportunity to enter your home by creating a skill of its own. Now that Amazon Echoes and Google Homes are in kitchens and living rooms across the country, they open a window into user behavior that marketers previously only dreamt of. But brands’ efforts to engage consumers directly via voice have been scattershot. The list of those that have tried is sparse: some banks; a couple of fast food chains; a few beauty companies; retailers here and there. Building a marketing plan for Alexa has been a risky venture. That’s because, when it comes to our virtual assistants, no one knows what the *&^& is going on.
But if 2017 was the year that Alexa hit the mainstream, 2018 will be the year that advertisers begin to take her seriously by investing time and money in figuring out how to make use of her.
8 emerging AI jobs for IT pros— from enterprisersproject.com by Kevin Casey What IT jobs will be hot in the age of AI? Take a sneak peek at roles likely to be in demand
Excerpt:
If you’re watching the impact of artificial intelligence on the IT organization, your interest probably starts with your own job. Can robots do what you do? But more importantly, you want to skate where the puck is headed. What emerging IT roles will AI create? We talked to AI and IT career experts to get a look at some emerging roles that will be valuable in the age of AI.
This past June, Fortune Magazine asked all the CEOs of the Fortune 500 what they believed the biggest challenge facing their companies was. Their biggest concern for 2017: “The rapid pace of technological change” said 73% of those polled, up from 64% in 2016. Cyber security came in only a far second, at 61%, even after all the mega hacks of the past year.
But apart from the Big 8 technology companies – Google, Facebook, Microsoft, Amazon, IBM, Baidu, Tencent, and Alibaba – business leaders, especially of earlier generations, may feel they don’t know enough about AI to make informed decisions.
For the first time, artificial intelligence has been used to discover two new exoplanets. One of the discoveries, made by Nasa’s Kepler mission, brings the Kepler-90 solar system to a total of 8 planets – the first solar system found with the same number as our own.
From DSC: After reviewing the article below, I wondered...if we need to interact with content to learn it…how might mixed reality allow for new ways of interacting with such content? This is especially intriguing when we interact with that content with others as well (i.e., social learning).
Perhaps Mixed Reality (MR) will bring forth a major expansion of how we look at “blended learning” and “hybrid learning.”
Changing How We Perceive The World One Industry At A Time Part of the reason mixed reality has garnered this momentum within such short span of time is that it promises to revolutionize how we perceive the world without necessarily altering our natural perspective. While VR/AR invites you into their somewhat complex worlds, mixed reality analyzes the surrounding real-world environment before projecting an enhanced and interactive overlay. It essentially “mixes” our reality with a digitally generated graphical information.
…
All this, however, pales in comparison to the impact of mixed reality on the storytelling process. While present technologies deliver content in a one-directional manner, from storyteller to audience, mixed reality allows for delivery of content, then interaction between content, creator and other users.This mechanism cultivates a fertile ground for increased contact between all participating entities, ergo fostering the creation of shared experiences. Mixed reality also reinvents the storytelling process. By merging the storyline with reality, viewers are presented with a wholesome experience that’s perpetually indistinguishable from real life.
Mixed reality is without a doubt going to play a major role in shaping our realities in the near future, not just because of its numerous use cases but also because it is the flag bearer of all virtualized technologies. It combines VR, AR and other relevant technologies to deliver a potent cocktail of digital excellence.
At North Carolina State University, Assistant Professor of Chemistry Denis Fourches uses technology to research the effectiveness of new drugs. He uses computer programs to model interactions between chemical compounds and biological targets to predict the effectiveness of the compound, narrowing the field of drug candidates for testing. Lately, he has been using a new program that allows the user to create 3D models of molecules for 3D printing, plus augmented and virtual reality applications.
RealityConvert converts molecular objects like proteins and drugs into high-quality 3D models. The models are generated in standard file formats that are compatible with most augmented and virtual reality programs, as well as 3D printers. The program is specifically designed for creating models of chemicals and small proteins.
Mozilla has launched its first ever augmented reality app for iOS. The company, best known for its Firefox browser, wants to create an avenue for developers to build augmented reality experiences using open web technologies, WebXR, and Apple’s ARKit framework.
This latest effort from Mozilla is called WebXR Viewer. It contains several sample AR programs, demonstrating its technology in the real world. One is a teapot, suspended in the air. Another contains holographic silhouettes, which you can place in your immediate vicinity. Should you be so inclined, you can also use it to view your own WebXR creations.
Airbnb announced today (Dec.11) that it’s experimenting with augmented- and virtual-reality technologies to enhance customers’ travel experiences.
The company showed off some simple prototype ideas in a blog post, detailing how VR could be used to explore apartments that customers may want to rent, from the comfort of their own homes. Hosts could scan apartments or houses to create 360-degree images that potential customers could view on smartphones or VR headsets.
It also envisioned an augmented-reality system where hosts could leave notes and instructions to their guests as they move through their apartment, especially if their house’s setup is unusual. AR signposts in the Airbnb app could help guide guests through anything confusing more efficiently than the instructions hosts often leave for their guests.
Now Object Theory has just released a new collaborative computing application for the HoloLens called Prism, which takes many of the functionalities they’ve been developing for those clients over the past couple of years, and offers them to users in a free Windows Store application.
Spending on augmented and virtual reality will nearly double in 2018, according to a new forecast from International Data Corp. (IDC), growing from $9.1 billion in 2017 to $17.8 billion next year. The market research company predicts that aggressive growth will continue throughout its forecast period, achieving an average 98.8 percent compound annual growth rate (CAGR) from 2017 to 2021.
Scope AR has launched Remote AR, an augmented reality video support solution for Microsoft’s HoloLens AR headsets.
The San Francisco company is launching its enterprise-class AR solution to enable cross-platform live support video calls.
Remote AR for Microsoft HoloLens brings AR support for field technicians, enabling them to perform tasks with better speed and accuracy. It does so by allowing an expert to get on a video call with a technician and then mark the spot on the screen where the technician has to do something, like turn a screwdriver. The technician is able to see where the expert is pointing by looking at the AR overlay on the video scene.
Ultimately, VR in education will revolutionize not only how people learn but how they interact with real-world applications of what they have been taught. Imagine medical students performing an operation or geography students really seeing where and what Kathmandu is. The world just opens up to a rich abundance of possibilities.