Some excerpts of this infographic:
Here’s how Google made VR history and got its first Oscar nom — from inverse.com by Victor Fuste
Google’s short film ‘Pearl’ marks a major moment in VR history.
Excerpt:
The team at Google Spotlight Stories made history on Wednesday, as its short film Pearl became the first virtual reality project to be nominated for an Academy Award. But instead of serving as a capstone, the Oscar nod is just a nice moment at the beginning of the Spotlight team’s plan for the future of storytelling in the digital age.
…
Google Spotlight Stories are not exactly short films. Rather, they are interactive experiences created by the technical pioneers at Google’s Advanced Technologies and Projects (ATAP) division, and they defy expectations and conventions. Film production has in many ways been perfected, but for each Spotlight Story, the technical staff at Google uncovers new challenges to telling stories in a medium that blends together film, mobile phones, games, and virtual reality. Needless to say, it’s been an interesting road.
“The world’s first smart #AugmentedReality for the Connected Home has arrived. — from thunderclap.it
From DSC:
Note this new type of Human Computer Interaction (HCI). I think that we’ll likely be seeing much more of this sort of thing.
Excerpt (emphasis DSC):
How is Hayo different?
AR that connects the magical and the functional:
Unlike most AR integrations, Hayo removes the screens from smarthome use and transforms the objects and spaces around you into a set of virtual remote controls. Hayo empowers you to create experiences that have previously been limited by the technology, but now are only limited by your imagination.
Screenless IoT:
The best interface is no interface at all. Aside from the one-time setup Hayo does not use any screens. Your real-life surfaces become the interface and you, the user, become the controls. Virtual remote controls can be placed wherever you want for whatever you need by simply using your Hayo device to take a 3D scan of your space.
Smarter AR experience:
Hayo anticipates your unique context, passive motion and gestures to create useful and more unique controls for the connected home. The Hayo system learns your behaviors and uses its AI to help meet your needs.
Also see:
Apple iPhone 8 To Get 3D-Sensing Tech For Augmented-Reality Apps — from investors.com by Patrick Seitz
Excerpt:
Apple’s (AAPL) upcoming iPhone 8 smartphone will include a 3D-sensing module to enable augmented-reality applications, Rosenblatt Securities analyst Jun Zhang said Wednesday. Apple has included the 3D-sensing module in all three current prototypes of the iPhone 8, which have screen sizes of 4.7, 5.1 and 5.5 inches, he said. “We believe Apple’s 3D sensing might provide a better user experience with more applications,” Zhang said in a research report. “So far, we think 3D sensing aims to provide an improved smartphone experience with a VR/AR environment.”
Apple’s iPhone 8 is expected to have 3D-sensing tech like Lenovo’s Phab 2 Pro smartphone. (Lenovo)
10 Prominent Developers Detail Their 2017 Predictions for The VR/AR Industry — from uploadvr.com by
Excerpt:
As we look forward to 2017 then, we’ve reached out to a bunch of industry experts and insiders to get their views on where we’re headed over the next 12 months.
2016 provided hints of where Facebook, HTC, Sony, Google, and more will take their headsets in the near future, but where does the industry’s best and brightest think we’ll end up this time next year? With CES, the year’s first major event, now in the books, let’s hear from some those that work with VR itself about what happens next.
We asked all of these developers the same four questions:
1) What do you think will happen to the VR/AR market in 2017?
2) What NEEDS to happen to the VR AR market in 2017?
3) What will be the big breakthroughs and innovations of 2017?
4) Will 2017 finally be the “year of VR?”
MEL Lab’s Virtual Reality Chemistry Class — from thereisonlyr.com by Grant Greene
An immersive learning startup brings novel experiences to science education.
The MEL app turned my iPhone 6 into a virtual microscope, letting me walk through 360 degree, 3-D representations of the molecules featured in the experiment kits.
Labster releases ‘World of Science’ Simulation on Google Daydream — from labster.com by Marian Reed
Excerpt:
Labster is exploring new platforms by which students can access its laboratory simulations and is pleased to announce the release of its first Google Daydream-compatible virtual reality (VR) simulation, ‘Labster: World of Science’. This new simulation, modeled on Labster’s original ‘Lab Safety’ virtual lab, continues to incorporate scientific learning alongside of a specific context, enriched by story-telling elements. The use of the Google VR platform has enabled Labster to fully immerse the student, or science enthusiast, in a wet lab that can easily be navigated with intuitive usage of Daydream’s handheld controller.
The Inside Story of Google’s Daydream, Where VR Feels Like Home — from wired.com by David Pierce
Excerpt:
Jessica Brillhart, Google’s principle VR filmmaker, has taken to calling people “visitors” rather than “viewers,” as a way of reminding herself that in VR, people aren’t watching what you’ve created. They’re living it. Which changes things.
Welcoming more devices to the Daydream-ready family — from blog.google.com by Amit Singh
Excerpt:
In November, we launched Daydream with the goal of bringing high quality, mobile VR to everyone. With the Daydream View headset and controller, and a Daydream-ready phone like the Pixel or Moto Z, you can explore new worlds, kick back in your personal VR cinema and play games that put you in the center of the action.
Daydream-ready phones are built for VR with high-resolution displays, ultra smooth graphics, and high-fidelity sensors for precise head tracking. To give you even more choices to enjoy Daydream, today we’re welcoming new devices that will soon join the Daydream-ready family.
Kessler Foundation awards virtual reality job interview program — from haptic.al by Deniz Ergürel
Excerpt:
Kessler Foundation, one of the largest public charities in the United States, is awarding a virtual reality training project to support high school students with disabilities. The foundation is providing a two-year, $485,000 Signature Employment Grant to the University of Michigan in Ann Arbor, to launch the Virtual Reality Job Interview Training program. Kessler Foundation says, the VR program will allow for highly personalized role-play, with precise feedback and coaching that may be repeated as often as desired without fear or embarrassment.
Deep-water safety training goes virtual — from shell.com by Soh Chin Ong
How a visit to a shopping centre led to the use of virtual reality safety training for a new oil production project, Malikai, in the deep waters off Sabah in Malaysia.
GE’s Sam Murley scopes out the state of AR and what’s next — from thearea.org
Excerpt (emphasis DSC):
AREA: How would you describe the opportunity for Augmented Reality in 2017?
SAM MURLEY: I think it’s huge — almost unprecedented — and I believe the tipping point will happen sometime this year. This tipping point has been primed over the past 12 to 18 months with large investments in new startups, successful pilots in the enterprise, and increasing business opportunities for providers and integrators of Augmented Reality. During this time, we have witnessed examples of proven implementations – small scale pilots, larger scale pilots, and companies rolling out AR in production — and we should expect this to continue to increase in 2017. You can also expect to see continued growth of assisted reality devices, scalable for industrial use cases such as manufacturing, industrial, and services industries as well as new adoption of mixed reality and augmented reality devices, spatially-aware and consumer focused for automotive, consumer, retail, gaming, and education use cases. We’ll see new software providers emerge, existing companies taking the lead, key improvements in smart eyewear optics and usability, and a few strategic partnerships will probably form.
AREA: Do you have visibility into all the different AR pilots or programs that are going on at GE?
SAM MURLEY:
…
At the 2016 GE Minds + Machines conference, our Vice President of GE Software Research, Colin Parris, showed off how the Microsoft HoloLens could help the company “talk” to machines and service malfunctioning equipment. It was a perfect example of how Augmented Reality will change the future of work, giving our customers the ability to talk directly to a Digital Twin — a virtual model of that physical asset — and ask it questions about recent performance, anomalies, potential issues and receive answers back using natural language. We will see Digital Twins of many assets, from jet engines to or compressors. Digital Twins are powerful – they allow tweaking and changing aspects of your asset in order to see how it will perform, prior to deploying in the field. GE’s Predix, the operating system for the industrial Internet, makes this cutting-edge methodology possible. “What you saw was an example of the human mind working with the mind of a machine,” said Parris. With Augmented Reality, we are able to empower the workforce with tools that increase productivity, reduce downtime, and tap into the Digital Thread and Predix. With Artificial Intelligence and Machine Learning, Augmented Reality quickly allows language to be the next interface between the Connected Workforce and the Internet of Things (IoT). No keyboard or screen needed.
From DSC:
I also believe that the tipping point will happen sometime this year. I hadn’t heard of the concept of a Digital Twin — but I sense that we’ll be hearing that more often in the future.
With Artificial Intelligence and Machine Learning, Augmented Reality quickly allows language to be the next interface between the Connected Workforce and the Internet of Things (IoT). No keyboard or screen needed.
From DSC:
I then saw the concept of the “Digital Twin” again out at:
From DSC:
The following article reminded me of a vision that I’ve had for the last few years…
Though I’m a huge fan of online learning, why only build a production studio that’s meant to support online courses only? Let’s take it a step further and design a space that can address the content development for online learning as well as for blended learning — which can include the flipped classroom type of approach.
To do so, colleges and universities need to build something akin to what the National University of Singapore has done. I would like to see institutions create large enough facilities in order to house multiple types of recording studios in each one of them. Each facility would feature:
A piece of this facility could look and act like the Sound Lab at the Museum of Pop Culture (MoPOP)
Sydney – The Opera House has joined forces with Samsung to open a new digital lounge that encourages engagement with the space. — from lsnglobal.com by Rhiannon McGregor
The Lounge, enabled by Samsung on November 8, 2016 in Sydney, Australia. (Photo by Anna Kucera)
The Lounge, enabled by Samsung on November 8, 2016 in Sydney, Australia. (Photo by Anna Kucera)
Also see:
The Lounge enabled by Samsung
Open day and night, The Lounge enabled by Samsung is a new place in the heart of the Opera House where people can sit and enjoy art and culture through the latest technology. The most recent in a series of future-facing projects enabled by Sydney Opera House’s Principal Partner, Samsung, the new visitor lounge features stylish, comfortable seating, as well as interactive displays and exclusive digital content, including:
User interface features:
Anatomy 4D transports teachers, medical professionals, and students of all levels into an interactive 4D experience of human anatomy.
CES 2017: Intel’s VR visions — from jwtintelligence.com by Shepherd Laughlin
The company showed off advances in volumetric capture, VR live streaming, and “merged reality.”
Excerpt (emphasis DSC):
Live-streaming 360-degree video was another area of focus for Intel. Guests were able to watch a live basketball game being broadcast from Indianapolis, Indiana, choosing from multiple points of view as the action moved up and down the court. Intel “will be among the first technology providers to enable the live sports experience on multiple VR devices,” the company stated.
After taking a 3D scan of the room, Project Alloy can substitute virtual objects where physical objects stand.
From DSC:
If viewers of a live basketball game can choose from multiple points of view, why can’t remote learners do this as well with a face-to-face classroom that’s taking place at a university or college? Learning from the Living [Class] Room.
From CES 2017: Introducing DAQRI’s Smart Glasses™
Excerpt:
Data visualization, guided work instructions, remote expert — for use in a variety of industries: medical, aviation and aerospace, architecture and AEC, lean manufacturing, engineering, and construction.
Third-party Microsoft HoloLens-based mixed reality headsets coming soon, prices to start at $299 — from bgr.in by Deepali Moray
Microsoft has partnered with companies including Dell and Acer which will release their own HoloLens compatible devices.
Excerpt:
The company said that it is teaming up with the likes of Dell, HP, Lenovo and Acer, which will release headsets based on the HoloLens technology. “These new head-mounted displays will be the first consumer offerings utilizing the Mixed Reality capabilities of Windows 10 Creators Update,” a Microsoft spokesperson said. Microsoft’s partner companies for taking the HoloLens technology forward include Dell, HP, Lenovo, Acer, and 3 Glasses. Headsets by these manufacturers will work the same way as the original HoloLens but carry the design and branding of their respective companies. While the HoloLens developer edition costs a whopping $2999 (approximately Rs 2,00,000), the third-party headsets will be priced starting $299 (approximately Rs 20,000).
Verto Studio 3D App Makes 3D Modeling on HoloLens Easy — from winbuzzer.com by Luke Jones
The upcoming Verto Studio 3D application allows users to create 3D models and interact with them when wearing HoloLens. It is the first software of its kind for mixed reality.
Excerpt:
How is The Immersive Experience Delivered?
Tethered Headset VR – The user can participate in a VR experience by using a computer with a tethered VR headset (also known as a Head Mounted Display – HMD) like Facebook’s Oculus Rift, PlayStation VR, or the HTC Vive. The user has the ability to move freely and interact in the VR environment while using a handheld controller to emulate VR hands. But, the user has a limited area in which to move about because they are tethered to a computer.
Non-Tethered Headset VR/AR – These devices are headsets and computers built into one system, so users are free of any cables limiting their movement. These devices use AR to deliver a 360° immersive experience. Much like with Oculus Rift and Vive, the user would be able to move around in the AR environment as well as interact and manipulate objects. A great example of this headset is Microsoft’s HoloLens, which delivers an AR experience to the user through just a headset.
Mobile Device Inserted into a Headgear – To experience VR, the user inserts their mobile device into a Google Cardboard, Samsung Gear 360°, or any other type of mobile device headgear, along with headphones if they choose. This form of VR doesn’t require the user to be tethered to a computer and most VR experiences can be 360° photos, videos, and interactive scenarios.
Mobile VR – The user can access VR without any type of headgear simply by using a mobile device and headphones (optional). They can still have many of the same experiences that they would through Google Cardboard or any other type of mobile device headgear. Although they don’t get the full immersion that they would with headgear, they would still be able to experience VR. Currently, this version of the VR experience seems to be the most popular because it only requires a mobile device. Apps like Pokémon Go and Snapchat’s animated selfie lens only require a mobile device and have a huge number of users.
Desktop VR – Using just a desktop computer, the user can access 360° photos and videos, as well as other VR and AR experiences, by using the trackpad or computer mouse to move their field of view and become immersed in the VR scenario.
New VR – Non-mobile and non-headset platforms like Leap Motion use depth sensors to create a VR image of one’s hands on a desktop computer; they emulate hand gestures in real time. This technology could be used for anything from teaching assembly in a manufacturing plant to learning a step-by-step process to medical training.
VR/AR Solutions
Goggles that are worn, while they are “Oh Myyy” awesome, will not be the final destination of VR/AR. We will want to engage and respond, without wearing a large device over our eyes. Pokémon Go was a good early predictor of how non-goggled experiences will soar.
Elliott Masie
Top 8 VR & AR predictions for 2017 — from haptic.al by Christine Hart
Excerpt:
Education will go virtual
Similar to VR for brand engagement, we’ve seen major potential for delivering hands-on training and distance education in a virtual environment. If VR can take a class on a tour of Mars, the current trickle of educational VR could turn into a flood in 2017.
Published on Dec 26, 2016
Top 10 Virtual Reality Predictions For 2017 In vTime. Its been an amazing year for VR and AR. New VR and AR headsets, ground breaking content and lots more. 2017 promises to be amazing as well. Here’s our top 10 virtual reality predictions for the coming year. Filmed in vTime with vCast. Sorry about the audio quality. We used mics on Rift and Vive which are very good on other platforms. We’ve reported this to vTime.
Addendums
AR becomes the killer app for smartphones
Mixed reality is coming in 2017! Here’s what you need to know — from linkedin.com by Keith Curtin
Excerpts:
A hybrid of both AR & VR, Mixed Reality (MR) is far more advanced than Virtual Reality because it combines the use of several types of technologies including sensors, advanced optics and next gen computing power. All of this technology bundled into a single device will provide the user with the capability to overlay augmented holographic digital content into your real-time space, creating scenarios that are unbelievably realistic and mind-blowing.
How does it work?
Mixed Reality works by scanning your physical environment and creating a 3D map of your surroundings so the device will know exactly where and how to place digital content into that space – realistically – while allowing you to interact with it using gestures. Much different than Virtual Reality where the user is immersed in a totally different world, Mixed Reality experiences invite digital content into your real-time surroundings, allowing you to interact with them.
Mixed reality use cases mentioned in the article included: