Law schools have always taught the law and the practice thereof, but in the 21st century that is not nearly enough to provide students with the tools to succeed.
Clients, particularly business clients, are not only looking for an “attorney” in the customary sense, but a strategic partner equipped to deal with everything from project management to metrics to process enhancement. Those demands present law schools with both an opportunity for and expectation of innovation in legal education.
At Hofstra Law, we are in the process of establishing a new Center for Applied Legal Technology and Innovation where law students will be taught to use current and emerging technology, and to apply those skills and expertise to provide cutting-edge legal services while taking advantage of interdisciplinary opportunities.
Our goal is to teach law students how to use technology to deliver legal services and to yield graduates who combine exceptional legal acumen with the skill and ability to travel comfortably among myriad disciplines. The lawyers of today—and tomorrow—must be more than just conversant with other professionals. Rather, they need to be able to collaborate with experts in other fields to serve the myriad and intertwined interests of the client.
We are living through a fundamental transformation in the way we work. Automation and ‘thinking machines’ are replacing human tasks and jobs, and changing the skills that organisations are looking for in their people. These momentous changes raise huge organisational, talent and HR challenges – at a time when business leaders are already wrestling with unprecedented risks, disruption and political and societal upheaval.
The pace of change is accelerating.
Graphic by DSC
Competition for the right talent is fierce. And ‘talent’ no longer means the same as ten years ago; many of the roles, skills and job titles of tomorrow are unknown to us today. How can organisations prepare for a future that few of us can define? How will your talent needs change? How can you attract, keep and motivate the people you need? And what does all this mean for HR?
This isn’t a time to sit back and wait for events to unfold. To be prepared for the future you have to understand it.In this report we look in detail at how the workplace might be shaped over the coming decade.
Artificial Intelligence (AI), virtual reality, augmented reality, robotics, drones, automation, bots, machine learning, NLP/voice recognition and personal assistants, the Internet of Things, facial recognition, data mining, and more. How these technologies roll out — and if some of them should be rolling out at all — needs to be discussed and dealt with sooner. This is due to the fact that the pace of change has changed. If you can look at those articles — with an eye on the last 500-1000 years or so to compare things to — and say that we aren’t living in times where the trajectory of technological change is exponential, then either you or I don’t know the meaning of that word.
The ABA and law schools need to be much more responsive and innovative — or society will end up suffering the consequences.
Mae Jemison, the first woman of color to go into space, stood in the center of the room and prepared to become digital. Around her, 106 cameras captured her image in 3-D, which would later render her as a life-sized hologram when viewed through a HoloLens headset.
Jemison was recording what would become the introduction for a new exhibit at the Intrepid Sea, Air, and Space Museum, which opens tomorrow as part of the Smithsonian’s annual Museum Day. In the exhibit, visitors will wear HoloLens headsets and watch Jemison materialize before their eyes, taking them on a tour of the Space Shuttle Enterprise—and through space history. They’re invited to explore artifacts both physical (like the Enterprise) and digital (like a galaxy of AR stars) while Jemison introduces women throughout history who have made important contributions to space exploration.
Interactive museum exhibits like this are becoming more common as augmented reality tech becomes cheaper, lighter, and easier to create.
Using either an Oculus Go standalone device or a mobile Gear VR headset, users will be able to login to the Oculus Venues app and join other users for an immersive live stream of various developer keynotes and adrenaline-pumping esports competitions.
From DSC: What are the ramifications of this for the future of webinars, teaching and learning, online learning, MOOCs and more…?
Apple’s iOS 12 has finally landed. The big update appeared for everyone on Monday, Sept. 17, and hiding within are some pretty amazing augmented reality upgrades for iPhones, iPads, and iPod touches. We’ve been playing with them ever since the iOS 12 beta launched in June, and here are the things we learned that you’ll want to know about.
For now, here’s everything AR-related that Apple has included in iOS 12. There are some new features aimed to please AR fanatics as well as hook those new to AR into finally getting with the program. But all of the new AR features rely on ARKit 2.0, the latest version of Apple’s augmented reality framework for iOS.
In a pilot program at Berkeley College, members of a Virtual Reality Faculty Interest Group tested the use of virtual reality to immerse students in a variety of learning experiences. During winter 2018, seven different instructors in nearly as many disciplines used inexpensive Google Cardboard headsets along with apps on smartphones to virtually place students in North Korea, a taxicab and other environments as part of their classwork.
Participants used free mobile applications such as Within, the New York Times VR, Discovery VR, Jaunt VR and YouTube VR. Their courses included critical writing, international business, business essentials, medical terminology, international banking, public speaking and crisis management.
Last year Amazon announced a new feature for its Amazon Web Services called Amazon Sumerian, a platform that would allow anyone to create full-featured virtual reality experiences. The Royal Melbourne Institute of Technology (RMIT) have now announced that it will be offering short courses in artificial intelligence (AI), augmented reality (AR) and VR thanks to a partnership with Amazon Web Services.
The new partnership between RMIT and Amazon Web Services was announced at the AWS Public Sector Summit and Canberra, Australia on Wednesday. The new courses will be using the Amazon Sumerian platform.
The newly launched courses includes Developing AI Strategy, Developing AR and VR Strategy and Developing AR and VR Applications. All of these have been adapted from the AWS Educate program, which was created to react to the changing nature of the workplace, and how immersive technology is increasingly relevant.
My team’s mission is to build a community of lifelong learners, successfully navigating the world of work … yes, sometimes your degree is the right solution education wise for a person, but throughout our lives, certainly I know in my digital career, constantly we need to be updating our skills and understand the new, emerging technology and talk with experts.”
London College of Communication, UAL is launching a Master of Arts in VR degree for the 2018-19 academic year.
The demand for VR professionals is growing in the film and media industries, where these technologies are being used most frequently.
From DSC: Collaboration/videoconferencing tools like Webex, Blackboard Collaborate, Zoom, Google Hangouts, Skype, etc. are being used in a variety of scenarios. That platform has been well established. It will be interesting to see how VR might play into web-based collaboration, training, and learning.
The first collaborative VR molecular modeling application was released August 29 to encourage hands-on chemistry experimentation.
The open-source tool is free for download now on Oculus and Steam.
Nanome Inc., the San Diego-based start-up that built the intuitive application, comprises UCSD professors and researchers, web developers and top-level pharmaceutical executives.
“With our tool, anyone can reach out and experience science at the nanoscale as if it is right in front of them. At Nanome, we are bringing the craftsmanship and natural intuition from interacting with these nanoscale structures at room scale to everyone,” McCloskey said.
From DSC: While VR will have its place — especially for timeswhen you need to completely immerse yourself into another environment — I think AR and MR will be much larger and have a greater variety of applications. For example, I could see where instructions on how to put something together in the future could use AR and/or MR to assist with that process. The system could highlight the next part that I’m looking for and then highlight the corresponding parts where it goes — and, if requested, can show me a clip on how it fits into what I’m trying to put together.
Workers with mixed-reality solutions that enable remote assistance, spatial planning, environmentally contextual data, and much more,” Bardeen told me. With the HoloLens Firstline Workers workers conduct their usual, day-to-day activities with the added benefit of a heads-up, hands-free, display that gives them immediate access to valuable, contextual information. Microsoft says speech services like Cortana will be critical to control along with gesture, according to the unique needs of each situation.
Expect new worker roles. What constitutes an “information worker” could change because mixed reality will allow everyone to be involved in the collection and use of information. Many more types of information will become available to any worker in a compelling, easy-to-understand way.
STEM students engaged in scientific disciplines, such as biochemistry and neuroscience, are often required by their respective degrees to spend a certain amount of time engaged in an official laboratory environment. Unfortunately, crowded universities and the rise of online education have made it difficult for these innovators-in-training to access properly equipped labs and log their necessary hours.
Cue Google VR Labs, a series of comprehensive virtual lab experiences available on the Google Daydream platform. Developed as part of partnership between Google and simulation education company Labster, the in-depth program boasts 30 interactive lab experiences in which biology students can engage in a series of hands-on scientific activities in a realistic environment.
These actions can include everything from the use of practical tools, such as DNA sequencers and microscopes, to reality-bending experiences only capable in a virtual environment, like traveling to the surface of the newly discovered Astakos IV exoplanet or examining and altering DNA on a molecular level.
Immersive learning with VR experiences: Design learning scenarios that your learners can experience in Virtual Reality using VR headsets. Import 360° media assets and add hotspots, quizzes and other interactive elements to engage your learners with near real-life scenarios
Interactive videos: Liven up demos and training videos by making them interactive with the new Adobe Captivate. Create your own or bring in existing YouTube videos, add questions at specific points and conduct knowledge checks to aid learner remediation
Fluid Boxes 2.0: Explore the building blocks of Smart eLearning design with intelligent containers that use white space optimally. Objects placed in Fluid Boxes get aligned automatically so that learners always get fully responsive experience regardless of their device or browser.
360° learning experiences: Augment the learning landscape with 360° images and videos and convert them into interactive eLearning material with customizable overlay items such as information blurbs, audio content & quizzes.
Blippar unveils indoor visual positioning system to anchor AR — from martechtoday.com by Barry Levine Employing machine vision to recognize mapped objects, the company says it can determine which way a user is looking and can calculate positioning down to a centimeter.
To even scratch the surface of these questions, we need to better understand the audience’s experience in VR — not just their experience of the technology, but the way that they understand story and their role within it.
HoloLens technology is being paired with Microsoft’s Surface Hub, a kind of digital whiteboard. The idea is that the surgical team can gather together around a Surface Hub to review patient information, discuss the details of a procedure, and select what information should be readily accessible during surgery. During the procedure, a surgeon wearing a HoloLens would be able to review a CT or MRI scan, access other data in the electronic medical records, and to be able to manipulate these so as to get a clear picture of what is being worked on and what needs to be done.
The VR solution allows emergency medical services (EMS) personnel to dive into a rich and detailed environment which allows them to pinpoint portions of the body to dissect. This then allows them then see each part of the body in great detail along with viewing it from any angle. The goal is to allow for users to gain the experience to diagnose injuries from a variety of vantage points all where working within an virtual environment capable of displaying countless scenarios.
A team of Harvard University researchers recently achieved a major breakthrough in robotics, engineering a tiny spider robot using tech that could one day work inside your body to repair tissues or destroy tumors. Their work could not only change medicine–by eliminating invasive surgeries–but could also have an impact on everything from how industrial machines are maintained to how disaster victims are rescued.
Until now, most advanced, small-scale robots followed a certain model: They tend to be built at the centimeter scale and have only one degree of freedom, which means they can only perform one movement. Not so with this new ‘bot, developed by scientists at Harvard’s Wyss Institute for Biologically Inspired Engineering, the John A. Paulson School of Engineering and Applied Sciences, and Boston University. It’s built at the millimeter scale, and because it’s made of flexible materials–easily moved by pneumatic and hydraulic power–the critter has an unprecedented 18 degrees of freedom.
I liked that it gave a new perspective to the video clip I’d watched: It threw the actual game up on the wall alongside the kind of information a basketball fan would want, including 3-D renderings and stats. Today, you might turn to your phone for that information. With Magic Leap, you wouldn’t have to.
…
Abovitz also said that intelligent assistants will play a big role in Magic Leap’s future. I didn’t get to test one, but Abovitz says he’s working with a team in Los Angeles that’s developing high-definition people that will appear to Magic Leap users and assist with tasks. Think Siri, Alexa or Google Assistant, but instead of speaking to your phone, you’d be speaking to a realistic-looking human through Magic Leap.Or you might be speaking to an avatar of someone real.
“You might need a doctor who can come to you,” Abovitz said. “AI that appears in front of you can give you eye contact and empathy.”
And I loved the idea of being able to place a digital TV screen anywhere I wanted.
December of last year U.S. startup Magic Leap unveiled its long-awaited mixed reality headset, a secretive device five years and $2.44B USD in the making.
This morning that same headset, now referred to as the Magic Leap One Creator Edition, became available for purchase in the U.S. On sale to creators at a hefty starting price of $2,275, the computer spatial device utilizes synthetic lightfields to capture natural lightwaves and superimpose interactive, 3D content over the real-world.
After spending about an hour with the headset running through set up and poking around its UI and a couple of the launch day apps, I thought it would be helpful to share a quick list of some of my first impressions as someone who’s spent a lot of time with a HoloLens over the past couple years and try to start answering many of the burning questions I’ve had about the device.
UNIVERSITY PARK, Pa. — Penn State instructional designers are researching whether using virtual reality and 360-degree video can help students in online classes learn more effectively.
Designers worked with professors in the College of Nursing to incorporate 360-degree video into Nursing 352, a class on Advanced Health Assessment. Students in the class, offered online through Penn State World Campus, were offered free VR headsets to use with their smartphones to create a more immersive experience while watching the video, which shows safety and health hazards in a patient’s home.
Bill Egan, the lead designer for the Penn State World Campus RN to BSN nursing program, said students in the class were surveyed as part of a study approved by the Institutional Review Board and overwhelmingly said that they enjoyed the videos and thought they provided educational value. Eighty percent of the students said they would like to see more immersive content such as 360-degree videos in their online courses, he said.
In this post, we run through some practical stumbling blocks that prevent VR training from being feasible for most.
…
There are quite a number of practical considerations which prevent VR from totally overhauling the corporate training world. Some are obvious, whilst others only become apparent after using the technology a number of times. It’s important to be made aware of these limitations so that a large investment isn’t made in tech that isn’t really practical for corporate training.
Augmented reality – the next big thing for HR? — from hrdconnect.com Augmented reality (AR) could have a huge impact on HR, transforming long-established processes into engaging and exciting something. What will this look like? How can we shape this into our everyday working lives?
Excerpt (emphasis DSC):
AR also has the potential to revolutionise our work lives, changing the way we think about office spaces and equipment forever.
Most of us still commute to an office every day, which can be a time-consuming and stressful experience. AR has the potential to turn any space into your own customisable workspace, complete with digital notes, folders and files – even a digital photo of your loved ones. This would give you access to all the information and tools that you would typically find in an office, but wherever and whenever you need them.
And instead of working on a flat, stationary, two-dimensional screen, your workspace would be a customisable three-dimensional space, where objects and information are manipulated with gestures rather than hardware. All you would need is an AR headset.
AR could also transform the way we advertise brands and share information. Imagine if your organisation had an AR stand at a conference – how engaging would that be for potential customers? How much more interesting and fun would meetings be if we used AR to present information instead of slides on a projector?
AR could transform the on-boarding experience into something fun and interactive – imagine taking an AR tour of your office, where information about key places, company history or your new colleagues pops into view as you go from place to place.
A new project is aiming to make it easier for staff in airport control towers to visualize information to help make their job easier by leveraging augmented reality (AR) technology. The project, dubbed RETINA, is looking to modernise Europe’s air traffic management for safer, smarter and even smoother air travel.
From DSC: Having served as one of the judges for these competitions during the last several years, I really appreciate the level of innovation that’s been displayed by many of the submissions and the individuals/institutions behind them.
Overhyped by some, drastically underestimated by others, few emerging technologies have generated the digital ink like virtual reality (VR), augmented reality (AR), and mixed reality (MR). Still lumbering through the novelty phase and roller coaster-like hype cycles, the technologies are only just beginning to show signs of real world usefulness with a new generation of hardware and software applications aimed at the enterprise and at end users like you. On the line is what could grow to be a $108 billion AR/VR industry as soon as 2021. Here’s what you need to know.
The reason is that VR environments by nature demand a user’s full attention, which make the technology poorly suited to real-life social interaction outside a digital world. AR, on the other hand, has the potential to act as an on-call co-pilot to everyday life, seamlessly integrating into daily real-world interactions. This will become increasingly true with the development of the AR Cloud.
The AR Cloud Described by some as the world’s digital twin, the AR Cloud is essentially a digital copy of the real world that can be accessed by any user at any time.
For example, it won’t be long before whatever device I have on me at a given time (a smartphone or wearable, for example) will be equipped to tell me all I need to know about a building just by training a camera at it (GPS is operating as a poor-man’s AR Cloud at the moment).
What the internet is for textual information, the AR Cloud will be for the visible world. Whether it will be open source or controlled by a company like Google is a hotly contested issue.
Augmented reality will have a bigger impact on the market and our daily lives than virtual reality — and by a long shot. That’s the consensus of just about every informed commentator on the subject.
Despite all the hype in recent years about the potential for virtual reality in education, an emerging technology known as mixed reality has far greater promise in and beyond the classroom.
Unlike experiences in virtual reality, mixed reality interacts with the real world that surrounds us. Digital objects become part of the real world. They’re not just digital overlays, but interact with us and the surrounding environment.
If all that sounds like science fiction, a much-hyped device promises some of those features later this year. The device is by a company called Magic Leap, and it uses a pair of goggles to project what the company calls a “lightfield” in front of the user’s face to make it look like digital elements are part of the real world. The expectation is that Magic Leap will bring digital objects in a much more vivid, dynamic and fluid way compared to other mixed-reality devices such as Microsoft’s Hololens.
Now think about all the other things you wished you had learned this way and imagine a dynamic digital display that transforms your environment and even your living room or classroom into an immersive learning lab. It is learning within a highly dynamic and visual context infused with spatial audio cues reacting to your gaze, gestures, gait, voice and even your heartbeat, all referenced with your geo-location in the world. Unlike what happens with VR, where our brain is tricked into believing the world and the objects in it are real, MR recognizes and builds a map of your actual environment.
Also see:
virtualiteach.com Exploring The Potential for the Vive Focus in Education
On the big screen it’s become commonplace to see a 3D rendering or holographic projection of an industrial floor plan or a mechanical schematic. Casual viewers might take for granted that the technology is science fiction and many years away from reality. But today we’re going to outline where these sophisticated virtual replicas – Digital Twins – are found in the real world, here and now. Essentially, we’re talking about a responsive simulated duplicate of a physical object or system. When we first wrote about Digital Twin technology, we mainly covered industrial applications and urban infrastructure like transit and sewers. However, the full scope of their presence is much broader, so now we’re going to break it up into categories.
Digital twin refers to a digital replica of physical assets (physical twin), processes and systems that can be used for various purposes.[1] The digital representation provides both the elements and the dynamics of how an Internet of Things device operates and lives throughout its life cycle.[2]
Digital twins integrate artificial intelligence, machine learning and software analytics with data to create living digital simulation models that update and change as their physical counterparts change. A digital twin continuously learns and updates itself from multiple sources to represent its near real-time status, working condition or position. This learning system, learns from itself, using sensor data that conveys various aspects of its operating condition; from human experts, such as engineers with deep and relevant industry domain knowledge; from other similar machines; from other similar fleets of machines; and from the larger systems and environment in which it may be a part of. A digital twin also integrates historical data from past machine usage to factor into its digital model.
In various industrial sectors, twins are being used to optimize the operation and maintenance of physical assets, systems and manufacturing processes.[3] They are a formative technology for the Industrial Internet of Things, where physical objects can live and interact with other machines and people virtually.[4]
Walt Disney Animation Studio is set to debut its first VR short film, Cycles, this August in Vancouver, the Association for Computing Machinery announced today. The plan is for it to be a headliner at the ACM’s computer graphics conference (SIGGRAPH), joining other forms of VR, AR and MR entertainment in the conference’s designated Immersive Pavilion.
This film is a first for both Disney and its director, Jeff Gipson, who joined the animation team in 2013 to work as a lighting artist on films like Frozen, Zootopia and Moana. The objective of this film, Gipson said in the statement released by ACM, is to inspire a deep emotional connection with the story.
“We hope more and more people begin to see the emotional weight of VR films, and with Cycles in particular, we hope they will feel the emotions we aimed to convey with our story,” said Gipson.
Computing in the Camera — from blog.torch3d.com by Paul Reynolds Mobile AR, with its ubiquitous camera, is set to transform what and how human experience designers create.
One of the points Allison [Woods, CEO, Camera IQ] made repeatedly on that call (and in this wonderful blog post of the same time period) was that the camera is going to be at the center of computing going forward, an indispensable element. Spatial computing could not exist without it. Simple, obvious, straightforward, but not earth shaking. We all heard what she had to say, but I don’t think any of us really understood just how profound or prophetic that statement turned out to be.
“[T]he camera will bring the internet and the real world into a single time and space.”
— Allison Woods, CEO, Camera IQ
The Camera As Platform — from shift.newco.co by Allison Wood When the operating system moves to the viewfinder, the world will literally change
“Every day two billion people carry around an optical data input device — the smartphone Camera — connected to supercomputers and informed by massive amounts of data that can have nearly limitless context, position, recognition and direction to accomplish tasks.”
The bigger story, however, is how fast the enterprise segment is growing as applications as straightforward as schematics on a head-mounted monocular microdisplay are transforming manufacturing, assembly, and warehousing. Use cases abounded.
…
After traveling the country and most recently to Europe, I’ve now experienced almost every major VR/AR/MR/XR related conference out there. AWE’s exhibit area was by far the largest display of VR and AR companies to date (with the exception of CES).
Specifically, we explored the potential for how virtual reality can help create a more empathetic nurse, which, we hypothesize, will lead to increased development of nursing students’ knowledge, skills, and attitudes. We aim to integrate these virtual experiences into early program coursework, with the intent of changing nursing behavior by providing a deeper understanding of the patient’s perspective during clinical interactions.
…
In addition to these compelling student reflections and the nearly immediate change in reporting practice, survey findings show that students unanimously felt that this type of patient-perspective VR experience should be integrated and become a staple of the nursing curriculum. Seeing, hearing, and feeling these moments results in significant and memorable learning experiences compared to traditional classroom learning alone. The potential that this type of immersive experience can have in the field of nursing and beyond is only limited by the imagination and creation of other virtual experiences to explore. We look forward to continued exploration of the impact of VR on student learning and to establishing ongoing partnerships with developers.
Oculus is officially launching Oculus TV, its dedicated hub for watching flatscreen video in virtual reality, on the standalone Oculus Go headset. Oculus TV was announced at last month’s F8 conference, and it ties together a lot of existing VR video options, highlighting Oculus’ attempts to emphasize non-gaming uses of VR. The free app features a virtual home theater with what Oculus claims is the equivalent of a 180-inch TV screen. It offers access to several streaming video services, including subscription-based platforms like Showtime and free web television services like Pluto TV as well as video from Oculus’ parent company Facebook.
Lehi, UT, May 29, 2018 (GLOBE NEWSWIRE) — Today, fast-growing augmented reality startup, Seek, is launching Seek Studio, the world’s first mobile augmented reality studio, allowing anybody with a phone and no coding expertise required, to create their own AR experiences and publish them for the world to see. With mobile AR now made more readily available, average consumers are beginning to discover the magic that AR can bring to the palm of their hand, and Seek Studio turns everyone into a creator.
To make the process incredibly easy, Seek provides templates for users to create their first AR experiences. As an example, a user can select a photo on their phone, outline the portion of the image they want turned into a 3D object and then publish it to Seek. They will then be able to share it with their friends through popular social networks or text. A brand could additionally upload a 3D model of their product and publish it to Seek, providing an experience for their customers to easily view that content in their own home. Seek Studio will launch with 6 templates and will release new ones every few days over the coming months to constantly improve the complexity and types of experiences possible to create within the platform.
Apple unveiled its new augmented reality file format, as well as ARKit 2.0, at its annual WWDC developer conference today. Both will be available to users later this year with iOS 12.
The tech company partnered with Pixar to develop the AR file format Universal Scene Description (USDZ) to streamline the process of sharing and accessing augmented reality files. USDZ will be compatible with tools like Adobe, Autodesk, Sketchfab, PTC, and Quixel. Adobe CTO Abhay Parasnis spoke briefly on stage about how the file format will have native Adobe Creative Cloud support, and described it as the first time “you’ll be able to have what you see is what you get (WYSIWYG) editing” for AR objects.
With a starting focus on University-level education and vocational schools in sectors such as mechanical engineering, VivEdu branched out to K-12 education in 2018, boasting a comprehensive VR approach to learning science, technology, engineering, mathematics, and art for kids.
That roadmap, of course, is just beginning. Which is where the developers—and those arm’s-length iPads—come in. “They’re pushing AR onto phones to make sure they’re a winner when the headsets come around,” Miesnieks says of Apple. “You can’t wait for headsets and then quickly do 10 years’ worth of R&D on the software.”
To fully realize the potential will require a broad ecosystem. Adobe is partnering with technology leaders to standardize interaction models and file formats in the rapidly growing AR ecosystem. We’re also working with leading platform vendors, open standards efforts like usdz and glTF as well as media companies and the creative community to deliver a comprehensive AR offering. usdz is now supported by Apple, Adobe, Pixar and many others while glTF is supported by Google, Facebook, Microsoft, Adobe and other industry leaders.
There are a number of professionals who would find the ability to quickly and easily create floor plans to be extremely useful. Estate agents, interior designers and event organisers would all no doubt find such a capability to be extremely valuable. For those users, the new feature added to iStaging’s VR Maker app might be of considerable interest.
The new VR Maker feature utilises Apple’s ARKit toolset to recognise spaces, such as walls and floors and can provide accurate measurements. By scanning each wall of a space, a floor plan can be produced quickly and easily.
I’ve interviewed nine investors who have provided their insights on where the VR industry has come, as well as the risks and opportunities that exist in 2018 and beyond. We’ve asked them what opportunities are available in the space — and what tips they have for startups.
Augmented reality (AR) hasn’t truly permeated the mainstream consciousness yet, but the technology is swiftly being adopted by global industries. It’ll soon be unsurprising to find a pair of AR glasses strapped to a helmet sitting on the heads of service workers, and RealWear, a company at the forefront on developing these headsets, thinks it’s on the edge of something big.
…
VOICE ACTIVATION
What’s most impressive about the RealWear HMT-1Z1 is how you control it. There’s no touch-sensitive gestures you need to learn — it’s all managed with voice, and better yet, there’s no need for a hotword like “Hey Google.” The headset listens for certain commands. For example, from the home screen just say “show my files” to see files downloaded to the device, and you can go back to the home screen by saying “navigate home.” When you’re looking at documents — like schematics — you can say “zoom in” or “zoom out” to change focus. It worked almost flawlessly, even in a noisy environment like the AWE show floor.
David Scowsill‘s experience in the aviation industry spans over 30 years. He has worked for British Airways, American Airlines, Easy Jet, Manchester Airport, and most recently the World Travel and Tourism Council, giving him a unique perspective on how Augmented and Virtual Reality (AVR) can impact the aviation industry.
These technologies have the power to transform the entire aviation industry, providing benefits to companies and consumers. From check-in, baggage drop, ramp operations and maintenance, to pilots and flight attendants, AVR can accelerate training, improve safety, and increase efficiency.
London-based design studio Marshmallow Laser Feast is using VR to let us reconnect with nature. With headsets, you can see a forest through the eyes of different animals and experience the sensations they feel. Creative Director Ersinhan Ersin took the stage at TNW Conference last week to show us how and why they created the project, titled In the Eyes of the Animal.
Have you already taken a side when it comes to XR wearables? Whether you prefer AR glasses or VR headsets likely depends on the application you need. But wouldn’t it be great to have a device that could perform as both? As XR tech advances, we think crossovers will start popping up around the world.
A Beijing startup called AntVR recently rocketed past its Kickstarter goal for an AR/VR visor. Their product, the Mix, uses tinted lenses to toggle between real world overlay and full immersion. It’s an exciting prospect. But rather than digging into the tech (or the controversy surrounding their name, their marketing, and a certain Marvel character) we’re looking at what this means for how XR devices are developed and sold.
Google is bringing AR tech to its Expeditions app with a new update going live today. Last year, the company introduced its GoogleExpeditions AR Pioneer Program, which brought the app into classrooms across the country; with this launch the functionality is available to all.
Expeditions will have more than 100 AR tours in addition to the 800 VR tours already available. Examples include experiences that let users explore Leonardo Da Vinci’s inventions and ones that let you interact with the human skeletal system.
At four recent VR conferences and events there was a palpable sense that despite new home VR devices getting the majority of marketing and media attention this year, the immediate promise and momentum is in the location-based VR (LBVR) attractions industry. The VR Arcade Conference (April 29th and 30th), VRLA (May 4th and 5th), the Digital Entertainment Group’s May meeting (May 1), and FoIL (Future of Immersive Leisure, May 16th and 17th) all highlighted a topic that suddenly no one can stop talking about: location-based VR (LBVR). With hungry landlords giving great deals for empty retail locations, VRcades, which are inexpensive to open (like Internet Cafes), are popping up all over the country. As a result, VRcade royalties for developers are on the rise, so they are shifting their attention accordingly to shorter experiences optimized for LBVR, which is much less expensive than building a VR app for the home.
The world is about to be painted with data. Every place. Every person. Every thing. In the near term this invisible digital layer will be revealed by the camera in your phone, but in the long term it will be incorporated into a wearable device, likely a head-mounted display (HMD) integrating phone, audio, and AI assistants. Users will control the system with a combination of voice, gesture and ring controller. Workers in factories use monocular displays to do this now, but it’s going to be quite some time before this benefits consumers. While this coming augmentation of man represents an evolutionary turning point, it’s adoption will resemble that of the personal computer, which took at least fifteen years. Mobile AR, on the other hand, is here now, and in a billion Android and Apple smartphones, which are about to get a lot better. Thanks to AR, we can start building the world’s digital layer for the smartphone, right now, without waiting for HMDs to unlock the benefits of an AR-enabled world.
12 hot augmented reality ideas for your business — from information-age.com Augmented reality is one of the most exciting technologies that made its way into the mass market in the recent years.
Excerpt:
In this article we will tell you about other ways to use this technology in a mobile app except for gaming and give you some augmented reality business ideas.
The new AR features combine Google’s existing Street View and Maps data with a live feed from your phone’s camera to overlay walking directions on top of the real world and help you figure out which way you need to go.
The VR travel industry may be in its infancy, but if you expect to see baby steps leading to market adoption, think again. Digital travel sales are expected to reach $198 billion this year, with virtual reality travel apps and VR tours capturing a good share of market revenue.
Of course, this should come as no surprise. Consumers increasingly turn to digital media when planning aspects of their lives, from recreational activities to retirement. Because VR has the power to engage travelers like no other technology can do, it is a natural step in the evolution of the travel industry. It is also likely to disrupt travel planning as we know it.
In this article, we will explore VR travel technology, and what it means for business in 2018.
HP Inc. is teaming up with DiSTI to create VR training programs for enterprise customers. DiSTI is a platform for user interface software and custom 3D training solutions. The companies are partnering to create maintenance and operations training in VR for vehicle, aircraft and industrial equipment systems. DiSTI’s new VE Studio software lets customers develop their own virtual training applications or have DiSTI and HP professional services teams assist in designing and building the program. — TECHRADAR
HP Inc. today announced an alliance with the DiSTI Corporation, a leading global provider of VR and advanced human machine interface development solutions, to address the growing demand for high-impact, cost-effective VR training.
The two companies will work together to develop unique VR training solutions for enterprise customers, with a specific focus on maintenance and operations training for complex systems such as vehicle, aircraft and industrial equipment.
It’s a different approach to storytelling: Just as standard video is a step up from photography in terms of immersiveness, 360 video and VR amp this up considerably further. Controlling what’s in the frame and editing to hone in on the elements of the picture that we’d like the viewer to focus on is somewhat ‘easy’ with photography. With moving pictures (video), this is harder but with the right use of the camera, it’s still easy to direct the viewer’s attention to the elements of the narrative we’d like to highlight.Since 360 and VR allow the user to essentially take control of the camera, content creators have a lot less control in terms of capturing attention. This has its upsides too though…360 video and particularly VR provide for a very rich sensory environment that standard video just can’t match.
Virtual reality (VR) has often been mentioned as the empathy machine, however it has many use cases. When it comes to memory and retention it looks like VR is not only useful for simulation but for education as well. Immersive VR Education teamed up with HTC Vive and Windsor Forest Colleges Group to create a memorable experience of virtual teaching.
On the April 25th ten students from Windsor Forest Colleges Group in the UK put on an HTC Vive headset and guided by David Whelan CEO & Founder of Immersive VR Education and Mike Armstrong, Senior/Lead Developer of Immersive VR Education using the free VR social education and presentation platform ENGAGE. ENGAGE allows users to hold meetings, classes, private lessons and presentations. Users can record, create their own lessons and presentations as well as allow users to interact with virtual objects.
The $200 Oculus Go is the most accessible VR headset today.
Up until now, one of the biggest barriers to entry for VR has been price. Headset adoption has taken a conservative growth path, mostly due in part to high prices of PC-required systems or just requiring consumers to own a specific line of VR compatible phones to pair with mobile headsets.
But now the Oculus Go is finally here and it’s a big deal, especially for the millions of iPhone users out there who up until today have had limited options to get into VR.
Starting today, the Oculus Go standalone VR headset is available for purchase for $199. Available for sale on Oculus.com in 23 countries, you can also pick up one online from Amazon or in Best Buy Stores in the U.S. The Oculus companion app used for initial setup is available for both iPhone or Android devices.