To higher ed: When the race track is going 180mph, you can’t walk or jog onto the track. [Christian]

From DSC:
When the race track is going 180mph, you can’t walk or jog onto the track.  What do I mean by that? 

Consider this quote from an article that Jeanne Meister wrote out at Forbes entitled, “The Future of Work: Three New HR Roles in the Age of Artificial Intelligence:”*

This emphasis on learning new skills in the age of AI is reinforced by the most recent report on the future of work from McKinsey which suggests that as many as 375 million workers around the world may need to switch occupational categories and learn new skills because approximately 60% of jobs will have least one-third of their work activities able to be automated.

Go scan the job openings and you will likely see many that have to do with technology, and increasingly, with emerging technologies such as artificial intelligence, deep learning, machine learning, virtual reality, augmented reality, mixed reality, big data, cloud-based services, robotics, automation, bots, algorithm development, blockchain, and more. 

 

From Robert Half’s 2019 Technology Salary Guide 

 

 

How many of us have those kinds of skills? Did we get that training in the community colleges, colleges, and universities that we went to? Highly unlikely — even if you graduated from one of those institutions only 5-10 years ago. And many of those institutions are often moving at the pace of a nice leisurely walk, with some moving at a jog, even fewer are sprinting. But all of them are now being asked to enter a race track that’s moving at 180mph. Higher ed — and society at large — are not used to moving at this pace. 

This is why I think that higher education and its regional accrediting organizations are going to either need to up their game hugely — and go through a paradigm shift in the required thinking/programming/curricula/level of responsiveness — or watch while alternatives to institutions of traditional higher education increasingly attract their learners away from them.

This is also, why I think we’ll see an online-based, next generation learning platform take place. It will be much more nimble — able to offer up-to-the minute, in-demand skills and competencies. 

 

 

The below graphic is from:
Jobs lost, jobs gained: What the future of work will mean for jobs, skills, and wages

 

 

 


 

* Three New HR Roles To Create Compelling Employee Experiences
These new HR roles include:

  1. IBM: Vice President, Data, AI & Offering Strategy, HR
  2. Kraft Heinz Senior Vice President Global HR, Performance and IT
  3. SunTrust Senior Vice President Employee Wellbeing & Benefits

What do these three roles have in common? All have been created in the last three years and acknowledge the growing importance of a company’s commitment to create a compelling employee experience by using data, research, and predictive analytics to better serve the needs of employees. In each case, the employee assuming the new role also brought a new set of skills and capabilities into HR. And importantly, the new roles created in HR address a common vision: create a compelling employee experience that mirrors a company’s customer experience.

 


 

An excerpt from McKinsey Global Institute | Notes from the Frontier | Modeling the Impact of AI on the World Economy 

Workers.
A widening gap may also unfold at the level of individual workers. Demand for jobs could shift away from repetitive tasks toward those that are socially and cognitively driven and others that involve activities that are hard to automate and require more digital skills.12 Job profiles characterized by repetitive tasks and activities that require low digital skills may experience the largest decline as a share of total employment, from some 40 percent to near 30 percent by 2030. The largest gain in share may be in nonrepetitive activities and those that require high digital skills, rising from some 40 percent to more than 50 percent. These shifts in employment would have an impact on wages. We simulate that around 13 percent of the total wage bill could shift to categories requiring nonrepetitive and high digital skills, where incomes could rise, while workers in the repetitive and low digital skills categories may potentially experience stagnation or even a cut in their wages. The share of the total wage bill of the latter group could decline from 33 to 20 percent.13 Direct consequences of this widening gap in employment and wages would be an intensifying war for people, particularly those skilled in developing and utilizing AI tools, and structural excess supply for a still relatively high portion of people lacking the digital and cognitive skills necessary to work with machines.

 


 

 
For museums, augmented reality is the next frontier — from wired.com by Arielle Pardes

Excerpt:

Mae Jemison, the first woman of color to go into space, stood in the center of the room and prepared to become digital. Around her, 106 cameras captured her image in 3-D, which would later render her as a life-sized hologram when viewed through a HoloLens headset.

Jemison was recording what would become the introduction for a new exhibit at the Intrepid Sea, Air, and Space Museum, which opens tomorrow as part of the Smithsonian’s annual Museum Day. In the exhibit, visitors will wear HoloLens headsets and watch Jemison materialize before their eyes, taking them on a tour of the Space Shuttle Enterprise—and through space history. They’re invited to explore artifacts both physical (like the Enterprise) and digital (like a galaxy of AR stars) while Jemison introduces women throughout history who have made important contributions to space exploration.

Interactive museum exhibits like this are becoming more common as augmented reality tech becomes cheaper, lighter, and easier to create.

 

 

Oculus will livestream it’s 5th Connect Conference on Oculus venues — from vrscout.com by Kyle Melnick

Excerpt (emphasis DSC):

Using either an Oculus Go standalone device or a mobile Gear VR headset, users will be able to login to the Oculus Venues app and join other users for an immersive live stream of various developer keynotes and adrenaline-pumping esports competitions.

 

From DSC:
What are the ramifications of this for the future of webinars, teaching and learning, online learning, MOOCs and more…?

 

 

 

10 new AR features in iOS 12 for iPhone & iPad — from mobile-ar.reality.news by Justin Meyers

Excerpt:

Apple’s iOS 12 has finally landed. The big update appeared for everyone on Monday, Sept. 17, and hiding within are some pretty amazing augmented reality upgrades for iPhones, iPads, and iPod touches. We’ve been playing with them ever since the iOS 12 beta launched in June, and here are the things we learned that you’ll want to know about.

For now, here’s everything AR-related that Apple has included in iOS 12. There are some new features aimed to please AR fanatics as well as hook those new to AR into finally getting with the program. But all of the new AR features rely on ARKit 2.0, the latest version of Apple’s augmented reality framework for iOS.

 

 

Berkeley College Faculty Test VR for Learning— from campustechnology.com by Dian Schaffhauser

Excerpt:

In a pilot program at Berkeley College, members of a Virtual Reality Faculty Interest Group tested the use of virtual reality to immerse students in a variety of learning experiences. During winter 2018, seven different instructors in nearly as many disciplines used inexpensive Google Cardboard headsets along with apps on smartphones to virtually place students in North Korea, a taxicab and other environments as part of their classwork.

Participants used free mobile applications such as Within, the New York Times VR, Discovery VR, Jaunt VR and YouTube VR. Their courses included critical writing, international business, business essentials, medical terminology, international banking, public speaking and crisis management.

 

 

 

 

Just released from the Future Today Institute:
The 2019 Journalism, Media and Tech Trends Report 

Launched at the Online News Association conference in Austin, Texas, the Future Today Institute’s new industry report for the future of journalism, media and technology follows the same approach as our popular annual mega trends report, now in its 11th year with more than 7.5 million cumulative views.

Key findings:

  • Blockchain emerged as a significant driver of change in 2019 and beyond. The blockchain ecosystem is still maturing, however we’ve now seen enough development, adoption and consolidation that it warrants its own, full section. There are numerous opportunities for media and journalism organizations. For that reason, we’ve included an explainer, a list of companies to watch, and a cross-indexed list of trends to compliment blockchain technology. We’ve also included detailed scenarios in this section.

 

  • Mixed Reality is entering the mainstream.
    The mixed reality ecosystem has grown enough that we now see concrete opportunities on the horizon for media organizations. From immersive video to wearable technology, news and entertainment media organizations should begin mapping their strategy for new kinds of devices and platforms.

 

  • Artificial Intelligence is not a tech trend—it is the third era of computing. And it isn’t just for story generation. You will see the AI ecosystem represented in many of the trends in this report, and it is vitally important that all decision-makers and teams familiarize themselves with current and emerging AI trends.

 

In addition to the 108 trends identified, the report also includes several guides for journalists, including a Blockchain Primer, an AI Primer, a mixed reality explainer, hacker terms and lingo, and a guide to policy changes on the horizon.

The report also includes guidance on how everyone working within journalism and media can take action on tech trends and how to evaluate a trend’s impact on their local marketplaces.

Download and read the full report here.   |   View it on Slideshare.

 

 

 

 

Courses in VR and AR will now be available thanks to Amazon Sumerian partnership — from vrfocus.com by Rebecca Hills-Duty
The Royal Melbourne Institute of Technology (RMIT) have announced a partnership with Amazon Web Services.

Excerpt:

Last year Amazon announced a new feature for its Amazon Web Services called Amazon Sumerian, a platform that would allow anyone to create full-featured virtual reality experiences. The Royal Melbourne Institute of Technology (RMIT) have now announced that it will be offering short courses in artificial intelligence (AI), augmented reality (AR) and VR thanks to a partnership with Amazon Web Services.

The new partnership between RMIT and Amazon Web Services was announced at the AWS Public Sector Summit and Canberra, Australia on Wednesday. The new courses will be using the Amazon Sumerian platform.

The newly launched courses includes Developing AI Strategy, Developing AR and VR Strategy and Developing AR and VR Applications. All of these have been adapted from the AWS Educate program, which was created to react to the changing nature of the workplace, and how immersive technology is increasingly relevant.

 

My team’s mission is to build a community of lifelong learners, successfully navigating the world of work … yes, sometimes your degree is the right solution education wise for a person, but throughout our lives, certainly I know in my digital career, constantly we need to be updating our skills and understand the new, emerging technology and talk with experts.”

 

 

 

 



 

Everything you need to know about those new iPhones — from wired.com by Arielle Pardes

Excerpt:

Actually, make that three new iPhones. Apple followed last year’s iPhone X with the iPhone Xs, iPhone Xs Max, and iPhone Xr. It also spent some time showing off the Apple Watch Series 4, its most powerful wearable yet. Missed the event? Catch our commentary on WIRED’s liveblog, or read on for everything you need to know about today’s big Apple event.

 

 

Apple’s latest iPhones are packed with AI smarts — from wired.com by Tom Simonite

Excerpt:

At a glance the three new iPhones unveiled next to Apple’s glassy circular headquarters Wednesday look much like last year’s iPhone X. Inside, the devices’ computational guts got an invisible but more significant upgrade.

Apple’s phones come with new chip technology with a focus on helping the devices understand the world around them using artificial intelligence algorithms. The company says the improvements allow the new devices to offer slicker camera effects and augmented reality experiences.

For the first time, non-Apple developers will be allowed to run their own algorithms on Apple’s AI-specific hardware.

 

 

Apple Watch 4 adds ECG, EKG, and more heart-monitoring capabilities — from wired.com by Lauren Goode

Excerpt:

The new Apple Watch Series 4, revealed by Apple earlier today, underscores that some of the watch’s most important features are its health and fitness-tracking functions. The new watch is one of the first over-the-counter devices in the US to offer electrocardiogram, or ECG, readings. On top of that, the Apple Watch has received FDA clearance—both for the ECG feature and another new feature that detects atrial fibrillation.

 

 

 

 

 

Why one London university is now offering degrees in VR — from techrepublic.com by Conner Forrest
London College of Communication, UAL is launching a master’s degree in virtual reality for the 2018-19 academic year.

The big takeaways for tech leaders:

  • London College of Communication, UAL is launching a Master of Arts in VR degree for the 2018-19 academic year.
  • The demand for VR professionals is growing in the film and media industries, where these technologies are being used most frequently.

 

From DSC:
Collaboration/videoconferencing tools like Webex, Blackboard Collaborate, Zoom, Google Hangouts, Skype, etc. are being used in a variety of scenarios. That platform has been well established. It will be interesting to see how VR might play into web-based collaboration, training, and learning.

 

 

San Diego’s Nanome Inc. releases collaborative VR-STEM software for free — from vrscout.com by Becca Loux

Excerpt:

The first collaborative VR molecular modeling application was released August 29 to encourage hands-on chemistry experimentation.

The open-source tool is free for download now on Oculus and Steam.

Nanome Inc., the San Diego-based start-up that built the intuitive application, comprises UCSD professors and researchers, web developers and top-level pharmaceutical executives.

 

“With our tool, anyone can reach out and experience science at the nanoscale as if it is right in front of them. At Nanome, we are bringing the craftsmanship and natural intuition from interacting with these nanoscale structures at room scale to everyone,” McCloskey said.

 

San Diego’s Nanome Inc. Releases Collaborative VR-STEM Software For Free

 

 

10 ways VR will change life in the near future — from forbes.com

Excerpts:

  1. Virtual shops
  2. Real estate
  3. Dangerous jobs
  4. Health care industry
  5. Training to create VR content
  6. Education
  7. Emergency response
  8. Distraction simulation
  9. New hire training
  10. Exercise

 

From DSC:
While VR will have its place — especially for timeswhen you need to completely immerse yourself into another environment — I think AR and MR will be much larger and have a greater variety of applications. For example, I could see where instructions on how to put something together in the future could use AR and/or MR to assist with that process. The system could highlight the next part that I’m looking for and then highlight the corresponding parts where it goes — and, if requested, can show me a clip on how it fits into what I’m trying to put together.

 

How MR turns firstline workers into change agents — from virtualrealitypop.com by Charlie Finkand
Mixed Reality, a new dimension of work — from Microsoft and Harvard Business Review

Excerpts:

Workers with mixed-reality solutions that enable remote assistance, spatial planning, environmentally contextual data, and much more,” Bardeen told me. With the HoloLens Firstline Workers workers conduct their usual, day-to-day activities with the added benefit of a heads-up, hands-free, display that gives them immediate access to valuable, contextual information. Microsoft says speech services like Cortana will be critical to control along with gesture, according to the unique needs of each situation.

 

Expect new worker roles. What constitutes an “information worker” could change because mixed reality will allow everyone to be involved in the collection and use of information. Many more types of information will become available to any worker in a compelling, easy-to-understand way. 

 

 

Let’s Speak: VR language meetups — from account.altvr.com

 

 

 

 

Adobe Announces the 2019 Release of Adobe Captivate, Introducing Virtual Reality for eLearning Design — from theblog.adobe.com

Excerpt:

  • Immersive learning with VR experiences: Design learning scenarios that your learners can experience in Virtual Reality using VR headsets. Import 360° media assets and add hotspots, quizzes and other interactive elements to engage your learners with near real-life scenarios
  • Interactive videos: Liven up demos and training videos by making them interactive with the new Adobe Captivate. Create your own or bring in existing YouTube videos, add questions at specific points and conduct knowledge checks to aid learner remediation
  • Fluid Boxes 2.0: Explore the building blocks of Smart eLearning design with intelligent containers that use white space optimally. Objects placed in Fluid Boxes get aligned automatically so that learners always get fully responsive experience regardless of their device or browser.
  • 360° learning experiences: Augment the learning landscape with 360° images and videos and convert them into interactive eLearning material with customizable overlay items such as information blurbs, audio content & quizzes.

 

 

Blippar unveils indoor visual positioning system to anchor AR — from martechtoday.com by Barry Levine
Employing machine vision to recognize mapped objects, the company says it can determine which way a user is looking and can calculate positioning down to a centimeter.

A Blippar visualization of AR using its new indoor visual positioning system

 

The Storyteller’s Guide to the Virtual Reality Audience — from medium.com by Katy Newton

Excerpt:

To even scratch the surface of these questions, we need to better understand the audience’s experience in VR — not just their experience of the technology, but the way that they understand story and their role within it.

 

 

Hospital introducing HoloLens augmented reality into the operating room — from medgadget.com

Excerpt:

HoloLens technology is being paired with Microsoft’s Surface Hub, a kind of digital whiteboard. The idea is that the surgical team can gather together around a Surface Hub to review patient information, discuss the details of a procedure, and select what information should be readily accessible during surgery. During the procedure, a surgeon wearing a HoloLens would be able to review a CT or MRI scan, access other data in the electronic medical records, and to be able to manipulate these so as to get a clear picture of what is being worked on and what needs to be done.

 

 

Raleigh Fire Department invests in virtual reality to enrich training — from vrfocus.com by Nikholai Koolon
New system allows department personnel to learn new skills through immersive experiences.

Excerpt:

The VR solution allows emergency medical services (EMS) personnel to dive into a rich and detailed environment which allows them to pinpoint portions of the body to dissect. This then allows them then see each part of the body in great detail along with viewing it from any angle. The goal is to allow for users to gain the experience to diagnose injuries from a variety of vantage points all where working within an virtual environment capable of displaying countless scenarios.

 

 

For another emerging technology, see:

Someday this tiny spider bot could perform surgery inside your body — from fastcompany.com by Jesus Diaz
The experimental robots could also fix airplane engines and find disaster victims.

Excerpt:

A team of Harvard University researchers recently achieved a major breakthrough in robotics, engineering a tiny spider robot using tech that could one day work inside your body to repair tissues or destroy tumors. Their work could not only change medicine–by eliminating invasive surgeries–but could also have an impact on everything from how industrial machines are maintained to how disaster victims are rescued.

Until now, most advanced, small-scale robots followed a certain model: They tend to be built at the centimeter scale and have only one degree of freedom, which means they can only perform one movement. Not so with this new ‘bot, developed by scientists at Harvard’s Wyss Institute for Biologically Inspired Engineering, the John A. Paulson School of Engineering and Applied Sciences, and Boston University. It’s built at the millimeter scale, and because it’s made of flexible materials–easily moved by pneumatic and hydraulic power–the critter has an unprecedented 18 degrees of freedom.

 


Plus some items from a few weeks ago


 

After almost a decade and billions in outside investment, Magic Leap’s first product is finally on sale for $2,295. Here’s what it’s like. — from

Excerpts (emphasis DSC):

I liked that it gave a new perspective to the video clip I’d watched: It threw the actual game up on the wall alongside the kind of information a basketball fan would want, including 3-D renderings and stats. Today, you might turn to your phone for that information. With Magic Leap, you wouldn’t have to.

Abovitz also said that intelligent assistants will play a big role in Magic Leap’s future. I didn’t get to test one, but Abovitz says he’s working with a team in Los Angeles that’s developing high-definition people that will appear to Magic Leap users and assist with tasks. Think Siri, Alexa or Google Assistant, but instead of speaking to your phone, you’d be speaking to a realistic-looking human through Magic Leap. Or you might be speaking to an avatar of someone real.

“You might need a doctor who can come to you,” Abovitz said. “AI that appears in front of you can give you eye contact and empathy.”

 

And I loved the idea of being able to place a digital TV screen anywhere I wanted.

 

 

Magic Leap One Available For Purchase, Starting At $2,295 — from vrscout.com by Kyle Melnick

Excerpt:

December of last year U.S. startup Magic Leap unveiled its long-awaited mixed reality headset, a secretive device five years and $2.44B USD in the making.

This morning that same headset, now referred to as the Magic Leap One Creator Edition, became available for purchase in the U.S. On sale to creators at a hefty starting price of $2,275, the computer spatial device utilizes synthetic lightfields to capture natural lightwaves and superimpose interactive, 3D content over the real-world.

 

 

 

Magic Leap One First Hands-On Impressions for HoloLens Developers — from magic-leap.reality.news

Excerpt:

After spending about an hour with the headset running through set up and poking around its UI and a couple of the launch day apps, I thought it would be helpful to share a quick list of some of my first impressions as someone who’s spent a lot of time with a HoloLens over the past couple years and try to start answering many of the burning questions I’ve had about the device.

 

 

World Campus researches effectiveness of VR headsets and video in online classes — from news.psu.edu

Excerpt:

UNIVERSITY PARK, Pa. — Penn State instructional designers are researching whether using virtual reality and 360-degree video can help students in online classes learn more effectively.

Designers worked with professors in the College of Nursing to incorporate 360-degree video into Nursing 352, a class on Advanced Health Assessment. Students in the class, offered online through Penn State World Campus, were offered free VR headsets to use with their smartphones to create a more immersive experience while watching the video, which shows safety and health hazards in a patient’s home.

Bill Egan, the lead designer for the Penn State World Campus RN to BSN nursing program, said students in the class were surveyed as part of a study approved by the Institutional Review Board and overwhelmingly said that they enjoyed the videos and thought they provided educational value. Eighty percent of the students said they would like to see more immersive content such as 360-degree videos in their online courses, he said.

 

 

7 Practical Problems with VR for eLearning — from learnupon.com

Excerpt:

In this post, we run through some practical stumbling blocks that prevent VR training from being feasible for most.

There are quite a number of practical considerations which prevent VR from totally overhauling the corporate training world. Some are obvious, whilst others only become apparent after using the technology a number of times. It’s important to be made aware of these limitations so that a large investment isn’t made in tech that isn’t really practical for corporate training.

 

Augmented reality – the next big thing for HR? — from hrdconnect.com
Augmented reality (AR) could have a huge impact on HR, transforming long-established processes into engaging and exciting something. What will this look like? How can we shape this into our everyday working lives?

Excerpt (emphasis DSC):

AR also has the potential to revolutionise our work lives, changing the way we think about office spaces and equipment forever.

Most of us still commute to an office every day, which can be a time-consuming and stressful experience. AR has the potential to turn any space into your own customisable workspace, complete with digital notes, folders and files – even a digital photo of your loved ones. This would give you access to all the information and tools that you would typically find in an office, but wherever and whenever you need them.

And instead of working on a flat, stationary, two-dimensional screen, your workspace would be a customisable three-dimensional space, where objects and information are manipulated with gestures rather than hardware. All you would need is an AR headset.

AR could also transform the way we advertise brands and share information. Imagine if your organisation had an AR stand at a conference – how engaging would that be for potential customers? How much more interesting and fun would meetings be if we used AR to present information instead of slides on a projector?

AR could transform the on-boarding experience into something fun and interactive – imagine taking an AR tour of your office, where information about key places, company history or your new colleagues pops into view as you go from place to place. 

 

 

RETINA Are Bringing Augmented Reality To Air Traffic Control Towers — from vrfocus.com by Nikholai Koolonavi

Excerpt:

A new project is aiming to make it easier for staff in airport control towers to visualize information to help make their job easier by leveraging augmented reality (AR) technology. The project, dubbed RETINA, is looking to modernise Europe’s air traffic management for safer, smarter and even smoother air travel.

 

 

 

2018 NMC Horizon Report: The trends, challenges, and developments likely to influence ed tech

2018 NMC Horizon Report — from library.educause.edu

Excerpt:

What is on the five-year horizon for higher education institutions? Which trends and technology developments will drive educational change? What are the critical challenges and how can we strategize solutions? These questions regarding technology adoption and educational change steered the discussions of 71 experts to produce the NMC Horizon Report: 2018 Higher Education Edition brought to you by EDUCAUSE. This Horizon Report series charts the five-year impact of innovative practices and technologies for higher education across the globe. With more than 16 years of research and publications, the Horizon Project can be regarded as one of education’s longest-running explorations of emerging technology trends and uptake.

Six key trends, six significant challenges, and six developments in educational technology profiled in this higher education report are likely to impact teaching, learning, and creative inquiry in higher education. The three sections of this report constitute a reference and technology planning guide for educators, higher education leaders, administrators, policymakers, and technologists.

 

2018 NMC Horizon Report -- a glance at the trends, challenges, and developments likely to influence ed tech -- visual graphic

 

Also see:

 

 

The title of this article being linked to is: Augmented and virtual reality mean business: Everything you need to know

 

Augmented and virtual reality mean business: Everything you need to know — from zdnet by Greg Nichols
An executive guide to the technology and market drivers behind the hype in AR, VR, and MR.

Excerpt:

Overhyped by some, drastically underestimated by others, few emerging technologies have generated the digital ink like virtual reality (VR), augmented reality (AR), and mixed reality (MR).  Still lumbering through the novelty phase and roller coaster-like hype cycles, the technologies are only just beginning to show signs of real world usefulness with a new generation of hardware and software applications aimed at the enterprise and at end users like you. On the line is what could grow to be a $108 billion AR/VR industry as soon as 2021. Here’s what you need to know.

 

The reason is that VR environments by nature demand a user’s full attention, which make the technology poorly suited to real-life social interaction outside a digital world. AR, on the other hand, has the potential to act as an on-call co-pilot to everyday life, seamlessly integrating into daily real-world interactions. This will become increasingly true with the development of the AR Cloud.

The AR Cloud
Described by some as the world’s digital twin, the AR Cloud is essentially a digital copy of the real world that can be accessed by any user at any time.

For example, it won’t be long before whatever device I have on me at a given time (a smartphone or wearable, for example) will be equipped to tell me all I need to know about a building just by training a camera at it (GPS is operating as a poor-man’s AR Cloud at the moment).

What the internet is for textual information, the AR Cloud will be for the visible world. Whether it will be open source or controlled by a company like Google is a hotly contested issue.

 

Augmented reality will have a bigger impact on the market and our daily lives than virtual reality — and by a long shot. That’s the consensus of just about every informed commentator on the subject.

 

 

 

Mixed reality will transform learning (and Magic Leap joins act one) — from edsurge.com by Maya Georgieva

Excerpt:

Despite all the hype in recent years about the potential for virtual reality in education, an emerging technology known as mixed reality has far greater promise in and beyond the classroom.

Unlike experiences in virtual reality, mixed reality interacts with the real world that surrounds us. Digital objects become part of the real world. They’re not just digital overlays, but interact with us and the surrounding environment.

If all that sounds like science fiction, a much-hyped device promises some of those features later this year. The device is by a company called Magic Leap, and it uses a pair of goggles to project what the company calls a “lightfield” in front of the user’s face to make it look like digital elements are part of the real world. The expectation is that Magic Leap will bring digital objects in a much more vivid, dynamic and fluid way compared to other mixed-reality devices such as Microsoft’s Hololens.

 

The title of the article being linked to here is Mixed reality will transform learning (and Magic Leap joins act one)

 

Now think about all the other things you wished you had learned this way and imagine a dynamic digital display that transforms your environment and even your living room or classroom into an immersive learning lab. It is learning within a highly dynamic and visual context infused with spatial audio cues reacting to your gaze, gestures, gait, voice and even your heartbeat, all referenced with your geo-location in the world. Unlike what happens with VR, where our brain is tricked into believing the world and the objects in it are real, MR recognizes and builds a map of your actual environment.

 

 

 

Also see:

virtualiteach.com
Exploring The Potential for the Vive Focus in Education

 

virtualiteach.com

 

 

 

Digital Twins Doing Real World Work — from stambol.com

Excerpt:

On the big screen it’s become commonplace to see a 3D rendering or holographic projection of an industrial floor plan or a mechanical schematic. Casual viewers might take for granted that the technology is science fiction and many years away from reality. But today we’re going to outline where these sophisticated virtual replicas – Digital Twins – are found in the real world, here and now. Essentially, we’re talking about a responsive simulated duplicate of a physical object or system. When we first wrote about Digital Twin technology, we mainly covered industrial applications and urban infrastructure like transit and sewers. However, the full scope of their presence is much broader, so now we’re going to break it up into categories.

 

The title of the article being linked to here is Digital twins doing real world work

 

Digital twin — from Wikipedia

Digital twin refers to a digital replica of physical assets (physical twin), processes and systems that can be used for various purposes.[1] The digital representation provides both the elements and the dynamics of how an Internet of Things device operates and lives throughout its life cycle.[2]

Digital twins integrate artificial intelligence, machine learning and software analytics with data to create living digital simulation models that update and change as their physical counterparts change. A digital twin continuously learns and updates itself from multiple sources to represent its near real-time status, working condition or position. This learning system, learns from itself, using sensor data that conveys various aspects of its operating condition; from human experts, such as engineers with deep and relevant industry domain knowledge; from other similar machines; from other similar fleets of machines; and from the larger systems and environment in which it may be a part of. A digital twin also integrates historical data from past machine usage to factor into its digital model.

In various industrial sectors, twins are being used to optimize the operation and maintenance of physical assets, systems and manufacturing processes.[3] They are a formative technology for the Industrial Internet of Things, where physical objects can live and interact with other machines and people virtually.[4]

 

 

Disney to debut its first VR short next month — from techcrunch.com by Sarah Wells

Excerpt:

Walt Disney Animation Studio is set to debut its first VR short film, Cycles, this August in Vancouver, the Association for Computing Machinery announced today. The plan is for it to be a headliner at the ACM’s computer graphics conference (SIGGRAPH), joining other forms of VR, AR and MR entertainment in the conference’s designated Immersive Pavilion.

This film is a first for both Disney and its director, Jeff Gipson, who joined the animation team in 2013 to work as a lighting artist on films like Frozen, Zootopia and Moana. The objective of this film, Gipson said in the statement released by ACM, is to inspire a deep emotional connection with the story.

“We hope more and more people begin to see the emotional weight of VR films, and with Cycles in particular, we hope they will feel the emotions we aimed to convey with our story,” said Gipson.

 

 

 

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2018 | Daniel Christian