Want to learn a new language? With this AR app, just point & tap — from fastcodesign.com by Mark Wilson
A new demo shows how augmented reality could redefine apps as we know them.

Excerpt:

There’s a new app gold rush. After Facebook and Apple both released augmented reality development kits in recent months, developers are demonstrating just what they can do with these new technologies. It’s a race to invent the future first.

To get a taste of how quickly and dramatically our smartphone apps are about to change, just take a look at this little demo by front end engineer Frances Ng, featured on Prosthetic Knowledge. Just by aiming her iPhone at various objects and tapping, she can both identify items like lamps and laptops, and translate their names to a number of different languages. Bye bye, multilingual dictionaries and Google translate. Hello, “what the heck is the Korean word for that?”

 

 

 

Also see:

Apple ARKit & Machine Learning Come Together Making Smarter Augmented Reality — from next.reality.news by Jason Odom

Excerpt:

The world is a massive place, especially when you consider the field of view of your smartglasses or mobile device. To fulfill the potential promise of augmented reality, we must find a way to fill that view with useful and contextual information. Of course, the job of creating contextual, valuable information, to fill the massive space that is the planet earth, is a daunting task to take on. Machine learning seems to be one solution many are moving toward.

Tokyo, Japan based web developer, Frances Ng released a video on Twitter showing off her first experiments with Apple’s ARKit and CoreML, Apple’s machine learning system. As you can see in the gifs below, her mobile device is being used to recognize a few objects around her room, and then display the name of the identified objects.

 

 

 

Davy Crockett to give tours of Alamo in new augmented reality app — from mysanantonio.com by Samantha Ehlinger

Excerpt:

Using a smart phone, users will be able to see and interact with computer-generated people and scenes from the past — overlayed on top of the very real and present-day Alamo. The app will also show the Alamo as it was at different points in history, and tell the story of the historic battle through different perspectives of the people (like Crockett) who were there. The app includes extra features users can buy, much like Pokémon Go.

“We’re making this into a virtual time machine so that if I’m standing on this spot and I look at, oh well there’s Davy Crockett, then I can go back a century and I can see the mission being built,” Alamo Reality CEO Michael McGar said. The app will allow users to see the Alamo not only as it was in 1836, but as it was before and after, McGar said.

 

 

 

“We’re developing a technology that’s going to be able to span across generations to tell a story”

— Lane Traylor

 

 

Augmented Reality Technology: A student creates the closest thing yet to a magic ring — from forbes.com by Kevin Murnane

Excerpt:

Nat Martin set himself the problem of designing a control mechanism that can be used unobtrusively to meld AR displays with the user’s real-world environment. His solution was a controller in the shape of a ring that can be worn on the user’s finger. He calls it Scroll. It uses the ARKit software platform and contains an Arduino circuit board, a capacitive sensor, gyroscope, accelerometer, and a Softpot potentiometer. Scroll works with any AR device that supports the Unity game engine such as Google Cardboard or Microsoft’s Hololens.

 

Also see:

Scroll from Nat on Vimeo.

 

 


Addendum on 8/15/17:

New iOS 11 ARKit Demo Shows Off Drawing With Fingers In Augmented Reality [Video] — from redmondpie.com by Oliver Haslam |

Excerpt:

When Apple releases iOS 11 to the public next month, it will also release ARKit for the first time. The framework, designed to make bringing augmented reality to iOS a reality was first debuted during the opening keynote of WWDC 2017 when Apple announced iOS 11, and ever since then we have been seeing new concepts and demos be released by developers.

Those developers have given us a glimpse of what we can expect when apps taking advantage of ARKit start to ship alongside iOS 11, and the latest of those is a demonstration in which someone’s finger is used to draw on a notepad.

 


 

 

 

 

Campus Technology Announces 2017 Impact Award Honorees — from campustechnology.com

Excerpt:

“When you consider the use of technology in education, one of the most important factors is impact — how it benefits students, improves teaching, streamlines costs, contributes to the community, furthers the institutional mission, etc.,” said Rhea Kelly, executive editor of Campus Technology. “These 10 projects are making a difference in higher education in variety of inspiring ways, and we are so pleased to recognize them with this year’s Impact Awards.”

 

From DSC:
I was a Member of the Review Board for this year’s Impact Awards program. As such, I want to extend my sincere congratulations to these recipients! I also want to extend congratulations to the many other people/organizations who — though they didn’t win an award this year — are doing some great work out there as well!

 

 

 

How SLAM technology is redrawing augmented reality’s battle lines — from venturebeat.com by Mojtaba Tabatabaie

 

 

Excerpt (emphasis DSC):

In early June, Apple introduced its first attempt to enter AR/VR space with ARKit. What makes ARKit stand out for Apple is a technology called SLAM (Simultaneous Localization And Mapping). Every tech giant — especially Apple, Google, and Facebook — is investing heavily in SLAM technology and whichever takes best advantage of SLAM tech will likely end up on top.

SLAM is a technology used in computer vision technologies which gets the visual data from the physical world in shape of points to make an understanding for the machine. SLAM makes it possible for machines to “have an eye and understand” what’s around them through visual input. What the machine sees with SLAM technology from a simple scene looks like the photo above, for example.

Using these points machines can have an understanding of their surroundings. Using this data also helps AR developers like myself to create much more interactive and realistic experiences. This understanding can be used in different scenarios like robotics, self-driving cars, AI and of course augmented reality.

The simplest form of understanding from this technology is recognizing walls and barriers and also floors. Right now most AR SLAM technologies like ARKit only use floor recognition and position tracking to place AR objects around you, so they don’t actually know what’s going on in your environment to correctly react to it. More advanced SLAM technologies like Google Tango, can create a mesh of our environment so not only the machine can tell you where the floor is, but it can also identify walls and objects in your environment allowing everything around you to be an element to interact with.

 

 

The company with the most complete SLAM database will likely be the winner. This database will allow these giants to have an eye on the world metaphorically, so, for example Facebook can tag and know the location of your photo by just analyzing the image or Google can place ads and virtual billboards around you by analyzing the camera feed from your smart glasses. Your self-driving car can navigate itself with nothing more than visual data.

 

 

 

 

2017 Ed Tech Trends: The Halfway Point — from campustechnology.com by Rhea Kelly
Four higher ed IT leaders weigh in on the current state of education technology and what’s ahead.

This article includes some perspectives shared from the following 4 IT leaders:

  • Susan Aldridge, Senior Vice President for Online Learning, Drexel University (PA); President, Drexel University Online
  • Daniel Christian, Adjunct Faculty Member, Calvin College
  • Marci Powell, CEO/President, Marci Powell & Associates; Chair Emerita and Past President, United States Distance Learning Association
  • Phil Ventimiglia, Chief Innovation Officer, Georgia State University

 

 

Also see:

 

 

 

Mortenson Creates First-of-Its-Kind Augmented Reality App for Construction Visualization — from prnewswire.com
Akin to Pokemon Go, the new mobile app is helping the University of Washington community “experience” a new computer science building 18 months before its doors open to students

Excerpt:

SEATTLEJuly 18, 2017 /PRNewswire-USNewswire/ — After pioneering the use of virtual design in construction, Mortenson Construction has developed a first-of-its-kind augmented reality (AR) mobile app to help the University of Washington community “see” the future CSE2 computer science building – well before its doors open to students in January of 2019. Similar to the popular Pokémon Go, users can either point their smartphones at the construction site on campus – or at a printed handout if off campus – to experience a life-like digital representation of the future CSE2 building.

 

 

4 ways augmented reality could change corporate training forever –from by Jay Samit

Excerpt:

In the coming years, machine learning and augmented reality will likely take both educational approaches to the next level by empowering workers to have the latest, most accurate information available in context, when and where they need it most.

Here are four ways that digital reality can revolutionize corporate training…

 

…augmented reality (AR) is poised not only to address issues faced by our aging workforce, but to fundamentality increase productivity by changing how all employees are trained in the future.

 

 

 

 

More Than Just Cool? — from insidehighered.com by Nick Roll
Virtual and augmented realities make headway in courses on health care, art history and social work.

Excerpt:

When Glenn Gunhouse visits the Pantheon, you would think that the professor, who teaches art and architecture history, wouldn’t be able to keep his eyes off the Roman temple’s columns, statues or dome. But there’s something else that always catches his eye: the jaws of the tourists visiting the building, and the way they all inevitably drop.

“Wow.”

There’s only one other way that Gunhouse has been able to replicate that feeling of awe for his students short of booking expensive plane tickets to Italy. Photos, videos and even three-dimensional walk-throughs on a computer screen don’t do it: It’s when his students put on virtual reality headsets loaded with images of the Pantheon.

 

…nursing schools are using virtual reality or augmented reality to bring three-dimensional anatomy illustrations off of two-dimensional textbook pages.

 

 

 



 

Also see:

Oculus reportedly planning $200 standalone wireless VR headset for 2018 — from techcrunch.com by Darrell Etherington

Excerpt:

Facebook is set to reveal a standalone Oculus virtual reality headset sometime later this year, Bloomberg reports, with a ship date of sometime in 2018. The headset will work without requiring a tethered PC or smartphone, according to the report, and will be branded with the Oculus name around the world, except in China, where it’ll carry Xiaomi trade dress and run some Xiaomi software as part of a partnership that extends to manufacturing plans for the device.

 



Facebook Inc. is taking another stab at turning its Oculus Rift virtual reality headset into a mass-market phenomenon. Later this year, the company plans to unveil a cheaper, wireless device that the company is betting will popularize VR the way Apple did the smartphone.

Source



 

 

 
 
© 2017 | Daniel Christian