Want to learn a new language? With this AR app, just point & tap — from fastcodesign.com by Mark Wilson
A new demo shows how augmented reality could redefine apps as we know them.
Excerpt:
There’s a new app gold rush. After Facebook and Apple both released augmented reality development kits in recent months, developers are demonstrating just what they can do with these new technologies. It’s a race to invent the future first.
To get a taste of how quickly and dramatically our smartphone apps are about to change, just take a look at this little demo by front end engineer Frances Ng, featured on Prosthetic Knowledge. Just by aiming her iPhone at various objects and tapping, she can both identify items like lamps and laptops, and translate their names to a number of different languages. Bye bye, multilingual dictionaries and Google translate. Hello, “what the heck is the Korean word for that?”
My first attempt on #ARKit and #CoreML using the pre-built model Inception v3! ARKit is awesome!!! pic.twitter.com/62d7vZcgAt
— Frances Ng (@franxyzxyz) August 14, 2017
Also see:
Apple ARKit & Machine Learning Come Together Making Smarter Augmented Reality — from next.reality.news by Jason Odom
Excerpt:
The world is a massive place, especially when you consider the field of view of your smartglasses or mobile device. To fulfill the potential promise of augmented reality, we must find a way to fill that view with useful and contextual information. Of course, the job of creating contextual, valuable information, to fill the massive space that is the planet earth, is a daunting task to take on. Machine learning seems to be one solution many are moving toward.
Tokyo, Japan based web developer, Frances Ng released a video on Twitter showing off her first experiments with Apple’s ARKit and CoreML, Apple’s machine learning system. As you can see in the gifs below, her mobile device is being used to recognize a few objects around her room, and then display the name of the identified objects.