Amazon Opening Store That Will Eliminate Checkout — and Lines — from bloomberg.com by Jing Cao
At Amazon Seattle location items get charged to Prime account | New technology combines artificial intelligence and sensors
Excerpt:
Amazon.com Inc. unveiled technology that will let shoppers grab groceries without having to scan and pay for them — in one stroke eliminating the checkout line.
The company is testing the new system at what it’s calling an Amazon Go store in Seattle, which will open to the public early next year. Customers will be able to scan their phones at the entrance using a new Amazon Go mobile app. Then the technology will track what items they pick up or even return to the shelves and add them to a virtual shopping cart in real time, according a video Amazon posted on YouTube. Once the customers exit the store, they’ll be charged on their Amazon account automatically.
Amazon Introduces ‘Amazon Go’ Retail Stores, No Checkout, No Lines — from investors.com
Excerpt:
Online retail king Amazon.com (AMZN) is taking dead aim at the physical-store world Monday, introducing Amazon Go, a retail convenience store format it is developing that will use computer vision and deep-learning algorithms to let shoppers just pick up what they want and exit the store without any checkout procedure.
Shoppers will merely need to tap the Amazon Go app on their smartphones, and their virtual shopping carts will automatically tabulate what they owe, and deduct that amount from their Amazon accounts, sending you a receipt. It’s what the company has deemed “just walk out technology,” which it said is based on the same technology used in self-driving cars. It’s certain to up the ante in the company’s competition with Wal-Mart (WMT), Target (TGT) and the other retail leaders.
Google DeepMind Makes AI Training Platform Publicly Available — from bloomberg.com by Jeremy Kahn
Company is increasingly embracing open-source initiatives | Move comes after rival Musk’s OpenAI made its robot gym public
Excerpt:
Alphabet Inc.’s artificial intelligence division Google DeepMind is making the maze-like game platform it uses for many of its experiments available to other researchers and the general public.
DeepMind is putting the entire source code for its training environment — which it previously called Labyrinth and has now renamed as DeepMind Lab — on the open-source depository GitHub, the company said Monday. Anyone will be able to download the code and customize it to help train their own artificial intelligence systems. They will also be able to create new game levels for DeepMind Lab and upload these to GitHub.
Related:
Alphabet DeepMind is inviting developers into the digital world where its AI learns to explore — from qz.com by Dave Gershgorn
After Retail Stumble, Beacons Shine From Banks to Sports Arenas — from bloomberg.com by Olga Kharif
Shipments of the devices expected to grow to 500 million
Excerpt (emphasis DSC):
Beacon technology, which was practically left for dead after failing to deliver on its promise to revolutionize the retail industry, is making a comeback.
Beacons are puck-size gadgets that can send helpful tips, coupons and other information to people’s smartphones through Bluetooth. They’re now being used in everything from bank branches and sports arenas to resorts, airports and fast-food restaurants. In the latest sign of the resurgence, Mobile Majority, an advertising startup, said on Monday that it was buying Gimbal Inc., a beacon maker it bills as the largest independent source of location data other than Google and Apple Inc.
…
Several recent developments have sparked the latest boom. Companies like Google parent Alphabet Inc. are making it possible for people to use the feature without downloading any apps, which had been a major barrier to adoption, said Patrick Connolly, an analyst at ABI. Introduced this year, Google Nearby Notifications lets developers tie an app or a website to a beacon to send messages to consumers even when they have no app installed.
…
But in June, Cupertino, California-based Mist Systems began shipping a software-based product that simplified the process. Instead of placing 10 beacons on walls and ceilings, for example, management using Mist can install one device every 2,000 feet (610 meters), then designate various points on a digital floor plan as virtual beacons, which can be moved with a click of a mouse.
Google’s Hand-Fed AI Now Gives Answers, Not Just Search Results — from wired.com by Cade Metz
Excerpt:
Ask the Google search app “What is the fastest bird on Earth?,” and it will tell you.
“Peregrine falcon,” the phone says. “According to YouTube, the peregrine falcon has a maximum recorded airspeed of 389 kilometers per hour.”
That’s the right answer, but it doesn’t come from some master database inside Google. When you ask the question, Google’s search engine pinpoints a YouTube video describing the five fastest birds on the planet and then extracts just the information you’re looking for. It doesn’t mention those other four birds. And it responds in similar fashion if you ask, say, “How many days are there in Hanukkah?” or “How long is Totem?” The search engine knows that Totem is a Cirque de Soleil show, and that it lasts two-and-a-half hours, including a thirty-minute intermission.
Google answers these questions with the help from deep neural networks, a form of artificial intelligence rapidly remaking not just Google’s search engine but the entire company and, well, the other giants of the internet, from Facebook to Microsoft. Deep neutral nets are pattern recognition systems that can learn to perform specific tasks by analyzing vast amounts of data. In this case, they’ve learned to take a long sentence or paragraph from a relevant page on the web and extract the upshot—the information you’re looking for.
Deep Learning in Production at Facebook — from re-work.co by Katie Pollitt
Excerpt:
Facebook is powered by machine learning and AI. From advertising relevance, news feed and search ranking to computer vision, face recognition, and speech recognition, they run ML models at massive scale, computing trillions of predictions every day.
At the 2016 Deep Learning Summit in Boston, Andrew Tulloch, Research Engineer at Facebook, talked about some of the tools and tricks Facebook use for scaling both the training and deployment of some of their deep learning models at Facebook. He also covered some useful libraries that they’d open-sourced for production-oriented deep learning applications. Tulloch’s session can be watched in full below.
The Artificial Intelligence Gold Rush — from foresightr.com by Mark Vickers
Big companies, venture capital firms and governments are all banking on AI
Excerpt:
Let’s start with some of the brand-name organizations laying down big bucks on artificial intelligence.
- Amazon: Sells the successful Echo home speaker, which comes with the personal assistant Alexa.
- Alphabet (Google): Uses deep learning technology to power Internet searches and developed AlphaGo, an AI that beat the world champion in the game of Go.
- Apple: Developed the popular virtual assistant Siri and is working on other phone-related AI applications, such as facial recognition.
- Baidu: Wants to use AI to improve search, recognize images of objects and respond to natural language queries.
- Boeing: Works with Carnegie Mellon University to develop machine learning capable of helping it design and build planes more efficiently.
- Facebook: Wants to create the “best AI lab in the world.” Has its personal assistant, M, and focuses heavily on facial recognition.
IBM: Created the Jeopardy-winning Watson AI and is leveraging its data analysis and natural language capabilities in the healthcare industry. - Intel: Has made acquisitions to help it build specialized chips and software to handle deep learning.
- Microsoft: Works on chatbot technology and acquired SwiftKey, which predicts what users will type next.
- Nokia: Has introduced various machine learning capabilities to its portfolio of customer-experience software.
Nvidia: Builds computer chips customized for deep learning. - Salesforce: Took first place at the Stanford Question Answering Dataset, a test of machine learning and comprehension, and has developed the Einstein model that learns from data.
- Shell: Launched a virtual assistant to answer customer questions.
- Tesla Motors: Continues to work on self-driving automobile technologies.
- Twitter: Created an AI-development team called Cortex and acquired several AI startups.
IBM Watson and Education in the Cognitive Era — from i-programmer.info by Nikos Vaggalis
Excerpt:
IBM’s seemingly ubiquitous Watson is now infiltrating education, through AI powered software that ‘reads’ the needs of individual students in order to engage them through tailored learning approaches.
This is not to be taken lightly, as it opens the door to a new breed of technologies that will spearhead the education or re-education of the workforce of the future.
As outlined in the 2030 report, despite robots or AI displacing a big chunk of the workforce, they will also play a major role in creating job opportunities as never before.In such a competitive landscape, workers of all kinds, white or blue collar to begin with, should come readied with new, versatile and contemporary skills.
The point is, the very AI that will leave someone jobless, will also help him to re-adapt into a new job’s requirements.It will also prepare the new generations through the use of such optimal methodologies that will once more give meaning to the aging and counter-productive schooling system which has the students’ skills disengaged from the needs of the industry and which still segregates students into ‘good’ and ‘bad’. Might it be that ‘bad’ students become just like that due to the system’s inability to stimulate their interest?