Google is bringing translation to its Home speakers — from businessinsider.com by Peter Newman

Excerpt:

Google has added real-time translation capabilities to its Google Home smart speakers, the Home Hub screened speaker, as well as other screened devices from third parties, according to Android Police.

 

Also see:

 

 

Amazon has 10,000 employees dedicated to Alexa — here are some of the areas they’re working on — from businessinsider.com by Avery Hartmans

Summary (emphasis DSC):

  • Amazon’s vice president of Alexa, Steve Rabuchin, has confirmed that yes, there really are 10,000 Amazon employees working on Alexa and the Echo.
  • Those employees are focused on things like machine learning and making Alexa more knowledgeable.
  • Some employees are working on giving Alexa a personality, too.

 

 

From DSC:
How might this trend impact learning spaces? For example, I am interested in using voice to intuitively “drive” smart classroom control systems:

  • “Alexa, turn on the projector”
  • “Alexa, dim the lights by 50%”
  • “Alexa, open Canvas and launch my Constitutional Law I class”

 

 

 

Ten HR trends in the age of artificial intelligence — from fortune.com by Jeanne Meister
The future of HR is both digital and human as HR leaders focus on optimizing the combination of human and automated work. This is driving a new HR priority: requiring leaders and teams to develop fluency in artificial intelligence while they re-imagine HR to be more personal, human, and intuitive.

Excerpt from 21 More Jobs Of the Future (emphasis DSC):

Voice UX Designer: This role will leverage voice as a platform to deliver an “optimal” dialect and sound that is pleasing to each of the seven billion humans on the planet. The Voice UX Designer will do this by creating a set of AI tools and algorithms to help individuals find their “perfect voice” assistant.

Head of Business Behavior: The head of business behavior will analyze employee behavioral data such as performance data along with data gathered through personal, environmental and spatial sensors to create strategies to improve employee experience, cross company collaboration, productivity and employee well-being.

The question for HR leaders is: What are new job roles in HR that are on the horizon as A.I. becomes integrated into the workplace?

Chief Ethical and Humane Use Officer: This job role is already being filled by Salesforce announcing its first Chief Ethical and Humane Officer this month. This new role will focus on developing strategies to use technology in an ethical and humane way. As practical uses of AI have exploded in recent years, we look for more companies to establish new jobs focusing on ethical uses of AI to ensure AI’s trustworthiness, while also helping to diffuse fears about it.

A.I. Trainer: This role allows the existing knowledge you have about a job to be ready for A.I. to use.  Creating knowledge for an A.I. supported workplace requires individuals to tag or “annotate” discrete knowledge nuggets so the correct data is served up in a conversational interface. This role is increasingly important as the role of a recruiter is augmented by AI.

 

 

Also see:

  • Experts Weigh in on Merits of AI in Education — from by Dian Schaffhauser
    Excerpt:
    Will artificial intelligence make most people better off over the next decade, or will it redefine what free will means or what a human being is? A new report by the Pew Research Center has weighed in on the topic by conferring with some 979 experts, who have, in summary, predicted that networked AI “will amplify human effectiveness but also threaten human autonomy, agency and capabilities.”

    These same experts also weighed in on the expected changes in formal and informal education systems. Many mentioned seeing “more options for affordable adaptive and individualized learning solutions,” such as the use of AI assistants to enhance learning activities and their effectiveness.

 

 
 

Smart speakers hit critical mass in 2018 — from techcrunch.com by Sarah Perez

Excerpt (emphasis DSC):

We already know Alexa had a good Christmas — the app shot to the top of the App Store over the holidays, and the Alexa service even briefly crashed from all the new users. But Alexa, along with other smart speaker devices like Google Home, didn’t just have a good holiday — they had a great year, too. The smart speaker market reached critical mass in 2018, with around 41 percent of U.S. consumers now owning a voice-activated speaker, up from 21.5 percent in 2017.

 

In the U.S., there are now more than 100 million Alexa-enabled devices installed — a key milestone for Alexa to become a “critical mass platform,” the report noted.

 

 

From DSC:
How long before voice drives most appliances, thermostats, etc?

Hisense is bringing Android and AI smarts to its 2019 TV range — from techradar.com by Stephen Lambrechts
Some big announcements planned for CES 2019

Excerpt (emphasis DSC):

Hisense has announced that it will unveil the next evolution of its VIDAA smart TV platform at CES 2019 next month, promising to take full advantage of artificial intelligence with version 3.0.

Each television in Hisense’s 2019 ULED TV lineup will boast the updated VIDAA 3.0 AI platform, with Amazon Alexa functionality fully integrated into the devices, meaning you won’t need an Echo device to use Alexa voice control features.

 

 

 

Hey bot, what’s next in line in chatbot technology? — from blog.engati.com by Imtiaz Bellary

Excerpts:

Scenario 1: Are bots the new apps?
Scenario 2: Bot conversations that make sense
Scenario 3: Can bots increase employee throughput?
Scenario 4: Let voice take over!

 

Voice as an input medium is catching up with an increasing number of folks adopting Amazon Echo and other digital assistants for their daily chores. Can we expect bots to gauge your mood and provide personalised experience as compared to a standard response? In regulated scenarios, voice acts as an authentication mechanism for the bot to pursue actions. Voice as an input adds sophistication and ease to do tasks quickly, thereby increasing user experience.

 

 

Alexa, get me the articles (voice interfaces in academia) — from blog.libux.co by Kelly Dagan

Excerpt:

Credit to Jill O’Neill, who has written an engaging consideration of applications, discussions, and potentials for voice-user interfaces in the scholarly realm. She details a few use case scenarios: finding recent, authoritative biographies of Jane Austen; finding if your closest library has an item on the shelf now (and whether it’s worth the drive based on traffic).

Coming from an undergraduate-focused (and library) perspective, I can think of a few more:

  • asking if there are any group study rooms available at 7 pm and making a booking
  • finding out if [X] is open now (Archives, the Cafe, the Library, etc.)
  • finding three books on the Red Brigades, seeing if they are available, and saving the locations
  • grabbing five research articles on stereotype threat, to read later

 

Also see:

 

 

 

Reflections on “Are ‘smart’ classrooms the future?” [Johnston]

Are ‘smart’ classrooms the future? — from campustechnology.com by Julie Johnston
Indiana University explores that question by bringing together tech partners and university leaders to share ideas on how to design classrooms that make better use of faculty and student time.

Excerpt:

To achieve these goals, we are investigating smart solutions that will:

  • Untether instructors from the room’s podium, allowing them control from anywhere in the room;
  • Streamline the start of class, including biometric login to the room’s technology, behind-the-scenes routing of course content to room displays, control of lights and automatic attendance taking;
  • Offer whiteboards that can be captured, routed to different displays in the room and saved for future viewing and editing;
  • Provide small-group collaboration displays and the ability to easily route content to and from these displays; and
  • Deliver these features through a simple, user-friendly and reliable room/technology interface.

Activities included collaborative brainstorming focusing on these questions:

  • What else can we do to create the classroom of the future?
  • What current technology exists to solve these problems?
  • What could be developed that doesn’t yet exist?
  • What’s next?

 

 

 

From DSC:
Though many peoples’ — including faculty members’ — eyes gloss over when we start talking about learning spaces and smart classrooms, it’s still an important topic. Personally, I’d rather be learning in an engaging, exciting learning environment that’s outfitted with a variety of tools (physically as well as digitally and virtually-based) that make sense for that community of learners. Also, faculty members have very limited time to get across campus and into the classroom and get things setup…the more things that can be automated in those setup situations the better!

I’ve long posted items re: machine-to-machine communications, voice recognition/voice-enabled interfaces, artificial intelligence, bots, algorithms, a variety of vendors and their products including Amazon’s Alexa / Apple’s Siri / Microsoft’s Cortana / and Google’s Home or Google Assistant, learning spaces, and smart classrooms, as I do think those things are components of our future learning ecosystems.

 

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian