AI chatbot apps to infiltrate businesses sooner than you think — from searchbusinessanalytics.techtarget.com by Bridget Botelho
Artificial intelligence chatbots aren’t the norm yet, but within the next five years, there’s a good chance the sales person emailing you won’t be a person at all.

Excerpt:

In fact, artificial intelligence has come so far so fast in recent years, Gartner predicts it will be pervasive in all new products by 2020, with technologies including natural language capabilities, deep neural networks and conversational capabilities.

Other analysts share that expectation. Technologies that encompass the umbrella term artificial intelligence — including image recognition, machine learning, AI chatbots and speech recognition — will soon be ubiquitous in business applications as developers gain access to it through platforms such as the IBM Watson Conversation API and the Google Cloud Natural Language API.

 

 

3 corporate departments that chatbots will disrupt — from venturebeat.com by Natalie Lambert

Excerpt:

  1. Customer Service
  2. Human Resources
  3. Marketing

 

 

Facebook Messenger’s 11,000 chatbots are much more interactive — from androidcentral.com by Harish Jonnalagadda

Excerpt:

Facebook introduced chatbots on Messenger three months ago, and the search giant has shared today that over 11,000 bots are active on the messaging service. The Messenger Platform has picked up an update that adds a slew of new features to bots, such as a persistent menu that lists a bot’s commands, quick replies, ability to respond with GIFs, audio, video, and other files, and a rating system to provide feedback to bot developers.

 

 

Chatbots are coming to take over the world — from telecom.economictimes.indiatimes.com

Excerpts:

In another example, many businesses use interactive voice response (IVR) telephony systems, which have limited functionalities and often provide a poor user experience. Chatbots can replace these applications in future where the user will interact naturally to get relevant information without following certain steps or waiting for a logical sequence to occur.

Chatbots are a good starting point, but the future lies in more advanced versions of audio and video bots. Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, Google with its voice assistance, are working in the same direction to achieve it. Bot ecosystems will become even more relevant in the phase of IoT mass adoption and improvement of input/output (I/O) technology.

 

 

With big players investing heavily in AI, Chatbots are likely to be an increasing feature of social media and other communications platforms.

 

 

Everything You Wanted to Know About Chatbots But Were Afraid to Ask — from businessinsider.com by Andrew Meola

Excerpt:

Chatbots are software programs that use messaging platforms as the interface to perform a wide variety of tasks—everything from scheduling a meeting to reporting the weather, to helping a customer buy a sweater.

Because texting is the heart of the mobile experience for smartphone users, chatbots are a natural way to turn something users are very familiar with into a rewarding service or marketing opportunity.

And when you consider that the top 4 messaging apps reach over 3 billion global users (MORE than the top 4 social networks), you can see that the opportunity is huge.

 

chatbotecosystem-businsider-sept2016

 

 

 

Stanford’s virtual reality lab cultivates empathy for the homeless — from kqed.org by Rachael Myrow

 

Excerpt:

The burgeoning field of Virtual Reality — or VR as it is commonly known — is a vehicle for telling stories through 360-degree visuals and sound that put you right in the middle of the action, be it at a crowded Syrian refugee camp, or inside the body of an 85-year-old with a bad hip and cataracts.  Because of VR’s immersive properties, some people describe the medium as “the ultimate empathy machine.” But can it make people care about something as fraught and multi-faceted as homelessness?

A study in progress at Stanford’s Virtual Human Interaction Lab explores that question, and I strapped on an Oculus Rift headset (one of the most popular devices people currently use to experience VR) to look for an answer.

A new way of understanding homelessness
The study, called Empathy at Scale, puts participants in a variety of scenes designed to help them imagine the experience of being homeless themselves.

 

How chatbots and deep learning will change the future of organizations — from forbes.com by Daniel Newman

Excerpt (emphasis DSC):

Don’t let the fun, casual name mislead you. Chatbots—software that you can “chat with”—have serious implications for the business world. Though many businesses have already considered their use for customer service purposes, a chatbot’s internal applications could be invaluable on a larger scale. For instance, chatbots could help employees break down siloes and provide targeted data to fuel every department. This digital transformation is happening, even in organizational structures that face challenges with other formats of real-time communication.

Still unclear on what chatbots are and what they do? Think of a digital assistant—such as iPhone’s Siri or Alexa, the Artificial Intelligence within the Amazon Echo. A chatbot reduces or eliminates the need for many mobile apps, as the answers are stored inside the chatbot. Need to know what the weather’s like in LA? Ask your chatbot. Is your flight running on time? Ask your chatbot. Is the package you ordered going to be delivered while you’re away? You get the gist.

 

From DSC:
How might institutions of higher education as well as K-12 school districts benefit from such internal applications of chatbots? How about external applications of chatbots? If done well, such applications could facilitate solid customer service (i.e., self-service knowledgebases) and aid in communications.

 

 

Hold Tight: The Chatbot Era Has Begun — from chatbotsmagazine.com by
Chatbots: What they are, and why they’re so important for businesses

Excerpt:

Imagine being able to have your own personal assistant at your fingertips, who you can chat with, give instructions and errands to, and handle the majority of your online research — both personally and professionally.

That’s essentially what a chatbot does.

Capable of being installed on any messenger platform, such as Facebook messenger or text messages, a chat bot is a basic artificial intelligence computer software program that does the legwork for you — and communicates in a way that feels like an intelligent conversation over text or voice messaging.

 

 

 

Once-in-a-decade paradigm shift: Messaging — from chatbotsmagazine.com by Beerud Sheth
In the history of the personal computing industry, we have had a paradigm shift about once every decade.

 

Bots-ChatbotMagazine-June2016

 

 

 

 

Smart assistants and chatbots will be top consumer applications for AI over next 5 years, poll says — from venturebeat.com by Blaise Zerega

Excerpt:

Virtual agents and chatbots will be the top consumer applications of artificial intelligence over the next five years, according to a consensus poll released today by TechEmergence, a marketing research firm for AI and machine learning.

The emphasis on virtual agents and chatbots is in many ways not surprising. After all, the tech industry’s 800-pound gorillas have all made big bets: Apple with Siri, Amazon with Alexa, Facebook with M and Messenger, Google with Google Assistant, Microsoft with Cortana and Tay. However, the poll’s data also suggests that chatbots may soon be viewed as a horizontal enabling technology for many industries.

 

From DSC:
Might this eventually affect students’ expectations? Blackboard’s partnership with IBM and Amazon may come into play here.

 

 

Some like it bot: Why chatbots could be the start of something big — from business.sprint.com
Chatbots are on the rise. With Google, Facebook and Microsoft competing to be the caring voice your customers turn to, what does this mean for your enterprise?

Excerpt:

The most prevalent indicator that the artificial intelligence-driven future is rapidly  becoming reality is the rise of the chatbot – an  A.I. software program designed to chat with people. Well, more accurately, the re-emergence of the bot. Basically, robots talking to humans talking to robots is the tech vision for the future.

 

Maybe one of the most obvious uses for this technology would be in service. If you can incorporate your self-service knowledge base into a bot that responds directly to questions and uses customer data to provide the most relevant answers, it’s more convenient for customers…

 

 

Chatbot lawyer overturns 160,000 parking tickets in London and New York — from theguardian.com by Samuel Gibbs
Free service DoNotPay helps appeal over $4m in parking fines in just 21 months, but is just the tip of the legal AI iceberg for its 19-year-old creator

Excerpt:

An artificial-intelligence lawyer chatbot has successfully contested 160,000 parking tickets across London and New York for free, showing that chatbots can actually be useful.

Dubbed as “the world’s first robot lawyer” by its 19-year-old creator, London-born second-year Stanford University student Joshua Browder, DoNotPay helps users contest parking tickets in an easy to use chat-like interface.

 

 

 


 

Addendum on 6/29/16:

A New Chatbot Would Like to Help You With Your Bank Account — from wired.com by Cade Metz

Excerpt:

People in Asia are already using MyKai, and beginning today, you can too. Because it’s focused on banking—and banking alone—it works pretty well. But it’s also flawed. And it’s a bit weird. Unsettling, even. All of which makes it a great way of deconstructing the tech world’s ever growing obsession with “chatbots.”

MyKai is remarkably adept at understanding what I’m asking—and that’s largely because it’s focused solely on banking. When I ask “How much money do I have?” or “How much did I spend on food in May?,” it understands. But when I ask who won the Spain-Italy match at Euro 2016, it suggests I take another tack. The thing to realize about today’s chatbots is that they can be reasonably effective if they’ve honed to a particular task—and that they break down if the scope gets to wide.

 

UX to LX: The Rise of Learner Experience Design — from edsurge.com by Whitney Kilgore

Excerpt:

Instructional design is now approaching a similar transition. Most student consumers have yet to experience great learning design, but the commoditization of online learning is forcing colleges and universities to think differently about how they construct digital courses. Courseware is enabling the development of new modalities and pedagogical shifts. An abundance of data now enables instructional designers to decode learning patterns. As a result, we are witnessing the growth of a new field: Learner Experience Design.

Parse higher-education job postings and descriptions, and it’s evident that LX design is, as a discipline, among the fastest growing fields in education. But what exactly makes for great learning design, and how can instructional designers ensure they remain competitive in this new era of student-centric education?

The transition to digital content has made entirely new layers of student data available. Learners now leave a digital footprint that allows designers to understand how students are interacting with course materials and for how long. LX designers can develop course pathways that connect student challenges to specific sections of content. For the first time, faculty have insights into time on task—before, during and after class. Ready access to student behavior data is helping institutions develop powerful predictive analytics, and LX designers are leading the field to make more and better informed choices on content delivery to help students better understand the critical concepts.

The groundswell of data and learning technology shows no sign of slowing down, and a LX designer’s job will grow more complex alongside it. Learner experience designers must rise to the challenge, so universities can deliver online courses that captivate and resonate with each unique student.

 

Ten skills you need to be a UX unicorn — from medium.com by Conor Ward

Excerpt:

So if the discipline of UX is not about improving how things LOOK, but instead how they WORK, then of course UX Design includes a multitude of varied deep specialisms and expertise. How could it not?

Well, thats where the mythical part of this discussion comes in, many UX designers out there still believe very strongly (and for good reason) that this multi skilled ‘specialist-generalist’ cannot exist.

They could well be correct in their current circumstances. For example if their company does not work like this, then how could they? Especially if their company is an agency, and their access to users is limited or non-existent.

My own experience is that we must strive towards unicorn-ism. I have created and curated a fantastic team of UX Unicorns (yes a group of unicorns is called a blessing, Google says so) and more importantly together with my colleagues and bosses we have created the environment for them to survive and thrive.

 

 

 

 

 

 

From DSC:
The above two articles get at a piece of what I was trying to relay out at The EvoLLLution.com. And that is, the growing complexities of putting digitally-based materials online — with a high degree of engagement, professionalism, and quality — require the use of specialists.  One person simply can’t do it all anymore. In fact, User Experience Designers and Learner Experience Designers are but a couple of the potential players at the table.

 

 

 
 

Specialists central to high-quality, engaging online programming [Christian]

DanielChristian-TheEvoLLLution-TeamsSpecialists-6-20-16

 

Specialists central to high-quality, engaging online programming — from EvoLLLution.com (where the LLL stands for lifelong learning) by Daniel Christian

Excerpts:

Creating high-quality online courses is getting increasingly complex—requiring an ever-growing set of skills. Faculty members can’t do it all, nor can instructional designers, nor can anyone else.  As time goes by, new entrants and alternatives to traditional institutions of higher education will likely continue to appear on the higher education landscape—the ability to compete will be key.

For example, will there be a need for the following team members in your not-too-distant future?

  • Human Computer Interaction (HCI) Specialists: those with knowledge of how to leverage Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) in order to create fun and engaging learning experiences (while still meeting the learning objectives)
  • Data Scientists
  • Artificial Intelligence Integrators
  • Cognitive Computing Specialists
  • Intelligent Tutoring Developers
  • Learning Agent Developers
  • Algorithm Developers
  • Personalized Learning Specialists
  • Cloud-based Learner Profile Administrators
  • Transmedia Designers
  • Social Learning Experts

 

From DSC:
If the future of TV is apps > and if bots are the new apps > does that mean that the future of TV is bots…?

 

SAN FRANCISCO, CA - SEPTEMBER 9: Apple CEO Tim Cook introduces the New Apple TV during a Special Event at Bill Graham Civic Auditorium September 9, 2015 in San Francisco, California. Apple Inc. is expected to unveil latest iterations of its smart phone, forecasted to be the 6S and 6S Plus. The tech giant is also rumored to be planning to announce an update to its Apple TV set-top box. (Photo by Stephen Lam/ Getty Images)

 

 

bots-are-loving-you-9-638

 

 

 
 

HolographicStorytellingJWT-June2016

HolographicStorytellingJWT-2-June2016

 

Holographic storytelling — from jwtintelligence.com by Jade Perry

Excerpt (emphasis DSC):

The stories of Holocaust survivors are brought to life with the help of interactive 3D technologies.

New Dimensions in Testimony’ is a new way of preserving history for future generations. The project brings to life the stories of Holocaust survivors with 3D video, revealing raw first-hand accounts that are more interactive than learning through a history book.

Holocaust survivor Pinchas Gutter, the first subject of the project, was filmed answering over 1000 questions, generating approximately 25 hours of footage. By incorporating natural language processing from Conscience Display, viewers were able to ask Gutter’s holographic image questions that triggered relevant responses.

 

 

From DSC:
I wonder…is this an example of a next generation, visually-based chatbot*?

With the growth of artificial intelligence (AI), intelligent systems, and new types of human computer interaction (HCI), this type of concept could offer an on-demand learning approach that’s highly engaging — and accessible from face-to-face settings as well as from online-based learning environments. (If it could be made to take in some of the context of a particular learner and where a learner is in the relevant Zone of Proximal Development (via web-based learner profiles/data), it would be even better.)

As an aside, is this how we will obtain
customer service from the businesses of the future? See below.

 


 

 

*The complete beginner’s guide to chatbots — from chatbotsmagazine.com by Matt Schlicht
Everything you need to know.

Excerpt (emphasis DSC):

What are chatbots? Why are they such a big opportunity? How do they work? How can I build one? How can I meet other people interested in chatbots?

These are the questions we’re going to answer for you right now.

What is a chatbot?
A chatbot is a service, powered by rules and sometimes artificial intelligence, that you interact with via a chat interface. The service could be any number of things, ranging from functional to fun, and it could live in any major chat product (Facebook Messenger, Slack, Telegram, Text Messages, etc.).

A chatbot is a service, powered by rules and sometimes artificial intelligence, that you interact with via a chat interface.

Examples of chatbots
Weather bot. Get the weather whenever you ask.
Grocery bot. Help me pick out and order groceries for the week.
News bot. Ask it to tell you when ever something interesting happens.
Life advice bot. I’ll tell it my problems and it helps me think of solutions.
Personal finance bot. It helps me manage my money better.
Scheduling bot. Get me a meeting with someone on the Messenger team at Facebook.
A bot that’s your friend. In China there is a bot called Xiaoice, built by Microsoft, that over 20 million people talk to.

 

 

HoloAnatomy app previews use of augmented reality in medical schools — from medgadget.com

Excerpt:

The Cleveland Clinic has partnered with Case Western Reserve University to release a Microsoft HoloLens app that allows users to explore the human body using augmented reality technology. The HoloLens is a headset that superimposes computer generated 3D graphics onto a person’s field of view, essentially blending reality with virtual reality.

The HoloAnatomy app lets people explore a virtual human, to walk around it looking at details of different systems of the body, and to select which are showing. Even some sounds are replicated, such as that of the beating heart.

 

 

 

 

 

 

Will “class be in session” soon on tools like Prysm & Bluescape? If so, there will be some serious global interaction, collaboration, & participation here! [Christian]

From DSC:
Below are some questions and thoughts that are going through my mind:

  • Will “class be in session” soon on tools like Prysm & Bluescape?
  • Will this type of setup be the next platform that we’ll use to meet our need to be lifelong learners? That is, will what we know of today as Learning Management Systems (LMS) and Content Management Systems (CMS) morph into this type of setup?
  • Via platforms/operating systems like tvOS, will our connected TVs turn into much more collaborative devices, allowing us to contribute content with learners from all over the globe?
  • Prysm is already available on mobile devices and what we consider a television continues to morph
  • Will second and third screens be used in such setups? What functionality will be assigned to the main/larger screens? To the mobile devices?
  • Will colleges and universities innovate into such setups?  Or will organizations like LinkedIn.com/Lynda.com lead in this space? Or will it be a bit of both?
  • How will training, learning and development groups leverage these tools/technologies?
  • Are there some opportunities for homeschoolers here?

Along these lines, are are some videos/images/links for you:

 

 

PrysmVisualWorkspace-June2016

 

PrysmVisualWorkspace2-June2016

 

BlueScape-2016

 

BlueScape-2015

 

 



 

 

DSC-LyndaDotComOnAppleTV-June2016

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 



 

Also see:

kitchenstories-AppleTV-May2016

 

 

 

 


 

Also see:

 


Prysm Adds Enterprise-Wide Collaboration with Microsoft Applications — from ravepubs.com by Gary Kayye

Excerpt:

To enhance the Prysm Visual Workplace, Prysm today announced an integration with Microsoft OneDrive for Business and Office 365. Using the OneDrive for Business API from Microsoft, Prysm has made it easy for customers to connect Prysm to their existing OneDrive for Business environments to make it a seamless experience for end users to access, search for, and sync with content from OneDrive for Business. Within a Prysm Visual Workplace project, users may now access, work within and download content from Office 365 using Prysm’s built-in web capabilities.

 


 

 

 

Questions from DSC:

  • Which jobs/positions are being impacted by new forms of Human Computer Interaction (HCI)?
  • What new jobs/positions will be created by these new forms of HCI?
  • Will it be necessary for instructional technologists, instructional designers, teachers, professors, trainers, coaches, learning space designers, and others to pulse check this landscape?  Will that be enough? 
  • Or will such individuals need to dive much deeper than that in order to build the necessary skillsets, understandings, and knowledgebases to meet the new/changing expectations for their job positions?
  • How many will say, “No thanks, that’s not for me” — causing organizations to create new positions that do dive deeply in this area?
  • Will colleges and universities build and offer more courses involving HCI?
  • Will Career Services Departments get up to speed in order to help students carve out careers involving new forms of HCI?
  • How will languages and language translation be impacted by voice recognition software?
  • Will new devices be introduced to our classrooms in the future?
  • In the corporate space, how will training departments handle these new needs and opportunities?  How will learning & development groups be impacted? How will they respond in order to help the workforce get/be prepared to take advantage of these sorts of technologies? What does it mean for these staffs personally? Do they need to invest in learning more about these advancements?

As an example of what I’m trying to get at here, who all might be involved with an effort like Echo Dot?  What types of positions created it? Who all could benefit from it?  What other platforms could these technologies be integrated into?  Besides the home, where else might we find these types of devices?



WhatIsEchoDot-June2016

Echo Dot is a hands-free, voice-controlled device that uses the same far-field voice recognition as Amazon Echo. Dot has a small built-in speaker—it can also connect to your speakers over Bluetooth or with the included audio cable. Dot connects to the Alexa Voice Service to play music, provide information, news, sports scores, weather, and more—instantly.

Echo Dot can hear you from across the room, even while music is playing. When you want to use Echo Dot, just say the wake word “Alexa” and Dot responds instantly. If you have more than one Echo or Echo Dot, you can set a different wake word for each—you can pick “Amazon”, “Alexa” or “Echo” as the wake word.

 

 

Or how might students learn about the myriad of technologies involved with IBM’s Watson?  What courses are out there today that address this type of thing?  Are more courses in the works that will address this type of thing? In which areas (Computer Science, User Experience Design, Interaction Design, other)?

 

WhatIsIBMWatson-June2016

 

 

Lots of questions…but few answers at this point. Still, given the increasing pace of technological change, it’s important that we think about this type of thing and become more responsive, nimble, and adaptive in our organizations and in our careers.

 

 

 

 

 

 

Top 10 future technology jobs: VR developer, IoT specialist and AI expert — from v3.co.uk; with thanks to Norma Owen for this resource
V3 considers some of the emerging technology jobs that could soon enter your business

Top 10 jobs:

10. VR developer
9.   Blockchain engineer/developer
8.   Security engineer
7.   Internet of Things architect
6.   UX designer
5.   Data protection officer
4.   Chief digital officer
3.   AI developer
2.   DevOps engineer
1.   Data scientist

 

Amazon now lets you test drive Echo’s Alexa in your browser — from by Dan Thorp-Lancaster

Excerpt:

If you’ve ever wanted to try out the Amazon Echo before shelling out for one, you can now do just that right from your browser. Amazon has launched a dedicated website where you can try out an Echo simulation and put Alexa’s myriad of skills to the test.

 

Echosimio-Amazon-EchoMay2016

 

 

From DSC:
The use of the voice and gesture to communicate to some type of computing device or software program represent growing types of Human Computer Interaction (HCI).  With the growth of artificial intelligence (AI), personal assistants, and bots, we should expect to see more voice recognition services/capabilities baked into an increasing amount of products and solutions in the future.

Given these trends, personnel working within K-12 and higher ed need to start building their knowledgebases now so that we can begin offering more courses in the near future to help students build their skillsets.  Current user experience designers, interface designers, programmers, graphic designers, and others will also need to augment their skillsets.

 

 

 

Now you can build your own Amazon Echo at home—and Amazon couldn’t be happier — from qz.com by Michael Coren

Excerpt:

Amazon’s $180 Echo and the new Google Home (due out later this year) promise voice-activated assistants that order groceries, check calendars and perform sundry tasks of your everyday life. Now, with a little initiative and some online instructions, you can build the devices yourself for a fraction of the cost. And that’s just fine with the tech giants.

At this weekend’s Bay Area Maker Faire, Arduino, an open-source electronics manufacturer, announced new hardware “boards”—bundles of microprocessors, sensors, and ports—that will ship with voice and gesture capabilities, along with wifi and bluetooth connectivity. By plugging them into the free voice-recognition services offered by Google’s Cloud Speech API and Amazon’s Alexa Voice Service, anyone can access world-class natural language processing power, and tap into the benefits those companies are touting. Amazon has even released its own blueprint and code repository to build a $60 version of its Echo using Raspberry Pi, another piece of open-source hardware.

 

From DSC:
Perhaps this type of endeavor could find its way into some project-based learning out there, as well as in:

  • Some Computer Science-related courses
  • Some Engineering-related courses
  • User Experience Design bootcamps
  • Makerspaces
  • Programs targeted at gifted students
  • Other…??

 

 

 
© 2025 | Daniel Christian