From DSC:
Interactive video — a potentially very powerful medium to use, especially for blended and online-based courses or training-related materials! This interactive piece from Heineken is very well done, even remembering how you answered and coming up with their evaluation of you from their 12-question “interview.”

But notice again, a TEAM of specialists are needed to create such a piece. Neither a faculty member, a trainer, nor an instructional designer can do something like this all on their own. Some of the positions I could imagine here are:

  • Script writer(s)
  • Editor(s)
  • Actors and actresses
  • Those skilled in stage lighting and sound / audio recording
  • Digital video editors
  • Programmers
  • Graphic designers
  • Web designers
  • Producers
  • Product marketers
  • …and perhaps others

This is the kind of work that I wish we saw more of in the world of online and blended courses!  Also, I appreciated their use of humor. Overall, a very engaging, fun, and informative piece!

 

heineken-interactive-video-cover-sep2016

 

heineken-interactive-video-first-sep2016

 

heineken-interactive-video0-sep2016

 

heineken-interactive-video1-sep2016

 

heineken-interactive-video2-sep2016

 

heineken-interactive-video3-sep2016

 

 

 

LinkedIn ProFinder expands nationwide to help you hire freelancers — from blog.linkedin.com

Excerpt:

The freelance economy is on the rise. In fact, the number of freelancers on LinkedIn has grown by nearly 50% in just the past five years. As the workforce evolves, we, too, are evolving to ensure we’re creating opportunity for the expanding sector of professionals looking for independent, project-based work in place of the typical 9 to 5 profession.

Last October, we began piloting a brand new platform in support of this very endeavor and today, we’re excited to announce its nationwide availability. Introducing LinkedIn ProFinder, a LinkedIn marketplace that connects consumers and small businesses looking for professional services – think Design, Writing and Editing, Accounting, Real Estate, Career Coaching – with top quality freelance professionals best suited for the job.

 

 

Also see:

 

linkedin-profinder-aug2016

 

Also see:

 

40percentfreelancersby2020-quartz-april2013

 

DanielChristian2-ARVRMRCampusTechArticle-8-16-16

 

From Dreams to Realities: AR/VR/MR in Education | A Q&A with Daniel Christian — from campustechnology.com by Mary Grush; I’d like to thank Jason VanHorn for his contributions to this article

Excerpt:

Grush: Is there a signpost you might point to that would indicate that there’s going to be more product development in AR/VR/MR?

Christian: There’s a significant one. Several major players — with very deep pockets — within the corporate world are investing in new forms of HCI, including Microsoft, Google, Apple, Facebook, Magic Leap, and others. In fact, according to an article on engadget.com from 6/16/16, “Magic Leap has amassed an astounding $1.39 billion in funding without shipping an actual product.” So to me, it’s just not likely that the billions of dollars being invested in a variety of R&D-related efforts are simply going to evaporate without producing any impactful, concrete products or services. There are too many extremely smart, creative people working on these projects, and they have impressive financial backing behind their research and product development efforts. So, I think we can expect an array of new choices in AR/VR/MR.

Just the other day I was talking to Jason VanHorn, an associate professor in our geology, geography, and environmental studies department. After finishing our discussion about a particular learning space and how we might implement active learning in it, we got to talking about mixed reality. He related his wonderful dreams of being able to view, manipulate, maneuver through, and interact with holographic displays of our planet Earth.

When I mentioned a video piece done by Case Western and the Cleveland Clinic that featured Microsoft’s Hololens technology, he knew exactly what I was referring to. But this time, instead of being able to drill down through the human body to review, explore, and learn about the various systems composing our human anatomy, he wanted to be able to drill down through the various layers of the planet Earth. He also wanted to be able to use gestures to maneuver and manipulate the globe — turning the globe to just the right spot before using a gesture to drill down to a particular place.

 

 

 

 

Uploaded on Jul 21, 2016

 

Description:
A new wave of compute technology -fueled by; big data analytics, the internet of things, augmented reality and so on- will change the way we live and work to be more immersive and natural with technology in the role as partner.

 

 

Also see:

Excerpt:

We haven’t even scratched the surface of the things technology can do to further human progress.  Education is the next major frontier.  We already have PC- and smartphone-enabled students, as well as tech-enabled classrooms, but the real breakthrough will be in personalized learning.

Every educator divides his or her time between teaching and interacting.  In lectures they have to choose between teaching to the smartest kid in the class or the weakest.  Efficiency (and reality) dictates that they must teach to the theoretical median, meaning some students will be bored and some will still struggle.  What if a digital assistant could step in to personalize the learning experience for each student, accelerating the curriculum for the advanced students and providing greater extra support for those that need more help?  The digital assistant could “sense” and “learn” that Student #1 has already mastered a particular subject and “assign” more advanced materials.  And it could provide additional work to Student #2 to ensure that he or she was ready for the next subject.  Self-paced learning to supplant and support classroom learning…that’s the next big advancement.

 

 

 

 

Specialists central to high-quality, engaging online programming [Christian]

DanielChristian-TheEvoLLLution-TeamsSpecialists-6-20-16

 

Specialists central to high-quality, engaging online programming — from EvoLLLution.com (where the LLL stands for lifelong learning) by Daniel Christian

Excerpts:

Creating high-quality online courses is getting increasingly complex—requiring an ever-growing set of skills. Faculty members can’t do it all, nor can instructional designers, nor can anyone else.  As time goes by, new entrants and alternatives to traditional institutions of higher education will likely continue to appear on the higher education landscape—the ability to compete will be key.

For example, will there be a need for the following team members in your not-too-distant future?

  • Human Computer Interaction (HCI) Specialists: those with knowledge of how to leverage Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) in order to create fun and engaging learning experiences (while still meeting the learning objectives)
  • Data Scientists
  • Artificial Intelligence Integrators
  • Cognitive Computing Specialists
  • Intelligent Tutoring Developers
  • Learning Agent Developers
  • Algorithm Developers
  • Personalized Learning Specialists
  • Cloud-based Learner Profile Administrators
  • Transmedia Designers
  • Social Learning Experts

 

HolographicStorytellingJWT-June2016

HolographicStorytellingJWT-2-June2016

 

Holographic storytelling — from jwtintelligence.com by Jade Perry

Excerpt (emphasis DSC):

The stories of Holocaust survivors are brought to life with the help of interactive 3D technologies.

New Dimensions in Testimony’ is a new way of preserving history for future generations. The project brings to life the stories of Holocaust survivors with 3D video, revealing raw first-hand accounts that are more interactive than learning through a history book.

Holocaust survivor Pinchas Gutter, the first subject of the project, was filmed answering over 1000 questions, generating approximately 25 hours of footage. By incorporating natural language processing from Conscience Display, viewers were able to ask Gutter’s holographic image questions that triggered relevant responses.

 

 

From DSC:
I wonder…is this an example of a next generation, visually-based chatbot*?

With the growth of artificial intelligence (AI), intelligent systems, and new types of human computer interaction (HCI), this type of concept could offer an on-demand learning approach that’s highly engaging — and accessible from face-to-face settings as well as from online-based learning environments. (If it could be made to take in some of the context of a particular learner and where a learner is in the relevant Zone of Proximal Development (via web-based learner profiles/data), it would be even better.)

As an aside, is this how we will obtain
customer service from the businesses of the future? See below.

 


 

 

*The complete beginner’s guide to chatbots — from chatbotsmagazine.com by Matt Schlicht
Everything you need to know.

Excerpt (emphasis DSC):

What are chatbots? Why are they such a big opportunity? How do they work? How can I build one? How can I meet other people interested in chatbots?

These are the questions we’re going to answer for you right now.

What is a chatbot?
A chatbot is a service, powered by rules and sometimes artificial intelligence, that you interact with via a chat interface. The service could be any number of things, ranging from functional to fun, and it could live in any major chat product (Facebook Messenger, Slack, Telegram, Text Messages, etc.).

A chatbot is a service, powered by rules and sometimes artificial intelligence, that you interact with via a chat interface.

Examples of chatbots
Weather bot. Get the weather whenever you ask.
Grocery bot. Help me pick out and order groceries for the week.
News bot. Ask it to tell you when ever something interesting happens.
Life advice bot. I’ll tell it my problems and it helps me think of solutions.
Personal finance bot. It helps me manage my money better.
Scheduling bot. Get me a meeting with someone on the Messenger team at Facebook.
A bot that’s your friend. In China there is a bot called Xiaoice, built by Microsoft, that over 20 million people talk to.

 

 

Faculty Innovation Toolkit — from campustechnology.com by Leila Meyer and Dian Schaffhauser

Excerpt:

  • 15 Sites for Free Digital Textbooks
  • 12 Tips for Gamifying a Course
  • 10 Tools for More Interactive Videos
  • 4 Ways to Use Social Media for Learning

 

FacultyInnovationToolkit-CampusTechnology-June2016

 

 

 

Will “class be in session” soon on tools like Prysm & Bluescape? If so, there will be some serious global interaction, collaboration, & participation here! [Christian]

From DSC:
Below are some questions and thoughts that are going through my mind:

  • Will “class be in session” soon on tools like Prysm & Bluescape?
  • Will this type of setup be the next platform that we’ll use to meet our need to be lifelong learners? That is, will what we know of today as Learning Management Systems (LMS) and Content Management Systems (CMS) morph into this type of setup?
  • Via platforms/operating systems like tvOS, will our connected TVs turn into much more collaborative devices, allowing us to contribute content with learners from all over the globe?
  • Prysm is already available on mobile devices and what we consider a television continues to morph
  • Will second and third screens be used in such setups? What functionality will be assigned to the main/larger screens? To the mobile devices?
  • Will colleges and universities innovate into such setups?  Or will organizations like LinkedIn.com/Lynda.com lead in this space? Or will it be a bit of both?
  • How will training, learning and development groups leverage these tools/technologies?
  • Are there some opportunities for homeschoolers here?

Along these lines, are are some videos/images/links for you:

 

 

PrysmVisualWorkspace-June2016

 

PrysmVisualWorkspace2-June2016

 

BlueScape-2016

 

BlueScape-2015

 

 



 

 

DSC-LyndaDotComOnAppleTV-June2016

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 



 

Also see:

kitchenstories-AppleTV-May2016

 

 

 

 


 

Also see:

 


Prysm Adds Enterprise-Wide Collaboration with Microsoft Applications — from ravepubs.com by Gary Kayye

Excerpt:

To enhance the Prysm Visual Workplace, Prysm today announced an integration with Microsoft OneDrive for Business and Office 365. Using the OneDrive for Business API from Microsoft, Prysm has made it easy for customers to connect Prysm to their existing OneDrive for Business environments to make it a seamless experience for end users to access, search for, and sync with content from OneDrive for Business. Within a Prysm Visual Workplace project, users may now access, work within and download content from Office 365 using Prysm’s built-in web capabilities.

 


 

 

 

Questions from DSC:

  • Which jobs/positions are being impacted by new forms of Human Computer Interaction (HCI)?
  • What new jobs/positions will be created by these new forms of HCI?
  • Will it be necessary for instructional technologists, instructional designers, teachers, professors, trainers, coaches, learning space designers, and others to pulse check this landscape?  Will that be enough? 
  • Or will such individuals need to dive much deeper than that in order to build the necessary skillsets, understandings, and knowledgebases to meet the new/changing expectations for their job positions?
  • How many will say, “No thanks, that’s not for me” — causing organizations to create new positions that do dive deeply in this area?
  • Will colleges and universities build and offer more courses involving HCI?
  • Will Career Services Departments get up to speed in order to help students carve out careers involving new forms of HCI?
  • How will languages and language translation be impacted by voice recognition software?
  • Will new devices be introduced to our classrooms in the future?
  • In the corporate space, how will training departments handle these new needs and opportunities?  How will learning & development groups be impacted? How will they respond in order to help the workforce get/be prepared to take advantage of these sorts of technologies? What does it mean for these staffs personally? Do they need to invest in learning more about these advancements?

As an example of what I’m trying to get at here, who all might be involved with an effort like Echo Dot?  What types of positions created it? Who all could benefit from it?  What other platforms could these technologies be integrated into?  Besides the home, where else might we find these types of devices?



WhatIsEchoDot-June2016

Echo Dot is a hands-free, voice-controlled device that uses the same far-field voice recognition as Amazon Echo. Dot has a small built-in speaker—it can also connect to your speakers over Bluetooth or with the included audio cable. Dot connects to the Alexa Voice Service to play music, provide information, news, sports scores, weather, and more—instantly.

Echo Dot can hear you from across the room, even while music is playing. When you want to use Echo Dot, just say the wake word “Alexa” and Dot responds instantly. If you have more than one Echo or Echo Dot, you can set a different wake word for each—you can pick “Amazon”, “Alexa” or “Echo” as the wake word.

 

 

Or how might students learn about the myriad of technologies involved with IBM’s Watson?  What courses are out there today that address this type of thing?  Are more courses in the works that will address this type of thing? In which areas (Computer Science, User Experience Design, Interaction Design, other)?

 

WhatIsIBMWatson-June2016

 

 

Lots of questions…but few answers at this point. Still, given the increasing pace of technological change, it’s important that we think about this type of thing and become more responsive, nimble, and adaptive in our organizations and in our careers.

 

 

 

 

 

 

Amazon now lets you test drive Echo’s Alexa in your browser — from by Dan Thorp-Lancaster

Excerpt:

If you’ve ever wanted to try out the Amazon Echo before shelling out for one, you can now do just that right from your browser. Amazon has launched a dedicated website where you can try out an Echo simulation and put Alexa’s myriad of skills to the test.

 

Echosimio-Amazon-EchoMay2016

 

 

From DSC:
The use of the voice and gesture to communicate to some type of computing device or software program represent growing types of Human Computer Interaction (HCI).  With the growth of artificial intelligence (AI), personal assistants, and bots, we should expect to see more voice recognition services/capabilities baked into an increasing amount of products and solutions in the future.

Given these trends, personnel working within K-12 and higher ed need to start building their knowledgebases now so that we can begin offering more courses in the near future to help students build their skillsets.  Current user experience designers, interface designers, programmers, graphic designers, and others will also need to augment their skillsets.

 

 

 

Beyond touch: designing effective gestural interactions — from blog.invisionapp.com by by Yanna Vogiazou; with thanks to Mark Pomeroy for the resource

 

The future of interaction is multimodal.

 

Excerpts:

The future of interaction is multimodal. But combining touch with air gestures (and potentially voice input) isn’t a typical UI design task.

Gestures are often perceived as a natural way of interacting with screens and objects, whether we’re talking about pinching a mobile screen to zoom in on a map, or waving your hand in front of your TV to switch to the next movie. But how natural are those gestures, really?

Try not to translate touch gestures directly to air gestures even though they might feel familiar and easy. Gestural interaction requires a fresh approach—one that might start as unfamiliar, but in the long run will enable users to feel more in control and will take UX design further.

 

 

Forget about buttons — think actions.

 

 

Eliminate the need for a cursor as feedback, but provide an alternative.

 

 

 

 

Creators of Siri reveal first public demo of AI assistant “Viv” — from seriouswonder.com by B.J. Murphy

Excerpts:

When it comes to AI assistants, a battle has been waged between different companies, with assistants like Siri, Cortana, and Alexa at the forefront of the battle. And now a new potential competitor enters the arena.

During a 20 minute onstage demo at Disrupt NYC, creators of Siri Dag Kittlaus and Adam Cheyer revealed Viv – a new AI assistant that makes Siri look like a children’s toy.

 

“Viv is an artificial intelligence platform that enables developers to distribute their products through an intelligent, conversational interface. It’s the simplest way for the world to interact with devices, services and things everywhere. Viv is taught by the world, knows more than it is taught, and learns every day.”

 

VIV-2-May2016

 

 

From DSC:
I saw a posting at TechCrunch.com the other day — The Information Age is over; welcome to the Experience Age.  In terms of why I’m mentioning that article here, the content of that article is not what’s as relevant here as the title of the article.  An interesting concept…and probably spot on; with ramifications for numerous types of positions, skillsets, and industries from all over the globe.

Also see:

 

VIV-May2016

 

 

Addendum on 5/12/16:

  • New Siri sibling Viv may be next step in A.I. evolution — from computerworld.com by Sharon Gaudin
    Excerpt:
    With the creators of Siri offering up a new personal assistant that won’t just tell you what pizza is but can order one for you, artificial intelligence is showing a huge leap forward. Viv is an artificial intelligence (AI) platform built by Dag Kittlaus and Adam Cheyer, the creators of the AI behind Apple’s Siri, the most well-known digital assistant in the world. Siri is known for answering questions, like how old Harrison Ford is, and reminding you to buy milk on the way home. Viv, though, promises to go well past that.

 

 

Virtual reality: The Next Big Thing for college creatives — from college.usatoday.com by Morgan Buckley, University of Southern California

Excerpt:

College students across the country — from the University of Southern California to the University of Minnesota to Southern Methodist University — are also experimenting with virtual reality applications via clubs, design labs and hackathons.

The tech industry is taking note.

Among the bigger showcases for the technology took place last month, when the University of Southern California’s Virtual Reality Club (VRSC) hosted its first annual Virtual Reality Festival and Demo Day, a showcase of projects and panels with The Walt Disney Company as its title sponsor.

Students traveled from the University of California-San Diego, UCLA, Chapman University, Loyola Marymount University and the University of Colorado-Boulder to attend the fest. The judges were industry professionals from companies including NVIDIA, Google, Maker Studios and Industrial Light and Magic’s X Lab.

Some $25,000 in prizes were split among winners in four categories: 360 Live-Action Videos, 360 Animation, Interactive VR Games and Immersive Technology/Augmented Reality (AR). VR/AR categories ranged from health care to games, journalism, interactive design and interpretive dance.

 

A New Morning — by Magic Leap; posted on 4/19/16
Welcome to a new way to start your day. Shot directly through Magic Leap technology on April 8, 2016 without use of special effects or compositing.

 

MagicLeap-ANewMorning-April2016

 

Also see:

 

 

 

 

What Gen Z thinks about ed tech in college — from edtechmagazine.com by D. Frank Smith
A report on digital natives sheds light on their learning preferences.

Excerpt (emphasis DSC):

A survey of the collegiate educational-technology expectations of 1,300 middle and high school students from 49 states was captured by Barnes and Noble. The survey, Getting to Know Gen Z, includes feedback on the students’ expectations for higher education.

“These initial insights are a springboard for colleges and universities to begin understanding the mindset of Gen Z as they prepare for their future, focusing specifically on their aspirations, college expectations and use of educational technology for their academic journey ahead,” states the survey’s introduction.

Like the millennials before them, Generation Z grew up as digital natives, with devices a fixture in the learning experience. According to the survey results, these students want “engaging, interactive learning experiences” and want to be “empowered to make their own decisions.” In addition, the students “expect technology to play an instrumental role in their educational experience.”

 

From DSC:
First of all, I’d like to thank D. Frank Smith for the solid article and for addressing the topic of students’ expectations. These messages were echoed in what I heard a few days ago at the MVU Online Learning Symposium, a conference focused on the K-12 space.

 

MoreChoiceMoreControl-DSC2

 

I want to quote and elaborate on one of the items from the report (as mentioned in the article):

“There is a need for user-friendly tools that empower faculty to design the kinds of compelling resources that will comprise the next wave of instructional resources and materials,” the report states.

Most likely, even if such tools were developed, the end goal from the quote above won’t happen. Why? Because:

  • Most faculty simply don’t have the time — they are being overrun with all sorts of other demands on their time (committees, task forces, advising, special projects, keeping up with the changes in their disciplines, etc.)
  • Even with user-friendly tools, one still needs a variety of skill sets to create engaging, sophisticated content and learning environments. Creating “the next wave of instructional resources and materials” is waaaaaay beyond the skillsets of any one person!!! Numerous skills will be required to create the kinds of learning materials that we can expect to see in the future:
    • Information architecture
    • Instructional design
    • Interaction design
    • Videography and creating/working with multiple kinds of media
    • Programming/coding
    • Responsive web design and knowing how best to design content for multiple kinds of devices
    • User experience design
    • Graphic design
    • Game design
    • Knowledge of copyrights
    • Expertise in accessibility-related items
    • The ability to most effectively write for blended and/or online-based approaches
    • Knowing how to capture and use learning analytics/data
    • Keeping up with advancements in human computer interfaces (HCI)
    • Staying current with learning space design
    • How best to deliver personalized learning
    • and much more!

This is why I continue to assert that we need a much more team-based approach to creating our learning environments. The problem is, very few people are listening to this advice.

How can I say this?

Because I continue to hear people discussing how important professional development is and how much support is needed for faculty members.  I continue to see quotes, like the above one, that puts the onus solely on the backs of our faculty members. Conferences are packed full with this type of approach.

Let’s get rid of that approach — it’s not working!  Or at least not nearly to the degree that students need it to. There may be a small percentage of faculty members who have the time and skills to pull some things off here, but even they will run into some walls eventually (depending upon the level of sophistication being pursued). None of us can do it all.

But for the most part, years have gone by and not much has changed. Rather, we need to figure out how we could use teams to create and deliver content. That would be a much wiser use of our energies and time. This perspective is not meant to dog faculty members — it’s just recognizing realities:

  • One person simply can’t do it all anymore.
  • Tools don’t exist that can pull all of the necessary pieces together.
  • Even if such tools existed, they won’t be able to keep pace w/ the exponential rate of technological changes that we’re currently experiencing — and will likely continue to experience over the next 10-20 years.

If we’re going to insist on faculty members creating the next wave of instructional materials and resources, then faculty members better look out — they don’t know what’s about to hit them.  Forget about having families. Forget about having a life outside of creating/delivering content.  And find a way to create a 50-60 hour work DAY (not week) — cause that’s how much time one will need to achieve any where’s close to mastery in all the prerequisite areas.

 

 
© 2025 | Daniel Christian