Faculty Innovation Toolkit — from campustechnology.com by Leila Meyer and Dian Schaffhauser

Excerpt:

  • 15 Sites for Free Digital Textbooks
  • 12 Tips for Gamifying a Course
  • 10 Tools for More Interactive Videos
  • 4 Ways to Use Social Media for Learning

 

FacultyInnovationToolkit-CampusTechnology-June2016

 

 

 

Will “class be in session” soon on tools like Prysm & Bluescape? If so, there will be some serious global interaction, collaboration, & participation here! [Christian]

From DSC:
Below are some questions and thoughts that are going through my mind:

  • Will “class be in session” soon on tools like Prysm & Bluescape?
  • Will this type of setup be the next platform that we’ll use to meet our need to be lifelong learners? That is, will what we know of today as Learning Management Systems (LMS) and Content Management Systems (CMS) morph into this type of setup?
  • Via platforms/operating systems like tvOS, will our connected TVs turn into much more collaborative devices, allowing us to contribute content with learners from all over the globe?
  • Prysm is already available on mobile devices and what we consider a television continues to morph
  • Will second and third screens be used in such setups? What functionality will be assigned to the main/larger screens? To the mobile devices?
  • Will colleges and universities innovate into such setups?  Or will organizations like LinkedIn.com/Lynda.com lead in this space? Or will it be a bit of both?
  • How will training, learning and development groups leverage these tools/technologies?
  • Are there some opportunities for homeschoolers here?

Along these lines, are are some videos/images/links for you:

 

 

PrysmVisualWorkspace-June2016

 

PrysmVisualWorkspace2-June2016

 

BlueScape-2016

 

BlueScape-2015

 

 



 

 

DSC-LyndaDotComOnAppleTV-June2016

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 



 

Also see:

kitchenstories-AppleTV-May2016

 

 

 

 


 

Also see:

 


Prysm Adds Enterprise-Wide Collaboration with Microsoft Applications — from ravepubs.com by Gary Kayye

Excerpt:

To enhance the Prysm Visual Workplace, Prysm today announced an integration with Microsoft OneDrive for Business and Office 365. Using the OneDrive for Business API from Microsoft, Prysm has made it easy for customers to connect Prysm to their existing OneDrive for Business environments to make it a seamless experience for end users to access, search for, and sync with content from OneDrive for Business. Within a Prysm Visual Workplace project, users may now access, work within and download content from Office 365 using Prysm’s built-in web capabilities.

 


 

 

 

Questions from DSC:

  • Which jobs/positions are being impacted by new forms of Human Computer Interaction (HCI)?
  • What new jobs/positions will be created by these new forms of HCI?
  • Will it be necessary for instructional technologists, instructional designers, teachers, professors, trainers, coaches, learning space designers, and others to pulse check this landscape?  Will that be enough? 
  • Or will such individuals need to dive much deeper than that in order to build the necessary skillsets, understandings, and knowledgebases to meet the new/changing expectations for their job positions?
  • How many will say, “No thanks, that’s not for me” — causing organizations to create new positions that do dive deeply in this area?
  • Will colleges and universities build and offer more courses involving HCI?
  • Will Career Services Departments get up to speed in order to help students carve out careers involving new forms of HCI?
  • How will languages and language translation be impacted by voice recognition software?
  • Will new devices be introduced to our classrooms in the future?
  • In the corporate space, how will training departments handle these new needs and opportunities?  How will learning & development groups be impacted? How will they respond in order to help the workforce get/be prepared to take advantage of these sorts of technologies? What does it mean for these staffs personally? Do they need to invest in learning more about these advancements?

As an example of what I’m trying to get at here, who all might be involved with an effort like Echo Dot?  What types of positions created it? Who all could benefit from it?  What other platforms could these technologies be integrated into?  Besides the home, where else might we find these types of devices?



WhatIsEchoDot-June2016

Echo Dot is a hands-free, voice-controlled device that uses the same far-field voice recognition as Amazon Echo. Dot has a small built-in speaker—it can also connect to your speakers over Bluetooth or with the included audio cable. Dot connects to the Alexa Voice Service to play music, provide information, news, sports scores, weather, and more—instantly.

Echo Dot can hear you from across the room, even while music is playing. When you want to use Echo Dot, just say the wake word “Alexa” and Dot responds instantly. If you have more than one Echo or Echo Dot, you can set a different wake word for each—you can pick “Amazon”, “Alexa” or “Echo” as the wake word.

 

 

Or how might students learn about the myriad of technologies involved with IBM’s Watson?  What courses are out there today that address this type of thing?  Are more courses in the works that will address this type of thing? In which areas (Computer Science, User Experience Design, Interaction Design, other)?

 

WhatIsIBMWatson-June2016

 

 

Lots of questions…but few answers at this point. Still, given the increasing pace of technological change, it’s important that we think about this type of thing and become more responsive, nimble, and adaptive in our organizations and in our careers.

 

 

 

 

 

 

Amazon now lets you test drive Echo’s Alexa in your browser — from by Dan Thorp-Lancaster

Excerpt:

If you’ve ever wanted to try out the Amazon Echo before shelling out for one, you can now do just that right from your browser. Amazon has launched a dedicated website where you can try out an Echo simulation and put Alexa’s myriad of skills to the test.

 

Echosimio-Amazon-EchoMay2016

 

 

From DSC:
The use of the voice and gesture to communicate to some type of computing device or software program represent growing types of Human Computer Interaction (HCI).  With the growth of artificial intelligence (AI), personal assistants, and bots, we should expect to see more voice recognition services/capabilities baked into an increasing amount of products and solutions in the future.

Given these trends, personnel working within K-12 and higher ed need to start building their knowledgebases now so that we can begin offering more courses in the near future to help students build their skillsets.  Current user experience designers, interface designers, programmers, graphic designers, and others will also need to augment their skillsets.

 

 

 

Beyond touch: designing effective gestural interactions — from blog.invisionapp.com by by Yanna Vogiazou; with thanks to Mark Pomeroy for the resource

 

The future of interaction is multimodal.

 

Excerpts:

The future of interaction is multimodal. But combining touch with air gestures (and potentially voice input) isn’t a typical UI design task.

Gestures are often perceived as a natural way of interacting with screens and objects, whether we’re talking about pinching a mobile screen to zoom in on a map, or waving your hand in front of your TV to switch to the next movie. But how natural are those gestures, really?

Try not to translate touch gestures directly to air gestures even though they might feel familiar and easy. Gestural interaction requires a fresh approach—one that might start as unfamiliar, but in the long run will enable users to feel more in control and will take UX design further.

 

 

Forget about buttons — think actions.

 

 

Eliminate the need for a cursor as feedback, but provide an alternative.

 

 

 

 

Creators of Siri reveal first public demo of AI assistant “Viv” — from seriouswonder.com by B.J. Murphy

Excerpts:

When it comes to AI assistants, a battle has been waged between different companies, with assistants like Siri, Cortana, and Alexa at the forefront of the battle. And now a new potential competitor enters the arena.

During a 20 minute onstage demo at Disrupt NYC, creators of Siri Dag Kittlaus and Adam Cheyer revealed Viv – a new AI assistant that makes Siri look like a children’s toy.

 

“Viv is an artificial intelligence platform that enables developers to distribute their products through an intelligent, conversational interface. It’s the simplest way for the world to interact with devices, services and things everywhere. Viv is taught by the world, knows more than it is taught, and learns every day.”

 

VIV-2-May2016

 

 

From DSC:
I saw a posting at TechCrunch.com the other day — The Information Age is over; welcome to the Experience Age.  In terms of why I’m mentioning that article here, the content of that article is not what’s as relevant here as the title of the article.  An interesting concept…and probably spot on; with ramifications for numerous types of positions, skillsets, and industries from all over the globe.

Also see:

 

VIV-May2016

 

 

Addendum on 5/12/16:

  • New Siri sibling Viv may be next step in A.I. evolution — from computerworld.com by Sharon Gaudin
    Excerpt:
    With the creators of Siri offering up a new personal assistant that won’t just tell you what pizza is but can order one for you, artificial intelligence is showing a huge leap forward. Viv is an artificial intelligence (AI) platform built by Dag Kittlaus and Adam Cheyer, the creators of the AI behind Apple’s Siri, the most well-known digital assistant in the world. Siri is known for answering questions, like how old Harrison Ford is, and reminding you to buy milk on the way home. Viv, though, promises to go well past that.

 

 

Virtual reality: The Next Big Thing for college creatives — from college.usatoday.com by Morgan Buckley, University of Southern California

Excerpt:

College students across the country — from the University of Southern California to the University of Minnesota to Southern Methodist University — are also experimenting with virtual reality applications via clubs, design labs and hackathons.

The tech industry is taking note.

Among the bigger showcases for the technology took place last month, when the University of Southern California’s Virtual Reality Club (VRSC) hosted its first annual Virtual Reality Festival and Demo Day, a showcase of projects and panels with The Walt Disney Company as its title sponsor.

Students traveled from the University of California-San Diego, UCLA, Chapman University, Loyola Marymount University and the University of Colorado-Boulder to attend the fest. The judges were industry professionals from companies including NVIDIA, Google, Maker Studios and Industrial Light and Magic’s X Lab.

Some $25,000 in prizes were split among winners in four categories: 360 Live-Action Videos, 360 Animation, Interactive VR Games and Immersive Technology/Augmented Reality (AR). VR/AR categories ranged from health care to games, journalism, interactive design and interpretive dance.

 

A New Morning — by Magic Leap; posted on 4/19/16
Welcome to a new way to start your day. Shot directly through Magic Leap technology on April 8, 2016 without use of special effects or compositing.

 

MagicLeap-ANewMorning-April2016

 

Also see:

 

 

 

 

What Gen Z thinks about ed tech in college — from edtechmagazine.com by D. Frank Smith
A report on digital natives sheds light on their learning preferences.

Excerpt (emphasis DSC):

A survey of the collegiate educational-technology expectations of 1,300 middle and high school students from 49 states was captured by Barnes and Noble. The survey, Getting to Know Gen Z, includes feedback on the students’ expectations for higher education.

“These initial insights are a springboard for colleges and universities to begin understanding the mindset of Gen Z as they prepare for their future, focusing specifically on their aspirations, college expectations and use of educational technology for their academic journey ahead,” states the survey’s introduction.

Like the millennials before them, Generation Z grew up as digital natives, with devices a fixture in the learning experience. According to the survey results, these students want “engaging, interactive learning experiences” and want to be “empowered to make their own decisions.” In addition, the students “expect technology to play an instrumental role in their educational experience.”

 

From DSC:
First of all, I’d like to thank D. Frank Smith for the solid article and for addressing the topic of students’ expectations. These messages were echoed in what I heard a few days ago at the MVU Online Learning Symposium, a conference focused on the K-12 space.

 

MoreChoiceMoreControl-DSC2

 

I want to quote and elaborate on one of the items from the report (as mentioned in the article):

“There is a need for user-friendly tools that empower faculty to design the kinds of compelling resources that will comprise the next wave of instructional resources and materials,” the report states.

Most likely, even if such tools were developed, the end goal from the quote above won’t happen. Why? Because:

  • Most faculty simply don’t have the time — they are being overrun with all sorts of other demands on their time (committees, task forces, advising, special projects, keeping up with the changes in their disciplines, etc.)
  • Even with user-friendly tools, one still needs a variety of skill sets to create engaging, sophisticated content and learning environments. Creating “the next wave of instructional resources and materials” is waaaaaay beyond the skillsets of any one person!!! Numerous skills will be required to create the kinds of learning materials that we can expect to see in the future:
    • Information architecture
    • Instructional design
    • Interaction design
    • Videography and creating/working with multiple kinds of media
    • Programming/coding
    • Responsive web design and knowing how best to design content for multiple kinds of devices
    • User experience design
    • Graphic design
    • Game design
    • Knowledge of copyrights
    • Expertise in accessibility-related items
    • The ability to most effectively write for blended and/or online-based approaches
    • Knowing how to capture and use learning analytics/data
    • Keeping up with advancements in human computer interfaces (HCI)
    • Staying current with learning space design
    • How best to deliver personalized learning
    • and much more!

This is why I continue to assert that we need a much more team-based approach to creating our learning environments. The problem is, very few people are listening to this advice.

How can I say this?

Because I continue to hear people discussing how important professional development is and how much support is needed for faculty members.  I continue to see quotes, like the above one, that puts the onus solely on the backs of our faculty members. Conferences are packed full with this type of approach.

Let’s get rid of that approach — it’s not working!  Or at least not nearly to the degree that students need it to. There may be a small percentage of faculty members who have the time and skills to pull some things off here, but even they will run into some walls eventually (depending upon the level of sophistication being pursued). None of us can do it all.

But for the most part, years have gone by and not much has changed. Rather, we need to figure out how we could use teams to create and deliver content. That would be a much wiser use of our energies and time. This perspective is not meant to dog faculty members — it’s just recognizing realities:

  • One person simply can’t do it all anymore.
  • Tools don’t exist that can pull all of the necessary pieces together.
  • Even if such tools existed, they won’t be able to keep pace w/ the exponential rate of technological changes that we’re currently experiencing — and will likely continue to experience over the next 10-20 years.

If we’re going to insist on faculty members creating the next wave of instructional materials and resources, then faculty members better look out — they don’t know what’s about to hit them.  Forget about having families. Forget about having a life outside of creating/delivering content.  And find a way to create a 50-60 hour work DAY (not week) — cause that’s how much time one will need to achieve any where’s close to mastery in all the prerequisite areas.

 

 

8ninths Develops “Holographic Workstation”™ for Citi Traders using Microsoft HoloLens — from 8ninths.com

Excerpt:

San Francisco – March 30, 2016 – 8ninths was named today by Microsoft Corporation as one of seven companies chosen for the Microsoft HoloLens Agency Readiness Program, and will showcase their “Holographic Workstation”™ prototype, designed and engineered for Citi, this week at Microsoft Build 2016. The Holographic Workstation™ increases efficiency by using the Microsoft HoloLens platform to create 3D holograms of real-time financial data. A three-tiered system of dynamically updated and interactive information enables traders to view, process, and interact with large amounts of abstract data in a combined 3D and 2D environment. The physical workstation integrates tablet screen space, 3D holographic docking space, keyboard, mouse, gaze, gesture, voice input, and existing Citi devices and workflows.

 

Hololens-VR-Workstation-March2016

 

8ninths-march2016-main

 

8ninths-march2016

 

 

HoloLens could get into finance with this VR workstation — from mashable.com by Lance Ulanoff

Excerpt:

8Ninths Cofounder and CEO Adam Sheppard told me they looked at the pain points of existing workstations and then drew inspiration from how, for example, they’d seen Microsoft and NASA solve 3D problems by embedding information in 2D and real environments.

The result is 8ninths’ Holographic Workstation, which was announced Wednesday at Microsoft’s Build 2016 developers conference. It’s a true blend of the real world (a physical day trader desk with a pair of real screens and a Surface Pro 4 in the middle) and a host of live, financial visualizations spread above the physical desk, including a cloud-like work area floating above the top shelf.

 

Also see the Vimeo video on this:

 

8ninths-march2016-vimeo

 

 

Microsoft HoloLens used as the basis for a cool holographic stock trading workstation — from windowscentral.com by John Callaham

 

 

\

 
 

MicrosoftHololensDevelopmentKit-March2016

 

Introducing first ever experiences for the Microsoft HoloLens Development Edition — from blogs.windows.com by Kudo Tsunoda

Excerpt:

I am super excited about today’s announcement that the Microsoft HoloLens Development Edition is available for pre-order. We set out on a mission to deliver the world’s first untethered holographic computer and it is amazing to finally be at this point in time where developers will be receiving the very first versions so they can start building their own holographic experiences.

With HoloLens, we are committed to providing the development community with the best experience possible. In order to help get developers started creating experiences for HoloLens, we’ve provided a number of great resources. First of all, there is a complete set of documentation provided to developers both by the people who have created the platform and by the people who have been building holographic experiences. We want to share all of our holographic knowledge with developers so they can start bringing their holographic dreams to reality as easily as possible. We have also provided a host of tutorial videos to help people along. All of the documentation and videos can be found at dev.windows.com/holographic.

 

 

MicrosoftHololensDevelopmentKit2-March2016

 

 

 

How Google is reimagining books — from fastcodesign.com by Meg Miller
Editions at Play” sees designers and authors working simultaneously to build a new type of e-book from the ground up.

Excerpt:

The first sign that Reif Larsen’s Entrances & Exits is not a typical e-book comes at the table of contents, which is just a list of chapters titled “Location Unknown.” Click on one of them, and you’ll be transported to a location (unknown) inside Google Street View, facing a door. Choose to enter the house and that’s where the narrative, a sort of choose-your-own-adventure string of vignettes, begins. As the book’s description reads, it’s a “Borgeian love story” that “seamlessly spans the globe” and it represents a fresh approach to the book publishing industry.

Larsen’s book is one of the inaugural titles from Editions at Play, a joint e-books publishing venture between Google Creative Lab Sydney and the design-driven publishing house Visual Editions, which launched this week. With the mission of reimagining what an e-book can be, Editions at Play brings together the author, developers, and designers to work simultaneously on building a story from the ground up. They are the opposite of the usual physical-turned-digital-books; rather, they’re books that “cannot be printed.”

 

From DSC:
Interesting to note the use of teams of specialists here…

 

 

 

Winner revealed for Microsoft’s HoloLens App Competition — from vrfocus.com by Peter Graham

Excerpt:

Out of the thousands of ideas entered, Airquarium, Grab the Idol and Galaxy Explorer were the three that made it through. Out of those the eventual winner was Galaxy Explorer with a total of 58 per cent of the votes. The app aims to give users the ability to wander the Milky Way and learn about our galaxy. Navigating through the stars and landing on the myriad of planets that are out there.

 

 

 

Also see:

Virtual Reality in 2016: What to expect from Google, Facebook, HTC and others in 2016 — from tech.firstpost.com by Naina Khedekar

Excerpt:

Many companies have made their intentions for virtual reality clear. Let’s see what they are up to in 2016.

 

 

Also see:

 

Somewhat related, but in the AR space:

Except:
Augmented reality(AR) has continued to gain momentum in the educational landscape over the past couple of years. These educators featured below have dove in head first using AR in their classrooms and schools. They continue to share excellent resources to help educators see how augmented reality can engage students and deepen understanding.

 

 

 

Intel launches x-ray-like glasses that allow wearers to ‘see inside’ objects — from theguardian.com by
Smart augmented reality helmet allows wearers to overlay maps, schematics and thermal images to effectively see through walls, pipes and other solid objects

Excerpt:

Unlike devices such as HoloLens or Google Glass, which have been marketed as consumer devices, the Daqri Smart Helmet is designed with industrial use in mind. It will allow the wearer to effectively peer into the workings of objects using real-time overlay of information, such as wiring diagrams, schematics and problem areas that need fixing.

 

 

From DSC:
Currently, you can add interactivity to your digital videos. For example, several tools allow you to do this, such as:

So I wonder…what might interactivity look like in the near future when we’re talking about viewing things in immersive virtual reality (VR)-based situations?  When we’re talking about videos made using cameras that can provide 360 degrees worth of coverage, how are we going to interact with/drive/maneuver around such videos? What types of gestures and/or input devices, hardware, and software are we going to be using to do so? 

What new forms of elearning/training/education will we have at our disposal? How will such developments impact instructional design/designers? Interaction designers? User experience designers? User interface designers? Digital storytellers?

Hmmm…

The forecast?  High engagement, interesting times ahead.

Also see:

  • Interactive video is about to get disruptive — from kineo.com by James Cory-Wright
    Excerpt:
    Seamless and immersive because it all happens within the video
    We can now have embedded hotspots (motion tags) that move within the video; we can use branching within the video to change the storyline depending on the decisions you make; we can show consequences of making that decision; we can add video within video to share expert views, link directly to other rich media or gather real-time data via social media tools – all without leaving the actual video. A seamless experience.
    .
  • Endless learning: Virtual reality in the classroom — from pixelkin.org by David Jagneaux
    Excerpt:
    What if you could be part of the audience for Martin Luther King Jr.’s riveting “I Have a Dream” speech? What if you could stand in a chemistry lab and experiment without any risk of harm or danger? What if you could walk the earth millions of years ago and watch dinosaurs? With virtual reality technology, these situations could become real. Virtual reality (VR) is a hot topic in today’s game industry, but games are only one aspect of the technology. I’ve had fun putting  on a headset and shooting  down ships in outer space. But VR also has the potential to enhance education in classrooms.
    .
  • Matter VR

 

MatterVR-Jan2016

 
© 2024 | Daniel Christian