Winner revealed for Microsoft’s HoloLens App Competition — from by Peter Graham


Out of the thousands of ideas entered, Airquarium, Grab the Idol and Galaxy Explorer were the three that made it through. Out of those the eventual winner was Galaxy Explorer with a total of 58 per cent of the votes. The app aims to give users the ability to wander the Milky Way and learn about our galaxy. Navigating through the stars and landing on the myriad of planets that are out there.




Also see:

Virtual Reality in 2016: What to expect from Google, Facebook, HTC and others in 2016 — from by Naina Khedekar


Many companies have made their intentions for virtual reality clear. Let’s see what they are up to in 2016.



Also see:


Somewhat related, but in the AR space:

Augmented reality(AR) has continued to gain momentum in the educational landscape over the past couple of years. These educators featured below have dove in head first using AR in their classrooms and schools. They continue to share excellent resources to help educators see how augmented reality can engage students and deepen understanding.




Intel launches x-ray-like glasses that allow wearers to ‘see inside’ objects — from by
Smart augmented reality helmet allows wearers to overlay maps, schematics and thermal images to effectively see through walls, pipes and other solid objects


Unlike devices such as HoloLens or Google Glass, which have been marketed as consumer devices, the Daqri Smart Helmet is designed with industrial use in mind. It will allow the wearer to effectively peer into the workings of objects using real-time overlay of information, such as wiring diagrams, schematics and problem areas that need fixing.



Top 20 User Experience Blogs and Resources of 2015 — from by Matt Ellis

Example resources from that posting:










BONUS: 15 Honorable Mentions

The following is a list of other excellent user experience blogs that did not make it in this list but are so good that they are still worth a mention. So please be sure to check them out as well!


From DSC:
Though I’m sure this list is missing many talented folks and firms, it’s a great place to start learning about user experience design, interface design, interaction design, prototyping, and usability testing.




From DSC:
Currently, you can add interactivity to your digital videos. For example, several tools allow you to do this, such as:

So I wonder…what might interactivity look like in the near future when we’re talking about viewing things in immersive virtual reality (VR)-based situations?  When we’re talking about videos made using cameras that can provide 360 degrees worth of coverage, how are we going to interact with/drive/maneuver around such videos? What types of gestures and/or input devices, hardware, and software are we going to be using to do so? 

What new forms of elearning/training/education will we have at our disposal? How will such developments impact instructional design/designers? Interaction designers? User experience designers? User interface designers? Digital storytellers?


The forecast?  High engagement, interesting times ahead.

Also see:

  • Interactive video is about to get disruptive — from by James Cory-Wright
    Seamless and immersive because it all happens within the video
    We can now have embedded hotspots (motion tags) that move within the video; we can use branching within the video to change the storyline depending on the decisions you make; we can show consequences of making that decision; we can add video within video to share expert views, link directly to other rich media or gather real-time data via social media tools – all without leaving the actual video. A seamless experience.
  • Endless learning: Virtual reality in the classroom — from by David Jagneaux
    What if you could be part of the audience for Martin Luther King Jr.’s riveting “I Have a Dream” speech? What if you could stand in a chemistry lab and experiment without any risk of harm or danger? What if you could walk the earth millions of years ago and watch dinosaurs? With virtual reality technology, these situations could become real. Virtual reality (VR) is a hot topic in today’s game industry, but games are only one aspect of the technology. I’ve had fun putting  on a headset and shooting  down ships in outer space. But VR also has the potential to enhance education in classrooms.
  • Matter VR




From DSC:
The folks who worked on the site have done a great job! Besides the wonderful resources therein, I really appreciated the user experience one is able to get by using their site. Check out the interface and functionality you can experience there (and which I’ve highlighted below).


We need a similar interface for matching up pedagogies with technologies.






Also, there are some excellent accessibility features:






From DSC:
When you read the article below, you’ll see why I’m proposing the aforementioned interface/service/database — a service that would be similar to Wikipedia in terms of allowing many people to contribute to it.


Why ed tech is not transforming how teachers teach  — from by Benjamin Herold
Student-centered, technology-driven instruction remains elusive for most


Public schools now provide at least one computer for every five students. They spend more than $3 billion per year on digital content. And nearly three-fourths of high school students now say they regularly use a smartphone or tablet in the classroom.

But a mountain of evidence indicates that teachers have been painfully slow to transform the ways they teach, despite that massive influx of new technology into their classrooms. The student-centered, hands-on, personalized instruction envisioned by ed-tech proponents remains the exception to the rule.


Everything you need to know from today’s Apple WWDC Keynote — from by Greg Kumparak


  • The Next Version Of OS X — Apple announced OS X 10.11, or “OS X El Capitan”.
  • iOS 9
  • Siri is getting smarter
  • Split Screen iPad Apps: Perhaps the biggest feature of all, and something that has been rumored for ages: the iPad is getting split screen apps. You’ll now be able to run two apps at once, side by side. (Split screen iPad apps are limited to the latest/most powerful iPad hardware: the iPad Air 2)
  • Picture in picture video
  • CarPlay goes wireless
  • Swift 2: Apple also announce Swift 2, the second iteration of their new programming language.
  • watchOS 2
  • Apple Music: As long rumored, Apple is launching a Spotify/Rdio competitor. It’s $9.99 per month, or $14.99 on a family plan (with support for up to 6 accounts)



The Apple Watch just got a lot more useful — from by John Paul Titlow
Apple’s WatchOS 2 is here and it will let developers build native, much more capable apps.

If you had your doubts about the Apple Watch, Cupertino just made it much more interesting. At the Worldwide Developers Conference this morning, Apple announced WatchOS 2, a new version of the watch’s operating system that lets developers build native apps for the device.

Watch OS 2 will also ship with a number of new features, like the ability to set photo watch faces, reply to email, take FaceTime audio calls, use Apple Pay’s new virtual loyalty cards, and get mass transit directions.



Apple introduces “News”: An old idea with big potential — from by John Paul Titlow
WWDC 2015 Update: Apple just unveiled News, a Flipboard-Style news aggregation app that will ship with iOS 9.


Apple is now a Flipboard competitor. When iOS 9 ships in the fall, it will include a new app called News, a newspaper, magazine, and blog aggregator that will feel very familiar to anyone who’s ever used Flipboard, Pulse, or one of the many other apps of this nature. The new app was unveiled by Apple’s vice president of Product Marketing, Susan Prescott, this afternoon at the company’s Worldwide Developers Conference.



Apple Maps in iOS 9 adds public transit, local business search — from by Marco Tabini








Kevin shows some of the great things developers can do with the new version of WatchKit.




Example snapshots from
Microsoft’s Productivity Future Vision



















My thanks to Mary Grush at Campus Technology for her continued work in bringing relevant topics and discussions to light — so that our institutions of higher education will continue delivering on their missions well into the future. By doing so, learners will be able to continue to partake of the benefits of attending such institutions. But in order to do so, we must adapt, be responsive, and be willing to experiment. Towards that end, this Q&A with Mary relays some of my thoughts on the need to move more towards a team-based approach.

When you think about it, we need teams whether we’re talking about online learning, hybrid learning or face-to-face learning. In fact, I just came back from an excellent Next Generation Learning Space Conference and it was never so evident to me that you need a team of specialists to design the Next Generation Learning Space and to design/implement pedagogies that take advantage of the new affordances being offered by active learning environments.








From DSC:
We continue to see more articles and innovations that involve the Internet of Things (IoT) or the Internet of Everything (IoE). This trend has made me reflect upon what I think will be a future, required subset of needed expertise within the fields of Instructional Design, User Experience Design, User Interface Design, Product Development, Programming, Human Computer Interaction (HCI), and likely other fields such as Artificial Intelligence (AI), Augmented Reality (AR), and Virtual Reality(VR) as well.

And that is, we will need people who can craft learning experiences from the presence of beacons/sensors and that integrate such concepts as found in “If This Then That” ( whereby one is putting the Internet and cloud/mobile-based applications to work for you. Certainly, those involved in retail are already busy working on these types of projects. But those of us involved with learning, human computer interaction (HCI), and interface design need to get involved as well.





For example, this potential scenario of a K-12 field trip might be fodder for such a learning experience.

So for those individuals who are involved with the aforementioned disciplines…we need to pulse check what new affordances are coming from the rollout and further development of the IoT/IoE.






Oculus Connect Videos and Presentations Online — from




All the keynotes, panels, and developer sessions from Connect are now available to watch online. The slides from each session are also available for download from the Connect site under the “Schedule” section.  Complete list of the keynotes, panels, and developer sessions from Connect:


  • Brendan Iribe and Nate Mitchell — Oculus CEO Brendan Iribe and VP of Product Nate Mitchell officially open Connect with their Keynote discussing Oculus, the Gear VR, and the newest prototype: Crescent Bay.
  • Michael Abrash  — Oculus Chief Scientist Michael Abrash discusses perception in virtual reality, the future of VR, and what that means for developers.
  • John Carmack — Oculus CTO John Carmack discusses the Gear VR and shares development stories at Oculus Connect.


Keynote Panel:


Developer Sessions:




Related items:





Excerpt from A Brief Look at Texting and the Internet in Film — by Tony Zhou

Is there a better way of showing a text message in a film? How about the internet? Even though we’re well into the digital age, film is still ineffective at depicting the world we live in. Maybe the solution lies not in content, but in form.



From DSC:
With a shout out/thanks to Krista Spahr,
Senior Instructional Designer at Calvin College, for this resource


Does Studying Fine Art = Unemployment? Introducing LinkedIn’s Field of Study Explorer — from by Kathy Hwang


[On July 28, 2014], we are pleased to announce a new product – Field of Study Explorer – designed to help students like Candice explore the wide range of careers LinkedIn members have pursued based on what they studied in school.

So let’s explore the validity of this assumption: studying fine art = unemployment by looking at the careers of members who studied Fine & Studio Arts at Universities around the world. Are they all starving artists who live in their parents’ basements?






Also see:

The New Rankings? — from by Charlie Tyson


Who majored in Slovak language and literature? At least 14 IBM employees, according to LinkedIn.

Late last month LinkedIn unveiled a “field of study explorer.” Enter a field of study – even one as obscure in the U.S. as Slovak – and you’ll see which companies Slovak majors on LinkedIn work for, which fields they work in and where they went to college. You can also search by college, by industry and by location. You can winnow down, if you desire, to find the employee who majored in Slovak at the Open University and worked in Britain after graduation.



Bye bye second screen? The InAIR lets you browse the web and watch TV all in one place — from by Colleen Taylor (@loyalelectron)


Nowadays, many people browse the web at the same time that they’re watching TV — the phenomenon is called the “second screen.” But a new gadget called InAIR from a startup called SeeSpace wants to bring our attention back to just one screen by putting the best of the laptop, the smart phone, and the TV all together in one place.




From DSC:
I like what Nam Do, the CEO of InAir is saying about one’s attention and how it’s divided when you are using a second screen.  That is, when you are trying to process some information from the large screen and some information from the smaller/more mobile screen on your lap or desk.  You keep looking up…then looking down. Looking up…then looking down.  By putting information in closer proximity in one’s visual channel, perhaps processing information will be more intuitive and less mentally taxing.

This fits with an app I saw yesterday called Spritz.



In this posting, it said:

To understand Spritz, you must understand Rapid Serial Visual Presentation (RSVP). RSVP is a common speed-reading technique used today. However, RSVP was originally developed for psychological experiments to measure human reactions to content being read. When RSVP was created, there wasn’t much digital content and most people didn’t have access to it anyway. The internet didn’t even exist yet.  With traditional RSVP, words are displayed either left-aligned or centered. Figure 1 shows an example of a center-aligned RSVP, with a dashed line on the center axis.

When you read a word, your eyes naturally fixate at one point in that word, which visually triggers the brain to recognize the word and process its meaning. In Figure 1, the preferred fixation point (character) is indicated in red. In this figure, the Optimal Recognition Position (ORP) is different for each word. For example, the ORP is only in the middle of a 3-letter word. As the length of a word increases, the percentage that the ORP shifts to the left of center also increases. The longer the word, the farther to the left of center your eyes must move to locate the ORP.


In the Science behind this app, it says (emphasis DSC):

Reading Basics
Traditional reading involves publishing text in lines and moving your eyes sequentially from word to word. For each word, the eye seeks a certain point within the word, which we call the “Optimal Recognition Point” or ORP. After your eyes find the ORP, your brain starts to process the meaning of the word that you’re viewing. With each new word, your eyes move, called a “saccade”, and then your eyes seek out the ORP for that word. Once the ORP is found, processing the word for meaning and context occurs and your eyes move to the next word. When your eyes encounter punctuation within and between sentences, your brain is prompted to assemble all of the words that you have read and processes them into a coherent thought.

When reading, only around 20% of your time is spent processing content. The remaining 80% is spent physically moving your eyes from word to word and scanning for the next ORP. With Spritz we help you get all that time back.


So, if the convergence of the television, the telephone, and the computer continues, I think Nam Do’s take on things might be very useful. If I were watching a lecture on the large screen for example, I could get some additional information in closer proximity to the professor.  If the content I wanted to see was more closely aligned with each other, perhaps my mind could take in more things…more efficiently.





The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV



Also see:
  • Clarity Matrix MultiTouch
    Interactive LCD Video WallThe Clarity™ Matrix MultiTouch LCD Video Wall is an ultra-slim profile touch screen video wall ideal for public spaces or collaboration environments. Utilizing the latest touch technology, Clarity Matrix MultiTouch enables up to 32 touch points and allows multiple users to simultaneously interact with video wall without affecting other users.



From DSC:
In addition to video walls and large multitouch surfaces…we need to start thinking about how we might integrate the Internet of Things (IoT) into the classroom.  For example,  what might occur if technologies like Apple’s iBeacon were in the blended/hybrid classroom?  Tony goes to corner A, and up pops a video re: XYZ for him to check out on the large multitouch display.  Tony then moves across the room, to another corner of the room, and up pops a quiz or an assignment related to what Tony saw in the first corner of the room.  Available on one of the windows on the video wall is of a remote specialist on the subject.  Tony clicks on the “Connect with specialist” button and finds himself in a videoconference with this remote specialist/tutor on a portion of the video wall.

As learning hubs* grow, these technologies might be very useful.

* where some of the content is electronically
and where some of the content
is discussed in a face-to-face manner


From DSC:
First some recent/relevant postings:

IFTTT’s ingenious new feature: Controlling apps with your location
— from by Kyle VanHemert


An update to the IFTTT app lets you use your location in recipes. Image: IFTTT



IFTTT stands athwart history. At a point where the software world is obsessed with finding ever more specialized apps for increasingly specific problems, the San Francisco-based company is gleefully doing just the opposite. It simply wants to give people a bunch of tools and let them figure it out. It all happens with simple conditional statements the company calls “recipes.” So, you can use the service to execute the following command: If I take a screenshot, then upload it to Dropbox. If this RSS feed is updated, then send me a text message. It’s great for kluging together quick, automated solutions for the little workflows that slip into the cracks between apps and services.


If This, Then That (IFTTT)



4 reasons why Apple’s iBeacon is about to disrupt interaction design — from by Kyle VanHemert


You step inside Walmart and your shopping list is transformed into a personalized map, showing you the deals that’ll appeal to you most. You pause in front of a concert poster on the street, pull out your phone, and you’re greeted with an option to buy tickets with a single tap. You go to your local watering hole, have a round of drinks, and just leave, having paid—and tipped!—with Uber-like ease. Welcome to the world of iBeacon.

It sounds absurd, but it’s true: Here we are in 2013, and one of the most exciting things going on in consumer technology is Bluetooth. Indeed, times have changed. This isn’t the maddening, battery-leeching, why-won’t-it-stay-paired protocol of yore. Today we have Bluetooth Low Energy which solves many of the technology’s perennial problems with new protocols for ambient, continuous, low-power connectivity. It’s quickly becoming big deal.


The Internet of iThings: Apple’s iBeacon is already in almost 200 million iPhones and iPads — from by Anthony Kosner

Excerpt (emphasis DSC):

Because of iBeacons’ limited range, they are well-suited for transmitting content that is relevant in the immediate proximity.




From DSC:
Along the lines of the above postings…I recently had a meeting whereby the topic of iBeacons came up. It was mentioned that museums will be using this sort of thing; i.e. approaching a piece of art will initiate an explanation of that piece on the museum’s self-guided tour application. 

That idea made me wonder whether such technology could be used in a classroom…and I quickly thought, “Yes!” 

For example, if a student goes to the SW corner of the room, they approach a table. That table has an iBeacon like device on it, which triggers a presentation within a mobile application on the student’s device.  The students reviews the presentation and moves onto the SE corner of the room whereby they approach a different table with another/different iBeacon on it.  That beacon triggers a quiz on the material they just reviewed, and then proceeds to build upon that information.  Etc. Etc.   Physically-based scaffolding along with some serious blended/hybrid learning. It’s like taking the concept of QR codes to the next level. 

Some iBeacon vendors out there include:

Data mining, interaction design, user interface design, and user experience design may never be the same again.


Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2020 | Daniel Christian