Microsoft advances natural UI with Kinect SDK — from cnet.com by Jay Greene

 

Oregon State University student Alex Wiggins gestures to Kinect, which in turn makes a remote-control
toy helicopter take off while teammates Ruma Paul (left) and Fabio Matsui (right) look on. (Credit: Microsoft)

 

How HTML5 will transform the online video landscape — from Mashable.com by Christina Warren

 

Mike Matas: A next-generation digital book (filmed March 2011)


TED: Mike Matas -- Next Generation Digital Book - filmed March 2011

 

About this talk
Software developer Mike Matas demos the first full-length interactive book for the iPad — with clever, swipeable video and graphics and some very cool data visualizations to play with. The book is “Our Choice,” Al Gore’s sequel to “An Inconvenient Truth.”

About Mike Matas
While at Apple, Mike Matas helped write the user interface for the iPhone and iPad. Now with Push Pop Press, he’s helping to rewrite the electronic book.

 

Why Angry Birds is so successful and popular: A cognitive teardown of the user experience — from Pulse > UX by Charles L. Mauro

Excerpt:

Simple yet engaging interaction concept: This seems an obvious point, but few realize that a simple interaction model need not be, and rarely is, procedurally simple. Simplification means once users have a relatively brief period of experience with the software, their mental model of how the interface behaves is well formed and fully embedded. This is known technically as schema formation. In truly great user interfaces, this critical bit of skill acquisition takes place during a specific use cycle known as the First User Experience or FUE. When users are able to construct a robust schema quickly, they routinely rate the user interface as “simple”. However, simple does not equal engaging. It is possible to create a user interface solution that is initially perceived by users as simple. However, the challenge is to create a desire by users to continue interaction with a system over time, what we call user “engagement”.

What makes a user interface engaging is adding more detail to the user’s mental model at just the right time. Angry Birds’ simple interaction model is easy to learn because it allows the user to quickly develop a mental model of the game’s interaction methodology, core strategy and scoring processes. It is engaging, in fact addictive, due to the carefully scripted expansion of the user’s mental model of the strategy component and incremental increases in problem/solution methodology. These little birds are packed with clever behaviors that expand the user’s mental model at just the point when game-level complexity is increased. The process of creating simple, engaging interaction models turns out to be exceedingly complex. Most groups developing software today think expansion of the user’s mental model is for the birds. Not necessarily so.

Other key items discussed:

  • Simple yet engaging interaction concept
  • Cleverly managed response time
  • Short-term memory management
  • Mystery
  • How things sound
  • How things look
  • Measuring that which some say cannot be measured

 

From DSC:
What Apple is able to do with many of their hardware and software products, what Charles describes here with Angry Birds, what Steelcase did with their Media:Scape product’s puck — and other examples — point out that creating something that is “easy” is actually quite hard.

 

A hugely powerful vision: A potent addition to our learning ecosystems of the future

 

Daniel Christian:
A Vision of Our Future Learning Ecosystems


In the near future, as the computer, the television, the telephone (and more) continues to converge, we will most likely enjoy even more powerful capabilities to conveniently create and share our content as well as participate in a global learning ecosystem — whether that be from within our homes and/or from within our schools, colleges, universities and businesses throughout the world.

We will be teachers and students at the same time — even within the same hour — with online-based learning exchanges taking place all over the virtual and physical world.  Subject Matter Experts (SME’s) — in the form of online-based tutors, instructors, teachers, and professors — will be available on demand. Even more powerful/accurate/helpful learning engines will be involved behind the scenes in delivering up personalized, customized learning — available 24x7x365.  Cloud-based learner profiles may enter the equation as well.

The chances for creativity,  innovation, and entrepreneurship that are coming will be mind-blowing! What employers will be looking for — and where they can look for it — may change as well.

What we know today as the “television” will most likely play a significant role in this learning ecosystem of the future. But it won’t be like the TV we’ve come to know. It will be much more interactive and will be aware of who is using it — and what that person is interested in learning about. Technologies/applications like Apple’s AirPlay will become more standard, allowing a person to move from device to device without missing a  beat. Transmedia storytellers will thrive in this environment!

Much of the professionally done content will be created by teams of specialists, including the publishers of educational content, and the in-house teams of specialists within colleges, universities, and corporations around the globe. Perhaps consortiums of colleges/universities will each contribute some of the content — more readily accepting previous coursework that was delivered via their consortium’s membership.

An additional thought regarding higher education and K-12 and their Smart Classrooms/Spaces:
For input devices…
The “chalkboards” of the future may be transparent, or they may be on top of a drawing board-sized table or they may be tablet-based. But whatever form they take and whatever is displayed upon them, the ability to annotate will be there; with the resulting graphics saved and instantly distributed. (Eventually, we may get to voice-controlled Smart Classrooms, but we have a ways to go in that area…)

Below are some of the graphics that capture a bit of what I’m seeing in my mind…and in our futures.

Alternatively available as a PowerPoint Presentation (audio forthcoming in a future version)

 



 

 

 

 

 

 

 

 

 

 

 

 

— from Daniel S. Christian | April 2011

See also:

Addendum on 4-14-11:

 

Tagged with:  

 

Originally saw this at:
BizDeansTalk.net

 

From DSC:
I appreciated the variety of clever interface elements at play here. Worth checking out just from an interface design standpoint, if not from a standpoint of relaying/processing information in a creative way (as well as discerning the critical business-related items/decisions).

 

 

 

 

Concept future -  Universal Remote Controller 6

.

Concept future -  Universal Remote Controller 4

 

Student engagement on the go — from The Journal by Chris Riedel

Assistant Principal Patrick McGee explains that whatever the other advantages of adopting iPads and iPods in the classroom, the key is student engagement.

“This is my 3-year old daughter the day the iPad came out,” said Patrick McGee as he displayed a movie of a young girl sitting at a kitchen counter, gripping an iPad in both hands. The audience watched as the little girl found, launched, and began to use a Dr. Seuss app; all without intervention or explanation from an adult. “Kids know–intuitively–how these things work; even at 3,” he said. “We need to use that.”

 

The connected life at home -- from Cisco

.

.

From DSC:

How will these types of technologies affect what we can do with K-12 education/higher education/workplace training and development? I’d say they will open up a world of new applications and opportunities for those who are ready to innovate; and these types of technologies will move the “Forthcoming Walmart of Education” along.

Above item from:

Tagged with:  

Mobile content is twice as difficult [usability] — from Jakob Nielsen
Summary:

When reading from an iPhone-sized screen, comprehension scores for complex Web content were 48% of desktop monitor scores.

It’s more painful to use the Web on mobile phones than on desktop computers for many reasons:

  • Slower downloads
  • No physical keyboard for data entry
  • No mouse for selection; no mouse buttons to issue commands and access contextual menus (indeed fewer signaling states, as discussed further in our seminar on Applying HCI Principles to Real World Problems: a touchscreen only signals “finger-down/up,” whereas a mouse has hover state in addition to button press/release)
  • Small screen (often with tiny text)
  • Websites designed for desktop access instead of following the usability guidelines for mobile
  • App UIs that lack consistency

New research by R.I. Singh and colleagues from the University of Alberta provides one more reason: it’s much harder to understand complicated information when you’re reading through a peephole.

…rest of posting here.

From DSC:
With the above said, the mobile learning wave cannot — and most likely should not — be stopped. The types of devices we end up using may change, but mobile learning will move forward.

For one example of this, see:

© 2025 | Daniel Christian