SeeSpace InAiR: The World’s 1st Augmented Television — from kickstarter.com by Nam Do, Dale Herigstad, A-M Roussel

 

Defining the new Augmented Television space
Defining the new Augmented Television space

 

From DSC:
For those interested in Human Computer Interaction (HCI), interactivity, interface design, augmented reality, and end user experience design — I think you will really like this one!

Again, I can’t help but think there are some significant educationally-related opportunities here; let’s let the students help us innovate here.

 

 

 

Transmedia Storytelling: Trends for 2014 —  from Robert Pratten, CEO  at Transmedia Storyteller Ltd on Dec 06, 2013

Excerpt:

Pratten-TransmediaStorytellingIn2014

 

Conducttr-Jan2014

 

From DSC:
Something here for education/learning? With the creativity, innovation, interactivity, participation, and opportunities for more choice/more control being offered here, I would say YES!

 

 

Also see:

 

 

 

Human Computer Interaction & the next generation of exhibits — from ideum.com by one of my former instructors at SFSU’s Multimedia Studies Program, Mr. Jim Spadacini

 

HCI Science Center Exhibits

Excerpt:

Computer-based technology continues to evolve at an ever-accelerating rate, creating both opportunities and challenges for science centers and museums. We are now seeing computing enter new realms, one that are potentially more promising for exhibit development than earlier ones.

 

Split screen interactivity and finger motion control on 2014 Samsung Smart TVs — from v-net.tv

Excerpt:

Samsung has made some incremental improvements to its Smart TV platform for 2014. During International CES the company unveiled the Multi-Link feature, which lets you split the screen and use one half to get more information about content you are watching. For example, you can watch live TV on half the screen and get search results from a web browser on the other or seek out relevant YouTube content. In effect, the company is enabling ‘companion’ or Second Screen activities but on the main screen.

 

Items re: IBM and Watson:

 

 

FURo-S and also see FutureRobot

 

FURo-S-Jan2014

 

Nuance unlocks personalized content for Smart TVs with voice biometrics for Dragon TV — from online.wsj.com

Excerpt:

Nuance Communications, Inc. (NASDAQ: NUAN) today announced that its Dragon TV platform now includes voice biometrics for TVs, set-top boxes and second screen applications, creating an even more personalized TV experience through voice. Upon hearing a person’s unique voice, Dragon TV not only understands what is being said, but authenticates who is saying it, and from there enables personalized content for different members of a household. As a result, individuals can have immediate access to their own preferred channels and content, customized home screens and social media networks.

From DSC:
Re: this last one from Nuance, imagine using this to get rid of the problem/question in online learning — is it really Student A taking that quiz?  Also, this type of technology could open up possibilities for personalized playlists/content for each learner.

 

 

 

From DSC:
First some recent/relevant postings:



IFTTT’s ingenious new feature: Controlling apps with your location
— from wired.com by Kyle VanHemert

 

An update to the IFTTT app lets you use your location in recipes. Image: IFTTT

 

Excerpt:

IFTTT stands athwart history. At a point where the software world is obsessed with finding ever more specialized apps for increasingly specific problems, the San Francisco-based company is gleefully doing just the opposite. It simply wants to give people a bunch of tools and let them figure it out. It all happens with simple conditional statements the company calls “recipes.” So, you can use the service to execute the following command: If I take a screenshot, then upload it to Dropbox. If this RSS feed is updated, then send me a text message. It’s great for kluging together quick, automated solutions for the little workflows that slip into the cracks between apps and services.

 

If This, Then That (IFTTT)

IFTTT-Dec2013

 

4 reasons why Apple’s iBeacon is about to disrupt interaction design — from wired.com by Kyle VanHemert

Excerpt:

You step inside Walmart and your shopping list is transformed into a personalized map, showing you the deals that’ll appeal to you most. You pause in front of a concert poster on the street, pull out your phone, and you’re greeted with an option to buy tickets with a single tap. You go to your local watering hole, have a round of drinks, and just leave, having paid—and tipped!—with Uber-like ease. Welcome to the world of iBeacon.

It sounds absurd, but it’s true: Here we are in 2013, and one of the most exciting things going on in consumer technology is Bluetooth. Indeed, times have changed. This isn’t the maddening, battery-leeching, why-won’t-it-stay-paired protocol of yore. Today we have Bluetooth Low Energy which solves many of the technology’s perennial problems with new protocols for ambient, continuous, low-power connectivity. It’s quickly becoming big deal.

 

The Internet of iThings: Apple’s iBeacon is already in almost 200 million iPhones and iPads — from forbes.com by Anthony Kosner

Excerpt (emphasis DSC):

Because of iBeacons’ limited range, they are well-suited for transmitting content that is relevant in the immediate proximity.

 

 


 

From DSC:
Along the lines of the above postings…I recently had a meeting whereby the topic of iBeacons came up. It was mentioned that museums will be using this sort of thing; i.e. approaching a piece of art will initiate an explanation of that piece on the museum’s self-guided tour application. 

That idea made me wonder whether such technology could be used in a classroom…and I quickly thought, “Yes!” 

For example, if a student goes to the SW corner of the room, they approach a table. That table has an iBeacon like device on it, which triggers a presentation within a mobile application on the student’s device.  The students reviews the presentation and moves onto the SE corner of the room whereby they approach a different table with another/different iBeacon on it.  That beacon triggers a quiz on the material they just reviewed, and then proceeds to build upon that information.  Etc. Etc.   Physically-based scaffolding along with some serious blended/hybrid learning. It’s like taking the concept of QR codes to the next level. 

Some iBeacon vendors out there include:

Data mining, interaction design, user interface design, and user experience design may never be the same again.

 

Learning from the Living (Class) Room [Grush & Christian]

CampusTechnology-12-5-13-DSCLivingClassRoom

 

Learning in ‘the Living [Class] Room’
From campustechnology.com by Mary Grush and Daniel Christian
Convergent technologies have the ability to support streams of low-cost, personalized content, both at home and in college.

 

A proposal for Apple, Google, IBM, Microsoft, and any other company who wants to own the future living room [Christian]

DanielChristian-A-proposal-to-Apple-MS-Google-IBM-Nov182013

 

 

 

“The main obstacle to an Apple television set has been content. It has mostly failed to convince cable companies to make their programming available through an Apple device. And cable companies have sought to prevent individual networks from signing distribution deals with Apple.”

Apple, closer to its vision for a TV set, wants
ESPN, HBO, Viacom, and others to come along

qz.com by Seward, Chon, & Delaney, 8/22/13

 

From DSC:
I wonder if this is because of the type of content that Apple is asking for. Instead of entertainment-oriented content, what if the content were more focused on engaging, interactive, learning materials? More on educational streams of content (whether we — as individuals — create and contribute that content or whether businesses do)?

Also see:

 

internet of things

 

Excerpt (emphasis DSC):

The communications landscape has historically taken the form of a tumultuous ocean of opportunities. Like rolling waves on a shore, these opportunities are often strong and powerful – yet ebb and flow with time.

Get ready, because the next great wave is upon us. And, like a tropical storm, it is likely to change the landscape around us.

As detailed by analyst Chetan Sharma, this particular wave is the one created by the popularity of over-the-top (OTT) solutions – apps that allow access to entertainment, communication and collaboration over the Internet from smartphones, tablets and laptops, rather than traditional telecommunications methods. Sharma has coined this the mobile “fourth wave” – the first three being voice, messaging (SMS) and data access, respectively – and it is rapidly washing over us.

 

Addendum on 11/25:

 

SmartTVFeatures

 

 

 

 

Items re: Helpouts by Google, which was just introduced on Monday, November 4th, 2013:


 

HelpoutsByGoogle-IntroducedNov-4-2013

 

 

 


From DSC:
This type of thing goes hand and hand with what I’m saying in the Learning from the Living Room vision/concept:  “More choice. More control.”   This type of thing may impact K-12, higher ed, and corporate training/L&D departments.

It this how we are going to make a living in the future?  If so, what changes do we need to make:

  • To the curricula out there?
  • To the “cores” out there?
  • In helping people build their digital/online-based footprints?
  • In helping people market themselves?

 

 

 

“Learning in the Living [Class] Room” — as explained by Daniel Christian [Campus Technology]

Learning from the Living [Class] Room  — from Campus Technology by Daniel Christian and Mary Grush; with a huge thanks also going out to Mr. Steven Niedzielski (@Marketing4pt0) and to Mr. Sam Beckett (@SamJohnBeck) for their assistance and some of the graphics used in making these videos.

From DSC:
These 4 short videos explain what I’m trying to relay with a vision I’m entitling, Learning from the Living [Class] Room.  I’ve been pulse checking a variety of areas for years now, and the pieces of this vision continue to come into fruition.  This is what I see Massive Open Online Courses (MOOCs) morphing into (though there may be other directions/offshoots that they go in as well).

After watching these videos, I think you will see why I think we must move to a teambased approach.

(It looks like the production folks for Campus Technology had to scale things way back in terms of video quality to insure an overall better performance for the digitally-based magazine.) 


To watch these videos in a higher resolution, please use these links:


  1. What do you mean by “the living [class] room”?
  2. Why consider this now?
  3. What are some examples of apps and tech for “the living [class] room”?
  4. What skill sets will be needed to make “the living [class] room” a reality?

 

 


Alternatively, these videos can be found at:


 

DanielSChristianLearningFromTheLivingClassRoom-CampusTechnologyNovember2013

.

 

 

Tidebreak Next Generation Mobile App Powers Full-Participation Learning  — from digitaljournal.com

Excerpts (emphasis DSC):

Mountain View and Anaheim, CA (PRWEB) October 15, 2013

New web app increases collaboration between students and faculty in classroom in BYOD learning environments.

“Using technology in the classroom can help spur creativity, increase participation, and foster a collaborative environment,” said Andrew J. Milne, Ph.D., chief executive officer of Tidebreak. “The latest version of Tidebreak’s mobile web app allows students to use any tablet or handheld device to share information with the entire class in real-time. By incorporating mobile web apps into devices that students already own, faculty can improve the learning process by creating a more collaborative environment that encourages active participation.”

The mobile web app from Tidebreak has many new features that will help increase student participation in the classroom. New features that have been incorporated into ClassSpot, ClassSpot PBL and TeamSpot include:

  • Work “at the board” without getting up – Full keyboard and track pad control from a tablet or phone allows students to collaborate on the large classroom screen in real-time.
  • Bridge the physical and digital world – Capture and share photos of whiteboard content, physical objects, or images and then share it on-screen or archive it instantly.
  • Surf and share – Search the web for relevant content and then share it to the main screen, the session archive, or to everyone in the group simultaneously.
  • Navigate an enhanced design – A great deal of improvement has gone into the user interface which helps generate new ideas among students.
 

FutureOfStorytellingGaskins-Oct2013

 

Excerpt:

Many of us go about our lives constantly surrounded by screens, immersed in various “stories”: movies, TV shows, books, plot-driven video games, news articles, advertising, and more. Whether we realize it or not, we’re creating new behaviors, routines, mindsets, and expectations around what we watch, read or play—which in turn presents new challenges and opportunities for creators and marketers.

In other words, while the fundamentals of good storytelling remain the same, technology is changing how stories can be told. But what does that mean exactly?

 

From DSC:
There’s something here for classrooms/education — even for the living rooms of the future!

 

Little Mermaid Second Screen Live makes iPads part of the movie world — from gigaom.com by Liz Shannon Miller

Summary:

Disney advertises the Second Screen Live experience as a rebellion: “Break the rules — bring your iPad to the movies!” But it’s less a trip to the movies and more a fully interactive experience.

As the film began, so did the games. Most second-screen experiences I’ve tried have been largely passive, but Little Mermaid demanded the audience’s attention right from the beginning with games, trivia questions and other forms of interactivity for all ages.

 

lmssl trivia 1

The action on the big screen even froze from time to time for more complex games, and there were moments of seemingly new animation inserted at key plot points, as well.

 

Hacking the classroom: Purdue U’s approach to augmented learning — from campustechnology.com by Mary Grush
A Q&A with Kyle Bowen

Excerpt:

A term more familiar in the competitive world of the television industry, “second screen” offers a simple way for viewers to access additional information relevant to a TV program. At Purdue University, researchers aim to put the concept to work in the classroom — making an enriched and interactive learning experience with classroom apps easier. Here, Kyle Bowen, Purdue’s director of informatics explores his latest research on “hacking the classroom”: how to leverage the second screen concept to help instructors and students negotiate the realm of multiple and varied classroom apps.

 

From DSC:
Reminds me of the underlying vision that I was trying to get at here:

 

Learning from the living room -- a component of our future learning ecosystems -- by Daniel S. Christian, June 2012

 

True personalization is the next big thing in multiscreen TV [Moulding]

True personalization is the next big thing in multiscreen TV — from .v-net.tv by John Moulding

 

 

 

From DSC:
Not a far stretch to see some applications of this in the future aimed at learning objects/learning agents/and personalized streams of content.

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

Addendum:
(With thanks going out to Mr. Richard Byrne over at the Free Technology for Teachers blog for this item
)

 

 

SchoolsWorldTV-Sept2013

 

GoogleGlass-SurgeryToolOfFuture-9-2013

.

GoogleGlass-SurgeryToolOfFuture2-9-2013

 

 

Also see:

Wearable tech at the workplace — from connectedworldmag.com

Excerpt:

Wearable sensors are the latest tech-related fashion craze, but not just for consumers. Businesses could soon make use of sensors affixed to hats, watches, glasses, and more. Equipped to provide workers with much-need data about the surrounding environment, these devices may soon invade the job and redefine the workplace—much like tablets and smartphones did just a few years back.

Naturally, businesses have been discussing how something like Google Glass could be used at work. Even the BYOD (bring your own device) phenomenon has the potential to extend to wearable technology, with workers bringing their personal smart watches and glasses for use at work. However, the potential for wearable sensors could be much greater, with specific applications created for targeted industries.

From DSC:
Some serious potential for mobile/distance learning here.  What if, instead of a surgeon, that person was an archaeologist on a dig speaking to students in their living rooms, on mobile devices, and/or in classrooms somewhere else…? An electrician fixing a broken transformer on the electrical grid and talking through what she is fixing…? The possibilities are numerous.

See Andrew Vanden Heuvel’s work on this as well:

 

Moving back to the surgery…one last thought/idea here:

  • What if there were several people wearing Google Glasses and the learners/students could pick the camera angle that they wanted to see?

 

 

 

IBM-WatsonAtWork-Sept2013

 

From DSC:
IBM Watson continues to expand into different disciplines/areas, which currently include:

  • Healthcare
  • Finance
  • Customer Service

But Watson is also entering the marketing and education/research realms.

I see a Watson-type-of-tool as being a key ingredient for future MOOCs and the best chance for MOOCs to morph into something very powerful indeed — offloading the majority of the workload to computers/software/intelligent tutoring/learning agents, while at the same time allowing students to connect with each other and/or to Subject Matter Experts (SME’s) as appropriate.

The price of education could hopefully come way down — depending upon the costs involved with licensing Watson or a similar set of technologies — as IBM could spread out their costs to multiple institutions/organizations.  This vision represents another important step towards the “Walmart of Education” that continues to develop before our eyes.

Taking this even one step further, I see this system being available to us on our mobile devices as well as in our living rooms — as the telephone, the television, and the computer continue to converge.  Blended learning on steroids.

What would make this really powerful would be to provide:

  • The ability to create narratives/stories around content
  • To feed streams of content into Watson for students to tap into
  • Methods of mining data and using that to tweak algorithms, etc. to improve the tools/learning opportunities

Such an architecture could be applied towards lifelong learning opportunities — addressing what we now know as K-12, higher education, and corporate training/development.

.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 
© 2024 | Daniel Christian