NMC Horizon Report > 2015 Higher Education Edition — from nmc.org

Excerpt:

What is on the five-year horizon for higher education institutions? Which trends and technologies will drive educational change? What are the challenges that we consider as solvable or difficult to overcome, and how can we strategize effective solutions? These questions and similar inquiries regarding technology adoption and educational change steered the collaborative research and discussions of a body of 56 experts to produce the NMC Horizon Report: 2015 Higher Education Edition, in partnership with the EDUCAUSE Learning Initiative (ELI). The NMC Horizon Report series charts the five-year horizon for the impact of emerging technologies in learning communities across the globe. With more than 13 years of research and publications, it can be regarded as the world’s longest-running exploration of emerging technology trends and uptake in education.

 

NMCHorizonReport-2015

 

NMCHorizonReport-2015-toc

 

 
 

6 things you can learn from LeWeb Day One — from nextberlin.eu by Adam Tinworth

Excerpt:

1. The sharing economy is bigger than you think
2. The secret of wearable will be simplicity
3. We all wear clothes, but we don’t all wear jewellery or watches
4. Yandex started life searching floppy disks
5. The clock is ticking on the classic office suite of software
6. Games make you a cognitive superhuman

 

What is LeWeb?

Excerpt:

LeWeb in a nutshell
Founded in 2004 by French entrepreneurs Loïc and Geraldine Le Meur, LeWeb is an internationally-renowned conference for digital innovation where visionaries, startups, tech companies, brands and leading media converge to explore today’s hottest trends and define the future of internet-driven business.

 

How do we create an advanced experience ecosystem? — from uxmag.com by Dallas Sargent

Excerpt:

“The Internet of Things is interesting because it forces us to rethink the interaction paradigms we have with objects—that is, something is different about our engagement with a smart object versus a ‘dumb’ one,” writes Thomas Wendt in his recent UX Magazine article, “Internet of Things and the Work of the Hands.”

Whatever you choose to call the grayish area sitting at the intersection of wearable technology, customizable sensors, and enhanced connectivity, it seems likely that this is where the next major breakthroughs in experience design will take place. Cisco estimates that by 2020, there will be 50 billion things connected to the Internet, so whether we settle on the “Internet of Things,” “The Internet of Everything,” or an “Experience Ecosystem,” the space is primed for explosive growth.

As Wendt notes, there is something different about interacting with smart objects and, “It’s our job to understand what that something is.” That something has been illusive in recent years, with users adopting to wearable tech slowly and a noticeable lack of sophisticated communication between objects. After all, an object sharing information with your smartphone or tablet only constitutes a relationship, it takes many more overlapping relationships to create an ecosystem.

 

Also see:

Apple pioneer Bill Fernandez on Google Glass, Oculus Rift and the look of the future — from cnet.com by Jason Hiner
In his time at Apple, Bill Fernandez helped develop technologies that have shaped modern computing. He also has a keen perspective on what’s next.

Excerpt:

“The true user interface is the gestalt of many things: industrial design, ergonomics, internal hardware, performance, error handling, visual design, interaction, latency, reliability, predictability, and more. For this reason I’m glad to see my profession moving from the concept of user “interface” (where the user interacts with the screen) to user ‘experience’ (where more the above factors are considered),” said Fernandez.

“I can design the best user interface in the world on paper, but if it’s not implemented well by the hardware and software engineers (e.g. the product is slow or buggy) then the product will fail. If you do a really good job of designing a product then everyone who sees it thinks it’s so obvious that they could have done it themselves.”

“So there will be a seamless, multimodal UI experience using sight, sound, voice, head motion, arm movement, gestures, typing on virtual keyboards, etc. There have been pieces of this in research for decades. Only now are we beginning to see some of these things emerge into the consumer market: with Google Glass, Siri, etc. We have further to go, but we’re on the road.”

 

From DSC:
I’m intrigued with the overlay (or should I say integration?) of services/concepts similar to IFTTT over the infrastructure of machine-to-machine communications and sensors such as iBeacons. I think we’re moving into entirely new end user experiences and affordances enabled by these technologies. I’m looking for opportunities that might benefit students, teachers, faculty members, and trainers.

 

11 things we just learned about how the Apple Watch works — from theverge.com by Ross Miller

 

 

Excerpt:

As of [11/18/14] , developers can now make apps for Apple Watch. Well, they’re not separate apps so much as they are extensions of pre-existing iPhone apps, and there isn’t a lot of flexibility in the WatchKit toolset — but it looks like that’ll change next year.

 

Oculus Connect Videos and Presentations Online — from oculus.com

 

 

Excerpt:

All the keynotes, panels, and developer sessions from Connect are now available to watch online. The slides from each session are also available for download from the Connect site under the “Schedule” section.  Complete list of the keynotes, panels, and developer sessions from Connect:

Keynotes:

  • Brendan Iribe and Nate Mitchell — Oculus CEO Brendan Iribe and VP of Product Nate Mitchell officially open Connect with their Keynote discussing Oculus, the Gear VR, and the newest prototype: Crescent Bay.
  • Michael Abrash  — Oculus Chief Scientist Michael Abrash discusses perception in virtual reality, the future of VR, and what that means for developers.
  • John Carmack — Oculus CTO John Carmack discusses the Gear VR and shares development stories at Oculus Connect.

.

Keynote Panel:

.

Developer Sessions:

 

 

 

Related items:

 

 

World-of-Comenius-Oct2014

 

‘World of Comenius’ demonstrates powerful educational interaction w/ Leap Motion & Oculus Rift and Tomas “Frooxius” Mariancik

Excerpt:

We recently covered a (at that point unnamed) VR project by developer Tomáš “Frooxius” Marian?ík, the mind behind the stunning ‘Sightline’ VR series of demos. The project fused Leap Motion skeletal hand tracking with Oculus Rift DK2 positional tracking to produce an impressively intuitive VR interface. Now, a new video of the interface  shows impressive progress. We catch up with Tomáš to find out some more about this mysterious VR project.

Enter the ‘World of Comenius’
The virtual reality resurgence that is currently underway, necessarily and predictably concentrates on bringing people new ways to consume and experience media and games. This is where virtual reality has the best chance of breaking through as a viable technology, one that will appeal to consumers worldwide. But virtual reality’s greatest impact, at least in terms of historical worth to society, could and probably will come in the form of non-entertainment based fields.

 

 

Also see:

 
 

The amazing ways new tech shapes storytelling — from stuff.tv by Stephen Graves

Excerpt:

From the moment some singer-poet livened up his verse performances with a musical instrument, technology has changed entertainment. The printing press, theatrical lighting, the cinema, radio, cinematic sound – they’ve all either impacted on existing storytelling forms, or created whole new ones.

In recent years, the arrival of digital formats and non-linear editing changed TV. Existing TV formats like drama benefited from the same level of technical polish as films; and at the same time, the ability to shoot and edit large amounts of footage quickly and cheaply created a whole new form of storytelling – reality TV.

Streaming media’s one thing – but the biggest tech leap in years is, of course, your smartphone. Texting during films may infuriate but whipping your phone out in the cinema may become an integral part of the story: the 2013 film App used a second-screen app to display extra layers of narrative, synced to the film’s soundtrack. There are books that use second-screen apps: last year’s Night Film lets you scan tags in the physical book to unlock extra content, including mocked-up websites and trailers.

 

 

The amazing ways new tech shapes storytelling

 

 

 

Also see:

 

InternetOfEverything-BI-Oct2014

 

Some excerpted slides:

 

wearehere-bi-oct2014

 

wearehere2-bi-oct2014

 

wearehere3-bi-oct2014

 

wearehere4-bi-oct2014

 

 


From DSC:
Educators/eduprenuers should be watching what happens in the living room…or better yet, let’s create some second-screen-based learning applications that ride the Smart/Connected TV wave.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

From DSC:
I’m thinking out loud again…

What if were were to be able to take the “If This Then That (IFTTT)” concept/capabilities and combine it with sensor-based technologies?  It seems to me that we’re at the very embryonic stages of some very powerful learning scenarios, scenarios that are packed with learning potential, engagement, intrigue, interactivity, and opportunities for participation.

For example, what would happen if you went to one corner of the room, causing an app on your mobile device to launch and bring up a particular video to review?  Then, after the viewing of the video, a brief quiz appears after that to check your understanding of the video’s main points. Then, once you’ve submitted the quiz — and it’s been received by system ABC — this triggers an unexpected learning event for you.

Combining the physical with the digital…

Establishing IFTTT-based learning playlists…

Building learning channels…learning triggers…learning actions…

Setting a schedule of things to do for a set of iBeacons over a period of time (and being able to save that schedule of events for “next time”).

Hmmm…there’s a lot of potential here!

 

 

IfThisThenThat-Combined-With-iBeacons

 

 

IfThisThenThat

 

 

iBeaconsAndEducation-8-10-14

 

 

Now throw augmented reality, wearables, and intelligent tutoring into the equation! Whew!

We need to be watching out for how machine-to-machine (M2M) communications can be leveraged in the classrooms and training programs across the globe.

One last thought here…
How are we changing our curricula to prepare students to leverage the power of the Internet of Things (IoT)?

 

Recording of yesterday’s event/keynote

 

 

AppleWatch-9-9-14

 

 

 

Apple-iPhone6-9-9-14

 

 

Beyond the Major Apple Headlines, 11 Things You Need to Know — from blogs.wsj.com by Brian R. Fitzgerald

 

iPhone6-Apple-9-9-14

 

The Apple Watch: Everything You Need to Know — from businessweek.com by Joshua Brustein

 

Apple Watch: Here Are the Apps to Expect  — from blogs.wsj.com by Nathan Olivarez-Giles

 

Apple in Focus After Debut of iPhone 6, Smart Watch, Apple Pay — from finance.yahoo.com by Zacks Research Staff

 

Tim Cook tells USA TODAY: ‘This is epic’ — from usatoday.com by Marco della Cava

 

Apple unveils smartwatch, bets on wearable devices — from finance.yahoo.com by Michael Liedtke and Anick Jesdanun

 

Apple Watch: Coming to a Classroom Near You? — from chronicle.com by Rebecca Koenig

 

applewatch

 
© 2025 | Daniel Christian