Recording of yesterday’s event/keynote

 

 

AppleWatch-9-9-14

 

 

 

Apple-iPhone6-9-9-14

 

 

Beyond the Major Apple Headlines, 11 Things You Need to Know — from blogs.wsj.com by Brian R. Fitzgerald

 

iPhone6-Apple-9-9-14

 

The Apple Watch: Everything You Need to Know — from businessweek.com by Joshua Brustein

 

Apple Watch: Here Are the Apps to Expect  — from blogs.wsj.com by Nathan Olivarez-Giles

 

Apple in Focus After Debut of iPhone 6, Smart Watch, Apple Pay — from finance.yahoo.com by Zacks Research Staff

 

Tim Cook tells USA TODAY: ‘This is epic’ — from usatoday.com by Marco della Cava

 

Apple unveils smartwatch, bets on wearable devices — from finance.yahoo.com by Michael Liedtke and Anick Jesdanun

 

Apple Watch: Coming to a Classroom Near You? — from chronicle.com by Rebecca Koenig

 

applewatch

 

Beacons at the museum: Pacific Science Center to roll out location-based Mixby app next month — from geekwire.com by Todd Bishop

Excerpt:

Seattle’s Pacific Science Center has scheduled an Oct. 4 public launch for a new system that uses Bluetooth-enabled beacons and the Mixby smartphone app to offer new experiences to museum guests — presenting them with different features and content depending on where they’re standing at any given moment.

 

Also see:

 

From DSC:
The use of location-based apps & associated technologies (machine-to-machine (M2M) communications) should be part of all ed tech planning from here on out — and also applicable to the corporate world and training programs therein. 

Not only applicable to museums, but also to art galleries, classrooms, learning spaces, campus tours, and more.  Such apps could be used on plant floors in training-related programs as well.

Now mix augmented reality in with location-based technology.  Come up to a piece of artwork, and a variety of apps could be launched to really bring that piece to life! Some serious engagement.

Digital storytelling. The connection of the physical world with the digital world. Digital learning. Physical learning. A new form of blended/hybrid learning.  Active learning. Participation.

 

 

 

Addendum on 9/4/14 — also see:

Aerohive Networks Delivers World’s First iBeacon™ and AltBeacon™ – Enabled Enterprise Wi-Fi Access Points
New Partnership with Radius Networks Delivers IoT Solution to Provide Advanced Insights and Mobile Experience Personalization

Excerpt (emphasis DSC):

SUNNYVALE, Calif.–(BUSINESS WIRE)–Aerohive Networks® (NYSE:HIVE), a leader in controller-less Wi-Fi and cloud-managed mobile networking for the enterprise market today announced that it is partnering with Radius Networks, a market leader in proximity services and proximity beacons with iBeacon™ and AltBeacon™ technology, to offer retailers, educators and healthcare providers a cloud-managed Wi-Fi infrastructure enabled with proximity beacons. Together, Aerohive and Radius Networks provide complementary cloud platforms for helping these organizations meet the demands of today’s increasingly connected customers who are seeking more personalized student education, patient care and shopper experiences.

 

Also:

 

 

 

TextingInTV-Film-Romano-Aug2014

 

Excerpt from A Brief Look at Texting and the Internet in Film — by Tony Zhou

Is there a better way of showing a text message in a film? How about the internet? Even though we’re well into the digital age, film is still ineffective at depicting the world we live in. Maybe the solution lies not in content, but in form.

 

 

From DSC:
With a shout out/thanks to Krista Spahr,
Senior Instructional Designer at Calvin College, for this resource

 

TechInEducation-FutureClassroom-Jan2014NemroffPictures

 

From DSC:
With thanks to Mr. Jeff Finder at
Faculty Row for this resource!

Also, take note of how
interdisciplinary this piece is,
encompassing Daniel Nemroff’skills
in filmmaking, visual effects,
photography, & graphic design — but also
his visionary thinking and his awareness
of what might be effective uses of
educational technologies.

 

WatsonInBoardRoomMeetingsMIT-Aug2014

 

Excerpt:

First, Watson was brought up to speed by being directed, verbally, to read over an internal memo summarizing the company’s strategy for artificial intelligence. It was then asked by one of the researchers to use that knowledge to generate a long list of candidate companies. “Watson, show me companies between $15 million and $60 million in revenue relevant to that strategy,” he said.

After the humans in the room talked over the results Watson displayed on screen, they called out a shorter list for Watson to put in a table with columns for key characteristics. After mulling some more, one of them said: “Watson, make a suggestion.” The system ran a set of decision-making algorithms and bluntly delivered its verdict: “I recommend eliminating Kawasaki Robotics.” When Watson was asked to explain, it simply added. “It is inferior to Cognilytics in every way.”

 

Reflections on “C-Suite TV debuts, offers advice for the boardroom” [Dreier]

C-Suite TV debuts, offers advice for the boardroom — from streamingmedia.com by Troy Dreier
Business leaders now have an on-demand video network to call their own, thanks to one Bloomberg host’s online venture.

Excerpt:

Bringing some business acumen to the world of online video, C-Suite TV is launching today. Created by Bloomberg TV host and author Jeffrey Hayzlett, the on-demand video network offers interviews with and shows about business execs. It promises inside information on business trends and the discussions taking place in the biggest boardrooms.

 

MYOB-July2014

 

The Future of TV is here for the C-Suite — from hayzlett.com by Jeffrey Hayzlett

Excerpt:

Rather than wait for networks or try and gain traction through the thousands of cat videos, we went out and built our own network.

 

 

See also:

  • Mind your own business
    From the About page:
    C-Suite TV is a web-based digital on-demand business channel featuring interviews and shows with business executives, thought leaders, authors and celebrities providing news and information for business leaders. C-Suite TV is your go-to resource to find out the inside track on trends and discussions taking place in businesses today. This online channel will be home to such shows as C-Suite with Jeffrey Hayzlett, MYOB – Mind Your Own Business and Bestseller TV with more shows to come.

 

 

From DSC:
The above items took me back to the concept of Learning from the Living [Class] Room.

Many of the following bullet points are already happening — but what I’m trying to influence/suggest is to bring all of them together in a powerful, global, 24 x 7 x 365, learning ecosystem:

  • When our “TVs” become more interactive…
  • When our mobile devices act as second screens and when second screen-based apps are numerous…
  • When discussion boards, forums, social media, assignments, assessments, and videoconferencing capabilities are embedded into our Smart/Connected TVs and are also available via our mobile devices…
  • When education is available 24 x 7 x 365…
  • When even the C-Suite taps into such platforms…
  • When education and entertainment are co-mingled…
  • When team-based educational content creation and delivery are mainstream…
  • When self-selecting Communities of Practice thrive online…
  • When Learning Hubs combine the best of both worlds (online and face-to-face)…
  • When Artificial Intelligence, powerful cognitive computing capabilities (i.e., IBM’s Watson), and robust reporting mechanisms are integrated into the backends…
  • When lifelong learners have their own cloud-based profiles…
  • When learners can use their “TVs” to tap into interactive, multimedia-based streams of content of their choice…
  • When recommendation engines are offered not just at Netflix but also at educationally-oriented sites…
  • When online tutoring and intelligent tutoring really take off…

…then I’d say we’ll have a powerful, engaging, responsive, global education platform.

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

Mobile Megatrends 2014…uncovering major mobile trends in 2014 — from visionmobile.com

Excerpt:

This report examines five major trends that we expect to shape the future of mobile in the coming years:

  1. Apps: The Tip of the Iceberg
  2. Mobile Ecosystems: Don’t Come Late to the Game
  3. OTT Squared: Messaging Apps are the new Platforms
  4. Handset Business Reboot: Hardware is the new Distribution
  5. The Future of HTML5: Beyond the Browser

 

From DSC:
In looking at the below excerpted slide from this solid presentation, I have to ask…

“Does this same phenomenon also apply to educationally-related products/services?”

Yes, I think it does.

That is, the educationally-related products and services of an organization will compete not by size, but how well the experience roams across screens.  Lifelong learners (who are using well-designed learning experiences) will be able to tap into streams of content on multiple devices and never skip a beat.  The organizations who provide such solid learning experiences across multiple “channels” should do well in the future.  This is due to:

  • The affordances of cloud-based computing
  • The increasing power of mobile computing
  • The convergence of the television, the telephone, and the computer — which is opening up the door for powerful, interactive, multi-directional communications that involve smart/connected televisions
  • Generation Z’s extensive use of screens*

 

 

 

HowEcosystemsWillCompete-VisionMobile-June2014

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

* From Here Comes Generation Z — bloombergview.com by Leonid Bershidsky

If Y-ers were the perfectly connected generation, Z-ers are overconnected. They multi-task across five screens: TV, phone, laptop, desktop and either a tablet or some handheld gaming device, spending 41 percent of their time outside of school with computers of some kind or another, compared to 22 percent 10 years ago.

 

Goodbye, TV Channels—And Hello, TV Apps — from readwrite.com by Adriana Lee
How a small change in language represents a universal shift in the television experience.

 

GoodbyeTVChannels-May2014

 

Excerpt (emphasis DSC):

But television is evolving. Increasingly, it’s all about the apps now—browsable, downloadable, interactive TV applications. You can thank the swelling ranks of streaming services and devices for that.

The software applications they’re delivering to our living rooms are growing in number and prominence. And they’re starting to eclipse the passive, one-way broadcasts we once fought over for two-way, interactive experiences that let you share democratically among multiple users (née viewers) across mobile devices and computers.

According to research firm NPD Group, the smart television business has begun to boom. In the beginning of 2013, there were 140 million Internet-ready TVs in American homes. By 2015, it will grow 44 percent, to 202 million. And by that time, nearly two-thirds of them will actually be connected to the Internet, compared to just 56 percent now.

How they connect is important. When it comes to television, “apps” are where it’s at, not ye olde “TV channels.” It’s just a shift in language, true—but it’s also a shift in thinking.

 

 

In a multi-screen future, phones don’t control TVs, TVs control phones — from foxnews.com by Alex Tretbar

Excerpt:

Right now, most “second-screen” usage is more distracting than it is enriching, but that’s about to change. Soon your tablet will spring to life when you tune into your favorite show, and you’ll have more opportunities than ever to engage. The million-dollar buzzword here is Automatic Content Recognition, or ACR. But, before we get too far into that, let’s start at the beginning: the screen itself.

Navin wants his apps to automatically deliver content viewers might otherwise seek out manually. This might mean recommendations, related video, social-media discussions, or even a simple plot synopsis.

 

 

What television will look like in 2025, according to Netflix — from  wired.com by Issie Lapowsky

Excerpts:

People have traditionally discovered new shows by tuning into the channels that were most aligned with their interests. Love news?  Then CNN might be the channel for you.  If it’s children’s programming you want, Nickelodeon has you covered.  And yet, none of these channels can serve 100 percent of their customers what they want to watch 100 percent of the time.

According to Hunt, this will change with internet TV.  He said Netflix is now working to perfect its personalization technology to the point where users will no longer have to choose what they want to watch from a grid of shows and movies.  Instead, the recommendation engine will be so finely tuned that it will show users “one or two suggestions that perfectly fit what they want to watch now.”

“I think this vision is possible,” Hunt said. “We’ve come a long way towards it, and we have a ways to go still.” He said Netflix is now devoting as much time and energy to building out that personalization technology as the company put into building the infrastructure for delivering that content in the first place.

“The stories we watch today are not your parents’ TV,” Hunt said, “and the stories your kids watch in 2025 will blow your mind away.”

 

And by the year 2025, he told his audience, everyone will own a smart TV.

 

 

TV transformed by smart thinking — from theaustralian.com.au/ by

Excerpt:

As LG puts it, your apps to the right of the cards are “the future” — what you will watch, while the display of your recently used apps, to the left of the cards, is “the past” — so the launcher is an amalgam of your past, present and future viewing activity

 

 

 

From DSC:
“…everyone will own a smart TV by 2025.”  Well, maybe not everyone, but many of us will have access to these Internet-connected “TV’s”  (if they are even called TV’s at that point). 

I hope that Netflix will license those personalization technologies to other vendors or, if not, that some other vendor will create them for educationally-related purposes.

Can you imagine a personalization engine — focused on education and/or training — that could provide the scaffolding necessary for learning about many topics?  i.e. digital playlists of learning. Streams of content focused on education.  Such engines would remember where you left off and what you still need to review…what you have mastered and what you are still struggling with…what you enjoy learning about…your learning preferences…and more.

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

Addendum:
How Samsung is enabling the future of social TV — from lostremote.com by Natan Edelsburg

 

 
 

FirstLegoLeague-2014-FutureOfLearning

 

Excerpt (emphasis DSC — with thanks to Mr. Joe Byerwalter for this resource):

What is the future of learning? FIRST® LEGO® League teams will find the answers. In the 2014 FLL WORLD CLASS? Challenge, over 230,000 children ages 9 to 16* from over 70 countries will redesign how we gather knowledge and skills in the 21st century. Teams will teach adults about the ways that kids need and want to learn. Get ready for a whole new class – FLL WORLD CLASSSM!

FLL challenges kids to think like scientists and engineers. During FLL WORLD CLASSSM, teams will build, test, and program an autonomous robot using LEGO MINDSTORMS® to solve a set of missions in the Robot Game. They will also choose and solve a real-world question in the Project. Throughout their experience, teams will operate under FLL’s signature set of Core Values.

*9-14 in the US, Canada, and Mexico

 

New centre will give robots the gift of sight — from Queensland University of Technology; 18 March 2014

Excerpt:

“Society is on the cusp of a revolution in robotics in which personal robots will become an everyday fact of life. What we need to make this happen is to develop technology which allows a robot to perceive its environment – to sense, understand and learn from the information it gathers.

“We want to use cameras in robots, which are less costly and use less power than other types of sensors,” he said.

“Cameras provide very rich information about the world, and we know from our own experience that our eyes, which are just cameras, provide the information we need to perform complex tasks like driving cars or picking up objects.

“This Centre will deliver the science and technologies that will turn cameras into powerful sensing devices capable of understanding and responding to their environment, and enabling robots to operate reliably over long periods, in complex unstructured surroundings where they will interact with humans as well as objects, many of which will require delicate handling.

 

US Army to test robots that “think – look – move – talk – work” — from robohub.org by Colin Lewis

Excerpt:

A report issued by the US Department of Defense shows that the army intends to test robots that “think – look – move – talk and work.” Figure 24 from the report summarizes the Army’s vision for these five problem domains, barriers to achieving its vision, and work to be done to advance toward the vision.

 

 

Deep Learning’s Role in the Age of Robots — from innovationinsights.wired.com by Julian Green

 

 

IEEE_Spectrum_-robots-app

 

Avidbots wants to automate commercial cleaning with robots — from spectrum.ieee.orgby Evan Ackerman

 
Steve Zylius/UC Irvine Dr. Warren Wiechmann, assistant clinical professor of
emergency medicine and associate dean of instructional technologies, will oversee
implementation of the Google Glass four-year program at UCI.

 

UCI School of Medicine first to integrate Google Glass into curriculum — from news.uci.edu by
Wearable computing technology will transform training of future doctors

Excerpt:

Irvine, Calif., May 14, 2014 — As physicians and surgeons explore how to use Google Glass, the UC Irvine School of Medicine is taking steps to become the first in the nation to integrate the wearable computer into its four-year curriculum – from first- and second-year anatomy courses and clinical skills training to third- and fourth-year hospital rotations.

Leaders of the medical school have confidence that faculty and students will benefit from Glass’s unique ability to display information in a smartphone-like, hands-free format; being able to communicate with the Internet via voice commands; and being able to securely broadcast and record patient care and student training activities using proprietary software compliant with the 1996 federal Health Insurance Portability & Accountability Act.

When faculty wear Google Glass for instruction, he added, it gives students an unprecedented first-person perspective. Conversely, when students are wearing Glass, they can take advantage of pertinent information delivered directly into their line of sight by faculty members, who can see exactly what a student sees and thus better guide a dissection or simulation exercise.

“The most promising part is having patients wear Glass so that our students can view themselves through the patients’ eyes, experience patient care from the patients’ perspective, and learn from that information to become more empathic and engaging physicians,” Wiechmann said.

 

CNN’s Google Glass app now supports iReport citizen journalism – from glassalmanac.com by Matt McGee

Excerpt:

We can all be journalists via Google Glass. That’s because CNN has updated its existing Glassware to add support for the iReport service.

For the uninitiated, iReport is a citizen journalism program where Joe Public can send news tips, videos, photos, etc. to CNN for possible use on the TV news or on CNN’s website. If you watch CNN enough, chances are you’ve seen iReport video or photos being used on air.

And now, as a Glass Explorer, you can share your #throughglass photos and videos straight to CNN’s iReport service.

 

Metaio releases first true “see-through” wearable augmented reality — from realareal.com by Kiran Voleti

Excerpt:

The world leader in Augmented reality (AR) software and solutions, Metaio, today announced the first ever “see-through” wearable AR capabilities through the newest Beta version of the Metaio SDK, now optimized for wearable computing devices like the brand-new Epson Moverio BT-200. Instead of utilizing a camera view, Metaio’s technology allows the user to perceive reality itself with digital and virtual content directly overlaid onto their surroundings.

Wearable computing is on the rise, with devices like Google Glass and Oculus Rift in the public eye more and more. But in order to perform augmented reality experiences, even transparent displays like Google Glass rely on a camera video feed that duplicates reality rather than using the reality itself, potentially creating a disconnect for the user between the augmented content and the real world.

 

My Google Glass Experiment — from mrspepe.com
This section is an experiment that I began in March of 2014 when I had the chance to become a Google Glass explorer.  Each day I took a few minutes to reflect upon how my students and I used “Google Glass in the Class” –  here is the data.  I am doing this experiment for 60 days.

 

Smart Glasses for warehouses: SmartPick — from postscapes.com
Order picking with a vision.

See also:
smartpick.be

Smart Glasses for Warehouses: SmartPick

 

OK Glass, identify this dinosaur fossil — from popularmechanics.com by William Herkewitz
How Google Glass will help paleontologists identify new finds, share fossil hunts with people half a world away, and build digital models of dinosaur bones before the fossils even come out of the ground.

 

Air Force eyeing Google Glass for battlefield informatics — from blogs.wsj.com by Clint Boulton

 

Google Glass enters the operating theatre: Surgeon becomes first in the UK to use smart specs during an operation — from dailymail.co.uk by Emma Innes

Excerpt:

David Isaac, at Torbay Hospital, was first surgeon in UK to use the device
Other surgeons at the same Devon hospital have now used them too
They say Google Glass has ‘huge potential’ for medical eduction
It can be used to live-stream operations to lecture theatres so students can see them from the surgeon’s perspective

 
 

From DSC:
Last spring, I saw the following graphic from Sparks & Honey’s presentation entitled, “8 Exponential Trends That Will Shape Humanity“:

 

ExponentialNotLinearSparksNHoney-Spring2013

 

If today’s changes are truly exponential — and I agree with Sparks & Honey that they are, especially as they relate to technological changes — how soon will it be before each of us is interacting with a robot?

This is not an idle idea or question, nor is it a joke. It will be here sooner than most of us think!  The science fiction of the past is here (at least in part).  Some recent items I’ve run across come to my mind, such as:

 

Hitachi’s EMIEW Robot Learns to Navigate Around the Office — from spectrum.ieee.org by Jason Falconer

 

Photo: Hitachi

Excerpt:

Now EMIEW 2 still relies on maps of its surroundings, but its navigation software has a new feature: It uses designated zones that make the robot change its speed and direction.

 

 

iRpobot Ava 500 

iRobotAva-2014

 

Excerpt:

Ava 500 enables this new dimension in telepresence with:

  • autonomous navigation and mobility – remote users simply specify a destination and the robot automatically navigates to the desired location without any human intervention.
  • standards-based videoconferencing – built-in Cisco Telepresence® solutions deliver enterprise-class security and reliability.
  • an easy-to-use client application – an iPad mini™ tablet enables remote users to schedule and control the robot.
  • scheduling and management -seamlessly handled through an iRobot managed cloud service.

 

 

 

Also see:

 

 

 
© 2025 | Daniel Christian