Recording of yesterday’s event/keynote
Beyond the Major Apple Headlines, 11 Things You Need to Know — from blogs.wsj.com by Brian R. Fitzgerald
The Apple Watch: Everything You Need to Know — from businessweek.com by Joshua Brustein
Apple Watch: Here Are the Apps to Expect — from blogs.wsj.com by Nathan Olivarez-Giles
Apple in Focus After Debut of iPhone 6, Smart Watch, Apple Pay — from finance.yahoo.com by Zacks Research Staff
Tim Cook tells USA TODAY: ‘This is epic’ — from usatoday.com by Marco della Cava
Apple unveils smartwatch, bets on wearable devices — from finance.yahoo.com by Michael Liedtke and Anick Jesdanun
Apple Watch: Coming to a Classroom Near You? — from chronicle.com by Rebecca Koenig
Excerpt from A Brief Look at Texting and the Internet in Film — by Tony Zhou
Is there a better way of showing a text message in a film? How about the internet? Even though we’re well into the digital age, film is still ineffective at depicting the world we live in. Maybe the solution lies not in content, but in form.
From DSC:
With a shout out/thanks to Krista Spahr,
Senior Instructional Designer at Calvin College, for this resource
From DSC:
With thanks to Mr. Jeff Finder at
Faculty Row for this resource!
Also, take note of how
interdisciplinary this piece is,
encompassing Daniel Nemroff’skills
in filmmaking, visual effects,
photography, & graphic design — but also
his visionary thinking and his awareness
of what might be effective uses of
educational technologies.
Excerpt:
First, Watson was brought up to speed by being directed, verbally, to read over an internal memo summarizing the company’s strategy for artificial intelligence. It was then asked by one of the researchers to use that knowledge to generate a long list of candidate companies. “Watson, show me companies between $15 million and $60 million in revenue relevant to that strategy,” he said.
After the humans in the room talked over the results Watson displayed on screen, they called out a shorter list for Watson to put in a table with columns for key characteristics. After mulling some more, one of them said: “Watson, make a suggestion.” The system ran a set of decision-making algorithms and bluntly delivered its verdict: “I recommend eliminating Kawasaki Robotics.” When Watson was asked to explain, it simply added. “It is inferior to Cognilytics in every way.”
Goodbye, TV Channels—And Hello, TV Apps — from readwrite.com by Adriana Lee
How a small change in language represents a universal shift in the television experience.
Excerpt (emphasis DSC):
But television is evolving. Increasingly, it’s all about the apps now—browsable, downloadable, interactive TV applications. You can thank the swelling ranks of streaming services and devices for that.
The software applications they’re delivering to our living rooms are growing in number and prominence. And they’re starting to eclipse the passive, one-way broadcasts we once fought over for two-way, interactive experiences that let you share democratically among multiple users (née viewers) across mobile devices and computers.
…
According to research firm NPD Group, the smart television business has begun to boom. In the beginning of 2013, there were 140 million Internet-ready TVs in American homes. By 2015, it will grow 44 percent, to 202 million. And by that time, nearly two-thirds of them will actually be connected to the Internet, compared to just 56 percent now.
…
How they connect is important. When it comes to television, “apps” are where it’s at, not ye olde “TV channels.” It’s just a shift in language, true—but it’s also a shift in thinking.
In a multi-screen future, phones don’t control TVs, TVs control phones — from foxnews.com by Alex Tretbar
Excerpt:
Right now, most “second-screen” usage is more distracting than it is enriching, but that’s about to change. Soon your tablet will spring to life when you tune into your favorite show, and you’ll have more opportunities than ever to engage. The million-dollar buzzword here is Automatic Content Recognition, or ACR. But, before we get too far into that, let’s start at the beginning: the screen itself.
…
Navin wants his apps to automatically deliver content viewers might otherwise seek out manually. This might mean recommendations, related video, social-media discussions, or even a simple plot synopsis.
What television will look like in 2025, according to Netflix — from wired.com by Issie Lapowsky
Excerpts:
People have traditionally discovered new shows by tuning into the channels that were most aligned with their interests. Love news? Then CNN might be the channel for you. If it’s children’s programming you want, Nickelodeon has you covered. And yet, none of these channels can serve 100 percent of their customers what they want to watch 100 percent of the time.
According to Hunt, this will change with internet TV. He said Netflix is now working to perfect its personalization technology to the point where users will no longer have to choose what they want to watch from a grid of shows and movies. Instead, the recommendation engine will be so finely tuned that it will show users “one or two suggestions that perfectly fit what they want to watch now.”
“I think this vision is possible,” Hunt said. “We’ve come a long way towards it, and we have a ways to go still.” He said Netflix is now devoting as much time and energy to building out that personalization technology as the company put into building the infrastructure for delivering that content in the first place.
…
“The stories we watch today are not your parents’ TV,” Hunt said, “and the stories your kids watch in 2025 will blow your mind away.”
And by the year 2025, he told his audience, everyone will own a smart TV.
TV transformed by smart thinking — from theaustralian.com.au/ by
Excerpt:
As LG puts it, your apps to the right of the cards are “the future” — what you will watch, while the display of your recently used apps, to the left of the cards, is “the past” — so the launcher is an amalgam of your past, present and future viewing activity
From DSC:
“…everyone will own a smart TV by 2025.” Well, maybe not everyone, but many of us will have access to these Internet-connected “TV’s” (if they are even called TV’s at that point).
I hope that Netflix will license those personalization technologies to other vendors or, if not, that some other vendor will create them for educationally-related purposes.
Can you imagine a personalization engine — focused on education and/or training — that could provide the scaffolding necessary for learning about many topics? i.e. digital playlists of learning. Streams of content focused on education. Such engines would remember where you left off and what you still need to review…what you have mastered and what you are still struggling with…what you enjoy learning about…your learning preferences…and more.
Addendum:
How Samsung is enabling the future of social TV — from lostremote.com by Natan Edelsburg
UCI School of Medicine first to integrate Google Glass into curriculum — from news.uci.edu by
Wearable computing technology will transform training of future doctors
Excerpt:
Irvine, Calif., May 14, 2014 — As physicians and surgeons explore how to use Google Glass, the UC Irvine School of Medicine is taking steps to become the first in the nation to integrate the wearable computer into its four-year curriculum – from first- and second-year anatomy courses and clinical skills training to third- and fourth-year hospital rotations.
Leaders of the medical school have confidence that faculty and students will benefit from Glass’s unique ability to display information in a smartphone-like, hands-free format; being able to communicate with the Internet via voice commands; and being able to securely broadcast and record patient care and student training activities using proprietary software compliant with the 1996 federal Health Insurance Portability & Accountability Act.
…
When faculty wear Google Glass for instruction, he added, it gives students an unprecedented first-person perspective. Conversely, when students are wearing Glass, they can take advantage of pertinent information delivered directly into their line of sight by faculty members, who can see exactly what a student sees and thus better guide a dissection or simulation exercise.
“The most promising part is having patients wear Glass so that our students can view themselves through the patients’ eyes, experience patient care from the patients’ perspective, and learn from that information to become more empathic and engaging physicians,” Wiechmann said.
CNN’s Google Glass app now supports iReport citizen journalism – from glassalmanac.com by Matt McGee
Excerpt:
We can all be journalists via Google Glass. That’s because CNN has updated its existing Glassware to add support for the iReport service.
For the uninitiated, iReport is a citizen journalism program where Joe Public can send news tips, videos, photos, etc. to CNN for possible use on the TV news or on CNN’s website. If you watch CNN enough, chances are you’ve seen iReport video or photos being used on air.
And now, as a Glass Explorer, you can share your #throughglass photos and videos straight to CNN’s iReport service.
Metaio releases first true “see-through” wearable augmented reality — from realareal.com by Kiran Voleti
Excerpt:
The world leader in Augmented reality (AR) software and solutions, Metaio, today announced the first ever “see-through” wearable AR capabilities through the newest Beta version of the Metaio SDK, now optimized for wearable computing devices like the brand-new Epson Moverio BT-200. Instead of utilizing a camera view, Metaio’s technology allows the user to perceive reality itself with digital and virtual content directly overlaid onto their surroundings.
Wearable computing is on the rise, with devices like Google Glass and Oculus Rift in the public eye more and more. But in order to perform augmented reality experiences, even transparent displays like Google Glass rely on a camera video feed that duplicates reality rather than using the reality itself, potentially creating a disconnect for the user between the augmented content and the real world.
My Google Glass Experiment — from mrspepe.com
This section is an experiment that I began in March of 2014 when I had the chance to become a Google Glass explorer. Each day I took a few minutes to reflect upon how my students and I used “Google Glass in the Class” – here is the data. I am doing this experiment for 60 days.
Smart Glasses for warehouses: SmartPick — from postscapes.com
Order picking with a vision.
See also:
smartpick.be
OK Glass, identify this dinosaur fossil — from popularmechanics.com by William Herkewitz
How Google Glass will help paleontologists identify new finds, share fossil hunts with people half a world away, and build digital models of dinosaur bones before the fossils even come out of the ground.
Air Force eyeing Google Glass for battlefield informatics — from blogs.wsj.com by Clint Boulton
Google Glass enters the operating theatre: Surgeon becomes first in the UK to use smart specs during an operation — from dailymail.co.uk by Emma Innes
Excerpt:
David Isaac, at Torbay Hospital, was first surgeon in UK to use the device
Other surgeons at the same Devon hospital have now used them too
They say Google Glass has ‘huge potential’ for medical eduction
It can be used to live-stream operations to lecture theatres so students can see them from the surgeon’s perspective