New centre will give robots the gift of sight — from Queensland University of Technology; 18 March 2014

Excerpt:

“Society is on the cusp of a revolution in robotics in which personal robots will become an everyday fact of life. What we need to make this happen is to develop technology which allows a robot to perceive its environment – to sense, understand and learn from the information it gathers.

“We want to use cameras in robots, which are less costly and use less power than other types of sensors,” he said.

“Cameras provide very rich information about the world, and we know from our own experience that our eyes, which are just cameras, provide the information we need to perform complex tasks like driving cars or picking up objects.

“This Centre will deliver the science and technologies that will turn cameras into powerful sensing devices capable of understanding and responding to their environment, and enabling robots to operate reliably over long periods, in complex unstructured surroundings where they will interact with humans as well as objects, many of which will require delicate handling.

 

US Army to test robots that “think – look – move – talk – work” — from robohub.org by Colin Lewis

Excerpt:

A report issued by the US Department of Defense shows that the army intends to test robots that “think – look – move – talk and work.” Figure 24 from the report summarizes the Army’s vision for these five problem domains, barriers to achieving its vision, and work to be done to advance toward the vision.

 

 

Deep Learning’s Role in the Age of Robots — from innovationinsights.wired.com by Julian Green

 

 

IEEE_Spectrum_-robots-app

 

Avidbots wants to automate commercial cleaning with robots — from spectrum.ieee.orgby Evan Ackerman

 
Steve Zylius/UC Irvine Dr. Warren Wiechmann, assistant clinical professor of
emergency medicine and associate dean of instructional technologies, will oversee
implementation of the Google Glass four-year program at UCI.

 

UCI School of Medicine first to integrate Google Glass into curriculum — from news.uci.edu by
Wearable computing technology will transform training of future doctors

Excerpt:

Irvine, Calif., May 14, 2014 — As physicians and surgeons explore how to use Google Glass, the UC Irvine School of Medicine is taking steps to become the first in the nation to integrate the wearable computer into its four-year curriculum – from first- and second-year anatomy courses and clinical skills training to third- and fourth-year hospital rotations.

Leaders of the medical school have confidence that faculty and students will benefit from Glass’s unique ability to display information in a smartphone-like, hands-free format; being able to communicate with the Internet via voice commands; and being able to securely broadcast and record patient care and student training activities using proprietary software compliant with the 1996 federal Health Insurance Portability & Accountability Act.

When faculty wear Google Glass for instruction, he added, it gives students an unprecedented first-person perspective. Conversely, when students are wearing Glass, they can take advantage of pertinent information delivered directly into their line of sight by faculty members, who can see exactly what a student sees and thus better guide a dissection or simulation exercise.

“The most promising part is having patients wear Glass so that our students can view themselves through the patients’ eyes, experience patient care from the patients’ perspective, and learn from that information to become more empathic and engaging physicians,” Wiechmann said.

 

CNN’s Google Glass app now supports iReport citizen journalism – from glassalmanac.com by Matt McGee

Excerpt:

We can all be journalists via Google Glass. That’s because CNN has updated its existing Glassware to add support for the iReport service.

For the uninitiated, iReport is a citizen journalism program where Joe Public can send news tips, videos, photos, etc. to CNN for possible use on the TV news or on CNN’s website. If you watch CNN enough, chances are you’ve seen iReport video or photos being used on air.

And now, as a Glass Explorer, you can share your #throughglass photos and videos straight to CNN’s iReport service.

 

Metaio releases first true “see-through” wearable augmented reality — from realareal.com by Kiran Voleti

Excerpt:

The world leader in Augmented reality (AR) software and solutions, Metaio, today announced the first ever “see-through” wearable AR capabilities through the newest Beta version of the Metaio SDK, now optimized for wearable computing devices like the brand-new Epson Moverio BT-200. Instead of utilizing a camera view, Metaio’s technology allows the user to perceive reality itself with digital and virtual content directly overlaid onto their surroundings.

Wearable computing is on the rise, with devices like Google Glass and Oculus Rift in the public eye more and more. But in order to perform augmented reality experiences, even transparent displays like Google Glass rely on a camera video feed that duplicates reality rather than using the reality itself, potentially creating a disconnect for the user between the augmented content and the real world.

 

My Google Glass Experiment — from mrspepe.com
This section is an experiment that I began in March of 2014 when I had the chance to become a Google Glass explorer.  Each day I took a few minutes to reflect upon how my students and I used “Google Glass in the Class” –  here is the data.  I am doing this experiment for 60 days.

 

Smart Glasses for warehouses: SmartPick — from postscapes.com
Order picking with a vision.

See also:
smartpick.be

Smart Glasses for Warehouses: SmartPick

 

OK Glass, identify this dinosaur fossil — from popularmechanics.com by William Herkewitz
How Google Glass will help paleontologists identify new finds, share fossil hunts with people half a world away, and build digital models of dinosaur bones before the fossils even come out of the ground.

 

Air Force eyeing Google Glass for battlefield informatics — from blogs.wsj.com by Clint Boulton

 

Google Glass enters the operating theatre: Surgeon becomes first in the UK to use smart specs during an operation — from dailymail.co.uk by Emma Innes

Excerpt:

David Isaac, at Torbay Hospital, was first surgeon in UK to use the device
Other surgeons at the same Devon hospital have now used them too
They say Google Glass has ‘huge potential’ for medical eduction
It can be used to live-stream operations to lecture theatres so students can see them from the surgeon’s perspective

 
 

From DSC:
Last spring, I saw the following graphic from Sparks & Honey’s presentation entitled, “8 Exponential Trends That Will Shape Humanity“:

 

ExponentialNotLinearSparksNHoney-Spring2013

 

If today’s changes are truly exponential — and I agree with Sparks & Honey that they are, especially as they relate to technological changes — how soon will it be before each of us is interacting with a robot?

This is not an idle idea or question, nor is it a joke. It will be here sooner than most of us think!  The science fiction of the past is here (at least in part).  Some recent items I’ve run across come to my mind, such as:

 

Hitachi’s EMIEW Robot Learns to Navigate Around the Office — from spectrum.ieee.org by Jason Falconer

 

Photo: Hitachi

Excerpt:

Now EMIEW 2 still relies on maps of its surroundings, but its navigation software has a new feature: It uses designated zones that make the robot change its speed and direction.

 

 

iRpobot Ava 500 

iRobotAva-2014

 

Excerpt:

Ava 500 enables this new dimension in telepresence with:

  • autonomous navigation and mobility – remote users simply specify a destination and the robot automatically navigates to the desired location without any human intervention.
  • standards-based videoconferencing – built-in Cisco Telepresence® solutions deliver enterprise-class security and reliability.
  • an easy-to-use client application – an iPad mini™ tablet enables remote users to schedule and control the robot.
  • scheduling and management -seamlessly handled through an iRobot managed cloud service.

 

 

 

Also see:

 

 

 

Everything you need to know about iBeacon— from 3months.com

From DSC:
I’m posting this because:

  1. The video on that page is helpful in learning more about iBeacon
    .
  2. It’s important for those involved with education to be in the loop on beacon-related technologies — as such devices will help us bridge the physical world with the digital world.

 

 

3Months-iBeacon-2014

 

 

 

Also see:

 

 

 

 

Hypermedia storytelling — from kirkbowe.com
Museum exhibitions as dynamic storytelling experiences using the latest technology

Excerpt:

The secret of many great storytellers lies in their ability to adapt delivery to their audiences, even as they speak.  Storytelling is at its best when it is not a one-way monologue but rather an experience which is shaped by the teller and the listener together.  Underpinning this is the notion of real-time mutual discovery.

Great museum exhibitions tell great stories.  But for practical reasons they lack a dynamic edge, unable to see the faces and hear the thoughts of the people walking around them.  This is because many exhibitions are, to some extent, static place-holders for the mind and soul of the curator or curation team.

One of my passions is researching into how to use technology to bring a vibrant storytelling relationship to the fore.  Recently, advances in certain areas of mobile technology have begun to show me that the potential is now there for the cultural heritage sector to take advantage of it.

But how about the cultural sector?  Many museums have already experimented with mobile interaction through the use of printed codes, such as QR codes, which visitors must scan with their devices.  Bluetooth Smart removes that cumbersome step: visitors need only be with proximity of a beacon in order for your app to provide them with the contextual information you wish to deliver.  The technology has many different potential applications:

– Place a beacon in each room of the exhibition.  Your app then triggers a screen of scene-setting background information for the room as the visitor enters.  No need to have congestion points around wall-mounted text at the door.

– Place a beacon under selected objects or cases.  As visitors walk up to the object, your app detects the beacon and provides commentary, video, or a three-dimensional representation of the object.  No need for visitors to type in an object number to a traditional electronic guide.

From DSC:
This has major implications — and applications — for teaching and learning spaces! For blended/hybrid learning experiments.  Such technologies can bridge the physical and virtual/digital worlds!

 

 

Another interesting application, providing access to published content from a specific location only:

 

New iBeacon App Stations of the Cross at St. Thomas — from mrspepe.com by Courtney Pepe

Excerpt:

This app was developed by a fellow ADE Jay Anderson. It uses iBeacon technology to sense how close you are to different pieces of art related to the 14 Stations of the Cross in a church in Lancaster, Pennsylvania. The app has three different settings: meditation for children, meditation for adults, and the comments of the artists. Great use of the iBeacon technology.

Addendum on 4/19:

 


The National Slate Museum in Llanberis which is operating an iBeacon smartphone information point.

 

App Ed Review

 

APPEdReview-April2014

 

From the About Us page (emphasis DSC):

App Ed Review is a free searchable database of educational app reviews designed to support classroom teachers finding and using apps effectively in their teaching practice. In its database, each app review includes:

  • A brief, original description of the app;
  • A classification of the app based on its purpose;
  • Three or more ideas for how the app could be used in the classroom;
  • A comprehensive app evaluation;
  • The app’s target audience;
  • Subject areas where the app can be used; and,
  • The cost of the app.

 

 

Also see the Global Education Database:

 

GlobalEducationDatabase-Feb2014

 

From the About Us page:

It’s our belief that digital technologies will utterly change the way education is delivered and consumed over the next decade. We also reckon that this large-scale disruption doesn’t come with an instruction manual. And we’d like GEDB to be part of the answer to that.

It’s the pulling together of a number of different ways in which all those involved in education (teachers, parents, administrators, students) can make some sense of the huge changes going on around them. So there’s consumer reviews of technologies, a forum for advice, an aggregation of the most important EdTech news and online courses for users to equip themselves with digital skills. Backed by a growing community on social media (here, here and here for starters).

It’s a fast-track to digital literacy in the education industry.

GEDB has been pulled together by California residents Jeff Dunn, co-founder of Edudemic, and Katie Dunn, the other Edudemic co-founder, and, across the Atlantic in London, Jimmy Leach, a former habitue of digital government and media circles.

 

 

Addendum:

Favorite educational iPad apps that are also on Android — from the Learning in Hand blog by Tony Vincent

 

What educationally-related affordances might we enjoy from these TV-related developments?

MakingTVMorePersonal-V-NetTV-April2014

 

EducationServiceOfTheFutureApril2014

 

CONTENTS

  • Content discovery and synchronization
    With access to rich data about their subscribers and what they do, operators can improve recommendation, encourage social TV and exploit second screen synchronization.
  • Recordings get more personal
    One of the next big steps in multiscreen TV is giving people access to their personal recordings on every screen. This is the moment for nPVR to finally make its entrance.
  • Evolving the User Experience
    As service providers go beyond household level and address individuals, the role of log-ins or context will become important. There is a place for social TV and big data.
  • The role of audio in personalization
    Audio has a huge impact on how much we enjoy video services. Now it can help to personalize them. ‘Allegiance’ based audio choices are one possibility.
  • Making advertising more targeted
    Addressable advertising is in its infancy but has a bright future, helping to fund the growth of on-demand and multiscreen viewing.

 

Some excerpts from this report:

Good content should be matched by good content discovery , including recommendations. The current state-of -the-art is defined by Netflix.

Today’s TV experience is worlds apart from the one we were talking about even five years ago. We’ve witnessed exponential growth in services such as HD and have moved from a model in which one screen is watched by many, to many screens (and devices) being available to the individual viewer, what is today called TV Everywhere.  Having multiscreen access to content is driving the demand for a more personalised experience, in which the viewer can expect to see what they want, where, and when. While video on-demand (VOD) has been a great method for delivering compelling content to viewers, it is not always a truly seamless TV-like experience, and traditionally has been limited to the living room. The growing demand for the personalised experience is driving seismic change within the TV industry, and we’ve seen great strides made already, with time-shifted TV and nPVR as just two examples of how we in the industry can deliver content in the ways viewers want to watch. The next step is to move towards more advanced content discovery, effectively creating a personalised channel or playlist for the individual user.

As the tools become available to deliver personalized experiences to consumers, content owners can better create experiences that leverage their content. For example, for sports with multiple points of action, like motor racing, multiple camera angles and audio feeds will allow fans to follow the action that is relevant to their favourite racing team. And for movies, access to additional elements such as director’s commentaries, which have been available on Blu-ray discs for some time, can be made available over broadcast networks.

 

 

From DSC:
Some words and phrases that come to my mind:

  • Personalization.
  • Data driven.
  • Content discovery and recommendation engines (which could easily relate to educational playlists)
  • Training on demand
  • Learning agents
  • Web-based learner profiles
  • Learning hubs
  • What MOOCs morph into
  • More choice. More control.
  • Virtual tutoring
  • Interactivity and participation
  • Learning preferences
  • Lifelong learning
  • Reinventing oneself
  • Streams of content
  • Learning from The Living [Class] Room

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

streams-of-content-blue-overlay

 

Immigrants from the future — from The Economist
Robots offer a unique insight into what people want from technology. That makes their progress peculiarly fascinating, says Oliver Morton

 

UsualSuspects-RoboticsSpecialReportEconomist-April2014

 

The pieces in the Special Report include:

 

 

Expanding Learning Opportunities with Transmedia Practices (part 1) — from worlds-of-learning.com by Laura Fleming

 

Transmedia-LauraFleming-Part1-MarApr2014

Excerpt:

The proliferation of digital and networking technologies enables us to rethink,  restructure, and redefine teaching and learning. Transmedia storytelling takes advantage of the rapid convergence of media and allows teachers and learners to participate in rich virtual (and physical) environments that have been shown to foster students’ real emotional engagement with the process of learning. Transmedia learning applies storytelling techniques across multiple platforms to create immersive educational experiences that enable multiple entry and exit points for learning and teaching. By utilizing constructivist and connectivist precepts in the application of these techniques, we can create pedagogies that are transformative on many levels. Encapsulating these notions in the concept of the Transmedia LearningWorld (TLW) allows educators to combine the exciting affordances of the digital technologies with real-life experiences and truly learner-focused pedagogies to produce profoundly productive and powerful learning experiences.

Expanding Learning Opportunities with Transmedia Practices (part 2) — from worlds-of-learning.com by Laura Fleming

Excerpt:

Transmedia storytelling exemplifies learning in the twenty-first century by merging the concept of storytelling with that of the listener-learner and the resulting emotional engagement with the pervasiveness of media. We might define transmedia learning as: the application of storytelling techniques combined with the use of multiple platforms to create an immersive learning landscape which enables multivarious entry and exit points for learning and teaching.

Expanding Learning Opportunities with Transmedia Practices (part 3) — from worlds-of-learning.com by Laura Fleming

Excerpt:

None of this is easy though. Teachers must be given the support they will need to prepare for the concomitant shift in instruction; they will need help to make sense of the new kinds of content that will make their way into the classroom; they will need encouragement to change their approach to teaching and to learning accordingly; and they will need support in how to effectively weave and integrate technology into their practice. The effective use of digital learning can help school districts meet these educational challenges, including, as we have noted, implementing college and career-ready standards for all students, as outlined in the Common Core. Educators need to come to see technology as intrinsic to their instructional practices. Rather than envisaging a process in which technology is merely embedded into the curriculum, an attitude that so often relegates the technology to an afterthought or just one amongst a range of motivating techniques, it should be about the seamless integration of technology into every aspect of teaching and learning through transmedia practices. Technology tools should be so much a part of learning that the friction is removed because of educators and learners do not waste energy thinking about how it works, instead becoming an essential component of all that goes on in the classroom.

Expanding Learning Opportunities with Transmedia Practices (part 4) — from worlds-of-learning.com by Laura Fleming

Excerpt:

The paradox lies in the fact that, at the same time that political and economic forces are pushing the agenda of standardization with some determination, the social-technological environment that we now inhabit is pushing education in the opposite direction. In a real sense, learning is breaking free from the tradition model of education—with school as the central paradigm in that model—simply because the walls of the school can no longer contain all the knowledge and content and desire to learn that is now flowing freely across the ether and intermingling across borders without constraint.

Expanding Learning Opportunities with Transmedia Practices (part 5) — from worlds-of-learning.com by Laura Fleming

Excerpt:

The power of Inanimate Alice lies in the organic connection that is made between the story and the medium along with the innovative use of design and structure. The story unfolds in a game-like world that makes readers direct participants in helping the story to unfold across multiple platforms. With hours of interactive audio-visual experience built in, a gripping mesh of games, puzzles, sights, and sounds embellish and enhance the storyline. The interactivity and narrative are not distinct from one another. In the case of Inanimate Alice, the interactive elements simply cannot be separated from the story. Whether it is controlling Alice’s Baxi (her handheld gaming device) or communicating with Brad (her virtual friend on the Baxi), the embedded technology enhances the narrative and helps it to unfold in manifold directions under the reader’s impulse. It is this that makes Alice a truly unique digital reading experience.

Expanding Learning Opportunities with Transmedia Practices (part 6) — from worlds-of-learning.com by Laura Fleming

Excerpt:

As co-creators of content, our students actively participate in and take control of their own learning. As echoed by the United States Department of Education, the rich, fictional worlds of transmedia tend to create a greater level of social interaction that can inspire children to create their own stories and media products and to share them with each other. The experience of reading is changing. In a transmedia learning experience, reading is now simultaneously an individual act and a social act. Similarly, students can be individual producers but are also able to engage on collaborative sharing, joint creativity, and proliferation of knowledge across the globe.

 

 

Apple iBeacon: signs of new direction — from theage.com.au by Garry Barker]

Excerpt (emphasis DSC):

For local initiatives, I consulted two Australian pioneers of iBeacon technology and related educational app development: Geoff Elwood, chief executive and founder of Melbourne-based Specialist Apps (specialistapps.com); and Paul Hamilton, an Apple distinguished educator, and primary school teacher at Matthew Flinders Anglican College at Buderim, Queensland.

Both are highly respected developers of educational technology and iBeacon is the latest of their passions.

Hamilton says he was the first person in the world to use iBeacon technology in a school. At Matthew Flinders he has installed three iBeacons for interactive technology, library and art learning zones. His website, appsbypaulhamilton.com, includes videos showing them at work.

Elwood’s largest installation so far is at Bryanston, an elite coeducational school based in a country mansion on a 160-hectare estate beside the River Stour in Dorset.

Bryanston employs a student management application that uses an online eLockers information system, distributed by the iBeacon network. ”With iBeacons, a teacher can use an eLocker application to quickly form a proximity group, press a button on the iPad and transmit a notification to students within the proximity to open an eLocker that is blinking on their screens. The app is both transmitter and receiver, but we have taken the technology beyond just transmitting and receiving to establishing direct relationships between teachers and students, individually or in groups,” Elwood says.

 

 

 

W3C: Web Design & Applications

W3C-WebDesignMarch2014

 

Lynda.com

LyndaDotComWebDesign-March2014

 

Web Design Groups on LinkedIn.com

LinkedInWebDesignGroups-March2014

 

Relevant hashtags on Twitter:

 

Top Designer Google+ Communities You Should Follow — from hongkiat.com by Charnita Fance. Filed in Web Design.

Excerpt:

If you are active on Google+ there are a lot of communities for web designers or UI designers to join. Google+ Communities are like online groups or forums where people can come together to talk about a common hobby, interest or career (such as Design). Only members of a given community can see your posts in their stream. As a designer, this is great because you can share your work and get feedback from thousands of other designers, for free.

In the design communities below, you’ll find lots of great information, freebies, tips and tricks, and personal design work from members. Plus, you can ask for help or offer help to others. Let’s find the perfect Google+ design community for you.

 

Fresh Resources for Designers and Developers — March 2014 — from hongkiat.com

 

Infographic: HTML5 vs. native mobile app development [updated] — from kony.com by Dipesh Mukerji — also see his posting: Developing apps with HTML5: benefits and challenges

 

Responsive e-learning in a Multi-Device World — from elearning-reviews.traineasy.com

Excerpt:

“Day by day, the number of devices, platforms, and browsers that need to work with your site grows. Responsive web design represents a fundamental shift in how we’ll build websites for the decade to come,” says Jeffrey Veen, CEO & Cofounder of Typekit.

 

New ebook all about web design for Google Glass — from glassalmanac.com by Christian Bullock

Excerpt:

Well that was fast.

As someone who didn’t really know too many people were currently all about ensuring a site’s web design was fit for Google Glass browsing, there’s now an eBook by author Joe Casabona (known for his blog People Reacting to Glass) that’s a guide to web design for Glass.

Pretty cool idea and something I could see as being crucial when Glass launches publicly. I’m sure Glass adoption won’t be as high as tablet or smartphone adoption rates, but as we’re seeing now, it’s necessary for web designers to think of other ways people are interacting with websites aside from normal desktop or laptop computers.

 

Webmonkey.com

 

WebDesignerDepot.com

 

WebDesign.Tutsplus.com

 

 Addendums on 4/1/14:

 

Tapping M2M: The Internet of Things — from zdnet.com

Excerpt:

The rise of objects that connect themselves to the internet — from cars to heart monitors to stoplights — is unleashing a wave of new possibilities for data gathering, predictive analytics, and IT automation. We discuss how to tap these nascent solutions.

 

TappingM2M-ZDNET-March2014

 

 

 

Practical ideas for using augmented reality in contextual mobile learning — from blog.commlabindia.com by Aruna Vayuvegula

Excerpt:

Using Augmented Reality for contextual mLearning sounds too futuristic – a phenomenon that is being experimented in universities and research centers across the world. Wikipedia defines Augmented reality (AR) as a live, copy, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. It is rather hard for a non-technical person to perceive the implication of this definition in a learning situation.

However, when you come across a report that says

  • 864 million high-end cell phones could be AR enabled in 2014,
  • 103 million automobiles will have AR technology by 2020, (Ref: Semico)

You can’t ignore Augmented Reality (AR). You need to stop, to understand what it all means. That’s when I came across an article by Jason Haag, who spoke at DevLearn 13 on Augmented Reality in mobile learning. Published at Advanced Distributed Learning website, it lists some cool examples in the form of videos about AR in action. The video from the link given below is one of them; it truly helped me conceive the idea of AR in a learning context.

 

AR Music APP Enchantium by DAQRI — from realareal.com by Kiran Voleti

Excerpt:

In the 21st century classroom: Students can see the shape of knowledge, Students can hear the shape of knowledge, students can TOUCH the shape of knowledge.

 

Zientia: Changing the Way We Learn with Augmented Reality — from realareal.com by Kiran Voleti

 

AR-in-learning

 

AR-in-learning2

 

Enchantium is a 4D platform that uses augmented reality — from realareal.com by Kiran Voleti

Excerpt:

Discover a New Dimension of Play™ in a magical world of curated, kid-safe content where play sets can come to life and toys can talk, interact and learn new things.

Toys Become Enchanted

So much more than an app, Enchantium is a 4D platform that uses augmented reality and cutting edge technology to connect games and toys with interactive experiences.

 
© 2025 | Daniel Christian