Can you apply Google’s 20% time in the classroom? — from guardian.co.uk/teacher-network by Stuart Spendlow
Google offers its engineers 20% of their timetable to work on their own projects. Keen to see if it could work for education, Stuart Spendlow introduced the idea to his own classroom

.

Pupils cooking

20% time developed a class of intrinsically motivated learners who strive to
make themselves proud without any fear whatsoever of making mistakes,
says Stuart Spendlow.
Photograph: Paul Doyle/Alamy

McGraw-Hill & Kno offer a peek into the future of textbooks: They’re dynamic, vocal, adaptive & bring stats to studying — from techcrunch.com by Rip Empson

Excerpt:

The suite leverages adaptive learning technology — one of the hottest topics in education this past year — which, simply put, seeks to personalize the educational experience by collecting data on student comprehension (knowledge, skill and confidence), employing algorithms to create customized study plans/paths based on that data. The goal being to keep students engaged (and improving) by helping them to identify and focus on areas where they’re struggling.

 

Prediction from DSC:
I’d like to take these developments one step further…

These developments will find their way into our living rooms, via second screen devices and interactions with Smart/Connected TVs. Highly-sophisticated, back-end, behind the scenes technologies will continue to develop (think Next Gen Knewton or IBM’s Watson) — aiding in the fulfillment of one’s learning objectives. Personalized, digital playlists will be presented and will feature multimedia-based content, with chances for more choice, more control, interactivity, social learning, and more. They will meet us where we are at (i.e. in our Zone of Proximal Development), and encourage us to keep learning via game-like interfaces…but will try not to overwhelm or discourage us.  But live persons will either be instantly available to assist, and/or will help us walk through the steps, and/or perhaps we’ll go through these types of exercises in virtual cohorts (that come together quickly, then once finished with the badge or exercise, will disband).

 

 

8 things to look for in today’s classroom — by George Couros

Excerpt:

As I think that leaders should be able to describe what they are looking for in schools I have thought of eight things that I really want to see in today’s classroom.  I really believe that classrooms need to be learner focused. This is not simply that students are creating but that they are also having opportunities to follow their interests and explore passions.  The teacher should embody learning as well.

 

From DSC:
If we can tap into students’ passions/hearts, I believe we’ll find enormous amounts of creative energy pour forth! I’m again struck with adding the tags/keywords to this posting — More choice, more control.

 

From DSC:
The other day, I mentioned how important it will be for institutions of higher education to increase the priority of experimentation. But, for a variety of reasons, I believe this is true for the K-12 world as well. Especially with the kindergarten/early elementary classroom in mind, I created the graphic below. Clicking on it will give you another example of the kind of experimentation that I’m talking about — whether that be in K-12 or in higher ed.

 

DanielChristianJan2013-ExperimentsInCustomizedLearningSpaces

 

From DSC:
I’m trying to address the students that are more easily distracted and, due to how their minds process information, have a harder time focusing on the task at hand.  In fact, at times, all of the external stimuli can be overwhelming. How can we provide a learning environment that’s more under the students’ control? i.e. How can we provide “volume knobs” on their auditory and visual channels?

Along these lines, I’m told that some theaters have sensory-friendly film showings — i.e. with different settings for the lights and sound than is typically offered.

Also see — with thanks going out to Ori Inbar (@comogard) for these:
.

 

A relevant addendum on 1/10/12:

TryBeingMe-Jan2013

 

“Mom! Check out what I did at school today!”

If you’re a parent, don’t you love to hear the excitement in your son’s or daughter’s voice when they bring home something from school that really peaked their interest? Their passions?

I woke up last night with several ideas and thoughts on how technology could help students become — and stay — engaged, while passing over more control and choice to the students in order for them to pursue their own interests and passions. The idea would enable students to efficiently gain some exposure to a variety of things to see if those things were interesting to them — perhaps opening a way for a future internship or, eventually, a career.

The device I pictured in my mind was the sort of device that I saw a while back out at Double Robotics and/or at Suitable Technologies:

.

doublerobotics dot com -- wheels for your iPad

 

 

Remote presence system called Beam -- from Suitable Technologies - September 2012

 

The thoughts centered on implementing a growing network of such remote-controlled, mobile, videoconferencing-based sorts of devices, that were hooked up to voice translation engines.  Students could control such devices to pursue things that they wanted to know more about, such as:

  • Touring the Louvre in Paris
  • Being backstage at a Broadway musical or checking out a live performance of Macbeth
  • Watching a filming of a National Geographic Special in the Fiji Islands
  • Attending an IEEE International Conference in Taiwan
  • Attending an Educause Conference or a Sloan C event to get further knowledge about how to maximize your time studying online or within a hybrid environment
  • Touring The Exploratorium in San Francisco
  • Touring the Museum of Science & Industry in Chicago
  • Being a fly on the wall during a Senate hearing/debate
  • Seeing how changes are made in the assembly lines at a Ford plant
  • Or perhaps, when a student wheels their device to a particular area — such as the front row of a conference, the signal automatically switches to the main speaker/event (keynote speakers, panel, etc. via machine-to-machine communications)
  • Inviting guest speakers into a class: pastors, authors, poets, composers, etc.
  • Work with local/virtual teams on how to heighten public awareness re: a project that deals with sustainability
  • Virtually head to another country to immerse themselves in another country’s language — and, vice versa, help them learn the students’ native languages

For accountability — as well as for setting aside intentional time to process the information — students would update their own blogs about what they experienced, heard, and saw.  They would need to include at least one image, along with the text they write about their experience.  Or perhaps a brief/edited piece of digital video or audio of some of the statements that they heard that really resonated with them, or that they had further questions on.  The default setting on such postings would be to be kept private, but if the teacher and the student felt that a posting could/should be made public, a quick setting could be checked to publish it out there for others to see/experience.

Real world. Engaging. Passing over more choice and control to the students so that they can pursue what they are passionate about.

 

 

 

 

.

 

.

From DSC, some examples:

  • Unbundling and Unmooring: Technology and the Higher Ed Tsunami — from educause.org by Audrey Watters
  • Unbundling Higher Education | From the Bell Tower –– from lj.libraryjournal.com by Steven Bell
    Excerpt (emphasis DSC):
    Recent events in higher education suggest a new trend — earning degrees by the course from multiple providers. Are we looking at the iTunes model of unbundled higher ed? Call it alt-HE.
  • Napster, Udacity, and the Academy — from Clay Shirky
    Excerpt:
    Once you see this pattern—a new story rearranging people’s sense of the possible, with the incumbents the last to know—you see it everywhere. First, the people running the old system don’t notice the change. When they do, they assume it’s minor. Then that it’s a niche. Then a fad. And by the time they understand that the world has actually changed, they’ve squandered most of the time they had to adapt.
    .
    It’s been interesting watching this unfold in music, books, newspapers, TV, but nothing has ever been as interesting to me as watching it happen in my own backyard. Higher education is now being disrupted; our MP3 is the massive open online course (or MOOC), and our Napster is Udacity, the education startup.

    But who faces that choice? Are we to imagine an 18 year old who can set aside $250K and 4 years, but who would have a hard time choosing between a residential college and a series of MOOCs? Elite high school students will not be abandoning elite colleges any time soon; the issue isn’t what education of “the very best sort” looks like, but what the whole system looks like.

From DSC:
I understand that Mr. George Lucas is going to express his generosity in donating the $4.05 billion from the sale of Lucasfilm to education.

Here’s a question/idea that I’d like to put forth to Mr. Lucas (or to the United States Department of Education, or to another interested/committed party):

Would you consider using the $4+ billion gift to build an “Online Learning Dream Team?”

.

Daniel Christian -- The Online Learning Dream Team - as of November 2012

.

 Original image credit (before purchased/edited by DSC)
yobro10 / 123RF Stock Photo

 

 

From DSC:
What do you think? What other “players” — technologies, vendors, skillsets, etc. — should be on this team?

  • Perhaps videography?
  • Online tutoring?
  • Student academic services?
  • Animation?
  • Digital photography?

 


Some of the powerful words that, if done well, are enabled by online learning are:


Choice:

  • Of format — digital audio, digital video, text, graphics, animations, games, role playing, etc.; I can look at content from multiple angles and in multiple ways
  • Of assignment — which one works best for me? Which one interests me the most? Or if I don’t really get the assignment in one way, I can reach the learning objective in another way.

Control:

  • Of when and where I bop into my course and participate in it; it may be a brief 5-minute posting of a great and relevant article I ran across, or it may be a 3-4 hour stint
  • Of playing media — pausing, fast forwarding, rewinding, slowing down or speeding up digital video and/or digital audio

Participation:

  • I can contribute content that I created — in a variety of formats; content that I can spend some time on creating
  • I can take my time to engage in thoughtful, reflective discussions; especially helpful to those of us who don’t think very fast on our feet and need time to think about a topic and develop our responses

Communication and collaboration:

  • Between students
  • Between faculty and students

bop in

— In my mind, those are very powerful words in many peoples’ learning experiences.

 

 

 

IBM’s Watson expands commercial applications, aims to go mobile  — from singularityhub.com by Jason Dorrier

.

From DSC:
This relates to what I was trying to get at with the posting on mobile learning.  I would add the word “Education” to the list of industries that the technologies encapsulated in Watson will impact in the future. Combine this with the convergence that’s enabling/building the Learning from the Living [Class] Room environment, and you have one heck of an individualized, data-driven, learning ecosystem that’s available 24 x 7 x 365 — throughout your lifetime!!!

.

 

IBM Watson-Introduction and Future Applications

 

 


Also relevant here are some visions/graphics I created from 2012 and from 2008:


 

.

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

.

 

.

Why couldn't these channels represent online-based courses/MOOCs? Daniel Christian - 10-17-12

 

 

.

 

 

 

 

The power and potential of mobile learning [Christian]

The power and possibility of mobile learningas cross posted from evoLLLution.com (for LifeLong Learning)
Daniel S. Christian | October 2012

As I sat down to write about mobile learning, I struggled with narrowing down the scope of what I was going to attempt to address.  Which angle(s) should I take?

And then I reflected on my morning so far. I helped my daughter wake up to the sounds of a song coming from my iPhone.  She opened one eye, then the other, and soon, she was dancing around the room.  Success!

I then proceeded to listen to my iPhone as I drove my car into work – it gave me the energy I needed to start my internal engines.  (By the way, the idea of automobile-based technologies continues to grow, opening up further possibilities; but that’s a topic for another day.)

Then I caught up with a friend for coffee and he reached for his iPad.  He showed me an app for the local Art Prize competition that’s currently going on in our area.  He mentioned that if a person wasn’t in the immediately vicinity of the Art Prize event, that person could not vote on any of the pieces.  However, if the GPS-based coordinates were within the approved range, a person could use that app to:

  • Vote on which pieces of artwork that they liked
  • Find out where the artwork was located (at numerous locations on a map)
  • Learn more about the pieces themselves – what the pieces were made of, hear the artists’ thoughts on why they created what they created, etc.

So by the time I pulled up to my PC at 9:00am, I had already been positively impacted by mobile technology in several ways.  The common words and phrases that are often used to describe mobile learning and mobile technologies rang true and popped back into my mind: ubiquitous, always on, always connected, 24x7x365, convenient, etc.

As I enjoy peering into the future as well as pulse checking a variety of items, I would like to ask the following questions concerning the potential power and possibility of mobile learning as well as the relevant, emerging set of technologies that enable it:

  • Q:  What happens when the technologies behind IBM’s Watson and Apple’s Siri get perfected and integrated into learning-based products and services? What types of devices will be able to tap into those products and services?
    A:  IBM’s Watson beat the best human players in a game of Jeopardy and is now being used as a data analytics engine for the medical community, wading through terabytes of patient healthcare information and research data in order to determine how best to treat illnesses.  So, such technologies hold some serious promise in terms of at least addressing the lower to mid-levels of Bloom’s Taxonomy. For 2012, tablets, smart phones, laptops, notebooks would be likely candidates of accessing these types of products/services. But we are just at the embryonic stages of the Smart/Connected TV, and I have it that such a device will become an important and commonly-used mechanism for accessing such cloud-based applications and services in the future.
    .
  • Q: Will students of all ages have access to their own virtual tutors so to speak? From any device at any time?
    A: Yes; this is highly likely, especially given the current (and increasing) levels of investments being made in educational technology related areas.  It’s very feasible to think that Apple, IBM, Google, Microsoft, McGraw-Hill, Pearson, or some other organization with deep pockets will develop such virtual tutoring products and services.
    .
  • Q:  What will happen when a virtual tutor is unable to resolve or address the student’s issue to the student’s satisfaction?  Will the student be able to instantly access a human tutor – with the option of keeping the existing work/issue/problem visible to the human tutor?
    A:  Yes, again…highly likely. This will open up new opportunities for faculty members, teachers, instructors, trainers, and tutors.  Getting experience in teaching online is a solid career move at this point in history.  (Also relevant will be those people creating applications and technically supporting them. An understanding of web-based videoconferencing and collaboration tools will be helpful here.)
    .
  • Q:  How will the convergence of the television, the computer, and the telephone impact what can be done with learning-based applications and experiences?
    A:  We are just beginning to see the ecosystems changing and adapting to deal with the convergence of the television, the computer, and the telephone.  The Smart/Connected TV – along with “second screen” based applications – is being driven by innovations involving the entertainment, marketing, and advertising industries.  But it’s not a stretch to think that educationally-related content will be right behind such innovative solutions.
    .
  • Q:  How will the multiple screens phenomenon affect how content can be consumed and discussed?
    A:   I created a couple of graphics along these lines that attempt to capture a potential vision here. I call it “Learning from the Living [Class] Room,” and it continues to develop in front of our eyes. We could be watching a “lecture” on a big screen and simultaneously interacting with people throughout the world on our smaller screens.

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

.

Learning from the living room -- a component of our future learning ecosystems -- by Daniel S. Christian, June 2012

  • Q:  How will mobile conferencing affect what can be achieved?
    A:  Mobile-based conferencing will provide 24x7x365 opportunities for learning – and communications – to occur.  Such technologies have applications even in more traditional face-to-face classroom settings.  For a few possibilities here, see how mobile technologies are used in this vision by Intel  as well as in this vision by Corning.

 

Great vision from Intel!

.

Great vision from Corning!

.

  • Q:  What sort of creative doors are opened when a story can be told across a variety of “channels” and means?  And what sorts of skill sets do we need to start building – or continue to build – in order to help students find work in these emerging fields?
    A:  Create a Google Alert on transmedia and/or transmedia-based storytelling and you will get a sense of what’s happening in this arena.  There will be huge opportunity for creative, innovative folks out there!  Being versed in new media would be a solid idea if one hopes to pursue careers in these burgeoning fields. That is, building at least a rudimentary skill set of how to creatively use text, graphics, animations, digital audio, digital video, and interactive programming to deliver and obtain information would be very beneficial here.
    .
  • Q:  What types of analytics will be tracked and fed into one’s cloud-based learner profile?
    A:  It will depend upon where we want such technologies to take us.  However, think about the applications and implications of this approach if a web-based profile were used to:

    • Feed a workplace-based exchange – matching buyers and sellers of services
    • Inform a learning agent on which topics/disciplines that person wants to learn more about – helping that person obtain a highly-personalized, customized, relevant, engaging, productive learning experience with a solid ROI
    • Inform a cloud-based app on what prior knowledge one has and where to begin the “next lesson”
      .
  • Q:  Will courses become apps?  Will what we know of TV programs become apps and, if so, how will that affect what each of us can contribute to our own communities of practice?
    A:  Just as the web has enabled individuals to deliver their own podcasts, information, etc. – essentially becoming their own radio stations to a degree – the ecosystems being built up around the Smart/Connected TVs could help each of us become our own TV station. The potential is huge in terms of further developing and sharing knowledge within communities of practice.
    .
  • Q:  Will educational gaming dove-tail nicely with mobile learning and emerging technologies such as augmented reality, 3D, and connected television?
    Yes, the synergies and foundational pieces are already coming into place.
    .
  • Q:  How will the bring-your-own-device (BYOD) situation affect what can occur in the face-to-face classroom?
    A:  Students will be able to seamlessly and efficiently contribute content to the discussions in a face-to-face classroom without breaking the flow of the classroom.  An example graphic can be found here.

 

A piece of the Next Generation Smart Classroom -- Daniel Christian -- June 2012

 

The topics and potential routes that additional articles could take are almost endless.  But I think it’s safe to say that mobile, lifelong learning is here to stay.

Listed below are some recent articles and resources if you are interested in pursuing the topic of mobile learning.  I also have a section on my Learning Ecosystems blog dedicated to mobile learning.

If you are interested in what I’m calling Learning from the Living [Class] Room, you might be interested in these postings.

 

Some recent articles/resources regarding mobile learning:

 

Addendum:

 

Tagged with:  

Apple TV and the transformation of web apps into tablet and TV dual screen apps — from brightcove.com by Jeremy Allaire

.

 

Excerpts:

Importantly, designers and developers need to shed the concept that “TVs” are for rendering video, and instead think about “TVs” as large monitors on which they can render applications, content and interactivity that is supported by a touch-based tablet application.

The key concept here is that this pervasive adoption of TV monitors is the tip of the spear in creating a social computing surface in the real world.

Specifically, Apple has provided the backbone for dual screen apps, enabling:

  • Any iOS device (and OSX Mountain Lion-enabled PCs) to broadcast its screen onto a TV. Think of this as essentially a wireless HDMI output to a TV. If you haven’t played with AirPlay mirroring features in iOS and Apple TV, give it a spin, it’s a really exciting development.
  • A set of APIs and an event model for enabling applications to become “dual screen aware” (e.g. to know when a device has a TV screen it can connect to, and to handle rendering information, data and content onto both the touch screen and the TV screen).


[Jeremy listed several applications for these concepts:  Buying a house, buying a car, doctor’s office, kids edutainment, the classroom, retail electronics store, consuming news, consuming video, sales reporting, board games.]

.

Also see:

 
From DSC:
Graphically speaking — and approaching this from an educational/learning ecosystems standpoint — I call this, “Learning from the Living [Class] Room.

.

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

Learning from the living room -- a component of our future learning ecosystems -- by Daniel S. Christian, June 2012

 

 

Related item:

The future of education, according to McGraw-Hill — from edcetera.com by Kirsten Winkler

Excerpt:

Certainly, there needed to be a set of skills defined to call a student college-ready, but the path of how to achieve these fixed goals should be an individual one that allows students to go after their interests and respects their individual talents. Students shouldn’t be in the same grade when they’re not on the same level. What we’re talking about is a flexible system that would depend on when a student is ready rather than when a curriculum defines they’re ready.

Such a more flexible model is inspired by some of the things that are happening in higher education already. This is a great example of how innovation taken from higher ed is making its way into high schools.

Vinet Madan used a nice metaphor when he explained this approach in our interview. It’s like the decision to take either the scenic route by the coast or the highway to get to your goal. Ultimately, both routes will take you to the destination, it’s just a personal preference.

Also see:

© 2024 | Daniel Christian