Reflections on “Inside Amazon’s artificial intelligence flywheel” [Levy]

Inside Amazon’s artificial intelligence flywheel — from wired.com by Steven Levy
How deep learning came to power Alexa, Amazon Web Services, and nearly every other division of the company.

Excerpt (emphasis DSC):

Amazon loves to use the word flywheel to describe how various parts of its massive business work as a single perpetual motion machine. It now has a powerful AI flywheel, where machine-learning innovations in one part of the company fuel the efforts of other teams, who in turn can build products or offer services to affect other groups, or even the company at large. Offering its machine-learning platforms to outsiders as a paid service makes the effort itself profitable—and in certain cases scoops up yet more data to level up the technology even more.

It took a lot of six-pagers to transform Amazon from a deep-learning wannabe into a formidable power. The results of this transformation can be seen throughout the company—including in a recommendations system that now runs on a totally new machine-learning infrastructure. Amazon is smarter in suggesting what you should read next, what items you should add to your shopping list, and what movie you might want to watch tonight. And this year Thirumalai started a new job, heading Amazon search, where he intends to use deep learning in every aspect of the service.

“If you asked me seven or eight years ago how big a force Amazon was in AI, I would have said, ‘They aren’t,’” says Pedro Domingos, a top computer science professor at the University of Washington. “But they have really come on aggressively. Now they are becoming a force.”

Maybe the force.

 

 

From DSC:
When will we begin to see more mainstream recommendation engines for learning-based materials? With the demand for people to reinvent themselves, such a next generation learning platform can’t come soon enough!

  • Turning over control to learners to create/enhance their own web-based learner profiles; and allowing people to say who can access their learning profiles.
  • AI-based recommendation engines to help people identify curated, effective digital playlists for what they want to learn about.
  • Voice-driven interfaces.
  • Matching employees to employers.
  • Matching one’s learning preferences (not styles) with the content being presented as one piece of a personalized learning experience.
  • From cradle to grave. Lifelong learning.
  • Multimedia-based, interactive content.
  • Asynchronously and synchronously connecting with others learning about the same content.
  • Online-based tutoring/assistance; remote assistance.
  • Reinvent. Staying relevant. Surviving.
  • Competency-based learning.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

 

 

 

 

We’re about to embark on a period in American history where career reinvention will be critical, perhaps more so than it’s ever been before. In the next decade, as many as 50 million American workers—a third of the total—will need to change careers, according to McKinsey Global Institute. Automation, in the form of AI (artificial intelligence) and RPA (robotic process automation), is the primary driver. McKinsey observes: “There are few precedents in which societies have successfully retrained such large numbers of people.”

Bill Triant and Ryan Craig

 

 

 

Also relevant/see:

Online education’s expansion continues in higher ed with a focus on tech skills — from educationdive.com by James Paterson

Dive Brief:

  • Online learning continues to expand in higher ed with the addition of several online master’s degrees and a new for-profit college that offers a hybrid of vocational training and liberal arts curriculum online.
  • Inside Higher Ed reported the nonprofit learning provider edX is offering nine master’s degrees through five U.S. universities — the Georgia Institute of Technology, the University of Texas at Austin, Indiana University, Arizona State University and the University of California, San Diego. The programs include cybersecurity, data science, analytics, computer science and marketing, and they cost from around $10,000 to $22,000. Most offer stackable certificates, helping students who change their educational trajectory.
  • Former Harvard University Dean of Social Science Stephen Kosslyn, meanwhile, will open Foundry College in January. The for-profit, two-year program targets adult learners who want to upskill, and it includes training in soft skills such as critical thinking and problem solving. Students will pay about $1,000 per course, though the college is waiving tuition for its first cohort.

 

 

 

Production Values for Audio Podcasts, Part I — from learningsolutionsmag.com by Jeff D’Anza

Excerpts:

There are a number of production values that narrative podcasters find effective for grabbing listener attention and keeping their audiences engaged in the story; you could think of these as technical elements of professional audio quality. They range from techniques for improving content when applied to script writing to methods applied to audio recording and editing. The most successful professional podcasters use these elements to create immersion in the audio environment and to eliminate audio distraction. The result is the creation of a kind of audio theater. Here are four basic practices to embrace while creating your narrative podcasts.

  1. Set the scene first
  2. Hook the audience
  3. Vary character voices
  4. Talk like real people

 

Production Values for Audio Podcasts, Part II — from learningsolutionsmag.com by Jeff D’Anza

Excerpts:

In this article, I will continue with more production tricks that can substantially increase the quality of your narrative podcasts.

Use music to reset scenes
It’s not revolutionary to suggest that learners tend to have short attention spans, and the case is no different when it comes to narrative podcasts. Every so often you need to reset your learners’ brains in order to keep their attention level high.

One excellent way to accomplish this is through the use of musical breaks. Music breaks can function as a type of auditory palate cleanser, allowing the brain a few moments to stop focusing on information that is being presented and prepare the learner to be ready for the next section of content.

Also:

  • Host/producer structure
  • Get out of the studio
  • Don’t fear insignificant details

 


From DSC:

Seems to me there’s some wisdom here for instructional designers as well as professors, teachers, and trainers who are creating learning/training related content and/or who are flipping their classrooms.

 

 

 

The Mobile AR Leaders of 2018 — from next.reality.news

Excerpt:

This time last year, we were getting our first taste of what mobile app developers could do in augmented reality with Apple’s ARKit, and most people had never heard of Animojis. Google’s AR platform was still Tango. Snapchat had just introduced its World Lens AR experiences. Most mobile AR experiences existing in the wild were marker-based offerings from the likes of Blippar and Zappar, or generic Pokémon GO knock-offs.

In last year’s NR50, published before the introduction of ARKit, only two of the top 10 professionals worked directly with mobile AR, and Apple CEO Tim Cook was ranked number 26, based primarily on his forward-looking statements about AR.

This year, Cook comes in at number one, with five others categorized under mobile AR in the overall top 10 of the NR30.

What a difference a year makes.

In just 12 months, we’ve seen mobile AR grow at a breakneck pace. Since Apple launched its AR toolkit, users have downloaded more than 13 million ARKit apps from the App Store, not including existing apps updated with ARKit capabilities. Apple has already updated its platform and will introduce even more new features to the public with the release of ARKit 2.0 this fall. Last year’s iPhone X also introduced a depth-sensing camera and AR Animojis that captured the imaginations of its users.

 

 

The Weather Channel forecasts more augmented reality for its live broadcasts with Unreal Engine — from next.reality.news by Tommy Palladino

Excerpt:

Augmented reality made its live broadcast debut for The Weather Channel in 2015. The technology helps on-air talent at the network to explain the science behind weather phenomena and tell more immersive stories. Powered by Unreal Engine, The Future Group’s Frontier platform will enable The Weather Channel to be able to show even more realistic AR content, such as accurately rendered storms and detailed cityscapes, all in real time.

 

 

 

From DSC:
Imagine this type of thing in online-based learning, MOOCs, and/or even in blended learning based learning environments (i.e., in situations where learning materials are designed/created by teams of specialists). If that were the case, who needs to be trained to create these pieces? Will students be creating these types of pieces in the future? Hmmm….

 

 

Winners announced of the 2018 Journalism 360 Challenge — from vrfocus.com
The question of “How might we experiment with immersive storytelling to advance the field of journalism?” looks to be answered by 11 projects.

Excerpt:

The eleven winners were announced on 9/11/18 of a contest being held by the Google News Initiative, Knight Foundation and Online News Association. The 2018 Journalism 360 Challenge asked people the question “How might we experiment with immersive storytelling to advance the field of journalism?” and it generated over 400 responses.

 

 

 

 

 



 

Addendum:

Educause Explores Future of Extended Reality on Campus — from campustechnology.com by Dian Schaffhauser

Among the findings:

  • VR makes people feel like they’re really there. The “intellectual and physiological reactions” to constructs and events in VR are the same — “and sometimes identical” — to a person’s reactions in the real world;
  • 3D technologies facilitate active and experiential learning. AR, for example, lets users interact with an object in ways that aren’t possible in the physical world — such as seeing through surfaces or viewing data about underlying objects. And with 3D printing, learners can create “physical objects that might otherwise exist only simulations”; and
  • Simulations allow for scaling up of “high-touch, high-cost learning experiences.” Students may be able to go through virtual lab activities, for instance, even when a physical lab isn’t available.

Common challenges included implementation learning curves, instructional design, data storage of 3D images and effective cross-departmental collaboration.

“One significant result from this research is that it shows that these extended reality technologies are applicable across a wide spectrum of academic disciplines,” said Malcolm Brown, director of learning initiatives at Educause, in a statement. “In addition to the scientific disciplines, students in the humanities, for example, can re-construct cities and structures that no longer exist. I think this study will go a long way in encouraging faculty, instructional designers and educational technologists across higher education to further experiment with these technologies to vivify learning experiences in nearly all courses of study.”

 



 

 

San Diego’s Nanome Inc. releases collaborative VR-STEM software for free — from vrscout.com by Becca Loux

Excerpt:

The first collaborative VR molecular modeling application was released August 29 to encourage hands-on chemistry experimentation.

The open-source tool is free for download now on Oculus and Steam.

Nanome Inc., the San Diego-based start-up that built the intuitive application, comprises UCSD professors and researchers, web developers and top-level pharmaceutical executives.

 

“With our tool, anyone can reach out and experience science at the nanoscale as if it is right in front of them. At Nanome, we are bringing the craftsmanship and natural intuition from interacting with these nanoscale structures at room scale to everyone,” McCloskey said.

 

San Diego’s Nanome Inc. Releases Collaborative VR-STEM Software For Free

 

 

10 ways VR will change life in the near future — from forbes.com

Excerpts:

  1. Virtual shops
  2. Real estate
  3. Dangerous jobs
  4. Health care industry
  5. Training to create VR content
  6. Education
  7. Emergency response
  8. Distraction simulation
  9. New hire training
  10. Exercise

 

From DSC:
While VR will have its place — especially for timeswhen you need to completely immerse yourself into another environment — I think AR and MR will be much larger and have a greater variety of applications. For example, I could see where instructions on how to put something together in the future could use AR and/or MR to assist with that process. The system could highlight the next part that I’m looking for and then highlight the corresponding parts where it goes — and, if requested, can show me a clip on how it fits into what I’m trying to put together.

 

How MR turns firstline workers into change agents — from virtualrealitypop.com by Charlie Finkand
Mixed Reality, a new dimension of work — from Microsoft and Harvard Business Review

Excerpts:

Workers with mixed-reality solutions that enable remote assistance, spatial planning, environmentally contextual data, and much more,” Bardeen told me. With the HoloLens Firstline Workers workers conduct their usual, day-to-day activities with the added benefit of a heads-up, hands-free, display that gives them immediate access to valuable, contextual information. Microsoft says speech services like Cortana will be critical to control along with gesture, according to the unique needs of each situation.

 

Expect new worker roles. What constitutes an “information worker” could change because mixed reality will allow everyone to be involved in the collection and use of information. Many more types of information will become available to any worker in a compelling, easy-to-understand way. 

 

 

Let’s Speak: VR language meetups — from account.altvr.com

 

 

 

 

Google’s VR Labs provide STEM students with hands-on experience — from vrscout.com by Kyle Melnick

Excerpt:

STEM students engaged in scientific disciplines, such as biochemistry and neuroscience, are often required by their respective degrees to spend a certain amount of time engaged in an official laboratory environment. Unfortunately, crowded universities and the rise of online education have made it difficult for these innovators-in-training to access properly equipped labs and log their necessary hours.

Cue Google VR Labs, a series of comprehensive virtual lab experiences available on the Google Daydream platform. Developed as part of partnership between Google and simulation education company Labster, the in-depth program boasts 30 interactive lab experiences in which biology students can engage in a series of hands-on scientific activities in a realistic environment.

These actions can include everything from the use of practical tools, such as DNA sequencers and microscopes, to reality-bending experiences only capable in a virtual environment, like traveling to the surface of the newly discovered Astakos IV exoplanet or examining and altering DNA on a molecular level.

 

Google’s VR Labs Provide STEM Students With Hands-On Experience

 

Also see:

 

 

 

Adobe Announces the 2019 Release of Adobe Captivate, Introducing Virtual Reality for eLearning Design — from theblog.adobe.com

Excerpt:

  • Immersive learning with VR experiences: Design learning scenarios that your learners can experience in Virtual Reality using VR headsets. Import 360° media assets and add hotspots, quizzes and other interactive elements to engage your learners with near real-life scenarios
  • Interactive videos: Liven up demos and training videos by making them interactive with the new Adobe Captivate. Create your own or bring in existing YouTube videos, add questions at specific points and conduct knowledge checks to aid learner remediation
  • Fluid Boxes 2.0: Explore the building blocks of Smart eLearning design with intelligent containers that use white space optimally. Objects placed in Fluid Boxes get aligned automatically so that learners always get fully responsive experience regardless of their device or browser.
  • 360° learning experiences: Augment the learning landscape with 360° images and videos and convert them into interactive eLearning material with customizable overlay items such as information blurbs, audio content & quizzes.

 

 

Blippar unveils indoor visual positioning system to anchor AR — from martechtoday.com by Barry Levine
Employing machine vision to recognize mapped objects, the company says it can determine which way a user is looking and can calculate positioning down to a centimeter.

A Blippar visualization of AR using its new indoor visual positioning system

 

The Storyteller’s Guide to the Virtual Reality Audience — from medium.com by Katy Newton

Excerpt:

To even scratch the surface of these questions, we need to better understand the audience’s experience in VR — not just their experience of the technology, but the way that they understand story and their role within it.

 

 

Hospital introducing HoloLens augmented reality into the operating room — from medgadget.com

Excerpt:

HoloLens technology is being paired with Microsoft’s Surface Hub, a kind of digital whiteboard. The idea is that the surgical team can gather together around a Surface Hub to review patient information, discuss the details of a procedure, and select what information should be readily accessible during surgery. During the procedure, a surgeon wearing a HoloLens would be able to review a CT or MRI scan, access other data in the electronic medical records, and to be able to manipulate these so as to get a clear picture of what is being worked on and what needs to be done.

 

 

Raleigh Fire Department invests in virtual reality to enrich training — from vrfocus.com by Nikholai Koolon
New system allows department personnel to learn new skills through immersive experiences.

Excerpt:

The VR solution allows emergency medical services (EMS) personnel to dive into a rich and detailed environment which allows them to pinpoint portions of the body to dissect. This then allows them then see each part of the body in great detail along with viewing it from any angle. The goal is to allow for users to gain the experience to diagnose injuries from a variety of vantage points all where working within an virtual environment capable of displaying countless scenarios.

 

 

For another emerging technology, see:

Someday this tiny spider bot could perform surgery inside your body — from fastcompany.com by Jesus Diaz
The experimental robots could also fix airplane engines and find disaster victims.

Excerpt:

A team of Harvard University researchers recently achieved a major breakthrough in robotics, engineering a tiny spider robot using tech that could one day work inside your body to repair tissues or destroy tumors. Their work could not only change medicine–by eliminating invasive surgeries–but could also have an impact on everything from how industrial machines are maintained to how disaster victims are rescued.

Until now, most advanced, small-scale robots followed a certain model: They tend to be built at the centimeter scale and have only one degree of freedom, which means they can only perform one movement. Not so with this new ‘bot, developed by scientists at Harvard’s Wyss Institute for Biologically Inspired Engineering, the John A. Paulson School of Engineering and Applied Sciences, and Boston University. It’s built at the millimeter scale, and because it’s made of flexible materials–easily moved by pneumatic and hydraulic power–the critter has an unprecedented 18 degrees of freedom.

 


Plus some items from a few weeks ago


 

After almost a decade and billions in outside investment, Magic Leap’s first product is finally on sale for $2,295. Here’s what it’s like. — from

Excerpts (emphasis DSC):

I liked that it gave a new perspective to the video clip I’d watched: It threw the actual game up on the wall alongside the kind of information a basketball fan would want, including 3-D renderings and stats. Today, you might turn to your phone for that information. With Magic Leap, you wouldn’t have to.

Abovitz also said that intelligent assistants will play a big role in Magic Leap’s future. I didn’t get to test one, but Abovitz says he’s working with a team in Los Angeles that’s developing high-definition people that will appear to Magic Leap users and assist with tasks. Think Siri, Alexa or Google Assistant, but instead of speaking to your phone, you’d be speaking to a realistic-looking human through Magic Leap. Or you might be speaking to an avatar of someone real.

“You might need a doctor who can come to you,” Abovitz said. “AI that appears in front of you can give you eye contact and empathy.”

 

And I loved the idea of being able to place a digital TV screen anywhere I wanted.

 

 

Magic Leap One Available For Purchase, Starting At $2,295 — from vrscout.com by Kyle Melnick

Excerpt:

December of last year U.S. startup Magic Leap unveiled its long-awaited mixed reality headset, a secretive device five years and $2.44B USD in the making.

This morning that same headset, now referred to as the Magic Leap One Creator Edition, became available for purchase in the U.S. On sale to creators at a hefty starting price of $2,275, the computer spatial device utilizes synthetic lightfields to capture natural lightwaves and superimpose interactive, 3D content over the real-world.

 

 

 

Magic Leap One First Hands-On Impressions for HoloLens Developers — from magic-leap.reality.news

Excerpt:

After spending about an hour with the headset running through set up and poking around its UI and a couple of the launch day apps, I thought it would be helpful to share a quick list of some of my first impressions as someone who’s spent a lot of time with a HoloLens over the past couple years and try to start answering many of the burning questions I’ve had about the device.

 

 

World Campus researches effectiveness of VR headsets and video in online classes — from news.psu.edu

Excerpt:

UNIVERSITY PARK, Pa. — Penn State instructional designers are researching whether using virtual reality and 360-degree video can help students in online classes learn more effectively.

Designers worked with professors in the College of Nursing to incorporate 360-degree video into Nursing 352, a class on Advanced Health Assessment. Students in the class, offered online through Penn State World Campus, were offered free VR headsets to use with their smartphones to create a more immersive experience while watching the video, which shows safety and health hazards in a patient’s home.

Bill Egan, the lead designer for the Penn State World Campus RN to BSN nursing program, said students in the class were surveyed as part of a study approved by the Institutional Review Board and overwhelmingly said that they enjoyed the videos and thought they provided educational value. Eighty percent of the students said they would like to see more immersive content such as 360-degree videos in their online courses, he said.

 

 

7 Practical Problems with VR for eLearning — from learnupon.com

Excerpt:

In this post, we run through some practical stumbling blocks that prevent VR training from being feasible for most.

There are quite a number of practical considerations which prevent VR from totally overhauling the corporate training world. Some are obvious, whilst others only become apparent after using the technology a number of times. It’s important to be made aware of these limitations so that a large investment isn’t made in tech that isn’t really practical for corporate training.

 

Augmented reality – the next big thing for HR? — from hrdconnect.com
Augmented reality (AR) could have a huge impact on HR, transforming long-established processes into engaging and exciting something. What will this look like? How can we shape this into our everyday working lives?

Excerpt (emphasis DSC):

AR also has the potential to revolutionise our work lives, changing the way we think about office spaces and equipment forever.

Most of us still commute to an office every day, which can be a time-consuming and stressful experience. AR has the potential to turn any space into your own customisable workspace, complete with digital notes, folders and files – even a digital photo of your loved ones. This would give you access to all the information and tools that you would typically find in an office, but wherever and whenever you need them.

And instead of working on a flat, stationary, two-dimensional screen, your workspace would be a customisable three-dimensional space, where objects and information are manipulated with gestures rather than hardware. All you would need is an AR headset.

AR could also transform the way we advertise brands and share information. Imagine if your organisation had an AR stand at a conference – how engaging would that be for potential customers? How much more interesting and fun would meetings be if we used AR to present information instead of slides on a projector?

AR could transform the on-boarding experience into something fun and interactive – imagine taking an AR tour of your office, where information about key places, company history or your new colleagues pops into view as you go from place to place. 

 

 

RETINA Are Bringing Augmented Reality To Air Traffic Control Towers — from vrfocus.com by Nikholai Koolonavi

Excerpt:

A new project is aiming to make it easier for staff in airport control towers to visualize information to help make their job easier by leveraging augmented reality (AR) technology. The project, dubbed RETINA, is looking to modernise Europe’s air traffic management for safer, smarter and even smoother air travel.

 

 

 

2018 NMC Horizon Report: The trends, challenges, and developments likely to influence ed tech

2018 NMC Horizon Report — from library.educause.edu

Excerpt:

What is on the five-year horizon for higher education institutions? Which trends and technology developments will drive educational change? What are the critical challenges and how can we strategize solutions? These questions regarding technology adoption and educational change steered the discussions of 71 experts to produce the NMC Horizon Report: 2018 Higher Education Edition brought to you by EDUCAUSE. This Horizon Report series charts the five-year impact of innovative practices and technologies for higher education across the globe. With more than 16 years of research and publications, the Horizon Project can be regarded as one of education’s longest-running explorations of emerging technology trends and uptake.

Six key trends, six significant challenges, and six developments in educational technology profiled in this higher education report are likely to impact teaching, learning, and creative inquiry in higher education. The three sections of this report constitute a reference and technology planning guide for educators, higher education leaders, administrators, policymakers, and technologists.

 

2018 NMC Horizon Report -- a glance at the trends, challenges, and developments likely to influence ed tech -- visual graphic

 

Also see:

 

 

Campus Technology recently announced the recipients of the 2018 Campus Technology Impact Awards.

 

Campus Technology recently announced the recipients of the 2018 Campus Technology Impact Awards.

 

Categories include:

  • Teaching and Learning
  • Education Futurists
  • Student Systems & Services
  • Administration
  • IT Infrastructure & Systems

 

From DSC:
Having served as one of the judges for these competitions during the last several years, I really appreciate the level of innovation that’s been displayed by many of the submissions and the individuals/institutions behind them. 

 

 

You’re already harnessing the science of learning (you just don’t know it) — from edsurge.com by Pooja Agarwal

Excerpt (emphasis DSC):

Now, a decade later, I see the same clicker-like trend: tools like Kahoot, Quizlet, Quizizz and Plickers are wildly popular due to the increased student engagement and motivation they can provide. Meanwhile, these tech tools continue to incorporate powerful strategies for learning, which are discussed less often. Consider, for example, four of the most robust research-based strategies from the science of learning:

  1. Retrieval practice
  2. Spaced practice
  3. Interleaving
  4. Feedback

Sound familiar? It’s because approaches that encourage students to use what they know, revisit it over time, mix it up and learn about their own learning are core elements in many current edtech tools. Kahoot and Quizlet, for example, provide numerous retrieval formats, reminders, shuffle options and instant feedback. A century of scientific researchdemonstrates that these features don’t simply increase engagement—they also improve learning, higher order thinking and transfer of knowledge.

 

 


From DSC:
Pastors should ask this type of question as well: “What did we talk about the last time we met?” — then give the congregation a minute to write down what they can remember.


 

 

Also from Pooja Agarwal and RetrievalPractice.org

For teachers, here’s what we share in a minute or less about retrieval practice:

And when it comes to students, the first thing we share are Retrieval Warm Ups. These quick, fun questions engage students in class discussion and start a conversation about how retrieval is something we do every day. Try one of these with a teacher to start a conversation about retrieval practice, too!

 

 
 
 

Guiding faculty into immersive environments — from campustechnology.com by David Raths
What’s the best way to get faculty to engage with emerging technologies and incorporate new learning spaces into their teaching? Five institutions share their experiences.

Guiding faculty into immersive environments -- by David Raths

Excerpt:

One of the biggest hurdles for universities has been the high cost of VR-enabled computers and headsets, and some executives say prices must continue to drop before we’ll see more widespread usage. But John Bowditch, director of the Game Research and Immersive Design Lab at Ohio University’s Scripps College of Communication, is already seeing promising developments on that front as he prepares to open a new 20-seat VR classroom. “Probably the best thing about VR in 2018 is that it is a lot more affordable now and that democratizes it,” he said. “We purchased a VR helmet 13 years ago, and it was $12,000 just for the headset. The machine that ran it cost about $20,000. That would be a nonstarter beyond purchasing just one or two. Today, you can get a VR-enabled laptop and headset for under $2,000. That makes it much easier to think about integrating it into classes.”

 

 

Colleges and universities face several hurdles in getting faculty to incorporate virtual reality or immersive experiences in their courses. For one, instructional designers, instructional technologists and directors of teaching and learning centers may not have access to these tools yet, and the budgets aren’t always there to get the labs off the ground, noted Daniel Christian, instructional services director at Western Michigan University‘s Cooley Law School. “Many faculty members’ job plates are already jam-packed — allowing little time to even look at emerging technologies,” he said. “Even if they wanted to experiment with such technologies and potential learning experiences, they don’t have the time to do so. Tight budgets are impacting this situation even further.”

 

 

 

 

Computing in the Camera — from blog.torch3d.com by Paul Reynolds
Mobile AR, with its ubiquitous camera, is set to transform what and how human experience designers create.

One of the points Allison [Woods, CEO, Camera IQ] made repeatedly on that call (and in this wonderful blog post of the same time period) was that the camera is going to be at the center of computing going forward, an indispensable element. Spatial computing could not exist without it. Simple, obvious, straightforward, but not earth shaking. We all heard what she had to say, but I don’t think any of us really understood just how profound or prophetic that statement turned out to be.

 

“[T]he camera will bring the internet and the real world into a single time and space.”

— Allison Woods, CEO, Camera IQ

 

 

The Camera As Platform — from shift.newco.co by Allison Wood
When the operating system moves to the viewfinder, the world will literally change

“Every day two billion people carry around an optical data input device — the smartphone Camera — connected to supercomputers and informed by massive amounts of data that can have nearly limitless context, position, recognition and direction to accomplish tasks.”

– Jacob Mullins, Shasta Ventures

 

 

 

The State Of The ARt At AWE 18 — from forbes.com by Charlie Fink

Excerpt:

The bigger story, however, is how fast the enterprise segment is growing as applications as straightforward as schematics on a head-mounted monocular microdisplay are transforming manufacturing, assembly, and warehousing. Use cases abounded.

After traveling the country and most recently to Europe, I’ve now experienced almost every major VR/AR/MR/XR related conference out there. AWE’s exhibit area was by far the largest display of VR and AR companies to date (with the exception of CES).

 

AR is being used to identify features and parts within cars

 

 

 

 

Student Learning and Virtual Reality: The Embodied Experience — from er.educause.edu by Jaime Hannans, Jill Leafstedt and Talya Drescher

Excerpts:

Specifically, we explored the potential for how virtual reality can help create a more empathetic nurse, which, we hypothesize, will lead to increased development of nursing students’ knowledge, skills, and attitudes. We aim to integrate these virtual experiences into early program coursework, with the intent of changing nursing behavior by providing a deeper understanding of the patient’s perspective during clinical interactions.

In addition to these compelling student reflections and the nearly immediate change in reporting practice, survey findings show that students unanimously felt that this type of patient-perspective VR experience should be integrated and become a staple of the nursing curriculum. Seeing, hearing, and feeling these moments results in significant and memorable learning experiences compared to traditional classroom learning alone. The potential that this type of immersive experience can have in the field of nursing and beyond is only limited by the imagination and creation of other virtual experiences to explore. We look forward to continued exploration of the impact of VR on student learning and to establishing ongoing partnerships with developers.

 

Also see:

 

 

 

Reimagining the Higher Education Ecosystem — from edu2030.agorize.com
How might we empower people to design their own learning journeys so they can lead purposeful and economically stable lives?

Excerpts:

The problem
Technology is rapidly transforming the way we live, learn, and work. Entirely new jobs are emerging as others are lost to automation. People are living longer, yet switching jobs more often. These dramatic shifts call for a reimagining of the way we prepare for work and life—specifically, how we learn new skills and adapt to a changing economic landscape.

The changes ahead are likely to hurt most those who can least afford to manage them: low-income and first generation learners already ill-served by our existing postsecondary education system. Our current system stifles economic mobility and widens income and achievement gaps; we must act now to ensure that we have an educational ecosystem flexible and fair enough to help all people live purposeful and economically stable lives. And if we are to design solutions proportionate to this problem, new technologies must be called on to scale approaches that reach the millions of vulnerable people across the country.

 

The challenge
How might we empower people to design their own learning journeys so they can lead purposeful and economically stable lives?

The Challenge—Reimagining the Higher Education Ecosystem—seeks bold ideas for how our postsecondary education system could be reimagined to foster equity and encourage learner agency and resilience. We seek specific pilots to move us toward a future in which all learners can achieve economic stability and lead purposeful lives. This Challenge invites participants to articulate a vision and then design pilot projects for a future ecosystem that has the following characteristics:

Expands access: The educational system must ensure that all people—including low-income learners who are disproportionately underserved by the current higher education system—can leverage education to live meaningful and economically stable lives.

Draws on a broad postsecondary ecosystem: While college and universities play a vital role in educating students, there is a much larger ecosystem in which students learn. This ecosystem includes non-traditional “classes” or alternative learning providers, such as MOOCs, bootcamps, and online courses as well as on-the-job training and informal learning. Our future learning system must value the learning that happens in many different environments and enable seamless transitions between learning, work, and life.

 

From DSC:
This is where I could see a vision similar to Learning from the Living [Class] Room come into play. It would provide a highly affordable, accessible platform, that would offer more choice, and more control to learners of all ages. It would be available 24×7 and would be a platform that supports lifelong learning. It would combine a variety of AI-enabled functionalities with human expertise, teaching, training, motivation, and creativity.

It could be that what comes out of this challenge will lay the groundwork for a future, massive new learning platform.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

Also see:

 

What teaching that lifts all students could look like — from kqed.org by Kristina Rizga

An initial comment from DSC:
I recently ran across this article. Although it’s from 12/24/15, it has some really solid points and strategies in it. Definitely worth a read if you are a teacher or even a parent with school age kids.

Excerpts (emphasis DSC):

…his comments in class about the substance of her ideas, his feedback on her writing, the enthusiasm in his voice when he discussed her thinking. Over the course of a year, that proof solidified into a confidence that couldn’t be easily shaken anymore. It was that pride in her intellect that gave her the fortitude and resilience to cut through many racial stereotypes and negative myths as she made her way through high school and then Boston University.

For McKamey, the most important value driving her teaching and coaching is her conviction that being a good teacher means hearing, seeing, and succeeding with all students—regardless of how far a student is from the teacher’s preconceived notions of what it means to be ready to learn. When teachers are driven by a belief that all of their students can learn, they are able to respond to the complexity of their students’ needs and to adjust if something is not working for a particular individual or group of students.

 

 

The best way to improve teaching and reduce the achievement gaps, McKamey argues, is to allow teachers to act as school-based researchers and leaders, justifying classroom reforms based on the broad range of performance markers of their students: daily grades, the quality of student work and the rate of its production, engagement, effort, attendance, and student comments. That means planning units together and then spending a lot of time analyzing the iterative work the students produce. This process teaches educators to recognize that there are no standard individuals, and there are as many learning trajectories as there are people. 

 

 

 
© 2024 | Daniel Christian