AR and VR in STEM: The New Frontiers in Science  — from er.educause.edu by Emory Craig and Maya Georgieva

Excerpt:

Virtual and Augmented Reality are poised to profoundly transform the STEM curriculum. In this article, we offer several inspiring examples and key insights on the future of immersive learning and the sciences. Immersive technologies will revolutionize learning through experiential simulations, modelling and spatial representation of data, and a sense of presence in contextual gamification.

Understanding our place in the universe, building the next Martian Rover, designing new transportation systems, fostering sustainable communities, modeling economic stability — finding the solution for these pressing and interconnected challenges brings us to STEM and STEAM in teaching and learning. The movement behind STEAM advocates incorporating the arts and humanities to the science, technology, engineering and math curriculum.

 

 

Also see:

 

 

 

From DSC:
I appreciated hearing the perspectives from Bruce Dixon and Will Richardson this morning, as I listed to a webinar that they recently offered. A few key takeaways for me from that webinar — and with a document that they shared — were:

  • The world has fundamentally changed. (Bruce and Will also mentioned the new pace of change; i.e., that it’s much faster.)
  • We need to have more urgency about the need to reimagine school, not to try to improve the existing model.
  • “Because of the advent of the Web and the technologies we use to access it, learning is, in a phrase, leaving the (school) building.”
  • There is a newfound capacity to take full control of one’s own learning; self-determined learning should be at the center of students’ and teachers’ work; co-constructed curriculum
  • And today, at a moment when learners of all ages have never had more agency over their own learning, schools must unlearn centuries old mindsets and practices and relearn them in ways that truly will serve every child living in the modern, connected world.
  • Will and Bruce believe that every educator — and district for that matter — should articulate their own “principles of learning”
  • Beliefs about how kids learn (powerfully and deeply) need to be articulated and consistently communicated and lived out
  • Everything we do as educators, administrators, etc. tells a story. What stories are we telling? (For example, what does the signage around your school building say? Is it about compliance? Is is about a love of learning? Wonder? What does the 20′ jumbo tron say about priorities? Etc.)
  • Bruce and Will covered a “story audit” and how to do one

 

“Learning is, in a phrase, leaving the (school) building.”

Richardson & Dixon

 

 

Also see:

 

 

 

These educators have decades worth of experience. They are pulse-checking their environments. They want to see students thrive both now and into the future. For these reasons, at least for me, their perspectives are highly worth reflecting upon.

 

 

 

ARCore: Augmented reality at Android scale — from blog.google by Dave Burke

Excerpt:

With more than two billion active devices, Android is the largest mobile platform in the world. And for the past nine years, we’ve worked to create a rich set of tools, frameworks and APIs that deliver developers’ creations to people everywhere. Today, we’re releasing a preview of a new software development kit (SDK) called ARCore. It brings augmented reality capabilities to existing and future Android phones. Developers can start experimenting with it right now.

 

Google just announced its plan to match the coolest new feature coming to the iPhone –from cnbc.com by Todd Haselton

  • Google just announced its answer to Apple’s augmented reality platform
  • New tools called ARCore will let developers enable AR on millions of Android devices

 

AR Experiments

Description:

AR Experiments is a site that features work by coders who are experimenting with augmented reality in exciting ways. These experiments use various tools like ARCore, an SDK that lets Android developers create awesome AR experiences. We’re featuring some of our favorite projects here to help inspire more coders to imagine what could be made with AR.

 

Google’s ARCore hopes to introduce augmented reality to the Android masses — from androidauthority.com by Williams Pelegrin

Excerpt:

Available as a preview, ARCore is an Android software development kit (SDK) that lets developers introduce AR capabilities to, you guessed it, Android devices. Because of how ARCore works, there is no need for folks to purchase additional sensors or hardware – it will work on existing and future Android phones.

 

 

Augray, An Augmented Reality Company Has Launched World’s First AR Messenger — from tada-time.com; with thanks to Mr. Woontack Woo for this resource

Excerpt:

Never before has there been a messenger app based only on AR.

Just by downloading the app with a selfie, you will get your avatar and up get to customize them! There are many clothing and style options to fit your personality and mood. Don’t worry, you can change their appearance at any time. Once you create your digital you, you will place them in any environment around you, with the default animation’s audio or with your own voice, you really make your avatar come alive. You can record the actions in the real world, record, and chat with your friends and family or post them in the inbuilt social media for everyone to see your creative recording or the to experience your digital you in their world! This app will change your social life for the better.

 

Also related/see:
The Future of Social Media is Here: The Rise of Augmented Reality

 

 

The Top 5 Virtual Reality and Augmented Reality Apps for Architects — from archdaily.com by Lidija Grozdanic for Archipreneur.com; again, with thanks to Mr. Woontack Woo for this resource

Excerpt:

Virtual reality and augmented reality tools for the AEC industry are getting increasingly better and more optimized. As prices keep dropping, there are fewer reasons why every architect, engineer, contractor, and owner shouldn’t use some form of VR/AR in bringing their projects to life.

From being a novelty a few years ago, VR/AR solutions are slowly becoming a medium that’s transforming the way professionals in the AEC industry communicate, create and experience content. Offering a more immersive experience of architectural designs, but also products and areas related to space building, VR and AR tools are becoming an industry standard that offers rapid iterations and opportunity to refine designs in collaboration with clients and colleagues.

 

 

 

Dell makes a VR Visor to go with its Alienware laptops — from engadget.com by Andrew Tarantola
It’ll retail for $349 when it goes on sale in October.

 

 

 



Addendums:
Microsoft’s new partnership with Valve looks like a win-win — from businessinsider.com by Matt Weinberger

Excerpt:

  • Cheap VR headsets, priced at around $300, are coming to Windows 10 later this year
  • Microsoft just announced those headsets will support apps and games from the Steam platform, which will greatly expand selection. Plus, Microsoft says it’s working on “Halo” content for virtual reality.
  • The VR headsets will work with less powerful computers, potentially expanding the market.

 
Virtual reality, real rewards in higher ed — from universitybusiness.com by Jennifer Fink
Using augmented and virtual reality to attract students and engage donors

 



 

 

 

Top Trends in the Gartner Hype Cycle for Emerging Technologies, 2017 — from gartner.com by Kasey Panetta
Enterprises should explain the business potential of blockchain, artificial intelligence and augmented reality.

Excerpt (emphasis DSC):

…emerging technologies such as machine learning, blockchain, drones (commercial UAVs), software-defined security and brain-computer interfaces have moved significantly along the Hype Cycle since 2016.

The Gartner Hype Cycle for Emerging Technologies, 2017 focuses on three emerging technology mega-trends: Artificial intelligence (AI) everywhere, transparently immersive experiences and digital platforms. Enterprise architects and technology innovation leaders should explore and ideate these three mega-trends to understand the future impacts to their business.

“Organizations will continue to be faced with rapidly accelerating technology innovation that will profoundly impact the way they deal with their workforces, customers and partners,” says Mike J. Walker, research director. “Our 2017 Hype Cycle reveals three distinct technology trends that profoundly create new experiences with unrivaled intelligence, and offer platforms that propel organizations to connect with new business ecosystems in order to become competitive over the next five to 10 years.”

 

 

 

 

 

 

 

Artificial intelligence will transform universities. Here’s how. — from weforum.org by Mark Dodgson & David Gann

Excerpt:

The most innovative AI breakthroughs, and the companies that promote them – such as DeepMind, Magic Pony, Aysadi, Wolfram Alpha and Improbable – have their origins in universities. Now AI will transform universities.

We believe AI is a new scientific infrastructure for research and learning that universities will need to embrace and lead, otherwise they will become increasingly irrelevant and eventually redundant.

Through their own brilliant discoveries, universities have sown the seeds of their own disruption. How they respond to this AI revolution will profoundly reshape science, innovation, education – and society itself.

As AI gets more powerful, it will not only combine knowledge and data as instructed, but will search for combinations autonomously. It can also assist collaboration between universities and external parties, such as between medical research and clinical practice in the health sector.

The implications of AI for university research extend beyond science and technology.

When it comes to AI in teaching and learning, many of the more routine academic tasks (and least rewarding for lecturers), such as grading assignments, can be automated. Chatbots, intelligent agents using natural language, are being developed by universities such as the Technical University of Berlin; these will answer questions from students to help plan their course of studies.

Virtual assistants can tutor and guide more personalized learning. As part of its Open Learning Initiative (OLI), Carnegie Mellon University has been working on AI-based cognitive tutors for a number of years. It found that its OLI statistics course, run with minimal instructor contact, resulted in comparable learning outcomes for students with fewer hours of study. In one course at the Georgia Institute of Technology, students could not tell the difference between feedback from a human being and a bot.

 

 

Also see:

Digital audio assistants in teaching and learning — from blog.blackboard.com by Szymon Machajewski

Excerpts:

I built an Amazon Alexa skill called Introduction to Computing Flashcards. In using the skill, or Amazon Alexa app, students are able to listen to Alexa and then answer questions. Alexa helps students prepare for an exam by speaking definitions and then waiting for their identification. In addition to quizzing the student, Alexa is also keeping track of the correct answers. If a student answers five questions correctly, Alexa shares a game code, which is worth class experience points in the course gamification My Game app.

Certainly, exam preparation apps are one way to use digital assistants in education. As development and publishing of Amazon Alexa skills becomes easier, faculty will be able to produce such skills just as easily as they now create PowerPoints. Given the basic code available through Amazon tutorials, it takes 20 minutes to create a new exam preparation app. Basic voice experience Amazon Alexa skills can take as much as five minutes to complete.

Universities can publish their campus news through the Alexa Flash Briefing. This type of a skill can publish news, success stories, and other events associated with the campus.

If you are a faculty member, how can you develop your first Amazon Alexa skill? You can use any of the tutorials already available. You can also participate in an Amazon Alexa classroom training provided by Alexa Dev Days. It is possible that schools or maker spaces near you offer in-person developer sessions. You can use meetup.com to track these opportunities.

 

 

 

 

 

5 reasons iPhone 8 will be an augmented reality game changer — from techradar.com by Mark Knapp
Augmented Reality (AR) will be everywhere!

Excerpt (emphasis DSC):

Most of us are at least vaguely familiar with augmented reality thanks to Pokemon Go sticking Charmanders in our sock drawers and Snapchat letting us spew stars and rainbows from our mouths, but Apple’s iPhone 8 is going to push AR to point of ubiquity. When the iPhone 8 launches, we’ll all be seeing the world differently.

iPhones are everywhere, so AR will be everywhere!

The iPhone 8 will bring with it iOS 11, and with iOS 11 will come Apple’s AR. Since the iPhone 8 is all but guaranteed to be a best-seller and earlier iPhones and iPads will be widely updated to iOS 11, Apple will have a massive AR platform. Craig Federighi, Apple’s Senior Vice President of Software Engineering, believes it will be “the largest AR platform in the world,” which will lure AR developers en masse.

 

Apple AR has huge potential in education.
Apple has been positioning itself in the education world for years, with programs like iTunes U and iBook, as well as efforts to get iPads into classrooms. AR already has major prospects in education, with the ability to make Museum exhibits interactive and to put a visually explorable world in front of users.

 

 

Apple could guide you around your city using augmented reality — from techcrunch.com by Romain Dillet, with thanks to Woontack Woo for this resource

 

 

 

Startup Simulanis uses augmented and virtual reality to skill professionals — from economictimes.indiatimes.com by Vinay Dwivedi

Excerpt:

[India] The widening gap between the skills required by businesses and the know-how of a large number of engineering students got Raman Talwar started on his entrepreneurial journey.

Delhi-based Simulanis harnesses AR and VR technology to help companies across industries— pharmaceuticals, auto, FMCG and manufacturing—train their staff. It continues to work in the engineering education sector and has developed applications that assist students visualise challenging subjects and concepts.

Our products help students and trainees learn difficult concepts easily and interactively through immersive AR-VR and 3-D gamification methods,” says Talwar. Simulanis’ offerings include an AR learning platform, Saral, and a gamified learning platform, Protocol.

 

Also see:

 

 

 

Want to learn a new language? With this AR app, just point & tap — from fastcodesign.com by Mark Wilson
A new demo shows how augmented reality could redefine apps as we know them.

Excerpt:

There’s a new app gold rush. After Facebook and Apple both released augmented reality development kits in recent months, developers are demonstrating just what they can do with these new technologies. It’s a race to invent the future first.

To get a taste of how quickly and dramatically our smartphone apps are about to change, just take a look at this little demo by front end engineer Frances Ng, featured on Prosthetic Knowledge. Just by aiming her iPhone at various objects and tapping, she can both identify items like lamps and laptops, and translate their names to a number of different languages. Bye bye, multilingual dictionaries and Google translate. Hello, “what the heck is the Korean word for that?”

 

 

 

Also see:

Apple ARKit & Machine Learning Come Together Making Smarter Augmented Reality — from next.reality.news by Jason Odom

Excerpt:

The world is a massive place, especially when you consider the field of view of your smartglasses or mobile device. To fulfill the potential promise of augmented reality, we must find a way to fill that view with useful and contextual information. Of course, the job of creating contextual, valuable information, to fill the massive space that is the planet earth, is a daunting task to take on. Machine learning seems to be one solution many are moving toward.

Tokyo, Japan based web developer, Frances Ng released a video on Twitter showing off her first experiments with Apple’s ARKit and CoreML, Apple’s machine learning system. As you can see in the gifs below, her mobile device is being used to recognize a few objects around her room, and then display the name of the identified objects.

 

 

 

Learn the skills and resources you need to master virtual reality — from vudream.com by Mark Metry

Excerpt:

[From] Tee Jia Hen, CEO of VRcollab
In my opinion, there are 4 specializations for the VR content professional.

  1. VR native app development
  2. Cinematic VR creation
  3. Photogrammetry
  4. VR web development

 

 


Also see:

Getting Started with WebVR – from virtualrealitypop.com by Michael Hazani

Excerpt:

This is not a tutorial or a comprehensive, thorough technical guide?—?many of those already exist?—?but rather a way to think about WebVR and acquaint yourself with what it is, exactly, and how best to approach it from scratch. If you’ve been doing WebVR or 3D programming for a while, this article is most certainly not for you. If you’ve been curious about that stuff and want to know how to join the party— read on!

 


 

 

 

Davy Crockett to give tours of Alamo in new augmented reality app — from mysanantonio.com by Samantha Ehlinger

Excerpt:

Using a smart phone, users will be able to see and interact with computer-generated people and scenes from the past — overlayed on top of the very real and present-day Alamo. The app will also show the Alamo as it was at different points in history, and tell the story of the historic battle through different perspectives of the people (like Crockett) who were there. The app includes extra features users can buy, much like Pokémon Go.

“We’re making this into a virtual time machine so that if I’m standing on this spot and I look at, oh well there’s Davy Crockett, then I can go back a century and I can see the mission being built,” Alamo Reality CEO Michael McGar said. The app will allow users to see the Alamo not only as it was in 1836, but as it was before and after, McGar said.

 

 

 

“We’re developing a technology that’s going to be able to span across generations to tell a story”

— Lane Traylor

 

 

Augmented Reality Technology: A student creates the closest thing yet to a magic ring — from forbes.com by Kevin Murnane

Excerpt:

Nat Martin set himself the problem of designing a control mechanism that can be used unobtrusively to meld AR displays with the user’s real-world environment. His solution was a controller in the shape of a ring that can be worn on the user’s finger. He calls it Scroll. It uses the ARKit software platform and contains an Arduino circuit board, a capacitive sensor, gyroscope, accelerometer, and a Softpot potentiometer. Scroll works with any AR device that supports the Unity game engine such as Google Cardboard or Microsoft’s Hololens.

 

Also see:

Scroll from Nat on Vimeo.

 

 


Addendum on 8/15/17:

New iOS 11 ARKit Demo Shows Off Drawing With Fingers In Augmented Reality [Video] — from redmondpie.com by Oliver Haslam |

Excerpt:

When Apple releases iOS 11 to the public next month, it will also release ARKit for the first time. The framework, designed to make bringing augmented reality to iOS a reality was first debuted during the opening keynote of WWDC 2017 when Apple announced iOS 11, and ever since then we have been seeing new concepts and demos be released by developers.

Those developers have given us a glimpse of what we can expect when apps taking advantage of ARKit start to ship alongside iOS 11, and the latest of those is a demonstration in which someone’s finger is used to draw on a notepad.

 


 

 

 

Why Natural Language Processing is the Future of Business Intelligence — from dzone.com by Gur Tirosh
Until now, we have been interacting with computers in a way that they understand, rather than us. We have learned their language. But now, they’re learning ours.

Excerpt:

Every time you ask Siri for directions, a complex chain of cutting-edge code is activated. It allows “her” to understand your question, find the information you’re looking for, and respond to you in a language that you understand. This has only become possible in the last few years. Until now, we have been interacting with computers in a way that they understand, rather than us. We have learned their language.

But now, they’re learning ours.

The technology underpinning this revolution in human-computer relations is Natural Language Processing (NLP). And it’s already transforming BI, in ways that go far beyond simply making the interface easier. Before long, business transforming, life changing information will be discovered merely by talking with a chatbot.

This future is not far away. In some ways, it’s already here.

What Is Natural Language Processing?
NLP, otherwise known as computational linguistics, is the combination of Machine Learning, AI, and linguistics that allows us to talk to machines as if they were human.

 

 

But NLP aims to eventually render GUIs — even UIs — obsolete, so that interacting with a machine is as easy as talking to a human.

 

 

 

 

Penn State World Campus implements 360-degree videos in online courses — from news.psu.edu by Mike Dawson
Videos give students virtual-reality experiences; leaders hopeful for quick expansion

Excerpt:

UNIVERSITY PARK, Pa. — Penn State World Campus is using 360-degree videos and virtual reality for the first time with the goal of improving the educational experience for online learners.

The technology has been implemented in the curriculum of a graduate-level special education course in Penn State’s summer semester. Students can use a VR headset to watch 360-degree videos on a device such as a smartphone.

The course, Special Education 801, focuses on how teachers can respond to challenging behaviors, and the 360-degree videos place students in a classroom where they see an instructor explaining strategies for arranging the classroom in ways best-suited for the learning activity. The videos were produced using a 360-degree video camera and uploaded into the course in just a few a days.

 

 

 

Report: AI will be in nearly all new software by 2020 — from thejournal.com by Joshua Bolkan

Excerpt:

Artificial intelligence will be in nearly all new software products by 2020 and a top five investment priority for more than 30 percent of chief information officers, according to a new report from Gartner.

The company lists three keys to successfully exploiting AI technologies over the next few years:

  • Many vendors are “AI washing” their products, or applying the term artificial intelligence to tools that don’t really merit it. Vendors should use the term wisely and be clear about what differentiates their AI products and what problems they solve;
  • Forego more complicated or cutting-edge AI techniques in favor of simpler, proven approaches; and
  • Organizations do not have the skills to evaluate, build or deploy AI and are looking for embedded or packaged AI rather than custom building their own.

 

 

 

 

How SLAM technology is redrawing augmented reality’s battle lines — from venturebeat.com by Mojtaba Tabatabaie

 

 

Excerpt (emphasis DSC):

In early June, Apple introduced its first attempt to enter AR/VR space with ARKit. What makes ARKit stand out for Apple is a technology called SLAM (Simultaneous Localization And Mapping). Every tech giant — especially Apple, Google, and Facebook — is investing heavily in SLAM technology and whichever takes best advantage of SLAM tech will likely end up on top.

SLAM is a technology used in computer vision technologies which gets the visual data from the physical world in shape of points to make an understanding for the machine. SLAM makes it possible for machines to “have an eye and understand” what’s around them through visual input. What the machine sees with SLAM technology from a simple scene looks like the photo above, for example.

Using these points machines can have an understanding of their surroundings. Using this data also helps AR developers like myself to create much more interactive and realistic experiences. This understanding can be used in different scenarios like robotics, self-driving cars, AI and of course augmented reality.

The simplest form of understanding from this technology is recognizing walls and barriers and also floors. Right now most AR SLAM technologies like ARKit only use floor recognition and position tracking to place AR objects around you, so they don’t actually know what’s going on in your environment to correctly react to it. More advanced SLAM technologies like Google Tango, can create a mesh of our environment so not only the machine can tell you where the floor is, but it can also identify walls and objects in your environment allowing everything around you to be an element to interact with.

 

 

The company with the most complete SLAM database will likely be the winner. This database will allow these giants to have an eye on the world metaphorically, so, for example Facebook can tag and know the location of your photo by just analyzing the image or Google can place ads and virtual billboards around you by analyzing the camera feed from your smart glasses. Your self-driving car can navigate itself with nothing more than visual data.

 

 

 
© 2025 | Daniel Christian