Teaching with Technology in 2018 — from thejournal.com by David Nagel
In our third-annual ed tech survey, teachers reveal an overwhelmingly positive attitude toward tech in the classroom and its impact on teaching, learning and professional development.

Excerpt:

Teachers are growing fonder of technology every year. Even the dreaded mobile phone is gaining acceptance as a classroom tool, at least among those who participated in THE Journal’s third-annual Teaching with Technology Survey.

Teacher Attitudes Toward Tech
While teachers in each of the preceding survey were, for the most part, pumped up about tech for learning, this year’s results reveal an evolving positivism not just about tech, but about the direction tech is heading.

Exactly three-quarters of teachers in the survey indicated tech has had an extremely positive (38.37 percent) or mostly positive (36.63 percent) impact on education. The remaining 25 percent said tech has had both positive and negative effects on education. Zero respondents said tech had a negative or extremely negative impact.

Responses about tech’s impact on student learning were similar, with 84 percent saying it’s had a positive impact, 6 percent saying it’s had a negative impact and 10 percent being neutral.

 

 

 

 

Click on the image to get a larger image in a PDF file format.

 


From DSC:
So regardless of what was being displayed up on any given screen at the time, once a learner was invited to use their devices to share information, a graphical layer would appear on the learner’s mobile device — as well as up on the image of the screens (but the actual images being projected on the screens would be shown in the background in a muted/pulled back/25% opacity layer so the code would “pop” visually-speaking) — letting him or her know what code to enter in order to wirelessly share their content up to a particular screen. This could be extra helpful when you have multiple screens in a room.

For folks at Microsoft: I could have said Mixed Reality here as well.


 

#ActiveLearning #AR #MR #IoT #AV #EdTech #M2M #MobileApps
#Sensors #Crestron #Extron #Projection #Epson #SharingContent #Wireless

 

 

The 50 Best Augmented Reality Apps for iPhone, iPad & Android Devices — from next.reality.news by Tommy Palladino

Excerpt:

Complete Anatomy 2018 +Courses (iOS): Give your preschoolers a head start on their education! Okay, clearly this app is meant for more advanced learners. Compared to the average app, you’ll end up paying through the nose with in-app purchases, but it’s really a drop in the bucket compared to the student loans students will accumulate in college. Price: Free with in-app purchases ranging from $0.99 to $44.99.

SkyView (iOS & Android): If I can wax nostalgic for a bit, I recall one of the first mobile apps that wowed me being Google’s original SkyView app. Now you can bring back that feeling with some augmented reality. With SkyView, you can point your phone to the sky and the app will tell you what constellations or other celestial bodies you are looking at. Price: $1.99, but there’s a free version for iOS and Android.

JigSpace (iOS): JigSpace is an app dedicated to showing users how things work (the human body, mechanical objects, etc.). And the app recently added how-to info for those who WonderHowTo do other things as well. JigSpace can now display its content in augmented reality as well, which is a brilliant application of immersive content to education. Price: Free.

NY Times (iOS & Android): The New York Times only recently adopted augmented reality as a means for covering the news, but already we’ve had the chance to see Olympic athletes and David Bowie’s freaky costumes up close. That’s a pretty good start! Price: Free with in-app purchases ranging from $9.99 to $129.99 for subscriptions.

BBC Civilisations (iOS & Android): Developed as a companion to the show of the same name, this app ends up holding its own as an AR app experience. Users can explore digital scans of ancient artifacts, learn more about their significance, and even interact with them. Sure, Indiana Jones would say this stuff belongs in a museum, but augmented reality lets you view them in your home as well. Price: Free.

SketchAR (iOS, Android, & Windows): A rare app that works on the dominant mobile platforms and HoloLens, Sketch AR helps users learn how to draw. Sketch AR scans your environment for your drawing surface and anchors the content there as you draw around it. As you can imagine, the app works best on HoloLens since it keeps users’ hands free to draw. Price: Free.

 

 

Sun Seeker (iOS & Android): This app displays the solar path, hour intervals, and more in augmented reality. While this becomes a unique way to teach students about the Earth’s orbit around the sun (and help refute silly flat-earthers), it can also be a useful tool for professionals. For instance, it can help photographers plan a photoshoot and see where sunlight will shine at certain times of the day. Price: $9.99.

Froggipedia (iOS): Dissecting a frog is basically a rite of passage for anyone who has graduated from primary school in the US within the past 50 years or so. Thanks to augmented reality, we can now save precious frog lives while still learning about their anatomy. The app enables users to dissect virtual frogs as if they are on the table in front of them, and without the stench of formaldehyde. Price: $3.99.

GeoGebra Augmented Reality (iOS): Who needs a graphing calculator when you can visualize equations in augmented reality. That’s what GeoGebra does. The app is invaluable for visualizing graphs. Price: Free.

 

 

Addendum:

 

 

 

 

Udacity Launches a ‘Learn ARKit’ Course Created in Collaboration with Unity — from roadtovr.com by Scott Hayden

Excerpt:

With ARKit already baked into the mobile operating system of “hundreds of millions of iPhones and iPads,” the massive potential install base means there’s plenty of reasons for developers to start making new augmented reality apps for Apple’s App Store. Now Udacity, the for-profit online education site that was spawned from free Stanford University computer science classes, has created a course that says will take you one month to complete so you can start making your own AR apps for iOS.

 

 

From DSC:
Again, how many of these types of courses/programs are in the works right now throughout traditional institutions of higher education? My guess? Very few.

 

 

What’s keeping us from being more responsive?

 

 

 

 

 

 

 

 



Also see:



 

Everything Apple Announced — from wired.comby Arielle Pardes

Excerpt:

To much fanfare, Apple CEO Tim Cook unveiled the next crop of iPhones [on 9/12/17] at the new Steve Jobs Theater on Apple’s new headquarters in Cupertino. With the introduction of three new phones, Cook made clear that Apple’s premiere product is very much still evolving. The iPhone X, he said, represents “the future of smartphones”: a platform for augmented reality, a tool for powerful computing, screen for everything. But it’s not all about the iPhones. The event also brought with it a brand new Apple Watch, upgrades to Apple TV, and a host of other features coming to the Apple ecosystem this fall. Missed the big show? Check out our archived live coverage of Apple’s big bash, and read all the highlights below.

 

 

iPhone Event 2017 — from techcrunch.com

From DSC:
A nice listing of articles that cover all of the announcements.

 

 

Apple Bets on Augmented Reality to Sell Its Most Expensive Phone — from bloomberg.com by Alex Webb and Mark Gurman

Excerpt:

Apple Inc. packed its $1,000 iPhone with augmented reality features, betting the nascent technology will persuade consumers to pay premium prices for its products even as cheaper alternatives abound.

The iPhone X, Apple’s most expensive phone ever, was one of three new models Chief Executive Officer Tim Cook showed off during an event at the company’s new $5 billion headquarters in Cupertino, California, on Tuesday. It also rolled out an updated Apple Watch with a cellular connection and an Apple TV set-top box that supports higher-definition video.

Augmented Reality
Apple executives spent much of Tuesday’s event describing how AR is at the core of the new flagship iPhone X. Its new screen, 3-D sensors, and dual cameras are designed for AR video games and other more-practical uses such as measuring digital objects in real world spaces. Months before the launch, Apple released a tool called ARKit that made it easier for developers to add AR capabilities to their apps.

These technologies have never been available in consumer devices and “solidify the platform on which Apple will retain and grow its user base for the next decade,” Gene Munster of Loup Ventures wrote in a note following Apple’s event.

The company is also working on smart glasses that may be AR-enabled, people familiar with the plan told Bloomberg earlier this year.

 

 

Meet the iPhone X, Apple’s New High-End Handset — from wired.com by David Pierce

Excerpt:

First of all, the X looks like no other phone. It doesn’t even look like an iPhone. On the front, it’s screen head to foot, save for a small trapezoidal notch taken out of the top where Apple put selfie cameras and sensors. Otherwise, the bezel around the edge of the phone has been whittled to near-nonexistence and the home button disappeared—all screen and nothing else. The case is made of glass and stainless steel, like the much-loved iPhone 4. The notched screen might take some getting used to, but the phone’s a stunner. It goes on sale starting at $999 on October 27, and it ships November 3.

If you can’t get your hands on an iPhone X in the near future, Apple still has two new models for you. The iPhone 8 and 8 Plus both look like the iPhone 7—with home buttons!—but offer a few big upgrades to match the iPhone X. Both new models support wireless charging, run the latest A11 Bionic processor, and have 2 gigs of RAM. They also have glass backs, which gives them a glossy new look. They don’t have OLED screens, but they’re getting the same TrueTone tech as the X, and they can shoot video in 4K.

 

 

Apple Debuts the Series 3 Apple Watch, Now With Cellular — from wired.com by David Pierce

 

 

 

Ikea and Apple team up on augmented reality home design app — from curbed.com by Asad Syrkett
The ‘Ikea Place’ app lets shoppers virtually test drive furniture

 

 

The New Apple iPhone 8 Is Built for Photography and Augmented Reality — from time.com by Alex Fitzpatrick

Excerpt:

Apple says the new iPhones are also optimized for augmented reality, or AR, which is software that makes it appear that digital images exist in the user’s real-world environment. Apple SVP Phil Schiller demonstrated several apps making use of AR technology, from a baseball app that shows users player statistics when pointing their phone at the field to a stargazing app that displays the location of constellations and other celestial objects in the night sky. Gaming will be a major use case for AR as well.

Apple’s new iPhone 8 and iPhone 8 plus will have wireless charging as well. Users will be able to charge the device by laying it down on a specially-designed power mat on their desk, bedside table or inside their car, similar to how the Apple Watch charges. (Competing Android devices have long had a similar feature.) Apple is using the Qi wireless charging standard for the iPhones.

 

 

Why you shouldn’t unlock your phone with your face — from medium.com by Quincy Larson

Excerpt:

Today Apple announced its new FaceID technology. It’s a new way to unlock your phone through facial recognition. All you have to do is look at your phone and it will recognize you and unlock itself. At time of writing, nobody outside of Apple has tested the security of FaceID. So this article is about the security of facial recognition, and other forms of biometric identification in general.

Historically, biometric identification has been insecure. Cameras can be tricked. Voices can be recorded. Fingerprints can be lifted. And in many countries?—?including the US?—?the police can legally force you to use your fingerprint to unlock your phone. So they can most certainly point your phone at your face and unlock it against your will. If you value the security of your data?—?your email, social media accounts, family photos, the history of every place you’ve ever been with your phone?—?then I recommend against using biometric identification.

Instead, use a passcode to unlock your phone.

 

 

The iPhone lineup just got really compleX  — from techcrunch.com by Josh Constine

 

 

 

 

 

Apple’s ‘Neural Engine’ Infuses the iPhone With AI Smarts — from wired.com by Tom Simonite

Excerpt:

When Apple CEO Tim Cook introduced the iPhone X Tuesday he claimed it would “set the path for technology for the next decade.” Some new features are superficial: a near-borderless OLED screen and the elimination of the traditional home button. Deep inside the phone, however, is an innovation likely to become standard in future smartphones, and crucial to the long-term dreams of Apple and its competitors.

That feature is the “neural engine,” part of the new A11 processor that Apple developed to power the iPhone X. The engine has circuits tuned to accelerate certain kinds of artificial-intelligence software, called artificial neural networks, that are good at processing images and speech.

Apple said the neural engine would power the algorithms that recognize your face to unlock the phone and transfer your facial expressions onto animated emoji. It also said the new silicon could enable unspecified “other features.”

Chip experts say the neural engine could become central to the future of the iPhone as Apple moves more deeply into areas such as augmented reality and image recognition, which rely on machine-learning algorithms. They predict that Google, Samsung, and other leading mobile-tech companies will soon create neural engines of their own. Earlier this month, China’s Huawei announced a new mobile chip with a dedicated “neural processing unit” to accelerate machine learning.

 

 

 

 

ARCore: Augmented reality at Android scale — from blog.google by Dave Burke

Excerpt:

With more than two billion active devices, Android is the largest mobile platform in the world. And for the past nine years, we’ve worked to create a rich set of tools, frameworks and APIs that deliver developers’ creations to people everywhere. Today, we’re releasing a preview of a new software development kit (SDK) called ARCore. It brings augmented reality capabilities to existing and future Android phones. Developers can start experimenting with it right now.

 

Google just announced its plan to match the coolest new feature coming to the iPhone –from cnbc.com by Todd Haselton

  • Google just announced its answer to Apple’s augmented reality platform
  • New tools called ARCore will let developers enable AR on millions of Android devices

 

AR Experiments

Description:

AR Experiments is a site that features work by coders who are experimenting with augmented reality in exciting ways. These experiments use various tools like ARCore, an SDK that lets Android developers create awesome AR experiences. We’re featuring some of our favorite projects here to help inspire more coders to imagine what could be made with AR.

 

Google’s ARCore hopes to introduce augmented reality to the Android masses — from androidauthority.com by Williams Pelegrin

Excerpt:

Available as a preview, ARCore is an Android software development kit (SDK) that lets developers introduce AR capabilities to, you guessed it, Android devices. Because of how ARCore works, there is no need for folks to purchase additional sensors or hardware – it will work on existing and future Android phones.

 

 

5 reasons iPhone 8 will be an augmented reality game changer — from techradar.com by Mark Knapp
Augmented Reality (AR) will be everywhere!

Excerpt (emphasis DSC):

Most of us are at least vaguely familiar with augmented reality thanks to Pokemon Go sticking Charmanders in our sock drawers and Snapchat letting us spew stars and rainbows from our mouths, but Apple’s iPhone 8 is going to push AR to point of ubiquity. When the iPhone 8 launches, we’ll all be seeing the world differently.

iPhones are everywhere, so AR will be everywhere!

The iPhone 8 will bring with it iOS 11, and with iOS 11 will come Apple’s AR. Since the iPhone 8 is all but guaranteed to be a best-seller and earlier iPhones and iPads will be widely updated to iOS 11, Apple will have a massive AR platform. Craig Federighi, Apple’s Senior Vice President of Software Engineering, believes it will be “the largest AR platform in the world,” which will lure AR developers en masse.

 

Apple AR has huge potential in education.
Apple has been positioning itself in the education world for years, with programs like iTunes U and iBook, as well as efforts to get iPads into classrooms. AR already has major prospects in education, with the ability to make Museum exhibits interactive and to put a visually explorable world in front of users.

 

 

Apple could guide you around your city using augmented reality — from techcrunch.com by Romain Dillet, with thanks to Woontack Woo for this resource

 

 

 

Startup Simulanis uses augmented and virtual reality to skill professionals — from economictimes.indiatimes.com by Vinay Dwivedi

Excerpt:

[India] The widening gap between the skills required by businesses and the know-how of a large number of engineering students got Raman Talwar started on his entrepreneurial journey.

Delhi-based Simulanis harnesses AR and VR technology to help companies across industries— pharmaceuticals, auto, FMCG and manufacturing—train their staff. It continues to work in the engineering education sector and has developed applications that assist students visualise challenging subjects and concepts.

Our products help students and trainees learn difficult concepts easily and interactively through immersive AR-VR and 3-D gamification methods,” says Talwar. Simulanis’ offerings include an AR learning platform, Saral, and a gamified learning platform, Protocol.

 

Also see:

 

 

 
 

ARKit: Augmented Reality on 195 million iPhones and iPads by year end — from blog.mapbox.com by Ceci Alvarez

 

Excerpt:

Apple’s ARKit just made augmented reality (AR) mainstream — and together with the Maps SDK for Unity, will fundamentally change the types of location-based apps that developers can build.

Using ARKit plus Maps SDK for Unity allows you record your bike ride up to Twin Peaks in Strava and project the map of your route on your coffee table. As you plan your next vacation over dinner, you’ll be able to open your Lonely Planet app and have the Big Sur coast hovering in front of you as you browse the different camp sites. Or, when you’re at work appraising a property for flood insurance, you could just tilt up your phone and see the flood plain in front of you, and which parts of the property are susceptible to flooding. Or, when you’re teaching a geology class you project the evolution of Pangea in 3D for students to visualize instead of being limited by 2D images in textbooks.

 

 

 

Inside Peter Jackson’s New Augmented Reality Studio — from cartoonbrew.com by Ian Failes

Excerpt:

At Apple’s recent Worldwide Developers Conference (WWDC) in San Jose, one of the stand-out demos was from Wingnut AR, the augmented reality studio started by director Peter Jackson and his partner Fran Walsh.

On stage, Wingnut AR’s creative director Alasdair Coull demonstrated a tabletop ar experience made using Apple’s upcoming augmented reality developer kit called ARKit and Epic Games’ Unreal Engine 4. The experience blended a real world environment – the tabletop – with digital objects, in this case a sci-fi location complete with attacking spaceships, while being viewed live, on an iPad.

 

 

 

 

Soon your desk will be a computer too — from wired.com by

 

 

More Fun Uses for Augmented Reality & Your iPhone Keep Popping Up — from ar.reality.news by Juliet Gallagher

Excerpt:

Developers are really having a field day with Apple’s ARKit, announced last month. Since it’s release to developers, videos have been appearing all over the Internet of the different ways that developers are getting creative with the ARKit using iPhones and iPads.

Here are a few videos of the cool things that are happening using the ARKit:

 

 

 

 


Addendum on 7/10/17:

Google Lens offers a snapshot of the future for augmented reality and AI — from android authority.com by Adam Sinicki

Google Lens offers a snapshot of the future for augmented reality and AI

 

Google Lens is a tool that effectively brings search into the real world. The idea is simple: you point your phone at something around you that you want more information on and Lens will provide that information.

 

What a future, powerful, global learning platform will look & act like [Christian]


Learning from the Living [Class] Room:
A vision for a global, powerful, next generation learning platform

By Daniel Christian

NOTE: Having recently lost my Senior Instructional Designer position due to a staff reduction program, I am looking to help build such a platform as this. So if you are working on such a platform or know of someone who is, please let me know: danielchristian55@gmail.com.

I want to help people reinvent themselves quickly, efficiently, and cost-effectively — while providing more choice, more control to lifelong learners. This will become critically important as artificial intelligence, robotics, algorithms, and automation continue to impact the workplace.


 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

Learning from the Living [Class] Room:
A global, powerful, next generation learning platform

 

What does the vision entail?

  • A new, global, collaborative learning platform that offers more choice, more control to learners of all ages – 24×7 – and could become the organization that futurist Thomas Frey discusses here with Business Insider:

“I’ve been predicting that by 2030 the largest company on the internet is going to be an education-based company that we haven’t heard of yet,” Frey, the senior futurist at the DaVinci Institute think tank, tells Business Insider.

  • A learner-centered platform that is enabled by – and reliant upon – human beings but is backed up by a powerful suite of technologies that work together in order to help people reinvent themselves quickly, conveniently, and extremely cost-effectively
  • An AI-backed system of analyzing employment trends and opportunities will highlight those courses and “streams of content” that will help someone obtain the most in-demand skills
  • A system that tracks learning and, via Blockchain-based technologies, feeds all completed learning modules/courses into learners’ web-based learner profiles
  • A learning platform that provides customized, personalized recommendation lists – based upon the learner’s goals
  • A platform that delivers customized, personalized learning within a self-directed course (meant for those content creators who want to deliver more sophisticated courses/modules while moving people through the relevant Zones of Proximal Development)
  • Notifications and/or inspirational quotes will be available upon request to help provide motivation, encouragement, and accountability – helping learners establish habits of continual, lifelong-based learning
  • (Potentially) An online-based marketplace, matching learners with teachers, professors, and other such Subject Matter Experts (SMEs)
  • (Potentially) Direct access to popular job search sites
  • (Potentially) Direct access to resources that describe what other companies do/provide and descriptions of any particular company’s culture (as described by current and former employees and freelancers)

Further details:
While basic courses will be accessible via mobile devices, the optimal learning experience will leverage two or more displays/devices. So while smaller smartphones, laptops, and/or desktop workstations will be used to communicate synchronously or asynchronously with other learners, the larger displays will deliver an excellent learning environment for times when there is:

  • A Subject Matter Expert (SME) giving a talk or making a presentation on any given topic
  • A need to display multiple things going on at once, such as:
  • The SME(s)
  • An application or multiple applications that the SME(s) are using
  • Content/resources that learners are submitting in real-time (think Bluescape, T1V, Prysm, other)
  • The ability to annotate on top of the application(s) and point to things w/in the app(s)
  • Media being used to support the presentation such as pictures, graphics, graphs, videos, simulations, animations, audio, links to other resources, GPS coordinates for an app such as Google Earth, other
  • Other attendees (think Google Hangouts, Skype, Polycom, or other videoconferencing tools)
  • An (optional) representation of the Personal Assistant (such as today’s Alexa, Siri, M, Google Assistant, etc.) that’s being employed via the use of Artificial Intelligence (AI)

This new learning platform will also feature:

  • Voice-based commands to drive the system (via Natural Language Processing (NLP))
  • Language translation (using techs similar to what’s being used in Translate One2One, an earpiece powered by IBM Watson)
  • Speech-to-text capabilities for use w/ chatbots, messaging, inserting discussion board postings
  • Text-to-speech capabilities as an assistive technology and also for everyone to be able to be mobile while listening to what’s been typed
  • Chatbots
    • For learning how to use the system
    • For asking questions of – and addressing any issues with – the organization owning the system (credentials, payments, obtaining technical support, etc.)
    • For asking questions within a course
  • As many profiles as needed per household
  • (Optional) Machine-to-machine-based communications to automatically launch the correct profile when the system is initiated (from one’s smartphone, laptop, workstation, and/or tablet to a receiver for the system)
  • (Optional) Voice recognition to efficiently launch the desired profile
  • (Optional) Facial recognition to efficiently launch the desired profile
  • (Optional) Upon system launch, to immediately return to where the learner previously left off
  • The capability of the webcam to recognize objects and bring up relevant resources for that object
  • A built in RSS feed aggregator – or a similar technology – to enable learners to tap into the relevant “streams of content” that are constantly flowing by them
  • Social media dashboards/portals – providing quick access to multiple sources of content and whereby learners can contribute their own “streams of content”

In the future, new forms of Human Computer Interaction (HCI) such as Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) will be integrated into this new learning environment – providing entirely new means of collaborating with one another.

Likely players:

  • Amazon – personal assistance via Alexa
  • Apple – personal assistance via Siri
  • Google – personal assistance via Google Assistant; language translation
  • Facebook — personal assistance via M
  • Microsoft – personal assistance via Cortana; language translation
  • IBM Watson – cognitive computing; language translation
  • Polycom – videoconferencing
  • Blackboard – videoconferencing, application sharing, chat, interactive whiteboard
  • T1V, Prsym, and/or Bluescape – submitting content to a digital canvas/workspace
  • Samsung, Sharp, LCD, and others – for large displays with integrated microphones, speakers, webcams, etc.
  • Feedly – RSS aggregator
  • _________ – for providing backchannels
  • _________ – for tools to create videocasts and interactive videos
  • _________ – for blogs, wikis, podcasts, journals
  • _________ – for quizzes/assessments
  • _________ – for discussion boards/forums
  • _________ – for creating AR, MR, and/or VR-based content

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2018 | Daniel Christian