From DSC: If we had more beacons on our campus (a Christian liberal arts college), I could see where we could offer a variety of things in new ways:
For example, we might use beacons around the main walkways of our campus where, when we approach these beacons, pieces of advice or teaching could appear on an app on our mobile devices. Examples could include:
Micro-tips on prayer from John Calvin, Martin Luther, or Augustine (i.e., 1 or 2 small tips at a time; could change every day or every week)
Or, for a current, campus-wide Bible study, the app could show a question for that week’s study; you could reflect on that question as you’re walking around
Or, for musical events…when one walks by the Covenant Fine Arts Center, one could get that week’s schedule of performances or what’s currently showing in the Art Gallery
Peaces of scripture, with links to Biblegateway.com or other sites
Further information re: what’s being displayed on posters within the hallways — works that might be done by faculty members and/or by students
Etc.
A person could turn the app’s notifications on or off at any time. The app would encourage greater exercise; i.e., the more you walk around, the more tips you get.
A school bus, virtual reality, & an out-of-this-world journey — from goodmenproject.com “Field Trip To Mars” is the barrier-shattering outcome of an ambitious mission to give a busload of people the same, Virtual Reality experience – going to Mars.
Excerpt:
Inspiration was Lockheed‘s goal when it asked its creative resources, led by McCann, to create the world’s first mobile group Virtual Reality experience. As one creator notes, VR now is essentially a private, isolating experience. But wouldn’t it be cool to give a busload of people the same, simultaneous VR experience? And then – just to make it really challenging – put the whole thing on wheels?
“Field Trip To Mars” is the barrier-shattering outcome of this ambitious mission.
From DSC: This is incredible! Very well done. The visual experience tracks the corresponding speeds of the bus and even turns of the bus.
The United States Department of Education (ED) has formally kicked off a new competition designed to encourage the development of virtual and augmented reality concepts for education.
Dubbed the EdSim Challenge, the competition is aimed squarely at developing students’ career and technical skills — it’s funded through the Carl D. Perkins Career and Technical Education Act of 2006 — and calls on developers and ed tech organizations to develop concepts for “computer-generated virtual and augmented reality educational experiences that combine existing and future technologies with skill-building content and assessment. Collaboration is encouraged among the developer community to make aspects of simulations available through open source licenses and low-cost shareable components. ED is most interested in simulations that pair the engagement of commercial games with educational content that transfers academic, technical, and employability skills.”
Virtual reality boosts students’ results— from raconteur.net by Edwin Smith Virtual and augmented reality can enable teaching and training in situations which would otherwise be too hazardous, costly or even impossible in the real world
Excerpt:
More recently, though, the concept described in Aristotle’s Nicomachean Ethics has been bolstered by further scientific evidence. Last year, a University of Chicago study found that students who physically experience scientific concepts, such as the angular momentum acting on a bicycle wheel spinning on an axel that they’re holding, understand them more deeply and also achieve significantly improved scores in tests.
Virtual and augmented reality are shaking up sectors — from raconteur.net by Sophie Charara Both virtual and augmented reality have huge potential to leap from visual entertainment to transform the industrial and service sectors
Microsoft might not have envisioned its HoloLens headset as a war helmet, but that’s not stopping Ukrainian company LimpidArmor from experimenting. Defence Blog reports that LimpidArmor has started testing military equipment that includes a helmet with Microsoft’s HoloLens headset integrated into it.
The helmet is designed for tank commanders to use alongside a Circular Review System (CRS) of cameras located on the sides of armored vehicles. Microsoft’s HoloLens gathers feeds from the cameras outside to display them in the headset as a full 360-degree view. The system even includes automatic target tracking, and the ability to highlight enemy and allied soldiers and positions.
Bring your VR to work— from itproportal.com by Timo Elliott, Josh Waddell 4 hours ago With all the hype, there’s surprisingly little discussion of the latent business value which VR and AR offer.
Excerpt:
With all the hype, there’s surprisingly little discussion of the latent business value which VR and AR offer — and that’s a blind spot that companies and CIOs can’t afford to have. It hasn’t been that long since consumer demand for the iPhone and iPad forced companies, grumbling all the way, into finding business cases for them. Gartner has said that the next five to ten years will bring “transparently immersive experiences” to the workplace. They believe this will introduce “more transparency between people, businesses, and things” and help make technology “more adaptive, contextual, and fluid.”
If digitally enhanced reality generates even half as much consumer enthusiasm as smartphones and tablets, you can expect to see a new wave of consumerisation of IT as employees who have embraced VR and AR at home insist on bringing it to the workplace. This wave of consumerisation could have an even greater impact than the last one. Rather than risk being blindsided for a second time, organisations would be well advised to take a proactive approach and be ready with potential business uses for VR and AR technologies by the time they invade the enterprise.
In Gartner’s latest emerging technologies hype cycle, Virtual Reality is already on the Slope of Enlightenment, with Augmented Reality following closely.
One place where students are literally immersed in VR is at Carnegie Mellon University’s Entertainment Technology Center (ETC). ETC offers a two-year Master of Entertainment Technology program (MET) launched in 1998 and cofounded by the late Randy Pausch, author of “The Last Lecture.”
MET starts with an intense boot camp called the “immersion semester” in which students take a Building Virtual Worlds (BVW) course, a leadership course, along with courses in improvisational acting, and visual storytelling. Pioneered by Pausch, BVW challenges students in small teams to create virtual reality worlds quickly over a period of two weeks, culminating in a presentation festival every December.
Apple patents augmented reality mapping system for iPhone — from appleinsider.com by Mikey Campbell Apple on Tuesday was granted a patent detailing an augmented reality mapping system that harnesses iPhone hardware to overlay visual enhancements onto live video, lending credence to recent rumors suggesting the company plans to implement an iOS-based AR strategy in the near future.
Virtual reality is a combination of life-like images, effects and sounds that creates an imaginary world in front of our eyes.
But what if we could also imitate more complex sensations like the feeling of falling rain, a beating heart or a cat walking? What if we could distinguish, between a light sprinkle and a heavy downpour in a virtual experience?
Disney Research?—?a network of research laboratories supporting The Walt Disney Company, has announced the development of a 360-degree virtual reality application offering a library of feel effects and full body sensations.
Literature class meets virtual reality — from blog.cospaces.io by Susanne Krause Not every student finds it easy to let a novel come to life in their imagination. Could virtual reality help? Tiffany Capers gave it a try: She let her 7th graders build settings from Lois Lowry’s “The Giver” with CoSpaces and explore them in virtual reality. And: they loved it.
After generations of peering into a microscope to examine cells, scientists could simply stroll straight through one.
Calling his project the “stuff of science fiction,” director of the 3D Visualisation Aesthetics Lab at the University of New South Wales (UNSW) John McGhee is letting people come face-to-face with a breast cancer cell.
In contrast, VR has been described as the “ultimate empathy machine.” It gives us a way to virtually put us in someone else’s shoes and experience the world the way they do.
This isn’t meant to be an exhaustive list, but if I missed something major, please tell me and I’ll add it. Also, please reach out if you’re working on anything cool in this space à sarah(at)accomplice(dot)co.
Hand and finger tracking, gesture interfaces, and grip simulation:
From DSC: How long before recommendation engines like this can be filtered/focused down to just display apps, channels, etc. that are educational and/or training related (i.e., a recommendation engine to suggest personalized/customized playlists for learning)?
That is, in the future, will we have personalized/customized playlists for learning on our Apple TVs — as well as on our mobile devices — with the assessment results of our taking the module(s) or course(s) being sent in to:
A credentials database on LinkedIn (via blockchain) and/or
A credentials database at the college(s) or university(ies) that we’re signed up with for lifelong learning (via blockchain)
and/or
To update our cloud-based learning profiles — which can then feed a variety of HR-related systems used to find talent? (via blockchain)
Will participants in MOOCs, virtual K-12 schools, homeschoolers, and more take advantage of learning from home?
Will solid ROI’s from having thousands of participants paying a smaller amount (to take your course virtually) enable higher production values?
Will bots and/or human tutors be instantly accessible from our couches?
From DSC: Here’s an idea that came to my mind the other day as I was walking by a person who was trying to put some books back onto the shelves within our library.
From DSC: Perhaps this idea is not very timely…as many collections of books will likely continue to be digitized and made available electronically. But preservation is still a goal for many libraries out there.
Today, the IoT sits at the peak of Gartner’s Hype Cycle. It’s probably not surprising that industry is abuzz with the promise of streaming sensor data. The oft quoted “50 billion connected devices by 2020!” has become a rallying cry for technology analysts, chip vendors, network providers, and other proponents of a deeply connected, communicating world. What is surprising is that academia has been relatively slow to join the parade, particularly when the potential impacts are so exciting. Like most organizations that manage significant facilities, universities stand to benefit by adopting the IoT as part of their management strategy. The IoT also affords new opportunities to improve the customer experience. For universities, this means the ability to provide new student services and improve on those already offered. Perhaps most surprisingly, the IoT represents an opportunity to better engage a diverse student base in computer science and engineering, and to amplify these programs through meaningful interdisciplinary collaboration.
…
The potential benefits of the IoT to the academic community extend beyond facilities management to improving our students’ experience. The lowest hanging fruit can be harvested by adapting some of the smart city applications that have emerged. What student hasn’t shown up late to class after circling the parking lot looking for a space? Ask any student at a major university if it would improve their campus experience to be able to check on their smart phones which parking spots were available. The answer will be a resounding “yes!” and there’s nothing futuristic about it. IoT parking management systems are commercially available through a number of vendors. This same type of technology can be adapted to enable students to find open meeting rooms, computer facilities, or café seating. What might be really exciting for students living in campus dormitories: A guarantee that they’ll never walk down three flights of stairs balancing two loads of dirty laundry to find that none of the washing machines are available. On many campuses, the washing machines are already network-connected to support electronic payment; availability reporting is a straightforward extension.
Also see:
2016 Innovators Awards | A Location-Aware App for Exploring the Library — from campustechnology.com by Meg Lloyd To help users access rich information resources on campus, the University of Oklahoma Libraries created a mobile app with location-based navigation and “hyperlocal” content.
Category: Education Futurists
Institution: University of Oklahoma
Project: OU Libraries NavApp
Project lead: Matt Cook, emerging technologies librarian
From DSC: I have attended theNext Generation Learning Spaces Conferencefor the past two years. Both conferences were very solid and they made a significant impact on our campus, as they provided the knowledge, research, data, ideas, contacts, and the catalyst for us to move forward with building a Sandbox Classroom on campus. This new, collaborative space allows us to experiment with different pedagogies as well as technologies. As such, we’ve been able to experiment much more with active learning-based methods of teaching and learning. We’re still in Phase I of this new space, and we’re learning new things all of the time.
For the upcoming conference in February, I will be moderating a New Directions in Learning panel on the use of augmented reality (AR), virtual reality (VR), and mixed reality (MR). Time permitting, I hope that we can also address other promising, emerging technologies that are heading our way such as chatbots, personal assistants, artificial intelligence, the Internet of Things, tvOS, blockchain and more.
The goal of this quickly-moving, engaging session will be to provide a smorgasbord of ideas to generate creative, innovative, and big thinking. We need to think about how these topics, trends, and technologies relate to what our next generation learning environments might look like in the near future — and put these things on our radars if they aren’t already there.
Key takeaways for the panel discussion:
Reflections regarding the affordances that new developments in Human Computer Interaction (HCI) — such as AR, VR, and MR — might offer for our learning and our learning spaces (or is our concept of what constitutes a learning space about to significantly expand?)
An update on the state of the approaching ed tech landscape
Creative, new thinking: What might our next generation learning environments look like in 5-10 years?
I’m looking forward to catching up with friends, meeting new people, and to the solid learning that I know will happen at this conference. I encourage you to check outthe conferenceandregister soon to take advantage of the early bird discounts.
ARMONK, N.Y., Sept. 28, 2016 /PRNewswire/ — Teachers will have access to a new, first-of-its-kind, free tool using IBM’s innovative Watson cognitive technology that has been trained by teachers and designed to strengthen teachers’ instruction and improve student achievement, the IBM Foundation and the American Federation of Teachers announced today.
Hundreds of elementary school teachers across the United States are piloting Teacher Advisor with Watson – an innovative tool by the IBM Foundation that provides teachers with a complete, personalized online resource. Teacher Advisor enables teachers to deepen their knowledge of key math concepts, access high-quality vetted math lessons and acclaimed teaching strategies and gives teachers the unique ability to tailor those lessons to meet their individual classroom needs.
…
Litow said there are plans to make Teacher Advisor available to all elementary school teachers across the U.S. before the end of the year.
In this first phase, Teacher Advisor offers hundreds of high-quality vetted lesson plans, instructional resources, and teaching techniques, which are customized to meet the needs of individual teachers and the particular needs of their students.
Also see:
Educators can also access high-quality videos on teaching techniques to master key skills and bring a lesson or teaching strategy to life into their classroom.
From DSC: Today’s announcement involved personalization and giving customized directions, and it caused my mind to go in a slightly different direction. (IBM, Google, Microsoft, Apple, Amazon, and others like Smart Sparrow are likely also thinking about this type of direction as well. Perhaps they’re already there…I’m not sure.)
But given the advancements in machine learning/cognitive computing (where example applications include optical character recognition (OCR) and computer vision), how much longer will it be before software is able to remotely or locally “see” what a third grader wrote down for a given math problem (via character and symbol recognition) and “see” what the student’s answer was while checking over the student’s work…if the answer was incorrect, the algorithms will likely know where the student went wrong. The software will be able to ascertain what the student did wrong and then show them how the problem should be solved (either via hints or by showing the entire problem to the student — per the teacher’s instructions/admin settings). Perhaps, via natural language processing, this process could be verbalized as well.
Further questions/thoughts/reflections then came to my mind:
Will we have bots that teachers can use to teach different subjects? (“Watson may even ask the teacher additional questions to refine its response, honing in on what the teacher needs to address certain challenges.)
Will we have bots that students can use to get the basics of a given subject/topic/equation?
Will instructional designers — and/or trainers in the corporate world — need to modify their skillsets to develop these types of bots?
Will teachers — as well as schools of education in universities and colleges — need to modify their toolboxes and their knowledgebases to take advantage of these sorts of developments?
How might the corporate world take advantage of these trends and technologies?
Will MOOCs begin to incorporate these sorts of technologies to aid in personalized learning?
What sorts of delivery mechanisms could be involved? Will we be tapping into learning-related bots from our living rooms or via our smartphones?
IBM Watson will help educators improve teaching skills — from zdnet.com by Natalie Gagliordi The IBM Foundation has teamed with teachers and the American Federation of Teachers union to create an AI-based lesson plan tool called Teacher Advisor.
I believe we are moving into the fourth era of personal computing. The first era was characterized by the emergence of the PC. The second by the web and the browser, and the third by mobile and apps.
The fourth personal computing platform will be a combination of IOT, wearable and AR-based clients using speech and gesture, connected over 4G/5G networks to PA, CaaS and social networking platforms that draw upon a new class of cloud-based AI to deliver highly personalized access to information and services.
So what does the fourth era of personal computing look like? It’s a world of smart objects, smart spaces, voice control, augmented reality, and artificial intelligence.
Soon, hundreds of millions of mobile users in China will have direct access to an augmented reality smartphone platform on their smartphones.
Baidu, China’s largest search engine, unveiled an AR platform today called DuSee that will allow China’s mobile users the opportunity to test out smartphone augmented reality on their existing devices. The company also detailed that they plan to integrate the technology directly into their flagship apps, including the highly popular Mobile Baidu search app.
From DSC: With “a billion iOS devices out in the world“, I’d say that’s a good, safe call…at least for one of the avenues/approaches via which AR will be offered.
Recently it’s appeared that augmented reality (AR) is gaining popularity as a professional platform, with Visa testing it as a brand new e-commerce solution and engineering giant Aecom piloting a project that will see the technology used in their construction projects across three continents. Now, it’s academia’s turn with Deakin University in Australia announcing that it plans to use the technology as a teaching tool in its medicine and engineering classrooms.
As reported by ITNews, AR technology will be introduced to Deakin University’s classes from December, with the first AR apps to be used during the university’s summer programme which runs from November to March, before the technology is distributed more widely in the first semester of 2017.
Creating realistic interactions with objects and people in virtual reality is one of the industry’s biggest challenges right now, but what about for augmented reality?
That’s an area that researchers from Massachusetts Institute of Technology’s (MIT’s) Computer Science and Artificial Intelligence Laboratory (CSAIL) have recently made strides in with what they call Interactive Dynamic Video (IDV). First designed with video in mind, PhD student Abe Davis has created a unique concept that could represent a way to not only interact with on-screen objects, but for those objects to also realistically react to the world around them.
If that sounds a little confusing, then we recommend taking a look at…
Venice Architecture Biennale 2016: augmented reality will revolutionise the architecture and construction industries according to architect Greg Lynn, who used Microsoft HoloLens to design his contribution to the US Pavilion at the Venice Biennale (+ movie).
.
Augmented Reality In Healthcare Will Be Revolutionary — from medicalfuturist.com Augmented reality is one of the most promising digital technologies at present – look at the success of Pokémon Go – and it has the potential to change healthcare and everyday medicine completely for physicians and patients alike.
Excerpt:
Nurses can find veins easier with augmented reality The start-up company AccuVein is using AR technology to make both nurses’ and patients’ lives easier. AccuVein’s marketing specialist, Vinny Luciano said 40% of IVs (intravenous injections) miss the vein on the first stick, with the numbers getting worse for children and the elderly. AccuVein uses augmented reality by using a handheld scanner that projects over skin and shows nurses and doctors where veins are in the patients’ bodies. Luciano estimates that it’s been used on more than 10 million patients, making finding a vein on the first stick 3.5x more likely. Such technologies could assist healthcare professionals and extend their skills.
But even with a wealth of hardware partners over the years, Urbach says he’d never tried a pair of consumer VR glasses that could effectively trick his brain until he began working with Osterhout Design Group (ODG).
ODG has previously made military night-vision goggles, and enterprise-focused glasses that overlay digital objects onto the real world. But now the company is partnering with OTOY, and will break into the consumer AR/VR market with a model of glasses codenamed “Project Horizon.”
The glasses work by using a pair of micro OLED displays to reflect images into your eyes at 120 frames-per-second. And the quality blew Urbach away, he tells Business Insider.
You could overlay images onto the real world in a way that didn’t appear “ghost-like.” We have the ability to do true opacity matching,” he says.
Live streaming VR events continue to make the news. First, it was the amazing Reggie Watts performance on AltspaceVR. Now the startup Rivet has launched an iOS app (sorry, Android still to come) for live streams of concerts. As the musician, record producer and visual artist Brian Eno once said,
You can’t really imagine music without technology.
In the near future, we may not be able to imagine a live performance without the option of a live stream in virtual reality.
While statistics on VR use in K-12 schools and colleges have yet to be gathered, the steady growth of the market is reflected in the surge of companies (including zSpace, Alchemy VRand Immersive VR Education) solely dedicated to providing schools with packaged educational curriculum and content, teacher training and technological tools to support VR-based instruction in the classroom. Myriad articles, studies and conference presentations attest to the great success of 3D immersion and VR technology in hundreds of classrooms in educationally progressive schools and learning labs in the U.S. and Europe.
…
Much of this early foray into VR-based learning has centered on the hard sciences — biology, anatomy, geology and astronomy — as the curricular focus and learning opportunities are notably enriched through interaction with dimensional objects, animals and environments. The World of Comenius project, a biology lesson at a school in the Czech Republic that employed a Leap Motion controller and specially adapted Oculus Rift DK2 headsets, stands as an exemplary model of innovative scientific learning.
In other areas of education, many classes have used VR tools to collaboratively construct architectural models, recreations of historic or natural sites and other spatial renderings. Instructors also have used VR technology to engage students in topics related to literature, history and economics by offering a deeply immersive sense of place and time, whether historic or evolving.
“Perhaps the most utopian application of this technology will be seen in terms of bridging cultures and fostering understanding among young students.”
Business will be a leading use case for AR over the next several years. An example is DHL’s use of wearables and AR in a warehouse pilot program to achieve improvement in the picking process. Other AR apps allow for automatic sign translation, or in a hands-busy environment, technicians can get equipment maintenance instructions through their HMDs while keeping focused on the task at hand.
“Virtual reality (VR) and AR capabilities will merge with the digital mesh to form a more seamless system of devices capable of orchestrating a flow of information that comes to the user as hyperpersonalized and relevant apps and services,” said Mr. Blau. “Integration across multiple mobile, wearable, IoT and sensor-rich environments (the digital mesh) will extend immersive applications beyond isolated and single-person experiences. Rooms and spaces will become active with things, and their connection through the mesh will appear and work in conjunction with immersive virtual worlds. “
From DSC: I would add those working in higher education to this as well — as well as vendors working on education-related solutions.
Just as the world’s precious artworks and monuments need a touch-up to look their best, the home we’ve built to host the world’s cultural treasures online needs a lick of paint every now and then. We’re ready to pull off the dust sheets and introduce the new Google Arts & Culture website and app, by the Google Cultural Institute. The app lets you explore anything from cats in art since 200 BCE to the color red in Abstract Expressionism, and everything in between. Our new tools will help you discover works and artifacts, allowing you to immerse yourself in cultural experiences across art, history and wonders of the world—from more than a thousand museums across 70 countries…
Paper 53 is a brilliant app which combines drawings, notes, photos and sketches. It is available on the Appstore. The app is simple and user-friendly; just use your finger (or a stylus) to draw, paint, select colours, erase and lots more.
Below is a collection of some new educational web tools and mobile apps to try out in your instruction. The purpose is to keep you updated about the new releases in the EdTech world and empower you with the necessary technology to take your teaching and learning to the next level. Some of the things you can do with these applications include: Learn English pronunciation from native speakers, easily save web content to Google, search YouTube without having to stop the video playing, learn basic math skills through challenging games and activities, unshare sent files in Gmail, create interactive and engaging videos by adding polls, short questions and quizzes, create beautiful presentations and animations using drawn images and stick figures and many more.
This year the Center for Teaching hosted a few educational technology working groups for faculty, staff, and students interested in exploring ways particular technologies might meet their instructional goals. One of the groups investigated the use of digital timeline tools, like Tiki-Toki and TimelineJS, that facilitate the creation of online, multimedia, interactive, and collaborative timelines. I had used such tools in my own teaching, having asked my 2010 writing seminar students to create a class timeline on the history of cryptography, and I was eager to talk with other instructors about the potential of student-produced timelines.
In Silicon Valley and elsewhere there’s currently an AI arms race going on. The first wave of this race is centered around artificial virtual assistants that are poised to become our new digital best friends in the very near future. While many people are familiar with Apple’s popular AI virtual assistant, Siri, there are four other main players in the AI virtual assistant space.
Introducing the new mobile app for Microsoft Flow— from flow.microsoft.com with thanks to Mr. Michael Haan, Technology Integration Specialist/Purchasing w/ Calvin Information Technology, for this resource
Creating high-quality online courses is getting increasingly complex—requiring an ever-growing set of skills. Faculty members can’t do it all, nor can instructional designers, nor can anyone else. As time goes by, new entrants and alternatives to traditional institutions of higher education will likely continue to appear on the higher education landscape—the ability to compete will be key.
…
For example, will there be a need for the following team members in your not-too-distant future?
Human Computer Interaction (HCI) Specialists: those with knowledge of how to leverage Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) in order to create fun and engaging learning experiences (while still meeting the learning objectives)
Apple made several interesting announcements today during its WWDC 2016 keynote. Here are the major announcements from the event.
iOS 10 — the big focus with this release is on Siri.
macOS Sierra — The next version of Apple’s operating system for desktops and laptops is dropping the ‘X’, opting instead for the new ‘macOS’ branding. With macOS, Siri makes its debut on Apple’s traditional computers for the first time.
watchOS
tvOS — The next major version of the Apple TV’s software will offer single sign-on for cable logins, along with its own dark mode. There will also be a number of Siri enhancements, as well as improvements for watching live TV.
Highlights from Apple’s WWDC 2016 Keynote — from fastcompany.com From Messages to Music, and Siri to Apple Pay on the web, here are the most important announcements from Apple’s event today.
Apple today announced Swift Playgrounds for the iPad, a new project that aims to teach kids to code in Swift.
When you first open it, Swift Playground presents you with a number of basic coding lessons, as well as challenges. The interface looks somewhat akin to Codecademy, but it’s far more graphical and playful, which makes sense, given that the target audience is kids. Most of the projects seem to involve games and fun little animations to keep kids motivated.
To make coding on the iPad a bit easier, Apple is using a special keyboard with a number of shortcuts and other features that will make it easier to enter code.
The biggest news for Siri from the WWDC keynote: Apple’s assistant is now open to third party developers.
…
Apple is now opening Siri to all of those potential interactions for developers through SiriKit. Siri will be able to access messaging, photos, search, ride booking through Uber or Lyft etc., payments, health and fitness and so forth. Siri will also be incorporated into Apple CarPlay apps to make it easier to interact with the assistant while driving.
Speaking at WWDC 2016, Apple announced that it is bringing the power of machine learning to the Photos app in iOS 10. With machine learning, the Photos app will include object and scene recognition thanks to what Apple calls “Advanced Computer Vision.” For example, the app will be able to automatically pick out specific animal, features and more. Facial recognition is also available, all done locally on the iPhone with automatic people albums.
Apple has announced its newest and easiest way to control any and all HomeKit accessories that you may have in your house: Home. With Home, you’ll be able to control all of your accessories, including Air Conditioners, cameras, door locks and other new categories. 3D Touch will be able to give you deeper controls at just a press, and notifications from these will also have 3D Touch functionality as well.
Here’s what Apple is bringing to the Apple TV — from fastcompany.com tvOS is taking a step forward with updates to Siri, and new features such as single sign on, dark mode, and more.
Excerpt:
Less than nine months after the first version of tvOS, there are now over 6,000 native apps for the Apple TV. Of those apps, 1,300 are for streaming video. Popular over-the-top Internet television service Sling TV arrives on the Apple TV today. Live Fox Sports Go streaming will come this summer. Speaking of apps: Apple is introducing a new Apple TV Remote app for iOS that allows people to navigate tvOS using Siri from their iPhone and iPad.
… Download apps on iPhone and get them on Apple TV
Now when you download an app on your iPad or iPhone, if there is an Apple TV version of the app, it will download to your Apple TV automatically.
Apple iOS 10 “Memories” turns old photos into editable mini-movies — from techcrunch.com by Josh Constine Using local, on-device facial recognition and AI detection of what’s in your images, it can combine photos and videos into themed mini-movies complete with transitions and a soundtrack.
As a reminder: You shouldn’t install developer betas on your primary devices if you want them to work.
This is our yearly reminder, folks: Unless you’re a developer with a secondary iPhone or Mac, we strongly, strongly urge you to consider not installing developer betas on your devices.
It’s not because we don’t want you to have fun: iOS 10, watchOS, tvOS, and macOS have some phenomenal features coming this Fall. But they’re beta seeds for a reason: These features are not fully baked, may crash at will, and probably will slow down or crash your third-party applications.