7 things you should know about artificial intelligence in teaching and learning — from Educause Learning Initiative (ELI)

Abstract:

The term artificial intelligence (AI) refers to computer systems that undertake tasks usually thought to require human cognitive processes and decision-making capabilities. To exhibit intelligence, computers apply algorithms to find patterns in large amounts of data—a process called machine learning, which plays a key role in a number of AI applications. AI learning agents have the potential to function like adaptive learning but at a much more sophisticated and nuanced level, potentially giving every student a computer-simulated personal mentor. Many colleges and universities are developing AI projects that aid teaching and learning.

 

7 things you should know about the evolution of teaching and learning professions — from Educause Learning Initiative (ELI)

Abstract

For this issue of the 7 Things, we asked a set of seven community leaders—who come from different walks of life in the community—to offer a short meditation on the evolution of the profession. In this issue you will find comments from professionals such as an instructional designer, a CIO, an accessibility expert, and a librarian. We hope that this issue and the spotlight it casts on the evolution of our profession will encourage us to begin further conversations about where we are headed and how we can help one another to achieve our professional goals.

 

Chief information officers are fast becoming chief innovation officers. It is increasingly critical for the CIO to be an advocate and leader of transformational change on campus rather than a director and manager of IT operations.

A key “big picture” area is the mission of teaching and learning. How do the systems we select today enable improved learning opportunities over the next three years? Will this solution empower students and faculty for years to come or merely meet a tactical need today?
There are increasing opportunities for librarians to work as partners with faculty to develop challenging assignments that encourage students to create a project with an output of a video, podcast, website, data visualization, blog, or other format.
“Support” connotes a hierarchy that doesn’t recognize that staff are valuable assets who play an important role in postsecondary education. We need to find a new language that promotes the ethos of service and servant leadership, within the context of describing ourselves as non-faculty educators and alternative academics.
Once, we thought the faculty role was expanding such that instructors would become learning designers and proto-technologists. Instead, an increasingly competitive and austere landscape is putting competing pressures on faculty, either around research expectations or expanded teaching responsibilities, preventing most from expanding their roles. 
 

Veeery interesting. Alexa now adds visuals / a screen! With the addition of 100 skills a day, where might this new platform lead?

Amazon introduces Echo Show

The description reads:

  • Echo Show brings you everything you love about Alexa, and now she can show you things. Watch video flash briefings and YouTube, see music lyrics, security cameras, photos, weather forecasts, to-do and shopping lists, and more. All hands-free—just ask.
  • Introducing a new way to be together. Make hands-free video calls to friends and family who have an Echo Show or the Alexa App, and make voice calls to anyone who has an Echo or Echo Dot.
  • See lyrics on-screen with Amazon Music. Just ask to play a song, artist or genre, and stream over Wi-Fi. Also, stream music on Pandora, Spotify, TuneIn, iHeartRadio, and more.
  • Powerful, room-filling speakers with Dolby processing for crisp vocals and extended bass response
  • Ask Alexa to show you the front door or monitor the baby’s room with compatible cameras from Ring and Arlo. Turn on lights, control thermostats and more with WeMo, Philips Hue, ecobee, and other compatible smart home devices.
  • With eight microphones, beam-forming technology, and noise cancellation, Echo Show hears you from any direction—even while music is playing
  • Always getting smarter and adding new features, plus thousands of skills like Uber, Jeopardy!, Allrecipes, CNN, and more

 

 

 

 

 

 



From DSC:

Now we’re seeing a major competition between the heavy-hitters to own one’s living room, kitchen, and more. Voice controlled artificial intelligence. But now, add the ability to show videos, text, graphics, and more. Play music. Control the lights and the thermostat. Communicate with others via hands-free video calls.

Hmmm….very interesting times indeed.

 

 

Developers and corporates released 4,000 new skills for the voice assistant in just the last quarter. (source)

 

…with the company adding about 100 skills per day. (source)

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 



 

Addendum on 5/10/17:

 



 

 

The 2017 Dean’s List: EdTech’s 50 Must-Read Higher Ed Blogs [Meghan Bogardus Cortez at edtechmagazine.com]

 

The 2017 Dean’s List: EdTech’s 50 Must-Read Higher Ed Blogs — from edtechmagazine.com by Meghan Bogardus Cortez
These administrative all-stars, IT gurus, teachers and community experts understand how the latest technology is changing the nature of education.

Excerpt:

With summer break almost here, we’ve got an idea for how you can use some of your spare time. Take a look at the Dean’s List, our compilation of the must-read blogs that seek to make sense of higher education in today’s digital world.

Follow these education trailblazers for not-to-be-missed analyses of the trends, challenges and opportunities that technology can provide.

If you’d like to check out the Must-Read IT blogs from previous years, view our lists from 2016, 2015, 2014 and 2013.

 

 



From DSC:
I would like to thank Tara Buck, Meghan Bogardus Cortez, D. Frank Smith, Meg Conlan, and Jimmy Daly and the rest of the staff at EdTech Magazine for their support of this Learning Ecosystems blog through the years — I really appreciate it. 

Thanks all for your encouragement through the years!



 

 

 

 

Brace yourselves: AI is set to explode in higher ed in the next 4 years –from ecampusnews.com by Laura Ascione
As AI grows in the education sector, its impact could be felt in student learning programs, websites and admissions programs

Excerpt:

A new report predicts that artificial intelligence (AI) in the U.S. education sector will grow 47.5 percent through 2021.

The report, Artificial Intelligence Market in the U.S. Education Sector 2017-2021, is based on in-depth market analysis with inputs from industry experts.

One of the major trends surrounding AI and education is AI-powered educational games. Because games have the potential to engage students while teaching them challenging education concepts in an engaging manner, vendors are incorporating AI features into games to enhance their interactivity.

Educational games that include adaptive learning features give students frequent and timely suggestions for a guided learning experience.

 

From DSC:
I can’t say how many AI-based solutions we’ll see within higher education in the next 4 years…it could be a lot…it could be a little. But it will happen. At some point, it will happen.

The use of AI will likely play a key role in a future organization that I’m calling the Next Amazon.com of Higher Education.  AI will likely serve as a foundational piece of what futurist Thomas Frey claims will be the largest company on the internet: “an education-based company that we haven’t heard of yet.” (source)

Web-based learner profiles and blockchain-based technologies should also be on our radars, and are relevant in this discussion.

 

Also see:

Key questions answered in this report

  • What will the market size be in 2021 and what will the growth rate be?
  • What are the key market trends?
  • What is driving this market?
  • What are the challenges to market growth?
  • Who are the key vendors in this market space?
  • What are the market opportunities and threats faced by the key vendors?
  • What are the strengths and weaknesses of the key vendors?

 

 

 

Q&A: Artificial Intelligence Expert Shares His Vision of the Future of Education — from edtechmagazine.com by Amy Burroughs
Artificial intelligence expert Joseph Qualls believes AI can solve some of the biggest challenges facing higher education — and the change is already underway.

Excerpts:

EDTECH: What AI applications might we see in higher education?

QUALLS: You are going to see a massive change in education from K–12 to the university. The thought of having large universities and large faculties teaching students is probably going to go away — not in the short-term, but in the long-term. You will have a student interact with an AI system that will understand him or her and provide an educational path for that particular student. Once you have a personalized education system, education will become much faster and more enriching. You may have a student who can do calculus in the sixth grade because AI realized he had a mathematical sense. That personalized education is going to change everything.

 

 

 

 

 

 

A smorgasboard of ideas to put on your organization’s radar! [Christian]

From DSC:
At the Next Generation Learning Spaces Conference, held recently in San Diego, CA, I moderated a panel discussion re: AR, VR, and MR.  I started off our panel discussion with some introductory ideas and remarks — meant to make sure that numerous ideas were on the radars at attendees’ organizations. Then Vinay and Carrie did a super job of addressing several topics and questions (Mary was unable to make it that day, as she got stuck in the UK due to transportation-related issues).

That said, I didn’t get a chance to finish the second part of the presentation which I’ve listed below in both 4:3 and 16:9 formats.  So I made a recording of these ideas, and I’m relaying it to you in the hopes that it can help you and your organization.

 


Presentations/recordings:


 

Audio/video recording (187 MB MP4 file)

 

 


Again, I hope you find this information helpful.

Thanks,
Daniel

 

 

 

HarvardX rolls out new adaptive learning feature in online course — from edscoop.com by Corinne Lestch
Students in MOOC adaptive learning experiment scored nearly 20 percent better than students using more traditional learning approaches.

Excerpt:

Online courses at Harvard University are adapting on the fly to students’ needs.

Officials at the Cambridge, Massachusetts, institution announced a new adaptive learning technology that was recently rolled out in a HarvardX online course. The feature offers tailored course material that directly correlates with student performance while the student is taking the class, as well as tailored assessment algorithms.

HarvardX is an independent university initiative that was launched in parallel with edX, the online learning platform that was created by Harvard and Massachusetts Institute of Technology. Both HarvardX and edX run massive open online courses. The new feature has never before been used in a HarvardX course, and has only been deployed in a small number of edX courses, according to officials.

 

 

From DSC:
Given the growth of AI, this is certainly radar worthy — something that’s definitely worth pulse-checking to see where opportunities exist to leverage these types of technologies.  What we now know of as adaptive learning will likely take an enormous step forward in the next decade.

IBM’s assertion rings in my mind:

 

 

I’m cautiously hopeful that these types of technologies can extend beyond K-12 and help us deal with the current need to be lifelong learners, and the need to constantly reinvent ourselves — while providing us with more choice, more control over our learning. I’m hopeful that learners will be able to pursue their passions, and enlist the help of other learners and/or the (human) subject matter experts as needed.

I don’t see these types of technologies replacing any teachers, professors, or trainers. That said, these types of technologies should be able to help do some of the heavy teaching and learning lifting in order to help someone learn about a new topic.

Again, this is one piece of the Learning from the Living [Class] Room that we see developing.

 

 

 

 

IBM to Train 25 Million Africans for Free to Build Workforce — from by Loni Prinsloo
* Tech giant seeking to bring, keep digital jobs in Africa
* Africa to have world’s largest workforce by 2040, IBM projects

Excerpt:

International Business Machines Corp. is ramping up its digital-skills training program to accommodate as many as 25 million Africans in the next five years, looking toward building a future workforce on the continent. The U.S. tech giant plans to make an initial investment of 945 million rand ($70 million) to roll out the training initiative in South Africa…

 

Also see:

IBM Unveils IT Learning Platform for African Youth — from investopedia.com by Tim Brugger

Excerpt (emphasis DSC):

Responding to concerns that artificial intelligence (A.I.) in the workplace will lead to companies laying off employees and shrinking their work forces, IBM (NYSE: IBM) CEO Ginni Rometty said in an interview with CNBC last month that A.I. wouldn’t replace humans, but rather open the door to “new collar” employment opportunities.

IBM describes new collar jobs as “careers that do not always require a four-year college degree but rather sought-after skills in cybersecurity, data science, artificial intelligence, cloud, and much more.”

In keeping with IBM’s promise to devote time and resources to preparing tomorrow’s new collar workers for those careers, it has announced a new “Digital-Nation Africa” initiative. IBM has committed $70 million to its cloud-based learning platform that will provide free skills development to as many as 25 million young people in Africa over the next five years.

The platform will include online learning opportunities for everything from basic IT skills to advanced training in social engagement, digital privacy, and cyber protection. IBM added that its A.I. computing wonder Watson will be used to analyze data from the online platform, adapt it, and help direct students to appropriate courses, as well as refine the curriculum to better suit specific needs.

 

 

From DSC:
That last part, about Watson being used to personalize learning and direct students to appropropriate courses, is one of the elements that I see in the Learning from the Living [Class]Room vision that I’ve been pulse-checking for the last several years. AI/cognitive computing will most assuredly be a part of our learning ecosystems in the future.  Amazon is currently building their own platform that adds 100 skills each day — and has 1000 people working on creating skills for Alexa.  This type of thing isn’t going away any time soon. Rather, I’d say that we haven’t seen anything yet!

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

And Amazon has doubled down to develop Alexa’s “skills,” which are discrete voice-based applications that allow the system to carry out specific tasks (like ordering pizza for example). At launch, Alexa had just 20 skills, which has reportedly jumped to 5,200 today with the company adding about 100 skills per day.

In fact, Bezos has said, “We’ve been working behind the scenes for the last four years, we have more than 1,000 people working on Alexa and the Echo ecosystem … It’s just the tip of the iceberg. Just last week, it launched a new website to help brands and developers create more skills for Alexa.

Source

 

 

Also see:

 

“We are trying to make education more personalised and cognitive through this partnership by creating a technology-driven personalised learning and tutoring,” Lula Mohanty, Vice President, Services at IBM, told ET. IBM will also use its cognitive technology platform, IBM Watson, as part of the partnership.

“We will use the IBM Watson data cloud as part of the deal, and access Watson education insight services, Watson library, student information insights — these are big data sets that have been created through collaboration and inputs with many universities. On top of this, we apply big data analytics,” Mohanty added.

Source

 

 


 

Also see:

  • Most People in Education are Just Looking for Faster Horses, But the Automobile is Coming — from etale.org by Bernard Bull
    Excerpt:
    Most people in education are looking for faster horses. It is too challenging, troubling, or beyond people’s sense of what is possible to really imagine a completely different way in which education happens in the world. That doesn’t mean, however, that the educational equivalent of the automobile is not on its way. I am confident that it is very much on its way. It might even arrive earlier than even the futurists expect. Consider the following prediction.

 


 

 

 

Excerpt from Amazon fumbles earnings amidst high expectations (emphasis DSC):

Aside from AWS, Amazon Alexa-enabled devices were the top-selling products across all categories on Amazon.com throughout the holiday season and the company is reporting that Echo family sales are up over 9x compared to last season. Amazon aims to brand Alexa as a platform, something that has helped the product to gain capabilities faster than its competition. Developers and corporates released 4,000 new skills for the voice assistant in just the last quarter.

 

 

 

 

 

Alexa got 4,000 new skills in just the last quarter!

From DSC:
What are the teaching & learning ramifications of this?

By the way, I’m not saying for professors, teachers, & trainers to run for the hills (i.e., that they’ll be replaced by AI-based tools). But rather, I would like to suggest that we not only put this type of thing on our radars, but we should begin to actively experiment with such technologies to see if they might be able to help us do some heavy lifting for students learning about new topics.

 

Per X Media Lab:

The authoritative CB Insights lists imminent Future Tech Trends: customized babies; personalized foods; robotic companions; 3D printed housing; solar roads; ephemeral retail; enhanced workers; lab-engineered luxury; botroots movements; microbe-made chemicals; neuro-prosthetics; instant expertise; AI ghosts. You can download the whole outstanding report here (125 pgs).

 

From DSC:
Though I’m generally pro-technology, there are several items in here which support the need for all members of society to be informed and have some input into if and how these technologies should be used. Prime example: Customized babies.  The report discusses the genetic modification of babies: “In the future, we will choose the traits for our babies.” Veeeeery slippery ground here.

 

Below are some example screenshots:

 

 

 

 

 

 

 

 

 

Also see:

CBInsights — Innovation Summit

  • The New User Interface: The Challenge and Opportunities that Chatbots, Voice Interfaces and Smart Devices Present
  • Fusing the physical, digital and biological: AI’s transformation of healthcare
  • How predictive algorithms and AI will rule financial services
  • Autonomous Everything: How Connected Vehicles Will Change Mobility and Which Companies Will Own this Future
  • The Next Industrial Age: The New Revenue Sources that the Industrial Internet of Things Unlocks
  • The AI-100: 100 Artificial Intelligence Startups That You Better Know
  • Autonomous Everything: How Connected Vehicles Will Change Mobility and Which Companies Will Own this Future

 

 

 

The Periodic Table of AI — from ai.xprize.org by Kris Hammond

Excerpts:

This is an invitation to collaborate.  In particular, it is an invitation to collaborate in framing how we look at and develop machine intelligence. Even more specifically, it is an invitation to collaborate in the construction of a Periodic Table of AI.

Let’s be honest. Thinking about Artificial Intelligence has proven to be difficult for us.  We argue constantly about what is and is not AI.  We certainly cannot agree on how to test for it.  We have difficultly deciding what technologies should be included within it.  And we struggle with how to evaluate it.

Even so, we are looking at a future in which intelligent technologies are becoming commonplace.

With that in mind, we propose an approach to viewing machine intelligence from the perspective of its functional components. Rather than argue about the technologies behind them, the focus should be on the functional elements that make up intelligence.  By stepping away from how these elements are implemented, we can talk about what they are and their roles within larger systems.

 

 

Also see this article, which contains the graphic below:

 

 

 

From DSC:
These graphics are helpful to me, as they increase my understanding of some of the complexities involved within the realm of artificial intelligence.

 

 

 


Also relevant/see:

 

 

 
 

CES 2017: Intel’s VR visions — from jwtintelligence.com by Shepherd Laughlin
The company showed off advances in volumetric capture, VR live streaming, and “merged reality.”

Excerpt (emphasis DSC):

Live-streaming 360-degree video was another area of focus for Intel. Guests were able to watch a live basketball game being broadcast from Indianapolis, Indiana, choosing from multiple points of view as the action moved up and down the court. Intel “will be among the first technology providers to enable the live sports experience on multiple VR devices,” the company stated.

After taking a 3D scan of the room, Project Alloy can substitute virtual objects where physical objects stand.

 

From DSC:
If viewers of a live basketball game can choose from multiple points of view, why can’t remote learners do this as well with a face-to-face classroom that’s taking place at a university or college?  Learning from the Living [Class] Room.

 

 

 

From CES 2017: Introducing DAQRI’s Smart Glasses™

Excerpt:

Data visualization, guided work instructions, remote expert — for use in a variety of industries: medical, aviation and aerospace, architecture and AEC, lean manufacturing, engineering, and construction.

 

 

 

Third-party Microsoft HoloLens-based mixed reality headsets coming soon, prices to start at $299 — from bgr.in by Deepali Moray
Microsoft has partnered with companies including Dell and Acer which will release their own HoloLens compatible devices.

Excerpt:

The company said that it is teaming up with the likes of Dell, HP, Lenovo and Acer, which will release headsets based on the HoloLens technology. “These new head-mounted displays will be the first consumer offerings utilizing the Mixed Reality capabilities of Windows 10 Creators Update,” a Microsoft spokesperson said. Microsoft’s partner companies for taking the HoloLens technology forward include Dell, HP, Lenovo, Acer, and 3 Glasses. Headsets by these manufacturers will work the same way as the original HoloLens but carry the design and branding of their respective companies. While the HoloLens developer edition costs a whopping $2999 (approximately Rs 2,00,000), the third-party headsets will be priced starting $299 (approximately Rs 20,000).

 

 

Verto Studio 3D App Makes 3D Modeling on HoloLens Easy — from winbuzzer.com by Luke Jones
The upcoming Verto Studio 3D application allows users to create 3D models and interact with them when wearing HoloLens. It is the first software of its kind for mixed reality.

 

 

 

 

Excerpt:
How is The Immersive Experience Delivered?

Tethered Headset VR – The user can participate in a VR experience by using a computer with a tethered VR headset (also known as a Head Mounted Display – HMD) like Facebook’s Oculus Rift, PlayStation VR, or the HTC Vive. The user has the ability to move freely and interact in the VR environment while using a handheld controller to emulate VR hands. But, the user has a limited area in which to move about because they are tethered to a computer.

Non-Tethered Headset VR/AR – These devices are headsets and computers built into one system, so users are free of any cables limiting their movement. These devices use AR to deliver a 360° immersive experience. Much like with Oculus Rift and Vive, the user would be able to move around in the AR environment as well as interact and manipulate objects. A great example of this headset is Microsoft’s HoloLens, which delivers an AR experience to the user through just a headset.

Mobile Device Inserted into a Headgear – To experience VR, the user inserts their mobile device into a Google Cardboard, Samsung Gear 360°, or any other type of mobile device headgear, along with headphones if they choose. This form of VR doesn’t require the user to be tethered to a computer and most VR experiences can be 360° photos, videos, and interactive scenarios.

Mobile VR – The user can access VR without any type of headgear simply by using a mobile device and headphones (optional). They can still have many of the same experiences that they would through Google Cardboard or any other type of mobile device headgear. Although they don’t get the full immersion that they would with headgear, they would still be able to experience VR. Currently, this version of the VR experience seems to be the most popular because it only requires a mobile device. Apps like Pokémon Go and Snapchat’s animated selfie lens only require a mobile device and have a huge number of users.

Desktop VR – Using just a desktop computer, the user can access 360° photos and videos, as well as other VR and AR experiences, by using the trackpad or computer mouse to move their field of view and become immersed in the VR scenario.

New VR – Non-mobile and non-headset platforms like Leap Motion use depth sensors to create a VR image of one’s hands on a desktop computer; they emulate hand gestures in real time. This technology could be used for anything from teaching assembly in a manufacturing plant to learning a step-by-step process to medical training.

VR/AR Solutions

  • Oculus Rift – www.oculus.com
  • HTC Vive – htcvive.com
  • Playstation VR – playstation.com
  • Samsung VR Gear – www.samsung.com
  • Google Daydream – https://vr.google.com/daydream/
  • Leap Motion – www.leapmotion.com
  • Magic Leap – www.magicleap.com
  • Most mobile devices

 

Goggles that are worn, while they are “Oh Myyy” awesome, will not be the final destination of VR/AR. We will want to engage and respond, without wearing a large device over our eyes. Pokémon Go was a good early predictor of how non-goggled experiences will soar.

Elliott Masie

 

 

 

Top 8 VR & AR predictions for 2017 — from haptic.al by Christine Hart

Excerpt:

Education will go virtual
Similar to VR for brand engagement, we’ve seen major potential for delivering hands-on training and distance education in a virtual environment. If VR can take a class on a tour of Mars, the current trickle of educational VR could turn into a flood in 2017.

 

 

 

 

Published on Dec 26, 2016
Top 10 Virtual Reality Predictions For 2017 In vTime. Its been an amazing year for VR and AR. New VR and AR headsets, ground breaking content and lots more. 2017 promises to be amazing as well. Here’s our top 10 virtual reality predictions for the coming year. Filmed in vTime with vCast. Sorry about the audio quality. We used mics on Rift and Vive which are very good on other platforms. We’ve reported this to vTime.

 

 


Addendums


 

  • 5 top Virtual Reality and Augmented Reality technology trends for 2017 — from marxentlabs.com by Joe Bardi
    Excerpt:
    So what’s in store for Virtual Reality and Augmented Reality in 2017? We asked Marxent’s talented team of computer vision experts, 3D artists and engineers to help us suss out what the year ahead will hold. Here are their predictions for the top Virtual Reality and Augmented Reality technology trends for 2017.

 

AR becomes the killer app for smartphones

 

 

 

 

 

 

5 Online Education Trends to Watch in 2017 — from usnews.com by Jordan Friedman
Experts predict more online programs will offer alternative credentials and degrees in specialized fields.

Excerpts:

  1. Greater emphasis on nontraditional credentials
  2. Increased use of big data to measure student performance
  3. Greater incorporation of artificial intelligence into classes
  4. Growth of nonprofit online programs
  5. Online degrees in surprising and specialized disciplines

 

 

The Future of Online Learning Is Offline: What Strava Can Teach Digital Course Designers — from edsurge.com by Amy Ahearn

Excerpt:

I became a Strava user in 2013, around the same time I became an online course designer. Quickly I found that even as I logged runs on Strava daily, I struggled to find the time to log into platforms like Coursera, Udemy or Udacity to finish courses produced by my fellow instructional designers. What was happening? Why was the fitness app so “sticky” as opposed to the online learning platforms?

As a thought experiment, I tried to recast my Strava experience in pedagogical terms. I realized that I was recording hours of deliberate practice (my early morning runs), formative assessments (the occasional speed workout on the track) and even a few summative assessments (races) on the app. Strava was motivating my consistent use by overlaying a digital grid on my existing offline activities. It let me reconnect with college teammates who could keep me motivated. It enabled me to analyze the results of my efforts and compare them to others. I didn’t have to be trapped behind a computer to benefit from this form of digital engagement—yet it was giving me personalized feedback and results. How could we apply the same practices to learning?

I’ve come to believe that one of the biggest misunderstandings about online learning is that it has to be limited to things that can be done in front of a computer screen. Instead, we need to reimagine online courses as something that can enable the interplay between offline activities and digital augmentation.

A few companies are heading that way. Edthena enables teachers to record videos of themselves teaching and then upload these to the platform to get feedback from mentors.

 

 

DIY’s JAM online courses let kids complete hands-on activities like drawing or building with LEGOs and then has them upload pictures of their work to earn badges and share their projects.

 

 

My team at +Acumen has built online courses that let teams complete projects together offline and then upload their prototypes to the NovoEd platform to receive feedback from peers. University campuses are integrating Kaltura into their LMS platforms to enable students to capture and upload videos.

 

 

We need to focus less on building multiple choice quizzes or slick lecture videos and more on finding ways to robustly capture evidence of offline learning that can be validated and critiqued at scale by peers and experts online.

 

 

 

 

 

 

Don’t discount the game-changing power of the morphing “TV” when coupled with AI, NLP, and blockchain-based technologies! [Christian]

From DSC:

Don’t discount the game-changing power of the morphing “TV” when coupled with artificial intelligence (AI), natural language processing (NLP), and blockchain-based technologies!

When I saw the article below, I couldn’t help but wonder what (we currently know of as) “TVs” will morph into and what functionalities they will be able to provide to us in the not-too-distant future…?

For example, the article mentions that Seiki, Westinghouse, and Element will be offering TVs that can not only access Alexa — a personal assistant from Amazon which uses artificial intelligence — but will also be able to provide access to over 7,000 apps and games via the Amazon Fire TV Store.

Some of the questions that come to my mind:

  • Why can’t there be more educationally-related games and apps available on this type of platform?
  • Why can’t the results of the assessments taken on these apps get fed into cloud-based learner profiles that capture one’s lifelong learning? (#blockchain)
  • When will potential employers start asking for access to such web-based learner profiles?
  • Will tvOS and similar operating systems expand to provide blockchain-based technologies as well as the types of functionality we get from our current set of CMSs/LMSs?
  • Will this type of setup become a major outlet for competency-based education as well as for corporate training-related programs?
  • Will augmented reality (AR), virtual reality (VR), and mixed reality (MR) capabilities come with our near future “TVs”?
  • Will virtual tutoring be one of the available apps/channels?
  • Will the microphone and the wide angle, HD camera on the “TV” be able to be disconnected from the Internet for security reasons? (i.e., to be sure no hacker is eavesdropping in on their private lives)

 

Forget a streaming stick: These 4K TVs come with Amazon Fire TV inside — from techradar.com by Nick Pino

Excerpt:

The TVs will not only have access to Alexa via a microphone-equipped remote but, more importantly, will have access to the over 7,000 apps and games available on the Amazon Fire TV Store – a huge boon considering that most of these Smart TVs usually include, at max, a few dozen apps.

 

 

 

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 


Addendums


 

“I’ve been predicting that by 2030 the largest company on the internet is going to be an education-based company that we haven’t heard of yet,” Frey, the senior futurist at the DaVinci Institute think tank, tells Business Insider.

.

  • Once thought to be a fad, MOOCs showed staying power in 2016 — from educationdive.com
    Dive Brief:

    • EdSurge profiles the growth of massive online open courses in 2016, which attracted more than 58 million students in over 700 colleges and universities last year.
    • The top three MOOC providers — Coursera, Udacity and EdX — collectively grossed more than $100 million last year, as much of the content provided on these platforms shifted from free to paywall guarded materials.
    • Many MOOCs have moved to offering credentialing programs or nanodegree offerings to increase their value in industrial marketplaces.
 

From DSC:
Recently, my neighbor graciously gave us his old Honda snowblower, as he was getting a new one. He wondered if we had a use for it.  As I’m definitely not getting any younger and I’m not Howard Hughes, I said, “Sure thing! That would be great — it would save my back big time!  Thank you!” (Though the image below is not mine, it might as well be…as both are quite old now.)

 

 

Anyway…when I recently ran out of gas, I would have loved to be able to take out my iPhone, hold it up to this particular Honda snowblower and ask an app to tell me if this particular Honda snowblower takes a mixture of gas and oil, or does it have a separate container for the oil? (It wasn’t immediately clear where to put the oil in, so I’m figuring it’s a mix.)

But what I would have liked to have happen was:

  1. I launched an app on my iPhone that featured machine learning-based capabilities
  2. The app would have scanned the snowblower and identified which make/model it was and proceeded to tell me whether it needed a gas/oil mix (or not)
  3. If there was a separate place to pour in the oil, the app would have asked me if I wanted to learn how to put oil in the snowblower. Upon me saying yes, it would then have proceeded to display an augmented reality-based training video — showing me where the oil was to be put in and what type of oil to use (links to local providers would also come in handy…offering nice revenue streams for advertisers and suppliers alike).

So several technologies would have to be involved here…but those techs are already here. We just need to pull them together in order to provide this type of useful functionality!

 

 
© 2024 | Daniel Christian