Gartner: Immersive experiences among top tech trends for 2019 — from campustechnology.com by Dian Schaffhauser

Excerpt:

IT analyst firm Gartner has named its top 10 trends for 2019, and the “immersive user experience” is on the list, alongside blockchain, quantum computing and seven other drivers influencing how we interact with the world. The annual trend list covers breakout tech with broad impact and tech that could reach a tipping point in the near future.

 

 

 

Reflections on “Inside Amazon’s artificial intelligence flywheel” [Levy]

Inside Amazon’s artificial intelligence flywheel — from wired.com by Steven Levy
How deep learning came to power Alexa, Amazon Web Services, and nearly every other division of the company.

Excerpt (emphasis DSC):

Amazon loves to use the word flywheel to describe how various parts of its massive business work as a single perpetual motion machine. It now has a powerful AI flywheel, where machine-learning innovations in one part of the company fuel the efforts of other teams, who in turn can build products or offer services to affect other groups, or even the company at large. Offering its machine-learning platforms to outsiders as a paid service makes the effort itself profitable—and in certain cases scoops up yet more data to level up the technology even more.

It took a lot of six-pagers to transform Amazon from a deep-learning wannabe into a formidable power. The results of this transformation can be seen throughout the company—including in a recommendations system that now runs on a totally new machine-learning infrastructure. Amazon is smarter in suggesting what you should read next, what items you should add to your shopping list, and what movie you might want to watch tonight. And this year Thirumalai started a new job, heading Amazon search, where he intends to use deep learning in every aspect of the service.

“If you asked me seven or eight years ago how big a force Amazon was in AI, I would have said, ‘They aren’t,’” says Pedro Domingos, a top computer science professor at the University of Washington. “But they have really come on aggressively. Now they are becoming a force.”

Maybe the force.

 

 

From DSC:
When will we begin to see more mainstream recommendation engines for learning-based materials? With the demand for people to reinvent themselves, such a next generation learning platform can’t come soon enough!

  • Turning over control to learners to create/enhance their own web-based learner profiles; and allowing people to say who can access their learning profiles.
  • AI-based recommendation engines to help people identify curated, effective digital playlists for what they want to learn about.
  • Voice-driven interfaces.
  • Matching employees to employers.
  • Matching one’s learning preferences (not styles) with the content being presented as one piece of a personalized learning experience.
  • From cradle to grave. Lifelong learning.
  • Multimedia-based, interactive content.
  • Asynchronously and synchronously connecting with others learning about the same content.
  • Online-based tutoring/assistance; remote assistance.
  • Reinvent. Staying relevant. Surviving.
  • Competency-based learning.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

 

 

 

 

We’re about to embark on a period in American history where career reinvention will be critical, perhaps more so than it’s ever been before. In the next decade, as many as 50 million American workers—a third of the total—will need to change careers, according to McKinsey Global Institute. Automation, in the form of AI (artificial intelligence) and RPA (robotic process automation), is the primary driver. McKinsey observes: “There are few precedents in which societies have successfully retrained such large numbers of people.”

Bill Triant and Ryan Craig

 

 

 

Also relevant/see:

Online education’s expansion continues in higher ed with a focus on tech skills — from educationdive.com by James Paterson

Dive Brief:

  • Online learning continues to expand in higher ed with the addition of several online master’s degrees and a new for-profit college that offers a hybrid of vocational training and liberal arts curriculum online.
  • Inside Higher Ed reported the nonprofit learning provider edX is offering nine master’s degrees through five U.S. universities — the Georgia Institute of Technology, the University of Texas at Austin, Indiana University, Arizona State University and the University of California, San Diego. The programs include cybersecurity, data science, analytics, computer science and marketing, and they cost from around $10,000 to $22,000. Most offer stackable certificates, helping students who change their educational trajectory.
  • Former Harvard University Dean of Social Science Stephen Kosslyn, meanwhile, will open Foundry College in January. The for-profit, two-year program targets adult learners who want to upskill, and it includes training in soft skills such as critical thinking and problem solving. Students will pay about $1,000 per course, though the college is waiving tuition for its first cohort.

 

 

 

A cyber-skills shortage means students are being recruited to fight off hackers — from technologyreview.com by Erin Winick
Students with little or no cybersecurity knowledge are being paired with easy-to-use AI software that lets them protect their campus from attack.

Excerpt:

There aren’t enough cybersecurity workers out there—and things are getting worse. According to one estimate, by 2021 an estimated 3.5 million cybersecurity jobs will be unfilled. And of the candidates who apply, fewer than one in four are even qualified.

That’s why many large corporations are investing in longer-term solutions like mobile training trucks and apprenticeship programs. But Texas A&M University has found a way to solve its labor shortage in the short term. It’s pairing student security beginners with AI software.

The college’s Security Operations Center deals with about a million attempts to hack the university system each month. While the center does have some full-time employees, the majority of its security force is made up of students. Ten students currently work alongside AI softwareto detect, monitor, and remediate the threats.

 

 

 

 

In the 2030 and beyond world, employers will no longer be a separate entity from the education establishment. Pressures from both the supply and demand side are so large that employers and learners will end up, by default, co-designing new learning experiences, where all learning counts.

 

OBJECTIVES FOR CONVENINGS

  • Identify the skills everyone will need to navigate the changing relationship between machine intelligence and people over the next 10-12 years.
  • Develop implications for work, workers, students, working learners, employers, and policymakers.
  • Identify a preliminary set of actions that need to be taken now to best prepare for the changing work + learn ecosystem.

Three key questions guided the discussions:

  1. What are the LEAST and MOST essential skills needed for the future?
  2. Where and how will tomorrow’s workers and learners acquire the skills they really need?
  3. Who is accountable for making sure individuals can thrive in this new economy?

This report summarizes the experts’ views on what skills will likely be needed to navigate the work + learn ecosystem over the next 10–15 years—and their suggested steps for better serving the nation’s future needs.

 

In a new world of work, driven especially by AI, institutionally-sanctioned curricula could give way to AI-personalized learning. This would drastically change the nature of existing social contracts between employers and employees, teachers and students, and governments and citizens. Traditional social contracts would need to be renegotiated or revamped entirely. In the process, institutional assessment and evaluation could well shift from top-down to new bottom-up tools and processes for developing capacities, valuing skills, and managing performance through new kinds of reputation or accomplishment scores.

 

In October 2017, Chris Wanstrath, CEO of Github, the foremost code-sharing and social networking resource for programmers today, made a bold statement: “The future of coding is no coding at all.” He believes that the writing of code will be automated in the near future, leaving humans to focus on “higher-level strategy and design of software.” Many of the experts at the convenings agreed. Even creating the AI systems of tomorrow, they asserted, will likely require less human coding than is needed today, with graphic interfaces turning AI programming into a drag-and-drop operation.

Digital fluency does not mean knowing coding languages. Experts at both convenings contended that effectively “befriending the machine” will be less about teaching people to code and more about being able to empathize with AIs and machines, understanding how they “see the world” and “think” and “make decisions.” Machines will create languages to talk to one another.

Here’s a list of many skills the experts do not expect to see much of—if at all—in the future:

  • Coding. Systems will be self-programming.
  • Building AI systems. Graphic interfaces will turn AI programming into drag-and-drop operations.
  • Calendaring, scheduling, and organizing. There won’t be need for email triage.
  • Planning and even decision-making. AI assistants will pick this up.
  • Creating more personalized curricula. Learners may design more of their own personalized learning adventure.
  • Writing and reviewing resumes. Digital portfolios, personal branding, and performance reputation will replace resumes.
  • Language translation and localization. This will happen in real time using translator apps.
  • Legal research and writing. Many of our legal systems will be automated.
  • Validation skills. Machines will check people’s work to validate their skills.
  • Driving. Driverless vehicles will replace the need to learn how to drive.

Here’s a list of the most essential skills needed for the future:

  • Quantitative and algorithmic thinking.  
  • Managing reputation.  
  • Storytelling and interpretive skills.  
  • First principles thinking.  
  • Communicating with machines as machines.  
  • Augmenting high-skilled physical tasks with AI.
  • Optimization and debugging frame of mind.
  • Creativity and growth mindset.
  • Adaptability.
  • Emotional intelligence.
  • Truth seeking.
  • Cybersecurity.

 

The rise of machine intelligence is just one of the many powerful social, technological, economic, environmental, and political forces that are rapidly and disruptively changing the way everyone will work and learn in the future. Because this largely tech-driven force is so interconnected with other drivers of change, it is nearly impossible to understand the impact of intelligent agents on how we will work and learn without also imagining the ways in which these new tools will reshape how we live.

 

 

 

AI Now Law and Policy Reading List — from medium.com by the AI Now Institute

Excerpt:

Data-driven technologies are widely used in society to make decisions that affect many critical aspects of our lives, from health, education, employment, and criminal justice to economic, social and political norms. Their varied applications, uses, and consequences raise a number of unique and complex legal and policy concerns. As a result, it can be hard to figure out not only how these systems work but what to do about them.

As a starting point, AI Now offers this Law and Policy Reading List tailored for those interested in learning about key concepts, debates, and leading analysis on law and policy issues related to artificial intelligence and other emerging data-driven technologies.

 

New game lets players train AI to spot legal issues — from abajournal.com by Jason Tashea

Excerpt:

Got a free minute? There’s a new game that will help train an artificial intelligence model to spot legal issues and help close the access-to-justice gap.

Called Learned Hands—yes, it’s a pun—the game takes 75,000 legal questions posted on Reddit dealing with family, consumer, criminal and other legal issues and asks the user to determine what the issue is.

While conjuring up nightmares of the first-year in law school for many lawyers, David Colarusso says it’s for a good cause.

“It’s an opportunity for attorneys to take their downtime to train machine learning algorithms to help access-to-justice issues,” says Colarusso, director of Suffolk University Law School’s Legal Innovation and Technology (LIT) Lab and partner on this project with the Stanford Legal Design Lab.

 

From learnedhands.law.stanford.edu/legalIssues

When you play the game, you’ll be spotting if different legal issues are present in people’s stories. Some of these issues will be high level categories, and others will be more specific issues.

Here are the high level categories:

 

 

MIT plans $1B computing college, AI research effort — from educationdive.com by James Paterson

Dive Brief (emphasis DSC):

  • The Massachusetts Institute of Technology is creating a College of Computing with the help of a $350 million gift from billionaire investor Stephen A. Schwarzman, who is the CEO and co-founder of the private equity firm Blackstone, in a move the university said is its “most significant reshaping” since 1950.
  • Featuring 50 new faculty positions and a new headquarters building, the $1 billion interdisciplinary initiative will bring together computer science, artificial intelligence (AI), data science and related programs across the institution. MIT will establish a new deanship for the college.
  • The new college…will explore and promote AI’s use in non-technology disciplines with a focus on ethical considerations, which are a growing concern as the technology becomes embedded in many fields.

 

Also see:

Alexa Sessions You Won’t Want to Miss at AWS re:Invent 2018 — from developer.amazon.com

Excerpts — with an eye towards where this might be leading in terms of learning spaces:

Alexa and AWS IoT — Voice is a natural interface to interact not just with the world around us, but also with physical assets and things, such as connected home devices, including lights, thermostats, or TVs. Learn how you can connect and control devices in your home using the AWS IoT platform and Alexa Skills Kit.

Connect Any Device to Alexa and Control Any Feature with the Updated Smart Home Skill API — Learn about the latest update to the Smart Home Skill API, featuring new capability interfaces you can use as building blocks to connect any device to Alexa, including those that fall outside of the traditional smart home categories of lighting, locks, thermostats, sensors, cameras, and audio/video gear. Start learning about how you can create a smarter home with Alexa.

Workshop: Build an Alexa Skill with Multiple Models — Learn how to build an Alexa skill that utilizes multiple interaction models and combines functionality into a single skill. Build an Alexa smart home skill from scratch that implements both custom interactions and smart home functionality within a single skill. Check out these resources to start learning:

 

Your next doctor’s appointment might be with an AI — from technologyreview.com by Douglas Heaven
A new wave of chatbots are replacing physicians and providing frontline medical advice—but are they as good as the real thing?

Excerpt:

The idea is to make seeking advice about a medical condition as simple as Googling your symptoms, but with many more benefits. Unlike self-diagnosis online, these apps lead you through a clinical-grade triage process—they’ll tell you if your symptoms need urgent attention or if you can treat yourself with bed rest and ibuprofen instead. The tech is built on a grab bag of AI techniques: language processing to allow users to describe their symptoms in a casual way, expert systems to mine huge medical databases, machine learning to string together correlations between symptom and condition.

Babylon Health, a London-based digital-first health-care provider, has a mission statement it likes to share in a big, bold font: to put an accessible and affordable health service in the hands of every person on earth. The best way to do this, says the company’s founder, Ali Parsa, is to stop people from needing to see a doctor.

Not everyone is happy about all this. For a start, there are safety concerns. Parsa compares what Babylon does with your medical data to what Facebook does with your social activities—amassing information, building links, drawing on what it knows about you to prompt some action. Suggesting you make a new friend won’t kill you if it’s a bad recommendation, but the stakes are a lot higher for a medical app.

 

 

Also see:

 

 

What will be important in the learn and work ecosystem in 2030? How do we prepare? — from evolllution.com by Holly Zanville | Senior Advisor for Credentialing and Workforce Development, Lumina Foundation

Excerpt:

These seven suggested actions—common to all scenarios—especially resonated with Lumina:

  1. Focus on learning: All learners will need a range of competencies and skills, most critically: learning how to learn; having a foundation in math, science, IT and cross-disciplines; and developing the behaviors of grit, empathy and effective communication.
  2. Prepare all “systems”: Schools will continue to be important places to teach competencies and skills. Parents will be important teachers for children. Workplaces will also be important places for learning, and many learners will need instruction on how to work effectively as part of human/machine teams.
  3. Integrate education and work: Education systems will need to be integrated with work in an education/work ecosystem. To enable movement within the ecosystem, credentials will be useful, but only if they are transparent and portable. The competencies and skills that stand behind credentials will need to be identifiable, using a common language to enable (a) credential providers to educate/train for an integrated education/work system; (b) employers to hire people and upgrade their skills; and (c) governments (federal/state/local) to incentivize and regulate programs and policies that support the education/work system.
  4. Assess learning: Assessing competencies and skills acquired in multiple settings and modes (including artificial reality and virtual reality tools), will be essential. AI will enable powerful new assessment tools to collect and analyze data about what humans know and can do.
  5. Build fair, moral AI: There will be a high priority on ensuring that AI has built-in checks and balances that reflect moral values and honor different cultural perspectives.
  6. Prepare for human/machine futures: Machines will join humans in homes, schools and workplaces. Machines will likely be viewed as citizens with rights. Humans must prepare for side-by-side “relationships” with machines, especially in situations in which machines will be managing aspects of education, work and life formerly managed by humans. Major questions will also arise about the ownership of AI structures—what ownership looks like, and who profits from ubiquitous AI structures.
  7. Build networks for readiness/innovation: Open and innovative partnerships will be needed for whatever future scenarios emerge. In a data-rich world, we won’t solve problems alone; networks, partnerships and communities will be key.

 

 

Also see:

 

 

The coming revolution in software development — from forbes.com by Matt Bornstein

Excerpt:

Amid the deep learning hype, though, many observers miss the biggest reason to be optimistic about its future: deep learning requires coders to write very little actual code. Rather than relying on preset rules or if-then statements, a deep learning system writes rules automatically based on past examples. A software developer only has to create a “rough skeleton,” to paraphrase Andrej Karpathy from Tesla, then let the computers do the rest.

In this new world, developers no longer need to design a unique algorithm for each problem. Most work focuses, instead, on generating datasets that reflect desired behavior and managing the training process. Pete Warden from Google’s TensorFlow team pointed this outas far back as 2014: “I used to be a coder,” he wrote. “Now I teach computers to write their own programs.”

Again: the programming model driving the most important advances in software today does not require a significant amount of actual programming.

What does this mean for the future of software development?

 

 

 

An open letter to Microsoft and Google’s Partnership on AI — from wired.com by Gerd Leonhard
In a world where machines may have an IQ of 50,000, what will happen to the values and ethics that underpin privacy and free will?

Excerpt:

This open letter is my modest contribution to the unfolding of this new partnership. Data is the new oil – which now makes your companies the most powerful entities on the globe, way beyond oil companies and banks. The rise of ‘AI everywhere’ is certain to only accelerate this trend. Yet unlike the giants of the fossil-fuel era, there is little oversight on what exactly you can and will do with this new data-oil, and what rules you’ll need to follow once you have built that AI-in-the-sky. There appears to be very little public stewardship, while accepting responsibility for the consequences of your inventions is rather slow in surfacing.

 

In a world where machines may have an IQ of 50,000 and the Internet of Things may encompass 500 billion devices, what will happen with those important social contracts, values and ethics that underpin crucial issues such as privacy, anonymity and free will?

 

 

My book identifies what I call the “Megashifts”. They are changing society at warp speed, and your organisations are in the eye of the storm: digitization, mobilisation and screenification, automation, intelligisation, disintermediation, virtualisation and robotisation, to name the most prominent. Megashifts are not simply trends or paradigm shifts, they are complete game changers transforming multiple domains simultaneously.

 

 

If the question is no longer about if technology can do something, but why…who decides this?

Gerd Leonhard

 

 

From DSC:
Though this letter was written 2 years ago back in October of 2016, the messages, reflections, and questions that Gerd puts on the table are very much still relevant today.  The leaders of these powerful companies have enormous power — power to do good, or to do evil. Power to help or power to hurt. Power to be a positive force for societies throughout the globe and to help create dreams, or power to create dystopian societies while developing a future filled with nightmares. The state of the human heart is extremely key here — though many will hate me saying that. But it’s true. At the end of the day, we need to very much care about — and be extremely aware of — the characters and values of the leaders of these powerful companies. 

 

 

Also relevant/see:

Spray-on antennas will revolutionize the Internet of Things — from networkworld.com by Patrick Nelson
Researchers at Drexel University have developed a method to spray on antennas that outperform traditional metal antennas, opening the door to faster and easier IoT deployments.

 From DSC:
Again, it’s not too hard to imagine in this arena that technologies can be used for good or for ill.

 

 
 

The 11 coolest announcements Google made at its biggest product event of the year — from businessinsider.com by Dave Smith

Excerpt:

On Tuesday (10/9), Google invited journalists to New York City to debut its newest smartphone, the Pixel 3, among several other hardware goodies.

Google made dozens of announcements in its 75-minute event, but a handful of “wow” moments stole the show.

Here are the 11 coolest announcements Google made at its big hardware event…

 

 

 

Google’s Pixel 3 event in 6 minutes — from businessinsider.com

 

 

Google unveiled its new Home Hub but it might be ‘late to the market’ — from barrons.com by Emily Bary

Excerpt:

Alphabet ’s Google has sold millions of voice-enabled speakers, but it inched toward the future at a Tuesday launch event when it introduced the Home Hub smart screen.

Google isn’t the first company to roll out a screen-enabled home device with voice-assistant technology— Amazon.com released its Echo Show in July 2017. Meanwhile, Lenovo has gotten good reviews for its Smart Display, and Facebook introduced the Portal on Monday.

For the most part, though, consumers have stuck to voice-only devices, and it will be up to Google and its rivals to convince them that an added screen is worth paying for. They’ll also have to reassure consumers that they can trust big tech to maintain their privacy, an admittedly harder task these days after recent security issues at Google and Facebook.

 

Amazon was right to realize early on that consumers aren’t always comfortable buying items they can’t even see pictures of, and that it’s hard to remember directions you’ve heard but not seen.

 

 

Google has announced its first smart speaker with a screen — from businessinsider.com by Brandt Ranj

  • Google has announced the Google Hub, its first smart home speaker with a screen. It’s available for pre-order at Best Buy, Target, and Walmart for $149. The Hub will be released on October 22.
  • The Hub has a 7″ touchscreen and sits on a base with a built-in speaker and microphones, which you can use to play music, watch videos, get directions, and control smart home accessories with your voice.
  • Its biggest advantage is its ability to hook into Google’s first-party services, like YouTube and Google Maps, which none of its competitors can use.
  • If you’re an Android or Chromecast user, trust Google more than Amazon, or want a smaller smart home speaker with a screen, the Google Hub is now your best bet.

 

 

Google is shutting down Google+ for consumers following security lapse — from theverge.com by Ashley Carman

Excerpt:

Google is going to shut down the consumer version of Google+ over the next 10 months, the company writes in a blog post today. The decision follows the revelation of a previously undisclosed security flaw that exposed users’ profile data that was remedied in March 2018.

 

 

Google shutters Google+ after users’ personal data exposed — from cbsnews.com

 

 

Google+ is shutting down, and the site’s few loyal users are mourning – from cnbc.com by Jillian D’Onfro

  • Even though Google had diverted resources away from Google+, there’s a small loyal base of users devastated to see it go away.
  • One fan said the company is using its recently revealed security breach as an excuse to shutter the site.
  • Google said it will be closing Google+ in the coming months.

 

 

 

Jarvish’s smart motorcycle helmets will offer Alexa and Siri support and an AR display

 

Jarvish’s smart motorcycle helmets will offer Alexa and Siri support and an AR display — from the verge.com by Chaim Gartenberg

Excerpt:

The Jarvish X is the more basic of the two models. It offers integrated microphones and speakers for Siri, Google Assistant, and Alexa support so wearers have access things like directions, weather updates, and control music through voice control. There’s also a 2K, front-facing camera built into the helmet so you can record your ride. It’s set to cost $799 when it hits Kickstarter in January.

 

 

 

10 jobs that are safe in an AI world — from linkedin.com by Kai-Fu Lee

Excerpts:

Teaching
AI will be a great tool for teachers and educational institutions, as it will help educators figure out how to personalize curriculum based on each student’s competence, progress, aptitude, and temperament. However, teaching will still need to be oriented around helping students figure out their interests, teaching students to learn independently, and providing one-on-one mentorship. These are tasks that can only be done by a human teacher. As such, there will still be a great need for human educators in the future.

Criminal defense law
Top lawyers will have nothing to worry about when it comes to job displacement. reasoning across domains, winning the trust of clients, applying years of experience in the courtroom, and having the ability to persuade a jury are all examples of the cognitive complexities, strategies, and modes of human interaction that are beyond the capabilities of AI. However, a lot of paralegal and preparatory work like document review, analysis, creating contracts, handling small cases, packing cases, and coming up with recommendations can be done much better and more efficiently with AI. The costs of law make it worthwhile for AI companies to go after AI paralegals and AI junior lawyers, but not top lawyers.

 

From DSC:
In terms of teaching, I agree that while #AI will help personalize learning, there will still be a great need for human teachers, professors, and trainers. I also agree w/ my boss (and with some of the author’s viewpoints here, but not all) that many kinds of legal work will still need the human touch & thought processes. I diverge from his thinking in terms of scope — the need for human lawyers will go far beyond just lawyers involved in crim law.

 

Also see:

15 business applications for artificial intelligence and machine learning — from forbes.com

Excerpt:

Fifteen members of Forbes Technology Council discuss some of the latest applications they’ve found for AI/ML at their companies. Here’s what they had to say…

 

 

 
© 2025 | Daniel Christian