In the 2030 and beyond world, employers will no longer be a separate entity from the education establishment. Pressures from both the supply and demand side are so large that employers and learners will end up, by default, co-designing new learning experiences, where all learning counts.

 

OBJECTIVES FOR CONVENINGS

  • Identify the skills everyone will need to navigate the changing relationship between machine intelligence and people over the next 10-12 years.
  • Develop implications for work, workers, students, working learners, employers, and policymakers.
  • Identify a preliminary set of actions that need to be taken now to best prepare for the changing work + learn ecosystem.

Three key questions guided the discussions:

  1. What are the LEAST and MOST essential skills needed for the future?
  2. Where and how will tomorrow’s workers and learners acquire the skills they really need?
  3. Who is accountable for making sure individuals can thrive in this new economy?

This report summarizes the experts’ views on what skills will likely be needed to navigate the work + learn ecosystem over the next 10–15 years—and their suggested steps for better serving the nation’s future needs.

 

In a new world of work, driven especially by AI, institutionally-sanctioned curricula could give way to AI-personalized learning. This would drastically change the nature of existing social contracts between employers and employees, teachers and students, and governments and citizens. Traditional social contracts would need to be renegotiated or revamped entirely. In the process, institutional assessment and evaluation could well shift from top-down to new bottom-up tools and processes for developing capacities, valuing skills, and managing performance through new kinds of reputation or accomplishment scores.

 

In October 2017, Chris Wanstrath, CEO of Github, the foremost code-sharing and social networking resource for programmers today, made a bold statement: “The future of coding is no coding at all.” He believes that the writing of code will be automated in the near future, leaving humans to focus on “higher-level strategy and design of software.” Many of the experts at the convenings agreed. Even creating the AI systems of tomorrow, they asserted, will likely require less human coding than is needed today, with graphic interfaces turning AI programming into a drag-and-drop operation.

Digital fluency does not mean knowing coding languages. Experts at both convenings contended that effectively “befriending the machine” will be less about teaching people to code and more about being able to empathize with AIs and machines, understanding how they “see the world” and “think” and “make decisions.” Machines will create languages to talk to one another.

Here’s a list of many skills the experts do not expect to see much of—if at all—in the future:

  • Coding. Systems will be self-programming.
  • Building AI systems. Graphic interfaces will turn AI programming into drag-and-drop operations.
  • Calendaring, scheduling, and organizing. There won’t be need for email triage.
  • Planning and even decision-making. AI assistants will pick this up.
  • Creating more personalized curricula. Learners may design more of their own personalized learning adventure.
  • Writing and reviewing resumes. Digital portfolios, personal branding, and performance reputation will replace resumes.
  • Language translation and localization. This will happen in real time using translator apps.
  • Legal research and writing. Many of our legal systems will be automated.
  • Validation skills. Machines will check people’s work to validate their skills.
  • Driving. Driverless vehicles will replace the need to learn how to drive.

Here’s a list of the most essential skills needed for the future:

  • Quantitative and algorithmic thinking.  
  • Managing reputation.  
  • Storytelling and interpretive skills.  
  • First principles thinking.  
  • Communicating with machines as machines.  
  • Augmenting high-skilled physical tasks with AI.
  • Optimization and debugging frame of mind.
  • Creativity and growth mindset.
  • Adaptability.
  • Emotional intelligence.
  • Truth seeking.
  • Cybersecurity.

 

The rise of machine intelligence is just one of the many powerful social, technological, economic, environmental, and political forces that are rapidly and disruptively changing the way everyone will work and learn in the future. Because this largely tech-driven force is so interconnected with other drivers of change, it is nearly impossible to understand the impact of intelligent agents on how we will work and learn without also imagining the ways in which these new tools will reshape how we live.

 

 

 

Benchmarking Higher Ed AV Staffing Levels — Revisited — from campustechnology.com by Mike Tomei
As AV-equipped classrooms on campus increase in both numbers and complexity, have AV departments staffed up accordingly? A recent survey sheds some light on how AV is managed in higher education.

Excerpt:

I think we can all agree that new AV system installs have a much higher degree of complexity compared to AV systems five or 10 years ago. The obvious culprits are active learning classrooms that employ multiple displays and matrix switching backends, and conferencing systems of varying complexity being installed in big and small rooms all over campus. But even if today’s standard basic classrooms are offering the same presentation functionality as they were five years ago, the backend AV technology running those systems has still increased in complexity. We’re trying to push very high resolution video signals around the room; copyright-protected digital content is coming into play; there are myriad BYOD devices and connectors that need to be supported; and we’re making a strong push to connect our AV devices to the enterprise network for monitoring and troubleshooting. This increase in AV system complexity just adds to the system design, installation and support burdens placed upon an AV department. Without an increase in FTE staff beyond what we’re seeing, there’s just no way that AV support can truly flourish on campuses.

Today we’re reopening the survey to continue to gather data about AV staffing levels, and we’ll periodically tabulate and publish the results for those that participate. Visit www.AV-Survey.com to take the survey. If you would like to request the full 2018 AV staffing survey results, including average AV department budgets, staffing levels by position, breakouts by public/private/community colleges and small/medium/large schools, please send an e-mail to me (mike@tomeiav.com) and to Craig Park from The Sextant Group (cpark@thesextantgroup.com).

 

 

 

AI Now Law and Policy Reading List — from medium.com by the AI Now Institute

Excerpt:

Data-driven technologies are widely used in society to make decisions that affect many critical aspects of our lives, from health, education, employment, and criminal justice to economic, social and political norms. Their varied applications, uses, and consequences raise a number of unique and complex legal and policy concerns. As a result, it can be hard to figure out not only how these systems work but what to do about them.

As a starting point, AI Now offers this Law and Policy Reading List tailored for those interested in learning about key concepts, debates, and leading analysis on law and policy issues related to artificial intelligence and other emerging data-driven technologies.

 

New game lets players train AI to spot legal issues — from abajournal.com by Jason Tashea

Excerpt:

Got a free minute? There’s a new game that will help train an artificial intelligence model to spot legal issues and help close the access-to-justice gap.

Called Learned Hands—yes, it’s a pun—the game takes 75,000 legal questions posted on Reddit dealing with family, consumer, criminal and other legal issues and asks the user to determine what the issue is.

While conjuring up nightmares of the first-year in law school for many lawyers, David Colarusso says it’s for a good cause.

“It’s an opportunity for attorneys to take their downtime to train machine learning algorithms to help access-to-justice issues,” says Colarusso, director of Suffolk University Law School’s Legal Innovation and Technology (LIT) Lab and partner on this project with the Stanford Legal Design Lab.

 

From learnedhands.law.stanford.edu/legalIssues

When you play the game, you’ll be spotting if different legal issues are present in people’s stories. Some of these issues will be high level categories, and others will be more specific issues.

Here are the high level categories:

 

 

MIT plans $1B computing college, AI research effort — from educationdive.com by James Paterson

Dive Brief (emphasis DSC):

  • The Massachusetts Institute of Technology is creating a College of Computing with the help of a $350 million gift from billionaire investor Stephen A. Schwarzman, who is the CEO and co-founder of the private equity firm Blackstone, in a move the university said is its “most significant reshaping” since 1950.
  • Featuring 50 new faculty positions and a new headquarters building, the $1 billion interdisciplinary initiative will bring together computer science, artificial intelligence (AI), data science and related programs across the institution. MIT will establish a new deanship for the college.
  • The new college…will explore and promote AI’s use in non-technology disciplines with a focus on ethical considerations, which are a growing concern as the technology becomes embedded in many fields.

 

Also see:

Alexa Sessions You Won’t Want to Miss at AWS re:Invent 2018 — from developer.amazon.com

Excerpts — with an eye towards where this might be leading in terms of learning spaces:

Alexa and AWS IoT — Voice is a natural interface to interact not just with the world around us, but also with physical assets and things, such as connected home devices, including lights, thermostats, or TVs. Learn how you can connect and control devices in your home using the AWS IoT platform and Alexa Skills Kit.

Connect Any Device to Alexa and Control Any Feature with the Updated Smart Home Skill API — Learn about the latest update to the Smart Home Skill API, featuring new capability interfaces you can use as building blocks to connect any device to Alexa, including those that fall outside of the traditional smart home categories of lighting, locks, thermostats, sensors, cameras, and audio/video gear. Start learning about how you can create a smarter home with Alexa.

Workshop: Build an Alexa Skill with Multiple Models — Learn how to build an Alexa skill that utilizes multiple interaction models and combines functionality into a single skill. Build an Alexa smart home skill from scratch that implements both custom interactions and smart home functionality within a single skill. Check out these resources to start learning:

 

Psalm 25:4-6 New International Version (NIV) — from biblegateway.com

Show me your ways, Lord,
    teach me your paths.
Guide me in your truth and teach me,
    for you are God my Savior,
    and my hope is in you all day long.
Remember, Lord, your great mercy and love,
    for they are from of old.

 

Your next doctor’s appointment might be with an AI — from technologyreview.com by Douglas Heaven
A new wave of chatbots are replacing physicians and providing frontline medical advice—but are they as good as the real thing?

Excerpt:

The idea is to make seeking advice about a medical condition as simple as Googling your symptoms, but with many more benefits. Unlike self-diagnosis online, these apps lead you through a clinical-grade triage process—they’ll tell you if your symptoms need urgent attention or if you can treat yourself with bed rest and ibuprofen instead. The tech is built on a grab bag of AI techniques: language processing to allow users to describe their symptoms in a casual way, expert systems to mine huge medical databases, machine learning to string together correlations between symptom and condition.

Babylon Health, a London-based digital-first health-care provider, has a mission statement it likes to share in a big, bold font: to put an accessible and affordable health service in the hands of every person on earth. The best way to do this, says the company’s founder, Ali Parsa, is to stop people from needing to see a doctor.

Not everyone is happy about all this. For a start, there are safety concerns. Parsa compares what Babylon does with your medical data to what Facebook does with your social activities—amassing information, building links, drawing on what it knows about you to prompt some action. Suggesting you make a new friend won’t kill you if it’s a bad recommendation, but the stakes are a lot higher for a medical app.

 

 

Also see:

 

 

What will be important in the learn and work ecosystem in 2030? How do we prepare? — from evolllution.com by Holly Zanville | Senior Advisor for Credentialing and Workforce Development, Lumina Foundation

Excerpt:

These seven suggested actions—common to all scenarios—especially resonated with Lumina:

  1. Focus on learning: All learners will need a range of competencies and skills, most critically: learning how to learn; having a foundation in math, science, IT and cross-disciplines; and developing the behaviors of grit, empathy and effective communication.
  2. Prepare all “systems”: Schools will continue to be important places to teach competencies and skills. Parents will be important teachers for children. Workplaces will also be important places for learning, and many learners will need instruction on how to work effectively as part of human/machine teams.
  3. Integrate education and work: Education systems will need to be integrated with work in an education/work ecosystem. To enable movement within the ecosystem, credentials will be useful, but only if they are transparent and portable. The competencies and skills that stand behind credentials will need to be identifiable, using a common language to enable (a) credential providers to educate/train for an integrated education/work system; (b) employers to hire people and upgrade their skills; and (c) governments (federal/state/local) to incentivize and regulate programs and policies that support the education/work system.
  4. Assess learning: Assessing competencies and skills acquired in multiple settings and modes (including artificial reality and virtual reality tools), will be essential. AI will enable powerful new assessment tools to collect and analyze data about what humans know and can do.
  5. Build fair, moral AI: There will be a high priority on ensuring that AI has built-in checks and balances that reflect moral values and honor different cultural perspectives.
  6. Prepare for human/machine futures: Machines will join humans in homes, schools and workplaces. Machines will likely be viewed as citizens with rights. Humans must prepare for side-by-side “relationships” with machines, especially in situations in which machines will be managing aspects of education, work and life formerly managed by humans. Major questions will also arise about the ownership of AI structures—what ownership looks like, and who profits from ubiquitous AI structures.
  7. Build networks for readiness/innovation: Open and innovative partnerships will be needed for whatever future scenarios emerge. In a data-rich world, we won’t solve problems alone; networks, partnerships and communities will be key.

 

 

Also see:

 

 

7 Internet of Things examples that are super futuristic— from blog.hubspot.com by Caroline Forsey

 

Spray-on antennas will revolutionize the Internet of Things — from networkworld.com
Researchers at Drexel University have developed a method to spray on antennas that outperform traditional metal antennas, opening the door to faster and easier IoT deployments.

 

7 ways to keep your smart home from being hacked — from marketwatch.com

 

The coming revolution in software development — from forbes.com by Matt Bornstein

Excerpt:

Amid the deep learning hype, though, many observers miss the biggest reason to be optimistic about its future: deep learning requires coders to write very little actual code. Rather than relying on preset rules or if-then statements, a deep learning system writes rules automatically based on past examples. A software developer only has to create a “rough skeleton,” to paraphrase Andrej Karpathy from Tesla, then let the computers do the rest.

In this new world, developers no longer need to design a unique algorithm for each problem. Most work focuses, instead, on generating datasets that reflect desired behavior and managing the training process. Pete Warden from Google’s TensorFlow team pointed this outas far back as 2014: “I used to be a coder,” he wrote. “Now I teach computers to write their own programs.”

Again: the programming model driving the most important advances in software today does not require a significant amount of actual programming.

What does this mean for the future of software development?

 

 

 

An open letter to Microsoft and Google’s Partnership on AI — from wired.com by Gerd Leonhard
In a world where machines may have an IQ of 50,000, what will happen to the values and ethics that underpin privacy and free will?

Excerpt:

This open letter is my modest contribution to the unfolding of this new partnership. Data is the new oil – which now makes your companies the most powerful entities on the globe, way beyond oil companies and banks. The rise of ‘AI everywhere’ is certain to only accelerate this trend. Yet unlike the giants of the fossil-fuel era, there is little oversight on what exactly you can and will do with this new data-oil, and what rules you’ll need to follow once you have built that AI-in-the-sky. There appears to be very little public stewardship, while accepting responsibility for the consequences of your inventions is rather slow in surfacing.

 

In a world where machines may have an IQ of 50,000 and the Internet of Things may encompass 500 billion devices, what will happen with those important social contracts, values and ethics that underpin crucial issues such as privacy, anonymity and free will?

 

 

My book identifies what I call the “Megashifts”. They are changing society at warp speed, and your organisations are in the eye of the storm: digitization, mobilisation and screenification, automation, intelligisation, disintermediation, virtualisation and robotisation, to name the most prominent. Megashifts are not simply trends or paradigm shifts, they are complete game changers transforming multiple domains simultaneously.

 

 

If the question is no longer about if technology can do something, but why…who decides this?

Gerd Leonhard

 

 

From DSC:
Though this letter was written 2 years ago back in October of 2016, the messages, reflections, and questions that Gerd puts on the table are very much still relevant today.  The leaders of these powerful companies have enormous power — power to do good, or to do evil. Power to help or power to hurt. Power to be a positive force for societies throughout the globe and to help create dreams, or power to create dystopian societies while developing a future filled with nightmares. The state of the human heart is extremely key here — though many will hate me saying that. But it’s true. At the end of the day, we need to very much care about — and be extremely aware of — the characters and values of the leaders of these powerful companies. 

 

 

Also relevant/see:

Spray-on antennas will revolutionize the Internet of Things — from networkworld.com by Patrick Nelson
Researchers at Drexel University have developed a method to spray on antennas that outperform traditional metal antennas, opening the door to faster and easier IoT deployments.

 From DSC:
Again, it’s not too hard to imagine in this arena that technologies can be used for good or for ill.

 

 

Three shifts as big as print to digital — from gettingsmart.com by Tom Vander Ark

Excerpts (emphasis DSC):

We just lived through the biggest shift in learning since the printing press—a 25-year shift from print to digital. While it extended access and options to billions, it didn’t prove as transformational as many of us expected. It did, however, set the stage for three shifts that will change what and how people learn.

  1. Basic to broader aims.
  2. Passive to active learning.
  3. Time to demonstrated learning.

 

 

 

California enacts first law regulating Internet of Things devices — from iplawtrends.com by David Rice with thanks to my friend Justin Wagner for posting this on LinkedIn

Excerpt:

California has enacted the nation’s first law regulating Internet of Things (IoT) devices, which was signed by Governor Jerry Brown on September 28, 2018. IoT refers to the rapidly-expanding world of internet-connected objects such as home security systems, video monitors, enterprise devices that track packages and vehicles, health monitors, connected cars, smart city devices that manage traffic congestion, and smart meters for utilities.

IoT devices promise to bring efficiencies to a broad range of industries and improve lives. But these devices also collect vast troves of information, and this raises data security and privacy concerns. In 2016, a distributed denial of service (DDoS) attack on the internet infrastructure company Dyn was powered by millions of hacked IoT devices such as web cameras and connected refrigerators. Hackers have used baby monitors to view inside homes, with a prominent recent example being the widely-deployed Mi-Cam baby monitor. If hackers are able to get into critical IoT systems in first responder networks, then there could be public safety risks.

 

The law states that having a unique preprogrammed password for each IoT device or requiring the user to generate a new means of authentication before access to the device is granted for the first time is deemed to be a reasonable security feature.

 

 

 
 

The 11 coolest announcements Google made at its biggest product event of the year — from businessinsider.com by Dave Smith

Excerpt:

On Tuesday (10/9), Google invited journalists to New York City to debut its newest smartphone, the Pixel 3, among several other hardware goodies.

Google made dozens of announcements in its 75-minute event, but a handful of “wow” moments stole the show.

Here are the 11 coolest announcements Google made at its big hardware event…

 

 

 

Google’s Pixel 3 event in 6 minutes — from businessinsider.com

 

 

Google unveiled its new Home Hub but it might be ‘late to the market’ — from barrons.com by Emily Bary

Excerpt:

Alphabet ’s Google has sold millions of voice-enabled speakers, but it inched toward the future at a Tuesday launch event when it introduced the Home Hub smart screen.

Google isn’t the first company to roll out a screen-enabled home device with voice-assistant technology— Amazon.com released its Echo Show in July 2017. Meanwhile, Lenovo has gotten good reviews for its Smart Display, and Facebook introduced the Portal on Monday.

For the most part, though, consumers have stuck to voice-only devices, and it will be up to Google and its rivals to convince them that an added screen is worth paying for. They’ll also have to reassure consumers that they can trust big tech to maintain their privacy, an admittedly harder task these days after recent security issues at Google and Facebook.

 

Amazon was right to realize early on that consumers aren’t always comfortable buying items they can’t even see pictures of, and that it’s hard to remember directions you’ve heard but not seen.

 

 

Google has announced its first smart speaker with a screen — from businessinsider.com by Brandt Ranj

  • Google has announced the Google Hub, its first smart home speaker with a screen. It’s available for pre-order at Best Buy, Target, and Walmart for $149. The Hub will be released on October 22.
  • The Hub has a 7″ touchscreen and sits on a base with a built-in speaker and microphones, which you can use to play music, watch videos, get directions, and control smart home accessories with your voice.
  • Its biggest advantage is its ability to hook into Google’s first-party services, like YouTube and Google Maps, which none of its competitors can use.
  • If you’re an Android or Chromecast user, trust Google more than Amazon, or want a smaller smart home speaker with a screen, the Google Hub is now your best bet.

 

 

Google is shutting down Google+ for consumers following security lapse — from theverge.com by Ashley Carman

Excerpt:

Google is going to shut down the consumer version of Google+ over the next 10 months, the company writes in a blog post today. The decision follows the revelation of a previously undisclosed security flaw that exposed users’ profile data that was remedied in March 2018.

 

 

Google shutters Google+ after users’ personal data exposed — from cbsnews.com

 

 

Google+ is shutting down, and the site’s few loyal users are mourning – from cnbc.com by Jillian D’Onfro

  • Even though Google had diverted resources away from Google+, there’s a small loyal base of users devastated to see it go away.
  • One fan said the company is using its recently revealed security breach as an excuse to shutter the site.
  • Google said it will be closing Google+ in the coming months.

 

 

 
© 2024 | Daniel Christian