From DSC:
I agree with futurist Thomas Frey:

“I’ve been predicting that by 2030 the largest company on the internet is going to be an education-based company that we haven’t heard of yet.”

(source)

 

Along these lines, see what Arizona State University is up to:

We think of this as a transformation away from a mass-production model to a mass-personalization model. For us, that’s the big win in this whole process. When we move away from the large lectures in that mass-production model that we’ve used for the last 170 years and get into something that reflects each of the individual learners’ needs and can personalize their learning path through the instructional resources, we will have successfully moved the education industry to the new frontier in the learning process. We think that mass personalization has already permeated every aspect of our lives, from navigation to entertainment; and education is really the next big frontier.

(source)

 

From DSC:
Each year the vision I outlined here gets closer and closer and closer and closer. With the advancements in Artificial Intelligence (AI), change is on the horizon…big time. Mass personalization. More choice. More control.

 

 

LinkedIn Learning Opens Its Platform (Slightly) [Young]

LinkedIn Learning Opens Its Platform (Slightly) — from edsurge by Jeff Young

Excerpt (emphasis DSC):

A few years ago, in a move toward professional learning, LinkedIn bought Lynda.com for $1.5 billion, adding the well-known library of video-based courses to its professional social network. Today LinkedIn officials announced that they plan to open up their platform to let in educational videos from other providers as well—but with a catch or two.

The plan, announced Friday, is to let companies or colleges who already subscribe to LinkedIn Learning add content from a select group of other providers. The company or college will still have to subscribe to those other services separately, so it’s essentially an integration—but it does mark a change in approach.

For LinkedIn, the goal is to become the front door for employees as they look for micro-courses for professional development.

 

LinkedIn also announced another service for its LinkedIn Learning platform called Q&A, which will give subscribers the ability to pose a question they have about the video lessons they’re taking. The question will first be sent to bots, but if that doesn’t yield an answer the query will be sent on to other learners, and in some cases the instructor who created the videos.

 

 

Also see:

LinkedIn becomes a serious open learning experience platform — from clomedia.com by Josh Bersin
LinkedIn is becoming a dominant learning solution with some pretty interesting competitive advantages, according to one learning analyst.

Excerpt:

LinkedIn has become quite a juggernaut in the corporate learning market. Last time I checked the company had more than 17 million users, 14,000 corporate customers, more than 3,000 courses and was growing at high double-digit rates. And all this in only about two years.

And the company just threw down the gauntlet; it’s now announcing it has completely opened up its learning platform to external content partners. This is the company’s formal announcement that LinkedIn Learning is not just an amazing array of content, it is a corporate learning platform. The company wants to become a single place for all organizational learning content.

 

LinkedIn now offers skills-based learning recommendations to any user through its machine learning algorithms. 

 

 



Is there demand for staying relevant? For learning new skills? For reinventing oneself?

Well…let’s see.

 

 

 

 

 

 



From DSC:
So…look out higher ed and traditional forms of accreditation — your window of opportunity may be starting to close. Alternatives to traditional higher ed continue to appear on the scene and gain momentum. LinkedIn — and/or similar organizations in the future — along with blockchain and big data backed efforts may gain traction in the future and start taking away some major market share. If employers get solid performance from their employees who have gone this route…higher ed better look out. 

Microsoft/LinkedIn/Lynda.com are nicely positioned to be a major player who can offer society a next generation learning platform at an incredible price — offering up-to-date, microlearning along with new forms of credentialing. It’s what I’ve been calling the Amazon.com of higher ed (previously the Walmart of Education) for ~10 years. It will take place in a strategy/platform similar to this one.

 



Also, this is what a guerilla on the back looks like:

 

This is what a guerilla on the back looks like!

 



Also see:

  • Meet the 83-Year-Old App Developer Who Says Edtech Should Better Support Seniors — from edsurge.com by Sydney Johnson
    Excerpt (emphasis DSC):
    Now at age 83, Wakamiya beams with excitement when she recounts her journey, which has been featured in news outlets and even at Apple’s developer conference last year. But through learning how to code, she believes that experience offers an even more important lesson to today’s education and technology companies: don’t forget about senior citizens.Today’s education technology products overwhelmingly target young people. And while there’s a growing industry around serving adult learners in higher education, companies largely neglect to consider the needs of the elderly.

 

 

The global companies that failed to adapt to change. — from trainingmag.com by Professor M.S. Rao, Ph.D.

Excerpt:

Eastman Kodak, a leader for many years, filed for bankruptcy in 2012. Blockbuster Video became defunct in 2013. Similarly, Borders — one of the largest book retailers in the U.S. — went out of business in 2011. Why did these companies, which once had great brands, ultimately fail? It is because they failed to adapt to change. Additionally, they failed to unlearn and relearn.

Former GE CEO Jack Welch once remarked, “If the rate of change on the outside exceeds the rate of change on the inside, the end is near.” Thus, accept change before the change is thrust on you.

Leaders must adopt tools and techniques to adapt to change. Here is a blueprint to embrace change effectively:

  • Keep the vision right and straight, and articulate it effectively.
  • Create organizational culture conducive to bring about change.
  • Communicate clearly about the need to change.
  • Enlighten people about the implications of the status quo.
  • Show them benefits once the change is implemented.
  • Coordinate all stakeholders effectively.
  • Remove the roadblocks by allaying their apprehensions.
  • Show them small gains to ensure that entire change takes place smoothly without any resistance.

 

From DSC:
Though I’m not on board with all of the perspectives in that article, institutions of traditional higher education likely have something to learn from the failures of these companies….while there’s still time to change and to innovate. 

 

 

Robots won’t replace instructors, 2 Penn State educators argue. Instead, they’ll help them be ‘more human.’ — from edsurge.com by Tina Nazerian

Excerpt:

Specifically, it will help them prepare for and teach their courses through several phases—ideation, design, assessment, facilitation, reflection and research. The two described a few prototypes they’ve built to show what that might look like.

 

Also see:

The future of education: Online, free, and with AI teachers? — from fool.com by Simon Erickson
Duolingo is using artificial intelligence to teach 300 million people a foreign language for free. Will this be the future of education?

Excerpts:

While it might not get a lot of investor attention, education is actually one of America’s largest markets.

The U.S. has 20 million undergraduates enrolled in colleges and universities right now and another 3 million enrolled in graduate programs. Those undergrads paid an average of $17,237 for tuition, room, and board at public institutions in the 2016-17 school year and $44,551 for private institutions. Graduate education varies widely by area of focus, but the average amount paid for tuition alone was $24,812 last year.

Add all of those up, and America’s students are paying more than half a trillion dollars each year for their education! And that doesn’t even include the interest amassed for student loans, the college-branded merchandise, or all the money spent on beer and coffee.

Keeping the costs down
Several companies are trying to find ways to make college more affordable and accessible.

 

But after we launched, we have so many users that nowadays if the system wants to figure out whether it should teach plurals before adjectives or adjectives before plurals, it just runs a test with about 50,000 people. So for the next 50,000 people that sign up, which takes about six hours for 50,000 new users to come to Duolingo, to half of them it teaches plurals before adjectives. To the other half it teaches adjectives before plurals. And then it measures which ones learn better. And so once and for all it can figure out, ah it turns out for this particular language to teach plurals before adjectives for example.

So every week the system is improving. It’s making itself better at teaching by learning from our learners. So it’s doing that just based on huge amounts of data. And this is why it’s become so successful I think at teaching and why we have so many users.

 

 

From DSC:
I see AI helping learners, instructors, teachers, and trainers. I see AI being a tool to help do some of the heavy lifting, but people still like to learn with other people…with actual human beings. That said, a next generation learning platform could be far more responsive than what today’s traditional institutions of higher education are delivering.

 

 

Affordable and at-scale — from insidehighered.com by Ray Schroeder
Affordable degrees at scale have arrived. The momentum behind this movement is undeniable, and its impact will be significant, Ray Schroeder writes.

Excerpt (emphasis DSC):

How many times have we been told that major change in our field is on the near horizon? Too many times, indeed.

The promises of technologies and practices have fallen short more often than not. Just seven years ago, I was part of the early MOOC movement and felt the pulsating potential of teaching thousands of students around the world in a single class. The “year of the MOOC” was declared in 2012. Three years later, skeptics declared that the MOOC had died an ignominious death with high “failure” rates and relatively little recognition by employers.

However, the skeptics were too impatient, misunderstood the nature of MOOCs and lacked the vision of those at Georgia Tech, the University of Illinois, Arizona State University, Coursera, edX and scores of other institutions that have persevered in building upon MOOCs’ premises to develop high-quality, affordable courses, certificates and now, degrees at scale.

No, these degrees are not free, but they are less than half the cost of on-campus versions. No, they are not massive in the hundreds of thousands, but they are certainly at large scale with many thousands enrolled. In computer science, the success is felt across the country.

 

Georgia Tech alone has enrolled 10,000 students over all in its online master’s program and is adding thousands of new students each semester in a top 10-ranked degree program costing less than $7,000. Georgia Tech broke the new ground through building collaborations among several partners. Yet, that was just the beginning, and many leading universities have followed.

 

 

Also see:

Trends for the future of education with Jeff Selingo — from steelcase.com
How the future of work and new technology will make place more important than ever.

Excerpt:

Selingo sees artificial intelligence and big data as game changers for higher education. He says AI can free up professors and advisors to spend more time with students by answering some more frequently-asked questions and handling some of the grading. He also says data can help us track and predict student performance to help them create better outcomes. “When they come in as a first-year student, we can say ‘People who came in like you that had similar high school grades and took similar classes ended up here. So, if you want to get out of here in four years and have a successful career, here are the different pathways you should follow.’”

 

 

 

Academics Propose a ‘Blockchain University,’ Where Faculty (and Algorithms) Rule — from edsurge.com by Jeff Young

Excerpt:

A group of academics affiliated with Oxford University have proposed a new model of higher education that replaces traditional administrators with “smart contracts” on the blockchain, the same technology that drives Bitcoin and other cryptocurrencies.

“Our aim is to create a university in which the bulk of administrative tasks are either eliminated or progressively automated,” said the effort’s founders in a white paper released earlier this year. Those proposing the idea added the university would be “a decentralised, non-profit, democratic community in which the use of blockchain technology will provide the contractual stability needed to pursue a full course of study.”

Experiments with blockchain in higher education are underway at multiple campuses around the country, and many of researchers are looking into how to use the technology to verify and deliver credentials. Massachusetts Institute for Technology, for example, began issuing diplomas via blockchain last year.

The plan by Oxford researchers goes beyond digital diplomas—and beyond many typical proposals to disrupt education in general. It argues for a completely new framework for how college is organized, how professors are paid, and how students connect with learning. In other words, it’s a long shot.

But even if the proposed platform never emerges, it is likely to spur debates about whether blockchain technology could one day allow professors to reclaim greater control of how higher education operates through digital contracts.

 

The platform would essentially allow professors to organize their own colleges, and teach and take payments from students directly. “

 

 

 

Reflections on “Inside Amazon’s artificial intelligence flywheel” [Levy]

Inside Amazon’s artificial intelligence flywheel — from wired.com by Steven Levy
How deep learning came to power Alexa, Amazon Web Services, and nearly every other division of the company.

Excerpt (emphasis DSC):

Amazon loves to use the word flywheel to describe how various parts of its massive business work as a single perpetual motion machine. It now has a powerful AI flywheel, where machine-learning innovations in one part of the company fuel the efforts of other teams, who in turn can build products or offer services to affect other groups, or even the company at large. Offering its machine-learning platforms to outsiders as a paid service makes the effort itself profitable—and in certain cases scoops up yet more data to level up the technology even more.

It took a lot of six-pagers to transform Amazon from a deep-learning wannabe into a formidable power. The results of this transformation can be seen throughout the company—including in a recommendations system that now runs on a totally new machine-learning infrastructure. Amazon is smarter in suggesting what you should read next, what items you should add to your shopping list, and what movie you might want to watch tonight. And this year Thirumalai started a new job, heading Amazon search, where he intends to use deep learning in every aspect of the service.

“If you asked me seven or eight years ago how big a force Amazon was in AI, I would have said, ‘They aren’t,’” says Pedro Domingos, a top computer science professor at the University of Washington. “But they have really come on aggressively. Now they are becoming a force.”

Maybe the force.

 

 

From DSC:
When will we begin to see more mainstream recommendation engines for learning-based materials? With the demand for people to reinvent themselves, such a next generation learning platform can’t come soon enough!

  • Turning over control to learners to create/enhance their own web-based learner profiles; and allowing people to say who can access their learning profiles.
  • AI-based recommendation engines to help people identify curated, effective digital playlists for what they want to learn about.
  • Voice-driven interfaces.
  • Matching employees to employers.
  • Matching one’s learning preferences (not styles) with the content being presented as one piece of a personalized learning experience.
  • From cradle to grave. Lifelong learning.
  • Multimedia-based, interactive content.
  • Asynchronously and synchronously connecting with others learning about the same content.
  • Online-based tutoring/assistance; remote assistance.
  • Reinvent. Staying relevant. Surviving.
  • Competency-based learning.

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

 

 

 

 

We’re about to embark on a period in American history where career reinvention will be critical, perhaps more so than it’s ever been before. In the next decade, as many as 50 million American workers—a third of the total—will need to change careers, according to McKinsey Global Institute. Automation, in the form of AI (artificial intelligence) and RPA (robotic process automation), is the primary driver. McKinsey observes: “There are few precedents in which societies have successfully retrained such large numbers of people.”

Bill Triant and Ryan Craig

 

 

 

Also relevant/see:

Online education’s expansion continues in higher ed with a focus on tech skills — from educationdive.com by James Paterson

Dive Brief:

  • Online learning continues to expand in higher ed with the addition of several online master’s degrees and a new for-profit college that offers a hybrid of vocational training and liberal arts curriculum online.
  • Inside Higher Ed reported the nonprofit learning provider edX is offering nine master’s degrees through five U.S. universities — the Georgia Institute of Technology, the University of Texas at Austin, Indiana University, Arizona State University and the University of California, San Diego. The programs include cybersecurity, data science, analytics, computer science and marketing, and they cost from around $10,000 to $22,000. Most offer stackable certificates, helping students who change their educational trajectory.
  • Former Harvard University Dean of Social Science Stephen Kosslyn, meanwhile, will open Foundry College in January. The for-profit, two-year program targets adult learners who want to upskill, and it includes training in soft skills such as critical thinking and problem solving. Students will pay about $1,000 per course, though the college is waiving tuition for its first cohort.

 

 

 

 

In the 2030 and beyond world, employers will no longer be a separate entity from the education establishment. Pressures from both the supply and demand side are so large that employers and learners will end up, by default, co-designing new learning experiences, where all learning counts.

 

OBJECTIVES FOR CONVENINGS

  • Identify the skills everyone will need to navigate the changing relationship between machine intelligence and people over the next 10-12 years.
  • Develop implications for work, workers, students, working learners, employers, and policymakers.
  • Identify a preliminary set of actions that need to be taken now to best prepare for the changing work + learn ecosystem.

Three key questions guided the discussions:

  1. What are the LEAST and MOST essential skills needed for the future?
  2. Where and how will tomorrow’s workers and learners acquire the skills they really need?
  3. Who is accountable for making sure individuals can thrive in this new economy?

This report summarizes the experts’ views on what skills will likely be needed to navigate the work + learn ecosystem over the next 10–15 years—and their suggested steps for better serving the nation’s future needs.

 

In a new world of work, driven especially by AI, institutionally-sanctioned curricula could give way to AI-personalized learning. This would drastically change the nature of existing social contracts between employers and employees, teachers and students, and governments and citizens. Traditional social contracts would need to be renegotiated or revamped entirely. In the process, institutional assessment and evaluation could well shift from top-down to new bottom-up tools and processes for developing capacities, valuing skills, and managing performance through new kinds of reputation or accomplishment scores.

 

In October 2017, Chris Wanstrath, CEO of Github, the foremost code-sharing and social networking resource for programmers today, made a bold statement: “The future of coding is no coding at all.” He believes that the writing of code will be automated in the near future, leaving humans to focus on “higher-level strategy and design of software.” Many of the experts at the convenings agreed. Even creating the AI systems of tomorrow, they asserted, will likely require less human coding than is needed today, with graphic interfaces turning AI programming into a drag-and-drop operation.

Digital fluency does not mean knowing coding languages. Experts at both convenings contended that effectively “befriending the machine” will be less about teaching people to code and more about being able to empathize with AIs and machines, understanding how they “see the world” and “think” and “make decisions.” Machines will create languages to talk to one another.

Here’s a list of many skills the experts do not expect to see much of—if at all—in the future:

  • Coding. Systems will be self-programming.
  • Building AI systems. Graphic interfaces will turn AI programming into drag-and-drop operations.
  • Calendaring, scheduling, and organizing. There won’t be need for email triage.
  • Planning and even decision-making. AI assistants will pick this up.
  • Creating more personalized curricula. Learners may design more of their own personalized learning adventure.
  • Writing and reviewing resumes. Digital portfolios, personal branding, and performance reputation will replace resumes.
  • Language translation and localization. This will happen in real time using translator apps.
  • Legal research and writing. Many of our legal systems will be automated.
  • Validation skills. Machines will check people’s work to validate their skills.
  • Driving. Driverless vehicles will replace the need to learn how to drive.

Here’s a list of the most essential skills needed for the future:

  • Quantitative and algorithmic thinking.  
  • Managing reputation.  
  • Storytelling and interpretive skills.  
  • First principles thinking.  
  • Communicating with machines as machines.  
  • Augmenting high-skilled physical tasks with AI.
  • Optimization and debugging frame of mind.
  • Creativity and growth mindset.
  • Adaptability.
  • Emotional intelligence.
  • Truth seeking.
  • Cybersecurity.

 

The rise of machine intelligence is just one of the many powerful social, technological, economic, environmental, and political forces that are rapidly and disruptively changing the way everyone will work and learn in the future. Because this largely tech-driven force is so interconnected with other drivers of change, it is nearly impossible to understand the impact of intelligent agents on how we will work and learn without also imagining the ways in which these new tools will reshape how we live.

 

 

 

MIT plans $1B computing college, AI research effort — from educationdive.com by James Paterson

Dive Brief (emphasis DSC):

  • The Massachusetts Institute of Technology is creating a College of Computing with the help of a $350 million gift from billionaire investor Stephen A. Schwarzman, who is the CEO and co-founder of the private equity firm Blackstone, in a move the university said is its “most significant reshaping” since 1950.
  • Featuring 50 new faculty positions and a new headquarters building, the $1 billion interdisciplinary initiative will bring together computer science, artificial intelligence (AI), data science and related programs across the institution. MIT will establish a new deanship for the college.
  • The new college…will explore and promote AI’s use in non-technology disciplines with a focus on ethical considerations, which are a growing concern as the technology becomes embedded in many fields.

 

Also see:

Alexa Sessions You Won’t Want to Miss at AWS re:Invent 2018 — from developer.amazon.com

Excerpts — with an eye towards where this might be leading in terms of learning spaces:

Alexa and AWS IoT — Voice is a natural interface to interact not just with the world around us, but also with physical assets and things, such as connected home devices, including lights, thermostats, or TVs. Learn how you can connect and control devices in your home using the AWS IoT platform and Alexa Skills Kit.

Connect Any Device to Alexa and Control Any Feature with the Updated Smart Home Skill API — Learn about the latest update to the Smart Home Skill API, featuring new capability interfaces you can use as building blocks to connect any device to Alexa, including those that fall outside of the traditional smart home categories of lighting, locks, thermostats, sensors, cameras, and audio/video gear. Start learning about how you can create a smarter home with Alexa.

Workshop: Build an Alexa Skill with Multiple Models — Learn how to build an Alexa skill that utilizes multiple interaction models and combines functionality into a single skill. Build an Alexa smart home skill from scratch that implements both custom interactions and smart home functionality within a single skill. Check out these resources to start learning:

 

What will be important in the learn and work ecosystem in 2030? How do we prepare? — from evolllution.com by Holly Zanville | Senior Advisor for Credentialing and Workforce Development, Lumina Foundation

Excerpt:

These seven suggested actions—common to all scenarios—especially resonated with Lumina:

  1. Focus on learning: All learners will need a range of competencies and skills, most critically: learning how to learn; having a foundation in math, science, IT and cross-disciplines; and developing the behaviors of grit, empathy and effective communication.
  2. Prepare all “systems”: Schools will continue to be important places to teach competencies and skills. Parents will be important teachers for children. Workplaces will also be important places for learning, and many learners will need instruction on how to work effectively as part of human/machine teams.
  3. Integrate education and work: Education systems will need to be integrated with work in an education/work ecosystem. To enable movement within the ecosystem, credentials will be useful, but only if they are transparent and portable. The competencies and skills that stand behind credentials will need to be identifiable, using a common language to enable (a) credential providers to educate/train for an integrated education/work system; (b) employers to hire people and upgrade their skills; and (c) governments (federal/state/local) to incentivize and regulate programs and policies that support the education/work system.
  4. Assess learning: Assessing competencies and skills acquired in multiple settings and modes (including artificial reality and virtual reality tools), will be essential. AI will enable powerful new assessment tools to collect and analyze data about what humans know and can do.
  5. Build fair, moral AI: There will be a high priority on ensuring that AI has built-in checks and balances that reflect moral values and honor different cultural perspectives.
  6. Prepare for human/machine futures: Machines will join humans in homes, schools and workplaces. Machines will likely be viewed as citizens with rights. Humans must prepare for side-by-side “relationships” with machines, especially in situations in which machines will be managing aspects of education, work and life formerly managed by humans. Major questions will also arise about the ownership of AI structures—what ownership looks like, and who profits from ubiquitous AI structures.
  7. Build networks for readiness/innovation: Open and innovative partnerships will be needed for whatever future scenarios emerge. In a data-rich world, we won’t solve problems alone; networks, partnerships and communities will be key.

 

 

Also see:

 

 

An open letter to Microsoft and Google’s Partnership on AI — from wired.com by Gerd Leonhard
In a world where machines may have an IQ of 50,000, what will happen to the values and ethics that underpin privacy and free will?

Excerpt:

This open letter is my modest contribution to the unfolding of this new partnership. Data is the new oil – which now makes your companies the most powerful entities on the globe, way beyond oil companies and banks. The rise of ‘AI everywhere’ is certain to only accelerate this trend. Yet unlike the giants of the fossil-fuel era, there is little oversight on what exactly you can and will do with this new data-oil, and what rules you’ll need to follow once you have built that AI-in-the-sky. There appears to be very little public stewardship, while accepting responsibility for the consequences of your inventions is rather slow in surfacing.

 

In a world where machines may have an IQ of 50,000 and the Internet of Things may encompass 500 billion devices, what will happen with those important social contracts, values and ethics that underpin crucial issues such as privacy, anonymity and free will?

 

 

My book identifies what I call the “Megashifts”. They are changing society at warp speed, and your organisations are in the eye of the storm: digitization, mobilisation and screenification, automation, intelligisation, disintermediation, virtualisation and robotisation, to name the most prominent. Megashifts are not simply trends or paradigm shifts, they are complete game changers transforming multiple domains simultaneously.

 

 

If the question is no longer about if technology can do something, but why…who decides this?

Gerd Leonhard

 

 

From DSC:
Though this letter was written 2 years ago back in October of 2016, the messages, reflections, and questions that Gerd puts on the table are very much still relevant today.  The leaders of these powerful companies have enormous power — power to do good, or to do evil. Power to help or power to hurt. Power to be a positive force for societies throughout the globe and to help create dreams, or power to create dystopian societies while developing a future filled with nightmares. The state of the human heart is extremely key here — though many will hate me saying that. But it’s true. At the end of the day, we need to very much care about — and be extremely aware of — the characters and values of the leaders of these powerful companies. 

 

 

Also relevant/see:

Spray-on antennas will revolutionize the Internet of Things — from networkworld.com by Patrick Nelson
Researchers at Drexel University have developed a method to spray on antennas that outperform traditional metal antennas, opening the door to faster and easier IoT deployments.

 From DSC:
Again, it’s not too hard to imagine in this arena that technologies can be used for good or for ill.

 

 
 

This is how the Future Today Institute researches, models & maps the future & develops strategies

 

This is how the Future Today Institute researches, models & maps the future & develops strategies

 

Also see what the Institute for the Future does in this regard

Foresight Tools
IFTF has pioneered tools and methods for building foresight ever since its founding days. Co-founder Olaf Helmer was the inventor of the Delphi Method, and early projects developed cross-impact analysis and scenario tools. Today, IFTF is methodologically agnostic, with a brimming toolkit that includes the following favorites…

 

 

From DSC:
How might higher education use this foresight workflow? How might we better develop a future-oriented mindset?

From my perspective, I think that we need to be pulse-checking a variety of landscapes, looking for those early signals. We need to be thinking about what should be on our radars. Then we need to develop some potential scenarios and strategies to deal with those potential scenarios if they occur. Graphically speaking, here’s an excerpted slide from my introductory piece for a NGLS 2017 panel that we did.

 

 

 

This resource regarding their foresight workflow was mentioned in  a recent e-newsletter from the FTI where they mentioned this important item as well:

  • Climate change: a megatrend that impacts us all
    Excerpt:
    Earlier this week, the United Nations’ scientific panel on climate change issued a dire report [PDF]. To say the report is concerning would be a dramatic understatement. Models built by the scientists show that at our current rate, the atmosphere will warm as much as 1.5 degrees Celsius, leading to a dystopian future of food shortages, wildfires, extreme winters, a mass die-off of coral reefs and more –– as soon as 2040. That’s just 20 years away from now.

 

But China also decided to ban the import of foreign plastic waste –– which includes trash from around the U.S. and Europe. The U.S. alone could wind up with an extra 37 million metric tons of plastic waste, and we don’t have a plan for what to do with it all.

 

Immediate Futures Scenarios: Year 2019

  • Optimistic: Climate change is depoliticized. Leaders in the U.S., Brazil and elsewhere decide to be the heroes, and invest resources into developing solutions to our climate problem. We understand that fixing our futures isn’t only about foregoing plastic straws, but about systemic change. Not all solutions require regulation. Businesses and everyday people are incentivized to shift behavior. Smart people spend the next two decades collaborating on plausible solutions.
  • Pragmatic: Climate change continues to be debated, while extreme weather events cause damage to our power grid, wreak havoc on travel, cause school cancellations, and devastate our farms. The U.S. fails to work on realistic scenarios and strategies to combat the growing problem of climate change. More countries elect far-right leaders, who shun global environmental accords and agreements. By 2029, it’s clear that we’ve waited too long, and that we’re running out of time to adapt.
  • Catastrophic: A chorus of voices calling climate change a hoax grows ever louder in some of the world’s largest economies, whose leaders choose immediate political gain over longer-term consequences. China builds an environmental coalition of 100 countries within the decade, developing both green infrastructure while accumulating debt service. Beijing sets global standards emissions––and it locks the U.S out of trading with coalition members. Trash piles up in the U.S., which didn’t plan ahead for waste management. By 2040, our population centers have moved inland and further north, our farms are decimated, our lives are miserable.

Watchlist: United Nations’ Intergovernmental Panel on Climate Change; European Geosciences Union; National Oceanic and Atmospheric Administration (NOAA); NASA; Department of Energy; Department of Homeland Security; House Armed Services Sub-committee on Emerging Threats and Capabilities; Environmental Justice Foundation; Columbia University’s Earth Institute; University of North Carolina at Wilmington; Potsdam Institute for Climate Impact Research; National Center for Atmospheric Research.

 

3 trends shaping the future world of work — from hrtechnologist.com by Becky Frankiewicz, President of Manpower Group North America

Excerpt:

In a world of constant change, continuity has given way to adaptability. It’s no secret the world of work has changed. Yet today it’s changing faster than ever before.

The impact of technology means new skills and new roles are emerging as fast as others become extinct.

My career path is a case in point. When I entered high school, I intended to follow a linear career path similar to generations before me. Pick a discipline, get a degree, commit to it, retire. Now in my fourth career, that’s not how it worked out, and I’m glad. In fact, the only true constant I’ve had is constant learning. Because success in the future won’t be defined by performance, but by potential and the ability to learn, apply and adapt.

 

From Jobs for Life to Skills for Life
Each day we see firsthand technology’s impact on jobs. 65% of the jobs my three daughters will do don’t even exist yet. Employability is less about what you already know and more about your capacity to learn. It requires a new mindset for us to develop a workforce with the right skillsets, and for individuals seeking to advance their careers. We need to be ready to help upskill and reskill people for new jobs and new roles. 

 

 

 

How AI could help solve some of society’s toughest problems — from technologyreview.com by Charlotte Jee
Machine learning and game theory help Carnegie Mellon assistant professor Fei Fang predict attacks and protect people.

Excerpt:

Fei Fang has saved lives. But she isn’t a lifeguard, medical doctor, or superhero. She’s an assistant professor at Carnegie Mellon University, specializing in artificial intelligence for societal challenges.

At MIT Technology Review’s EmTech conference on Wednesday, Fang outlined recent work across academia that applies AI to protect critical national infrastructure, reduce homelessness, and even prevent suicides.

 

 

How AI can be a force for good — from science.sciencemag.org by Mariarosaria Taddeo & Luciano Floridi

Excerpts:

Invisibility and Influence
AI supports services, platforms, and devices that are ubiquitous and used on a daily basis. In 2017, the International Federation of Robotics suggested that by 2020, more than 1.7 million new AI-powered robots will be installed in factories worldwide. In the same year, the company Juniper Networks issued a report estimating that, by 2022, 55% of households worldwide will have a voice assistant, like Amazon Alexa.

As it matures and disseminates, AI blends into our lives, experiences, and environments and becomes an invisible facilitator that mediates our interactions in a convenient, barely noticeable way. While creating new opportunities, this invisible integration of AI into our environments poses further ethical issues. Some are domain-dependent. For example, trust and transparency are crucial when embedding AI solutions in homes, schools, or hospitals, whereas equality, fairness, and the protection of creativity and rights of employees are essential in the integration of AI in the workplace. But the integration of AI also poses another fundamental risk: the erosion of human self-determination due to the invisibility and influencing power of AI.

To deal with the risks posed by AI, it is imperative to identify the right set of fundamental ethical principles to inform the design, regulation, and use of AI and leverage it to benefit as well as respect individuals and societies. It is not an easy task, as ethical principles may vary depending on cultural contexts and the domain of analysis. This is a problem that the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems tackles with the aim of advancing public debate on the values and principles that should underpin ethical uses of AI.

 

 

Who’s to blame when a machine botches your surgery? — from qz.com by Robert Hart

Excerpt:

That’s all great, but even if an AI is amazing, it will still fail sometimes. When the mistake is caused by a machine or an algorithm instead of a human, who is to blame?

This is not an abstract discussion. Defining both ethical and legal responsibility in the world of medical care is vital for building patients’ trust in the profession and its standards. It’s also essential in determining how to compensate individuals who fall victim to medical errors, and ensuring high-quality care. “Liability is supposed to discourage people from doing things they shouldn’t do,” says Michael Froomkin, a law professor at the University of Miami.

 

 

Google Cloud’s new AI chief is on a task force for AI military uses and believes we could monitor ‘pretty much the whole world’ with drones — from businessinsider.in by Greg Sandoval

Excerpt:

“We could afford if we wanted to, and if we needed, to be surveilling pretty much the whole word with autonomous drones of various kinds,” Moore said. “I’m not saying we’d want to do that, but there’s not a technology gap there where I think it’s actually too difficult to do. This is now practical.”

Google’s decision to hire Moore was greeted with displeasure by at least one former Googler who objected to Project Maven.

“It’s worrisome to note after the widespread internal dissent against Maven that Google would hire Andrew Moore,” said one former Google employee. “Googlers want less alignment with the military-industrial complex, not more. This hire is like a punch in the face to the over 4,000 Googlers who signed the Cancel Maven letter.”

 

 

Organizations Are Gearing Up for More Ethical and Responsible Use of Artificial Intelligence, Finds Study — from businesswire.com
Ninety-two percent of AI leaders train their technologists in ethics; 74 percent evaluate AI outcomes weekly, says report from SAS, Accenture Applied Intelligence, Intel, and Forbes Insights

Excerpt:

AI oversight is not optional

Despite popular messages suggesting AI operates independently of human intervention, the research shows that AI leaders recognize that oversight is not optional for these technologies. Nearly three-quarters (74 percent) of AI leaders reported careful oversight with at least weekly review or evaluation of outcomes (less successful AI adopters: 33 percent). Additionally, 43 percent of AI leaders shared that their organization has a process for augmenting or overriding results deemed questionable during review (less successful AI adopters: 28 percent).

 

 

 

Do robots have rights? Here’s what 10 people and 1 robot have to say — from createdigital.org.au
When it comes to the future of technology, nothing is straightforward, and that includes the array of ethical issues that engineers encounter through their work with robots and AI.

 

 

 
© 2024 | Daniel Christian