The five most important new jobs in AI, according to KPMG — from qz.com by Cassie Werber

Excerpt:

Perhaps as a counter to the panic that artificial intelligence will destroy jobs, consulting firm KPMG published a list (on 1/8/19) of what it predicts will soon become the five most sought-after AI roles. The predictions are based on the company’s own projects and those on which it advises. They are:

  • AI Architect – Responsible for working out where AI can help a business, measuring performance and—crucially— “sustaining the AI model over time.” Lack of architects “is a big reason why companies cannot successfully sustain AI initiatives,” KMPG notes.
  • AI Product Manager – Liaises between teams, making sure ideas can be implemented, especially at scale. Works closely with architects, and with human resources departments to make sure humans and machines can all work effectively.
  • Data Scientist – Manages the huge amounts of available data and designs algorithms to make it meaningful.
  • AI Technology Software Engineer – “One of the biggest problems facing businesses is getting AI from pilot phase to scalable deployment,” KMPG writes. Software engineers need to be able both to build scalable technology and understand how AI actually works.
  • AI Ethicist – AI presents a host of ethical challenges which will continue to unfold as the technology develops. Creating guidelines and ensuring they’re upheld will increasingly become a full-time job.

 

While it’s all very well to list the jobs people should be training and hiring for, it’s another matter to actually create a pipeline of people ready to enter those roles. Brad Fisher, KPMG’s US lead on data and analytics and the lead author of the predictions, tells Quartz there aren’t enough people getting ready for these roles.

 

Fisher has a steer for those who are eyeing AI jobs but have yet to choose an academic path: business process skills can be “trained,” he said, but “there is no substitute for the deep technical skillsets, such as mathematics, econometrics, or computer science, which would prepare someone to be a data scientist or a big-data software engineer.”

 

From DSC:
I don’t think institutions of higher education (as well as several other types of institutions in our society) are recognizing that the pace of technological change has changed, and that there are significant ramifications to those changes upon society. And if these institutions have picked up on it, you can hardly tell. We simply aren’t used to this pace of change.

Technologies change quickly. People change slowly. And, by the way, that is not a comment on how old someone is…change is hard at almost any age.

 

 

 

 

 

Cut the curriculum — from willrichardson.com by Will Richardson

Excerpt:

Here’s an idea: A Minimal Viable Curriculum (MVC). That’s what Christian Talbot over at Basecamp is proposing, and I have to say, I love the idea.

He writes: “What if we were to design MVCs: Minimum Viable Curricula centered on just enough content to empower learners to examine questions or pursue challenges with rigor? Then, as learners go deeper into a question or challenge, they update their MVC…which is pretty much how learning happens in the real world.”

The key there to me is that THEY update their MVC. That resonates so deeply; it feels like that’s what I’m doing with my learning each day as I read about and work with school leaders who are thinking deeply about change.

 

When we pursue questions that matter to us, rigor is baked in.

 

From DSC:
I love the idea of giving students — as they can handle it — more choice, more control. So anytime around 8th-12th grade, I say we turn much more control over to the students, and let them make more choices on what they want to learn about. We should at least try some experiments along these lines.

 

 

As everyone in the workforce is now required to be a lifelong learner, our quality of life goes much higher if we actually enjoy learning. As I think about it, I have often heard an adult (especially middle age and older) say something like, “I hated school, but now, I love to learn.”

Plus, I can easily imagine greater engagement with the materials that students choose for themselves, as well as increased attention spans and higher motivation levels.

Also, here’s a major shout out to Will Richardson, Bruce Dixon, Missy Emler and Lyn Hilt for the work they are doing at ModernLearners.com.

 

Check out the work over at Modern Learners dot com

 

 

Big tech may look troubled, but it’s just getting started — from nytimes.com by David Streitfeld

Excerpt:

SAN JOSE, Calif. — Silicon Valley ended 2018 somewhere it had never been: embattled.

Lawmakers across the political spectrum say Big Tech, for so long the exalted embodiment of American genius, has too much power. Once seen as a force for making our lives better and our brains smarter, tech is now accused of inflaming, radicalizing, dumbing down and squeezing the masses. Tech company stocks have been pummeled from their highs. Regulation looms. Even tech executives are calling for it.

The expansion underlines the dizzying truth of Big Tech: It is barely getting started.

 

“For all intents and purposes, we’re only 35 years into a 75- or 80-year process of moving from analog to digital,” said Tim Bajarin, a longtime tech consultant to companies including Apple, IBM and Microsoft. “The image of Silicon Valley as Nirvana has certainly taken a hit, but the reality is that we the consumers are constantly voting for them.”

 

Big Tech needs to be regulated, many are beginning to argue, and yet there are worries about giving that power to the government.

Which leaves regulation up to the companies themselves, always a dubious proposition.

 

 

 

The world is changing. Here’s how companies must adapt. — from weforum.org by Joe Kaeser, President and Chief Executive Officer, Siemens AG

Excerpts (emphasis DSC):

Although we have only seen the beginning, one thing is already clear: the Fourth Industrial Revolution is the greatest transformation human civilization has ever known. As far-reaching as the previous industrial revolutions were, they never set free such enormous transformative power.

The Fourth Industrial Revolution is transforming practically every human activity...its scope, speed and reach are unprecedented.

Enormous power (Insert from DSC: What I was trying to get at here) entails enormous risk. Yes, the stakes are high. 

 

“And make no mistake about it: we are now writing the code that will shape our collective future.” CEO of Siemens AG

 

 

Contrary to Milton Friedman’s maxim, the business of business should not just be business. Shareholder value alone should not be the yardstick. Instead, we should make stakeholder value, or better yet, social value, the benchmark for a company’s performance.

Today, stakeholders…rightfully expect companies to assume greater social responsibility, for example, by protecting the climate, fighting for social justice, aiding refugees, and training and educating workers. The business of business should be to create value for society.

This seamless integration of the virtual and the physical worlds in so-called cyber-physical systems – that is the giant leap we see today. It eclipses everything that has happened in industry so far. As in previous industrial revolutions but on a much larger scale, the Fourth Industrial Revolution will eliminate millions of jobs and create millions of new jobs.

 

“…because the Fourth Industrial Revolution runs on knowledge, we need a concurrent revolution in training and education.

If the workforce doesn’t keep up with advances in knowledge throughout their lives, how will the millions of new jobs be filled?” 

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 


From DSC:
At least three critically important things jump out at me here:

  1. We are quickly approaching a time when people will need to be able to reinvent themselves quickly and cost-effectively, especially those with families and who are working in their (still existing) jobs. (Or have we already entered this period of time…?)
  2. There is a need to help people identify which jobs are safe to reinvent themselves to — at least for the next 5-10 years.
  3. Citizens across the globe — and their relevant legislatures, governments, and law schools — need to help close the gap between emerging technologies and whether those technologies should even be rolled out, and if so, how and with which features.

 


 

What freedoms and rights should individuals have in the digital age?

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 

MOOCs are dead, welcome MOOC-based degrees — from iblnews.org

 

Coursera, edX, Udacity grew their businesses by over 20% in 2018 — from iblnews.org

 

Also see:

Reinventing the College Degree: A Future with Modular Credentials — from iblnews.org

Excerpt:

Rapidly changing technology is impacting the workforce and the economy, highlighting the need to be continually learning and refreshing skills in order to stay relevant. Additionally, the jobs of the future will require a set of skills from a variety of subject areas. “We have to rethink our education system,” writes Anant Agarwal, CEO at edX, on a column in Quartz.

A summary of Mr. Agarwal’s ideas:

  • Lego-like modular education…
  • MOOCs at scale, with personalized pacing and open admissions…
  • Modular credentials…

 

 

5 questions we should be asking about automation and jobs — from hbr.org by Jed Kolko

Excerpts:

  1. Will workers whose jobs are automated be able to transition to new jobs?*
  2. Who will bear the burden of automation?
  3. How will automation affect the supply of labor?
  4. How will automation affect wages, and how will wages affect automation?
  5. How will automation change job searching?

 

From DSC:
For those Economics profs and students out there, I’m posted this with you in mind; also highly applicable and relevant to MBA programs.

* I would add a few follow-up questions to question #1 above:

  • To which jobs should they transition to?
  • Who can help identify the jobs that might be safe for 5-10 years?
  • If you have a family to feed, how are you going to be able to reinvent yourself quickly and as efficiently/flexibly as possible? (Yes…constant, online-based learning comes to my mind as well, as campus-based education is great, but very time-consuming.)

 

Also see:

We Still Don’t Know Much About the Jobs the AI Economy Will Make — or Take — from medium.com by Rachel Metz with MIT Technology Review
Experts think companies need to invest in workers the way they do for other core aspects of their business they’re looking to future-proof

One big problem that could have lasting effects, she thinks, is a mismatch between the skills companies need in new employees and those that employees have or know that they can readily acquire. To fix this, she said, companies need to start investing in their workers the way they do their supply chains.

 

Per LinkedIn:

Putting robots to work is becoming more and more popularparticularly in Europe. According to the European Bank for Reconstruction and Development, Slovakian workers face a 62% median probability that their job will be automated “in the near future.” Workers in Eastern Europe face the biggest likelihood of having their jobs overtaken by machines, with the textile, agriculture and manufacturing industries seen as the most vulnerable. • Here’s what people are saying.

 

Robot Ready: Human+ Skills for the Future of Work — from economicmodeling.com

Key Findings

In Robot-Ready, we examine several striking insights:

1. Human skills—like leadership, communication, and problem solving—are among the most in-demand skills in the labor market.

2. Human skills are applied differently across career fields. To be effective, liberal arts grads must adapt their skills to the job at hand.

3. Liberal art grads should add technical skills. There is considerable demand for workers who complement their human skills with basic technical skills like data analysis and digital fluency.

4. Human+ skills are at work in a variety of fields. Human skills help liberal arts grads thrive in many career areas, including marketing, public relations, technology, and sales.

 

 

 

The top learning trends for 2019: Towards a digital-human workforce — from hrdive.com; a sponsored posting by Shelley Osborne, Head of L&D at Udemy

Excerpt:

New digital technologies like artificial intelligence (AI) and automation tools are rapidly changing the way we work, develop products, and interact with our customers. Intelligent automation tools augment what people do at work and will redefine what’s possible.

As organizations navigate this complex digital transformation, learning & development (L&D) leaders are tasked with keeping employees up to speed with the ever-evolving skills ecosystem.

To uncover emerging trends and predict what’s required for 2019, we surveyed 400 L&D leaders to find out what they’re doing to prepare their workforce for this digital transformation.

 

With the rise of automation, the world of work is experiencing the largest job transition since the shift from agriculture to manufacturing jobs during the Industrial Revolution. By 2030, as many as 375 million workers—or roughly 14 percent of the global workforce—may need to switch occupational categories as digitization, automation, and advances in artificial intelligence disrupt the world of work,” according to McKinsey Global Institute.

 

As Thomson Reuters readies layoffs of 3,200, what’s it mean for customers? — from lawsitesblog.com by Bob Ambrogi

Excerpts:

Thomson Reuters, the dominant provider of research and information services for the legal profession, last week announced plans to reduce its workforce by 3,200 and close 30 percent of its offices by the end of 2020. What is going on and what does it mean for the company’s customers?

The overall goal, the company said, is to create a leaner, more agile organization that will allow it to better serve its customers and shift its orientation from a content company to a software company.

“As the velocity of technology change increases and the iteration cycles become ever shorter, the new Thomson Reuters needs to run leaner, be faster and more effective,” Neil T. Masterson, co-COO, told the investors. TR plans to accomplish that through three “levers” which will result in a headcount reduction of 12 percent by 2020…

 

New operating structure of Thomson Reuters

 

 

 

Forecast 5.0 – The Future of Learning: Navigating the Future of Learning  — from knowledgeworks.org by Katherine Prince, Jason Swanson, and Katie King
Discover how current trends could impact learning ten years from now and consider ways to shape a future where all students can thrive.

 

 

 

AI Now Report 2018 | December 2018  — from ainowinstitute.org

Meredith Whittaker , AI Now Institute, New York University, Google Open Research
Kate Crawford , AI Now Institute, New York University, Microsoft Research
Roel Dobbe , AI Now Institute, New York University
Genevieve Fried , AI Now Institute, New York University
Elizabeth Kaziunas , AI Now Institute, New York University
Varoon Mathur , AI Now Institute, New York University
Sarah Myers West , AI Now Institute, New York University
Rashida Richardson , AI Now Institute, New York University
Jason Schultz , AI Now Institute, New York University School of Law
Oscar Schwartz , AI Now Institute, New York University

With research assistance from Alex Campolo and Gretchen Krueger (AI Now Institute, New York University)

Excerpt (emphasis DSC):

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central problem, and provides 10 practical recommendations that can help create accountability frameworks capable of governing these powerful technologies.

  1. Governments need to regulate AI by expanding the powers of sector-specific agencies to oversee, audit, and monitor these technologies by domain.
  2. Facial recognition and affect recognition need stringent regulation to protect the public interest.
  3. The AI industry urgently needs new approaches to governance. As this report demonstrates, internal governance structures at most technology companies are failing to ensure accountability for AI systems.
  4. AI companies should waive trade secrecy and other legal claims that stand in the way of accountability in the public sector.
  5. Technology companies should provide protections for conscientious objectors, employee organizing, and ethical whistleblowers.
  6.  Consumer protection agencies should apply “truth-in-advertising” laws to AI products and services.
  7. Technology companies must go beyond the “pipeline model” and commit to addressing the practices of exclusion and discrimination in their workplaces.
  8. Fairness, accountability, and transparency in AI require a detailed account of the “full stack supply chain.”
  9. More funding and support are needed for litigation, labor organizing, and community participation on AI accountability issues.
  10. University AI programs should expand beyond computer science and engineering disciplines. AI began as an interdisciplinary field, but over the decades has narrowed to become a technical discipline. With the increasing application of AI systems to social domains, it needs to expand its disciplinary orientation. That means centering forms of expertise from the social and humanistic disciplines. AI efforts that genuinely wish to address social implications cannot stay solely within computer science and engineering departments, where faculty and students are not trained to research the social world. Expanding the disciplinary orientation of AI research will ensure deeper attention to social contexts, and more focus on potential hazards when these systems are applied to human populations.

 

Also see:

After a Year of Tech Scandals, Our 10 Recommendations for AI — from medium.com by the AI Now Institute
Let’s begin with better regulation, protecting workers, and applying “truth in advertising” rules to AI

 

Also see:

Excerpt:

As we discussed, this technology brings important and even exciting societal benefits but also the potential for abuse. We noted the need for broader study and discussion of these issues. In the ensuing months, we’ve been pursuing these issues further, talking with technologists, companies, civil society groups, academics and public officials around the world. We’ve learned more and tested new ideas. Based on this work, we believe it’s important to move beyond study and discussion. The time for action has arrived.

We believe it’s important for governments in 2019 to start adopting laws to regulate this technology. The facial recognition genie, so to speak, is just emerging from the bottle. Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues. By that time, these challenges will be much more difficult to bottle back up.

In particular, we don’t believe that the world will be best served by a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success. We believe that the only way to protect against this race to the bottom is to build a floor of responsibility that supports healthy market competition. And a solid floor requires that we ensure that this technology, and the organizations that develop and use it, are governed by the rule of law.

 

From DSC:
This is a major heads up to the American Bar Association (ABA), law schools, governments, legislatures around the country, the courts, the corporate world, as well as for colleges, universities, and community colleges. The pace of emerging technologies is much faster than society’s ability to deal with them! 

The ABA and law schools need to majorly pick up their pace — for the benefit of all within our society.

 

 

 
 

These news anchors are professional and efficient. They’re also not human. — from washingtonpost.com by Taylor Telford

Excerpt:

The new anchors at China’s state-run news agency have perfect hair and no pulse.

Xinhua News just unveiled what it is calling the world’s first news anchors powered by artificial intelligence, at the World Internet Conference on Wednesday in China’s Zhejiang province. From the outside, they are almost indistinguishable from their human counterparts, crisp-suited and even-keeled. Although Xinhua says the anchors have the “voice, facial expressions and actions of a real person,” the robotic anchors relay whatever text is fed to them in stilted speech that sounds less human than Siri or Alexa.

 

From DSC:
The question is…is this what we want our future to look like? Personally, I don’t care to watch a robotic newscaster giving me the latest “death and dying report.” It comes off bad enough — callous enough — from human beings backed up by TV networks/stations that have agendas of their own; let alone from a robot run by AI.

 

 

LinkedIn Learning Opens Its Platform (Slightly) [Young]

LinkedIn Learning Opens Its Platform (Slightly) — from edsurge by Jeff Young

Excerpt (emphasis DSC):

A few years ago, in a move toward professional learning, LinkedIn bought Lynda.com for $1.5 billion, adding the well-known library of video-based courses to its professional social network. Today LinkedIn officials announced that they plan to open up their platform to let in educational videos from other providers as well—but with a catch or two.

The plan, announced Friday, is to let companies or colleges who already subscribe to LinkedIn Learning add content from a select group of other providers. The company or college will still have to subscribe to those other services separately, so it’s essentially an integration—but it does mark a change in approach.

For LinkedIn, the goal is to become the front door for employees as they look for micro-courses for professional development.

 

LinkedIn also announced another service for its LinkedIn Learning platform called Q&A, which will give subscribers the ability to pose a question they have about the video lessons they’re taking. The question will first be sent to bots, but if that doesn’t yield an answer the query will be sent on to other learners, and in some cases the instructor who created the videos.

 

 

Also see:

LinkedIn becomes a serious open learning experience platform — from clomedia.com by Josh Bersin
LinkedIn is becoming a dominant learning solution with some pretty interesting competitive advantages, according to one learning analyst.

Excerpt:

LinkedIn has become quite a juggernaut in the corporate learning market. Last time I checked the company had more than 17 million users, 14,000 corporate customers, more than 3,000 courses and was growing at high double-digit rates. And all this in only about two years.

And the company just threw down the gauntlet; it’s now announcing it has completely opened up its learning platform to external content partners. This is the company’s formal announcement that LinkedIn Learning is not just an amazing array of content, it is a corporate learning platform. The company wants to become a single place for all organizational learning content.

 

LinkedIn now offers skills-based learning recommendations to any user through its machine learning algorithms. 

 

 



Is there demand for staying relevant? For learning new skills? For reinventing oneself?

Well…let’s see.

 

 

 

 

 

 



From DSC:
So…look out higher ed and traditional forms of accreditation — your window of opportunity may be starting to close. Alternatives to traditional higher ed continue to appear on the scene and gain momentum. LinkedIn — and/or similar organizations in the future — along with blockchain and big data backed efforts may gain traction in the future and start taking away some major market share. If employers get solid performance from their employees who have gone this route…higher ed better look out. 

Microsoft/LinkedIn/Lynda.com are nicely positioned to be a major player who can offer society a next generation learning platform at an incredible price — offering up-to-date, microlearning along with new forms of credentialing. It’s what I’ve been calling the Amazon.com of higher ed (previously the Walmart of Education) for ~10 years. It will take place in a strategy/platform similar to this one.

 



Also, this is what a guerilla on the back looks like:

 

This is what a guerilla on the back looks like!

 



Also see:

  • Meet the 83-Year-Old App Developer Who Says Edtech Should Better Support Seniors — from edsurge.com by Sydney Johnson
    Excerpt (emphasis DSC):
    Now at age 83, Wakamiya beams with excitement when she recounts her journey, which has been featured in news outlets and even at Apple’s developer conference last year. But through learning how to code, she believes that experience offers an even more important lesson to today’s education and technology companies: don’t forget about senior citizens.Today’s education technology products overwhelmingly target young people. And while there’s a growing industry around serving adult learners in higher education, companies largely neglect to consider the needs of the elderly.

 

 

The global companies that failed to adapt to change. — from trainingmag.com by Professor M.S. Rao, Ph.D.

Excerpt:

Eastman Kodak, a leader for many years, filed for bankruptcy in 2012. Blockbuster Video became defunct in 2013. Similarly, Borders — one of the largest book retailers in the U.S. — went out of business in 2011. Why did these companies, which once had great brands, ultimately fail? It is because they failed to adapt to change. Additionally, they failed to unlearn and relearn.

Former GE CEO Jack Welch once remarked, “If the rate of change on the outside exceeds the rate of change on the inside, the end is near.” Thus, accept change before the change is thrust on you.

Leaders must adopt tools and techniques to adapt to change. Here is a blueprint to embrace change effectively:

  • Keep the vision right and straight, and articulate it effectively.
  • Create organizational culture conducive to bring about change.
  • Communicate clearly about the need to change.
  • Enlighten people about the implications of the status quo.
  • Show them benefits once the change is implemented.
  • Coordinate all stakeholders effectively.
  • Remove the roadblocks by allaying their apprehensions.
  • Show them small gains to ensure that entire change takes place smoothly without any resistance.

 

From DSC:
Though I’m not on board with all of the perspectives in that article, institutions of traditional higher education likely have something to learn from the failures of these companies….while there’s still time to change and to innovate. 

 

 
© 2025 | Daniel Christian