Top six AI and automation trends for 2019 — from forbes.com by Daniel Newman

Excerpt:

If your company hasn’t yet created a plan for AI and automation throughout your enterprise, you have some work to do. Experts believe AI will add nearly $16 trillion to the global economy by 2030, and 20 % of companies surveyed are already planning to incorporate AI throughout their companies next year. As 2018 winds down, now is the time to take a look at some trends and predictions for AI and automation that I believe will dominate the headlines in 2019—and to think about how you may incorporate them into your own company.

 

Also see — and an insert here from DSC:

Kai-Fu has a rosier picture than I do in regards to how humanity will be impacted by AI. One simply needs to check out today’s news to see that humans have a very hard time creating unity, thinking about why businesses exist in the first place, and being kind to one another…

 

 

 

How AI can save our humanity 

 

 

 

Big tech may look troubled, but it’s just getting started — from nytimes.com by David Streitfeld

Excerpt:

SAN JOSE, Calif. — Silicon Valley ended 2018 somewhere it had never been: embattled.

Lawmakers across the political spectrum say Big Tech, for so long the exalted embodiment of American genius, has too much power. Once seen as a force for making our lives better and our brains smarter, tech is now accused of inflaming, radicalizing, dumbing down and squeezing the masses. Tech company stocks have been pummeled from their highs. Regulation looms. Even tech executives are calling for it.

The expansion underlines the dizzying truth of Big Tech: It is barely getting started.

 

“For all intents and purposes, we’re only 35 years into a 75- or 80-year process of moving from analog to digital,” said Tim Bajarin, a longtime tech consultant to companies including Apple, IBM and Microsoft. “The image of Silicon Valley as Nirvana has certainly taken a hit, but the reality is that we the consumers are constantly voting for them.”

 

Big Tech needs to be regulated, many are beginning to argue, and yet there are worries about giving that power to the government.

Which leaves regulation up to the companies themselves, always a dubious proposition.

 

 

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 

5 influencers predict AI’s impact on business in 2019 — from martechadvisor.com by Christine Crandell

Excerpt:

With Artificial Intelligence (AI) already proving its worth to adopters, it’s not surprising that an increasing number of companies will implement and leverage AI in 2019. Now, it’s no longer a question of whether AI will take off. Instead, it’s a question of which companies will keep up. Here are five predictions from five influencers on the impact AI will have on businesses in 2019, writes Christine Crandell, President, New Business Strategies.

 

 

Should we be worried about computerized facial recognition? — from newyorker.com by David Owen
The technology could revolutionize policing, medicine, even agriculture—but its applications can easily be weaponized.

 

Facial-recognition technology is advancing faster than the people who worry about it have been able to think of ways to manage it. Indeed, in any number of fields the gap between what scientists are up to and what nonscientists understand about it is almost certainly greater now than it has been at any time since the Manhattan Project. 

 

From DSC:
This is why law schools, legislatures, and the federal government need to become much more responsive to emerging technologies. The pace of technological change has changed. But have other important institutions of our society adapted to this new pace of change?

 

 

Andrew Ng sees an eternal springtime for AI — from zdnet.com by Tiernan Ray
Former Google Brain leader and Baidu chief scientist Andrew Ng lays out the steps companies should take to succeed with artificial intelligence, and explains why there’s unlikely to be another “AI winter” like in times past.

 

 

Google Lens now recognizes over 1 billion products — from venturebeat.com by Kyle Wiggers with thanks to Marie Conway for her tweet on this

Excerpt:

Google Lens, Google’s AI-powered analysis tool, can now recognize over 1 billion products from Google’s retail and price comparison portal, Google Shopping. That’s four times the number of objects Lens covered in October 2017, when it made its debut.

Aparna Chennapragada, vice president of Google Lens and augmented reality at Google, revealed the tidbit in a retrospective blog post about Google Lens’ milestones.

 

Amazon Customer Receives 1,700 Audio Files Of A Stranger Who Used Alexa — from npr.org by Sasha Ingber

Excerpt:

When an Amazon customer in Germany contacted the company to review his archived data, he wasn’t expecting to receive recordings of a stranger speaking in the privacy of a home.

The man requested to review his data in August under a European Union data protection law, according to a German trade magazine called c’t. Amazon sent him a download link to tracked searches on the website — and 1,700 audio recordings by Alexa that were generated by another person.

“I was very surprised about that because I don’t use Amazon Alexa, let alone have an Alexa-enabled device,” the customer, who was not named, told the magazine. “So I randomly listened to some of these audio files and could not recognize any of the voices.”

 

 

5 questions we should be asking about automation and jobs — from hbr.org by Jed Kolko

Excerpts:

  1. Will workers whose jobs are automated be able to transition to new jobs?*
  2. Who will bear the burden of automation?
  3. How will automation affect the supply of labor?
  4. How will automation affect wages, and how will wages affect automation?
  5. How will automation change job searching?

 

From DSC:
For those Economics profs and students out there, I’m posted this with you in mind; also highly applicable and relevant to MBA programs.

* I would add a few follow-up questions to question #1 above:

  • To which jobs should they transition to?
  • Who can help identify the jobs that might be safe for 5-10 years?
  • If you have a family to feed, how are you going to be able to reinvent yourself quickly and as efficiently/flexibly as possible? (Yes…constant, online-based learning comes to my mind as well, as campus-based education is great, but very time-consuming.)

 

Also see:

We Still Don’t Know Much About the Jobs the AI Economy Will Make — or Take — from medium.com by Rachel Metz with MIT Technology Review
Experts think companies need to invest in workers the way they do for other core aspects of their business they’re looking to future-proof

One big problem that could have lasting effects, she thinks, is a mismatch between the skills companies need in new employees and those that employees have or know that they can readily acquire. To fix this, she said, companies need to start investing in their workers the way they do their supply chains.

 

Per LinkedIn:

Putting robots to work is becoming more and more popularparticularly in Europe. According to the European Bank for Reconstruction and Development, Slovakian workers face a 62% median probability that their job will be automated “in the near future.” Workers in Eastern Europe face the biggest likelihood of having their jobs overtaken by machines, with the textile, agriculture and manufacturing industries seen as the most vulnerable. • Here’s what people are saying.

 

Robot Ready: Human+ Skills for the Future of Work — from economicmodeling.com

Key Findings

In Robot-Ready, we examine several striking insights:

1. Human skills—like leadership, communication, and problem solving—are among the most in-demand skills in the labor market.

2. Human skills are applied differently across career fields. To be effective, liberal arts grads must adapt their skills to the job at hand.

3. Liberal art grads should add technical skills. There is considerable demand for workers who complement their human skills with basic technical skills like data analysis and digital fluency.

4. Human+ skills are at work in a variety of fields. Human skills help liberal arts grads thrive in many career areas, including marketing, public relations, technology, and sales.

 

 

 

Digital transformation reality check: 10 trends — from enterprisersproject.com by Stephanie Overby
2019 is the year when CIOs scrutinize investments, work even more closely with the CEO, and look to AI to shape strategy. What other trends will prove key?

Excerpt (emphasis DSC):

6. Technology convergence expands
Lines have already begun to blur between software development and IT operations thanks to the widespread adoption of DevOps. Meanwhile, IT and operational technology are also coming together in data-centric industries like manufacturing and logistics.

“A third convergence – that many are feeling but not yet articulating will have a profound impact on how CIOs structure and staff their organizations, design their architectures, build their budgets, and govern their operations – is the convergence of applications and infrastructure,” says Edwards. “In the digital age, it is nearly impossible to build a strategy for infrastructure that doesn’t include a substantial number of considerations for applications and vice versa.”

Most IT organizations still have heads of infrastructure and applications managing their own teams, but that may begin to change.

While most IT organizations still have heads of infrastructure and applications managing their own teams, that may begin to change as trends like software-defined infrastructure grow. “In 2019, CIOs will need to begin to grapple with the challenges to their operating models when the lines within the traditional IT tower blur and sometimes fade,” Edwards says.

 

 

Forecast 5.0 – The Future of Learning: Navigating the Future of Learning  — from knowledgeworks.org by Katherine Prince, Jason Swanson, and Katie King
Discover how current trends could impact learning ten years from now and consider ways to shape a future where all students can thrive.

 

 

 
 

The global companies that failed to adapt to change. — from trainingmag.com by Professor M.S. Rao, Ph.D.

Excerpt:

Eastman Kodak, a leader for many years, filed for bankruptcy in 2012. Blockbuster Video became defunct in 2013. Similarly, Borders — one of the largest book retailers in the U.S. — went out of business in 2011. Why did these companies, which once had great brands, ultimately fail? It is because they failed to adapt to change. Additionally, they failed to unlearn and relearn.

Former GE CEO Jack Welch once remarked, “If the rate of change on the outside exceeds the rate of change on the inside, the end is near.” Thus, accept change before the change is thrust on you.

Leaders must adopt tools and techniques to adapt to change. Here is a blueprint to embrace change effectively:

  • Keep the vision right and straight, and articulate it effectively.
  • Create organizational culture conducive to bring about change.
  • Communicate clearly about the need to change.
  • Enlighten people about the implications of the status quo.
  • Show them benefits once the change is implemented.
  • Coordinate all stakeholders effectively.
  • Remove the roadblocks by allaying their apprehensions.
  • Show them small gains to ensure that entire change takes place smoothly without any resistance.

 

From DSC:
Though I’m not on board with all of the perspectives in that article, institutions of traditional higher education likely have something to learn from the failures of these companies….while there’s still time to change and to innovate. 

 

 

Report: Accessibility in Digital Learning Increasingly Complex — from campustechnology.com by Dian Schaffhauser

Excerpt:

The Online Learning Consortium (OLC)has introduced a series of original reports to keep people in education up-to-date on the latest developments in the field of digital learning. The first report covers accessibility and addresses both K-12 and higher education. The series is being produced by OLC’s Research Center for Digital Learning & Leadership.

The initial report addresses four broad areas tied to accessibility:

  • The national laws governing disability and access and how they apply to online courses;
  • What legal cases exist to guide online course design and delivery in various educational settings;
  • The issues that emerge regarding online course access that might be unique to higher ed or to K-12, and which ones might be shared; and
  • What support online course designers need to generate accessible courses for learners across the education life span (from K-12 to higher education).

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian