The world is changing. Here’s how companies must adapt. — from weforum.org by Joe Kaeser, President and Chief Executive Officer, Siemens AG

Excerpts (emphasis DSC):

Although we have only seen the beginning, one thing is already clear: the Fourth Industrial Revolution is the greatest transformation human civilization has ever known. As far-reaching as the previous industrial revolutions were, they never set free such enormous transformative power.

The Fourth Industrial Revolution is transforming practically every human activity...its scope, speed and reach are unprecedented.

Enormous power (Insert from DSC: What I was trying to get at here) entails enormous risk. Yes, the stakes are high. 

 

“And make no mistake about it: we are now writing the code that will shape our collective future.” CEO of Siemens AG

 

 

Contrary to Milton Friedman’s maxim, the business of business should not just be business. Shareholder value alone should not be the yardstick. Instead, we should make stakeholder value, or better yet, social value, the benchmark for a company’s performance.

Today, stakeholders…rightfully expect companies to assume greater social responsibility, for example, by protecting the climate, fighting for social justice, aiding refugees, and training and educating workers. The business of business should be to create value for society.

This seamless integration of the virtual and the physical worlds in so-called cyber-physical systems – that is the giant leap we see today. It eclipses everything that has happened in industry so far. As in previous industrial revolutions but on a much larger scale, the Fourth Industrial Revolution will eliminate millions of jobs and create millions of new jobs.

 

“…because the Fourth Industrial Revolution runs on knowledge, we need a concurrent revolution in training and education.

If the workforce doesn’t keep up with advances in knowledge throughout their lives, how will the millions of new jobs be filled?” 

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 


From DSC:
At least three critically important things jump out at me here:

  1. We are quickly approaching a time when people will need to be able to reinvent themselves quickly and cost-effectively, especially those with families and who are working in their (still existing) jobs. (Or have we already entered this period of time…?)
  2. There is a need to help people identify which jobs are safe to reinvent themselves to — at least for the next 5-10 years.
  3. Citizens across the globe — and their relevant legislatures, governments, and law schools — need to help close the gap between emerging technologies and whether those technologies should even be rolled out, and if so, how and with which features.

 


 

What freedoms and rights should individuals have in the digital age?

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 

Dezeen’s top 10 architecture and interiors trends of 2018 — from dezeen.com by Tom Ravenscroft

Excerpt:

For our review of 2018, deputy editor Tom Ravenscroft looks at the year’s biggest trends in architecture and interior design, including action to address gender imbalance, the rise of hyperloop and the move towards off-grid living.

 

 

 

Smart speakers hit critical mass in 2018 — from techcrunch.com by Sarah Perez

Excerpt (emphasis DSC):

We already know Alexa had a good Christmas — the app shot to the top of the App Store over the holidays, and the Alexa service even briefly crashed from all the new users. But Alexa, along with other smart speaker devices like Google Home, didn’t just have a good holiday — they had a great year, too. The smart speaker market reached critical mass in 2018, with around 41 percent of U.S. consumers now owning a voice-activated speaker, up from 21.5 percent in 2017.

 

In the U.S., there are now more than 100 million Alexa-enabled devices installed — a key milestone for Alexa to become a “critical mass platform,” the report noted.

 

 

5 influencers predict AI’s impact on business in 2019 — from martechadvisor.com by Christine Crandell

Excerpt:

With Artificial Intelligence (AI) already proving its worth to adopters, it’s not surprising that an increasing number of companies will implement and leverage AI in 2019. Now, it’s no longer a question of whether AI will take off. Instead, it’s a question of which companies will keep up. Here are five predictions from five influencers on the impact AI will have on businesses in 2019, writes Christine Crandell, President, New Business Strategies.

 

 

Should we be worried about computerized facial recognition? — from newyorker.com by David Owen
The technology could revolutionize policing, medicine, even agriculture—but its applications can easily be weaponized.

 

Facial-recognition technology is advancing faster than the people who worry about it have been able to think of ways to manage it. Indeed, in any number of fields the gap between what scientists are up to and what nonscientists understand about it is almost certainly greater now than it has been at any time since the Manhattan Project. 

 

From DSC:
This is why law schools, legislatures, and the federal government need to become much more responsive to emerging technologies. The pace of technological change has changed. But have other important institutions of our society adapted to this new pace of change?

 

 

Andrew Ng sees an eternal springtime for AI — from zdnet.com by Tiernan Ray
Former Google Brain leader and Baidu chief scientist Andrew Ng lays out the steps companies should take to succeed with artificial intelligence, and explains why there’s unlikely to be another “AI winter” like in times past.

 

 

Google Lens now recognizes over 1 billion products — from venturebeat.com by Kyle Wiggers with thanks to Marie Conway for her tweet on this

Excerpt:

Google Lens, Google’s AI-powered analysis tool, can now recognize over 1 billion products from Google’s retail and price comparison portal, Google Shopping. That’s four times the number of objects Lens covered in October 2017, when it made its debut.

Aparna Chennapragada, vice president of Google Lens and augmented reality at Google, revealed the tidbit in a retrospective blog post about Google Lens’ milestones.

 

Amazon Customer Receives 1,700 Audio Files Of A Stranger Who Used Alexa — from npr.org by Sasha Ingber

Excerpt:

When an Amazon customer in Germany contacted the company to review his archived data, he wasn’t expecting to receive recordings of a stranger speaking in the privacy of a home.

The man requested to review his data in August under a European Union data protection law, according to a German trade magazine called c’t. Amazon sent him a download link to tracked searches on the website — and 1,700 audio recordings by Alexa that were generated by another person.

“I was very surprised about that because I don’t use Amazon Alexa, let alone have an Alexa-enabled device,” the customer, who was not named, told the magazine. “So I randomly listened to some of these audio files and could not recognize any of the voices.”

 

 

On one hand XR-related technologies
show some promise and possibilities…

 

The AR Cloud will infuse meaning into every object in the real world — from venturebeat.com by Amir Bozorgzadeh

Excerpt:

Indeed, if you haven’t yet heard of the “AR Cloud”, it’s time to take serious notice. The term was coined by Ori Inbar, an AR entrepreneur and investor who founded AWE. It is, in his words, “a persistent 3D digital copy of the real world to enable sharing of AR experiences across multiple users and devices.”

 

Augmented reality invades the conference room — from zdnet.com by Ross Rubin
Spatial extends the core functionality of video and screen sharing apps to a new frontier.

 

 

The 5 most innovative augmented reality products of 2018 — from next.reality.news by Adario Strange

 

 

Augmented, virtual reality major opens at Shenandoah U. next fall — from edscoop.com by by Betsy Foresman

Excerpt:

“It’s not about how virtual reality functions. It’s about, ‘How does history function in virtual reality? How does biology function in virtual reality? How does psychology function with these new tools?’” he said.

The school hopes to prepare student for careers in a field with a market size projected to grow to $209.2 billion by 2022, according to Statista. Still at its advent, Whelan compared VR technology to the introduction of the personal computer.

 

VR is leading us into the next generation of sports media — from venturebeat.com by Mateusz Przepiorkowski

 

 

Accredited surgery instruction now available in VR — from zdnet.com by Greg Nichols
The medical establishment has embraced VR training as a cost-effective, immersive alternative to classroom time.

 

Toyota is using Microsoft’s HoloLens to build cars faster — from cnn.comby Rachel Metz

From DSC:
But even in that posting the message is mixed…some pros…some cons. Some things going well for XR-related techs…but for other things, things are not going very well.

 

 

…but on the other hand,
some things don’t look so good…

 

Is the Current Generation of VR Already Dead? — from medium.com by Andreas Goeldi

Excerpt:

Four years later, things are starting to look decidedly bleak. Yes, there are about 5 million Gear VR units and 3 million Sony Playstation VR headsets in market, plus probably a few hundred thousand higher-end Oculus and HTC Vive systems. Yes, VR is still being demonstrated at countless conferences and events, and big corporations that want to seem innovative love to invest in a VR app or two. Yes, Facebook just cracked an important low-end price point with its $200 Oculus Go headset, theoretically making VR affordable for mainstream consumers. Plus, there’s even more hype about Augmented Reality, which in a way could be a gateway drug to VR.

But it’s hard to ignore a growing feeling that VR is not developing as the industry hoped it would. So is that it again, we’ve seen this movie before, let’s all wrap it up and wait for the next wave of VR to come along about five years from now?

There are a few signs that are really worrying…

 

 

From DSC:
My take is that it’s too early to tell. We need to give things more time.

 

 

 

5 questions we should be asking about automation and jobs — from hbr.org by Jed Kolko

Excerpts:

  1. Will workers whose jobs are automated be able to transition to new jobs?*
  2. Who will bear the burden of automation?
  3. How will automation affect the supply of labor?
  4. How will automation affect wages, and how will wages affect automation?
  5. How will automation change job searching?

 

From DSC:
For those Economics profs and students out there, I’m posted this with you in mind; also highly applicable and relevant to MBA programs.

* I would add a few follow-up questions to question #1 above:

  • To which jobs should they transition to?
  • Who can help identify the jobs that might be safe for 5-10 years?
  • If you have a family to feed, how are you going to be able to reinvent yourself quickly and as efficiently/flexibly as possible? (Yes…constant, online-based learning comes to my mind as well, as campus-based education is great, but very time-consuming.)

 

Also see:

We Still Don’t Know Much About the Jobs the AI Economy Will Make — or Take — from medium.com by Rachel Metz with MIT Technology Review
Experts think companies need to invest in workers the way they do for other core aspects of their business they’re looking to future-proof

One big problem that could have lasting effects, she thinks, is a mismatch between the skills companies need in new employees and those that employees have or know that they can readily acquire. To fix this, she said, companies need to start investing in their workers the way they do their supply chains.

 

Per LinkedIn:

Putting robots to work is becoming more and more popularparticularly in Europe. According to the European Bank for Reconstruction and Development, Slovakian workers face a 62% median probability that their job will be automated “in the near future.” Workers in Eastern Europe face the biggest likelihood of having their jobs overtaken by machines, with the textile, agriculture and manufacturing industries seen as the most vulnerable. • Here’s what people are saying.

 

Robot Ready: Human+ Skills for the Future of Work — from economicmodeling.com

Key Findings

In Robot-Ready, we examine several striking insights:

1. Human skills—like leadership, communication, and problem solving—are among the most in-demand skills in the labor market.

2. Human skills are applied differently across career fields. To be effective, liberal arts grads must adapt their skills to the job at hand.

3. Liberal art grads should add technical skills. There is considerable demand for workers who complement their human skills with basic technical skills like data analysis and digital fluency.

4. Human+ skills are at work in a variety of fields. Human skills help liberal arts grads thrive in many career areas, including marketing, public relations, technology, and sales.

 

 

 

The top learning trends for 2019: Towards a digital-human workforce — from hrdive.com; a sponsored posting by Shelley Osborne, Head of L&D at Udemy

Excerpt:

New digital technologies like artificial intelligence (AI) and automation tools are rapidly changing the way we work, develop products, and interact with our customers. Intelligent automation tools augment what people do at work and will redefine what’s possible.

As organizations navigate this complex digital transformation, learning & development (L&D) leaders are tasked with keeping employees up to speed with the ever-evolving skills ecosystem.

To uncover emerging trends and predict what’s required for 2019, we surveyed 400 L&D leaders to find out what they’re doing to prepare their workforce for this digital transformation.

 

With the rise of automation, the world of work is experiencing the largest job transition since the shift from agriculture to manufacturing jobs during the Industrial Revolution. By 2030, as many as 375 million workers—or roughly 14 percent of the global workforce—may need to switch occupational categories as digitization, automation, and advances in artificial intelligence disrupt the world of work,” according to McKinsey Global Institute.

 

Digital transformation reality check: 10 trends — from enterprisersproject.com by Stephanie Overby
2019 is the year when CIOs scrutinize investments, work even more closely with the CEO, and look to AI to shape strategy. What other trends will prove key?

Excerpt (emphasis DSC):

6. Technology convergence expands
Lines have already begun to blur between software development and IT operations thanks to the widespread adoption of DevOps. Meanwhile, IT and operational technology are also coming together in data-centric industries like manufacturing and logistics.

“A third convergence – that many are feeling but not yet articulating will have a profound impact on how CIOs structure and staff their organizations, design their architectures, build their budgets, and govern their operations – is the convergence of applications and infrastructure,” says Edwards. “In the digital age, it is nearly impossible to build a strategy for infrastructure that doesn’t include a substantial number of considerations for applications and vice versa.”

Most IT organizations still have heads of infrastructure and applications managing their own teams, but that may begin to change.

While most IT organizations still have heads of infrastructure and applications managing their own teams, that may begin to change as trends like software-defined infrastructure grow. “In 2019, CIOs will need to begin to grapple with the challenges to their operating models when the lines within the traditional IT tower blur and sometimes fade,” Edwards says.

 

 

5 things you will see in the future “smart city” — from interestingengineering.com by Taylor Donovan Barnett
The Smart City is on the horizon and here are some of the crucial technologies part of it.

5 Things You Will See in the Future of the Smart City

Excerpt:

A New Framework: The Smart City
So, what exactly is a smart city? A smart city is an urban center that hosts a wide range of digital technology across its ecosystem. However, smart cities go far beyond just this definition.

Smart cities use technology to better population’s living experiences, operating as one big data-driven ecosystem.

The smart city uses that data from the people, vehicles, buildings etc. to not only improve citizens lives but also minimize the environmental impact of the city itself, constantly communicating with itself to maximize efficiency.

So what are some of the crucial components of the future smart city? Here is what you should know.

 

 

 

Hey bot, what’s next in line in chatbot technology? — from blog.engati.com by Imtiaz Bellary

Excerpts:

Scenario 1: Are bots the new apps?
Scenario 2: Bot conversations that make sense
Scenario 3: Can bots increase employee throughput?
Scenario 4: Let voice take over!

 

Voice as an input medium is catching up with an increasing number of folks adopting Amazon Echo and other digital assistants for their daily chores. Can we expect bots to gauge your mood and provide personalised experience as compared to a standard response? In regulated scenarios, voice acts as an authentication mechanism for the bot to pursue actions. Voice as an input adds sophistication and ease to do tasks quickly, thereby increasing user experience.

 

 

Forecast 5.0 – The Future of Learning: Navigating the Future of Learning  — from knowledgeworks.org by Katherine Prince, Jason Swanson, and Katie King
Discover how current trends could impact learning ten years from now and consider ways to shape a future where all students can thrive.

 

 

 

AI Now Report 2018 | December 2018  — from ainowinstitute.org

Meredith Whittaker , AI Now Institute, New York University, Google Open Research
Kate Crawford , AI Now Institute, New York University, Microsoft Research
Roel Dobbe , AI Now Institute, New York University
Genevieve Fried , AI Now Institute, New York University
Elizabeth Kaziunas , AI Now Institute, New York University
Varoon Mathur , AI Now Institute, New York University
Sarah Myers West , AI Now Institute, New York University
Rashida Richardson , AI Now Institute, New York University
Jason Schultz , AI Now Institute, New York University School of Law
Oscar Schwartz , AI Now Institute, New York University

With research assistance from Alex Campolo and Gretchen Krueger (AI Now Institute, New York University)

Excerpt (emphasis DSC):

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central problem, and provides 10 practical recommendations that can help create accountability frameworks capable of governing these powerful technologies.

  1. Governments need to regulate AI by expanding the powers of sector-specific agencies to oversee, audit, and monitor these technologies by domain.
  2. Facial recognition and affect recognition need stringent regulation to protect the public interest.
  3. The AI industry urgently needs new approaches to governance. As this report demonstrates, internal governance structures at most technology companies are failing to ensure accountability for AI systems.
  4. AI companies should waive trade secrecy and other legal claims that stand in the way of accountability in the public sector.
  5. Technology companies should provide protections for conscientious objectors, employee organizing, and ethical whistleblowers.
  6.  Consumer protection agencies should apply “truth-in-advertising” laws to AI products and services.
  7. Technology companies must go beyond the “pipeline model” and commit to addressing the practices of exclusion and discrimination in their workplaces.
  8. Fairness, accountability, and transparency in AI require a detailed account of the “full stack supply chain.”
  9. More funding and support are needed for litigation, labor organizing, and community participation on AI accountability issues.
  10. University AI programs should expand beyond computer science and engineering disciplines. AI began as an interdisciplinary field, but over the decades has narrowed to become a technical discipline. With the increasing application of AI systems to social domains, it needs to expand its disciplinary orientation. That means centering forms of expertise from the social and humanistic disciplines. AI efforts that genuinely wish to address social implications cannot stay solely within computer science and engineering departments, where faculty and students are not trained to research the social world. Expanding the disciplinary orientation of AI research will ensure deeper attention to social contexts, and more focus on potential hazards when these systems are applied to human populations.

 

Also see:

After a Year of Tech Scandals, Our 10 Recommendations for AI — from medium.com by the AI Now Institute
Let’s begin with better regulation, protecting workers, and applying “truth in advertising” rules to AI

 

Also see:

Excerpt:

As we discussed, this technology brings important and even exciting societal benefits but also the potential for abuse. We noted the need for broader study and discussion of these issues. In the ensuing months, we’ve been pursuing these issues further, talking with technologists, companies, civil society groups, academics and public officials around the world. We’ve learned more and tested new ideas. Based on this work, we believe it’s important to move beyond study and discussion. The time for action has arrived.

We believe it’s important for governments in 2019 to start adopting laws to regulate this technology. The facial recognition genie, so to speak, is just emerging from the bottle. Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues. By that time, these challenges will be much more difficult to bottle back up.

In particular, we don’t believe that the world will be best served by a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success. We believe that the only way to protect against this race to the bottom is to build a floor of responsibility that supports healthy market competition. And a solid floor requires that we ensure that this technology, and the organizations that develop and use it, are governed by the rule of law.

 

From DSC:
This is a major heads up to the American Bar Association (ABA), law schools, governments, legislatures around the country, the courts, the corporate world, as well as for colleges, universities, and community colleges. The pace of emerging technologies is much faster than society’s ability to deal with them! 

The ABA and law schools need to majorly pick up their pace — for the benefit of all within our society.

 

 

 

Alexa, get me the articles (voice interfaces in academia) — from blog.libux.co by Kelly Dagan

Excerpt:

Credit to Jill O’Neill, who has written an engaging consideration of applications, discussions, and potentials for voice-user interfaces in the scholarly realm. She details a few use case scenarios: finding recent, authoritative biographies of Jane Austen; finding if your closest library has an item on the shelf now (and whether it’s worth the drive based on traffic).

Coming from an undergraduate-focused (and library) perspective, I can think of a few more:

  • asking if there are any group study rooms available at 7 pm and making a booking
  • finding out if [X] is open now (Archives, the Cafe, the Library, etc.)
  • finding three books on the Red Brigades, seeing if they are available, and saving the locations
  • grabbing five research articles on stereotype threat, to read later

 

Also see:

 

 

 
 
© 2024 | Daniel Christian