5 questions we should be asking about automation and jobs — from hbr.org by Jed Kolko

Excerpts:

  1. Will workers whose jobs are automated be able to transition to new jobs?*
  2. Who will bear the burden of automation?
  3. How will automation affect the supply of labor?
  4. How will automation affect wages, and how will wages affect automation?
  5. How will automation change job searching?

 

From DSC:
For those Economics profs and students out there, I’m posted this with you in mind; also highly applicable and relevant to MBA programs.

* I would add a few follow-up questions to question #1 above:

  • To which jobs should they transition to?
  • Who can help identify the jobs that might be safe for 5-10 years?
  • If you have a family to feed, how are you going to be able to reinvent yourself quickly and as efficiently/flexibly as possible? (Yes…constant, online-based learning comes to my mind as well, as campus-based education is great, but very time-consuming.)

 

Also see:

We Still Don’t Know Much About the Jobs the AI Economy Will Make — or Take — from medium.com by Rachel Metz with MIT Technology Review
Experts think companies need to invest in workers the way they do for other core aspects of their business they’re looking to future-proof

One big problem that could have lasting effects, she thinks, is a mismatch between the skills companies need in new employees and those that employees have or know that they can readily acquire. To fix this, she said, companies need to start investing in their workers the way they do their supply chains.

 

Per LinkedIn:

Putting robots to work is becoming more and more popularparticularly in Europe. According to the European Bank for Reconstruction and Development, Slovakian workers face a 62% median probability that their job will be automated “in the near future.” Workers in Eastern Europe face the biggest likelihood of having their jobs overtaken by machines, with the textile, agriculture and manufacturing industries seen as the most vulnerable. • Here’s what people are saying.

 

Robot Ready: Human+ Skills for the Future of Work — from economicmodeling.com

Key Findings

In Robot-Ready, we examine several striking insights:

1. Human skills—like leadership, communication, and problem solving—are among the most in-demand skills in the labor market.

2. Human skills are applied differently across career fields. To be effective, liberal arts grads must adapt their skills to the job at hand.

3. Liberal art grads should add technical skills. There is considerable demand for workers who complement their human skills with basic technical skills like data analysis and digital fluency.

4. Human+ skills are at work in a variety of fields. Human skills help liberal arts grads thrive in many career areas, including marketing, public relations, technology, and sales.

 

 

 

From DSC:
Not too long ago, I really enjoyed watching a program on PBS regarding America’s 100 most-loved books, entitled, “The Great American Read.”

 

Watch “The Grand Finale”

 

While that’s not the show I’m talking about, it got me to thinking of one similar to it — something educational, yet entertaining. But also, something more.

The program that came to my mind would be a program that’s focused on significant topics and issues within American society — offered up in a debate/presentation style format. 

For example, you could have different individuals, groups, or organizations discuss the pros and cons of an issue or topic. The show would provide contact information for helpful resources, groups, organizations, legislators, etc.  These contacts would be for learning more about a subject or getting involved with finding a solution for that problem.

For example, how about this for a potential topic: Grades or no grades?
  • What are the pros and cons of using an A-F grading system?
  • What are the benefits and issues/drawbacks with using grades? 
  • How are we truly using grades Do we use them to rank and compare individuals, schools, school systems, communities? Do we use them to “weed people out” of a program?
  • With our current systems, what “product” do we get? Do we produce game-players or people who enjoy learning? (Apologies for some of my bias showing up here! But my son has become a major game-player and, likely, so did I at his age.)
  • How do grades jibe with Individualized Education Programs (IEPs)? On one hand…how do you keep someone moving forward, staying positive, and trying to keep learning/school enjoyable yet on the other hand, how do you have those grades mean something to those who obtain data to rank school systems, communities, colleges, programs, etc.?
  • How do grades impact one’s desire to learn throughout one’s lifetime?

Such debates could be watched by students and then they could have their own debates on subjects that they propose.

Or the show could have journalists debate college or high school teams. The format could sometimes involve professors and deans debating against researchers. Or practitioners/teachers debating against researchers/cognitive psychologists. 

Such a show could be entertaining, yet highly applicable and educational. We would probably all learn something. And perhaps have our eyes opened up to a new perspective on an issue.

Or better yet, we might actually resolve some more issues and then move on to address other ones!

 

 

5 things you will see in the future “smart city” — from interestingengineering.com by Taylor Donovan Barnett
The Smart City is on the horizon and here are some of the crucial technologies part of it.

5 Things You Will See in the Future of the Smart City

Excerpt:

A New Framework: The Smart City
So, what exactly is a smart city? A smart city is an urban center that hosts a wide range of digital technology across its ecosystem. However, smart cities go far beyond just this definition.

Smart cities use technology to better population’s living experiences, operating as one big data-driven ecosystem.

The smart city uses that data from the people, vehicles, buildings etc. to not only improve citizens lives but also minimize the environmental impact of the city itself, constantly communicating with itself to maximize efficiency.

So what are some of the crucial components of the future smart city? Here is what you should know.

 

 

 

Google Glass wasn’t a failure. It raised crucial concerns. — from wired.com by Rose Eveleth

Excerpts:

So when Google ultimately retired Glass, it was in reaction to an important act of line drawing. It was an admission of defeat not by design, but by culture.

These kinds of skirmishes on the front lines of surveillance might seem inconsequential — but they can not only change the behavior of tech giants like Google, they can also change how we’re protected under the law. Each time we invite another device into our lives, we open up a legal conversation over how that device’s capabilities change our right to privacy. To understand why, we have to get wonky for a bit, but it’s worth it, I promise.

 

But where many people see Google Glass as a cautionary tale about tech adoption failure, I see a wild success. Not for Google of course, but for the rest of us. Google Glass is a story about human beings setting boundaries and pushing back against surveillance…

 

IN THE UNITED States, the laws that dictate when you can and cannot record someone have a several layers. But most of these laws were written when smartphones and digital home assistants weren’t even a glimmer in Google’s eye. As a result, they are mostly concerned with issues of government surveillance, not individuals surveilling each other or companies surveilling their customers. Which means that as cameras and microphones creep further into our everyday lives, there are more and more legal gray zones.

 

From DSC:
We need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have.

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

Addendum on 12/27/18: — also related/see:

‘We’ve hit an inflection point’: Big Tech failed big-time in 2018 — from finance.yahoo.com by JP Mangalindan

Excerpt (emphasis DSC):

2018 will be remembered as the year the public’s big soft-hearted love affair with Big Tech came to a screeching halt.

For years, lawmakers and the public let massive companies like Facebook, Google, and Amazon run largely unchecked. Billions of people handed them their data — photos, locations, and other status-rich updates — with little scrutiny or question. Then came revelations around several high-profile data breaches from Facebook: a back-to-back series of rude awakenings that taught casual web-surfing, smartphone-toting citizens that uploading their data into the digital ether could have consequences. Google reignited the conversation around sexual harassment, spurring thousands of employees to walk out, while Facebook reminded some corners of the U.S. that racial bias, even in supposedly egalitarian Silicon Valley, remained alive and well. And Amazon courted well over 200 U.S. cities in its gaudy and protracted search for a second headquarters.

“I think 2018 was the year that people really called tech companies on the carpet about the way that they’ve been behaving conducting their business,” explained Susan Etlinger, an analyst at the San Francisco-based Altimeter Group. “We’ve hit an inflection point where people no longer feel comfortable with the ways businesses are conducting themselves. At the same time, we’re also at a point, historically, where there’s just so much more willingness to call out businesses and institutions on bigotry, racism, sexism and other kinds of bias.”

 

The public’s love affair with Facebook hit its first major rough patch in 2016 when Russian trolls attempted to meddle with the 2016 U.S. presidential election using the social media platform. But it was the Cambridge Analytica controversy that may go down in internet history as the start of a series of back-to-back, bruising controversies for the social network, which for years, served as the Silicon Valley poster child of the nouveau American Dream. 

 

 

Forecast 5.0 – The Future of Learning: Navigating the Future of Learning  — from knowledgeworks.org by Katherine Prince, Jason Swanson, and Katie King
Discover how current trends could impact learning ten years from now and consider ways to shape a future where all students can thrive.

 

 

 

AI Now Report 2018 | December 2018  — from ainowinstitute.org

Meredith Whittaker , AI Now Institute, New York University, Google Open Research
Kate Crawford , AI Now Institute, New York University, Microsoft Research
Roel Dobbe , AI Now Institute, New York University
Genevieve Fried , AI Now Institute, New York University
Elizabeth Kaziunas , AI Now Institute, New York University
Varoon Mathur , AI Now Institute, New York University
Sarah Myers West , AI Now Institute, New York University
Rashida Richardson , AI Now Institute, New York University
Jason Schultz , AI Now Institute, New York University School of Law
Oscar Schwartz , AI Now Institute, New York University

With research assistance from Alex Campolo and Gretchen Krueger (AI Now Institute, New York University)

Excerpt (emphasis DSC):

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central problem, and provides 10 practical recommendations that can help create accountability frameworks capable of governing these powerful technologies.

  1. Governments need to regulate AI by expanding the powers of sector-specific agencies to oversee, audit, and monitor these technologies by domain.
  2. Facial recognition and affect recognition need stringent regulation to protect the public interest.
  3. The AI industry urgently needs new approaches to governance. As this report demonstrates, internal governance structures at most technology companies are failing to ensure accountability for AI systems.
  4. AI companies should waive trade secrecy and other legal claims that stand in the way of accountability in the public sector.
  5. Technology companies should provide protections for conscientious objectors, employee organizing, and ethical whistleblowers.
  6.  Consumer protection agencies should apply “truth-in-advertising” laws to AI products and services.
  7. Technology companies must go beyond the “pipeline model” and commit to addressing the practices of exclusion and discrimination in their workplaces.
  8. Fairness, accountability, and transparency in AI require a detailed account of the “full stack supply chain.”
  9. More funding and support are needed for litigation, labor organizing, and community participation on AI accountability issues.
  10. University AI programs should expand beyond computer science and engineering disciplines. AI began as an interdisciplinary field, but over the decades has narrowed to become a technical discipline. With the increasing application of AI systems to social domains, it needs to expand its disciplinary orientation. That means centering forms of expertise from the social and humanistic disciplines. AI efforts that genuinely wish to address social implications cannot stay solely within computer science and engineering departments, where faculty and students are not trained to research the social world. Expanding the disciplinary orientation of AI research will ensure deeper attention to social contexts, and more focus on potential hazards when these systems are applied to human populations.

 

Also see:

After a Year of Tech Scandals, Our 10 Recommendations for AI — from medium.com by the AI Now Institute
Let’s begin with better regulation, protecting workers, and applying “truth in advertising” rules to AI

 

Also see:

Excerpt:

As we discussed, this technology brings important and even exciting societal benefits but also the potential for abuse. We noted the need for broader study and discussion of these issues. In the ensuing months, we’ve been pursuing these issues further, talking with technologists, companies, civil society groups, academics and public officials around the world. We’ve learned more and tested new ideas. Based on this work, we believe it’s important to move beyond study and discussion. The time for action has arrived.

We believe it’s important for governments in 2019 to start adopting laws to regulate this technology. The facial recognition genie, so to speak, is just emerging from the bottle. Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues. By that time, these challenges will be much more difficult to bottle back up.

In particular, we don’t believe that the world will be best served by a commercial race to the bottom, with tech companies forced to choose between social responsibility and market success. We believe that the only way to protect against this race to the bottom is to build a floor of responsibility that supports healthy market competition. And a solid floor requires that we ensure that this technology, and the organizations that develop and use it, are governed by the rule of law.

 

From DSC:
This is a major heads up to the American Bar Association (ABA), law schools, governments, legislatures around the country, the courts, the corporate world, as well as for colleges, universities, and community colleges. The pace of emerging technologies is much faster than society’s ability to deal with them! 

The ABA and law schools need to majorly pick up their pace — for the benefit of all within our society.

 

 

 

What teaching that lifts all students could look like — from kqed.org by Kristina Rizga

An initial comment from DSC:
I recently ran across this article. Although it’s from 12/24/15, it has some really solid points and strategies in it. Definitely worth a read if you are a teacher or even a parent with school age kids.

Excerpts (emphasis DSC):

…his comments in class about the substance of her ideas, his feedback on her writing, the enthusiasm in his voice when he discussed her thinking. Over the course of a year, that proof solidified into a confidence that couldn’t be easily shaken anymore. It was that pride in her intellect that gave her the fortitude and resilience to cut through many racial stereotypes and negative myths as she made her way through high school and then Boston University.

For McKamey, the most important value driving her teaching and coaching is her conviction that being a good teacher means hearing, seeing, and succeeding with all students—regardless of how far a student is from the teacher’s preconceived notions of what it means to be ready to learn. When teachers are driven by a belief that all of their students can learn, they are able to respond to the complexity of their students’ needs and to adjust if something is not working for a particular individual or group of students.

 

 

The best way to improve teaching and reduce the achievement gaps, McKamey argues, is to allow teachers to act as school-based researchers and leaders, justifying classroom reforms based on the broad range of performance markers of their students: daily grades, the quality of student work and the rate of its production, engagement, effort, attendance, and student comments. That means planning units together and then spending a lot of time analyzing the iterative work the students produce. This process teaches educators to recognize that there are no standard individuals, and there are as many learning trajectories as there are people. 

 

 

 

 

 

Below are some excerpted slides from her presentation…

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Also see:

  • 20 important takeaways for learning world from Mary Meeker’s brilliant tech trends – from donaldclarkplanb.blogspot.com by Donald Clark
    Excerpt:
    Mary Meeker’s slide deck has a reputation of being the Delphic Oracle of tech. But, at 294 slides it’s a lot to take in. Don’t worry, I’ve been through them all. It has tons on economic stuff that is of marginal interest to education and training but there’s plenty to to get our teeth into. We’re not immune to tech trends, indeed we tend to follow in lock-step, just a bit later than everyone else. Among the data are lots of fascinating insights that point the way forward in terms of what we’re likely to be doing over the next decade. So here’s a really quick, top-end summary for folk in the learning game.

 

“Educational content usage online is ramping fast” with over 1 billion daily educational videos watched. There is evidence that use of the Internet for informal and formal learning is taking off.

 

 

 

 

 

 

10 Big Takeaways From Mary Meeker’s Widely-Read Internet Report — from fortune.com by  Leena Rao

 

 

 

 

From DSC:
As important as being able to effectively communicate with others is, I think we could do a better job throughout the entire learning continuum of proving more coaching in this regard. We should provide far more courses, and/or e-learning based modules, and/or examples where someone is being coached on how best to communicate in a variety of situations. 

Some examples/scenarios along the continuum (i.e, pre-K-12, higher ed and/or vocational work, and the workplace) might include:

  • For children communicating with each other
    • How to ask if someone wants to play (and how best to find an activity everyone wants to do; or how to handle getting a no each time)
    • How to handle a situation where one’s friend is really angry about something or is being extra quiet about something
    • How to listen
  • For children communicating with adults (and vice versa)
    • How to show respect
    • How to listen
    • Not being shy but feeling free to say what’s on their mind with a known/respected adult
  • For highschoolers
    • Wondering how best to interview for that new job
    • For communicating with parents
    • How to handle issues surrounding diversity and showing respect for differences
    • How to listen
  • For college students
    • Wondering how best to interview for that new job
    • Encouraging them to use their professors’ office hours — and to practice communication-related skills therein as well
    • For communicating with parents, and vice versa
    • How to listen
  • For those entering the workplace
    • How to communicate with co-workers
    • For dealing with customers who are irate about something that happened (or didn’t happen) to them
    • How to listen
  • For managers and their communications with their employees
    • How to encourage
    • How to handle disciplinary issues or change behaviors
    • How to listen
  • For leaders and their communications with their departments, staffs, companies, organizations
    • How to inspire, guide, lead
    • How to listen

I intentionally inserted the word listen many times in the above scenarios, as I don’t think we do enough about — or even think about — actively developing that skill.

The manner in which we deliver and engage learners here could vary:

  • One possible way would be to use interactive videos that pause at critical points within conversations and ask the listeners how they would respond at these points in the scenarios. They might have 2-3 choices per decision point. When the video continues, based upon which selection they went with, the learner could see how things panned out when they pursued that route.
  • Or perhaps we could host some seminars or workshops with students on how to use web-based collaboration tools (videoconferencing and/or audio only based meetings) and/or social media related tools.
  • Or perhaps such training could occur in more face-to-face environments with 2 or more learners reading a scene-setting script, then pausing at critical points in the conversation for students to discuss the best possible responses
  • ….and I’m sure there are other methods that could be employed as well.

But for all the talk of the importance of communications, are we doing enough to provide effective examples/coaching here?

 


Some thoughts on this topic from scripture


James 1:19-20 (NIV)
Listening and Doing
19 My dear brothers and sisters, take note of this: Everyone should be quick to listen, slow to speak and slow to become angry, 20 because human anger does not produce the righteousness that God desires.

 

Proverbs 21:23 (NIV)
23 Those who guard their mouths and their tongues keep themselves from calamity.

 

Proverbs 18:20-21 (NIV)
20 From the fruit of their mouth a person’s stomach is filled; with the harvest of their lips they are satisfied. 21 The tongue has the power of life and death, and those who love it will eat its fruit.

 

Proverbs 12:18-19 (NIV)
18 The words of the reckless pierce like swords, but the tongue of the wise brings healing. 19 Truthful lips endure forever, but a lying tongue lasts only a moment.

 


 

 

 

With great tech success, comes even greater responsibility — from techcrunch.com by Ron Miller

Excerpts:

As we watch major tech platforms evolve over time, it’s clear that companies like Facebook, Apple, Google and Amazon (among others) have created businesses that are having a huge impact on humanity — sometimes positive and other times not so much.

That suggests that these platforms have to understand how people are using them and when they are trying to manipulate them or use them for nefarious purposes — or the companies themselves are. We can apply that same responsibility filter to individual technologies like artificial intelligence and indeed any advanced technologies and the impact they could possibly have on society over time.

We can be sure that Twitter’s creators never imagined a world where bots would be launched to influence an election when they created the company more than a decade ago. Over time though, it becomes crystal clear that Twitter, and indeed all large platforms, can be used for a variety of motivations, and the platforms have to react when they think there are certain parties who are using their networks to manipulate parts of the populace.

 

 

But it’s up to the companies who are developing the tech to recognize the responsibility that comes with great economic success or simply the impact of whatever they are creating could have on society.

 

 

 

 

Why the Public Overlooks and Undervalues Tech’s Power — from morningconsult.com by Joanna Piacenza
Some experts say the tech industry is rapidly nearing a day of reckoning

Excerpts:

  • 5% picked tech when asked which industry had the most power and influence, well behind the U.S. government, Wall Street and Hollywood.
  • Respondents were much more likely to say sexual harassment was a major issue in Hollywood (49%) and government (35%) than in Silicon Valley (17%).

It is difficult for Americans to escape the technology industry’s influence in everyday life. Facebook Inc. reports that more than 184 million people in the United States log on to the social network daily, or roughly 56 percent of the population. According to the Pew Research Center, nearly three-quarters (73 percent) of all Americans and 94 percent of Americans ages 18-24 use YouTube. Amazon.com Inc.’s market value is now nearly three times that of Walmart Inc.

But when asked which geographic center holds the most power and influence in America, respondents in a recent Morning Consult survey ranked the tech industry in Silicon Valley far behind politics and government in Washington, finance on Wall Street and the entertainment industry in Hollywood.

 

 

 

 

Fake videos are on the rise. As they become more realistic, seeing shouldn’t always be believing — from latimes.com by David Pierson Fe

Excerpts:

It’s not hard to imagine a world in which social media is awash with doctored videos targeting ordinary people to exact revenge, extort or to simply troll.

In that scenario, where Twitter and Facebook are algorithmically flooded with hoaxes, no one could fully believe what they see. Truth, already diminished by Russia’s misinformation campaign and President Trump’s proclivity to label uncomplimentary journalism “fake news,” would be more subjective than ever.

The danger there is not just believing hoaxes, but also dismissing what’s real.

The consequences could be devastating for the notion of evidentiary video, long considered the paradigm of proof given the sophistication required to manipulate it.

“This goes far beyond ‘fake news’ because you are dealing with a medium, video, that we traditionally put a tremendous amount of weight on and trust in,” said David Ryan Polgar, a writer and self-described tech ethicist.

 

 

 

 

From DSC:
Though I’m typically pro-technology, this is truly disturbing. There are certainly downsides to technology as well as upsides — but it’s how we use a technology that can make the real difference. Again, this is truly disturbing.

 

 

Global Human Capital Report 2017 — from the World Economic Forum

Excerpt from the Conclusion section (emphasis DSC):

Technological change and its impact on labour markets calls for a renewed focus on how the world’s human capital is invested in and leveraged for social well-being and economic prosperity for all. Many of today’s education systems are already disconnected from the skills needed to function in today’s labour markets and the exponential rate of technological and economic change is further increasing the gap between education and labour markets. Furthermore, the premise of current education systems is on developing cognitive skills, yet behavioural and non-cognitive skills that nurture an individual’s capacity to collaborate, innovate, self-direct and problem-solve are increasingly important. Current education systems are also time-compressed in a way that may not be suited to current or future labour markets. They force narrow career and expertise decisions in early youth. The divide between formal education and the labour market needs to be overcome, as learning, R&D, knowledge-sharing, retraining and innovation take place simultaneously throughout the work life cycle, regardless of the job, level or industry.

 

Insert from DSC…again I ask:

Is is time to back up a major step and practice design thinking on the entire continuum of lifelong learning?”

 

Education delivery and financing mechanisms have gone through little change over the last decades. In many countries, many youth and children may find their paths constrained depending on the type of education they are able to afford, while others may not have access to even basic literacy and learning. On the other hand, many developed world education systems have made enormous increases in spending—with little explicit return. Early childhood education and teacher quality remain neglected areas in many developed and developing countries, despite their proven impact on learning outcomes. Both areas also suffer from lack of objective, global data.

Generational shifts also necessitate an urgent focus by governments on human capital investments, one that transcends political cycles. Ageing economies will face a historical first, as more and more of their populations cross into the 65 and over age group and their workforces shrink further, necessitating a better integration of youth, female workers, migrants and older workers. Many emerging economies face change of a different kind as a very large cohort of the next generation—one that is more connected and globalized than ever before—enters the workforce with very different aspirations, expectations and worldviews than their predecessors.

The expansion of the digital economy is accelerating the presence of a new kind of productive entity, somewhere between human capital and physical capital—robots and intelligent algorithms. As a result, some experts expect a potential reduction in the use of human labour as part of economic value creation while others expect a restructuring of the work done by people across economies but stable or growing overall levels of employment.19 Yet others have cautioned of the risks to economic productivity of technological reticence at the cost of realizing the raw potential of new technological advancements unfettered.20 While in the immediate term the link between work and livelihoods remains a basic feature of our societies, the uncertainty around the shifts underway poses fundamental questions about the long-term future structure of economies, societies and work. However, for broad-based transition and successful adaptation towards any one of these or other long-term futures, strategic and deep investments in human capital will be even more—not less—important than before.

 

 

 

 

Google AR and VR: Get a closer look with Street View in Google Earth VR

Excerpt:

With Google Earth VR, you can go anywhere in virtual reality. Whether you want to stroll along the canals of Venice, stand at the summit of Mount Kilimanjaro or soar through the sky faster than a speeding bullet, there’s no shortage of things to do or ways to explore. We love this sense of possibility, so we’re bringing Street View to Earth VR to make it easier for you to see and experience the world.

This update lets you explore Street View imagery from 85 countries right within Earth VR. Just fly down closer to street level, check your controller to see if Street View is available and enter an immersive 360° photo. You’ll find photos from the Street View team and those shared by people all around the world.

 

 

 

 

 

 

Getting students ready for the gig economy — from gettingsmart.com by Emily Liebtag

Excerpt:

1) Finding a Passion and Making an Impact
Exploring passions, interests and causes that matter should be a part of every student’s education. Projects or several short-term gigs are a great way to help students reveal (or discover) their personal passions and to facilitate their interest explorations (all while still covering core content and standards). There are many students who already have opportunities to explore their passions during the regular school day through projects and are thriving as a result.

We’ve seen students at High Tech High create business plans and sell self-designed t-shirts, students at Thrive Public Schools engage in projects around kindness and empathy in their communities, and students at One Stone work with clients on advertising and marketing gigs, exploring their passions one project at a time.

Need ideas? Engage students in projects around the Sustainable Development Goals, snag an idea from the the PBL Q & A blog or simply ask students what they are curious about exploring in their community.

 

 

Robots and AI are going to make social inequality even worse, says new report — from theverge.com by
Rich people are going to find it easier to adapt to automation

Excerpt:

Most economists agree that advances in robotics and AI over the next few decades are likely to lead to significant job losses. But what’s less often considered is how these changes could also impact social mobility. A new report from UK charity Sutton Trust explains the danger, noting that unless governments take action, the next wave of automation will dramatically increase inequality within societies, further entrenching the divide between rich and poor.

The are a number of reasons for this, say the report’s authors, including the ability of richer individuals to re-train for new jobs; the rising importance of “soft skills” like communication and confidence; and the reduction in the number of jobs used as “stepping stones” into professional industries.

For example, the demand for paralegals and similar professions is likely to be reduced over the coming years as artificial intelligence is trained to handle more administrative tasks. In the UK more than 350,000 paralegals, payroll managers, and bookkeepers could lose their jobs if automated systems can do the same work.

 

Re-training for new jobs will also become a crucial skill, and it’s individuals from wealthier backgrounds that are more able to do so, says the report. This can already be seen in the disparity in terms of post-graduate education, with individuals in the UK with working class or poorer backgrounds far less likely to re-train after university.

 

 

From DSC:
I can’t emphasize this enough. There are dangerous, tumultuous times ahead if we can’t figure out ways to help ALL people within the workforce reinvent themselves quickly, cost-effectively, and conveniently. Re-skilling/up-skilling ourselves is becoming increasingly important. And I’m not just talking about highly-educated people. I’m talking about people whose jobs are going to be disappearing in the near future — especially people whose stepping stones into brighter futures are going to wake up to a very different world. A very harsh world.

That’s why I’m so passionate about helping to develop a next generation learning platform. Higher education, as an industry, has some time left to figure out their part/contribution out in this new world. But the window of time could be closing, as another window of opportunity / era could be opening up for “the next Amazon.com of higher education.”

It’s up to current, traditional institutions of higher education as to how much they want to be a part of the solution. Some of the questions each institution ought to be asking are:

  1. Given our institutions mission/vision, what landscapes should we be pulse-checking?
  2. Do we have faculty/staff/members of administration looking at those landscapes that are highly applicable to our students and to their futures? How, specifically, are the insights from those employees fed into the strategic plans of our institution?
  3. What are some possible scenarios as a result of these changing landscapes? What would our response(s) be for each scenario?
  4. Are there obstacles from us innovating and being able to respond to the shifting landscapes, especially within the workforce?
  5. How do we remove those obstacles?
  6. On a scale of 0 (we don’t innovate at all) to 10 (highly innovative), where is our culture today? Where do we hope to be 5 years from now? How do we get there?

…and there are many other questions no doubt. But I don’t think we’re looking into the future nearly enough to see the massive needs — and real issues — ahead of us.

 

 

The report, which was carried out by the Boston Consulting Group and published this Wednesday [7/12/17], looks specifically at the UK, where it says some 15 million jobs are at risk of automation. But the Sutton Trust says its findings are also relevant to other developed nations, particularly the US, where social mobility is a major problem.

 

 

 

 
© 2025 | Daniel Christian