Defining the skills citizens will need in the future world of work — from McKinsey & Company; with thanks to Ryan Craig for this resource

Excerpts:

Our findings help define the particular skills citizens are likely to require in the future world of work and suggest how proficiency in them can influence work-related outcomes, namely employment, income, and job satisfaction. This, in turn, suggests three actions governments may wish to take.

  1. Reform education systems
  2. Reform adult-training systems
  3. Ensure affordability of lifelong education

Establish an AI aggregator of training programs to attract adult learners and encourage lifelong learning. AI algorithms could guide users on whether they need to upskill or reskill for a new profession and shortlist relevant training programs. 

Foundational skills that will help citizens thrive in the future of work


From DSC:
No one will have all 56 skills that McKinsey recommends here. So (HR) managers, please don’t load up your job postings with every single skill listed here. The search for purple unicorns can get tiring, old, and discouraging for those who are looking for work.

That said, much of what McKinsey’s research/data shows — and what their recommendations are — resonates with me. And that’s why I keep adding to the developments out at:

Learning from the living class room

A powerful, global, next-generation learning platform — meant to help people reinvent themselves quickly, safely, cost-effectively, conveniently, & consistently!!!

 

The Fight to Define When AI Is ‘High Risk’ — from wired.com by Khari Johnson
Everyone from tech companies to churches wants a say in how the EU regulates AI that could harm people.

Excerpt:

The AI Act is one of the first major policy initiatives worldwide focused on protecting people from harmful AI. If enacted, it will classify AI systems according to risk, more strictly regulate AI that’s deemed high risk to humans, and ban some forms of AI entirely, including real-time facial recognition in some instances. In the meantime, corporations and interest groups are publicly lobbying lawmakers to amend the proposal according to their interests.

 

From DSC:
Yet another example of the need for the legislative and legal realms to try and catch up here.

The legal realm needs to try and catch up with the exponential pace of technological change

 

Many Americans aren’t aware they’re being tracked with facial recognition while shopping  — from techradar.com by Anthony Spadafora
You’re not just on camera, you’re also being tracked

Excerpt:

Despite consumer opposition to facial recognition, the technology is currently being used in retail stores throughout the US according to new research from Piplsay.

While San Francisco banned the police from using facial recognition back in 2019 and the EU called for a five year ban on the technology last year, several major retailers in the US including Lowe’s, Albertsons and Macy’s have been using it for both fraud and theft detection.

From DSC:
I’m not sure how prevalent this practice is…and that’s precisely the point. We don’t know what all of those cameras are actually doing in our stores, gas stations, supermarkets, etc. I put this in the categories of policy, law schools, legal, government, and others as the legislative and legal realm need to scramble to catch up to this Wild Wild West.

Along these lines, I was watching a portion of 60 minutes last night where they were doing a piece on autonomous trucks (reportedly to hit the roads without a person sometime later this year). When asked about oversight, there was some…but not much.

Readers of this blog will know that I have often wondered…”How does society weigh in on these things?”

Along these same lines, also see:

  • The NYPD Had a Secret Fund for Surveillance Tools — from wired.com by Sidney Fussell
    Documents reveal that police bought facial-recognition software, vans equipped with x-ray machines, and “stingray” cell site simulators—with no public oversight.
 

‘Best of Both Worlds’ — from insidehighered.com by Alexis Gravely
The expansion will allow more people to participate in prison education programs while the department prepares for across-the-board Pell Grant access for incarcerated students.

Excerpt:

The Second Chance Pell Experimental Sites Initiative will be expanded for the 2022-23 award year to allow another 69 colleges and universities to participate, paving the way for even more incarcerated individuals to gain access to higher education.

A maximum of 200 two- and four-year colleges will be able to offer prison education programs with the support of the Pell Grant, up from the 131 institutions currently participating. The department is also planning to broaden the geographic scope of Second Chance Pell, with the goal of having programs in most or all 50 states.

 

 

NLADA Welcomes DOJ’s Advice to Chief Justices to Ensure Fair Process to Keep Families in Their Homes — from nlada.org

Excerpts:

NLADA welcomes the Associate Attorney General’s letter urging Chief Justices and State Court Administrators to consider eviction diversion strategies that can help families avoid the disruption and damage that evictions cause. Together with the Centers for Disease Control and Prevention’s decision to extend the eviction moratorium and the additional steps announced by the White House, this letter is an important tool in a package of federal resources to help our country and courts navigate this crisis.

The housing crisis, exacerbated by the pandemic, disproportionally affects women and communities of color. More than 80 percent of people facing eviction are Black. We know that when families have access to counsel provided by civil legal aid attorneys, approximately 80 percent of families stay in their homes, and that rates of homelessness have had a direct correlation to coronavirus rates. These numbers represent men, women and children who are integral to communities across the country, and we are pleased by today’s actions to help families and communities struggling in our nation.

 

 

 

AI cannot be recognised as an inventor, US rules — from bbc.com

An artificial intelligence system has been refused the right to two patents in the US, after a ruling only “natural persons” could be inventors.

 

Watch a Drone Swarm Fly Through a Fake Forest Without Crashing — from wired.com by Max Levy
Each copter doesn’t just track where the others are. It constantly predicts where they’ll go.

From DSC:
I’m not too crazy about this drone swarm…in fact, the more I thought about it, I find it quite alarming and nerve-racking. It doesn’t take much imagination to think what the militaries of the world are already doing with this kind of thing. And our son is now in the Marines. So forgive me if I’m a bit biased here…but I can’t help but wondering what the role/impact of foot soldiers will be in the next war? I hope we don’t have one. 

Anway, just because we can…

 

The Future of Social Media: Re-Humanisation and Regulation — by Gerd Leonhard

How could social media become ‘human’ again? How can we stop the disinformation, dehumanisation and dataism that has resulted from social media’s algorithmic obsessions? I foresee that the EXTERNALTIES i.e. the consequences of unmitigated growth of exponential digital technologies will become just as big as the consequences of climate change. In fact, today, the social media industry already has quite a few parallels to the oil, gas and coal business: while private make huge profits from extracting the ‘oil’ (i.e. user data), the external damage is left to society and governments to fix. This needs to change! In this keynote I make some precise suggestions as to how that could happen.

Some snapshots/excerpts:

The future of social media -- a video by Gerd Leonhard in the summer of 2021

 

 

 

 


From DSC:
Gerd brings up some solid points here. His presentation and perspectives are not only worth checking out, but they’re worth some time for us to seriously reflect on what he’s saying.

What kind of future do we want?

And for you professors, teachers, instructional designers, trainers, and presenters out there, check out *how* he delivers the content. It’s well done and very engaging.


 

From DSC:
Again, as you can see from the items below…there are various plusses and minuses regarding the use of Artificial Intelligence (AI). Some of the items below are neither positive or negative, but I found them interesting nonetheless.


How Amazon is tackling the A.I. talent crunch — from fortune.com by Jonathan Vanian

Excerpt:

“One way Amazon has adapted to the tight labor market is to require potential new programming hires to take classes in machine learning, said Bratin Saha, a vice president and general manager of machine learning services at Amazon. The company’s executives believe they can teach these developers machine learning basics over a few weeks so that they can work on more cutting-edge projects after they’re hired.”

“These are not formal college courses, and Saha said the recruits aren’t graded like they would be in school. Instead, the courses are intended to give new developers a foundation in machine learning and statistics so they can understand the theoretical underpinnings.”

Machine Learning Can Predict Rapid Kidney Function Decline — from sicklecellanemianews.com by Steve Bryson PhD; with thanks to Sam DeBrule for this resource

Excerpt:

Machine learning tools can identify sickle cell disease (SCD) patients at high risk of progressive kidney disease as early as six months in advance, a study shows.  The study, “Using machine learning to predict rapid decline of kidney function in sickle cell anemia,” was published in the journal eJHaem.

NYPD’s Sprawling Facial Recognition System Now Has More Than 15,000 Cameras — from vice.com by Todd Feathers; with thanks to Sam DeBrule for this resource
The massive camera network is concentrated in predominantly Black and brown neighborhoods, according to a new crowdsourced report.

Excerpt:

The New York City Police Department has built a sprawling facial recognition network that may include more than 15,000 surveillance cameras in Manhattan, Brooklyn, and the Bronx, according to a massive crowdsourced investigation by Amnesty International.

“This sprawling network of cameras can be used by police for invasive facial recognition and risk turning New York into an Orwellian surveillance city,” Matt Mahmoudi, an artificial intelligence and human rights researcher at Amnesty, wrote in the group’s report. “You are never anonymous. Whether you’re attending a protest, walking to a particular neighbourhood, or even just grocery shopping—your face can be tracked by facial recognition technology using imagery from thousands of camera points across New York.”

Related to that article is this one:

The All-Seeing Eyes of New York’s 15,000 Surveillance Cameras — from wired.com by Sidney Fussell
Video from the cameras is often used in facial-recognition searches. A report finds they are most common in neighborhoods with large nonwhite populations.

Excerpt:

A NEW VIDEO from human rights organization Amnesty International maps the locations of more than 15,000 cameras used by the New York Police Department, both for routine surveillance and in facial-recognition searches. A 3D model shows the 200-meter range of a camera, part of a sweeping dragnet capturing the unwitting movements of nearly half of the city’s residents, putting them at risk for misidentification. The group says it is the first to map the locations of that many cameras in the city.

Don’t End Up on This Artificial Intelligence Hall of Shame — from wired.com by Tom Simonite
A list of incidents that caused, or nearly caused, harm aims to prompt developers to think more carefully about the tech they create.

Excerpt:

The AI Incident Database is hosted by Partnership on AI, a nonprofit founded by large tech companies to research the downsides of the technology. The roll of dishonor was started by Sean McGregor, who works as a machine learning engineer at voice processor startup Syntiant. He says it’s needed because AI allows machines to intervene more directly in people’s lives, but the culture of software engineering does not encourage safety.

 

June 2021: Rethinking Lawyer Regulation — by Jim Sandman

Excerpts:

The National Center for State Courts estimates that in 76 percent of civil cases in state courts, at least one party is unrepresented. That figure does not include family law cases. If it did, the percentage would be even higher. It is common for more than 90 percent of tenants facing eviction in the United States to be without counsel, even though more than 90 percent of landlords have lawyers. It is common for more than 80 percent of domestic violence victims seeking protection orders to be without counsel.

The model on which our adversary system of justice is based – with each party represented by counsel who present evidence and arguments on behalf of their clients – is a fiction in the majority of civil cases in the United States today. Unrepresented litigants fend for themselves in tens of millions of cases every year involving the most basic of human needs: shelter (evictions and foreclosures), family stability (child custody child support, adoptions, and guardianships), personal safety (protection orders against abusers), and economic subsistence (access to unemployment insurance, health care, food, and other benefits programs). The lack of access to counsel affects not just low-income people, but the middle class and small businesses, too.

Our nation is defaulting on its foundational promise of justice for all. We need solutions commensurate with the magnitude and the urgency of the problem, and those solutions must include regulatory reform.

 

Ed Department Sets Expectations For Special Education As Schools Reopen — from disabilityscoop.com by Michelle Diament

Excerpt:

With schools across the nation increasingly eyeing a return to normalcy, federal education officials are further clarifying what that should mean for students with disabilities.

In a 23-page question-and-answer document, the U.S. Department of Education is laying out how the Individuals with Disabilities Education Act, Section 504 of the Rehabilitation Act and other civil rights laws apply as schools return to in-person learning.

The guidance addresses schools’ responsibilities to students with disabilities in remote, hybrid and in-person situations, touching on everything from the right to a free appropriate public education to handling children who are unable to wear masks or maintain social distance.

 

A lack of devices but a heroic effort by teachers — from global-edtech.com by Dorothy Lepkowska
EDUCATE webinar hears how countries coped with the Covid school shutdown

Excerpts:

As in many countries, including the UK, schools and students in Portugal were hindered by a lack of devices, Huge Fonseca, an EdTech innovator, told the webinar.  He said teachers were “superheroes” and did what they could to connect with pupils, including distributing work on bicycles.

He said: “In Portugal there is a lack of devices and a lack of teacher training. The profession is among the oldest in Europe, and half of teachers are more than 50 years old. There is no government support to develop them, and if you leave teachers without learning new skills for 20 years or more then it is hard for them to adapt. We need more young people to go into teaching.”

In Uruguay, more coherence was needed between the distribution of technology and teacher training.

Meanwhile, in Brazil, there was a roll-out of technology but little training for teachers.

In Zambia, however, the pandemic offered an opportunity for project-based learning to address some of the problems faced by communities.

She said the pandemic had shone a light on the heroic work of teachers and there was a “shift in perceptions about teachers, and the importance of face-to-face teaching as a valued part of civil society”.

Also see:

 

A reasonable robot in the eyes of the law — from innovationaus.com by Stuart Corner

Excerpts:

“But what happens when an autonomous vehicle kills someone? A robot is not subject to the law. So is the car manufacturer liable, or the developer of the software? And how do you pinpoint the cause of such an accident?”

You can’t tax a robot worker
“If my employer, the University of Surrey could get a robot to automate my job, which they will someday, they would save money doing that, even if we were both equally good, because they have to make National Insurance contributions [a UK earnings tax that funds state pensions and other benefits] for my work and a host of other reasons, but the machine is taxed more favourably.

 
© 2021 | Daniel Christian