Legal Tech’s Predictions for Legal Technology Innovation in 2022 — from legalreader.com by Smith Johnes

Excerpts:

  • By 2024, non-lawyer workers will replace 20% of generalist lawyers in legal departments.
  • Legal departments will have automated half of legal work connected to big corporate transactions by 2024.
  • Legal departments will triple their investment in legal technology by 2025.

From DSC:
What are the ramifications for law schools and legal departments if these legal tech-related predictions come true?

 

Justice, Equity, And Fairness: Exploring The Tense Relationship Between Artificial Intelligence And The Law With Joilson Melo — from forbes.com by Annie Brown

Excerpt:

The law and legal practitioners stand to gain a lot from a proper adoption of AI into the legal system. Legal research is one area that AI has already begun to help out with. AI can streamline the thousands of results an internet or directory search would otherwise provide, offering a smaller digestible handful of relevant authorities for legal research. This is already proving helpful and with more targeted machine learning it would only get better.

The possible benefits go on; automated drafts of documents and contracts, document review, and contract analysis are some of those considered imminent.

Many have even considered the possibilities of AI in helping with more administrative functions like the appointment of officers and staff, administration of staff, and making the citizens aware of their legal rights.

A future without AI seems bleak and laborious for most industries including the legal and while we must march on, we must be cautious about our strategies for adoption. This point is better put in the words of Joilson Melo; “The possibilities are endless, but the burden of care is very heavy[…]we must act and evolve with [caution].”

 

Defining the skills citizens will need in the future world of work — from McKinsey & Company; with thanks to Ryan Craig for this resource

Excerpts:

Our findings help define the particular skills citizens are likely to require in the future world of work and suggest how proficiency in them can influence work-related outcomes, namely employment, income, and job satisfaction. This, in turn, suggests three actions governments may wish to take.

  1. Reform education systems
  2. Reform adult-training systems
  3. Ensure affordability of lifelong education

Establish an AI aggregator of training programs to attract adult learners and encourage lifelong learning. AI algorithms could guide users on whether they need to upskill or reskill for a new profession and shortlist relevant training programs. 

Foundational skills that will help citizens thrive in the future of work


From DSC:
No one will have all 56 skills that McKinsey recommends here. So (HR) managers, please don’t load up your job postings with every single skill listed here. The search for purple unicorns can get tiring, old, and discouraging for those who are looking for work.

That said, much of what McKinsey’s research/data shows — and what their recommendations are — resonates with me. And that’s why I keep adding to the developments out at:

Learning from the living class room

A powerful, global, next-generation learning platform — meant to help people reinvent themselves quickly, safely, cost-effectively, conveniently, & consistently!!!

 

The Fight to Define When AI Is ‘High Risk’ — from wired.com by Khari Johnson
Everyone from tech companies to churches wants a say in how the EU regulates AI that could harm people.

Excerpt:

The AI Act is one of the first major policy initiatives worldwide focused on protecting people from harmful AI. If enacted, it will classify AI systems according to risk, more strictly regulate AI that’s deemed high risk to humans, and ban some forms of AI entirely, including real-time facial recognition in some instances. In the meantime, corporations and interest groups are publicly lobbying lawmakers to amend the proposal according to their interests.

 

From DSC:
Yet another example of the need for the legislative and legal realms to try and catch up here.

The legal realm needs to try and catch up with the exponential pace of technological change

 

Many Americans aren’t aware they’re being tracked with facial recognition while shopping  — from techradar.com by Anthony Spadafora
You’re not just on camera, you’re also being tracked

Excerpt:

Despite consumer opposition to facial recognition, the technology is currently being used in retail stores throughout the US according to new research from Piplsay.

While San Francisco banned the police from using facial recognition back in 2019 and the EU called for a five year ban on the technology last year, several major retailers in the US including Lowe’s, Albertsons and Macy’s have been using it for both fraud and theft detection.

From DSC:
I’m not sure how prevalent this practice is…and that’s precisely the point. We don’t know what all of those cameras are actually doing in our stores, gas stations, supermarkets, etc. I put this in the categories of policy, law schools, legal, government, and others as the legislative and legal realm need to scramble to catch up to this Wild Wild West.

Along these lines, I was watching a portion of 60 minutes last night where they were doing a piece on autonomous trucks (reportedly to hit the roads without a person sometime later this year). When asked about oversight, there was some…but not much.

Readers of this blog will know that I have often wondered…”How does society weigh in on these things?”

Along these same lines, also see:

  • The NYPD Had a Secret Fund for Surveillance Tools — from wired.com by Sidney Fussell
    Documents reveal that police bought facial-recognition software, vans equipped with x-ray machines, and “stingray” cell site simulators—with no public oversight.
 

‘Best of Both Worlds’ — from insidehighered.com by Alexis Gravely
The expansion will allow more people to participate in prison education programs while the department prepares for across-the-board Pell Grant access for incarcerated students.

Excerpt:

The Second Chance Pell Experimental Sites Initiative will be expanded for the 2022-23 award year to allow another 69 colleges and universities to participate, paving the way for even more incarcerated individuals to gain access to higher education.

A maximum of 200 two- and four-year colleges will be able to offer prison education programs with the support of the Pell Grant, up from the 131 institutions currently participating. The department is also planning to broaden the geographic scope of Second Chance Pell, with the goal of having programs in most or all 50 states.

 

 

NLADA Welcomes DOJ’s Advice to Chief Justices to Ensure Fair Process to Keep Families in Their Homes — from nlada.org

Excerpts:

NLADA welcomes the Associate Attorney General’s letter urging Chief Justices and State Court Administrators to consider eviction diversion strategies that can help families avoid the disruption and damage that evictions cause. Together with the Centers for Disease Control and Prevention’s decision to extend the eviction moratorium and the additional steps announced by the White House, this letter is an important tool in a package of federal resources to help our country and courts navigate this crisis.

The housing crisis, exacerbated by the pandemic, disproportionally affects women and communities of color. More than 80 percent of people facing eviction are Black. We know that when families have access to counsel provided by civil legal aid attorneys, approximately 80 percent of families stay in their homes, and that rates of homelessness have had a direct correlation to coronavirus rates. These numbers represent men, women and children who are integral to communities across the country, and we are pleased by today’s actions to help families and communities struggling in our nation.

 

 

 

AI cannot be recognised as an inventor, US rules — from bbc.com

An artificial intelligence system has been refused the right to two patents in the US, after a ruling only “natural persons” could be inventors.

 

Watch a Drone Swarm Fly Through a Fake Forest Without Crashing — from wired.com by Max Levy
Each copter doesn’t just track where the others are. It constantly predicts where they’ll go.

From DSC:
I’m not too crazy about this drone swarm…in fact, the more I thought about it, I find it quite alarming and nerve-racking. It doesn’t take much imagination to think what the militaries of the world are already doing with this kind of thing. And our son is now in the Marines. So forgive me if I’m a bit biased here…but I can’t help but wondering what the role/impact of foot soldiers will be in the next war? I hope we don’t have one. 

Anway, just because we can…

 

The Future of Social Media: Re-Humanisation and Regulation — by Gerd Leonhard

How could social media become ‘human’ again? How can we stop the disinformation, dehumanisation and dataism that has resulted from social media’s algorithmic obsessions? I foresee that the EXTERNALTIES i.e. the consequences of unmitigated growth of exponential digital technologies will become just as big as the consequences of climate change. In fact, today, the social media industry already has quite a few parallels to the oil, gas and coal business: while private make huge profits from extracting the ‘oil’ (i.e. user data), the external damage is left to society and governments to fix. This needs to change! In this keynote I make some precise suggestions as to how that could happen.

Some snapshots/excerpts:

The future of social media -- a video by Gerd Leonhard in the summer of 2021

 

 

 

 


From DSC:
Gerd brings up some solid points here. His presentation and perspectives are not only worth checking out, but they’re worth some time for us to seriously reflect on what he’s saying.

What kind of future do we want?

And for you professors, teachers, instructional designers, trainers, and presenters out there, check out *how* he delivers the content. It’s well done and very engaging.


 

From DSC:
Again, as you can see from the items below…there are various plusses and minuses regarding the use of Artificial Intelligence (AI). Some of the items below are neither positive or negative, but I found them interesting nonetheless.


How Amazon is tackling the A.I. talent crunch — from fortune.com by Jonathan Vanian

Excerpt:

“One way Amazon has adapted to the tight labor market is to require potential new programming hires to take classes in machine learning, said Bratin Saha, a vice president and general manager of machine learning services at Amazon. The company’s executives believe they can teach these developers machine learning basics over a few weeks so that they can work on more cutting-edge projects after they’re hired.”

“These are not formal college courses, and Saha said the recruits aren’t graded like they would be in school. Instead, the courses are intended to give new developers a foundation in machine learning and statistics so they can understand the theoretical underpinnings.”

Machine Learning Can Predict Rapid Kidney Function Decline — from sicklecellanemianews.com by Steve Bryson PhD; with thanks to Sam DeBrule for this resource

Excerpt:

Machine learning tools can identify sickle cell disease (SCD) patients at high risk of progressive kidney disease as early as six months in advance, a study shows.  The study, “Using machine learning to predict rapid decline of kidney function in sickle cell anemia,” was published in the journal eJHaem.

NYPD’s Sprawling Facial Recognition System Now Has More Than 15,000 Cameras — from vice.com by Todd Feathers; with thanks to Sam DeBrule for this resource
The massive camera network is concentrated in predominantly Black and brown neighborhoods, according to a new crowdsourced report.

Excerpt:

The New York City Police Department has built a sprawling facial recognition network that may include more than 15,000 surveillance cameras in Manhattan, Brooklyn, and the Bronx, according to a massive crowdsourced investigation by Amnesty International.

“This sprawling network of cameras can be used by police for invasive facial recognition and risk turning New York into an Orwellian surveillance city,” Matt Mahmoudi, an artificial intelligence and human rights researcher at Amnesty, wrote in the group’s report. “You are never anonymous. Whether you’re attending a protest, walking to a particular neighbourhood, or even just grocery shopping—your face can be tracked by facial recognition technology using imagery from thousands of camera points across New York.”

Related to that article is this one:

The All-Seeing Eyes of New York’s 15,000 Surveillance Cameras — from wired.com by Sidney Fussell
Video from the cameras is often used in facial-recognition searches. A report finds they are most common in neighborhoods with large nonwhite populations.

Excerpt:

A NEW VIDEO from human rights organization Amnesty International maps the locations of more than 15,000 cameras used by the New York Police Department, both for routine surveillance and in facial-recognition searches. A 3D model shows the 200-meter range of a camera, part of a sweeping dragnet capturing the unwitting movements of nearly half of the city’s residents, putting them at risk for misidentification. The group says it is the first to map the locations of that many cameras in the city.

Don’t End Up on This Artificial Intelligence Hall of Shame — from wired.com by Tom Simonite
A list of incidents that caused, or nearly caused, harm aims to prompt developers to think more carefully about the tech they create.

Excerpt:

The AI Incident Database is hosted by Partnership on AI, a nonprofit founded by large tech companies to research the downsides of the technology. The roll of dishonor was started by Sean McGregor, who works as a machine learning engineer at voice processor startup Syntiant. He says it’s needed because AI allows machines to intervene more directly in people’s lives, but the culture of software engineering does not encourage safety.

 

June 2021: Rethinking Lawyer Regulation — by Jim Sandman

Excerpts:

The National Center for State Courts estimates that in 76 percent of civil cases in state courts, at least one party is unrepresented. That figure does not include family law cases. If it did, the percentage would be even higher. It is common for more than 90 percent of tenants facing eviction in the United States to be without counsel, even though more than 90 percent of landlords have lawyers. It is common for more than 80 percent of domestic violence victims seeking protection orders to be without counsel.

The model on which our adversary system of justice is based – with each party represented by counsel who present evidence and arguments on behalf of their clients – is a fiction in the majority of civil cases in the United States today. Unrepresented litigants fend for themselves in tens of millions of cases every year involving the most basic of human needs: shelter (evictions and foreclosures), family stability (child custody child support, adoptions, and guardianships), personal safety (protection orders against abusers), and economic subsistence (access to unemployment insurance, health care, food, and other benefits programs). The lack of access to counsel affects not just low-income people, but the middle class and small businesses, too.

Our nation is defaulting on its foundational promise of justice for all. We need solutions commensurate with the magnitude and the urgency of the problem, and those solutions must include regulatory reform.

 

Ed Department Sets Expectations For Special Education As Schools Reopen — from disabilityscoop.com by Michelle Diament

Excerpt:

With schools across the nation increasingly eyeing a return to normalcy, federal education officials are further clarifying what that should mean for students with disabilities.

In a 23-page question-and-answer document, the U.S. Department of Education is laying out how the Individuals with Disabilities Education Act, Section 504 of the Rehabilitation Act and other civil rights laws apply as schools return to in-person learning.

The guidance addresses schools’ responsibilities to students with disabilities in remote, hybrid and in-person situations, touching on everything from the right to a free appropriate public education to handling children who are unable to wear masks or maintain social distance.

 
© 2025 | Daniel Christian