“Access to Justice” – the first open access issue of Dædalus — from amacad.org

Excerpt:

“Access to Justice” – the first open access issue of Dædalus – features twenty-four essays that examine the national crisis in civil legal services facing poor and low-income Americans: from the challenges of providing quality legal assistance to more people, to the social and economic costs of an often unresponsive legal system, to the opportunities for improvement offered by new technologies, professional innovations, and fresh ways of thinking about the crisis.

 

How surveillance cameras could be weaponized with A.I. — from nytimes.com by Niraj Chokshi
Advances in artificial intelligence could supercharge surveillance cameras, allowing footage to be constantly monitored and instantly analyzed, the A.C.L.U. warned in a new report.

Excerpt:

In the report, the organization imagined a handful of dystopian uses for the technology. In one, a politician requests footage of his enemies kissing in public, along with the identities of all involved. In another, a life insurance company offers rates based on how fast people run while exercising. And in another, a sheriff receives a daily list of people who appeared to be intoxicated in public, based on changes to their gait, speech or other patterns.

Analysts have valued the market for video analytics at as much as $3 billion, with the expectation that it will grow exponentially in the years to come. The important players include smaller businesses as well as household names such as Amazon, Cisco, Honeywell, IBM and Microsoft.

 

From DSC:
We can no longer let a handful of companies tell the rest of us how our society will be formed/shaped/act. 

For example, Amazon should NOT be able to just send its robots/drones to deliver packages — that type of decision is NOT up to them. I have a suspicion that Amazon cares more about earning larger profits and pleasing Wall Street rather than being concerned with our society at large. If Amazon is able to introduce their robots all over the place, what’s to keep any and every company from introducing their own army of robots or drones? If we allow this to occur, it won’t be long before our streets, sidewalks, and air spaces are filled with noise and clutter.

So…a  question for representatives, senators, legislators, mayors, judges, lawyers, etc.:

  • What should we be building in order to better allow citizens to weigh in on emerging technologies and whether any given emerging technology — or a specific product/service — should be rolled out…or not? 

 

 

INSIGHT: Ten ways machine learning will transform the practice of law — from news.bloomberglaw.com by Caroline Sweeney
Law firms are increasingly using machine learning and artificial intelligence, which have become standard in document review. Dorsey & Whitney’s Caroline Sweeney says any firm that wants to stay competitive should get on board now and gives examples for use and best practices.

 

 

This state is expected to become the first to collect prosecutor data, with breakdowns by race — from abajournal.com by Debra Cassens Weiss

Excerpt:

Connecticut is expected to become the first state to collect statewide criminal case data from prosecutors broken down by the defendants’ race, sex, ethnicity, age and ZIP code.

The bill requires the state to collect statistics on arrests, diversionary programs, case dispositions, plea agreements, cases going to trial, court fines and fees, and restitution orders.

Lamont said the bill will provide the public with greater insight into prosecutors’ decisions. “

 

 

8 industrial IoT trends of 2019 that cannot be ignored — from datafloq.com

Excerpt:

From manufacturing to the retail sector, the infinite applications of the industrial internet of things are disrupting business processes, thereby improving operational efficiency and business competitiveness. The trend of employing IoT-powered systems for supply chain management, smart monitoring, remote diagnosis, production integration, inventory management, and predictive maintenance is catching up as companies take bold steps to address a myriad of business problems.

No wonder, the global technology spend on IoT is expected to reach USD 1.2 trillion by 2022. The growth of this segment will be driven by firms deploying IIoT solutions and giant tech organizations who are developing these innovative solutions.

To help you stay ahead of the curve, we have enlisted a few trends that will dominate the industrial IoT sphere.

 

5. 5G Will Drive Real-Time IIoT Applications
5G deployments are digitizing the industrial domain and changing the way enterprises manage their business operations. Industries, namely transportation, manufacturing, healthcare, energy and utilities, agriculture, retail, media, and financial services will benefit from the low latency and high data transfer speed of 5G mobile networks.

 

State Attempts to Nix Public School’s Facial Recognition Plans — from futurism.com by Kristin Houser
But it might not have the authority to actually stop an upcoming trial.

Excerpt (emphasis DSC):

Chaos Reigns
New York’s Lockport City School District (CSD) was all set to become the first public school district in the U.S. to test facial recognition on its students and staff. But just two days after the school district’s superintendent announced the project’s June 3 start date, the New York State Education Department (NYSED) attempted to put a stop to the trial, citing concerns for students’ privacy. Still, it’s not clear whether the department has the authority to actually put the project on hold — *****the latest sign that the U.S. is in desperate need of clear-cut facial recognition legislation.*****

 

10 things we should all demand from Big Tech right now — from vox.com by Sigal Samuel
We need an algorithmic bill of rights. AI experts helped us write one.

We need an algorithmic bill of rights. AI experts helped us write one.

Excerpts:

  1. Transparency: We have the right to know when an algorithm is making a decision about us, which factors are being considered by the algorithm, and how those factors are being weighted.
  2. Explanation: We have the right to be given explanations about how algorithms affect us in a specific situation, and these explanations should be clear enough that the average person will be able to understand them.
  3. Consent: We have the right to give or refuse consent for any AI application that has a material impact on our lives or uses sensitive data, such as biometric data.
  4. Freedom from bias: We have the right to evidence showing that algorithms have been tested for bias related to race, gender, and other protected characteristics — before they’re rolled out. The algorithms must meet standards of fairness and nondiscrimination and ensure just outcomes. (Inserted comment from DSC: Is this even possible? I hope so, but I have my doubts especially given the enormous lack of diversity within the large tech companies.)
  5. Feedback mechanism: We have the right to exert some degree of control over the way algorithms work.
  6. Portability: We have the right to easily transfer all our data from one provider to another.
  7. Redress: We have the right to seek redress if we believe an algorithmic system has unfairly penalized or harmed us.
  8. Algorithmic literacy: We have the right to free educational resources about algorithmic systems.
  9. Independent oversight: We have the right to expect that an independent oversight body will be appointed to conduct retrospective reviews of algorithmic systems gone wrong. The results of these investigations should be made public.
  10. Federal and global governance: We have the right to robust federal and global governance structures with human rights at their center. Algorithmic systems don’t stop at national borders, and they are increasingly used to decide who gets to cross borders, making international governance crucial.

 

This raises the question: Who should be tasked with enforcing these norms? Government regulators? The tech companies themselves?

 

 

Law’s Looming Skills Crisis — from forbes.com by Mark Cohen

Excerpts (emphasis DSC):

The good news is that legal professionals have more career paths, lifestyle options, and geographic nimbleness than ever before. The bad news is that relatively few in the industry are prepared to fill the new roles.

A growing list of clients demand transformed legal services. What does that mean? Legal professionals must meld law, technology, and business and apply principles of digital transformation to the legal function. They must be proactive, data-driven, client-centric, and collaborative*. They must appreciate that clients want solutions to business challenges, not legal tomes.

Legal culture responds to the warp speed change—when it does at all– with buzzwords, denial, and self-congratulation. The packed calendar of industry award dinners celebrating pioneers, innovation, diversity, and other self-declared advances belies data exposing law’s dreadful scorecard on diversity, gender pay equality, advancement opportunities for non-white males, and other legal guild cultural holdovers. Then there’s the disconnect between the lawyer and client view of industry performance. Law’s net promoter score lags other professions and almost all industries. Legal culture needs a jolt; client-centricity, the ability to respond rapidly and effectively to new risk factors and challenges, data-driven judgments, and agile workforces are among law’s transformational musts.

Legal culture is slow to embrace data, technology, new delivery models, multidisciplinary practice, regulatory reform, collaboration, diversity, gender pay equality, the distinction between the practice of law and the delivery of legal services, client-centricity, and digital transformation. Law is rooted in precedent; it looks to the past to prepare for the future. That is no longer the world we live in.

 

From DSC:
*And I would add the ability to look into the future, develop some potential scenarios and some responses to those potential scenarios…in other words, practice some methods used in futurism.

I would encourage my colleagues within the legal education world — which includes the American Bar Association (ABA) — but also judges, lawyers, legislators, attorney generals, public defenders, and many others to realize that we need to massively pick up the pace if we are to help tame the wild west of emerging technologies these days!

 

 

 

 

[ABA] Council enacts new bar passage standard for law schools — from americanbar.org

Excerpt (emphasis DSC):

On May 17, the Council of the ABA Section of Legal Education and Admissions to the Bar approved a major change in the bar passage standard, known as 316, that would require 75 percent of a law school’s graduates who sit for the bar to pass it within two years. The change takes effect immediately although schools falling short of the standard would have at least two years to come into compliance.

Twice since 2017, the ABA policy-making House of Delegates has voted against the change, as some delegates feared it would have an adverse effect on law schools with significant minority enrollment. But under ABA rules and procedures, the Council, which is recognized by the U.S. Department of Education as the national accreditor of law schools, has the final say on accreditation matters.

 

Also see:

  • ABA’s Tougher Bar Pass Rule for Law Schools Applauded, Derided — from law.com by Karen Sloan
    The American Bar Association’s new standard could increase pressure on jurisdictions like California with high cut scores to lower that threshold. It could also add momentum to the burgeoning movement to overhaul the bar exam itself.

“Either the ABA Council simply ignored the clear empirical evidence that the new bar standard will decrease diversity in the bar, or it passed the new standard with the hope that states, like California, that have unreasonably high bar cut scores will lower those metrics in order to ameliorate the council’s action,” Patton said.

 

At that January meeting, former ABA President Paulette Brown, the first African-American woman to hold that position, called the proposed change “draconian.”

“I know and understand fully that the [ABA] council has the right to ignore what we say,” she said. “That does not absolve us of our responsibility to give them a very clear and strong message that we will not idly stand by while they decimate the diversity in the legal profession.”

 

 

San Francisco becomes first city to bar police from using facial recognition— from cnet.com by Laura Hautala
It won’t be the last city to consider a similar law.

San Francisco becomes first city to bar police from using facial recognition

Excerpt:

The city of San Francisco approved an ordinance on Tuesday [5/14/19] barring the police department and other city agencies from using facial recognition technology on residents. It’s the first such ban of the technology in the country.

The ordinance, which passed by a vote of 8 to 1, also creates a process for the police department to disclose what surveillance technology they use, such as license plate readers and cell-site simulators that can track residents’ movements over time. But it singles out facial recognition as too harmful to residents’ civil liberties to even consider using.

“Facial surveillance technology is a huge legal and civil liberties risk now due to its significant error rate, and it will be worse when it becomes perfectly accurate mass surveillance tracking us as we move about our daily lives,” said Brian Hofer, the executive director of privacy advocacy group Secure Justice.

For example, Microsoft asked the federal government in July to regulate facial recognition technology before it gets more widespread, and said it declined to sell the technology to law enforcement. As it is, the technology is on track to become pervasive in airports and shopping centers and other tech companies like Amazon are selling the technology to police departments.

 

Also see:

 

People, Power and Technology: The Tech Workers’ View — from doteveryone.org.uk

Excerpt:

People, Power and Technology: The Tech Workers’ View is the first in-depth research into the attitudes of the people who design and build digital technologies in the UK. It shows that workers are calling for an end to the era of moving fast and breaking things.

Significant numbers of highly skilled people are voting with their feet and leaving jobs they feel could have negative consequences for people and society. This is heightening the UK’s tech talent crisis and running up employers’ recruitment and retention bills. Organisations and teams that can understand and meet their teams’ demands to work responsibly will have a new competitive advantage.

While Silicon Valley CEOs have tried to reverse the “techlash” by showing their responsible credentials in the media, this research shows that workers:

    • need guidance and skills to help navigate new dilemmas
    • have an appetite for more responsible leadership
    • want clear government regulation so they can innovate with awareness

Also see:

  • U.K. Tech Staff Quit Over Work On ‘Harmful’ AI Projects — from forbes.com by Sam Shead
    Excerpt:
    An alarming number of technology workers operating in the rapidly advancing field of artificial intelligence say they are concerned about the products they’re building. Some 59% of U.K. tech workers focusing on AI have experience of working on products that they felt might be harmful for society, according to a report published on Monday by Doteveryone, the think tank set up by lastminute.com cofounder and Twitter board member Martha Lane Fox.

 

 

 

Watch Salvador Dalí Return to Life Through AI — from interestingengineering.com by
The Dalí Museum has created a deepfake of surrealist artist Salvador Dalí that brings him back to life.

Excerpt:

The Dalí Museum has created a deepfake of surrealist artist Salvador Dalí that brings him back to life. This life-size deepfake is set up to have interactive discussions with visitors.

The deepfake can produce 45 minutes of content and 190,512 possible combinations of phrases and decisions taken by the fake but realistic Dalí. The exhibition was created by Goodby, Silverstein & Partners using 6,000 frames of Dalí taken from historic footage and 1,000 hours of machine learning.

 

From DSC:
While on one hand, incredible work! Fantastic job! On the other hand, if this type of deepfake can be done, how can any video be trusted from here on out? What technology/app will be able to confirm that a video is actually that person, actually saying those words?

Will we get to a point that says, this is so and so, and I approved this video. Or will we have an electronic signature? Will a blockchain-based tech be used? I don’t know…there always seems to be pros and cons to any given technology. It’s how we use it. It can be a dream, or it can be a nightmare.

 

 

From DSC:
Re: the Learning from the Living [Class] Room vision of a next gen learning platform

 

Learning from the Living Class Room

 

…wouldn’t it be cool if you could use your voice to ask your smart/connected “TV” type of device:

“Show me the test questions for Torts I from WMU-Cooley Law School. Cooley could then charge $0.99 for these questions.”

Then, the system knows how you did on answering those questions. The ones you got right, you don’t get asked to review as often as the ones you got wrong. As you get a question right more often, the less you are asked to answer it.

You sign up for such streams of content — and the system assesses you periodically. This helps a person keep certain topics/information fresh in their memory. This type of learning method would be incredibly helpful for students trying to pass the Bar or other types of large/summative tests — especially when a student has to be able to recall information that they learned over the last 3-5 years.

Come to think of it…this method could help all of us in learning new disciplines/topics throughout our lifetimes. Sign up for the streams of content that you want to learn more about…and drop the (no-longer relevant) subscriptions as needed..

 

We need to tap into streams of content in our next gen learning platform

 

5 new legal professions under the impact of legal tech — from medium.com by Valentin Pivovarov

Excerpt:

Professionals who work in the field of jurisprudence should take into account the coming changes. It is necessary to realize the significance of the transition from an industrial society operating with printed texts to an information society operating with Internet-oriented resources. Legal innovations implementation will allow specialists to change their routines and make their work more effective. It is already important to retrain and look for new opportunities. And it doesn’t matter whether you work in a large law company or your own legal tech startup. The game is changing.

 

2019 State of Corporate Law Departments Report — from Thomson Reuters Legal Executive Institute, the International Bar Association (IBA), the Corporate Legal Operations Consortium (CLOC), and UK-based legal research firm Acritas

 

 

Because many corporate law departments are faced with dynamic and wide-ranging problems, the solutions require a highly diverse set of skills and capabilities. Teams of lawyers alone are no longer enough to solve all problems in optimal ways. That’s why law departments must provide legal support to corporations that not only enables them to maximize their competitive advantage, but also safeguards the organization against unnecessary risk.

The State of Corporate Law Departments 2019 — a new report that was recently published by Thomson Reuters Legal Executive Institute, the International Bar Association (IBA), the Corporate Legal Operations Consortium (CLOC), and UK-based legal research firm Acritas — examines the landscape for corporate law departments and explains that in order to maximize the value of legal services being delivered, corporate law departments must make their best efforts to improve the impact of those legal services while at the same time reducing the cost of those services.

Excerpts (emphasis DSC):

However, this report suggests that in order to maximize the value delivered, it’s time to pay as much attention to improving the impact of legal services the corporate law departments are delivering as it is to reducing the cost of those services.

Indeed, today’s legal problems are dynamic and wide-ranging, and the solutions require a highly diverse set of skills and capabilities. Teams of lawyers alone are no longer enough to solve problems in optimal ways. All types of professional need to work together collaboratively, often from different organizations, and they need the support of modern working processes and systems.

Innovative law departments and innovative law firms score significantly higher across all key performance areas, including the ultimate measures of quality and value.

Innovation incorporates a whole host of different areas, such as embracing legal technologies, utilizing expert professionals holistically with lawyers, overhauling work processes and pricing models, and building collaborative partnerships between in-house teams and their outside law firms and alternative legal services suppliers.

To that end, the report identified a number of key levers that corporate law departments can use to create a higher performing legal function and enhance the impact that their departments makes on the overall success of the organization.

The solution to this challenge reinforces the key findings of this report as a whole — the need to tap into a diverse range of skills beyond legal expertise, to access new technologies, and to report on the progress made.

 

 
© 2025 | Daniel Christian