From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

Gartner: 10 ways technology will change what it means to be human — from campustechnology.com by Rhea Kelly

Excerpts:

Gartner’s top 10 strategic predictions for technology are:

  1. “By 2023, the number of people with disabilities employed will triple due to AI and emerging technologies, reducing barriers to access.”
  2. “By 2024, AI identification of emotions will influence more than half of the online advertisements you see.”
  3. “Through 2023, 30 percent of IT organizations will extend BYOD policies with ‘bring your own enhancement’ (BYOE) to address augmented humans in the workforce.”
  4. “By 2025, 50 percent of people with a smartphone but without a bank account will use a mobile-accessible cryptocurrency account.”
  5. “By 2023, a self-regulating association for oversight of AI and machine learning designers will be established in at least four of the G7 countries.”
  6. “By 2023, 40 percent of professional workers will orchestrate their business application experiences and capabilities like they do their music streaming experience.”
  7. “By 2023, up to 30 percent of world news and video content will be authenticated as real by blockchain countering deep fake technology.”
  8. “Through 2021, digital transformation initiatives will take large traditional enterprises on average twice as long and cost twice as much as anticipated.”
  9. “By 2023, individual activities will be tracked digitally by an ‘Internet of Behavior’ to influence benefit and service eligibility for 40 percent of people worldwide.”
  10. “By 2024, the World Health Organization will identify online shopping as an addictive disorder, as millions abuse digital commerce and encounter financial stress.”

Facial recognition, location tracking and big data will allow organizations to monitor individual behavior and link that behavior to other digital actions, Gartner said, noting that “The Internet of Things (IoT) – where physical things are directed to do a certain thing based on a set of observed operating parameters relative to a desired set of operating parameters — is now being extended to people, known as the Internet of Behavior (IoB).”

 

From DSC:
That last quote about the “Internet of Behavior (IoB)” should disturb us. I don’t want that kind of world for the next generation. 

 

20 digital transformation leaders to follow on Twitter in 2020 — from enterprisersproject.com by David. F. Carr
Committed to digital transformation this year? Follow these people for perspective and emerging lessons

Excerpt:

One of our New Year’s resolutions was to refresh and expand our Twitter feed for digital transformation leaders, reviewing them not just for the use of the right hashtags but for the content they share.

There are a few repeats from a similar list we shared last year, but for the most part, we tried to give you new Twitter handles to follow. This year’s list includes CIOs, authors, consultants, and cloud computing leaders. Some only post on technology topics, while others share thoughts on family, culture, politics, and favorite movies.

The common denominator we looked for was a thoughtfully curated feed that’s not entirely self-promotional but adds to the conversation we’re all having about how to understand the potential of digital transformation and put it to work for our organizations.


From DSC:

While these types of lists invariably leave off a ton of extremely talented individuals and organizations who are worth following as well, such lists are a good starting point for:

  • Someone to use to begin tapping into streams of content in a given area
  • Observing the topics, issues, ideas being discussed
  • Building one’s network
  • Seeing who these folks follow and who they respect
  • …and more.

Searching for the top __ people of Twitter in subject XYZ is a solid way to enhance our [lifelong] learning ecosystems.

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 
 

Some of the topics/items mentioned include:

  • Technologists join lawyers in creating the legal realm of the future.
  • Future lawyers will need to either have project managers on staff or be able to manage projects themselves.
  • Lifelong learning is now critically important. One doesn’t necessarily need to be able to code, but one needs to be constantly learning.
  • Need to understand legal principles but you will also need to have augmented skills (which will differ from person to person)
  • New business and delivery models. Don’t presuppose that the current model will always be around.
  • There will be fewer traditional roles/practices. Traditional roles are sunsetting; new skillsets are needed.
  • Students: Do your due diligence; read up on the industry and think about whether there’s a good fit. Learn your craft. Get experience. Be who you are. Bring your unique brand to the table.
 

Web Technologies of the Year 2019 — from w3techs.com

Excerpts:
These are the technologies that gained most sites in 2019 in areas such as:

  • Content Management System of the Year 2019
  • Server-side Programming Language of the Year 2019
  • JavaScript Library of the Year 2019
  • Web Server of the Year 2019
  • Operating System of the Year 2019
  • Traffic Analysis Tool of the Year 2019
  • …and several more categories
 

Don’t trust AI until we build systems that earn trust — from economist.com
Progress in artificial intelligence belies a lack of transparency that is vital for its adoption, says Gary Marcus, coauthor of “Rebooting AI”

Excerpts:

Mr Marcus argues that it would be foolish of society to put too much stock in today’s AI techniques since they are so prone to failures and lack the transparency that researchers need to understand how algorithms reached their conclusions.

As part of The Economist’s Open Future initiative, we asked Mr Marcus about why AI can’t do more, how to regulate it and what teenagers should study to remain relevant in the workplace of the future.

Trustworthy AI has to start with good engineering practices, mandated by laws and industry standards, both of which are currently largely absent. Too much of AI thus far has consisted of short-term solutions, code that gets a system to work immediately, without a critical layer of engineering guarantees that are often taken for granted in other field. The kinds of stress tests that are standard in the development of an automobile (such as crash tests and climate challenges), for example, are rarely seen in AI. AI could learn a lot from how other engineers do business.

The assumption in AI has generally been that if it works often enough to be useful, then that’s good enough, but that casual attitude is not appropriate when the stakes are high. It’s fine if autotagging people in photos turns out to be only 90 percent reliable—if it is just about personal photos that people are posting to Instagram—but it better be much more reliable when the police start using it to find suspects in surveillance photos.

 

120 AI predictions for 2020 — from forbes.com by Gil Press

Excerpt:

As for the universe, it is an open book for the 120 senior executives featured here, all involved with AI, delivering 2020 predictions for a wide range of topics: Autonomous vehicles, deepfakes, small data, voice and natural language processing, human and augmented intelligence, bias and explainability, edge and IoT processing, and many promising applications of artificial intelligence and machine learning technologies and tools. And there will be even more 2020 AI predictions, in a second installment to be posted here later this month.

 

From the American Bar Association: 2019 Solo and Small Firm Technology Usage Report

2019 Solo & Small Firm — from the American Bar Association (ABA)

Excerpt:

Solos and small firms are in a unique position to leverage technology in order to become more sustainable and competitive with larger firms and alternative legal service providers—they have the ability to be more agile than larger firms because of their size. There are numerous technology options available and many are solo and small firm budget-friendly. Consumers are demanding convenience, transparency, and efficiency from the legal profession. Attorneys must be adaptable to this driving force. The legal profession has some of the highest rates of addiction, depression, and suicide. Women are leaving the profession in droves and lawyers are struggling to find work-life balance. Technology use can reduce human error, increase efficiency, reduce overhead, provide flexibility, enhance attorney well-being, and better meet client needs.

 

The future of law and computational technologies: Two sides of the same coin — from law.mit.edu by Daniel Linna
Law and computation are often thought of as being two distinct fields. Increasingly, that is not the case. Dan Linna explores the ways a computational approach could help address some of the biggest challenges facing the legal industry.

Excerpt:

The rapid advancement of artificial intelligence (“AI”) introduces opportunities to improve legal processes and facilitate social progress. At the same time, AI presents an original set of inherent risks and potential harms. From a Law and Computational Technologies perspective, these circumstances can be broadly separated into two categories. First, we can consider the ethics, regulations, and laws that apply to technology. Second, we can consider the use of technology to improve the delivery of legal services, justice systems, and the law itself. Each category presents an unprecedented opportunity to use significant technological advancements to preserve and expand the rule of law.

For basic legal needs, access to legal services might come in the form of smartphones or other devices that are capable of providing users with an inventory of their legal rights and obligations, as well as providing insights and solutions to common legal problems. Better yet, AI and pattern matching technologies can help catalyze the development of proactive approaches to identify potential legal problems and prevent them from arising, or at least mitigate their risk.

We risk squandering abundant opportunities to improve society with computational technologies if we fail to proactively create frameworks to embed ethics, regulation, and law into our processes by design and default.

To move forward, technologists and lawyers must radically expand current notions of interdisciplinary collaboration. Lawyers must learn about technology, and technologists must learn about the law.

 

 

Technology is increasingly being used to provide legal services, which demands a new breed of innovative lawyer for the 21st century. Law schools are launching specialist LL.M.s in response, giving students computing skills — from llm-guide.com by Seb Murray

Excerpts:

Junior lawyers at Big Law firms have long been expected to work grueling hours on manual and repetitive tasks like reviewing documents and doing due diligence. Increasingly, such work is being undertaken by machines – which can be faster, cheaper and more accurately than humans. This is the world of legal technology – the use of technology to provide legal services.

The top law schools recognize the need to train not just excellent lawyers but tech-savvy ones too, who understand the application of technology and its impact on the legal market. They are creating specialist courses for those who want to be more involved with the technology used to deliver legal advice.

“Technology is changing the way we live, work and interact,” says Alejandro Touriño, co-director of the course. “This new reality demands a new breed of lawyers who can adapt to the emerging paradigm. An innovative lawyer in the 21st century needs not only to be excellent in law, but also in the sector where their clients operate and the technologies they deal with.” 

The rapid growth in Legal Tech LL.M. offerings reflects a need in the professional world. Indeed, law firms know they need to become digital businesses in order to attract and retain clients and prospective employees.

 

From DSC:
In case it’s helpful or interesting, a person interested in a legal career needs to first get a Juris Doctor (J.D.) Degree, then pass the Bar. At that point, if they want to expand their knowledge in a certain area or areas, they can move on to getting an LL.M. Degree if they choose to.

As in the world of higher ed and also in the corporate training area, I have it that the legal field will need to move more towards the use of teams of specialists. There will be several members of the team NOT having law degrees. For example, technologists, programmers, user experience designers, etc. should be teaming up with lawyers more and more these days.

 

The 20 top tech skills that employers want and that can help you find a job, according to recruiting site Indeed — from businessinsider.com by Rosalie Chan and Bani Sapra

Excerpt:

The job search site Indeed released a report this month about the top tech skills of 2019 based on job descriptions that are being posted.

Andrew Flowers, an economist at Indeed, says that in today’s job market, there are two major trends that drive the top skills in tech. The first is the rise of data science, machine learning, and artificial intelligence. The second is the rise of cloud computing.

“Python has had explosive growth,” Flowers told Business Insider. “If I’m around the dinner table and a nephew asks what should I learn? Having done this report, I would say, learn Python.”

 

‘Academic capitalism’ is reshaping faculty life. What does that mean? — from edsurge.com by Rebecca Koenig

Excerpt:

Professors have long savored their position outside of commercial systems. But these days, there’s plenty of signs of capitalism within the academy—scholars seek support from funders in hopes that the findings will lead to commercial applications, departments market their courses to students as pathways to lucrative careers and universities replace tenure-track positions with adjunct jobs to cut costs.

Last week on the latest installment of EdSurge Live, our monthly online discussion forum, we dug into the topic, often referred to as “academic capitalism.”

 

Why AI is a threat to democracy – and what we can do to stop it — from asumetech.com by Lawrence Cole

Excerpts:

In the US, however, we also have a tragic lack of foresight. Instead of creating a grand strategy for AI or for our long-term futures, the federal government has removed the financing of scientific and technical research. The money must therefore come from the private sector. But investors also expect a certain return. That is a problem. You cannot plan your R&D breakthroughs when working on fundamental technology and research. It would be great if the big tech companies had the luxury of working very hard without having to organize an annual conference to show off their newest and best whiz bang thing. Instead, we now have countless examples of bad decisions made by someone in the G-MAFIA, probably because they worked quickly. We begin to see the negative effects of the tension between doing research that is in the interest of humanity and making investors happy.

The problem is that our technology has become increasingly sophisticated, but our thinking about what free speech is and what a free market economy looks like has not become that advanced. We tend to resort to very basic interpretations: free speech means that all speech is free, unless it conflicts with defamation laws, and that’s the end of the story. That is not the end of the story. We need to start a more sophisticated and intelligent conversation about our current laws, our emerging technology, and how we can make the two meet halfway.

 

So I absolutely believe that there is a way forward. But we have to come together and bridge the gap between Silicon Valley and DC, so that we can all steer the boat in the same direction.

— Amy Webb, futurist, NYU professor, founder of the Future Today Institute

 

Also see:

“FRONTLINE investigates the promise and perils of artificial intelligence, from fears about work and privacy to rivalry between the U.S. and China. The documentary traces a new industrial revolution that will reshape and disrupt our lives, our jobs and our world, and allow the emergence of the surveillance society.”

The film has five distinct messages about:

1. China’s AI Plan
2. The Promise of AI
3. The Future of Work
4. Surveillance Capitalism
5. The Surveillance State

 
© 2025 | Daniel Christian