From DSC:
I can’t help but reflect on how slippery the slope is when we start talking about using drones — especially as sponsored and used by governments, including our government here in the U.S. Consider the following from The Future Institute.

The Future Institute Today -- discussing the slippery slope of using drones

Excerpt:

Eyes in the sky
As nationwide racial justice protests continue, some journalists and protestors have noticed a new addition to the armed police officers and National Guard troops: a drone flying a hexagon-shaped route 20,000 feet above the streets in Minneapolis. The drone, flown by U.S. Customs and Border Protection, is called a Predator, and is a piece of military technology used for identifying and targeting terrorists overseas. Lately, it’s become a more common sight domestically.

Last month, a number of New Yorkers witnessed a drone floating above them, barking orders to follow social distancing guidelines. The mysterious drone wasn’t official police equipment, but rather a privately owned device piloted by a man named Xavier Arthur in Queens, who was frustrated that people weren’t following stay-at-home orders. He claimed to represent the “Anti-Covid-19 Volunteer Drone Task Force. 

It’s not an isolated incident. During the outbreak, drones have been used extensively to monitor residents and encourage them to stay indoors, to inspect traffic stops and hospitals, and to spray cities with disinfectants. In Paris and Mumbai, they’re patrolling social distancing violators. In China, a video clip went viral, showing a drone breaking up a mahjong game—residents had defied local orders that they stay indoors. Drones with infrared cameras also allegedly flew overhead and checked for people with fevers.

Advanced drones can pinpoint certain behaviors in crowds from high altitudes, recognize and automatically follow targets, and communicate with each other or to command centers on the ground with remarkable precision and low latency. The pandemic and protests are playing to the strengths of an emerging real-time aerial surveillance ecosystem.

3 Things You Should Know

  1. The Flying Internet of Things is taking off.
  2. New drones can self-destruct.
  3. Shareable drones may drive growth in the industry.
 
 

From DSC:
As the ripples move outward from this time of the Coronavirus, we need to be very careful with #EmergingTechnologies. For example, where might the use of (police dept) drones equipped with #AI #FacialRecognition flying overhead take us? What if you’re of the “wrong religion” in a country? Hmm…

 

Social Distancing Enforcement Drones Arrive in the U.S. — from nymag.com by Adam Raymond

Excerpts:

In late January, a viral video from China showed people who’d wandered outside in the early days of the coronavirus outbreak getting scolded by a disembodied voice from a drone flying overhead. Last month, similar campaigns began in France, where locals flouting travel restrictions were gently reminded to “respectez les distances de sécurité s’il vous plaît.”

Now, self-righteous flying robots have made their way to the U.S., with at least two American police departments deploying drones to tell people to disperse, go home, and stay there.

“These drones will be around the City with an automated message from the Mayor telling you to STOP gathering, disperse and go home,” the police department wrote on Facebook. “Summonses HAVE AND WILL CONTINUE to be issued to those found in violation. Fines are up to $1000. You have been advised.”

 

 

Per Techcrunch article:

The world is vulnerable to a new type of trolling as people turn to Zoom video calls to feel connected amidst quarantines. Jerks are using Zoom’s screensharing feature to blast other viewers with the most awful videos from across the internet, from violence to shocking pornography.

 

From DSC:
I’d like to thank Mr. Eric Osterberg — a fraternity brother and friend of mine — for sending me the following article. I wrote back to him. After thanking Eric for the article, I said:

Such an article makes me reflect on things — which is always a good thing for me to try to see my blindspots and/or to think about the good and bad of things. Technologies are becoming more powerful and integrated into our lives — for better at times and for worse at other times.

I’m wondering how the legal realm can assist and/or help create a positive future for societies throughout the globe…any thoughts?


Can computers ever replace the classroom? — from theguardian.com by Alex Beard
With 850 million children worldwide shut out of schools, tech evangelists claim now is the time for AI education. But as the technology’s power grows, so too do the dangers that come with it. 

Excerpts:

But it’s in China, where President Xi Jinping has called for the nation to lead the world in AI innovation by 2030, that the fastest progress is being made. In 2018 alone, Li told me, 60 new AI companies entered China’s private education market. Squirrel AI is part of this new generation of education start-ups. The company has already enrolled 2 million student users, opened 2,600 learning centres in 700 cities across China, and raised $150m from investors.

The supposed AI education revolution is not here yet, and it is likely that the majority of projects will collapse under the weight of their own hype.

The point, in short, is that AI doesn’t have to match the general intelligence of humans to be useful – or indeed powerful. This is both the promise of AI, and the danger it poses.

It was a reminder that Squirrel AI’s platform, like those of its competitors worldwide, doesn’t have to be better than the best human teachers – to improve people’s lives, it just needs to be good enough, at the right price, to supplement what we’ve got. The problem is that it is hard to see technology companies stopping there. For better and worse, their ambitions are bigger. “We could make a lot of geniuses,” Li told me.

 

Future Today Institute's 2020 tech trends report

Key takeaways of this report:

  • Welcome to the Synthetic Decade.
  • You’ll soon have augmented hearing and sight.
  • A.I.-as-a-Service and Data-as-a-Service will reshape business.
  • China has created a new world order.
  • Home and office automation is nearing the mainstream.
  • Everyone alive today is being scored.
  • We’ve traded FOMO for abject fear.
  • It’s the end of forgetting.
  • Our new trust economy is being formed.

 

Facial recognition startup Clearview AI says its full client list was stolen — from engadget.com by Igor Bonifacic

Excerpt:

You might expect a high-profile (and controversial) facial recognition startup like Clearview AI would have its data locked down, but it turns out it’s just as vulnerable as almost any other company to malicious individuals. In a notification obtained by The Daily Beast, the company says a recent vulnerability allowed someone to gain “unauthorized access” to a list of all of its customers. Clearview works with approximately 600 law enforcement agencies across North America, including the Chicago Police Department.

Also see:

 

 

Cost cutting algorithms are making your job search a living hell — from vice.com by Nick Keppler
More companies are using automated job screening systems to vet candidates, forcing jobseekers to learn new and absurd tricks to have their résumés seen by a human.

Excerpts:

Companies are increasingly using automated systems to select who gets ahead and who gets eliminated from pools of applicants. For jobseekers, this can mean a series of bizarre, time-consuming tasks demanded by companies who have not shown any meaningful consideration of them.

Maneuvering around algorithmic gatekeepers to reach an actual person with a say in hiring has become a crucial skill, even if the tasks involved feel duplicitous and absurd. ATS software can also enable a company to discriminate, possibly unwittingly, based on bias-informed data and culling of certain psychological traits.

Until he started as a legal writer for FreeAdvice.com last month, Johnson, 36, said he was at potential employers’ whims. “I can’t imagine I’d move to the next round if I didn’t do what they said,” he told Motherboard.

 

How to block Facebook and Google from identifying your face — from cnbc.com by Todd Haselton

Excerpt:

  • A New York Times report over the weekend discussed a company named Clearview AI that can easily recognize people’s faces when someone uploads a picture.
  • It scrapes this data from the internet and sites people commonly use, such as Facebook and YouTube, according to the report.
  • You can stop Facebook and Google from recognizing your face in their systems, which is one step toward regaining your privacy.
  • Still, it probably won’t entirely stop companies like Clearview AI from recognizing you, since they’re not using the systems developed by Google or Facebook.
 

From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 

From DSC:
Very disturbing that citizens had no say in this. Legislators, senators, representatives, lawyers, law schools, politicians, engineers, programmers, professors, teachers, and more…please reflect upon our current situation here. How can we help create the kind of future that we can hand down to our kids and rest well at night…knowing we did all that we could to provide a dream — and not a nightmare — for them?


The Secretive Company That Might End Privacy as We Know It — from nytimes.com by Kashmir Hill
A little-known start-up helps law enforcement match photos of unknown people to their online images — and “might lead to a dystopian future or something,” a backer says.

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.

 

Excerpts:

“But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year…”

Clearview’s app carries extra risks because law enforcement agencies are uploading sensitive photos to the servers of a company whose ability to protect its data is untested.

 

CES 2020: Finding reality in a deluge of utopia — from web-strategist.com by Jeremiah Owyang

Excerpts:

One of my strategies is to look past the products that were announced, and instead find the new technologies that will shed light on which products will emerge such as sensors and data types.

The trick to approaching CES: Look for what creates the data, then analyze how it will be used, therein lies the power/leverage/business model of the future.

Sharp’s augmented windows give us an interesting glimpse of what retail could look like if every window was a transparent screen…

Rivian, the new electric truck company, which is funded by both Ford and Amazon was featured at the Amazon booth, with a large crowd, each wheel has an independent motor and it’s Alexa integrated – watch out Cybertruck.

Caution: “Data leakage” (where your data ends up in places you didn’t expect) is frightening, and people will start to care. The amount of devices present, that offer data collection to unknown companies in unknown countries is truly astounding. Both from a personal, business, and national security perspective, consumers and businesses alike really don’t know the ramifications of all of this data sharing.

Also see:

 

From pizza to transplant organs: What drones will be delivering in the 2020s — from digitaltrends.com by Luke Dormehl

Excerpt:

From drone racing to drone photography, quadcopters and other unmanned aerial vehicles rose to prominence in the 2010s. But in the decade to come they’re going to become an even bigger thing in the next 10 years. Case in point: Deliveries by drone.

Who should you be watching in this space?


From DSC:

While I appreciate Luke’s reporting on this, I am very much against darkening the skies with noisy machines. Again, we adults need to be very careful of the world that we are developing for our kids! If items could be delivered via a system of underground pipes, that would be a different, quieter, not visible, more agreeable approach for me.

Just because we can…

 
© 2025 | Daniel Christian