From DSC:
As the ripples move outward from this time of the Coronavirus, we need to be very careful with #EmergingTechnologies. For example, where might the use of (police dept) drones equipped with #AI #FacialRecognition flying overhead take us? What if you’re of the “wrong religion” in a country? Hmm…

 

Social Distancing Enforcement Drones Arrive in the U.S. — from nymag.com by Adam Raymond

Excerpts:

In late January, a viral video from China showed people who’d wandered outside in the early days of the coronavirus outbreak getting scolded by a disembodied voice from a drone flying overhead. Last month, similar campaigns began in France, where locals flouting travel restrictions were gently reminded to “respectez les distances de sécurité s’il vous plaît.”

Now, self-righteous flying robots have made their way to the U.S., with at least two American police departments deploying drones to tell people to disperse, go home, and stay there.

“These drones will be around the City with an automated message from the Mayor telling you to STOP gathering, disperse and go home,” the police department wrote on Facebook. “Summonses HAVE AND WILL CONTINUE to be issued to those found in violation. Fines are up to $1000. You have been advised.”

 

 

Per Techcrunch article:

The world is vulnerable to a new type of trolling as people turn to Zoom video calls to feel connected amidst quarantines. Jerks are using Zoom’s screensharing feature to blast other viewers with the most awful videos from across the internet, from violence to shocking pornography.

 

My thanks to a friend for causing me to further reflect on this article: “Can computers ever replace the classroom?” [Beard]


From DSC:
I’d like to thank Mr. Eric Osterberg — a fraternity brother and friend of mine — for sending me the following article. I wrote back to him. After thanking Eric for the article, I said:

Such an article makes me reflect on things — which is always a good thing for me to try to see my blindspots and/or to think about the good and bad of things. Technologies are becoming more powerful and integrated into our lives — for better at times and for worse at other times.

I’m wondering how the legal realm can assist and/or help create a positive future for societies throughout the globe…any thoughts?


Can computers ever replace the classroom? — from theguardian.com by Alex Beard
With 850 million children worldwide shut out of schools, tech evangelists claim now is the time for AI education. But as the technology’s power grows, so too do the dangers that come with it. 

Excerpts:

But it’s in China, where President Xi Jinping has called for the nation to lead the world in AI innovation by 2030, that the fastest progress is being made. In 2018 alone, Li told me, 60 new AI companies entered China’s private education market. Squirrel AI is part of this new generation of education start-ups. The company has already enrolled 2 million student users, opened 2,600 learning centres in 700 cities across China, and raised $150m from investors.

The supposed AI education revolution is not here yet, and it is likely that the majority of projects will collapse under the weight of their own hype.

The point, in short, is that AI doesn’t have to match the general intelligence of humans to be useful – or indeed powerful. This is both the promise of AI, and the danger it poses.

It was a reminder that Squirrel AI’s platform, like those of its competitors worldwide, doesn’t have to be better than the best human teachers – to improve people’s lives, it just needs to be good enough, at the right price, to supplement what we’ve got. The problem is that it is hard to see technology companies stopping there. For better and worse, their ambitions are bigger. “We could make a lot of geniuses,” Li told me.

 

Future Today Institute's 2020 tech trends report

Key takeaways of this report:

  • Welcome to the Synthetic Decade.
  • You’ll soon have augmented hearing and sight.
  • A.I.-as-a-Service and Data-as-a-Service will reshape business.
  • China has created a new world order.
  • Home and office automation is nearing the mainstream.
  • Everyone alive today is being scored.
  • We’ve traded FOMO for abject fear.
  • It’s the end of forgetting.
  • Our new trust economy is being formed.

 

Facial recognition startup Clearview AI says its full client list was stolen — from engadget.com by Igor Bonifacic

Excerpt:

You might expect a high-profile (and controversial) facial recognition startup like Clearview AI would have its data locked down, but it turns out it’s just as vulnerable as almost any other company to malicious individuals. In a notification obtained by The Daily Beast, the company says a recent vulnerability allowed someone to gain “unauthorized access” to a list of all of its customers. Clearview works with approximately 600 law enforcement agencies across North America, including the Chicago Police Department.

Also see:

 

 

Cost cutting algorithms are making your job search a living hell — from vice.com by Nick Keppler
More companies are using automated job screening systems to vet candidates, forcing jobseekers to learn new and absurd tricks to have their résumés seen by a human.

Excerpts:

Companies are increasingly using automated systems to select who gets ahead and who gets eliminated from pools of applicants. For jobseekers, this can mean a series of bizarre, time-consuming tasks demanded by companies who have not shown any meaningful consideration of them.

Maneuvering around algorithmic gatekeepers to reach an actual person with a say in hiring has become a crucial skill, even if the tasks involved feel duplicitous and absurd. ATS software can also enable a company to discriminate, possibly unwittingly, based on bias-informed data and culling of certain psychological traits.

Until he started as a legal writer for FreeAdvice.com last month, Johnson, 36, said he was at potential employers’ whims. “I can’t imagine I’d move to the next round if I didn’t do what they said,” he told Motherboard.

 

How to block Facebook and Google from identifying your face — from cnbc.com by Todd Haselton

Excerpt:

  • A New York Times report over the weekend discussed a company named Clearview AI that can easily recognize people’s faces when someone uploads a picture.
  • It scrapes this data from the internet and sites people commonly use, such as Facebook and YouTube, according to the report.
  • You can stop Facebook and Google from recognizing your face in their systems, which is one step toward regaining your privacy.
  • Still, it probably won’t entirely stop companies like Clearview AI from recognizing you, since they’re not using the systems developed by Google or Facebook.
 

From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 

From DSC:
Very disturbing that citizens had no say in this. Legislators, senators, representatives, lawyers, law schools, politicians, engineers, programmers, professors, teachers, and more…please reflect upon our current situation here. How can we help create the kind of future that we can hand down to our kids and rest well at night…knowing we did all that we could to provide a dream — and not a nightmare — for them?


The Secretive Company That Might End Privacy as We Know It — from nytimes.com by Kashmir Hill
A little-known start-up helps law enforcement match photos of unknown people to their online images — and “might lead to a dystopian future or something,” a backer says.

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.

 

Excerpts:

“But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year…”

Clearview’s app carries extra risks because law enforcement agencies are uploading sensitive photos to the servers of a company whose ability to protect its data is untested.

 

CES 2020: Finding reality in a deluge of utopia — from web-strategist.com by Jeremiah Owyang

Excerpts:

One of my strategies is to look past the products that were announced, and instead find the new technologies that will shed light on which products will emerge such as sensors and data types.

The trick to approaching CES: Look for what creates the data, then analyze how it will be used, therein lies the power/leverage/business model of the future.

Sharp’s augmented windows give us an interesting glimpse of what retail could look like if every window was a transparent screen…

Rivian, the new electric truck company, which is funded by both Ford and Amazon was featured at the Amazon booth, with a large crowd, each wheel has an independent motor and it’s Alexa integrated – watch out Cybertruck.

Caution: “Data leakage” (where your data ends up in places you didn’t expect) is frightening, and people will start to care. The amount of devices present, that offer data collection to unknown companies in unknown countries is truly astounding. Both from a personal, business, and national security perspective, consumers and businesses alike really don’t know the ramifications of all of this data sharing.

Also see:

 

From pizza to transplant organs: What drones will be delivering in the 2020s — from digitaltrends.com by Luke Dormehl

Excerpt:

From drone racing to drone photography, quadcopters and other unmanned aerial vehicles rose to prominence in the 2010s. But in the decade to come they’re going to become an even bigger thing in the next 10 years. Case in point: Deliveries by drone.

Who should you be watching in this space?


From DSC:

While I appreciate Luke’s reporting on this, I am very much against darkening the skies with noisy machines. Again, we adults need to be very careful of the world that we are developing for our kids! If items could be delivered via a system of underground pipes, that would be a different, quieter, not visible, more agreeable approach for me.

Just because we can…

 

Indian police are using facial recognition to identify protesters in Delhi — from fastcompany.com by Kristin Toussaint

Excerpt:

At Modi’s rally on December 22, Delhi police used Automated Facial Recognition System (AFRS) software—which officials there acquired in 2018 as a tool to find and identify missing children—to screen the crowd for faces that match a database of people who have attended other protests around the city, and who officials said could be disruptive.

According to the Indian Express, Delhi police have long filmed these protest events, and the department announced Monday that officials fed that footage through AFRS. Sources told the Indian news outlet that once “identifiable faces” are extracted from that footage, a dataset will point out and retain “habitual protesters” and “rowdy elements.” That dataset was put to use at Modi’s rally to keep away “miscreants who could raise slogans or banners.”

 

From DSC:
Here in the United States…are we paying attention to today’s emerging technologies and collaboratively working to create a future dream — versus a future nightmare!?!  A vendor or organization might propose a beneficial reason to use their product or technology — and it might even meet the hype at times…but then comes along other unintended uses and consequences of that technology. For example, in the article above, what started out as a technology that was supposed to be used to find/identify missing children (a benefit) was later used to identify protesters (an unintended consequence, and a nightmare in terms of such an expanded scope of use I might add)!

Along these lines, the youth of today have every right to voice their opinions and to have a role in developing or torpedoing emerging techs. What we build and put into place now will impact their lives bigtime!

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2020 | Daniel Christian