Pushback is growing against automated proctoring services. But so is their use — from edsurge.com by Jeffrey R. Young

Excerpt:

Many students have pushed back, arguing that remote proctoring tools result in a serious invasion of privacy and create stress that can hinder academic performance. More than 60,000 students across the U.S. have signed petitions calling on their colleges to stop using automated proctoring tools, meaning that the technology has become arguably the most controversial tool of the pandemic at colleges.

From DSC:
We have an issue oftentimes within higher education — including graduate schools/professional schools as well — where the student and the professor aren’t always on the same team (or at least that’s the percaption). To me, the professors need to be saying (and living out the message that), “We ARE on your team. We are working to help make you successful in the future that you have chosen for yourself. We’re here to help you…not monitor you.”

It’s like I feel when I walk into so many public places these days (and even out on the roadways as well). When I walk into a store, it’s like the security cameras are whispering to me and to others…”We don’t trust you. Some of you have stolen in the past. so we’re going to carefully watch every single one of you. And we aren’t just going to watch you, we’re going to record you as well.”

The message? We don’t trust you.

This severely hampers the relationships involved.

And I’m sure that cheating is going on. But then, that makes me think that perhaps it’s time to change the way we assess students — and to help them see assessments as opportunities to learn, not to cheat. 

Lower the stakes. Offer tests more frequently. Provide more opportunities to practice recall. And be on their team.

 

“Especially given that these systems replicate and amplify the harms of structural racism and historical discrimination, which fall predominantly on Black, brown, and poor communities.”

From DSC:
Some serious fodder for thought in this article. I’d like to see
#computerscience students and faculty members debate and/or weigh in on this type of topic.

 

 

Wrongfully accused by an algorithm- from the New York Times

Wrongfully accused by an algorithm — from nytimes.com by Kashmir Hill

Excerpt:

On a Thursday afternoon in January, Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from the Detroit Police Department telling him to come to the station to be arrested. He thought at first that it was a prank.

An hour later, when he pulled into his driveway in a quiet subdivision in Farmington Hills, Mich., a police car pulled up behind, blocking him in. Two officers got out and handcuffed Mr. Williams on his front lawn, in front of his wife and two young daughters, who were distraught. The police wouldn’t say why he was being arrested, only showing him a piece of paper with his photo and the words “felony warrant” and “larceny.”

His wife, Melissa, asked where he was being taken. “Google it,” she recalls an officer replying.

“Is this you?” asked the detective.

The second piece of paper was a close-up. The photo was blurry, but it was clearly not Mr. Williams. He picked up the image and held it next to his face.

“No, this is not me,” Mr. Williams said. “You think all black men look alike?”

 

Also relevant/see:

What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias

 

PhotoVoice -- using photography to tell one's story

Photo voice dot org

 

Why photography?

A photograph is the quickest and easiest way for somebody to document the realities of their circumstances. Most people are familiar with photography to some degree, and it can be picked up relatively quickly by all abilities and ages.

Photography also crosses cultural and linguistic barriers, with its power lying in its dual role as both a form of creative expression and a way to document facts.

It provides an accessible way to describe realities, communicate perspectives, and raise awareness of social and global issues to different audiences.

Its relatively low cost and ease of dissemination encourages sharing, facilitating dialogue and discussion, even for those who have never picked up a camera before.

 

Addendum on 6/25/20

 

IBM, Amazon, and Microsoft abandon law enforcement face recognition market — from which-50.com by Andrew Birmingham

Excerpt:

Three global tech giants — IBM, Amazon, and Microsoft — have all announced that they will no longer sell their face recognition technology to police in the USA, though each announcement comes with its own nuance.

The new policy comes in the midst of ongoing national demonstrations in the US about police brutality and more generally the subject of racial inequality in the country under the umbrella of the Black Lives Matter movement.

From DSC:
While I didn’t read the fine print (so I don’t know all of the “nuances” they are referring to) I see this as good news indeed! Well done whomever at those companies paused, and thought…

 

…just because we can…

just because we can does not mean we should


…doesn’t mean we should.

 

just because we can does not mean we should

Addendum on 6/18/20:

  • Why Microsoft and Amazon are calling on Congress to regulate facial recognition tech — from finance.yahoo.com by Daniel HowleyExcerpt:
    The technology, which can be used to identify suspects in things like surveillance footage, has faced widespread criticism after studies found it can be biased against women and people of color. And according to at least one expert, there needs to be some form of regulation put in place if these technologies are going to be used by law enforcement agencies.“If these technologies were to be deployed, I think you cannot do it in the absence of legislation,” explained Siddharth Garg, assistant professor of computer science and engineering at NYU Tandon School of Engineering, told Yahoo Finance.
 

From DSC:
As the technologies get more powerful, so do the ramifications/consequences. And here’s the kicker. Just like we’re seeing with trying to deal with systemic racism in our country…like it or not, it all comes back to the state of the hearts and minds out there. I’m not trying to be preachy or to pretend that I’m better than anyone. I’m a sinner. I know that full well. And I view all of us as equals trying to survive and to make our way in this tough-to-live-in world. But the unfortunate truth is that technologies are tools, and how the tools are used depends upon one’s heart.

Proverbs 4:23 New International Version (NIV) — from biblegateway.com

23 Above all else, guard your heart,
    for everything you do flows from it.

 

This startup is using AI to give workers a “productivity score” — from technologyreview.com by Will Douglas
Enaible is one of a number of new firms that are giving employers tools to help keep tabs on their employees—but critics fear this kind of surveillance undermines trust.

Excerpt:

In the last few months, millions of people around the world stopped going into offices and started doing their jobs from home. These workers may be out of sight of managers, but they are not out of mind. The upheaval has been accompanied by a reported spike in the use of surveillance software that lets employers track what their employees are doing and how long they spend doing it.

Companies have asked remote workers to install a whole range of such tools. Hubstaff is software that records users’ keyboard strokes, mouse movements, and the websites that they visit. Time Doctor goes further, taking videos of users’ screens. It can also take a picture via webcam every 10 minutes to check that employees are at their computer. And Isaak, a tool made by UK firm Status Today, monitors interactions between employees to identify who collaborates more, combining this data with information from personnel files to identify individuals who are “change-makers.”

Machine-learning algorithms also encode hidden bias in the data they are trained on. Such bias is even harder to expose when it’s buried inside an automated system. If these algorithms are used to assess an employee’s performance, it can be hard to appeal an unfair review or dismissal. 

 

 

From DSC:
I can’t help but reflect on how slippery the slope is when we start talking about using drones — especially as sponsored and used by governments, including our government here in the U.S. Consider the following from The Future Institute.

The Future Institute Today -- discussing the slippery slope of using drones

Excerpt:

Eyes in the sky
As nationwide racial justice protests continue, some journalists and protestors have noticed a new addition to the armed police officers and National Guard troops: a drone flying a hexagon-shaped route 20,000 feet above the streets in Minneapolis. The drone, flown by U.S. Customs and Border Protection, is called a Predator, and is a piece of military technology used for identifying and targeting terrorists overseas. Lately, it’s become a more common sight domestically.

Last month, a number of New Yorkers witnessed a drone floating above them, barking orders to follow social distancing guidelines. The mysterious drone wasn’t official police equipment, but rather a privately owned device piloted by a man named Xavier Arthur in Queens, who was frustrated that people weren’t following stay-at-home orders. He claimed to represent the “Anti-Covid-19 Volunteer Drone Task Force. 

It’s not an isolated incident. During the outbreak, drones have been used extensively to monitor residents and encourage them to stay indoors, to inspect traffic stops and hospitals, and to spray cities with disinfectants. In Paris and Mumbai, they’re patrolling social distancing violators. In China, a video clip went viral, showing a drone breaking up a mahjong game—residents had defied local orders that they stay indoors. Drones with infrared cameras also allegedly flew overhead and checked for people with fevers.

Advanced drones can pinpoint certain behaviors in crowds from high altitudes, recognize and automatically follow targets, and communicate with each other or to command centers on the ground with remarkable precision and low latency. The pandemic and protests are playing to the strengths of an emerging real-time aerial surveillance ecosystem.

3 Things You Should Know

  1. The Flying Internet of Things is taking off.
  2. New drones can self-destruct.
  3. Shareable drones may drive growth in the industry.
 

NYC classrooms cancel Zoom after trolls make ‘Zoombombing’ a thing — from thenextweb.com by Bryan Clark

From DSC:
I’m a sinner. I know that all too well. But I’m tagging this with ethics, morals, and matters of the heart because I don’t have a tag for “I’m bumbed out that a solid technology gets ruined by a few bad apples.” It’s not the technology’s fault, per se. It has everything to do with the human heart….whether we like to admit/recognize it or not.

Also see:

“Zoombombing problems are more likely to occur for online events for which the link is circulated widely, or even posted on a public website.” (source)
 

From DSC:
Normally I don’t advertise or “plug” where I work, but as our society is in serious need of increasing Access to Justice (#A2J) & promoting diversity w/in the legal realm, I need to post this video.

 

From DSC:
My brother-in-law sent me the link to the video below. It’s a very solid interview about racism and what some solutions are to it. It offers some important lessons for us.

A heads up here: There’s some adult language in this piece — from the interviewer not the interviewee (i.e., you know…several of those swear words that I’ve been trying since second grade to get rid of in my vocabulary! Sorry to report that I’ve not enjoyed too much success in that area. Thanks for your patience LORD…the work/process continues).

While I have several pages worth of notes (because that’s just how I best process information and stay focused), I will just comment on a couple things:

* A 10 year old boy has rocks thrown at him by adults and kids and rightfully asks, “Why are they doing this to me when they don’t even *know* me?!”  That burning question lead to a decades-long search for Mr. Daryl Davis as he sought the answer to that excellent question.

* Years later Daryl surmised this phenomenon was/is at play: Unchecked ignorance –> leads to fear –> unchecked fear leads to hatred –> unchecked hatred leads to destruction. One of the best ways to stop this is via education and exposure to the truth — which we can get by being with and talking to/with each other. How true.  
One of the best things my parents ever did was to move us from a predominantly white neighborhood and school setting to a far more diverse setting. Prior to the move, we used to hear (and likely believed was true) that “There are all kinds of guns and knives at this junior high school and at this high school. Violence abounds there.” After moving and getting exposure to the people and learning environments at those schools, we realized that that fear was a lie…a lie born out of ignorance. The truth/reality was different from the lie/ignorance.
* Mr. Daryl Davis is an instrument of peace. He is:
  • Highly articulate
  • A multi-talented gentleman
  • A deep thinker
  • …and an eloquent communicator.

I thanked my brother-in-law for the link to the interview.


Also see:

Healing Racial Trauma: The Road to Resilience— from christianbook.com by Sheila Wise Rowe

Product Description
As a child, Sheila Wise Rowe was bused across town to a majority white school, where she experienced the racist lie that one group is superior to all others. This lie continues to be perpetuated today by the action or inaction of the government, media, viral videos, churches, and within families of origin. In contrast, Scripture declares that we are all fearfully and wonderfully made.

Rowe, a professional counselor, exposes the symptoms of racial trauma to lead readers to a place of freedom from the past and new life for the future. In each chapter, she includes an interview with a person of color to explore how we experience and resolve racial trauma. With Rowe as a reliable guide who has both been on the journey and shown others the way forward, you will find a safe pathway to resilience.

 

 

From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 
© 2024 | Daniel Christian