The Chegg situation is worse than you think — from eliterate.us by Michael Feldstein

Excerpts:

Forbes just ran with an article entitled “This $12 Billion Company Is Getting Rich Off Students Cheating Their Way Through Covid“.

Ouch.

Chegg -- This $12 Billion Company Is Getting Rich Off Students Cheating Their Way Through Covid

[Per Michael] To sum up:

  • Publishers, after selling expensive textbooks to students, sold the answers to the homework questions in those expensive books to Chegg.
  • Chegg sells the answers to the questions to the students, who often use them to cheat.
  • To combat this problem, universities pay for proctoring software, which is apparently more effective at preventing students from going to the bathroom than it is at preventing cheating.
  • To add insult to all of this injury, “to chegg” is now apparently a verb in the English language. We will all have to live with that linguistic violence.

Addendum on 2/9/21:

 

Could 2021 Be The Year Of Civil Justice Reform? — from law360.com by Cara Bayles

Excerpt:

Now, any tenant in Boulder, regardless of means, can get an attorney to represent them in housing court for free. While a handful of other cities like New York and San Francisco offer similar programs, there is no universal right to an attorney in eviction cases.

Justice advocates are hoping that could change. COVID-19, they say, has drawn attention to access to justice issues that have plagued civil proceedings for years. They hope the tragedies of 2020 can fuel reform in 2021.

Six months into the pandemic, millions of Americans had fallen behind on rent. One statistic that civil justice advocates have known for years became apparent — that while 90% of landlords have legal counsel in eviction proceedings, only 10% of tenants do.

 

 

From DSC:
An interesting, more positive use of AI here:

Deepdub uses AI to dub movies in the voice of famous actors — from protocol.com by Janko Roettgers
Fresh out of stealth, the startup is using artificial intelligence to automate the localization process for global streaming.

Excerpt:

Tel Aviv-based startup Deepdub wants to help streaming services accelerate this kind of international rollout by using artificial intelligence for their localization needs. Deepdub, which came out of stealth on Wednesday, has built technology that can translate a voice track to a different language, all while staying true to the voice of the talent. This makes it possible to have someone like Morgan Freeman narrate a movie in French, Italian or Russian without losing what makes Freeman’s voice special and recognizable.

From DSC:
A much more negative use of AI here:

A much more negative use of AI here...

 

 

The Digital Divide for Tribal College Students — COVID, CARES Act, and Critical Next Steps — from diverseeducation.com

Excerpt:

In this episode staff writer Sara Weissman shares a story that focuses on the digital divide for Native Americans by bringing in voices of tribal college leaders and their students during the COVID 19 pandemic.

Many don’t know but Native American colleges and universities have long struggled with the worst internet connectivity in the nation while ironically paying the highest rates for service. Hear first-hand how students from Diné College and other institutions are currently affected. Carrie Billie (Big Water Clan), President & CEO of the American Indian Higher Education Consortium (AIHEC) and Dr. Cynthia Lindquist (Star Horse Woman), President of Cankdeska Cikana Community College in North Dakota, break down the data and lay out critical next steps necessary to address the digital divide.

Many don’t know but Native American colleges and universities have long struggled with the worst internet connectivity in the nation while ironically paying the highest rates for service.

From DSC:
When will there be justice!? Let’s join in and make amends and provide the funding, concrete assistance, products, and services to Native American colleges, universities, and communities. Some potential ideas:

  • For the short term, could there be Loon balloons deployed immediately to provide free and stronger access to the Internet?

Could Project Loon assist Native American colleges, universities, and communities?

  • Could our Federal Government make amends and do the right thing here? (e-rate program, put Internet access in, make policy changes, offer more grants, other?)
  • Could Silicon Valley assist with hardware and software? For example:
    • Can Apple, HP, Microsoft, and others donate hardware and software?
    • Can Zoom, Adobe, Cisco Webex, Microsoft Teams, and others donate whatever these communities need to provide videoconferencing licenses?
  • Could telecom providers provide free internet access?
  • Could MOOCs offer more free courses?
  • Could furniture makers such as Steelcase, Herman Miller, and others donate furniture and help establish connected learning spaces?
  • How might faculty members and staff within higher education contribute?
  • How could churches, synagogues, and such get involved?
  • Could the rest of us locate and donate to charities that aim to provide concrete assistance to Native American schools, colleges, universities, and communities?

We need to do the right thing here. This is another area* where our nation can do much better.

* Here’s another example/area where we can do much better and make amends/changes.

 


Addendum on 12/7/20:

 

Pushback is growing against automated proctoring services. But so is their use — from edsurge.com by Jeffrey R. Young

Excerpt:

Many students have pushed back, arguing that remote proctoring tools result in a serious invasion of privacy and create stress that can hinder academic performance. More than 60,000 students across the U.S. have signed petitions calling on their colleges to stop using automated proctoring tools, meaning that the technology has become arguably the most controversial tool of the pandemic at colleges.

From DSC:
We have an issue oftentimes within higher education — including graduate schools/professional schools as well — where the student and the professor aren’t always on the same team (or at least that’s the percaption). To me, the professors need to be saying (and living out the message that), “We ARE on your team. We are working to help make you successful in the future that you have chosen for yourself. We’re here to help you…not monitor you.”

It’s like I feel when I walk into so many public places these days (and even out on the roadways as well). When I walk into a store, it’s like the security cameras are whispering to me and to others…”We don’t trust you. Some of you have stolen in the past. so we’re going to carefully watch every single one of you. And we aren’t just going to watch you, we’re going to record you as well.”

The message? We don’t trust you.

This severely hampers the relationships involved.

And I’m sure that cheating is going on. But then, that makes me think that perhaps it’s time to change the way we assess students — and to help them see assessments as opportunities to learn, not to cheat. 

Lower the stakes. Offer tests more frequently. Provide more opportunities to practice recall. And be on their team.

 

From DSC:
As the technologies get more powerful, so do the ramifications/consequences. And here’s the kicker. Just like we’re seeing with trying to deal with systemic racism in our country…like it or not, it all comes back to the state of the hearts and minds out there. I’m not trying to be preachy or to pretend that I’m better than anyone. I’m a sinner. I know that full well. And I view all of us as equals trying to survive and to make our way in this tough-to-live-in world. But the unfortunate truth is that technologies are tools, and how the tools are used depends upon one’s heart.

Proverbs 4:23 New International Version (NIV) — from biblegateway.com

23 Above all else, guard your heart,
    for everything you do flows from it.

 

This startup is using AI to give workers a “productivity score” — from technologyreview.com by Will Douglas
Enaible is one of a number of new firms that are giving employers tools to help keep tabs on their employees—but critics fear this kind of surveillance undermines trust.

Excerpt:

In the last few months, millions of people around the world stopped going into offices and started doing their jobs from home. These workers may be out of sight of managers, but they are not out of mind. The upheaval has been accompanied by a reported spike in the use of surveillance software that lets employers track what their employees are doing and how long they spend doing it.

Companies have asked remote workers to install a whole range of such tools. Hubstaff is software that records users’ keyboard strokes, mouse movements, and the websites that they visit. Time Doctor goes further, taking videos of users’ screens. It can also take a picture via webcam every 10 minutes to check that employees are at their computer. And Isaak, a tool made by UK firm Status Today, monitors interactions between employees to identify who collaborates more, combining this data with information from personnel files to identify individuals who are “change-makers.”

Machine-learning algorithms also encode hidden bias in the data they are trained on. Such bias is even harder to expose when it’s buried inside an automated system. If these algorithms are used to assess an employee’s performance, it can be hard to appeal an unfair review or dismissal. 

 

 

From DSC:
I can’t help but reflect on how slippery the slope is when we start talking about using drones — especially as sponsored and used by governments, including our government here in the U.S. Consider the following from The Future Institute.

The Future Institute Today -- discussing the slippery slope of using drones

Excerpt:

Eyes in the sky
As nationwide racial justice protests continue, some journalists and protestors have noticed a new addition to the armed police officers and National Guard troops: a drone flying a hexagon-shaped route 20,000 feet above the streets in Minneapolis. The drone, flown by U.S. Customs and Border Protection, is called a Predator, and is a piece of military technology used for identifying and targeting terrorists overseas. Lately, it’s become a more common sight domestically.

Last month, a number of New Yorkers witnessed a drone floating above them, barking orders to follow social distancing guidelines. The mysterious drone wasn’t official police equipment, but rather a privately owned device piloted by a man named Xavier Arthur in Queens, who was frustrated that people weren’t following stay-at-home orders. He claimed to represent the “Anti-Covid-19 Volunteer Drone Task Force. 

It’s not an isolated incident. During the outbreak, drones have been used extensively to monitor residents and encourage them to stay indoors, to inspect traffic stops and hospitals, and to spray cities with disinfectants. In Paris and Mumbai, they’re patrolling social distancing violators. In China, a video clip went viral, showing a drone breaking up a mahjong game—residents had defied local orders that they stay indoors. Drones with infrared cameras also allegedly flew overhead and checked for people with fevers.

Advanced drones can pinpoint certain behaviors in crowds from high altitudes, recognize and automatically follow targets, and communicate with each other or to command centers on the ground with remarkable precision and low latency. The pandemic and protests are playing to the strengths of an emerging real-time aerial surveillance ecosystem.

3 Things You Should Know

  1. The Flying Internet of Things is taking off.
  2. New drones can self-destruct.
  3. Shareable drones may drive growth in the industry.
 

NYC classrooms cancel Zoom after trolls make ‘Zoombombing’ a thing — from thenextweb.com by Bryan Clark

From DSC:
I’m a sinner. I know that all too well. But I’m tagging this with ethics, morals, and matters of the heart because I don’t have a tag for “I’m bumbed out that a solid technology gets ruined by a few bad apples.” It’s not the technology’s fault, per se. It has everything to do with the human heart….whether we like to admit/recognize it or not.

Also see:

“Zoombombing problems are more likely to occur for online events for which the link is circulated widely, or even posted on a public website.” (source)
 

Why education is a ‘wicked problem’ for learning engineers to solve — from edsurge.com by Rebecca Koenig

Excerpts (emphasis DSC):

So, back to the wicked problem: How do we make education that’s both quality education and at the same time accessible and affordable?

“Now, we are building a new technology that we call Agent Smith. It’s another AI technology— and we’re very excited about it—that builds [a] Jill Watson for you. And Agent Smith can build a Jill Watson for you in less than 10 percent of the hours.”

So one question for online education is, can we build a new set of tools—and I think that’s where AI is going to go, that learning engineering is going to go—where AI is not helping individual humans as much as AI is helping human-human interaction.

Huge ethical issues and something that learning engineering has not yet started focusing on in a serious manner. We are still in a phase of, “Look ma, no hands, I can ride a bike without hands.”

Technology should not be left to technologists.

Learning from the living class room

 

From DSC:
Normally I don’t advertise or “plug” where I work, but as our society is in serious need of increasing Access to Justice (#A2J) & promoting diversity w/in the legal realm, I need to post this video.

 

From DSC:
My brother-in-law sent me the link to the video below. It’s a very solid interview about racism and what some solutions are to it. It offers some important lessons for us.

A heads up here: There’s some adult language in this piece — from the interviewer not the interviewee (i.e., you know…several of those swear words that I’ve been trying since second grade to get rid of in my vocabulary! Sorry to report that I’ve not enjoyed too much success in that area. Thanks for your patience LORD…the work/process continues).

While I have several pages worth of notes (because that’s just how I best process information and stay focused), I will just comment on a couple things:

* A 10 year old boy has rocks thrown at him by adults and kids and rightfully asks, “Why are they doing this to me when they don’t even *know* me?!”  That burning question lead to a decades-long search for Mr. Daryl Davis as he sought the answer to that excellent question.

* Years later Daryl surmised this phenomenon was/is at play: Unchecked ignorance –> leads to fear –> unchecked fear leads to hatred –> unchecked hatred leads to destruction. One of the best ways to stop this is via education and exposure to the truth — which we can get by being with and talking to/with each other. How true.  
One of the best things my parents ever did was to move us from a predominantly white neighborhood and school setting to a far more diverse setting. Prior to the move, we used to hear (and likely believed was true) that “There are all kinds of guns and knives at this junior high school and at this high school. Violence abounds there.” After moving and getting exposure to the people and learning environments at those schools, we realized that that fear was a lie…a lie born out of ignorance. The truth/reality was different from the lie/ignorance.
* Mr. Daryl Davis is an instrument of peace. He is:
  • Highly articulate
  • A multi-talented gentleman
  • A deep thinker
  • …and an eloquent communicator.

I thanked my brother-in-law for the link to the interview.


Also see:

Healing Racial Trauma: The Road to Resilience— from christianbook.com by Sheila Wise Rowe

Product Description
As a child, Sheila Wise Rowe was bused across town to a majority white school, where she experienced the racist lie that one group is superior to all others. This lie continues to be perpetuated today by the action or inaction of the government, media, viral videos, churches, and within families of origin. In contrast, Scripture declares that we are all fearfully and wonderfully made.

Rowe, a professional counselor, exposes the symptoms of racial trauma to lead readers to a place of freedom from the past and new life for the future. In each chapter, she includes an interview with a person of color to explore how we experience and resolve racial trauma. With Rowe as a reliable guide who has both been on the journey and shown others the way forward, you will find a safe pathway to resilience.

 

 

From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 
© 2021 | Daniel Christian