The Digital Divide for Tribal College Students — COVID, CARES Act, and Critical Next Steps — from diverseeducation.com

Excerpt:

In this episode staff writer Sara Weissman shares a story that focuses on the digital divide for Native Americans by bringing in voices of tribal college leaders and their students during the COVID 19 pandemic.

Many don’t know but Native American colleges and universities have long struggled with the worst internet connectivity in the nation while ironically paying the highest rates for service. Hear first-hand how students from Diné College and other institutions are currently affected. Carrie Billie (Big Water Clan), President & CEO of the American Indian Higher Education Consortium (AIHEC) and Dr. Cynthia Lindquist (Star Horse Woman), President of Cankdeska Cikana Community College in North Dakota, break down the data and lay out critical next steps necessary to address the digital divide.

Many don’t know but Native American colleges and universities have long struggled with the worst internet connectivity in the nation while ironically paying the highest rates for service.

From DSC:
When will there be justice!? Let’s join in and make amends and provide the funding, concrete assistance, products, and services to Native American colleges, universities, and communities. Some potential ideas:

  • For the short term, could there be Loon balloons deployed immediately to provide free and stronger access to the Internet?

Could Project Loon assist Native American colleges, universities, and communities?

  • Could our Federal Government make amends and do the right thing here? (e-rate program, put Internet access in, make policy changes, offer more grants, other?)
  • Could Silicon Valley assist with hardware and software? For example:
    • Can Apple, HP, Microsoft, and others donate hardware and software?
    • Can Zoom, Adobe, Cisco Webex, Microsoft Teams, and others donate whatever these communities need to provide videoconferencing licenses?
  • Could telecom providers provide free internet access?
  • Could MOOCs offer more free courses?
  • Could furniture makers such as Steelcase, Herman Miller, and others donate furniture and help establish connected learning spaces?
  • How might faculty members and staff within higher education contribute?
  • How could churches, synagogues, and such get involved?
  • Could the rest of us locate and donate to charities that aim to provide concrete assistance to Native American schools, colleges, universities, and communities?

We need to do the right thing here. This is another area* where our nation can do much better.

* Here’s another example/area where we can do much better and make amends/changes.

 

 

Pushback is growing against automated proctoring services. But so is their use — from edsurge.com by Jeffrey R. Young

Excerpt:

Many students have pushed back, arguing that remote proctoring tools result in a serious invasion of privacy and create stress that can hinder academic performance. More than 60,000 students across the U.S. have signed petitions calling on their colleges to stop using automated proctoring tools, meaning that the technology has become arguably the most controversial tool of the pandemic at colleges.

From DSC:
We have an issue oftentimes within higher education — including graduate schools/professional schools as well — where the student and the professor aren’t always on the same team (or at least that’s the percaption). To me, the professors need to be saying (and living out the message that), “We ARE on your team. We are working to help make you successful in the future that you have chosen for yourself. We’re here to help you…not monitor you.”

It’s like I feel when I walk into so many public places these days (and even out on the roadways as well). When I walk into a store, it’s like the security cameras are whispering to me and to others…”We don’t trust you. Some of you have stolen in the past. so we’re going to carefully watch every single one of you. And we aren’t just going to watch you, we’re going to record you as well.”

The message? We don’t trust you.

This severely hampers the relationships involved.

And I’m sure that cheating is going on. But then, that makes me think that perhaps it’s time to change the way we assess students — and to help them see assessments as opportunities to learn, not to cheat. 

Lower the stakes. Offer tests more frequently. Provide more opportunities to practice recall. And be on their team.

 

From DSC:
As the technologies get more powerful, so do the ramifications/consequences. And here’s the kicker. Just like we’re seeing with trying to deal with systemic racism in our country…like it or not, it all comes back to the state of the hearts and minds out there. I’m not trying to be preachy or to pretend that I’m better than anyone. I’m a sinner. I know that full well. And I view all of us as equals trying to survive and to make our way in this tough-to-live-in world. But the unfortunate truth is that technologies are tools, and how the tools are used depends upon one’s heart.

Proverbs 4:23 New International Version (NIV) — from biblegateway.com

23 Above all else, guard your heart,
    for everything you do flows from it.

 

This startup is using AI to give workers a “productivity score” — from technologyreview.com by Will Douglas
Enaible is one of a number of new firms that are giving employers tools to help keep tabs on their employees—but critics fear this kind of surveillance undermines trust.

Excerpt:

In the last few months, millions of people around the world stopped going into offices and started doing their jobs from home. These workers may be out of sight of managers, but they are not out of mind. The upheaval has been accompanied by a reported spike in the use of surveillance software that lets employers track what their employees are doing and how long they spend doing it.

Companies have asked remote workers to install a whole range of such tools. Hubstaff is software that records users’ keyboard strokes, mouse movements, and the websites that they visit. Time Doctor goes further, taking videos of users’ screens. It can also take a picture via webcam every 10 minutes to check that employees are at their computer. And Isaak, a tool made by UK firm Status Today, monitors interactions between employees to identify who collaborates more, combining this data with information from personnel files to identify individuals who are “change-makers.”

Machine-learning algorithms also encode hidden bias in the data they are trained on. Such bias is even harder to expose when it’s buried inside an automated system. If these algorithms are used to assess an employee’s performance, it can be hard to appeal an unfair review or dismissal. 

 

 

From DSC:
I can’t help but reflect on how slippery the slope is when we start talking about using drones — especially as sponsored and used by governments, including our government here in the U.S. Consider the following from The Future Institute.

The Future Institute Today -- discussing the slippery slope of using drones

Excerpt:

Eyes in the sky
As nationwide racial justice protests continue, some journalists and protestors have noticed a new addition to the armed police officers and National Guard troops: a drone flying a hexagon-shaped route 20,000 feet above the streets in Minneapolis. The drone, flown by U.S. Customs and Border Protection, is called a Predator, and is a piece of military technology used for identifying and targeting terrorists overseas. Lately, it’s become a more common sight domestically.

Last month, a number of New Yorkers witnessed a drone floating above them, barking orders to follow social distancing guidelines. The mysterious drone wasn’t official police equipment, but rather a privately owned device piloted by a man named Xavier Arthur in Queens, who was frustrated that people weren’t following stay-at-home orders. He claimed to represent the “Anti-Covid-19 Volunteer Drone Task Force. 

It’s not an isolated incident. During the outbreak, drones have been used extensively to monitor residents and encourage them to stay indoors, to inspect traffic stops and hospitals, and to spray cities with disinfectants. In Paris and Mumbai, they’re patrolling social distancing violators. In China, a video clip went viral, showing a drone breaking up a mahjong game—residents had defied local orders that they stay indoors. Drones with infrared cameras also allegedly flew overhead and checked for people with fevers.

Advanced drones can pinpoint certain behaviors in crowds from high altitudes, recognize and automatically follow targets, and communicate with each other or to command centers on the ground with remarkable precision and low latency. The pandemic and protests are playing to the strengths of an emerging real-time aerial surveillance ecosystem.

3 Things You Should Know

  1. The Flying Internet of Things is taking off.
  2. New drones can self-destruct.
  3. Shareable drones may drive growth in the industry.
 

NYC classrooms cancel Zoom after trolls make ‘Zoombombing’ a thing — from thenextweb.com by Bryan Clark

From DSC:
I’m a sinner. I know that all too well. But I’m tagging this with ethics, morals, and matters of the heart because I don’t have a tag for “I’m bumbed out that a solid technology gets ruined by a few bad apples.” It’s not the technology’s fault, per se. It has everything to do with the human heart….whether we like to admit/recognize it or not.

Also see:

“Zoombombing problems are more likely to occur for online events for which the link is circulated widely, or even posted on a public website.” (source)
 

Why education is a ‘wicked problem’ for learning engineers to solve — from edsurge.com by Rebecca Koenig

Excerpts (emphasis DSC):

So, back to the wicked problem: How do we make education that’s both quality education and at the same time accessible and affordable?

“Now, we are building a new technology that we call Agent Smith. It’s another AI technology— and we’re very excited about it—that builds [a] Jill Watson for you. And Agent Smith can build a Jill Watson for you in less than 10 percent of the hours.”

So one question for online education is, can we build a new set of tools—and I think that’s where AI is going to go, that learning engineering is going to go—where AI is not helping individual humans as much as AI is helping human-human interaction.

Huge ethical issues and something that learning engineering has not yet started focusing on in a serious manner. We are still in a phase of, “Look ma, no hands, I can ride a bike without hands.”

Technology should not be left to technologists.

Learning from the living class room

 

From DSC:
Normally I don’t advertise or “plug” where I work, but as our society is in serious need of increasing Access to Justice (#A2J) & promoting diversity w/in the legal realm, I need to post this video.

 

From DSC:
My brother-in-law sent me the link to the video below. It’s a very solid interview about racism and what some solutions are to it. It offers some important lessons for us.

A heads up here: There’s some adult language in this piece — from the interviewer not the interviewee (i.e., you know…several of those swear words that I’ve been trying since second grade to get rid of in my vocabulary! Sorry to report that I’ve not enjoyed too much success in that area. Thanks for your patience LORD…the work/process continues).

While I have several pages worth of notes (because that’s just how I best process information and stay focused), I will just comment on a couple things:

* A 10 year old boy has rocks thrown at him by adults and kids and rightfully asks, “Why are they doing this to me when they don’t even *know* me?!”  That burning question lead to a decades-long search for Mr. Daryl Davis as he sought the answer to that excellent question.

* Years later Daryl surmised this phenomenon was/is at play: Unchecked ignorance –> leads to fear –> unchecked fear leads to hatred –> unchecked hatred leads to destruction. One of the best ways to stop this is via education and exposure to the truth — which we can get by being with and talking to/with each other. How true.  
One of the best things my parents ever did was to move us from a predominantly white neighborhood and school setting to a far more diverse setting. Prior to the move, we used to hear (and likely believed was true) that “There are all kinds of guns and knives at this junior high school and at this high school. Violence abounds there.” After moving and getting exposure to the people and learning environments at those schools, we realized that that fear was a lie…a lie born out of ignorance. The truth/reality was different from the lie/ignorance.
* Mr. Daryl Davis is an instrument of peace. He is:
  • Highly articulate
  • A multi-talented gentleman
  • A deep thinker
  • …and an eloquent communicator.

I thanked my brother-in-law for the link to the interview.


Also see:

Healing Racial Trauma: The Road to Resilience— from christianbook.com by Sheila Wise Rowe

Product Description
As a child, Sheila Wise Rowe was bused across town to a majority white school, where she experienced the racist lie that one group is superior to all others. This lie continues to be perpetuated today by the action or inaction of the government, media, viral videos, churches, and within families of origin. In contrast, Scripture declares that we are all fearfully and wonderfully made.

Rowe, a professional counselor, exposes the symptoms of racial trauma to lead readers to a place of freedom from the past and new life for the future. In each chapter, she includes an interview with a person of color to explore how we experience and resolve racial trauma. With Rowe as a reliable guide who has both been on the journey and shown others the way forward, you will find a safe pathway to resilience.

 

 

From DSC:
As some of you may know, I’m now working for the WMU-Thomas M. Cooley Law School. My faith gets involved here, but I believe that the LORD wanted me to get involved with:

  • Using technology to increase access to justice (#A2J)
  • Contributing to leveraging the science of learning for the long-term benefit of our students, faculty, and staff
  • Raising awareness regarding the potential pros and cons of today’s emerging technologies
  • Increase the understanding that the legal realm has a looooong way to go to try to get (even somewhat) caught up with the impacts that such emerging technologies can/might have on us.
  • Contributing and collaborating with others to help develop a positive future, not a negative one.

Along these lines…in regards to what’s been happening with law schools over the last few years, I wanted to share a couple of things:

1) An article from The Chronicle of Higher Education by Benjamin Barton:

The Law School Crash

 

2) A response from our President and Dean, James McGrath:Repositioning a Law School for the New Normal

 

From DSC:
I also wanted to personally say that I arrived at WMU-Cooley Law School in 2018, and have been learning a lot there (which I love about my job!).  Cooley employees are very warm, welcoming, experienced, knowledgeable, and professional. Everyone there is mission-driven. My boss, Chris Church, is multi-talented and excellent. Cooley has a great administrative/management team as well.

There have been many exciting, new things happening there. But that said, it will take time before we see the results of these changes. Perseverance and innovation will be key ingredients to crafting a modern legal education — especially in an industry that is just now beginning to offer online-based courses at the Juris Doctor (J.D.) level (i.e., 20 years behind when this began occurring within undergraduate higher education).

My point in posting this is to say that we should ALL care about what’s happening within the legal realm!  We are all impacted by it, whether we realize it or not. We are all in this together and no one is an island — not as individuals, and not as organizations.

We need:

  • Far more diversity within the legal field
  • More technical expertise within the legal realm — not only with lawyers, but with legislators, senators, representatives, judges, others
  • Greater use of teams of specialists within the legal field
  • To offer more courses regarding emerging technologies — and not only for the legal practices themselves but also for society at large.
  • To be far more vigilant in crafting a positive world to be handed down to our kids and grandkids — a dream, not a nightmare. Just because we can, doesn’t mean we should.

Still not convinced that you should care? Here are some things on the CURRENT landscapes:

  • You go to drop something off at your neighbor’s house. They have a camera that gets activated.  What facial recognition database are you now on? Did you give your consent to that? No, you didn’t.
  • Because you posted your photo on Facebook, YouTube, Venmo and/or on millions of other websites, your face could be in ClearView AI’s database. Did you give your consent to that occurring? No, you didn’t.
  • You’re at the airport and facial recognition is used instead of a passport. Whose database was that from and what gets shared? Did you give your consent to that occurring? Probably not, and it’s not easy to opt-out either.
  • Numerous types of drones, delivery bots, and more are already coming onto the scene. What will the sidewalks, streets, and skies look like — and sound like — in your neighborhood in the near future? Is that how you want it? Did you give your consent to that happening? No, you didn’t.
  • …and on and on it goes.

Addendum — speaking of islands!

Palantir CEO: Silicon Valley can’t be on ‘Palo Alto island’ — Big Tech must play by the rules — from cnbc.com by Jessica Bursztynsky

Excerpt:

Palantir Technologies co-founder and CEO Alex Karp said Thursday the core problem in Silicon Valley is the attitude among tech executives that they want to be separate from United States regulation.

“You cannot create an island called Palo Alto Island,” said Karp, who suggested tech leaders would rather govern themselves. “What Silicon Valley really wants is the canton of Palo Alto. We have the United States of America, not the ‘United States of Canton,’ one of which is Palo Alto. That must change.”

“Consumer tech companies, not Apple, but the other ones, have basically decided we’re living on an island and the island is so far removed from what’s called the United States in every way, culturally, linguistically and in normative ways,” Karp added.

 

 

From DSC:
I’ll say it again, just because we can, doesn’t mean we should.

From the article below…we can see another unintended consequence is developing on society’s landscapes. I really wish the 20 and 30 somethings that are being hired by the big tech companies — especially at Amazon, Facebook, Google, Apple, and Microsoft — who are developing these things would ask themselves:

  • “Just because we can develop this system/software/application/etc., SHOULD we be developing it?”
  • What might the negative consequences be? 
  • Do the positive contributions outweigh the negative impacts…or not?

To colleges professors and teachers:
Please pass these thoughts onto your students now, so that this internal questioning/conversations begin to take place in K-16.


Report: Colleges Must Teach ‘Algorithm Literacy’ to Help Students Navigate Internet — from edsurge.com by Rebecca Koenig

Excerpt (emphasis DSC):

If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.

That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.

And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.

Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”

They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.

 

From DSC:
Very disturbing that citizens had no say in this. Legislators, senators, representatives, lawyers, law schools, politicians, engineers, programmers, professors, teachers, and more…please reflect upon our current situation here. How can we help create the kind of future that we can hand down to our kids and rest well at night…knowing we did all that we could to provide a dream — and not a nightmare — for them?


The Secretive Company That Might End Privacy as We Know It — from nytimes.com by Kashmir Hill
A little-known start-up helps law enforcement match photos of unknown people to their online images — and “might lead to a dystopian future or something,” a backer says.

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.

 

Excerpts:

“But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year…”

Clearview’s app carries extra risks because law enforcement agencies are uploading sensitive photos to the servers of a company whose ability to protect its data is untested.

 

Indian police are using facial recognition to identify protesters in Delhi — from fastcompany.com by Kristin Toussaint

Excerpt:

At Modi’s rally on December 22, Delhi police used Automated Facial Recognition System (AFRS) software—which officials there acquired in 2018 as a tool to find and identify missing children—to screen the crowd for faces that match a database of people who have attended other protests around the city, and who officials said could be disruptive.

According to the Indian Express, Delhi police have long filmed these protest events, and the department announced Monday that officials fed that footage through AFRS. Sources told the Indian news outlet that once “identifiable faces” are extracted from that footage, a dataset will point out and retain “habitual protesters” and “rowdy elements.” That dataset was put to use at Modi’s rally to keep away “miscreants who could raise slogans or banners.”

 

From DSC:
Here in the United States…are we paying attention to today’s emerging technologies and collaboratively working to create a future dream — versus a future nightmare!?!  A vendor or organization might propose a beneficial reason to use their product or technology — and it might even meet the hype at times…but then comes along other unintended uses and consequences of that technology. For example, in the article above, what started out as a technology that was supposed to be used to find/identify missing children (a benefit) was later used to identify protesters (an unintended consequence, and a nightmare in terms of such an expanded scope of use I might add)!

Along these lines, the youth of today have every right to voice their opinions and to have a role in developing or torpedoing emerging techs. What we build and put into place now will impact their lives bigtime!

 

7 Artificial Intelligence Trends to Watch in 2020 — from interestingengineering.com by Christopher McFadden

Excerpts:

Per this article, the following trends were listed:

  1. Computer Graphics will greatly benefit from AI
  2. Deepfakes will only get better, er, worse
  3. Predictive text should get better and better
  4. Ethics will become more important as time goes by
  5. Quantum computing will supercharge AI
  6. Facial recognition will appear in more places
  7. AI will help in the optimization of production pipelines

Also, this article listed several more trends:

According to sources like The Next Web, some of the main AI trends for 2020 include:

  • The use of AI to make healthcare more accurate and less costly
  • Greater attention paid to explainability and trust
  • AI becoming less data-hungry
  • Improved accuracy and efficiency of neural networks
  • Automated AI development
  • Expanded use of AI in manufacturing
  • Geopolitical implications for the uses of AI

Artificial Intelligence offers great potential and great risks for humans in the future. While still in its infancy, it is already being employed in some interesting ways.

According to sources like Forbes, some of the next “big things” in technology include, but are not limited to:

  • Blockchain
  • Blockchain As A Service
  • AI-Led Automation
  • Machine Learning
  • Enterprise Content Management
  • AI For The Back Office
  • Quantum Computing AI Applications
  • Mainstreamed IoT

Also see:

Artificial intelligence predictions for 2020: 16 experts have their say — from verdict.co.uk by Ellen Daniel

Excerpts:

  • Organisations will build in processes and policies to prevent and address potential biases in AI
  • Deepfakes will become a serious threat to corporations
  • Candidate (and employee) care in the world of artificial intelligence
  • AI will augment humans, not replace them
  • Greater demand for AI understanding
  • Ramp up in autonomous vehicles
  • To fully take advantage of AI technologies, you’ll need to retrain your entire organisation
  • Voice technologies will infiltrate the office
  • IT will run itself while data acquires its own DNA
  • The ethics of AI
  • Health data and AI
  • AI to become an intrinsic part of robotic process automation (RPA)
  • BERT will open up a whole new world of deep learning use cases

The hottest trend in the industry right now is in Natural Language Processing (NLP). Over the past year, a new method called BERT (Bidirectional Encoder Representations from Transformers) has been developed for designing neural networks that work with text. Now, we suddenly have models that will understand the semantic meaning of what’s in text, going beyond the basics. This creates a lot more opportunity for deep learning to be used more widely.

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2020 | Daniel Christian