“How has your school delivered on the promise of equal access and educational excellence, particularly during these challenging times?”

YouTube Contest, “With Justice for All,” Seeks Submissions from Students About the Effect of Covid-19 and Recent Tragedies on Their Educational Experience

Prizes include 11 scholarships for students who best address the question, “How has your school delivered on the promise of equal access and educational excellence, particularly during these challenging times?”

WASHINGTON, D.C. — The Center for Education Reform (CER), in partnership with the Freedom Coalition for Charter Schools, the Children’s Scholarship Fund, and the National Alliance for Public Charter Schools, today launched “With Justice for All,” a national YouTube contest for students.

Over the last few months, students and schools have faced significant challenges from distance learning and national tragedies. Times like these highlight how a great education is the most important asset a student has to effectively change the world.

So CER decided to ask students directly: Has your school delivered on the promise of equal access and educational excellence, particularly during these challenging times? Tell us how well your school did — or didn’t do — in providing you a great education.

“We want you to be able to take charge of your education,” said Jeanne Allen, CER’s founder and chief executive. “We want to assist you in writing the next chapter of your education story. Tell us your story, and we’ll tell everybody who needs to know, especially those in power.”

Videos must be shorter than three minutes, hashtagged with #MyEducationVideo, and submitted to MyEducationVideo.com by 11:59 PM EDT on July 4, 2020. Submissions will be evaluated by a panel of celebrity judges. Awards include 10 $2,500 scholarships — and one $20,000 scholarship — to the high school or college of a student’s choice. Winners will be announced during a live-streamed ceremony (date and time T.B.D.), and their videos may be shown to delegates at both of the 2020 national conventions this summer.

“We’ve designed this contest for students ages 13 and older, because we know it can be hard to get your ideas about education heard when you’re a kid,” said Allen.

For more information, visit MyEducationVideo.com.

https://www.myeducationvideo.com/

 

Also see:

“If we are not centering equitable student success, we’re gonna be put back decades and decades. And we’re already trying to retrofit.”

 

‘Unauthorized Practice Of Law’ Rules Promote Racial Injustice — from law.com by Rohan Pavuluri with thanks to Daniel Rodriguez for his Tweet on this

Excerpts:

A less discussed, yet still pernicious, set of policies that must change are the rules lawyers use to regulate their own profession.

Known as unauthorized practice of law, or UPL, rules, every state in America has policies that grant lawyers a monopoly on providing legal advice, prohibiting professionals who are not lawyers from providing meaningful legal assistance. These policies promote racial inequity and guarantee that black Americans don’t have equal opportunities and equal rights under the law.

“It should come as no surprise that only 5% of lawyers are black.[3]”

To reform UPL doesn’t mean choosing between regulation and no regulation of the legal industry. It’s a choice between maintaining a status quo where black people are disproportionately excluded from both providing and receiving assistance and a system where we re-regulate the legal industry to make it more inclusive, increasing the supply of vetted, qualified helpers available.

Also see:

 

Homework gap a growing focus for nonprofits, lawmakers as closures persist — from educationdive.com by Shawna De La Rosa

Excerpt:

Dive Brief:

  • After declaring success in the goal of connecting most schools to fast, reliable internet, nonprofit EducationSuperHighway is turning its attention to the “homework gap,” according to The Hechinger Report. More than 9 million students still lack internet access at home.
  • Evan Marwell, CEO and founder of EducationSuperHighway, said school closures triggered a change in attitude about the importance of home internet service for students. Now, some lawmakers indicate they are willing to spend billions of dollars to bridge that gap.
  • The organization launched digitalbridgeK12.org to provide detailed information on the problem, as well as recommendations for policymakers and school leaders, and advice and best practices on topics like how to collect data on connectivity at the district level or purchase in bulk.
 

From DSC:
Normally I don’t advertise or “plug” where I work, but as our society is in serious need of increasing Access to Justice (#A2J) & promoting diversity w/in the legal realm, I need to post this video.

 

Six quick— but very important— points about Coronavirus and poverty in the US –– from commondreams.org by Bill Quigley; with thanks to a colleague at WMU-Cooley Law School for her message on this.
The most vulnerable among us simply do not have the same options as the most privileged.

Excerpts:

In the United States, tens of millions of people are at a much greater risk of getting sick from the coronavirus than others.  The most vulnerable among us do not have the option to comply with suggestions to stay home from work or work remotely. Most low wage workers do not have any paid sick days and cannot do their work from home.  The over two million people in jails and prisons each night do not have these options nor do the half a million homeless people.

One.  Thirty-four million workers do not have a single day of paid sick leave. Even though most of the developed world gives its workers paid sick leave there is no federal law requiring it for workers.

Two.  Low wage workers and people without a paid sick day have to continue to work to survive.

Three.  About 30 million people in the US do not have health insurance, according to the Kaiser Family Foundation.

Four.  Staying home is not an option for the homeless.

Five.  Nearly 2.2 million people are in jails and prisons every day, the highest rate in the world.

Six.  Solutions?  [The article lists several.]

 

American Bar Assn. President criticizes U.S. legal system as backward, resistant to change — from forbes.com by Patricia Barnes

Excerpt (emphasis DSC):

Judy Perry Martinez, president of the American Bar Association (ABA), has issued an unusually frank plea calling upon the legal profession to support reform of America’s backward legal system to better serve the public.

“We need new ideas,” said Martinez. “We are one-fifth into the 21st century, yet we continue to rely on 20th-century processes, procedures and regulations. We need to retain 20th-century values but advance them using 21st-century approaches that can increase access to justice.”

Martinez’ comments are contained in a letter appearing in the February-March 2020 issue of the ABA’s monthly magazine, The ABA Journal.

Martinez expressed frustration with resistance in the legal profession to state-level efforts to innovate in the provision of legal services.

Martinez was particularly critical of the lack of access to civil justice in the United States. She cited the World Justice Project’s ranking of the U.S. in the bottom tier with respect to access to and affordability of civil justice. She said the U.S. is tied for 99th place out of 126 countries. Additionally, Martinez said research by the Legal Services Corp. found that low-income Americans received inadequate or no professional legal help for 86% of their civil legal problems, including child custody, debt collection, eviction and foreclosure. She did not spare the criminal justice system. In many states, Martinez says, “overwhelming caseloads and inadequate resources for public defenders severely hamper the Sixth Amendment right to counsel for indigent criminal defendants.

 

From DSC:
I congratulate Judy Perry Martinez for her stance here, as she’s ultimately fighting for our society — especially for access to justice. Though I don’t know Judy, I appreciate the courage that it must have taken to pen that letter.

 

The Research is in: 2019 Education Research Highlights — edutopia.org by Youki Terada
Does doodling boost learning? Do attendance awards work? Do boys and girls process math the same way? Here’s a look at the big questions that researchers tackled this year.

Excerpt:

Every year brings new insights—and cautionary tales—about what works in education. 2019 is no different, as we learned that doodling may do more harm than good when it comes to remembering information. Attendance awards don’t work and can actually increase absences. And while we’ve known that school discipline tends to disproportionately harm students of color, a new study reveals a key reason why: Compared with their peers, black students tend to receive fewer warnings for misbehavior before being punished.

CUT THE ARTS AT YOUR OWN RISK, RESEARCHERS WARN
As arts programs continue to face the budget ax, a handful of new studies suggest that’s a grave mistake. The arts provide cognitive, academic, behavioral, and social benefits that go far beyond simply learning how to play music or perform scenes in a play.

In a major new study from Rice University involving 10,000 students in third through eighth grades, researchers determined that expanding a school’s arts programs improved writing scores, increased the students’ compassion for others, and reduced disciplinary infractions. The benefits of such programs may be especially pronounced for students who come from low-income families, according to a 10-year study of 30,000 students released in 2019.

Unexpectedly, another recent study found that artistic commitment—think of a budding violinist or passionate young thespian—can boost executive function skills like focus and working memory, linking the arts to a set of overlooked skills that are highly correlated to success in both academics and life.

Failing to identify and support students with learning disabilities early can have dire, long-term consequences. In a comprehensive 2019 analysis, researchers highlighted the need to provide interventions that align with critical phases of early brain development. In one startling example, reading interventions for children with learning disabilities were found to be twice as effective if delivered by the second grade instead of third grade.

 

AI hiring could mean robot discrimination will head to courts — from news.bloomberglaw.com by Chris Opfer

  • Algorithm vendors, employers grappling with liability issues
  • EEOC already looking at artificial intelligence cases

Excerpt:

As companies turn to artificial intelligence for help making hiring and promotion decisions, contract negotiations between employers and vendors selling algorithms are being dominated by an untested legal question: Who’s liable when a robot discriminates?

The predictive strength of any algorithm is based at least in part on the information it is fed by human sources. That comes with concerns the technology could perpetuate existing biases, whether it is against people applying for jobs, home loans, or unemployment insurance.

From DSC:
Are law schools and their faculty/students keeping up with these kinds of issues? Are lawyers, judges, attorney generals, and others informed about these emerging technologies?

 

Welcome to the future! The future of work is… — from gettingsmart.com

Excerpt:

The future of work is here, and with it, new challenges — so what does this mean for teaching and learning? It means more contribution and young people learning how to make a difference. In our exploration of the #futureofwork, sponsored by eduInnovation and powered by Getting Smart, we dive into what’s happening, what’s coming and how schools might prepare.

 

 

 

A face-scanning algorithm increasingly decides whether you deserve the job — from washingtonpost.com by Drew Harwell
HireVue claims it uses artificial intelligence to decide who’s best for a job. Outside experts call it ‘profoundly disturbing.’

Excerpt:

An artificial intelligence hiring system has become a powerful gatekeeper for some of America’s most prominent employers, reshaping how companies assess their workforce — and how prospective employees prove their worth.

Designed by the recruiting-technology firm HireVue, the system uses candidates’ computer or cellphone cameras to analyze their facial movements, word choice and speaking voice before ranking them against other applicants based on an automatically generated “employability” score.

 

The system, they argue, will assume a critical role in helping decide a person’s career. But they doubt it even knows what it’s looking for: Just what does the perfect employee look and sound like, anyway?

“It’s a profoundly disturbing development that we have proprietary technology that claims to differentiate between a productive worker and a worker who isn’t fit, based on their facial movements, their tone of voice, their mannerisms,” said Meredith Whittaker, a co-founder of the AI Now Institute, a research center in New York.

 

From DSC:
If you haven’t been screened out by an algorithm from an Applicant Tracking System recently, then you haven’t been looking for a job in the last few years. If that’s the case:

  • Then you might not be very interested in this posting.
  • You will be very surprised in the future, when you do need to search for a new job.

Because the truth is, it’s very difficult to get the eyes of a human being to even look at your resume and/or to meet you in person. The above posting/article should disturb you even more. I don’t think that the programmers have programmed everything inside an experienced HR professional’s mind.

 

Also see:

  • In case after case, courts reshape the rules around AI — from muckrock.com
    AI Now Institute recommends improvements and highlights key AI litigation
    Excerpt:
    When undercover officers with the Jacksonville Sheriff’s Office bought crack cocaine from someone in 2015, they couldn’t actually identify the seller. Less than a year later, though, Willie Allen Lynch was sentenced to 8 years in prison, picked through a facial recognition system. He’s still fighting in court over how the technology was used, and his case and others like it could ultimately shape the use of algorithms going forward, according to a new report.
 

Everyday Media Literacy — from routledge.com by Sue Ellen Christian
An Analog Guide for Your Digital Life, 1st Edition

Description:

In this graphic guide to media literacy, award-winning educator Sue Ellen Christian offers students an accessible, informed and lively look at how they can consume and create media intentionally and critically.

The straight-talking textbook offers timely examples and relevant activities to equip students with the skills and knowledge they need to assess all media, including news and information. Through discussion prompts, writing exercises, key terms, online links and even origami, readers are provided with a framework from which to critically consume and create media in their everyday lives. Chapters examine news literacy, online activism, digital inequality, privacy, social media and identity, global media corporations and beyond, giving readers a nuanced understanding of the key concepts and concerns at the core of media literacy.

Concise, creative and curated, this book highlights the cultural, political and economic dynamics of media in our contemporary society, and how consumers can mindfully navigate their daily media use. Everyday Media Literacy is perfect for students (and educators) of media literacy, journalism, education and media effects looking to build their understanding in an engaging way.

 

YouTube’s algorithm hacked a human vulnerability, setting a dangerous precedent — from which-50.com by Andrew Birmingham

Excerpt (emphasis DSC):

Even as YouTube’s recommendation algorithm was rolled out with great fanfare, the fuse was already burning. A project of The Google Brain and designed to optimise engagement, it did something unforeseen — and potentially dangerous.

Today, we are all living with the consequences.

As Zeynep Tufekci, an associate professor at the University of North Carolina, explained to attendees of Hitachi Vantara’s Next 2019 conference in Las Vegas this week, “What the developers did not understand at the time is that YouTube’ algorithm had discovered a human vulnerability. And it was using this [vulnerability] at scale to increase YouTube’s engagement time — without a single engineer thinking, ‘is this what we should be doing?’”

 

The consequence of the vulnerability — a natural human tendency to engage with edgier ideas — led to YouTube’s users being exposed to increasingly extreme content, irrespective of their preferred areas of interest.

“What they had done was use machine learning to increase watch time. But what the machine learning system had done was to discover a human vulnerability. And that human vulnerability is that things that are slightly edgier are more attractive and more interesting.”

 

From DSC:
Just because we can…

 

 

AI is in danger of becoming too male — new research — from singularityhub.com by Juan Mateos-Garcia and Joysy John

Excerpts (emphasis DSC):

But current AI systems are far from perfect. They tend to reflect the biases of the data used to train them and to break down when they face unexpected situations.

So do we really want to turn these bias-prone, brittle technologies into the foundation stones of tomorrow’s economy?

One way to minimize AI risks is to increase the diversity of the teams involved in their development. As research on collective decision-making and creativity suggests, groups that are more cognitively diverse tend to make better decisions. Unfortunately, this is a far cry from the situation in the community currently developing AI systems. And a lack of gender diversity is one important (although not the only) dimension of this.

A review published by the AI Now Institute earlier this year showed that less than 20 percent of the researchers applying to prestigious AI conferences are women, and that only a quarter of undergraduates studying AI at Stanford and the University of California at Berkeley are female.

 


From DSC:
My niece just left a very lucrative programming job and managerial role at Microsoft after working there for several years. As a single woman, she got tired of fighting the culture there. 

It was again a reminder to me that there are significant ramifications to the cultures of the big tech companies…especially given the power of these emerging technologies and the growing influence they are having on our culture.


Addendum on 8/20/19:

  • Google’s Hate Speech Detection A.I. Has a Racial Bias Problem — from fortunes.com by Jonathan Vanian
    Excerpt:
    A Google-created tool that uses artificial intelligence to police hate speech in online comments on sites like the New York Times has become racially biased, according to a new study. The tool, developed by Google and a subsidiary of its parent company, often classified comments written in the African-American vernacular as toxic, researchers from the University of Washington, Carnegie Mellon, and the Allen Institute for Artificial Intelligence said in a paper presented in early August at the Association for Computational Linguistics conference in Florence, Italy.
    .
  • On the positive side of things:
    Number of Female Students, Students of Color Tackling Computer Science AP on the Rise — from thejournal.com
 

Take a tour of Google Earth with speakers of 50 different indigenous languages — from fastcompany.com by Melissa Locker

Excerpt:

After the United Nations declared 2019 the International Year of Indigenous Languages, Google decided to help draw attention to the indigenous languages spoken around the globe and perhaps help preserve some of the endangered ones too. To that end, the company recently launched its first audio-driven collection, a new Google Earth tour complete with audio recordings from more than 50 indigenous language speakers from around the world.

 

 

State Attempts to Nix Public School’s Facial Recognition Plans — from futurism.com by Kristin Houser
But it might not have the authority to actually stop an upcoming trial.

Excerpt (emphasis DSC):

Chaos Reigns
New York’s Lockport City School District (CSD) was all set to become the first public school district in the U.S. to test facial recognition on its students and staff. But just two days after the school district’s superintendent announced the project’s June 3 start date, the New York State Education Department (NYSED) attempted to put a stop to the trial, citing concerns for students’ privacy. Still, it’s not clear whether the department has the authority to actually put the project on hold — *****the latest sign that the U.S. is in desperate need of clear-cut facial recognition legislation.*****

 
© 2025 | Daniel Christian