Motherboard uncovered documents showing the AEGIS system misidentified Black men four times more often and Black women 16 times more often than white men.
Facial recognition software / Photo Credit: Getty Images
by Jon Greig
A New York public school is under fire after internal emails showed administrators knew their facial recognition system could not identify Black faces and had other significant problems.
Through the Freedom of Information Law, Motherboard obtained hundreds of emails from experts expressing concerns about the Lockport City School District’s decision to spend $2.7 million on SN Technologies’ AEGIS face and weapons detection system. Lockport is 11% Black and about an hour away from Niagara Falls.
The school district was told by auditors and scientists who had tested the system that it misidentified Black men four times more often and Black women 16 times more often than white men. The emails also show SN Technologies has outright lied about how they performed on racial bias tests.
Despite these concerns and an ACLU-backed lawsuit filed by parents, Lockport City School District turned on the system in January. It has largely been rendered useless because of pandemic regulations forcing everyone to wear a mask inside.
But parents in the district, particularly Black families, have been incensed about the school’s decision to spend millions on a system that routinely misidentifies Black people and was found to have a persistent problem of misidentifying broom handles as guns.
Parents in the district told Motherboard that the danger caused by this inaccurate system is incalculable. According to Motherboard, the AEGIS system “begins a process to automatically alert police when it detects weapons or certain people on the district’s ‘hot list.'”
In an interview with Motherboard, Lockport parent Jim Shultz said he was at a loss in understanding how the benefits of the system outweigh the potential consequences.
“The police have said if they get a notification they’re going to treat it as a live shooter system, and you have a system that’s predisposed to make mistakes and misidentify people. The risk of an accident, the risk of something horrible happening because the system is structured the way it is, to me, is 1 million times higher than [the chance] that the cameras are going to prevent a real situation,” Shultz said.
The danger of the system has been exacerbated since it was revealed that SN Technologies lied about the system’s problems with Black faces.
The National Institute of Standards and Technology (NIST) performed the racial bias tests on SN Technologies’ system and confirmed what hundreds of Black scientists have repeatedly proven over the last five years — all facial recognition systems are significantly less effective with brown and Black faces.
The accounting firm hired by the city told administrators that SN Technologies lied about the accuracy of their system in documents obtained by Motherboard. SN Technologies said its algorithm misidentified Black men twice as often as white men and misidentified Black women 10 times more often than white men in the NIST test.
When the school district turned on the system in January, they received dozens of false notifications for weeks because of broom handles. Motherboard shared emails showing the problem got so bad that a representative for SN Technologies wrote they were “working on building an image database on broom handles to add this into the learning tool, this will help enhance the detection accuracy for guns.”
In an affidavit, one of the parents serving as a plaintiff in the case, Renee Cheatham, said beyond the problems with the system, it was an enormous waste of money at a time when many in the district are struggling to access devices and wi-fi because of the pandemic.
“Lockport should have spent the funds it received to purchase and install a face recognition system on actual educational programs and instructional technology. Neighboring districts invested their Smart Schools Bond Act money in iPads and faster internet, while Lockport bought spy cameras,” Cheatham said.
“I am witnessing today the impact placed on the Lockport students who do not have the types of home technological resources Lockport could have put into place with SmartSchool Bond Act grant funding. The coronavirus is exposing the true digital divide that exists here in Lockport: the gap between students and their families who have speedy, modern-day internet connections and those who do not,” she added.
She went on to note that many families who do not have wi-fi or internet connections are suffering because of their lack of access to iPads or Chromebooks. Other school districts used Smart Schools Bond Act funds to provide these devices for students while Lockport opted to spend it almost entirely on the AEGIS system.
Lockport has made national news because of the fracas over the facial recognition software. They were the first in the state to want the technology and prompted a new law in New York state banning schools from using it.
However, education leaders, according to Motherboard, believe it will help stop shootings and other school districts have now expressed interest in buying the system.
Facial recognition systems are still inaccurate yet are being used widely by police departments, schools, airports and militaries around the world. In June, the Detroit Police Department became the first to admit they had mistakenly arrested a Black man due to a false facial recognition match.
BREAKING: We’re filing a complaint against Detroit police for wrongfully arresting Robert Williams, an innocent Black man — all because face recognition technology can’t tell Black people apart.
Officers hauled him away in front of his kids and locked him up for 30 hours. pic.twitter.com/84XJs0XWqu— ACLU (@ACLU) June 24, 2020
Robert Williams told his story to The Washington Post, writing that he was arrested in front of his wife, children and neighbors.
“I never thought I’d have to explain to my daughters why Daddy got arrested. How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway? Why is law enforcement even allowed to use such technology when it obviously doesn’t work? I get angry when I hear companies, politicians and police talk about how this technology isn’t dangerous or flawed,” Williams wrote.
“I wouldn’t be surprised if others like me became suspects but didn’t know that a flawed technology made them guilty in the eyes of the law. I wouldn’t have known that facial recognition was used to arrest me had it not been for the cops who let it slip while interrogating me. I keep thinking about how lucky I was to have spent only one night in jail—as traumatizing as it was. Many Black people won’t be so lucky. My family and I don’t want to live with that fear. I don’t want anyone to live with that fear,” he added.