Connect with us

State News

Facial Recognition Moves Into a New Front: Schools

By Davey Alba, The New York Times

LOCKPORT, N.Y. — Jim Shultz tried everything he could think of to stop facial recognition technology from entering the public schools in Lockport, a small city 20 miles east of Niagara Falls. He posted about the issue in a Facebook group called Lockportians. He wrote an Op-Ed in The New York Times. He filed a petition with the superintendent of the district, where his daughter is in high school.

But a few weeks ago, he lost. The Lockport City School District turned on the technology to monitor who’s on the property at its eight schools, becoming the first known public school district in New York to adopt facial recognition, and one of the first in the nation.

The district, said Shultz, 62, “turned our kids into lab rats in a high-tech experiment in privacy invasion.”

The decision underscores how facial recognition is spreading across the country and being deployed in new ways in the United States, as public officials turn to the technology in the name of public safety.

Libby March, The New York Times

Lockport High School in Lockport, N.Y., Jan. 8, 2020.

A few cities, like San Francisco and Somerville, Massachusetts, have barred their governments from using the technology, but they are exceptions. More than 600 law enforcement agencies started using the technology of one company, Clearview AI, in just the past year. Airports and other public venues, like Madison Square Garden in Manhattan, have adopted it as well.

Schools are a newer front, and the debate that took place in Lockport encapsulates the furor surrounding the technology. Proponents call it a crucial crime-fighting tool, to help prevent mass shootings and stop sexual predators. Robert LiPuma, the Lockport City School District’s director of technology, said he believed that if the technology had been in place at Marjory Stoneman Douglas High School in Parkland, Florida, the deadly 2018 attack there may never have happened.

Libby March, The New York Times

Robert LiPuma, the Lockport school district’s technology director, during a school board meeting in Lockport, N.Y., Jan. 8, 2020.

“You had an expelled student that would have been put into the system, because they were not supposed to be on school grounds,” LiPuma said. “They snuck in through an open door. The minute they snuck in, the system would have identified that person.”

But opponents like Shultz say the concerns about facial recognition — namely privacy, accuracy and racial bias — are even more worrisome when it comes to children.

“Subjecting 5-year-olds to this technology will not make anyone safer, and we can’t allow invasive surveillance to become the norm in our public spaces,” said Stefanie Coyle, education counsel for the New York Civil Liberties Union. “Reminding people of their greatest fears is a disappointing tactic, meant to distract from the fact that this product is discriminatory, unethical and not secure.”

The debate in Lockport has unfolded over nearly two years. The school district initially announced its plans to install a facial recognition security system, called Aegis, in March 2018. The district spent $1.4 million, with money it had been awarded by the state, to install the technology across 300 cameras.

Libby March, The New York Times

Tina Ni and Nick Doxey, students at Lockport High School, outside the school in Lockport, N.Y., Jan. 9, 2020.

But when administrators wanted to do a test run last May, the State Education Department told them to hold off, partly in response to mounting public concerns over student privacy. The state wanted Lockport to make sure that students’ data would be properly protected, and demanded a policy that would forbid the use of student data, including their photos.

By June, Lockport officials said they had adjusted their policies, and they began testing parts of the system. In late November, the State Education Department said the district’s revised policy addressed its concerns. In January, the school board unanimously approved the latest policy revision.

When the system is on, LiPuma said, the software looks at the faces captured by the hundreds of cameras and calculates whether those faces match a “persons of interest” list made by school administrators.

That list includes sex offenders in the area, people prohibited from seeing students by restraining orders, former employees who are barred from visiting the schools and others deemed “credible threats” by law enforcement.

Trending

Share via
Copy link
Powered by Social Snap