open menu

Events

Face-off Over Facial Recognition

Rappaport panel helps to clear up confusion over a new technology applauded for doing good and blamed for creating problems.

       
From left, Kimberly Atkins Stohr, Justice Elspeth Cypher, and William G. Brooks III. Panelist Kade Crockford appeared on Zoom. Photograph by Reba Saldanha 

In the last few years, facial recognition has quietly developed into one of the more useful—and terrifying—technologies of our time. What was once considered an Orwellian possibility of the distant future has arrived: Madison Square Garden Entertainment was recently sued by a slew of attorneys who were turned away at the doors of venues like Radio City Music Hall, the Beacon Theater, and the Garden itself thanks to facial recognition software that identified them as undesirable. The attorneys in question were turned away for the simple fact that they were representing litigants in civil suits against the live entertainment giant (or who were merely affiliated with a law firm engaged in such representation), which used headshots ripped from law firms’ websites to train the program.

On the other hand, facial recognition could herald a new technological revolution in law enforcement on a scale not seen since the development of DNA analysis over thirty years ago. Proponents of the nascent technology believe that it could be used to solve crimes that would have otherwise gone unsolved; police could simply upload a video of a suspect against a database of faces (often compiled by the local DMV/RMV) and receive a list of possible matches, all without having to resort to the time-intensive and unreliable use of eyewitness identifications and photo arrays.

On February 21, the Rappaport Center for Law and Public Policy assembled a panel of speakers from across the spectrum of facial recognition support to find the middle ground between Big Brotheresque dystopia and panoptic utopia. The discussion was overseen by Kimberly Atkins Stohr, senior opinion writer and columnist for Boston Globe Opinion and a former trial and appellate litigation attorney, who delivered a Rappaport Center keynote address as the 2024 Rappaport Senior Fellow on BC Law’s campus just days earlier. Stohr was joined by Chief William G. Brooks III of the Norwood Police Department; Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts; and former SJC Associate Justice Elspeth Cypher, BC Law Huber Distinguished Visiting Professor.

Crockford highlighted the urgency of regulating the technology as it is widely adopted in the US. When the ACLU of Massachusetts launched its Press Pause of Face Surveillance campaign in 2019, the Commonwealth had no laws on the books concerning the use of facial recognition by law enforcement. Unfortunately, only a dozen or so states have adopted such laws to date. Thanks in part to the ACLU’s work, Massachusetts has adopted some initial reforms, and four of the Commonwealth’s largest cities have outright banned facial recognition in law enforcement.

Crockford and Chief Brooks were both appointees to the Massachusetts Special Commission to Evaluate Government Use of Facial Recognition Technology in the Commonwealth, which published its final report in March 2022. Although their views on the technology were diametrically opposed in many ways, they agreed that some form of consensus was reached. “The commission’s work and the recommendations we made truly represent the best of democracy. A number of views came together and we worked to hash out a compromise,” Crockford said. Neither side walked away feeling particularly satisfied, but according to Crockford, “that’s when you know you landed in the right place.”

“Facial recognition is merely a lead, treated much like an anonymous recording left on a tip phone line. They are never relied upon for an arrest, search, or detention.”

Norwood Police Chief William G. Brooks III

Chief Brooks came to the technology’s defense, describing some of the fears concerning facial recognition as overblown: “Facial recognition is merely a lead, treated much like an anonymous recording left on a tip phone line. They are never relied upon for an arrest, search, or detention,” he assured the audience.

Cypher noted the immense difficulty the judiciary has had in fairly interpreting facial recognition statutes: “We are trying to balance the needs of the police department as well as the need of citizens and Massachusetts residents to have privacy,” she said. Foremost among her concerns was the rate of false positives in facial recognition searches, when someone is misidentified by the sorting algorithm. “The algorithms are normally trained on white people, and men in particular. Women, African Americans… these people were not included in the initial datasets,” she feared.

Justice Cypher was confident, however, that law enforcement would never cede full control to AIs and facial recognition software, at least not any time soon. “You’ve got to have a human being in the loop…you cannot just let the machines work their magic. When a match is made, some human being has got to double-check it and make sure it’s actually a match,” she said.