Quinn says the proliferation of facial recognition technology has led researchers to believe there will at least be appropriate digital evidence, similar to how the TV show CSI led people to believe that there would always be DNA or physical forensic evidence. In reality, images from security cameras can be grainy, low quality, come from odd angles and suffer from lighting issues that can get in the way of a good match.
Given the widespread distrust of the police in some areas, “we really need to get it out there and help educate our communities about the value of this stuff and how we’re using it,” Quinn says. Referring to a ban on the use of facial recognition in some cities, he says it would otherwise “be very easy to discuss these technologies in terms of all or nothing.”
As more states and cities consider restricting the technology, a September report from the Center for Strategic and International Studies, a think tank, suggests Congress draft national standards to avoid a patchwork of regulations. Lead author James Lewis says he supports facial recognition and thinks its spread is inevitable, but there needs to be transparency about how the technology is being used in criminal investigations. Seven US states and cities, including Boston and San Francisco, have passed full or partial bans on facial recognition by government agencies. Lewis doesn’t think Congress will follow suit, in part because of the January 6 attack on the US Capitol and the ensuing investigation, saying, “I think that’s influential if you have to hide in a closet.”
An analysis by the Human Rights Law Review at Columbia University concluded that “defendants face meaningful barriers to challenge the technology” and called on Congress to pass a law requiring disclosure. The report also called for procedural safeguards, such as regular testing and a minimum threshold for the accuracy of facial recognition systems.
White House science and technology policymakers approved more disclosures about the use of artificial intelligence last fall as part of an AI Bill of Rights. Regulation of facial recognition technology has received bipartisan support in Congress, but there are no federal restrictions on law enforcement’s use of the technology, despite a documented lack of guardrails for federal agencies using the technology.
The National District Attorneys Association (NDAA) says it is instructing its more than 5,000 members to use “professional judgment and discretion” when it comes to disclosing facial recognition uses and considering issues of public safety, privacy and relevance. take when making these decisions. NDAA officials did not respond to requests for examples of how disclosing facial recognition in a criminal investigation could threaten public safety.
“The longer things stay secret, the harder it is to challenge them, and the harder it is to challenge them, the longer the police are without courts that set limits on what they can do,” said Nathan Wessler, who leading the speech. Privacy and Technology Project at the ACLU.
An attempt to learn more
Defense attorneys say their best hopes of getting law enforcement and prosecutors to reveal that facial recognition helped identify a suspect rests on a 1963 Supreme Court decision. In Brady v Maryland, the court ruled that the police should remove any evidence they gathered to exonerate that suspect, must hand over to a defendant.
The most famous facial recognition case and the Brady decision is that of Willie Allen Lynch, a Florida man who was convicted in 2016 of selling $50 worth of crack cocaine, based in part on facial recognition, and sentenced to eight years in prison. During his trial, Lynch, who defended himself for a while, argued that he should be able to cross-examine a crime analyst who had performed the facial recognition scan and sent a single photo of Lynch to investigators. In a preliminary study, the analyst testified that she did not fully understand how the facial recognition program worked.
In December 2018, a Florida court of appeals dismissed Lynch’s appeal, arguing that he failed to demonstrate on Brady grounds that documents such as photographs of other potential test subjects would have altered the outcome of a trial.
Lynch then appealed to the Florida Supreme Court, seeking more information about how facial recognition was used in his case, including photos of other possible matches and the software behind the algorithm. The call was supported by groups such as the ACLU, Electronic Frontier Foundation, Georgetown Law Center on Privacy and Technology, and the Innocence Project. They argued that uncertainty about the results of facial recognition analysis should be equated with eyewitnesses who said they were unsure whether they would recognize the person who had committed a crime. The Florida Supreme Court declined to hear the case.