When retired U.S. Magistrate Judge Stephen W. Smith learned the police in his Texas hometown were taking photographs of arrestees to compile a facial recognition database, it rubbed him the wrong way.
Smith, now director of Fourth Amendment & Open Courts at Stanford Law School's Center for Internet and Society, understands both sides of the debate. Law enforcement sees a database of faces as the most logical next step after fingerprinting while civil rights activists and criminal defense attorneys call for more oversight. But when the debate literally hit home, it became a bit more black and white for Smith.
"I understand the need for identification for the crime under investigation, but beyond that I'm uncomfortable with the government having this huge database they can access at will without a search warrant," Smith said. "There's a possibility for abuse, and if there's not some regulation on the front end that's a problem."
Regulators are just starting to get involved with the use of facial recognition technology. This week, the House Committee on Oversight and Reform heard from Gretta Goodwin, director of the Department of Homeland Security's Government Accountability Office, who audited the use of the technology by the FBI and Justice Department.
Goodwin told lawmakers Tuesday the FBI's database included 640 million faces, and the agency could eventually run afoul of the First and Fourth Amendments without better self-reporting, a charge the FBI and Justice Department denied.
In California, the Assembly is considering a bill that would put guidelines in place for law enforcement's use of surveillance technology.
Courts are divided on the issue of whether or not facial recognition technology causes harm or violates civil liberties when it comes to law enforcement.
U.S. Magistrate Judge Kandis A. Westmore of the Northern District ruled against the Justice Department in January, writing the government "may not compel or otherwise utilize fingers, thumbs, facial recognition, optical/iris, or any other biometric feature to unlock electronic devices" because it violates the Fourth and Fifth Amendment rights of citizens.
"The challenge facing the courts is that technology is outpacing the law," Westmore wrote in her January order. "Courts have an obligation to safeguard constitutional rights and cannot permit those rights to be diminished merely due to the advancement of technology."
In the meantime, policies are being made on a piecemeal basis with each jurisdiction deciding for itself how it wants to handle the proliferation of facial recognition technology and its application to the criminal justice system.
Smith said on a national scale Westmore is in the minority, that most jurists liken facial recognition databases to the collection of fingerprints and hold the collection of photographs to be constitutional.
There is some case law attorneys can use to defend clients whose prosecution includes information obtained through new technology, he said. For example, last year the U.S. Supreme Court decided in Carpenter v. United States (2018) 138 S. Ct. 2206 that police must obtain a warrant before accessing historical location records from cellphones.
But facial recognition and machine learning are years beyond cellphone technology, and many lawmakers are concerned about the widespread use of the technology without any guardrails.
"The laws that you're relying on were passed before facial recognition became popular. That's a problem," Rep. Thomas Massie, R-Kentucky, said at Tuesday's House committee hearing.
Brian Hofer, a senior legal assistant at the Berkeley firm Gould & Hahn and a commissioner for Oakland's Privacy Advisory Board, has been behind the push to reign in surveillance technology and ban law enforcement use of facial recognition outright in multiple Bay Area jurisdictions. That effort led to the nation's first ban on facial recognition when San Francisco County supervisors voted in May to prohibit its use by law enforcement there.
Hofer sued the Contra Costa County sheriff in December for an incident when he said deputies drew guns on him and his younger brother after a license plate reader misidentified the car he rented as stolen. Hofer v. Emley, 18-cv-07580 (N.D. Cal., filed Dec. 17, 2018).
Contra Costa County public information officer Jimmy Lee said the bad data was a human error caused by police in the jurisdiction where the vehicle had previously been stolen neglecting to update the system.
According to Hofer, lack of training is just part of the problem. He said law enforcement's use of surveillance technology is already invasive, but facial recognition outweighs many other programs when it comes to its potential for civil rights violations.
"It's shared responsibility," Hofer said. "If these vendors want to be in the chain of command, they need to ensure the data in their system is accurate, especially as we move into data analytics and predictive policing."
A 2012 study co-authored by a senior FBI technologist looked at three algorithms and found they performed 5% to 10% worse on African Americans than on Caucasians. One algorithm failed nearly twice as often when the photo was of an African American, according to the study author.
Hofer said in spite of these inaccuracies, law enforcement is using the technology in more ways, often without a warrant.
A 2019 study from Georgetown Law found police in New York run sketches through the facial recognition program and color in pixels to alter photographs that aren't clear. The department also used a picture of Woody Harelson to catch a suspect witnesses described as looking like the actor.
"I can't believe it's legal, and it can only be because the defense bar doesn't know about it yet," Hofer said. "Facial recognition is going to be the biggest power shift between the government and the governed, and I hope the majority of us wouldn't consent to those who have police power to use such a massive surveillance tool."
Providers like Amazon Web Services, which licenses its Amazon Rekognition software to law enforcement, say the use of advanced technology is of huge benefit to society. Lauren Lynch, a spokesperson for the company, wrote in an email that one customer, whom she would not identify, has used Rekognition to identify over 3,000 human trafficking victims. She also said the company has not seen a misuse of its facial recognition technology by law enforcement nor has it been used to infringe on civilians' civil liberties.
"Over the past several months, we've talked to customers, researchers, academics, policymakers and others to understand how to best balance the benefits of facial recognition with the potential risks," Lynch wrote. "We outline clear guidelines in our documentation and blog for public safety use, where we also reiterated our support for the creation of a national legislative framework covering facial recognition."
Paula Lehman-Ewing
paula_ewing@dailyjournal.com
For reprint rights or to order a copy of your photo:
Email
Jeremy_Ellis@dailyjournal.com
for prices.
Direct dial: 213-229-5424
Send a letter to the editor:
Email: letters@dailyjournal.com



