The San Mateo County Sheriff’s Office is considering purchasing a facial recognition software created by the company Clearview AI after testing it around 2,000 times.
A BuzzFeedNews investigation published late last month found that use of the software, which compares photographs of people to images from public web sources like social media in order to identify them, was widespread. It found thousands of government entities and private businesses in the United States and abroad using the software, and attributed the information to an anonymous source.
Among the users it found was the San Mateo County Sheriff’s Office, which was said to have logged around 2,000 uses of the software between 2018 and 2020. When first asked by BuzzFeed reporters about the department’s use of the software, Public Information Officer Rosemerry Blankswade reportedly said, “No one is able to speak about these matters at this time.”
But this week, Assistant Sheriff Ed Wood confirmed the department’s past use of the software on a trial basis. He said the trial has now ended and no one in the field from the Sheriff’s Office is currently using the software.
Wood said an internal group of detectives and people from the operations side of the Sheriff’s Office are tasked with determining if it should purchase the software. He did not say when the department might decide.
The use of Clearview AI has come under scrutiny because of the company’s marketing tactics, often targeting individual employees through their public email addresses for free trials. Sometimes, those trials occur without knowledge or permission from their departments or the public. It has also been criticized as marking “the end of privacy” because it so efficiently pulls personal information off of public websites and without oversight.
These issues led San Francisco to ban the use of the technology entirely in 2019. BuzzFeedNews investigative reporter Ryan Mac, who worked for more than a year to uncover who is using the software after internal documents were leaked, said he was often met with confusion when he reached out to departments for comment on their use of the software.
“We were literally informing police chiefs that their employees were using it,” Mac said. “That was shocking to us.”
The San Mateo County Sheriff’s Office currently has a policy specifically addressing the use of facial recognition technology. According to the policy, which was adopted Jan. 10, 2020, officers should only use the software in conjunction with other investigative methods to identify someone and are strongly recommended to seek peer reviews when feasible. Wood confirmed that the Sheriff’s Office has used the software successfully to help identify suspects or victims, but always alongside other investigative means.
“The software gives you a direction of who it might be,” Wood said. “It scrubs the open web, but then you can match it to a driver’s license or a government ID. This system does not solve crimes.”
The Sheriff’s Office policy says that a search warrant is generally not required to use the software, but would be obtained if necessary, and the software may only be used for legitimate law enforcement and public safety purposes.
To Mac, the bigger concern is how these and other policies are enforced. Without insight into the system, there is no auditing of its use, no search history available, and no way to know when or how it has been used by individuals. The Sheriff’s Office’s policy requires the department to maintain a log of requested, accessed, searched or disseminated facial recognition information, which the Review has requested via a Public Records Act request.
Jennifer Jones, technology and civil liberties fellow at the American Civil Liberties Union of Northern California, said the ACLU opposes use of the software by law enforcement and advocates for local bans. In addition to the issues around transparency, Jones wrote in an email to the Review that the technology raises equity concerns.
“Facial recognition algorithms erroneously match and misidentify Black and brown people at a disproportionate rate,” Jones wrote. “This is particularly alarming information when coupled with the knowledge that facial recognition tools like Clearview AI have been added to the arsenal of tools that local law enforcement and federal immigration enforcement agencies have at their disposal.”
Wood said he hasn’t heard of any of his colleagues who have used the software who thought it was biased, but said its overall usefulness was under evaluation. He said the department found that it has been useful over half the time to help identify a person of interest. Money is also a factor in the department’s decision, Wood said. According to Mac, most of the contracts with Clearview AI land around $1,000 per user account per year.
“We’re now at the point where we’re deciding if we will spend the money,” Wood said. “... We really want to go through it and do some good research and make sure what we’re using and spending public funds on meets the Sheriff’s Office’s needs and desires.”