Local News

Raleigh police abruptly end use of controversial facial recognition tech

The Raleigh Police Department has halted the use of a facial recognition app that appears to violate the agency's own internal policies, six months after it paid a tech startup for the service.
Posted 2020-02-11T17:50:10+00:00 - Updated 2020-02-12T17:23:23+00:00
Facial recognition company's practices raise serious privacy concerns

The Raleigh Police Department has halted the use of a facial recognition app that appears to violate the agency's own internal policies, six months after it paid a tech startup for the service.

The department paid $2,500 in August for several accounts with year-long access to Clearview AI, which has drawn widespread criticism after reporting from The New York Times revealed the company collected billions of images from the internet and social media without the consent of the sites or their users. That image collection, the company says in its marketing to law enforcement, is the "secret sauce" powering a fast and accurate facial recognition tool aimed at solving crimes.

Other WRAL Top Stories

{
    "name": "follow",
    "attrs": [],
    "children": null
}

Since the Times first reported on the company in January, privacy advocates have warned that the technology is ripe for abuse, especially as it spreads without clear restrictions on its use. At least four Silicon Valley tech giants have demanded Clearview stop collecting data on their users. The New Jersey attorney general instructed county prosecutors and local police departments to stop using the service.

Since WRAL News began asking questions about local police departments' use of the service, Clearview AI has so far failed to respond to multiple requests for comment.

For its part, the Raleigh Police Department has been using facial recognition services for several years. A nearly $50,000 annual contract with Greenville, S.C.-based DataWorks gave police the ability to match photos against a database of mugshots from the City-County Bureau of Identification.

That process would be consistent with the Raleigh Police Department's existing facial recognition policy, which went into effect in 2015. The policy puts limits on who can access the system and how photos can be obtained and compared. It also includes prohibitions on comparing photos from "social media sources, except for photos or videos that are obtained from the public domain and are directly related to an active criminal investigation."

"That is a much more limited universe of photographs than Clearview, which by its own marketing suggests that they have billions of photos from across the Internet," Ann Webb, policy counsel for the North Carolina chapter of the American Civil Liberties Union, said. "This suggests that using Clearview may go well beyond the existing policy, meaning it's essentially unregulated in Raleigh."

WRAL raised those concerns in early February after a Raleigh police spokeswoman confirmed the department had accounts with Clearview.

Sgt. Chuck Penny, supervisor for the agency's intelligence center, was scheduled to answer questions about the service Tuesday morning in an interview with WRAL. But in an email Monday night, spokeswoman Dia Harris canceled the interview, saying the department is no longer using Clearview AI "in keeping with our practice of regularly reviewing Departmental Operating Instructions."

The department paid for access to the service for a year, but halted its use after six months.

"As technology advances, the need to review and update policies and procedures pertaining to technology continues," Harris said Tuesday morning in response to written questions sent last week. "The Raleigh Police Department is the process of reviewing and updating relevant policies. The RPD will release the relevant policies once they are updated."

While that review takes place, Harris said, officers no longer have access to Clearview.

She said the department was aware, prior to reporting from The New York Times, that the service used "images from the open source, public domain" and said the department is "guided by the Facial Recognition Policy, using only images from the public domain or those directly related to a criminal investigation."

Late Tuesday night, the police department issued a press release saying they contacted Clearview in early February "in an attempt to gather information about past use of the system for internal auditing purposes." That contact would have occurred after WRAL News began asking questions in late January.

"When we did not receive a response that we believed to be satisfactory, we ceased our use of this technology and have shut down all Departmental access," the statement reads. "We do not intend to resume use of Clearview AI."

Harris did not answer follow-up questions about how that statement squared with the wording of the policy, which says comparisons with a facial recognition system "will be made to a database of arrest photos maintained by the City-County Bureau of Identification."

More concerning, the ACLU's Webb said, is that the department sought no approval from Raleigh City Council or city staff before entering into an agreement with the company.

"The fundamental question here is, why hasn't Raleigh engaged in public notice, in a public discussion of its expanded use of facial recognition," Webb said.

Although the department has stopped using the service, it's unclear how often it was used during the six months it was in operation.

In an email to WRAL on Jan. 31, Harris called Clearview "an open source data tool" used by a limited number of employees in the police department.

"The software is most often used to investigate formidable crimes that are extraordinary in nature, such as reports of human trafficking and shootings," Harris wrote in the email. "With the understanding of the privacy issues that the software raises, consideration of the Fourth Amendment is forefront."

She also said the searches were "fairly narrow in their scope."

That appears to conflict with guidance from Clearview, which told two of the department's account holders in an email that it recommends its users "don't stop at one search."

"See if you can reach 100 searches. It's a numbers game," the email reads. "Our database is always expanding, and you never know when a photo will turn up a lead. Take a selfie with Clearview or search a celebrity to see how powerful the technology can be."

The emails, provided to WRAL through a public records request, also show officers told Clearview about five "successful" uses of the software by late August.

The company worked with a detective from the department to highlight one case in particular: the arrest of James Mero. Marketing materials from the company quote Detective Sgt. Robert Powell saying Mero was identified using Clearview after law enforcement issued a "be on the lookout" alert using a photo from one of his alleged victims.

It's unclear which law enforcement agency actually used Clearview to identify Mero, and Harris said the Raleigh Police Department wasn't using the service at the time.

Mero faces a slew of charges in Wake County court after police say he defrauded multiple victims of more than $100,000 and was found with child pornography on his phone. The charges include obtaining property by false pretense, credit card and identity theft and second-degree sexual exploitation of a minor.

Mero's public defender, Tad Dardess, did not respond to requests for comment on the case.

Harris refused to comment further on the Mero case or provide the number of times the department has used Clearview for other investigations, saying the information is not public record.

In subsequent emails Tuesday afternoon, she said the detective "did not give consent for his name to be used in any marketing materials."

Nationally, Google, Twitter, Facebook and LinkedIn have all sent Clearview AI cease-and-desist letters in response to the revelations about the company's data collection practices. In emails to WRAL, spokespeople from all four companies say Clearview violated their terms of service and the privacy of their users in collecting photos from their platforms.

The ACLU's national office has also taken issue with the company's accuracy claims. Clearview's marketing materials say an independent panel used methodology from the ACLU to test the software, finding it "rated 100% accurate, producing instant and accurate matches."

The ACLU calls that claim "highly misleading" and points to research from the National Institute of Standards and Technology that shows facial recognition systems are particularly error-prone when applied to people of color.

"We have seen no reason to think Clearview is different, but even perfectly accurate face surveillance technology raises profound privacy and civil liberties concerns – as does Clearview's dangerous partnership with police and its shadily assembled database of billions of face scans," Nathan Freed Wessler, staff attorney with the ACLU’s Speech, Privacy and Technology Project, said in a statement.

Webb said these tools prompt serious questions the community needs to grapple with before they're employed by law enforcement.

"We have to ask ourselves, 'Does this cross a line?'" Webb said, "and we really need to query the profound privacy and civil liberties concerns that are raised by a tool that essentially looks at the entire internet as a source of searchable material."

Editors note: This story, originally published on Feb. 11, 2020, was updated Feb. 12 with a statement from the Raleigh Police Department issued late on Feb. 11.

Credits