top of page

A Practitioner's Guide to the Responsible Use of Facial Recognition Technology

CLEARVIEW AI PARTNERS WITH VICTIM RIGHTS ADVOCATE TO PREVENT FRT ABUSE

AI is a driving force behind many law enforcement technologies, especially in video and image analysis tools. And the benefits are many: reducing repetitive tasks and workloads, recognizing patterns, sending alerts of anomalies, and streamlining investigations. You can see AI at work in technologies like object detection, license plate recognition, weapon detection, acoustic gunshot detection, and facial recognition.


Understanding what AI can do is important, and just as important is knowing what AI doesn’t do. Mischaracterizations of the technology might lead one to believe that AI and computerized automation are one and the same. Simply put, AI is NOT the same as automation.


That distinction is clear with Clearview AI’s facial recognition technology, powered by an algorithm that is made smarter and more accurate through the use of machine learning. The algorithm does not automate decision making, but rather puts the onus on a person to analyze image search results and apply best investigative practices. Law enforcement still has to do the investigative work, but you get a big head-start in the form of investigative leads that might take days, months, or even years to uncover otherwise.


Recently, our Co-Founder and CEO published a blog on the steps Clearview AI has taken to ensure the responsible use of facial recognition technology. Those steps include training protocols, advanced AI to achieve >99% accuracy across all demographics in NIST testing [1], and tools within the solution that allow for auditing and reporting at the agency level. To further support this, we developed a recommended five step investigative workflow for law enforcement using facial recognition technology.



RECOMMENDED INVESTIGATIVE WORKFLOW

As agencies develop policies around facial recognition technology, one key consideration involves how a user of the technology performs and analyzes the results. The following five steps can dramatically improve both the results of the facial recognition technology as well as support an agency’s policy regarding search, subsequent analysis and investigative protocol.


Making the technology solely available for law enforcement use.





1. ANALYZE PROBE IMAGES

Prior to running any search, the analyst or investigator should examine the probe image to assess the quality of the image. Some images, especially ones taken from ATM and CCTV cameras may present a facial image that is off-axis, where the subject's face is turned away from the camera. And, depending on lighting conditions and quality of the camera, it may also be blurry and dark. For most facial recognition platforms, these deficiencies can be problematic and it is advised that the image not be run unless it can be enhanced. Clearview AI 2.0 has recently added image enhancement capabilities directly on the platform, which allows investigators and analysts to produce valuable leads despite these issues, in many cases. These capabilities include an in-line editor to flip, rotate, adjust brightness, crop or un-blur an image. Clearview AI’s in-line editor documents all enhancements performed on probe images and these modified images are marked in the audit trail. See it in action >


Providing monitoring tools for agency administrators.





2. UNIQUE MARKS

Once a search of the probe image is complete and results are returned, trained face examiners can confirm if the individual is one and the same or discredit the results as being two different people. Written as a protocol within an agency’s policy, face examiners should always look for distinctions and unique identifiers during the face analysis. Those identifiers include unusual facial markings like scars, moles or tattoos, ear shape or size, or an individual’s hairline or hair type. These unique identifiers help to support decisions and validate the accuracy and results returned from a facial recognition search. Once a face examiner has completed the facial analysis, we recommend taking any further investigation steps needed to validate or disprove the match.


Making the technology solely available for law enforcement use.





3. PEER REVIEW

Another recognized best practice is for the investigator or analyst to solicit a peer review to confirm a potential match. Investigators ask other members of the department, also trained in facial recognition technology, to look for unique identifiers to confirm the match. If sufficient agreement exists, typically three out of five individuals in the review group agree, the investigator should feel confident to proceed to further the lead through ordinary investigative processes.


Making the technology solely available for law enforcement use.






4. ESTABLISH PROBABLE CAUSE BEYOND THE MATCH

A facial recognition match should never be used as the sole basis for an arrest. It is not a replacement for good policing and investigative work. If the facial recognition search and peer review results in a strong lead, an immediate background investigation should be conducted. Thorough follow-up, for example, may reveal that the person of interest was unavailable to commit the crime, and therefore a potential match is not confirmed. The onus falls on the agency to establish cause for arrest by other investigatory means. Some common examples include:

  • Check incarceration status - was the individual incarcerated at the time of the incident?

  • Check known address - what is the proximity of the individual’s residence to the location of the incident?

  • Check M.O. and priors - if the individual has prior arrests in his/her record, do they include a modus operandi similar to the incident in question?

Consider the case where a photo produced by a family member helped Hernando County Sheriff’s Office in Florida identify an associate of a homicide victim. Detectives added this person of interest to their list of people to interview and following a thorough investigation, the agency uncovered additional evidence as well as probable cause to arrest this individual.


Making the technology solely available for law enforcement use.






5. MAINTAIN A LOG

Agencies maintain transparency and gain public trust by keeping a log of their use of facial recognition technology. This check and balance details how police used facial recognition in a case and how it worked to audit its use. The agency’s log provides credibility, demonstrates responsibility, and offers full transparency.


For investigative uses of facial recognition technology, like Clearview AI, agencies must ensure that they are promoting and demanding responsible use. Even if no statewide or federal regulations exist with respect to facial recognition use by law enforcement, the agency should develop its own agency policy to govern the use of the technology. This includes who has access, what training they must complete prior to be given access, but most importantly how that discrete set of approved users operate the technology. With any powerful law enforcement technology, safeguards must be established and put in place. And through proper use and application by practitioners, facial recognition technology can continue to solve crimes, locate and rescue victims, and make communities a safer place.


[1] NIST FRVT 1:1, October 2021

___


ROGER RODRIGUEZ

VP Market Development (NYPD-Ret.)

Roger Rodriguez is a former member of the New York Police Department directly involved with the development of the Real Time Crime Center, its Facial Identification Section and specialized initiatives for the Intelligence Division.


As a thought leader in the public safety sales space, he supports the development, marketing, and sales of facial recognition technology platforms. His domain experience with SaaS solutions and data analytics allows for regular exchanges with police executives, law enforcement associations, academia, software developers, lobbyists and the media. In 2018, Roger authored a POST accredited course for law enforcement about the uses of facial recognition technology for public safety.


Presently, Roger serves as Vice President, Market Development at Clearview AI. He can be seen at industry trade shows and conferences evangelizing the value of Clearview AI in support of public safety, homeland security, law enforcement operations, National defense, and intelligence.

bottom of page