A brief summary of the newest developments surrounding face recognition and its growing use all over the world.

Clearview AI

Clearview AI has scraped the internet for photos of peoples’ faces and amassed a database of over 3 billion images, which it makes available to law enforcement agencies. Which law enforcement agencies? According to Clearview, they are “focused on doing business in USA and Canada”, but according to leaked documents its technology is used by over 2,000 law enforcement agencies from countries including Australia and Saudi Arabia. The meaning of “used” is not entirely straight-forward, since Clearview offers free trials to “active law enforcement personnel”, which can lead to employees using Clearview’s products without the agency’s knowledge or any oversight procedures in place. And there is evidence suggesting that such unsanctioned use by employees is already happening, at least in Australia.

London Metropolitan Police

On February 27 the London Metropolitan Police deployed Live Facial Recognition Technology in Oxford Circus. During the deployment they scanned scanned around 8,600 faces and compared them against a watchlist of 7,292 people wanted for “serious and violent offenses.” The deployment resulted in 7 false matches and one correct match. Of the 7 false matches, 5 led to an “engagement”, which presumably means police stopping a person simply going about their business in Oxford Circus. The correct match led to an arrest. Thus the deployment had a false match rate of 0.08%, which sounds good, but at the same time 83% of all police “engagements” were unnecessary. Which statistic do we want to cite?

Russia

Russian courts are less worried about use of face recognition. The Tverskoy District Court of Moscow dismissed a lawsuit filed by opposition politicians and activists claiming that collection of biometric information “about participants of an authorized rally could result in the violation of their right to freedom of peaceful assembly.” So far so unsurprising, but it adds to the public record of where face recognition is and is not used.

Black Software

Charlton D. McIlwain’s book Black Software: The Internet & Racial Justice, from the AfroNet to Black Lives Matter was released January 2020. Based on the book, the article by Eisa Nefertari Ulen connects historic racism with modern surveillance and the use of face recognition. And the McIlwain thus summarises, why face recognition in public spaces is such a dangerous technology:

You make suspects out of people when you utilize facial recognition in a given area, when you surveil the people in a given area.

I have added the book to my reading list.

Face Masks

Some people want to use masks to defeat face recognition systems. Others work on printing your face on the mask, so you can continue to use FaceID while wearing a mask


Sample images