How to Thwart Facial Recognition
“Why not give the camera what it wants, which is a face?” says Leonardo Selvaggio, an interdisciplinary artist. Just don’t give it your face. To enable people to obfuscate facial-recognition software programs, Selvaggio, who is 34 and white, made available 3-D, photo-realistic prosthetic masks of his own face to anyone who wants one. He tested the masks by asking people connected to him on Facebook to upload pictures of themselves in the prosthetic: It didn’t matter if they were skinny women or barrel-chested men; short or tall; black, brown, Asian or white — the social network’s facial-recognition software recognized them as Selvaggio. “There’s nothing more invisible to surveillance and security technology than a white man,” he says.
Selvaggio thought up the project, which he calls URME Surveillance, when he was living in Chicago, where law-enforcement officials have access to more than 30,000 interlinked video cameras across the city. He wanted to start conversations about surveillance and what technology does with our identity. He knew that researchers have found that facial-recognition software exhibits racial biases. The programs are often best at identifying white and male faces, because they have been trained on data sets that include disproportionate numbers of them, and particularly bad at identifying black faces. In law-enforcement contexts, these errors can potentially implicate people in crimes they didn’t commit.
Selvaggio sees two routes to elude facial-recognition programs. The first is to disappear: go offline and off the grid. Selvaggio prefers the second option, which is to flood the syste