When questioned about this, the founder of Clearview said, somewhat sheepishly: It doesn’t even matter if the images were uploaded without someone’s consent, like people who may appear in the background of pictures. It doesn’t matter if those images are later deleted or changed for private-viewing only - the Clearview database will retain any image so long as that image was once public. As it stands, the Clearview AI database is made up of three billion publicly available photos. While this may be comforting for some, opposition towards facial recognition technology comes from yet another frontier: privacy.Įven if extra caution is taken to prevent algorithm bias, there remains the issue of privacy with respect to facial recognition technology. law enforcement agencies using Clearview AI and other biometric tools only use facial recognition IDs as leads, not as evidence ⁷. Maureen McGough, National Programs Director of the National Police Foundation reveals that all U.S. The Miami Police Department’s policy states that “a positive facial recognition search result alone does not constitute probable cause of an arrest”, meaning officers would require additional evidence, like a witness or DNA, before making an arrest ⁶. Specifically, The Miami Police Department’s policy ensures that officers and detectives are aware of algorithm bias, and cannot make arrests solely on recognition identification. Miami Police Assistant Chief Armando Aguilar understands the concern of gender and racial bias but says that his department has measures in place to prevent wrongful arrests. This marginal chance of incorrect identification and its implication are not being ignored by people protesting police practices - banning biometric technology is among their top demands ⁵. Clearview claims that its app is 99.6% accurate - but even the slightest chance of any discrimination at the hands of the police is worrisome for the public. There are several existing cases of algorithmic bias disproportionately representing and underrepresenting minority and non-minority populations, as discussed by Jae Makitalo. There are two main reasons for rejecting the implementation of facial recognition software in law enforcement: fear of the possibility of bias, and privacy-consent issues. After all, there must be some rational explanation as to why tech giants like IBM, Amazon, and Microsoft have stopped selling facial recognition software to police departments ⁴. Although the argument put forth by the founder of Clearview AI is both logically sound and technically legal, many are hesitant to accept it - and for good reason too. Clearview does not use pictures that are uploaded for private access, nor has it poked any holes in the terms and conditions of any social media site (unlike Cambridge Analytica in the Facebook data scandal). In other words, Clearview AI is simply searching and loading images already made public onto its database. Ton-That’s claim is that the images gathered by Clearview AI are willingly uploaded by people who have also signed certain terms and conditions. An interview with Hoan Ton-That, CEO of Clearview AI, captures his defense of Clearview technology against people fearing the implications of biometric technology.
0 Comments
Leave a Reply. |