Last January, the London police announced the installation in the city of a facial recognition system that would help arrest people wanted by the police. For this purpose, the system will capture and analyze images of the faces of people in public places, and will search for matches with the characteristics of faces included in a database of suspects of having committed a crime. In the case where the system has identified a suspect, the police will approach him and ask him to identify himself and will arrest him if they confirm that he is the person they are looking for. Facial recognition systems have become a reality thanks to advances in artificial intelligence technology and the massive banks of facial images that have accumulated in recent decades.
The London police measure, on the other hand, has been the subject of criticism, due to the apparent inefficiency of the system for identifying suspects. Thus, although the London police argue that the system generates only one false alert for every 1,000 cases, a study by the University of Essex finds that of 42 cases studied, only 5 worked correctly.
It is not clear what motivates the large difference between what is reported by the University of Essex and the London police, but substantial differences in the efficiency of the facial recognition system would certainly be expected based on the facial images it has to analyze. Thus, this efficiency would be greater with images obtained on purpose, well lit and resolved, than with blurred images obtained in a public place.
For the rest, although the case of London is not unique – Moscow, for example, shortly after the London announcement, unveiled the deployment of an extensive facial recognition system – other cities, such as San Francisco and Boston, have its use by the police or government agencies is prohibited as it is considered a technology not yet sufficiently developed. It is noted, for example, that the technology produces biased results against minorities, who show a higher percentage of false positives compared to the majority white population.
The cause of the bias does not lie, of course, in the nature of the facial recognition systems, but in the information that was supplied to them, in the form of facial images, during their training. Thus, facial recognition systems are exposed to a greater number of images of white people than of other minorities, and as a result a bias is generated in the number of their mistakes.
A case that has become visible in recent days and that also involves a bias in facial recognition systems – this time towards younger people – is that of the city of Buenos Aires, Argentina. As an article published this week in the magazine “MIT Technology Review” under the signature of Karen Hao comments, the city of Buenos Aires has installed a facial recognition system to capture people wanted by the police. Said system is linked to a database known as the National Consultation of Rebellions and Captures, or CONARC, in which data on persons suspected of crimes are concentrated.
The Buenos Aires system has shown numerous failure to identify suspects. In one case, for example, a man was arrested for 6 days, being transferred to a maximum security prison before being released. In another case, the victim was warned that he could suffer further arrests in the future and to avoid them he was given a pass to show it to the agents who wanted to arrest him.
Furthermore, according to an investigation by “Human Right Watch”, CONARC contains information on minors, most of whom are between the ages of 16 and 17, but also some considerably younger, even up to one year old. Although it has not been documented that infants have been the subject of an arrest, older youth are exposed to this contingency. In fact, they are more so than adults, since facial recognition systems generate more mistakes in the case of the youngest, since they were trained to recognize older people.
We could perhaps conclude that facial recognition systems as they advance technologically will undoubtedly have positive aspects. But while that happens, both its positive and negative aspects should be balanced. Without losing sight of the Orwellian dystopian vision.