Earlier this week, Amazon announced that it had improved the accuracy of its learning system recognition system, improving its "emotion detection" capabilities. Along with discovering the emotions of happy, sad, angry, surprised, disgusted, calm, and confused, Amazon says it has "added a new emotion: fear."

Technically, the algorithm works by learning how people's faces usually look when they express fear. Then, when you see it, you can tell it with a certain degree of certainty. The company's descriptions of the product claim that "Amazon Recognition detects emotions as happy, sad, or surprise, and demographic information searches as gender from facial images."

While Amazon has been recruiting for marketing and marketing companies, Amazon has also been advocating for an investigation by the ACLU.

"Facial recognition already automates and exacerbates police abuse, profiling, and discrimination," said Evans Greer in a statement. "Now Amazon has made it on a path where it has been able to make split second judgments based on a flawed algorithm's cold testimony. Innocent people could be detained, deported, or falsely imprisoned because they were computerized. The dystopian surveillance state of our nightmares is being built in plain-site by a profit-hungry corporation.

[Image: Rootstocks/iStock]

In the past, civil rights groups, AI experts, and even some of Amazon's own investors probably to be discriminated against within the criminal justice system. Oakland, California; and Somerville, Massachusetts, have issued bans-even as many other places embrace the use of Amazon's technology. Facial recognition is already used in public spaces, airports, and even in schools.

Earlier this week, the ACLU conducted a review of Recognition where the nonprofit found that the service falsely matched 20% of California's state legislatures to mugshots found within the state's database of 25,000 public arrest photos. More than half of the falsely identified legislators were people of color, demonstrating some of the algorithm's bias problems. For the test, the ACLU uses the default settings for recognition. (The company says that the ACLU does not use the highest accuracy threshold that the company recommends for law enforcement.)

The ability of any algorithm to accurately measure emotions is therefore disputed by scientists. A July paper published in the journal Psychological Science in the Public Interest reviews the actual evidence that it is not the case. The paper does not specifically cite Amazon within the context of emotion recognition, though it does not refer to the way that tech companies like.

"The available scientific evidence suggests that, when it's time to get angry, when it's over, when it's angry, and so on, as proposed by the common view," reads the article's abstract. "Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise. Furthermore, various aspects of facial movements are variably expressed instances of more than one emotion category. "

"So-called emotional expressions are more variable and context-dependent than originally assumed," they write.

The authors say that it is fundamentally wrong, "write the researchers, who hail from Northeastern University and CalTech. "Efforts to simply" read out 'people's internal states from an analysis of their facial movements alone, without considering various aspects of context, are at best incomplete and at worst wholly incompetent, no matter how sophisticated the computational algorithms.'

In any case, AWS services. Amazon declined to comment on this story.