Chinese authorities use AI to analyse emotions of Uyghur prisoners

Chinese law enforcement agencies use AI systems to analyse the emotions of Uighur prisoners in Xinjiang Autonomous Region.

This information was reported to the BBC by an anonymous software engineer who installed such systems in police stations in the province.

The name of the company for which the programmer works was not disclosed. However, an informer provided the BBC with five photographs of Uyghur prisoners testing the emotion recognition system.

We have placed an emotion detection camera at a distance of 3m from the object. It is similar to a lie detector, but has much more advanced technology. Law enforcement officials sat inmates on “restraint chairs” that are often installed in police stations throughout China. The wrists and ankles on these chairs are fixed with metal restraints.the engineer explained.

According to the source, the artificial intelligence system is trained to detect and analyse even the smallest changes in facial and skin expression. The program creates a pie chart showing the proportion of a person’s negative or anxiety state. According to the programmer, the software was intended for “a preliminary court decision without any reliable evidence.”

Uyghurs usually have to provide DNA samples to local authorities, undergo digital scans, and most of them have to download a government phone app that collects data, including contact lists and text messages. Most of the data is entered into a computer system called the Integrated Joint Operations Platform, designed to detect suspicious behaviour.

The system collects information on dozens of different types of perfectly legal behaviour, including whether people exit through the back door instead of the front door, or whether they fill up gas in a car that does not belong to them.

These materials were presented to Sophie Richardson, the Chinese director of Human Rights Watch.

It is shocking material. It’s not just that people are being reduced to a pie chart, it’s people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that’s taken as an indication of guilt, and I think, that’s deeply problematic. Sophie Richardson reacted.

Human rights activists believe that with artificial intelligence, Chinese citizens cannot rely on any kind of privacy.

Let me remind you that we also wrote that China officially legalized the “Social Credit System”, as well as that the Chinese authorities use the Tianfu Cup as a source of exploits.

User Review
0 (0 votes)
Comments Rating 0 (0 reviews)

Daniel Zimmermann

Daniel Zimmermann has been writing on security and malware subjects for many years and has been working in the security industry for over 10 years. Daniel was educated at the Saarland University in Saarbrücken, Germany and currently lives in New York.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button