According to BBC, a camera system that uses AI and facial recognition intended to reveal states of emotion is being tested on Uyghurs Muslims in Xinjiang detention camps.

What Happened: A software engineer who installed these systems in the police stations within the province spoke to the BBC’s Panorama program under the condition of anonymity and revealed five photographs of Uyghur detainees who he claimed had had the emotion recognition system tested on them. “The Chinese government use Uyghurs as test subjects for various experiments just like rats are used in laboratories,” he said. He also informed that the emotion detection camera is placed 3m from the subject, which is bound by “restraint chairs” in which the wrists and ankles are locked in place by metal restraints. He claimed that the AI system is trained to detect and analyze minute changes in facial expressions and skin pores and create a pie chart, with a red segment representing a negative or anxious state of mind.

The Response: This outright violation of human rights has scandalized many.
Truly horrifying story.
Right now, Uyghurs are being used as lab subjects for emotion AI, under coercion, strapped into restraint chairs, and then tagged as "nervous" or "anxious" which is taken as a marker of guilt. via @hare_brain https://t.co/EprUJgxHEC
— Kate Crawford (@katecrawford) May 26, 2021
This is Orwellian. "The Chinese government use Uyghurs as test subjects for various experiments just like rats are used in laboratories," #Orwellian #China #UyghurGenocide https://t.co/CMTJcpnllJ
— RoastMyHelwa (@Roastmyhelwa) May 26, 2021
When government uses AI, ethics must be an integral part of that process. This article is a horrific example of what happens when the state develops and deploys AI technologies with no ethical oversight. Absolutely horrendous. https://t.co/cpaIOQtIyd
— Dr Cosmina Dorobantu (@CosDorobantu) May 26, 2021