More than one billion people live in China. They are recorded by police cameras on street corners and subway ceilings in hotels and apartment buildings. Their purchases are monitored and their online chats are not allowed.
Their future is also under scrutiny.
The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and anomalies. They target potential troublemakers in the eyes of the Chinese government, not only those with a criminal past but also vulnerable groups, such as ethnic minorities, migrant workers and those with a history of mental illness.
If a victim of a fraud tries to travel to Beijing to petition the government for payment, they can warn the police if they make too many calls to the same number. They can tell officers if a person with a history of mental illness is near a school.
It takes a lot of evasive maneuvers. In the past,Zhang Yuqiao, a 74-year-old man who has been petitioning the government for most of his adult life, could simply stay off the main highways to avoid the authorities and make his way to Beijing to fight for compensation over the torture of his parents during He stopped using his phone, paid in cash, and bought train tickets to false destinations.
New Chinese technologies detailed in procurement and other documents reviewed by The New York Times further extend the boundaries of social and political controls and integrate them into people's lives. At their most basic, they are justified in violating privacy, but in the extreme they risk automation of discrimination and political oppression.
Social stability is a priority for the government. During his decade as China's top leader, he has hardened and centralized the security state, unleashing techno-authoritarian policies to quell ethnic unrest in the west of the country and enforce some of the world's most severe coronaviruses. The space for dissent is quickly vanishing.
At a national public security work meeting, Mr. Xi said that big data should be used as an engine to power innovative development of public security work.
The algorithm, which was controversial in other countries, is often trumpeted as a triumph.
In 2020, the authorities in southern China denied a woman's request to move to Hong Kong to be with her husband after software warned them that the marriage was suspicious. The two were not often in the same place at the same time and had not spent the Spring Festival together. The marriage was fake to get a migration permit.
In northern China, an automated alert about a man entering a residential compound with different friends prompted the police to investigate. He was found to be a part of a pyramid scheme.
The details of these emerging security technologies are described in police research papers, as well as hundreds of public procurement documents reviewed by The Times. Many of the procurement documents were shared by ChinaFile, an online magazine published by the Asia Society. The authorities in the port city of Tianjin bought software to stop people from going to Beijing, and the information about it was provided by a publication.
China's Ministry of Public Security did not respond to faxed questions about its activities.
Data-driven policing software from the United States and Europe has been used to make decisions about which neighborhoods are most policed and which prisoners are paroled. The police in China have access to a huge amount of data that allows them to operate without fear.
People don't realize they're being watched. The police don't face a lot of scrutiny of the technology they use. There are no warrants required to collect personal information in China.
How is it possible to accurately predict the future if the police are called?
Experts say that even though the software fails to deduce human behavior, it can be considered a success.
Maya Wang, a senior China researcher with Human Rights Watch, said that the disproportionate brunt of technology imposed on society is felt by groups of people that are already discriminated against.
One of China's best-known entrepreneurs had a vision of a computer system that could predict crimes.
The founder of Megvii, an artificial intelligence start-up, told Chinese state media that the system could give the police a search engine for crime and warn them about suspicious behavior. He said that if cameras caught a person spending too much time at a train station, the system would flag a possible pickpocket.
If there were people watching behind the camera, it would be frightening. We use it every day to surf the internet, but it is very neutral. It is supposed to be good.
The bad guys don't have a place to hide.
He had a vision five years ago. The Megvii presentations reviewed by The Times show how their products can be used for police work.
A product called "intelligent search" is described as a multidimensional database that stores faces, photos, cars, cases and incidents. The software analyzes the data to find out who is innocent and who is guilty.
A spokesman for Megvii said in an email that the company was committed to responsible development of artificial intelligence and that it was concerned about making life more safe and convenient.
Similar technologies are currently being used. Hikvision's software was used by the police in Tianjin to predict protests. There are legions of Chinese people who try to file complaints about local officials with higher authorities.
The likelihood of them going to Beijing is scored by it. The data will be used to train machine- learning models.
Local officials want to prevent the trips from happening. The central government doesn't want citizens to gather in the capital.
A Hikvision representative wouldn't say anything about the system.
Efforts to control petitions have grown more aggressive. A 32-year-old member of a group that has been trying to get compensation for a real estate fraud said that the authorities in China prevented him from buying a ticket to Beijing. He thought the authorities were watching their conversations on the app.
The Hikvision system in Tianjin, which is run in cooperation with the police, is more advanced.
According to the procurement document, the platform analyzes individuals' social and family relationships, past trips and personal situations to determine their likelihood of petitioning. It helps the police create a profile of each protester, with fields for them to describe their temperament.
The government's handling of a tragic accident or neglect is one of the reasons why many people petition over it. If a person has low social status or has gone through a major tragedy, the risk level should be increased.
In Zhouning, a rural county in Fujian Province, the police listed the coordinates of the 437 cameras that they bought in the last year. According to the document, some hung above intersection and schools.
Outside the homes of people with mental illness there were nine installations.
A more common type of software is based on the preconceived notions of the police. Over a hundred procurement documents were reviewed by The Times.
Those with mental illness, convicted criminals, fugitives, drug users, petitioners, suspected terrorists and threats to social stability were included in some of the procurement documents. Migrant workers, youths without a job, ethnic minorities, foreigners and those with H.I.V. were targeted by other systems.
There is no process to let people know when they are on the lists. According to experts, once individuals are in a database, they are rarely removed.
The software can allow the authorities to set up digital tripwires that show a possible threat. The police were able to make their own early warnings thanks to the system's interface.
With a simple fill-in-the- blank menu, the police can base alarms on specific parameters, such as where a blacklisted person appears, when he or she meets with other blacklisted people, and the number of activities. Each time two people with a history of drug use check into the same hotel, the police could set the system to send a warning.
Yitu did not reply.
The police bought software in 2020 that could look for more than three key people checking into the same or nearby hotels and a drug user calling a new out-of-town number frequently. The authorities bought a system to alert them if a foreigner without a work permit spent too much time hanging around foreign-language schools or bars.
The authorities used software to identify people who exceeded water and electricity use. When the system detected suspicious consumption patterns, it would send a digital whistle to the police.
Migrant workers often live in close quarters to save money. They are considered an elusive and impoverished group who can bring crime into communities.
The same level of police response isn't achieved by the automated alert. Suzanne E. Scoggins is a professor at Clark University who studies China's policing.
The police often state the need to profile people. A researcher at China's national police university said in a speech that they use big data to paint a picture of people. Pre-emptive security measures are carried out for people who receive more than one type of label.
The government was petitioned for compensation over the torture of his family. He wants the police to stop targeting his family.
As China has built out its techno-authoritarian tools, he has had to use spy movie tactics to circumvent surveillance that, he says, has become "high tech and Nazified."
He paid for the transportation in cash to minimize his digital footprint after he traveled to Beijing. He purchased train tickets to the wrong place. Private drivers were hired to get around the checkpoint.
There is a special feature for people like him who have a certain awareness of anti-reconnaissance and regularly change vehicles to evade detection.
A change has been noticed, whether or not he triggered the system. He said that whenever he turns off his phone, officers show up at his house to make sure he doesn't leave on a trip to Beijing.
According to Noam Yuchtman, an economics professor at the London School of Economics, police systems can be considered successful even if they can't predict behavior.
In a situation where there isn't real political accountability, having a surveillance system that frequently sends police officers "can work pretty well" to discourage unrest.
Police officers don't have a lot of flexibility once the metrics are set. According to public police reports, they are evaluated for their ability to prevent protests.
The technology has problems. There is a "red list" of people whom the system should ignore.
The function was for people who need privacy protection. A person from Guangdong Province said that the red list was for government officials.
The way technology has cut off those in political power from the rest of the population is frustrating.
He said that the authorities don't really solve problems but do everything they can to silence people who raise them. This is a retrograde step.
He said that he still believed in the power of technology, but that in the wrong hands it could be a problem.
All roads lead to Beijing in the past if you left your home and traveled to the countryside. The nation is a net.
Research and reporting was done by the two people. The production was done by Alexander Cardia.