CRBC News

AI Is Supercharging China’s Surveillance State — Algorithms Are Tightening Control

ASPI’s new report finds that China is integrating AI across surveillance, censorship, courts and prisons to monitor citizens more effectively and to anticipate and suppress dissent. Major cities and tech firms lead deployment, while the government pushes for AI systems in courts by 2025 and installs "smart" prison technologies. The report warns these tools — including open LLMs for minority languages — could be exported and adopted by other states, exporting censorship and surveillance practices.

AI Is Supercharging China’s Surveillance State — Algorithms Are Tightening Control

AI Is Deepening Surveillance And Censorship In China, ASPI Report Finds

A new report from the Australian Strategic Policy Institute (ASPI) documents how China is embedding artificial intelligence into its existing surveillance and censorship systems to monitor citizens more efficiently, predict unrest and suppress dissent. The technology is being deployed across cities, courts and prisons and is increasingly supported by major domestic tech firms and state investment.

“AI lets the CCP monitor more people, more closely, with less effort.”
— Nathan Attrill, Senior China Analyst and ASPI Report Co-Author

Key Findings

ASPI finds that AI is being used to automate content moderation, enhance facial recognition and geolocation tracking, and to produce predictive tools aimed at pre‑empting protests or identifying individuals at perceived risk. Deployment is most advanced in major urban centres such as Beijing and Shanghai, while rural areas lag behind due to weaker digital infrastructure.

Though precise counts vary, the report cites estimates of up to 600 million surveillance cameras across China — roughly three cameras for every seven people — many of which are now augmented with AI capabilities.

AI In Courts, Prisons And Policing

China’s Supreme Court has urged courts to "develop a competent artificial intelligence system by 2025," encouraging AI tools for trials and administrative work. ASPI documents examples where local systems suggest detention, sentencing or suspended sentences.

Prisons and rehabilitation centres are being outfitted with so-called "smart" technologies: facial‑recognition cameras that monitor inmates' expressions and flag emotions such as anger, and virtual‑reality systems used in AI-assisted therapy programs.

Role Of Tech Companies

Major Chinese technology firms are identified as central to the ecosystem. The report highlights that companies develop moderation and surveillance products, cooperate with authorities on investigations and sell tools to smaller firms. Examples cited include ByteDance (content moderation on Douyin), Tencent (behaviour monitoring and risk scoring) and Baidu (content-moderation tools and cooperation in criminal cases).

Language Models And Minority Communities

Chinese companies — with government support — are building large language models (LLMs) for minority languages such as Uyghur, Tibetan, Mongolian and Korean. ASPI and outside experts warn these models could be used to monitor communications, shape information flows and tighten control over minority communities.

Global Implications

The report warns that China’s open‑weight LLMs and exportable surveillance technologies could be adopted by other governments and private actors. ASPI notes that cheaper, widely available Chinese models can carry embedded design choices that facilitate censorship and surveillance, potentially exporting those capabilities abroad.

Caveats And Official Response

ASPI’s findings echo earlier research, and the report acknowledges geographic and infrastructural variation in deployment. Chinese state bodies named in the report did not respond to ASPI or CNN requests for comment; China’s State Council Information Office and Ministry of Justice have previously questioned ASPI’s credibility and funding. ASPI is partially funded by the Australian government and other foreign sources, a point noted by its critics.

Implications

Experts say these technologies can improve public safety but stress the political context matters: systems built for law enforcement can also be repurposed to target dissidents, religious or ethnic minorities, and other vulnerable groups. With China’s court system operating under strong Party oversight and a reported conviction rate above 99%, observers warn that AI-driven tools could further entrench opaque decision-making and limit due process.

Bottom line: ASPI concludes that AI has made China’s surveillance and censorship apparatus more efficient, predictive and far‑reaching, and that these tools increasingly form a core part of state control rather than peripheral enhancements.

Similar Articles