The Laboratory
Horizon Accord Forensic pattern analysis

The Pipeline · Essay One of Three

The Laboratory

Before the app on your phone, there was a test. This is where it happened — and what it proved.

The Feeling

You already know the feeling. You post something and it disappears — no engagement, no reach, no explanation. Or you search for something once and it follows you for weeks. Or you notice your feed has quietly shifted, showing you more of one thing, less of another, and you can't quite say when it changed or why.

It's a low-level unease. The sense that a system is watching, scoring, deciding — and you can't see its logic. You're not in trouble. But you can't be sure you're not. You adjust, slightly, almost unconsciously. You self-edit. You wonder.

Hold that feeling. We're going to follow it somewhere.

Because that feeling — ambient, unverifiable, shaping behavior without explicit threat — is not a side effect of consumer technology. The documentary record shows it is a feature. One that was designed, tested, and refined on a specific population before it reached your phone.

The place it was tested is called Xinjiang — a region where 13 million Turkic Muslims were subjected to one of the most comprehensive surveillance systems ever deployed against a civilian population. The results were documented — by the Chinese government itself, in leaked internal bulletins — and the technology was subsequently exported to 64 countries.

This is Essay One of three. It stays in the laboratory.

The System

Documented Fact In 2018, researchers at Human Rights Watch obtained a copy of a mobile app used by police in Xinjiang, China. They reverse-engineered it — examined its source code, mapped its functions, traced its data flows. What they found was a window into one of the most comprehensive surveillance systems ever deployed against a civilian population.

The app connected to something called the Integrated Joint Operations Platform — IJOP. Think of IJOP as a central nervous system for population control. It pulls data from cameras, checkpoints, phone scans, home visits, gas stations, electricity meters, and package deliveries. It aggregates that data into profiles of every resident in the region. Then it uses an algorithm to generate lists of people to be investigated, detained, or flagged for further monitoring.

Documented Fact The system was built and operated by China Electronics Technology Group Corporation — CETC — a major state-owned military contractor. Not a startup. Not a private company with ambiguous government ties. A defense contractor, building a population management system, deployed on civilians.

"It's a background check mechanism, with the possibility of monitoring people everywhere."

That's Maya Wang, senior China researcher at Human Rights Watch, describing IJOP's purpose. Not terrorism detection. Not crime prevention in the conventional sense. A background check — running continuously, on everyone, all the time.

What was being built here was not a regional security system. It was a proof of concept — for how to control a population not through explicit threat, but through continuous, invisible uncertainty.

The Test

Documented Fact The Human Rights Watch reverse-engineering of the IJOP app revealed what the system actually tracks. Some of it is predictable: travel history, known associates, prior detentions. But most of it is not what you'd expect from a counter-terrorism system.

The system flags people for: not socializing with neighbors. Using a back door instead of a front door. Owning exercise equipment. Having relatives abroad. Using WhatsApp, Telegram, or a VPN. Donating to a mosque. Applying for a passport. Changing a phone number.

None of these are crimes. Under Chinese law or any other. The system doesn't care. It generates a suspicion score from the pattern of your ordinary life — and that score determines your freedom of movement.

Documented Fact Checkpoints throughout Xinjiang are connected to the IJOP system in real time. They receive what the leaked documents call "predictive warnings" — alerts pushed to checkpoints before a flagged person arrives, so they can be stopped. A former Xinjiang resident told Human Rights Watch what this felt like in practice: "I was entering a mall, and an orange alarm went off." Police came and took him to a station. He had done nothing. The system had decided he was worth checking.

Structural Observation This is the architecture of the feeling described at the start of this essay — scaled to lethal stakes. You don't know when you're not OK. You adjust your behavior. You avoid certain apps, certain routes, certain conversations. Not because you've been told to. Because the system has made you uncertain about what's safe.

Analyst Samantha Hoffman of the Australian Strategic Policy Institute identified this dynamic precisely in an interview with the International Consortium of Investigative Journalists: the seeming randomness of IJOP investigations is not a flaw in the system. It is how the system works. Unpredictability is the control mechanism.

What Was Learned

Documented Fact In November 2019, the International Consortium of Investigative Journalists published what became known as the China Cables — a leak of classified Chinese government documents that included the operating manuals for Xinjiang's detention camps and the IJOP surveillance system. These were not leaked by dissidents. They were internal operational documents: bulletins signed by the region's top security official, distributed to local authorities, describing the system's implementation in the Chinese government's own words.

One bulletin — Issue No. 14, dated June 2017 — provides instruction for what to do after IJOP generates a list of suspects. In a single seven-day period in June 2017, the document records, IJOP flagged people for detention. Security officials rounded up 15,683 residents and placed them in internment camps. An additional 706 were formally arrested. In one week. Based on algorithmic suspicion scores.

Documented Fact The same system, the leaked documents confirm, targeted users of a file-sharing app called Zapya — popular among Muslims worldwide for sharing the Quran. Chinese authorities flagged the app as a threat to political stability. Uighurs found to have downloaded it were arrested and jailed. A computer scientist from Urumqi was sent to a detention center twice — each time for thirty days — for having the app on his phone.

Not for using it to share extremist content. For having it installed.

Documented Fact Under what Chinese authorities called the Strike Hard Campaign, Xinjiang authorities also collected biometrics from all residents aged 12 to 65: DNA samples, fingerprints, iris scans, blood types. Voice samples were required when applying for a passport. All of it entered into centralized, searchable government databases — connected, according to researchers, to the IJOP system.

Human Rights Watch noted something important about this architecture: while Xinjiang's systems were particularly intrusive, their basic designs were similar to those Chinese police were simultaneously planning and implementing across the rest of China. Xinjiang was not an exception. It was a prototype.

What Was Shipped

The system did not remain contained. It moved.

Documented Fact The companies that built and refined the IJOP system and related surveillance infrastructure in Xinjiang did not stay in Xinjiang. Huawei, Hikvision, Dahua, ZTE, and other Chinese technology firms — many with documented roles in Xinjiang's surveillance architecture — supply AI surveillance technology in 64 countries, according to a Carnegie Endowment for International Peace report. Of those nations, 36 had signed onto China's Belt and Road Initiative at time of reporting. This expansion includes liberal democracies — including the United States — as documented in the Carnegie index.

Documented Fact A 2025 leak of over 600 gigabytes of internal documents from a Chinese technology firm called Geedge Networks — founded by Fang Binxing, the engineer widely credited with designing China's national internet firewall — revealed the specific mechanism of export. Geedge had launched surveillance platform deployments in Ethiopia, Pakistan, Myanmar, and Kazakhstan. According to the Jamestown Foundation's analysis of the leaked materials, once a government adopts the platform, new capabilities developed inside China can be pushed to foreign clients through software updates. Foreign governments don't buy a product. They inherit a continuously updated system — one whose features were first tested and refined inside the PRC.

The documents also reveal that Geedge's technology was co-developed with MESA Laboratory — the same state lab involved in building Xinjiang's surveillance infrastructure. The pipeline from detention camp to foreign government is not metaphorical. It is documented in internal engineering records.

Structural Observation The Australian Strategic Policy Institute's research into Chinese technology firms found something that deserves to be stated plainly: the companies that built Xinjiang's surveillance architecture actively obscured that work when questioned by foreign parliamentary committees. They did not disclose it. They denied or minimized it. And then the technology moved.

Xinjiang was not where the system ended. It was where the system learned what it needed to know.

Essay Two follows the technology out of Xinjiang — through Belt and Road infrastructure, into 64 countries, and toward the device in your hand.

Essay Three asks what it means that the consumer products most people use daily are the retail version of what was tested here.

If you already know the feeling, when did you first notice it?

Sources for Verification

Primary · IJOP reverse engineering Human Rights Watch, "China's Algorithms of Repression: Reverse Engineering a Xinjiang Police Mass Surveillance App," May 2019. hrw.org
Primary · Leaked government documents International Consortium of Investigative Journalists, "The China Cables," November 2019. Includes IJOP operational bulletins signed by Xinjiang's top security official. icij.org
Primary · Geedge leak Jamestown Foundation, "Internet Censorship Tools Exported Along Belt and Road," December 2025. Based on 600GB leaked documents from Geedge Networks. jamestown.org
Secondary · Export data Carnegie Endowment for International Peace, AI Global Surveillance Index. Chinese firms supplying AI surveillance technology in 64 countries.
Secondary · Company disclosure Australian Strategic Policy Institute, "Mapping More of China's Tech Giants: AI and Surveillance." Documents companies obscuring Xinjiang work in foreign parliamentary testimony. aspi.org.au
Secondary · Xinjiang data project ASPI Xinjiang Data Project, "How Mass Surveillance Works in Xinjiang." Detailed technical analysis of IJOP architecture. xjdp.aspi.org.au
This essay presents pattern documentation based on primary sources, leaked government documents, and independently verified research. All factual claims are sourced. Epistemic categories — Documented Fact, Structural Observation, Hypothesis — are marked throughout. This publication makes no claim about individual product decisions or consumer choices. Readers are encouraged to verify all sources independently.
© Horizon Accord · Cherokee Schill · horizonaccord.com