China's 'social credit' system uses technology to punish citizens
Published Wednesday, December 19, 2018 7:00AM EST
Last Updated Wednesday, December 19, 2018 7:26PM EST
In a 2016 episode of the dystopian Netflix show “Black Mirror,” everyone uses their smartphones to rate each other on a five-point scale, not unlike how you might rate your Uber driver but with far more serious consequences. The protagonist wants a nicer apartment but she doesn’t have the 4.5 rating required so she’s stuck where she is. She gets into a fight with an airline employee and her score drops so low that she can’t board a plane or even rent a decent car. The pressure of trying to conform ends up making her more and more distressed. Eventually, she’s imprisoned.
The scenario might strike you as pure science fiction but these types of technology-enabled punishments and rewards are increasingly the reality in China. In March, the country’s National Development and Reform Commission boasted that more than nine million plane ticket sales and more than three million rail ticket sales had been blocked under what China calls "social credit." Passengers are reminded in Mandarin and English when they board high-speed trains that riding without a ticket, disorderly behaviour or smoking can result in a negative record in the “individual credit information system.”
Here's a dystopian vision of the future: A real announcement I recorded on the Beijing-Shanghai bullet train. (I've subtitled it so you can watch in silence.) pic.twitter.com/ZoRWtdcSMy— James O'Malley (@Psythor) October 29, 2018
Social credit in China isn’t exactly like “Black Mirror.” Individuals aren’t using their smartphones to rate each other. Still, the Chinese government is actively shaping citizens’ behaviour using the threat of technology-enabled blacklists, and it’s enlisting the private sector to help turn rulebreakers into virtual second-class citizens. Those who have studied social credit say it’s too soon to know how invasive the system will become, but some worry that social credit could be used to automate the process of cracking down on political dissent.
One of the most famous social credit experiments was started by local officials in Suining City in 2010. Each person was given 1,000 credit points. Citizens’ scores dropped each time they were found guilty of an infraction, including obviously anti-social things such as driving drunk or offering a bribe, and less obviously unacceptable behaviours such as “heretical activities,” disturbing the “social order” and failing to display “household virtue.” Those with the most points were rewarded with access to civil service jobs, business licenses, procurement contacts, loans, subsidies and skills training. Others were left out.
Rogier Creemers, a Chinese law researcher at Leiden University in the Netherlands, has studied social credit and is quick to point out that Suining’s points-based system was heavily criticized in the state media and appears to have been discontinued. Still, elements of the experiment including publicly naming and shaming rulebreakers and making it difficult for them to do business are present in the national system described by the Plan for the Construction of the Social Credit System (2014 to 2020). The stated objective of the national plan is “raising the honest mentality and credit levels of the entire society.”
Since then, government agencies have signed agreements to create a series of blacklists known collectively as the Joint Punishment System, which Creemers documented in a research paper earlier this year. As of 2016, people who are defined by the courts as “chronic cheats” can be punished in a number of ways. They may not be able to hold certain positions in state-owned enterprises, work in the civil service or join the military. They may also face restrictions on buying real estate, renovating houses, buying cars, travelling on high-speed trains, using airplanes, staying in high-end hotels, staying at holiday resorts, joining golf courses, getting visas to go on foreign holidays or sending their children to private schools.
The central government argued in its 2014 plan that more trust was needed given “especially grave production safety accidents, food and drug security incidents happen from time to time, commercial swindles, production and sales of counterfeit products, tax evasion, fraudulent financial claims, academic impropriety and other such phenomena.”
Creemers told CTVNews.ca in a telephone interview that it was easy to make the case that a new way of cracking down on anti-social behaviour was needed. For example, there was a genuine problem with first-time flyers “wanting some fresh air” and opening up the emergency exit doors on planes, so taking away the right to fly was arguably a reasonable deterrent. There was also a need for online retailers to gauge the trustworthiness of people buying and selling products in a country where the vast majority did not have credit cards and therefore did not have credit scores, so another type of credit score was needed. China also needed a way to punish nefarious businessmen selling adulterated foods and drugs. In 2008, an estimated 300,000 babies were sickened and at least six died because an infant formula producer was bulking up the product with the toxic compound melamine. Creemers says social credit is at least partly about preventing these genuine problems. Courts could impose fines or issue orders to deal with these types of behaviour in the past, but people didn’t always pay their fines.
“The rationale is, if you can’t pay your fine, you’re not going to travel on aircraft, you’re not going to travel on high-speed trains. You could travel but you’re going to get the cheapest ticket on the slowest train that stops in every hick town,” he says. “Similarly, if you can’t pay your fine, you clearly aren’t in any financial position to get a mortgage, so you’re not getting a mortgage.”
Creemers says that, at least so far, to face the wrath of the social credit blacklists, “you need to have been convicted of something by a people’s court and you need to have not performed that judgment.” In other words, the punishments against individuals have only so far been meted out by judges, rather than bureaucrats, as far as he knows.
But Maya Wang, a researcher from Human Rights Watch in Hong Kong, told CTVNews.ca in a telephone interview that even if the blacklists are only used to enforce court orders, they can still be used to violate people’s human rights because Chinese courts frequently make arbitrary decisions.
Wang points to the case of Li Xiaolin, a lawyer who was hit by the social credit system in 2016. Li was 1,900 kilometres from home and tried to use his national identity card to purchase a plane ticket but was informed by a message on the screen that he was blacklisted by China’s top court. A local judge had placed Li on the list without telling him. When Li finally reached the judge weeks later, he was told it was because an apology he’d been ordered to give years earlier hadn’t been sincere. The reason?
“The judge said (Li) dated the apology on April 1 and April 1 is April Fools’ Day,” according to Wang. “Eventually he got the lawyers’ association to advocate on his behalf. The judge said, ‘OK, if you apologize again, we’ll take you off this blacklist.’ He did apologize again (and) was taken off the travel blacklist … But when he tried to obtain a mortgage, he found out he was also on the mortgage blacklist.”
In other words, not only can judges make arbitrary decisions about who gets listed in the Joint Punishment System but it’s unclear how to get off the blacklist once you’re on it.
The other big question for Wang is that the social credit system will be used differently in the future, especially if the system becomes integrated with the security cameras, facial-recognition technology and security databases proliferating in Chinese cities.
She points out that overtly political behaviour such as engaging in protests or publicly criticizing certain officials has always been heavily policed in China, and the job of cracking down on dissidents remains the purview of the Ministry of Public Security rather than social credit. But she says that doesn’t mean the tools created in the name of social credit can’t be used to punish people arbitrarily down the road.
Human Rights Watch has documented how police in some Chinese cities are creating huge databases of information about their citizens including their addresses, family relations, birth control methods and religious affiliations. Some police have said they plan to add hotel, flight and train records, biometrics and CCTV footage to the files.
In Shanghai, the city is already using facial recognition for everything from verifying people’s identities before people can rent apartments, to letting them buy subway fares, to automatically emailing tickets to jaywalkers caught on camera, using the photos of them on their national ID cards.
Citizens in the megacity can also use their faces to sign up for a new app called Honest Shanghai, which is based on social credit principles. Those who sign up are given a rating of “good, very good or bad” based on information linked to their ID cards, such as whether they have committed a restaurant hygiene violation or paid their municipal water bill. When it was launched, there were no punishments for a bad score, but a good score allowed users to collect rewards like discounted airline tickets. It also makes a person more attractive to deal with. For example, if you sell dumplings, you might get more business if potential customers can quickly see that you don’t have any health code violations. Shao Zhiqing, whose department oversees the application, told U.S. National Public Radio last year that the city hopes to one day incorporate information from “industry associations, private companies, and social media.”
Having that type of information collected in one place and easily available to the government worries Wang. She says the simple act of saying that “I really hate (President) Xi Jinping” to the person sitting next you on a street in Shanghai could one day lead to a near-automatic listing in the Joint Punishment System. “Now it’s the police officers doing the hard work of figuring out what this activist is up to,” she says. “The worst case scenario is that all this would be automated in a very systematic manner that is very fast,” she adds. “That would completely prevent any possibility of dissent.”
Samantha Hoffman, a non-resident fellow at the Australian Strategic Policy Institute, has studied social credit and says it’s not yet known whether apps like Honest Shanghai will be abused by the government. However, she’s convinced that social credit is about asserting political control in a way that avoids a public backlash, precisely because it does simultaneously help to solve real problems such as tainted baby formula.
Hoffman points out that social credit has already been used to bring powerful foreign companies into line politically. Earlier this year, airlines including Air Canada were told that if they did not change the way Taiwan is listed on their global websites to say it’s part of China rather than its own country, the aviation authority would create a record of “serious dishonesty” in the social credit system. “That would really seriously affect their ability to operate in China,” Hoffman says. The airlines complied.
Like Wang, Hoffman fears that “over time though, incrementally, the process for consequences of social credit will become more automatic … and a wider range of people will feel that more directly.” That’s likely to include Chinese-Canadians, she says. The Ministry of Public Security is also developing credit codes for overseas Chinese.
Hoffman says that if Canadians are concerned about human rights violations like the re-education camps targeting Uighur Muslims in western China, they ought to ask their governments to prevent the export of technology that can assist with social credit or to use laws like the Magnitsky Act to punish those profiting. She also says that social credit should be a wake-up call for democratic countries to strengthen privacy laws.
Creemers agrees. He says people in Western nations should think long and hard about what types of data we’re willing to give away to both governments and private companies, that could one day be used against us. For example, in the Netherlands there was a proposal to tax drivers on how much they used roads, which would have meant creating a record of where everyone has driven. “We shouldn’t really look at China as a completely different case,” he says. “A lot of the thinking that informs the Chinese social credit system is actually present in our states as well.”