Social credit is starting to bite

YouTube bristles with ‘Black Mirror’ reviews, some deciding which of its dystopian views of tech gone wrong is nearest to coming true. Surprisingly not all include Series 3’s ‘Nosedive’ where ‘likes’ are lifeblood – people give and receive ratings for all interactions. Characters’ entire lives are determined by their rating and a bad score from one place sticks around. But it’s closer to reality than you think.

KISS Managing Director, Sarah Reakes, writes: 

According to Fast Company the Chinese state and regional governments are seriously trialling related ideas to measure what’s usually called ‘social credit’. For example, the central government’s trials of a ‘red list’ and ‘black list’ to rank everyone on a scale from 350 to 950, where points can be lost for anything from supporting Falun Gong to playing loud music, and boosted by things like taking your parents to the doctors. A blacklisting can restrict things like travel, but also access to hotels, good jobs, top schools and high-speed internet. And as well as Alibaba and Tencent, some of Silicon Valley’s big players are involved or at least starting down this road.

Big players like Airbnb can already ban you without giving a reason (competitor platforms have similar policies) – and while restricting your access to Benidorm apartments might not seem too serious, there’s a disturbing trend here: for example, people can be banned from WhatsApp which has big consequences in many countries where it’s a prime communication channel, and insurers are checking social media and might revoke your insurance for that post of you in a shark cage. Bars are already using apps to detect and ban punters, potentially from bars across the entire country, for behaviour a bar manager deems unacceptable from harassment to violent behaviour or persistently using fake IDs.

While we should celebrate the joined-up use of tech and its potential (I guess we benefit, not just because similar technologies might help catch terrorists) the big trend here is ‘extra-legal regulation’ by private companies of people’s social credit, the deliberate search across many channels and histories, and the rising impacts bad social credit might have, both here and around the world. 

Who gains points, who gets to book that apartment, who gets into that school are presumably decided by both algorithms and, hopefully, real people. In the case of private companies like Airbnb at least decisions are confidential, and bans are for life (and yes, they will shut you down if you just open another account). If you wanted to challenge a backlisting the legal situation isn’t clear and any redress via the justice system is likely to be cumbersome, if not impossible. But if Fast Company is right this is small beer compared to China. 

We’ve become used to credit ratings for decades, but this is much more broad-reaching than that. Your staff and customers might find their ability to travel to and around China restricted due to previous social media activity or a range of other reasons. Your insurance might rise or be refused. And of course, this is early Beta days – a case of mistaken identity might happen easily…. 

As brand owners I think this is a trend to be aware of, and to me raises more than a few questions. We may find in certain countries our staff, agencies and suppliers may be affected by it, or may be party to it, knowingly or otherwise. 

For now, the most serious consequence might be that one of your staff can’t get into a cocktail bar on a Saturday night because they’ve been mistakenly identified as a troublemaker. But as technology becomes ever-smarter, joined up and repurposed by bodies from bars to government departments, we need to be aware of its potentially far-reaching effects for customers, staff and brands.



Looking for something specific?