The algorithmic self

All over the world, predictive algorithms are shaping societies in profound ways: How can we as a society make sure we have the right to regulate how they affect our lives?

by MAIZE

alt=
Big data 01 February 2018

Today, technology has permeated every section of our society. Our daily life is flooded with new applications and we are as a species rapidly becoming one with our tools. Most of these developments are visible, we see them around us all of the time. But what about the technologies that are not overtly present in our lives?

Those which affect us much more than anything else? These unseen forces are now an intrinsic part of how nations and businesses function – and to remain in the dark about them is to stay blind to how many predictive algorithms are in control of your own future and the potential paths your life can take.

All over the world, predictive algorithms are shaping societies in profound ways. In China, we are seeing witnessing one of the biggest social experiments in history, as 400 million people are relocating from villages to cities. In the largest mobilization project ever, China now has to come up with a way to integrate these people into urban society quickly. One way that has been devised is to give these rural citizens a credit scoring system in which they can leverage their potential to new employees and schools. Many of these citizens are completely off the record when it comes to banking, education and other indicators of an individual’s merits, so authorities have no real means to analyze their capabilities.

What this situation has resulted in is a complete rethink of the credit scoring system. Instead, China has opted for a social scoring system. The leading project in this area is Sesame Credit, a social credit scoring system being developed by Ant Financial Services Group, an affiliate of the Chinese Alibaba Group and associate of the Chinese government.

So how does it work? Initially, the platform establishes who your friends and family are in order to offset your score (between 300-900) and map your social graph. A person’s behavior will then subsequently impact their scoring, and by utilizing its citizen’s phones and other platforms, the government can ascertain what the individual is up too. Does the individual buy baby food and provide for his or her family? Then their score will go up. Do they spend ten hours a day playing video games? Their score will go down. Did the individual criticize the government online? Then their score will take a nosedive.

One of the most interesting aspects of this system is that although when it began it was simply a social scoring experiment, once users realized they had scores they began to share them. The population has completely bought into the idea of social scoring. This new society, not yet installed nationwide, has four different tiers that will define a substantial amount of the individual’s mobility ability. If you find yourself in the top first tier, you can apply for governmental positions, if you are in the fourth, you cannot even obtain a passport to travel out of your region.

Businesses too have accepted the idea, one dating website in China has offered free membership for users who have a score of 750 and above. An employment website has stated that if you have a score of less than 600, then they will not accept you as someone suitable for the jobs they have on offer. Others have used it as a promotional campaign, with train companies stating that you can only apply for first class tickets if you have a score of 700 or more, whilst a hotel chain has offered deposit-less bookings for anyone of a score of 700 or higher.

Right now China has only implemented this idea in one region, but a governmental paper states that by 2020 it is aiming to have made it a nationwide phenomenon. Initially it will be transparent, but eventually, citizens will not be able to see their scores or learn what is affecting them and how. This system does echo the caste/class systems seen across the world and could end up being just as difficult for individuals to mobilize themselves from within.

This all seems very suspect and we can be thankful that our governments and businesses aren’t experimenting with our prospects in the same way. However, what many don’t know is that they are, and we already have very similar algorithms in place across Western society. But unlike the Chinese, we know next to nothing about what these forces are for and who is designing them.

This raises a multitude of issues: the first being what information is used who gave the user the permission to take and use it? The second issue comes with what I call the distorted algorithm: who’s actually building these algorithms and do they have an agenda? Can they, for example, be racist and not even know that they are racist? And another, third issue is, what if someone is scored incorrectly? Is there anyone that can be held accountable? Can it be rectified? Are there any systems in place at all at all?

There is the counter argument that this fails to differ from the standardised rankings we have permeated throughout our society anyway. Our education, online presence, and businesses are all scored through different means, so what fears should we have of a score that is the culmination of these?

As much as that may be the case, ultimately we as a society should still have the right to regulate how these background algorithms affect our lives. Currently we do not, and it is impossible to know whether or not we have missed out on jobs, health insurance or even relationships thanks to these unseen forces. This is a discussion that must be had as more and more our algorithmic selves decide our lives for us.