Why so many of us have ended up at the mercy of social media

Clemens von Stengel 19 March 2018
Image Credit: Blogtrepeneur

An hour after sitting down to start writing, you find yourself clicking through the photos of a stranger. How did you end up twenty posts deep into your Facebook news feed? You never intended this to happen. Nor did you particularly enjoy the scroll to get there. Yet somehow the site has pulled off the same trick that they’ve performed countless times before.

Each feature of the platform you are using has been tuned meticulously in order to squeeze every second of your attention out of you. What’s more – when you’ve put it away, it won’t be long until your ears pick up the faint buzz of your phone. A friend will like a picture you’ve been tagged in. You’ll be reminded of the event you said you were “interested” in 2 weeks ago. And, inevitably, you’ll be sucked back into the mindless scroll, searching for the next hit of dopamine.

The designers of social media have found a back door into people’s minds, and, like a parasite controlling its host, are using it to proliferate and grow. As they find more and more effective ways to exploit this, the technology is becoming more and more intrusive. This is restructuring our society, and ultimately is being crafted into a tool of oppression and control.

How exactly are we all being duped into committing more time to social media than we would like to? Whilst there are many tricks of the trade, there is one technique which has become the bread and butter of social media design – something former Google designer Tristan Harris has identified as being “intermittent variable rewards”1.

He likens the design to that of a slot machine. You pull a lever for a chance to win a prize. An intermittent action is linked to a variable reward. Repeat this enough times and your mind will reward you for pulling the lever with a hit of dopamine. This is the same design that underpins “swiping” on Tinder, or scrolling down your Facebook news feed. It is what creates an addictive response in your mind. It is what makes slot machines bring in more revenue than all other casino games combined. It’s also a key ingredient in getting people to check their phones over 80 times a day.

Importantly, once the designers opened this back door into our minds, it was inevitable that it would be perfected and incorporated into every aspect of the design of social media. This is because the companies behind the design are in an arms race to get as much attention from their users as possible. Those which don’t adopt the addictive design philosophy are outcompeted, and eventually forced to adopt it or go bust.

The overwhelming market incentive to get people to pay attention to their products is what drives even well intentioned companies to design in this apparently malicious way.

Dopamine Labs is the poster child of this attention economy. It is a startup based in L.A, which has built an AI to tweak an app’s design by “giving your users our perfect bursts of Dopamine”. As advertised on their website, “a burst of Dopamine doesn’t just feel good: it’s proven to re-wire user behavior and habits”. We are being programmed to make decisions that benefit corporations whose goals are very different from our own.

Another way in which social media keeps people’s eyes on their screens is even more pernicious: by changing the norms for socialising to suit business needs. With enough persistent use, our values turn into those of the corporations. Our desire for social approval becomes a craving for followers, likes or retweets. What’s more, by implementing features like “streaks” on snapchat, or encouraging you to tag your friends on facebook, the designers can turn us users into their advertisers. This is becoming more and more intrusive: as illustrated by the recent Tinder ad campaign, in which the company target specific students to become “reps” and help market and normalise their service. In this creeping fashion, our society is being restructured in a way many people aren’t comfortable with.

Former vice-present for user growth at Facebook Chamath Palihapitiya has been vocal in his disdain for the direction the company is going. “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.” he said at a Stanford Business School event.

Of course, for most of us the advantages of being constantly connected to everyone we know seem to outweigh the corrosive effects this has on societal values. The psychological manipulation being employed to get our attention feels like a price we have to pay. Besides, the data which is harvested is just used by companies to convince us to buy their products and use their services. It is enough to make anyone uncomfortable, but doesn’t quite stop us from willingly accepting this subordination.

All this pales in comparison to how the politically motivated can abuse the same back door into the population’s minds.

Cambridge Analytica, a self-described “behavioural microtargeting” company, has developed a targeted advertising algorithm. People’s personality, determined by their Facebook activity, is used to personalise advertisements. Crucially the company is owned by Robert Mercer, the billionaire hedge fund owner who was Trump’s biggest donor. Here the company is no longer trying to maximise “attention”, rather it is trying to manipulate the vote.

The Trump Campaign was also Cambridge Anayltica’s biggest client. On their website, Cambridge Analytica boast how they helped Trump get into power. “Analyzing millions of data points, we consistently identified the most persuadable voters and the issues they cared about. We then sent targeted messages to them at key times in order to move them to action.” In part this meant finding emotional triggers for each individual voter.

More ominously, there is evidence that it involved targeting democratic voters, with the goal of getting them to stay at home on election day. AggregateIQ is another similar company (whose IP, incidentally, is also owned by Robert Mercer), which offer similar services. They approached the Brexit Campaign, which ended up spending over half of their budget on their targeted advertisement services.

In the case of Cambridge Analytica, the feat of psychological manipulation is a subtle one. They are not explicitly targeting social norms, nor are they manipulating voters to persuade one another. This is not the case in China, where the proposed “Social Credit System” is akin to something straight out of Black Mirror.

The idea is simple. First the government collect data concerning each individual’s daily activity, using data from social media sites. This includes how people spend their time, the people they associate with, the content of their discussion, and the things they buy. This is then used to determine an overall “Social Credit Score”, which is displayed publicly. This is used to determine your internet speed, which restaurants you can frequent, and your right to travel.

Several pilot programs have already experimented with the idea. One of these, called “Sesame Credit”, is already being used by millions of people throughout China. Here data is also being sourced from people’s online purchases. The extent to which every action of these people conforms to the government’s ideal citizen is measured, and contributes to their score. Whilst the company remains secretive about exactly how the score is calculate, it is not too hard to imagine. Buying work shoes and sharing news from “approved” sources could increases one’s score. Importing media from Japan or posting pictures of Tiananmen Square could decrease it.

This score also depends who you associate with, since, as one senior figure of a Chinese company puts it, “we can assume good people will be friends with good people”. Thus people have a strong incentive to encourage each other to be good. Of course now it is the government who gets to decide what it is to be “good”. In this way the Chinese government can use people’s friendships and communities to encourage them to buy into their own subordination.

Efforts to get the people to adopt the score as a measure for their own self worth have gone further than even this. Baihe, China’s biggest online matchmaking service, has integrated sesame credit into its platforms. People’s scores are visible on their profiles, and people with a higher score will be visible to more people.

The scariest thing is how readily people are accepting this system. Its gamified feel leaves people boasting about their score on social media, and competing to get the highest scores possible. Despite the fact that this “service” is still optional, a reported 6.15 million people have been banned from taking flights for “social misdeeds” in the 4 years since this was first introduced.

The Social Credit System, in which all of the 1.4 billion Chinese citizens would be monitored and scored, is set to be introduced in 2020. Here it will be obedience, rather than attention, which will be engineered in the population. It promises to be the ultimate totalitarian tool.

By abusing the vulnerabilities of the mind, the social media we have grown to rely on can be used to manipulate the way we think and act. At first this was employed to generate unparalleled ad revenue for the companies that can best exploit our psychology. Increasingly, it is being used as a tool for political influence, cultivating obedience, and oppression. Perhaps it is time that we stop actively choosing to adopt platforms that control the way we think – whilst we still can.