Not yet Westworld: Do social media algorithms determine how we think?

wp-image-1430785659jpg.jpgIt was all quite innocuous, till it was not. Looking up crockpot chicken dinner recipes on YouTube would lead me to a puzzle at the heart of the modern world.

There is a theory doing the rounds that we all live in bubbles of our own making and this is supposedly behind the divisiveness of the present time. In an earlier post, I tried to work out the fundamental problems with the bubble theory. Social media is probably more responsible than we are for the bubbles of our own making. Although, it is true that we live in bubbles we fill with biased opinions and people that second these views, bubble-living isn’t unique to this or any other time period.

What is unique to this time is a prequel to artificial intelligence or its equivalent algorithms that we take for granted. We are not in Westworld yet, but whether we are or not aware of it, the social internet we are a part of is watching us interact, learning our behavioural patterns to profit the companies that control them by targeting us with focused advertisements. This all seems innocent enough, till it is not.

A few weeks ago, Facebook was criticised for promoting fake news. Nobody realised how pernicious fake news was till falsified and spurious  information in the guise of actual news began to determine the outcome of an election. Social media sites had been going on as usual without realising what was going wrong. Profit is achieved by targeting users with posts and ads that match their bubble preferences. This is carried out by algorithms that track our movements online, which then make intelligent guesses about our preferences based on site visits. More often than not these guesses are correct sending us down a rabbit-hole towards things and posts we never knew we wanted to buy or read. It’s all quite innocent in a futuristic, disturbing Philip K Dick kind of way. Innocent can also mutate rapidly into something chimeric.

The algorithms are often not able to distinguish between between fake and real news. Ironically, the humans receiving these notifications are not able to either. Our biases can be so overwhelming they destroy our capacity for healthy scepticism. Strong biases can knock out our bullshit detector. It doesn’t help that the news media whose job it is to detect the bullshit on our behalf is instead practising false equivalence, which blunts their ability to nose out the truth and, ironically, destroys their capacity to cover the news fairly.

A couple of weeks into looking for crockpot chicken dinners, YouTube started to send me notifications for Michelle Obama videos. Curious, I clicked open one to find it was racist and much more offensive than I could describe here. Despite ignoring them, these notifications kept coming in. Finally I was forced to right click and mark one out as racist and offensive.

The changeover was immediate. All of a sudden, YouTube began sending me video notifications for remarkable Michelle Obama speeches. What was it about crockpot chicken recipes that made YouTube think I would enjoy watching racist videos? Had its algorithm detected a similar pattern in other users? Whatever the reason, it was an eye-opening glimpse into the social-media artificial intelligence that now controls the way we think.

These algorithms seem to hone and hew our biases, as we watch, consume and are then bombarded by unbalanced perspectives that match our inherent cultural biases. If these biases were only hidden and inherent before, they are being emboldened and made normal by our social media consumption. Online society is a place much like fictional Westworld, an anonymous place we visit to live out our worst prejudices.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s