“It was the best of times, it was the worst of times…” pretty much sums up 2016 in a single sentence. Today’s result in the American Presidential Election came as a shock to many, partly because all the data pointed to the contrary right up until the day itself. It seems we are still a long way from understanding the contextual nature of mining big data for information, and this problem extends itself into how we are becoming heavily reliant on similar systems controlling how we consume information.
Nowadays we use Facebook, Twitter, and other social networks to gain information, ingest content and read the news. We often see what we like and like what we see, resulting in biased social feeds because of this echo chamber. Most of us have realised how biased our news feed is which is down to how we use them, but recent political events such as Brexit and the US election have shown us the extent of just how much this is the case.
[quoter color=”yellow”]We unknowingly accept the echo chamber we’re placed in because we are fed information we like and agree with our own opinion.[/quoter]
In many cases, users don’t even realise that they consume one-sided, or similar information because of the social circles around them. This entire phenomenon can be chalked up to “You don’t know what you don’t know,” and users end up only reading regurgitated information. As a consequence, we unknowingly accept the echo chamber we’re placed in because we are fed information we like and agree with our own opinion. It’s then reinforced because people in the same social sphere agree with us too. So the echo chambers’ cycle remains because it gives us a false sense of affirmation that we are right in our beliefs – also known as a confirmation bias.
How likes divided a nation
The 2016 U.S. Election coverage is a perfect example. The Wall Street Journal recently put together this graphic which depicts how feeds may differ to Facebook users based on their political views: http://graphics.wsj.com/blue-feed-red-feed/
In the graphic, you can see Liberal and Conservative Facebook feeds side by side and how much they differ. After all, your feed is designed to prioritise content based on what you’ve liked, clicked and shared in the past. This means that conservatives don’t see much content from liberal sources and vice versa.
This particular presidential campaign has been fought on an entirely different content battleground than others, with a new army generating that content at high speed; with low value but an extremely high impact judging from today’s outcome. According to an article in Wired, one in every five election-related tweets from September 16 to October 21 was generated by a bot. These bots automatically generated content that met the criteria of the political agenda. Because of the deluge of tweets, they triggered and shaped online discussions around the presidential race, including trending topics and how online activity surrounding the election debates were judged. The problem stems in a shift from an Information Economy to an Attention Economy, where he or she who makes the most noise, wins. Unfortunately, noise does not equal signal but you can’t tell them apart when a bot or algorithm is involved.
The perfect example of this was the revelation that over 100 pro-Trump websites were in fact, being run out of a small town in Macedonia. Posts weren’t being generated by bots but by a small group of teenagers making money from click bait articles which were mostly false and misleading. The most successful post, according to Buzzfeed when they investigated the issue was based on a story from a fake news website, was the headline on the story from ConservativeState.com which read “Hillary Clinton In 2013: ‘I Would Like To See People Like Donald Trump Run For Office; They’re Honest And Can’t Be Bought.’” The post was a week old and had racked up an astounding 480,000 shares, reactions, and comments on Facebook. Those numbers are astounding and prove that attention means more than information.
Breaking the cycle
Bots and algorithms don’t seek out opposing views or surface them for readers because they’re not built that way. They serve us what we want to hear. It’s the same when we’re served news written by human hands specifically for our tastes. We become trapped in a filter bubble wrapped around an echo chamber (or should that be an echo chamber wrapped around by a filter bubble?!).
[quoter color=”yellow”]Bots and algorithms…serve us what we want to hear[/quoter]
The way to break free from this is to start understanding how algorithms work, why content screaming for attention can no longer be trusted as relevant, and to surround ourselves with different viewpoints. The ultimate goal is balance and only this way can you find a new perspective, different content, and learn what you don’t yet know.
We should be more selective in the content we consume. Instead of the algorithm doing the filtering first, we should manually look to filter the news, media, and information ourselves in order for algorithms to gently nudge new information by suggesting opposing views that broaden our perspective.
The algorithm should be the one to challenge our point of view, not reinforce it.