We Need Better Filters

This article originally appeared in Wired.

It seems pointless reeling off the numbers. There are millions of videos on YouTube. There’s an incalculable abundance of images on Facebook. There are limitless articles on a plethora of news sites and infinite cat pictures on innumerable Tumblrs.

There is now more content than ever flooding in to our lives.

This year alone Gartner estimates more than 2.3 billion smartphones, tablets, laptops and desktops will be shipped to content-hungry users. Despite all these devices, we continue to struggle with this deluge of information. We are drowning in a sea of lolcats, lifehacks, and listicles.

The challenge each of us now faces is a brand new one. How do we filter the infinite flow of information? How do we create a balanced diet of content with so much junk being thrown at us?

While we may feel that we’re answering this challenge with technology, in doing so we fail to identify the bias in our worldviews. Facebook, Twitter, Flipboard, Prismatic, Zite, and all of our media websites and apps give us the illusion of effective filtering, but in reality we have no idea what’s really going on.

Filtering wasn’t always this hard

Our filters were once the media, our friends, and our families. They took incoming information  –  large but finite  –  and filtered it down to what they felt was meaningful to us.

The value exchange from this filtering service usually happened in one of two ways: commercially driven (we have your attention and therefore will now sell ad space), or relationship driven (I like you and want you to like me).

It was simple, and more importantly, the methods and motivations behind our filters were clear. Outside our trusted circle of family and friends, newspapers had editors that made  –  and were accountable for  –  editorial decisions. TV, radio, and magazines were all slightly tweaked versions of the same model.

We were aware of, and understood how our filters operated. And if we weren’t happy, we could change.

With more information, filtering gets tougher

Simple questions like “what’s happening in the world?”, “what good new music is out?”, or “how well am I sleeping?” only get harder to answer with access to more information.

More information requires more filtering in order to give us a meaningful answer.

Now, the “filtered” information our friends and family present us with has itself been filtered. Facebook’s EdgeRank is the perfect example of this hyper-filtering.

Away from our friends and family, editorial decision-making was the only filter between the media and us. Now, large media organisations create mountains of content, then track our reading habits and online behaviour in order to build a profile of us. This data is then used to only deliver us content we are interested not via an editor, by via an algorithm.

Our filters are becoming invisible

EdgeRank isn’t something that Facebook users understand. It’s not even something Facebook acknowledges or talks about publicly. We simply have to have faith that Facebook’s filter is unbiased in its representation of which content from our family and friends we’ll find most useful.

Similarly, the algorithms and techniques used by media organisations to deliver us relevant content are not always acknowledged (or even understood). The “universal homepage" –  a website that is identical for all visitors  –  is dead. Every single page load is unique, not just because of the point in time a user accesses it, but because of all the data a visitor has supplied that allows the publisher to “filter on the way out”.

Twitter continues to move closer to becoming a fully-fledged media company – particularly through its hashtag filtering. Following a popular hashtag was once an exercise in speed-reading and rapid scrolling through hundreds (or thousands) of tweets per second.

But watching the #Ashes hashtag over the past few weeks, I noticed the feed was being filtered down to a manageable 20 tweets per refresh. Of the thousands of #Ashes updates occurring every minute, I was presented with a curated selection of tweets: some from my friends, some from people my friends followed, and some highly-retweeted content.

Just 20 tweets out of thousands. In cases like this, we only see one percent of the content that might be relevant to us. The filter that dictates the 99 percent is completely opaque. And that 99 percent is growing exponentially.

We need to understand the filters being applied to our lives

According to one German study, one in three people feel worse after visiting Facebook. It doesn’t take reams of academic research to realise that the world presented to us on Facebook is a skewed representation of reality. While we might be able to account for this by identifying which of our friends are over-sharers, we are increasingly losing touch with how our perception of content from friends and family is being altered through biased filters.

News organisations filter with the same goal in mind as Facebook  –  they want to maximise our enjoyment, agreement, and ultimately the number of times we visit and the amount of time we spend on their websites. These last two are the metrics that content is increasingly judged on.

The incentive is to present us content with which we agree  –  whether that is our friend’s holiday photos or a news report on the middle east. Our natural confirmation bias draws us to the information sources we agree with, and our world view becomes slightly narrower.

Why can’t we build our own filters?

All of the outrage over the shutting down of Google Reader shared a common theme – the end of free filters: free as in beer, but more importantly free as in speech. Google suggested the refugee Reader users should head to Google+, where we could experience another invisible filter on our content. With the end of Reader, we lost the ability to build a filter on our own terms.

But even if that option was available, the reality is that most people aren’t equipped to build their own filters. A set of favourite news groups; a list of RSS feeds; a well-curated bookmarks folder; these are all filters we once built ourselves. Now they have been replaced with the algorithms of Facebook, Twitter and BuzzFeed.

The less we understand our filters, the more we will come to accept that the world they present us with is true.

The more control we have over our filters, the more we can understand what we’re not seeing.

We need better filters.

- August 2013