The Filter Bubble – How the hidden web is shaping lives

In this RSA talk the pioneering online campaigner Eli Pariser talks about a crucial and, as yet under-discussed, danger facing the the social media web: the expansion of filtering into every aspect of our online activity. Sites collect data on usage patterns, particularly our reactions to being presented with content and the action (e.g. ‘like’, ‘share’, ‘+1’)  we take in response to what we see.

Without collecting such data any possibility of a semantic web is immediately foreclosed because human meaning has to enter the processing system somewhere. Yet the sheer opacity with which these technologies are being developed, let alone how they are being implemented on the web, demands urgent political debate.

However it would be easy to be alarmist about this and throw the baby out with the bath water. The problem is not filtering per se but rather the private and opaque nature of this filtering. In so far as the development and roll out of the technology is reliant on the corporate structures of capitalism, it’s difficult to avoid the former entirely. But the demand shouldn’t be for liberation from the filter bubble these corporations have placed us in – it should be for them to make their technology available to us so that we can design and implement our own filtering bubbles, as part of our ongoing day-to-day interactions with the internet, driven by our awareness of what we do and do not want to see. Certainly the computational systems they’ve developed allow us to see connections which we might no be consciously aware of: I’ve come across rafts of fascinating reading through following Amazon’s ‘other customers who bought this also bought’ system. But this should be an opt in system, rather than something imposed upon us.

It could be argued that there are political problems inherent in this as well – as Cass Sunstein plausibly argues in his Republic 2.0 – given the possibility that already politically divided societies are likely to become ever more polarized when individuals self-select for all the content they encounter. However firstly it’s necessary if we’re going to have any possibility of engaging productively and creatively with modern digital technology simply because of the exponential trend of content growth which goes hand-in-hand with the mass uptake of social media tools. Secondly, the problems attached to it are contingent and emergent (i.e. they result from when people in practice doing this filtering badly, often for reasons not of their own making) rather than being intrinsic to filtering itself. Thirdly, the sheer cultural value of web 2.0 demands new proficiencies on the part of its users: we can either retreat from information overload (see the growing trend for going offline, protectively lock ourselves into virtual bubbles of our own making, stay passively within the corporate infosphere* OR we can embrace the challenges that come from this revolution in human communication, using the tools available to us in order to dialogically develop a dynamic filtering orientation as we negotiate an ongoing path through human culture in the 21st century.

*Which I think is the main concern which arises from the filter bubble as it presently stands


Categories: Uncategorized

Tags: , , , ,

3 replies »

  1. Very few people are aware of this today, even though this phenomenon is nothing new. In fact, it’s been known about for over 50 years, and the fathers of information theory, including Claude Shannon and von Neumann, debated its implications for society and human interaction. At the time they argued that humans, at a basic level, act as receivers and repeaters of information, and this causes ideas to become reinforced over time within social groups, sometimes becoming ideologies. It could be the case the subcultures we saw in the 1960s and early 1970s resulted from attempts to apply this to society, since that was the kind of thing being discussed by information theorists at the Macy Conferences several years before.
    But how does this fit in with your article? Early cyberneticists believed that this ‘filtering bubble’ is present at the group level. Bearing in mind that ‘information’ is defined as the change in entropy – the degree of uncertainty – we find that very little information is actually exchanged within most social groups. In fact, information and entropy can be quantified, and we have mathematical expressions for the amount of information exchanged. To actually have any flow of information, as opposed to the mere communication of ‘content’, there has to be some disagreement and uncertainty between communicating parties.

  2. I’m not sure I’d accept the distinction between information and content on a philosophical level but this is a fascinating response – sorry I missed it at the time. I’ve been trying to sketch out an account of ‘information ecology’ for a while via a couple of conference presentations and a book chapter (relating to protest movements & sexual communities respectively) so I definitely need to follow up the references you’re pointing towards.

Leave a Reply

Your email address will not be published. Required fields are marked *