Wikipedia and Social Science 2.0

A project like Wikipedia thrives because of it’s ability to harness the efforts of occasional contributors. As Clay Shirky suggests in his excellent Here Comes Everybody, the numbers willing to make a small contribution (e.g. proof reading an article and correcting typos) vastly outstrip the numbers willing (or able!) to sit and write an entire article from scratch. This dynamic allows collaborative production to spiral into an endless series of feedback loops, as a few who contribute a lot provide raw material which a far greater number who contribute a little subsequently ‘mop up’ (i.e. rephrase, extend, correct), in turn expanding the scope of the site and increasing both its actual traffic andpotential appeal, bringing ever more co-producers to Wikipedia. It’s an incredibly powerful iterative process, as can be seen in the statistics describing the site’s growth:

In fact the sophistication which characterises the discussion at the above link (how best to model Wikipedia’s growth) is testament to the intellectual power of iterative co-production. So the obvious question is: how can this dynamic be harnessed by social science 2.0?

One of the obvious problems which Wikipedia raises, particularly within academia, is that of expertise. How can the intellectual outputs of an anonymous and collaborative endeavour be trusted? After all, academic life is predicated on systems of accreditation which have been evolving for centuries. The simple answer is that it’s not warranted to uncriticallytrust any particular article on Wikipedia because mistakes and inaccuracies pervade the system. Expertise is an emergent characteristic of the overarching site but not one (at least not a taken-for-granted on) of any one article.

If this was a commercial encyclopedia then this inadequacy would be something of a deal-breaker. People wouldn’t pay money for a series of books that they couldn’t trust and, conversely, the manufacturer wouldn’t produce a series of books which people would be unlikely to pay for. However what makes Wikipedia unique is a generic property of the web (massively reduced production costs) and a specific property of the site itself (open-ended self-correction). These add up to one very special property: minimal cost of failure. The point at which aggregative failure threatens the integrity & utility of the overarching system is far lower than any comparable pre-internet endeavour.

As a system Wikipedia can survive a great deal of failure and, in turn, this facilitates iterative self-correction. Because there’s no central agency which has invested money in the project in the hope of making a profit, there’s no incentive to cut their losses because the project ceases to be commercially viable. This lack of a cut off point means that iterative co-production can continue and, through doing so, actually correct the failures which might otherwise have led to its demise. In the process new failures will occur but these too can be corrected.

I find Wikipedia absolutely fascinating when seen in cybernetic terms. There are three properties I’ve discussed which need to be considered when articulating a concept of Social Science 2.0:

  1. Harnessing small contributions effectively
  2. Maintaining function in spite of recurrent failure
  3. Rendering accreditation and expertise unproblematic

 


Categories: Uncategorized

Tags: , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *