We are all equal before Google

This snippet from an interview with the new Google CEO, Sundar Pichai, intrigued me:

Pichai has said that he’s attracted to computing because of its ability to do cheaply things that are useful to everyone, irrespective of class or background. “The thing which attracted me to Google and to the internet in general is that it’s a great equalizer,” he said in a video interview last year. “I’ve always been struck by the fact that Google search worked the same, as long as you had access to a computer with connectivity, if you’re a rural kid anywhere or a professor at Stanford or Harvard.”

http://www.theguardian.com/technology/2015/aug/15/google-ceo-sundar-pichai

I’m very interested in the moral self-understandings which are common within the tech industry: do other senior corporate figures think in these terms? To what extent does it motivate the work they do? Or is it simply a retrospective story which they tell to congratulate themselves on their disruption?


Categories: Digital Sociology

Tags: ,

3 replies »

  1. It works the same, yet the results aren’t the same depending on who does the search…

  2. There’s an element of truth in this I guess. I’ve recently done a blog post about one of Paul Virilio’s books and I was surprised to see that if I search for an overview of this book on google my webpage is one of the first that can be accessed via the search engine. on the contrary though there are now business’s that pay for technical expertise of people who specialise in SEO and this practise would potentially give evidence for the fact that it is not a level field and that those who can afford to pay for technical expertise will have the edge.

  3. I’m more concerned about which parts of our learnings are limited by virtue of the personalized filter bubble that one’s activity creates. Search results are tailored to Google’s assumed understanding of who we are through our actions and cannot equate for cognizant dissonance when one wants to reset… therefore only giving us what the AI thinks we need rather than what we are really after at times. Repetition and conditioning makes for a very troubling reality.

Leave a Reply to TajdarOC Cancel reply

Your email address will not be published. Required fields are marked *