Links between web pages are a perfect example of a public commons. We all benefit from, and many of us contribute to, an enormous collection of public links that connect web pages together. We follow links to find additional information we may want. The links are what made the web succeed and grow exponentially, by adding a network effect that made all the pages more valuable to all of us as we connected pages to each other.
Google built a search engine that worked by reading from the commons - from those public links - into a database that was used to create much higher quality search results. Google’s ‘spiders’ (robots that visit web pages to find links) enjoyed the same public access to the links that we all did.
Initially, the existence of the Google search engine actually magnified the utility of the commons that was the Web and its links. Being able to search in a new and better way made Web pages all the more useful by being even easier to find. From the standpoint of a web page, Google was valuable in the same way that inbound links were valuable: they both helped people to find it.
But after a while, with so many people using the Google search engine rather than following links, marketers and entrepreneurs figured out something that later came to be broadly called SEO - ‘Search Engine Optimization’. By reverse engineering the order in which Google’s search engine presented results, they were able to figure out tricks they could do to links and web pages that would cause their own pages to show up higher in the search results. They might, for example, create a bunch of fake websites that pointed to their own, to make Google think that their site was more relevant. Or use keywords and other meta-data that suggested their site contained information that it didn’t.
The problem with this process was that it devalued the commons for everyone else. By encouraging websites to ‘optimize’ their search results rather than simply presenting information, the clarity and value of the websites and links were diminished for everyone else. Google was making money from search, but at the same time slowly polluting the Web with noise and disinformation.
Damage to a public commons like the Web can be prevented with the right governance strategies. Elinor Ostrom won the Nobel Prize for her study of how to successfully manage these types of commonly shared resources, and further demonstrated that people have historically usually been fairly successful at it. For example, people don’t damage public resources when such damage is easily traced back to them (and in the case of Google it certainly is), and when sanctions matching the damage can be applied to violators.
To fix this and prevent future Googles from damaging the commons, we need to first collectively recognize the web as a precious resource, and then create mechanisms where sanctions like fines can be proportionately applied for damaging it. The same sort of machine learning that we are using to power AI can be used to objectively measure the quality of web searches, for example, and look for patterns of abuse that reduces the quality. Let’s do it.
I am writing about this now because addressing big problems like climate change and AI will require similar strategies. We will have to collectively define precious resources (like coral reefs or, in the case of AI, our right to not be manipulated) and sanctions for violations. Both climate change and AI (or more broadly, technology development) have the additional challenge that they will require the development of new collectives at scales larger than corporations or nation-states.
I appreciate you and your work. Thank you