Some Skepticism About Search Neutrality

Article Source: The Next Digital Decade: Essays on the Future of the Internet, Berin Szoka & Adam Marcus, eds., TechFreedom, 2011, pp. 435-459
Publication Date:
Time to Read: 2 minute read
Written By:



Some advocates urge that search engine rankings be made neutral. However, rankings have a subjective element, and regulation to require objectivity would be unworkable. Sites would seek to use ranking regulation to gain advantages over competitors.


Policy Relevance:

Established antitrust principles or FTC guidelines could address some search engine abuses.


Key Takeaways:
  • Search engines can do dastardly things, such as lower the ranking of competitors, or that of sites that refuse to buy advertising.
  • Leaving search engines entirely unregulated is not ideal, as consumers will often be unaware if relevant results fail to appear; also, healthy competition between search engines is unlikely, because consumers have high switching costs.
  • The free speech rights of audiences are restricted when information is withheld from them.
  • Neutrality advocates propose that search engines be objective, but this is unworkable; search ranking is by nature subjective. Likewise, it would not be possible to remove “bias” from search engines, especially as web sites and Internet users themselves are not free of bias.
  • Web sites have no right to insist that search rankings maintain sites’ accustomed flow of traffic; sites have no right to consumer’s visits, and search engines must be free to adjust rankings.
  • Asking regulators to ensure that a search engine ranks sites in terms of relevance will foment endless disputes; every site will insist that it is the most relevant.
  • Search neutrality regulations would often be used by sites seeking an advantage over competitors, and would prevent consumers from seeing sites they want.
  • Antitrust issues can arise when Google designs its search results to serve its own interests by linking to its own products, but often consumer interests are served as well.
  • Regulators should scrutinize payments from web sites to search engines to change rankings, and search engines could disclose commercial interests in the ranking decisions, consistent with principles set out by the Federal Trade Commission.
  • Insisting that search engines reveal their algorithms would allow unscrupulous sites to manipulate results; some propose that only regulators see the information, but regulators have a poor track record of competence with technical matters.



James Grimmelmann

About James Grimmelmann

James Grimmelmann is Professor of Law at Cornell Tech and Cornell Law School. He studies how the law governing the creation and use of computer software affects individual freedom and the distribution of wealth and power in society. As a lawyer and technologist, he helps these two groups understand each other by writing about copyright and digitization, the regulation of search engines, privacy on social networks, and other topics in computer and Internet law. He teaches courses in property, intellectual property, and Internet law.