Documentation of Internet Filtering Worldwide
Jonathan Zittrain* and Benjamin Edelman**
Berkman Center for Internet & Society - Harvard Law School
This project is part of the OpenNet Initiative,
a collaboration with the Citizen Lab at the Munk Centre for
International Studies at the University of Toronto and
the Programme for Security in International Society
of the University of Cambridge

A variety of organizations, institutions, companies, and countries seek to restrict Internet access from within their premises and territories. For example, companies may seek to improve employee productivity by restricting access to leisure sites; libraries and schools may seek to avoid exposing children to sexually-explicit content, or be required to do so; countries may seek to control the information received by their citizens generally. Common among nearly all these applications is the public unavailability of the filtering lists -- that, by the design of filtering systems, users cannot and do not know the set of specific sites blocked. In some cases users might ask for a specific site and be told of its unavailability due to filtering, but in other cases such unavailability may be conflated with unremarkable network blockages -- a Web site might be unreachable for any number of reasons, and the failure to view it at a particular moment cannot reliability be attributed to active filtering.

With this project we seek to document and analyze a large number of Web pages blocked by various types of filtering regimes, and ultimately create a distributed tool enabling Internet users worldwide to gather and relay such data from their respective locations on the Internet. We can thus start to assemble a picture not of a single hypothetical World Wide Web comprising all pages currently served upon it, but rather a mosaic of webs as viewed from respective locations, each bearing its own limitations on access. As various countries, companies and other entities employ or consider employing filtering software, documentation of the specific details, successes, and in some instances flaws of existing filtering efforts may prove helpful. (See European Union Internet Action Plan - Filtering & Rating, among other entities considering filtering.)

In general we seek to focus on those blocked pages likely to be most controversial -- for example, those blocked pages seemingly inconsistent with the criteria of the blocking regimes that respectively restrict their retrieval, as well as blocked pages that are well-known or frequently requested. However, to provide a more complete sense of the state of Internet filtering efforts, we also seek to investigate all blocking, including the restriction of access to web pages consistent with censors' category definitions.

Previous efforts to document the extent of government filtering have been made by researchers at the Soros Foundation's Internet Censorship Project. In that work, the ICP sent correspondents around the world to collect anecdotal data about filtering efforts worldwide by manually searching for some well-known Web pages; we build on their work by invoking automated methods to test and document thousands of pages blocked by each country or other blocking system studied. We wish to similarly augment the efforts described in such ventures as Shanthi Kalathil and Taylor Boas's "The Internet and State Control in Authoritarian Regimes" and Radio Free Europe's "20 Enemies of the Internet." Finally, our work follows a series of projects intended to document sites blocked -- and in many instances arguably wrongly blocked -- by major commercial Internet filtering applications; such projects include Bennett Haselton's Peacefire and Seth Finkelstein's Anticensorware Investigations as well as one author's Sites Blocked by Internet Filtering Programs.

In future work, the authors intend to expand analysis to Internet filtering systems in additional countries. Sign up to receive updates. To date, our methodology is limited to obtaining Internet access through a given country and testing a set of URLs for blockages; however, our ultimate aim is to develop a distributed software application for use in testing, analyzing, and documenting Internet filtering regimes worldwide. This application will enable Internet users to easily test what is and isn't filtered from their respective locations on the network, relaying results back for analysis and documentation. Get more information and sign up to get involved.

 

Country-Specific Studies

Other Studies

Other Analyses and Projects

 

Support for this project was provided by the Berkman Center for Internet & Society at Harvard Law School.

* Jack N. and Lillian R. Berkman Assistant Professor of Entrepreneurial Legal Studies, Harvard Law School.
** J.D. Candidate, Harvard Law School, 2005.


Last Updated: October 24, 2003 - Sign up for notification of major updates and related work.