partnership with colleagues at the Center for Democracy &
Berkman Center is pleased to release a report on “Account
Deactivation and Content Removal: Guiding Principles and
Practices for Companies and Users” by
Caroline Nolan, Cynthia Wong, and Jillian York.
You can download the report
You can download the report
report explores these dilemmas and recommends principles, strategies,
that companies and users alike can adopt to mitigate the negative
account deactivation and content removal. Through case examples, we
ways in which platform providers can have a positive impact on user
behavior by being more clear and consistent in developing ToU and other
policies, responding to and evaluating suspected violations, and
opportunities for recourse and appeal. We also highlight concrete
users can take to educate themselves about how the moderation,
abuse-prevention mechanisms work for the services they use, provide and
communicate context where necessary, and engage with companies and
around such issues.
From the activist who communicates with
her network via
her Facebook account, the user who posts documentary-style videos to
the citizen journalist who raises awareness with photos uploaded to
Flickr, platforms that host user-generated
content are increasingly used by a range of civic actors in innovative
amplify voices, organize campaigns and coordinate disaster response,
advocate around issues of common concern. However, while the online
be perceived as a public commons, private entities play a role in
guidelines, and other mechanisms of control.
Platform providers often enforce such rules in response to
threats, misuse, or ToU violations; users must observe
them or risk losing their accounts, their contacts, or their
The clarity, transparency, and consistency of how such terms are established and implemented are important to all users, but for the growing number of human rights activists who depend on web 2.0 platforms for core elements of their work—and for whom removed content and deleted accounts can have severe consequences—the stakes are much higher. For platform providers, enforcing site guidelines can require balancing complex and often competing considerations, including supporting community norms and innovative user activity, while maintaining a safe and secure online environment, protecting the free expression and privacy rights of users while enforcing legal standards or responding to government pressure, and accounting for the potential risks faced by activists.
document grew out of an ongoing
learning series hosted
by the Global
(GNI), of which the Berkman Center
founding members. Our analysis was strengthened
the experience and feedback of diverse stakeholders,
including company representatives,
socially responsible investors, academics, and advocates, and intends
to offer realistic
and concrete approaches that are rights-sensitive while also flexible
be practically implemented across diverse platforms and contexts.
For more information on the Berkman Center’s work with the GNI, and on issues related to online free expression and privacy, please visit: http://cyber.law.harvard.edu/research/principles.
Additional announcements can be found at:
Last updated September 21, 2011