Skip to the main content

Report Release: Account Deactivation and Content Removal: Guiding Principles and Practices for Companies and Users

In partnership with colleagues at the Center for Democracy & Technology, the Berkman Center is pleased to release a report on Account Deactivation and Content Removal: Guiding Principles and Practices for Companies and Users” byErica Newland, Caroline Nolan, Cynthia Wong, and Jillian York.

You can download the report at:http://cyber.harvard.edu/publications/2011/account_deactivation

This report explores these dilemmas and recommends principles, strategies, and tools that companies and users alike can adopt to mitigate the negative effects of account deactivation and content removal. Through case examples, we outline the ways in which platform providers can have a positive impact on user trust and behavior by being more clear and consistent in developing ToU and other policies, responding to and evaluating suspected violations, and providing opportunities for recourse and appeal. We also highlight concrete actions that users can take to educate themselves about how the moderation, takedown, and abuse-prevention mechanisms work for the services they use, provide and communicate context where necessary, and engage with companies and other users around such issues.  

From the activist who communicates with her network via her Facebook account, the user who posts documentary-style videos to YouTube or the citizen journalist who raises awareness with photos uploaded to Flickr, platforms that host user-generated content are increasingly used by a range of civic actors in innovative ways: to amplify voices, organize campaigns and coordinate disaster response, and advocate around issues of common concern. However, while the online space may be perceived as a public commons, private entities play a role in shaping online activity, behavior, and content via Terms of Use (ToU), community guidelines, and other mechanisms of control. Platform providers often enforce such rules in response to potential threats, misuse, or ToU violations; users must observe them or risk losing their accounts, their contacts, or their ability to post content.  

The clarity, transparency, and consistency of how such terms are established and implemented are important to all users, but for the growing number of human rights activists who depend on web 2.0 platforms for core elements of their work—and for whom removed content and deleted accounts can have severe consequences—the stakes are much higher. For platform providers, enforcing site guidelines can require balancing complex and often competing considerations, including supporting community norms and innovative user activity, while maintaining a safe and secure online environment, protecting the free expression and privacy rights of users while enforcing legal standards or responding to government pressure, and accounting for the potential risks faced by activists.

This document grew out of an ongoing learning series hosted by the Global Network Initiative(GNI), of which the Berkman Center andCDT are founding members. Our analysis was strengthened by the experience and feedback of diverse stakeholders, including company representatives, socially responsible investors, academics, and advocates, and intends to offer realistic and concrete approaches that are rights-sensitive while also flexible enough to be practically implemented across diverse platforms and contexts.

For more information on the Berkman Center’s work with the GNI, and on issues related to online free expression and privacy, please visit: http://cyber.harvard.edu/research/principles.

Additional announcements can be found at:


Publications 01

Publication
Sep 20, 2011

Account Deactivation and Content Removal: Guiding Principles and Practices for Companies and Users

In partnership with colleagues at the Center for Democracy & Technology, the Berkman Center is pleased to release a report on “Account Deactivation and Content Removal: Guiding…