Skip to the main content
Custodians of the Internet
Luncheon Series

Custodians of the Internet

Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media

Social media platforms face an irreconcilable contradiction: while platforms promise an open space for participation and community, every one of them imposes rules of some kind. In his book, Tarleton Gillespie discusses how social media platforms police what we post online – and the societal impact of these decisions. In this talk he will flip the story, to argue that content moderation is not ancillary to what platforms do, it is essential, definitional, constitutional. And given that, the very fact of moderation should change how we understand what platforms are.

Download original audio or video from this event.

Subscribe to the Berkman Klein events series podcast.

 

This event is supported by the Ethics and Governance of Artificial Intelligence Initiative at the Berkman Klein Center for Internet & Society. In conjunction with the MIT Media Lab, the Initiative is developing activities, research, and tools to ensure that fast-advancing AI serves the public good. Learn more at /topics/ethics-and-governance-ai.

 

Social media, moderation, and the mythos of the Internet

Tarleton Gillespie joins Berkman Klein Center for discussion of “Custodians of the Internet.”

by Carolyn Schmitt

When social media platforms like YouTube and Twitter banned alt-right conspiracy theorist Alex Jones from their platforms, it was greeted with both praise (some saying the removal was far overdue), and backlash (others arguing the removal violated free speech). Jones’ case is but one recent example of social media platforms grappling with moderation, removing content and users to reduce the sharing harmful and dangerous speech.

The prominence of content moderation may suggest that this is a new challenge for social media platforms. But moderation has been central to all platforms since their inception argues Tarleton Gillespie, a Principal Researcher at Microsoft Research New England and an adjunct associate professor at Cornell University. Platforms including Twitter, Facebook and YouTube use modes of moderation ranging from platform guidelines and policies enforced by staff, to offering their users the option to flag inappropriate content.

Gillespie recently joined the Berkman Klein Center for a discussion about his latest book, “Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media.” Gillespie spoke about moderation as an important component of platforms, but highlighted that public perception previously failed to acknowledge the role of moderation in social media platforms.

Book cover of Custodians of the Internet

“Many of these platforms were designed by people who were inspired by—or at the very least hope to profit from—the freedom that the web had promised. This meant that especially in early days, the platforms needed to disavow content moderation; we didn’t hear much about it,” Gillespie said. “It was obscured behind a kind of mythos of open participation; that when you went on these platforms all you would find is all of the content you wanted, all the opportunity to speak, all the people you want to chat with, all the sociality you could possibly imagine.”

This perception leads to a misconstrued “cultural imagination” surrounding platforms, Gillespie argued. He noted that when conceptualizing such platforms, people continually fail to recognize moderation as a key component of their existence, as well as the labor that goes into moderating content.

Yet companies have always moderated content, and recent interest in—and calls for— stronger content moderation have spurred the expansion of content moderation teams. But the challenges surrounding effectively moderating platforms extend beyond the sheer number of laborers.

Concerns of what content is moderated abound. Gillespie recalled that while content policy workers at large platforms acknowledge that always being “right” or “just” in reviewing content is an impossible task, “one of the things they often emphasize is being consistent.”

“These [moderation] systems are struggling between this emphasis on consistency and automating–if not making automatic—these processes,” Gillespie added. “How do you get thousands of people to make these decisions very quickly and very consistently and very fairly and not lose their minds in the process?”

Asked about the “right approach” platform companies should take towards moderation, Gillespie stressed that education within the companies is critical. Platforms have content moderation teams of varying sizes, but they are often brought onto projects or developments during the final stages to flag potential misses, he said. In his research, Gillespie found that engineers and designers at different levels of companies hadn’t realized the extent of the misuses of their platforms by users—including Alex Jones—despite it being flagged by content moderation teams.

Gillespie emphasized that the education process within platform companies should include “both the challenges of moderation—the depths of the problems of moderation—but also the economic costs of moderation.”

Social media platforms thrive on user-generated content, and within this sphere is the hope for viral or “unexpected” content, Gillespie said. Moderating content to create “reliable, predictable, safe, but also surprising” platforms is thus one of a few complicated tensions that platforms must negotiate. “Platforms need to tame that unexpectedness, while also appearing open. This is key to the business,” Gillespie explained.

“Platforms have to take our contributions, freely given, as raw material to assemble into a flow of content that they hope will be engaging and tolerable. And this is moderation. Moderation is the mechanism that tries to manage those tensions. How do you allow an unfettered flow of user-generated information that also produces an engaging and safe community?” Gillespie asked. “Moderation is the only answer to that question, and it’s an imperfect one.”

The Berkman Klein Center Luncheon Series is a weekly forum for conversations about Internet issues and research. It is free and open to the public.

Past Event
Tuesday, October 30, 2018
Time
12:00 PM - 1:15 PM ET
Location
HLS Wasserstein Hall, Milstein East C (room 2036, second floor)
1585 Massachusetts Avenue
Cambridge, MA 02138 US

You might also like


Events 01

Event Series

Luncheon Series

The Berkman Klein Center Luncheon Series is a weekly forum for conversations about Internet issues and research. It is free and open to the public.