Main Page: Difference between revisions

From Truthiness in Digital Media
Jump to navigation Jump to search
No edit summary
Line 1: Line 1:
'''<center><big>TRUTHINESS IN DIGITAL MEDIA: A SYMPOSIUM THAT SEEKS TO UNDERSTAND AND ADDRESS PROPAGANDA AND MISINFORMATION IN THE NEW MEDIA ECOSYSTEM</big>'''
'''<center><big>TRUTHINESS IN DIGITAL MEDIA: A SYMPOSIUM THAT SEEKS TO UNDERSTAND AND ADDRESS PROPAGANDA AND MISINFORMATION IN THE NEW MEDIA ECOSYSTEM</big>'''
March 6-7, 2012  
March 6-7, 2012  
Berkman Center for Internet & Society and MIT Center for Civic Media
http://blogs.law.harvard.edu/truthiness/</center>
http://blogs.law.harvard.edu/truthiness/</center>



Revision as of 14:28, 22 February 2012

TRUTHINESS IN DIGITAL MEDIA: A SYMPOSIUM THAT SEEKS TO UNDERSTAND AND ADDRESS PROPAGANDA AND MISINFORMATION IN THE NEW MEDIA ECOSYSTEM

March 6-7, 2012

http://blogs.law.harvard.edu/truthiness/

OVERVIEW

Introduction

As the networked media environment increasingly permeates private and public life, driven in part by the rapid and extensive travels of news, information and commentary, our systems for identifying and responding to misinformation and propaganda are failing us, creating serious risk to everything from personal and financial health to fundamental democratic processes and governance. In this age when many would argue that news and information have become key ingredients of broad social progress, our supply is tainted. Concerns about misinformation and disinformation are nothing new. Indeed, many tried and true techniques for disseminating misinformation remain just as vibrant in our current communications and information era. But digital media present new challenges to existing institutions, structures and processes, jeopardizing its potential contributions to the health of political, economic, and social systems.

While opinions differ over how digital media ameliorates and exacerbates the spread and influence of misinformation, this multifaceted issue persists in the face of thoughtful, sustained, and creative responses—and demonstrates a great diversity of manifestations, roots, and harms. The motives for spreading misinformation are many. They may be partisan or commercial, may derive from or evoke moral and religious sensibilities, may offer political or social commentary, or may be merely whimsical. But what to do? Building upon recent convenings and a number of related projects, we are taking a critical step towards a deeper understanding of the problem with a keen eye towards collectively identifying novel solutions and concrete actions to combat the deleterious impacts of misinformation in the near term and over time.

Objectives

This symposium will focus on exploring the many facets of this complex issue with an eye to crafting tools and strategies to ameliorate the negative impacts of deception, bias, and inaccuracy in the digital media ecosystem. We hope to come to a better position to take advantage of the benefits promised by digital media while appreciating the positive aspects of creative media-making and probing the blurred boundaries between nefarious and beneficial media shaping practices.

From academics to activists, techno geeks to policy geeks, and media scholars to media makers, participants will span a wide range of academic and professional backgrounds and expertise to promote cross-disciplinary learning and to facilitate the formation of novel and holistic approaches to these complex issues. The one-day public conference will include approximately 100 participants to balance inclusiveness with a participatory setting. It will draw upon a variety of formats to complement the topics, disciplinary approaches, and available participants, including selected presentations, case studies, tool demonstrations, and roundtable discussions. We hope to provide ample white space between sessions for people to process ideas, connect with one another, and generate new insights and approaches regarding the nature and complexity of the problem, relevant narratives and illustrative use cases, new areas of vulnerability and concern, and the relative merits of existing strategies to combat misinformation.

The final session on March 6, 2012 is intended to provide a bridge between the day’s discussions and the Hack day, with a focus on interventions, tools, and useful strategies for a variety of users operating in today’s digital communications environment. Hosted at the Media Lab, the Hack Day will provide for a series of working sessions, informed by the conference and designed to maximize focus and productivity. It will endeavor to conceive and prototype tools, processes and other resources to confront the challenges identified in the previous days.

Mode

We are delighted to have a diverse range of experts, scholars, commentators, practitioners, and users as participants and speakers in the Symposium. Our hope is to create a highly interactive environment that will encourage conversation, provocation, debate, and input from all participants throughout the day. We will provide numerous mechanisms for commentary, both in the lead up to and after the symposium, from our blog to the Hack Day on March 7, 2012. Moderators and session leads will also be looking to the crowd to deepen and enrich all sessions and approach these complex issues from multiple dimensions, disciplines, and perspectives.

RESOURCES

Please add links to papers, articles, blogposts, and other items related to privacy and of interest to symposium participants to this page. Users need to create an account to edit this wiki -- click on the link in the top right corner of this page to obtain a username/password.



Panagiotis Metaxas, “Web spam, social propaganda and the evolution of search engine rankings,” Lecture Notes BIP, Springer-Verlag, 2010. http://cs.wellesley.edu/~pmetaxas/Metaxas-EvolutionSEs-LNBIP10.pdf

Panagiotis Metaxas and Eni Mustafaraj, “From obscurity to prominence in prominence in minutes: political speech and real-time search,” Web Science 2010 Conference, Raleigh, NC, April 2010. http://cs.wellesley.edu/~pmetaxas/Metaxas-Obscurity-to-prominence.pdf

Eni Mustafaraj, Samantha Finn, Carolyn Whitlock and Panagiotis Metaxas, “Vocal minority versus silent majority: discovering the opinions of the long tail,” IEEE SocialCom Conference, Boston, MA, October, 2011. http://cs.wellesley.edu/~pmetaxas/Silent-minority-Vocal-majority.pdf

Eni Mustafaraj and Panagiotis Metaxas, “Trails of trustiworthiness in real-time streams,” Design, Influence and Social Technologies, ACM Computer Supported Cooperative Work (CSCW), Seattle, WA, February, 2012. http://cs.wellesley.edu/~pmetaxas/TrustTrails.pdf

Electronic Privacy Information Center, “E-Deceptive Campaign Practices: Technology and Democracy 2.0 Report 2010,” October 2010. The report reviews the potential for abuse of Internet-based technology in the election context, and makes recommendations on steps that should be taken by Election Administrators, voters, and those involved in Election Protection efforts. E-Deceptive campaigns are internet-based attempts to misdirect targeted voters regarding the voting process, and include false statements about poll place hours, election dates, voter identification rules, or voter eligibility requirements. http://epic.org/privacy/voting/E_Deceptive_Report_10_2010.pdf

Geoffrey York, “How a U.S. agency cleaned up Rwanda’s genocide-stained image,” Globe and Mail, January 31, 2012. http://www.theglobeandmail.com/news/world/how-a-us-agency-cleaned-up-rwandas-genocide-stained-image/article2322005/page1/ Melanie Newman, “PR firm ‘attacked’ critics of Rwandan government,” Bureau of Investigative Journalism, December 6, 2011. http://www.thebureauinvestigates.com/2011/12/06/pr-firm-attacked-critics-of-rwandan-government/

Truthy: Truthy is a research project that helps you understand how memes spread online. We collect tweets from Twitter and analyze them. With our statistics, images, movies, and interactive data, you can explore these dynamic networks. http://truthy.indiana.edu/

“Information Diffusion in Online Social Networks,” Center for Complex Networks and Systems Research. The focus of this research project is understanding how information propagates through complex networks. Leveraging large-scale behavioral trace data from online social networking platforms we are able to analyze and model the spread of information, from political discourse to market trends, in unprecedented detail. http://cnets.indiana.edu/groups/nan/truthy

Yochai Benkler, “Seven Lessons from SOPA/PIPA/Megaupload and Four Proposals on Where We Go From Here,” TechPresident, January 25, 2012. http://techpresident.com/news/21680/seven-lessons-sopapipamegauplaod-and-four-proposals-where-we-go-here


Howard, P. N., Agarwal, S. D., & Hussain, M. M. (2011). When Do States Disconnect Their Digital Networks? Regime Responses to the Political Uses of Social Media. The Communication Review, Twitter Revolutions? Addressing Social Media and Dissent, 14(3), 216-232.

Howard, P. N., Agarwal, S. D., & Hussain, M. M. (2011). The Dictators’ Digital Dilemma (No. 13). Issues in Technology and Innovation (pp. 1-11). Washington, DC: The Brookings Institution.

Howard, P. N., Duffy, A., Freelon, D., Hussain, M. M., Mari, W., & Mazaid, M. (2011). Opening Closed Regimes: What was the role of social media during the Arab Spring? National Science Foundation-funded Information Technology and Political Islam project (pp. 1-30). Seattle, WA: Center for Communication and Civic Engagement.

Howard, P. N., & Hussain, M. M. (2011). The Role of Digital Media. Journal of Democracy, The Upheavals in Egypt and Tunisia, 22(3), 35-48.

Howard, P. N., Hussain, M. M., Freelon, D., Mari, W., & Duffy, A. (2011). Digital Media and Contagious Democracy: Lessons from the Arab Spring. Dallas, TX: Bush Institute for Human Freedom.

Moy, P., & Hussain, M. M. (2011). Media Influences on Political Trust and Engagement. In R. Y. Shapiro & L. R. Jacobs (Eds.), The Oxford Handbook of American Public Opinion and the Media (pp. 220-235). Oxford, UK: Oxford University Press.

Nahon, K., Helmsley, J., Hussain, M. M., & Walker, S. (2011). Viral Political Information in the US Elections Metabase, 2008 [Data Set]. Seattle, WA: Information and Society Center.

Nahon, K., Hemsley, J., Walker, S., & Hussain, M. M. (2011). Fifteen Minutes of Fame: The Power of Blogs in the Lifecycle of Viral Political Information. Policy & Internet, 3(1).

Pamela Meyer, “How to spot a lie,” CNN.com, November 14, 2011, http://www.cnn.com/2011/11/13/opinion/meyer-lie-spotting/

Andrew Sullivan, “Wikipedia is not Truth,” The Daily Dish, February 20, 2012. http://andrewsullivan.thedailybeast.com/2012/02/wikipedia-is-not-truth.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+andrewsullivan%2FrApM+%28The+Daily+Dish%29

Timothy Messer-Kruse, “The ‘Undue Weight’ of Truth on Wikipedia,” The Chronicle of Higher Education, February 12, 2012. http://chronicle.com/article/The-Undue-Weight-of-Truth-on/130704/

Fabrice Epelboin, “Did the French Govt. Ask Twitter to Suspend Satirical Accounts?,” Read Write Web, February 19, 2012. http://www.readwriteweb.com/archives/the_morning_after_french_president.php

FOOD FOR THOUGHT DINNERS

Food for Thought dinners are self-organized gatherings allow conference attendees to engage in informal, themed conversation with other conference participants, and would take place after the symposium reception ends, on the evening of March 6 (Tues). Attendance is limited to six people per dinner (including the organizer).

If you would like to propose / organize a food for thought dinner:

If you would like to join one of the dinners:

  • Add your name to one of the slots below by 3PM 3/6 (Tues)

Please note that attendees will pay their own dinner costs.

If you decide not to attend a dinner to which you are signed up, please delete yourself from the list. If you have any questions, please contact one of the Berkman staff members.

Restaurant locations and maps are listed with each dinner. For restaurants in Harvard Square, expect approximately a 10 minute walk from HLS campus. For restaurants in Porter Square, expect approximately a 15 minute walk from HLS.

Organizer, Proposed Topic

  • Insert time, address, venue
  1. Insert name
  2. Insert name
  3. Insert name
  4. Insert name
  5. Insert name
  6. Insert name

Organizer, Proposed Topic

  • Insert time, address, venue
  1. Insert name
  2. Insert name
  3. Insert name
  4. Insert name
  5. Insert name
  6. Insert name

Organizer, Proposed Topic

  • Insert time, address, venue
  1. Insert name
  2. Insert name
  3. Insert name
  4. Insert name
  5. Insert name
  6. Insert name