The Center for an Informed Public is an interdisciplinary research initiative at the University of Washington dedicated to resisting strategic misinformation, promoting an informed society and strengthening democratic discourse.
Emma S. Spiro (PI), Kate Starbird (Co-PI)
This research addresses empirical and conceptual questions about online rumoring, asking: (1) How do online rumors permute, branch, and otherwise evolve over the course of their lifetime? (2) How can theories of rumor spread in offline settings be extended to online interaction, and what factors (technological and behavioral) influence these dynamics, perhaps making online settings distinct environments for information flow? The dynamics of information flow are particularly salient in the context of crisis response, where social media have become an integral part of both the formal and informal communication infrastructure. Improved understanding of online rumoring could inform communication and information-gathering strategies for crisis responders, journalists, and citizens affected by disasters, leading to innovative solutions for detecting, tracking, and responding to the spread of misinformation and malicious rumors. This project has the potential to fundamentally transform both methods and theories for studying collective behavior online during disasters. Techniques developed for tracking rumors as they evolve and spread over social media will aid other researchers in addressing similar problems in other contexts. Learn more
Emma S. Spiro (PI)
A number of high-profile incidents have highlighted tensions between citizens and police, bringing issues of police-citizen trust and community policing to the forefront of the public’s attention. Efforts to mediate this tension emphasize the importance of promoting interaction and developing social relationships between citizens and police. This strategy – a critical component of community policing – may be employed in a variety of settings, including social media. While the use of social media as a community policing tool has gained attention from precincts and law enforcement oversight bodies, the ways in which police are expected to use social media to meet these goals remains an open question. This study seeks to explore how police are currently using social media as a community policing tool. It focuses on Twitter – a functionally flexible social media space – and considers whether and how law enforcement agencies are co-negotiating norms of engagement within this space, as well as how the public responds to the behavior of police accounts. Learn more
Emma S. Spiro (PI)
When crises occur, including natural disasters, mass casualty events, and political/social protests, we observe drastic changes in social behavior. Local citizens, emergency responders and aid organizations flock to the physical location of the event. Global onlookers turn to communication and information exchange platforms to seek and disseminate event-related content. This social convergence behavior, long known to occur in offline settings in the wake of crisis events, is now mirrored – perhaps enhanced – in online settings. This project looks specifically at the mass convergence of public attention onto emergency responders during crisis events. Viewed through the framework of social network analysis, convergence of attention onto individual actors can be conceptualized in terms of network dynamics. This project employs a longitudinal study of social network structures in a prominent online social media platform to characterize instances of social convergence behavior and subsequent decay of social ties over time, across different actors types and different event types. Learn more
Emma Spiro (Co-PI), Zack Almquist (Co-PI)
Individuals are influenced by their social networks. People adjust not only their opinions and attitudes, but also their behaviors based on both direct and indirect interaction with peers. Questions about social influence are particularly salient for activity-based behaviors; indeed much attention has been paid to promoting healthy habits through social interaction in online communities. A particularly interesting implication of peer influence in these settings is the potential for network-based interventions that utilize network processes to promote or contain certain behaviors or actions in a population; however, the first step toward designing such intervention strategies is to understand how, when, and to what extent social signals delivered via social interaction influence behavior. This project fills this gap by using digital traces of behaviors in online platforms to observe and understand how social networks and interactions are associated with behavior and behavior change. Learn more.
Kate Starbird (PI), Emma Spiro (Co-PI), Robert Mason (Co-PI)
This research seeks both to understand the patterns and mechanisms of the diffusion of misinformation on social media and to develop algorithms to automatically detect misinformation as events unfold. During natural disasters and other hazard events, individuals increasingly utilize social media to disseminate, search for and curate event-related information. Eyewitness accounts of event impacts can now be shared by those on the scene in a matter of seconds. There is great potential for this information to be used by affected communities and emergency responders to enhance situational awareness and improve decision-making, facilitating response activities and potentially saving lives. Yet several challenges remain; one is the generation and propagation of misinformation. Indeed, during recent disaster events, including Hurricane Sandy and the Boston Marathon bombings, the spread of misinformation via social media was noted as a significant problem; evidence suggests it spread both within and across social media sites as well as into the broader information space.
Taking a novel and transformative approach, this project aims to utilize the collective intelligence of the crowd – the crowdwork of some social media users who challenge and correct questionable information – to distinguish misinformation and aid in its detection. It will both characterize the dynamics of misinformation flow online during crisis events, and develop a machine learning strategy for automatically identifying misinformation by leveraging the collective intelligence of the crowd. The project focuses on identifying distinctive behavioral patterns of social media users in both spreading and challenging or correcting misinformation. It incorporates qualitative and quantitative methods, including manual and machine-based content analysis, to look comprehensively at the spread of misinformation. Learn more
Emma S. Spiro (PI)
Project HEROIC is a collaborative, NSF funded effort with researchers at the University of Kentucky and the University of California, Irvine which strives to better understand the dynamics of informal online communication in response to extreme events.
The nearly continuous, informal exchange of information — including such mundane activities as gossip, rumor, and casual conversation — is a characteristic human behavior, found across societies and throughout recorded history. While often taken for granted, these natural patterns of information exchange become an important “soft infrastructure” for decentralized resource mobilization and response during emergencies and other extreme events. Indeed, despite being historically limited by the constraints of physical proximity, small numbers of available contacts, and the frailties of human memory, informal communication channels are often the primary means by which time-sensitive hazard information first reaches members of the public. This capacity of informal communication has been further transformed by the widespread adoption of mobile devices (such as “smart-phones”) and social media technologies (e.g., microblogging services such as Twitter), which allow individuals to reach much larger numbers of contacts over greater distances than was possible in previous eras.
Although the potential to exploit this capacity for emergency warnings, alerts, and response is increasingly recognized by practitioners, much remains to be learned about the dynamics of informal online communication in emergencies — and, in particular, about the ways in which existing streams of information are modified by the introduction of emergency information from both official and unofficial sources. Our research addresses this gap, employing a longitudinal, multi-hazard, multi-event study of online communication to model the dynamics of informal information exchange in and immediately following emergency situations. Learn more
Jevin D. West, Carl Bergstrom
The aim of the Eigenfactor Project is to develop methods, algorithms, and visualizations for mapping the structure of science. We use these maps to identify (1) disciplines and emerging areas of science, (2) key authors, papers and venues and (3) communication patterns such as differences in gender bias. We also use these maps to study scholarly publishing models and build recommendation engines and search interfaces for improving how scholars access and navigate the literature. Learn more
Jevin D. West, Carl Bergstrom
The open access movement has made great strides. There has been a significant increase in Open Access journals over the last ten years and many large foundations now require OA. Unfortunately, during the same time, there has been a signficant increase in exploitative, predatory publishers, which charge authors to publish with little or no peer review, editorial services or authentic certification. We are developing a cost effectiveness tool that will create an open journal market of prices and influence scores where these kinds of journals can be objectively identified . Learn more
Kate Starbird (PI)
Ph.D. Student Lead: Tom Wilson
While there is increasing awareness and research about online information operations—efforts by state and non-state actors to manipulate public opinion through methods such as the coordinated dissemination of disinformation, amplification of specific accounts or messages, and targeted messaging by agents who impersonate online political activists—there is still a lot we do not understand, including how these operations take place across social media platforms. In this research we are investigating cross-platform collaboration and content amplification that can work to propagate disinformation, specifically as part of an influence campaign targeting the White Helmets, a volunteer response organization that operates in Syria.
Through a mixed-method approach that uses ‘seed’ data from Twitter, we have examined the role of YouTube in the influence campaign against the White Helmets. Preliminary findings suggest that on Twitter the accounts working to discredit and undermine the White Helmets are far more active and prolific than those that support the group. Furthermore, this cluster of anti-WH accounts on Twitter uses YouTube as a resource—leveraging the specific affordances of the different platforms in complementary ways to achieve their objectives. Learn more
Using Facebook engagements to assess how information operations micro-target online audiences using “alternative” new media
Kate Starbird (PI)
Ph.D. Student Lead: Tom Wilson
There is a pressing need to understand how social media platforms are being leveraged to conduct information operations. In addition to being deployed to influence democratic processes, information operations are also utilized to complement kinetic warfare on a digital battlefield. In prior work we have developed a deep understanding of the Twitter-based information operations conducted in the context of the civil war in Syria. Extending upon this work, this project examines how Facebook is leveraged within information operations, and how a subsection of the “alternative” media ecosystem is integrated into those operations. We aim to understand the structure and dynamics of the media ecosystem that is utilized by information operations to manipulate public opinion, and more about the audiences that engage with related content from these domains. Our research will provide insight into how Facebook features into a persistent, multi-platform information operations campaign. Complementing previous research, it will provide insight into how a subsection of the alternative media ecosystem is leveraged by political entities to micro-target participants in specific online communities with strategic messaging and disinformation.
Helping Journalists Unravel the Trajectories of Online Disinformation
Kate Starbird (PI)
Ph.D. Student Lead: Melinda McClure Haughey
CiP researchers have been studying the online spread of rumors, misinformation, and disinformation for several years. We have developed and continue to expand and refine the infrastructure and research methods for collecting, curating, and analyzing social media data related to these phenomena. In this project, we explore strategies to extend our methods and to develop tools and techniques that allow other people—especially journalists and humanitarian responders—to analyze digital trace data themselves.
Unraveling Online Disinformation Trajectories: Applying and Translating a Mixed-Method Approach to Identify, Understand and Communicate Information Provenance
Kate Starbird (PI)
This project will improve our understanding of the spread of disinformation in online environments. It will contribute to the field of human-computer interaction in the areas of social computing, crisis informatics, and human centered data science. Conceptually, it explores relationships between technology, structure, and human action – applying the lens of structuration theory towards understanding how technological affordances shape online action, how online actions shape the underlying structure of the information space, and how those integrated structures shape information trajectories. Methodologically, it enables further development, articulation and evaluation of an iterative, mixed method approach for interpretative analysis of “big” social data. Finally, it aims to leverage these empirical, conceptual and methodological contributions towards the development of innovative solutions for tracking disinformation trajectories.
The online spread of disinformation is a societal problem at the intersection of online systems and human behavior. This research program aims to enhance our understanding of how and why disinformation spreads and to develop tools and methods that people, including humanitarian responders and everyday analysts, can use to detect, understand, and communicate its spread. The research has three specific, interrelated objectives: (1) to better understand the generation, evolution, and propagation of disinformation; (2) to extend, support, and articulate an evolving methodological approach for analyzing “big” social media data for use in identifying and communicating “information provenance” related to disinformation flows; (3) to adapt and transfer the tools and methods of this approach for use by diverse users for identification of disinformation and communication of its origins and trajectories. More broadly, it will contribute to the advancement of science through enhanced understandings and conceptualization of the relationships between technological affordances, social network structure, human behavior, and intentional strategies of deception. The program includes an education plan that supports PhD student training and recruits diverse undergraduate students into research through multiple mechanisms, including for-credit research groups and an academic bridge program. Learn more
Rolf Hapel, Chris Coward, Chris Jowaisas, Jason Young
Given their historic role of curating information, libraries have the potential to be key players in combating misinformation, political bias, and other threats to democracy. This research project seeks to support public libraries in their efforts to address misinformation by developing library-based community labs: spaces where patrons can collectively explore pressing social issues. Community labs have become popular across many European countries, where they are used to instigate democratic debate, provide discussion spaces and programming, alleviate community tensions, and promote citizenship and mutual understanding. This project aims to adapt a community lab model for use by public libraries in Washington. Learn more
Bergstrom, Carl T., and Joseph B. Bak-Coleman. “Information gerrymandering in social networks skews collective decision-making.” (2019): 40-41.
C.T. Bergstrom, J.D. West (2019) Calling Bullshit. Random House
West, Jevin D. and Bergstrom, Carl T. (2019) Misinformation in and about science. (In review)
Kate Starbird, Ahmer Arif, and Tom Wilson. (2019). Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations. Submitted to PACMHCI. 3, Computer-Supported Cooperative Work (CSCW 2019). Article 127.
Keeping Rumors in Proportion: Managing Uncertainty in Rumor Systems. PM Krafft, ES Spiro – Proceedings of the 2019 CHI Conference on Human Computer Interaction, 2019
Coward, C., McClay, C., Garrido, M. (2018). Public libraries as platforms for civic engagement. Seattle: Technology & Social Change Group, University of Washington Information School.
Ahmer Arif, Leo G. Stewart, and Kate Starbird. (2018). Acting the Part: Examining Information Operations within #BlackLivesMatter Discourse. PACMHCI. 2, Computer-Supported Cooperative Work (CSCW 2018). Article 20.
Bergstrom, Carl T., and Jevin D. West. (2018) “Why scatter plots suggest causality, and what we can do about it.” arXiv preprint arXiv:1809.09328
Madeline Lamo & Ryan Calo (2018) Regulating Bot Speech. UCLA Law Review. https://www.uclalawreview.org/regulating-bot-speech/
L. Kim, J.H. Portenoy, J.D. West, Katherine W. Stovel. (2018). Scientific Journals Still Matter in the Era of Academic Search Engines and Preprint Archives. Journal of the American Society for Information Science & Technology. (in press)
Kate Starbird, Ahmer Arif, Tom Wilson, Katherine Van Koevering, Katya Yefimova, and Daniel Scarnecchia. (2018). Ecosystem or Echo-System? Exploring Content Sharing across Alternative Media Domains. Presented at 12th International AAAI Conference on Web and Social Media (ICWSM 2018), Stanford, CA, (pp. 365-374).
K Starbird, D Dailey, O Mohamed, G Lee, ES Spiro. Engage Early, Correct More: How Journalists Participate in False Rumors Online during Crisis Events. CHI, 2018.
P Krafft, K Zhou, I Edwards, K Starbird, ES Spiro. Centralized, Parallel, and Distributed Information Processing during Collective Sensemaking. CHI, 2017.
J.D. West. (2017) How to fine-tune your BS meter. Seattle Times. Op-ed
L. Kim, J.D. West, K. Stovel. (2017) Echo Chambers in Science? American Sociological Association (ASA) Annual Meeting, August 2017
Kate Starbird. (2017). Examining the Alternative Media Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter. In 11th International AAAI Conference on Web and Social Media (ICWSM 2017), Montreal, Canada, (pp. 230-339).
Ahmer Arif, John Robinson, Stephanie Stanek, Elodie Fichet, Paul Townsend, Zena Worku and Kate Starbird. (2017). A Closer Look at the Self-Correcting Crowd: Examining Corrections in Online Rumors. Proceedings of the ACM 2017 Conference on Computer-Supported Cooperative Work & Social Computing (CSCW ’17), Portland, Oregon, (pp. 155-168).
Kate Starbird, Emma Spiro, Isabelle Edwards, Kaitlyn Zhou, Jim Maddock and Sindhu Narasimhan. (2016). Could This Be True? I Think So! Expressed Uncertainty in Online Rumoring. Proceedings of the ACM 2016 Conference on Human Factors in Computing Systems (CHI 2016), San Jose, CA. (pp, 360-371).
A Arif, K Shanahan, R Chou, S Dosouto, K Starbird, ES Spiro. How Information Snowballs: Exploring the Role of Exposure in Online Rumor Propagation. CSCW, 2016.
L Zeng, K Starbird, ES Spiro. #Unconfirmed: Classifying Rumor Stance in Crisis-Related Social Media Messages. ICWSM, 2016.
M. Rosvall, A.V. Esquivel, A. Lancichinetti, J.D. West, R. Lambiotte. (2014) Memory in network flows and its effects on spreading dynamics and community detection. Nature Communications. 5:4630, doi:10.1038/ncomms5630
Kate Starbird, Jim Maddock, Mania Orand, Peg Achterman, and Robert M. Mason. (2014). Rumors, False Flags, and Digital Vigilantes: Misinformation on Twitter after the 2013 Boston Marathon Bombings. Short paper. iConference 2014, Berlin, Germany, (9 pages).
J.D. West, T.C. Bergstrom, C.T. Bergstrom. (2014). Cost-effectiveness of open access publications. Economic Inquiry. 52: 1315-1321. doi: 10.1111/ecin.12117
ES Spiro, J Sutton, S Fitzhugh, M Greczek, N Pierski, CT Butts. Rumoring During Extreme Events: A Case Study of Deepwater Horizon 2010. WebSci, 2012.
J.D. West, C.T Bergstrom (2011) Can Ignorance Promote Democracy? Science. 334(6062):1503-1504. doi:10.1126/science.1216124
Education and Resources
To help combat the spread of misinformation, CIP director Jevin West and affiliate researcher Carl Bergstrom developed a class titled “Calling Bullshit: Data Reasoning in a Digital World.” The goal of the class is to teach students and the public, at large, how to spot and refute BS wrapped in numbers, statistics, and algorithms. View the first series of lectures from the class on YouTube or find more resources at CallingBullshit.org.
This recording of a Sept. 1, 2020 virtual public forum discusses the emerging threat of malicious synthetic media, often referred to as “deepfakes,” and the challenges they pose to democratic institutions and processes. Our panel of experts combine perspectives from technology, journalism, and civil society to explain what deepfakes are, why we should care about them, and what individuals can do to counter their impact.
In this interactive quiz, developed by the University of Washington’s Center for an Informed Public and Microsoft’s Defending Democracy Program in conjunction with USA TODAY and Sensity, learn more about deepfakes and other examples of synthetic media and test your abilities to spot deepfake images.
Facts in the Time of COVID-19
During a pandemic it’s more important than ever to avoid falling for or spreading misinformation and disinformation. But with so much new and changing information, how do you know what to trust? Pacific Science Center in Seattle teamed up with the University of Washington’s Center for an Informed Public to create a virtual exhibit designed help you navigate COVID-19 and the 24-hour news cycle.
We’re all looking at a lot of graphs and charts during the ongoing pandemic, but which are trustworthy? Check out this virtual exhibit, created by Pacific Science Center with the University of Washington’s Center for an Informed Public, to learn more about some of the common ways data visualizations can be accidentally distorted or intentionally manipulated and why that matters.
This hour-long webinar, co-presented with the University of Washington’s Center for an Informed Public and Washington State University’s Murrow College of Communication in partnership with the Washington Library Association, is designed to connect school librarians with the educational resources, tools and professional insights they need to better understand where misinformation and disinformation comes from, how it’s shared and what they can do to educate students and other library users to be better information consumers.
This recording of a May 2020 webinar — presented by Jevin West, director of the Center for an Informed Public at the University of Washington, and Mike Caulfield, director of blended and networked learning at Washington State University Vancouver — is designed for educators interested in MisinfoDay, an annual event for high school students, educators and librarians meant to teach high school students, teachers and librarians how to identify and combat online misinformation and disinformation.
Find insights from subject matter experts in academia and industry in this mini lecture series recorded in April 2018. Topics include strategies for cleaning up our polluted information system (CIP’s Jevin West), research conducted on online rumors in the context of crisis response (CIP’s Kate Starbird), and weaponized AI propaganda (Berit Anderson).
This spring, KUOW public radio in Seattle partnered with the Center for an Informed Public on a virtual event series focused on misinformation, how to spot it and stop its spread. On KUOW’s Stand With the Facts information page, check out extended interviews with CIP researchers and a Stand With the Facts toolkit.
Coexisting With COVID-19: COVID and the Truth
The 24-hour news cycle can be overwhelming. While we want to be informed, we also want to know that what we are learning is accurate and our information sources are trustworthy. As part of the University of Washington Graduate School’s Public Lectures series “Coexisting with COVID-19,” the Center for an Informed Public’s Jevin West and Kate Starbird talk about how we can be best informed in the face of often conflicting information.
In this July 2020 TEDx talk, Chris Coward suggests that everyone needs to become familiar with the basics of how mis- and disinformation works. He takes us on a tour of several disinformation tactics and discusses why these have been so effective in changing people’s beliefs. He concludes with some tips for more safely engaging with potentially problematic information. Chris Coward is a senior principal research scientist at the University of Washington Information School, co-founder of the Center for an Informed Public, and director of the Technology & Social Change Group.
In this engaging March 2018 TEDx talk, CIP director Jevin West teaches us to “call bullshit.” West shows how dangerous and misleading some news stories can be, and explains that these news stories are fairly easy to create but harder to clean because the digital networks used for dissemination.
This article from First Draft News provides a few ways to better understand the information ecosystem. It looks at the different kinds of misinformation and disinformation being created and shared, the motivations behind those who create content, and how they disseminate information. The typology of seven distinct content types and the matrix of intent are valuable tools.
CTRL-F, a project from Canadian nonprofit CIVIX, is an extension of the SIFT Method, a method for information literacy. The project provides simple skills for sorting through facts and misinformation online. Help students become robust fact checkers and combat misinformation with a verification skills module themed around COVID-19.
AI-enabled learning algorithms are now able to easily and quickly generate synthetic “photographs” of people. Your task: guess which one is real. Created by the University of Washington’s Calling Bullshit project, whichfaceisreal.com, aims to raise awareness around the ease at which facial identities can be faked, manufactured and disseminated digitally.
The Trust Project is an international consortium of news organization building standards of transparency and working with technology platforms to affirm and amplify journalism’s commitment to transparency, accuracy, inclusion and fairness. The Trust Project has designed a system of “Trust Indicators” — that is, standardized disclosures about the news outlet, journalist and the commitments behind their work.
Red Pen Reviews uses a structured expert review method to deliver the most informative, consistent and unbiased nutrition and health book reviews available, free of charge. Reviewers all have a master’s degree, equivalent or higher in a relevant field of science. Each book review is the work of two experts: a primary reviewer who writes the review, and a peer reviewer who checks the work.
‘Lexicon of Lies’
From “misinformation” to “propaganda” to “parody,” use this report from independent nonprofit research organization Data & Society to better understand and describe the accuracy and relevance of media content. This report unpacks specific origins and applications of different forms of problematic information.