News coverage from July 2023 about the Center for an Informed Public and CIP-affiliated research and researchers.

  • PolitiFact (July 6): “How Meta, TikTok, Twitter and YouTube plan to address 2024 election misinformation
    CIP director and co-founder Kate Starbird was interviewed by PolitiFact about YouTube’s decision to stop removing videos that promote election falsehoods. “It is interesting to note that, after false claims about voter fraud helped to motivate and mobilize the Jan. 6 (2021) attack on the U.S. Capitol, the reaction from platforms has been to move away from moderating such claims, rather than doubling down on moderation,” Starbird, a UW Human Centered Design & Engineering associate professor, told PolitiFact. “We could attribute that to the success of the effort to rhetorically equate social media moderation to ‘censorship.’”

***

  • KNKX Public Radio (July 7): “Can libraries help hold our information landscape together?
    CIP research fellow Jason Young, who helps lead the Co-Designing for Trust project, was interviewed by KNKX about efforts to leverage rural librarians to educate communities about digital literacy and fight the spread of misinformation. “Misinformation is a very political kind of topic already and so people might want to avoid that or, for example, a lot of people don’t think they’re going to get tricked,” Young said, referring to the challenges librarians face in getting patrons to engage in digital literacy education. 

***

  • Axios (July 10): “How AI will turbocharge misinformation — and what we can do about it
    CIP director and co-founder Kate Starbird was interviewed by Axios for an article about key ways generative AI could contribute to a rise in misinformation. Starbird said: “Generative AI has the potential to accelerate the spread of both mis- and disinformation, and exacerbate the ongoing challenge of finding information we can trust online.” 

***

  • Los Angeles Times (July 13): “Column: Artificial intelligence chatbots are spreading fast, but hype about them is spreading faster
    Los Angeles Times business columnist Michael Hiltzik referenced a tweet by CIP co-founder Ryan Calo about the growing hype around the often unfounded abilities and perils of generative AI. The AI industry has promoted warnings about the perils of AI, such as AI-based extinction. These warnings come with an ulterior motive to “focus the public’s attention on a far-fetched scenario that doesn’t require much change to their business models” and to convince the public that AI is uniquely powerful, Calo, a professor in the UW School of Law and Information School.

***

***

  • Nautilus Magazine (July 18): “Yes, there is a cure for bullshit
    CIP faculty member and UW Biology professor Carl Bergstrom and Calling Bullshit: The Art of Skepticism in a Data-Driven World, the 2020 book he co-authored with CIP co-founder and UW Information School associate professor Jevin West, were referenced in a Nautilus magazine article about bullshit and the scientific importance of identifying bullshit.

***

  • The New York Times Magazine (July 25): “The ongoing mystery of Covid’s origin
    In an interview with David Quammen, author of Breathless: The Scientific Race to Defeat a Deadly Virus, CIP faculty member and UW Biology professor Carl Bergstrom discussed the human affinity for dark theories of big events.

***

  • Vox (July 29): “Why TikTokers are drinking laundry detergent
    It’s hard to track the impact of TikTok trends like the borax challenge, “[e]specially when the trend involves ingesting a substance that is typically thought of as toxic,” CIP researcher scientist Rachel Moran-Prestridge told Vox in an interview, so “it’s unclear how many people who show interest online will actually perform the behavior offline.”