The most recent release of the Twitter Files reveals that during the peak of the COVID-19 pandemic, Twitter’s leaders gave in to pressure from the government to censor information that was accurate but unpopular, suspended doctors who held opposing opinions, and relied on bots and foreign contractors to moderate complex scientific topics.
The 40-tweet Twitter Files exposé, “How Twitter Rigged the Covid Debate,” was published on Monday by independent journalist David Zweig. Zweig examined internal Twitter documents for the Free Press in order to compile the story. The Twitter Files is a collection of articles based on internal Twitter papers that the company’s new CEO, Elon Musk, sent to a small group of journalists.
According to Zweig’s article on Monday, the Biden and Trump administrations put pressure on Twitter and other social media platforms to highlight material that complemented their narratives and to stifle that which did not. According to Zweig, the Trump administration asked internet firms to “fight falsehoods” regarding “runs on food shops” at the start of the pandemic.
According to Zweig, when Joe Biden became president, his administration was worried about “anti-vaxxer stories,” especially the account of journalist Alex Berenson.
After Biden claimed that social media firms were “killing people” for promoting vaccination disinformation, Berenson’s Twitter account was banned. Berenson later filed a lawsuit and reached a settlement with Twitter.
According to Zweig, Biden’s staff was “extremely upset” that Twitter didn’t take more drastic measures to deplatform accounts it disapproved of.
However, according to Zweig, Twitter did censor viewpoints, including those of medical professionals and scientific authorities whose views “conflicted with the White House’s official position,” “differed from the CDC guidelines,” or was “contrarian but factual.”
According to Zweig’s article, Dr. Martin Kulldorff, an epidemiologist at Harvard Medical School, posted opinions that were opposed to those of the American Left and public health authorities (“the political affiliation of practically the entire personnel at Twitter”).
A moderator marked one of his tweets regarding vaccinations as “fake information,” despite the fact that it was essentially an expert view and consistent with vaccination laws in many other nations. The fact that it deviated from CDC recommendations, however, led Twitter censors to label it “fake information,” Zweig said.
A Rhode Island doctor named Andrew Bostom was indefinitely banned from Twitter for allegedly disseminating false material, including a message that alluded to the findings of peer-reviewed research on mRNA vaccinations.
Only one of Bostom’s five alleged violations was found to be true. That one was only because it “cited data that was legitimate but inconvenient to the public health establishment’s narrative about the risks of flu versus Covid in children,” Zweig wrote. The internal audit was carried out after Bostom’s attorney contacted Twitter.
Zweig said that bots carried out a large portion of Twitter’s content moderation with artificial intelligence and machine learning training and by international contractors in countries like the Philippines.
Zweig noted that “tasking non-specialists to judge tweets on complicated issues like myocarditis and mask effectiveness data was destined for a big mistake rate.” He added that “higher-level Twitter management picked the inputs and decision trees the bots and foreign contractors based the choices on.”
Zweig noted the response to a tweet by then-president Donald Trump in October 2020 after he had acquired Covid and been discharged from Walter Reed Medical Center, calling it an example of human prejudice “Covid is nothing to fear. It shouldn’t rule your life,” Trump tweeted.
The subject of why that comment was not reported as a breach of Twitter’s policy was raised by Jim Baker, who was at the time Twitter’s deputy general counsel.
Yoel Roth, formerly in charge of trust and safety at Twitter, had to clarify that optimism did not qualify as false information.