How a debunked COVID-19 video kept spreading after Facebook and YouTube took it down

Although social media giants such as Facebook and YouTube have largely removed a debunked documentary about the COVID-19 pandemic from their platforms, copies and variations of the video are still up on alternative social media sites where hundreds of thousands of people are watching them.

And links to that content keep popping up on the mainstream platforms. 

Plandemic is a 26-minute video, originally touted as a vignette meant to be part of a longer documentary, full of false and misleading claims about the coronavirus, including about how people can protect themselves.

The video flooded social media platforms in the first week of May. According to the New York Times, it was viewed more than eight million times across major platforms

The original version has been removed from Facebook, YouTube and Vimeo in their push to crack down on false or misleading information relating to the COVID-19 pandemic. Facebook said some of the documentary’s claims could cause “imminent harm,” and YouTube cited “medically unsubstantiated diagnostic advice.”

Nonetheless, clips and modified versions of the documentary have resurfaced on the sites. Facebook has labelled them false and linked to fact checks by news organizations detailing a range of problems with the content, including its promotion of conspiracy theories and incorrect medical information.

Complicating this effort is the fact that Plandemic is easy to find on websites known as alt-tech platforms, many of which position themselves as alternatives to the popular mainstream social media platforms and as proponents of uncensored free speech.

These social media platforms often act as a reservoir for content that’s been flagged and removed from major sites such as Facebook and YouTube, as was the case with Pandemic, and links to the content on alt-tech platforms often make their way back onto mainstream social media platforms.

So, despite the efforts of the mainstream sites to crack down on what they’ve deemed potentially harmful content, alt-tech platforms help keep it circulating.

Mikovits walks with Mikki Willis, the filmmaker who created Plandemic, in this still from the video. (Screenshot/Elevate)

One such alt-tech site is BitChute, a video-sharing platform registered in the U.K. that is similar to YouTube, allowing people to comment and vote on posted videos. It shows more than 1,770 search results for the term “Plandemic.” The top result appears to be a re-post of the original Plandemic video, and it had more than 1.6 million views as of Wednesday.

There are about a dozen alt-tech platforms in operation, some based on blockchain technology that allows them to tout their decentralized nature.

It’s not possible to verify how many users and visitors each has, but some claim to have sizable communities.

Twitter-alternative Gab, for example, which is favoured by the far right, says it had at least a million registered users last year.

BitChute recently tweeted that it had 20 million unique visitors in April.

‘A censorship backfire’

Zarine Kharazian, assistant editor at the Atlantic Council’s Digital Forensic Research Lab in Washington, D.C., which studies disinformation and how it affects democratic norms, says in the case of Plandemic, the filmmaker seemed to anticipate the video would be downranked or removed from major platforms — initial posts featuring the video predicted as much.

At the same time, she said, copies began to spring up on the alternative video platforms.

Then, when the video was flagged and removed from mainstream social media sites, links to the video on alt-tech platforms showed up on Twitter and Facebook.

“That’s how the video is sort of staying alive even as the major platforms, with the control that they have, tried to remove and downgrade it,” she said.

“I think, paradoxically, what has been happening with the Plandemic video is a sort of censorship backfire. It’s called … the Streisand Effect.”

In 2003, legendary entertainer Barbra Streisand sued to have a photo of her home removed from the internet, which resulted in a far greater number of people seeing the photo because of the publicity around the lawsuit.

In the case of Plandemic, Kharazian says people who are curious about the video may be attracted by its now-taboo status and will go looking for it.

“In parallel to that, the … hardcore believers are outraged by the fact that the video has been removed. So they increase their sort of push to get the video in front of as many people as possible,” she said.  

“Those two things together makes it so any one content moderation effort on any one platform does not stop the conspiracy spread once the conspiracy has gone viral.”

Platforms tout freedom of expression 

When asked about Plandemic remaining on the BitChute platform, the company wrote in an email, “There is already legislation that places reasonable restrictions on speech such as not allowing incitement to violence and we follow those fully. 

“Google and Facebook’s censorship of ideas is misguided and counter-productive. Hiding the public from ideas, even bad ones, only makes society more susceptible to dangerous errors and infringes on people’s universal human right to freedom of expression.”

WATCH | How Plandemic and other debunked content keeps spreading: 

A new video called ‘Plandemic’ is full of false claims about COVID-19, but that hasn’t stopped it from spreading online. An expert explains how ‘Plandemic’ and other conspiracy theories get popular. 7:47

Administrators of DTube, another video-sharing alt-tech platform, are anonymous, but one who identifies as “heimindanger” responded to CBC questions on Discord, another platform where DTube’s community hosts a forum.

“I have nothing to say about these videos, I haven’t watched them but I have seen them circulating. Any type of content is allowed on DTube. If something gets deleted on [YouTube], then very often this same content is re-uploaded on alternative video solutions like ours.” 

The administrator went on to say that users have the ability to upvote or downvote the content. Content that is upvoted is considered popular and shown to more users.

“So anyone can do his ‘public duty to limit harm’. No one controls what is free speech, or what is harming, it’s all vote-based,” the administrator said. 

Matthew Johnson, the director of education for Media Smarts, a Canadian non-profit that promotes digital and media literacy, says in some cases, the alt-tech platforms are set up specifically to host content that gets taken down elsewhere.

“Sometimes these are created by conspiracy groups, by hate groups, by health misinformation groups as a refuge. And, sometimes, it’s just that one of these platforms may have a more laissez-faire attitude toward content moderation,” said Johnson.

WATCH | Debunking COVID-19 immunity scams: 

Misinformation about so-called miracle cures for COVID-19 are spreading online. Can you really buy your way to a better immune system? We ask an expert: UBC professor Bernie Garrett, who studies deception in healthcare, including alternative medicine. 5:27 

Because of the pandemic, major platforms are relying more on automated content moderation rather than human moderation, says Johnson, which can have some negative effects. 

“In some cases, there’s an increased rate of false positives, which really does … hurt the public reputation of content moderation because you’re seeing things that are getting taken down that shouldn’t be taken down,” he said.

In March, for example, after an outpouring of complaints that posts from legitimate news sites — and even Canadian authorities — were being blocked, Facebook blamed its anti-spam system and eventually fixed the problem

Despite the limitations of moderation, it was important to try to remove Plandemic from the big platforms, because people are less likely to stumble across it, Johnson said.

“That’s particularly important in places like YouTube, where, for instance, 70 per cent of views don’t come because someone is looking for someone, they come because people are watching what was recommended to them by the YouTube algorithm,” he said. 

“So, already, moving it off those platforms is a victory. It’s moving away from the public eye.”

How platforms are removing content

The big platforms have stepped up their efforts to flag misleading or potentially dangerous COVID-19 content with varying degrees of success, according to Philip N. Howard, the director of the Oxford Internet Institute at Oxford University.

The institute did a study of how much misinformation remained on social media platforms after the initial posts were debunked by fact-checkers.

“Two-thirds of the stories that we found on Twitter were still there a week later,” Howard said, whereas on Facebook, about 24 per cent of the posts remained on the platform without warning labels.

In a separate study, the group looked at what kind of information the average person gets when they search YouTube for health information on COVID-19. 

“We found that the average person searching for health information on YouTube doesn’t find junk. They usually find stuff prepared by professional news organizations,” he said.

“You have to search pretty hard on YouTube to find the most conspiratorial content about COVID.”

But, Howard said, those who are searching for that content inevitably find it elsewhere — or build their own platforms.

“It’s par for the course when a social media firm tries to manage the extreme views and push them out of the platform. Those extremists will go elsewhere to try to build their own technologies and keep up their conversations.”



Read more at CBC.ca