Select Page

“I’m not really expecting things to ever be what they were,” says Sarah. “There’s no going back.” Sarah’s mother is a QAnon believer who first came across the conspiracy theory on YouTube. Now that YouTube has taken steps toward regulating misinformation and conspiracy theories, a new site, Rumble, has risen to take its place. Sarah feels the platform has taken her mother away from her.

Rumble is “just the worst possible things about YouTube amplified, like 100 percent,” says Sarah. (Her name has been changed to protect her identity.) Earlier this year, her mother asked for help accessing Rumble when her favorite conservative content creators (from Donald Trump Jr. to “Patriot Streetfighter”) flocked from YouTube to the site. Sarah soon became one of 150,000 members of the support group QAnon Casualties as her mother tumbled further down the dangerous conspiracy theory rabbit hole.

Between September 2020 and January 2021, monthly site visits to Rumble rose from 5 million to 135 million; as of April, they were sitting at just over 81 million. Sarah’s mother is one of these new Rumble users, and, according to Sarah, is now refusing to get the Covid-19 vaccine. Explaining her decision, says Sarah, her mother cites the dangerous anti-vax disinformation found in many videos on Rumble.

Rumble claims that it does not promote misinformation or conspiracy theories but simply has a free-speech approach to regulation. However, our research reveals that Rumble has not only allowed misinformation to thrive on its platform, it has also actively recommended it.

If you search “vaccine” on Rumble, you are three times more likely to be recommended videos containing misinformation about the coronavirus than accurate information. One video by user TommyBX featuring Carrie Madej—a popular voice in the anti-vax world—alleges, “This is not just a vaccine; we’re being connected to artificial intelligence.” Others unfoundedly state that the vaccine is deadly and has not been properly tested.

Even if you search for an unrelated term, “law,” according to our research you are just as likely to be recommended Covid-19 misinformation than not—about half of the recommended content is misleading. If you search for “election” you are twice as likely to be recommended misinformation than factual content.

Image may contain Text Number Symbol and Page
Courtesy of Ellie House, Isabelle Stanley and Alice Wright; Created with Datawrapper

The data behind these findings was gathered over five days in February 2021. Using an adaptation of a code first developed by Guillaume Chaslot (an ex-Google employee who worked on YouTube’s algorithm), information was collected about which videos Rumble recommends for five neutral words: “democracy,” “election,” “law,” “coronavirus,” and “vaccine.” The code was run five times for each word, on different days at different times, so that the data was reflective of Rumble’s consistent recommendation algorithm.

Over 6,000 recommendations were manually analyzed. There can be disagreements about what can and cannot be classed as misinformation, so this investigation erred on the side of caution. For example, if a content creator said “I won’t take the vaccine because I think there might be a tracking chip in it,” the video was not categorized as misinformation. Whereas if a video stated “there is a tracking device in the vaccine,” it was. Our conclusions are conservative.

Of the five search terms used, Rumble is more likely than not to recommend videos containing misinformation for “vaccine,” “election,” and “law.” Even for the other two words “democracy” and “coronavirus,” the likelihood of Rumble recommending misleading videos remains high.

This data was tracked almost a year into the pandemic, after more than 3 million deaths worldwide have made it far more difficult to maintain that the virus is fake. It’s possible that searching for “coronavirus” on Rumble would have resulted in much more misinformation at the start of the pandemic.