YouTube will ban QAnon content that targets individuals

.

YouTube announced Thursday it was taking steps to ban content related to QAnon and other conspiracies that target individuals, just days after the CEO said the company could not commit to such a move.

“Today we’re further expanding both our hate and harassment policies to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence,” YouTube wrote in a blog post. “One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.”

QAnon is a right-wing conspiracy group that believes, among other things, that a group of powerful politicians is involved in a child sex-trafficking ring and that a deep state exists to thwart President Trump. In 2017, “pizzagate” conspiracies propagated by QAnon led to a man opening fire in a Washington pizzeria.

On Monday, CEO Susan Wojcicki said a ban writ large on QAnon content would be difficult since it’s not a single entity. “It’s not that we’re not looking at it. … I think if you look at QAnon, part of the challenge, part of it is that it’s a grassroots movement, and so you can see just lots and lots of different people who are uploading content that has different QAnon theories.”

Despite not having a ban on all content related to QAnon, the announcement noted that algorithmic adjustments to YouTube’s recommendations have led to an 80% drop in viewership for QAnon content. “Additionally, we’ve removed tens of thousands of QAnon-videos and terminated hundreds of channels under our existing policies, particularly those that explicitly threaten violence or deny the existence of major violent events,” the blog reads.

YouTube’s work focuses on three pillars: removing violative content, raising up authoritative content, and reducing the spread of borderline content. Much of QAnon’s content falls under the last category, but with the updated policy guidelines, more content will likely be removed for violating YouTube policies.

“We will begin enforcing this updated policy today, and will ramp up in the weeks to come,” YouTube wrote.

The announcement follows an update to YouTube’s misinformation policies on Wednesday. YouTube said that it would ban any content that spread misinformation about coronavirus vaccines or that encouraged people not to get the vaccine when it becomes available.

The video streaming company said Wednesday that “any content that includes claims about Covid-19 vaccinations that contradict expert consensus from local health authorities or the World Health Organization (WHO) will be removed from YouTube.”

The Washington Examiner reached out to YouTube for comment.

Related Content

Related Content