Technology

Facebook bans QAnon entirely, says previous crackdown wasn’t enough

This post was originally published on this site

A group of Trump supporters wearing red

Enlarge / Supporters of President Donald Trump wearing “QAnon” T-shirts wait in line before a campaign rally at Freedom Hall on October 1, 2018, in Johnson City, Tennessee.

Facebook is banning the QAnon conspiracy-theorist group, expanding on a policy that previously led to the removal of over 1,500 QAnon-related pages and groups.

Starting in mid-August, Facebook removed QAnon pages and groups that contained discussions of potential violence. Today, Facebook said, “We believe these efforts need to be strengthened when addressing QAnon,” and the company strengthened the ban so that it applies regardless of whether the pages and groups discuss violence.

Facebook wrote:

Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content. This is an update from the initial policy in August that removed Pages, Groups and Instagram accounts associated with QAnon when they discussed potential violence while imposing a series of restrictions to limit the reach of other Pages, Groups and Instagram accounts associated with the movement… We are starting to enforce this updated policy today and are removing content accordingly, but this work will take time and need to continue in the coming days and weeks.

Facebook said its earlier removal of 1,500 QAnon pages and groups was accompanied by the removal of another 6,500 pages and groups that were “tied to more than 300 Militarized Social Movements.” Facebook said it will continue to disable the profiles of admins who manage pages and groups that were removed for violating the policy.

New problems popped up after initial ban

Facebook said that “several” problems that led to today’s policy update:

For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public. Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.

This may not be the last policy update, Facebook said. “We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary,” the company said.

Facebook said it has its “Dangerous Organizations Operations” team on the case. The group consists of specialists who “proactively detect content for removal,” a method that “has provided better leads in identifying new evolutions in violating content than sifting through user reports.”

Twitter has taken similar action against QAnon.

As NBC News writes, the QAnon conspiracy theory “posits that high-profile Democrats and Hollywood celebrities are members of a child-eating cabal that is being secretly taken down by President Donald Trump, and that members of this fictitious cabal will soon be marched to their execution.” QAnon accounts pushed the baseless claim that Democratic presidential nominee Joe Biden would secretly wear an earpiece in a debate against President Trump, and they also “pushed the conspiracy theory that Trump is not sick with the coronavirus, but carrying out secret missions in a fictitious war that has been predicted by QAnon followers,” NBC News wrote. QAnon accounts have also been spreading disinformation about COVID-19 in general.

Related Posts