Facebook's actions against QAnon are overdue and may not be enough, experts say - Action News
Home WebMail Friday, November 22, 2024, 03:29 PM | Calgary | -10.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Facebook's actions against QAnon are overdue and may not be enough, experts say

Facebook's announcement this week that it will remove groups and accounts affiliated with the baseless conspiracy theory QAnon is a welcome move,experts say, butmay not be enough to address the damage already done.

Conspiracy groups act 'like a parasite' by exploiting algorithms that favour popularity over truth

Experts welcomed Facebook's move to root out groups and accounts that espouse the baseless QAnon conspiracy theory but worry that it may be an uphill battle because of the theory's proliferation across social media. (Stephanie Keith/Getty Images)

Facebook's announcement this week that it will remove any groups and accounts affiliated with the baseless conspiracy theory QAnon is a welcome move,experts say, butmay not be enough to address the damage already done.

"They've allowed their platform to be used for the spread of this incredibly poisonous conspiracy theory," Matthew McGregor, campaigns director of the British advocacy group Hope Not Hate, told CBC's Thomas Daigle.

"So it is welcome, but it is incredibly frustrating that, yet again, Facebook is actually so slowin taking action against hate on their platform."

QAnon followers promote an intertwined series of beliefs, based on anonymous web postings from a user identified as "Q," who claims to have insider knowledge of the Trump administration. A core tenet of the conspiracy theory which has been amplified on Twitter, Facebook, Instagram and YouTube is that U.S. President Donald Trump is secretly fighting a cabal of child-sex predators that includes prominent Democrats, Hollywood elites and "deep state" allies.

"QAnon supporters use coded language to try and kind of filter their real beliefs," McGregor said. "When it comes down to it, this is a poisonous, far-right conspiracy theory rooted in anti-Semitism. But when you see the content online, it's really about opposing child abuse and hashtags like 'Save Our Children.' Those are attempts to get around these bans and attempts to kind of suck people into the conspiracy."

Less than two months ago, Facebook said it would stop surfacing content from the group and its adherents, although it faltered with spotty enforcement. It said it would only remove QAnon groups if they promote violence. That is no longer the caseunder a broader policy the company startedenforcingTuesday aimed at rooting out allQAnon content.

The company cautioned that the effort"will take time and will continue in the coming days and weeks."

Barbara Perry, director of Ontario Tech University's Centre on Hate, Bias and Extremism in Oshawa, Ont., anticipates that it will be a challenging undertaking especially when it comes to the more covert posts on Facebook.

"Those that are especially canny have been very careful in couching their language in ways that just fall short of the community standards or even the legal boundaries around hate speech or misinformation," Perry said.

"I think we need to continue to engage experts, both internal to the organization and external, to help with that interpretation, if you will, that translation of the emerging terminology so that they know which new posters which new phrases and terminology needs to be flagged."

WATCH | What is QAnon?:

QAnon: The pro-Trump conspiracy theory that's gaining traction

6 years ago
Duration 2:12
CBC News looks at its origins and how QAnon supporters could impact U.S. politics in the months ahead.

Unintended consequences

Ghayda Hassan, director of the Canadian Practitioners Network for Prevention of Radicalization and Extremist Violence (CPN-PREV), is wary of the commitment from companies like Facebook as well as the ripple effect from its announcement.

"Big tech companies have shown some interest so far, but to my belief, to my knowledge, have not engaged seriously inconversation and also in the efforts," saidHassan, a psychologist and professor at L'Universit du Qubec Montral (UQAM).

"We know that strong censorship may also produce backlash. There needs to be a global initiative. Many players have to embark and not just one.

"It's definitely not the only way to go and not enough."

Platforms like Facebook are built to surface content that is popular and new rather than factually correct, says Joan Donovan, research director at the Harvard Kennedy School's Shorenstein Center on Media, Politics and Public Policy. 'Truth is really at a disadvantage in this moment.' (Ted S. Warren, File/The Associated Press)

Joan Donovan, research director at Harvard Kennedy School's Shorenstein Center on Media, Politics and Public Policy in Cambridge, Mass., says Facebook's announcement was aimed at targeting the wider QAnonnetwork.

"I think they realized that these groups are able to really spawn more and more pages if you don't remove practically the entire network at once," she said.

"These groups are highly motivated to stay on broad social media platforms. So they already have places in which they gather and they talk that are off of Facebook and Twitter and YouTube. But if they're going to recruit new folks or they're going to reach new audiences, they have to be in these more public places."

Taking hold amid the pandemic

Investigations have shown that social media recommendation algorithms can drive people who show an interest in conspiracy theories toward more material. A report by the Institute for Strategic Dialogue (ISD) in July found that the number of users engaging in discussion of QAnon on Twitter and Facebook has surged this year, with membership of QAnon groups on Facebook growing by 120 per centin March.

In June, another ISD report identified more than 6,600 online pages, accounts or groups where Canadians were involved in spreading white supremacist, misogynistic or other radical views.

On Wednesday, Facebook Canada said it removed Radio-Qubec, one of the province's most prominent QAnon advocates, as part of its new efforts.

WATCH | Facebooktargets QAnon group Radio-Qubec:

Quebec conspiracy theorist with links to QAnon removed from Facebook

4 years ago
Duration 2:04

The ongoing COVID-19 pandemic has presented an opportunity for these groups to take hold in mainstream online discourse, as people spend more time online, particularly on social media, and have fewer face-to-face conversations that allow more opportunity for some of the more outlandish theories tobe directly confronted.

"We have some very significant information challenges. And unfortunately, the internet, the way that it's built, is built to surface things that are popular and fresh rather than are true and correct. And so truth is really at a disadvantage in this moment."

Normalization of conspiracies

Donovan says it's important to understand how conspiracy theories such asQAnon become normalized. At least one Republican candidate who espouses QAnon beliefs is on track to earn a seat in the U.S. House of Representatives: Marjorie Taylor Greene, who won a Georgia primary runoff in August for a heavily Republican congressional district.

"There are aspects of the QAnon conspiracy theory that haven't been normalized, pieces of it that are at its root and where it got started that are incredibly anti-Semitic, that haven't really broken through," Donovan said.

The U.S. House of Representatives voted to condemn QAnonlast week, with Republican Rep. Denver Riggleman, who co-sponsorted the resolution, saying the anti-Semitic posts on social media "should cause concern for everyone."

However, Donovan noted that other aspects of the conspiracy theory "have gripped people," particularly with regard to the unfounded allegations of pedophilia and sex trafficking.Recently, QAnon proponents organized protests against child trafficking and were involved in a pro-police demonstration in Portland, Ore.

"That's why we need platform companies to be thinking more strategically about what is the communication they want to support, what are the groups in the communities that they believe will benefit from using their products and how are they going to then moderate these other groups that are really acting almost like a parasite," Donovan said.

"They're attaching themselves to other groups and then over time really taking over the host."

LISTEN | What's needed to tackle conspiracies and extremism online:

Movements like the Boogaloo Bois and QAnon are now surfacing at rallies and public events

'Damaging impact' on presidential election

McGregor says the onus is on social media platforms to clamp down on groups and accounts that spreadQAnon conspiracies, as they have "allowed this to grow to a point where damage has already been done and damage will continue to be done."

"To have done this three weeks out from the election rather than six months out from the election, it genuinely has made a very, very damaging impact on the election itself," he said.

Hassan, meanwhile, saysa multi-pronged approach, including fact-check and media literacy initiatives by news organizations and groups such as the Canadian non-profit Media Smarts, is critical to properly addressing the issue.

"I think actions should be at different levels,and at a global prevention level, we must multiply initiatives around critical literacy, around checking information before sharing," she said.

Barbara Perry, director of Ontario Tech University's Centre on Hate, Bias and Extremism, says the combination of disinformation surrounding the U.S. election and the COVID-19 pandemic has forced companies to take action to fight it. (Joseph Prezioso/AFP/Getty Images)

What other companies have done

Facebook's announcement comes after other social media companies previously announced efforts to weed out QAnon content.

A spokesperson for the short-form video app TikTok told Reuters that it has blocked dozens of QAnon hashtagswhile a Reddit spokeswoman told Reuters the site has removed QAnon communities that repeatedly violated its rules since 2018, when it took down forums such as r/greatawakening. A YouTube spokesperson said it has removed tens of thousands of Q-related videos and terminated hundreds of Q-related channels for violating its rules since updating its hate speech policy in June 2019.

"The disinformation around the election, around COVID, there are a number of things coming together at the same time ... to finally sort of make them recognize the impact of not just hate speech but disinformation on community practice and community safety," Perry said.

Beyond social media, e-commerce site Etsy said it was removing all QAnon merchandise for purchase. CBC News reached out to Amazon and eBay representatives on Tuesday to ask whether they would do the samebut didnot receivea response.

"Companies are incentivized by the number of people that visit their platforms. They make most of their money from advertising," McGregor said. "And what they need is a counter to that the counter of consumers, other users, activist groups, politicians saying this is not acceptable.

"If they're not under pressure to act, the dollars will keep clicking up and the incentive really isn't there."

With files from CBC's Thomas Daigle, Reuters and The Associated Press