Facebook introduces tools to combat fake news - Action News
Home WebMail Saturday, November 23, 2024, 12:28 AM | Calgary | -11.5°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Facebook introduces tools to combat fake news

Facebook is taking new measures to curb the spread of fake news on its huge and influential social network, focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions.

Bogus stories are thought to have influenced people during 2016 presidential election

Facebook is taking steps to combat fake news. (Mark Rourke/Associated Press)

Facebook is taking new measures to curb the spread of fake news on its huge and influential social network, focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions.

Fake news stories touch on a broad range of subjects, from unproven cancer cures to celebrity hoaxes and backyard Bigfoot sightings.

But fake political stories have drawn wide attention because of the possibility that they influenced public perceptions and could have swayed the U.S. presidential election.

There have been other dangerous real-world consequences. A fake story about a child sex ring at a Washington, D.C., pizza joint prompted a man to fire an assault rifle inside the restaurant, Comet Ping Pong.

"We do believe that we have an obligation to combat the spread of fake news," said JohnHegeman, vice-president of product management on news feed, in an interview. But he added thatFacebookalso takes its role to provide people with an open platform seriously, and that it is not the company's role to decide what is true or false.

3rd-party fact-checking

To start,Facebookis making it easier for users to report fake news when they see it, which they can now do in two steps, not three.

If enough people report a story as fake,Facebookwill pass it to third-party fact-checking organizations that are part of the non-profitPoynterInstitute's International Fact-Checking Network.

The fact-checking organizationsFacebookis currently working with are ABC News, The Associated Press, FactCheck.org,PolitifactandSnopes.Facebooksays this group is likely to expand.

Stories that flunk the fact check won't be removed fromFacebook. But they'll be publicly flagged as "disputed" by third-party fact-checkers, which will force them to appear lower down in people's news feed.

Users can click on a link to learn why. And if people decide they want to share the story anyway, they can but they'll get another warning that it has been disputed.

Some people believe that fake news influenced people in the 2016 presidential election. (Scott Olson/Getty Images)

By partnering with respected outside organizations and flagging, rather than removing, these disputed stories, Facebookis sidestepping some of the biggest concerns experts had raised about it exercising its considerable power in this areas.

For example, some worried thatFacebookmight act as a censor and not a skilful one, either, being an engineer-led company with little experience making complex media ethics decisions.

"They definitely don't have the expertise," said RobynCaplan, researcher at Data & Society, a non-profit research institute funded in part by Microsoft and the National Science Foundation. In an interview beforeFacebook'sannouncement, she urged the company to "engage media professionals and organizations that are working on these issues."

Responsibility

Facebook CEO Mark Zuckerberg has said that fake news constitutes less than oneper cent of what's on Facebook, but critics say that's wildly misleading. For a site with nearly two billion users tapping out posts by the millisecond, even one per cent is a huge number, especially since the total includes everything that's posted on Facebook photos, videos and daily updates in addition to news articles.

If you combined the top stories from the Boston Globe, Washington Post, Chicago Tribune, and LA Times, they still had only five per cent the viewership of an article from fake news.- MikeCaufield,Washington State University

In a studyreleased Thursday, the Pew Research Center found that nearly a quarter of Americans say they have shared a made-up news story, either knowingly or unknowingly, realizing only later that it was fake.

Forty-five per cent said that the government, politicians and elected officials bear responsibility for preventing made-up stories from gaining attention. Forty-two per cent put this responsibility on social networking sites and search engines, and a similar percentage on the public itself.

Fake news stories can be quicker to go viral than news stories from traditional sources. That's because they were created for sharing they areclickable, often inflammatory and pander to emotional responses.

MikeCaufield, director of blended and networked learning at Washington State University Vancouver, tracked whether real or fake news is more likely to be shared onFacebook.

He compared a made-up story from a fake outlet, with articles in local newspapers. The fake story, headlined "FBI Agent Suspected In Hillary Leaks Found Dead In Apparent Murder-Suicide" from the non-existent Denver Guardian, was shared 1,000 times more than material from the real newspapers.

"To put this in perspective, if you combined the top stories from the Boston Globe, Washington Post, Chicago Tribune, and LA Times, they still had only five per cent theviewershipof an article from fake news," he wrote in ablogpost .

Facebookis emphasizing that it's only going after the most egregious fake news creators and sites, the "the clear hoaxes spread byspammersfor their own gain," writes AdamMosseri, vice president of product forFacebook'snews feed, in ablogpost.

Follow the money

The social network's first public step toward fixing the fake-news problem since the election was a statement barring fake-news sites from using its lucrative ad network. But it wasn't much more than rhetorical.Facebook'spolicies already blocked sites that spread misleading information from its ad network, an automated system that places ads on sites across the internet.

Now,Facebooksays it has also eliminated the ability forspammersto masquerade as real news organizations by spoofing domains. And it says it's weighing a crackdown on publishers of fake news as well.

Deprivingscammersof money could be effective.

"GoogleandFacebookare the single two biggest engines formonetization," said SusanBidel, a senior analyst atForresterResearch focusing on digital publishers. "I don't think you are ever going to completely eradicate it. But it could get down to a manageable level."

Algorithms de-emphasize fake news

Facebook'smain approach to problems has been to tackle them with studying its vast troves of user data, with algorithms that can be more effective at things than humans, and to favour engineers over editors. Data rules all else at the Menlo Park, California, company.

Beyond the human fact-checkers,Facebookis also using its algorithms to de-emphasize fake news stories.

For example, if people are significantly less likely to share an article after they have read it, it's a "really good sign that the article was misleading or not informative in some way,"Hegemansaid. Kind of like when you try a cereal sample at thegrocery store then decide not to buy it.

Fake news stories won't disappear fromFacebook, not the way child porn andspamand various illegal stuff does. That isnotFacebook'sgoal.

"We believe providing more context can help people decide for themselves what to trust and what to share," Mosseriwrote.