Business

Facebook is finally cracking down on anti-vaccination hoaxes

Facebook is finally making some moves to limit the spread of anti-vaccination posts and hoaxes.

Facebook on Thursday said that it will stop showing Pages and Groups that spread anti-vaccination content in search results, and will stop recommending them for users to join.

Promoting anti-vaccination content is also now against Facebook’s rules, so ads that peddle “misinformation about vaccinations” won’t be allowed. There were also specific targeting options on Facebook, terms like “vaccine controversies,” that let advertisers narrow in on users the social network thought might be interested in those hoaxes. That targeting has been removed.

Facebook will also proactively hunt for this kind of material. Before, if a user flagged a vaccination hoax to Facebook’s fact-checkers, Facebook would then limit that post’s reach. But the company says it will now limit the spread of well-known hoaxes that have been identified by the World Health Organization and the US Centers for Disease Control and Prevention even before a fact-checker looks at them, a company spokesperson said.

That’s positive news. Anti-vaccination hoaxes are not new, though they’ve been top of mind lately thanks to a recent measles outbreak in Washington state, with 71 confirmed cases so far. That outbreak has renewed concerns about misinformation on social media as a driver of parents refusing to vaccinate their kids.

The most popular hoaxes, like the idea that vaccinations can lead to autism, have been resoundingly disproven by numerous large studies and refuted by the world’s top health organizations. That hasn’t stopped those hoaxes from spreading rampantly online through services like Facebook and YouTube, where misinformation can easily go viral.

You might be wondering: Why did it take Facebook until March 2019 to start taking aggressive action against anti-vaccination content?

This is a reminder of Facebook’s rules on misinformation. Facebook doesn’t want to decide what’s true and what’s false. As a result, sharing things that are false on Facebook — so-called fake news — is not actually against the company’s terms of service. If it’s reported by a user and then confirmed by a fact-checker, Facebook will limit that post’s distribution, but sharing fake news on its own is not actually against Facebook’s rules.

That can be problematic if, for example, the fake news in question could lead to serious real-world violence or harm. That’s been a real challenge in countries like India and Myanmar, where fake news on Facebook and WhatsApp has led to violence and deaths. Last summer, Facebook started to take a stricter stance on content that fell into that category, in some cases removing it altogether if it looks like it will cause imminent harm.

Anti-vaccination content, though, appears to fall somewhere in the middle. Unlike your standard, run-of-the-mill fake news, Facebook is taking extra steps to fight it. Facebook is not, though, taking anti-vaccination content off altogether.

Facebook isn’t the first company to come out and make changes to fight anti-vaccination content. YouTube stopped showing ads alongside anti-vaccination videos last month. Pinterest, meanwhile, stopped showing search results for vaccination queries.

Products You May Like

Articles You May Like

Europe Votes to Slap China-Made EVs With Tariffs—but Tesla Gets Off Easy
Cybertruck Finally Gets Full Self-Driving (Supervised)
Meta’s Movie Gen Makes Convincing AI Video Clips
Meta Can’t Use Sexual Orientation to Target Ads in the EU, Court Rules
This Startup Wants YouTube Creators to Get Paid for AI Training Data

Leave a Reply