Business

Vaccine misinformation and Infowars: Researchers wary of Facebook’s embrace of ‘Groups’

Two years ago, Facebook CEO Mark Zuckerberg announced a renewed focus on communities and its Groups feature, boasting that it would help create a more “meaningful” social infrastructure.

It worked. Last month, Zuckerberg told investors that “hundreds of millions” of users reported belonging in “meaningful groups,” up from one million in 2017, when the company began focusing on Groups.

“Connecting with communities of people that you’re interested in is going to be as central to the experience as connecting with friends and family,” he told investors last month.

But that shift has also moved Facebook activity out of public view, leaving researchers to warn that they now know less about what is happening on the social network and how the company’s algorithm-driven recommendations are funneling people to fringe communities and misinformation.

“As a researcher, we can get content but content isn’t what you need to research Facebook and see how these groups work,” said Jonathan Albright, director of research at Columbia University’s Tow Center for Digital Journalism. “We need to know more about the networks and members of groups that spread false information and target individuals.”

“It would take a lot of work and more of a partnership than Facebook is willing to establish, especially for this kind of work,” Albright said.

Groups gained public attention this week when The Guardian uncovered a vast network of groups spreading false information about vaccines. “STOP Mandatory Vaccination,” one of the largest anti-vaccine private groups at more than 126,000 members, is led by Larry Cook, a self-described “social media activist” with no children or medical training. Cook and his members promote the disproven theory that vaccines cause autism and spread conspiracies that outbreaks of preventable diseases are “hoaxes” perpetrated by the government.

In response to the negative press, Facebook told Bloomberg News it was considering “reducing or removing this type of content” from the recommendations it offers users.

Those recommendations have also come under increased scrutiny. Part of Facebook’s plan for growth has relied on it’s recommendation engine—the algorithm the company relies on to get users to join more groups.

More from NBC News:

Reddit takes aim at Google and Facebook with help from China
Germany to Facebook: Stop pooling user data from multiple apps and internet use
Loophole allows pirated apps to be installed on Apple iPhones

The algorithm that powers the right rail of “suggested groups” isn’t public, and a Facebook spokesperson declined to give details other than saying they are tailored to individual users. But researchers like Renée DiResta, who studies online disinformation as director of research at cybersecurity company New Knowledge, have found Facebook

users down a rabbit hole of increasingly misinformed, conspiratorial and radical communities.

Zuckerberg has acknowledged the significance of the recommendation engine, writing in 2017, “Most don’t seek out groups on their own—friends send invites or Facebook suggests them.”

Facebook is not the only tech company to face criticism for developing automated recommendations that push users toward misinformation. YouTube, a subsidiary of Alphabet‘s Google, recently announced it would be changing its own recommendation engine to stop suggesting conspiracy videos, after years of similar outcry from researchers and former employees critical of what they claimed was the company’s bargain in which it suggested extreme content to keep people watching.

Facebook doesn’t appear to have any plans to follow suit. A Facebook spokesperson pointed NBC News to the option it provides users to hide or remove the groups that Facebook suggests.

Meanwhile, uneven enforcement across the platform means a person or page may be disciplined or banned while a related group continues to operate. Facebook does not appear to be enforcing its 2018 ban on the conspiracy news site Infowars in its Groups feature, highlighting how communities can skirt Facebook’s broader efforts to clean up its platform.

Facebook updated its enforcement policy last week, closing a loophole that had allowed Alex Jones, owner of Infowars, to operate pages on the platform despite the ban. In doing so, it shut down 22 separate InfoWars pages. But in a private Facebook group of more than 117,000 members, administered by Jones and several of his employees, Infowars lives on.

The group describes itself as “an alternative media source for those who are awake!” and members share and comment on far-right, anti-Muslim and conspiracy content—much of it from InfoWars’ website—while Jones and the other moderators and administrators keep members informed of alternate pages and groups set up to evade the company’s original ban and promote supplements and prepper gear sold in the Infowars’ store.

When asked about the Infowars group, a Facebook spokesperson noted that the updated policy applies across Facebook features and products including Pages, Groups, Events and Instagram, and said the company had only begun to enforce it. At its rollout of the updated policy, Facebook said it would be looking at two criteria to decide which groups to purge: whether the groups have the same admins or similar group names to those that had been removed — benchmarks the InfoWars group appears to meet.

“The recent removals are a first step, and we will continue to expand in the months ahead,” the Facebook spokesperson said in a statement.

Meanwhile, inside its private community, the InfoWars description reads: “This group is doing great.


Products You May Like

Articles You May Like

The Case Against Apple Weaponizes the Cult of Cupertino
The Baltimore Bridge Collapse Is About to Get Even Messier
Photography Is No Longer Evidence of Anything
Meta Kills a Crucial Transparency Tool At the Worst Possible Time
A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

Leave a Reply