MENLO PARK, Calif. — Sandwiched between Building 20 and Building 21 in the heart of Facebook’s campus, an approximately 25-foot by 35-foot conference room is under construction.
Thick cords of blue wiring hang from the ceiling, ready to be attached to window-size computer monitors on 16 desks. On one wall, a half dozen televisions will be tuned to CNN, MSNBC, Fox News and other major cable networks. A small paper sign with orange lettering taped to the glass door describes what’s being built: “War Room.”
Although it is not much to look at now, as of next week the space will be Facebook’s headquarters for safeguarding elections. More than 300 people across the company are working on the initiative, but the War Room will house a smaller team of about 20 people focused on rooting out disinformation, monitoring false news and deleting fake accounts that may be trying to influence voters before coming elections in the United States, Brazil and other countries.
“We see this as probably the biggest companywide reorientation since our shift from desktops to mobile phones,” said Samidh Chakrabarti, who leads Facebook’s elections and civic engagement team. The company, he added, “has mobilized to make this happen.”
The misuse of Facebook by those involved in foreign influence campaigns has been rampant. In July and August, the company detailed previously undisclosed efforts by Iranians and Russians to mislead users of the social network through divisive ads and posts. Now, with the midterm elections in the United States just seven weeks away, Facebook is in an all-out sprint to convince the world that it is ready to handle any new attempts at such meddling. The company is under tremendous pressure to prevent a repeat of the foreign manipulation that unfolded on the social network during the 2016 presidential campaign.
More from The New York Times:
Apple Watch Series 4 Review: Faster, Bigger, With a Promise to Be Healthier
Why Jeff Bezos Should Push for Nobody to Get as Rich as Jeff Bezos
If You See Disinformation Ahead of the Midterms, We Want to Hear From You
Mark Zuckerberg, Facebook’s chief executive, has vowed to fix the problems, and he said this month that the company was “better prepared” to handle potential interference. But he has acknowledged that Facebook is in an “arms race” against those who are trying to manipulate the platform. The company has taken steps to build defenses against spammers, hackers and foreign operatives — including hiring thousands of people to help moderate content and starting an archive to catalog all political ads — but the War Room’s half-finished state shows how nascent and hurried many of the efforts are.
Facebook invited two New York Times reporters into the War Room before it opens next week to discuss the elections’ team’s work and some of the tools it has developed to try to prevent interference. The company limited the scope of what The Times could see and publish out of a concern over revealing too much to adversaries who may be looking for vulnerabilities. The company said the War Room was modeled after similar operations used by political campaigns, which are typically set up in the final weeks before Election Day.
Mr. Chakrabarti, who joined Facebook about four years ago from Google, said one of the new tools the company is introducing is custom-designed software that helps track information flowing across the social network in real time. These dashboards resemble a set of line and bar graphs with statistics that provide a view into how activity on the platform is changing. They allow employees to zero in on, say, a specific false news story in wide circulation or a spike in automated accounts being created in a particular geographic area.
The dashboards were first tested before of the special United States Senate election in Alabama in December. Since then, Facebook has built such systems, testing and redesigning the software during multiple elections worldwide. This month, before Brazil’s presidential election, the company will introduce the newest versions of the dashboards, Mr. Chakrabarti said.
Facebook created what it calls its elections and civic engagement team in 2007 to work with governments and campaigns on how they could use the social network most effectively. For a long time, the team numbered just a few dozen people in Silicon Valley; it expanded in 2013 to include members in other offices outside of the United States.
After Facebook disclosed that agents linked to the Kremlin had manipulated the social network to spread inflammatory messages to American voters during the 2016 election, the company began to increase the team’s ranks. The group was also restructured to focus more on the security of elections.
Since then, the team has mushroomed to its current size, augmented by other people at the company whose jobs involve some aspect of stopping election interference. Facebook has said that each of its units — including Instagram and WhatsApp — has been told to make election security a top priority when designing new products.
Mr. Chakrabarti meets several times a month with Facebook’s top executives, engineers and product managers. The meetings often include Mr. Zuckerberg and Sheryl Sandberg, the company’s chief operating officer.
Facebook decided to create a War Room dedicated to election-related work earlier this year so that a core group of engineers, data scientists, and executives could sit together in the same physical space before the midterms. They chose an empty conference room off the hallway that connects Building 20 and Building 21, a central point on Facebook’s campus that is easy for employees to get to.
Construction began a few months ago and the room, with its whiteboard walls and clusters of long tables, is set to open for operations on Monday. It has been refitted with cables and internet boosters, and new wiring was installed for the monitors and other equipment.
Mr. Chakrabarti said that what happens in the War Room will be a “last line of defense” for Facebook engineers to quickly spot unforeseen problems on and near Election Days in different countries. Many of the company’s other measures are meant to stop disinformation and other problems long before they show up in the War Room.
Once a problem reaches the War Room, the dashboards will be set to spot and track unusual activity, while data scientists and security experts take a closer look. Mr. Chakrabarti said the team was particularly on guard for posts that manifest “real-world harm,” and planned to actively remove posts that try to disenfranchise voters by giving incorrect polling data or spreading hoaxes like encouraging people to vote by text message.
“The best outcome for us is that nothing happens in the War Room,” he said. “Everything else we are doing is defenses we are putting down to stop this in the first place.”
Leave a Reply
You must be logged in to post a comment.