Facebook is not a part of the government. That means, unlike an American government body that has to abide by the First Amendment to the U.S. Constitution, it can kick off users who violate its rules.
However, says First Amendment scholar Jameel Jaffer, we should have a discussion about that power and whether Facebook should be able to decide who gets to speak.
“The First Amendment is concerned principally with government power, but we resisted the centralization of control over the public square in the government because we didn’t like the idea of centralization of that kind of power,” Jaffer said on the latest episode of Recode Decode with Kara Swisher. “Maybe we should resist the idea of centralizing power in the social media companies for the same reason.”
As the executive director of Columbia University’s Knight First Amendment Institute, Jaffer has sued President Trump for blocking people on Twitter and called on Facebook to stop restricting the access to its platform it has placed on journalists and academics. And although he doesn’t believe requiring Facebook to carry everyone’s speech is a workable solution, the “privatization of the public square” has made free speech a thorny issue online.
“Facebook has its own First Amendment rights here,” Jaffer said. “It expresses them by ejecting Alex Jones from the platform. I think none of that would raise difficult questions if it weren’t for Facebook’s scale. It’s the fact that Facebook is so big and that Facebook arguably controls the public square or arguably controls a large segment of the public square.”
“That’s when I think free speech advocates start to get nervous about Facebook excluding people from the platform, especially when there’s an argument that they’re excluding people on the basis of viewpoint,” he added. “You can think whatever you want to about Alex Jones, but I worry not about Alex Jones, but about the next person or the next year. Who is it that Facebook is going to be excluding next year?”
You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts and Overcast.
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Jameel.
Kara Swisher: Hi I’m Kara Swisher, editor at large of Recode. You may know me as someone who loves the First Amendment, but number 19 is pretty good, too. But, in my spare time I talk tech, and you’re listening to Recode Decode from the Vox Media Podcast Network.
Today in the red chair is Jameel Jaffer, the executive director of the Knight First Amendment Institute. It’s a group at Columbia University that defends the freedoms of speech and press in the digital age. And just recently, they’ve challenged the way Facebook deals with journalists and scholars, and they’re also bothering Donald Trump.
Jameel, welcome to Recode Decode.
Jameel Jaffer: Thanks very much. I really appreciate the invitation.
So, I love all the things you’re doing. And that’s the reason I wanted to have you on here, because there’s so much interesting stuff you’re doing there. So, let’s talk about how you got there. How did you get there? And how did the Knight … explain what the Knight First Amendment Institute is. But, you first.
Sure. Well, okay. So, I was a lawyer at the ACLU for 15 years.
I’ve heard of them.
Yeah, just down the street, actually.
I’m teasing.
When this building was Goldman-Sachs, I was at the ACLU.
This building was Goldman-Sachs?
It was Goldman-Sachs, yeah, just until a couple of years ago.
Oh, man.
But, I worked at the ACLU starting in 2002, so, soon after 9/11, and, mainly at the beginning, on national security-related cases. Cases involving immigration detainees, cases involving Guantanamo, and then cases involving the Patriot Act and government surveillance and First Amendment-related questions. And that practice grew, and it eventually became a formal project of the ACLU focused on national security issues. And I ran that project for a few years.
And then, when I left the ACLU a couple years ago, I was the director of something called the Center for Democracy, which covered the national ACLU’s work on free speech, privacy, national security, technology and international human rights. Obviously I was not doing all that work myself, there were a lot of people at the ACLU that I was working with on it.
But, I was doing that. And then, the Knight First Amendment Institute was created at Columbia. It was really the project of Lee Bollinger, who is the president of Columbia, and Alberto Ibarguen, who’s the president of the Knight Foundation. And they had had conversations over many years about the possibility of setting up something like this. And the main insight that they had was that the big Supreme Court precedents from the 1960s and ’70s involving free speech were precedents created in an era that looked very, very different from the one we’re in right now, where the threats to the First Amendment were quite different from the ones we’re facing right now. And all of the things that raise complicated free speech questions right now, like the privatization of the public square, or …
That’s a very good way of putting it.
New surveillance technology, search engines. None of these things existed back when the Supreme Court decided the Pentagon Papers case or New York Times v. Sullivan, or the cases, the big hits from the ’60s and ’70s.
So, they thought, “We need an institute that will focus on the edge of technology and the edge of the law, and these questions that new technology is presenting for the First Amendment.”
So many questions.
Yeah.
What interested you in the First Amendment? In doing this as a lawyer?
Well, I became a First Amendment lawyer almost incidentally. I was, as I said, I was kind of a national security lawyer. But, a lot of the national security cases I worked on turned out to be First Amendment cases. I worked on a lot of transparency cases. In fact, a case I probably spent the most time on at the ACLU was a Freedom of Information Act case involving interrogation under the Bush administration. And that case, we filed in 2004, it’s still running today.
You were trying to get Information Act …
Yeah, we were trying to get documents about the treatment of prisoners in military and CIA custody. The case was, as these kinds of cases go, a very successful one. It resulted in the release of what are sometimes called the “torture memos,” the Bush administration’s torture memos. A lot of information about maltreatment of prisoners at Guantanamo … eventually, the CIA Inspector General report that led to a criminal investigation by the Obama administration. So, it was a relatively successful case. That’s the one I spent most of my time on.
But I did a lot of other cases involving the exclusion of foreign scholars from the United States because of their political views. A lot of cases involving government surveillance, especially post-Snowden. We represented Snowden when I was at the ACLU and we also brought a whole series of cases challenging the constitutionality of some of the surveillance programs that he disclosed or helped disclose. And I argued some of those cases just down the block from here in the Second Circuit. And one of them went to the Supreme Court in 2012.
We’re still, at the Knight Institute, working on one of those cases with the ACLU, where we represent Wikimedia in a challenge to a particular kind of NSA surveillance. And these turn out to be privacy cases, but also First Amendment cases.
Yes, hand in hand. Yeah.
And, as you know, national security has always been the kind of crucible of the First Amendment. And so, being a national security lawyer can turn you into a First Amendment lawyer, so over time I became a First Amendment lawyer too. The opportunity to help start something new at Columbia with the resources of Columbia and the Knight Foundation — and with people like Lee Bollinger and Alberto Ibarguen and Steve Coll and Nick Lemann from the Journalism School involved — that was a very exciting thing.
So, what is your charge? What are you supposed to do? You’re supposed to study these and do what?
Yeah, so the idea is to …
Have a lot of meetings?
Well, we try to limit the meetings. But, the idea is to defend …
Like, colloquia?
Colloquia, we haven’t yet done a colloquium. We should look into that.
Why not?
No, we have a research program and we have a litigation program. Those are the two main components of the institute. They do what you would expect them to. The research program is an effort to understand these threats, to understand what opportunities and what threats are presented by new technology. And the litigation program is our way of defending free speech and the freedom of the press in relation to this new technology.
Maybe that already sounds complicated, but it’s actually more complicated than it sounds because, especially now, there’s a lot of disagreement about what the First Amendment means. So, it’s great to defend the First Amendment, but first you have to decide, well, what is this thing that you’re defending?
And there’s a real debate about … I don’t know how closely you follow Supreme Court decisions relating to the First Amendment, but this recent case involving labor unions in which Justice Kagan dissented and she accused the majority of “weaponizing” the First Amendment …
I love that, I use that term myself.
Yeah, so, Justice Kagan isn’t the only one who’s used those kinds of phrases in relation to some First Amendment arguments. There’s this important question of what values the First Amendment is meant to protect. And so, the research program is our effort to struggle with those questions.
So, what is the research? Give examples, and we’ll get into the specific things you’re recently got into the news for.
Sure, yeah. One example, we have an essay series called Emerging Threats where we’re focused on emerging threats to the First Amendment or emerging threats to the system of free expression. We published a paper by Tim Wu, that was the first one that we published, called “Is the First Amendment Obsolete?” And he was thinking about new threats to the First Amendment. Like, instead of censorship, flooding in information space. Or, instead of …
Seems to work really well.
Yeah, apparently it does. Instead of having your henchmen go to somebody’s door and threaten them, you just harass them on social media, right? So, there are these kinds of threats to the First Amendment that the First Amendment isn’t really well-suited, at least in its current form, to address. Tim’s paper was about that, and that was the first one we published.
We published another paper by Heather Whitney about the search engines and the proper analogy you should use. Should we think of them as editors or should we think of search engines as more like shopping malls or should we think of them as something else? That was another one in that series.
Recently, we published one by Jack Goldsmith, who’s a former Bush administration lawyer, on what he calls the failure of internet freedom. Because in his view, the internet freedom agenda that the United States put forward, especially during the Clinton administration, has been a spectacular failure and we now see that domestically as well as internationally. So, that’s an essay series that we’re overseeing.
We also did, not quite a colloquium, but a symposium …
All right.
… a few months ago with the Columbia Law Review on the First Amendment and inequality. The hope is that this research will inform our litigation decisions — and it already has to some extent — but I think the research will become even more important over time. The hope is that there’s a relationship between the two parts.
Right. So, now the litigation part.
Yeah.
So talk about the two things, the ones that go in that.
Yeah, yeah, sure. So, we now have a fairly … We’re only two years old, but we have a fairly …
How much money do you have to do all this, if you could say?
We could use more, if you’re offering.
No, I’m not.
Yeah, we were established with an operational commitment from the Knight Foundation and Columbia, $5 million from each of those over five years. So, we had this sort of generous base funding to start with.
Yeah, that is.
And, we have since been able to raise some money from a whole set of organizations across the political spectrum. The Democracy Fund, which is associated with Pierre Omidyar.
Sounds like a Pierre thing.
Yeah, the Koch Foundation. The Charles Koch Foundation. Open Society for …
What, do you have to take one from each side?
Carnegie. Well, we try to have a diverse … Well, that’s the goal.
As long as they’re not meddling with you, I don’t care.
They’re not meddling. To the contrary, they have been generous supporters of the work that we want to do.
Yeah, as long as they keep their dirty paws out, I’m good.
Yeah, yeah.
Not like an Eric Schmidt kind of thing down over at the New …
Well, it turns out to be a relatively good time to raise money for this kind of project. But at the same time, there’s a lot of work to do.
All right, so, talk about the litigation stuff that you’re doing.
One of the first cases that we filed, probably the case that’s got the most attention so far is a challenge to President Trump’s practice of blocking critics on Twitter.
From his personal thing or his …?
From @realDonaldTrump. He characterizes it as his personal account, but our complaint with that characterization is that he uses the account for official purposes. So, for example, to announce the appointment of people to government posts.
He does, yeah.
Or to engage in international diplomacy, if that’s the right phrase. Or to defend …
It’s called international trolling, but go ahead.
International trolling, right, right. Or to defend or describe government policies. You know, for all sorts of official purposes. And if you go to his …
It’s not like, “You should go watch this game.” This is about baseball.
No, it’s not. It used to be that way, you know, before he became president. But once he became president he started using it almost entirely for these …
I don’t think he gets, but go ahead. I don’t think he knows how to shift between them, but all right.
I think that’s probably true, but most of …
But, he’s using it that way.
He’s using it that way. And when people criticize him or mock him for his decisions or for his statements about his decisions, sometimes he blocks them.
Right. Or Dan Scavino does, whoever’s doing it.
Or Dan Scavino does, yeah, we happen to know it was President Trump with respect to our clients because in the litigation, the government has disclosed that.
Yeah. Can’t you see him there, jabbing his little fingers? Like, “Eehh, this guy …” I can see it.
Apparently, he does it personally.
Yeah.
He personally blocks them out of, I guess, pique.
Yeah. Well, that’s how you block people, just so you know.
No, I guess that’s true.
He shouldn’t be unfairly …
He’s not special in that particular, that’s right.
He’s not special. I’ve done it. I’m like, “You, gone!”
Right, right. But you’re not a government official.
No, I’m not, but I’m just saying. I’m going to give him a break on that one.
There’s one way to look at this, [that] this is a trivial thing. He’s blocking people on Twitter, is it the end of the world? And, obviously, if you look at it that way, it’s not.
There’s a principle at stake here.
Well, it’s not just a principle at stake.
No, there is. There is actually a principle at stake, sorry.
Well, there is a principle at stake. I’m saying there’s not only a principle …
Right, okay.
This is the way that public officials engage with their constituents now. It’s not just President Trump, it’s public officials all over the country engaged with their constituents maybe principally through social media. And, if you create a rule that allows public officials to block from their social media accounts anyone who criticizes them, you’re going to have a pretty dramatic effect on public discourse.
Right. Absolutely. I’m good with this lawsuit.
Yeah, so we took on the lawsuit … not everybody was, especially when we initially brought it, when we brought the lawsuit …
Did they think it was frivolous, or …?
Yeah, we made the argument that the president’s Twitter account should be thought of as a public forum under the First Amendment, with the consequence that if a public official blocks somebody from that forum, it’s unconstitutional. It violates the First Amendment. I think it took people a little bit of time to come around to that view.
But now, my impression is that most First Amendment advocates and scholars are on our side. More importantly, the district court is on our side and issued a ruling a couple of months ago in our favor, holding that this practice of blocking people on the basis of viewpoint is unconstitutional. The Trump administration has now unblocked our clients and dozens of other people in response to that ruling. It is also appealing the ruling, though, so we’ll be in the Second Circuit.
And then where does it go to? Would this go to the Supreme Court?
You know, my hope was that they wouldn’t appeal it, but they’re appealing to the Second Circuit. If we win in the Second Circuit …
And their argument being what? He can block anyone he damn well pleases?
Essentially, yes. Yes.
Yeah.
The argument is that, notwithstanding the fact that he uses it in the ways that he uses it, notwithstanding the fact that if you go to his profile page, the account is said to belong to the President of the United States and there’s a big photograph of Air Force One on the page, notwithstanding all of that, their argument is that it’s a personal account.
So, where do you imagine this going?
Well, it’s already had a pretty significant effect around the country, which is very gratifying to see. Other public officials who have adopted this practice of blocking their critics from their social media accounts …
Oh, he doesn’t have an Air Force One now, he has a rally.
Oh, now it’s a rally. Yeah, it changes every few days.
Okay, yeah.
But it’s usually something involving his official work.
There are public officials all over the country that have adopted this practice of blocking critics on social media, and now people are writing to them citing, among other things, or citing principally this decision that we got from the Southern District of New York here. It’s very gratifying to see public officials, Democrats and Republicans, reconsidering their social media policies in response to this litigation.
Right. And so, it should end there that they can’t do that.
It should.
Recently you’ve been doing some things around Facebook. Can you tell people about it?
Yeah, sure. As everybody knows — or should know — Facebook has terms of service that restrict how users can use its platform. The terms of service, in many respects, makes sense, but one consequence of the terms of service is that journalists and researchers who want to study Facebook’s platform are impeded from doing so. They’re impeded from doing so because Facebook bars them from using some basic digital tools.
For example, scraping information from — collecting by automated means — information from the platform, or using temporary research accounts to prompt the platform, or try to figure out how the platform will respond to certain kinds of prompts. And there are many journalists and researchers who study the platform who’ve been able to do a lot of good work in spite of those restrictions, but these restrictions are increasingly limiting their ability to study, not just Facebook, but other platforms as well which have similar restrictions.
And these aren’t people who just wanna do something for a business sense. Explain these are people who are doing research into understanding how the platform works.
Yeah, basically, a whole lot of what we know about how Facebook works and how Facebook affects the world, and obviously Facebook and other platforms now have a huge, if not fully understood, effect on public discourse, not just here in the United States but around the world. What we know about the effect that social media platforms are having on public discourse, we know because of the work that digital journalists and researchers have done.
So the Cambridge Analytica story, for example, is a result of work done by the Guardian, or Julia Angwin has done a lot of work about discriminatory ads on Facebook. Kash Hill has written about the People You May Know tool. All of this journalism has told us a lot more about how they work.
How the platform is working or not working, mostly not working.
Yeah, working or not working and shaping public discourse and thereby shaping our society. Some of these things are discriminatory ads. I think Julia was focused principally on the effects domestically, but some of the effects of the social media platforms are very significant internationally. The New York Times has written about ethnic violence in Myanmar or ethnic violence in Sri Lanka.
Everybody has written.
Right. This is reporting that is especially crucial right now because nobody — and certainly not the platforms — nobody fully understands the implications of the decisions that the platforms are making.
People are chipping away at the stuff, so they were blocking it, and so you want …
They wanna sell you the platform, and there are these restrictions that prevent them from using the tools that would be most useful to them. The restrictions generally make sense. It’s entirely understandable why Facebook wants to limit hits.
Yes, they’re worried about abuse.
Yeah, they’re worried that people will abuse the privacy of their users.
Good instinct.
Yeah, entirely good instinct, especially right now in the wake of the Cambridge Analytica scandal, Facebook and Twitter and a lot of platforms are under a lot of pressure to crack down on, or ensure that that kind of abuse doesn’t occur. All that is understandable.
But one effect of those general prohibitions is to prevent journalists and researchers from doing work that we really need them to do right now, so we’ve gone to Facebook and we’ve said, “Will you create a safe harbor? Amend your terms of service to make it possible for journalists and researchers to use these tools, to scrape public information from the platform.” Safe harbor is focused only on public information, information that users decide to make public.
We’ve said, “Can you create a safe harbor so that journalists and researchers can scrape that kind of information from the platform or so that they can use temporary research accounts to see how the platform responds to different kinds of profiles? And can you assure them that you won’t invoke the terms of service against them if they take on public interest projects?”
That may be at cross purposes, too.
You mean that Facebook …
Yeah, it would be negative towards Facebook.
Well, it could be. Some of it could be negative towards Facebook. I mean certainly, the Cambridge Analytica story was negative towards Facebook, in a sense. Although once …
It was pointing out a glaring hole.
It’s protecting Facebook’s users, right?
Yeah.
That story protects Facebook’s users, and Facebook changed its policy.
Same thing with discriminatory advertising.
Right. Facebook has changed its policies in response to the Guardian’s reporting around Cambridge Analytica, in response to ProPublica’s reporting around discriminatory advertising, so yes, some of it could be embarrassing to Facebook, but some of it could also be very useful to Facebook. More importantly …
Make it a better place.
Crucial for its users.
Right.
Yeah, so that’s the idea.
Where are you on this?
We wrote to them about a month ago. We asked them to respond by just after Labor Day. They responded very graciously, and we’ve been in a conversation with them since. I think it’s too early to say whether that conversation’s gonna lead anywhere, but we’re glad to be in the conversation.
It’s a loosening of rules for specific research projects.
Yeah, to be honest, it’s a kinda hard balance to strike here, because again, we’re very sympathetic to the privacy interests.
Let’s just be clear for listeners who don’t know, the Cambridge Analytica thing started with a university person doing it.
That’s right, so you don’t wanna create a situation where …
He did it and handed it over to them.
… just because it’s a researcher it’s okay. You can’t create that kind of situation. The safe harbor that we drafted, again, it’s focused only on public information, and then it requires journalists and researchers to observe certain safeguards or limitations.
So for example, you have to protect Facebook’s users’ privacy. You have to take measures to make sure that the information you collect isn’t gonna be inadvertently disclosed. You can’t transfer it to a third party. You can’t transfer it, for example, to a data aggregator, or to any other commercial enterprise. You can use it only to inform the public about matters of public concern.
And obviously, there’s gonna be disagreement about the meaning of some of these terms, and Facebook’s gonna have to flesh it out over time. Facebook would have to decide over time which projects it was willing to allow and which ones it was gonna shut down. But in our view, that’s a better situation than we’re in right now, where Facebook has effectively categorically prohibited all of this journalism and research from taking place.
Right, on an open platform presumably, a platform that aspires to be open. These are the kind of things. Are you doing anything around Twitter? Or things like that?
You mean aside from our lawsuit against President Trump?
Yeah.
Which is not against Twitter. It’s about Twitter, but not against Twitter.
On the research side, we are just now launching a project focused on regulation of the social media platforms.
Oh good.
Jamal Greene is a Columbia law professor who is now gonna be visiting at the Knight Institute. He’s not visiting from very far away, but he’s visiting the Knight Institute for the next year. He’s a constitutional scholar there, and he’s gonna focus on …
What they should do.
Yeah, what they should do. We have been commissioning papers on this question.
He’s gonna be my new best friend.
Well, you know …
I’d like to know.
Nobody … They’re hard questions. We’ve been thinking through them ourselves, but the point of this visiting scholars program is to bring in other people as well from not just the academy, but ideally from the companies as well.
But, dealing about this. What’s gonna be really important to this thing is how they’re going to regulate and what it’s gonna manifest itself. Because I think the minute the Democrats get back, there’s gonna be some — if they get back — there’s gonna be some reckoning.
Yeah, I think that’s probably right. If I were at one of the companies, I would be working very hard on …
Republicans winning.
Self-regulation in order to …
Republicans winning.
I’m not sure about that.
I’m just saying.
You know, there’s a lot that the companies can do on their own that would help them if there were a more serious debate about regulation. Low-hanging fruit would be transparency, more transparency about what kinds of decisions they’re making, what the effects of those decisions are.
I just wrote a column in the New York Times today saying that transparency …
I know. I saw it.
I’d just like some transparency.
I think that now there seems to be, I think … broad agreement may be overstating it, but broader than there was a year ago, that the social media companies should be more transparent about which accounts they’re taking down and why they’re taking them down.
How they made the decision.
There are these Santa Clara Principles that some advocacy groups have put together that have to do with providing notice to people who are affected by those decisions, giving them an opportunity to appeal, disclosure of statistics about how many of these decisions are being made, and how many accounts are taken down.
And how they’re making them.
Yeah. But one observation about the debate so far is that the debate has focused, at least the public debate … I think the academic debate and the debate amongst technologists is a deeper one that this, but the public debate has focused on these very spectacular censorship, spectacular in the sense that they get tons of attention, like Alex Jones, right?
Right.
There are hundreds or thousands of these decisions made every day.
Yes, there are.
That aren’t affecting people like Alex Jones. They’re affecting …
And also made in seconds.
People who are on the margins. Yeah, made in seconds, or made by machines, and I do worry that the debate focused on Alex Jones gives people the impression that we can just say the voices we really don’t like will be kept off the platforms and everything else will be the same. I do worry about marginalized voices that aren’t noticed in the way that Alex Jones is, whose accounts are taken down or whose posts are taken down. The debate should encompass the …
And how they do these things.
And how they do these things. That’s one part of it, but the other part focusing just on the Alex Jones-styles cases which are about account takedowns or posts that are taken down. I think that’s too narrow too because, as you know, maybe the more fundamental content creation that these platforms engage in every day is just through algorithmic decisions, prioritization, which information you see, which information you don’t see. I think there’s a good argument that Alex Jones’s power comes not so much from his access to the platform but from the fact that the platform’s algorithms privilege that kind of speech.
Which Nicole was talking about, the pillars. Change the pillars. Then, you’ll see a change in what wins. What she was talking about was, you know, engagement, speed, and something else are …
The “slow food movement” for it. Yeah, yeah, yeah. I think the focus on Alex Jones, I’m not saying it’s unwarranted, but I do worry that it’s taking, it’s shifting attention away from the responsibility of the platforms.
To be more clear about what rules they make and how they do things.
And the fact that they make these decisions on a second-by-second basis. It’s not just about taking down accounts. It’s about how you prioritize the information, which information you decide to show me and when you decide to show it to me, and in what form you decide to show it to me. Why is it that Alex Jones’s speech spreads so quickly? Maybe you can put some of that on Alex Jones, but some of it’s on Facebook, and some of it’s on Twitter. Some of it has to do with their algorithms, and so I think we should have that sort of broader conversation about the role that the social media companies are playing in shaping, and arguably, distorting public discourse.
Right, because one of the things … In the next section, we’ll talk about it. I was having an argument with someone about Alex Jones, and they were like, “They shouldn’t do this. The First Amendment.” I said, “They do it all the time.” Are you kidding me? To have them go on and on about how they protect the First Amendment. I’m like, “Do you know how many people they’ve taken down?”
By the way, you ever heard of Chuck Jones? It’s Chuck Jones, right? They took him down. They didn’t like him. They didn’t like his words. He violated their things, and then they got off. That was it. Or they move things down in priority. They bury things.
Right, to recognize that, that’s to recognize something important, but it doesn’t tell you anything about what the solution is, so yeah, they do it all the time.
I’m only saying that because they pretend they don’t. They’re like, “Oh no, we’re backing the First Amendment.”
Oh no, I agree with you. I agree with you, but I just think that the hard question doesn’t get presented until you recognize that they do it all the time. Now we recognize that they do it all the time, what’s the answer? On the one hand, you have the platform saying, “We have a First Amendment right to create the kinds of communities we wanna create.” That’s a plausible argument.
On the other hand, you have the very real concern that centralizing the power to control debate in the social media companies would be the worst thing in the world, which is also a very serious argument. The First Amendment is concerned principally with centralization of power in the government, centralization of power over …
But not this.
The First Amendment doesn’t, obviously, have very much to say about this.
Never contemplated it. One of the things you said early on was this idea of the privatization of the public space. I think in his testimony last week, Jack Dorsey called Twitter the public square. I was like, no it’s not. It is, but it’s not. If it is, then he gets to be regulated, if he’s the public square or something like that. So, talk a little bit about what you mean by the privatization of the public sphere.
I think what I mean by that is that conversations that used to take place in spaces that were subject to the First Amendment are now taking place in spaces that are controlled by private actors and therefore not subject to the First Amendment.
“Public square” is used in a lot of different ways in a lot different contexts. There was this decision a couple years ago, the Packingham Decision, that Justice Kennedy wrote, in which he described the social media platforms as the public square, and Justice Alito wrote a concurrence effectively scolding Justice Kennedy for using that language, saying you don’t wanna go down that road, you don’t even mean what you’re saying. So there’s this debate at the level of legal doctrine. Are the social media companies public squares or not?
Then there’s a question, sort of in a more practical way, I know Zeynep Tufekci has resisted the idea of calling the social media public squares because really their whole model is based on feeding you just information that’s really made just for you, so it’s the opposite of a public square in a way.
That’s right. And then it’s interested in grabbing information from you to feed advertisers.
Yeah, certainly, they’re not publicly … Their interest is not a public interest, it’s a commercial corporation.
No, completely.
They make money.
You walk across a public square, and this is how they’re walking across, and here’s somebody that might want an ice cream right here because they seem hot.
Right, right, right. There’s this whole level of surveillance under …
That’s not what the public square does.
Yeah, yeah, yeah, a whole level of surveillance under the speech, and they’re monetizing the results of that surveillance. So in those respects, it doesn’t look like a public square. But there’s no question that these companies have immense power to decide not just who can speak, but also who gets heard. Who can speak because they decide who gets on the platform and who doesn’t, but who can be heard because their algorithms decide what speech gets prioritized and what speech gets suppressed.
Pushed down.
Yeah, so in that sense, they control the public square, and not just in the United States but in a lot of the world, right? Again, the First Amendment is concerned principally with government power, but we resisted the centralization of control over the public square in the government because we didn’t like the idea of centralization of that kind of power. Maybe we should resist the idea of centralizing power in the social media companies for the same reason.
That’s how you get to proposals like, “Well, maybe we should have a must-carry rule, which requires Facebook to carry everybody, that restricts Facebook from …” But you run up against pretty serious First Amendment arguments on the other side. Facebook can, I think, quite plausibly say that it has a First Amendment right to create the kind of community that it wants to create.
I’m not sure we really want a situation where Facebook is subject to the First Amendment in the same way that the government is. I mean, it would require Facebook to allow pornography on the platform, for example. It would require Facebook to allow Constitutionally protected hate speech. Facebook would be required to host that. I’m not sure anybody would see that as a solution to the problems that we’re facing right now. That’s all just to say that it’s complicated and I don’t have any answers.
Okay. All right. Where are we … When you talk about this debate, use a debate like the Alex Jones thing. A lot of people pulled out this, “He has a First Amendment right.” Often I say, “Well, he does, but it doesn’t mean that he doesn’t get kicked off.” That’s two different things. I think I wrote, “Freedom of speech doesn’t mean freedom from consequence,” and if you break the rules of a platform you get to pay for that, essentially.
How do First Amendment scholars look at this? I think people have been very reductive about the First Amendment, especially when it comes to these social media companies. The other argument of course is that they’re private companies that can do whatever they want.
Yeah. I mean, I think you have summarized quite well the basic First Amendment arguments here. Alex Jones has the right to speak but that doesn’t mean he has the right to be on Facebook platforms.
Same thing for Roseanne Barr.
Yes. Facebook has its own First Amendment rights here. It expresses them by ejecting Alex Jones from the platform. I think none of that would raise difficult questions if it weren’t for Facebook’s scale, right?
Exactly.
It’s the fact that Facebook is so big and that Facebook arguably controls the public square or arguably controls a large segment of the public square. That’s when I think free speech advocates start to get nervous about Facebook excluding people from the platform, especially when there’s an argument that they’re excluding people on the basis of viewpoint.
You can think whatever you want to about Alex Jones, but worry not about Alex Jones but about the next person or next year. Or who is it that Facebook is going to be excluding next year? If we know anything from the history of government censorship, we know that this power is going to be used most aggressively against marginalized voices, controversial voices, but marginalized voices that we especially need to hear.
This would not be a worry if Facebook were a community listserv or something like that because Facebook wouldn’t have this kind of out-sized effect on public discourse and on our society. If you accept — again, there’s an argument about this — but if you accept that Facebook is rightfully characterized as the public square or a big piece of the public square then I think you should be very troubled by the idea that Facebook is going to decide who gets to speak and who gets heard.
Now Mark Zuckerberg is trying not to be able to be able to say.
Yeah. I’m not unsympathetic to him.
He kind of has to.
Yeah.
What does he do? I’ve had him do this to me. I’m like, “Why are you the controlling shareholder? Why do you have $64 billion?” You can’t own it and control it and say you don’t own and control it.
Look, I’m not unsympathetic to their feeling that we don’t want to have to decide these questions.
Yeah.
I would be a lot more sympathetic to them if they were at the very least offering us this low-hanging fruit of transparency and notice and an appeal right. At the very least, they should tell us what decisions they’re making. Who is getting excluded from the square? Whose voices are getting amplified? Whose voices are getting suppressed? Offer statistics about all of those things and offer transparency at the level of individual cases…
Why don’t they do that?
… and transparency at the level of algorithms. You know, I think that they have definitely made moves in that direction over the last few years. I think part of the concern is the way people will respond to that kind of transparency. I mean, the transparency could be embarrassing for them. It could lead to calls for regulation. I still think they have a responsibility to do it.
Yeah. Get more behind the idea that they don’t want the responsibility of something they’ve built. See, I think they have responsibility for it so they’ve got to figure it out. It’s my problem but they want to push it away and yet they hold all the benefit parts, the money, the advertising.
I think that’s right.
Then give us the money if they don’t want … Let someone else run it and own it. Hand it over to someone else.
You should propose that to them.
I have. They don’t like that idea. They have rejected my brilliant idea.
I’m sorry. I’m sorry to hear that. It’s a good idea. This is part of why the Alex Jones debate, again, makes me uncomfortable. Not because I disagree that Alex Jones is a toxic person who is causing real harm to real people. Obviously he is. I think most of the responsibility for the problems that most concern people like you and me right now — disinformation or discrimination on these platforms or echo chambers or filter bubbles — all those problems are the result not of people like Alex Jones but of people like Mark Zuckerberg.
The platforms themselves are making these decisions. The platforms decide to amplify some speech and suppress other speech. They decide to facilitate some kinds of communities and to foreclose other kinds of communities. They need to take responsibility for those decisions.
What would you do if you were … Eventually Twitter did the same thing, even though you knew they were going to do it, kick off Alex Jones. It took them a while. They wanted to make their speeches about the First Amendment and then they did exactly what everybody else did. How did you look at that?
Yeah. I find it hard to answer that question in isolation because I think the Alex Jones thing … It’s a mistake to look at it in isolation because the decision that Twitter or Facebook makes is a more general decision that will have implications for cases involving people who aren’t Alex Jones. That’s one reason why I don’t like answering the Alex Jones question.
Fair point.
If you ask me, do you want that guy who you disagree with all the time to be quiet or to be stopped from speaking by the government? If I look at that in isolation then maybe I would say, “Yeah, it’s totally fine if the government stops him from speaking. I don’t like what he says.” But that’s not the way the world works.
No, of course. No, in this case they kind of had … He broke the terms of service.
That’s right.
So that’s an easy one.
He broke the terms of service. Well, it’s an easy one under their policies. Yeah.
That’s right.
The question is, are their policies the right ones for society, right? It’s a harder question.
What would you do if you were running one of these things? Fake news is a whole other thing. That’s a whole issue. That’s not protected. It’s lies, essentially.
Yeah. I would do a number of things. One, I’ve already mentioned, which is provide more information about which accounts are being taken…
Tell us what you’re doing.
Tell us what’s going on.
Yeah. They don’t want to show you that.
Both in individual cases and more generally. You asked me what I would do, all right? Not what they are going to do.
All right. Okay. I wanna know, I’m just telling you, they’re not going to do that, but I like it.
That’s one. Another form of transparency, which is what we’re asking for in this letter we sent to Facebook, which is essentially make it easier for journalists and researchers to study what’s going on on your platforms.
Right.
Make it easier for the world to understand how these problems work.
“Let us solve the problem for you.”
Or let us help solve the problem. How about that?
Do you remember Jack Dorsey’s thing? “Journalists will figure it out and tell everyone.” I was like, “Really?”
Yeah.
I don’t own Twitter stock. I’m not sure I want to do that job.
Well, Twitter also has terms of service that restrict journalists from doing the same kinds of journalism that Facebook does.
Yeah. It was ridiculous. He took it back but, boy, was that a doozy.
Yeah. Yeah.
That’s the word for it, doozy. What a doozy. Anyway, go ahead. Sorry.
Well, another piece of the transparency …
OK, so that one. Let journalists do it.
I was thinking about it today in relation to this video that Breitbart was circulating about the Google meeting. You know, if you do conceive of these … Kate Klonick is a legal scholar who has this recent article called “The New Governors” where she characterizes the social media companies as akin to governors and therefore the implication is we ought to treat them that way and they ought to … Well, we ought to demand things of them that we would demand of our governors.
Right. We can’t vote them out, though. That’s the problem.
You can’t vote them out, but you can demand that they provide things like due process protections and transparency protections, right? If you think of the social media companies as governors or akin to governors, then why not also have whistleblower rules? Why don’t Google and Facebook and Twitter have whistleblower rules to protect people who would tell the public about abuses of one kind or another that are taking place within those companies?
Because, again, if you take seriously the fact, as you should, that these companies have an outsize effect on public discourse and therefore an outsize effect on our society, then there ought to be some safeguards in place to ensure that the companies are working in the public interest.
If they are treated like that. We’ve never treated companies like this in the past. What other companies have been in the public interest like this?
Well, this is not unprecedented at all.
TV? TV. I guess TV.
We regulated broadcasts in this way. Common carriers, the railroads. And I’m not even suggesting that we should think of the social media companies in the same way.
No, different. A whole new different level.
Yeah. I’m just saying that it’s not unprecedented to require …
To finish up, what regulation do you think is coming?
I have no idea. I think if I were the social media companies I’d be very nervous about that. As a First Amendment advocate, I’m a little bit nervous about it myself.
As people probably were in television time.
Sorry?
As people were in television time.
Yeah. Yeah.
But they got regulated. They didn’t die.
I do think that regulation … I think that we should consider regulating the social media companies and there are a lot of different possibilities that range from, again, low-hanging fruit like transparency regulation to much more intrusive regulation relating to content. Like a must-carry rule, for example. Yeah. Some of those make sense and some of them don’t.
Yeah. I got one for you.
It’s very unpredictable.
Section 230. That’s my favorite new thing.
Yeah.
Why not just make them liable? Then they’ll stop misbehaving. It’ll be in their interest to do a good job at their job.
Well, I think that the risk there is that, yes, they will stop misbehaving, but they will also take down a whole lot of information that is Constitutionally protected and valuable but they’ll worry about liability.
Again, I’m not worried about Alex Jones here. I’m worried about Black Lives Matter or a million different politically controversial topics that other private actors will write to Facebook or government officials will write to Facebook saying, “Why do you have this up?” and Facebook will panic and take it down. I’m not categorically opposed to an amendment to section 230, which …
It’s already being chipped away.
In this FOSTA-SESTA, yeah. It’s already been chipped away a little bit. I’m not categorically opposed to it. I think it’s worth considering. But there are risks.
Right.
Yeah. Those risks, I think we should take seriously too.
Yeah. All right. To finish up, where do you think the next battleground is?
The one thing we haven’t talked about is surveillance.
Ah, yes.
I think that people aren’t accustomed to thinking of surveillance as a threat to the First Amendment. They think of surveillance as a privacy threat, which of course it is. Pervasive surveillance also has very real implications for the freedoms of speech and association and the press. It’s very hard to measure. When people are under surveillance, they act differently, but how do you measure whether they are acting differently and to what extent they’re acting differently? It’s a very difficult thing.
There’s a whole line of cases from the 1960s and ’70s involving more primitive forms of surveillance but that draw the connection between surveillance and the First Amendment. I think that over the next 10 years, you’re going to see at the very least advocates and I hope courts too …
Argue this?
Yeah. Yeah. Accept this connection and think about the implications for the First Amendment of both government surveillance and private surveillance.
That you didn’t give your consent and they …
Yeah, or even if you gave your consent, people give their consent for limited purposes and then the information is used for other purposes. The government loves to make this argument in other contexts but people give one piece of information in one context and another piece of information in another context and together these things form a mosaic, and you can learn a lot more about a person by putting all this stuff together.
Yes, you can.
As data aggregators are paid to do, than you can by just studying individual data points. All that has far-reaching privacy implications. We started having this privacy debate in the wake of the Snowden disclosures. I think we’re only now starting to grapple with the free speech implications.
Absolutely. Well, this has been fascinating and there’s more to come. I want to check in with you maybe in a year about where you guys are going and what your next areas are.
That’d be great.
I think you’re right. Absolutely surveillance is another one. Things in the home. What you say in your home and what is protected with these devices, how you behave in … There’s all kinds of things that I think people … What your identity is, I think, is another one.
You’re carrying around this surveillance device in your pocket.
I call it that all the time. No one ever listens to me. I’m always like, “You just joined the prison system,” when you realize it and how much information they have on you. Jameel, it was great talking to you. Thank you for coming on the show.
Thank you.
Please come again.
Leave a Reply
You must be logged in to post a comment.