Intro Montage:

What ... What ... What ... What ... What ... What ... What is ... Twitter?

Evelyn Douek:

Good day. I'm Evelyn Douek, and this is episode 2 of "Views on First."

Intro Montage:

Twitter. Twitter. Twitter. Twitter. Twitter. Twitter.

Evelyn Douek:

This is a podcast about the First Amendment in the digital age from the Knight First Amendment Institute at Columbia University.

Yoel Roth:

It's clear that social media represents new vectors for abuse and harassment at scale with dangerous consequences.

Evelyn Douek:

This podcast explores what on earth the First Amendment is going to do about social media platforms and free speech in the digital age.

Noah Feldman:

There's a very likely scenario coming where a war is going to happen and what I'm doing is trying to plan for that war.

Evelyn Douek:

If you haven't listened to episode 1, you really should pause this episode and go and listen to that before continuing. Go on, we'll wait. All right, so the question this series is exploring is what are social media platforms as a matter of constitutional law? At the end of episode 1, we left you on a high note. The lawyers from the Knight Institute had emerged victorious in the legal challenge they started in 2017 arguing that President Trump blocking critics on Twitter was a violation of the First Amendment. Go team.

But of course, the case wasn't just about Trump or just about the handful of plaintiffs Knight represented in bringing their case, and it wasn't just about Twitter, which may or may not exist by the time you're listening to this episode. When places like the Knight Institute brings strategic litigation they're not only interested in individual cases, but they think about how the cases they bring can shape the law going forward, and that's what this episode is about. What are the ramifications of the holding in Knight v Trump and what might happen in the future? We've got two people to help us think through this question. One of whom has serious doubts about whether the argument made by the Knight Institute was correct and the other... Well, the other has been on the front lines of the battles about what we can say online, but let's not get ahead of ourselves. Let's start with the Knight critic.

Noah Feldman:

My name is Noah Feldman. I teach constitutional law at Harvard Law School.

Evelyn Douek:

Noah was a skeptic of the Knight team's theory from the start. In fact, he wrote a number of columns for Bloomberg News that made his feelings about it pretty clear. In June 2017, he wrote, "Kudos for Creativity to The New Knight First Amendment Institute at Columbia University, which is alleged that the First Amendment bars President Donald Trump from blocking followers on his Twitter account. Unfortunately, the law runs to the contrary." In May 2018, after the Knight Institute won its case in the district court, he wrote that "The Trump Twitter blocking ruling is bad for free speech." Then when the initial ruling was affirmed on appeal, Noah wrote that this decision would have serious negative implications for the freedom of speech. So what exactly does Noah think is so wrong with this case and why does he think it will lead to developments in the law that will ultimately be dangerous?

Noah Feldman:

The core of my instinct about the case was to understand Twitter as a social media platform, as a single unified product, which was created in the exercise of Twitter's first amendment rights, both to free speech and to free association. Seen from that perspective, it made no sense to think of the then-president's Twitter feed as a public forum because the whole forum was created by Twitter.

Evelyn Douek:

So this is the first area of disagreement. Remember, Knight was not arguing that all of Twitter itself was a public forum as in a place where the First Amendment applies to prevent any restrictions on speech, but it was arguing that the area under Trump's tweets where people could interact with what the president said was a kind of public forum, and it was not arguing that Twitter couldn't block people, but it was arguing that the president, as a government actor, could not block people. Noah thought that was innovative.

Noah Feldman:

It was an extremely creative position to advance, and it also represented a kind of cutting-edge position.

Evelyn Douek:

But he remains unconvinced. He just doesn't think it makes sense conceptually to think of Twitter as lots of different kinds of spaces and have lots of different kinds of legal rules that apply to each of the different parts. He thinks Twitter is a private platform, and what happens on Twitter is the result of Twitter's actions, not Trump's.

Noah Feldman:

Twitter is a business that is wholly controlled by Twitter, and therefore the forum that is created there is within Twitter's control in exactly the same way that the legacy media would have been.

Evelyn Douek:

Legacy media like The New York Times, Fox News or film studios. To Noah there's no difference between Twitter choosing what content to show you and a newspaper choosing what stories to run. Therefore, the Constitution shouldn't have anything to say about what Trump did on Twitter because...

Noah Feldman:

One thing you can do on Twitter is block people. You can also mute people. You can also amplify certain voices. Those are all features of something that Twitter designed and Donald Trump was using them, but while he was using them, he was doing so as it were at the mercy of Twitter. Donald Trump wasn't in charge of this forum, twitter was in charge of this forum.

Evelyn Douek:

Noah has some experience thinking about the power that private companies have over online speech. A long time ago when Facebook was first starting to get real heat for its content moderation decisions, Noah was on a bike ride in Menlo Park. He was contemplating the nature of free speech in the modern world as constitutional law professors are wont to do. When he thought that maybe Facebook could turn down the heat by creating something like its own Supreme Court to make its content moderation decisions for it, he wrote home to tell the college friend that he was staying with about the idea. Now, that college friend just happened to be Sheryl Sandberg, the chief operating officer of Facebook, and the idea of a Facebook oversight board became reality a few years later. Since then, Noah has consulted for Facebook and so he sees up close the pretty breathtaking displays of power that social media platforms wield over their sites, and as a result, our public sphere, the most striking display of this awesome power, awesome in the literal sense of course, was indeed over the very same account the Knight case was about.

Noah Feldman:

After all this had happened, Twitter did in fact deplatformed Donald Trump, so the forum went poof, it disappeared in an instant, and Donald Trump had zero control over that, which to my mind strongly brought home that he was never in control of the forum in the first place and that therefore his account should not have been considered a public forum in which the First Amendment applied.

Evelyn Douek:

To Noah's point about the level of control Twitter has since we talked, Twitter's new owner, Elon Musk, has flicked the switch and depoofed Trump's account. It's back on the platform for all to see. Now, at the time of taping, Trump hasn't tweeted, but he could if he wanted to. I'm not going to spend too long thinking about what it means that Trump is showing more control staying away from his Twitter account than I've ever been able to.

Anyway, Noah isn't making this distinction between whether Twitter is in control or Trump is in control just because academics going to academic and get all theoretical. Noah's concerned that imposing First Amendment constraints on parts of Twitter could be dangerous for two main reasons. The first reason Noah is worried about the holding in Knight v Trump that government actors can't block people on Twitter is because remember, the case was not just about President Trump, but was about all government actors.

Noah Feldman:

Shift it over and make it Alexandria Ocasio-Cortez. The amount of inbound hate misogyny, nasty evil that she gets in any few moments is kind of horrific and also awe-inspiring in its quantity.

Evelyn Douek:

Alexandria Ocasio-Cortez, the young congresswoman from New York's 14th congressional district, most commonly called AOC, has 13.2 million followers on Twitter and is famous for using social media in a way no politician has before. She's used platforms that aren't usually thought of as political spaces.

Alexandria Ocasio-Cortez:

I'm happy to hop on Instagram live tonight and answer your questions around COVID relief and stuff like that.

Evelyn Douek:

She's found new ways to reach people by talking in a way that we are not used to hearing from politicians.

Young Turks Host:

AOC played Twitch games last night and the whole point of it was to get out to vote.

Alexandria Ocasio-Cortez:

It is a good way to use technology to reach constituents, even if it's just when I'm prepping vegetables for dinner.

Evelyn Douek:

And it's been tremendously effective.

Alexandria Ocasio-Cortez:

Pretty insane 163,000 of you guys are hopping on. This is nuts.

Evelyn Douek:

But although AOC seems to be having fun on social media, it's not all sunshine and lollipops. When you're big online, you're also a big target online death threats are the norm. AOC has talked about having her morning coffee and reviewing photos of men who want to kill her. That takes its toll.

Alexandria Ocasio-Cortez:

I personally gave up Facebook, which was kind of a big deal because I started my campaign on Facebook and Facebook was my primary digital organizing tool for a very long time. Social media poses a public health risk to everybody.

Evelyn Douek:

It's not just AOC who felt driven out of these spaces due to abuse. It's Democrat Kim Weaver who was running for Congress but...

Young Turks Host:

Announced yesterday that she would be stepping down from that race due to multiple reasons, but one of them was rampant death threats that she's been receiving for apparently many months.

Evelyn Douek:

It's Katie Hill, former representative of California's 25th district in the house.

Katie Hill:

Some people call this electronic assault, digital exploitation, others call it revenge porn as the victim of it. I call it one of the worst things that we can do to our sisters and our daughters.

Evelyn Douek:

Erin Schrode, a former Democratic congressional candidate from California.

Erin Schrode:

It's very unsettling when all day, every day, should I look at any of my social media feeds or my email to be reminded that I'm Jewish, that I'm unwelcome, that I need to leave, that I'm inferior, that they're going to come for me.

Evelyn Douek:

Mya Whitaker, a former candidate for city council in Oakland.

Mya Whitaker:

I feel like that as a Black woman too. They definitely ask for where is my child's father, is he involved? And it's like, "What does that have to do with whether or not I can get this valid pass?"

Evelyn Douek:

Gold star to anyone that can identify what all the voices you just heard have in common?

Yoel Roth:

It represents a challenge to the ability of women to fully engage on social media, which we believe has fundamental implications for freedom of expression and information.

Evelyn Douek:

That's someone who knows an unfortunate amount about online harassment, Yoel Roth.

Yoel Roth:

And I'm the head of safety and integrity at Twitter.

Evelyn Douek:

Actually, since we spoke to him. Yoel has become the former head of safety and integrity at Twitter. It all went down in quite an unceremonious and frankly, pretty awful way.

MSNBC Host:

Several high-ranking Twitter executives have now left the company, one of the biggest names to leave the company yesterday, Yoel Roth, in charge of trust and safety, who until recently has defended Musk's efforts to fight misinformation and hate speech.

Humanist Report Host:

Elon Musk seemingly retaliated against Twitter's former head of trust and safety, Yoel Roth, by insinuating that Roth was a pedophile based on an out of context excerpt from his 2016 doctoral dissertation, which led to Yoel Roth getting death threats and even having to flee his home after Musk's post were amplified [inaudible 00:12:09].

Evelyn Douek:

Since we spoke to him, many things at Twitter have changed. Musk has different views of how the platform should moderate content.

Sky News Australia Newscaster:

Elon Musk himself, he's a self-professed free speech absolutist and that just has a lot of people in a tizzy.

Evelyn Douek:

Many people have been fired. Yoel quit his job and with all the change ups, it's kind of chaos over there. It's not always exactly clear what's going on. A few weeks after he quit in early November, Yoel sat down for an interview with tech journalist Kara Swisher, and here's how he summarized what appears to be Musk's general approach.

Yoel Roth:

One way of streamlining the work of trust and safety, I guess, is to have fewer rules. You can certainly streamline things, but that doesn't mean that malicious activity is going to get less complicated. It doesn't mean trolls are going to stop. You can't bury your head in the sand.

Evelyn Douek:

Trolls. A universal law of the internet is that if there's a platform, there's trolls and there may be more than ever right now.

Global News Host:

By opening the floodgates to unfettered free speech. Elon Musk has allowed hate to spread far and fast on Twitter.

Evelyn Douek:

Now, the ins and outs of that drama aren't relevant for us here, but what Yoel has to say about trying to keep people on a platform like Twitter safe are still relevant because that's kind of the point. The issues raised by Knight v Trump are fundamental issues about how to best manage the online public sphere, and that's bigger than any single platform or any single job, and much of what Yoel saw in his role at Twitter is pretty standard across the industry.

Yoel Roth:

Women journalists and elected officials and especially those from minority populations have been shown to face disproportionate levels of abuse and harassment.

Evelyn Douek:

Now, in many ways, of course, none of this is new. I mean, "Newsflash, women in politics face sexism." A lot of this is stuff that women have come to expect as the price of having a public profile, and it's not like social media created misogyny, but it is clear that there's ways in which social media has amplified these long-standing dynamics.

Yoel Roth:

It's clear that social media represents new vectors for abuse and harassment at scale with dangerous consequences.

Evelyn Douek:

And the harassment of women and minorities is not the only way that online platforms can be used to cause harm. Twitter alone is a massive platform.

Yoel Roth:

At the scale of hundreds of millions of tweets per day.

Evelyn Douek:

And so Yoel saw plenty of ways platforms can be misused.

Yoel Roth:

Things like nation-state disinformation campaigns or terrorist organizations.

Evelyn Douek:

This is after all the lesson of history.

Yoel Roth:

It's really the classic double-edged sword of every type of connective technology. It can be used for incredible good and incredible value in society, and those same tactics of organization and mobilization can be used to do incredible harm.

Evelyn Douek:

And that harm can fall especially hard on women and minorities. But that's the rub. When the Knight Institute brought its case against Trump, they were hoping to establish a rule that would constrain not only Trump but other officials too, and for better or worse, they succeeded. When you litigate a case like Knight v. Trump and get a ruling, what is good for the goose is good for the gander. If a case establishes a broad principle, you don't get to pick and choose who gets to take advantage of that principle.

Rules are rules, and the court had said that the rule was public officials can't block people on social media because they don't like what those people say. They didn't say Republican and public officials can't block people on social media. They didn't say mean politicians can't block people on social media, and they definitely didn't say that the politicians that you don't like specifically can't block people on social media. So when the Knight Institute found out that AOC was also blocking critics on Twitter, it sent her a letter asking her to reconsider. During her attention to the cases where courts had held that what she was doing was unconstitutional. At first, AOC wasn't happy about it.

Young Turks Host:

She tweeted back, "I have 5.2 million followers, less than 20 accounts are blocked for ongoing harassment. 0 are my constituents. Harassment is not a viewpoint. Some accounts like the Daily Caller posted fake nude photos of me and abused my comments to spread it. No one is entitled to abuse. People are free to speak whatever classist, racist, false, misogynistic, bigoted comments they'd like. They do not have the right to force others to endure their harassment and abuse."

Evelyn Douek:

But a few months later, AOC changed her mind. She unblocked and settled the lawsuit with the owner of one of those accounts. She said that she agreed that people had a First Amendment right to express their views.

A really important point here is that it appeared that AOC was not simply blocking people because they were harassing her or sending her death threats Instead, many of the times that she'd blocked people, it followed them loudly disagreeing with her policy positions. Dov Hikind, the person that AOC reached a legal settlement with, for example, had disagreed with AOCs position on Palestine and said that AOC blocked him for criticizing remarks she made comparing migrant detention centers at the Texas-Mexico border to concentration camps.

So AOC was right to reconsider this kind of picking and choosing the viewpoints she wanted to hear from and that others would be able to see in the replies to her tweets is pretty anti First Amendment values. Those values are the ones that motivated the Knight Institute to bring the case against Trump in the first place. As Jamele and Katie from the institute explained in episode 1, "You really want those values embedded in the law's treatment of social media platforms because there's such critically important spaces for politicians to engage with the public."

Yoel Roth:

For instance, during election periods, we see that leaders use Twitter to share real-time information about how to vote, and we see that candidates for office use Twitter to campaign and sometimes even interact directly with each other. We see that especially in the case of women in public life, there are very substantial chilling effects connected with abusive and harmful speech.

Evelyn Douek:

It's these chilling effects that Noah says he is worried about too.

Noah Feldman:

It's not facilitating public discourse to insist that these spaces be open to everybody because if they are, then politicians will just walk away from those spaces and they'll find some other format to express themselves, especially if they're being subject to hatred and misogyny and all kinds of bad content.

Evelyn Douek:

The idea here is that protecting free speech overall doesn't necessarily mean protecting all speech all the time. Some people will be silenced if other people can run them out of public conversation. Now, there are other features that platforms are starting to offer that are a little more nuanced than straight-out blocks or takedowns. Things like nudging people to be a little more polite, letting others mute jerks or applying warning screens to offensive content. Knight v Trump really was about the specific blocking function, and so there are all these open questions about how the case might apply to these different platform features if public officials start making use of them too.

Yoel Roth:

If you think about a feature like "Mute" on Twitter, which doesn't prevent somebody from seeing your content but hides their content from you, is that different under the law? Is that different under ethics regulations and expectations? Are we saying that you need to be able to see the content of public officials, but they don't necessarily have an obligation to pay attention to you? Those are some of the big unsolved and open questions here.

Evelyn Douek:

Another open question is what will happen to these ideas at Twitter now that so many of the staff have left the building and the lights have been turned off? Here's Yoel to Kara Swisher again, when asked whether he thinks the platform will continue to function.

Yoel Roth:

I would encourage folks to keep an eye out for what are the canaries in the coal mine that suggest that that something's not right? A couple of the things that I keep an eye out for are, have core safety features stopped working the way that you expect? Does block still work or do you start seeing blocked accounts in weird places where you don't expect them? Does mute still work?

Evelyn Douek:

But what happens at Twitter specifically isn't the only thing worth watching because just as Knight v Trump wasn't really just about Trump, it also wasn't just about Twitter. In the over five years since Knight filed the case, the holding is being applied in lots of different contexts. Five circuit courts have adopted the same general principle, and lots of different officials have been called out for blocking people in lots of different contexts. It's not just presidents that can't block people, it's chairs of county boards, county sheriffs and members of school district boards of trustees. It's also the army in the Navy.

Here's the story. The military wanted to recruit young people, so they went to where the young people are and had service members livestream themselves playing video games as a kind of outreach. But not everyone is a fan of the military and not everyone thinks they should be targeting young people for recruitment by playing games. So anti-military activists expressed their disapproval in the manner that is traditional on the internet, by trolling them in the comments.

Jordan Uhl:

So I saw that they were streaming on Twitch, and I went into the chat and I was like, what's your favorite war crime?

Evelyn Douek:

Then the members of the military acted like a lot of people would if they were getting trolled on Twitch. They blocked the activists.

U.S. Military Rep. on Twitch:

Have a nice time getting banned, my dude.

Evelyn Douek:

But the Army and the Navy are government actors. So the Knight Institute lept into action.

Jordan Uhl:

Because they blocked me, both the Army and the Navy. The Knight First Amendment Center at Columbia University wrote a demand letter to both branches and was like, "You can't do this. It's unconstitutional." The government cannot ban you on the basis of viewpoint.

Evelyn Douek:

It seems that not all officials are getting the message however, and so there's lots of ongoing litigation. In many cases, government actors are ordered to unblock their constituents, but there have also been cases where courts found that the accounts involved were personal, not official accounts, and so they could block all they like. So these issues can be nuanced. Not only are the government actors involved different, each of the platforms involved works differently. For example, on some public officials broadcast to the world on others, they engage in small group conversations. Some platforms allow people to remain anonymous, others require you to use your real name. This changes the kind of conversation and abuse that can happen in each environment.

Yoel Roth:

Things probably look differently on a public social messaging service like Twitter than they would in a private one-to-one or small group messaging app. A key part of it is really understanding how these technologies work and thinking about different types of control and how they might be implemented in different platform contexts.

Evelyn Douek:

And all of these differences mean that courts are working out for themselves one step at a time. The limits of the general principle established in the Trump case. There's problems with asking the law to do this. I mean, first the law is slow. It was nearly four years from when Knight first filed the complaint challenging President Trump's blocking of people on Twitter to when the Supreme Court dismissed the appeal as moot. In that time, the world had changed quite a lot and platforms changed quite a lot too.

Yoel Roth:

On the time scale of a year. It's fairly safe to assume that a product like Twitter is going to change quite a bit.

Evelyn Douek:

When I spoke to Yoel in the [inaudible 00:23:20] days of late 2022 even he couldn't envision just how much Twitter would change in the next few weeks, let alone months, and it's really hard for the law to keep up with all of that. New features are emerging all the time. New platforms emerge that work totally different to old ones. Yesterday everyone was Facebook posting. Today they're hanging with the youths on TikTok, posting hilarious clips of themselves dancing to Bored in the House.

"Bored in the House" Song:

Okay. I'm bored in the house and I'm in the house bored, bored in the house, and I'm in the house bored.

Evelyn Douek:

Tomorrow they'll be attending rallies in the Metaverse or something. Meanwhile, cases can take years to work their way through the courts. There are literally cases before the Supreme Court right now about Twitter's content moderation choices in 2017. A few things have changed in Twitter's content moderation since 2017. So the law is slow and it's also not necessarily so tech savvy. I mean, how many judges do you think know what the Metaverse is?

But there also might be a more fundamental issue with asking the law to solve these problems in this area at this time. That brings us to the second reason Noah Feldman was nervous about the Knight v Trump litigation and is nervous about its holding now, which is, what happens if courts take the ruling and run with it? Holding that not only can government officials not block people on social media, but neither can the social media platforms themselves.

We'll come back to the question of how likely it is that courts actually do what Noah worries they'll do. There's real disagreement about that. But for now, let's just think about why it would be such a big deal if courts restricted social media platforms from taking down content. The thing is, as you've probably heard, unless you've been under a soundproof weighted blanket for the past half decade, platforms do a lot of blocking. A platform that couldn't block would be radically different from the kind of thing you are used to. A lot of this content moderation is important for making social media at all usable and a slightly smaller dumpster fire of terrible speech than it would be otherwise.

One of Noah's big concerns about the public forum argument is that courts will paint with a broad brush and make rulings that platforms themselves cannot take down really annoying or rude or offensive speech. That is speech that is not illegal and is protected by the First Amendment, but that would make every social media platform basically PornHub with a side serving of ads for fake Ray-Bans and enticing marriage proposals from foreign princes.

Noah Feldman:

If the First Amendment applies, generic nastiness to a political figure is protected by the First Amendment. So yes, true threats could be excluded, but the full range of human nastiness using all the vocabulary that we know would not be blockable in a classic public forum.

Evelyn Douek:

So Noah acknowledges that the First Amendment totally does allow politicians to block some of the worst threats and harassment that public figures get. Stuff like the death threats we heard about before from many of the female politicians, but there's a lot of bad stuff that falls short of the worst of the worst.

Knight v. Trump says politicians can't block that stuff, but most people would probably agree that platforms should remove it. If platforms themselves were public forums and bound by the First Amendment, they'd be able to take down a lot, lot less. It's one thing when Elon Musk voluntarily decides not to take stuff down on Twitter. It's another altogether for the law to say that all platforms must leave stuff up that they would rather take down.

The Knight Institute and Noah agree about this, neither of them think it would be a good thing if courts restricted platforms from taking down content and they don't think that Knight v. Trump requires courts to do so, but Noah's worried about how other people might leverage the precedent that was set in Knight v. Trump, and there are people who might want to. Platforms policing so much speech has created intense controversy in the past few years.

U.S. Congressman Jim Jordan:

Why is it always that these big social media companies, why is it always conservatives who seem to get censored?

Florida Governor Ron DeSantis:

Are they using secret algorithms and shadow banning to shape debates and control the flow of information, but yet they evade accountability by claiming they're just neutral platforms.

U.S. Senator Ted Cruz:

Mr. Dorsey, who the hell elected you and put you in charge of what the media are allowed to report and what the American people are allowed to hear? And why do you persist in behaving as a democratic super PAC silencing views to the contrary of your political beliefs?

Evelyn Douek:

When Donald Trump was deplatformed, many of these lawmakers became more adamant than ever that platforms content moderation needs to be reigned in and they're writing and passing laws to try and make that happen. When Noah and Knight part ways is whether the holding in Knight v Trump will make it easier for these conservatives to make that argument. Nothing in Knight v Trump has anything to do with what private platforms can do. It's only about First Amendment constraints on government actors. But Noah is worried that this distinction will get lost in the constitutional and political struggle over platform regulation. But courts don't decide things in a vacuum, and while legal arguments are important, there's also this bigger political struggle over platform regulation that's playing out right now.

Noah Feldman:

The Knight Institute is run by lawyers who are making legal arguments, and so it's natural for us to say this weird lawyer thing, which is that there's some big difference between these circumstances and these are nuanced and they're case by case, and you have to have a subtle understanding of them. I do it. We all do it, but we have to remember that this is not a game that in the big picture is going to be one or lost by lawyers. It's much bigger than that.

Evelyn Douek:

The fear is that Knight v. Trump is the chink in the armor, the crack in the dam, the slippery slope down the hill, that it would be better to pull up the drawbridge and create a fort around the idea that platforms are private spaces so that they can run their services however they like in order to best protect the digital public sphere from government interference.

Noah Feldman:

The path that the Knight Institute is pushing for in my view, is the path where it'll no longer be available to private companies to make their own decisions about this. There will be a default regulatory requirement that they can't, and I understand you haven't argued for that, but I see that as the logical next step along the way.

Evelyn Douek:

When the time comes, and that time may be soon for the Supreme Court to hear arguments about whether private platforms should be allowed to perform content moderation and remove things like hate speech, misinformation, and political leaders who are inciting a riot.

Noah Feldman:

A crucial piece of their argument will be, and it already is, that social media platforms are the public square, and in order to do that in real world legal cases, you cite cases in support of that proposition, a case holding that even a part of Twitter, namely Donald Trump's Twitter feed was a public forum, is a great citation, a great proof for [inaudible 00:30:22] lawyers for the idea that Twitter is a public forum.

Evelyn Douek:

So Noah would rather we keep the constitution out of this and leave it to private companies to manage their own products.

Noah Feldman:

I don't have some utopian idea that the market will always get the right results, is very far from it. But what I am saying is that just like in legacy media, which is also not a perfect market by any stretch of the imagination, you get different points of view expressed. This does come down to a core First Amendment point. "Do I trust the private sector?" Not especially. "Do I trust the government more?" No, I do not.

Evelyn Douek:

Let me reassure you. It's really, really not the case that Knight trusts the government. If anything, Knight v. Trump is a classic First Amendment case about the distrust of government power to pick and choose voices in the public sphere. Noah and Knight agree on that. What they don't agree on is whether a citation to Knight v Trump matters to the conservative crusade against content moderation. This is mainly a disagreement about tactics, not an ideological disagreement, but when a lot is at stake, tactics matter.

Noah Feldman:

What I'm saying is there's a very likely scenario to me coming where a war is going to happen, and what I'm doing is trying to plan for that war.

Evelyn Douek:

In the next episode, we're going to talk about that war and the tectonic shifts that are happening in the politics of the First Amendment. Battle lines are being drawn, strategies are being drafted, and the assault on established constitutional doctrine is beginning. What's on the line? Oh, just, you know, the entire system of freedom of expression in the United States.

This episode of "Views on First" is co-written, produced, and edited by Merrin Lazyan, and hosted by me Evelyn Douek. With production assistance and fact-checking from Kushal Dev. Candace White is our executive producer. Audio and production services are provided by Ultraviolet Audio, with production and scoring by me Merrin Lazyan and Mixing in Sound Design by Matt Boynton.

"Views on First" is brought to you by the Knight First Amendment Institute at Columbia University and is available wherever you get your podcasts. Please subscribe and leave a review. We'd love to know what you think. To learn more about the Knight Institute, visit knightcolumbia.org or follow the Institute on Twitter at @knightcolumbia, or on Mastodon by the same handle. I'm Evelyn Douek. Thanks for listening.

Trolls, trolls. Trolls, I mean, who are you? Trolls. A universal... Did you mean just the word or did you want the whole sentence?