Nov 29, 2023

Transcript
Bad Ideas

 

[RADIOLAB INTRO]

LATIF NASSER: Okay, let's just do the thing.

LULU MILLER: All right. Hey, I'm Lulu Miller.

LATIF: I'm Latif Nasser.

LULU: This is Radiolab.

LATIF: Okay, basically I was just, like, nosing around, and I found this article in a journal called Nature Machine Intelligence.

LULU: Nature Machine Intelligence. Okay.

LATIF: And this is a few months ago. It was in March. I don't—I don't even know what I was looking for when I was—and I just found this paper. And it had this weird, kind of boring title that I didn't understand, but I, like, started reading it. And the tone of it, there was something about the tone of it that was sort of breathless.

LULU: Hmm.

LATIF: Like, "Oh, my God! We just discovered this thing. And it's kind of scary, and we're not the only ones who are able to find this thing. We're not the only ones who are actually looking for this thing." And it felt, like, calamitous. And by the time I finished it, I was breathless. Like, I was like, "Oh, my God! Like, is this what I think it is? Because if it is, this thing is terrifying."

LULU: Okay, what—what—and what's the thing?

LATIF: Okay. Okay, so here's what happened. So ...

LULU: Okay.

LATIF: Our scene begins with ...

LATIF: La, la, la, la, la, la, la, la, la.

SEAN EKINS: You're muted.

LATIF: Oh, I'm muted.

LATIF: ... this guy named Sean.

SEAN EKINS: Hey!

LATIF: I'm muted. Duh! Let's actually rewind a little bit. Like, who are you? What is your company? What do you do?

SEAN EKINS: So my name's Sean Ekins. I'm the CEO of Collaborations Pharmaceuticals. And this is a company based in Raleigh, North Carolina. I founded it 2015, and we work on using computational approaches for doing drug discovery.

LATIF: Which basically means what they do is they use AI to discover new medicines.

LULU: Hmm.

LATIF: You're a medicine hunter, and you do it through computers.

SEAN EKINS: Exactly, yes.

LATIF: So essentially what they do is they've built—like, using a lot of, like, open-source technology, a lot of, like, open-source, like, databases of drugs, they've basically created this computer program that's kind of like a search engine. And so they call it Mega-Syn.

LATIF: Or Mega-Sine? How do you say it?

SEAN EKINS: Yeah, Mega-Syn. So it was a really quick name for Mega-Synthesis.

LULU: Oh! So—and sort of synthesizing drugs that exist with receptors and brains and ...

LATIF: Kind of. It's a little complicated.

SEAN EKINS: It's one of those strange things where, like, I don't use it. I have one of my employees that basically codes it and puts it all together.

LATIF: [laughs]

LATIF: So that employee ...

FABIO URBINA: Sorry, let me take my mask off here.

LATIF: His name's Fabio.

FABIO URBINA: Okay.

LATIF: Fabio Urbina.

LULU: Okay.

LATIF: So what they do is they typically work with rare diseases.

FABIO URBINA: Which aren't considered profitable.

LATIF: Big Pharma's ignored them, there's no drugs for them. So what they'll do is take one of these diseases ...

FABIO URBINA: That, you know, usually only a few hundred people have.

LATIF: And they'll be like, "Okay, we need a drug that will do a very specific thing in the body to stop this disease, to stop the person from getting sick."

LULU: Right.

LATIF: So they'll tell Mega-Syn, "We need a drug that can do this very specific thing." And then they'll hit 'Search.' And Mega-Syn will comb through all the available drugs that have been discovered, all the drugs that have been even evaluated. Like, this giant network of basically every drug that has ever been created. And if from that, Sean and Fabio can't find a good match ...

FABIO URBINA: Well, we're kind of out of luck. That's the end of that.

LATIF: Except it's not. Because Mega-Syn can do this other thing.

FABIO URBINA: It can put together a drug, basically.

LATIF: So how this works is drugs are basically just made up of molecules. And the thing about molecules that work as drugs ...

FABIO URBINA: Is they have a certain molecular weight range, they have certain properties.

LATIF: They're distinct.

FABIO URBINA: You can look at them and say, "Okay, that's a drug," or "Oh, that looks nothing drug-like."

LATIF: And so using all these public databases and just inputting all this information into Mega-Syn about chemistry, molecular engineering ...

SEAN EKINS: In a sense, we've—we've tried to train it to make things that a chemist would make.

LATIF: And so with that knowledge, what they do is they take the rare disease and they say, "Okay, Mega-Syn. In the infinity of molecules that could be drugs that don't even exist yet, can you make something ..."

FABIO URBINA: That might be active against our disease of interest?

LATIF: "... that could work here?" So Fabio will enter all this stuff into Mega-Syn, hit 'Run,' and within minutes ...

FABIO URBINA: It'll spit out ...

LATIF: ... these brand new, never-before-seen molecules.

FABIO URBINA: Molecules that look like drugs.

LATIF: And then Sean and Fabio can go through these molecules and say, "Okay, this is the one we want to do this thing we need it to do."

LULU: Huh. Huh!

LATIF: To disrupt this disease that humankind doesn't have a cure for.

LULU: Oh my gosh! The speed of that just needs a moment.

LATIF: Yeah, yeah. It's kind of amazing. It's incredible. It's incredible.

LULU: Yeah.

LATIF: Yeah. Okay. So this is—I mean, this is where things start to get dark. So it's 2021, and these guys, they are just doing their thing, and they get invited to this conference.

SEAN EKINS: Called the Spiez or Spiez Convergence Conference.

LATIF: Kind of international security conference.

SEAN EKINS: And the goal of it really is to understand how technologies can be misapplied.

LATIF: Part of the theme is, like, this idea of, like, dual-use. So okay, for example, dual use is like, a nuclear bomb and nuclear power plants come out of the same technology.

LULU: Right.

LATIF: Like the so-called, like, double-edged sword.

FABIO URBINA: And so we got this invite. And of course, we're thinking ...

SEAN EKINS: Oh! Well, that's interesting. Why—why ...

LATIF: Why us?

FABIO URBINA: How can we misuse wonderful drug discovery tools?

LATIF: It's never even occurred to them. So they're sort of like brainstorming, like, "Okay, if we were, like, really evil ..."

FABIO URBINA: What would we do? How would we misuse what we know?

LATIF: They called it, like, their "Dr. Evil plan." Like, what would Dr. Evil do here?

FABIO URBINA: So yeah, it was—it was a very weird feeling.

LATIF: So they're, like, imagining and thinking it up ...

SEAN EKINS: Well, we were running out of time.

LATIF: ... when Sean has this idea.

SEAN EKINS: I hadn't given it a lot of thought. It was—it was pretty quick.

LATIF: So one of the things about Mega-Syn ...

FABIO URBINA: Is if we are trying to generate a new drug, we want to make sure it's not toxic.

LATIF: Fabio basically programmed this filter in Mega-Syn so that if the side effects of the medicine are gonna be worse than the medicine itself, like—or than the disease itself ...

LULU: Pass. Not interested.

LATIF: Like, don't—not interested.

LULU: Don't bother!

FABIO URBINA: Because of course, it doesn't matter if your drug cures all cancer, if it stops your heart from beating, it's going to not be a good drug.

LATIF: It'll save my life, but it'll also kill me. So ...

LULU: So I'll pass.

LATIF: Yeah, exactly.

LULU: Got it.

LATIF: And so Sean ...

SEAN EKINS: Well ...

LATIF: ... he thought ...

SEAN EKINS: What if instead of going right we went left?

LULU: Mmm ...

LATIF: What if we flipped the filter? What if we did the exact opposite, like, photo negative of that filter?

LULU: Yeah. Just spit out the deadlies?

LATIF: Yeah, exactly. And so ...

FABIO URBINA: I was using a 2015 Mac.

LATIF: ... that night ...

FABIO URBINA: Just did a couple of copy-and-paste changes. Typed a one where there was a zero and a zero where there was a one.

SEAN EKINS: It was that simple. It was literally that simple.

LATIF: He hit 'Run' on Mega-Syn.

FABIO URBINA: And then went home for the evening.

LATIF: Next day, they open up the file, and it's just overwhelming. So many potential molecules.

SEAN EKINS: Tens of thousands of molecules.

LATIF: Which they run against a public database to find out if Mega-Syn had created anything truly terrible that already exists. So they're looking, they're looking, when all of a sudden they come across a match.

SEAN EKINS: With a super-hideous molecule ...

LATIF: Called VX.

[NEWS CLIP: So what exactly is VX?]

LATIF: It's the thing that ...

[NEWS CLIP: Top news today: a toxic substance was indeed the murder weapon ...]

LATIF: Do you remember that Kim Jong-Un poisoned his half-brother in an airport?

LULU: Right.

[NEWS CLIP: Two women now in custody.]

LATIF: That's VX.

[NEWS CLIP: Wiped his face with the toxic substance.]

LATIF: It's a nerve agent.

[NEWS CLIP: Developed in the United Kingdom in the 1950s.]

[NEWS CLIP: Banned by the United Nations and classified as a weapon of mass destruction, it is considered one of the most lethal chemical substances ever made.]

[NEWS CLIP: Ever created.]

SEAN EKINS: I mean, one way to think about it is if you think about salt, a few of those crystals of salt, if it was VX, would be enough to kill you.

LATIF: And if you did get exposed, your muscles would start to twitch, your pupils would dilate, you'd start sweating. Then you'd start to vomit. After that, your muscles would go completely slack. You'd be paralyzed, including your diaphragm, which would stop working, so you'd start to suffocate. And within a few minutes of being exposed, you would die.

LULU: Ugh! That's horrific.

LATIF: Yeah. I mean, it's awful. It's very awful. And Mega-Syn basically independently created it with the push of a button.

LATIF: What did you think was gonna happen?

SEAN EKINS: Um, I mean, realistically?

LATIF: Maybe it was just what did happen? I don't know.

SEAN EKINS: Yeah. Well, it was—I think what did happen was just the ease of it. I thought we would maybe get a few things that looked like VX.

LATIF: You did?

SEAN EKINS: Yeah. And we found a few, you know, in the literature, in publications.

LATIF: A few horrible things that humans had already created. And they figured that'd be it.

SEAN EKINS: But what we got was thousands of different molecules that looked like VX.

LATIF: Thousands of brand new, never-before-seen molecules that were actually predicted to be more potent than VX.

SEAN EKINS: Massively more potent. By orders of magnitude.

LATIF: This is just bad, right?

SEAN EKINS: Yeah, it's bad. It was like the alarm bells started ringing at that point.

LATIF: Because, according to Sean, if any chemist got their hands on this and wanted to make some of these molecules into weapons, if they did, because no one knows they exist, these weapons would be untraceable, undiagnosable.

LULU: Wow!

LATIF: Incurable.

LULU: Latif, this is so scary.

LATIF: It's really, really scary.

SEAN EKINS: I didn't sleep. I didn't sleep. I did not sleep. It was that gnawing away, you know, we shouldn't have done this. And then, oh, we should just stop now. Like, just stop.

LATIF: But in a minute, we're gonna keep going. Stay with us.

LULU: Lulu.

LATIF: Latif.

LULU: Radiolab.

LATIF: So Sean and Fabio have opened Pandora's box of chemical weapons.

FABIO URBINA: Yeah. Now we have this sort of file in our computer and all of a sudden holds all these warfare agents.

LATIF: Was there part of you that was just like, delete, delete, delete, and just pretend this never happened?

LULU: Just unthink the thought experiment?

SEAN EKINS: That definitely crossed our mind.

FABIO URBINA: Yeah. We definitely had that sort of reaction to this.

SEAN EKINS: Like, I don't wanna know any—I don't wanna know any more.

LATIF: But then they figured, wait a second ...

SEAN EKINS: Other people could do this.

LATIF: You do not need a PhD to do this. You just need some basic coding knowledge, a basic laptop, and then all the data is available online for free.

SEAN EKINS: This could be something people are already doing or have already done. You know, these tools are in the hands of people that there is no control of. You know, anyone could do this anywhere in the world.

LATIF: And so the two of them, in this moment, were faced with this dilemma of now that you know what you know, what do you do? Do you tell people?

SEAN EKINS: Well, we have to make people aware of these potential uses of the technology and show people that, yes, these technologies can be misused.

LATIF: So maybe people could prepare for it, try to prevent it.

SEAN EKINS: Exactly.

LATIF: But at the same time, if you tell people ...

FABIO URBINA: We could inspire instead of prevent.

SEAN EKINS: Then maybe people that would wanna do this, see how far they could push it.

LATIF: What do you do? What do you do? What do you do?

LULU: Ah!

LATIF: It's exactly the double-edged sword.

LULU: So what—what—what happened? So okay.

LATIF: So—so they make this decision together where they're like, "We gotta go to this international security conference and we gotta say this out loud. But we're not gonna show anybody the specifics, and we're gonna tell them just enough that they know that this is a really serious problem."

LULU: Yeah.

LATIF: "And we gotta flag it. And hopefully someone smarter than us can figure out a solution to this." And so they go to the conference, they present at the conference, they then publish a few months later a comment in this journal, which is basically, effectively what they say at the conference. And it blows up. Well, maybe that's the wrong word to use.

LULU: [laughs]

LATIF: But, like, it goes everywhere. Wired, Scientific American. There was a thing on it in the Washington Post. There was a thing on it in The Economist. There was a thing on it in this thing, and this thing. And a lot of people like me stumble upon this thing. There's a lot of, like, active discussion about it from not just chemical weapons people, but also AI people and people in the pharmaceutical industry, and philosophers and, you know, weapons people and, like, all kinds of different people are, like, weighing in on it and thinking about this and, like, what do you do here? So it's kind of—I think they got what they want, but they also, like—every night, they're going to bed thinking, like, "Tomorrow I could wake up and some horrible thing could happen, and it could have been because of me."

SONIA BEN OUAGRHAM-GORMLEY: Okay. Okay.

LATIF: So in the midst of reporting this story, I ended up talking to this expert.

SONIA BEN OUAGRHAM-GORMLEY: So my name is Sonia Ben Ouagrham-Gormley.

LATIF: She teaches at George Mason University.

SONIA BEN OUAGRHAM-GORMLEY: In the bio-defense program. And I study weapons of mass destruction, particularly biological weapons.

LATIF: And I found Sonia because, you know, I needed somebody else to talk to about this. I needed some kind of outside perspective because when I read Sean and Fabio's paper, I legitimately thought this was very, very terrifying.

SONIA BEN OUAGRHAM-GORMLEY: Yeah, that's—that's the impression it gives. And my point is that the thought experiment is just a thought experiment. It just shows that it is possible to identify new molecules, but there's a long way between the idea and the production of an actual drug or an actual weapon.

LATIF: Like, if you're a chemist who's gonna make something that just exists on paper, like, that takes a long time, a lot of investment, a lot of work, a lot of thought.

SONIA BEN OUAGRHAM-GORMLEY: And it's already hard enough to do it with chemicals that already exist.

LATIF: She said there's plenty of examples of scientists ...

SONIA BEN OUAGRHAM-GORMLEY: Who try to transform them.

LATIF: Who try to tweak them just a little bit.

SONIA BEN OUAGRHAM-GORMLEY: To make them more harmful. And very often they failed.

LATIF: Because it's just super, super difficult.

SONIA BEN OUAGRHAM-GORMLEY: That's the point.

LULU: Hmm.

LATIF: And I told her it was still kinda hard to wrap my head around it because the way that I was thinking about it, I was actually thinking of, like, the Anarchist Cookbook of molecules. It's almost like recipes or something, right? It's like this thing spat out 40,000 recipes or whatever, right?

LULU: And then you just go to the store and get some fertilizer and, you know, and make it. And it's the exact proportions.

LATIF: Exactly. So I was worried about that.

LULU: Right. I'm picturing that too.

LATIF: Is that the right analogy, or would you use, like, a different analogy?

SONIA BEN OUAGRHAM-GORMLEY: Um ...

LATIF: And she was like, "No, you're—actually, that's the same metaphor I use when I'm teaching my students. But ..."

LULU: Oh, no! [laughs]

LATIF: "... think about it this way."

SONIA BEN OUAGRHAM-GORMLEY: If we take the analogy of a cake, for making a cake.

LATIF: These molecules are, like, really fancy cakes.

SONIA BEN OUAGRHAM-GORMLEY: And based on what I read, right? Based on the article, what they have is a list of different ingredients.

LATIF: I feel like your takeaway about this paper is that you—you have not lost any sleep over this paper.

SONIA BEN OUAGRHAM-GORMLEY: No. No, I think I found it a little bit too alarmist.

LULU: Huh. So Sonia is soothing us.

LATIF: Yeah. But there's still a problem, and it's not a small one.

FABIO URBINA: We were just contacted by the White House this morning.

LATIF: Oh my God! Really?

FABIO URBINA: Yeah. We weren't expecting that, so yeah.

LATIF: What did they say?

FABIO URBINA: So they want us to brief them on the paper, basically.

LATIF: Holy crap!

FABIO URBINA: Apparently it's causing a lot of buzz in the White House right now.

LULU: Whoa!

LATIF: And so I actually followed up with Sean after our first interview about this very thing.

SEAN EKINS: So we ended up doing a Zoom call with folks from the Office of Science and Technology Policy and the National Security Council, which was very surreal.

LATIF: So this is just in March, 2022. So they do this presentation for the White House folks.

SEAN EKINS: I think they were worried that we were, you know, kind of crazy people, and we were just gonna let all of this information out there.

LATIF: But at the end of their presentation, there's like a Q&A, and the White House folks are asking them questions.

SEAN EKINS: "Is this information, you know, sort of locked away somewhere?"

LATIF: And it is.

FABIO URBINA: So yeah, you know, one of the first things I'm gonna do is put it in a file and encrypt it really heavily.

LATIF: And it's on a computer that is not connected to the internet. It's air gapped.

SEAN EKINS: Fabio has it on his machine, locked away, encrypted.

LATIF: But as the White House staff kept asking questions, Sean was like, "Oh."

SEAN EKINS: I was just waiting for the question.

LATIF: He's, like, anticipating it.

SEAN EKINS: It was just when are they gonna ask the question? The elephant in the room, right?

LATIF: And then finally they ask the question.

SEAN EKINS: "Uh, can we have the data?"

LATIF: And it's like another one of these moments where Sean has to decide, you know, maybe the government should know about this so that they can anticipate it, try to regulate it. But on the other hand, now you're actually handing over the list to one of the most powerful governments in the world. Like, if anybody has access to Michelin-starred chefs, here they are. And Sean, it's his call, right? Because he's the CEO of the company. And Sean says ...

SEAN EKINS: You know, no.

LATIF: No. And to me, that was like—that was like, "Oh my God!" I—how do you even?

LULU: And how did they respond?

SEAN EKINS: Well, the reaction was basically, I think it was like, "Okay." I mean, it was pretty, "All right. Well, we'll just have to accept that."

LATIF: And what's your rationale for saying—why not share it with them?

SEAN EKINS: I just didn't feel that I wanted to hand it over to them. I mean, we had other scientists reach out to us as well asking exactly the same question, and I told the White House that I didn't feel like, you know, we should do that, we should give it over to them. And I just didn't feel that it was right. I mean ...

LATIF: So you were basically like, "No to you, White House, and no to everybody. We're not sharing this with anybody."

SEAN EKINS: Right.

LATIF: And you've told everybody about the existence of the list, but you're not sharing the list. What's the impetus there?

SEAN EKINS: I mean, it's bad enough to put the article out there and tell people how to do it and watch it sort of spread out across the globe, but if I gave them the list of molecules, it's almost like a red flag to a bull, right? The chemists out there are just gonna try to figure out all right, which ones are the easiest ones to make and then just make them. And I don't want to be the person that has to say, "Yeah, I'm responsible for that."

LATIF: And so for now, the list just sits on Fabio's computer, kind of waiting to be used in the case of an emergency in case someone was to recreate it somewhere else and actually start making some of these things.

LATIF: Reporting this story, I talked to a lot of people—AI experts, weapons experts, that kind of thing—and also, like, from my own expertise studying the history of science, this is a thing that just happens where something gets created and someone finds a way to use it for the opposite purpose, right? That's the story of the nuclear bomb and nuclear power. It's the story—and this is one we've actually done on the show before—of the nitrogen-fixing process that can give you fertilizer to feed billions or, you know, gunpowder and explosives. This kind of thing just happens over and over and over in history. But I think the thing that struck me about what Sean and Fabio were doing is how, when it came to change this machine from making drugs that would help people, it's just how easy it was ...

FABIO URBINA: I just did a couple of copy and paste changes.

LATIF: ... to flip it.

FABIO URBINA: Typed a '1' where there was a '0' and a '0' where there was a '1.'

SEAN EKINS: It was that simple. It was literally that simple.

LATIF: Like, the line between this thing doing good and this thing doing bad felt so thin. And it was in thinking about this that I reached out to an old friend of mine, a guy named Yan Liu. He studies ancient Chinese medicine—he now works at the University at Buffalo. And I called him because he always would tell me this thing that when it comes to drugs ...

YAN LIU: You know, there's no clear boundary between what is poison and what is medicine.

LATIF: That an individual drug can be good or bad for you. It could go either way.

YAN LIU: It is the matter of the dose that makes a difference.

LATIF: That it's all about the dosage or also about the intent of the person giving it to you. Or about the body of the person who's taking it.

YAN LIU: It's more about the context ...

LATIF: ... than it is about the drug itself.

YAN LIU: Exactly.

LATIF: And I showed him Sean and Fabio's paper and he was like, "Oh, this feels like a toxicology paper."

YAN LIU: This separation between pharmacology and toxicology, you know, we have a long tradition especially in the West. It's the sense of separation between what is good and what is bad.

LATIF: But he's like, pharmacologists and toxicologists are studying the same molecules, but it's like, we just want them to be separate.

YAN LIU: Exactly. But every substance has the potential to either heal or to kill.

LATIF: And that's what I keep thinking about about this story is that there's no real line here. Everything is capable of everything, and it's just a matter of how we choose to use it.

LULU: Stick around because coming up after the break, we look at the technology we are already using, and ask are we so sure this is how we want to use it?

LATIF: Okay.

LULU: Okay. Hey, this is Radiolab. I'm Lulu Miller.

LATIF: And I'm Latif Nasser. In the first segment, you heard all about one technology and the potentially toxic stuff it could spit out. Now we're going to another technology, one you're certainly more familiar with, and the very real way it is currently making all of our lives more toxic—but maybe we're okay with that?

LULU: We should say that this story does contain both language and discussions that may not be appropriate for younger listeners.

SIMON ADLER: Come on, come on. Don't be a sassafras.

RACHAEL CUSICK: Hello, hello, hello.

SIMON: Hey!

RACHAEL: Lovely!

SIMON: Hey, this is Radiolab, and I'm Simon Adler sitting in for Lulu and Latif this week.

SIMON: Yeah, how do you feel that the B team has been sent in for this?

RACHAEL: [laughs] Yeah, they're like, "You know, the understudy of the understudy was out today so you're gonna have to take Simon Adler."

SIMON: Yeah, so you're gonna have to take—yeah, just the ...

RACHAEL: The ushers.

SIMON: [laughs] The ushers. Exactly.

SIMON: Because a while back, our reporter/producer Rachael Cusick, she sat me down in the studio to tell me a story about both how beautiful we humans are, but also just how downright awful we can be. And the tricky business of deciding who should be held responsible when that ugly part of us shows.

RACHAEL: Let's hit it.

SIMON: Mm-hmm.

RACHAEL: Okay, so we are gonna start on a stoop in Harlem.

MATTHEW HERRICK: Over in West Harlem. In what, 2016? Yeah, 2016.

RACHAEL: With this guy.

MATTHEW HERRICK: Matthew Herrick.

RACHAEL: Wait, do you go by Matt or Matthew?

MATTHEW HERRICK: Most people call me Matthew.

RACHAEL: Cool.

RACHAEL: At the time, Matthew had recently moved from LA to New York City.

MATTHEW HERRICK: It was definitely, you know, a punch to the face if you will.

RACHAEL: Trading palm trees and sunshine for a smelly city stoop.

MATTHEW HERRICK: Yeah.

RACHAEL: Anyway ...

MATTHEW HERRICK: I think it was around October, mid-October. I was sitting on my stairs, just like in the front of my building.

SIMON: What's this guy look like, do we know?

RACHAEL: Like, tall, muscular, salt-and-pepper hair nowadays. I think probably just pepper back in the day.

MATTHEW HERRICK: And I was having a cigarette, and a gentleman walked up and stopped and stood in front of me. And, you know, it's New York, so there's a lot of [bleep] weirdos. So I just, you know, figured it was just some weirdo being weird.

RACHAEL: Like, "Whatever. I'm just gonna ignore them."

MATTHEW HERRICK: Yeah. Like, it's all good. So I'm, like, kind of avoiding eye contact, but then I realize that they're standing there for, you know, an extended period of time. So I looked up ...

RACHAEL: And it's someone that he doesn't know, someone that he doesn't recognize, but this guy ...

MATTHEW HERRICK: He went, "Hey Matt." And I was like, "Hi?" Like, how the hell do you know my name? And he says, "It's so-and-so. We were talking on Grindr."

RACHAEL: Grindr. Dating app used primarily by gay men. And so this stranger tilts his phone towards Matthew, and he's like, "Look."

MATTHEW HERRICK: And it's a profile on the app. With, you know, a picture of me. And I was like, "That's not possible."

RACHAEL: Like, that is a photo of me.

MATTHEW HERRICK: But that's not me.

RACHAEL: That is not a profile that I made. I'm not on Grindr, you know? So he looks up at the guy ...

MATTHEW HERRICK: "Like, I don't know how to explain this. I don't know who you're talking to. I'm very confused right now. Like, you need to leave." And I got up and I went inside. And I remember I looked at my roommate Michael and I was like, "Someone just showed up looking for me from Grindr."

RACHAEL: "Like, how weird is that?"

MATTHEW HERRICK: Yeah. Little did I know ...

RACHAEL: Because after that, another guy came. And then another.

MATTHEW HERRICK: People started showing up.

RACHAEL: It just keeps happening.

MATTHEW HERRICK: Sometimes I would be home, or sometimes I'd be leaving the building and there would be people outside.

RACHAEL: Each time a different man.

MATTHEW HERRICK: You know, one or two a week.

RACHAEL: All saying the same thing. Like, "We were talking on Grindr. Let's have sex." So he reports the profile to Grindr, got the, like, automatic reply.

MATTHEW HERRICK: "We'll get back to you soon," or whatever the hell it said."

RACHAEL: But he doesn't actually hear back from anyone at the company. And meanwhile ...

MATTHEW HERRICK: People were showing up to my home in large volumes. And, like, I'm living my life. Leave me alone. Can I get a break?

RACHAEL: And finally one night, he's annoyed, he's fed up ...

MATTHEW HERRICK: I stood up and I said, "[bleep] this."

RACHAEL: And he decides he is going to get to the bottom of this.

MATTHEW HERRICK: Yeah, so I downloaded Grindr.

RACHAEL: He makes a profile.

MATTHEW HERRICK: Without a photo.

RACHAEL: And then logs on.

MATTHEW HERRICK: And I saw the fake profile of myself. Very close proximity.

RACHAEL: Grindr actually has this map feature where you can see where other people are who are on Grindr. And this person who had Matthew's name and his photo is, like, right there, right outside his apartment.

MATTHEW HERRICK: So I walked outside, and I remember looking down the street and he took off running. And I went and chased after him.

RACHAEL: And while he's running, Matthew is looking at the Grindr app, scanning for the fake him.

MATTHEW HERRICK: Because you can refresh it and it'll tell you how far that person is away.

RACHAEL: And this fake Matthew, he was, like, 20 feet away.

MATTHEW HERRICK: So I'm walking along Morningside Park. Refresh the app.

RACHAEL: Then 15.

MATTHEW HERRICK: I was walking and I was walking ...

RACHAEL: Then 10.

MATTHEW HERRICK: And I remember I looked down and I was like, "He's five feet away. How is he five feet away?" And I stood on the park bench, and I looked over the fence, and laying face down in the bushes was JC.

RACHAEL: His ex, JC.

MATTHEW HERRICK: And I screamed, "I [bleep] caught you! I caught you! I knew it was you!"

RACHAEL: JC started yelling back at him. Matthew ran away, JC chased him. The cops got called. It was a mess.

SIMON: Ugh, so an ex-lover made a fake profile for the purpose of terrorizing him?

RACHAEL: Yeah. Matthew and this guy JC had started dating not long after Matthew arrived in the city.

MATTHEW HERRICK: And we dated for I want to say eight to ten months.

RACHAEL: Matthew saw some ...

MATTHEW HERRICK: Little red flags.

RACHAEL: ... and ended things. And that's when these people started coming.

SIMON: Hmm.

RACHAEL: I think once Matthew broke up with him, he was like, "If you don't want to date me then, like, screw you. I'll make your life a living hell." Anyhow, once he knew it was JC ...

MATTHEW HERRICK: I ended up getting an order of protection against him.

RACHAEL: And so JC couldn't go anywhere near him in real life. But an order of protection doesn't really apply when JC's sending other people to his door.

MATTHEW HERRICK: There was no ramifications for what he was doing.

RACHAEL: There wasn't anything the courts or the cops could do about it. So Matthew contacts Grindr again, and is like, "This is the guy. Shut him down, please!" But still ...

MATTHEW HERRICK: Nothing. No acknowledgement.

RACHAEL: We reached out to Grindr for comment. Didn't hear back. Anyhow, with Grindr doing nothing, JC went on the offensive. He actually made multiple fake Matthew profiles.

MATTHEW HERRICK: There were two to three existing on the platform. And that's when, you know, the gay zombies started coming for me.

RACHAEL: [laughs] Do you call them the gay zombies?

MATTHEW HERRICK: Yeah, 'cause it's like everyone's like, "Maaaat!"

RACHAEL: Just like, "Must get sex now!"

MATTHEW HERRICK: Yeah. I would leave at six o'clock in the morning to walk my dog, there would be somebody outside waiting for me. And I would come home at night, 11:30, 12:00 at night, there'd be somebody waiting for me. Literally every single day of my life.

RACHAEL: And it wasn't just a lot of awkward but harmless encounters because these profiles ...

MATTHEW HERRICK: Said I was looking for rape fantasies.

RACHAEL: Ugh!

RACHAEL: Matthew would try to explain the situation to people calmly, but then ...

MATTHEW HERRICK: The profiles were telling these individuals it was part of my turn on, so to stay and then approach me again.

SIMON: Just diabolical!

RACHAEL: Yeah. And so again he tries reporting the profiles.

MATTHEW HERRICK: I had friends reporting profiles, family reporting profiles. Sending emails to the company.

RACHAEL: Again nothing.

MATTHEW HERRICK: I was slammed against the wall.

RACHAEL: Oh my God!

MATTHEW HERRICK: There was someone who broke into my building and physically assaulted my roommate while trying to get to me.

RACHAEL: He goes to the cops.

MATTHEW HERRICK: To file police reports and they would turn me away. They were like, "We don't understand." I don't think anybody really could grasp what I was actually talking about.

RACHAEL: JC started making profiles that promised people crystal meth, and said they should show up at the restaurant where Matthew worked.

MATTHEW HERRICK: I was taking an order at a [bleep] table, and I remember this guy is tapping on my shoulder saying my name, high out of his mind. And I'm looking at the people that are sitting down and they're looking back up at me, and they're like, "What is going on?" And my eyes are just welling up with tears because I'm like, "Oh my God." And I'm like, "How do you want your burger cooked?" You know what I mean? [laughs]

RACHAEL: And this went on for months.

SIMON: Jesus!

RACHAEL: Did you hate hearing your name at that point in your life?

MATTHEW HERRICK: Oh, I hated it. I hated everything about existing. I hated it all. Like, I remember sitting there saying to myself, "Like, I either want to [bleep] blow my brains out or throw myself off a building."

RACHAEL: And then one day, Matthew's talking to his lawyer.

MATTHEW HERRICK: My family court lawyer. She said, "Hey, there's this woman named Carrie Goldberg. She might be able to help you."

RACHAEL: So he takes the train to downtown Brooklyn.

MATTHEW HERRICK: I sat in her office and she told me a little bit about herself.

CARRIE GOLDBERG: I mean, as background, I started this law firm after I had been the target of a malicious and creative and relentless ex.

RACHAEL: Attorney Carrie Goldberg.

CARRIE GOLDBERG: One of the most malicious things that my ex was doing was—was blackmailing me with naked pictures and videos that he had. Contacting everybody in my life. He's making false IRS reports against my family.

RACHAEL: Now at that time Carrie was already a lawyer herself, but she really only did family law stuff—wills, guardianships, things like that, and ...

CARRIE GOLDBERG: I had so much difficulty during that process finding a lawyer who knew what to do and, like, was at the intersection of intimate partner violence and internet law and First Amendment and knew how to get an order of protection, I was really desperate.

RACHAEL: And so after this all ended ...

CARRIE GOLDBERG: I became the lawyer I'd needed.

RACHAEL: A lawyer for people like Matthew.

MATTHEW HERRICK: So she told me her story and I told her mine. And before I could finish, she said, "I would like to represent you. I think we can slow this attack on you."

RACHAEL: And the way to do that, Carrie said, was to go after Grindr, take them to court, and argue that this guy JC used Grindr essentially as a weapon, that Grinder knew all about it and did absolutely nothing to stop it. And so day of the hearing, Carrie and her team file in, sit down confident looking at their little table.

CARRIE GOLDBERG: We're pretty badass litigators. [laughs]

RACHAEL: And across the aisle is, of course, Grindr's lawyers.

SIMON: Mm-hmm.

RACHAEL: And as the hearing begins, the Grindr guys, they stand up. I'm imagining they do that thing that men do where they, like, put their tie, tucked in, like, inside their jacket and they're like, "Your honor, we don't have to do anything because of Section 230."

SIMON: Section 230.

RACHAEL: Yeah. And the judge is like: "You're right. We don't have to go any further. That's the end of this." [bang bang]

MATTHEW HERRICK: It was over. It was over. And I said, "What the [bleep] is Section 230?" I didn't even understand what that meant.

ELIE MYSTAL: Okay. So let's start there.

RACHAEL: Okay.

ELIE MYSTAL: Section 230 is a provision passed by Congress in 1996. That's not a typo—1996, right? [laughs]

RACHAEL: Attorney and justice correspondent at The Nation magazine, Elie Mystal.

ELIE MYSTAL: So that's how old this law is. Now it's worth pointing out that most of the rest of the law is no longer good law. It's been amended, it's been shaped, it's been overcome with kind of newer, better laws that take into account what the internet has actually become, but Section 230 is the beating core that remains.

RACHAEL: And 230, it does one simple thing.

ELIE MYSTAL: It relieves internet companies of liability for illegal things posted on their websites.

RACHAEL: Meaning, he says, not only in Matthew's case, but in others like it when someone lies about someone else or threatens them or even tries to do them some kind of harm using an app or a website, these tech companies, they get off scot free.

ELIE MYSTAL: That's exactly what's happening. Section 230 is fundamentally at core a liability shield.

RACHAEL: A shield that no other industry gets except the tech industry. In short, Section 230 ...

MATTHEW HERRICK: ... makes the tech world untouchable.

CARRIE GOLDBERG: It's just not fair.

RACHAEL: So unlike Matthew, Carrie already knew about Section 230. She knew the Grindr lawyers would use it against Matthew, and so she had actually been trying out this way to get around 230 by arguing that Grindr was a faulty product that harmed Matthew as a consumer. But the judge wasn't buying it, and with Matthew, appeal after appeal after appeal, the case just kept getting dismissed, each time because of Section 230. And so ...

CARRIE GOLDBERG: Section 230 is my nemesis.

RACHAEL: ... she hates it. And weirdly enough, this hatred Carrie feels ...

[ARCHIVE CLIP, Ted Cruz: As you know, Google enjoys a special immunity from liability under Section 230 of the Communications Decency Act.]

RACHAEL: ... is shared by a lot of people.

[ARCHIVE CLIP, Ted Cruz: A lot of Americans have concerns.]

RACHAEL: ... Conservative lawmakers like Ted Cruz.

[ARCHIVE CLIP, Joe Biden: I'm calling on Congress to get rid of special immunity for social media companies and impose much stronger ...]

RACHAEL: President Joe Biden called to have it removed.

[ARCHIVE CLIP, Donald Trump: Section 230. We have to get rid of Section 230, politicians.]

RACHAEL: And so did former President Donald Trump. That is the thing about Section 230: it's kind of built this, like, king-sized mattress of strange bedfellows who are all teaming up and saying, "We want it gone."

MATTHEW HERRICK: It is literally this, like, ominous, looming monster!

RACHAEL: But Matthew ...

MATTHEW HERRICK: I don't think they should get rid of it.

RACHAEL: ... is not one of those people. Because even though this law lets companies like Grindr completely ignore what happens on their platforms ...

MATTHEW HERRICK: Without Section 230, we couldn't live the way we do today.

ELIE MYSTAL: It is the law that makes the internet possible. And so now we're really getting into the heart of Section 230.

RACHAEL: And we're gonna go backwards ...

SIMON: All right!

RACHAEL: ... to a time not that long ago when what the internet would be, what it would look like and feel like was a very open question. A time when sort of anything seemed possible. It's a world of ...

SIMON: Sorry, what year are we in?

RACHAEL: Blah blah blah. Okay, so 1992 ...

[ARCHIVE CLIP: Things are starting to happen.]

[ARCHIVE CLIP: Things are starting to happen.]

ELIE MYSTAL: Back when getting on the internet required somebody else in your house to get off the phone.

RACHAEL: Dun dun dun de dun dun dun. The internet has evolved from this thing that really only academics used ...

[ARCHIVE CLIP: It's taken us five years of real hard work to develop a system like this.]

RACHAEL: ... to something niche and nerdy communities are playing on in the form of chat rooms.

[ARCHIVE CLIP: It's asking, "Why not go after real people?]

RACHAEL: To finally ...

[ARCHIVE CLIP: Introducing the power of Prodigy.]

RACHAEL: ... something that everyday people like you and me were using through these bulletin boards.

[ARCHIVE CLIP: What Prodigy does is connect our computer with a fantastic range of services.]

RACHAEL: Prodigy was one of these main early bulletin boards. And, you know, it let people post something on there and then other people would comment on it.

CHUCK EPSTEIN: It had no graphics, no pictures of any kind. It was only text. And although it may have been primitive, you had access to information all around the world.

RACHAEL: And as amazing as that was, as more and more people were logging on to get world news or share recipes or share their opinions about financial markets ...

[ARCHIVE CLIP: [bing bing bing] Prodigy needs your attention. You have new mail.]

RACHAEL: ... these bulletin boards, they began to get ...

[ARCHIVE CLIP: You're dumb.]

RACHAEL: ... heated.

[ARCHIVE CLIP: Instant message!]

[ARCHIVE CLIP: Obscene.]

[ARCHIVE CLIP: Dummy.]

[ARCHIVE CLIP: Fuck you.]

[ARCHIVE CLIP: Indecent.]

[ARCHIVE CLIP: Go fuck yourself.]

[ARCHIVE CLIP: Pornographic.]

[ARCHIVE CLIP: Go fuck yourself.]

[ARCHIVE CLIP: Goodbye.]

RACHAEL: And so guys like Chuck ...

CHUCK EPSTEIN: Chuck Epstein, moderator of the Prodigy Money Talk bulletin board.

RACHAEL: ... were brought in to turn down the temperature by removing posts that went too far.

CHUCK EPSTEIN: So I just took down swear words, derogatory words, racial slurs, et cetera.

RACHAEL: And it's just you there. You're all by yourself.

CHUCK EPSTEIN: That's correct. I was the only one who had the software, the moderating software. And there were, oh, easily a couple thousand posts per day on the Money Talk bulletin board, about stocks, bonds, real estate, equities.

RACHAEL: Hmm.

CHUCK EPSTEIN: So it was exciting.

RACHAEL: [laughs]

RACHAEL: And so Chuck, he managed to create this little neighborhood where people could connect and say what they wanted, but where he could also be a kindly curator, make sure that no one got out of line. Until ...

CHUCK EPSTEIN: Well, one evening I was at my house. Took my poodle out the front door for a walk.

RACHAEL: Fluffy little fella.

CHUCK EPSTEIN: He was a—a miniature French poodle, Bo. And we walked down the street about, you know, 40 paces.

RACHAEL: Bo does his business, Chuck stretches his legs.

CHUCK EPSTEIN: Then a man literally jumps out of bushes.

RACHAEL: Oh my God!

CHUCK EPSTEIN: It was like from the spy movies. I didn't know what this guy was doing.

RACHAEL: And standing there under a streetlight ...

CHUCK EPSTEIN: He says, like, "Mr. Epstein?" I said, "Yes?" And he hands me a piece of paper. In an—it was an envelope. And he says, "Thank you. Have a nice night."

RACHAEL: So Chuck turns around, walks home quickly.

CHUCK EPSTEIN: I went back in the house and opened the envelope, and ...

RACHAEL: The first thing that he sees on this piece of paper, it says ...

CHUCK EPSTEIN: "Stratton Oakmont versus Prodigy" in big letters at the very top. You know, I said, "What the hell is this?"

RACHAEL: Turns out that Stratton Oakmont, was suing Chuck's employer, Prodigy.

SIMON: Mm-hmm.

RACHAEL: Claiming that someone had used Chuck's bulletin board to smear their company, saying that their president ...

CHUCK EPSTEIN: Was a thief, involved in some scams. And Stratton Oakmont was a criminal organization. Basically attacked the reputation and the financial acumen and the honesty and the ethics of Stratton Oakmont.

RACHAEL: And that these posts constituted defamation.

CHUCK EPSTEIN: In this $100-million lawsuit.

RACHAEL: Now as would be discovered years later, these posts were not defamatory. In fact, Stratton Oakmont and their president were doing so many illegal things that they would one day inspire Leonardo DiCaprio ...

[ARCHIVE CLIP, The Wolf of Wall Street: Was all this legal? Absolutely not.]

RACHAEL: ... in the film The Wolf of Wall Street.

[ARCHIVE CLIP, The Wolf of Wall Street: We don't work for you, man!]

[ARCHIVE CLIP, The Wolf of Wall Street: You have my money taped to your boobs. Technically you do work for me.]

SIMON: [laughs]

RACHAEL: Yeah, Jonah Hill's character was actually based on the guy who cried defamation. But at the time of this suit, nobody knew anything about that, and so ...

CHUCK EPSTEIN: The lawsuit was about—it went over ...

RACHAEL: ... when the thing went to trial, these wolves of Wall Street, they argued that because Prodigy employed people like Chuck, moderators who left posts up and took posts down, that they were responsible for every defamatory post they left up. And this judge agreed.

CHUCK EPSTEIN: The judge ruled that Prodigy was responsible.

[ARCHIVE CLIP: The world of Prodigy.]

RACHAEL: Now the irony here is that right around this time there was another company ...

[ARCHIVE CLIP: ... access to the internet. Enter Compuserve.]

RACHAEL: ... Compuserve.

CHUCK EPSTEIN: And Compuserve did not hold itself out to be a family-friendly bulletin board.

RACHAEL: They were just like Prodigy, but they had no moderators, no Chucks. Didn't take anything down, all the swear words, defamation, racial slurs, all of it stayed there. And when they went to trial in a very similar online defamation lawsuit they won. And so weirdly in this situation, if Prodigy had never set out to be a family-friendly place, if they said, "Whatever you want, have it," they would not have lost this lawsuit.

SIMON: Well, that seems completely ass backwards.

RACHAEL: Yes! Yes!

CHRIS COX: What the law was saying is that if your approach is anything goes, then you'll be scot-free, but if you attempt to have rules of the road then we're gonna make you responsible for every piece of content that's uploaded by every single one of your users.

RACHAEL: Former Republican Representative of California Chris Cox. And when he learned about this, he was like this is not the way we want the internet to be regulated.

CHRIS COX: Because of the—the obvious consequences. You know, the rate of increase in users of the internet was exponential, and it was clear that this new means of communication was gonna be of vital importance either for good or for ill.

RACHAEL: And he worried that this precedent set by these two cases, like, reward the wild wests, punish the family-friendly sites, that that could be disastrous.

CHRIS COX: And one of the great things about being a member of Congress is that when you pick up the newspaper and you see something that needs to be fixed, you say "There oughta be a law," and then your next thought ...

RACHAEL: You're like, "Who can do this for me?" [laughs]

CHRIS COX: Yeah. I could do that.

RACHAEL: But he needed a partner. So ...

CHRIS COX: I made a beeline to my friend Ron.

RON WYDEN: Ron Wyden, one of Oregon's United States senators.

RACHAEL: Democrat. Little buds. They get ice cream together.

SIMON: Mmm!

RON WYDEN: Chocolate chip for me.

CHRIS COX: I'm chocolate, although when I'm being very extravagant I have one scoop of vanilla and one scoop of chocolate.

RACHAEL: That's living the life. [laughs]

RACHAEL: And he says, "Ron, like, I think it's really, really important that we do something about this." Explained these two cases, and how ...

RON WYDEN: You know, online platforms were offered a choice: you could either police your website and be liable for everything even if something slipped through, or you could turn a blind eye and not police anything. And Chris and I said, "Maybe we can come up with something that really promotes the best internet."

RACHAEL: An internet where sites could take down what they wanted without getting in trouble.

RON WYDEN: And the point was to keep it really simple. So Chris and I went to the House dining room and sat by ourselves and we put this together.

RACHAEL: A couple days later ...

RON WYDEN: We're done. It wasn't perfect by any means.

RACHAEL: And do either of you know it by heart? I'm sure you do because you talk about this all the time, but could you just say it just so we have it on the recording?

CHRIS COX: Yes, sure. What it says is that, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

RACHAEL: In other words, these internet companies could control the things that got posted on their sites as they saw fit without the threat of being sued. And so ...

[NEWS CLIP: Right now we're gonna take you over to the Library of Congress for this signing ceremony. Mr. Clinton uses the same ...]

RACHAEL: ... February 8, 1996.

[ARCHIVE CLIP, Bill Clinton: Today, our world is being remade yet again by an information revolution.]

RACHAEL: In a wood-paneled hall, President Clinton signed these 26 words into law.

[ARCHIVE CLIP, Bill Clinton: This historic legislation recognizes that with freedom comes responsibility. This bill protects consumers against monopolies. It guarantees the diversity of voices our democracy depends upon. Perhaps most of all, it enhances the common good.]

RACHAEL: I mean, just as one example, if it hadn't passed and sites remained liable for every little thing that we posted ...

CHRIS COX: You couldn't imagine a project like Wikipedia taking off as it has.

RACHAEL: Or the #MeToo movement, or that ice bucket challenge that raised millions of dollars for ALS research. Or #BlackLivesMatter.

CHRIS COX: They absolutely could not exist without Section 230.

RACHAEL: I mean, Section 230 let websites moderate as best as they could without the threat of constantly being dragged to court, which allowed space for this massive online wave that we're all still surfing today. But of course, waves can be dangerous, and now more than ever it's starting to feel like we could use some more lifeguards. Because, you know, these wonky little bulletin boards that sparked all this, they evolved into comment sections, which evolved into social media platforms like Facebook and Twitter and Instagram, and then into dating apps like Tinder and Grindr. And before we knew it, billions of people were on these things. And while these sites have enabled lots of good things to happen ...

[ARCHIVE CLIP: Nasty texts and Instagram posts.]

[ARCHIVE CLIP: Fat fat fatty.]

[ARCHIVE CLIP: Drink bleach and die.]

RACHAEL: ... they've also given us this whole new universe of ways to be cruel to one another.

[ARCHIVE CLIP: Cyberbullying.]

[NEWS CLIP: …doxxing.]

[NEWS CLIP: Revenge porn.]

RACHAEL: And even though the platforms make some efforts to weed out the bad stuff, so much of it gets through.

[ARCHIVE CLIP: Suck my [bleep].]

[ARCHIVE CLIP: Kill yourself.]

[NEWS CLIP: They put my pictures on there, put my little stub on my name and everything. Report the account, please!]

RACHAEL: And when someone comes to them and says please make it stop, like Matthew, our Grindr guy, or countless other people ...

[NEWS CLIP: This is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.]

RACHAEL: ... they say, "It's not our problem. Section 230."

[NEWS CLIP: I really don't think I'll ever get these images down from the internet. And I just—I'm sorry to my husband. I'm sorry to my children.]

ELIE MYSTAL: Again, Section 230, while critical to how the internet was made, critical to how it functions, is old!

RACHAEL: Once again, justice correspondent Elie Mystal.

ELIE MYSTAL: Our laws should be updated to reflect how the internet works today, not how it worked in 1996. And so there is a coalition amongst hardcore conservatives and some progressives to do something about Section 230 and take it away.

RACHAEL: It seems not unlike 1996 when Section 230 passed. Like, there's this open question again of what is the sort of internet that we want?

ELIE MYSTAL: However, the bottom line is that we don't know what's going to happen to the internet if we take away Section 230. One way it could go is for everybody to go back to a wild wild west scenario where there is no moderation anywhere at all, right?

RACHAEL: Mm-hmm.

ELIE MYSTAL: However, the other way it could go would be to have extreme moderation, nobody has open comment threads, nobody has a forum where they can say whatever they want. Everything is either completely closed off, or highly monitored by an AI, by the algorithm that is just without pride or prejudice just running around and smacking people based on keywords. Doesn't matter the context, right?

RACHAEL: Which, you know, it would take out racial slurs, problematic stuff, but it also might weed out these kernels of ideas that led to the Arab Spring, and #BlackLivesMatter, and #MeToo.

ELIE MYSTAL: So only the most kind of anodyne, Disneyfied ,"I like soup!" Right?

RACHAEL: [laughs] Are those options, like, both equally likely if Section 230 were to go away?

ELIE MYSTAL: Well, are you conservative or are you liberal?

RACHAEL: [laughs]

ELIE MYSTAL: Because what you think is more likely really tracks with your politics right now. Liberals, at least the ones who also think Section 230 should be taken away, think these—that the social media platforms will go full on aggressive stopping hateful comments. However, conservatives like Josh Hawley really think that it's gonna go the other way. That in a post-Section 230 world, because of the threat of liability, these companies they would go on a wild west format and just let everything ride so nobody gets in trouble.

RACHAEL: But the problem there, Elie says, is ...

ELIE MYSTAL: You've gotta be able to turn the internet upside down and shake money out of it, right? Like, none of this happens if somebody can't make money off of it, right?

RACHAEL: Meaning in most cases ...

[ARCHIVE CLIP: Sometimes I just want to rent a car and go, you know?]

[ARCHIVE CLIP: I do know. And I think I can help you with that.]

[ARCHIVE CLIP: Really?]

ELIE MYSTAL: Advertisers, yeah?

[ARCHIVE CLIP: I love Hertz!]

[ARCHIVE CLIP: Love Hertz.]

ELIE MYSTAL: And what the advertisers want is for there to be moderation.

RACHAEL: Hmm.

ELIE MYSTAL: Because they make more money when things are, for lack of a better word, nice. So it's highly likely that the advertisers simply will not stand for a wild wild west scenario where, like, when you click on the page all the comments are like, "F you, you dumb N-word."

RACHAEL: And if that happens, you're basically calcifying the internet as we have it today. Like, these small companies, these startups, these homespun sites, they will not have the resources to moderate.

ELIE MYSTAL: If you put these moderation controls on them, the next Twitter, the next Facebook, the next TikTok, there will be no way for them to compete.

RACHAEL: And so what we will have is basically just the titans that we have today. So we are stuck between, like, a rock, a hard place and a freakin', like, dagger right in front of our face. Like, there's no—it feels like there's, like, no clear way to tackle 230 without then destroying the internet as we know it.

MATTHEW HERRICK: It wouldn't be so comp—it wouldn't be a complicated issue if it wasn't a complicated issue.

RACHAEL: Once more, Matthew Herrick from the beginning of this episode, whose life got literally destroyed by Section 230 but still thinks we shouldn't get rid of it.

RACHAEL: I'm so surprised that you're—you don't want to just get rid of it altogether. I don't know, it's like a fricking shark came and bit your arm, and you're like, "Well, the shark has done some good for the ocean," you know? Like ...

MATTHEW HERRICK: Well, because I understand how complicated it is. I mean, obviously I'm [bleep] pissed but, like, I'm launching a coalition with a non-profit organization to help survivors. I'm trying to, like, seek out what I can utilize through that experience to create positive in the world, and I think that's all I can do.

RACHAEL: Hmm.

MATTHEW HERRICK: But I'd be bull[bleep] you if I said that I had the right answer. I just know all the wrong answers.

RACHAEL: And he's not alone. Like, no one is quite sure how to fix this thing. So the decision for now just seems to be to just leave it.

ELIE MYSTAL: And the Supreme Court said so.

[ARCHIVE CLIP, Supreme Court: Mr. Chief Justice, and may it please the court. Section 230-C1 distinguishes ...]

RACHAEL: So this past term ...

ELIE MYSTAL: The Supreme Court heard two cases about Section 230.

[ARCHIVE CLIP, Supreme Court: 1-13-33. Gonzalez v. Google.]

ELIE MYSTAL: Google v Gonzalez and Twitter v Taamneh.

RACHAEL: And during one of these trials, from the bench ...

ELIE MYSTAL: Elena Kagan says ...

[ARCHIVE CLIP, Elena Kagan: Why is it that the tech industry gets a pass? A little bit unclear. On the other hand, I mean, we're a court. We really don't know about these things. You know, these are not, like, the nine greatest experts on the internet. And boy, there is a lot of uncertainty.]

RACHAEL: And they decided to leave it in place.

ELIE MYSTAL: You know, there is a reason why a law from 1996 is still the law today, and it's because—not because it works but because it is—it is the least bad option.



-30-

 

Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.

 

New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.

THE LAB sticker

Unlock member-only exclusives and support the show

Exclusive Podcast Extras
Entire Podcast Archive
Listen Ad-Free
Behind-the-Scenes Content
Video Extras
Original Music & Playlists