Aug 11, 2023

Transcript
The Internet Dilemma

LATIF NASSER: Hey folks, just a quick warning before we get started, this episode contains some swear words as well as some frank discussions about sex and suicide. Listener discretion is advised.

[RADIOLAB INTRO]

SIMON ADLER: Come on, come on. Don't be a sassafras.

RACHAEL CUSICK: Hello, hello, hello.

SIMON: Hey!

RACHAEL: Lovely!

SIMON: Hey, this is Radiolab. I'm Simon Adler sitting in for Lulu and Latif this week.

SIMON: Yeah, how do you feel that the B team has been sent in for this?

RACHAEL: [laughs] Yeah, they're like, "You know, the understudy of the understudy was out today so you're gonna have to take Simon Adler."

SIMON: Yeah, so you're gonna have to take—yeah, just the ...

RACHAEL: The ushers.

SIMON: [laughs] The ushers. Exactly.

SIMON: Because a while back, our reporter/producer Rachael Cusick, she sat me down in the studio to tell me a story about both how beautiful we humans are, but also just how downright awful we can be. And the tricky business of deciding who should be held responsible when that ugly part of us shows.

RACHAEL: Let's hit it.

SIMON: Mm-hmm.

RACHAEL: Okay, so we are gonna start on a stoop in Harlem.

MATTHEW HERRICK: Over in West Harlem. In what, 2016? Yeah, 2016.

RACHAEL: With this guy.

MATTHEW HERRICK: Matthew Herrick.

RACHAEL: Wait, do you go by Matt or Matthew?

MATTHEW HERRICK: Most people call me Matthew.

RACHAEL: Cool.

RACHAEL: At the time, Matthew had recently moved from LA to New York City.

MATTHEW HERRICK: It was definitely, you know, a punch to the face if you will.

RACHAEL: Trading palm trees and sunshine for a smelly city stoop.

MATTHEW HERRICK: Yeah.

RACHAEL: Anyway ...

MATTHEW HERRICK: I think it was around October, mid-October. I was sitting on my stairs, just like in the front of my building.

SIMON: What's this guy look like, do we know?

RACHAEL: Like, tall, muscular, salt-and-pepper hair nowadays. I think probably just pepper back in the day.

MATTHEW HERRICK: And I was having a cigarette, and a gentleman walked up and stopped and stood in front of me. And, you know, it's New York, so there's a lot of fucking weirdos. So I just, you know, figured it was just some weirdo being weird.

RACHAEL: Like, "Whatever. I'm just gonna ignore them."

MATTHEW HERRICK: Yeah. Like, it's all good. So I'm, like, kind of avoiding eye contact, but then I realize that they're standing there for, you know, an extended period of time. So I looked up ...

RACHAEL: And it's someone that he doesn't know, someone that he doesn't recognize, but this guy ...

MATTHEW HERRICK: He went, "Hey Matt." And I was like, "Hi?" Like, how the hell do you know my name? And he says, "It's so-and-so. We were talking on Grindr."

RACHAEL: Grindr. Dating app used primarily by gay men. And so this stranger tilts his phone towards Matthew, and he's like, "Look."

MATTHEW HERRICK: And it's a profile on the app. With, you know, a picture of me. And I was like, "That's not possible."

RACHAEL: Like, that is a photo of me.

MATTHEW HERRICK: But that's not me.

RACHAEL: That is not a profile that I made. I'm not on Grindr, you know? So he looks up at the guy ...

MATTHEW HERRICK: "Like, I don't know how to explain this. I don't know who you're talking to. I'm very confused right now. Like, you need to leave." And I got up and I went inside. And I remember I looked at my roommate Michael and I was like, "Someone just showed up looking for me from Grindr."

RACHAEL: "Like, how weird is that?"

MATTHEW HERRICK: Yeah. Little did I know ...

RACHAEL: Because after that, another guy came. And then another.

MATTHEW HERRICK: People started showing up.

RACHAEL: It just keeps happening.

MATTHEW HERRICK: Sometimes I would be home, or sometimes I'd be leaving the building and there would be people outside.

RACHAEL: Each time a different man.

MATTHEW HERRICK: You know, one or two a week.

RACHAEL: All saying the same thing. Like, "We were talking on Grindr. Let's have sex." So he reports the profiles to Grindr, got the, like, automatic reply.

MATTHEW HERRICK: "We'll get back to you soon," or whatever the hell it said."

RACHAEL: But he doesn't actually hear back from anyone at the company. And meanwhile ...

MATTHEW HERRICK: People were showing up to my home in large volumes. And, like, I'm living my life. Leave me alone. Can I get a break?

RACHAEL: And finally one night, he's annoyed, he's fed up ...

MATTHEW HERRICK: I stood up and I said, "Fuck this."

RACHAEL: And he decides he is going to get to the bottom of this.

MATTHEW HERRICK: Yeah, so I downloaded Grindr.

RACHAEL: He makes a profile.

MATTHEW HERRICK: Without a photo.

RACHAEL: And then logs on.

MATTHEW HERRICK: And I saw the fake profile of myself. Very close proximity.

RACHAEL: Grindr actually has this map feature where you can see where other people are who are on Grindr. And this person who had Matthew's name and his photo is, like, right there, right outside his apartment.

MATTHEW HERRICK: So I walked outside, and I remember looking down the street and he took off running. And I went and chased after him.

RACHAEL: And while he's running, Matthew is looking at the Grindr app, scanning for the fake him.

MATTHEW HERRICK: Because you can refresh it and it'll tell you how far that person is away.

RACHAEL: And this fake Matthew, he was, like, 20 feet away.

MATTHEW HERRICK: So I'm walking along Morningside Park. Refresh the app.

RACHAEL: Then 15.

MATTHEW HERRICK: I was walking and I was walking ...

RACHAEL: Then 10.

MATTHEW HERRICK: And I remember I looked down and I was like, "He's five feet away. How is he five feet away?" And I stood on the park bench, and I looked over the fence, and laying face down in the bushes was JC.

RACHAEL: His ex, JC.

MATTHEW HERRICK: And I screamed, "I fucking caught you! I caught you! I knew it was you!"

RACHAEL: JC started yelling back at him. Matthew ran away, JC chased him. The cops got called. It was a mess.

SIMON: Ugh, so an ex-lover made a fake profile for the purpose of terrorizing him?

RACHAEL: Yeah. Matthew and this guy JC had started dating not long after Matthew arrived in the city.

MATTHEW HERRICK: And we dated for I want to say eight to 10 months.

RACHAEL: Matthew saw some ...

MATTHEW HERRICK: Little red flags.

RACHAEL: ... and ended things. And that's when these people started coming.

SIMON: Hmm.

RACHAEL: I think once Matthew broke up with him, he was like, "If you don't want to date me then, like, screw you. I'll make your life a living hell." Anyhow, once he knew it was JC ...

MATTHEW HERRICK: I ended up getting an order of protection against him.

RACHAEL: And so JC couldn't go anywhere near him in real life. But an order of protection doesn't really apply when JC's sending other people to his door.

MATTHEW HERRICK: There was no ramifications for what he was doing.

RACHAEL: There wasn't anything the courts or the cops could do about it. So Matthew contacts Grindr again, and is like, "This is the guy. Shut him down, please!" But still ...

MATTHEW HERRICK: Nothing. No acknowledgement.

RACHAEL: We reach out to Grindr for comment. Didn't hear back. Anyhow, with Grindr doing nothing, JC went on the offensive. He actually made multiple fake Matthew profiles.

MATTHEW HERRICK: There were two to three existing on the platform. And that's when, you know, the gay zombies started coming for me.

RACHAEL: [laughs] Do you call them the gay zombies?

MATTHEW HERRICK: Yeah, 'cause it's like everyone's like, "Maaaat!"

RACHAEL: Just like, "Must get sex now!"

MATTHEW HERRICK: Yeah. I would leave at six o'clock in the morning to walk my dog, there would be somebody outside waiting for me. And I would come home at night, 11:30, 12:00 at night, there'd be somebody waiting for me. Literally every single day of my life.

RACHAEL: And it wasn't just a lot of awkward but harmless encounters because these profiles ...

MATTHEW HERRICK: Said I was looking for rape fantasies.

RACHAEL: Ugh!

RACHAEL: Matthew would try to explain the situation to people calmly, but then ...

MATTHEW HERRICK: The profiles were telling these individuals it was part of my turn on, so to stay and then approach me again.

SIMON: Just diabolical!

RACHAEL: Yeah. And so again he tries reporting the profiles.

MATTHEW HERRICK: I had friends reporting profiles, family reporting profiles. Sending emails to the company.

RACHAEL: Again nothing.

MATTHEW HERRICK: I was slammed against the wall.

RACHAEL: Oh my God!

MATTHEW HERRICK: There was someone who broke into my building and physically assaulted my roommate while trying to get to me.

RACHAEL: He goes to the cops.

MATTHEW HERRICK: To file police reports and they would turn me away. They were like, "We don't understand." I don't think anybody really could grasp what I was actually talking about.

RACHAEL: JC started making profiles that promised people crystal meth, and said they should show up at the restaurant where Matthew worked.

MATTHEW HERRICK: I was taking an order at a fucking table, and I remember this guy is tapping on my shoulder saying my name, high out of his mind. And I'm looking at the people that are sitting down and they're looking back up at me, and they're like, "What is going on?" And my eyes are just welling up with tears because I'm like, "Oh my God." And I'm like, "How do you want your burger cooked?" You know what I mean? [laughs]

RACHAEL: And this went on for months.

SIMON: Jesus!

RACHAEL: Did you hate hearing your name at that point in your life?

MATTHEW HERRICK: Oh, I hated it. I hated everything about existing. I hated it all. Like, I remember sitting there saying to myself, "Like, I either want to fucking blow my brains out or throw myself off a building."

RACHAEL: And then one day, Matthew's talking to his lawyer.

MATTHEW HERRICK: My family court lawyer. She said, "Hey, there's this woman named Carrie Goldberg. She might be able to help you." And at that time, I was so just beat to a pulp, I just heard the word "help," and I was like, "Yeah."

RACHAEL: So he takes the train to downtown Brooklyn.

MATTHEW HERRICK: I sat in her office and she told me a little bit about herself.

CARRIE GOLDBERG: I mean, as background, I started this law firm after I had been the target of a malicious and creative and relentless ex.

RACHAEL: Attorney Carrie Goldberg.

CARRIE GOLDBERG: One of the most malicious things that my ex was doing was—was blackmailing me with naked pictures and videos that he had. Contacting everybody in my life. He's making false IRS reports against my family.

RACHAEL: Now at that time Carrie was already a lawyer herself, but she really only did family law stuff—wills, guardianships, things like that, and ...

CARRIE GOLDBERG: I had so much difficulty during that process finding a lawyer who knew what to do and, like, was at the intersection of intimate partner violence and internet law and First Amendment and knew how to get an order of protection, I was really desperate.

RACHAEL: And so after this all ended ...

CARRIE GOLDBERG: I became the lawyer I'd needed.

RACHAEL: A lawyer for people like Matthew.

MATTHEW HERRICK: So she told me her story and I told her mine. And before I could finish, she said, "I would like to represent you. I think we can slow this attack on you."

RACHAEL: And the way to do that, Carrie said, was to go after Grindr, take them to court, and argue that this guy JC used Grindr essentially as a weapon, that Grinder knew all about it and did absolutely nothing to stop it. And so day of the hearing, Carrie and her team file in, sit down confident looking at their little table.

CARRIE GOLDBERG: We're pretty badass litigators. [laughs]

RACHAEL: And across the aisle is, of course, Grindr's lawyers.

SIMON: Mm-hmm.

RACHAEL: And as the hearing begins, the Grindr guys, they stand up. I'm imagining they do that thing that men do where they, like, put their tie, tucked in, like, inside their jacket and they're like, "Your honor, we don't have to do anything because of Section 230."

SIMON: Section 230.

RACHAEL: Yeah. And the judge is like: "You're right. We don't have to go any further. That's the end of this." [bang bang]

MATTHEW HERRICK: It was over. It was over. And I said, "What the fuck is Section 230?" I didn't even understand what that meant.

ELIE MYSTAL: Okay. So let's start there.

RACHAEL: Okay.

ELIE MYSTAL: Section 230 is a provision passed by Congress in 1996. That's not a typo—1996, right? [laughs]

RACHAEL: Attorney and justice correspondent at The Nation magazine, Elie Mystal.

ELIE MYSTAL: So that's how old this law is. Now it's worth pointing out that most of the rest of the law is no longer good law. It's been amended, it's been shaped, it's been overcome with kind of newer, better laws that take into account what the internet has actually become, but Section 230 is the beating core that remains.

RACHAEL: And 230, it does one simple thing.

ELIE MYSTAL: It relieves internet companies of liability for illegal things posted on their websites.

RACHAEL: Meaning, he says, not only in Matthew's case, but in others like it when someone lies about someone else or threatens them or even tries to do them some kind of harm using an app or a website, these tech companies, they get off scot free.

ELIE MYSTAL: That's exactly what's happening. Section 230 is fundamentally at core a liability shield.

RACHAEL: A shield that no other industry gets except the tech industry. In short, Section 230 ...

MATTHEW HERRICK: ... makes the tech world untouchable.

CARRIE GOLDBERG: It's just not fair.

RACHAEL: So unlike Matthew, Carrie already knew about Section 230. She knew the Grindr lawyers would use it against Matthew, and so she had actually been trying out this way to get around 230 by arguing that Grindr was a faulty product that harmed Matthew as a consumer. But the judge wasn't buying it, and with Matthew, appeal after appeal after appeal, the case just kept getting dismissed, each time because of Section 230. And so ...

CARRIE GOLDBERG: Section 230 is my nemesis.

RACHAEL: ... she hates it.

CARRIE GOLDBERG: I can only talk about it once a day because I get so aggravated that I then can't do my job. [laughs] I think it can be totally decimated and thrown into the sun.

RACHAEL: And weirdly enough, this hatred Carrie feels ...

[ARCHIVE CLIP, Ted Cruz: As you know, Google enjoys a special immunity from liability under Section 230 of the Communications Decency Act.]

RACHAEL: ... is shared by a lot of people.

[ARCHIVE CLIP, Ted Cruz: A lot of Americans have concerns.]

RACHAEL: ... Conservative lawmakers like Ted Cruz.

[ARCHIVE CLIP, Joe Biden: I'm calling on Congress to get rid of special immunity for social media companies and impose much stronger ...]

RACHAEL: President Joe Biden called to have it removed.

[ARCHIVE CLIP, Donald Trump: Section 230. We have to get rid of Section 230, politicians.]

RACHAEL: And so did former President Donald Trump. That is the thing about Section 230: it's kind of built this, like, king-sized mattress of strange bedfellows who are all teaming up and saying, "We want it gone."

MATTHEW HERRICK: It is literally this, like, ominous, looming monster!

RACHAEL: But Matthew ...

MATTHEW HERRICK: I don't think they should get rid of it.

RACHAEL: ... is not one of those people. Because even though this law lets companies like Grindr completely ignore what happens on their platforms ...

MATTHEW HERRICK: Without Section 230, we couldn't live the way we do today.

ELIE MYSTAL: It is the law that makes the internet possible. And so now we're really getting into the heart of Section 230.

RACHAEL: That heart, and what our lives might look like without it, after the break.

SIMON: Simon.

RACHAEL: Rachael.

SIMON: Radiolab. And we are back.

RACHAEL: And we're gonna go backwards ...

SIMON: All right!

RACHAEL: ... to a time not that long ago when what the internet would be, what it would look like and feel like was a very open question. A time when sort of anything seemed possible. It's a world of ...

SIMON: Sorry, what year are we in?

RACHAEL: Blah blah blah. Okay, so 1992 ...

[ARCHIVE CLIP: Things are starting to happen.]

[ARCHIVE CLIP: Things are starting to happen.]

ELIE MYSTAL: Back when getting on the internet required somebody else in your house to get off the phone.

RACHAEL: Dun dun dun de dun dun dun. The internet has evolved from this thing that really only academics used ...

[ARCHIVE CLIP: It's taken us five years of real hard work to develop a system like this.]

RACHAEL: ... to something niche and nerdy communities are playing on in the form of chat rooms.

[ARCHIVE CLIP: It's asking, "Why not go after real people?]

RACHAEL: To finally ...

[ARCHIVE CLIP: Introducing the power of Prodigy.]

RACHAEL: ... something that everyday people like you and me were using through these bulletin boards.

[ARCHIVE CLIP: What Prodigy does is connect our computer with a fantastic range of services.]

RACHAEL: Prodigy was one of these main early bulletin boards. And, you know, it let people post something on there and then other people would comment on it.

CHUCK EPSTEIN: It had no graphics, no pictures of any kind. It was only text. And although it may have been primitive, you had access to information all around the world.

RACHAEL: And as amazing as that was, as more and more people were logging on to get world news or share recipes or share their opinions about financial markets ...

[ARCHIVE CLIP: [bing bing bing] Prodigy needs your attention. You have new mail.]

RACHAEL: ... these bulletin boards, they began to get ...

[ARCHIVE CLIP: You're dumb.]

RACHAEL: ... heated.

[ARCHIVE CLIP: Instant message!]

[ARCHIVE CLIP: Obscene.]

[ARCHIVE CLIP: Dummy.]

[ARCHIVE CLIP: Fuck you.]

[ARCHIVE CLIP: Indecent.]

[ARCHIVE CLIP: Go fuck yourself.]

[ARCHIVE CLIP: Pornographic.]

[ARCHIVE CLIP: Go fuck yourself.]

[ARCHIVE CLIP: Goodbye.]

RACHAEL: And so guys like Chuck ...

CHUCK EPSTEIN: Chuck Epstein, moderator of the Prodigy Money Talk bulletin board.

RACHAEL: ... were brought in to turn down the temperature by removing posts that went too far.

CHUCK EPSTEIN: So I just took down swear words, derogatory words, racial slurs, et cetera.

RACHAEL: And it's just you there. You're all by yourself.

CHUCK EPSTEIN: That's correct. I was the only one who had the software, the moderating software. And there were, oh, easily a couple thousand posts per day on the Money Talk bulletin board, about stocks, bonds, real estate, equities.

RACHAEL: Hmm.

CHUCK EPSTEIN: So it was exciting.

RACHAEL: [laughs]

RACHAEL: And so Chuck, he managed to create this little neighborhood where people could connect and say what they wanted, but where he could also be a kindly curator, make sure that no one got out of line. Until ...

CHUCK EPSTEIN: Well, one evening I was at my house. Took my poodle out the front door for a walk.

RACHAEL: Fluffy little fella.

CHUCK EPSTEIN: He was a—a miniature French poodle, Bo. And we walked down the street about, you know, 40 paces.

RACHAEL: Bo does his business, Chuck stretches his legs.

CHUCK EPSTEIN: Then a man literally jumps out of bushes.

RACHAEL: Oh my God!

CHUCK EPSTEIN: It was like from the spy movies. I didn't know what this guy was doing.

RACHAEL: And standing there under a streetlight ...

CHUCK EPSTEIN: He says, like, "Mr. Epstein?" I said, "Yes?" And he hands me a piece of paper. In an—it was an envelope. And he says, "Thank you. Have a nice night."

SIMON: I thought you were gonna say, "He said, 'I'm here to have sex with you.'"

RACHAEL: [laughs] Yeah, he's like, "I am a gay zombie of yesteryear." This is where it all started.

SIMON: Yeah. Yeah, yeah, yeah.

RACHAEL: Anyhow, so Chuck turns around, walks home quickly.

CHUCK EPSTEIN: I went back in the house and opened the envelope, and ...

RACHAEL: The first thing that he sees on this piece of paper, it says ...

CHUCK EPSTEIN: "Stratton Oakmont versus Prodigy" in big letters at the very top. You know, I said, "What the hell is this?"

RACHAEL: Turns out that Stratton Oakmont, was suing Chuck's employer, Prodigy.

SIMON: Mm-hmm.

RACHAEL: Claiming that someone had used Chuck's bulletin board to smear their company, saying that their president ...

CHUCK EPSTEIN: Was a thief, involved in some scams. And Stratton Oakmont was a criminal organization. Basically attacked the reputation and the financial acumen and the honesty and the ethics of Stratton Oakmont.

RACHAEL: And that these posts constituted defamation.

CHUCK EPSTEIN: In this $100-million lawsuit.

RACHAEL: Now as would be discovered years later, these posts were not defamatory. In fact, Stratton Oakmont and their president were doing so many illegal things that they would one day inspire Leonardo DiCaprio ...

[ARCHIVE CLIP, The Wolf of Wall Street: Was all this legal? Absolutely not.]

RACHAEL: ... in the film The Wolf of Wall Street.

[ARCHIVE CLIP, The Wolf of Wall Street: We don't work for you, man!]

[ARCHIVE CLIP, The Wolf of Wall Street: You have my money taped to your boobs. Technically you do work for me.]

SIMON: [laughs]

RACHAEL: Yeah, Jonah Hill's character was actually based on the guy who cried defamation. But at the time of this suit, nobody knew anything about any of that, and so ...

CHUCK EPSTEIN: The lawsuit was about—it went over ...

RACHAEL: ... when the thing went to trial, these wolves of Wall Street, they argued that because Prodigy employed people like Chuck, moderators who left posts up and took posts down, that they were responsible for every defamatory post they left up. And this judge agreed.

CHUCK EPSTEIN: The judge ruled that Prodigy was responsible.

[ARCHIVE CLIP: The world of Prodigy.]

RACHAEL: Now the irony here is that right around this time there was another company ...

[ARCHIVE CLIP: ... access to the internet. Enter Compuserve.]

RACHAEL: ... Compuserve.

CHUCK EPSTEIN: And Compuserve did not hold itself out to be a family-friendly bulletin board.

RACHAEL: They were just like Prodigy, but they had no moderators, no Chucks. Didn't take anything down, all the swear words, defamation, racial slurs, all of it stayed there. And when they went to trial in a very similar online defamation lawsuit they won. And so weirdly in this situation, if Prodigy had never set out to be a family-friendly place, if they said, "Whatever you want, have it," they would not have lost this lawsuit.

SIMON: Well, that seems completely ass backwards.

RACHAEL: Yes! Yes!

CHRIS COX: What the law was saying is that if your approach is anything goes, then you'll be scot-free, but if you attempt to have rules of the road then we're gonna make you responsible for every piece of content that's uploaded by every single one of your users.

RACHAEL: Former Republican Representative of California Chris Cox. And when he learned about this, he was like this is not the way we want the internet to be regulated.

CHRIS COX: Because of the—the obvious consequences. You know, the rate of increase in users of the internet was exponential, and it was clear that this new means of communication was gonna be of vital importance either for good or for ill.

RACHAEL: And he worried that this precedent set by these two cases, like, reward the wild wests, punish the family-friendly sites, that that could be disastrous.

CHRIS COX: And one of the great things about being a member of Congress is that when you pick up the newspaper and you see something that needs to be fixed, you say "There oughta be a law," and then your next thought ...

RACHAEL: You're like, "Who can do this for me?" [laughs]

CHRIS COX: Yeah. I could do that.

RACHAEL: But he needed a partner. So ...

CHRIS COX: I made a beeline to my friend Ron.

RON WYDEN: Ron Wyden, one of Oregon's United States senators.

RACHAEL: Democrat. Little buds. They get ice cream together.

SIMON: Mmm!

RON WYDEN: Chocolate chip for me.

CHRIS COX: I'm chocolate, although when I'm being very extravagant I have one scoop of vanilla and one scoop of chocolate.

RACHAEL: That's living the life. [laughs]

RACHAEL: And he says, "Ron, like, I think it's really, really important that we do something about this." Explained these two cases, and how ...

RON WYDEN: You know, online platforms were offered a choice: you could either police your website and be liable for everything even if something slipped through, or you could turn a blind eye and not police anything. And Chris and I said, "Maybe we can come up with something that really promotes the best internet."

RACHAEL: An internet where sites could take down what they wanted without getting in trouble.

RON WYDEN: And the point was to keep it really simple. So Chris and I went to the House dining room and sat by ourselves and we put this together.

RACHAEL: A couple days later ...

RON WYDEN: We're done. It wasn't perfect by any means.

RACHAEL: And do either of you know it by heart? I'm sure you do because you talk about this all the time, but could you just say it just so we have it on the recording?

CHRIS COX: Yes, sure. What it says is that, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

RACHAEL: In other words, these internet companies could control the things that got posted on their sites as they saw fit without the threat of being sued. And so ...

[NEWS CLIP: Right now we're gonna take you over to the Library of Congress for this signing ceremony. Mr. Clinton uses the same ...]

RACHAEL: ... February 8, 1996.

[ARCHIVE CLIP, Bill Clinton: Today, our world is being remade yet again by an information revolution.]

RACHAEL: In a wood-paneled hall, President Clinton signed these 26 words into law.

[ARCHIVE CLIP, Bill Clinton: This historic legislation recognizes that with freedom comes responsibility. This bill protects consumers against monopolies. It guarantees the diversity of voices our democracy depends upon. Perhaps most of all, it enhances the common good.]

RACHAEL: I mean, just as one example, if it hadn't passed and sites remained liable for every little thing that we posted ...

CHRIS COX: You couldn't imagine a project like Wikipedia taking off as it has.

RACHAEL: Or the #MeToo movement, or that ice bucket challenge that raised millions of dollars for ALS research. Or #BlackLivesMatter.

CHRIS COX: They absolutely could not exist without Section 230.

RACHAEL: I mean, Section 230 let websites moderate as best as they could without the threat of constantly being dragged to court, which allowed space for this massive online wave that we're all still surfing today. But of course, waves can be dangerous, and now more than ever it's starting to feel like we could use some more lifeguards. Because, you know, these wonky little bulletin boards that sparked all this, they evolved into comment sections, which evolved into social media platforms like Facebook and Twitter and Instagram, and then into dating apps like Tinder and Grindr. And before we knew it, billions of people were on these things. And while these sites have enabled lots of good things to happen ...

[ARCHIVE CLIP: Nasty texts and Instagram posts.]

[ARCHIVE CLIP: Fat fat fatty.]

[ARCHIVE CLIP: Drink bleach and die.]

RACHAEL: ... they've also given us this whole new universe of ways to be cruel to one another.

[ARCHIVE CLIP: Cyberbullying.]

[NEWS CLIP: …doxxing.]

[NEWS CLIP: Revenge porn.]

RACHAEL: And even though the platforms make some efforts to weed out the bad stuff, so much of it gets through.

[ARCHIVE CLIP: Suck my [bleep].]

[ARCHIVE CLIP: Kill yourself.]

[NEWS CLIP: They put my pictures on there, put my little stub on my name and everything. Report the account, please!]

RACHAEL: And when someone comes to them and says please make it stop, like Matthew, our Grindr guy, or countless other people ...

[NEWS CLIP: This is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.]

RACHAEL: ... they say, "It's not our problem. Section 230."

[NEWS CLIP: I really don't think I'll ever get these images down from the internet. And I just—I'm sorry to my husband. I'm sorry to my children.]

ELIE MYSTAL: Again, Section 230, while critical to how the internet was made, critical to how it functions, is old!

RACHAEL: Once again, justice correspondent Elie Mystal.

ELIE MYSTAL: And contemplates a late-'90s internet world that simply no longer exists.

RACHAEL: Yeah.

ELIE MYSTAL: So yes, there's a sense of, like, our laws should be updated to reflect how the internet works today, not how it worked in 1996. And so there is a coalition amongst hardcore conservatives and some progressives to do something about Section 230 and take it away.

RACHAEL: It seems not unlike 1996 when Section 230 passed. Like, there's this open question again of what is the sort of internet that we want?

ELIE MYSTAL: However, the other side of it is also—you know, I'm—we're kind of backing into the actual points of—I want to state the point differently, right?

RACHAEL: Yeah. Yeah, yeah, yeah. Yeah, do it.

ELIE MYSTAL: Because here's the thing: one of the reasons why we don't know what's going to happen with Section 230—sorry, the best way of saying it ...

RACHAEL: [laughs] You're your own producer. Thank you, Elie!

ELIE MYSTAL: I'm gonna get there! The bottom line is that we don't know what's going to happen to the internet if we take away Section 230. One way it could go is for everybody to go back to a wild wild west scenario where there is no moderation anywhere at all, right?

RACHAEL: Mm-hmm.

ELIE MYSTAL: However, the other way it could go would be to have extreme moderation, nobody has open comment threads, nobody has a forum where they can say whatever they want. Everything is either completely closed off, or highly monitored by an AI, by the algorithm that is just without pride or prejudice just running around and smacking people based on keywords. Doesn't matter the context, right?

RACHAEL: Which, you know it would take out racial slurs, problematic stuff, but it also might weed out these kernels of ideas that led to the Arab Spring, and #BlackLivesMatter, and #MeToo.

ELIE MYSTAL: So only the most kind of anodyne, Disneyfied ,"I like soup!" Right?

RACHAEL: [laughs] Are those options, like, both equally likely if Section 230 were to go away?

ELIE MYSTAL: Well, are you conservative or are you liberal?

RACHAEL: [laughs]

ELIE MYSTAL: Because what you think is more likely really tracks with your politics right now. Liberals, at least the ones who also think Section 230 should be taken away, think these—that the social media platforms will go full on aggressive stopping hateful comments. However, conservatives like Josh Hawley really think that it's gonna go the other way. That in a post-Section 230 world, because of the threat of liability, these companies, they would go on a wild west format and just let everything ride so nobody gets in trouble.

RACHAEL: But the problem there, Elie says, is ...

ELIE MYSTAL: You've gotta be able to turn the internet upside down and shake money out of it, right? Like, none of this happens if somebody can't make money off of it, right?

RACHAEL: Meaning in most cases ...

[ARCHIVE CLIP: Sometimes I just want to rent a car and go, you know?]

[ARCHIVE CLIP: I do know. And I think I can help you with that.]

[ARCHIVE CLIP: Really?]

ELIE MYSTAL: Advertisers, yeah?

[ARCHIVE CLIP: I love Hertz!]

[ARCHIVE CLIP: Love Hertz.]

ELIE MYSTAL: And what the advertisers want is for there to be moderation.

RACHAEL: Hmm.

ELIE MYSTAL: Because they make more money when things are, for lack of a better word, nice. So it's highly likely that the advertisers simply will not stand for a wild wild west scenario where, like, when you click on the page all the comments are like, "F you, you dumb N-word."

RACHAEL: And if that happens, you're basically calcifying the internet as we have it today. Like, these small companies, these startups, these homespun sites, they will not have the resources to moderate.

ELIE MYSTAL: If you put these moderation controls on them, the next Twitter, the next Facebook, the next TikTok, there will be no way for them to compete.

RACHAEL: And so what we will have is basically just the titans that we have today. So we are stuck between, like, a rock, a hard place and a freakin', like, dagger right in front of our face. Like, there's no—it feels like there's, like, no clear way to tackle 230 without then destroying the internet as we know it.

MATTHEW HERRICK: It wouldn't be so comp—it wouldn't be a complicated issue if it wasn't a complicated issue.

RACHAEL: Once more, Matthew Herrick from the beginning of this episode, whose life got literally destroyed by Section 230 but still thinks we shouldn't get rid of it.

RACHAEL: I'm so surprised that you're—you don't want to just get rid of it altogether. I don't know, it's like a fricking shark came and bit your arm, and you're like, "Well, the shark has done some good for the ocean," you know? Like ...

MATTHEW HERRICK: Well, because I understand how complicated it is. I mean, I don't want to sound, you know—I mean, obviously I'm fucking pissed but, like, I'm launching a coalition with a non-profit organization to help survivors. I'm trying to, like, seek out what I can utilize through that experience to create positive in the world, and I think that's all I can do.

RACHAEL: Hmm.

MATTHEW HERRICK: But I'd be bullshitting you if I said that I had the right answer. I just know all the wrong answers.

RACHAEL: And he's not alone. Like, no one is quite sure how to fix this thing. So the decision for now just seems to be to just leave it.

ELIE MYSTAL: And the Supreme Court said so.

[ARCHIVE CLIP, Supreme Court: Mr. Chief Justice, and may it please the court. Section 230-C1 distinguishes ...]

RACHAEL: So this past term ...

ELIE MYSTAL: The Supreme Court heard two cases about Section 230.

[ARCHIVE CLIP, Supreme Court: 1-13-33. Gonzalez v. Google.]

ELIE MYSTAL: Google v Gonzalez and Twitter v Taamneh.

RACHAEL: And during one of these trials, from the bench ...

ELIE MYSTAL: Elena Kagan says ...

[ARCHIVE CLIP, Elena Kagan: Why is it that the tech industry gets a pass? A little bit unclear. On the other hand, I mean, we're a court. We really don't know about these things. You know, these are not, like, the nine greatest experts on the internet. And boy, there is a lot of uncertainty.]

RACHAEL: And they decided to leave it in place.

ELIE MYSTAL You know, there is a reason why a law from 1996 is still the law today, and it's because—not because it works but because it is—it is the least bad option.

SIMON: This story was reported by Rachael Cusick and produced by Rachael and myself, with mixing from Arianne Wack. Special thanks this week to James Grimmelmann, Eric Goldman, Naomi Leeds, and an extra extra big thank you to Jeff Kosseff.

SIMON: All right, that's about it for it here. I'm Simon Adler. This is Radiolab. Thanks for listening.

[LISTENER: Hi, this is Mr. Fiedler's fifth-grade class, calling in from Menomonie, Wisconsin.

Radiolab was created by Jad Abumrad, and is edited by Soren Wheeler. Lulu Miller and Latif Nasser are our co-hosts. Dylan Keefe is our director of sound design. Our staff includes: Simon Adler, Jeremy Bloom, Becca Bressler, Rachael Cusick, Ekedi Fausther-Keeys, W. Harry Fortuna, David Gebel, Maria Paz Gutiérrez, Sindhu Gnanasambandan, Matt Kielty, Annie McEwen, Alex Neason, Sarah Qari, Anna Rascouët-Paz, Sarah Sandbach, Arianne Wack, Pat Walters and Molly Webster. With help from Sachi Kitajima Mulkey. Our fact-checkers are Diane Kelly, Emily Krieger and Natalie Middleton.]

[LISTENER: Hi, this is Jeremiah Barba, and I'm calling from San Francisco, California. Leadership support for Radiolab's science programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox, a Simons Foundation initiative, and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.]

 

-30-

 

Copyright © 2023 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.

 

New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.

THE LAB sticker

Unlock member-only exclusives and support the show

Exclusive Podcast Extras
Entire Podcast Archive
Listen Ad-Free
Behind-the-Scenes Content
Video Extras
Original Music & Playlists