Aug 17, 2018

Transcript
Post No Evil

[RADIOLAB INTRO]

 

JAD: Hey, I'm Jad Abumrad.

 

ROBERT KRULWICH: I'm Robert Krulwich.

 

JAD: This is Radiolab.

 

ROBERT: And today we have a story about what we can say ...

 

JAD: And what we [bleep]-ing can't.

 

ROBERT: [laughs]

 

JAD: And by the way, there's gonna be a smattering of curse words here that we're not gonna bleep, which I think makes sense given the content of this story. And also, there's some graphic scenes that if you've got kids with you, you may want to sit this one out.

 

ROBERT: Yeah. Anyway, the story comes to us from producer Simon Adler.

 

SIMON ADLER: So, let's start. Can we start in 2008?

 

JAD: Sure.

 

SIMON: How about with a song?

 

ROBERT: Yes, please.

 

[ARCHIVE CLIP, crowd singing: Rise up. Rise up. Demand Facebook cease their oppressive ways. Rise up. Rise up, and put back our pictures right now.]

 

SIMON: So December 27th, a sunny Saturday morning, this group of young to middle-aged women gathered in downtown Palo Alto.

 

[ARCHIVE CLIP, crowd singing: We'll fight for our cyberspace freedom. Let's all go to Riseup.net.]

 

SIMON: They're wearing these colorful hats, and are singing and swaying directly in front of the glass-doored headquarters of ...

 

[ARCHIVE CLIP, crowd singing: Yeah, Facebook says we're pornographic ...

 

SIMON: ... Facebook.

 

[ARCHIVE CLIP, crowd singing: Is their so-called service so free?]

 

STEPHANIE MUIR: Yes. It was a humble gathering of a few dozen women and babies.

 

SIMON: That right there ...

 

[ARCHIVE CLIP, woman: Are you the organizer of this event?]

 

SIMON: ... is one of the organizers of the gathering.

 

STEPHANIE MUIR: I'm Stephanie Muir.

 

[ARCHIVE CLIP, woman: And what are you calling the event?]

 

[ARCHIVE CLIP, Stephanie Muir: It's the Facebook Nurse-In.]

 

SIMON: Nurse-in as in, like, breastfeeding.

 

STEPHANIE MUIR: The intent was really just to be visible and be peaceful and make a quiet point.

 

JAD: What -- what point were they trying to make?

 

SIMON: Well, so Stephanie and this group of mothers, you know, they were on Facebook, as many people were, and they'd have photos taken of themselves occasionally breastfeeding their babies. They wanted to share with their friends what was going on, so they would upload those photos to Facebook. And these pictures would get taken down, and they would receive a warning from Facebook for ...

 

STEPHANIE MUIR: Uploading pornographic content. And people were really getting their backs up over this.

 

SIMON: They wanted Facebook to stop taking their photos down. To say that while nudity is not allowed ...

 

STEPHANIE MUIR: Breastfeeding is exempt, period.

 

[ARCHIVE CLIP, crowd singing: Rise up and put back our pictures right now!]

 

SIMON: Now what Stephanie couldn't have known at the time, was that this small, peaceful protest would turn out to be ...

 

[NEWS CLIP: This morning, a face-off on Facebook.]

 

SIMON: ... one of the opening shots ...

 

[NEWS CLIP: Facebook triggered a hornet's nest.]

 

SIMON: ... in what would become a loud ...

 

[ARCHIVE CLIP, boy: Fuck you, Facebook.]

 

[ARCHIVE CLIP, man: Fuck you, Facebook.]

 

SIMON: ... raucous ...

 

[ARCHIVE CLIP, man: Fuck you, Facebook. Fuck you!]

 

SIMON: ... and global battle.

 

[NEWS CLIP: Embattled Facebook CEO ...]

 

[NEWS CLIP: Facebook today playing defense.]

 

SIMON: And now I'm not talking about all the things you've recently heard about: Russian interference and election meddling or data breaches. But rather, something that I think is deeper than both of those: Free speech.

 

[NEWS CLIP: Facebook has been accused of facilitating violence against Rohingya Muslims.]

 

SIMON: What we can say and what we can't say.

 

[ARCHIVE CLIP, man: You're gonna get it!]

 

[ARCHIVE CLIP, man: Facebook banned this iconic photograph.]

 

SIMON: What we can see and what we can't see ...

 

[ARCHIVE CLIP, Alex Jones: They'd let Mueller rape kids in front of people.]

 

SIMON: ... on the Internet.

 

[ARCHIVE CLIP, man: Fuck you. You're a fucking piece of shit.]

 

[ARCHIVE CLIP, politician: Thank you, Mr. Chairman. Mr. Zuckerberg, I've gotta ask you, do you subjectively prioritize or censor speech?]

 

[ARCHIVE CLIP, Mark Zuckerberg: Congresswoman, we don't think about what we're doing as censoring speech. There are types of ...]

 

SIMON: But what really grabbed me was discovering that underneath all of this is an actual rule book, a text document that dictates what I can say on Facebook, what you can say on Facebook, and what all 2.2 billion of us can say on Facebook.

 

ROBERT: For everyone in the entire globe, who's on Facebook.

 

SIMON: For everyone, there's one set of rules that all 2.2 billion of us are expected to follow.

 

JAD: This is an actual document?

 

SIMON: It's a digital document but yes, it's about 50 pages if you print it off. And in bullet points and if/then statements, it spells out sort of a First Amendment for the globe. Which made me wonder, like, what are these rules? How were they written?

 

JAD: And can you even have one rule book?

 

SIMON: Right. Exactly. And so I dove into this rule book and dug up some stories that really put it to the test.

 

JAD: Hmm. Okay.

 

ROBERT: I'll be interested to hear that.

 

JAD: How many stories are we going to hear?

 

SIMON: Three-ish.

 

ROBERT: Three-ish?

 

JAD: Okay.

 

ROBERT: Okay.

 

JAD: All right. Cool.

 

ROBERT: I'm particular interested in the "-ish." But let's go ahead with the first one.

 

SIMON: Well, so let's start back on that morning in 2008, the morning that you could argue started at all.

 

[ARCHIVE CLIP, crowd singing: Rise up, rise up ...]

 

SIMON: Because in the building right behind those protesting mothers, there was a group of Facebook employees sitting in a conference room trying to figure out what to do.

 

FACEBOOK EMPLOYEE: Cool. So if I -- so I'm just gonna -- so I should just read this?

 

SIMON: So I was able to get in touch with a couple of former Facebook employees, one of whom was actually in that room at that moment. And now neither of these two were comfortable being identified, but they did give us permission to quote them extensively.

 

FACEBOOK EMPLOYEE: How's that? Will that take work for you?

 

SIMON: It sounded great.

 

FACEBOOK EMPLOYEE: Cool.

 

SIMON: Just so we have it, let's ...

 

SIMON: So what you're going to hear here is an actor we brought into read quotes taken directly from interviews that we did with these two different former Facebook employees.

 

FACEBOOK EMPLOYEE: All right. Ready. So at the time when I joined them, there was a small group, 12 of us.

 

SIMON: Mostly, recent college grads.

 

FACEBOOK EMPLOYEE: Who were sort of called the Site Integrity Team.

 

SIMON: Again, keep in mind this was in the early 2000s.

 

[NEWS CLIP: Seismic changes this week in the internet hierarchy.]

 

FACEBOOK EMPLOYEE: This was like the deep, dark past.

 

[NEWS CLIP: MySpace.com is now the most visited website in the U.S.]

 

SIMON: Facebook had somewhere in the neighborhood of 10 million users.

 

FACEBOOK EMPLOYEE: We were smaller than MySpace.

 

SIMON: The vast majority of them college kids. And so in those early days, those 12 people, they would sit around in a sort of conference-like room with a big, long table, each of them in front of their own computer.

 

KATE KLONICK: And things would come up onto their screen, flagged to Facebook and ...

 

SIMON: Flagged meaning like, "I, a user, saw something that I thought was wrong."

 

KATE KLONICK: Exactly. Like, reporting a piece of content that you think violates the community standards.

 

SIMON: This is Kate Klonick. She's a professor of law at St. John's. And she spent a lot of time studying this very thing. And she says in those early days what would happen is a user would flag a piece of content, and then that content along with an alert would get sent to one of those people sitting in that room. It would just pop up on their screen.

 

FACEBOOK EMPLOYEE: Most of what you were seeing was either naked people, blown-off heads, or things that there was no clear reason why someone had reported, because it was like a photo of a golden retriever, and people are just annoying.

 

SIMON: And every time something popped up onto the screen, the person sitting at that computer would have to make a decision whether to leave that thing up or take it down. And at the time, if you didn't know what to do ...

 

FACEBOOK EMPLOYEE: You would turn to your pod leader who was, you know, somebody who had been around nine months longer than you, and ask, "What do I do with this?" And they would either have seen it before and explain it to you, or you both wouldn't know and you'd Google some things.

 

KATE KLONICK: It really was just kind of an ad hoc approach.

 

ROBERT: And was there any sort of written standard or any common standard?

 

SIMON: Well, kind of.

 

KATE KLONICK: They had a set of community standards, but at the end of the day, they were just kind of -- it was one page long and it was not very specific.

 

SIMON: Sorry, the guidelines were really one page long?

 

KATE KLONICK: They were one page long.

 

SIMON: And basically, all this page said was, "Nudity is bad. So is Hitler."

 

KATE KLONICK: And if it makes you feel bad, take it down.

 

SIMON: And so when one of the people sitting in that room would have a breastfeeding picture pop up on the screen in front of them, they'd be like, "I can see a female breast. So I guess that's nudity," and they would take it down. Until ...

 

[ARCHIVE CLIP, crowd singing: Rise up. Rise up.]

 

ROBERT: Rise up! Fight for the rights to have breastfeeding -- anyway.

 

SIMON: Now a dozen or so people in front of their offices on a Saturday, it probably wasn't causing Facebook too much heartache, but ...

 

STEPHANIE MUIR: I thought, "You know, hey, we have an opportunity here with, you know, over 10,000 members in our group.

 

SIMON: According to Stephanie Muir, those protesters were just a tiny fraction of a much larger online group who had organized, ironically enough, through Facebook.

 

STEPHANIE MUIR: So to coincide with the live protest, I just typed up a little blurb encouraging our members that were in the group to do a virtual nurse-in.

 

SIMON: A virtual nurse-in?

 

STEPHANIE MUIR: Right. What we did ...

 

SIMON: They posted a message asking their members ...

 

STEPHANIE MUIR: To, for one day, change their profile avatar to an image of breastfeeding, and then change their status to the title of our group, "Hey Facebook: Breastfeeding is Not Obscene."

 

SIMON: And ...

 

STEPHANIE MUIR: It caught on.

 

[NEWS CLIP: A social networking website is under fire for its policy on photos of women breastfeeding their children.]

 

SIMON: Big time.

 

STEPHANIE MUIR: 12,000 members participated, and the media requests started pouring in.

 

[NEWS CLIP: The Facebook group called, "Hey Facebook: Breastfeeding is Not Obscene.]

 

STEPHANIE MUIR: I did hundreds of interviews for print. Chicago Tribune, Miami Herald, Time Magazine, New York Times, Washington Post ...

 

[ARCHIVE CLIP, Dr. Phil: You know, the internet is an interesting phenomenon.]

 

STEPHANIE MUIR: ... Dr. Phil. It was a media storm. And eventually, perhaps as a result of our group and our efforts, Facebook was forced to get much more specific about their rules.

 

SIMON: So for example, by then nudity was already not allowed on the site. But they had no definition for nudity. They just said no nudity. And so the Site Integrity Team, those 12 people at the time, they realized they had to start spelling out exactly what they meant.

 

KATE KLONICK: Precisely. All of these people at Facebook were in charge of trying to define nudity.

 

FACEBOOK EMPLOYEE: So I mean yeah, the first cut at it was visible male and female genitalia. And then visible female breasts. And then the question is well, okay, how much of a breast needs to be showing before it's nude? And the thing that we landed on was, if you could see essentially the nipple and areola, then that's nudity.

 

SIMON: And it would have to be taken down. Which theoretically at least, would appease these protesters because, you know, now when a picture would pop up of a mother breastfeeding, as long as the child was blocking the view of the nipple and the areola, they could say, "Cool, no problem."

 

KATE KLONICK: Then you start getting pictures that are women with just their babies on their chest with their breasts bare. Like, for example, maybe baby was sleeping on the chest of a bare-breasted woman and not actively breastfeeding.

 

FACEBOOK EMPLOYEE: Okay, now what? Like, is this actually breastfeeding? No, it's actually not breastfeeding. The woman is just holding the baby and she has her top off.

 

JAD: Yeah, but she was clearly just breastfeeding the baby.

 

ROBERT: Well, maybe just before.

 

SIMON: Well, I would say it's sort of like kicking a soccer ball. Like, a photo of someone who has just kicked a soccer ball, you can tell the ball is in the air, but there is no contact between the foot and the ball in that moment potentially. So although it is a photo of someone kicking a soccer ball, they are not, in fact, kicking the soccer ball in that photo.

 

ROBERT: [laughs]

 

JAD: [laughs] That's a good example.

 

SIMON: And this became the procedure or the protocol or the approach for all of these things, was we have to base it purely on what we can see in the image.

 

KATE KLONICK: And so they didn't allow that to stay up under the rules, because it could be too easily exploited for other types of content, like nudity or pornography.

 

FACEBOOK EMPLOYEE: We got to the only way you could objectively say that the baby and the mother were engaged in breastfeeding is if the baby's lips were touching the woman's nipple.

 

SIMON: So they included what you could call, like, an attachment clause. But as soon as they got that rule in place ...

 

FACEBOOK EMPLOYEE: Like, you would see, you know, a 25-year-old woman and a teenage-looking boy, right? And, like, what the hell is going on there?

 

KATE KLONICK: Oh, yeah. It gets really weird if you, like, start entering into, like, child age. And I wasn't even gonna bring that up because it's kind of gross.

 

FACEBOOK EMPLOYEE: It's like breastfeeding porn.

 

JAD: Is that a thing?

 

ROBERT: Are there sites like that?

 

SIMON: Apparently. And so this team, they realized they needed to have a nudity rule that allowed for breastfeeding but also had some kind of an age cap.

 

FACEBOOK EMPLOYEE: So -- so then we were saying, "Okay. Once you've progressed past infancy, then we believe that it's inappropriate."

 

SIMON: But then pictures would start popping up on their screen and they'd be like, "Wait. Is that an infant?" Like, where's the line between infant and toddler?

 

FACEBOOK EMPLOYEE: And so the thing that we landed on was, if it looked like the child could walk on his or her own, then too old.

 

SIMON: Big enough to walk? Too big to breastfeed.

 

ROBERT: Oh, that could be 18 months.

 

JAD: Yeah, that's like a year old in some cases.

 

SIMON: Yeah. And, like, the World Health Organization recommends breastfeeding until, you know, like, 18 months or two years, which meant there were a lot of photos still being taken down.

 

KATE KLONICK: Within days, we were continuing to hear reports from people that their photographs were still being targeted.

 

SIMON: But ...

 

[NEWS CLIP: Facebook did offer a statement saying ...]

 

FACEBOOK EMPLOYEE: You know, that's where we're going to draw the line.

 

[NEWS CLIP: That Facebook isn't budging on its policy.]

 

SIMON: And keep in mind through this whole episode ...

 

[NEWS CLIP: Is this perhaps the next big thing? The Facebook.com ...?]

 

FACEBOOK EMPLOYEE: The company was growing really, really fast.

 

[NEWS CLIP: It seems like almost everyone is on it.]

 

KATE KLONICK: And there just got to be a lot more content.

 

[ARCHIVE CLIP, Mark Zuckerberg: When we first launched, we were hoping for, you know, maybe 400, 500 people. And now we're at 100,000. So who knows where we're going now?]

 

SIMON: Thousands more people are joining Facebook every day.

 

[NEWS CLIP: 60 million users so far, with a projection of 200 million by the end of the year.]

 

[NEWS CLIP: There are now more people on Facebook than the entire U.S. population.]

 

SIMON: Not just within the United States, but also ...

 

FACEBOOK EMPLOYEE: It was growing rapidly more international.

 

KATE KLONICK: You know, you were getting stuff from India and Turkey.

 

[NEWS CLIP: Facebook is in Iran.]

 

SIMON: It's getting big throughout the EU.

 

[NEWS CLIP: Korea has joined the Facebook.]

 

SIMON: So they have more and more content coming in from all these different places, in all these different languages.

 

FACEBOOK EMPLOYEE: How are we going to keep everybody on the same page?

 

KATE KLONICK: And so once they saw that this was the operational method for dealing with this, creating this, like, nesting set of exceptions and rules and these clear things that had to be there or had to not be there in order to keep content up or take it down, that I think became their procedure.

 

SIMON: And so this small team at Facebook got a little bigger and bigger. Jumped up to 60 people and then a hundred. And they set out to create rules and definitions for everything.

 

ROBERT: Huh.

 

SIMON: Can we go through some of sort of the ridiculous examples?

 

ROBERT: Yes, please.

 

JAD: That's why we're here.

 

SIMON: Okay. So, gore.

 

ROBERT: Gore. You mean violence kind of gore?

 

SIMON: Yes. So the gore standard was, headline ...

 

FACEBOOK EMPLOYEE: We don't allow graphic violence and gore.

 

SIMON: And then the shorthand definition they used was ...

 

FACEBOOK EMPLOYEE: No insides on the outside.

 

ROBERT: No guts, no blood pouring out of something.

 

SIMON: Blood was a separate issue. There was an excessive blood rule. They had to come up with rules about bodily fluids.

 

FACEBOOK EMPLOYEE: Semen, for example, would be allowed in, like, a clinical setting, but like, what does a clinical setting mean? And, you know, does that mean if someone is in a lab coat?

 

ROBERT: Hmm.

 

KATE KLONICK: One of my favorite examples is, like, how do you define art?

 

SIMON: Because as these people are moderating, they would see images of naked people that were paintings or sculptures come up.

 

JAD: Oh.

 

SIMON: And so what they decided to do was say, "Art with nakedness can stay up."

 

KATE KLONICK: Like, it stays up if it is made out of wood, made out of metal, made out of stone.

 

SIMON: Really?

 

KATE KLONICK: Yeah. Because how else do you define art? You have to just be like, is this what you can see with your eyeballs?

 

SIMON: And so from then on, as they run into problems ...

 

KATE KLONICK: Those rules just constantly get updated.

 

SIMON: Constant amendments.

 

KATE KLONICK: Yeah, constant amendments.

 

SIMON: New problem, new rule. Another new problem, updated rule. In fact at this point, they are amending these rules up to 20 times a month.

 

ROBERT: Wow! Really?

 

JAD: Really?

 

SIMON: Yeah. Take for example those rules about breastfeeding. In 2013, they removed the attachment clause. So the baby no longer needed to have its mouth physically touching the nipple of the woman. And in fact, one nipple and/or areola could be visible in the photo.

 

JAD: But not two.

 

SIMON: Only one. Then, 2014, they make it so that both nipples or both areolae may be present in the photo.

 

ROBERT: So this is what happens in American law all the time, this very thing.

 

SIMON: Yes.

 

KATE KLONICK: Yeah. You know, it sounds a lot like common law.

 

SIMON: So common law is this system dating back to early England where individual judges would make a ruling, which would sort of be a law, but then that law would be amended or evolved by other judges. So the body of law was sort of constantly ...

 

KATE KLONICK: Fleshed out in face of new facts.

 

SIMON: Literally every time this team at Facebook would come up with a rule that they thought was airtight, ka-plop, something would show up that they weren't prepared for, that the rule hadn't accounted for.

 

FACEBOOK EMPLOYEE: As soon as you think, yeah, this is good, like, the next day something shows up to show you, yeah, you didn't think about this.

 

SIMON: For example, sometime around 2011 this content moderator is going through a queue of things.

 

[ARCHIVE CLIP, Facebook moderator: Accept. Reject. Accept. Escalate. Accept.]

 

SIMON: And she comes upon this image.

 

[ARCHIVE CLIP, Facebook moderator: Oh, my God! What?]

 

FACEBOOK EMPLOYEE: The photo itself was a teenage girl, African by dress and skin, breastfeeding a goat. A baby goat.

 

SIMON: The moderator throws her hands up and says ...

 

FACEBOOK EMPLOYEE: "What the fuck is this?" And we Googled breastfeeding goats and found that this was a thing. It turns out it's a survival practice.

 

SIMON: According to what they found, this is a tradition in Kenya that goes back centuries. That in a drought, a known way to help your herd get through the drought is to -- if you have a woman who's lactating, to have her nurse the kid, the baby goat, along with her human kid.

 

ROBERT: Hmm.

 

SIMON: And so there's nothing sexual about it.

 

FACEBOOK EMPLOYEE: Just good farming.

 

ROBERT: Good business.

 

SIMON: And theoretically, if we go point by point through this list, it's an infant. It sort of could walk, so maybe there's an issue there. But there is physical contact between the mouth and the nipple.

 

FACEBOOK EMPLOYEE: But -- but ...

 

SIMON: Obviously ...

 

FACEBOOK EMPLOYEE: Breastfeeding, as we intended anyway, meant human infants.

 

SIMON: And so in that moment, what they decide to do is remove the photo.

 

FACEBOOK EMPLOYEE: And there was an amendment, an asterisk, under the rules stating animals are not babies. We added that so in any future cases, people would know what to do.

 

SOREN WHEELER: They removed -- they discover it was culturally appropriate and a thing that people do, and they decided to remove the photo?

 

SIMON: Yeah.

 

JAD: That outraged individual is our editor, Soren Wheeler.

 

SOREN: Why?

 

FACEBOOK EMPLOYEE: Why didn't we make an exception?

 

SIMON: Because ...

 

FACEBOOK EMPLOYEE: Because when a problem grows large enough, you have to change the rules. If not, we don't. This was not one of those cases. The juice wasn't worth the squeeze.

 

SIMON: And, like, if they were to allow this picture, then they'd have to make some rule about when it was okay to breastfeed an animal and when it wasn't okay.

 

FACEBOOK EMPLOYEE: This is a utilitarian document. It's not about being right 100 percent of the time. It's about being able to execute effectively.

 

SIMON: In other words we're not trying to be perfect here, and we're not even necessarily trying to be 100 percent just or fair, we're just trying to make something that works.

 

AURORA ALMENDRAL: One, two, three, four, five, six, seven, eight.

 

SIMON: And when you step back and look at what Facebook has become, like, from 2008 to now, in just 10 years ...

 

AURORA ALMENDRAL: Simon, I've just arrived at the Accenture Tower here in Manila. I don't know how many floors it is. One, two, three, four, five ...

 

SIMON: The idea of a single set of rules that works, that can be applied fairly ...

 

KATE KLONICK: That's just a crazy, crazy concept.

 

AURORA ALMENDRAL: 15, 16, 17, 18 ...

 

SIMON: Because they've gone from something like 70 million users to 2.2 billion.

 

AURORA ALMENDRAL: It's hard to count, but I would say it's about 30 floors.

 

SIMON: And they've gone from 12 folks sitting in a room deciding what to take down or leave up to somewhere around 16,000 people.

 

AURORA ALMENDRAL: So there's a floor in this building where Facebook supposedly outsources content moderators.

 

SIMON: And so around 2010, they decided to start outsourcing some of this work to places like Manila, where you just heard reporter Aurora Almendral as well as ...

 

GARETH STACK: I mean, I would guess that there are thousands of people in this building.

 

SIMON: ... Dublin, where we sent reporter Gareth Stack.

 

GARETH STACK: Oh, I can see where they get their delicious Facebook treats cooked. Everybody's beavering away.

 

SIMON: And we sent them there to try to talk to some of these people, who for a living sit at a computer and collectively click through around a million flagged bits of content that pop up onto their screen every day.

 

JAD: Wow. I'm just curious, what's that like?

 

SIMON: Well ...

 

AURORA ALMENDRAL: Hello. Can I ask you some questions?

 

MAN: Sorry.

 

SIMON: We found out pretty quickly ...

 

AURORA ALMENDRAL: Who do you work for?

 

SIMON: ... none of these folks were willing to talk to us about what they do.

 

AURORA ALMENDRAL: So there's a lot of running away from me happening.

 

GARETH STACK: Hey lads, sorry to bother you, do you guys work at Facebook?

 

MAN: Ah, no. Sorry.

 

GARETH STACK: Do you happen to work in Facebook by any chance?

 

WOMAN: No, I don't.

 

GARETH STACK: Hi. Sorry to bother you, do you work inside?

 

WOMAN: No. Sorry.

 

GARETH STACK: Do you work in Facebook?

 

MAN: No.

 

GARETH STACK: I mean, you just came out of there. I know you're lying.

 

SIMON: In fact, most people wouldn't even admit they work for the company.

 

ROBERT: Like, what's the -- is there something wrong about being in the ...

 

JAD: Do they have, like, an NDA that they signed?

 

SIMON: Well, yeah. So when I finally did find someone willing to talk to me ...

 

SIMON: Do you want to be named or do you not want to be named?

 

FACEBOOK EMPLOYEE: I'd rather not.

 

SIMON: That's totally fine.

 

FACEBOOK EMPLOYEE: You know, I'm still in the industry. I don't want to lose my job over this shit, you know?

 

SIMON: He explained that he and all the other moderators like him were forced to sign these non-disclosure agreements, stating they weren't allowed to admit that they work for Facebook, they're not allowed to talk about the work they do ...

 

FACEBOOK EMPLOYEE: My contract prohibited me from talking about what content moderation was.

 

ROBERT: Why?

 

SIMON: Several reasons. One is that up until recently, Facebook wanted to keep secret what these rules were so that they couldn't be gamed.

 

ROBERT: Oh.

 

SIMON: At the same time, it creates a sort of separation between these workers and the company, which if you're Facebook, you might want ...

 

FACEBOOK EMPLOYEE: You know, I knew I signed up to monitor graphic images.

 

SIMON: ... just given the nature of the job.

 

FACEBOOK EMPLOYEE: But you know, I didn't really -- you know, you don't really know the impact that that's going to have on you until you go through it.

 

SIMON: So this guy I talked to, he got his first contract doing this work several years back. And for the duration of it, about a year, he'd show up to his desk every morning, put on his headphones ...

 

FACEBOOK EMPLOYEE: Click, click, click, click, click, click, click.

 

SIMON: Ignore, delete, delete.

 

FACEBOOK EMPLOYEE: Case by case by case by case. 5,000 cases every day. It was just image and decision. Image, decision, image, decision.

 

SIMON: Wait, 5,000 a day you just said?

 

FACEBOOK EMPLOYEE: Yeah. It was a lot of cases.

 

SIMON: Yeah, he said basically he'd have to go through an image or some other piece of content every three or four seconds.

 

JAD: Wow. All day long?

 

SIMON: All day, eight hours a day.

 

ROBERT: Whoa.

 

SIMON: Well if I can ask, what kind of things did you see?

 

FACEBOOK EMPLOYEE: I don't know if this is even, like, radio-worthy. I think it's too x-rated.

 

SIMON: Clicking through, he came across unspeakable things.

 

FACEBOOK EMPLOYEE: From heads exploding to, you know, people being squashed by a tank, to people in cages being drowned to, like, a 13-year-old girl having sex with an 8-year-old boy. And it's not just once, it's over and over and over and over.

 

SIMON: Well, and did you -- did this, like, keep you up at night? Or did this ...?

 

FACEBOOK EMPLOYEE: Absolutely. Absolutely, 100 percent. It kept me up at night.

 

SIMON: He'd catch himself thinking about these videos and photos when he was trying to relax. He had to start avoiding things.

 

FACEBOOK EMPLOYEE: There were -- there were specific, like, movies that I couldn't watch. There was one, I think it was Quentin Tarantino one, my wife wanted to see it. I was like, "Okay." I turned it on. It was like heads were exploding. I was like, "Nope, nope. I have to walk away." And I just -- I had to. It was too real. I saw that. It's classic PTSD.

 

SIMON: A different moderator I spoke to described it as seeing the worst side of humanity. You see all of the stuff that you and I don't have to see because they are going around playing clean-up.

 

ROBERT: Yeah.

 

JAD: What a job. Wow.

 

SIMON: Yeah. And it's worth noting that more and more of this work is being done in an automated fashion, particularly with content like gore or terrorist propaganda. They're getting better.

 

ROBERT: You can automate that?

 

SIMON: Yeah. They, through computer vision, they're able to detect hallmarks of a terrorist video or of a gory image, and with terrorist propaganda, they now take down 99 percent of it before anyone flags it on Facebook.

 

JAD: Wow.

 

SIMON: But moving onto our second story here, there is a type of content that they are having an incredibly hard time not just automating, but even getting their rules straight on, and that's surrounding hate speech.

 

JAD: Oh, good. Some more laughs coming up.

 

ROBERT: [laughs]

 

SIMON: Well, there will be laughter.

 

ROBERT: Oh, really?

 

SIMON: There will be comedians. There will be jokes.

 

ROBERT: Okay. Comedians.

 

JAD: Hey!

 

ROBERT: All right.

 

JAD: Okay.

 

ROBERT: Well, shall we take a break and then come right back?

 

JAD: No, I think we're gonna keep going.

 

ROBERT: Okay.

 

CARTER HODGE: Testing. One, two, three, four, five. Testing. One, two, three, four, five. I'm Simon Adler.

 

SIMON: So a couple months back ...

 

LIZA YEAGER: I think it's working.

 

CARTER HODGE: Great.

 

SIMON: ... we sent our pair of interns.

 

CARTER HODGE: On the left, 60 feet.

 

SIMON: Carter Hodge ...

 

LIZA YEAGER: Here we go at The Standing Room.

 

SIMON: ... and Liza Yeager ...

 

CLUB DOORMAN: Do you guys have tickets for tonight?

 

LIZA YEAGER: I think we're on the guest-list.

 

CLUB DOORMAN: Okay.

 

SIMON: ... to this cramped, narrow little comedy club. The kind of place with, like ...

 

CARTER HODGE: It's super expensive.

 

LIZA YEAGER: I know.

 

SIMON: ... $15 smashed rosemary cocktails.

 

BARTENDER: What's going on?

 

LIZA YEAGER: None of it. We do not need to get a drink. It's fine.

 

SIMON: High-top tables.

 

CARTER HODGE: The AC is dripping on me.

 

SIMON: But still kind of a dive.

 

CARTER HODGE: That feels good. Yeah.

 

SIMON: And we sent them there to check out someone else who'd found a fault line in Facebook's rulebook.

 

CLUB MC: This is exciting. We're gonna keep moving right along. The next comedian coming to the stage, please give it up for Marcia Belsky!

 

MARCIA BELSKY ON STAGE: Thank you. Yes. I get so mad. I feel like my first time to the city, I was such a carefree brat. You know, I was young and I had these older friends, which I thought was, like, very cool and then you just realize that they're alcoholics, you know?

 

SIMON: She's got dark, curly hair, was raised in Oklahoma.

 

MARCIA BELSKY ON STAGE: And I think -- I was raised Jewish. So when you're raised Jewish, you read about Anne Frank a lot. You know, a lot, a lot. And when you read about Anne Frank, like -- this will get funny. She ...

 

SIMON: How did you decide to become a comedian?

 

MARCIA BELSKY: You know, it was kind of the only thing that ever clicked with me. And especially political comedy. You know, I used to watch the Daily Show every day.

 

SIMON: And back in 2016, she started this political running bit that I think can be called sort of absurdist, feminist comedy.

 

MARCIA BELSKY ON STAGE: Now a lot of people think that I'm, like, an angry feminist. Which is weird. This guy called me a militant feminist the other day and I'm like, "Okay. Just because I am training a militia of women in the woods."

 

MARCIA BELSKY: At first, I just had this running bit online, on Facebook and Twitter.

 

SIMON: She was tweeting and posting jokes.

 

MARCIA BELSKY: You know, like we have all the Buffalo Wild Wings surrounded. You know, things like that.

 

SIMON: Eventually took this bit on stage, even wrote some songs.

 

MARCIA BELSKY ON STAGE: [singing] "All older white men should die, but not my dad. No, no, not my dad."

 

JAD: [laughs]

 

ROBERT: [laughs]

 

SIMON: Anyhow, so about a year into this running bit, Marcia was bored at work one day and logs onto Facebook. But instead of seeing her normal news feed, there was this message that pops up.

 

MARCIA BELSKY: It says, "You posted something that discriminated along the lines of race, gender, or ethnicity group."

 

SIMON: "And so we've removed that post."

 

MARCIA BELSKY: And so I'm like, "What could I possibly have posted?" I really -- I thought it was like a glitch.

 

SIMON: But then she clicked Continue, and there, highlighted, was the violating post. It was a photo of hers.

 

SIMON: What is the picture? Can you describe it?

 

MARCIA BELSKY: The photo is me as what can only be described as a cherub: cute little seven-year-old with big curly hair, and she's wearing this blue floral dress, her teeth are all messed up.

 

SIMON: And into the photo, Marcia had edited in a speech bubble ...

 

MARCIA BELSKY: That just says, "Kill all men." And so it's funny, you know, because I hate -- I hate -- it's funny, you know? Trust me. Whatever. So I thought it was ridiculous because I ...

 

SIMON: So she searched through her library of photos and found that "Kill all men" image.

 

MARCIA BELSKY: And I post it again.

 

SIMON: Immediately after? Like ...

 

MARCIA BELSKY: Yeah. And it got removed again.

 

SIMON: And this time there were consequences.

 

MARCIA BELSKY: I got banned for three days after that.

 

SIMON: Then after several other bans ...

 

MARCIA BELSKY: Shoot forward, this is months later.

 

SIMON: ... a friend of hers had posted an article and underneath it, in the comments section, there were guys posting just really nasty stuff.

 

MARCIA BELSKY: So I commented underneath those comments, "Men are scum." Which was very quickly removed.

 

SIMON: And how long did you get banned for this time?

 

MARCIA BELSKY: 30 days.

 

SIMON: Wow!

 

MARCIA BELSKY: Yeah. I was dumbfounded.

 

SOREN: So there's a rule somewhere that, if I type "Men are scum," you take it down?

 

FACEBOOK EMPLOYEE: Yes.

 

MARCIA BELSKY: I'm like, "What could it be?"

 

SIMON: And so Marcia called on her quote "militia of women."

 

MARCIA BELSKY: Exactly.

 

SIMON: To find out, like, is this just me?

 

MARCIA BELSKY: Female comedians who are sort of like mad on my behalf started experimenting, posting "Men are scum" to see how quickly it would get removed and if it would be removed every time. And it was.

 

SIMON: So they started trying other words.

 

MARCIA BELSKY: Woof. Yeah.

 

SIMON: To find out where the line was.

 

MARCIA BELSKY: My friend put, "Men are da scum." That got removed. "Men are the worst."

 

SIMON: Removed and banned.

 

MARCIA BELSKY: This one girl put, "Men are septic fluid." Banned.

 

SIMON: But ...

 

MARCIA BELSKY: We're only at the middle of the saga.

 

SIMON: It doesn't end there.

 

MARCIA BELSKY: Because there's no ...

 

SIMON: Now she's really like, "What the hell is going on? Is this sexism?"

 

MARCIA BELSKY: So I just start doing the most bare minimum amount of investigating.

 

SIMON: She's Googling around, trying to figure out what these policies are. And pretty quick, she comes across this leaked Facebook document.

 

MARCIA BELSKY: So this is when I lose my mind. This is when Mark Zuckerberg becomes my sworn nemesis for the rest of my life.

 

SIMON: Because what she'd found was a document Facebook used to train their moderators. And inside of it, in a section detailing who Facebook protected from hate speech, there was a multiple choice question that said, "Who do we protect? White men or Black children?" And the correct answer was white men, not Black children.

 

MARCIA BELSKY: Not even kidding.

 

JAD: White men are protected, Black children are not. That's not a good look.

 

MARCIA BELSKY: It's racist. Something's going on here. There's absolutely some sort of unaddressed bias or systematic issue at Facebook.

 

MONIKA BICKERT: Hi.

 

SIMON: Hello.

 

MONIKA BICKERT: How are you?

 

SIMON: I'm doing well. Thank you so much for being willing to do this.

 

MONIKA BICKERT: Yeah. No.

 

SIMON: So not long after sitting down with Marcia, Facebook invited me to come out to their offices in California and sit down with them.

 

MONIKA BICKERT: I'm gonna eat one cookie and then we're on. Ooh, they're little. I think I get two.

 

SIMON: Could I just get your name and your title?

 

MONIKA BICKERT: I'm Monika Bickert, and I lead the policies for Facebook.

 

SIMON: Monika Bickert is in charge of all of Facebook's rules, including their policies on hate speech. And so I asked her, like, why would there be a rule that protects white men, but not Black children?

 

MONIKA BICKERT: We have made our hate speech policies -- let me rephrase that. Our hate speech policies have become more detailed over time, but our main policy is you can't attack a person or group of people based on a protected characteristic. A characteristic like race, religion or gender.

 

SIMON: So this takes a couple of beats to explain, but the gist of it is that Facebook borrowed this idea of protected classes straight from U.S. anti-discrimination law. These are the laws that make it so that you can't not hire someone say, based on their religion, their ethnicity, their race. And so on Facebook, you can't attack someone based on one of these characteristics. Meaning you can't say, "Men are trash." Nor could you say, "Women are trash," because essentially you're attacking all men for being men.

 

SOREN: Oh, is it the "All"? Can I say, "Bob is trash?"

 

SIMON: Yeah. You can say, "Bob is trash." Because, as my sources explained to me ...

 

FACEBOOK EMPLOYEE: The distinction is that, in the first instance, you're attacking a category. In the second instance, you're attacking a person, but it's not clear that you're attacking that person because they are a member of a protected category.

 

JAD: Oh, so Bob might be trash for reasons that have nothing to do with him being a man.

 

SIMON: Yeah.

 

JAD: He just might be annoying.

 

SIMON: Right.

 

JAD: Okay, so that explains why you'd take down "Men are scum." But why would you leave up "Black children are scum"? Why would that not get taken down?

 

MONIKA BICKERT: So traditionally, we allowed speech once there was some other word in it that made it about something other than a protected characteristic.

 

SIMON: In Facebook jargon, these are referred to as a "non-protected modifier."

 

ROBERT: This means literally nothing to me. Give us an example of this?

 

MONIKA BICKERT: So traditionally, if you said, "I don't like 'this religion' cab drivers."

 

SIMON: 'Cab driver' would be the non-protected modifier because employment is not a protected category.

 

JAD: Huh.

 

SIMON: And so what the rule stated was, when you add this non-protected modifier to a protected category, in this case the cab driver's religion ...

 

MONIKA BICKERT: We would allow it, because we can't assume that you're hating this person because of his religion. You actually just may not like cab drivers.

 

JAD: So in the case of Black children, "children" is modifying the protected category of Black.

 

SIMON: Mm-hmm.

 

JAD: And so, 'children' trumps 'Black?'

 

SIMON: Age is a non-protected category.

 

JAD: Okay.

 

SIMON: And so 'children' becomes a non-protected modifier, and their childness trumps their Blackness. You can say whatever you want about Black children. Whereas in the case of white men, you've got gender and race, both protected, you can't attack them.

 

JAD: That's just a bizarre rule. I would think you'd go the other direction, that the protected class would outweigh the modifier.

 

SIMON: Well, they made this decision, as they explained to me, because their default was to allow speech. They were really trying to incorporate or nod to the American free speech tradition.

 

FACEBOOK EMPLOYEE: And so there's a whole lot of stuff out there that none of us would defend as valuable speech, but didn't rise to the level of stuff that we'd say, "This is so bad, we're going to take it down."

 

SIMON: And in this case, their concern was ...

 

FACEBOOK EMPLOYEE: We're all members of, like, you know, at least half a dozen protected categories. Like, we all have gender, we all have sexual orientation.

 

SIMON: If the rule is that any time a protected class is mentioned it could be hate speech, what you are doing at that point is opening up just about every comment that's ever made about anyone on Facebook to potentially be hate speech.

 

FACEBOOK EMPLOYEE: Then you're not left with anything, right?

 

MONIKA BICKERT: No matter where we draw this line, there are going to be some outcomes that we don't like. There are always going to be casualties. That's why we continue to change the policies.

 

SIMON: And in fact since Marcia's debacle, they've actually updated this rule. So now Black children are protected from what they consider the worst forms of hate speech.

 

MONIKA BICKERT: Now our reviewers take how severe the attack is into consideration.

 

SIMON: But despite this, there are still plenty of people ...

 

MARCIA BELSKY: That is flawed because you are a social network ...

 

SIMON: ... including Marcia, who think this still just isn't good enough.

 

MARCIA BELSKY: There are not systematic efforts to eliminate white men in the way that there are other groups. That's why you have protected groups.

 

SIMON: She thinks white men and heterosexuals should not be protected.

 

MARCIA BELSKY: Protect the groups who are actually victims of hate speech.

 

JAD: Makes sense.

 

SIMON: Well, yeah. Because in sort of hate speech, or thinking about hate speech, there's this idea of privileged or of historically disadvantaged groups, and that those historically disadvantaged groups should have more protection because of being historically disadvantaged.

 

ROBERT: Mm-hmm.

 

SIMON: And the challenge with that that was presented to me was, okay ...

 

[NEWS CLIP: By the thousands, new Japanese reinforcements poured into ...]

 

SIMON: In the 1940s ...

 

[NEWS CLIP: ... to cut off the Chinese in Chapei and Zhengzhou.]

 

SIMON: ... you had Japanese soldiers ...

 

[NEWS CLIP: Shot and beheaded tens of thousands of Chinese civilians.]

 

SIMON: ... killing millions of Chinese during World War Two. At that same time, you had Japanese American citizens ...

 

[NEWS CLIP: There were more than a hundred thousand persons of Japanese ancestry, all of them would have to move.]

 

SIMON: ... being put into internment camps.

 

FACEBOOK EMPLOYEE: And so we had to ask ourselves a question like, are the Japanese an historically advantaged or disadvantaged group?

 

JAD: Huh.

 

SIMON: Japanese Americans, pretty easy to make a case that they were disadvantaged. But in China, it's a totally different story. And this happened at the exact same moment. So you've got two different places, two different cultural stories. And when you have a website like Facebook, this trans-national community, they realized or they decided that ideas of privilege are so geographically bound that there is no way to effectively weigh and consider who is privileged above who, and decided, therefore, that we are not going to allow historical advantage or historical privilege into the equation at all. And I think it's very important to keep in mind here ...

 

[ARCHIVE CLIP, man: I hate Americans.]

 

SIMON: ... these moderators only have, like, four or five seconds ...

 

[ARCHIVE CLIP, man: Republicans are scum.]

 

SIMON: ... to make a decision.

 

[ARCHIVE CLIP, man: I am an Indian, and even I hate Indians.]

 

SIMON: In those four seconds, is there enough time to figure out where in the world someone is, particularly given IP addresses can easily be masked?

 

[ARCHIVE CLIP, man: Go back where you came from.]

 

SIMON: Is there enough time to figure out a person's ethnicity?

 

[ARCHIVE CLIP, man: White children are better than Black children.]

 

FACEBOOK EMPLOYEE: On top of that, we often don't know an individual's race.

 

[ARCHIVE CLIP, man: Straight people suck.]

 

FACEBOOK EMPLOYEE: Other categories are even less clear, like sexual orientation.

 

SIMON: And they just realized it would be next to impossible to get anybody to be able to run these calculations effectively.

 

MONIKA BICKERT: When we were building that framework we did a lot of tests, and we saw sometimes that it was just too hard for our reviewers to implement a more detailed policy consistently. They just couldn't do it accurately. So we want the policies to be sufficiently detailed to take into account all different types of scenarios, but simple enough that we can apply them consistently and accurately around the world. And the reality is anytime that the policies become more complicated, we see dips in our consistency.

 

SIMON: What Facebook's trying to do is take the First Amendment, this high-minded, lofty legal concept and convert it into an engineering manual that can be executed every four seconds for any piece of content from anywhere on the globe. And when you've got to move that fast, sometimes justice loses.

 

MONIKA BICKERT: That's the -- that's the tension here. And I just want to make sure I emphasize that these policies, they're not gonna please everybody. They often don't please everybody that's working on the policy team at Facebook. But if we want to have one line that we enforce consistently, then it means we have to have some pretty objective black and white rules.

 

JAD: But when we come back, those rules ...

 

ROBERT: They get toppled.

 

JAD: Jad.

 

ROBERT: Robert.

 

JAD: Radiolab.

 

ROBERT: Back to Simon Adler.

 

JAD: Facebook.

 

ROBERT: Free speech.

 

SIMON: So as we just heard before the break, Facebook is trying to do two competing things at once. They're trying to make rules that are just, but at the same time can be reliably executed by thousands of people spread across the globe in ways that are fair and consistent. And I would argue that this balancing act was put to the test April 15th, 2013.

 

[NEWS CLIP: The strike, with the Koran demands ...]

 

[NEWS CLIP: Hey, Carlos, I'm so sorry. We have some breaking news, otherwise I wouldn't cut you off so abruptly. Carlos ...]

 

SIMON: Monday, the 15th, 2013, just before three in the afternoon two pressure cooker bombs rip through the crowd near the finish line of Boston Marathon. And as sort of the dust begins to settle ...

 

[ARCHIVE CLIP, man: Oh, my God!]

 

SIMON: ... people like springing into action. This one man in a cowboy hat sees this spectator who's been injured, picks him up, throws him in a wheelchair. And as they're pushing him through this sort of ashy cloud, there's this photographer there and he snaps this photo. And the photo shows that the runner in the cowboy hat and these two other people pushing this man, who his face is ashen from all of the debris, his hair is sort of standing on end and you can tell that actually the force of the blast and then the particles that got in there are actually holding it in this sort of wedge shape. And one of his legs is completely blown off, and the second one is blown off below the knee other than the femur bone sticking out and then sort of skin and muscle and tendons. It's horrific. Meanwhile ...

 

[NEWS CLIP: From the CBS Bay Area studios ...]

 

SIMON: ... on the other side of the country.

 

[NEWS CLIP: KPIX-5 News.]

 

FACEBOOK EMPLOYEE: I remember snippets of the day.

 

SIMON: Facebook employees were clustering around several desks staring at the computer screens, watching the news break.

 

[NEWS CLIP: And this has occurred just in the last half hour or so.]

 

FACEBOOK EMPLOYEE: I have memories of watching some of the coverage.

 

[NEWS CLIP: Chilling new images just released of the Boston bombings.]

 

FACEBOOK EMPLOYEE: I remember seeing the photo published online. And it wasn't long after that someone had posted it on Facebook.

 

SIMON: From the folks I spoke to, the order of events here are a little fuzzy. But pretty quickly this photo's going viral.

 

FACEBOOK EMPLOYEE: And we realized we're going to have to deal with it.

 

SIMON: This image is spreading like wildfire across their platform. It appears to be way outside the rules they'd written, but it's in this totally new context. So they got their team together and sat down in a conference room.

 

FACEBOOK EMPLOYEE: I don't know, there was probably eight or ten people thinking about, like, should we allow it?

 

SIMON: Or should they take it down? According to their rules ...

 

FACEBOOK EMPLOYEE: Yeah. So if you recall the "no insides on the outsides" definition that we had in place, meaning you can't see, like, people's organs or that sort of thing. And if you can, then we wouldn't allow it. And in this photo, you could see -- you could definitely see bone.

 

SIMON: And so by the rules, the photo should obviously come down.

 

FACEBOOK EMPLOYEE: Yep.

 

SIMON: However, half the room says no.

 

FACEBOOK EMPLOYEE: The other people are saying this is newsworthy.

 

SIMON: Essentially, this photo's being posted everywhere else. It's important. We need to suspend the rules, we need to make an exception. Which immediately receives pushback.

 

FACEBOOK EMPLOYEE: Well, I was saying that what we've prided ourselves on was not making those calls. And there are no exceptions. There's either mistakes or improvements.

 

SIMON: We made the guidelines for moments like this.

 

ROBERT: Hmm.

 

SIMON: To which the other side shoots back ...

 

FACEBOOK EMPLOYEE: "Oh my God, are you kidding me? Like the Boston Globe is publishing this all over the place and we're taking it down? Like, are you fucking kidding me?"

 

SIMON: Damn the guidelines, let's have common sense here. Let's be humans. We know that this is important.

 

FACEBOOK EMPLOYEE: And, yeah, they're kind of -- they're right. But the reality is, like, if you say, "Well, we allowed it because it's newsworthy," how do you answer any of the questions about any of the rest of the stuff?

 

SIMON: In other words, this is a Pandora's box. And in fact, for reasons that aren't totally clear, Team Consistency, Team Follow-the-Rules eventually wins the day. They decide to take the photo down. But before they can pull the lever, word starts making its way up the chain.

 

FACEBOOK EMPLOYEE: And internally within Facebook ...

 

SIMON: According to my sources, an executive under Zuckerberg sent down an order.

 

FACEBOOK EMPLOYEE: ... we were essentially told, "Make the exception."

 

JAD: Huh.

 

SIMON: "I don't care what your guidelines say, I don't care what your reason is, the photo stands. You're not taking this down."

 

FACEBOOK EMPLOYEE: Yes. Yes, that's what happened.

 

ROBERT: This decision means that Facebook has just become a publisher. They don't think maybe they have, but they've made a news judgment. And just willy-nilly they've become CBS, ABC, New York Times, Herald Tribune, Atlantic Monthly, and all these other things. All at once they've just become a news organization.

 

SIMON: Yeah. And this brings up a legal question that's at the center of this conversation about free speech. Like, is Facebook a sort of collective scrapbook for us all? Or, is it a public square where you should be able to say whatever you want? Or yeah, is it now a news organization?

 

[ARCHIVE CLIP, politician: I'm sorry to interrupt, but let me get to one final question that kind of relates to what you're talking about in terms of what exactly Facebook is.]

 

SIMON: And this question has been popping up a lot recently. In fact, it even came up this past April when Zuckerberg was testifying in front of Congress.

 

[ARCHIVE CLIP, politician: I think about 140 million Americans get their news from Facebook. So which are you, are you a tech company or are you the world's largest publisher?]

 

[ARCHIVE CLIP, Mark Zuckerberg: Senator, this is a -- I view us as a tech company, because the primary thing that we do is build technology and products.]

 

[ARCHIVE CLIP, politician: But you said you're responsible for your content, which makes ...]

 

[ARCHIVE CLIP, Mark Zuckerberg: Exactly.]

 

[ARCHIVE CLIP, politician: ... you a kind of a publisher, right?]

 

[ARCHIVE CLIP, Mark Zuckerberg: Well, I agree that we're responsible for the content, but I don't think that that's incompatible with fundamentally at our core being a technology company where the main thing that we do is have engineers and build products.]

 

SIMON: Basically, Zuckerberg and others at the company are arguing, no, they're not a news organization.

 

JAD: Why? What would be the downside of that?

 

SIMON: Well, Facebook currently sits on this little idyllic legal island where they can't be held liable for much of anything, they're subjected to few regulations. However, were they to be seen in the eyes of the court as a media organization, that could change. But setting that aside, what really strikes me about all of this is, here you have a company that really up until this point has been crafting a set of rules that are both as objective as possible and can be executed as consistently as possible. And they've been willing to sacrifice rather large ideas in the name of this. For example, privilege, which we talked about, they decided was too geographically bound to allow for one consistent rule. But if you ask me, there's nothing more subjective or geographically bound than what people find interesting or important, what people find newsworthy.

 

ROBERT: Hmm.

 

SIMON: And I'll give you a great example of this that happened just six months after the Boston Marathon bombing when this video starts being circulated out of northern Mexico. And it's a video of a woman being grabbed and forced onto her knees in front of a camera. And then a man with his face covered grabs her head, pulled her head back and slices her head off right in front of the camera. And this video starts being spread.

 

SHANNON YOUNG: I can't count how many times, like, just reading my Twitter feed, I've been like, "Ah!" You know?

 

SIMON: One person who came across this video, or at least dozens of others like it, was Shannon Young.

 

SHANNON YOUNG: My name is Shannon Young. I am a freelance radio reporter. I've been living here in Mexico for many years now.

 

SIMON: Her beat is covering the drug war. And doing so years back, she noticed this strange phenomenon.

 

SHANNON YOUNG: It first caught my attention in early 2010.

 

SIMON: She'd be checking social media.

 

SHANNON YOUNG: You know, you're scrolling through your feed and you'd see all this news. People say, "Ah! There was this three-hour gun battle and intense fighting all weekend long."

 

SIMON: Folks were posting about clashes between drug cartels and government forces. But then when Shannon would watch the news that night ...

 

[NEWS CLIP: [News anchor speaking Spanish.]

 

SIMON: ... she'd see reports on the economy and soccer results, but ...

 

SHANNON YOUNG: The media wasn't covering it.

 

SIMON: ... there'd be no mention of these attacks.

 

SHANNON YOUNG: Nothing to do with the violence.

 

SIMON: And so she and other journalists tried to get to the bottom of this.

 

SHANNON YOUNG: Reporters in Mexico City would contact the state authorities and, you know, public information officer and they'd be like ...

 

SIMON: "Shootings? Bombings? What are you talking about?"

 

SHANNON YOUNG: "Nothing's going on. We have no reports of anything. These are just internet rumors."

 

SIMON: The government even coined a term for these sorts of posts.

 

SHANNON YOUNG: The famous phrase at the time was "Collective psychosis." These people are crazy.

 

SIMON: Because, you know, they didn't want the situation to seem out of control. But then a video was posted. It opens, looking out the windshield of a car on a sunny day. The landscape is dry, dusty, and the video itself is shaky, clearly shot on a phone. And then the woman taping starts talking.

 

SHANNON YOUNG: And this woman, she just narrates as they drive along this highway.

 

SIMON: She pans the phone from the passenger window to the windshield, focusing in on these two silver destroyed pickup trucks.

 

SHANNON YOUNG: And she's saying, "Look at these cars over here they're, you know, shot up and ooh, ooh, look here, look here. You know, this 18-wheeler is totally abandoned. It got shot up."

 

SIMON: At one point, she sticks the phone out the window to show all of the bullet casings littering the ground.

 

SHANNON YOUNG: And she just turned the official denial on its head.

 

SIMON: The government was saying there's no violence. Here were cars riddled with bullets. It was impossible to dismiss.

 

SHANNON YOUNG: And from then on, you had more and more citizens, citizen journalists uploading anonymously video of the violence.

 

SIMON: These low-fi, shaky shots of ...

 

SHANNON YOUNG: Shootouts, dismemberments, beheadings. I mean, bodies hanging, dangling off of overpasses to prove to the world that this was really happening. To say, "We're not crazy."

 

ROBERT: It's a cry for help.

 

SIMON: Yeah. Which brings us back to that beheading video we mentioned a bit earlier.

 

FACEBOOK EMPLOYEE: Yeah. That video of the beheading, a lot of people were uploading it, condemning the violence of the drug cartels.

 

SIMON: And when it started showing up on Facebook, much like with the Boston Marathon bombing photo, this team of people they sat down in a room, looked at the policy and weighed the arguments.

 

FACEBOOK EMPLOYEE: And my argument was, it was okay by the rules during the Boston bombing, why isn't it okay now?

 

SIMON: Particularly, given that it could help.

 

FACEBOOK EMPLOYEE: Leaving this up means we warn hundreds of thousands of people of the brutality of these cartels. And so we kept it up. However ...

 

[ARCHIVE CLIP, man: It's fucking wrong! It's wrong!]

 

[ARCHIVE CLIP, woman: I think it's utterly irresponsible, and in fact quite despicable of them to put ...]

 

SIMON: When people found out ...

 

[ARCHIVE CLIP, man: I'm talking I have little neighbor kids that don't need to see shit like that.]

 

SIMON: ... backlash.

 

[ARCHIVE CLIP, woman: Is there really any justification for allowing these videos to be ...]

 

SIMON: People as powerful as David Cameron weigh in on this decision ...

 

[NEWS CLIP: Today, the prime minister strongly criticized the move.]

 

SIMON: ... saying we have to protect children from this stuff.

 

[NEWS CLIP: David Cameron tweeted, "It's irresponsible of Facebook to post beheading videos."]

 

FACEBOOK EMPLOYEE: Yep. People were really upset because of what it was showing.

 

SIMON: And so, according to my sources, some of the folks involved in making this decision to leave it up were once again taken into an executive's office.

 

FACEBOOK EMPLOYEE: And so we went up and there was a lot of internal pressure to remove it. And I'd go to my boss and say, "Hey, look, this is the decision we made. I recognize this is controversial. I want to let you know why we made these decisions."

 

SIMON: And they made their case.

 

FACEBOOK EMPLOYEE: There are valid and important human rights reasons why you would want this to be out there to show the kind of savagery. And she vehemently disagreed with that.

 

SIMON: They took another approach, arguing that if we take this down ...

 

FACEBOOK EMPLOYEE: You're deciding to punish people who are trying to raise awareness.

 

SIMON: Again, she wasn't budging.

 

FACEBOOK EMPLOYEE: And just didn't get past that. And ultimately, I was overruled and we removed it just because there was pressure to do so.

 

SIMON: The same people that six months prior told them to leave it up because it was newsworthy said, "Take the video down."

 

[NEWS CLIP: Facebook this week reversed a decision and banned a video posted to the site of a woman being beheaded.]

 

[NEWS CLIP: In a statement, Facebook said, quote, "When we review ...]

 

ROBERT: If you want the one from Boston in, you probably should have the one from Mexico in.

 

SIMON: Right.

 

FACEBOOK EMPLOYEE: It was a mistake. Yeah, I think it was a mistake. Because I felt like -- like, why do we have these rules in place in the first place? And it's not the only reason, but decisions like that are the thing that precipitated me leaving.

 

JAD: Leaving?

 

SIMON: Yeah. Not too long after that incident, a few members of the team decided to quit. What I think this story shows is that Facebook has become too many different things at the same time. So Facebook is now sort of a playground. It's also an R-rated movie theater. And now it's the front page of a newspaper.

 

JAD: It's all those things at the same time.

 

SIMON: It's all those things at the same time. And what we, the users, are demanding of them is that they create a set of policies that are just. And the reality is, justice means a very different thing in each one of these settings.

 

ROBERT: Justice would mean that the person in Mexico gets told the truth in Mexico by Facebook, and the little boy in England doesn't have to look at something gory and horrible in England. But you can't put them together because they clash.

 

SIMON: Exactly.

 

ROBERT: So how do you solve that?

 

SIMON: I don't know. I think it's important to keep in mind that, even if you have the perfect set of policies that somehow managed to be just in different settings and that can be consistently enforced, the people at the end of the day making these decisions, they're still -- they're still people, they're still human beings.

 

SIMON: Is this working or no?

 

MARIE: I can hear you. Yeah.

 

SIMON: Great. Okay. At long last we figured it out, huh?

 

MARIE: Yeah. Clearly.

 

SIMON: I spoke to one woman who did this work for Facebook.

 

MARIE: I just want to be anonymous. I don't want them to even know that I'm doing it because they might file charges against me.

 

SIMON: We'll call her Marie. She's from the Philippines where she grew up on a coffee farm.

 

MARIE: Yeah. That's my father's group. And I didn't know that the coffee was only for adults. [laughs]

 

SIMON: She said many afternoons while she was growing up, she and her mother would sit together, like, outside sipping their coffee and tuning into their shortwave radio.

 

[NEWS CLIP: This is the Voice of America, Washington, DC.]

 

SIMON: And they'd sit there ...

 

MARIE: Listening to the Voice of America.

 

SIMON: Silently.

 

[ARCHIVE CLIP, Billy Graham: I'm going to ask that we all bow our heads in prayer.]

 

SIMON: She said one of her favorite things to catch on Voice of America were Billy Graham's sermons.

 

MARIE: Billy Graham, one of the great evangelists.

 

[ARCHIVE CLIP, Billy Graham: Our Father, we thank thee for this love of God that reaches around the world and engulfs all of mankind.]

 

SIMON: But then fast forward 50 years to 2010 and Marie is consuming a very different sort of American media.

 

MARIE: The videos were the ones that affected me. There were times when I felt really bad that I am a Christian and then I looked into these things.

 

SIMON: She became a content moderator back in 2010, and was actually one of the first people in the Philippines doing this work.

 

MARIE: I usually had the night shift, in the early morning or at dawn, from 2:00 a.m. to 4:00 a.m.

 

SIMON: She worked from home, and despite it being dark out she'd put blankets up over the windows so no one could see in at what she was looking at. She'd lock the door to keep her kids out.

 

MARIE: I have to drive them away, or I would tell them that it's adult thing, they cannot watch.

 

SIMON: And she and the other moderators on her team who lived throughout the Philippines, they were trained on the guidelines on this rule book.

 

MARIE: There were policies that we have to adhere to, but some of us were just clicking pass, pass, pass, even if it's not really pass, just to finish.

 

SIMON: Just to get through the content fast enough. And in some cases, she thinks ...

 

MARIE: A number of the moderators are doing it as a form of retaliation for the low rate.

 

SIMON: People were pissed at the low pay.

 

SIMON: If I can ask, how much were you making an hour doing this?

 

MARIE: As far as I remember it, we were paid, like, $2.50 per hour.

 

SIMON: Marie wouldn't say whether or not this low wage led her to just let things through. But she did say ...

 

MARIE: Based on my conservative background, there are things that I cannot look objectively at, so I reject many of the things that I think are not acceptable.

 

SIMON: Really?

 

MARIE: Of course.

 

SIMON: She said whether something was outside the rules or not, if her gut told her to, she just took it down.

 

MARIE: Whenever it affects me a lot, I would click the button of, like, it's a violation, because if it's going to disturb the young audience, then it should not be there. So, like, if there's a nude person ...

 

SIMON: Whether it was a breastfeeding photo, or an anatomy video, or a piece of art.

 

MARIE: I would consider it as pornography, and then click. Right away, it's a violation.

 

SIMON: You took the law into your own hands. You went vigilante.

 

MARIE: Yeah, or something. So yeah, I have to protect kids from those evil side of humankind.

 

ROBERT: Where does that leave you feeling? Does that leave you feeling that this is just -- that at the end, this is just un-doable?

 

SIMON: I think they will inevitably fail, but they have to try and -- and I think we should all be rooting for them.

 

JAD: This episode was reported by Simon Adler with help from Tracie Hunte, and produced by Simon with help from Bethel Habte.

 

ROBERT: Big thanks to Sarah Roberts, whose research into commercial content moderation got us going big time and we thank her very, very much for that.

 

JAD: Thanks also to Jeffrey Rosen, who helped us in our thinking about what Facebook is.

 

ROBERT: To Michael Churnis, whose voice we used to mask other people's voices.

 

JAD: And to Carolyn Glanville, Ruchika Budhraja.

 

ROBERT: Brian Dogan, Ellen Silver, James Mitchell, and Guy Rosen.

 

JAD: And of course, to all the content moderators who took the time to talk to us. And ...

 

SIMON: Do you want to sign off?

 

JAD: Yeah, I guess we should, huh?

 

ROBERT: We should. Ready? Do you want to go first?

 

JAD: Yeah. I'm Jad Abumrad.

 

ROBERT: I'm Robert Krulwich.

 

JAD: Thanks for listening.

 

[ANSWERING MACHINE: To play the message, press 2. Message one.]

 

[KATE KLONICK: Kate Klonick from Brooklyn, New York. Radiolab was created by Jad Abumrad and is produced by Soren Wheeler. Dylan Keefe is our Director of Sound Design. Maria Matasar-Padilla is our Managing Director. Our staff includes: Simon Adler, Maggie Bartolomeo, Becca Bressler, Rachael Cusick, David Gebel, Bethel Habte, Tracie Hunte, Matt Kielty, Robert Krulwich, Annie McEwen, Latif Nasser, Malissa O'Donnell, Arianne Wack, Pat Walters, and Molly Webster, with help from Shima Oliaee, Carter Hodge and Liza Yeager. Our fact checker is Michelle Harris.]

 

[ANSWERING MACHINE: End of message.]

 

Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.

New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.



-30-

 

THE LAB sticker

Unlock member-only exclusives and support the show

Exclusive Podcast Extras
Entire Podcast Archive
Listen Ad-Free
Behind-the-Scenes Content
Video Extras
Original Music & Playlists