
Feb 12, 2021
Transcript
[RADIOLAB INTRO]
JAD ABUMRAD: Three, two, one. Hey, I'm Jad Abumrad. This is Radiolab. Today, we have got a special collaboration with the New Yorker magazine and the New Yorker Radio Hour—very excited about that. So for the last several years, we here at Radiolab—and by "we," I mean mainly Simon Adler—he—we have been watching and reporting on Facebook, specifically, how Facebook decides and then enforces what people can and cannot post on their site. As many of you know, the way that they do it is they've got this rule book, one single set of rules for the many countries in the globe that define what is post-able and what isn't. And then they have a giant army of 15,000 souls who have to moderate all the crap that we put on Facebook. Anyhow, in doing so, Facebook has managed to piss off, well, just about everybody. I mean, despite all of the time, effort and money that they have thrown at this problem by taking posts down ...
[ARCHIVE CLIP: Censorship of …]
JAD: ... they have been accused of censoring voices across the political spectrum, and infringing on users' right to free expression.
[NEWS CLIP: The site won't let them post pictures nursing their infants.]
[ARCHIVE CLIP: Enemies of the First Amendment.]
JAD: And then by leaving material up ...
[NEWS CLIP: ... used to incite violence against Rohingya refugees.]
[NEWS CLIP: Investigators blame Facebook.]
JAD: ... they've been accused of helping to incite a genocide in Myanmar.
[NEWS CLIP: Fake stories influenced the election.]
JAD: And arguably swing the 2016 US presidential election.
[ARCHIVE CLIP: We're working on this. The most impactful decision.]
[ARCHIVE CLIP: Global struggle ...]
JAD: And I start here with this wrap up because since we last reported on all of that, Facebook has actually made a pretty big shift in how they are gonna approach policing, refereeing the world's speech. It's a shift that—it's gonna have a massive impact on their decisions about what is and is not allowed on the site, including the question—which we'll talk about in a second—of whether former President Trump should be banned indefinitely from Facebook. But more deeply, this is a shift that has Facebook really looking less like a company, and oddly, a little bit more like a government, an unelected government for the entire planet. So with all of that, let me now hand off to ...
SIMON ADLER: [clears throat]
KATE KLONICK: Hi!
SIMON: Hello, Kate. How are you?
JAD: Simon.
SIMON: Are you rolling on your end?
KATE KLONICK: There we go. Now I am rolling.
SIMON: Great.
KATE KLONICK: I will record myself on my phone.
SIMON: Yeah.
SIMON: So a couple months back, I called up academic Kate Klonick to talk about this shift and this research project she's been working on documenting it.
KATE KLONICK: I want to be done with this project so [bleep] badly. [laughs] I just, like ...
SIMON: [laughs]
KATE KLONICK: Yeah.
SIMON: This has been your life.
KATE KLONICK: Yeah, it has.
SIMON: Yeah. Yeah.
KATE KLONICK: Like, a little bit too much so. I'm ready to, like—you know, I'm ready to kind of do something different.
SIMON: Kate is a professor of law at St. John's University. She's studied Facebook off and on for years. And she was at it again because back in 2018, Mark Zuckerberg, the company's CEO, was considering this strange proposal.
KATE KLONICK: Yes. Like, this crazy project to solve this crisis about content management.
[ARCHIVE CLIP, Kate Klonick: I think you know that I've been kind of inside Facebook for the last couple—like, a little over a year.]
[ARCHIVE CLIP, Mark Zuckerberg: Mm-hmm.]
[ARCHIVE CLIP, Kate Klonick: And ...]
SIMON: Kate actually sat down with Mark to talk about all this. She did it over the computer, so you'll hear some clacking of keys. But anyway, as he told her ...
[ARCHIVE CLIP, Mark Zuckerberg: You know, I said a bunch of times that I just think that it's not sustainable over time for one person or even to one company's operations to be making so many decisions balancing free expression and safety at this scale.]
SIMON: Like, I recognize that this is a huge responsibility.
KATE KLONICK: And I'm not gonna be here forever.
[ARCHIVE CLIP, Mark Zuckerberg: You know, I'd like—I plan to be running the company for a while, but one day I'm not gonna be running the company. And I think at that point, it would be good to have built up a separate set of independent structures that ensure that the values around free expression and balance in these equities can exist.]
JAD: Oh, interesting! Like, I trust me, but I don't necessarily trust the next guy.
SIMON: Right.
KATE KLONICK: And so like a benevolent dictator, he wants to devolve power away from Facebook and himself.
SIMON: And what he'd landed on as a model for how to do this ...
KATE KLONICK: Was a Supreme Court for Facebook. And ...
SIMON: Sorry. What exactly—like, what?
KATE KLONICK: Yeah. So the proposal was pretty simple. It was creating a group of people from all over the world that would basically be this oversight on Facebook and its speech policies.
SIMON: Essentially, think of it as like the Supreme Court of the United States. But instead of overruling lower courts' decisions, this Supreme Court of Facebook would be able to overrule Facebook's own decisions.
SIMON: It's a hard pitch to make, isn't it?
KATE KLONICK: Oh, my God. One hundred percent.
SIMON: [laughs]
KATE KLONICK: You can imagine how that went over.
SIMON: Yeah, they're like, "Wait, what? You want us to do what?" That's how I imagine that going.
KATE KLONICK: Yeah. But Mark wanted this to happen, and so it happened. It's part of, like, a larger sense, I think, that he sees Facebook becoming more and more—like, a government isn't even the best term, but, like, a system of government.
[ARCHIVE CLIP, Mark Zuckerberg: I hope over time to use the fact that I have control to basically help implement some different forms of governance for ...]
KATE KLONICK: Like, a long-term legacy that he knows will not make terrible decisions.
JAD: This seems to be them catching up and being like, yeah. Like, if you've got three billion users, you're bigger than any company at that point, any country. Your rules can be as impactful as any government's laws. And so you really need to start thinking of yourself in a new way.
SIMON: Yeah, I think that's right.
SIMON: Has any company ever done anything like this before?
KATE KLONICK: I mean, honestly, there's nothing that even kind of comes close. And I don't want to be grandiose about this, but there is a sense in which it feels like you're—I felt like I was watching an experiment that would, if it—even if it completely and utterly failed, would be remembered and be a lesson for however the world ends up sorting out this problem of online speech.
SIMON: And so once Facebook decided to build this court, they suddenly needed to figure out, like, what cases would go to the court, who would be on it, how would they make these decisions? And it became clear that ...
BRENT HARRIS: It's, you know, not appropriate to have a single person answer these questions on behalf of society or, right, this institution.
SIMON: This is Brent Harris, who led Facebook's effort to build this board, this court. And as one of his first decisions, he said ...
BRENT HARRIS: We need to go out and actually listen to a wide array of people about what the problems are and the challenges are that they are finding and ask them, what do they want this to be? What can we create?
SIMON: And so they held dozens of listening sessions all over the world, talking to laypeople. But the cornerstone of this process was, really, six global workshops where they invited experts to come and weigh in. Kate was one of 40 or so people that attended the US workshop. It was held in the basement of the Nomad Hotel in downtown Manhattan. And when she walked in ...
KATE KLONICK: It was like walking into a technologist's wedding. You come in. Every, like, table is decorated with succulents and bottles of Voss Water and an iPad. The iPad is not for you to keep. And in fact, someone joked—one of the Facebook people joked to me, like, "Yeah, we used a couple-generations-old iPad to make sure no one walked away with any of them." [laughs]
SIMON: [laughs] That's spectacular.
KATE KLONICK: But so you have an iPad. And ultimately ...
SIMON: This moderator came out and tried to get the room's attention.
KATE KLONICK: And of course, like, everyone's half-listening, and most people are on their phones and, like, whatever else.
SIMON: In part because, like, a lot of people in that room were just very skeptical of what Facebook was doing here. I mean, Kate herself remained somewhat skeptical of this court.
KATE KLONICK: This is just something Facebook can scapegoat its really crappy decisions to. That was my main skeptical point in all of this.
SIMON: That Facebook is essentially erecting what will be just a body to absorb blame. But anyhow, the moderator explained what they were up to, that they'd brought these experts here to, in essence, design this institution.
KATE KLONICK: They're like, "So what do you think this should be? Like, what does it look like?" And some of it was, like, an answer to questions. Some of it was things people brought up: case selection questions, board selection, who picks the board? And I would say a solid third of it was people standing up and holding forth on topics that had nothing to do with why we were there that day.
SIMON: [laughs] Less of a question and more of a comment.
KATE KLONICK: Yeah, exactly! [laughs] Holy cow. So many of those.
SIMON: Eventually, though, they got to the heart of the matter: Like, how should a global board think about these cases that are—that are right on the edge?
BRENT HARRIS: What we wanted to do was really put people in the shoes that Facebook is in right now in taking these decisions.
SIMON: So they told them, like, "Hey, you are going to play mock court. As a group, you're going to have to decide whether a piece of content should stay up on Facebook or come down."
KATE KLONICK: And so everyone was asked to open their iPad. So you were asked to—like, we're gonna go over the first simulation. And you'll love this, Simon.
SIMON: [laughs]
KATE KLONICK: The first simulation that they did was the "Kill All Men" simulation.
SIMON: Really?
KATE KLONICK: Yes.
SIMON: Wow! Oh, that's great.
JAD: Oh, this is the thing you—the one that you focused on in the last story. I remember there was, like, a song in there.
SIMON: Yeah, it's ...
JAD: Am I right?
SIMON: You're totally right. We spent 10, 15 minutes dissecting this piece of content. This is ...
JAD: You know what? You should play this and just be like, "Here's what they focused on."
SIMON: Okay, yeah. I think we only need to do about three minutes of it. But here it is.
CLUB MC: This is exciting. We're gonna keep moving right along. The next comedian coming to the stage, please give it up for Marcia Belsky!
[cheering]
SIMON: We did this back in 2018. It's about comedian Marcia Belsky, and a photo she posted.
MARCIA BELSKY: [on stage] Thank you. Yes. I get so mad. I feel like my first time to the city, I was such a carefree brat. You know, I was young and I had these older friends, which I thought was, like, very cool, and then you just realize that they're alcoholics, you know?
SIMON: This is her up on stage. She's got dark, curly hair, was raised in Oklahoma.
SIMON: How did you decide to become a comedian?
MARCIA BELSKY: You know, it was kind of the only thing that ever clicked with me. And especially political comedy. You know, I used to watch The Daily Show every day.
SIMON: And inspired by this political comedy, she started this running bit that I think can be called sort of absurdist feminist comedy.
MARCIA BELSKY: [on stage] Now a lot of people think that I'm, like, an angry feminist. Which is weird. This guy called me a militant feminist the other day and I'm like, "Okay. Just because I am training a militia of women in the woods."
[laughter]
MARCIA BELSKY: At first, I just had this running bit online, on Facebook and Twitter.
SIMON: She was tweeting, posting jokes.
MARCIA BELSKY: You know, like, we have all the Buffalo Wild Wings surrounded. You know, things like that.
SIMON: [laughs]
SIMON: Eventually took this bit on stage, even wrote some songs.
MARCIA BELSKY: [on stage] [singing] "All older white men should die, but not my dad. No, no, not my dad."
JAD: [laughs]
ROBERT KRULWICH: [laughs]
SIMON: Anyhow, so about a year into this running bit, Marcia was bored at work one day and logs onto Facebook. But instead of seeing her normal news feed, there was this message that pops up.
MARCIA BELSKY: It says, "You posted something that discriminated along the lines of race, gender or ethnicity group."
SIMON: "And so we've removed that post."
MARCIA BELSKY: And so I'm like, "What could I possibly have posted?" I really—I thought it was like a glitch.
SIMON: But then she clicked "Continue," and there, highlighted, was the violating post. It was a photo of hers.
SIMON: What is the picture? Can you describe it?
MARCIA BELSKY: The photo is me as what can only be described as a cherub: cute little seven-year-old with big curly hair, and she's wearing this blue floral dress, her teeth are all messed up.
SIMON: And into the photo, Marcia had edited in a speech bubble.
MARCIA BELSKY: That just says, "Kill all men." And so it's funny, you know, because I hate—I hate—it's funny, you know? Trust me. Whatever.
SIMON: Facebook had taken it down because it violated their hate speech policy.
MARCIA BELSKY: I was dumbfounded.
SIMON: And so back to present day, this is the scenario they put in front of these tech elites in the basement of the Nomad Hotel to see, really, how they would react.
BRENT HARRIS: Is that hate speech? What does that mean? And should that be up on Facebook or not?
SIMON: Leave it up or take it down? And so people started to discuss.
KATE KLONICK: People were like, "Well, this wasn't funny." And someone else was like ...
SIMON: Does it matter whether it's funny or not?
KATE KLONICK: Back and forth and back and forth.
SIMON: And even so, like, should men be protected?
KATE KLONICK: Like, men are more protected than other groups.
SIMON: Eventually, though, the room pretty much came to an agreement.
BRENT HARRIS: "Kill all men" is clearly humor or social commentary. That should be up on Facebook, and it's inappropriate for Facebook to take that down.
JAD: Yeah, I get that. I mean, I remember when we first did this, feeling like this is a harmless joke, right? And Facebook should be a place where harmless jokes can get made, because in this case, the joke only works because men are the power structure. If they weren't, it wouldn't be funny.
SIMON: Yeah, it's punching up.
JAD: There you go. It's punching up. Right.
SIMON: But here's where things get interesting, because as we said, they did six of these expert global workshops.
KATE KLONICK: Berlin, Singapore, New Delhi, Mexico City, Nairobi.
SIMON: And at each of them, they ran through this Kill All Men scenario.
BRENT HARRIS: We ran that case across the world. And something that's very, very striking is we got really different viewpoints about should that be up on Facebook or not.
SIMON: Like, not just at the New York workshop, but in Berlin, another Western liberal democracy. And even Singapore, folks supported leaving it up. And, you know, you'd think that folks who'd experienced more authoritarian governments and restrictions on their speech would also be for leaving it up. But it didn't go that way.
BERHAN TAYE: [laughs] This sounds really bad.
SIMON: Go for it.
BERHAN TAYE: But I understand that—like, I understand that, of course, like, kill all men. That's the most feminist, radical joke that you can make.
SIMON: This is Berhan Taye. She works for an NGO called Access Now.
BERHAN TAYE: We defend and extend digital rights of users at risk around the world.
SIMON: And when she was shown this photo at the global workshop in Nairobi, which had attendees from all across the African continent, her thought was ...
BERHAN TAYE: It's very funny. And, you know, many of us that are feminists might have said that once—you know, once, twice in our life, right? Where you're just like, no, could we—yeah. You know, and I understand that to be a joke. So I'm like, yeah, of course, there should be space for humor, and I know why satire is so important.
SIMON: But I'm sensing a "but." What is it?
BERHAN TAYE: So, you know, it's—how do I put it? So for me right now, you know, it's funny, but, you know, humor is a luxury. And we're not—I mean, none of us are laughing right now. So, yes, we've seen content like that that's, unfortunately, quite prevalent. And, you know, we've lived through it, so it's not something that we joke about, right?
JAD: What is she—what events in the world is she thinking of when she says that?
SIMON: Well, some very recent history—and so we're gonna take a little bit of a detour here to understand why Berhan would want that "kill all men" joke taken down. And along the way, we're gonna see close up, really, the life-and-death decisions this global court will have to make. We'll get to that right after a quick break.
[LISTENER: This is Lauren Furey from Western Springs, Illinois. Radiolab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at www.sloan.org.]
[JAD: Science reporting on Radiolab is supported in part by Science Sandbox, a Simons Foundation initiative dedicated to engaging everyone with the process of science.]
JAD: Jad. Radiolab. Here with Simon Adler.
SIMON: Yes, yes, yes, yes, yes.
JAD: Okay, before we went to break, we met digital rights activist Berhan Taye, who was opposed to leaving a joke like "kill all men" on Facebook.
SIMON: That is correct.
JAD: So why is that? What was she thinking?
SIMON: Yeah. Well I mean, it comes down to what's been going on in her home country.
BERHAN TAYE: You know, there's absurdity in Ethiopia right now.
SIMON: Ethiopia.
BERHAN TAYE: There's a lot of animosity between different, you know, groups, a lot of tension.
SIMON: And looking at just the past four or five years there, you see how these questions of who's punching up and who's punching down can get flipped on their head with the click of a mouse. So to set things up, Ethiopia sits right on the Horn of Africa. It's the second-most populous country on the continent. And for a long time, it was considered one of the world's leading jailers of journalists.
ENDALK CHALA: Politically, the country used to be very authoritarian, very repressive.
SIMON: This is online-activist-turned-academic Endalk Chala.
ENDALK CHALA: Assistant professor at Hamline University. And yes, I can say that me and some of my colleagues were, like, the first people blogging to the Ethiopian public.
SIMON: He was actually forced into exile because of this activism. And the way he tells it ...
ENDALK CHALA: Late 2015 ...
[NEWS CLIP: The worst unrest in a decade.]
[NEWS CLIP: The demonstration started as a small-scale student protest.]
SIMON: Student protests break out. And they start spreading across the country.
[NEWS CLIP: Thousands took to Ethiopia's streets over the weekend.]
SIMON: And watching this unfold from the United States, Dr. Chala noticed that at the center of these protests was this guy Jawar Mohammed.
ENDALK CHALA: Yes. Jawar himself is a very tech-savvy guy. He's articulate in English.
[ARCHIVE CLIP, Jawar Mohammed: If dissenting voices are allowed, there is going to be sufficient pressure on the government to break its will.]
ENDALK CHALA: And he had about 1.4 or three million followers on Facebook.
SIMON: Making him as powerful as just about any news organization in Ethiopia. Now a couple of quick things about Jawar: Number one, he is from the Oromo ethnicity, the largest ethnic group in the country—and we'll get more into that in a moment. But first, the other notable thing about Jawar is that at the time that these protests were getting underway, he was actually living in Minnesota. He was in exile there, thousands of miles away from the action.
[NEWS CLIP: At least 75 people killed during …]
SIMON: But as these protests intensified, including clashes with the government ...
[ARCHIVE CLIP, Jawar Mohammed: They died for the true cause!]
[NEWS CLIP: Two people were killed in clashes with ...
[ARCHIVE CLIP, Jawar Mohammed: They died to liberate their people!]
SIMON: ... he was able to galvanize folks and direct things because of Facebook.
[ARCHIVE CLIP, Jawar Mohammed: Whether you live in America, Canada, Oromia or Kenya, you have the obligation to take up the arms of these young men.]
SIMON: So that sort of amazingly, when these protests succeeded ...
[ARCHIVE CLIP: [chanting and cheering]
[NEWS CLIP: Hailemariam Desalegn has resigned amid deadly anti-government protests there.]
SIMON: ... he was lionized as, well, a hero. One who'd helped usher in a new prime minister ...
[NEWS CLIP: Ethiopia has a new leader, Abiy Ahmed.]
[NEWS CLIP: Abiy Ahmed won 60 percent ...]
SIMON: And a new era in Ethiopia.
[NEWS CLIP: Since coming to power, Prime Minister Abiy Ahmed was engaged in listening to what people of the country have to say.]
BERHAN TAYE: And for the first time in our entire maybe 3,000 years of history …
SIMON: Again, Berhan Taye.
BERHAN TAYE: ... we actually thought we could be a cohesive, united country.
SIMON: The government freed thousands of political prisoners and journalists.
[NEWS CLIP: The latest of sweeping measures ...]
SIMON: Invited those in exile ...
[NEWS CLIP: : Ethiopian dissidents exiled abroad ...]
SIMON: ... to come back home. Even ended a decades-long conflict with neighboring Eritrea.
[NEWS CLIP: A promise delivered.]
SIMON: I mean, these changes were so profound that Ethiopia's new prime minister, Abiy Ahmed, thanks in no small part to Jawar Mohammed went on to win ...
[ARCHIVE CLIP, Berit Reiss-Andersen: The Nobel committee has decided to award the Nobel Peace Prize to Ethiopian Prime Minister Abiy Ahmed Ali.]
SIMON: ... that's right. The Nobel Peace Prize. So what you've got here is really the promise of Facebook realized, right? Like, man from thousands of miles away leverages Facebook's power to bring down an authoritarian government and elevate a peace-loving leader. I mean, this is David-and-Goliath-level [bleep]. And as part of all of these reforms ...
[ARCHIVE CLIP. Jawar Mohammed: I will be traveling back to the country. We have now established our office in Addis Ababa.]
SIMON: ... Jawar Mohammed returned to Ethiopia and was welcomed with open arms. However ...
[NEWS CLIP: While Abiy Ahmed's reform ambitions have increased his popularity, analysts fear that ethnic rivalries in Ethiopia will undermine his reforms.]
SIMON: ... the very forces that brought this change about began pulling in the opposite direction.
ENDALK CHALA: And I'm sure you're going to get a lot of reaction for this because everything is contested in Ethiopia, every historical fact, everything. You know, you see people are confused. There is information disorder in the United States. This is just like child's play when you compare it with Ethiopia. But yes, the first—the first violence that happened was in 2018. The first, it was gruesome pictures circulating on Facebook along with, you know, different anti-ethnic minority sentiment.
JAD: But what were the ethnic tensions, and what was being said?
SIMON: Yeah. So how complicated to get—or how in the weeds to get here?
JAD: Get complicated.
SIMON: Well, Okay. So as I mentioned, Jawar is part of the Oromo ethnicity, the largest ethnicity in the country. And while the Oromo are the largest, they've also long felt politically and culturally marginalized. And this feeling of marginalization, this resentment, this was really at the heart of the revolutionary protests that Jawar had helped lead.
[ARCHIVE CLIP, interviewer: Jawar, I'm just curious. Are you Oromo first or Ethiopian first?]
[ARCHIVE CLIP, Jawar Mohammed: I am an Oromo first.]
SIMON: I mean, many of his posts pointed directly at it.
ENDALK CHALA: He would say Oromo are oppressed, and how Oromos were marginalized. And that is absolutely okay with me because there is some historical truth to it. But he's a guy, like, who heats up the temperature, ramp up some emotions.
[ARCHIVE CLIP, Jawar Mohammed: As I said, we are forced to fight back, to coalesce together, to come together and fight back.]
SIMON: But now, even with the old government out of power and a new Oromo prime minister in power, Jawar Mohammed did not let up. He kept stoking this resentment.
[ARCHIVE CLIP, Jawar Mohammed: To be honest with you, I think there is a risk of, not civil war, but catastrophic communal violence across the country. I think people have to be very careful from that one.]
SIMON: And with this inversion of power, statements he was making during the protests sounded very different in 2018. Like, even just the line ...
[ARCHIVE CLIP, Jawar Mohammed: This is our land. This is our homeland.]
SIMON: ... went from being about Ethiopians getting a corrupt government out of power to Oromos getting minorities out of their territory. And quickly, the language began to escalate.
ENDALK CHALA: He will ramp up with, like, "Protect your land. Minorities, they are aliens. They are going to loot you. You know, they are evil."
SIMON: Until eventually ...
ENDALK CHALA: October, 2019.
[NEWS CLIP: The riots began on the 23 of October 2019 and lasted for several days. A mob took to the streets, burned cars and killed several people they thought were their opponents. Eighty-six people died across the country. What caused this horrific outbreak of violence? The Facebook post by opposition leader Jawar Mohammed.]
SIMON: One evening, from his home in Addis Ababa, Jawar Mohammed posted an unsupported claim.
ENDALK CHALA: Insinuating that he is going to be killed by minorities.
[NEWS CLIP: In his post, he called on his supporters for help. In response, some of his followers called for war.]
SIMON: And while Jawar denies that he was intentionally inciting violence, hate flooded onto Facebook.
BERHAN TAYE: Content calling for the killing of all minority groups.
SIMON: Again, Berhan Taye.
BERHAN TAYE: Content actually telling people, like, if your neighbor is from a different ethnic group, go and kill them. Literally, that was what we were seeing.
ENDALK CHALA: And then everyone started to take things on their own hand and, you know, kill minorities.]
BERHAN TAYE: Everything that could go wrong went wrong.
ENDALK CHALA: Minorities were brutally murdered. Like, brutally. Brutal, brutal, gruesome violence.
[NEWS CLIP: Minority communities being brutally targeted by the Oromo, the country's largest ethnic group.]
[ARCHIVE CLIP: (through interpreter) When they tried to cut my granddaughter's breast, I took out mine, and I begged them to cut mine instead. Then they stopped, but they took her father instead.]
SIMON: And since then, the government just has not been able to get back to any sort of peace.
[NEWS CLIP: More than 800,000 people have been displaced.]
[NEWS CLIP: At least five people were shot dead by police on Monday.]
[NEWS CLIP: At least 50 people ...]
[NEWS CLIP: The fatal shooting of Hachalu ...]
SIMON: And so every couple weeks ...
[NEWS CLIP: Dozens have been killed.]
SIMON: ... there's just another outbreak ...
[NEWS CLIP: Gunshots continue ...]
SIMON: ... of this sort of violence.
ENDALK CHALA: Facebook brought this change, this political change. And that is bullshit for me. I'm sorry for my phrase, but that is what happened.
SIMON: And so back in Nairobi, in an air-conditioned conference room where this Supreme Court of Facebook training session was underway, as Berhan was sitting there, staring down at this iPad with a photo on it that says "kill all men," she's like, "Yeah, this has to come down."
BERHAN TAYE: You know, I'm not in a space to, you know, even give space to having a conversation about content governance and moderation when it's about humor.
SIMON: And Berhan was not alone in this.
BRENT HARRIS: Many people felt that is an incitement to violence. That could result in actual harm.
SIMON: Again, Facebook's Brent Harris.
BRENT HARRIS: And that is something that should not be on Facebook.
BERHAN TAYE: And so I think around 4:00 pm, to be honest with you, I left.
SIMON: She walked out of the session.
BERHAN TAYE: Because I was just like, no. This does not address the issues that we're talking about today.
JAD: Damn, what do we do? Because it really is a "we." What do we do if the very thing that people in New York in an ironic way say must stay up is the very thing that makes her walk out because it's just utterly privileged and completely ignorant of the real-life consequences of hate speech? [bleep]. That's—wow!
SIMON: And keep in mind, these are just mock trials—training sessions, really. Like, they ran into this as they were trying to answer how to answer these sorts of questions. And now, we will get to some of their actual rulings and this Supreme Court itself.
JAD: Yeah.
SIMON: But first, like, I think that the tension we're seeing here goes deeper than this one example. I mean, at the core of Facebook is this very American understanding of freedom of expression. And you hear this even in the way Facebook executives just talk about the company.
[ARCHIVE CLIP, Mark Zuckerberg: And more people being able to share their experiences, that's how we make progress together.]
SIMON: You know, how many times has Mark Zuckerberg said some version of this?
[ARCHIVE CLIP, Mark Zuckerberg: The most progress in our lives actually comes from individuals having more of a voice.]
SIMON: But when you talk to people from different parts of the world, like, there's not universal agreement on this.
ENDALK CHALA: I will definitely tell you that I found myself—oh, my goodness. I was not as liberal as I thought.
SIMON: Again, Professor Endalk Chala.
ENDALK CHALA: In Ethiopia, Facebook came and overwhelmed us with information. We didn't have a well-established fact-checking system. We didn't have journalism institutions.
SIMON: We, Ethiopia, have only imported Facebook. We haven't imported the rest of the institutions and democratic foundations.
JAD: Right.
SIMON: The economic security around which such untrammeled freedom of expression is beneficial.
ENDALK CHALA: And so, well, ten years ago, eight years ago, yeah, I thought that freedom of expression and technology will help us, liberate us and get us out of authoritarian system. Now I have seen people who get angry and they will take matters on their own hands. That's what happened. So it's about, like, a choice between coexistence or saying whatever you want to say. It comes down to that for me. And as I have seen the violence that those speech has made, I think I would prefer coexistence.
SIMON: And to put that opinion in perspective here ...
BERHAN TAYE: Eighty percent of Facebook users are not American.
JAD: Eight zero?
SIMON: Yeah.
JAD: Really?
SIMON: Yeah.
BERHAN TAYE: And content moderation is a very difficult task, one that's being done by people that have no freaking idea about our way of life, you know? And unfortunately, it's us that are being affected over and over again with these things, than—you know, than you guys.
JAD: I mean, is there anyone openly advocating for just abolishing Facebook?
SIMON: [laughs] Yes, but I don't think anybody's taking that particularly seriously.
JAD: But I mean, come on. Like, at a certain point, if private company becomes so potentially toxic to the very basic functioning of a decent democracy—I don't know, man. I don't know. Unless you can somehow break Facebook into a Balkanized set of internets.
SIMON: Right.
JAD: Where each one has its own separate rules. But I doubt that's even possible.
SIMON: Well, engineering-wise, it is possible. Facebook, in a few rare instances, already does employ some version of this. I spoke to Monika Bickert, who is Facebook's head of global policy, and she explained that there are certain slurs that are outlawed in specific regions but allowed everywhere else. And similarly, they do have to abide by local laws. But she did go on to say that, quote, "If you want a borderless community, you have to have global policies." And that she doesn't expect that to change.
JAD: No. No! That's crazy. You're gonna have to be so astute and so aware of regional context and regional history. I just don't think that's possible. So actually, now that I'm saying it out loud, I think they should be outlawed. I don't know. I've suddenly talked myself into a very extreme position, but it suddenly seems like, what other solution is there?
SIMON: Well, the solution Facebook has landed on is this Supreme Court. After those global workshops, they took all that feedback and created this independent structure. It's going to have 40 members—it currently has 19. The members represent every continent other than Antarctica. And they're from just a wide array of backgrounds. Some are lawyers, others are free speech scholars, activists, journalists, even a former prime minister of Denmark.
SIMON: And among the first decisions they're going to have to make is whether or not former President Trump will be banned from the platform indefinitely. Facebook has currently banned him, but it will be up to the board to rule on whether that ban should remain or be lifted. And I mean, this decision won't just impact Trump. It could very well have implications for how Facebook will deal with political figures not just in the United States, but in places like Ethiopia.
SIMON: Hello, hello.
MAINA KIAI: Hey, Simon.
SIMON: Maina, very nice to meet you virtually here.
MAINA KIAI: How are you doing?
SIMON: I'm good. How are you, sir?
MAINA KIAI: All right.
SIMON: And while making the right decisions for the entire planet seems in many ways impossible, when I sat down and talked to several members of this court, of this board, I have to say they did make me a little bit hopeful.
SIMON: Thanks so much for being willing to do this. I hope we can have a little bit of fun here today.
MAINA KIAI: I hope so. I was—yes, I think we should make as much controversy as possible.
SIMON: Oh, wow! Okay.
SIMON: This is Maina Kiai. He's a member of the board, former special rapporteur to the United Nations. And he's basically spent his entire life fighting for human rights. And what struck me about him right off the bat is just how on un-Facebook-y he is.
MAINA KIAI: I haven't—I haven't used Facebook or Twitter myself.
SIMON: Really?
MAINA KIAI: I'm old-school. I try to keep my private life private.
SIMON: Why the hell were you chosen to be on the oversight board of a product that you don't even use?
MAINA KIAI: Why?
SIMON: Yeah.
MAINA KIAI: Because there were all kinds of people being chosen for it. I mean, that's the beauty of it, isn't it, that we have all kinds of people on the board. All kinds.
SIMON: And that he sees the solution here in the incremental progress we've made in the past.
MAINA KIAI: You know, look: I see this work as human rights work. I have gone through in my life different things around hate speech, using radio in, first of all, Rwanda, then in Kenya as well. The media can be abused, and then how do you rein them in? How do you mitigate them? And how do you mitigate them in a way that doesn't abuse human rights? So the tools and the problems is basically the same. The difference is that media, mainstream media before social media, has been regulated over time—decades and years—but then informed and guided how the information is put out.
SIMON: He said, just look at the five-second delay that live television runs on now.
MAINA KIAI: I'm sure when it started with live television and live radio, it was on the go. So I think that's the questions we have to now deal with with Facebook. But I think I have confidence that there is enough experience in the world that's dealt with these phenomenons.
SIMON: And this feeling resonates with most of the people I spoke to at Facebook.
BRENT HARRIS: I mean, I spent about 15 years working on climate before I came to Facebook. And I think the issues here are deeply analogous.
SIMON: Again, Brent Harris.
BRENT HARRIS: They are human-generated. There are major regulatory actions that are needed. There's a serious responsibility by industry to step up and think about the responsibility that they hold. And the solutions that will come forward as we start to figure out how to address these types of challenges will inherently be incremental. And at times, I worry we will kill off incremental good progress that start to address these issues because they don't solve everything.
SIMON: Hmm.
BRENT HARRIS: You know, is the Paris agreement enough? No. Is it a lot better than what we had before? Yes. Is the Montreal Protocol enough? No. Is it a substantial step forward against this challenge? Yes. And building this board is only one step in a wide array of many other steps that need to be taken on.
SIMON: It sounds to me that what you're saying is this is the first piece in this global governance body Facebook is imagining.
BRENT HARRIS: Well, if it really works and people end up believing in it and thinking it's a step forward, then yeah, further steps can be taken.
[ARCHIVE CLIP, Mark Zuckerberg: You know, nothing's ever perfect. There are always gonna be issues. People will criticize the specific people who are on it. They'll criticize the process.]
SIMON: And I mean, when Kate Klonick, who turned us on to this story to begin with, when she interviewed Mark Zuckerberg, he said as much.
[ARCHIVE CLIP, Mark Zuckerberg: It's not like the oversight board is the end. It is one institution that needs to get built as part of the eventual community governance that I think we will end up having 10 years from now, or however long it takes to build all of this out. It just felt like a kind of a concrete step that we could go take.]
SIMON: And what they're thinking of in terms of next steps ...
BRENT HARRIS: One would be something like regional circuits or, you know, a level of adjudication that are more regional or more localized, that sit below this board as a means of taking these decisions.
SIMON: You mean like seven continental courts or, I don't know, 52 sub-regional courts that feed up to the one supreme court?
BRENT HARRIS: Yeah, that's right.
SIMON: And so what we're watching spring up here is not just a solution to what is truly one of the problems of our moment, but also this wholly new way to organize ourselves and sort of adjudicate our behavior.
MAINA KIAI: Look. Look, what we're trying to do is an experiment. I cannot tell you it will work, but I can tell you we'll try to make it work as much as possible. And when we make mistakes, I am absolutely—I have got no doubt in my mind that being the humans we are, not yet evolved into saints and angels, we will make mistakes. That's part of the process.
SIMON: The oversight board started officially hearing cases in October. They've already ruled on matters ranging from whether nude photos advocating breast cancer awareness should stand, to whether a post about churches in Azerbaijan constitutes hate speech. And real quick before we go: An update, actually. Since we first reported this story, the oversight board has come to a decision about President Trump. They chose to uphold Facebook's ban, meaning well, you won't be seeing posts from him in your timeline anytime too soon.
JAD: This story was produced and reported by Simon Adler, with original music throughout by Simon. Is this original music by Simon that we're hearing right now, Simon?
SIMON: It is, indeed.
JAD: All right! As we said at the top, this episode was made in collaboration with The New Yorker Radio Hour and New Yorker magazine. To hear more about the intricacies of how this court came into being, the rulings they've already made and what's coming up on their docket, check out David Remnick and reporter Kate Klonick's conversation in The New Yorker Radio Hour's podcast feed, or head over to NewYorkerRadioHour.org. And on that note, a huge thank you to Kate Klonick, whose tireless coverage of Facebook and their oversight board made this story possible. We'd also like to give special thanks to Julie Owono, Tim Wu, Noah Feldman, Andrew Marantz, Monika Bickert, John Taylor, Jeff Gelman and all the volunteers who spoke with us from the Network Against Hate Speech.
SIMON: Beautiful, Jad. That's great.
JAD: All right.
[LISTENER: Hi, this is Claire Sebree calling from Lafayette, California. Radiolabwas created by Jad Abumrad and is edited by Soren Wheeler. Lulu Miller and Latif Nasser are our co-hosts. Dylan Keefe is our director of sound design. Suzie Lechtenberg is our executive producer. Our staff includes: Simon Adler, Jeremy Bloom, Becca Bressler, Rachael Cusick, David Gebel, Matt Kielty, Annie McEwen, Sarah Qari, Arianne Wack, Pat Walters and Molly Webster, with help from Shima Oliaee, Sarah Sandbach and Jonny Moens. Our fact-checkers are Diane Kelly and Emily Krieger.]
-30-
Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.