Feb 12, 2021

Facebook's Supreme Court

Since its inception, the perennial thorn in Facebook’s side has been content moderation. That is, deciding what you and I are allowed to post on the site and what we’re not. Missteps by Facebook in this area have fueled everything from a genocide in Myanmar to viral disinformation surrounding politics and the coronavirus. However, just this past year, conceding their failings, Facebook shifted its approach. They erected an independent body of twenty jurors that will make the final call on many of Facebook’s thorniest decisions. This body has been called: Facebook’s Supreme Court.

So today, in collaboration with the New Yorker magazine and the New Yorker Radio Hour, we explore how this body came to be, what power it really has and how the consequences of its decisions will be nothing short of life or death.

This episode was reported and produced by Simon Adler.

To hear more about the court's origin, their rulings so far, and their upcoming docket, check out David Remnick and reporter Kate Klonick’s conversation in the New Yorker Radio Hour podcast feed.

Support Radiolab by becoming a member today at Radiolab.org/donate.    


THE LAB sticker

Unlock member-only exclusives and support the show

Exclusive Podcast Extras
Entire Podcast Archive
Listen Ad-Free
Behind-the-Scenes Content
Video Extras
Original Music & Playlists

ANNOUNCER: Listener-supported WNYC Studios.


UNIDENTIFIED PERSON #1: Wait. Wait. You're listening (laughter)...





UNIDENTIFIED PERSON #2: You're listening...







JAD ABUMRAD: Three, two, one. Hey. I'm Jad Abumrad. This is RADIOLAB. Today, we have got a special collaboration with The New Yorker magazine and The New Yorker Radio Hour - very excited about that. So for the last several years, we here at RADIOLAB - and by we, I mean mainly Simon Adler - he - we have been watching and reporting on Facebook - specifically, how Facebook decides and then enforces what people can and cannot post on their site. As many of you know, the way that they do it is they've got this rule book, one single set of rules for the many countries in the globe that define what is postable and what isn't. And then they have a giant army of 15,000 souls who have to moderate all the crap that we put on Facebook. Anyhow, in doing so, Facebook has managed to piss off, well, just about everybody.


JAD ABUMRAD: I mean, despite all of the time, effort and money that they have thrown at this problem by taking posts down...


UNIDENTIFIED PERSON #4: ...Censorship of...

JAD ABUMRAD: ...They have been accused of censoring voices across the political spectrum and infringing on users' right to free expression.


UNIDENTIFIED REPORTER #1: The site won't let them post pictures nursing their...

UNIDENTIFIED REPORTER #2: ...Enemies of the First Amendment.

JAD ABUMRAD: And then by leaving material up...


UNIDENTIFIED PERSON #5: (Non-English language spoken).

UNIDENTIFIED REPORTER #3: ...Used to incite violence against Rohingya refugees.

UNIDENTIFIED PERSON #6: Investigators blame Facebook.

JAD ABUMRAD: ...They've been accused of helping to incite a genocide in Myanmar...


UNIDENTIFIED REPORTER #4: Big stories influence the election.

JAD ABUMRAD: ...And arguably swing the 2016 U.S. presidential election.


UNIDENTIFIED PERSON #7: (Unintelligible).

UNIDENTIFIED PERSON #8: We're working on this. The most impactful decision...

UNIDENTIFIED PERSON #9: Global struggle...

JAD ABUMRAD: And I start here with this wrap-up because since we last reported on all of that, Facebook has actually made a pretty big shift in how they are going to approach policing, refereeing the world's speech. It's a shift that - it's going to have a massive impact on their decisions about what is and is not allowed on the site, including the question - which we'll talk about in a second - of whether former President Trump should be banned indefinitely from Facebook. But more deeply, this is a shift that has Facebook really looking less like a company and, oddly, a little bit more like a government, an unelected government for the entire planet. So with all of that, let me now hand off to...



SIMON ADLER: Hello, Kate. How are you?


SIMON ADLER: Are you rolling on your end?

KATE KLONICK: There we go. Now I am rolling.


KATE KLONICK: I will record myself on my phone.

SIMON ADLER: Yeah. So a couple months back, I called up academic Kate Klonick to talk about this shift and this research project she's been working on documenting it.

KATE KLONICK: I want to be done with this project so [expletive] badly (laughter). I just, like...

SIMON ADLER: (Laughter).


SIMON ADLER: This has been your life.

KATE KLONICK: Yeah, it has...

SIMON ADLER: Yeah. Yeah.

KATE KLONICK: ...Like, a little bit too much so. I'm ready to, like - you know, I'm ready to kind of do something different.

SIMON ADLER: Kate is a professor of law at St. John's University. She's studied Facebook off and on for years. And she was at it again because back in 2018, Mark Zuckerberg, the company's CEO, was considering this strange proposal.

KATE KLONICK: Yes, like, this crazy project to solve this crisis about content management.


KATE KLONICK: I think you know that I've been kind of inside Facebook for the last couple - like, a little over a year.



SIMON ADLER: Kate actually sat down with Mark to talk about all this. She did it over the computer, so you'll hear some clacking of keys. But anyway, as he told her...


MARK ZUCKERBERG: You know, I said a bunch of times that I just think that it's not sustainable over time for one person or even to one company's operations to be making so many decisions balancing free expression, of safety at this scale.

SIMON ADLER: Like, I recognize that this is a huge responsibility...

KATE KLONICK: And I'm not going to be here forever.


MARK ZUCKERBERG: You know, I'd like - I plan to be running the company for a while, but one day, I'm not going to be running the company. And I think at that point, it would be good to have built up a separate set of independent structures that ensure that the values around free expression and balance in these equities can exist.

JAD ABUMRAD: Oh, interesting. Like, I trust me, but I don't necessarily trust the next guy.


KATE KLONICK: And so like a benevolent dictator, he wants to devolve power away from Facebook and himself.

SIMON ADLER: And what he'd landed on as a model for how to do this...

KATE KLONICK: ...Was a Supreme Court for Facebook. And...

SIMON ADLER: Sorry. What exactly - like, what?

KATE KLONICK: Yeah. So the proposal was pretty simple. It was creating a group of people from all over the world that would basically be this oversight on Facebook and its speech policies.

SIMON ADLER: Essentially, think of it as like the Supreme Court of the United States. But instead of overruling lower courts' decisions, this Supreme Court of Facebook would be able to overrule Facebook's own decisions.

It's a hard pitch to make, isn't it?

KATE KLONICK: Oh, my God, 100%.

SIMON ADLER: (Laughter).

KATE KLONICK: You can imagine how that went over.

SIMON ADLER: Yeah, they're like, wait, what? You want us to do what? That's how I imagine that going.

KATE KLONICK: Yeah. But Mark wanted this to happen, and so it happened.


KATE KLONICK: It's part of, like, a larger sense, I think, that he sees Facebook becoming more and more - like, a government isn't even the best term, but, like, a system of government.


MARK ZUCKERBERG: I hope over time to use the fact that I have control to basically help implement some different forms of governance for...

KATE KLONICK: Like, a long-term legacy that he knows will not make terrible decisions.

JAD ABUMRAD: This seems to be them catching up and being like, yeah. Like, if you've got 3 billion users, you're bigger than any company at that point, any country. Your rules can be as impactful as any government's laws. And so you really need to start thinking of yourself in a new way.

SIMON ADLER: Yeah, I think that's right.

Has any company ever done anything like this before?

KATE KLONICK: I mean, honestly, there's nothing that even kind of comes close. And I don't want to be grandiose about this, but there is a sense in which it feels like you're - I felt like I was watching an experiment that would, if it - even if it completely and utterly failed, would be remembered and be a lesson for however the world ends up sorting out this problem of online speech.


SIMON ADLER: And so once Facebook decided to build this court, they suddenly needed to figure out, like, what cases would go to the court, who would be on it, how would they make these decisions? And it became clear that...

BRENT HARRIS: It's, you know, not appropriate to have a single person answer these questions on behalf of society or rate this institution.

SIMON ADLER: This is Brent Harris, who led Facebook's effort to build this board, this court. And as one of his first decisions, he said...

BRENT HARRIS: We need to go out and actually listen to a wide array of people about what the problems are and the challenges are that they are finding and ask them, what do they want this to be? What can we create?

SIMON ADLER: And so they held dozens of listening sessions all over the world, talking to laypeople. But the cornerstone of this process was, really, six global workshops where they invited experts to come and weigh in. Kate was one of 40 or so people that attended the U.S. workshop. It was held in the basement of the Nomad Hotel in downtown Manhattan. And when she walked in...

KATE KLONICK: It was like walking into a technologist's wedding.


KATE KLONICK: You come in. Every, like, table is decorated with succulents and bottles of vase water and an iPad. The iPad is not for you to keep. And in fact, someone joked - one of the Facebook people joked to me, like, yeah, we used a couple-generations-old iPad to make sure no one walked away with it, you know.


SIMON ADLER: That's spectacular.

KATE KLONICK: But - and so you have an iPad. And ultimately...

SIMON ADLER: This moderator came out and tried to get the room's attention.

KATE KLONICK: And, of course, like, everyone's half-listening, and most people are on their phones and, like, whatever else.

SIMON ADLER: In part because, like, a lot of people in that room were just very skeptical of what Facebook was doing here. I mean, Kate herself remained somewhat skeptical of this court.

KATE KLONICK: This is just something Facebook can scapegoat. It's really crappy decisions, too. That was my main skeptical point in all of this.

SIMON ADLER: That Facebook is essentially erecting what will be just a body to absorb blame. But anyhow, the moderator explained what they were up to, that they'd brought these experts here to, in essence, design this institution.

KATE KLONICK: They're like, so what do you think this should be? Like, what does it look like? And some of it was, like, an answer to questions. Some of it was things people brought up - case selection questions, board selection, who picks the board? And I would say a solid third of it was people standing up and holding forth on topics that had nothing to do with why we were there that day.

SIMON ADLER: (Laughter) Less of a question and more of a comment.

KATE KLONICK: Yeah, exactly.


KATE KLONICK: Holy cow - so many of those.

SIMON ADLER: Eventually, though, they got to the heart of the matter - like, how should a global board think about these cases that are right on the edge?

BRENT HARRIS: What we wanted to do was really put people in the shoes that Facebook is in right now in taking these decisions.

SIMON ADLER: So they told them, like, hey, you are going to play mock court. As a group, you're going to have to decide whether a piece of content should stay up on Facebook or come down.

KATE KLONICK: And so everyone was asked to open their iPad. So you were asked to like - we're going to go over the first simulation. And you'll love this, Simon.

SIMON ADLER: (Laughter).

KATE KLONICK: The first simulation that they did was the Kill All Men simulation.



SIMON ADLER: Wow. Oh, that's great.

JAD ABUMRAD: Oh, this is the thing you - the one that you focused on in the last story. I remember there was, like, a song in there.

SIMON ADLER: Yeah, it's...

JAD ABUMRAD: Am I right?

SIMON ADLER: You're totally right. We spent 10, 15 minutes dissecting this piece of content. This is...

JAD ABUMRAD: You know what? You should play this and just be like, here's what they focused on.

SIMON ADLER: OK, yeah. I think we only need to do about three minutes of it. But here it is.


UNIDENTIFIED PERSON #10: We're going to keep it moving right along. The next team to come to the stage - please give it up for Marcia Belsky.


SIMON ADLER: We did this back in 2018. It's about comedian Marcia Belsky and a photo she posted.


MARCIA BELSKY: Thank you. Yes. I get so mad. I feel like my first time to the city, I was such a carefree brat, you know. I was young, and I had these older friends, which I thought was, like, very cool. And then you just realize that they're alcoholics, you know.


SIMON ADLER: This is her up on stage. She's got dark, curly hair, was raised in Oklahoma.


SIMON ADLER: How did you decide to become a comedian?

MARCIA BELSKY: You know, it was kind of the only thing that ever clicked with me. And especially political comedy - you know, I used to watch "The Daily Show" every day.

SIMON ADLER: And inspired by this political comedy, she started this running bit that I think can be called sort of absurdist feminist comedy.


MARCIA BELSKY: Now a lot of people think that I'm, like, an angry feminist, which is weird. This guy called me a militant feminist the other day. And I'm like, OK, just because I am training a militia of women in the woods.


MARCIA BELSKY: At first, I just had this running bit online, on Facebook and Twitter.

SIMON ADLER: She was tweeting, posting jokes.

MARCIA BELSKY: You know, like, we have all the Buffalo Wild Wings surrounded - you know, things like that.

SIMON ADLER: (Laughter).

Eventually took this bit on stage, even wrote some songs.


MARCIA BELSKY: (Singing) Say, all older white men should die, but not my dad - not my dad - no, no, not my dad.


SIMON ADLER: Anyhow, so about a year into this running bit, Marcia was bored at work one day and logs on to Facebook. But instead of seeing her normal news feed, there was this message that pops up.

MARCIA BELSKY: It says, you posted something that discriminated along the lines of race, gender or ethnicity group.

SIMON ADLER: And so we've removed that post.

MARCIA BELSKY: And so I'm like, what could I possibly have posted? I really - I thought it was, like, a glitch.

SIMON ADLER: But then she clicked continue, and there highlighted was the violating post. It was a photo of hers.

What is the picture? Can you describe it?

MARCIA BELSKY: The photo is me as what can only be described as a cherub - cute little 7-year-old with big curly hair. And she's wearing this blue floral dress. Her teeth are all messed up.

SIMON ADLER: And into the photo, Marcia had edited in a speech bubble.

MARCIA BELSKY: That just says, kill all men. And so it's funny because a hit, a hit - it's funny, you know. Trust me. Whatever.

SIMON ADLER: Facebook had taken it down because it violated their hate speech policy.

MARCIA BELSKY: I was dumbfounded.


SIMON ADLER: And so back to present day, this is the scenario they put in front of these tech elites in the basement of the Nomad Hotel to see, really, how they would react.

BRENT HARRIS: Is that hate speech? What does that mean? And should that be up on Facebook or not?

SIMON ADLER: Leave it up or take it down? And so people started to discuss.

KATE KLONICK: People were like, well, this wasn't funny. And someone else was like...

SIMON ADLER: Does it matter whether it's funny or not?

KATE KLONICK: Back and forth and back and forth.

SIMON ADLER: And even so, like, should men be protected?

KATE KLONICK: Like, men are more protected than other groups.

SIMON ADLER: Eventually, though, the room pretty much came to an agreement.

BRENT HARRIS: Kill all men is clearly humor or social commentary. That should be up on Facebook, and it's inappropriate for Facebook to take that down.

JAD ABUMRAD: Yeah, I get that. I mean, I remember when we first did this, feeling like, this is a harmless joke, right? And Facebook should be a place where harmless jokes can get made because in this case, the joke only works because men are the power structure. If they weren't, it wouldn't be funny.

SIMON ADLER: Yeah, it's punching up.

JAD ABUMRAD: There you go. It's punching up, right.

SIMON ADLER: But here's where things get interesting - because as we said, they did six of these expert global workshops.


KATE KLONICK: Berlin, Singapore, New Delhi, Mexico City, Nairobi.

SIMON ADLER: And at each of them, they ran through this Kill All Men scenario.

BRENT HARRIS: We ran that case across the world. And something that's very, very striking is we got really different viewpoints about should that be up on Facebook or not.

SIMON ADLER: Like, not just at the New York workshop, but in Berlin, another Western liberal democracy. And even Singapore - folks supported leaving it up. And, you know, you'd think that folks who'd experienced more authoritarian governments and restrictions on their speech would also be for leaving it up. But it didn't go that way.

BERHAN TAYE: (Laughter) This sounds really bad.

SIMON ADLER: Go for it.

BERHAN TAYE: But I understand that - like, I understand that, of course, like, kill all men. That's the most feminist, radical joke that you can make.

SIMON ADLER: This is Berhan Taye. She works for an NGO called Access Now.

BERHAN TAYE: We defend and extend digital rights of users at risk around the world.

SIMON ADLER: And when she was shown this photo at the global workshop in Nairobi, which had attendees from all across the African continent, her thought was...

BERHAN TAYE: It's very funny. And, you know, many of us that are feminists might have said that once - you know, once, twice in our life, right? Where you're just like, no, could we - yeah. You know, and I understand that to be a joke. So I'm like, yeah, of course, there should be space for humor, and I know why satire is so important.

SIMON ADLER: But I'm sensing a but. What is it?

BERHAN TAYE: So, you know, it's - how do I put it? So for me right now - you know, it's funny, but, you know, humor is a luxury.


BERHAN TAYE: And we're not - I mean, none of us are laughing right now. So, yes, we've seen content like that that's, unfortunately, quite prevalent. And, you know, we've lived through it, so it's not something that we joke about, right?

JAD ABUMRAD: What is she - what events in the world is she thinking of when she says that?

SIMON ADLER: Well, some very recent history - and so we're going to take a little bit of a detour here to understand why Berhan would want that Kill All Men joke taken down. And along the way, we're going to see, close up, really, the life-and-death decisions this global court will have to make. We'll get to that right after a quick break.


JAD ABUMRAD: Jad. RADIOLAB. Here with Simon Adler.

SIMON ADLER: Yes, yes, yes, yes, yes.

JAD ABUMRAD: OK, before we went to break, we met digital rights activist Berhan Taye, who was opposed to leaving a joke like Kill All Men on Facebook.

SIMON ADLER: That is correct.

JAD ABUMRAD: So why is that? What was she thinking?

SIMON ADLER: Yeah. Well, I mean, it comes down to what's been going on in her home country.

BERHAN TAYE: You know, there's absurdity in Ethiopia right now.

SIMON ADLER: Ethiopia.

BERHAN TAYE: There's a lot of animosity between different groups, a lot of tension.

SIMON ADLER: And looking at just the past four or five years there, you see how these questions of who's punching up and who's punching down can get flipped on their head with the click of a mouse.

So to set things up, Ethiopia sits right on the Horn of Africa. It's the second-most populous country on the continent. And for a long time, it was considered one of the world's leading jailers of journalists.

ENDALK CHALA: Politically, the country used to be very authoritarian, very repressive.

SIMON ADLER: This is online-activist-turned-academic Endalk Chala.

ENDALK CHALA: Assistant professor at Hamline University - and yes, I can say that me and some of my colleagues were, like, the first people blogging to the Ethiopian public.

SIMON ADLER: He was actually forced into exile because of this activism. And the way he tells it...

ENDALK CHALA: Late 2015...


UNIDENTIFIED PROTESTERS: (Non-English language spoken).

UNIDENTIFIED PROTESTER: (Non-English language spoken).

UNIDENTIFIED PROTESTERS: (Non-English language spoken).

UNIDENTIFIED PROTESTER: (Non-English language spoken).


UNIDENTIFIED REPORTER #5: The worst unrest in a decade...

UNIDENTIFIED REPORTER #6: The demonstration started as a small-scale student protest.

SIMON ADLER: Student protests break out.


UNIDENTIFIED PROTESTER: (Non-English language spoken).

UNIDENTIFIED PROTESTERS: (Non-English language spoken).

SIMON ADLER: And they start spreading across the country.


UNIDENTIFIED REPORTER #7: Thousands took to Ethiopia's streets over the weekend.

SIMON ADLER: And watching this unfold from the United States, Dr. Chala noticed that at the center of these protests was this guy Jawar Mohammed.


JAWAR MOHAMMED: (Non-English language spoken).

ENDALK CHALA: Jawar himself is a very tech-savvy guy. He's articulate in English.


JAWAR MOHAMMED: If dissenting voices are allowed, there is going to be sufficient pressure on the government to break its will...

ENDALK CHALA: And he had about 1.4 or three million followers on Facebook.

SIMON ADLER: Making him as powerful as just about any news organization in Ethiopia.

Now, a couple of quick things about Jawar - No. 1, he is from the Oromo ethnicity, the largest ethnic group in the country. And we'll get more into that in a moment. But first, the other notable thing about Jawar is that at the time that these protests were getting underway, he was actually living in Minnesota. He was in exile there, thousands of miles away from the action.


UNIDENTIFIED REPORTER #8: At least 75 people killed during...

SIMON ADLER: But as these protests intensified, including clashes with the government...


JAWAR MOHAMMED: They died for the true cause.

UNIDENTIFIED REPORTER #9: Two people were killed in clashes with...

JAWAR MOHAMMED: They died to liberate their people.

SIMON ADLER: He was able to galvanize folks and direct things because of Facebook...


JAWAR MOHAMMED: Whether you live in America, Canada, Oromia (ph) or Kenya (ph), you have the obligation to take up the arms of these young men.

SIMON ADLER: ...So that, sort of amazingly, when these protests succeeded...


UNIDENTIFIED PEOPLE: (Singing in non-English language).


UNIDENTIFIED REPORTER #10: Hailemariam Desalegn has resigned amid deadly anti-government protests there.

SIMON ADLER: ...He was lionized as, well, a hero...


UNIDENTIFIED PEOPLE: (Singing in non-English language).

SIMON ADLER: ...One who'd helped usher in a new prime minister...


UNIDENTIFIED REPORTER #11: Ethiopia has a new leader - Abiy Ahmed.


UNIDENTIFIED REPORTER #12: Abiy Ahmed won 60%...

SIMON ADLER: ...And a new era in Ethiopia.


UNIDENTIFIED SINGER #1: (Singing in non-English language).


UNIDENTIFIED REPORTER #13: Since coming to power, Prime Minister Abiy Ahmed was engaged in listening to what people of the country have to say.

BERHAN TAYE: And for the first time in our entire maybe 3,000 years of history...

SIMON ADLER: Again, Berhan Taye.

BERHAN TAYE: ...We actually thought we could be a cohesive, united country.


SIMON ADLER: The government freed thousands of political prisoners and journalists...


UNIDENTIFIED REPORTER #14: The latest of sweeping measures...

SIMON ADLER: ...Invited those in exile...


UNIDENTIFIED REPORTER #15: ...Ethiopian dissidents exiled abroad...

SIMON ADLER: ...To come back home, even ended a decades-long conflict with neighboring Eritrea.


UNIDENTIFIED REPORTER #16: A promise delivered.

SIMON ADLER: I mean, these changes were so profound that Ethiopia's new prime minister, Abiy Ahmed, thanks in no small part to Jawar Mohammed went on to win...


BERIT REISS-ANDERSEN: The Nobel committee has decided to award the Nobel Peace Prize to Ethiopian Prime Minister Abiy Ahmed Ali.

SIMON ADLER: That's right - the Nobel Peace Prize.


UNIDENTIFIED SINGER #2: (Singing in non-English language).

SIMON ADLER: So what you've got here is really the promise of Facebook realized, right? Like, man from thousands of miles away leverages Facebook's power to bring down an authoritarian government and elevate a peace-loving leader. I mean, this is David-and-Goliath-level [expletive]. And as part of all of these reforms...


JAWAR MOHAMMED: I will be traveling back to the country. We have now established our office in Addis Ababa.

SIMON ADLER: Jawar Mohammed returned to Ethiopia and was welcomed with open arms. However...


UNIDENTIFIED REPORTER #17: While Abiy Ahmed's reform ambitions have increased his popularity, analysts fear that ethnic rivalries in Ethiopia will undermine his reforms.

SIMON ADLER: The very forces that brought this change about began pulling in the opposite direction.

ENDALK CHALA: And I'm sure you're going to get a lot of reaction for this because everything is contested in Ethiopia - every historical fact, everything. You know, you see people are confused. There is information disorder in the United States. This is just like child's play when you compare it with Ethiopia. But, yes, the first violence that happened was in 2018. The first - it was gruesome pictures circulating on Facebook along with, you know, different anti-ethnic minority sentiment.

JAD ABUMRAD: But what were the ethnic tensions, and what was being said?

SIMON ADLER: Yeah. So how complicated to get - or how in the weeds to get here?

JAD ABUMRAD: Get complicated.

SIMON ADLER: Well, OK. So as I mentioned, Jawar is part of the Oromo ethnicity, the largest ethnicity in the country. And while the Oromo are the largest, they've also long felt politically and culturally marginalized. And this feeling of marginalization, this resentment, this was really at the heart of the revolutionary protests that Jawar had helped lead.


UNIDENTIFIED REPORTER #18: Jawar, I'm just curious. Are you Oromo first or Ethiopian first?

JAWAR MOHAMMED: I am an Oromo first.

SIMON ADLER: I mean, many of his posts pointed directly at it.

ENDALK CHALA: He would say Oromo are oppressed and how Oromos were marginalized. And that is absolutely OK with me because there is some historical truth to it. But he's a guy, like, who heats up the temperature, ramp up some emotions.


JAWAR MOHAMMED: As I said, we are forced to fight back, to coalesce together, to come together and fight back.

SIMON ADLER: But now, even with the old government out of power and a new Oromo prime minister in power, Jawar Mohammed did not let up. He kept stoking this resentment.


JAWAR MOHAMMED: To be honest with you, I think there is a risk of - not civil war, but catastrophic communal violence across the country. I think people have to be very careful from that one.

SIMON ADLER: And with this inversion of power, statements he was making during the protests sounded very different in 2018. Like, even just the line...


JAWAR MOHAMMED: This is our land. This is our homeland.

SIMON ADLER: ...Went from being about Ethiopians getting a corrupt government out of power to Oromos getting minorities out of their territory. And quickly, the language began to escalate.

ENDALK CHALA: He will ramp up with, like, protect your land. Minorities - they are aliens. They are going to loot you. You know, they are evil.

SIMON ADLER: Until eventually...

ENDALK CHALA: October 2019.


UNIDENTIFIED REPORTER #19: The riots began on the 23 of October 2019 and lasted for several days. A mob took to the streets, burned cars and killed several people they thought were their opponents. Eighty-six people died across the country. What caused this horrific outbreak of violence? The Facebook post by opposition leader Jawar Mohammed.

SIMON ADLER: One evening, from his home in Addis Ababa, Jawar Mohammed posted an unsupported claim.

ENDALK CHALA: Insinuating that he is going to be killed by minorities.


UNIDENTIFIED REPORTER #19: In his post, he called on his supporters for help. In response, some of his followers called for war.

SIMON ADLER: And while Jawar denies that he was intentionally inciting violence, hate flooded onto Facebook.

BERHAN TAYE: Content calling for the killing of all minority groups...

SIMON ADLER: Again, Berhan Taye.

BERHAN TAYE: ...Content actually telling people, like, if your neighbor is from a different ethnic group, go and kill them. Literally, that was what we were seeing.

ENDALK CHALA: And then everyone started to take things on their own hand...



ENDALK CHALA: ...And, you know, kill minorities.


UNIDENTIFIED PEOPLE: (Chanting in non-English language)

BERHAN TAYE: Everything that could go wrong went wrong.


UNIDENTIFIED PEOPLE: (Chanting in non-English language)

ENDALK CHALA: Minorities were brutally murdered, like, brutally - brutal, brutal, gruesome violence.


UNIDENTIFIED REPORTER #20: Minority communities being brutally targeted by the Oromo, the country's largest ethnic group.

UNIDENTIFIED PERSON #11: (Through interpreter) When they tried to cut my granddaughter's breast, I took out mine, and I begged them to cut mine instead. Then they stopped, but they took her father instead.

SIMON ADLER: And since then, the government just has not been able to get back to any sort of peace.


UNIDENTIFIED REPORTER #21: More than 800,000 people have been displaced...

UNIDENTIFIED REPORTER #22: At least five people were shot dead by police on Monday.

UNIDENTIFIED REPORTER #23: At least 50 people...

UNIDENTIFIED REPORTER #24: The fatal shooting of Hachalu...

SIMON ADLER: And so every couple weeks...


UNIDENTIFIED REPORTER #25: Dozens have been killed.

SIMON ADLER: ...There's just another outbreak...


UNIDENTIFIED REPORTER #26: Gunshots continue...

SIMON ADLER: ...Of this sort of violence.

ENDALK CHALA: Facebook brought this change, this political change. And that is bullshit for me. I'm sorry for my phrase, but that is what happened.


SIMON ADLER: And so back in Nairobi, in an air-conditioned conference room where this Supreme Court of Facebook training session was underway, as Berhan was sitting there, staring down at this iPad with a photo on it that says kill all men, she's like, yeah, this has to come down.

BERHAN TAYE: You know, I'm not in a space to, you know, even give space to having a conversation about content governance and moderation when it's about humor.

SIMON ADLER: And Berhan was not alone in this.

BRENT HARRIS: Many people felt that is an incitement to violence. That could result in actual harm.

SIMON ADLER: Again, Facebook's Brent Harris.

BRENT HARRIS: And that is something that should not be on Facebook.

BERHAN TAYE: And so I think around 4:00 PM, to be honest with you, I left.

SIMON ADLER: She walked out of the session.

BERHAN TAYE: Because I was just like, no. This does not address the issues that we're talking about today.

JAD ABUMRAD: Damn, what do we do? Because it really is a we. What do we do if the very thing that people in New York, in an ironic way, say must stay up is the very thing that makes her walk out because it's just utterly privileged and completely ignorant of the real-life consequences of hate speech? [Expletive]. That's - wow.

SIMON ADLER: And keep in mind; these are just mock trials - training sessions, really. Like, they ran into this as they were trying to answer how to answer these sorts of questions. And now, we will get to some of their actual rulings and this Supreme Court itself.


SIMON ADLER: But first, like, I think that the tension we're seeing here goes deeper than this one example. I mean, at the core of Facebook is this very American understanding of freedom of expression. And you hear this even in the way Facebook executives just talk about the company.


MARK ZUCKERBERG: And more people being able to share their experiences, that's how we make progress together.

SIMON ADLER: You know, how many times has Mark Zuckerberg said some version of this?


MARK ZUCKERBERG: The most progress in our lives actually comes from individuals having more of a voice.

SIMON ADLER: But when you talk to people from different parts of the world, like, there's not universal agreement on this.

ENDALK CHALA: I will definitely tell you that I found myself - oh, my goodness. I was not as liberal as I thought.

SIMON ADLER: Again, professor Endalk Chala.

ENDALK CHALA: In Ethiopia, Facebook came and overwhelmed us with information. We didn't have a well-established fact-checking system. We didn't have journalism institutions.

SIMON ADLER: We, Ethiopia, have only imported Facebook. We haven't imported the rest of the institutions and democratic foundations...


SIMON ADLER: ...The economic security around which such untrammeled freedom of expression is beneficial.

ENDALK CHALA: And so, well, 10 years ago, eight years ago. Yeah, I saw that in freedom of expression and technology will help us, liberate us and get us out of authoritarian system. Now I have seen people who get angry and they will take matters on their own hands. That's what happened. So it's about, like, a choice between coexistence or saying whatever you want to say. It comes down to that for me. And as I have seen the violence that those speech has made, I think I would prefer coexistence.

SIMON ADLER: And to put that opinion in perspective here...

BERHAN TAYE: Eighty percent of Facebook users are not American.

JAD ABUMRAD: Eight zero?




BERHAN TAYE: And content moderation is a very difficult task, one that's being done by people that have no freaking idea about our way of life, you know? And unfortunately, it's us that are being affected over and over again with these things, than - you know, than you guys.

JAD ABUMRAD: I mean, is there anyone openly advocating for just abolishing Facebook?

SIMON ADLER: (Laughter) Yes, but I don't think anybody's taking that particularly seriously.

JAD ABUMRAD: But I mean, come on. Like, at a certain point, if private company becomes so potentially toxic to the very basic functioning of a decent democracy - I don't know, man. I don't know. Unless you can somehow break Facebook into a Balkanized set of Internets...


JAD ABUMRAD: ...Where each one has its own separate rules - but I doubt that's even possible.

SIMON ADLER: Well, engineering-wise, it is possible. Facebook, in a few rare instances, already does employ some version of this. I spoke to Monika Bickert, who is Facebook's head of global policy, and she explained that there are certain slurs that are outlawed in specific regions but allowed everywhere else. And similarly, they do have to abide by local laws. But she did go on to say that, quote, "if you want a borderless community, you have to have global policies" and that she doesn't expect that to change.

JAD ABUMRAD: No. No. That's crazy. You're going to have to be so astute and so aware of regional context and regional history. I just don't think that's possible. So actually, now that I'm saying it out loud, I think they should be outlawed. I don't know. I've suddenly talked myself into a very extreme position, but it suddenly seems like, what other solution is there?

SIMON ADLER: Well, the solution Facebook has landed on is this Supreme Court. After those global workshops, they took all that feedback and created this independent structure. It's going to have 40 members. It currently has 19. The members represent every continent other than Antarctica, and they're from just a wide array of backgrounds. Some are lawyers. Others are free speech scholars, activists, journalists, even a former prime minister of Denmark.

And among the first decisions they're going to have to make is whether or not former President Trump will be banned from the platform indefinitely. Facebook has currently banned him, but it will be up to the board to rule on whether that ban should remain or be lifted. And, I mean, this decision won't just impact Trump. It could very well have implications for how Facebook will deal with political figures not just in the United States, but in places like Ethiopia.

Hello, hello.

MAINA KIAI: Hey, Simon.

SIMON ADLER: Maina, very nice to meet you virtually here.

MAINA KIAI: How are you doing?

SIMON ADLER: I'm good. How are you, sir?

MAINA KIAI: All right.

SIMON ADLER: And while making the right decisions for the entire planet seems in many ways impossible, when I sat down and talked to several members of this court, of this board, I have to say, they did make me a little bit hopeful.

Thanks so much for being willing to do this. I hope we can have a little bit of fun here today.

MAINA KIAI: I hope so. I was - yes. I think we should make as much controversy as possible.


This is Maina Kiai. He's a member of the board, former special rapporteur to the United Nations. And he's basically spent his entire life fighting for human rights. And what struck me about him right off the bat is just how on un-Facebooky (ph) he is.

MAINA KIAI: I haven't used Facebook or Twitter myself.


MAINA KIAI: I'm old-school. I try to keep my private life private.

SIMON ADLER: Why the hell were you chosen to be on the oversight board of a product that you don't even use?



MAINA KIAI: Because there were all kinds of people being chosen for it. I mean, that's the beauty of it - isn't it? - that we have all kinds of people on the board, all kinds.

SIMON ADLER: And that he sees the solution here in the incremental progress we've made in the past.

MAINA KIAI: You know, look; I see this work as human rights work. I mean, I have gone through in my life - through different things around hate speech, using radio in, first of all, Rwanda, then in Kenya as well. The media can be abused. And then, how do you rein them in? How to mitigate them? And how do you mitigate them in a way that doesn't abuse human rights? So the tools and the problems is basically the same. The difference is that media, mainstream media before social media, has been regulated over time - decades and years - but then informed and guided how the information is put out.

SIMON ADLER: He said, just look at the five-second delay that live television runs on now.

MAINA KIAI: I'm sure when it started with live television and live radio, it was on the go. So I think that's the questions we have to now deal with Facebook. But I think - I mean, I have confidence that there is enough experience in the world that's dealt with these phenomenons.

SIMON ADLER: And this feeling resonates with most of the people I spoke to at Facebook.

BRENT HARRIS: I mean, I spent about 15 years working on climate before I came to Facebook. And I think the issues here are deeply analogous.

SIMON ADLER: Again, Brent Harris.

BRENT HARRIS: They are human-generated. There are major regulatory actions that are needed. There's a serious responsibility by industry to step up and think about the responsibility that they hold. And the solutions that will come forward as we start to figure out how to address these types of challenges will inherently be incremental. And at times, I worry we will kill off incremental good progress that start to address these issues because they don't solve everything.


BRENT HARRIS: You know, is the Paris agreement enough? No. Is it a lot better than what we had before? Yes. Is the Montreal Protocol enough? No. Is it a (laughter) substantial step forward against this challenge? Yes. And building this board is only one step in a wide array of many other steps that need to be taken on.

SIMON ADLER: It sounds to me that what you're saying is this is the first piece in this global governance body Facebook is imagining.

BRENT HARRIS: Well, if it really works and people end up believing in it and thinking it's a step forward, then, yeah, further steps can be taken.


MARK ZUCKERBERG: You know, nothing's ever perfect. There are always going to be issues. People will criticize the specific people who are on it. They'll criticize the process.

SIMON ADLER: And I mean, when Kate Klonick, who turned us on to this story to begin with - when she interviewed Mark Zuckerberg, he said as much.


MARK ZUCKERBERG: It's not like the oversight board is the end. It is one institution that needs to get built as part of the eventual community governance that I think we will end up having 10 years from now or however long it takes to build all of this out. It just felt like a kind of a concrete step that we could go take.

SIMON ADLER: And what they're thinking of in terms of next steps...

BRENT HARRIS: One would be something like regional circuits or, you know, a level of adjudication that are more regional or more localized that sits below this board as a means of taking these decisions.

SIMON ADLER: You mean like seven continental courts or, I don't know, 52 subregional courts that feed up to the one Supreme Court?

BRENT HARRIS: Yeah, that's right.


SIMON ADLER: And so what we're watching spring up here is not just a solution to what is truly one of the problems of our moment, but also this wholly new way to organize ourselves and sort of adjudicate our behavior.

MAINA KIAI: Look. Look. What we're trying to do is an experiment. I cannot tell you it will work, but I can tell you we'll try to make it work as much as possible. And when we make mistakes, I am - absolutely - I have got no doubt in my mind that being the humans we are, not yet evolved into saints and angels, we will make mistakes. That's part of the process.


SIMON ADLER: The oversight board started officially hearing cases in October. They've already ruled on matters ranging from whether nude photos advocating breast cancer awareness should stand to whether a post about churches in Azerbaijan constitutes hate speech. The board will render their decision on President Trump in the next few months.


JAD ABUMRAD: This story was produced and reported by Simon Adler, with original music throughout by Simon. Is this original music by Simon that we're hearing right now, Simon?

SIMON ADLER: It is indeed.

JAD ABUMRAD: All right. As we said at the top, this episode was made in collaboration with The New Yorker Radio Hour and New Yorker magazine. To hear more about the intricacies of how this court came into being, the rulings they've already made and what's coming up on their docket, check out David Remnick and reporter Kate Klonick's conversation in The New Yorker Radio Hour's podcast feed or head over to newyorkerradiohour.org. And on that note, a huge thank you to Kate Klonick, whose tireless coverage of Facebook and their oversight board made this story possible.

We'd also like to give special thanks to Julie Owono, Tim Wu, Noah Feldman, Andrew Marantz, Monika Bickert, John Taylor, Jeff Gelman and all the volunteers who spoke with us from the Network Against Hate Speech.


SIMON ADLER: Beautiful, Jad. That's great.

JAD ABUMRAD: All right.


CLAIRE SEBREE: Hi. This is Claire Sebree (ph) calling from Lafayette, Calif. RADIOLAB was created by Jad Abumrad and is edited by Soren Wheeler. Lulu Miller and Latif Nasser are our co-hosts. Dylan Keefe is our director of sound design. Suzie Lechtenberg is our executive producer.

Our staff includes Simon Adler, Jeremy Bloom, Becca Bressler, Rachael Cusick, David Gebel, Matt Kielty, Annie McEwen, Sarah Qari, Arianne Wack, Pat Walters and Molly Webster with help from Shima Oliaee, Sarah Sandbach and Jonny Moens. Our fact-checkers are Diane Kelly and Emily Krieger.

Copyright © 2020 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.


New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.