
Feb 9, 2015
Transcript
[RADIOLAB INTRO]
JAD ABUMRAD: All right. Hey, I'm Jad Abumrad.
ROBERT KRULWICH: I'm Robert Krulwich.
JAD: This is Radiolab, the podcast. So here's a story we've been following for a while.
ROBERT: Yes.
JAD: Comes from a friend of mine Andrew Zolli, who is a great thinker and writer. He wrote a book called Resilience: Why Things Bounce Back. And he's a guy who thinks a lot about technology.
ANDREW ZOLLI: I have been interested in a long time—for a long time, in the relationship between technology and emotion. And because—well, I've thrown more than one cell phone to the ground. [laughs]
JAD: Andrew and I were having breakfast one day, and he pitched me on this idea of doing a story about Facebook.
ROBERT: Yeah, I remember.
JAD: I'm not a huge ...
ROBERT: No.
JAD: ... believer in doing stories about Facebook. But this story was wickedly interesting.
ROBERT: Yeah.
JAD: And profound in its way. So he and I have been following it for a couple years, up and down through this roller coaster of events. It really begins in 2011.
ANDREW ZOLLI: Well, let me back up for a minute. One of the challenges in talking about Facebook is just the scale of the thing. So, you know, there's—there's 1.3 billion people on Earth as of March 2014.
JAD: Wow.
ANDREW ZOLLI: Those are active monthly users. There's a billion people who access the site through—through mobile devices. Just to put that in perspective, there's more Facebook users than there are Catholics.
JAD: That can't be true.
ANDREW ZOLLI: On Earth. Yeah.
JAD: No!
ANDREW ZOLLI: Yeah.
JAD: It turns out it is true. But they're neck-in-neck. Anyhow though, the overall point is that when you have one out of every seven people on the planet in the same space trying to connect across time and geography ...
ANDREW ZOLLI: You are bound to create problems sometimes.
[NEWS CLIP: Facebook making headlines again tonight. The issue this time ...]
JAD: Before we go there, we should introduce you to the guy in our story who is the problem-solver.
ARTURO BEJAR: My name is Arturo Bejar, and I'm a director of engineering at Facebook.
JAD: The story begins Christmas, 2011.
ANDREW ZOLLI: People are doing what they do every holiday season, they're just—they're getting back together with their families, and they're going to family parties and they're taking lots and lots of pictures. And they're all uploading them to Facebook.
ARTURO BEJAR: And at the time, the number of photos that were getting uploaded was going pretty crazy.
JAD: In fact, in just those few days between Christmas and New Year's ...
ANDREW ZOLLI: There are more images uploaded to Facebook ...
ARTURO BEJAR: Than there were the entirety of Flickr.
JAD: Wait, you're saying more images were uploaded in a week to Facebook than all of Flickr all time?
ANDREW ZOLLI: Yeah.
ROBERT: Whoa.
JAD: Which created a situation.
ARTURO BEJAR: The number of photos was going up, and along with the number of photos going up, the number of reports was going up.
ANDREW ZOLLI: What he means by reports is this ...
JAD: Back in 2011 ...
ANDREW ZOLLI: ... if you saw something on Facebook that really upset you, you could click a button to report it.
JAD: You could tell Facebook to take it down. Which from their perspective is a really important mechanism, because if you're Facebook, you don't want certain kinds of content on your site.
ARTURO BEJAR: You don't want nudity, you don't want, like, drug use, hate speech, things like that.
ANDREW ZOLLI: So a day or so after Christmas ...
JAD: Thereabouts.
ANDREW ZOLLI: Facebook engineers come back to work, and they find waiting for them literally millions of photo reports.
ARTURO BEJAR: Yes. The number of people that would be necessary to review everything that was coming in, it kind of boggled the mind.
JAD: How many people would you have needed?
ARTURO BEJAR: I think at the time we were looking at it, which was two years ago—and again, all this has grown much since then—we're looking at, like, thousands.
JAD: Thousands. Like, some giant facility in Nevada filled with nothing but humans looking at Christmas porn.
ROBERT: [laughs]
JAD: We were actually joking about this, but we found out later there actually are thousands of people across the world who do this for internet companies all day long—which clearly warrants its own show. But for our purposes, just know that when a photo is reported, a human being has to look at it.
ARTURO BEJAR: Exactly right. Because there needs to be a—a judgment on the image, and humans are the best at that.
ANDREW ZOLLI: So Arturo decided before we do anything, let's just figure out what we're dealing with.
ARTURO BEJAR: And so we sat down with a team of people, and we started going through the photos that people were reporting.
JAD: And what they found was that about 97 percent of these million or so photo reports were drastically miscategorized. [laughs] They were seeing moms holding little babies ...
ARTURO BEJAR: Reported for harassment.
JAD: Pictures of families in matching Christmas sweaters ...
ARTURO BEJAR: Reported for nudity. Pictures of puppies reported for hate speech.
JAD: Puppies reported as hate speech?
ARTURO BEJAR: Yes. And we're like, "What's going on? Right? Hmm.
JAD: So they decide, let's investigate.
ANDREW ZOLLI: Okay, so step one for Facebook: just ask a few of these people.
ARTURO BEJAR: Why don't you like this photo?
ANDREW ZOLLI: Why did you report this?
JAD: Responses come back, and the first thing they realize is that almost always the person complaining about the image ...
ANDREW ZOLLI: Was in the image they were complaining about.
JAD: And they just hate the picture. Like, maybe they were doing a goofy dance, someone snapped a photo and then why did you post that?
ANDREW ZOLLI: Take it down.
JAD: Maybe they were at a party ...
ANDREW ZOLLI: They got a little too drunk, they hooked up with their ex, somebody took a picture, and that person says "Oh, you know, that's a one-time thing. That's never happening again. Take it down."
JAD: Arturo said there were definitely a lot of reports from people who used to be couples.
ARTURO BEJAR: And then they broke up, and then they're asking to take the photos down.
JAD: And the puppy, what would—what would be the reason for that?
ARTURO BEJAR: Oh, because it was maybe a shared puppy.
JAD: You know, maybe it's your ex-wife's puppy. You see it, makes you sad.
ANDREW ZOLLI: Take it down.
ARTURO BEJAR: So once we've begun investigating, you find that there's a lot of this relationship things that happen that are, like, really complicated.
ANDREW ZOLLI: You're talking about stuff that's the kind of natural detritus of human dramas.
JAD: And the only reason that the person reporting it flagged it as, like, hate speech is because that was one of the only options.
ANDREW ZOLLI: They were just picking because they needed to get to the next screen to submit the report.
ARTURO BEJAR: So we added a step.
ANDREW ZOLLI: Arturo and his team set it up so that when people were choosing that option ...
ARTURO BEJAR: I want this photo to be removed from Facebook.
ANDREW ZOLLI: ... some of them would see a little box on the screen ...
ARTURO BEJAR: That said, "How does the photo make you feel?"
ANDREW ZOLLI: And the box gave several choices.
ARTURO BEJAR: The options were "Embarrassing ..."
ANDREW ZOLLI: "Embarrassing."
ARTURO BEJAR: "Saddening ..."
ANDREW ZOLLI: "Upsetting."
ARTURO BEJAR: "Bad photo." And then we always put in an "Other."
ANDREW ZOLLI: Where you could write in whatever you wanted about the image.
ARTURO BEJAR: And it worked incredibly well. I mean, like, 50 percent of people would select an emotion.
JAD: Like, for instance, embarrassing.
ARTURO BEJAR: And then 34 percent of people would select "Other." And we read those, we sit down and we're reading the "Other," and what was the most frequent thing that people were typing into "other?" It was "It's embarrassing."
JAD: It's embarr—but you had embarrassing on the list.
ARTURO BEJAR: I know.
JAD: That's weird!
ARTURO BEJAR: I know.
JAD: It's so odd.
ROBERT: Huh.
JAD: Arturo was like, "Okay, maybe we should just put 'It's' in front of the choices?"
ANDREW ZOLLI: As in, "Please describe this piece of content. It's embarrassing."
ARTURO BEJAR: "It's a bad photo of me. It makes me sad."
JAD: Et cetera.
ANDREW ZOLLI: And when they wrote out the choice that way ...
JAD: With that extra word ...
ARTURO BEJAR: We went from 50 percent of people selecting an emotion to 78 percent people selecting an emotion.
JAD: In other words ...
ANDREW ZOLLI: The word "it's" all by itself boosted the response by 28 percent.
ARTURO BEJAR: From 50 to 78.
JAD: And in Facebook land that means thousands and thousands of people.
ROBERT: I mean, just to slow down for a second, I'm trying to think of what could it—what could that be? "It's."
JAD: It's.
ROBERT: Do people like full sentences, or ...?
JAD: Here's the thinking: it's always good to mirror the way people talk.
ROBERT: Right.
JAD: Arturo's idea, though—which I find kind of interesting—is that when you just say "embarrassing" and there's no subject, it's silently implied that you are embarrassing. But if you say "it's embarrassing," well then that shifts the sort of emotional energy to this ...
ROBERT: Photograph.
JAD: ... thing. And so then it's less hot, and it's easier to deal with.
ROBERT: Oh, how interesting. "That thing is embarrassing. I'm fine, it's embarrassing."
JAD: "It is responsible ...
ROBERT: Yes.
JAD: ... not me."
ROBERT: Good for Arturo. That's a ...
JAD: It's interesting, right?
ROBERT: ... a subtle thought.
JAD: It's very subtle. But it still doesn't solve their basic problem, because even if Facebook now knows why the person flagged the photo, that it was embarrassing and not actually hate speech, they still can't take it down.
ANDREW ZOLLI: I mean, there's nothing in the policy, the terms of service, that says you can't put up embarrassing photos.
JAD: And in fact, if they took it down, they'd be violating the rights of the person who posted it.
ARTURO BEJAR: Like, there's nothing we can do, I'm sorry.
ROBERT: Oh, so they'd actually fence themselves in a little bit.
JAD: Yeah.
ROBERT: Huh. If it were me I'd always put it in "Other."
JAD: I would just be like, "Just go deal with it yourself."
ROBERT: [laughs]
JAD: That's what I would say. Talk to the person. No, honestly, that's the solution. I—he wouldn't put it that way, but—what he needed to have happen was for the person who posted the picture and the person who was pissed about it ...
ARTURO BEJAR: To talk to each other.
ANDREW ZOLLI: To work it out themselves.
JAD: So Arturo and his team made a tweak, where if you said this photo was embarrassing or whatever, a new screen would pop up and it would ask ...
ARTURO BEJAR: Do you want your friend to take the photo down?
JAD: And if you said, "Yes, I would like my stupid friend to take the photo down."
ARTURO BEJAR: We'd put up an empty message box.
ANDREW ZOLLI: Just an empty box that said, "We think it's a good idea for you to tell the person who upset you that they upset you."
ARTURO BEJAR: And only 20 percent of people would type something in and send that message.
ANDREW ZOLLI: They just didn't do it. They just said, "I'd rather you deal with this."
JAD: So Arturo and his team were like, "Okay let's take it one step further." When that message box popped up ...
ARTURO BEJAR: We gave people a default message that we—that we crafted.
ANDREW ZOLLI: To start that conversation.
ARTURO BEJAR: Just get the conversation going. And it's kind of funny. The first version of the message that we did was like ...
ANDREW ZOLLI: "Hey, I didn't like this photo. Take it down."
ROBERT: "Hey, I don't like that photo." That's a little aggressive.
JAD: It is. But when they started presenting people with a message box with that sentence pre-written in ...
ANDREW ZOLLI: Almost immediately ...
ARTURO BEJAR: We went from 20 percent of people sending a message to 50 percent of people sending a message.
ROBERT: Really?
ARTURO BEJAR: It was surprising to all of us. Like, we weren't expecting to see that big of a shift.
ROBERT: Okay, so this means that people just don't want to write. They'll sign up for pretty much anything.
JAD: No, not necessarily.
ROBERT: No?
JAD: Maybe it's just that it's so easy to shirk the responsibility of confronting another person that you need every little stupid nudge you can get. [laughs]
ROBERT: [laughs] I see, okay.
JAD: That's how I see it.
ROBERT: Okay.
JAD: So they put up this pre-written message, it seems to really have an effect. So they're like, "Okay. If that worked so well, why don't we try some different wordings?" Instead of ...
ANDREW ZOLLI: "Hey, I didn't like this photo. Take it down."
JAD: Why don't we try ...
ANDREW ZOLLI: Hey...
JAD: ... Robert.
ANDREW ZOLLI: "I didn't like this photo. Take it down.
JAD: Just putting in your name ...
ANDREW ZOLLI: Works about seven percent better than leaving it out.
ROBERT: Meaning what?
ANDREW ZOLLI: It means that you're seven percent more likely either to get the person to do what you asked them to do ...
JAD: Take down the photo.
ANDREW ZOLLI: ... or to start a conversation about how to resolve your feelings about it.
ROBERT: Oh, we're now measuring the effectiveness of the message.
JAD: Yeah.
ROBERT: So if I'm objecting, will the other party pull it off the computer?
JAD: Pull it off, or just talk to you about it.
ROBERT: Okay.
JAD: They also tried variations like, "Hey, Robert. Would you please take it down?" Throwing in the word "Please." Or, "Would you mind taking it down?"
ARTURO BEJAR: And it turns out that "Would you please" performs four percent better than "Would you mind."
JAD: They're not totally sure why.
ROBERT: Huh.
JAD: But they're—they tried dozens of phrases like, "Would you please mind," "Would you mind," I'm sorry to bring this up, but would you please take it down?" "I'm sorry to bring this up, but would you mind taking it down?" And at a certain point, Andrew and I got ...
ANDREW ZOLLI: We're here to see Arturo.
JAD: We just wanted to see this whole process they're going through up close. So we took a trip out to Facebook headquarters, Menlo Park, California. This is about a year ago.
ARTURO BEJAR: Have you been here before?
JAD: No, I have not.
JAD: So it was before the hubbub. We met up with Arturo, who sort of walked us through the campus.
ARTURO BEJAR: Yeah, this is like a hammock. This is a little like, terrace hangout.
JAD: It's one of these sort of like, socialist utopic Silicon Valley campuses where people are like, in hammocks and there—there's volleyball happening.
ARTURO BEJAR: We actually had baby foxes here.
JAD: Really?
JAD: They had foxes running around at one point. So we were there on a Friday because every Friday afternoon Arturo assembles this really big group ...
ARTURO BEJAR: Welcome to the meeting.
JAD: ... to review all the data. You got about 15 people crammed into a conference room, like technical folks ...
MUSTAPHA: Mustapha. Software engineering at Facebook.
DAN FERRELL: Dan Ferrell. I'm a data scientist.
PAUL: Paul. I'm also an engineer.
JAD: A lot of these guys called themselves "Trust engineers." And every Friday, the trust engineers are joined by a bunch of outside scientists.
DACHER KELTNER: Dacher Keltner, professor of psychology, UC-Berkeley.
MATT KILLINGSWORTH: Matt Killingsworth, I study the causes and nature of human happiness.
EMILIANA SIMON-THOMAS: Emiliana Simon-Thomas, and my background is neuroscience.
JAD: This is the meeting where the team was reviewing all the data about these phrases, and so everybody was looking at a giant graph projected on the wall.
MAN: It's kinda supporting your—your slightly U-shaped curve there, in that the—especially in the deletion numbers, the "Hey, I don't like this photo. Take it down?" and the "Hey, I don't like this photo. Would you please take it down?" are kinda the winners here.
WOMAN: It's kind of interesting that you see the person that's receiving a more direct message is higher, 11 percent versus four percent.
JAD: One of the things they noticed is that any time they used the word "Sorry" in a phrase, like, "Hey, Robert. Sorry to bring this up, but would you please take it down?"
ARTURO BEJAR: Turns out the "I'm sorry" doesn't actually help, it makes the numbers go down.
ROBERT: Really?
WOMAN: Seven and nine are the low—some of the low points, and those are the ones that say "Sorry."
JAD: So, like, just don't apologize. Just don't apologize because, like, it shifts the responsibility back to you, I guess.
ROBERT: No, it doesn't. It's just—it's just ...
JAD: No, man, it's like—it's a linguistic psychology subtle thing.
ROBERT: You're making that up!
JAD: I am, kind of!
ROBERT: Yeah.
JAD: But one of the things that really struck me at this meeting, on a different subject, is that the scientists in the room, as they were looking at the graph, taking in the numbers, a lot of them had this look on their faces of, like, holy [bleep.]
EMILIANA SIMON-THOMAS: I'm just stunned and humbled at the numbers that we generally get in—in these studies.
JAD: That's Emiliana Simon-Thomas from Berkeley.
EMILIANA SIMON-THOMAS: My background is in neuroscience, and I'm used to studies where we look at 20 people and that's sufficient to say something general about how brains work.
ROB BOYLE: Like, in general, at Facebook, like, people would scoff at samples sizes that small.
JAD: That's Rob Boyle, who's a project manager at Facebook.
ROB BOYLE: The magnitudes that we're used to working with are in the hundreds of thousands to millions.
JAD: And it's kind of an interesting moment because there's been a lot of criticism recently—especially in social science—about the sample sizes. How they're too small, and how there's—they're too often filled with white undergraduate college kids, and how can you generalize from that? So you could tell that some of the scientists in the room, like for example Dacher Keltner, he's a psychologist at UC-Berkeley, they were like, "Oh, my God, look at what we can do now. We can get all these different people."
DACHER KELTNER: Of different class backgrounds, different countries.
JAD: To him, this kind of work with Facebook, this could be the future of social science right here.
DACHER KELTNER: There has never been a human community like this in human history.
JAD: And somewhere in the middle of all the excitement about the data and the speed at which they can now test things ...
DACHER KELTNER: The bottleneck is no longer how fast we can test how things work, it's coming up with the right things to test.
JAD: Andrew threw out a question.
ANDREW ZOLLI: What is the statistical likelihood that I have been a guinea pig in one of your experiments?
MAN: I believe 100 percent. But ...
[participants laugh]
DAN FERRELL: If we look at the data, any given person ...
JAD: That's Dan Ferrell, data scientist.
DAN FERRELL: And when we look at the data, any given person is probably currently involved in, what, 10 different experiments? And they've been exposed to—to 10 different experimental things.
MAN: Yeah.
ANDREW ZOLLI: That kind of blew me back a little bit. I was, like, amaze—I've been a research subject, and I had no idea.
JAD: Coming up, everybody gets the idea. And the lab rats revolt. Stay with us.
JAD: This is Radiolab. And we'll pick up the story with Andrew Zolli and I sitting in a meeting at Facebook headquarters—this was about a year and a half ago. We had just learned that at any given moment, any given Facebook user is part of 10 experiments at once without really their knowledge. And sitting there in that meeting, you know, this was a while ago, we both were like, "Did we just hear that correctly?"
ANDREW ZOLLI: That kind of blew me back a little bit. I was, like, amaze—I've been a research subject, and I had no idea. And I had that moment of discovery on a Friday, and literally the next day, Saturday ...
[NEWS CLIP: This is scary.]
ANDREW ZOLLI: ... the world had that experience.
[NEWS CLIP: Facebook using you and me as lab rats for a Facebook experiment on emotions.]
JAD: Barely a day after we'd gotten off the plane from Facebook headquarters ...
ANDREW ZOLLI: The kerfuffle occurred.
[NEWS CLIP: Facebook exposed for using us as lab rats.]
[NEWS CLIP: As lab rats.]
[NEWS CLIP: Lab rats, shall we say?]
[NEWS CLIP: Facebook messing with your emotions.]
JAD: You might remember the story because for a hot second it was everywhere.
[NEWS CLIP: Facebook altered the amount of positive and negative ...]
JAD: It was all over Facebook. The story was, an academic paper had come out that showed that, with some scientists, the company ...
[NEWS CLIP: Had intentionally manipulated user news feeds to study a person's emotional response.]
[NEWS CLIP: Seriously, they wanted to see how emotions spread on social media.]
JAD: They basically tinkered with the news feeds of about 700,000 people.
[NEWS CLIP: 700,000 users to test how they'd react if they saw more positive versus negative posts, and vice versa.]
JAD: And they found an effect, that when people saw more positive stuff in their newsfeeds, they would post more positive things themselves and vice versa. It was a tiny effect, tiny effect, but the results weren't really the story. The real story was that Facebook was messing with us.
[NEWS CLIP: Gives you pause, and scares me when you think that they were just doing an experiment to manipulate how people were feeling and how they then reacted on Facebook.]
ANDREW ZOLLI: People went apoplectic.
[NEWS CLIP: It has this big brother element to it that I think people are going to be very uncomfortable with.]
ANDREW ZOLLI: And some people went so far as to argue ...
[NEWS CLIP: I wonder if Facebook killed anyone with their emotional manipulation stunt.]
ANDREW ZOLLI: ... if a person had a psychological, psychiatric disorder, manipulating their social world could cause them real harm.
[NEWS CLIP: Make sure you read those terms and conditions, my friends, that's—that's the big takeaway.]
KATE CRAWFORD: What you hear is a sense of betrayal, that I really wasn't aware that this space of mine was being treated in these ways, and that I was part of your psychological experimentation.
JAD: That's Kate Crawford.
KATE CRAWFORD: I'm a principal researcher at Microsoft Research.
JAD: Visiting professor at MIT, strong critic of Facebook throughout the kerfuffle.
KATE CRAWFORD: There is a power imbalance at work. I think when we look at the way that that experiment was done, it's an example of highly centralized power and highly opaque power at work. And I don't want to see us in a situation where we just have to blindly trust that platforms are looking out for us. Here I'm thinking of one of—an earlier Facebook study, actually, back in 2010, where they did a study looking at whether they could increase voter turnout. They had this quite simple design, they—they came up with, you know, a little box that would pop up and show you where your nearest voting booth was. And then they said, "Oh, well in addition to that, when you've voted here's a button you can press that says 'I voted,' and then you'll also see the pictures of six of your friends who'd also voted that day." Would this change the number of people who went out to vote that day?
JAD: And Facebook found that it did. That if you saw a bunch of pictures of your friends who had voted, and you saw those pictures on Election Day, you were then two percent more likely to click the "I voted" button yourself. Presumably because you too had gone out and voted. Now two percent might not sound like a lot, but ...
KATE CRAWFORD: It was not insignificant. Again, I think by the order of 340,000 votes were the—were the votes that they estimate they actually shifted by getting people to go out.
JAD: Really? So these are people who wouldn't have voted who did?
KATE CRAWFORD: Who wouldn't have voted, and who that they—they have said in their own paper, in a published paper, that they increased the number of votes that day by 340,000.
JAD: Simply by saying that you—your neighbors did it too?
KATE CRAWFORD: Yeah. By your friends.
JAD: Now my first reaction to this, I must admit, was okay, I mean we're at historic lows when it comes to voter turnout, this sounds like a good thing.
KATE CRAWFORD: Yes, but what happens if someone's running a platform that a lot of people are on and they say, "Hey, you know, I'm—I'm really interested in this candidate. This candidate is gonna look out, not just for my interests, but the interests of the technology sector, and I think they're—you know, they're a great candidate. Why don't we just show that get out to vote message and that—that little system design that we have to the people who clearly, because we already have their political preferences, the ones who kind of agree with us? And the people who disagree with that candidate, they won't get those little nudges." Now that is a profound democratic power that you have.
JAD: Kate's basic position is that when it comes to social engineering—which is what this is—companies and the people that use them need to be really, really careful. In fact, when—when Andrew mentioned to her that Arturo had this group, and the group had a name ...
ANDREW ZOLLI: He actually runs a group called the Trust Engineering Group. His job is to engineer trust.
JAD: When Andrew told her that ...
ANDREW ZOLLI: Facebook users, that's his job.
JAD: You're smacking your forehead.
KATE CRAWFORD: I think we call that a facepalm. [laughs]
JAD: She facepalmed, like, really hard.
KATE CRAWFORD: These ideas that we could somehow engineer compassion, I think to some degree have a kind of hubris in them.
JAD: Hmm.
KATE CRAWFORD: Who are we to decide whether we can make somebody more compassionate or not?
ANDREW ZOLLI: How do you want to set this up?
JAD: Let's see, how do we do this?
JAD: A couple months after our first interview, we spoke to Arturo Bejar again. At this point, the kerfuffle was dying down, and we asked him about all the uproar.
JAD: I know this is not your work, this—the emotional contagion stuff.
ARTURO BEJAR: Mm-hmm.
JAD: But literally like hours after we got back from that meeting, that thing erupted. Do you understand the backlash?
ARTURO BEJAR: No, I mean, I think that—I mean, we—we—we really care about the people who use Facebook. I don't think that there's such a thing as—as—I mean, if anything I've learned in this work is that you really have to respect people's response and emotions, no matter what they are.
JAD: He says the whole thing definitely made them take stock.
ARTURO BEJAR: There was a moment of concern of what it would mean to the work, and there was like, is this gonna—is this gonna mean that—that we can't do this?
JAD: Hmm.
ARTURO BEJAR: Part of me, like, being honest coming here is that I actually want to reclaim back the word "emotion," and reclaim back the ability to do very thoughtful and careful experiments. I want to claim back the word experiment.
JAD: You want to reclaim it from—from what?
ARTURO BEJAR: Well, suddenly, like, the word "emotion" and the word "experiment," all these things became really charged.
JAD: Well yeah, because people thought that Facebook was manipulating emotion and they were like ...
ARTURO BEJAR: Yes, but ...
JAD: ... "How could they?"
ARTURO BEJAR: In our case—in our case, right, and in the work that we're talking about right now, all of the work that we do begins with a person asking us for help.
JAD: This was Arturo's most emphatic point. He said it over and over that, you know, Facebook isn't just doing this for fun, people are asking for help. They need help. Which points to one of the biggest challenges of living online, which is that, you know, offline, you know, when we try and engineer trust offline, or at least just read one another, we do it in these super subtle ways using eye contact and facial expressions and posture and tone of voice, all this non-verbal stuff. And of course when we go online, we don't have access to any of that.
ARTURO BEJAR: In the absence of that feedback, how do we communicate? What does communication turn into?
[ARCHIVE TAPE, Louis C.K.: Just 'cause the other stupid kid ...]
JAD: The best riff on this, I gotta say, is Louis C.K. on Conan.
[ARCHIVE TAPE, Louis C.K.: ... for kids. It's just this thing, it's bad.]
JAD: He did this great bit about technology and kids.
[ARCHIVE TAPE, Louis C.K.: You know, kids are mean. And it's 'cause they're trying it out. They—they look at a kid and they go, "You're fat," and then they see the kid's face scrunch up and they go, "Ooh, that doesn't feel good to make a person do that." But they—but they gotta start with doing the mean thing. But when they write "You're fat," then they just go, "Mmm, that was fun. I like that."]
JAD: Anyhow, back to Arturo.
ARTURO BEJAR: I mean, I think about, like, what it means to be in the presence of a friend or a loved one. And—and how will you build experiences that facilitate that when you cannot be physically together?
JAD: Arturo says that's really all he's up to, he's just trying to nudge people a tiny bit so that their online selves are a little bit closer to how they are offline. And I gotta say if he can do that by engineering a couple of phrases like, "Hey Robert, would you mind ..." et cetera, et cetera, well then I'm all for it.
ROBERT: Why not take the position that to create a company that stands between two people who—who are interacting, and then giving them boxes and statuses and—and little—and advertising and so forth, that this is not doing a service, this is just—this is—this is a way to wedge yourself into the ordinary business of social intercourse and make money on it.
JAD: No.
ROBERT: And you're acting like this group of people now is going to try to create the—the moral equivalent of an actual conversation? First of all, it's probably not engineerable, and second of all, I don't believe that for a moment. All I'm—all I'm thinking is they're gonna just go and figure out other ways in which to make a revenue enhancer.
JAD: No, I don't think it's one or the other. I think they're in it for the money. In fact, if they can figure this out, and make the internet universe more conducive to trust, less annoying, it could mean trillions of dollars. So yeah, it's the money, but still that doesn't negate the fact that we have to build these systems, right? That we have to make the internet a little bit better.
ROBERT: That's—that's fine. This idea, however, that you're going to have to coach people into the subtleties of the relationship, tell them you're sorry, tell them that this—here's the formula for this. Here's—he doesn't want—he did something and you need to repair that. Here are the seven ways you might repair that. To do all that, it's—it's as if the Hallmark card company, instead of living only on Mother's Day, Father's Day and birthdays, just spread its evil wings out into the whole rest of your life. And I don't think that's a wonderful thing.
JAD: [laughs] I think—you know, I have a slightly different opinion of it. I mean, yeah keep in mind how this thing came about. I mean, they tried to get people to talk to each other, they gave them the blank text box, but nobody used it, right? So they're like, "Okay, let's come up with some stock phrases that yes, are generic." But think about the next step after you send the message saying, you know, "Jad, I don't like the photo. Please take it down." Presumably, then you and I get into a conversation. Maybe I explain myself, I say, "Oh my God, I'm so sorry. I didn't realize that you didn't like that photo. I just thought, like, that was an amazing night, I just thought that was a great night, I didn't realize you thought you looked [bleep]. So sorry. I'll take it down. It's cool." See, now presumably we're having that conversation as a next step.
ROBERT: Why do you presume that? How many of the birthday cards that you've sent to first cousins have resulted in a conversation?
JAD: Maybe not, but I mean ...
ROBERT: See, that's the thing. Sometimes these things are actually not—they're really the opposite of what you're saying, they're conversation substitutes.
JAD: Maybe. Maybe they're conversation starters.
ROBERT: Maybe that's the deep experiment.
JAD: Are they conversation starters or substitutes? Well, I hope they're conversation starters.
ROBERT: Yeah.
JAD: Because maybe that would be a beginning.
ANDREW ZOLLI: It kind of, in my mind, goes back to, like, the beginning of the automobile age.
JAD: This is how Andrew puts it.
ANDREW ZOLLI: There was a time when automobiles were new. And, you know, they didn't have turn signals. The tools they did have, like the horn, didn't necessarily indicate all the things that we use it to indicate. It wasn't clear what the horn was actually there to do. Was it there to say hello, or was it there to say get out of the way? And over time, we created norms. We created roads with lanes, we created turn signals that are primarily there for other people, so that we can co-exist in this great flow without crashing into each other.
JAD: And we still have road rage.
ANDREW ZOLLI: And we still have road rage. We still have places where those tools are incomplete.
JAD: Thanks to Andrew Zolli. Many, many, many, many thanks.
ROBERT: Yes, definitely.
JAD: For bringing us that story, and for reporting it with me for so long. And also ...
ROBERT: And to Arturo, who keeps—who you kept bringing back into the studio to keep ...
JAD: Yes, thank you very much to Arturo and the whole team over there. And by the way, they've changed their name, it's no longer Trust Engineering, it is the Facebook Protect and Care Team.
ROBERT: Really?
JAD: Yeah. We had some original music this hour from Moona Night, thanks to them. Props to Andy Mills for production support. And also, Andrew Zolli put together a blog post—if you go to Radiolab.org you can see it—which covers some really interesting research that we didn't get a chance to talk about. And if you've ever sent an email with a little smiley face, you're definitely gonna want to read this. Radiolab.org, I'm Jad Abumrad.
ROBERT: I'm Robert Krulwich.
JAD: Thanks for listening.
-30-
Copyright © 2023 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.