
Dec 14, 2010
Transcript
[RADIOLAB INTRO]
JAD ABUMRAD: All right.
ROBERT KRULWICH: Okay, so let's just do the open.
JAD: All right. Hey, I'm Jad Abumrad.
ROBERT: I'm Robert Krulwich.
JAD: This is Radiolab.
ROBERT: And today, we're gonna be talking—well, let's do it this way.
JAD: Which way?
ROBERT: I was at the 92nd Street Y in New York City, big gathering spot for cool people with new books and that particular week ...
ROBERT: Is Richard Dawkins ...
ROBERT: Richard Dawkins.
[cheering]
JAD: They like him.
ROBERT: Mm-hmm.
ROBERT: Don't make it so easy for him.
ROBERT: I decided to begin ...
ROBERT: This is a real problem for a lot of people ...
ROBERT: ... by quoting him to him.
ROBERT: You write—I don't know if it's in this book or some other, "The total amount of suffering per year in the natural world is beyond all decent contemplation. During the minute it takes me to compose the sentence, thousands of animals are being eaten alive, others are running for their lives, whimpering with fear. Others are slowly being devoured from within by rasping parasites. Thousands of all kinds are dying of starvation, thirst, disease. It must be so. If there's ever a time of plenty, this very fact will automatically lead to an increase in population until the natural state of starvation and misery is restored." [shudders]
RICHARD DAWKINS: Darwin was worried by the same thing. I mean, Darwin recognized the—the total horror of the suffering in nature, it was one of the things that actually made him lose his faith, but he also realized that it's not just a fact that it happens, it's—it's intrinsic to natural selection that it must happen. And when you look at a beautiful animal like a cheetah that appears to be beautifully designed for something, like a cheetah is amazingly well designed, apparently, for catching gazelles and gazelles are amazingly well designed for escaping from—from cheetahs. But they are the end products of a sort of evolutionary arms race in which thousands, millions of animals have died. The—the shaping, the carving of the shape of a cheetah or a gazelle has come about through millions of unsuccessful gazelles being caught, and the successful ones making it through only to be caught later probably, but after reproducing and passing on the genes that help them to escape. So the sheer number of deaths that lie behind the—the sculpting of these beautiful creatures is horrifying. And at the same time, it's got a kind of savage beauty.
JAD: Wow. Why did you play me this exactly?
ROBERT: Well, because I was sitting there thinking: I know that cheetahs chase and eat antelopes, but wasn't there a nice cheetah once that went over to the antelope and said, "Hi, have a sandwich together?" And that maybe something about the cheetah and the—had something to do with an act of kindness? I can't imagine ...
JAD: Ah, so you're thinking that maybe it's not just meanness that can sculpt, but maybe niceness can sculpt, too.
ROBERT: Exactly. Niceness as a—as a scalpel.
JAD: Niceness as a scalpel. Ooh, I want to listen to that show! Wait a second. We are that show. We should do it then. Let's do it. Today on Radiolab, goodness.
ROBERT: Kindness.
JAD: Selflessness.
ROBERT: Altruism.
JAD: If the world is so cruel, how do you account for it?
ROBERT: Yeah.
JAD: How should we think about it?
ROBERT: And when you do see generosity, how do you know it's really generous?
JAD: All right, so we're gonna start the show with a story that sort of embodies the last question you asked about a guy named George Price who was a mathematician we never heard of until our producer, Lynn Levy, told us about him. She heard about it from an author, Oren Harman, who wrote a book called The Price—as in George Price—of Altruism.
OREN HARMAN: You know, this is a high school photo ...
LYNN LEVY: So—okay, so the people on the radio can't see the picture. So describe what he looks like.
OREN HARMAN: Well, I tell you, he looks a bit like some kind of Scandinavian prince in the 17th century. Good-looking guy.
LYNN: Totally. Definitely something about this guy's eyes.
OREN HARMAN: His eyes.
LYNN: Yeah.
OREN HARMAN: This was described to me by a number of people who knew him. He had a gaze that you sort of walked away from at your own peril. There was something that, you know, he—he sort of knew things.
LYNN: You could start George's story anywhere, but let's start in 1943.
JAD: Okay.
LYNN: George graduates from college, and he's this ...
OREN HARMAN: Very kinetic kind of guy.
LYNN: Really athletic.
OREN HARMAN: He'd swim in the surf and he did a lot of rock climbing.
LYNN: And by all accounts, he was ...
OREN HARMAN: Incredibly brilliant.
LYNN: And right after college, he starts to kind of bounce through history.
OREN HARMAN: He was all over the place.
LYNN: First place he ends up is ...
OREN HARMAN: The Manhattan Project on uranium enrichment. So he's working as a chemist on the atom bomb. When he was done with that ...
LYNN: After a couple of years ...
OREN HARMAN: ... he made a 90-degree turn and started working at Bell Labs on transistor research, solved some very basic problems there, and then disappeared like a phantom. Started working at a medical center on oncology research.
LYNN: Meaning cancer.
ANNA PRICE: And I remember going to his lab, playing hide and seek. All these bottles and test tubes ...
LYNN: By this time, George had a wife and two kids ...
KATHLEEN PRICE: He would look under the microscope, at slides of blood.
LYNN: ... Anna and Kathleen. But he never really saw them that much.
OREN HARMAN: He'd worked 56 hours straight without sleeping on Benzedrine.
KATHLEEN PRICE: I remember, he was always ...
OREN HARMAN: Stuff like that.
KATHLEEN PRICE: ... gone a lot.
LYNN: When the kids are still pretty young ...
KATHLEEN PRICE: They were like five and six.
OREN HARMAN: He ...
LYNN: Left his family.
OREN HARMAN: Yeah.
LYNN: Just left.
OREN HARMAN: Turned another 90-degree corner and began working on computer-aided design. In fact, he invented computer-aided design. He was firing in all directions.
LYNN: What do you think was driving him to keep moving from thing to thing?
OREN HARMAN: He just wanted to succeed at any cost. It made no difference in what field. And at one point in time, he was corresponding with about five Nobel laureates, each in a different field. He wanted to have one great discovery that would make his name.
LYNN: So that's—that's George.
JAD: Wow, quite a guy.
LYNN: Very interesting guy.
JAD: So what happens next?
LYNN: So next what happens is, he gets on a boat and he—he goes to London.
JAD: When was this, by the way?
LYNN: November, 1967. And in London, that's where things, for our purposes, start to really happen.
JAD: Why? What happens in London?
LYNN: Well, he starts looking for this question. He goes from library to library. There are 13 libraries that he would hang out at. And the question that he finds for himself, which is weird considering his personal history, is ...
OREN HARMAN: Why family?
JAD: Like, why do people have families?
LYNN: Well, like, why do families stick together?
OREN HARMAN: There are a lot of sort of dynamics within the family where it would make more sense for an individual to sort of break out.
LYNN: You know, go it alone, like he had.
OREN HARMAN: And yet, family persists. And there should be a good reason for it.
LYNN: He even wrote about the question to his daughter.
KATHLEEN PRICE: "Dear Kathleen, my big paper will be on the evolutionary origin of the human family. In most species, the father just mates with the mother and she does all the child-rearing herself. But in the human species, the dominant pattern has involved care by adult males toward their own children. Why did our species evolve this way?" You know, it just brings back what kind of a father our father was towards us, and basically there was kind of this benign neglect.
LYNN: Hmm.
LYNN: But this question, "Why family?" was only the beginning. "Why family" led him to a bigger question, which is "Why does anybody help anybody?"
JAD: Huh. Well, what do you mean?
LYNN: If you think about Darwin's idea, survival of the fittest, think about what that really means. It means if you are a creature, you have two big important jobs.
JAD: You gotta survive, and you gotta be fit.
LYNN: Right.
JAD: Whatever that means.
LYNN: Fitness really means how many babies can you make? How many babies are you making? And so if you do some stupid, you know, harebrained thing that means you can't stay alive and/or you can't make babies, that doesn't make any sense.
JAD: Right.
OREN HARMAN: And yet, wherever you look in nature ...
LYNN: You see creatures doing this.
OREN HARMAN: From bacteria ...
LYNN: To insects ...
OREN HARMAN: Birds, bees, ants and wasps, fish. I'll give you an example. There's a species of amoeba called dictyostelium discoideum which usually, the amoeba sort of lives on its own, it's a single-celled organism in the forest. But when resources are low, what it does is it sends out this chemical signal, and all the other amoebas, who are also single-celled ...
LYNN: They start sending out signals.
OREN HARMAN: And they start sort of crawling until they all meet. And they become one slug, which is now a single organism.
LYNN: And this slug begins to sort of move along until it finds a place that's windy and sunny. At which point ...
OREN HARMAN: It stops. And the top 20 percent of the slug, the top 20 percent amoeba in the head of the slug, begins to create out of their own body a stalk, which hardens. And they die while doing so. But it—the stalk allows the bottom 80 percent to climb up the stalk, and to create an orb at the top of the stalk.
LYNN: And from there, all the amoebas that aren't, you know, dead, they can catch a wind.
OREN HARMAN: To better pastures.
LYNN: It's like a dandelion.
OREN HARMAN: So what's happened is that the top 20 percent have really sacrificed themselves for the back 80 percent. And that's an amoeba. So you figure, what the hell is happening here? This was a great mystery to Darwin. And Darwin said, "This is, in fact, the greatest mystery, and the greatest riddle. And if I can't answer it, then my theory isn't worth anything."
LYNN: And for a hundred years, when people talked about evolution, this thing, altruism, is the elephant in the room.
JAD: So we were curious about this.
CARL ZIMMER: Sorry, I'm ...
JAD: How might you take this elephant, this niceness thing that seems to be everywhere, and shove it back into the mean old theory of evolution? There's gotta be a way. And so we called up Carl Zimmer who's a journalist we have on the show quite often, who writes a lot about evolution. And he told us that in the 1960s, just as George Price was starting to ask these questions, some scientists came up with a new way of thinking about altruism, a thought experiment which he ran us through.
CARL ZIMMER: Okay, so—okay, so—so Robert, do you have siblings?
ROBERT: I have a sister.
CARL ZIMMER: Okay, you have a sister ...
ROBERT: Sarah.
CARL ZIMMER: Okay, let's just imagine that you guys are, like, home from college, say.
ROBERT: Okay.
CARL ZIMMER: And—and there's a flood at the Krulwich manor, and the water is flooding around, and you can see that your sister is—is about to die.
[VOICE: Help! Help me!]
CARL ZIMMER: If you save your sister's life and you die in the process, your genes, Robert Krulwich's genes, are gone.
ROBERT: Yep.
JAD: Right. This is the problem.
CARL ZIMMER: Yes. But you and your sister have the same parents.
ROBERT: Yes.
CARL ZIMMER: Okay. So your sister has 50 percent of your genes. Okay?
ROBERT: So if I rescue her, then half my genes survive?
CARL ZIMMER: Right. 50 percent move on. Now if you had a sister and a brother and you saved them both, they'd each have 50 percent.
JAD: So it's a wash.
CARL ZIMMER: And so it's effectively, it's like—it's like saving Robert Krulwich in his entirety.
ROBERT: Mathematically speaking.
CARL ZIMMER: Mathematically speaking. Right.
JAD: Can you do this with cousins?
CARL ZIMMER: Yeah, actually. If you step it back to cousins ...
ROBERT: What percentages are—that's a quarter in the case of the first cousins?
CARL ZIMMER: It's an eighth.
ROBERT: So I have to have eight first cousins to equal my full genome.
CARL ZIMMER: Right. Yeah.
JAD: Do you have that many?
ROBERT: I have 32 third cousins. And that's why I always round them up at a rodeo every year.
JAD: And you place them all together, just in case.
ROBERT: "You guys stay here in case something happens to me."
JAD: But here's what I don't get, like, how does this actually operate? Like, Robert's not gonna sit there while the manor is flooding and be like, "Well, let's see. I have a cousin that's an eighth and a second cousin, that's a 32nd."
CARL ZIMMER: No, you don't understand. The math has already been done.
JAD: The math has already ...
CARL ZIMMER: The math has been done by evolution on genes. And those are the genes you got.
JAD: Oh, so you're saying that the evolution has turned the math into an instinct?
CARL ZIMMER: Yeah, you got it.
ROBERT: I don't think I get it. Like, so what is the instinct here? The—I know I want to save my sister.
JAD: Yeah, well, so here's how I understand it. Since sis has half your genes ...
ROBERT: Yeah?
JAD: And since second cousin only has a 32nd, theoretically, your instinct to save your sis should be 16 times stronger than your instinct to save ...
ROBERT: No, that's actually roughly proportionally correct, really. [laughs]
JAD: But keep in mind this was just an idea. It was just a thought experiment until our guy George Price comes along and writes an equation which shows mathematically how an instinct like this could evolve.
CARL ZIMMER: It's very powerful. Okay, so—well, do you want me to just read the letters?
JAD: Yeah.
LYNN: What is the equation? What equals what?
CARL ZIMMER: Okay, Okay. So it's a WxΔZ = the covariance of WI, ZI + E. We call it EWIΔZI.
JAD: Oh, of course.
CARL ZIMMER: Yeah, there you go.
JAD: So complicated. I mean, it was simple a second ago.
LYNN: No, it's—yeah, it sounds a little complicated. He's not just dealing with, like, a simple setup. It's like, he's got the traits and how they affect the different groups and how things change over time. So it's a big—there's a lot going on in there.
JAD: Okay.
LYNN: Yeah.
JAD: All right. Do you understand what you just said?
LYNN: Not—nah.
KATHLEEN PRICE: Yeah. So here, this is a really interesting letter we should read.
CARL ZIMMER: When he did write the equation, he walked off the street into the University ...
KATHLEEN PRICE: University College of the University of London.
CARL ZIMMER: In London. Complete unknown. Had just moved from America. No one knew who he was.
KATHLEEN PRICE: "I went to talk to a Professor Smith ..."
CARL ZIMMER: And he showed the equation to the professor and said, "Is this new?"
KATHLEEN PRICE: "I felt sure that someone must have discovered it before."
CARL ZIMMER: The professor looked at it, and after a very, very short amount of minutes gave him an honorary professorship, keys and the keys to—to an office. One of the best genetics departments in the world.
LYNN: So George is sitting in his office, which by the way is on the site of Darwin's old house.
JAD: Whoa!
LYNN: Yeah. And he's made this big discovery and he's thinking, thinking, thinking.
CARL ZIMMER: Thinking philosophically about what it all meant.
LYNN: Thinking. Thinking.
CARL ZIMMER: "If I can write a formal mathematical treatment of the evolution of a trait like altruism, what it means about the trait is that the trait is never really purely altruistic."
LYNN: If making a sacrifice helps me in the end or helps my genes ...
JAD: Sort of like selfishness in disguise.
LYNN: Yeah.
CARL ZIMMER: If that's true, the world is a terrible place, because it means that there's no true—there could never be true selflessness in the world. My math means that there cannot ever be true selflessness. And I can't accept a world like that.
JAD: Why could he suddenly not accept the world like that?
LYNN: I—yeah, I don't know. Oren thinks it might be because ...
OREN HARMAN: Precisely because he had been so selfish for most of his life. And so he decided in his own life to embark on a program of radical altruism, that would prove that there was true selflessness in this world. And that's what led him to the streets of London, in search of homeless people, derelicts, down and outs. And he began by sort of just walking up to them, introducing himself. "Hello, my name is George. What's your name? How can I help you?"
JAD: To random people on the street?
LYNN: Yeah.
KATHLEEN PRICE: "Everywhere I go, I keep running into down and out alcoholics, to whom I give if I have anything, and with whom I sit and drink from their bottle if they offer me a drink."
OREN HARMAN: He'd buy people sandwiches, or give them a few pounds.
KATHLEEN PRICE: "Whether it's by giving them money, cleaning a filthy kitchen ..."
OREN HARMAN: And then it got bigger.
LYNN: He started giving out keys to his place.
OREN HARMAN: Inviting these guys into his home.
LYNN: People were coming and going, he was giving them food, clothes. And after a few months of charity like that, he was out of money.
OREN HARMAN: There was one letter that he had written to John Maynard Smith, another great biologist of the era, which said, "John, I'm down to my last 15p, and I can't wait to get rid of the last 15." He thought he was proving his equation wrong.
JAD: So by getting poorer and poorer and giving away all of his stuff, he was somehow negating the thing his math seemed to say was inevitable? The selfish instinct?
LYNN: Yeah. You know, he had this self-preservation instinct and he was going to fight the self-preservation instinct, and he was going to win.
OREN HARMAN: Sort of beat the mathematics that he himself had written.
LYNN: So he was approaching it almost like—like a math proof.
OREN HARMAN: Yeah.
PRODUCER: Yeah, it's the red one that you'll be talking into.
LYNN: When he ran out of money, George moved out of his apartment and into this abandoned house in a part of London called Tolmer's Square.
SYLVIA: Which one does the volume for my headphones?
LYNN: Which is where he met Sylvia.
SYLVIA: It was rough. There were just—'cause they were just poles holding the walls up. Some—some places had walls.
LYNN: She was a young artist, also squatting at the time.
SYLVIA: The buildings are crumbling, you know. People had made makeshift staircases ...
LYNN: And George had, like, a room?
SYLVIA: Well, a few clothes on the floor, not much. But, you know, you could see he was always thinking. He would go around asking other people, "Does anybody have shoes they don't want? So and so needs a pair of shoes." You know, that would be part of it. But it might also be like, if somebody was sick, getting him to a doctor. Because if you didn't—if you were homeless, it's very hard to have a doctor. But like I said, all this is going on at the same time. He was getting thinner and thinner. Thin little neck, and then these clothes that just hung around him.
LYNN: Hmm.
LYNN: He began writing letters to his daughters.
OREN HARMAN: Apologizing, weeping.
KATHLEEN PRICE: "Dear Annamarie. Sorry I deserted you like that, and I'm sorry I was such a poor father to you."
OREN HARMAN: Yeah, I've been a terrible father.
KATHLEEN PRICE: "Looking at your picture now makes me wish I could do it all over again."
SYLVIA: Maybe where I come into the picture is he wanted to begin again.
LYNN: She says George asked her to marry him over and over.
SYLVIA: At first, I thought it was kind of a joke, and I was saying, "George, we can't get married." You know ...
LYNN: She said no each time. And at a certain point he gave up.
SYLVIA: It's hard to really really remember, but it was colder. As the—as the winter came on, you wouldn't see George as often. He became quieter, I think. I just remember—and quieter.
LYNN: One morning, this guy that was sharing a squat with George.
OREN HARMAN: His name is Shmuley Katir.
LYNN: He was heading out the door.
OREN HARMAN: He found beneath the door—as he was going out of the building, he found beneath the door a letter. And since they were living in a squat, he was afraid that this was some kind of eviction notice or something like that. And he didn't read English, he couldn't read English. So he ran up the stairs and knocked on George's, door because George was the only one who could read English. And when he knocked, the door sort of kind of went in a bit, and he could see in the aperture that there was blood all over the linoleum floor. When he had enough of an opening, he could see that George was sitting there with no blood left in his body.
JAD: He killed himself?
LYNN: Yeah.
OREN HARMAN: He took a pair of scissors and cut through his carotid artery, which is a very, very sort of terrible death. Poor George.
JAD: Thanks to producer Lynn Levy. For more on George Price, be sure to read Oren Harman's book, The Price of Altruism. And thanks also to Carl Zimmer. His latest is Microcosm.
ROBERT: We'll be right back.
[ANNMARIE PRICE: Hi, this is Annmarie Price.]
[KATHLEEN PRICE: This is Kathleen Price. Radiolab is funded in part by the Alfred P. Sloan Foundation.]
[ANNMARIE PRICE: Enhancing public understanding of science and technology in the modern world.]
[KATHLEEN PRICE: More information about Sloan at www.sloan.org.]
[OREN HARMAN: This is Oren calling. Radiolab is produced by WNYC ...]
[CARL ZIMMER: And distributed by NPR. This is Carl Zimmer. Bye.]
JAD: Hey, this is Radiolab. I'm Jad Abumrad.
ROBERT: I'm Robert Krulwich.
JAD: Our topic today is ...
ROBERT: Goodness.
JAD: Goodness. Selflessness.
ROBERT: So we've done the math, I—the math leaves me a little on the cold side. [laughs]
JAD: [laughs] I wonder why. So you know what? Forget the math. Forget it.
ROBERT: Let's—let's go to the people who do the deeds.
JAD: Yeah, people who do amazingly brave and heroic things.
ROBERT: Yeah.
JAD: No math required.
ROBERT: Maybe find out, I don't know ...
JAD: What makes them different than the rest of us.
ROBERT: Yeah.
[phone rings]
JAD: That question led us to ...
[phone rings]
WALTER RUTKOWSKI: Walter Rutkowski.
JAD: ... to a guy named Walter Rutkowski.
WALTER RUTKOWSKI: And I'm the executive director and secretary of the Carnegie Hero Fund Commission.
JAD: Cool. Well, thanks for doing this.
WALTER RUTKOWSKI: Okay.
JAD: Can you just give us a little background on the Hero Fund? What is the Carnegie Hero Fund?
WALTER RUTKOWSKI: The Carnegie Hero Fund is a private operating foundation that was established by Andrew Carnegie in 1904. And what we do is recognize civilian heroism throughout the United States and Canada by giving an award called the Carnegie Medal. And accompanying the Carnegie Medal is a financial grant.
JAD: How much?
WALTER RUTKOWSKI: Currently the amount is $5,000.
JAD: Wow. And how do you guys choose your heroes?
WALTER RUTKOWSKI: We judge the heroic acts against a list of requirements.
ROBERT: So then you have to have some kind of definition of "hero," which includes some and excludes others.
WALTER RUTKOWSKI: Yes.
JAD: Perfect.
WALTER RUTKOWSKI: A basic definition which is a civilian ...
JAD: One.
WALTER RUTKOWSKI: ... meaning no military, who voluntarily ...
JAD: Two.
WALTER RUTKOWSKI: ... leaves the point of safety ...
JAD: Three.
WALTER RUTKOWSKI: ... to risk his own life, or her own life ...
JAD: Four.
WALTER RUTKOWSKI: ... to an extraordinary degree ...
JAD: Five.
WALTER RUTKOWSKI: ... to save or to attempt to save the life of another human.
JAD: Six. And how about seven: Why?
ROBERT: Can you—can you read that one more time?
WALTER RUTKOWSKI: Okay. I wasn't reading, that just came from memory. So ...
ROBERT: Oh, okay.
JAD: Like, what is it that happens in a person's mind at that pivotal moment when they decide to voluntarily leave a point of safety and risk their life to an extraordinary degree ...
WALTER RUTKOWSKI: ... to save the life of another human?
JAD: That's what we wanted to know.
JAD: Should we just jump in?
WALTER RUTKOWSKI: Okay.
JAD: So, the first one we have on our list is Lora Shrake.
WALTER RUTKOWSKI: Okay. That's file number 73546 and the award number is 8005.
LORA SHRAKE: I am Lora Shrake. I'm from Mattoon, Illinois, and I currently live in Dubai, United Arab Emirates.
TIM HOWARD: Oh, wow.
JAD: Lora spoke with our producer Tim Howard.
TIM: Okay, so we're going back a little bit here.
LORA SHRAKE: Yeah. 15 years.
WALTER RUTKOWSKI: Back in the mid '90s ...
LORA SHRAKE: 1995.
WALTER RUTKOWSKI: Was a 21-year-old college student.
LORA SHRAKE: And I was driving through the country and I saw a woman getting mauled by a bull in a pasture.
WALTER RUTKOWSKI: So she stopped to see what was going on.
LORA SHRAKE: Jumped out, and started yelling at her to see what I could do. The woman was on the ground and the bull was ...
WALTER RUTKOWSKI: A 950-pound Jersey bull.
LORA SHRAKE: Tossing her in the air and back on the ground.
TIM: Wow!
LORA SHRAKE: She was clearly struggling.
TIM: And where were you?
LORA SHRAKE: I was right on the other side of the fence, but the fence was electric.
JAD: So here's the moment that we find fascinating. At this point, Lora can either go forward through thousands of volts of electricity toward an angry bull that will likely maul her too, or she can stay safe.
LORA SHRAKE: I went ahead and just climbed through the fence, and I don't remember ever feeling the electricity.
JAD: She says by the time she got through ...
LORA SHRAKE: Crazily enough ...
JAD: ... a neighbor had shown up and threw her a piece of pipe.
LORA SHRAKE: Maybe about two feet long.
WALTER RUTKOWSKI: So she approached the woman.
LORA SHRAKE: Who was still conscious. The whole time she's yelling at me, "Hit the bull in the face as hard as you can and don't stop."
WALTER RUTKOWSKI: So Miss Shrake went up to the bull and beat it repeatedly with this two-foot length of tubing.
LORA SHRAKE: I think it distracted the bull enough where she was able to get out from under him. And as soon as we were outside the fence, looking back into the pasture and the bull was literally right there at the fence.
WALTER RUTKOWSKI: Kicked the ground a few times and snorted.
LORA SHRAKE: He was not—he was not happy.
JAD: To our question ...
TIM: When you were there at that fence, and you had the choice to either stay put or to go through it, what was going through your mind? Was there a calculation there?
LORA SHRAKE: No, I can't really say that. I mean ...
TIM: You didn't weigh your options or anything like that?
LORA SHRAKE: I did not, no. It was just here's the problem, here's what I need to do, and something needed to happen.
TIM: Huh. So there was no choice moment?
LORA SHRAKE: Not that I recall, no. If nobody came to this woman's rescue, she would die.
JAD: "Unfortunately, this is the usual explanation," says Walter. No explanation.
WALTER RUTKOWSKI: Like, "I couldn't stand there and not do anything. I would—I was compelled to act."
LORA SHRAKE: I didn't really take the time to think about what else could happen.
WILLIAM DAVID PENNELL: Well, I can't say I ever really thought about my own life at that time. I mean ...
JAD: Okay, we just jumped ahead because we thought we'd try again. That's the voice of the next Carnegie hero that Walter told us about. Yeah, William David Pennell.
WILLIAM DAVID PENNELL: Name is William Pennell.
WALTER RUTKOWSKI: Who is the 8,362 person to receive the Carnegie Medal.
JAD: Our producer, Lynn Levy, tracked him down.
LYNN: Bill, can you hear me?
WILLIAM DAVID PENNELL: Yeah, I can hear you.
WALTER RUTKOWSKI: William David Pennell was 37 years old at the time of his heroic act.
LYNN: Was it 1999?
WILLIAM DAVID PENNELL: Yes, it was early in the morning. It was like ...
WALTER RUTKOWSKI: 3:19 am in a small town near Pittsburgh.
WILLIAM DAVID PENNELL: Pittsburgh, Pennsylvania.
WALTER RUTKOWSKI: Monongahela, Pennsylvania.
WILLIAM DAVID PENNELL: We was in bed sleeping and my wife heard a loud crash. I actually didn't hear it, but a dog, my one dog was carrying on. So right away, I—I run down there.
WALTER RUTKOWSKI: Mr. Pennell went outside his house. There was a very bad automobile accident. A car crashed head-on into a utility pole.
WILLIAM DAVID PENNELL: Flames was, like, ripping up the windshield out from under the hood.
WALTER RUTKOWSKI: He responded to the scene wearing only sweatpants.
WALTER RUTKOWSKI: No shoes or shirt, or nothing on.
WILLIAM DAVID PENNELL: Bare chested and barefoot.
JAD: So here we are. Bill is standing in front of this ball of fire. There are three drunk teenagers inside that car, though he doesn't know it. He can either A) do nothing; or B) go in.
WALTER RUTKOWSKI: Through the driver's door.
WILLIAM DAVID PENNELL: And this big fella slumped out the door. So I reached in and grabbed the hold of him.
WALTER RUTKOWSKI: Around the chest, pulled him from the driver's seat out to the ground.
WILLIAM DAVID PENNELL: Meantime, the car was just like blazing. And my neighbor was there. She was hollering, "There's more of them in there!" So I run back to the vehicle ...
WALTER RUTKOWSKI: Found that the front-seat passenger was trapped in the wreckage ...
WILLIAM DAVID PENNELL: I finally got him loose and pulled him out.
WALTER RUTKOWSKI: Apparently, Mr. Pennell was aware that a third person was in the car, a third young man. Mr. Pennell entered the car a third time. By then ...
WILLIAM DAVID PENNELL: There was tires blowing high.
WALTER RUTKOWSKI: ... the flames have grown to about three feet above the car's roof.
WILLIAM DAVID PENNELL: The interior, like the headliner of the car, stuff was dripping like plastic down on my back. I mean, I'm in there screaming, you know, "Somebody give me a hand in here!" But nobody—nobody would help. And I reached in and grabbed a hold of the kid that was in the back by the scruff of the neck and pulled him out.
LYNN: All right, so when you were coming out of your house, and you're looking at that car, what was going through your head?
WILLIAM DAVID PENNELL: Well, just trying to—try to help. I mean, I—I—I did what any normal person would do. I mean, you know, I just kept saying, "This is somebody's kids," you know what I mean? At the time, my daughter was, like, 16, and I'm saying to myself, "You know, if something, God forbid, whatever happened to her, that I would hope someone would be there to help."
LYNN: Did you ever talk to your neighbors and ask them why they didn't come in there?
WILLIAM DAVID PENNELL: You know what? That's funny you brought that up because, no, I've never—never brought it up. Never brought it up.
LYNN: How come?
WILLIAM DAVID PENNELL: I don't know. I guess maybe I probably wouldn't like their answer. I don't know. I—I don't know why I've never asked them that.
LYNN: What do you think is the difference between you and those other people who just sort of stood by?
WILLIAM DAVID PENNELL: I—I couldn't answer that. I couldn't answer that.
JAD: So our bold girl, she didn't know, this guy didn't really know either. Somebody must be able to tell us something about what they were thinking at that moment that allowed them, that gave them the courage to do what they did.
WALTER RUTKOWSKI: I can't give you a definite answer as to what propels people to do this. No.
JAD: But we took one more shot with Walter, and he told us about a case that of all the cases he's heard, this is the one that puzzles him the most.
WALTER RUTKOWSKI: It's the case of Wesley James Autry, a construction worker from New York, 50-year-old man who did jump into the track bed in a subway station to remove a fellow, a fellow, a young man who had fallen onto the track. The gentleman was six foot, 180 pounds. He was—he was inert. And yet Mr. Autry persisted, despite the fact that a train was coming. There would come a point unless—at least in my estimation—where he would have to say, "I have to get out of here because I'm going to be killed. I'm—I'm not suicidal." But Mr. Autry didn't think that way. He and I part in this—in this manner. What he did was he lay atop the victim between the rails while the train passed over them. In the farthest reaches of my imagination, I can see myself jumping onto a subway track to attempt the rescue. What I can't see myself doing is lying atop the victim while the train passes over me.
JAD: Making this story even more nuts ...
[ARCHIVE CLIP, subway announcement: There is a downtown local ...]
JAD: ... when we finally met up with Wesley Autry on the platform where this incident happened, 135th and Broadway, he explained to us that his daughters had been with him.
WESLEY AUTRY: They—they was okay. And two ladies ...
TIM: How old were your daughters?
WESLEY AUTRY: At that time, my daughter was four and six. And this—this them there.
JAD: He showed us a picture.
ROBERT: Oh my God!
JAD: Super cute.
WESLEY AUTRY: The one behind me is Shuki, and this is the baby Sashi.
JAD: So when they're standing there, and this guy starts convulsing and then eventually falls off the platform onto the tracks right as a train is coming, his choice is pretty stark. In order to save this complete stranger, he's got to leave his daughters behind, potentially without a dad.
WESLEY AUTRY: I'm looking at him shaking and going into another seizure. For some strange reason, a voice out of nowhere said, '"Don't worry about your own. Don't worry about your daughters. You can do this."
JAD: So he jumps, runs to the guy.
ROBERT: Is he conscious?
WESLEY AUTRY: No, no.
JAD: Tries to grab the guy's hand.
WESLEY AUTRY: And each time I grabbed his hand, we'd slip apart. You know, he slipped, I look up, the train is getting closer. I grab his hand again, we'll slip apart. The train is closer.
JAD: Fifty feet. Twenty feet. Ten feet. And then it's right there, and all he can do is grab the guy, get him in a bear hug, and flatten his body against the guy as much as he can.
WESLEY AUTRY: The first train car just grazed my cap.
ROBERT: Oh my God.
JAD: Oh my God.
JAD: Train car went right over them.
WESLEY AUTRY: Then when the train came to a stop, four to five cars passed over us. I looked him in the eye and said, "Excuse me, you seemed to have had a seizure or something. I don't know you. You don't know me." So I just kept talking to him until he came through, and he was like, "Well, where are we?'" I'm like, "We're under a train.'" He said, "Well, who are you?" I said, "I came down and saved your life." So, he kept asking me, "Are we dead? Are we in heaven?" I gave him a slight pinch on his arm. He's like, "Oh," Just like, "See? You're very much alive."
JAD: Wow!
ROBERT: Have you—did you ever ask yourself at this point, like, "What am I doing here?" I mean, he asked it. "What am I doing here?" What about you?
WESLEY AUTRY: Well, I can hear my—I can hear the two ladies who have my daughters standing in between their legs. I can hear my daughters screaming. So when that train come to a stop I yelled up from underneath the train, "Excuse me, I'm the father. We're okay. I just want to let my daughters know that—that I'm okay, because I know that they are worried about me." Everybody started clapping.
JAD: Can I ask you a question? So the point at which you said you heard a voice ...
WESLEY AUTRY: Yes.
JAD: ... that said, "I can do this."
WESLEY AUTRY: "I can do this."
JAD: What's—what's—what is amazing to me is that you left your daughters right here and dived after a guy you don't know.
WESLEY AUTRY: He was a stranger. Total stranger. But you know what? The mission wasn't com—completed. I was chose for that.
JAD: You felt chose—like, you—you were ...
WESLEY AUTRY: I felt chosen. I felt like I was the chosen one.
ROBERT: But for a religious person, though, I would wonder, "Why me?"
WESLEY AUTRY: Well, you know what? Maybe 20 years ago, I was supposed to be at a certain point ...
JAD: And then he explained to us exactly why he had jumped. He was the one guy who could. He said right before his feet left the platform, this one specific moment from his life flashed to mind.
WESLEY AUTRY: This thing that happened, you know? I had a gun pulled to my temple, but, you know, it was a misfire. So, you know ...
ROBERT: A gun was put to your head and missed? So you were almost dead for a second or two.
WESLEY AUTRY: I—I was almost dead, you know?
ROBERT: So you think you might have been spared for a purpose?
WESLEY AUTRY: I was spared for a reason.
JAD: After that moment, he says, when the gun went click and he didn't die, he always wondered why had God spared him that moment? Until he was on the platform and he saw the guy fall off, he says, then he knew, "This is why."
WESLEY AUTRY: I—I can do this.
JAD: It's just—I can do this.
WESLEY AUTRY: I can do this. That voice, when that voice said that, "You gonna be okay," I knew everything was gonna work out.
ROBERT: You know what I think at the end of the day?
JAD: What's that?
ROBERT: I don't think that there is an answer to the question we asked. I don't think ...
JAD: The hero question?
ROBERT: Why were you a hero? I don't think that any three of these here—I mean, the last one had the longest explanation. He had been selected for some purpose. But does he know why he was chosen? Not a clue.
JAD: Okay, see I—guy number three gives me something.
ROBERT: What does he give you?
JAD: Okay, so the first two, right? They have no idea.
ROBERT: None.
JAD: So there's just something in them that made them act. But guy number three is talking about circumstances. Like the world prepared him for that moment. Serendipity. So it makes me think, well, what if circumstances are just right, maybe any of us could do that?
WILLIAM DAVID PENNELL: I got—I got a mailman. He—he used to say to me all the time, he says, "How did you manage to do that up there? How did you manage to pull them kids out? I don't know if I could have done that." I said, "Well, you know what? Don't say you wouldn't do this, or you wouldn't do that until you're put in that situation."
JAD: In fact, when we asked Walter ...
JAD: How many nominations do you get a year? Are they hard to find?
WALTER RUTKOWSKI: No, they are not hard at all to find. We are fortunate to be living in a society—regardless of what you hear elsewhere, we are fortunate to be living in a society where people do look out for others, even strangers.
JAD: He told us they've even had to up their guidelines to make it harder to win.
WALTER RUTKOWSKI: Simply because of the vast number of heroic deeds that happen in day-to-day life.
[LISTENER: Hi there. This is Sven Svalis calling from London, England. Radiolab is supported in part by the National Science Foundation, and by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at www.sloan.org. That's a bit of a tongue twister.]
JAD: Hey, this is Radiolab. I'm Jad Abumrad.
ROBERT: I'm Robert Krulwich. Oh.
JAD: [laughs] Yes. Would you like to say our topic, Robert?
ROBERT: Our topic today is goodness.
JAD: Niceness.
ROBERT: Or altruism, would be another bigger, fatter word.
JAD: Yep. And thus far, we've met a couple of folks, individuals who have struggled with altruism in some way.
ROBERT: Now we're going to sort of pull back and go from specifics to grand, global ...
JAD: Strategy.
ROBERT: Yes.
ROBERT AXELROD: Hello.
JAD: Hello, hello?
JAD: And we're gonna tell you a really cool story, we think, that begins with this guy.
ROBERT AXELROD: My name is Robert Axelrod. I'm the Walgreen Professor for the Study of Human Understanding in the Department of Political Science and the Ford School of Public Policy of the University of Michigan.
ROBERT: [laughs]
JAD: Wow, that's ...
ROBERT AXELROD: I know, that's a mouthful.
ROBERT: That was like your dean was, like, looking over you, says, "Say it all please. Say it all."
ROBERT AXELROD: Yeah. Well, you know, you—you could just say I'm a professor of public policy and political science or something.
ROBERT: Well, but before he was all of that, Axelrod, when he was in high school, he was one of those guys who just loved computers.
ROBERT AXELROD: Well, yes. In '59, 1960, I hung around the Northwestern University Computer Center.
ROBERT: '59, '60? So, what were—were those large pieces of furniture in refrigerated buildings?
ROBERT AXELROD: They were. In fact, the whole campus had one computer, and they let me use it for 15 minutes here and 15 minutes there.
JAD: And what would you do with the computer?
ROBERT AXELROD: What I did, I did a very simple computer simulation of hypothetical life forms and environments for a science project.
ROBERT: Really?
ROBERT AXELROD: Yeah.
ROBERT: You're a pre-geek is what you are.
JAD: Yes.
ROBERT: Before the word had been invented.
ROBERT AXELROD: I think you could say that.
ROBERT: But then in 1962, when Axelrod was down in a computer basement, I guess, somewhere, all over the world, everybody else was watching one of the great dramas in modern times ...
[ARCHIVE CLIP, John F. Kennedy: Good evening my fellow citizens.]
ROBERT: ... unfold.
ROBERT AXELROD: The Cuban Missile Crisis.
[ARCHIVE CLIP: John F. Kennedy: Within the past week, unmistakable evidence has established the fact that a series of offensive missile sites is now in preparation on that imprisoned island.]
ROBERT: And Axelrod started thinking about the dilemma we were in.
ROBERT AXELROD: Well, each side wants to spend more money buying missiles and things.
ROBERT: You know, we could build more bombs, but then they could build more bombs. It would be better if they would both stop, but if we stop and they don't ...
JAD: That would be bad.
ROBERT: Very bad.
ROBERT AXELROD: Yeah. And so I was interested in what's the—what were the conditions that would allow people to get out of this problem?
ROBERT: And then he starts thinking, "Well, wait. Maybe I could use my computer to help me figure out ..."
ROBERT AXELROD: What's a good strategy for this?
JAD: For something like the Cuban Missile Crisis?
ROBERT AXELROD: Well, yes. Right?
JAD: And what made you think that computers could help with that?
ROBERT AXELROD: Well, I came across a simple game called the Prisoner's Dilemma.
ANDREW ZOLLI: All right, let me go again.
JAD: Yeah, noise from the window.
JAD: Okay, so the Prisoner's Dilemma is a very famous thought experiment. It's a little tricky to describe, but I got a friend of mine, Andrew Zolli, who's written about the Prisoner's Dilemma in an upcoming book ...
ANDREW ZOLLI: Resilience: The Science of Why Things Bounce Back.
JAD: I got him to lay it out for me.
JAD: What is the Prisoner's Dilemma?
ANDREW ZOLLI: So imagine that two bank robbers are hanging out across the street from the First National Bank and the police pick them up. They've received a tip that these two guys are about to rob the bank.
JAD: Got it?
ROBERT: Yep.
JAD: So the cops take these two guys back to the station, do the whole Law and Order thing, put them in different rooms.
ANDREW ZOLLI: And they walk into each one—let's call them Lucky and Joe—and they say to Lucky, "We have enough to make sure that you go away for a six-month sentence."
JAD: But this is not really what the cops want. They want a longer sentence for one of these guys. So they make Lucky an offer.
ANDREW ZOLLI: "If you, Lucky, rat out Joe and Joe doesn't say anything, you will go free and Joe will go to jail for 10 years. If the reverse happens ..."
JAD: Meaning if you say nothing and Joe rats you out ...
ANDREW ZOLLI: "... you're going to jail for 10 years and he's gonna walk free."
JAD: If you both end up ratting on each other, you both get five ...
ANDREW ZOLLI: Five years.
JAD: Whereas if you both keep your mouth shut ...
ANDREW ZOLLI: You're each going to jail for six months for loitering.
JAD: So somehow, if Lucky and Joe could talk to each other, they both say "Don't speak."
ANDREW ZOLLI: Absolutely. But the big problem that Lucky and Joe have is they can't talk to each other.
JAD: All right, so you're Lucky.
ROBERT: Okay.
JAD: Okay? What do you do? Do you rat Joe out or not?
ROBERT: Do I know this guy?
JAD: Uh-uh.
ROBERT: At all?
JAD: I mean, you met for this one job, but tomorrow, you'll never see him again.
ROBERT: Ever?
JAD: Ever.
ROBERT: Well, like if I—if I knew him and I could trust him, then I think I know what I would do, but ...
JAD: You'd keep your mouth shut, because then you'd get six months, because he'd keep his mouth shut.
ROBERT: It would be a sweet thing.
JAD: Indeed.
ROBERT: But see, since I don't know him, what would happen if he rats me out?
JAD: You'd go to jail for 10 years, he'd go free. That bastard!
ROBERT: Ten years.
JAD: Yeah.
ROBERT: But if I rat him out, then the worst I get is ...
JAD: Five years.
ROBERT: Or, you know, I'd go away free. I'm totally free.
JAD: Do it, Krulwich. Say what's in your heart.
ROBERT: I guess I'm gonna—I'm throwing him under the bus, Jad.
JAD: Yes, throw him under.
ROBERT: What's his name again?
JAD: Joe.
ROBERT: Joe.
JAD: You see, he's already gone.
ROBERT: [laughs] He's—he's already—I don't even remember him. You're dead to me, Joe.
JAD: So you see, in this type of scenario where you don't know the guy, you have a very strong incentive ...
ANDREW ZOLLI: To rat the other guy out.
JAD: Or as the social scientist would say ...
ANDREW ZOLLI: To defect.
ROBERT AXELROD: That's right. If you play it only once, if you only meet somebody once, whatever the other guy does, you're better off defecting against them.
JAD: Just here on out, whenever you hear the word "defect," know that it means screw the other guy over.
ROBERT AXELROD: But the—the really interesting stuff happens if you play over and over again. If you're gonna meet the same people again.
JAD: Because now you're thinking, "Should I help this guy out the next time? If he screwed me, should I screw him?"
[ARCHIVE CLIP: John F. Kennedy: But this secret, swift, extraordinary buildup of Communist missiles ...]
ROBERT: What do you do? You want to cooperate, but you don't want to get screwed.
ARCHIVE CLIP: John F. Kennedy: ... which cannot be accepted by this country.]
JAD: Right.
STEVE STROGATZ: You know, these kind of thoughts were paramount in those days, because a Prisoner's Dilemma was being played between the two superpowers.
ROBERT: This is our friend, Steve Strogatz, the Cornell mathematician, who says at that time all kinds of folks ...
STEVE STROGATZ: Political scientists and economists and psychologists, mathematicians ...
JAD: ... were writing papers about the Prisoner's Dilemma.
ROBERT. Literally. And thinking, "Come on, we've got to be able to win this game if we're gonna play against the Russians. And we have to do it right."
ROBERT AXELROD: Exactly. But there was no consensus on the best way to do it. And so I was interested in what's a—what's a good strategy for this?
ROBERT: And that's when Robert Axelrod, sitting down there in the basement somewhere in the Midwest with the big computer, that's when he had his idea.
STEVE STROGATZ: His approach, which was really novel at the time, was to conduct a computer tournament.
JAD: A computer tournament!
STEVE STROGATZ: [laughs] Yeah.
ROBERT AXELROD: Invite the people that had come up with these different ideas to play with each other.
STEVE STROGATZ: In other words, what he said is, "All right, Mr. Wise Guy, you know, you've written so and so many articles on the Prisoner's Dilemma. You think you understand it. How about joining this tournament where you have to submit a program that will play Prisoner's Dilemma against a program submitted by the other experts? We'll have a round robin.
ROBERT AXELROD: Right. Try these different programs against each other.
ROBERT: So all these computer guys are brought to Caesar's Palace in—in Las Vegas and wear tuxedos, and they all sat down at the table?
ROBERT AXELROD: [laughs] No.
STEVE STROGATZ: It's a nice image, but what really happened was everyone submitted their programs to Axelrod.
ROBERT AXELROD: They would mail their entries to me. But there was a trophy.
ROBERT: There was a trophy. [laughs]
ROBERT AXELROD: I—so I wrote to people and I said, "If you win, I'll send you a trophy." You know, little plaque that says, "You won the computer tournament."
ROBERT: Okay, so here's the deal: Every program will play every other program 200 times. There will be points in each round, and then Axelrod will total the scores ...
ROBERT AXELROD: And see what actually worked.
ROBERT: By which he means in the long run, even if you lose some rounds here and there, one of these strategies is gonna beat all the others, meaning it'll let you survive, maybe even prosper. That's the game.
ROBERT AXELROD: That's right.
JAD: And can you introduce us to some of the contestants?
STEVE STROGATZ: Yeah, so there was one program called Massive Retaliatory Strike.
ROBERT AXELROD: And the first move it just cooperates.
ROBERT: But then as soon as the other program doesn't cooperate ...
STEVE STROGATZ: It would then retaliate for the rest of the game.
JAD: Like, "Sorry, man, you blew it."
ROBERT AXELROD: "I'll never trust you again."
STEVE STROGATZ: Yeah. "That's it for you." This is like the way my wife is. Whenever a guy in her earlier life stood her up, that was it.
JAD: Game over.
STEVE STROGATZ: [laughs]
ROBERT: But there were also some trickier programs.
STEVE STROGATZ: I mean, some crafty ones try to make a model of the opponent.
ROBERT: Like he mentioned one that was called Tester.
STEVE STROGATZ: So Tester would—it would see what you were like. It would start by being mean, and then if you start retaliating, it backs off and says, "You know, oh chill. It's okay, man." And, you know, and then it starts cooperating for a while until it throws in another ...
ROBERT: Just to test the other guy, because after all it was called Tester.
ROBERT AXELROD: Yeah, so Tester is kind of designed to see how much it could get away with.
JAD: I mean, it sounds kind of sensible in a way. I mean, mean, but ...
ROBERT AXELROD: Well, but if you see—if you think about what happens if these two players play each other.
ROBERT: If Tester plays Massive Retaliation 200 times ...
ROBERT AXELROD: Pretty soon the Tester will defect and then Massive Retaliation will never cooperate again.
ROBERT: Screw you, pal.
JAD: Oh no, screw you.
ROBERT: Screw me? I'll take you.
JAD: You come here!
ROBERT AXELROD: So in fact, they do very badly. Both of them.
ROBERT: When—when you're sitting there, did you have a hunch as to which would be the most successful program?
ROBERT AXELROD: Well, I didn't know, which is why I wanted to do it, but I did have a hunch that thousands or tens of thousands of lines of code would be needed to have a pretty competent program.
ROBERT: So when the mailman delivers the fattest envelope to your house, you're like, "This could be the one!"
ROBERT AXELROD: Well, yes. Right. Now it didn't turn out that way.
ROBERT: When it was all said and done, when he loaded all the programs into the computer, when they'd all played each other 200 times, the program that won ...
ROBERT AXELROD: It's really two lines of code.
JAD: Two lines of code?
ROBERT AXELROD: Yeah, it's got a simple name, it's called Tit for Tat.
ROBERT: First line of code. Be ...
STEVE STROGATZ: Nice.
JAD: Nice?
STEVE STROGATZ: Yeah, nice. "Nice" is a technical word in this game. Nice means I never am nasty first.
ROBERT AXELROD: And after that ...
ROBERT: Second line of code.
ROBERT AXELROD: It just does what the other player did on the previous move.
ROBERT: Oh!
ROBERT AXELROD: So if the other player has just cooperated, it'll cooperate. And if the other player has just defected, it'll defect.
STEVE STROGATZ: It retaliates on the next move. Couldn't be clearer. On the other hand, it only retaliates that one time. I mean, unless provoked further. It does its retaliation and now bygones are bygones, and that's it.
JAD: So wait, how exactly did it win? I mean, can you give us a sense of—of why it won?
STEVE STROGATZ: Okay, so let's suppose—here, let's take an extreme case of some very simple programs. One of them I'll call Jesus.
ROBERT: [laughing] Just for the sake of argument.
STEVE STROGATZ: Just for the sake of a name. Now the Jesus program cooperates on every turn. That is it's always—it's always, you know ...
ROBERT: Good.
STEVE STROGATZ: Yes.
JAD: So the Jesus program is a simple algorithm that says always be good.
ROBERT: Good, good, good, good, good.
STEVE STROGATZ: That's right. And let's say the other program is the Lucifer program, which no matter what, always is bad.
JAD: Okay, these are your two extremes, says Steve.
ROBERT: And, of course, most programs—and most people—fall somewhere in the middle.
JAD: Right.
ROBERT: But, in Tit for Tat, you've got a strategy that can swing both ways.
JAD: For instance, with Jesus, Tit for Tat starts by cooperating, as does Jesus.
STEVE STROGATZ: And then they're gonna keep cooperating for the whole 200 rounds.
ROBERT: Which is, you know ...
JAD: Good.
STEVE STROGATZ: But now, let's suppose it plays Lucifer.
JAD: Where there's no chance to cooperate.
ROBERT: Then, says Steve, Tit For Tat just plays good defense. So when Lucifer does his thing, Tit for Tat retaliates. And they pretty much keep doing that and stay even, so in other words ...
STEVE STROGATZ: It's—it's a very robust program. It elicits cooperation if the opponent has any inclination to cooperate, but it doesn't take any guff.
JAD: And it wins. So you might say in evolutionary terms, this program ...
ROBERT: Is the fittest.
STEVE STROGATZ: So actually, Axelrod played an evolutionary version of his tournament. That is, he had these programs, after they played their tournament, get a chance to reproduce copies of themselves according to how well they did.
JAD: You mean the winners would get to have more babies?
STEVE STROGATZ: Yeah.
JAD: And then would the babies play each other?
STEVE STROGATZ: Yeah, he ran them again. I mean, he ran them for many generations. And so, like, suppose you have a world of Lucifers, and there are a few Tit For Tat players out there. Can they thrive, can cooperation emerge in this horribly hostile world?
JAD: Wow, what an interesting question!
STEVE STROGATZ: So he looked at that, and the answer was if you have enough of them so that they have enough chance of meeting each other, they can actually invade and take over the world, even if the world starts horribly mean. I mean, what—what I take to be the big message, though, I mean, what—what always sent chills down my spine is that we see this version of morality around the world. You know, be upright, forgiving, but retaliatory. I mean, that sounds to me like the Old Testament.
JAD: Huh!
STEVE STROGATZ: It's not turn the other cheek, it's an eye for an eye. But not 10 eyes for an eye. And to think that it's not something that's handed down by our teachers, or by God, but that it's something that came from biology, I like that argument personally.
ROBERT: Hmm. From biology. Now do we know whether the math has anything to do with real people in real life situations, or are we just abstracting behavior? Is this wise, or is this just math?
STEVE STROGATZ: This is what's so impressive to me about Axelrod's work, so, he's not just playing math games, he—he tries to tie this to history and politics as seen.
ROBERT AXELROD: I like to scan journals. One of my—wouldn't say it's a pastime because it's part of my profession. But, I came across a book called The Live and Let Live System in World War I.
JAD: So here's where we jump away from the math and the computer tournaments, and into something very real.
STANLEY WEINTRAUB: The war began late in July, 1914.
JAD: That's Stan.
STANLEY WEINTRAUB: Stanley Weintraub.
JAD: Expert in World War I.
STANLEY WEINTRAUB: Evan Pugh Professor Emeritus at Penn State.
JAD: And the story that Stan is gonna help us tell takes place on what was called the Western Front, which was basically these two lines of trenches.
STANLEY WEINTRAUB: Very close to each other, a few 100 yards apart.
JAD: And they stretched for hundreds of miles. And that fall ...
STANLEY WEINTRAUB: In November the weather turned bad. Heavy rains, then it became icy, then slush and then snow. It became disgusting because the trenches also were filled with rats.
JAD: Rats?
STANLEY WEINTRAUB: Rats. The rats went after not only the food but after corpses.
JAD: And it was oddly in this miserable, disgusting hellhole that something quite amazing happened. No one quite knows how it started, but one day, maybe around daybreak, let's say, while the two sides were fighting, some of the British soldiers ...
STANLEY WEINTRAUB: Stopped firing long enough to have breakfast.
JAD: And as they were eating, they noticed, hmm, the Germans stopped too, to have their breakfast. And when they were both done ...
STANLEY WEINTRAUB: They'd begin firing again.
JAD: Next morning, same thing. British take their breakfast break at about the same time. The Germans do the same thing. Morning after that, the same thing, and then the next. And after a while ...
ROBERT AXELROD: Both sides caught on that if they didn't interrupt the other one, then they wouldn't be interrupted.
[ARCHIVE CLIP: On the whole, there is silence.]
JAD: This is from a letter a British soldier sent home to his wife at the time.
[ARCHIVE CLIP: After all, if you prevent your enemy from drawing his rations, his remedy is simple: He will prevent you from drawing yours.]
JAD: When Axelrod read this ...
ROBERT AXELROD: I thought, "Gee, this sounds very familiar."
JAD: Line one of Tit for Tat: Be nice first. Now the Brits probably didn't mean to be nice first when they started the breakfast truce. But it happened. And then the Germans reciprocated, which is line two. Now keep in mind, these two sides are at war. And implicit in line two is a threat. If you mess with me, I'm gonna mess with you.
ROBERT AXELROD: Well, think about snipers, for example. So there's letters where they explain where the snipers would shoot at a tree over and over and over again, showing that, in fact, they were really accurate. Meaning that if they wanted to kill you, they'd get you.
JAD: And this was going on during the breakfast truce. And these little agreements, you know, like I'm gonna be nice to you, but I could kick your ass, don't forget, well, these little truces spread all up and down the Western Front until things really changed. Fast forward to December, Christmas Eve.
STANLEY WEINTRAUB: The climate was just about freezing on Christmas Eve, and the Germans had a tradition of tabletop Christmas trees, small trees.
JAD: For weeks, he said, the German government had been ...
STANLEY WEINTRAUB: Shipping small trees, literally to the trenches, hundreds and hundreds of trees.
JAD: And that night, on Christmas Eve ...
STANLEY WEINTRAUB: At dusk, the Germans began putting up their trees ...
JAD: Mounted them on the rim of their trench ...
STANLEY WEINTRAUB: And lit candles on them, singing a Christmas carol. The British, who might have been no more than 50 or 70 yards away, crawled forward into no man's land to see better.
JAD: And then they were spotted. Here's a letter from a German soldier sent home to his family which describes what happened next.
[ARCHIVE CLIP: I shouted to our enemies that we didn't wish to shoot. I said we could speak to each other. At first, there was silence ...]
JAD: And then, very slowly, out of the darkness, the British guys approached.
[ARCHIVE CLIP: And so we came together and shook hands.]
PAT WALTERS: This—see this is where I start to think, "Are you making this up?" Because this is where it starts to sound sort of crazy to me.
JAD: That's Pat Walters, our producer.
STANLEY WEINTRAUB: It sounds as if this is being made up. And the result was for many decades people assumed that this was just myth. It couldn't possibly have happened. But we know what had happened because we have the letters that the British and the Germans sent back home. We know that they met in darkness and decided "Why don't we have a truce in the morning?"
JAD: Next morning, thousands of soldiers put down their rifles, climbed out of their trenches into no man's land, and started hanging out with each other.
[ARCHIVE CLIP: A lot of us went over and talked to them and this lasted the whole morning. I talked to several of them, and I must say they seemed extraordinarily fine men.]
JAD: Soldiers got together, started fires, cooked Christmas dinners ...
STANLEY WEINTRAUB: Swapped presents and drank.
JAD: The Germans hauled out these enormous ...
STANLEY WEINTRAUB: Barrels of beer.
JAD: They traded stuff ...
STANLEY WEINTRAUB: Cigars and trinkets ...
JAD: Even helped one another ...
STANLEY WEINTRAUB: Bury the dead.
JAD: And in some places on the Western Front, this period of goodwill lasted a whole week. But then the generals found out.
ROBERT AXELROD: They were very angry about this and they said, "We didn't send you to the front to be nice to the other guys, we sent you to kill them."
ROBERT: If the general says, "Hey, I want you to shoot those Germans, that's an order."
ROBERT AXELROD: Well then they would ...
ROBERT: Wouldn't that ...
ROBERT AXELROD: "Oh, geez. Sorry, General, I missed but I'll try again better next time." ROBERT: I see.
ROBERT AXELROD: That's the way the generals finally figured out how to disrupt this whole thing is they would say, "Okay, you guys go out on a raid and I want you to bring back a prisoner or a corpse."
JAD: In other words, "Show me a scalp. That's an order."
ROBERT AXELROD: And that messed things up royally.
JAD: Here's a letter from a British soldier whose unit contained a band—which was apparently pretty common. He writes this letter about one of the moments when the truce vanished.
[ARCHIVE CLIP: At six minutes to midnight, the band opened with Die Wacht am Rhein.]
JAD: Which is a German patriotic anthem. So some of the Germans, according to this letter, climbed up onto the rim of their trench to listen to this English band playing their song.
[ARCHIVE CLIP: Then as the last note sounds, every grenade-firing rifle, trench mortar and bomb-throwing machine let fly simultaneously into the German trench.]
JAD: So you can imagine the Germans that weren't killed would have felt betrayed. They had just been hanging out with these guys, and the next night they would have attacked back. And the British would have attacked them back. And then the Germans would have retaliated against them. And on and on and on ...
ROBERT AXELROD: And it would kind of echo back and forth forever.
JAD: And that's what happened.
STANLEY WEINTRAUB: There were immense casualties, as many as 50,000 casualties in a day.
JAD: "And this," says Axelrod, "is where you see sort of the dark side of Tit for Tat."
ROBERT AXELROD: One of the weaknesses of the Tit for Tat strategy, or one of the problems with it, is these echoes.
JAD: Not just echoes of good, obviously, but echoes of violence.
ROBERT AXELROD: Could get bad. So what I found, though, was that instead of playing pure Tit for Tat where you always defect if the other guy defects ...
JAD: There are certain circumstances, he says—and this I find completely fascinating—where you want to modify that second line of code so that you're not always retaliating, you're nearly always retaliating.
ROBERT AXELROD: Right. If you were a little bit generous, which—by which I mean, say, 10 percent of the time you don't defect, then what happens is that these echoes will stop. And I would call that generous Tit for Tat.
JAD: So this is kind of interesting. Like, we started with Moses, you know, an eye for an eye. But here it's saying, maybe for every nine parts Moses you need one part Jesus, you know?
ROBERT: Meaning, like, turn the other cheek?
JAD: Turn the other cheek.
ROBERT: It sounds like you've described like a cooking recipe or something.
JAD: Well ...
ROBERT: Like, nine parts one thing ...
JAD: Yeah. I mean if you abstract it, it's kind of a recipe. It's a recipe for life.
ROBERT: But it isn't a recipe. That ignores the deep fact of it. Look, if I were punching you in the face right now, what are you gonna do?
JAD: I'm gonna punch you back.
ROBERT: Yeah, and I'm gonna punch you back. You punch me back ...
JAD: I punch you back.
ROBERT: Then we're in pain.
JAD: Yeah.
ROBERT: And somehow, in the middle of being blasted by my powerful fist, you have to come up with the moral courage to say "I think I'm gonna kiss this guy now." And that is not—as you well know, that is not an easy thing to do.
JAD: All right, but you're making it all personal. My point is if you zoom out, this is a strategy that just seems to be woven into the fabric of the cosmos. It works for computers, it works for people, it probably works for amoeba, okay? It just works.
ROBERT: And you think that exists on some higher plane?
JAD: I do, I do.
ROBERT: I don't. I think this is still, as you just called it, very personal. I think a person has to choose to be kind.
JAD: All right, I'm gonna make that choice right now then, okay? Even though you're irritating me. I'm gonna say to you, "Robert, you look very nice today."
ROBERT: You know what I'm gonna do to you? [thump]
JAD: [laughs] All right, enough of this. Radiolab.org is our online home. You can read lots of stuff there, and you can subscribe to our podcast.
ROBERT: It's www dot ...
JAD: That's implied.
ROBERT: Yeah.
[STEVE STROGATZ: Hi, this is Steve Strogatz. Radiolab is produced by Jad Abumrad and Pat Walters.]
[ROBERT AXELROD: Our staff includes Soren Wheeler, Ellen Horne, Tim Howard, Brenna Farrell, and Lynn Levy.]
[ANNA PRICE: With help from Abby Wendel ...]
[KATHLEEN PRICE: And Douglas Smith.]
[STEVE STROGATZ: Special thanks to Nick Capodice, Graham Parker, Annu Neumann and Meg Bowles.]
[ANSWERING MACHINE: End of mailbox.]
-30-
Copyright © 2024 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.