
Sep 17, 2019
Transcript
[RADIOLAB INTRO]
JAD ABUMRAD: Hey, this is Radiolab. I'm Jad Abumrad.
ROBERT KRULWICH: I'm Robert Krulwich. Oh!
JAD: Would you like to say our topic, Robert?
ROBERT: Our topic today is goodness ...
JAD: Hey, I'm Jad. So last week we played you a show that is one of our all-time favorites. It was three different stories about different kinds of collisions between people, between moral philosophies, between right and wrong. Even right and left. Anyhow, this week we want to continue the thoughts we had going in that show for one more step. We're gonna play part of a different show that takes the strategies that came up in those kinds of showdowns last week between people, and plays them out on a geopolitical scale.
ROBERT: Grand global ...
JAD: Strategy.
ROBERT: Yes.
ROBERT AXELROD: Hello.
JAD: Hello, hello?
JAD: And we're gonna tell you a really cool story, we think. That begins with this guy.
ROBERT AXELROD: My name is Robert Axelrod. I'm the Walgreen Professor for the Study of Human Understanding in the Department of Political Science and the Ford School of Public Policy of the University of Michigan.
ROBERT: [laughs]
ROBERT AXELROD: I know that's a mouthful.
ROBERT: That was like your Dean was, like, looking over you saying, "Say it all, please!"
ROBERT AXELROD: Oh well, you know, you could just say I'm a professor of public policy and political science or something.
ROBERT: Well, but before he was all that. Axelrod when he was in high school, he was one of those guys who just loved computers.
ROBERT AXELROD: Well, yes. In '59 and 1960, I hung around the Northwestern University Computer Center.
ROBERT: '59, '60? So what, were those large pieces of furniture in refrigerated buildings?
ROBERT AXELROD: They were. In fact, the whole campus had one computer, and they let me use it for 15 minutes here and 15 minutes there.
JAD: And what would you do with—with the computer?
ROBERT AXELROD: What I did, I did a very simple computer simulation of hypothetical life forms and environments for a science project.
JAD: Ah.
ROBERT: Really?
ROBERT AXELROD: Yeah.
ROBERT: You're a pre-geek, is what you are.
JAD: Yes.
ROBERT: Before the word had been invented.
ROBERT AXELROD: I think you could say that.
ROBERT: But then in 1962, when Axelrod was down in the computer basement I guess somewhere, all over the world everybody else was watching one of the great dramas in modern times ...
[ARCHIVE CLIP, John F. Kennedy: Good evening, my fellow citizens ...]
ROBERT: ... unfold.
ROBERT AXELROD: The Cuban missile crisis.
[ARCHIVE CLIP, John F. Kennedy: Within the past week, unmistakable evidence has established the fact that a series of offensive missile sites is now in preparation on that imprisoned island.]
ROBERT: And Axelrod started thinking about the dilemma we were in.
ROBERT AXELROD: Well, each side wants to spend more money buying missiles and things.
ROBERT: You know, we could build more bombs, but then they could build more bombs. It would be better if they would both stop. But if we stop and they don't ...
JAD: That'd be bad.
ROBERT: Very bad.
ROBERT AXELROD: Yeah. And so I was interested in what's a—what would the conditions that would allow people to get out of this problem.
ROBERT: And then he starts thinking, "Well, wait. Maybe I could use my computer to help me figure out ..."
ROBERT AXELROD: What's a good strategy for this?
JAD: For something like the Cuban missile crisis?
ROBERT AXELROD: Well, yes. Right?
JAD: And what made you think that computers could help with that?
ROBERT AXELROD: Well, I came across a simple game called the prisoner's dilemma.
JAD: Okay, so the prisoner's dilemma is a very famous thought experiment. It's a little tricky to describe, but I got a friend of mine Andrew Zolli who's written about the prisoner's dilemma in an upcoming book ...
ANDREW ZOLLI: Resilience: The Science of Why Things Bounce Back.
JAD: I got him to lay it out for me.
JAD: What is the prisoner's dilemma?
ANDREW ZOLLI: So imagine that two bank robbers are hanging out across the street from the First National Bank, and the police pick them up. They've received a tip that these two guys are about to rob the bank.
JAD: Got it?
ROBERT: Yup.
JAD: So the cops take these two guys back to the station, do the whole Law & Order thing. Put them in different rooms.
ANDREW ZOLLI: And they walk into each one—let's call them Lucky and Joe. And they say to Lucky, "We have enough to make sure that you go away for a six-month sentence."
JAD: But this is not really what the cops want. They want a longer sentence for one of these guys, so they make Lucky an offer.
ANDREW ZOLLI: If you, Lucky, rat out Joe, and Joe doesn't say anything, you will go free and Joe will go to jail for 10 years. If the reverse happens ...
JAD: Meaning if you say nothing and Joe rats you out ...
ANDREW ZOLLI: You're going to jail for 10 years and he's gonna walk free.
JAD: If you both end up ratting on each other, you both get five.
ANDREW ZOLLI: Five years.
JAD: Whereas if you both keep your mouth shut ...
ANDREW ZOLLI: You're each going to jail for six months for loitering.
JAD: So somehow if Lucky and Joe could talk to each other, they'd both say, "Don't speak."
ANDREW ZOLLI: Absolutely. But the big problem that Lucky and Joe have is they can't talk to each other.
JAD: All right. So you're Lucky, okay? What do you do? Do you rat Joe out or not?
ROBERT: Do I know this guy?
JAD: Uh-uh.
ROBERT: At all?
JAD: I mean, you met for this one job but tomorrow you'll never see him again.
ROBERT: Ever.
JAD: Ever.
ROBERT: Well, like, if I knew him and I could trust him, then I think I know what I would do.
JAD: You'd keep your mouth shut, because then you'd get six months.
ROBERT: Yeah, of course.
JAD: Because he'd keep his mouth shut.
ROBERT: It would be a sweet thing.
JAD: Indeed.
ROBERT: But see, since I don't know him, what would happen if he rats me out?
JAD: You'd go to jail for 10 years and he'd go free, that bastard!
ROBERT: 10 years.
JAD: Yeah.
ROBERT: But if I rat him out, then the worst I get is ...
JAD: Five years.
ROBERT: Or, you know, I go away free. I'm totally free.
JAD: Do it, Krulwich. Say what's in your heart.
ROBERT: I'm gonna—I'm throwing him under the bus, Jad.
JAD: Yes. Throw him under.
ROBERT: What's his name again?
JAD: Joe.
ROBERT: Joe.
JAD: You see? He's already gone.
ROBERT: He's already—I don't even remember him. You're dead to me, Joe.
JAD: So you see, in this type of scenario where you don't know the guy, you have a very strong incentive ...
ANDREW ZOLLI: To rat the other guy out.
JAD: Or as the social scientist would say ...
ANDREW ZOLLI: To defect.
ROBERT AXELROD: That's right. If you play it only once, if you only meet somebody once, whatever the other guy does you're better off defecting against them.
JAD: Just here on out, whenever you hear the word 'defect' know that it means 'screw the other guy over.'
ROBERT AXELROD: But the—the really interesting stuff happens if you play over and over again, if you're gonna meet the same people again.
JAD: Because now you're thinking, "Should I help this guy out the next time? If he screwed me should I screw him?"
[ARCHIVE CLIP, John F. Kennedy: But this secret, swift, extraordinary buildup of communist missiles ...]
ROBERT: What do you do? You want to cooperate, but you don't want to get screwed.
[ARCHIVE CLIP, John F. Kennedy: ... which cannot be accepted by this country.]
JAD: Right.
STEVE STROGATZ: You know, these kind of thoughts were paramount in those days because a prisoner's dilemma was being played between the two superpowers.
ROBERT: This is our friend Steve Strogatz, the Cornell mathematician, who says at that time all kinds of folks ...
STEVE STROGATZ: Political scientists and economists and psychologists, mathematicians ...
JAD: Were writing papers about the prisoner's dilemma.
ROBERT: Literally. And thinking, "Come on, we've got to be able to win this game if we're gonna play against the Russians. And we have to do it right."
ROBERT AXELROD: Exactly. But there was no consensus on the best way to do it. And so I was interested in what's a—what's a good strategy for this.
ROBERT: And that's when Robert Axelrod, sitting down there in the basement somewhere in the Midwest with a big computer, that's when he had his idea.
STEVE STROGATZ: His approach, which was really novel at the time, was to conduct a computer tournament.
JAD: A computer tournament?
STEVE STROGATZ: Yeah.
ROBERT AXELROD: And invite the people that had come up with these different ideas to play with each other.
STEVE STROGATZ: In other words, what he said is "All right, Mr. Wise Guy, you know, you've written so and so many articles on the prisoner's dilemma, you think you understand it. How about joining this tournament where you have to submit a program that will play prisoner's dilemma against programs submitted by the other experts? We'll have a round robin."
ROBERT AXELROD: Right. Try these different programs against each other.
ROBERT: So all these computer guys are brought to Caesar's Palace in Las Vegas and they all wear tuxedos and they're all sat down at a table?
ROBERT AXELROD: No.
STEVE STROGATZ: It's a nice image, but what really happened was everyone submitted their programs to Axelrod.
ROBERT AXELROD: They would mail their entries to me, but there was a trophy.
ROBERT: There was a trophy. [laughs]
ROBERT AXELROD: So I wrote to people and I said, "If you win, I'll send you a trophy and a little plaque that says you won the computer tournament."
ROBERT: Okay, so here's the deal: every program will play every other program 200 times. There will be points in each round, and then Axelrod will total the scores ...
STEVE STROGATZ: And see what actually worked.
ROBERT: By which he means in the long run, even if you lose some rounds here and there one of these strategies is gonna beat all the others. Meaning it'll let you survive, maybe even prosper. That's the game.
ROBERT AXELROD: That's right.
JAD: And can you introduce us to some of the contestants?
ROBERT AXELROD: Yeah.
STEVE STROGATZ: So there was one program called Massive Retaliatory Strike.
ROBERT AXELROD: And the first move it just cooperates.
ROBERT: But then as soon as the other program doesn't cooperate ...
STEVE STROGATZ: It would then retaliate for the rest of the game.
JAD: Like, "Sorry, man. You blew it."
ROBERT AXELROD: "I will never trust you again."
STEVE STROGATZ: Yeah. That's it for you. This is like the way my wife is. Whenever a guy in her earlier life stood her up, that was it.
JAD: Game over.
STEVE STROGATZ: [laughs]
ROBERT: But there were also some trickier programs.
STEVE STROGATZ: I mean, some crafty ones tried to make a model of the opponent.
ROBERT: Like he mentioned one that was called Tester.
STEVE STROGATZ: So Tester would—it would see what you were like. It would start by being mean, and then if you start retaliating it backs off and says, "You know, whoa! Chill out. It's okay, man." And, you know, and then it starts cooperating for a while. Until it throws in another ...
ROBERT: Just to test the other guy. Because after all, it was called Tester.
ROBERT AXELROD: Yeah. So Tester is kind of designed to see how much it could get away with.
JAD: I mean, it sounds kind of sensible in a way. I mean, mean. But ...
ROBERT AXELROD: Well, but if you see—if you think about what happens if these two players play each other ...
ROBERT: If Tester plays Massive Retaliation 200 times ...
ROBERT AXELROD: Pretty soon the Tester will defect and then Massive Retaliation will never cooperate again.
ROBERT: "Screw you, pal!"
JAD: "Ah, no. Screw you!"
ROBERT: "Screw you!"
JAD: "Let's go!"
ROBERT AXELROD: In fact, they would do very badly. Both of them.
ROBERT: When you're sitting there, did you have a hunch as to which would be the most successful program, or were you ...
ROBERT AXELROD: Well, I didn't know, which is why I wanted to do it. But I did have a hunch that thousands or tens of thousands of lines of code would be needed to have a pretty competent program.
ROBERT: So when the mailman delivers the fattest envelope to your house you're like, "This could be the one!"
ROBERT AXELROD: Well, yes. Right? Now it didn't turn out that way.
ROBERT: When it was all said and done, when he loaded all the programs into the computer, when they'd all played each other 200 times, the program that won?
ROBERT AXELROD: It's really two lines of code
JAD: Two lines of code?
ROBERT AXELROD: Yeah, it's got a simple name. It's called Tit For Tat.
ROBERT: First line of code? Be ...
STEVE STROGATZ: Nice.
JAD: Nice?
STEVE STROGATZ: Yeah, nice. Nice is a technical word in this game. Nice means I never am nasty first.
ROBERT AXELROD: And after that ...
ROBERT: Second line of code?
ROBERT AXELROD: It just does what the other player did on the previous move.
JAD: Oh!
ROBERT AXELROD: So if the other player has just cooperated, it'll cooperate. And if the other player has just defected, it'll defect.
STEVE STROGATZ: It retaliates on the next move. Couldn't be clearer. On the other hand, it only retaliates that one time. I mean, unless provoked further. It does its retaliation and now bygones are bygones and that's it.
JAD: So, wait. How exactly did it win? I mean, can you give us a sense of why it won?
STEVE STROGATZ: Okay, so let's suppose—here, let's take an extreme case of some very simple programs. One of them I'll call Jesus.
ROBERT: [laughs] Just for the sake of argument.
STEVE STROGATZ: I couldn't think of a name. Now the Jesus program cooperates on every turn. That is, it's always—it's always, you know ...
ROBERT: Good.
STEVE STROGATZ: Yes.
JAD: So the Jesus program is a simple algorithm that says, "Always be good."
ROBERT: Good, good, good, good, good.
STEVE STROGATZ: That's right. And let's say the other program is the Lucifer program, which no matter what always is bad.
JAD: Okay.
JAD: These are your two extremes, says Steve.
ROBERT: And of course, most programs and most people fall somewhere in the middle.
JAD: Right.
ROBERT: But in Tit For Tat, you got a strategy that can swing both ways.
JAD: For instance with Jesus, Tit For Tat starts by cooperating, as does Jesus.
STEVE STROGATZ: And then they're gonna keep cooperating for the whole 200 rounds.
ROBERT: Which is, you know ...
JAD: Good.
STEVE STROGATZ: But now let's suppose it plays Lucifer.
JAD: Where there's no chance to cooperate.
ROBERT: Then, says Steve, Tit For Tat just plays good defense. So when Lucifer does his thing, Tit For Tat retaliates and they pretty much keep doing that and stay even. So in other words ...
STEVE STROGATZ: It's a very robust program. It elicits cooperation if the opponent has any inclination to cooperate, but it doesn't take any guff.
JAD: And it wins. So you might say in evolutionary terms, this program ...
ROBERT: Is the fittest.
STEVE STROGATZ: So actually, Axelrod played an evolutionary version of his tournament. That is, he had these programs, after they played their tournament get a chance to reproduce copies of themselves according to how well they did.
JAD: You mean the winners would get to have more babies?
STEVE STROGATZ: Yeah.
JAD: And then would the babies play each other?
STEVE STROGATZ: Yeah, he ran them again. I mean, he ran them for many generations. And still—like, suppose you have a world of Lucifers and there are a few Tit For Tat players out there. Can they thrive? Can cooperation emerge in this horribly hostile world?
JAD: Wow, what an interesting question!
STEVE STROGATZ: So he looked at that and the answer was: if you have enough of them so that they have enough chance of meeting each other, they can actually invade and take over the world—even if the world starts horribly mean. I mean, what I take to be the big message though, I mean what always sent chills down my spine is that we see this version of morality around the world. You know, be upright, forgiving but retaliatory. I mean, that sounds to me like the Old Testament.
JAD: Oh.
STEVE STROGATZ: It's not turn the other cheek, it's an eye for an eye. But not 10 eyes for an eye. And to think that it's not something that's handed down by our teachers or by God, but that it's something that came from biology, I like that argument, personally.
JAD: We're gonna take a quick break. When we come back, the story moves from computer programs to real people. Real people in the middle of a very real war. Stick around.
[LISTENER: Hey there, this is Greg in Huntington Beach, California. Radiolab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at www.sloan.org.]
JAD: Hey, I'm Jad. We're back, and we're playing a piece that we called at the time Tit For Tat. We just heard about the computer program of that name and how it fared in Robert Axelrod's Thunderdome-style competition between computer programs. But we're gonna shift the playing field now.
STEVE STROGATZ: This is what's so impressive to me about Axelrod's work. So he's not just playing math games. He—he tries to tie this to history and politics as seen.
ROBERT AXELROD: I like to scan journals. [laughs] One of my kids would say it's a pastime because it's part of my profession. But I came across a book called the Live-And-Let-Live System in World War I.
JAD: So here's where we jump away from the math and the computer tournaments and into something very real.
STANLEY WEINTRAUB: The war began late in July, 1914.
JAD: That's Stan.
STANLEY WEINTRAUB: Stanley Weintraub.
JAD: Expert in World War I.
STANLEY WEINTRAUB: Evan Pugh Professor Emeritus at Penn State.
JAD: And the story that Stan's gonna help us tell takes place on what was called the Western Front, which was basically these two lines of trenches ...
STANLEY WEINTRAUB: Very close to each other. A few hundred yards apart.
JAD: And they stretch for hundreds of miles. And that fall ...
STANLEY WEINTRAUB: In November, the weather turned bad. Heavy rains, then it became icy, and then slush and then snow. It became disgusting, because the trenches also were filled with rats. The rats went after not only the food but after corpses.
JAD: And it was oddly in this miserable, disgusting hellhole that something quite amazing happened. No one quite knows how it started but one day, maybe around daybreak let's say, while the two sides were fighting some of the British soldiers ...
STANLEY WEINTRAUB: Stopped firing long enough to have breakfast.
JAD: And as they were eating they noticed, hmm, the Germans stopped too, to have their breakfast. And when they were both done ...
STANLEY WEINTRAUB: They'd begin firing again.
JAD: Next morning, same thing. British take their breakfast break at about the same time. The Germans do the same thing. The morning after that the same thing, and then the next. And after a while ...
ROBERT AXELROD: Both sides caught on that if they didn't interrupt the other one, then they wouldn't be interrupted.
[ARCHIVE CLIP, letter: On the whole, there is silence.]
JAD: This is from a letter a British soldier sent home to his wife at the time.
[ARCHIVE CLIP, letter: After all, if you prevent your enemy from drawing his rations, his remedy is simple: he will prevent you from drawing yours.]
JAD: When Axelrod read this ...
ROBERT AXELROD: I thought, "Gee, this sounds very familiar."
JAD: Line one of Tit For Tat. Be nice first. Now the Brits probably didn't mean to be nice first when they started the breakfast truce, but it happened. And then the Germans reciprocated, which is line two. Now keep in mind these two sides are at war, and implicit in line two is a threat: if you mess with me, I'm gonna mess with you.
ROBERT AXELROD: Well, think about snipers, for example. See, there's letters where they explain where the snipers would shoot at a tree over and over and over again showing that, in fact, they were really accurate. Meaning that if they wanted to kill you, they'd get you.
JAD: And this was going on during the breakfast truce. And these little agreements, you know, like, "I'm gonna be nice to you, but I could kick your ass. Don't forget." Well, these little truces spread all up and down the Western Front until things really changed. Fast forward to December. Christmas Eve.
STANLEY WEINTRAUB: The climate was just about freezing on Christmas Eve. And the Germans had a tradition of tabletop Christmas trees, small trees.
JAD: For weeks, he said, the German government had been ...
STANLEY WEINTRAUB: ... shipping small trees literally to the trenches. Hundreds and hundreds of trees.
JAD: And that night, on Christmas Eve ...
STANLEY WEINTRAUB: At dusk, the Germans began putting up their trees.
JAD: Mounted them on the rim of their trench.
STANLEY WEINTRAUB: And lit candles on them, singing Christmas carols. The British, who might have been no more than 50 or 70 yards away, crawled forward into no-man's land to see better.
JAD: And then they were spotted. Here's a letter from a German soldier sent home to his family which describes what happened next.
[ARCHIVE CLIP, letter: I shouted to our enemies that we didn't wish to shoot. I said we could speak to each other. At first there was silence.]
JAD: And then very slowly out of the darkness the British guys approached.
[ARCHIVE CLIP, letter: And so we came together and shook hands.]
PAT WALTERS: This is—see, this is where I start to think, "Are you making this up?" Because this is where it starts to sound sort of crazy to me.
JAD: That's Pat Walters, our producer.
STANLEY WEINTRAUB: It sounds as if this is being made up, and the result was for many decades people assumed that this was just myth. It couldn't possibly have happened. But we know it happened because we have the letters that the British and the Germans sent back home. We know that they met in darkness and decided, "Why don't we have a truce in the morning?"
JAD: Next morning, thousands of soldiers put down their rifles, climbed out of their trenches into no-man's land and started hanging out with each other.
[ARCHIVE CLIP, letter: A lot of us went over and talked to them and this lasted the whole morning. I talked to several of them, and I must say they seemed extraordinarily fine men.]
JAD: Soldiers got together, started fires, cooked Christmas dinners.
STANLEY WEINTRAUB: Swapped presents and drank.
JAD: The Germans hauled out these enormous ...
STANLEY WEINTRAUB: Barrels of beer.
JAD: They traded stuff.
STANLEY WEINTRAUB: Cigars and trinkets.
JAD: Even helped one another ...
STANLEY WEINTRAUB: ... bury the dead.
JAD: And in some places on the Western Front, this period of goodwill lasted a whole week. But then the generals found out.
ROBERT AXELROD: They were very angry about this, and they said, "We didn't send you to the front to be nice to the other guys, we sent you to kill them."
ROBERT: The general says, "Hey, I want you to shoot those Germans. That's an order."
ROBERT AXELROD: Well, then they would ...
ROBERT: Wouldn't that ...?
ROBERT AXELROD: "Oh, gee. Sorry, General. I missed. But I'll try again better next time."
ROBERT: I see.
ROBERT AXELROD: The way the generals finally figured out how to disrupt this whole thing is they would say, "Okay, you guys go out on a raid, and I want you to bring back a prisoner or a corpse."
JAD: In other words, show me a scalp. That's an order.
ROBERT AXELROD: And that messed things up royally.
JAD: Here's a letter from a British soldier whose unit contained a band, which was apparently pretty common. He writes this letter about one of the moments when the truce vanished.
[ARCHIVE CLIP, letter: At six minutes to midnight, the band opened with Die Wacht am Rhein.]
JAD: Which is a German patriotic anthem. So some of the Germans, according to this letter, climbed up onto the rim of their trench to listen to this English band playing their song.
[ARCHIVE CLIP, letter: Then as the last note sounded, every grenade, firing rifle, trench mortar and bomb-throwing machine let fly simultaneously into the German trench.]
JAD: So you can imagine the Germans that weren't killed would have felt betrayed. They had just been hanging out with these guys. And the next night they would have attacked back, and the British would have attacked them back. And then the Germans would have retaliated against them. And on and on and on.
ROBERT AXELROD: And it would kind of echo back and forth forever.
JAD: And that's what happened.
STANLEY WEINTRAUB: There were immense casualties. As many as 50,000 casualties in a day.
JAD: And this, says Axelrod, is where you see sort of the dark side of tit for tat.
ROBERT AXELROD: One of the weaknesses of the tit-for-tat strategy, or one of the problems with it is these echoes ...
JAD: Not just echoes of good obviously, but echoes of violence.
ROBERT AXELROD: ... could get bad. So what I found though was that instead of playing pure tit for tat, where you always defect if the other guy defects ...
JAD: There are certain circumstances, he says—and this I find completely fascinating—where you want to modify that second line of code so that you're not always retaliating, you're nearly always retaliating.
ROBERT AXELROD: Right. If you were a little bit generous, which—by which I mean, say 10 percent of the time you don't defect, then what happens is that these echoes will stop. And I would call that generous tit for tat.
JAD: So this is kind of interesting. Like, we started with Moses.
ROBERT: Mm-hmm.
JAD: You know, an eye for an eye. But here it's saying maybe for every nine parts Moses, you need one part Jesus, you know?
ROBERT: Meaning, like, turn the other cheek?
JAD: Turn the other cheek.
ROBERT: It sounds like you've described, like, a cooking recipe or something.
JAD: Well ...
ROBERT: Like, nine parts of one thing.
JAD: Yeah. Well, I mean if you abstract it it's kind of a recipe. It's a recipe for life.
ROBERT: But it isn't a recipe. That ignores the deep fact of it. Look, if I were punching you in the face right now ...
JAD: Mm-hmm.
ROBERT: What are you gonna do?
JAD: I'm gonna punch you back.
ROBERT: Yeah. And I'm gonna punch you back. You punch me back and I punch you back. And we're in pain.
JAD: Yeah?
ROBERT: And somehow, in the middle of being blasted by my powerful fist, you have to come up with the moral courage to say, "I think I'm gonna kiss this guy now." And that is not easy—as you well know, that is not an easy thing to do.
JAD: All right, but you're making it all personal. My point is if you zoom out, this is a strategy that just seems to be woven into the fabric of the cosmos. It works for computers, it works for people, it probably works for amoeba, okay? It just works.
ROBERT: And you think that exists on some higher plane?
JAD: I do. I do.
ROBERT: I don't. I think this is still, as you just called it, very personal. I think a person has to choose to be kind.
JAD: All right, I'm gonna make that choice right now then, okay? Even though you're irritating me. I'm gonna say to you, "Robert? You look very nice today."
ROBERT: You know what I'm gonna do to you? Thump!
JAD: [laughs] All right, enough of this. Radiolab.org is our online home. You can read lots of stuff there and you can subscribe to our podcast.
ROBERT: It is w-w-w.
JAD: It's implied.
ROBERT: Yeah.
[STEVE STROGATZ: Hi, this is Steve Strogatz. Radiolab is produced by Jad Abumrad and Pat Walters. Our staff includes Soren Wheeler, Ellen Horne, Tim Howard, Brenna Farrell and Lynn Levy.]
[ROBERT AXELROD: With help from Abby Wendle and Douglas Smith.]
[STEVE STROGATZ: Special thanks to Miss Cappadice, Graham Parker, Daniel Neuman and Meg Bowles.]
[ANSWERING MACHINE: End of mailbox.]
-30-
Copyright © 2023 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.