Sep 9, 2022

Transcript
40,000 Recipes for Murder

 

LATIF: Okay, so I'm—I'm—this is unusual because I usually prep so much and I have not really prepped that much at all.

LULU: All right. You're flying fast and loose?

LATIF: I'm flying very loose here.

LULU: With secret intel about the government and poison? No!

LATIF: Something.

LULU: Is this about poison?

LATIF: Maybe.

LULU: Okay.

LATIF: Okay, let's just do the thing.

LULU: All right. Hey, I'm Lulu Miller.

LATIF: I'm Latif Nasser.

LULU: This is Radiolab.

LATIF: Okay, basically I was just, like, nosing around, and I found this article in a journal called Nature Machine Intelligence.

LULU: Nature Machine Intelligence. Okay.

LATIF: And this is a few months ago. It was in March. I don't—I don't even know what I was looking for when I was—and I just found this paper. And it had this weird, kind of boring title that I didn't understand, but I, like, started reading it. And the tone of it, there was something about the tone of it that was sort of breathless.

LULU: Hmm.

LATIF: Like, "Oh, my God! We just discovered this thing. And it's kind of scary, and we're not the only ones who are able to find this thing. We're not the only ones who are actually looking for this thing." And it felt, like, calamitous. And by the time I finished it, I was breathless. Like, I was like, "Oh, my God! Like, is this what I think it is? Because if it is, this thing is terrifying."

LULU: Okay, what—what—and what's the thing?

LATIF: Okay. Okay, so let me just start from the beginning. I was just trying to manufacture out of nothing a kind of open piece.

LULU: No! But that's—no! I mean, I'm in.

LATIF: Okay, so here's what happened. So ...

LULU: Okay.

LATIF: Our scene begins with ...

LATIF: La, la, la, la, la, la, la, la, la.

SEAN EKINS: You're muted.

LATIF: Oh, I'm muted.

LATIF: This guy named Sean.

SEAN EKINS: Hey!

LATIF: I'm muted. Duh! Okay, cool. All right.

SEAN EKINS: Okay.

LATIF: So Sean, I just—like, I gotta say when I read your paper, basically my jaw dropped and I wanted to hear you tell the story.

SEAN EKINS: Yeah, I'm happy to tell you how it came about.

LATIF: Yeah.

SEAN EKINS: It took quite a long time to get out.

LATIF: Well, before we get there, if you don't mind my interrupting, let's actually rewind a little bit. Like, who are you? What is your company? What do you do?

SEAN EKINS: So my name's Sean Ekins. I'm the CEO of Collaborations Pharmaceuticals. And this is a company based in Raleigh, North Carolina. I founded it 2015, and we work on using computational approaches for doing drug discovery.

LATIF: Which basically means what they do is they use AI to discover new medicines.

LULU: Hmm.

LATIF: You're a medicine hunter, and you do it through computers.

SEAN EKINS: Exactly, yes.

LATIF: So essentially what they do is they've built—like, using a lot of, like, open-source technology, a lot of, like, open-source, like, databases of drugs, they've basically created this computer program that's kind of like a search engine. And so they call it Mega-Syn.

LATIF: Or Mega-Sine? How do you say it?

SEAN EKINS: Yeah, Mega-Syn. So it was a really quick name for Mega-Synthesis.

LULU: Oh! So—and sort of synthesizing drugs that exist with receptors and brains and ...

LATIF: Kind of. It's a little complicated.

SEAN EKINS: It's one of those strange things where, like, I don't use it. I have one of my employees that basically codes it and puts it all together.

LATIF: [laughs]

LATIF: So that employee ...

FABIO URBINA: Sorry, let me take my mask off here.

LATIF: His name's Fabio.

FABIO URBINA: Oh, oh—okay.

LATIF: Fabio Urbina.

LULU: Okay. Sean and Fabio. [laughs]

LATIF: Sean and Fabio.

LULU: I love it!

LATIF: So what they do is they typically work with rare diseases.

FABIO URBINA: Which aren't considered profitable.

LATIF: Big Pharma's ignored them, there's no drugs for them. So what they'll do is take one of these diseases ...

FABIO URBINA: That, you know, usually only a few hundred people have.

LATIF: And they'll be like, "Okay, we need a drug that will do a very specific thing in the body to stop this disease, to stop the person from getting sick."

LULU: Right.

LATIF: So they'll tell Mega-Syn, "We need a drug that can do this very specific thing." And then they'll hit 'Search.' And Mega-Syn will comb through all the available drugs that have been discovered, all the drugs that have been even evaluated. Like, this giant, giant network of basically every drug that has ever been created. And if from that, Sean and Fabio can't find a good match ...

FABIO URBINA: Well, we're kind of out of luck. That's the end of that.

LATIF: Except it's not. Because Mega-Syn can do this other thing.

FABIO URBINA: It can put together a drug, basically.

LATIF: So how this works is drugs are basically just made up of molecules. And the thing about molecules that work as drugs ...

FABIO URBINA: Is they have a certain molecular weight range, they have certain properties.

LATIF: They're distinct.

FABIO URBINA: You can look at them and say, "Okay, that's a drug," or "Oh, that looks nothing drug-like."

LATIF: And so using all these public databases and just inputting all this information into Mega-Syn about chemistry, molecular engineering ...

SEAN EKINS: In a sense, we've—we've tried to train it to make things that a chemist would make.

LATIF: And so with that knowledge, what they do is they take the rare disease and they say, "Okay, Mega-Syn. In the infinity of molecules that could be drugs that don't even exist yet, can you make something ..."

FABIO URBINA: That might be active against our disease of interest?

LATIF: "... that could work here?" So Fabio will enter all this stuff into Mega-Syn, hit 'Run,' and within minutes ...

FABIO URBINA: It'll spit out ...

LATIF: ... these brand new, never-before-seen molecules.

FABIO URBINA: Molecules that look like drugs.

LATIF: And then Sean and Fabio can go through these molecules and say, "Okay, this is the one we want to do this thing we need it to do."

LULU: Huh. Huh!

LATIF: To disrupt this disease that humankind doesn't have a cure for.

LULU: Oh my gosh! The speed of that just needs a moment.

LATIF: Yeah, yeah. It's kind of amazing. It's incredible. It's incredible.

LULU: Yeah.

LATIF: Yeah. Okay. So this is—I mean, this is where things start to get dark. So it's 2021, and these guys, they are just doing their thing, and they get invited to this conference.

SEAN EKINS: Called the Spiez or Spiez Convergence Conference.

LATIF: Kind of international security conference.

SEAN EKINS: And the goal of it really is to understand how technologies can be misapplied.

LATIF: Part of the theme is, like, this idea of, like, dual-use. So okay, for example, dual use is like, a nuclear bomb and nuclear power plants come out of the same technology.

LULU: Right.

LATIF: Like the so-called, like, double-edged sword.

FABIO URBINA: And so we got this invite. And of course, we're thinking ...

SEAN EKINS: Oh! Well, that's interesting. Why—why ...

LATIF: Why us?

FABIO URBINA: How can we misuse wonderful drug discovery tools?

LATIF: It's never even occurred to them. So ...

SEAN EKINS: The location was good. It was in Switzerland.

LATIF: They're kind of excited ...

SEAN EKINS: I was like, "Oh, the pictures look really good."

LATIF: ... about going on a free trip to Switzerland, as anybody would be.

SEAN EKINS: Jumped at the chance. In the end, it was a Zoom conference, which was a bit, you know, not as exciting. But ...

LATIF: Right.

LATIF: But they were like, "Okay, let's think of something. Like, we could think of a way that we could—you know, we got this invite. Like, surely there's something we could do. So they're sort of like brainstorming, like, "Okay, if we were, like, really evil ..."

FABIO URBINA: What would we do? How would we misuse what we know?

LATIF: They called it, like, their "Dr. Evil plan." Like, what would Dr. Evil do here?

FABIO URBINA: So yeah, it was—it was a very weird feeling.

LATIF: So they're, like, imagining and thinking it up ...

SEAN EKINS: Well, we were running out of time.

LATIF: ... when Sean has this idea.

SEAN EKINS: I hadn't given it a lot of thought. It was—it was pretty quick.

LATIF: So one of the things about Mega-Syn ...

FABIO URBINA: Is if we are trying to generate a new drug, we want to make sure it's not toxic.

LATIF: Fabio basically programmed this filter in Mega-Syn so that if the side effects of the medicine are gonna be worse than the medicine itself, like—or than the disease itself ...

LULU: Pass. Not interested.

LATIF: Like, don't—not interested.

LULU: Don't bother!

FABIO URBINA: Because of course, it doesn't matter if your drug cures all cancer, if it stops your heart from beating, it's going to not be a good drug.

LATIF: It'll save my life, but it'll also kill me. So ...

LULU: So I'll pass.

LATIF: Yeah, exactly.

LULU: Got it.

LATIF: And so Sean ...

SEAN EKINS: Well ...

LATIF: ... he thought ...

SEAN EKINS: What if instead of going right we went left?

LULU: Mmm ...

LATIF: What if we flipped the filter? What if we did the exact opposite, like, photo negative of that filter?

LULU: Yeah. Just spit out the deadlies?

LATIF: Yeah, exactly. And so ...

FABIO URBINA: I was using a 2015 Mac.

LATIF: ... that night ...

FABIO URBINA: Just did a couple of copy-and-paste changes. Typed a '1' where there was a '0' and a '0' where there was a '1.'

SEAN EKINS: It was that simple. It was literally that simple.

LATIF: He hit 'Run' on Mega-Syn.

FABIO URBINA: And then went home for the evening.

LATIF: Next day ...

FABIO URBINA: I did some work in the morning, and then I think it was around noon, I just sort of opened up this file.

LATIF: And they check what they have.

SEAN EKINS: Fabio said, "All right, we've got a list here."

LATIF: It's, like, overwhelming.

SEAN EKINS: Tens of thousands of molecules.

LATIF: And so they skimmed, like, the top crop.

FABIO URBINA: These top toxic molecules.

SEAN EKINS: And then looked at them.

LATIF: And so what they did is they would pick out a Mega-Syn molecule ...

SEAN EKINS: Put it into a public database ...

FABIO URBINA: Do like a database search to see if these molecules already existed.

LATIF: So they're going through this database to see if Mega-Syn had created anything terrible. Looking, looking, when all of a sudden they come across a match.

SEAN EKINS: With a super-hideous molecule ...

LATIF: Called VX.

[NEWS CLIP: So what exactly is VX?]

LATIF: It's the thing that ...

[NEWS CLIP: Top news today: a toxic substance was indeed the murder weapon ...]

LATIF: Do you remember that Kim Jong-Un poisoned his half-brother in an airport?

LULU: Right.

[NEWS CLIP: Two women now in custody.]

LATIF: That's VX.

[NEWS CLIP: Wiped his face with the toxic substance.]

LATIF: It's a nerve agent.

[NEWS CLIP: Developed in the United Kingdom in the 1950s.]

[NEWS CLIP: Banned by the United Nations and classified as a weapon of mass destruction, it is considered one of the most lethal chemical substances ever made.]

[NEWS CLIP: Ever created.]

SEAN EKINS: I mean, one way to think about it is if you think about salt, a few of those crystals of salt, if it was VX, would be enough to kill you.

LATIF: And if you did get exposed, your muscles would start to twitch, your pupils would dilate, you'd start sweating. Then you'd start to vomit. After that, your muscles would go completely slack. You'd be paralyzed, including your diaphragm, which would stop working, so you'd start to suffocate. And within a few minutes of being exposed, you would die.

LULU: Ugh! That's horrific.

LATIF: Yeah. I mean, it's awful. It's very awful. And Mega-Syn basically independently created it with the push of a button.

LATIF: What did you think was gonna happen?

SEAN EKINS: Um, I mean, realistically?

LATIF: Maybe it was just what did happen? I don't know.

SEAN EKINS: Yeah. Well, it was—I think what did happen was just the ease of it. I thought we would maybe get a few things that looked like VX.

LATIF: You did?

SEAN EKINS: Yeah. And we found a few, you know, in the literature, in publications.

LATIF: A few horrible things that humans had already created. And they figured that'd be it.

SEAN EKINS: But what we got was thousands of different molecules that looked like VX.

LATIF: Thousands of brand new, never-before-seen molecules that were actually predicted to be more potent than VX.

SEAN EKINS: Massively more potent. By orders of magnitude.

LATIF: This is just bad, right?

SEAN EKINS: Yeah, it's bad. It was like the alarm bells started ringing at that point.

LATIF: Because, according to Sean, if any chemist got their hands on this and wanted to make some of these molecules into weapons, if they did, because no one knows they exist, these weapons would be untraceable, undiagnosable.

LULU: Wow!

LATIF: Incurable.

LULU: Latif, this is so scary.

LATIF: It's really, really scary.

SEAN EKINS: I didn't sleep. I didn't sleep. I did not sleep. It was that gnawing away, you know, we shouldn't have done this. And then, oh, we should just stop now. Like, just stop.

LATIF: But in a minute, we're gonna keep going. Stay with us.

 

LULU: Lulu.

LATIF: Latif.

LULU: Radiolab.

LATIF: So Sean and Fabio have opened Pandora's box of chemical weapons.

FABIO URBINA: Yeah. Now we have this sort of file in our computer and all of a sudden holds all these warfare agents.

LATIF: Was there part of you that was just like, delete, delete, delete, and just pretend this never happened?

LULU: Just unthink the thought experiment?

SEAN EKINS: That definitely crossed our mind.

FABIO URBINA: Yeah. We definitely had that sort of reaction to this.

SEAN EKINS: Like, I don't wanna know any—I don't wanna know any more.

LATIF: But then they figured, wait a second ...

SEAN EKINS: Other people could do this.

LATIF: You do not need a PhD to do this. You just need some basic coding knowledge, a basic laptop, and then all the data is available online for free.

SEAN EKINS: This could be something people are already doing or have already done. You know, these tools are in the hands of people that there is no control of. You know, anyone could do this anywhere in the world.

LATIF: And so the two of them, in this moment, were faced with this dilemma of now that you know what you know, what do you do? Do you tell people?

SEAN EKINS: Well, we have to make people aware of these potential uses of the technology and show people that, yes, these technologies can be misused.

LATIF: So maybe people could prepare for it, try to prevent it.

SEAN EKINS: Exactly.

LATIF: But at the same time, if you tell people ...

FABIO URBINA: We could inspire instead of prevent.

SEAN EKINS: Then maybe people that would wanna do this, see how far they could push it.

LATIF: What do you do? What do you do? What do you do?

LULU: Ah!

LATIF: It's exactly the double-edged sword.

LULU: So what—what—what happened? So okay.

LATIF: So—so they make this decision together where they're like, "We gotta go to this international security conference and we gotta say this out loud. But we're not gonna show anybody the specifics, and we're gonna tell them just enough that they know that this is a really serious problem."

LULU: Yeah.

LATIF: "And we gotta flag it. And hopefully someone smarter than us can figure out a solution to this." And so they go to the conference, they present at the conference, they then publish a few months later a comment in this journal, which is basically, effectively what they say at the conference. And it blows up. Well, maybe that's the wrong word to use.

LULU: [laughs]

LATIF: But, like, it goes everywhere. Wired, Scientific American. There was a thing on it in the Washington Post. There was a thing on it in The Economist. There was a thing on it in this thing, and this thing. And a lot of people like me stumble upon this thing. There's a lot of, like, active discussion about it from not just chemical weapons people, but also AI people and people in the pharmaceutical industry, and philosophers and, you know, weapons people and, like, all kinds of different people are, like, weighing in on it. And thinking about this and, like, what do you do here? So it's kind of—I think they got what they want, but they also, like—every night, they're going to bed thinking, like, "Tomorrow I could wake up and some horrible thing could happen, and it could have been because of me."

LATIF: Um, Lulu? Lulu? Are you there? Hello? Hello Lulu?

LULU: Oh my gosh. Sorry, sorry, sorry. I think my mic just conked out. I might need to just use my computer mic right now. Can you hear this?

LATIF: Yeah, yeah, yeah.

LULU: Let's just—if you can hear me, let's just keep going with this.

LATIF: Okay. No problem. That's fine.

LATIF: Just in one sentence can you just introduce yourself for me?

SONIA BEN OUAGRHAM-GORMLEY: Okay. Okay.

LATIF: So in the midst of reporting this story, I ended up talking to this expert.

SONIA BEN OUAGRHAM-GORMLEY: So my name is Sonia Ben Ouagrham-Gormley.

LATIF: She teaches at George Mason University.

SONIA BEN OUAGRHAM-GORMLEY: In the bio-defense program. And I study weapons of mass destruction, particularly biological weapons.

LATIF: And I found Sonia because, you know, I needed somebody else to talk to about this. I needed some kind of outside perspective because when I read Sean and Fabio's paper, I legitimately thought this was very, very terrifying.

SONIA BEN OUAGRHAM-GORMLEY: Yeah, that's—that's the impression it gives. And my point is that the thought experiment is just a thought experiment. It just shows that it is possible to identify new molecules, but there's a long way between the idea and the production of an actual drug or an actual weapon.

LATIF: Like, if you're a chemist who's gonna make something that just exists on paper, like, that takes a long time, a lot of investment, a lot of work, a lot of thought.

SONIA BEN OUAGRHAM-GORMLEY: And it's already hard enough to do it with chemicals that already exist.

LATIF: She said there's plenty of examples of scientists ...

SONIA BEN OUAGRHAM-GORMLEY: Who try to transform them.

LATIF: Who try to tweak them just a little bit.

SONIA BEN OUAGRHAM-GORMLEY: To make them more harmful. And very often they failed.

LATIF: Because it's just super, super difficult.

SONIA BEN OUAGRHAM-GORMLEY: That's the point.

LULU: Hmm.

LATIF: And I told her it was still kinda hard to wrap my head around because the way that I was thinking about it, I was actually thinking of, like, the Anarchist Cookbook of molecules. It's almost like recipes or something, right? It's like this thing spat out 40,000 recipes or whatever, right?

LULU: And then you just go to the store and get some fertilizer and, you know, and make it. And it's the exact proportions.

LATIF: Exactly. So I was worried about that.

LULU: Right. I'm picturing that too.

LATIF: Is that the right analogy, or would you use, like, a different analogy?

SONIA BEN OUAGRHAM-GORMLEY: Um ...

LATIF: And she was like, "No, you're—actually, that's the same metaphor I use when I'm teaching my students. But ..."

LULU: Oh, no! [laughs]

LATIF: "... think about it this way."

SONIA BEN OUAGRHAM-GORMLEY: If we take the analogy of a cake, for making a cake.

LATIF: These molecules are, like, really fancy cakes.

SONIA BEN OUAGRHAM-GORMLEY: And based on what I read, right? Based on the article, what they have is a list of different ingredients.

LATIF: And to your mind, it would take a five-star Michelin chef with a whole kitchen full of, you know, sous chefs to kind of figure that out, and to ultimately make any of those cakes.

SONIA BEN OUAGRHAM-GORMLEY: Exactly. Exactly.

LULU: It's like a level of craft.

LATIF: It's a level of craft and expertise.

LULU: It's like yeah, you could make the David in theory if you had marble and a chisel. But, like ...

LATIF: Right.

LATIF: But it's like—I mean, to take a step back, I feel like your takeaway about this paper is that you—you have not lost any sleep over this paper.

SONIA BEN OUAGRHAM-GORMLEY: No. No, I think I found it a little bit too alarmist.

LULU: Huh. So Sonia is soothing us.

LATIF: Yeah.

LULU: Are you gonna set this whole section to, like, a nice, lullaby bedtime, like, "It's okay."

LATIF: Um ...

LULU: "It's okay."

LATIF: No. [laughs] Because there's still a problem, which is that I mean, Michelin-starred chefs do exist.

FABIO URBINA: And, you know, I don't know if this could go on air just yet but I don't think it's private or anything, but we were just contacted by the White House this morning.

LATIF: Oh my God! Really?

FABIO URBINA: Yeah. We weren't expecting that, so yeah.

LATIF: What did they say?

FABIO URBINA: So they want us to brief them on the paper, basically.

LATIF: Holy crap!

FABIO URBINA: Apparently it's causing a lot of buzz in the White House right now.

LULU: Whoa!

LATIF: And so I actually followed up with Sean after our first interview about this very thing.

SEAN EKINS: So we ended up doing a Zoom call with folks from the Office of Science and Technology Policy and the National Security Council, which was very surreal.

LATIF: So this is just in March, 2022. Sean, Fabio and some of their team got on this Zoom call.

SEAN EKINS: With these folks in their office in the White House, with the White House crest in the background. [laughs]

LATIF: So they do this presentation for the White House folks.

SEAN EKINS: I think they were worried that we were, you know, kind of crazy people, and we were just gonna let all of this information out there.

LATIF: But at the end of their presentation, there's like a Q&A, and the White House folks are asking them questions.

SEAN EKINS: "Is this information, you know, sort of locked away somewhere?"

LATIF: And it is.

FABIO URBINA: So yeah, you know, one of the first things I'm gonna do is put it in a file and encrypt it really heavily.

LATIF: And it's on a computer that is not connected to the internet. It's air gapped.

SEAN EKINS: Fabio has it on his machine, locked away, encrypted.

LATIF: But as the White House staff kept asking questions, Sean was like, "Oh."

SEAN EKINS: I was just waiting for the question.

LATIF: He's, like, anticipating it.

SEAN EKINS: It was just when are they gonna ask the question? The elephant in the room, right?

LATIF: And then finally they ask the question.

SEAN EKINS: "Uh, can we have the data?"

LATIF: And it's like another one of these moments where Sean has to decide, you know, maybe the government should know about this so that they can anticipate it, try to regulate it. But on the other hand, now you're actually handing over the list to one of the most powerful governments in the world. Like, if anybody has access to Michelin-starred chefs, here they are. And Sean, it's his call, right? Because he's the CEO of the company. And Sean says ...

SEAN EKINS: You know, no.

LATIF: No. And to me, that was like—that was like, "Oh my God!" I—how do you even?

LULU: And how did they respond?

SEAN EKINS: Well, the reaction was basically, I think it was like, "Okay." I mean, it was pretty, "All right. Well, we'll just have to accept that."

LATIF: And what's your rationale for saying—why not share it with them?

SEAN EKINS: I just didn't feel that I wanted to hand it over to them. I mean, we had other scientists reach out to us as well asking exactly the same question, and I told the White House that I didn't feel like, you know, we should do that, we should give it over to them. And I just didn't feel that it was right. I mean ...

LATIF: So you were basically like, "No to you, White House, and no to everybody. We're not sharing this with anybody."

SEAN EKINS: Right.

LATIF: And you've told everybody about the existence of the list, but you're not sharing the list. What's the impetus there?

SEAN EKINS: I mean, it's bad enough to put the article out there and tell people how to do it and watch it sort of spread out across the globe, but if I gave them the list of molecules, it's almost like a red flag to a bull, right? The chemists out there are just gonna try to figure out all right, which ones are the easiest ones to make and then just make them. And I don't want to be the person that has to say, "Yeah, I'm responsible for that."

LATIF: And so for now, the list just sits on Fabio's computer, kind of waiting to be used in the case of an emergency in case someone was to recreate it somewhere else and actually start making some of these things.

LATIF: Reporting this story, I talked to a lot of people—AI experts, weapons experts, that kind of thing—and also, like, from my own expertise studying the history of science, this is a thing that just happens where something gets created and someone finds a way to use it for the opposite purpose, right? That's the story of the nuclear bomb and nuclear power. It's the story—and this is one we've actually done on the show before—of the nitrogen-fixing process that can give you fertilizer to feed billions or, you know, gunpowder and explosives. This kind of thing just happens over and over and over in history. But I think the thing that struck me about what Sean and Fabio were doing is how, when it came to change this machine from making drugs that would help people, it's just how easy it was ...

FABIO URBINA: I just did a couple of copy and paste changes.

LATIF: ... to flip it.

FABIO URBINA: Typed a '1' where there was a '0' and a '0' where there was a '1.'

SEAN EKINS: It was that simple. It was literally that simple.

LATIF: Like, the line between this thing doing good and this thing doing bad felt so thin. And it was in thinking about this that I reached out to an old friend of mine, a guy named Yan Liu. He studies ancient Chinese medicine—he now works at the University at Buffalo. And I called him because he always would tell me this thing that when it comes to drugs ...

YAN LIU: You know, there's no clear boundary between what is poison and what is medicine.

LATIF: That an individual drug can be good or bad for you. It could go either way.

YAN LIU: It is the matter of the dose that makes a difference.

LATIF: That it's all about the dosage or also about the intent of the person giving it to you. Or about the body of the person who's taking it.

YAN LIU: It's more about the context ...

LATIF: ... than it is about the drug itself.

YAN LIU: Exactly.

LATIF: And I showed him Sean and Fabio's paper and he was like, "Oh, this feels like a toxicology paper."

YAN LIU: This separation between pharmacology and toxicology, you know, we have a long tradition especially in the West. It's the sense of separation between what is good and what is bad.

LATIF: But he's like, pharmacologists and toxicologists are studying the same molecules, but it's like, we just want them to be separate.

YAN LIU: Exactly. But every substance has the potential to either heal or to kill.

LATIF: And that's what I keep thinking about about this story is that there's no real line here. Everything is capable of everything, and it's just a matter of how we choose to use it.

LULU: Yeah. It's like the idea that there is a line or that you could ever stay on one side of the line is—I don't know what. The myth? The blind spot?

LATIF: Yeah. It's a bitter pill to swallow.

LULU: But it is also apparently a sweet pill!

LATIF: Right. A spoonful of sugar to help the medicine go down?

LULU: No, like, the opposite side of the bitter is sweet. Like, it's a bitter pill but it's a sweet pill at the right dosage or something.

LATIF: Yeah. Yes.

LULU: But maybe like the kind of fact that most of us are not Michelangelos or great chefs, like, that that alone might speak to the sadness in terms of, like, curing diseases. But maybe it's the comfort.

LATIF: Yeah, I don't know if that's my silver lining takeaway. It's like, "Oh, we're too dumb to make chemical weapons."

LULU: That's kind of mine. I mean, that's actually—just sifting through all of this is like, I think I will be able to sleep a little better at night by thinking about the fact that most of us are not Michelangelos.

LATIF: The Michelangelos of chemical weapons? Yeah.

LULU: Yeah.

LATIF: But even sculpting. Like, there are a lot of people who are, like, woodworkers or carpenters or craftspeople.

LULU: Yeah. But there's no one who's Michelangelo. That's why I like ...

LATIF: But then you just need a 3D printer to just, you know ...

LULU: I mean, I think it's like why I'm settling on Michelangelo is like, because I kind of still don't think a 3D printer could print—could do it.

LATIF: Could print the David?

LULU: Yeah. And—and because I'm like ...

LATIF: Can't it though? Aren't there—like, if you go to any museum gift shop, like, you'll find ...

LULU: Yeah. But they're not—yeah! But they're not! They're not.

LATIF: How different are they, though?

LULU: Very!

LATIF: The keychain Davids?

LULU: They—yeah! Like, they're different!

LATIF: I don't think they're that different, though! [laughs]

LULU: I think it's really different [laughs]!

LATIF: I don't think it's that different !

LULU: Well you don't get the like—how the lights ...

LATIF: But the average—the average ...

LATIF: In March, 2022, the US under its obligations for the International Chemical Weapons Convention destroyed its last stockpile of VX. It has now destroyed over 95 percent of its chemical weapons, and is slated to destroy them all by September of 2023. So mark your calendars, I guess?

LATIF: This story was reported by me, Latif Nasser. It was produced by the ever non-toxic Matthew Kielty, with production help from actual pastry chef Rachael Cusick. Original music and sound design contributed by Matthew Kielty with mixing help from Arianne Wack. And given Lulu's mic situation, I'm gonna sign off for both of us. I'm Lulu Miller. I'm Latif Nasser. Thank you for listening. [laughs]

[LISTENER: Radiolab was created by Jad Abumrad and is edited by Soren Wheeler. Lulu Miller and Latif Nasser are our co-hosts. Suzie Lechtenberg is our executive producer. Dylan Keefe is our director of sound design. Our staff includes: Simon Adler, Jeremy Bloom, Becca Bressler, Rachael Cusick, Akedi Foster-Keys, W. Harry Fortuna, David Gebel, Maria Paz Gutiérrez, Sindhu Gnanasambandan, Matt Kielty, Annie McEwen, Alex Neason, Sarah Qari, Anna Rascouët-Paz, Sarah Sandbach, Arianne Wack, Pat Walters and Molly Webster. Our fact-checkers are Diane Kelly, Emily Krieger and Natalie Middleton.]

[LISTENER: Hi. This is Suzanna calling from Washington, DC. Leadership support for Radiolab science programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox—a Simons Foundation initiative—and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.]

 

-30-

 

Copyright © 2022 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.

 

 New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.

THE LAB sticker

Unlock member-only exclusives and support the show

Exclusive Podcast Extras
Entire Podcast Archive
Listen Ad-Free
Behind-the-Scenes Content
Video Extras
Original Music & Playlists