SGU Episode 890

From SGUTranscripts
Revision as of 00:40, 5 November 2022 by Hearmepurr (talk | contribs) (2nd news item completed)
Jump to navigation Jump to search
  Emblem-pen.png This episode is in the middle of being transcribed by Hearmepurr (talk) as of 2022-11-04.
To help avoid duplication, please do not transcribe this episode while this message is displayed.
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.


SGU Episode 890
July 30th 2022
center|200px

Open link above to view image:
Monkeypox rashes

SGU 889                      SGU 891

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

The illiterates of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.

Alvin Toffler, American futurist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, Audiobook Recording, Celebrity Narrators

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello and welcome to the Skeptics' Guide to the Universe. Today is Tuesday, July 26th 2022, and this is your host, Steven Novella. Joining me this week are Bob Novella...

B: Hey, everybody!

S: Cara Santa Maria...

C: Howdy.

S: Jay Novella...

J: Hey guys.

S: ...and Evan Bernstein.

E: Good evening folks!

S: Cara, you're back full-time this week. You can do the whole episode.

J: She's back.

C: Full-time.

E: Yeah, yeah.

C: Let's see if I don't fall asleep in the middle. I'm going to work real hard at it.

S: So I didn't have a chance to mention last week, but last Tuesday, actually, a week ago before we were recording this, I wrapped the audio version of our next book, The Skeptics' Guide to the Future.

B: Awesome, man. What'd you wrap it in?

S: I did it in half the scheduled time.

J: What?

B: Half?

J: How?

E: He spoke twice as fast as he normally does.

B: They've got to reset their expectations, man.

S: Yeah, totally.

E: Steve was set to times two.

S: No, actually, you have to speak slowly when you're recording those audio books. You have to slow it way down.

C: You flubbed fewer times.

S: That's it. We chatted a lot about it with I had the producer and then the audio recorder, the audio producer, so there's the two guys. They're both listening in. One guy's listening for diction and pacing and enunciation and acting. The other guy's just listening for noise. Your stomach made a noise. Say that again. Just listening to the audio quality itself and room noise and everything. But we took a few breaks throughout just so I'm not sitting for five hours in a row. So we were chatting about it. They said interesting things to say. Most people, so there's basically two kinds of people doing audio books today. You're the celebrity doing a fiction book.

C: You're the voice actor.

S: Yeah, you're the voice actor, yeah. Or if it's a nonfiction book, the industry expectation now is that the author will read the nonfiction book.

B: Really?

S: So you have a lot of authors that have no idea how to read because they're authors, they're writers. They don't necessarily have any experience with podcasts or speaking or whatever. And so some of them are just the sort of average of that group is just really slow. They don't need any constant correction and dictionism. There was one guy, there was one guy who was so bad. Basically just said, just read everything twice, just everything, just read it twice. Couldn't get through anything on the first try.

B: When when the the cat cat.

S: And then celebrities have a completely different problem in that they generally do not take direction well.

B: Ha-Ha. Primadonnas

S: And they're saying they said they're all prima donnas. They're all terrible. But they said that not all of them some of them were fine.

C: Yeah, that's surprising for actors. That's their job.

E: They're used to being directed.

S: Sometimes it's actors. Like he said, they said he did William Shatner for one book, he said he was the worst. He was a total prima donna.

J: I was going to bring that up. Did you hear the audio of William Shatner? Like he so he does a read. This was probably back in like the 70s.

E: I heard him read Rocketman once.

J: Yeah, right? That was something else. So he does a read. And it was like a 30 second spot. And then the "director", the person in the booth was like, all right, could you do that again? But could you do it like this? And he's like, William Shatner, through conversational judo, had this guy apologizing to him. (laughter) Because he said, oh, well, he goes, if you want me to read it, then say it the way that you want me to say it, then then read it the way you want me to say it and I'll copy how you do it. And he made the guy have to actually do the read and he talked the guy into doing the read. The guy does the read. And of course, he was terrible at it. So then William Shatner copies the way he did it, which was really bad. So William Shatner got this guy all wrapped up in a knot. You know what I mean? And it was like, holy Christ, don't come at that guy the wrong way.

B: Wow, man.

S: That's basically what they're saying. It's like not in other people, other celebrities as well. It's like, can you read that again this way? And it's like, why should I read it again? Why do you want me to? I'm not going to read it again. They just won't. They won't do it. I think it's it's the the worst combination, because if you have, because this is, it's not like it's a famous director directing them. It's somebody who works for a publisher, a book publisher. You know what I mean? They're professionals, they know what they're doing in terms of producing an audio book, but they're not the kind of person that a Hollywood actor might recognize and respect. So they were just not taking direction well. The sweet spot were professionals who weren't famous. And that's kind of where I fall in. Like, I have a lot of experience speaking, but I'm not famous, or I think I'm not a prima donna. So they were saying, they're just trying to make you sound the best you can. That's their job. Is to make your book the best it can possibly be. So why wouldn't you listen to everything they have to say? It's not like it's a contest or something. I don't know. It is a little weird.

E: They're auditioning to get the job or something, it's, they've got it. Why not take the professional advice?

S: I guess they just didn't respect the fact that they knew what they were doing. But it's also not just Hollywood celebrities, because it could be like, one guy said like Bill Clinton, they did a book with Bill Clinton, he also was a prima donna. Politicians, anybody with fame, it's not just Hollywood actor fame. They were a little impatient with the process. But anyway, I just cruised along very, very quickly. Plus, I edit, I'm also an editor. So I was like editing myself as I went. I didn't have to wait for them to tell me to stop and read that sentence over every time. Sometimes I let them decide if it was good enough or if they wanted me to do it again. But like, if I clearly flubbed something, I just stopped myself, went back and read it again. So just made it go very, very quickly. So it's good. So I'm done. Totally done. And when we finished, Bob and Jay joined me in the studio to do some bonus content. And it was a lot of fun.

B: Yeah, baby. That was so much fun.

S: We had a good conversation. It was very interesting, very lively. It was an hour and 15 minutes of extra content.

B: Flew by. Flew.

S: It was really good.

B: That was cool.

E: Bob, you worked with Bill Shatner, I recall.

B: Oh my God.

E: At one point.

B: When I was at Priceline.

E: You both worked for the same company.

B: When I was at Priceline Webhouse, I remember seeing his name in the email of the company because he was doing the Priceline commercials. And so that was funny. It was like, oh, look at that. But if I had tried to email him, I probably would have just been flat out fired. I mean, he probably would have insisted, fired that punk Nnovella.

J: So you're saying you didn't even try, Bob?

B: He had the temerity to email me? Huh?

J: You didn't even try. That's lame.

B: No.

J: That's silly, Bob. I mean, you blew your one chance.

B: Yeah. Oh, boy. How my life would have changed.

J: I got you all beat.

B: Oh, yeah. Say it. Say it.

J: I butt dialed David Copperfield.

B: That's awesome.

S: You butt dialed him?

J: Well, yeah. Because he's our keynote. Day of the keynote, I had to talk to him again on my phone to, because he was having trouble with a little connection quick thing. We fixed it. So his number was there. I don't know. And my phone I butt dial a lot by accident.

B: Yes, he does.

J: So I butt dialed him and I'm like, oh my God, I can't believe it. So I like he picks up and I'm like, David, I'm not going to lie to you. I just butt dialed you. I'm really sorry. I said, but while I have you on the phone, I just want to thank you so much for the keynote you did today with Bill. It was awesome.

B: Nice save. Nice save.

J: You guys, I said the audience is really going to love it. I was so entertained. You went in places I didn't expect you guys to go. And I just want you to take my apology for butt dialing you, but thank you so much for that awesome keynote today. And he was like, cool. Thanks. I had a good time. And he was like, totally cool about it. Yeah, but I was really mad at myself for about 20 minutes and then I'm like, who gives a shit? You know what I mean? These are just people. You know what I mean? If somebody like gets mortally offended, if you accidentally give them a call. Someone had you had spoken to all day today, you can't let shit like that bother you because they're just people.

E: See, Bob, you should have emailed Shat.

J: Right. He's just a guy. I mean, when a push comes to shove we look up to them because of the personas that they put out there, but those personas aren't them anyway. We don't even appreciate who they really are. Do you know what I mean? It's all fantasy.

C: Yeah. Did you guys see, I just watched actually speaking of last night on, I think on HBO, there's a really good two part documentary on George Carlin.

B: Really?

J: Oh yeah. Yeah.

C: And it really does speak to that, kind of the difference between the public persona and the private sort of like sweet, shy guy who is a little bit disillusioned, who is a little bit angry, who is really struggling with addiction, whose wife was struggling with alcohol abuse. And it's just like really interesting to see that juxtaposition. It was very honest appraisal and like featured heavily Kelly Carlin, his daughter, who of course is very active in the skeptic community.

B: Really?

C: Yeah. Oh yeah.

J: We saw her give a presentation at TAM. I don't remember what TAM it was.

C: Yeah. I feel like I've known, I mean, I don't know her well, but like I've had meals with her and kind of known her for years through the skeptic community. Like that's where I usually would see her is at conferences.

J: She was, she was super nice.

S: If you listen to a lot of George Carlin's routines, there there's skeptical threads and themes.

B: Oh yeah.

S: All the time. That's probably why I thought it was super funny.

B: Part of the reason.

C: And also just really funny.

S: Just a really funny.

C: And did you know that he didn't have past, like I think they said a ninth grade education?

S: Really?

C: Yeah. He never finished high school.

S: He had street smarts there, right?

C: Oh yeah. Very much so. And, and book smarts. He just very much dedicated himself to reading.

S: To self-study.

What's the Word? (10:20)

S: Cara, you're going to start us off with a What's the Word?

S: Got an email. I was going through some old emails. Actually, this one's not that old. It's just from last month. From Peter from New Jersey, who said: "I was listening to this science podcast and the host was explaining the difference between tortoise and turtles. You guys remember that episode? And use the word cladistics. Damn. So now not only do I know the difference between turtles and tortoises, but I got to look up this whole new word cladistics. Yeah. So I thought maybe we would take some time with that because clearly we've been, we've been throwing the term around on the show.

B: Important term.

E: Is clade the root?

J: The claden.

S: The clayde.

C: The clayde.

E: Thank you. I just wanted to hear it. I couldn't wait. I knew it was going to happen.

B: Nice.

E: I had to hear it.

C: So let's start with the etymology. And yes, we'll go back to the word clade, which comes from the Greek, which looks like it sort of roughly translates to an offshoot or a branch. A young branch or a shoot of a plant. So it kind of makes sense when you start to understand what cladistics is and what a cladogram is and what it actually looks like visually, why they would use this root word clade. And of course the Greek comes back to PIE and they think at that point it was kind of like a cut or a strike, a piece, a broken piece. But there's actually a person who coined the term. So clade was first coined by Julian Huxley, who I think we mostly are familiar with Huxley. In '57, in a paper that was published in Nature called The Three Types of Evolutionary Process. And then cladistics, the actual field or utilization, that term was coined by Ernst Mayr, which is interesting because he actually opposed cladistics. So he called it cladistics. It caught on, but he was saying it to say, no, I'm not into this.

E: Is that like the Streisand effect in a way?

C: Oh, I don't know. What's the Streisand effect?

S: No, not, no. It's not that so much as, sometimes when people come up with a term to, they think is a derogatory term.

C: To denigrate or to, yeah.

S: Then that becomes actually flipped and becomes, is embraced and becomes positive. Somebody referred to skeptics as the reality-based community and we're like, yeah, we'll take that. We are reality-based.

C: As long as we don't call ourselves Brights.

E: Sorry about the tangent.

S: See that's was trying to name ourselves and that didn't work.

C: Anyway. So let's actually talk about what all of these words clade, cladogram, cladistics actually mean. So, branch, offshoot, basically cladistics is a field of evolutionary biology. And what it's specifically looking at are relationships between different organisms that are shared because they have a common ancestor. So when you look at a cladogram, as opposed to some other ways of visualizing evolutionary biology. Cladograms sort of look like V's. It's one large V with all of these offshoots coming off of it to form smaller V's. And those V's identify hypothetical or sometimes actual common ancestors. Those breakage points. And then they continue on. And so it's another way of sort of looking at phylogeny. It's another way of looking at evolution through sometimes hypothesized and sometimes known common ancestry. So the cladeogram is the actual tree, but it's not really a tree. It's like this V-shaped thing that's very specific to cladistics. Cladistics is the field. And then an individual clade is the group of organisms itself. So that is the actual group that has a common ancestor and all of the different descendants of that common ancestor on that tree. Does that make sense? That's a clade.

S: Yeah, so it is a strictly evolutionary taxonomy. And that's why─

C: It's only looking at ancestry. Exactly.

S: Yeah, it's only evolutionary relationships. And that's why there was a lot of pushback. And I remember reading at the time, in the 80s, reading evolutionary books. I know Stephen Jay Gould wasn't totally down with it either.

B: Really?

S: Yeah, because it's like, well, the birds or dinosaurs was like the big example. It's like, really? Birds or dinosaurs? They're not different enough to be considered their own group. We're going to put them on one little tiny branch of dinosaurs because they're descended from theropods. But so they said you should consider other things as well, like how disparate they are, for example, how numerous they are, whatever. And that's why there was pushback. But the thing is that they just lost the argument because over the years, cladistics, especially with genetic analysis and everything, it just became so useful and to the type of evidence that we were getting that the new generation just embraced it and the old objections faded away.

C: And it's pretty much all I knew. I remember taking a paleo class in early college, when I went to community college my first year, and everything was cladistics. That's how we learned about dinosaurs and lineage.

S: Right. Now we have to say non-avian dinosaurs now when we're talking about them.

C: Yeah, that's true.

S: And yeah, the cladists won, basically.

C: The cladists won. (laughter)

News Items

Detecting Exoplanets (16:06)

S: All right, Bob, you're going to start off the news items with a new method for detecting exoplanets.

B: Yeah. Researchers recently published a new way to detect exoplanets using unusual binary stars with one of the best names in astronomy, cataclysmic variables. All right. So these researchers are from the Autonomous University of Nuevo Leon and the National Autonomous University of Mexico and New York University. And their research was published in the monthly notices of the Royal Astronomical Society. But seriously, if there was one reason I selected this news item, it was the name, cataclysmic variables. I just loved it. Just fell in love with it. I hadn't heard of it. I should have heard of it. Never did. It kind of reminds me, perhaps, of The Cliffs of Insanity!, doesn't it? (Evan laughs)

J: The Cliffs of Insanity?

C: What are the Cliffs of Insanity? What are the Cliffs of Insanity?

E: The Princess Bride. You have to see that.

C: No, I've seen it a million times. I'm just, I think I immediately go to, I don't know what that is when you guys reference it.

S: That's a good first assumption.

B: Mostly would be correct with our obscure references. So what are these cataclysmic Cliffs of Insanity variables? Broadly, they're binary stars where one is a white dwarf star, like what our star will become. Siphoning hydrogen off of a close donor star, usually a red dwarf. So it's so close that the white dwarf star can actually distort the shape of its companion gravitationally. Most stars, even our Sun, have a variable brightness. They are, in some ways, variable, but cataclysmic variable systems change their brightness irregularly by very, very large amounts. And that's one of their standout features there. So it is really one of the iconic examples of those artist impressions we so often see in astronomy news. You see a fantastic astronomy picture and it says artistic impression. Well, of course it is because there's no way we got in that close and took that. So that's what you've seen with these cataclysmic variables. Usually a large, usually a red and weirdly oblong star that has tendrils of gas leading away and spiraling towards and joining with an accretion disk orbiting a small point of light. So that image, I'm sure you've got that image in your head now, that's, how many times have we seen it? And it is one of those iconic examples, I think. Now there's actually many, many types of these variables and we've discussed some of them in the past. Classic novae are cataclysmic variables. Type Ia supernovae as well are, in that case, the accreting matter reaches a point where it triggers carbon fusion and explodes cataclysmically, you might say, and destroys the white dwarf star. Now it doesn't always destroy the white dwarf star though, and in some of those cases we have a scenario that will allow for a new method of detecting exoplanets in such systems according to this new research anyway. So we already have many ways to detect exoplanets. How many times have we covered them on the damn show? Radial velocity, transit, direct imaging, gravitational microlensing, astrometry, and there's even more obscure methods. Many look it up on Wikipedia, you'd be like, oh man, look at all of these methods. Those are the primaries. Now the transit method has by far been the most productive. It's what, 70, 80%, it's like the vast majority has been the transit method, which watches the dip in light as a planet crosses in front of its sun from our point of view. But what if the exoplanet doesn't cross in front of its star from our point of view? The transit method is basically useless at that point. And much of the time, many of the other methods wouldn't be very helpful either. So what can you do? In some ways this may be even more of a general method where it doesn't matter where the exoplanet is, assuming of course it happens to be a cataclysmic variable, of course. All right. Now this new method can help in these scenarios and it works by leveraging the changing luminosity of the accretion disk orbiting and feeding the white dwarf. The disk is amazing, actually. A swirling gas that gets hotter and hotter and it can emit so much light that it actually outshines the actual stars that are creating it. When you see the light, you're basically seeing the luminosity of the accretion disk and not the stars. The theory here is that an exoplanet that is bound to a cataclysmic variable system oscillates the L1 Lagrange point between these three bodies, okay? Now this is the point, Lagrange points are where there's a gravitational equilibrium between various bodies. If you remember the James Webb Space Telescope is in orbit around a Lagrangian point. So now this oscillation of the Lagrangian point changes and disrupts the rate of flow of gas from the donor star to the white dwarf accretion disk, right? And that disruption changes the luminosity of the entire system that we see. Because that flow of mass of that gas directly causes the luminosity in that accretion disk, as you would imagine. So if you mess with that flow, the luminosity changes, blam, and caused by the exoplanet. The change in luminosity then, the researchers claim, points to the existence of exoplanets and they did it for four cataclysmically variable systems that they studied. They looked at four and two, perhaps three of them actually showed this. And it certainly seems that the exoplanets are responsible for messing with that flow of gas in these systems. Now making their theory even more potent is the fact that it's a potential solution to a mystery that's been around for years, basically, regarding these systems. And it's called VLPP, which stands for Very Long Photometric Period. And it refers to the fact that some cataclysmic variables have longer periods of this luminosity change. It's just too long, it just takes too long for the process to go from beginning to end. It's even longer than the orbital period of the stars themselves. What is causing this? And it looks like it's these planets. Lead author Dr. Carlos Chavez said: "These perturbations can explain both the very long periods that have been observed between 42 and 265 days and the amplitude of those changes in brightness". So really cool news item, check it out yourself. So basically, yay science and Steve, you may proceed now.

S: So just to be clear, the period of the alteration in the luminosity of the star is the orbital duration of the planet, like the orbital period of the planet?

B: It's directly related to the exoplanet oscillating that Lagrangian point, which then messes with the flow of gas from the donor to the primary to the white dwarf.

S: And why does it have to be a Lagrangian point? Why can't it just be the orbital period of the planet itself?

B: Because somehow that Lagrangian point is intimately related to this flow of gas, right? You've got these two binary stars in orbit, and you can imagine that the Lagrangian point, that equilibrium where gravity is in equilibrium, is related to where this gas flows. And if something is messing with that Lagrangian point where it's oscillating, then that flow of gas is going to be disrupted. And then that changes luminosity, and that points to the existence of the exoplanet in the first place.

S: All right, gotcha.

Too Hot, Wet-bulb temperature (23:20)

S: All right, Jay, is it hot enough for you?

J: It's hot, Steve, and it's getting hotter.

S: How hot can it get, and how much heat can we take?

J: What level of heat and humidity can an average human handle?

B: Four foot one.

C: I think it depends on how much time you're spending in it.

B: Yeah, it's all the time and humidity, I think, would be huge.

J: I don't know. I'm going to put as many H's in that sentence as I can.

B: Oh, nice. I appreciate any attempts at good alliteration.

J: Let me unpack this for you guys. If you're watching the news, you know what? Global warming. It's causing a lot of heat-related events all over the world right now. Forest fires, heat-related deaths, this is common fare this summer.

B: Yeah, the UK had its hottest day ever recorded recently.

J: Yeah, and it keeps breaking its own record. In fact, these hot days are more frequent, they last longer and hotter now than they ever were historically, like Bob just said. As temperatures rise, it's important to know what combination of heat and humidity we should be staying away from. Some researchers did a new study that delves back into this, because there was previous information, but they wanted to get something a little bit more accurate and updated. Let's talk about something called the wet bulb temperature. You guys ever hear about this?

B: No.

E: No. I've not heard that.

J: This is interesting. It's a measure of the heat stress in direct sunlight, which takes into account temperature, humidity, wind speed, sun angle, cloud cover, it has all these different factors that it takes in. But in practicality, the wet bulb temperature is the temperature read by a thermometer that is covered in a water-soaked cloth. So why would they do that? The wet bulb temperature is the lowest temperature that can be reached under current ambient conditions by the evaporation of water only. So if you think about it, you have a thermometer that has a wet cloth on it, and that cloth is, the water is evaporating, and it's cooling the relative temperature of the thermometer. Do you understand? So the wet bulb is actually you. You're the wet bulb, because you sweat.

E: How did you call me?

J: You get it? So they're simulating the effect of evaporation on temperature with the bulb. That's it.

B: Right. So that temperature is as cool as you can get through sweating.

J: Through evaporation. To be─

B: Sweating and then evaporation, yes.

J: ─more specific. So there was a study published in 2010 that concluded that the wet bulb temperature of 35 Celsius or 95 Fahrenheit at 100% humidity, so just basically saying drenched with water, would be the highest temperature a human body can operate safely in. So 35°C or 95°F. Anything higher than that in the body would not be able to cool itself by sweat evaporation.

B: Then you're on a direct line for heat stroke.

J: Yeah, but this is, I'm just giving you a little history here, because this was a study that was done 12 years ago, but the problem with that study was that it didn't test in a laboratory setting. So it wasn't controlled. But now there's a recent study that tested people in a controlled laboratory setting. Now don't get too excited, though, because the results are not good.

B: 90? 96?

J: The researchers started off, they tested healthy young men and women at the Knoll Lab at Penn State University and subjected them to heat stress. They had their test subjects swallow a telemetry pill, very, very cool.

B: Oh, core body temperature.

J: Yes, this pill was able to monitor their bodies at their core temperature. They were then seated in a chamber and they were moving their bodies to simulate typical minimal bodily activity that most people do, scrolling on your phone, eating meatballs, typical stuff, right, Cara?

C: Yep, typical

J: As the test subjects sat in the chamber, the researchers either turned up the temperature or the humidity while they were paying attention to what their core temperature was and when does it start to go up. So this is called the critical environmental limit. This critical environmental limit is the moment when someone's core temperature starts to go up. So the researchers waited until the test subjects hit their critical environmental limit. So they keep raising the factor here, factor there, factor here, factor there. Temperature staying the same, the body's able to keep itself cool through sweat and an increase in heart rate. But then the telemetry pill says, oh, their core temperature has just gone up. And then what were the conditions when that happened? So once their core temperature went up, this is when the risk of heat-related illness could occur. At any point when the test subjects' core temperature went up, they were legitimately at risk from that moment. Even if it goes up one degree, it's dangerous. So when heat levels rose, the test subjects had an increase in heart rate, like I said. So they raised the heat up in the rooms that they were in, into the chamber, and immediately the test subjects' heart rates would increase. This allows blood flow to increase to the skin. The increase in blood flow brings heat to the skin. So your blood is going to your extremities, going to your skin. Your skin now is receiving, radiating the heat that was in your core. On a hot day, your heart usually circulates two to four times more blood each minute compared to a cool day just to do this, to bring the heat of your body to your skin. So now that body heat is brought to the skin and our sweat evaporates, and in doing so removes heat from the body. That is the mechanism. That's why we sweat. Keep in mind, though, that sweating is a direct loss of what?

C: Water.

J: Bodily fluids.

B: Alcohol?

J: Our sacred bodily fluids. Steve, what was that like?

S: Our precious bodily fluids.

E: Our precious bodily fluids.

J: Our precious bodily fluids. If a person's body is doing this for too long, or if they hit their critical environmental limit, they can suffer from heat stroke. Heat stroke is when your body can no longer lose body heat, so the result is that your core temperature goes up, and this is very dangerous. So let me paint this picture for you, and this happens every single day─

B: Oh man.

E: Awful, awful.

J: ─to people all around the world. It's hot out, they're sweating, and their heart rate is up. Their body is trying to shed heat through this process. But what happens is their heart rate can only go up so high, and they can only lose so much water before their body can no longer shed heat. Lots of different things can cause this, but essentially it's a lack of water, and it's only so much that your cardiovascular system can do to get that heat out to your extremities.

B: Now, Jay, but also the fact that if they're factoring in 100% humidity, then you can't really lose, you're not going to be evaporating any sweat, because it's 100% humidity, right? Isn't that a factor as well?

J: Yeah, yes it is, Bob. That's very smart of you to pick that out. I'll give you the details on that in a second. So this study was able to refine the wet bulb temperature. So they were able to refine it and bring it down to a much more accurate number. This is the temperature where the body can no longer lose heat. The wet bulb temperature. The new numbers are with 100% humidity, 31°C, which is 88°F. That's not hot, by the way. You could hit 88 easy on an average summer day. 88 is not a hard number─

B: Yeah. It's the 100% humidity that's rare.

J: ─or 31°C. Now, if the relative humidity was 60%, the temperatures would go up to 38°C and 100°F. Now, the good news is that most people have about 50% humidity in their houses. So that number would even be able to go higher.

B: That's how my mom survives.

J: That is it, right. Plus, her heart only beats once every three minutes. So do you guys know why it's harder to stay cool with higher humidity, by the way?

B: Evaporation is short-circuited.

J: Yeah, because it's harder for the water to evaporate as fast. You need it to do that, and there is a mathematical formula of the amount of sweat that evaporates and the temperature of your skin and how much heat is going to be whisked off of you. The troubling news is that heat waves around the world right now are at or above these new temperatures of the wet bulb temperature that I told you, 31°C or 88°F. There's a limit to how much any person can sweat, because they're limited by their hydration. Now, the danger-level temperatures I told you don't mean that you're perfectly safe at those temperatures. You shouldn't feel safe going out into an 88° day without being hydrated, and you have to be in good health. The danger-level temperature I told you doesn't mean that you're safe in those temperatures in 31°C or 88°F. In fact, you're really not in a good position if that's what the temperature is out. It means that a very healthy person's body is able to keep their core temperature under control at that temperature. That's what those numbers mean that I gave you. Lower temperatures can still stress your body and deplete water. You could be in 80° temperature and you're still, your heart rate is up and your body is, you're sweating a little bit to lose heat. Your body's still trying to shed heat. The length of time that you're exposed to higher temperatures also is a key factor here. This is why we all have to hydrate before we go outside. Stay hydrated. Be hydrated when you go out and stay hydrated. Don't wait until you're thirsty because when you're thirsty, there's a delay. By the time you feel thirsty, your body isn't pre-gaming it. Your body's telling you, I have a deficit, so getting thirsty is bad. You want to keep yourself not thirsty. This is when you're working out in the heat, you know what I mean? If you're out in the sun in the heat, don't let yourself get thirsty. Just keep drinking. Drink before you feel thirsty. Also, limit your exposure to high temperatures. This might seem super obvious, and it is, but most people don't really take this into account when they're doing things. Even short breaks in cooler shaded areas could be a huge help for your body to catch up. Catching some shade during the day can literally be the difference between heat stroke and just being hot. 65 and older make up 90% of those who die from heat exposure. That's a lot of you listening to this or there, so please pay attention. This is important. Energy costs are also up, and some people are finding it hard or impossible to cool their living spaces. And there is also power outages. These happen, and I guarantee you that global warming is going to increase the average amount of power outages we have due to severe weather. So what would you do if you lost power in the middle of an extra hot summer? Now, I've lost power during hurricanes, and it's not that hot. Because of the storm, the relative temperature is nowhere near as high it would be if there was direct sunlight. But if you did lose power, like you had a brownout or a blackout during the middle of the summer, and it's hot where you live.

C: Yeah, it happens all the time.

J: What are you going to do? Like think about that.

C: Stay indoors. Make sure there's a breeze. Always make sure there's a breeze.

J: You're absolutely correct, Cara, and I'm putting it to the audience. Think about it. Think about your situation, your home, do you have available water for situations like this? Do you have a cool place to go? Going in the basement is a great, if you have one, it's always cooler down there, go, you know what I mean? Or there's cooling stations, there's lots of cooling stations around the world that towns and cities are setting up for people. That's another potential place that you could go under an emergency. Global warming has drawn the line in the sand now, guys. It's here, it's getting worse, and it's starting to affect more and more people. So please think over this information and make some changes to how you approach heat and humidity. We just have to be cautious because the world that we live in is getting a little bit harsher. When I researched this, it was an eye-opener because I don't think about carrying water with me. I don't really think about like getting into shit.

C: Really?

J: Yeah, I'm not there yet. My wife carries a water bottle with her everywhere, and I'm like, I don't need it. I'll drink it in the house.

C: I also wonder if that is definitely a geographic difference, right?

S: Totally. Yeah, if you live in Arizona, of course you know you need to carry water with you.

C: Totally. Being from Texas and now living in LA, I always have a water bottle. I always carry a Swell bottle, those metal vacuum-sealed bottles, because they stay cold all day. And, yeah, I'm never without water.

S: You develop skills based upon the environment that you live in. Like in New England, we have skills dealing with snow. We know how to drive in the snow, when not to drive in the snow.

C: Yeah, and I know none of that.

S: How to not get frostbite and things like that.

J: In Canada, they wrestle bears, right, Steve?

S: Yeah. Canada, think we're a joke. If you live in Arizona or Southern Texas or anywhere in Texas, I guess, you know how to deal with the heat and the desert and the dry and whatever. That's a skill set. We don't have to develop that living in Connecticut.

C: Not yet.

S: Yeah, but that's the point is that, more and more people are going to have to learn how to manage a hotter, drier environment. All right.

Overconfidence and Denial (36:13)

[36:14.440 --> 36:15.440] So I had to talk about this.

[36:15.440 --> 36:18.040] There was like three things I wanted to talk about this week, but this is the one I had

[36:18.040 --> 36:19.040] to talk about.

[36:19.040 --> 36:25.240] This is a new study looking at the relationship between overconfidence and opposition to scientific

[36:25.240 --> 36:26.240] consensus.

[36:26.240 --> 36:27.800] Oh, I don't like this.

[36:27.800 --> 36:28.800] Yeah.

[36:28.800 --> 36:34.040] So it shows, you know, in the broad brushstroke, pretty much what you think it shows in line

[36:34.040 --> 36:35.360] with previous research.

[36:35.360 --> 36:37.680] But the details are always are always interesting.

[36:37.680 --> 36:45.540] So what the researchers were looking at was correlating the gap between people's self-assessment

[36:45.540 --> 36:52.880] of their own knowledge on a specific topic called subjective knowledge with their actual

[36:52.880 --> 36:55.680] objective knowledge about that same topic.

[36:55.680 --> 36:58.160] How well can they answer factual questions?

[36:58.160 --> 37:00.880] Oh, I feel some Dunning-Kruger coming on.

[37:00.880 --> 37:01.880] Yeah, absolutely.

[37:01.880 --> 37:02.880] Related.

[37:02.880 --> 37:03.880] Totally.

[37:03.880 --> 37:11.240] And then they compared that to their acceptance or opposition of controversial scientific topics.

[37:11.240 --> 37:12.460] Is global warming happening?

[37:12.460 --> 37:14.060] Did evolution happen?

[37:14.060 --> 37:19.600] Do vaccines or vaccines save or genetically modified food save, et cetera?

[37:19.600 --> 37:21.200] And what do you think they found?

[37:21.200 --> 37:28.240] They found that the general pattern for most topics, actually, again, I like to remind

[37:28.240 --> 37:29.920] people this is topic specific.

[37:29.920 --> 37:33.960] All of these things that we talk about in terms of the relationship between knowledge

[37:33.960 --> 37:39.140] and acceptance of scientific consensus, they're very topic specific, but for most topics,

[37:39.140 --> 37:46.800] they found that there was a really robust, consistent, rigorous correlation between a

[37:46.800 --> 37:54.520] larger gap in subjective knowledge to objective knowledge and opposition to scientific consensus

[37:54.520 --> 37:56.640] on controversial topics.

[37:56.640 --> 38:03.400] So the more people overestimated their knowledge, the more they also rejected the scientific

[38:03.400 --> 38:04.400] consensus.

[38:04.400 --> 38:07.440] And not only were they overestimated their knowledge.

[38:07.440 --> 38:16.740] So Dunning-Kruger, if you recall, Dunning-Kruger was that as people's objective knowledge decreased,

[38:16.740 --> 38:20.720] the degree to which they overestimated their relative knowledge, their knowledge relative

[38:20.720 --> 38:23.000] to others, increased.

[38:23.000 --> 38:27.080] But the gap increased, but the absolute value still decreased, right?

[38:27.080 --> 38:32.280] So as people knew less, they thought that they knew less, just not as much.

[38:32.280 --> 38:35.800] And so the gap increased, right?

[38:35.800 --> 38:38.680] That's Dunning-Kruger.

[38:38.680 --> 38:43.200] But I think it was first demonstrated with genetically modified food, a study looking

[38:43.200 --> 38:51.040] at that topic, where the people who had lower knowledge actually thought they knew objectively

[38:51.040 --> 38:52.040] more.

[38:52.040 --> 38:55.040] It wasn't just that it didn't decline as quickly.

[38:55.040 --> 38:56.040] It actually increased.

[38:56.040 --> 38:59.280] So the people who knew the least thought they knew the most, right?

[38:59.280 --> 39:04.400] They were like, yes, I know more than 95% of the population.

[39:04.400 --> 39:07.580] And they were in the fifth percentile.

[39:07.580 --> 39:09.160] They knew nothing.

[39:09.160 --> 39:11.160] And they thought they knew more than the experts.

[39:11.160 --> 39:15.520] They literally, in some cases, thought, I know more than doctors about autism.

[39:15.520 --> 39:16.520] And the question-

[39:16.520 --> 39:18.560] That can lead you to dangerous places, that's for sure.

[39:18.560 --> 39:20.440] Yeah, that's like the super Dunning-Kruger.

[39:20.440 --> 39:22.680] It's not just a regular Dunning-Kruger.

[39:22.680 --> 39:29.880] So this is actually replicating that research by looking at a broad series of topics.

[39:29.880 --> 39:35.320] And they're showing that, yeah, for many topics, there is that super Dunning-Kruger kind of

[39:35.320 --> 39:39.880] effect where the people who know the least think they know the most.

[39:39.880 --> 39:41.480] But not for every topic.

[39:41.480 --> 39:46.680] So what topics do you think do not show this pattern?

[39:46.680 --> 39:47.680] Topics that aren't highly politicized?

[39:47.680 --> 39:48.680] No, no.

[39:48.680 --> 39:49.680] These are all politicized.

[39:49.680 --> 39:50.680] These are all-

[39:50.680 --> 39:51.680] Oh, they all are.

[39:51.680 --> 39:52.680] It's actually the opposite, Kara.

[39:52.680 --> 39:56.880] The more politicized they are, the more likely they are to not follow this pattern.

[39:56.880 --> 39:57.880] Interesting.

[39:57.880 --> 39:58.880] Really?

[39:58.880 --> 39:59.880] Yeah.

[39:59.880 --> 40:05.640] So the two standouts are climate change and evolution.

[40:05.640 --> 40:13.240] And so what might those have in common that would make them fall under a different pattern?

[40:13.240 --> 40:14.240] Compared to what?

[40:14.240 --> 40:15.240] Sorry, what were the other topics?

[40:15.240 --> 40:16.240] I just want to kind of-

[40:16.240 --> 40:21.120] So vaccines and autism and GM food, like the two where it holds up very well.

[40:21.120 --> 40:26.320] So again, one way you could think about it is that this was showing us what the effect

[40:26.320 --> 40:29.840] is, but not why the effect exists, right?

[40:29.840 --> 40:35.400] So it shows that people who are overconfident tend to reject mainstream science more, but

[40:35.400 --> 40:37.040] it doesn't tell us why.

[40:37.040 --> 40:43.240] And so we could think about the why, and I do think that will illuminate why some topics

[40:43.240 --> 40:44.840] are different than others.

[40:44.840 --> 40:49.280] So for example, one reason may be misinformation, right?

[40:49.280 --> 40:55.600] You may be that you are subjected to misinformation, disinformation, and you are ignorant of the

[40:55.600 --> 40:56.800] scientific topic.

[40:56.800 --> 41:01.400] So this would be a knowledge deficit kind of model, right?

[41:01.400 --> 41:10.200] And interestingly, the topics like GM foods, the opposition to GM food, which does follow

[41:10.200 --> 41:16.120] a knowledge deficit model, is also the one that has this super decay effect where the

[41:16.120 --> 41:18.800] people who know the least think they know the most.

[41:18.800 --> 41:23.680] With climate change, the climate change deniers don't have a knowledge deficit problem.

[41:23.680 --> 41:25.840] That is not the issue.

[41:25.840 --> 41:31.520] They can sling the facts as well as anyone, you know, even the defenders of the scientific

[41:31.520 --> 41:32.520] consensus.

[41:32.520 --> 41:35.160] Now, of course, this makes them think that they're right.

[41:35.160 --> 41:38.040] And they'll say, you see, that's why it's not a knowledge deficit problem because climate

[41:38.040 --> 41:39.720] change is wrong and we're right.

[41:39.720 --> 41:46.200] But what it could be is that those are the issues that are the most political and not

[41:46.200 --> 41:52.700] just political, but really related to identity and ideology.

[41:52.700 --> 41:57.720] One factor that the authors speculated is playing a role here and in other research

[41:57.720 --> 42:03.960] as well, this is not a new idea, is the idea of cultural knowledge, right, that they have

[42:03.960 --> 42:08.920] a set of information that they believe because it is tied to their cultural identity.

[42:08.920 --> 42:16.720] So if you are a fundamentalist Christian, you are raised in a subculture denying evolution,

[42:16.720 --> 42:22.920] and you are filled with a lot of facts that you believe to be true, but they're all either

[42:22.920 --> 42:27.440] wrong or distorted or biased or whatever.

[42:27.440 --> 42:32.800] You are living in a narrative that is well developed and very robust, but is completely

[42:32.800 --> 42:33.800] wrong.

[42:33.800 --> 42:38.480] That will give you overconfidence, right, because you are filled with a lot of facts.

[42:38.480 --> 42:39.880] You think you know what you're talking about.

[42:39.880 --> 42:41.720] But I thought these people were less overconfident.

[42:41.720 --> 42:46.840] But you may actually know facts about the topic, but it doesn't matter because it's

[42:46.840 --> 42:49.720] motivated reasoning, right?

[42:49.720 --> 42:55.260] People who deny climate change actually do know a lot of facts, and they can cite a lot

[42:55.260 --> 42:57.680] of studies.

[42:57.680 --> 43:05.800] In fact, some of them have very detailed knowledge, but it's all couched in motivated reasoning

[43:05.800 --> 43:10.600] that they are using in a dedicated way to deny climate change, and also because they're

[43:10.600 --> 43:18.320] being handed narratives that are well-crafted by experts and by either cherry-picked scientists

[43:18.320 --> 43:23.660] or by sophisticated PR campaigns that have been honing their message over decades.

[43:23.660 --> 43:30.080] So basically they have a motivated view, but they still have access to reality, and that

[43:30.080 --> 43:33.880] reality is still kind of seeping its way in.

[43:33.880 --> 43:39.020] So they have facts, but they're overwhelmed with motivated reasoning because the particular

[43:39.020 --> 43:45.840] denial is tied to their cultural identity or their ideological identity, whereas people

[43:45.840 --> 43:53.240] who deny the scientific consensus on the safety of GM foods, it's not part of their identity.

[43:53.240 --> 43:54.560] They're just full of misinformation.

[43:54.560 --> 43:57.160] But I think for some people it is.

[43:57.160 --> 44:00.320] Yeah, but that's the minority, though.

[44:00.320 --> 44:04.000] I guess that's my majority, but you're probably right.

[44:04.000 --> 44:09.760] But I feel like most of the people who I know who are super anti-GM, that is their identity.

[44:09.760 --> 44:11.840] They're alt-medicine advocates.

[44:11.840 --> 44:14.640] They're super scared and anti-big ag.

[44:14.640 --> 44:17.800] They're super scared and anti-big pharma, big med.

[44:17.800 --> 44:22.680] They have these very, very identity-driven views.

[44:22.680 --> 44:28.900] I agree, but with that issue, they are, I think, a minority of the totality of people

[44:28.900 --> 44:30.680] who are afraid of GM foods.

[44:30.680 --> 44:36.360] When you look at people, if you select people based upon just answering a survey, I think

[44:36.360 --> 44:39.360] that GM foods are not safe.

[44:39.360 --> 44:45.620] They're not all the true believer, crunchy, New Ager people that you're talking about.

[44:45.620 --> 44:51.440] In fact, rejection of GM foods is bipartisan.

[44:51.440 --> 44:56.480] It doesn't even really hew to one political ideology or one political party because people

[44:56.480 --> 44:58.080] find reasons to do it.

[44:58.080 --> 44:59.160] They're afraid of GM foods.

[44:59.160 --> 45:01.860] If you are on the left, you say it's because you're anti-corporate.

[45:01.860 --> 45:04.360] If you're on the right, it's because you're anti-government regulation.

[45:04.360 --> 45:06.920] You don't trust the government to keep them safe.

[45:06.920 --> 45:11.760] But I also wonder how much of it is truly because when it comes to vaccines, it's a

[45:11.760 --> 45:13.560] personal choice to get a vaccine.

[45:13.560 --> 45:17.880] When it comes to GM foods, it's a personal choice to eat those foods or to feed those

[45:17.880 --> 45:19.640] foods to your children.

[45:19.640 --> 45:23.840] There's this kind of precautionary principle at work with a lot of individuals that isn't

[45:23.840 --> 45:28.080] there when we're talking about evolution or even climate change, because that's kind

[45:28.080 --> 45:29.360] of eschatological.

[45:29.360 --> 45:36.200] It's vaguely affecting all of humanity, but not me personally, like less control about

[45:36.200 --> 45:37.200] our choices.

[45:37.200 --> 45:38.200] Yeah.

[45:38.200 --> 45:41.200] It shows how multidimensional these belief systems are.

[45:41.200 --> 45:43.080] You could think about them so many different ways.

[45:43.080 --> 45:44.420] I don't know that this is the answer.

[45:44.420 --> 45:46.440] This is just something that's different about them.

[45:46.440 --> 45:53.740] But it is true that vaccine denial is nonpartisan, GMO food opposition is nonpartisan, and those

[45:53.740 --> 46:00.040] tend to be similar in terms of this super decay, and those people can be, some of them

[46:00.040 --> 46:03.920] can be brought around, and nothing's 100%, I'm not saying everybody can, but you can

[46:03.920 --> 46:06.360] move the needle with facts.

[46:06.360 --> 46:11.240] Some of them can be convinced by facts, whereas if you look at evolution denial and climate

[46:11.240 --> 46:19.440] change denial, they are super ideological, super political, and you cannot correct them

[46:19.440 --> 46:23.880] by giving them the information, because that's not the problem in the first place.

[46:23.880 --> 46:24.880] Lack of information-

[46:24.880 --> 46:25.880] Right.

[46:25.880 --> 46:26.880] They're almost like religious views at that point.

[46:26.880 --> 46:27.880] Yes.

[46:27.880 --> 46:33.240] Well, in one case, it is a religious view, and in terms of evolution denial, it is 100%

[46:33.240 --> 46:35.080] a religious view.

[46:35.080 --> 46:39.520] And with climate change denial, it is political identity.

[46:39.520 --> 46:43.680] And those two things, climate change denial and evolution denial, tend to go hand in hand

[46:43.680 --> 46:44.680] in a lot of individuals.

[46:44.680 --> 46:45.680] Of course.

[46:45.680 --> 46:46.680] Yeah, yeah, yeah.

[46:46.680 --> 46:47.680] Yeah.

[46:47.680 --> 46:51.520] But it's a pseudo-religious, at least it correlates very well with it.

[46:51.520 --> 46:55.840] I'm curious too, Steve, what you think about the idea of like, when we talk about pseudoscience

[46:55.840 --> 47:02.360] in kind of its most classical term, like pseudoscience, but then we can talk about science denial or

[47:02.360 --> 47:09.160] science distrust as having a kernel of truth in it and taking it too far.

[47:09.160 --> 47:13.040] And I wonder sometimes with the people who are anti-vaxx, anti-GM, it's because they

[47:13.040 --> 47:18.560] have a sort of prudent paranoia that's become imprudent over time, it's kind of run away

[47:18.560 --> 47:20.040] from them.

[47:20.040 --> 47:23.640] So there's a caution that has become extreme.

[47:23.640 --> 47:27.080] I don't think that's the pathway, it might be in some people.

[47:27.080 --> 47:35.520] I think we tend to choose positions based upon our instincts, and those instincts are

[47:35.520 --> 47:36.520] complicated, right?

[47:36.520 --> 47:38.380] They have a lot of parts to it.

[47:38.380 --> 47:42.760] And then we backfill a justification, we rationalize our instincts.

[47:42.760 --> 47:47.080] And so what you're talking about, I think, are people like they just simply doesn't feel

[47:47.080 --> 47:52.360] right about GM food, and then they justify that feeling with the precautionary principle

[47:52.360 --> 47:58.560] or distrust of big corporations, or I don't think we should be patenting life.

[47:58.560 --> 48:05.500] But these are not coherent opinions that evolve organically, ironically, from the facts, right?

[48:05.500 --> 48:08.400] These are just backfilling their gut instincts.

[48:08.400 --> 48:11.680] And I think that's what you're seeing with the crunchy, you know, crowd, whereas their

[48:11.680 --> 48:16.440] instincts are natural and anti-corporate and all of this.

[48:16.440 --> 48:19.800] And that's why they just doesn't like nuclear power doesn't feel right to them.

[48:19.800 --> 48:20.800] Right.

[48:20.800 --> 48:23.080] But I think that also kind of minimize it.

[48:23.080 --> 48:28.640] I mean, I know a fair amount of people who like, they have come to these conclusions,

[48:28.640 --> 48:34.760] and they personally believe that they have done legitimate, deep research on these topics.

[48:34.760 --> 48:36.620] Like it's misguided.

[48:36.620 --> 48:40.760] But this isn't just a, oh, I don't know, I buy organic because it feels like it's healthier.

[48:40.760 --> 48:46.920] Like they will sit and have a deep conversation with you about about their beliefs system

[48:46.920 --> 48:51.040] and how the how it evolved over time and what they've read and why they believe these things

[48:51.040 --> 48:52.180] to be true.

[48:52.180 --> 48:54.040] And they don't feel arbitrary.

[48:54.040 --> 48:55.040] They feel intentional.

[48:55.040 --> 48:58.360] Yeah, but again, I think you're talking about one end of the spectrum.

[48:58.360 --> 48:59.680] Yeah, I think you're right.

[48:59.680 --> 49:02.680] I talk to people all across the spectrum.

[49:02.680 --> 49:06.840] And there are a lot of people who they don't really they don't really have an identity

[49:06.840 --> 49:09.800] that is significantly tied to this.

[49:09.800 --> 49:11.840] They're just like, I just hear bad things about it.

[49:11.840 --> 49:14.120] And they can't even tell you what it stands for.

[49:14.120 --> 49:17.800] And they triggers just an uncomfortable feeling.

[49:17.800 --> 49:21.520] And those are the people that are easily easy to bring around to bring around the people

[49:21.520 --> 49:26.800] who are like activists, like this is part of their core worldview.

[49:26.800 --> 49:28.660] They are much more entrenched.

[49:28.660 --> 49:32.160] And that subgroup, you know, I think behaves differently.

[49:32.160 --> 49:37.800] I would like to see research that tries to tease apart different subgroups within like

[49:37.800 --> 49:39.360] the anti GMO crowd.

[49:39.360 --> 49:43.720] But every study I've seen just treats it all as one homogenous group, which it absolutely

[49:43.720 --> 49:45.000] isn't.

[49:45.000 --> 49:49.680] And then they're just making statistical statements about that one group treating it as if it

[49:49.680 --> 49:54.360] were homogenous, because they checked a box saying I don't trust GM food, you know, on

[49:54.360 --> 49:55.760] a survey.

[49:55.760 --> 50:00.400] And so, you know, and the reason why you can convince some of them with information is

[50:00.400 --> 50:06.160] because they were they were there was a light denial in the first place based upon misinformation

[50:06.160 --> 50:09.960] that can be corrected, or based upon their own misunderstanding.

[50:09.960 --> 50:13.520] Whereas you know, with any of these topics, you can have true believers who are have a

[50:13.520 --> 50:22.000] deep, rich belief system that's just massively biased through some thick ideological lens

[50:22.000 --> 50:24.000] that they can't even really perceive.

[50:24.000 --> 50:25.000] Absolutely.

[50:25.000 --> 50:26.000] Yeah.

[50:26.000 --> 50:29.440] So I think, you know, there are subgroups within all of these things as well, which

[50:29.440 --> 50:30.680] is interesting.

[50:30.680 --> 50:36.440] And this is, you know, because I write blogs every day, and I can, you know, get engaged

[50:36.440 --> 50:39.360] in the comments, you see the full range.

[50:39.360 --> 50:44.600] And then all the way to people that are like indistinguishable from corporate shills, you

[50:44.600 --> 50:48.360] know, like I whenever I write about climate change, there are people in the comments like

[50:48.360 --> 50:54.040] I really have I cannot tell if this person is a paid shill for the oil industry, or if

[50:54.040 --> 50:58.360] they just are somebody who came to believe all of the propaganda and talking points,

[50:58.360 --> 51:02.880] because that's the media ecosystem in which they live, you know.

[51:02.880 --> 51:07.880] So it could be cultural knowledge, it could be, you know, intellectually dishonest, it

[51:07.880 --> 51:11.600] could just be all ideological, it could just be all misinformation.

[51:11.600 --> 51:14.680] Maybe they're trying to be skeptical, and they think this is what it means to be skeptical,

[51:14.680 --> 51:15.680] and they're just wrong.

[51:15.680 --> 51:18.440] You know, it could be all of those things for different people.

[51:18.440 --> 51:23.040] And then as I think we pointed out before, others, another layer here, which is cognitive

[51:23.040 --> 51:24.160] style, right?

[51:24.160 --> 51:27.960] So then there are people who are intuitive thinkers versus analytical thinkers.

[51:27.960 --> 51:33.920] There are people who are opportunistic conspiracy theorists, and people who are dedicated conspiracy

[51:33.920 --> 51:34.920] theorists.

[51:34.920 --> 51:38.760] For them, it's about the conspiracy, and they believe any conspiracy.

[51:38.760 --> 51:42.560] Other people only believe like the one or two that support their ideology and then

[51:42.560 --> 51:46.020] generally don't believe the other conspiracies.

[51:46.020 --> 51:49.420] So there's multiple layers interacting at the same time here.

[51:49.420 --> 51:52.520] And we're just kind of at this level, the research at the level where we're just looking

[51:52.520 --> 51:57.760] at these things in the aggregate and looking at maybe the dominant effect for these different

[51:57.760 --> 52:05.100] topics but not really parsing out all the different factors that go into it.

[52:05.100 --> 52:10.080] There are studies looking at the relationship of the cognitive styles as well, but it's

[52:10.080 --> 52:13.080] hard to pull it all together, like to do it all at once in one study.

[52:13.080 --> 52:14.080] You can't.

[52:14.080 --> 52:17.360] You're only kind of looking at it from one angle at a time, which is challenging because

[52:17.360 --> 52:18.680] you have people who are complicated.

[52:18.680 --> 52:19.680] That's the bottom line.

[52:19.680 --> 52:20.920] So what does this mean for us?

[52:20.920 --> 52:27.360] You know, again, one is humility is the cornerstone of scientific skepticism, as always, right?

[52:27.360 --> 52:30.000] I mean, humility would solve a lot of these problems.

[52:30.000 --> 52:32.520] And it doesn't mean that the experts are always right.

[52:32.520 --> 52:36.680] But if you disagree with people who know more than you, that should at least give you massive

[52:36.680 --> 52:38.140] pause.

[52:38.140 --> 52:42.480] If the people who are the experts, you know, have a strong consensus and they're saying,

[52:42.480 --> 52:43.960] yes, this is what we think.

[52:43.960 --> 52:47.840] The planet is warming because industry is putting CO2 into the atmosphere.

[52:47.840 --> 52:49.760] It's clearly happening.

[52:49.760 --> 52:53.560] And if you disagree with them, you'd better be an expert yourself.

[52:53.560 --> 52:55.800] You better have a good reason.

[52:55.800 --> 53:00.400] And that reason is probably not good enough, because if you don't have a deep level of

[53:00.400 --> 53:03.360] technical knowledge, you simply can't play with the big boys.

[53:03.360 --> 53:05.480] I'm sorry, but that's just the way it is.

[53:05.480 --> 53:11.120] You know, it's hard to get people to understand this if they don't have a reasonable level

[53:11.120 --> 53:13.280] of expertise in anything.

[53:13.280 --> 53:19.280] But most people have at least above average knowledge in something, right?

[53:19.280 --> 53:24.120] So what do you think you know more about than anybody, than other people?

[53:24.120 --> 53:26.520] What's the thing you know the most about?

[53:26.520 --> 53:30.640] And compare your knowledge to the average person or the media or how it's portrayed

[53:30.640 --> 53:33.720] on television or whatever, right?

[53:33.720 --> 53:34.720] Nobody else.

[53:34.720 --> 53:39.960] Would you accept the opinion of somebody who has that sort of casual media-derived knowledge

[53:39.960 --> 53:43.120] level about this thing that you know a lot about?

[53:43.120 --> 53:44.480] Would you respect their opinions?

[53:44.480 --> 53:47.120] Are their opinions hopelessly childish and naive?

[53:47.120 --> 53:50.680] Well, that's the way you are about every other topic in the world.

[53:50.680 --> 53:51.680] Right.

[53:51.680 --> 53:57.120] And it's so interesting that there are certain topics where people really love to be armchair

[53:57.120 --> 53:58.120] experts.

[53:58.120 --> 53:59.120] Yeah.

[53:59.120 --> 54:00.120] Some more than others.

[54:00.120 --> 54:01.120] Right.

[54:01.120 --> 54:02.120] Yeah.

[54:02.120 --> 54:03.120] Yeah.

[54:03.120 --> 54:04.120] Totally.

[54:04.120 --> 54:05.840] But you'll find somebody who's willing to be an armchair skeptic about anything, even

[54:05.840 --> 54:06.840] quantum mechanics.

[54:06.840 --> 54:09.800] No, seriously, we've got the quantum quackery.

[54:09.800 --> 54:12.920] They think, oh, yeah, you know, the observer effect, you don't know what the hell you're

[54:12.920 --> 54:13.920] talking about.

[54:13.920 --> 54:14.920] Just be quiet.

[54:14.920 --> 54:22.760] But that's, you know, and then, you know, of course, in social media, 99.9% of the conversations

[54:22.760 --> 54:26.480] are happening among non-experts, right?

[54:26.480 --> 54:30.800] And the only thing we really should be talking about is trying to understand what the consensus

[54:30.800 --> 54:32.200] of expert opinion is.

[54:32.200 --> 54:34.880] And everything else is bullshit, right?

[54:34.880 --> 54:39.960] Everything else is just children playing and thinking that they're contributing to the

[54:39.960 --> 54:42.920] cutting edge knowledge of the world, you know?

[54:42.920 --> 54:49.760] Yeah, so that lack of humility, I think, is at the core of all science denial.

[54:49.760 --> 54:54.400] And that's a good first step, I think, to dealing with a lot of it as well.

Monkeypox (54:54)

C: ... I don't know if you guys remember, but we did a Monkeypox piece maybe, like, 3 months ago, when the cases in the U.S. started cropping up. We went ahead and covered Monkeypox, and so I want to direct everybody to our archive for the deep dive about the background of Monkeypox.

[54:54.400 --> 54:57.480] All right, Cara, tell us about the monkeypox.

[54:57.480 --> 55:03.760] Yeah, so we, I don't know if you guys remember, but we did a monkeypox piece maybe like three

[55:03.760 --> 55:08.500] months ago when the cases in the US first started cropping up.

[55:08.500 --> 55:10.360] We went ahead and covered monkeypox.

[55:10.360 --> 55:15.760] And so I want to direct everybody to our archive for the deep dive about the background of

[55:15.760 --> 55:17.200] monkeypox.

[55:17.200 --> 55:23.500] For those of you who don't know what this like PHEIC, which stands for Public Health

[55:23.500 --> 55:29.280] Emergency of International Concern, really means, that's basically the WHO's highest

[55:29.280 --> 55:31.000] alarm.

[55:31.000 --> 55:33.760] What I want to do is I want to talk about what's going on worldwide, what's going on

[55:33.760 --> 55:38.720] in the US, and then maybe bust a few of the myths around monkeypox.

[55:38.720 --> 55:48.040] So there is an active ongoing CDC page for monkeypox that gives a case count and a map.

[55:48.040 --> 55:57.600] So there have been 3,591 confirmed monkeypox cases within the United States within this

[55:57.600 --> 55:59.920] current outbreak.

[55:59.920 --> 56:02.320] And you can look state by state where they are.

[56:02.320 --> 56:08.380] The vast majority are where you would expect California, Florida, actually Illinois has

[56:08.380 --> 56:12.600] a big outbreak, so maybe they're not where you'd expect, New York, Texas.

[56:12.600 --> 56:15.480] So larger population centers, larger outbreaks.

[56:15.480 --> 56:17.760] A lot of people are scared.

[56:17.760 --> 56:22.180] And I think what we want to do is we want to find that balance, like we did the last

[56:22.180 --> 56:30.480] time we talked about this, between being cautious, between being vigilant, but also not panicking.

[56:30.480 --> 56:35.660] So let's talk a little bit first about, again, a review of what monkeypox is.

[56:35.660 --> 56:39.460] So monkeypox is an orthopoxvirus.

[56:39.460 --> 56:41.920] It is similar not to chickenpox.

[56:41.920 --> 56:43.480] It's not similar to varicella.

[56:43.480 --> 56:46.400] It's similar to smallpox.

[56:46.400 --> 56:52.880] But the thing that's really important to remember is that it is significantly less deadly than

[56:52.880 --> 56:53.880] smallpox.

[56:53.880 --> 56:57.560] It is less transmissible and less deadly.

[56:57.560 --> 57:02.560] And there are two different kinds of monkeypox, two different kind of sort of evolutionary

[57:02.560 --> 57:08.760] paths that the virus took, and the type that we are seeing globally in this outbreak right

[57:08.760 --> 57:13.300] now is the less transmissible, less deadly type.

[57:13.300 --> 57:17.600] So it's 99% plus survivability.

[57:17.600 --> 57:23.880] And so far of these almost 4,000 cases that are here in the US, we have not seen a single

[57:23.880 --> 57:24.880] death.

[57:24.880 --> 57:27.000] So this is the good news.

[57:27.000 --> 57:32.120] The bad news is we're seeing this cropping up all over the globe.

[57:32.120 --> 57:41.080] It's now been identified in 80 different countries, and this was only endemic in Africa previously.

[57:41.080 --> 57:43.700] So here's where things get scary.

[57:43.700 --> 57:51.480] If we can nip this in the bud, we can prevent this from becoming an endemic disease in other

[57:51.480 --> 57:52.800] countries.

[57:52.800 --> 57:56.200] We know it's always going to be, well, we shouldn't say always, but we know it's endemic

[57:56.200 --> 57:59.820] in Africa because there are animal reservoirs for this virus.

[57:59.820 --> 58:07.000] So far, we don't have endemic animal reservoirs in these other countries, but we could.

[58:07.000 --> 58:10.900] Because if enough people catch this and spread it around to each other, we're also going

[58:10.900 --> 58:11.900] to spread it.

[58:11.900 --> 58:17.120] We're going to have these backwards spillovers, and we may see zoonosis going the other way.

[58:17.120 --> 58:22.860] And if that happens, there may be this continuing reservoir that exists in nature.

[58:22.860 --> 58:27.120] So we will constantly have spillover events, and people will constantly get sick with this

[58:27.120 --> 58:28.120] virus.

[58:28.120 --> 58:29.580] We don't want that to happen.

[58:29.580 --> 58:31.860] We really don't want that to happen.

[58:31.860 --> 58:35.540] So that's one thing to be really mindful of, and it's one of the reasons why it's so important

[58:35.540 --> 58:37.000] to get out in front of this.

[58:37.000 --> 58:40.120] Obviously, we're concerned about health and human safety.

[58:40.120 --> 58:43.620] The good news is, again, monkeypox is not very fatal.

[58:43.620 --> 58:51.940] The bad news is some studies have shown fatality among sensitive or vulnerable groups.

[58:51.940 --> 58:56.880] And specifically, we're talking about children under the age of eight and pregnant women

[58:56.880 --> 58:58.680] or women who are nursing.

[58:58.680 --> 59:03.860] These seem to be more vulnerable groups, as well as those who are immunocompromised.

[59:03.860 --> 59:09.240] And sadly, those are also groups where there's still some question as to whether the vaccine

[59:09.240 --> 59:10.800] is safe for them to get.

[59:10.800 --> 59:14.720] Now, the good news is there is a monkeypox vaccine.

[59:14.720 --> 59:20.500] So not only do we have smallpox vaccines, which are, by some estimates, up to 85% effective

[59:20.500 --> 59:27.240] against monkeypox because the viruses are so similar, but there is also one FDA-approved

[59:27.240 --> 59:28.920] monkeypox vaccine.

[59:28.920 --> 59:37.360] Okay, so it's called JYNNEOS, or J-I-N-E-O-S, I'm not sure how to pronounce it, J-Y-N-N-E-O-S,

[59:37.360 --> 59:44.360] and the CDC has ordered, or I should say the US government has ordered, well over, I think

[59:44.360 --> 59:51.880] around 300,000 monkeypox vaccines so far, and there are about 800,000 that are going

[59:51.880 --> 59:54.940] to be on order very soon.

[59:54.940 --> 59:59.580] And not only is that vaccine, I shouldn't say readily available because it's nowhere

[59:59.580 --> 01:00:04.280] near readily available, but not only does that vaccine exist and it's on its way and

[01:00:04.280 --> 01:00:11.720] we do have some doses, there's also an antiviral called T-pox, which is already in the national

[01:00:11.720 --> 01:00:12.720] stockpile.

[01:00:12.720 --> 01:00:18.640] So there's 1.3 million doses of T-pox here in the US within the national stockpile, which

[01:00:18.640 --> 01:00:23.120] different agencies and organizations are working on trying to get out to providers across the

[01:00:23.120 --> 01:00:24.120] country.

[01:00:24.120 --> 01:00:27.520] Now, this feels very Americocentric because it is, because this is where I'm doing the

[01:00:27.520 --> 01:00:33.120] vast majority of my research, unfortunately the outbreak in America seems to be the worst

[01:00:33.120 --> 01:00:34.120] outbreak.

[01:00:34.120 --> 01:00:38.960] I think there's something like, it's so hard because every day the numbers are rising,

[01:00:38.960 --> 01:00:45.600] something like 18,000 cases worldwide, close to 4,000 of those, or just over 3,500 of those

[01:00:45.600 --> 01:00:51.600] are right here in the US, and like I said, it's been in 80 countries so far, and I think

[01:00:51.600 --> 01:00:56.920] the top four countries, so the US, Germany, Spain, and the UK are the places where the

[01:00:56.920 --> 01:00:59.280] case count is just going up, up, up, up, up.

[01:00:59.280 --> 01:01:04.780] So let's talk a little bit about some of the myths around monkeypox because I think it's

[01:01:04.780 --> 01:01:06.400] really important that we do that.

[01:01:06.400 --> 01:01:12.440] There is a real risk, a real, real, real, real, real risk of stigma with this disease

[01:01:12.440 --> 01:01:17.360] and the reason for it has everything to do with the current epidemiological pattern and

[01:01:17.360 --> 01:01:22.240] nothing to do with the virus itself.

[01:01:22.240 --> 01:01:26.440] And what I'm talking about is that the vast majority of case counts right now are among

[01:01:26.440 --> 01:01:28.040] men who have sex with men.

[01:01:28.040 --> 01:01:36.020] So gay and bisexual men, men who have close sexual contact with other men seem to be those

[01:01:36.020 --> 01:01:39.240] who are overwhelmingly affected right now.

[01:01:39.240 --> 01:01:44.560] They're not the only people who can be affected and they're not the only people who are affected.

[01:01:44.560 --> 01:01:50.960] We've already seen here in the US cis hetero women, we've already seen sadly two children

[01:01:50.960 --> 01:01:56.640] with a confirmed diagnosis, but something like over 95% of the global cases so far have

[01:01:56.640 --> 01:02:00.000] been amongst men who have sex with men.

[01:02:00.000 --> 01:02:06.120] And I fear, and you see this a lot from different experts online, that there is going to be

[01:02:06.120 --> 01:02:10.700] a repetition of what we saw early on with the HIV AIDS crisis, right?

[01:02:10.700 --> 01:02:13.800] This is not a discriminating virus.

[01:02:13.800 --> 01:02:19.080] It just happens to be the case that this is where the outbreak perhaps has started.

[01:02:19.080 --> 01:02:21.780] It won't be contained within this group.

[01:02:21.780 --> 01:02:26.800] And so it's important, yes, that there are targeted efforts, epidemiologic and public

[01:02:26.800 --> 01:02:32.340] health efforts to ensure that vaccines are available to high risk individuals, to ensure

[01:02:32.340 --> 01:02:37.580] that antiviral medications are available and that screening is happening.

[01:02:37.580 --> 01:02:39.360] But we have to also be really wary.

[01:02:39.360 --> 01:02:41.940] It's not going to stay within this group of individuals.

[01:02:41.940 --> 01:02:43.240] It's not a gay disease.

[01:02:43.240 --> 01:02:46.200] It's not a sexually transmitted infection.

[01:02:46.200 --> 01:02:51.960] This is a disease that is spread through close intimate contact and also through prolonged

[01:02:51.960 --> 01:02:53.200] respiratory contact.

[01:02:53.200 --> 01:02:56.400] So you can get it through skin to skin contact.

[01:02:56.400 --> 01:03:01.160] You can get it through respiratory droplets and you can even get it through clothing and

[01:03:01.160 --> 01:03:03.480] bedding of infected individuals.

[01:03:03.480 --> 01:03:08.360] So we do need to make sure that if there is a suspected case that there's safe handling,

[01:03:08.360 --> 01:03:15.640] that there's quarantining going into effect and safe handling of these fomites, basically.

[01:03:15.640 --> 01:03:20.120] But it's so, so important that we understand that this is not just a risk for men who have

[01:03:20.120 --> 01:03:21.640] sex with men.

[01:03:21.640 --> 01:03:28.660] We also know that we can give a vaccine after immediate infection.

[01:03:28.660 --> 01:03:31.680] It's probably not the best option available, though.

[01:03:31.680 --> 01:03:35.100] The best option available is contact tracing.

[01:03:35.100 --> 01:03:39.480] It's figuring out who's been exposed, quarantining those who have been exposed and preventing

[01:03:39.480 --> 01:03:41.500] the spread at the source.

[01:03:41.500 --> 01:03:45.360] We do know that there is post infection prophylactics available.

[01:03:45.360 --> 01:03:49.800] We also know that there are antivirals, but we don't want to be trying to contain this

[01:03:49.800 --> 01:03:51.240] after the fact.

[01:03:51.240 --> 01:03:56.220] The most important thing to do from a public health perspective is to, you know, utilize

[01:03:56.220 --> 01:03:59.700] these preventive measures.

[01:03:59.700 --> 01:04:03.960] Maybe I should go over really, really quickly what the symptoms are headache, acute onset

[01:04:03.960 --> 01:04:08.580] of fever, chills, muscle and body aches, tiredness.

[01:04:08.580 --> 01:04:14.520] One of the things that's different about monkeypox from smallpox is really swollen lymph nodes.

[01:04:14.520 --> 01:04:15.520] This is common.

[01:04:15.520 --> 01:04:18.400] And the other thing that's interesting is, yes, there is a rash.

[01:04:18.400 --> 01:04:20.480] And we talked about this last time.

[01:04:20.480 --> 01:04:22.160] Sometimes that rash can be really beneficial, right?

[01:04:22.160 --> 01:04:24.280] Oh, gosh, look, there's a monkeypox rash.

[01:04:24.280 --> 01:04:25.560] This is probably monkeypox.

[01:04:25.560 --> 01:04:29.640] Sadly, what we're seeing is that there are a lot of cases of individuals where the rash

[01:04:29.640 --> 01:04:31.560] isn't very obvious.

[01:04:31.560 --> 01:04:35.960] So they might only have one lesion and that lesion may be in the mouth or it may be around

[01:04:35.960 --> 01:04:38.100] the genitals or it may be around the anus.

[01:04:38.100 --> 01:04:43.160] And sadly, it's co-occurring very often with STI infections.

[01:04:43.160 --> 01:04:46.040] Yeah, so it might look like syphilis.

[01:04:46.040 --> 01:04:49.060] It might look like another STI.

[01:04:49.060 --> 01:04:55.600] And so it's really important that we don't minimize the risk of monkeypox in this sort

[01:04:55.600 --> 01:05:02.680] of abnormal presentation because not everybody gets the pox, the scabbing lesions that happen

[01:05:02.680 --> 01:05:04.540] all over the entire body.

[01:05:04.540 --> 01:05:09.680] I think it's also important to note that the rash or the actual course of the infection

[01:05:09.680 --> 01:05:12.420] itself, it usually lasts from three to four weeks.

[01:05:12.420 --> 01:05:14.820] It usually will resolve on its own.

[01:05:14.820 --> 01:05:20.560] So in healthy individuals with strong immune systems, the best course of action is quarantine.

[01:05:20.560 --> 01:05:23.080] Limit the spread, get better on your own.

[01:05:23.080 --> 01:05:27.680] But there are drugs available for individuals who don't have healthy immune systems for

[01:05:27.680 --> 01:05:30.120] individuals who do have more severe cases.

[01:05:30.120 --> 01:05:34.860] And of course, for individuals who are of great risk at this point, we're talking about

[01:05:34.860 --> 01:05:40.600] men who have sex with men and health care providers who are at risk of catching this

[01:05:40.600 --> 01:05:41.600] disease.

[01:05:41.600 --> 01:05:46.640] But of course, those targeted groups will likely change if we don't kind of manage this

[01:05:46.640 --> 01:05:47.640] outbreak.

[01:05:47.640 --> 01:05:49.720] Now, a few things also to remember.

[01:05:49.720 --> 01:05:51.760] We're not calling this an epidemic.

[01:05:51.760 --> 01:05:53.700] It's definitely not a pandemic.

[01:05:53.700 --> 01:05:55.480] And it is not the same as COVID.

[01:05:55.480 --> 01:05:57.660] And I think we all have a lot of fear.

[01:05:57.660 --> 01:06:01.280] And we all have a lot of concern because of what we just went through with COVID.

[01:06:01.280 --> 01:06:04.240] Like, oh, God, not monkey pox, not now.

[01:06:04.240 --> 01:06:06.480] There's a good chance this won't be anything like COVID.

[01:06:06.480 --> 01:06:08.080] It's a completely different virus.

[01:06:08.080 --> 01:06:11.060] It's a DNA virus, first of all, so it doesn't spread the same way.

[01:06:11.060 --> 01:06:13.320] It doesn't evolve the same way.

[01:06:13.320 --> 01:06:17.680] Also, this is a disease we've known about for over 50 years.

[01:06:17.680 --> 01:06:19.660] We've had vaccines available.

[01:06:19.660 --> 01:06:21.560] We've had these antivirals available.

[01:06:21.560 --> 01:06:24.360] This is not a novel virus.

[01:06:24.360 --> 01:06:29.840] What we do need to have learned, and sadly, we're not really doing the best job quite

[01:06:29.840 --> 01:06:34.760] yet, what we should have learned from COVID is how quick our response needs to be for

[01:06:34.760 --> 01:06:37.560] containment, how organized and how quick it needs to be.

[01:06:37.560 --> 01:06:40.080] That is one thing we can learn from COVID.

[01:06:40.080 --> 01:06:45.400] But I think it's really dangerous to try to view this as though it's the same thing as

[01:06:45.400 --> 01:06:51.520] COVID because it's a completely different disease, completely different transmissibility,

[01:06:51.520 --> 01:06:54.880] completely different mortality and morbidity.

[01:06:54.880 --> 01:06:57.600] For that reason, I would say don't panic.

[01:06:57.600 --> 01:07:00.960] If you are in one of these high-risk groups, make sure that you're educated about this

[01:07:00.960 --> 01:07:05.720] and that you know what to look for and that you're protecting yourself.

[01:07:05.720 --> 01:07:11.000] Of course, let's hope that we can get our Public Health Act together here and across

[01:07:11.000 --> 01:07:16.000] the world so that this does not become endemic in other parts of the globe.

[01:07:16.000 --> 01:07:22.400] Yeah, and I think the one thing I want to emphasize is that this is spread by intimate

[01:07:22.400 --> 01:07:25.880] contact, which includes but is not limited to sexual contact.

[01:07:25.880 --> 01:07:28.520] So do not think of this as a sexually transmitted disease.

[01:07:28.520 --> 01:07:29.520] Exactly.

[01:07:29.520 --> 01:07:33.920] It's different from an STI in that we still don't know if it's spread by semen or vaginal

[01:07:33.920 --> 01:07:38.320] fluid, which is kind of like fundamental to an STI.

[01:07:38.320 --> 01:07:39.840] This is spread by close intimate contact.

[01:07:39.840 --> 01:07:44.640] Of course, if you're having sex with somebody, you're having close intimate contact.

[01:07:44.640 --> 01:07:49.920] But also, if you live in a home with somebody, you're likely going to be touching the same

[01:07:49.920 --> 01:07:54.560] fabrics as them and using the same bathrooms and things like that.

[01:07:54.560 --> 01:07:59.840] But it's unlikely, not impossible, but it's unlikely that if you are dining in a restaurant

[01:07:59.840 --> 01:08:03.360] and somebody across the restaurant has monkeypox, that you are going to catch it.

[01:08:03.360 --> 01:08:09.620] It just doesn't spread the same way that respiratory, like standard respiratory viruses spread.

[01:08:09.620 --> 01:08:11.240] So this isn't COVID part two.

[01:08:11.240 --> 01:08:12.960] This is a totally different thing.

[01:08:12.960 --> 01:08:14.760] All right, thanks, Kara.

Who's That Noisy? (1:08:15)


New Noisy (1:15:50)

[low-quality audio of two men in conversation in English about photography]

J ...So, guys, if you enjoyed Who's That Noisy this week and you want to help, you can send me some cool Noisys that you hear in your life or you find on the web. Or if you want to try to answer this week's Who's That Noisy, you can email me at WTN@theskepticsguide.org

[01:08:14.760 --> 01:08:17.280] All right, Jay, it's Who's That Noisy Time.

[01:08:17.280 --> 01:08:21.520] All right, guys, last week I played this noisy.

[01:08:21.520 --> 01:08:39.080] I mean, I know there's something very recognizable there, right?

[01:08:39.080 --> 01:08:40.160] What are you hearing, guys?

[01:08:40.160 --> 01:08:44.120] I hear a calliope playing jazz music almost in a way.

[01:08:44.120 --> 01:08:47.340] It's like it's almost a song, but it's sloppy and there's something wrong with it.

[01:08:47.340 --> 01:08:48.340] That's interesting.

[01:08:48.340 --> 01:08:54.080] I guess my brain is so primed that I pretty clearly hear exactly what it is, what song

[01:08:54.080 --> 01:08:55.080] it is.

[01:08:55.080 --> 01:09:01.040] I got a lot of dead on correct guesses this week, so I don't have that many completely

[01:09:01.040 --> 01:09:05.880] off the rails guesses here, but there is something very interesting to talk about.

[01:09:05.880 --> 01:09:10.500] A listener named Ramsort wrote in and said, hey, Jay, that sounds like Toto's Africa as

[01:09:10.500 --> 01:09:15.880] interpreted by an AI that was given the lyrics and other generalities about the song.

[01:09:15.880 --> 01:09:22.440] Right out of the gate, yes, that is the song Africa by Toto, but there's something that

[01:09:22.440 --> 01:09:23.600] has been done to it.

[01:09:23.600 --> 01:09:29.240] So this particular person wrote in that it was interpreted by an AI, but that is not

[01:09:29.240 --> 01:09:32.520] correct, but that is a fantastic guess because I would have guessed that.

[01:09:32.520 --> 01:09:38.000] Now I'm going to get right to the person who won and then someone else who got the answer

[01:09:38.000 --> 01:09:39.040] dead on correct.

[01:09:39.040 --> 01:09:43.760] So the person who won this week is Brian Fort and he said, yeah, Jay, this sounds like Africa

[01:09:43.760 --> 01:09:46.460] by Toto played only on piano.

[01:09:46.460 --> 01:09:50.440] It sounds like the voice is being produced by specific combinations of strange piano

[01:09:50.440 --> 01:09:51.440] keys.

[01:09:51.440 --> 01:09:55.880] Now that is very correct, but there is a little bit of missing information here.

[01:09:55.880 --> 01:10:00.680] So another listener named Kier Nathan said, hi, I'm thinking this week's noisy is a midi

[01:10:00.680 --> 01:10:04.680] piano file of Toto's Africa.

[01:10:04.680 --> 01:10:11.520] So what has happened is they've taken the song Africa by Toto and they have imported

[01:10:11.520 --> 01:10:14.320] it into the midi format.

[01:10:14.320 --> 01:10:21.020] So it went from MP3 to midi and the computer hears everything that's happening and it translates

[01:10:21.020 --> 01:10:24.860] that into midi sounds, right?

[01:10:24.860 --> 01:10:30.420] On a piano, that was the only instrument that the programmer allowed to create the music

[01:10:30.420 --> 01:10:32.280] was a piano in midi.

[01:10:32.280 --> 01:10:39.280] So now when you turn that back into an MP3 and you play it back, you're hearing an interpretation

[01:10:39.280 --> 01:10:42.600] of that song only executed on a piano.

[01:10:42.600 --> 01:10:43.600] Do you understand now?

[01:10:43.600 --> 01:10:44.600] Yeah.

[01:10:44.600 --> 01:10:45.600] Yes.

[01:10:45.600 --> 01:10:46.600] All right.

[01:10:46.600 --> 01:10:47.600] So let me play it to you.

[01:10:47.600 --> 01:10:52.800] Let me help prime everybody that's listening to this by playing the exact clip, right?

[01:10:52.800 --> 01:10:55.840] It's the same exact segment of song, but it's the original.

[01:10:55.840 --> 01:11:16.200] You'll recognize this or most of you will, okay?

[01:11:16.200 --> 01:11:17.840] So now you have that song.

[01:11:17.840 --> 01:11:23.680] Now a computer is taking all of the music, the percussion, every, every little thing

[01:11:23.680 --> 01:11:29.980] that's in that song, the voice, the other instruments, a guitar, and it's trying to

[01:11:29.980 --> 01:11:31.520] simulate it on the keyboard.

[01:11:31.520 --> 01:11:36.240] It's trying to play all of that on a keyboard, but it's not limited to what 10 fingers can

[01:11:36.240 --> 01:11:37.240] do.

[01:11:37.240 --> 01:11:39.240] It could play any note it wants at any time.

[01:11:39.240 --> 01:11:40.240] Right?

[01:11:40.240 --> 01:11:43.400] So now, now listen to how this was, what's the right word?

[01:11:43.400 --> 01:11:45.540] Like, you know, interpolated.

[01:11:45.540 --> 01:11:46.540] Like I can't remember.

[01:11:46.540 --> 01:11:50.400] There is a word that fits this very well that my brain is trying to find.

[01:11:50.400 --> 01:11:52.400] I think it is interpolation.

[01:11:52.400 --> 01:11:53.400] Transposed.

[01:11:53.400 --> 01:11:54.400] Transposing.

[01:11:54.400 --> 01:11:55.400] Yeah.

[01:11:55.400 --> 01:11:56.400] Transposing.

[01:11:56.400 --> 01:11:57.400] Manipulation.

[01:11:57.400 --> 01:11:58.400] Yeah, maybe.

[01:11:58.400 --> 01:12:16.920] So here it is completely executed on a piano.

[01:12:16.920 --> 01:12:18.240] Do you hear the voice in there?

[01:12:18.240 --> 01:12:19.240] Yeah.

[01:12:19.240 --> 01:12:25.280] It's really odd because it's like, it's not like just a player piano version of Africa.

[01:12:25.280 --> 01:12:30.080] It's because every part is being played without any discrimination for how loud they should

[01:12:30.080 --> 01:12:31.080] be.

[01:12:31.080 --> 01:12:32.080] It's noisy.

[01:12:32.080 --> 01:12:33.080] Oh, hell yeah.

[01:12:33.080 --> 01:12:34.080] It's very noisy.

[01:12:34.080 --> 01:12:35.080] Yeah.

[01:12:35.080 --> 01:12:40.520] Because there isn't a lot of subtlety in the volumes of the different notes and everything.

[01:12:40.520 --> 01:12:49.760] But the incredible part to this is that a human voice can be simulated by playing lots

[01:12:49.760 --> 01:12:53.020] of different notes on a piano at the same time, right?

[01:12:53.020 --> 01:12:56.640] Because a human voice isn't just this singular sound.

[01:12:56.640 --> 01:12:57.840] There's overtones.

[01:12:57.840 --> 01:13:03.440] There's all sorts of resonances happening that actually make up the timbre of each of

[01:13:03.440 --> 01:13:04.440] our voices.

[01:13:04.440 --> 01:13:06.120] So now you take that and this piano-

[01:13:06.120 --> 01:13:07.120] That kid's dead.

[01:13:07.120 --> 01:13:13.440] I mean, it could be playing half the keys on a piano at the same time in different combinations

[01:13:13.440 --> 01:13:15.840] to simulate a human voice.

[01:13:15.840 --> 01:13:18.000] God, that is so cool.

[01:13:18.000 --> 01:13:23.920] That is so interesting that that instrument could somehow, you could hear a voice in there.

[01:13:23.920 --> 01:13:24.920] I can't really.

[01:13:24.920 --> 01:13:26.360] I don't think it sounded like a voice.

[01:13:26.360 --> 01:13:33.560] I think, well, Kara, if I played a version of this, of a song that you knew well, where

[01:13:33.560 --> 01:13:35.800] you recognize the lyrics and the melody instantly-

[01:13:35.800 --> 01:13:37.360] I know Africa.

[01:13:37.360 --> 01:13:38.920] I love that song.

[01:13:38.920 --> 01:13:39.920] Hold on now.

[01:13:39.920 --> 01:13:40.920] Let me play it again.

[01:13:40.920 --> 01:13:42.560] Let me play it again.

[01:13:42.560 --> 01:13:43.920] Listen again.

[01:13:43.920 --> 01:13:50.920] Get your ears involved.

[01:13:50.920 --> 01:14:00.080] Yeah.

[01:14:00.080 --> 01:14:01.080] I hear his voice.

[01:14:01.080 --> 01:14:04.800] No, I just hear the music from the treble clef.

[01:14:04.800 --> 01:14:05.800] Yeah, me too.

[01:14:05.800 --> 01:14:06.800] The treble clef line.

[01:14:06.800 --> 01:14:07.800] Yeah.

[01:14:07.800 --> 01:14:08.800] I don't hear a voice.

[01:14:08.800 --> 01:14:10.880] I think that the instruments that best emulate voices are theremins.

[01:14:10.880 --> 01:14:12.240] Oh, yeah.

[01:14:12.240 --> 01:14:15.360] It's amazing how much a theremin can sound like a voice.

[01:14:15.360 --> 01:14:16.360] Yeah.

[01:14:16.360 --> 01:14:22.040] I agree, Kara, that there is something about the way that overtones are used in the theremin.

[01:14:22.040 --> 01:14:25.840] If you heard this, there might be a loss of fidelity here because-

[01:14:25.840 --> 01:14:26.840] Yeah.

[01:14:26.840 --> 01:14:27.840] I think there is for sure.

[01:14:27.840 --> 01:14:29.360] Anyway, just a very cool thing.

[01:14:29.360 --> 01:14:33.680] I've played something similar to this, I think, seven, eight plus years ago.

[01:14:33.680 --> 01:14:35.160] There was another like talking piano.

[01:14:35.160 --> 01:14:36.160] Actually, I have it right here.

[01:14:36.160 --> 01:14:37.680] Let me play this for you.

[01:14:37.680 --> 01:14:41.160] This is such an interesting phenomenon.

[01:14:41.160 --> 01:14:58.560] Beep, beep, beep, beep, beep, beep.

[01:14:58.560 --> 01:14:59.560] You hear that?

[01:14:59.560 --> 01:15:00.560] That absolutely sounds like a voice.

[01:15:00.560 --> 01:15:01.560] I can hear it.

[01:15:01.560 --> 01:15:02.560] Yeah, I can hear it.

[01:15:02.560 --> 01:15:07.200] It sounds like a creepy, distorted ghost voice, but it definitely sounds like a voice.

[01:15:07.200 --> 01:15:10.040] There's no music being played.

[01:15:10.040 --> 01:15:14.600] It is just trying to do a human voice.

[01:15:14.600 --> 01:15:19.280] When you hear that, it's unbelievable what's happening on the instrument.

[01:15:19.280 --> 01:15:24.320] I watched a video that shows you all the keys that are being played, and they're being drummed

[01:15:24.320 --> 01:15:25.320] on.

[01:15:25.320 --> 01:15:26.320] You know what I mean?

[01:15:26.320 --> 01:15:29.440] It's not just like someone going, blah, playing a chord on the piano.

[01:15:29.440 --> 01:15:34.200] There's a lot of rhythm in there to help simulate the human voice.

[01:15:34.200 --> 01:15:35.200] Fascinating.

[01:15:35.200 --> 01:15:38.000] Just incredible what you could do with a piano.

[01:15:38.000 --> 01:15:40.960] So anyway, very cool, noisy.

[01:15:40.960 --> 01:15:49.880] One of my all time favorites sent in by Marcel Jansens, J-A-N-S-S-E-N-S, Jansens.

[01:15:49.880 --> 01:15:51.400] I have a new noisy for you guys.

[01:15:51.400 --> 01:15:54.160] This one was sent in by a listener named Melissa Rockhill.

[01:15:54.160 --> 01:15:55.160] 1.8 is AOS.

[01:15:55.160 --> 01:15:56.160] Hello, how's it going?

[01:15:56.160 --> 01:15:59.960] Am I up to giving you this service?

[01:15:59.960 --> 01:16:00.960] Oh yeah.

[01:16:00.960 --> 01:16:01.960] That's an awful answer.

[01:16:01.960 --> 01:16:02.960] No, it looks to me like you're on free, huh?

[01:16:02.960 --> 01:16:03.960] Just remember now, you're going to have to get that F off, probably off 22.

[01:16:03.960 --> 01:16:04.960] Maybe not.

[01:16:04.960 --> 01:16:05.960] Focus there on the insanity and the zoom beats me.

[01:16:05.960 --> 01:16:06.960] You can take your choice.

[01:16:06.960 --> 01:16:07.960] You can zoom with that thing all the way in if you take it to early mode.

[01:16:07.960 --> 01:16:08.960] That is great, isn't it?

[01:16:08.960 --> 01:16:09.960] Okay, the answer can't be men talking.

[01:16:09.960 --> 01:16:33.640] You have to give me the context, tell me exactly who, what, where, when on this one.

[01:16:33.640 --> 01:16:37.840] So guys, if you enjoyed Who's That Noisy this week and you want to help, you could send

[01:16:37.840 --> 01:16:42.440] me some cool noisies that you hear in your life or you find on the web, or if you want

[01:16:42.440 --> 01:16:48.760] to try to answer this week's Who's That Noisy, you can email me at WTN at the skeptics guide

[01:16:48.760 --> 01:16:49.760] dot org.

Announcements (1:16:50)

  • Upcoming NECSS

[01:16:49.760 --> 01:16:52.500] Steve, we are so close to Nexus.

[01:16:52.500 --> 01:16:55.400] It's coming up like a freight train.

[01:16:55.400 --> 01:16:59.720] As we record this, we are we are a week and a half away.

[01:16:59.720 --> 01:17:06.200] I am very excited about this year because we have a very interesting topic.

[01:17:06.200 --> 01:17:08.920] It's called Navigating the Misinformation Apocalypse.

[01:17:08.920 --> 01:17:15.400] We're going to have people talking about lots of different things about how we as consumers

[01:17:15.400 --> 01:17:21.880] have to digest misinformation that that is being hurled at us at near light speeds on

[01:17:21.880 --> 01:17:23.420] a daily basis.

[01:17:23.420 --> 01:17:28.920] Lots of different speakers talking about this topic and from from very, very different angles.

[01:17:28.920 --> 01:17:32.680] We have a keynote with Bill Nye and David Copperfield.

[01:17:32.680 --> 01:17:37.520] The two of them have a very interesting conversation about lots of different things, but they do

[01:17:37.520 --> 01:17:39.240] talk about misinformation.

[01:17:39.240 --> 01:17:43.960] They talk about their their own experiences with misinformation and they just really go

[01:17:43.960 --> 01:17:46.820] into a lot of different places in their conversation.

[01:17:46.820 --> 01:17:49.780] What a wonderful time we had recording that with them.

[01:17:49.780 --> 01:17:51.300] So please do join us.

[01:17:51.300 --> 01:17:57.280] You can go to NECSS dot org to buy tickets and find out more information about the list

[01:17:57.280 --> 01:17:58.840] of speakers that we have.

[01:17:58.840 --> 01:18:00.240] So please do join us this year.

[01:18:00.240 --> 01:18:01.240] All right.

[01:18:01.240 --> 01:18:02.240] Thanks, Jay.

Questions/Emails/Corrections/Follow-ups (1:18:02)

[01:18:02.240 --> 01:18:04.360] We're going to do one email this week.

[01:18:04.360 --> 01:18:11.440] We had a few emails responding to Kara's discussion of her medical experience last week.

[01:18:11.440 --> 01:18:17.480] I'm not going to read any individual but just give the basically the point was that we should

[01:18:17.480 --> 01:18:24.960] we need to do a deeper dive on this whole idea of self-advocacy versus following the

[01:18:24.960 --> 01:18:29.240] advice of your physician, especially when it comes to screening tests.

[01:18:29.240 --> 01:18:30.960] That is a very complicated topic.

[01:18:30.960 --> 01:18:32.520] But that's also sorry.

[01:18:32.520 --> 01:18:37.360] That's also a massive straw man, because I said multiple times to follow the advice of

[01:18:37.360 --> 01:18:38.360] your physician.

[01:18:38.360 --> 01:18:39.360] Totally.

[01:18:39.360 --> 01:18:43.240] When I was editing it, I was paying very close attention to that.

[01:18:43.240 --> 01:18:46.440] And you also said this is very anecdotal and blah, blah, blah.

[01:18:46.440 --> 01:18:49.760] And I am probably a very unique case.

[01:18:49.760 --> 01:18:54.240] But having said that, I mean, this is an interesting topic in and of itself to do a deeper dive

[01:18:54.240 --> 01:18:56.880] on and this is a good enough reason to do it.

[01:18:56.880 --> 01:19:01.020] The dilemma here is certainly as there's a delicate balance, I think is the bottom

[01:19:01.020 --> 01:19:06.200] line that there you want to as a patient advocate for yourself.

[01:19:06.200 --> 01:19:13.040] But advocating for yourself doesn't necessarily mean kibitzing your doctor's advice and their

[01:19:13.040 --> 01:19:17.240] the care that they're giving you or the recommendations that they're giving you absolutely doesn't

[01:19:17.240 --> 01:19:18.240] mean that.

[01:19:18.240 --> 01:19:19.240] Yeah, exactly.

[01:19:19.240 --> 01:19:24.200] Anything it means going above and beyond, like doing your own research on top of your

[01:19:24.200 --> 01:19:26.720] doctor's advice, if anything.

[01:19:26.720 --> 01:19:31.140] Because so few people actually get advice from their physicians, and that's the sad

[01:19:31.140 --> 01:19:32.140] thing.

[01:19:32.140 --> 01:19:33.140] What do you mean by that?

[01:19:33.140 --> 01:19:34.200] I'm not sure what you mean by that.

[01:19:34.200 --> 01:19:37.340] There's two main topics that I think are important to push here.

[01:19:37.340 --> 01:19:44.440] The first one is there's a fundamental difference between population level, epidemiological,

[01:19:44.440 --> 01:19:52.080] randomized control trial, large kind of sample data, and individuals, if we were to look

[01:19:52.080 --> 01:19:57.200] at, let's say, longitudinal information about one individual and follow them over time.

[01:19:57.200 --> 01:20:03.700] And I think very often, we make the mistake when we talk about science of saying that

[01:20:03.700 --> 01:20:10.200] population level data is fundamentally science, and individual anecdotes aren't science.

[01:20:10.200 --> 01:20:13.720] But population level data is made up of individual anecdotes.

[01:20:13.720 --> 01:20:18.360] And so yes, when we say the term, and I think it gets thrown around a lot in skeptical circles,

[01:20:18.360 --> 01:20:23.920] the plural of anecdote is not data, it's true and false at the same time, because absolutely

[01:20:23.920 --> 01:20:25.480] the plural of anecdote is data.

[01:20:25.480 --> 01:20:26.840] That is what data is.

[01:20:26.840 --> 01:20:30.920] It's a bunch of anecdotes compiled together systematically.

[01:20:30.920 --> 01:20:36.520] But we'd mean that the plural of anecdote is not data in that one aberrant experience

[01:20:36.520 --> 01:20:38.640] is not reflective of the whole.

[01:20:38.640 --> 01:20:39.640] And I think that that...

[01:20:39.640 --> 01:20:43.840] I'll push back on that a little bit just because of the definition of what an anecdote is.

[01:20:43.840 --> 01:20:50.080] The difference between anecdote and a datum is that a datum is collected within a controlled

[01:20:50.080 --> 01:20:57.040] environment, whereas an anecdote is not, and therefore, the door is open for all kinds

[01:20:57.040 --> 01:20:58.640] of biases.

[01:20:58.640 --> 01:21:04.540] There are case series, but we don't consider a case series, which is literally a string

[01:21:04.540 --> 01:21:09.120] of anecdotes, that would be a case series, for example, that's just not the same as collecting

[01:21:09.120 --> 01:21:12.280] data under controlled conditions.

[01:21:12.280 --> 01:21:15.440] And we do have to... there is a bright line between those two things, so I don't want

[01:21:15.440 --> 01:21:16.600] to conflate them.

[01:21:16.600 --> 01:21:17.600] I understand your point.

[01:21:17.600 --> 01:21:22.760] And I think really, I would frame it just differently, that when you're going from population

[01:21:22.760 --> 01:21:29.040] level data to individual decisions, you have to know how to translate that and...

[01:21:29.040 --> 01:21:31.440] Well, yeah, that's a separate point, but I agree.

[01:21:31.440 --> 01:21:36.080] Yeah, but... and also, how do you go from individual experiences to population level

[01:21:36.080 --> 01:21:38.000] data as well?

[01:21:38.000 --> 01:21:43.000] I just don't want to confuse what I think is an important distinction between the way

[01:21:43.000 --> 01:21:49.400] anecdotal data is collected versus controlled data, but of course, there are multiple different

[01:21:49.400 --> 01:21:52.860] types of scientific data, so you can't say it's not one thing.

[01:21:52.860 --> 01:21:56.040] It's all of these things, including anecdotes.

[01:21:56.040 --> 01:22:00.560] It's a spectrum of different kinds of data that all works together to push us towards

[01:22:00.560 --> 01:22:02.400] the truth, hopefully, like the right answer.

[01:22:02.400 --> 01:22:06.200] Yeah, and I think the point that I'm trying to make is that a little bit of information

[01:22:06.200 --> 01:22:10.960] can be really dangerous, and very often what I see within the skeptic community, and especially

[01:22:10.960 --> 01:22:18.040] when we receive emails about people who I think for good reason and well-meaning are

[01:22:18.040 --> 01:22:22.000] pushing a sort of very, like, this is the skeptical view, and you're not having a skeptical

[01:22:22.000 --> 01:22:28.000] view, is a misunderstanding fundamentally of the limits of science and what science is

[01:22:28.000 --> 01:22:30.140] and how we utilize science.

[01:22:30.140 --> 01:22:35.240] Or saying, you know, this over here is science, that over there is not science, I'm sorry,

[01:22:35.240 --> 01:22:39.840] but the randomized controlled trial is not the only type of science available.

[01:22:39.840 --> 01:22:44.000] And I think it's really important that we understand that and that we understand that

[01:22:44.000 --> 01:22:47.920] even the randomized controlled trial has limits.

[01:22:47.920 --> 01:22:52.160] But in this particular situation, the point that I think even more that I'm trying to

[01:22:52.160 --> 01:22:58.720] make is that we look at population-level data to understand trends and to understand how

[01:22:58.720 --> 01:23:03.680] to make evidence-based decisions that will affect the most people in the most equitable

[01:23:03.680 --> 01:23:08.640] way, but the point that you just said, how do we then translate that to an individual

[01:23:08.640 --> 01:23:09.640] consumer?

[01:23:09.640 --> 01:23:12.760] That's a completely different situation.

[01:23:12.760 --> 01:23:17.880] One is evidence-based medicine from a top-down approach.

[01:23:17.880 --> 01:23:25.580] The other is the art of medicine, the experience of going into a medical facility, getting

[01:23:25.580 --> 01:23:30.040] a diagnosis, and figuring out how to navigate the healthcare system.

[01:23:30.040 --> 01:23:35.680] And I'm sorry, like when I say this is what's wrong with, this is an example of what's wrong

[01:23:35.680 --> 01:23:42.720] with the medical system as it stands, this is me speaking from the perspective of a psychological

[01:23:42.720 --> 01:23:47.080] thinker operating, like, I can't call myself a health psychologist yet because I haven't

[01:23:47.080 --> 01:23:51.920] graduated yet, but somebody who does health psychology work within a health psych setting,

[01:23:51.920 --> 01:23:53.680] within a hospital setting.

[01:23:53.680 --> 01:24:00.400] The thing that I contend with the most when I talk to patients is feeling like a number,

[01:24:00.400 --> 01:24:06.120] feeling like a statistic, feeling like their individual tailored concerns aren't being

[01:24:06.120 --> 01:24:08.320] listened to, aren't being heard.

[01:24:08.320 --> 01:24:13.760] And so very often what we work on in therapy is how to advocate for yourself.

[01:24:13.760 --> 01:24:19.540] And I said this last week when we talked, they might be your one doctor, you are one

[01:24:19.540 --> 01:24:20.920] of hundreds of their patients.

[01:24:20.920 --> 01:24:27.400] We can't expect physicians to have the level, the level of specific individualized care

[01:24:27.400 --> 01:24:30.920] for every single patient that a patient's ever going to have for themselves.

[01:24:30.920 --> 01:24:32.240] We can't expect it.

[01:24:32.240 --> 01:24:34.080] And we have to pick up the pace there.

[01:24:34.080 --> 01:24:35.080] Yeah.

[01:24:35.080 --> 01:24:40.200] And that fits with the current model of clinical practice, which is the physician-patient relationship

[01:24:40.200 --> 01:24:42.080] is a partnership.

[01:24:42.080 --> 01:24:48.520] It is not the paternalistic model of like the 1950s where you have just, you listen

[01:24:48.520 --> 01:24:50.760] to what I say, you don't know what you're talking about.

[01:24:50.760 --> 01:24:52.120] It is absolutely a partnership.

[01:24:52.120 --> 01:24:58.000] And I count on my patients having intimate knowledge of their own clinical situation

[01:24:58.000 --> 01:25:01.880] because again, they're going to know it more intimately than I do because they're like

[01:25:01.880 --> 01:25:05.500] they're their one patient where I have a thousand patients or whatever.

[01:25:05.500 --> 01:25:09.960] And so absolutely not that I don't have a mastery over the details and document everything,

[01:25:09.960 --> 01:25:14.720] et cetera, but still they're the ones who are living a day to day and you have to listen

[01:25:14.720 --> 01:25:15.720] to that.

[01:25:15.720 --> 01:25:16.720] Absolutely.

[01:25:16.720 --> 01:25:21.560] Like I've had very few interactions with the healthcare system as a patient, but I run

[01:25:21.560 --> 01:25:25.920] into all of the same problems, even as a physician when I'm in the role of the patient.

[01:25:25.920 --> 01:25:26.920] The one time-

[01:25:26.920 --> 01:25:27.920] Most do.

[01:25:27.920 --> 01:25:28.920] Yeah.

[01:25:28.920 --> 01:25:32.880] And the one time I was with the situation that I had that was I think most analogous

[01:25:32.880 --> 01:25:39.180] to yours, meaning that I was in a situation where I felt I needed to advocate for myself

[01:25:39.180 --> 01:25:40.180] and my wife.

[01:25:40.180 --> 01:25:43.080] This is when she was giving birth to our second daughter.

[01:25:43.080 --> 01:25:50.760] And the gynecologist was following evidence-based guidelines, but in a very narrow way, like

[01:25:50.760 --> 01:25:55.600] saying we shouldn't admit you too early because that increases the risk that you'll have a

[01:25:55.600 --> 01:25:56.600] cesarean section.

[01:25:56.600 --> 01:26:01.160] Like just following the numbers, like I get that, but my wife is ready to pop, right?

[01:26:01.160 --> 01:26:05.520] I understand what you're saying, but we've been through this before and I know that she's

[01:26:05.520 --> 01:26:10.040] about to give birth and she sent us home anyway and we gave birth at home.

[01:26:10.040 --> 01:26:15.360] That was a total failure, in my opinion, of translating the literature and the evidence

[01:26:15.360 --> 01:26:18.200] to an individual patient situation.

[01:26:18.200 --> 01:26:24.240] And the mistake that I made was I didn't want to be the doctor who was kibitzing the care

[01:26:24.240 --> 01:26:25.640] of my own family member.

[01:26:25.640 --> 01:26:26.640] You know what I mean?

[01:26:26.640 --> 01:26:29.680] I wanted to just go with what they were saying, but I shouldn't have done that.

[01:26:29.680 --> 01:26:32.940] I should have insisted on being admitted.

[01:26:32.940 --> 01:26:35.400] So I made that mistake because I knew it.

[01:26:35.400 --> 01:26:39.020] I knew that it was, you know what I mean, because we lived through the first pregnancy.

[01:26:39.020 --> 01:26:40.020] This woman didn't.

[01:26:40.020 --> 01:26:42.000] This was the first time that she was covering.

[01:26:42.000 --> 01:26:45.720] She happened to be on call, you know, when we got to the hospital and she was just following

[01:26:45.720 --> 01:26:50.640] the evidence, which again is a reasonable starting point, but that's a starting point.

[01:26:50.640 --> 01:26:55.680] You then have to individualize it and self knowledge, self advocacy, understanding your

[01:26:55.680 --> 01:27:00.640] illness, your understanding the medical system, all that stuff is critical.

[01:27:00.640 --> 01:27:04.040] I can't tell you how important that is, absolutely.

[01:27:04.040 --> 01:27:06.240] And also listening to your body.

[01:27:06.240 --> 01:27:07.240] Totally.

[01:27:07.240 --> 01:27:13.720] It's not instead of listening to your physician, it's in addition to, it's adding information

[01:27:13.720 --> 01:27:14.720] to the physician.

[01:27:14.720 --> 01:27:19.920] It's so helpful when patients, I mean, you need to tell me what you need, you know, what

[01:27:19.920 --> 01:27:25.820] you're experiencing and then we will individualize the, so you start with efficacy, you start

[01:27:25.820 --> 01:27:30.400] with evidence based medicine, with science based medicine, and then you add the personal

[01:27:30.400 --> 01:27:35.600] information and knowledge and depth and what's meaningful to you, what's important to you,

[01:27:35.600 --> 01:27:36.600] all that stuff.

[01:27:36.600 --> 01:27:37.600] Yeah.

[01:27:37.600 --> 01:27:42.000] Because medicine is not plug and play robotic flow chart medicine and it never, nor should

[01:27:42.000 --> 01:27:43.000] it be.

[01:27:43.000 --> 01:27:44.000] It's not engineering.

[01:27:44.000 --> 01:27:45.000] We're not machines.

[01:27:45.000 --> 01:27:46.000] No, it absolutely is not engineering.

[01:27:46.000 --> 01:27:47.800] And we run into massive problems when we look at it.

[01:27:47.800 --> 01:27:52.500] And I've had physicians who looked at my healthcare that way and I did not want to work with them.

[01:27:52.500 --> 01:27:56.720] But I will say in fairness to the individual, and I know we've gotten a couple of emails,

[01:27:56.720 --> 01:28:02.880] the vast majority of emails that we got were super, super kind and actually really reinforced,

[01:28:02.880 --> 01:28:06.480] made me feel good about the conversation that we had because we got a lot of people saying

[01:28:06.480 --> 01:28:08.200] I'm going to go and get my screening now.

[01:28:08.200 --> 01:28:09.200] I've been putting it off.

[01:28:09.200 --> 01:28:10.200] Yeah.

[01:28:10.200 --> 01:28:11.200] That was kind of the point of it.

[01:28:11.200 --> 01:28:12.200] Yeah.

[01:28:12.200 --> 01:28:13.200] That was the point, right?

[01:28:13.200 --> 01:28:16.160] In fairness to, and I'm thinking of one specific email, in fairness to the MD who wrote in,

[01:28:16.160 --> 01:28:22.440] I understand the concern to say we have noticed over the past several years that when we used

[01:28:22.440 --> 01:28:28.420] to screen too much, we ended up with too many false positives, which was a burden financially

[01:28:28.420 --> 01:28:33.720] on the healthcare system and ultimately could be burdensome to individual patients.

[01:28:33.720 --> 01:28:38.360] But again, this is the difference between looking at population level data and individual

[01:28:38.360 --> 01:28:39.360] data.

[01:28:39.360 --> 01:28:43.760] I understand the fear that the conversation that we had was to kind of say, no, we need

[01:28:43.760 --> 01:28:46.760] more screening, but that is not what any of us said.

[01:28:46.760 --> 01:28:47.760] Yeah.

[01:28:47.760 --> 01:28:48.760] That wasn't the point.

[01:28:48.760 --> 01:28:49.760] That wasn't the point.

[01:28:49.760 --> 01:28:50.760] And it wasn't even the outcome.

[01:28:50.760 --> 01:28:55.800] The outcome was that the Bethesda protocol, which is what's utilized in gynecology for

[01:28:55.800 --> 01:29:00.440] the different levels of abnormal PAPs and different levels of testing and colposcopies

[01:29:00.440 --> 01:29:05.160] and all the wonky stuff we talked about last week, is the standard.

[01:29:05.160 --> 01:29:08.680] And the standard is going to be the most effective for the most people.

[01:29:08.680 --> 01:29:14.520] But sometimes individual patients have to advocate beyond the standard, not less than

[01:29:14.520 --> 01:29:16.680] the standard, beyond the standard.

[01:29:16.680 --> 01:29:17.680] Yeah.

[01:29:17.680 --> 01:29:19.680] You've got to know the rules and you've got to know when to bend them a little bit.

[01:29:19.680 --> 01:29:20.680] Exactly.

[01:29:20.680 --> 01:29:25.880] The other thing is like, yeah, patients ask me for unnecessary tests all the time.

[01:29:25.880 --> 01:29:29.780] And I explain to them why it's not necessary.

[01:29:29.780 --> 01:29:31.560] It takes time.

[01:29:31.560 --> 01:29:34.000] Sometimes patients are not happy with my explanation.

[01:29:34.000 --> 01:29:38.440] But if I feel strongly, it's like, no, we do not need to do an MRI scan.

[01:29:38.440 --> 01:29:40.360] It's not indicated.

[01:29:40.360 --> 01:29:42.600] And or whatever you are, you've had three of them already.

[01:29:42.600 --> 01:29:43.600] I think we're done.

[01:29:43.600 --> 01:29:45.760] I don't think we need to do a fourth one.

[01:29:45.760 --> 01:29:46.760] And that's fine.

[01:29:46.760 --> 01:29:47.760] They're just anxious and they want to know.

[01:29:47.760 --> 01:29:51.840] And they think that's the only positive things from doing that.

[01:29:51.840 --> 01:29:55.480] And sometimes I have to explain to them, actually, you know, if unnecessary screening can cause

[01:29:55.480 --> 01:29:59.760] more harm than good, and we can't do unnecessary screening.

[01:29:59.760 --> 01:30:00.760] It's not a good idea.

[01:30:00.760 --> 01:30:01.760] It's not good for you.

[01:30:01.760 --> 01:30:06.520] You know, but I made that decision for that individual patient at that individual time.

[01:30:06.520 --> 01:30:11.000] I also think there's a cognitive bias that happens here where we remember the squeaky

[01:30:11.000 --> 01:30:15.160] wheel patients more than the patients who fell through the cracks and didn't get care.

[01:30:15.160 --> 01:30:20.180] So it's brighter in the minds of physicians when somebody is asking for unnecessary tests,

[01:30:20.180 --> 01:30:25.080] as opposed to all the people who aren't getting the basic screens that they need.

[01:30:25.080 --> 01:30:31.180] That is a much larger problem in our society, yet we talk about it as if over-testing is

[01:30:31.180 --> 01:30:34.200] what we're really grappling with here.

[01:30:34.200 --> 01:30:37.000] And yes, it's an issue, but it's not the issue.

[01:30:37.000 --> 01:30:41.120] And I think that comes back to one last point I want to make, and I want to see if I can

[01:30:41.120 --> 01:30:45.480] — I feel like you would be better at making this point because you've probably written

[01:30:45.480 --> 01:30:47.260] about this.

[01:30:47.260 --> 01:30:52.600] But I recently watched a documentary about women who died during childbirth, specifically

[01:30:52.600 --> 01:30:58.660] black women who died during childbirth and their surviving husbands advocating for changes

[01:30:58.660 --> 01:31:03.160] to OB care, especially within the black community.

[01:31:03.160 --> 01:31:07.660] And one of the things that a physician at Harvard that they were talking to in the documentary

[01:31:07.660 --> 01:31:14.680] pointed to was that the very statistics that help guide medical decision-making are often

[01:31:14.680 --> 01:31:25.620] not causes but outcomes of fundamentally patriarchal, racist, sexist practices within medicine.

[01:31:25.620 --> 01:31:32.540] So when they say, well, black women are this much more likely to have a C-section, so maybe

[01:31:32.540 --> 01:31:37.420] we should try and just go for the C-section, what they're actually doing is perpetuating

[01:31:37.420 --> 01:31:39.540] this negative feedback loop.

[01:31:39.540 --> 01:31:43.080] It's not that black — there's anything about their blackness that makes them need

[01:31:43.080 --> 01:31:49.400] C-sections, it's that historically they weren't given the opportunity to have vaginal birth,

[01:31:49.400 --> 01:31:52.380] and so that starts to perpetuate and perpetuate and perpetuate.

[01:31:52.380 --> 01:31:57.220] And I think we see this a lot when we look at pure statistics without understanding why

[01:31:57.220 --> 01:31:59.540] we got to where we are.

[01:31:59.540 --> 01:32:05.580] It's very easy to say this group of people run this risk medically without looking at

[01:32:05.580 --> 01:32:12.620] all of the cultural and fundamental systemic problems that led to that risk factor to begin

[01:32:12.620 --> 01:32:14.260] with.

[01:32:14.260 --> 01:32:18.580] It's not that the risk factor is the problem, it's all the things that made it more risky

[01:32:18.580 --> 01:32:20.460] for these individuals.

[01:32:20.460 --> 01:32:25.540] And I think that becomes this really nasty self-perpetuating nightmare within the medical

[01:32:25.540 --> 01:32:31.100] system, and that's why we see time and time again that there are outcome studies that

[01:32:31.100 --> 01:32:35.300] show that women and people of color get significantly poorer care.

[01:32:35.300 --> 01:32:42.740] Yeah, there's definitely racial and sexual disparities in outcomes, and every time they

[01:32:42.740 --> 01:32:47.300] look at it, the research shows that there's a difference.

[01:32:47.300 --> 01:32:50.540] And it's hard to really now nail down exactly what's causing it.

[01:32:50.540 --> 01:32:52.100] Oh, there's a million things that cause it.

[01:32:52.100 --> 01:32:55.900] Yeah, there's so many factors, so many that are unconscious that are just systemic that

[01:32:55.900 --> 01:32:57.300] are built in.

[01:32:57.300 --> 01:33:02.220] And I think it's really, really important that we look at what's directly in front

[01:33:02.220 --> 01:33:06.200] of us and say, this is more than just a statistical thing.

[01:33:06.200 --> 01:33:08.900] This is more than just a biomedical phenomenon.

[01:33:08.900 --> 01:33:12.680] This is a cultural, psychological phenomenon as well.

[01:33:12.680 --> 01:33:16.080] And advocacy is fundamental to the healthcare system.

[01:33:16.080 --> 01:33:17.080] We need it.

[01:33:17.080 --> 01:33:18.080] Oh, definitely.

[01:33:18.080 --> 01:33:19.080] Definitely.

[01:33:19.080 --> 01:33:21.580] I could tell you the experience with patients is so different depending on how strong an

[01:33:21.580 --> 01:33:23.820] advocate they are for themselves.

[01:33:23.820 --> 01:33:26.940] You know, this is completely tangential to anything that we've been talking about, but

[01:33:26.940 --> 01:33:34.180] in my, again, anecdotal experience, but this is pretty robust phenomenon from my experience.

[01:33:34.180 --> 01:33:40.620] The single factor that predicts the most how much a patient's going to advocate for themselves

[01:33:40.620 --> 01:33:41.620] is what?

[01:33:41.620 --> 01:33:42.620] What would you think?

[01:33:42.620 --> 01:33:43.620] It's not gender.

[01:33:43.620 --> 01:33:44.620] It's not race.

[01:33:44.620 --> 01:33:45.620] SES?

[01:33:45.620 --> 01:33:46.620] It's nope.

[01:33:46.620 --> 01:33:47.620] It's not SES.

[01:33:47.620 --> 01:33:48.620] Education?

[01:33:48.620 --> 01:33:49.620] Age.

[01:33:49.620 --> 01:33:50.620] Age.

[01:33:50.620 --> 01:33:51.620] Yeah, yeah.

[01:33:51.620 --> 01:33:52.620] The older they get, just the more familiar they are.

[01:33:52.620 --> 01:33:53.620] The older the patient, the more they advocate for themselves.

[01:33:53.620 --> 01:33:59.020] Young, young, young patients, like in their 20s, and this is not universal, but they just

[01:33:59.020 --> 01:34:00.020] do whatever you tell them.

[01:34:00.020 --> 01:34:01.020] You know, they just go with the flow.

[01:34:01.020 --> 01:34:03.980] But the older the patient, the more likely they are to like really want to know what's

[01:34:03.980 --> 01:34:08.460] going on and ask about specific treatments or tests and like really challenge what you're

[01:34:08.460 --> 01:34:09.460] saying and everything.

[01:34:09.460 --> 01:34:14.380] In a good way, often, in a healthy way, but they're just so much more involved in their

[01:34:14.380 --> 01:34:15.380] care.

[01:34:15.380 --> 01:34:16.380] And it's like...

[01:34:16.380 --> 01:34:19.300] Which is, I think it's a good thing as people get, they get more confident, they get more

[01:34:19.300 --> 01:34:23.380] mature, they get more, they feel they're better able to manage the system and advocate for

[01:34:23.380 --> 01:34:24.500] themselves.

[01:34:24.500 --> 01:34:26.980] It's a skill that we learn as we get older, you know?

[01:34:26.980 --> 01:34:27.980] Yeah.

[01:34:27.980 --> 01:34:29.800] And it's like, don't get me wrong, I get it.

[01:34:29.800 --> 01:34:34.100] After a long, long day with too many patients and too much rounding and too much pressure

[01:34:34.100 --> 01:34:36.660] being put on you, it feels annoying.

[01:34:36.660 --> 01:34:41.420] It's annoying sometimes when a patient has a million questions, when a patient is really,

[01:34:41.420 --> 01:34:44.560] really concerned, when they're not satisfied with your answers.

[01:34:44.560 --> 01:34:49.660] But I'm sorry, if that is your attitude within medicine, it might not be the right field

[01:34:49.660 --> 01:34:52.540] for you because you're there to help the patient.

[01:34:52.540 --> 01:34:53.540] That's the job.

[01:34:53.540 --> 01:34:54.540] That is the job.

[01:34:54.540 --> 01:34:55.540] Yes, that is the job.

[01:34:55.540 --> 01:34:59.620] And I've got to tell you, I feel more comfortable with that than a patient who is just, like,

[01:34:59.620 --> 01:35:00.620] not engaging.

[01:35:00.620 --> 01:35:03.160] Yeah, who's just like, whatever you say, doc, sure.

[01:35:03.160 --> 01:35:07.060] That doesn't make me feel comfortable because that's likely to be a non-compliant patient

[01:35:07.060 --> 01:35:10.380] who is not going to understand why they're taking the drugs.

[01:35:10.380 --> 01:35:13.420] Like, you know, I'm teaching patients how to manage themselves.

[01:35:13.420 --> 01:35:15.820] And if they are not engaged, they're not going to do it.

[01:35:15.820 --> 01:35:18.260] They're not going to do it, they're not going to be able to do it well.

[01:35:18.260 --> 01:35:22.860] So that it's more work, but it's actually the outcomes are better and actually saves

[01:35:22.860 --> 01:35:26.700] me work in the long run because we get to an understanding about exactly what's going

[01:35:26.700 --> 01:35:27.700] to happen.

[01:35:27.700 --> 01:35:28.700] You know what I mean?

[01:35:28.700 --> 01:35:32.740] I think it's probably very specialty or situation specific as well, but like for the kind of

[01:35:32.740 --> 01:35:38.240] stuff where I'm managing complex outpatient, you know, problems like complicated migraines

[01:35:38.240 --> 01:35:42.940] or refractory facial pain or whatever, it's really, it's so important for the patient

[01:35:42.940 --> 01:35:43.940] to be fully engaged.

[01:35:43.940 --> 01:35:46.060] I'm sure it's the same way with diabetes and other chronic illnesses.

[01:35:46.060 --> 01:35:49.780] Yeah, especially things where there's a massive lifestyle component to the treatment.

[01:35:49.780 --> 01:35:50.780] Massive lifestyle component.

[01:35:50.780 --> 01:35:51.780] Yeah.

[01:35:51.780 --> 01:35:52.780] Yeah.

[01:35:52.780 --> 01:35:58.140] Okay, guys, let's go on with Science or Fiction.

Follow-up #1: Self-Advocacy vs Kibitzing

Science or Fiction (1:35:55)

Theme: Edible plants

Item #1: There are over 400,000 known plant species in the world, about 300,000 are edible to humans, but we regularly consume only 200, and 3 crops make up over half of plant calories consumed.[5]
Item #2: Cattails are almost entirely edible, and in fact produce more edible starch per acre than any other green plant.[6]
Item #3: Many species of cacti contain significant stores of water that can be used as an emergency source in the desert.[7]

Answer Item
Fiction Cacti w/ lots of water
Science ~300,000 edible plants
Science
Cattails ~totally edible
Host Result
Steve clever
Rogue Guess
Evan
Cacti w/ lots of water
Bob
Cattails ~totally edible
Jay
~300,000 edible plants
Cara
Cacti w/ lots of water

Voice-over: It's time for Science or Fiction.

Evan's Response

Bob's Response

Jay's Response

Cara's Response

Steve Explains Item #1

Steve Explains Item #2

Steve Explains Item #3

[01:35:58.140 --> 01:36:07.860] It's time for Science or Fiction.

[01:36:07.860 --> 01:36:13.060] Each week I come up with three science news items or facts, two real, one fake, challenge

[01:36:13.060 --> 01:36:18.620] my panelists' skeptics to sniff out the fake, and we have a theme this week.

[01:36:18.620 --> 01:36:26.740] On Saturday, I visited the botanical gardens in the Bronx, and very nice, you guys should,

[01:36:26.740 --> 01:36:27.740] we should go there more often.

[01:36:27.740 --> 01:36:28.740] Really, really.

[01:36:28.740 --> 01:36:29.740] I haven't been there.

[01:36:29.740 --> 01:36:30.740] Oh my gosh.

[01:36:30.740 --> 01:36:31.740] It's got to be at least 20 years.

[01:36:31.740 --> 01:36:36.100] They were having an exhibit on edible plants.

[01:36:36.100 --> 01:36:37.100] Oh, cool.

[01:36:37.100 --> 01:36:38.100] Yeah.

[01:36:38.100 --> 01:36:40.780] So that's the theme of the Science or Fiction this week, edible plants.

[01:36:40.780 --> 01:36:41.780] Okay.

[01:36:41.780 --> 01:36:42.780] Ready?

[01:36:42.780 --> 01:36:43.940] Here we go.

[01:36:43.940 --> 01:36:47.940] There are over 400,000 known plant species in the world.

[01:36:47.940 --> 01:36:54.980] About 300,000 are edible to humans, but we regularly consume only 200, and three crops

[01:36:54.980 --> 01:36:59.420] make up over half of plant calories consumed.

[01:36:59.420 --> 01:37:05.700] Number two, cattails are almost entirely edible, and in fact, produce more edible starch per

[01:37:05.700 --> 01:37:09.040] acre than any other green plant.

[01:37:09.040 --> 01:37:14.700] And number three, many species of cacti contain significant stores of water that can be used

[01:37:14.700 --> 01:37:17.020] as an emergency source in the desert.

[01:37:17.020 --> 01:37:18.380] Evan, go first.

[01:37:18.380 --> 01:37:19.380] Oh boy.

[01:37:19.380 --> 01:37:23.220] Well, I kind of have a feeling about this one.

[01:37:23.220 --> 01:37:28.440] I suppose the one about 400,000 known plant species in the world.

[01:37:28.440 --> 01:37:32.700] So there are some statements you're giving us there, but it boils down to the amount

[01:37:32.700 --> 01:37:38.300] that we regularly consume, only 200, and three crops make up over half of the plant calories

[01:37:38.300 --> 01:37:39.300] consumed.

[01:37:39.300 --> 01:37:40.300] Three crops would be what?

[01:37:40.300 --> 01:37:43.220] The wheats, the rice, and something else.

[01:37:43.220 --> 01:37:46.260] So yeah, that one sounds very interesting.

[01:37:46.260 --> 01:37:50.300] Big numbers, and yeah, I think that one's going to turn out to be correct.

[01:37:50.300 --> 01:37:53.860] The cattails, I think I know what a cattail is.

[01:37:53.860 --> 01:37:58.020] It's like that stalk with the big thick end to it.

[01:37:58.020 --> 01:37:59.020] Yep.

[01:37:59.020 --> 01:38:00.020] It looks like a hot dog, right?

[01:38:00.020 --> 01:38:02.220] And it has almost like a cotton puff at the end.

[01:38:02.220 --> 01:38:03.780] Is that what I'm thinking about?

[01:38:03.780 --> 01:38:05.740] They're like brown, right?

[01:38:05.740 --> 01:38:06.740] Yeah.

[01:38:06.740 --> 01:38:08.980] Brown hot dogs at the end of a stick.

[01:38:08.980 --> 01:38:12.300] Like grown marshy areas.

[01:38:12.300 --> 01:38:18.580] So if they produce more edible starch per acre than any other green plant, that could

[01:38:18.580 --> 01:38:20.780] also definitely be true.

[01:38:20.780 --> 01:38:22.780] It doesn't mean we eat it, though.

[01:38:22.780 --> 01:38:32.580] It's edible, and there's a lot of swampland or that sort of environment in which we don't

[01:38:32.580 --> 01:38:40.300] grow many of those 200 or so edible plants that you mentioned in the first one.

[01:38:40.300 --> 01:38:46.020] So the fact that there's more edible starch per acre than any other green plant, I would

[01:38:46.020 --> 01:38:47.180] tend to believe.

[01:38:47.180 --> 01:38:52.460] But then the last one about the cacti, significant stores of water that can be used as an emergency

[01:38:52.460 --> 01:38:54.020] source in the desert.

[01:38:54.020 --> 01:39:00.780] Well, it sounds like that's sort of something we're sort of brought up, you know, living

[01:39:00.780 --> 01:39:06.180] you know, six year old cartoon Wile E. Coyote in the desert in the cactus, you know, sort

[01:39:06.180 --> 01:39:11.140] of this preconceived notion about what exactly a cactus is and what it can do.

[01:39:11.140 --> 01:39:13.860] And, you know, I'm sure that's been exaggerated.

[01:39:13.860 --> 01:39:17.060] The only thing I'm thinking, Steve, is that you might be throwing that one as the curveball

[01:39:17.060 --> 01:39:19.900] thinking that, oh, yeah, well, we're relying on that.

[01:39:19.900 --> 01:39:22.260] But the other two seem more correct than this one.

[01:39:22.260 --> 01:39:25.020] So I have a feeling the cacti one is the fiction.

[01:39:25.020 --> 01:39:26.020] All right, Bob.

[01:39:26.020 --> 01:39:27.020] All right.

[01:39:27.020 --> 01:39:31.740] So the 400,000 known plants, blah, blah, blah, yeah, whatever, I guess.

[01:39:31.740 --> 01:39:34.140] I mean, cattails, though.

[01:39:34.140 --> 01:39:35.140] What the hell?

[01:39:35.140 --> 01:39:36.140] Cattails?

[01:39:36.140 --> 01:39:42.780] I mean, I thought that corn was like the iconic more most edible starch per acre plant than

[01:39:42.780 --> 01:39:43.780] anything.

[01:39:43.780 --> 01:39:45.780] Is that considered a green plant?

[01:39:45.780 --> 01:39:48.580] Looks pretty, pretty green to me.

[01:39:48.580 --> 01:39:52.740] Not the corn itself, but that's not what I was thinking, but OK.

[01:39:52.740 --> 01:39:53.740] I suppose so.

[01:39:53.740 --> 01:39:56.420] Yeah, I guess you have to know what the definition of green plant is.

[01:39:56.420 --> 01:40:01.660] I mean, that's the stock is pretty substantial, I mean, much more so than a cattail.

[01:40:01.660 --> 01:40:05.900] And then there's a cacti one that this one's so wishy washy.

[01:40:05.900 --> 01:40:09.380] Many species, significant stores.

[01:40:09.380 --> 01:40:11.740] I mean, what what do those words mean?

[01:40:11.740 --> 01:40:16.500] I'm like a true skeptic.

[01:40:16.500 --> 01:40:20.820] You know, they're like they're way, you know, they're pretty damn variable.

[01:40:20.820 --> 01:40:22.500] So it's based on that.

[01:40:22.500 --> 01:40:25.420] I'll just go with the cattails and say it's bull crap.

[01:40:25.420 --> 01:40:26.420] OK, Jay.

[01:40:26.420 --> 01:40:29.460] Yeah, I do agree with what Bob said about those words in there.

[01:40:29.460 --> 01:40:33.580] It sounds like Steve could change that to mean whatever he wants it to mean.

[01:40:33.580 --> 01:40:38.740] Steve is a force of good in this world, so I'm not going to hold any of that against

[01:40:38.740 --> 01:40:39.740] him.

[01:40:39.740 --> 01:40:40.740] All right.

[01:40:40.740 --> 01:40:43.260] This is the thing that's bothering me about these three.

[01:40:43.260 --> 01:40:48.540] Of all the statements that are made, the first one says there are over 400,000 known plant

[01:40:48.540 --> 01:40:49.860] species in the world.

[01:40:49.860 --> 01:40:52.260] About 300,000 are edible to humans.

[01:40:52.260 --> 01:40:56.020] So two thirds of the world's plants are edible to humans.

[01:40:56.020 --> 01:40:58.500] That can't possibly be true.

[01:40:58.500 --> 01:40:59.900] That can't be true.

[01:40:59.900 --> 01:41:01.100] Three quarters, I mean.

[01:41:01.100 --> 01:41:05.940] Three quarters of the world's plants can't cannot be edible to humans.

[01:41:05.940 --> 01:41:11.940] You know, when you think about the selective breeding that has taken place and how few

[01:41:11.940 --> 01:41:16.280] plants we actually do eat, you know what I mean, like not a lot.

[01:41:16.280 --> 01:41:21.100] If you really line them all up, the variety is nice, but it's not 300,000 nice.

[01:41:21.100 --> 01:41:22.900] There's no way that that is science.

[01:41:22.900 --> 01:41:24.260] That has got to be fiction.

[01:41:24.260 --> 01:41:28.940] All right, Cara, they're evenly divided, one for each.

[01:41:28.940 --> 01:41:32.940] I don't agree with Bob, so I'm going to put that one on the shelf.

[01:41:32.940 --> 01:41:36.180] I think cattails are probably, sure, lots of starch.

[01:41:36.180 --> 01:41:39.500] And when we think about corn, we only eat the corn part of the corn.

[01:41:39.500 --> 01:41:41.060] We don't eat any of the other part.

[01:41:41.060 --> 01:41:44.120] But you're saying the whole plant is edible.

[01:41:44.120 --> 01:41:48.420] And that's what makes me a little bit skeptical about the first one.

[01:41:48.420 --> 01:41:53.180] So there are over 400,000 known plant species, about 300,000 are edible to humans.

[01:41:53.180 --> 01:41:56.860] Does that mean, I don't think you're saying that we actually eat them.

[01:41:56.860 --> 01:42:02.500] I think you're saying that they are edible in the sense that there is some part of 300,000

[01:42:02.500 --> 01:42:07.420] plants that won't kill us if we consume it, but we just don't.

[01:42:07.420 --> 01:42:09.380] And if that's the case, that's probably true.

[01:42:09.380 --> 01:42:12.260] I don't think we eat 300,000 plants.

[01:42:12.260 --> 01:42:18.300] I think that it's probably true that there are portions of 300,000 plants that are edible

[01:42:18.300 --> 01:42:21.420] plants are highly, highly specialized.

[01:42:21.420 --> 01:42:22.900] So maybe on this one, it's the fruit.

[01:42:22.900 --> 01:42:24.660] On that one, it's the flower.

[01:42:24.660 --> 01:42:25.900] On this one, it's the stalk.

[01:42:25.900 --> 01:42:28.060] On this one, it's just the leaves.

[01:42:28.060 --> 01:42:29.860] But I wouldn't say that most plants are toxic.

[01:42:29.860 --> 01:42:32.780] I'd say there's portions of plants that are toxic.

[01:42:32.780 --> 01:42:34.700] So I don't know, that one could be the case.

[01:42:34.700 --> 01:42:38.920] I definitely agree with the thing where we basically eat three plants.

[01:42:38.920 --> 01:42:43.340] It's probably like corn and soy and something else.

[01:42:43.340 --> 01:42:44.340] Peanut butter.

[01:42:44.340 --> 01:42:46.700] Peanut butter plants, yeah.

[01:42:46.700 --> 01:42:52.980] So the one that's actually getting me is the cacti one because, A, they're not just like

[01:42:52.980 --> 01:42:59.820] water, they're not like fountains of water, no plant works that way.

[01:42:59.820 --> 01:43:03.420] And B, most cacti are highly toxic.

[01:43:03.420 --> 01:43:06.420] Out here in LA, we eat cactus.

[01:43:06.420 --> 01:43:10.660] Like we eat nopales, I think is what they're called, like cactus salads, but it's like

[01:43:10.660 --> 01:43:12.660] one or two species.

[01:43:12.660 --> 01:43:15.500] I don't think you can just go around eating cacti.

[01:43:15.500 --> 01:43:18.300] I think they're really bad for you.

[01:43:18.300 --> 01:43:23.380] So that one bothers me right there because I think as an emergency source in the desert,

[01:43:23.380 --> 01:43:28.660] that's a dangerous proposition if you're running out of water to just go bite into a cactus

[01:43:28.660 --> 01:43:31.940] because I think that you could like die if you do that.

[01:43:31.940 --> 01:43:34.020] So that's why I think that one's going to be the fiction.

[01:43:34.020 --> 01:43:35.260] All right.

[01:43:35.260 --> 01:43:36.580] So you guys are spread out.

[01:43:36.580 --> 01:43:39.620] I guess we'll just take them in order then.

[01:43:39.620 --> 01:43:44.140] There are over 400,000 known plant species in the world, about 300,000 are edible to

[01:43:44.140 --> 01:43:49.260] humans, but we regularly consume only 203 crops, make up over half of plant calories

[01:43:49.260 --> 01:43:50.260] consumed.

[01:43:50.260 --> 01:43:55.020] Jay, you think this one is the fiction because that 300,000 number is just too high.

[01:43:55.020 --> 01:43:58.300] The rest of you think this one is science.

[01:43:58.300 --> 01:44:00.740] And this one is science.

[01:44:00.740 --> 01:44:02.220] Sorry, Jay.

[01:44:02.220 --> 01:44:06.420] And yes, the 300,000 was the bit I was trying to get you on with this one.

[01:44:06.420 --> 01:44:10.460] The rest of it, if it didn't have that in there, the rest of it was fine, right?

[01:44:10.460 --> 01:44:11.460] Right.

[01:44:11.460 --> 01:44:15.620] And Kara, you're exactly correct that the definition of edible doesn't mean that we

[01:44:15.620 --> 01:44:22.140] actually eat them and it doesn't mean that it would be good to eat or that it's digestible.

[01:44:22.140 --> 01:44:23.900] It just means it won't kill you.

[01:44:23.900 --> 01:44:24.900] That's all it means.

[01:44:24.900 --> 01:44:25.900] It won't make you sick.

[01:44:25.900 --> 01:44:26.900] You know, it's not bullshit.

[01:44:26.900 --> 01:44:27.900] We don't digest a lot of plant matter.

[01:44:27.900 --> 01:44:28.900] That's what you're going on?

[01:44:28.900 --> 01:44:29.900] Jay, this is not what I'm going on.

[01:44:29.900 --> 01:44:30.900] That's what I assume, Jay.

[01:44:30.900 --> 01:44:31.900] This is the definition.

[01:44:31.900 --> 01:44:32.900] Yeah.

[01:44:32.900 --> 01:44:39.280] Because like insoluble fiber by definition is not digestible.

[01:44:39.280 --> 01:44:41.780] That definition sucks.

[01:44:41.780 --> 01:44:45.020] It really is a shit definition if that's what the actual definition is.

[01:44:45.020 --> 01:44:46.020] That's the technical definition.

[01:44:46.020 --> 01:44:47.420] Shit is not edible.

[01:44:47.420 --> 01:44:51.780] Now the 400,000 number here is rock solid.

[01:44:51.780 --> 01:44:57.140] The three numbers, it's maize, rice, and wheat.

[01:44:57.140 --> 01:45:01.960] And that's like, it's almost half of our total calories.

[01:45:01.960 --> 01:45:04.860] It's a little bit more than half of our plant calories.

[01:45:04.860 --> 01:45:10.340] 95% of our total calories come from 30 species, both, you know, plant and animal.

[01:45:10.340 --> 01:45:16.860] So we really do have a very restricted diet in terms of just the majority of our calories.

[01:45:16.860 --> 01:45:21.620] But there's about, if you look at all the plants that people around the world eat, the

[01:45:21.620 --> 01:45:27.780] whole world, it's on a regular basis, it's only about 200 different plants.

[01:45:27.780 --> 01:45:33.220] And although that's a little deceptive because like cabbage and broccoli and Brussels sprouts

[01:45:33.220 --> 01:45:34.740] are all considered one species.

[01:45:34.740 --> 01:45:38.420] So that would only be one of those 200, but it's all of those.

[01:45:38.420 --> 01:45:39.420] So there's a lot of cultivars.

[01:45:39.420 --> 01:45:40.420] There's a lot of varieties.

[01:45:40.420 --> 01:45:46.140] It's not like apple would be one of those 200, even though there's hundreds of cultivars

[01:45:46.140 --> 01:45:47.140] of apples.

[01:45:47.140 --> 01:45:49.820] Yeah, smooth peanut butter, crunchy peanut butter.

[01:45:49.820 --> 01:45:52.140] Yeah, exactly.

[01:45:52.140 --> 01:45:53.140] But here's an interesting number.

[01:45:53.140 --> 01:45:55.500] How many plants have humans ever consumed?

[01:45:55.500 --> 01:45:59.900] Like how many plants are on the list of things that anyone has ever eaten ever for food,

[01:45:59.900 --> 01:46:01.460] you know?

[01:46:01.460 --> 01:46:03.740] And that number is really hard to come by.

[01:46:03.740 --> 01:46:04.740] 100,000?

[01:46:04.740 --> 01:46:06.660] No, it's more like 6,000.

[01:46:06.660 --> 01:46:07.660] It's like 6,000.

[01:46:07.660 --> 01:46:08.660] That's it?

[01:46:08.660 --> 01:46:09.660] Yeah, that's it.

[01:46:09.660 --> 01:46:10.660] It's not a lot.

[01:46:10.660 --> 01:46:11.660] Wow.

[01:46:11.660 --> 01:46:15.100] So I get most of those 300,000 are technically edible, but probably not something you want

[01:46:15.100 --> 01:46:16.100] to eat.

[01:46:16.100 --> 01:46:22.900] There's also, I think there's a, you know, I saw a lot of, I saw some people cited 200,000.

[01:46:22.900 --> 01:46:26.420] You know, there were different numbers given there, but that was, I think, the consensus

[01:46:26.420 --> 01:46:28.140] one that I found the most.

[01:46:28.140 --> 01:46:32.500] But I think it's where you draw the line as to what is edible because it's not a sharp

[01:46:32.500 --> 01:46:33.500] demarcation.

[01:46:33.500 --> 01:46:38.180] It is just like, what can people get away with eating, basically?

[01:46:38.180 --> 01:46:41.500] And it kind of depends on what you mean by that.

[01:46:41.500 --> 01:46:44.900] But consumed as, it doesn't mean that it's consumed as food, right?

[01:46:44.900 --> 01:46:46.180] That is not what edible means.

[01:46:46.180 --> 01:46:49.140] And that was the tricky part that I was trying to get you on.

[01:46:49.140 --> 01:46:52.580] Consumed as food is probably something in that 6,000 to 8,000 range.

[01:46:52.580 --> 01:46:56.980] Again, this is an estimate, there's no hard and fast count for that.

[01:46:56.980 --> 01:46:59.700] But again, but on a regular basis, it's very few of those.

[01:46:59.700 --> 01:47:05.940] So there are efforts to expand the repertoire of what people eat, you know, like to explore

[01:47:05.940 --> 01:47:14.100] all these other rarely consumed, but very, very edible and consumable food, like things

[01:47:14.100 --> 01:47:20.080] that would be good food, but just not a regular part of the human diet.

[01:47:20.080 --> 01:47:25.660] In order to create more variety and also reduce our dependence on these 30 species that make

[01:47:25.660 --> 01:47:30.580] up most of our food, and to, you know, reduce the amount of monoculture that we're dependent

[01:47:30.580 --> 01:47:36.340] on, you know, we want to branch out to many, many more different plants, just if for no

[01:47:36.340 --> 01:47:40.900] other reason than for, you know, to secure our agriculture, so we're not dependent on

[01:47:40.900 --> 01:47:41.900] so few plants.

[01:47:41.900 --> 01:47:45.540] But that's interesting, the disconnect between how many potential plants we could eat out

[01:47:45.540 --> 01:47:48.860] there and how many we regularly eat is really stark.

[01:47:48.860 --> 01:47:51.020] Okay, let's go to number two.

[01:47:51.020 --> 01:47:54.580] Cattails are almost entirely edible and in fact produce more edible starch per acre than

[01:47:54.580 --> 01:47:56.140] any other green plant, Bob.

[01:47:56.140 --> 01:48:00.480] You think this one is the fiction, the rest of you think this one is science, and this

[01:48:00.480 --> 01:48:04.260] one is science.

[01:48:04.260 --> 01:48:05.660] Cattails are absolutely edible.

[01:48:05.660 --> 01:48:08.940] Now they produce a lot of starch.

[01:48:08.940 --> 01:48:13.740] They can in fact be monocropped, you know, and if they are, that's when they would have

[01:48:13.740 --> 01:48:19.100] the potential to create more edible starch per acre if they're monocropped, it's not

[01:48:19.100 --> 01:48:23.420] like in the wild they do because they usually only grow in small patches, but they produce

[01:48:23.420 --> 01:48:26.060] more than potatoes, more than yams, you know.

[01:48:26.060 --> 01:48:32.500] The thing that's throwing you off is you're thinking about the brown part of the plant,

[01:48:32.500 --> 01:48:35.060] but that's after it dries out.

[01:48:35.060 --> 01:48:39.620] Earlier on it's green and you can cook it and eat it, like corn, in fact they say you

[01:48:39.620 --> 01:48:41.740] can eat it like corn right off the stalk.

[01:48:41.740 --> 01:48:42.740] You can also...

[01:48:42.740 --> 01:48:43.740] That's what I was thinking.

[01:48:43.740 --> 01:48:47.620] You can eat the stalk, you can eat the roots, you have to cook it, but you can eat it.

[01:48:47.620 --> 01:48:49.660] It is a good survival plant.

[01:48:49.660 --> 01:48:54.780] That's something, there's this other category, like there's edible, but you probably don't

[01:48:54.780 --> 01:48:55.780] want to eat it.

[01:48:55.780 --> 01:48:58.500] Like grass is technically edible because it's not going to hurt you, but you're not going

[01:48:58.500 --> 01:49:00.780] to get a lot of calories out of it.

[01:49:00.780 --> 01:49:04.860] Then there's survival food, which is not the kind of thing you would want to have to eat,

[01:49:04.860 --> 01:49:09.100] but if you're going to starve to death, you can eat it and get nutrition out of it and

[01:49:09.100 --> 01:49:10.720] it's better than nothing.

[01:49:10.720 --> 01:49:17.260] And then there's like wild food that actually you can eat it and live off it and it's decent.

[01:49:17.260 --> 01:49:20.420] So cattails are actually something that it's not even survival food.

[01:49:20.420 --> 01:49:27.700] It's decent food and there are cultures that eat it and like it and know how to cook it

[01:49:27.700 --> 01:49:29.660] and prepare it and what parts you can do.

[01:49:29.660 --> 01:49:38.100] The other thing about cattails is in colonial times, pre-technical times, cattails were

[01:49:38.100 --> 01:49:47.140] very important in terms of their survival value because they don't just provide food.

[01:49:47.140 --> 01:49:51.940] You could also use the dried out brown part of them for insulation.

[01:49:51.940 --> 01:49:56.220] You can also use them to filter water and in fact, that's what they do for the plant.

[01:49:56.220 --> 01:50:01.700] The one caution is you shouldn't eat them if they're growing in toxic water because

[01:50:01.700 --> 01:50:02.700] that's what they do.

[01:50:02.700 --> 01:50:06.900] They filter the toxins out of the water so they might be concentrated in them until you

[01:50:06.900 --> 01:50:07.900] wouldn't want it.

[01:50:07.900 --> 01:50:11.260] You're going to eat whatever crap is in the water, you know, so you don't want to do that.

[01:50:11.260 --> 01:50:13.460] So you got to make sure that they taste good.

[01:50:13.460 --> 01:50:14.460] Apparently they do.

[01:50:14.460 --> 01:50:15.460] I mean, I've never had them.

[01:50:15.460 --> 01:50:18.900] I don't know if you've tried them and now that I know that they're a thing, but apparently

[01:50:18.900 --> 01:50:25.100] people eat them deliberately because if you know how to prepare them, like a lot of things,

[01:50:25.100 --> 01:50:29.940] they could actually be desired food stuff.

[01:50:29.940 --> 01:50:34.220] To me, like a cattail looks more like food than say a artichoke.

[01:50:34.220 --> 01:50:35.220] Yeah.

[01:50:35.220 --> 01:50:36.220] It looks like a corn dog.

[01:50:36.220 --> 01:50:37.220] It looks like a corn dog.

[01:50:37.220 --> 01:50:38.220] I'm amazed when I eat artichokes.

[01:50:38.220 --> 01:50:39.220] Who's the first guy?

[01:50:39.220 --> 01:50:45.420] And I think in fact, part of the plant you would eat like an artichoke where you would

[01:50:45.420 --> 01:50:50.860] sort of pull the softer parts off of the more cellulose parts that you don't digest very

[01:50:50.860 --> 01:50:51.860] well.

[01:50:51.860 --> 01:50:55.820] But if you saw them in their green state, then it's like, oh yeah, that looks more like

[01:50:55.820 --> 01:50:58.340] food than the brown stuff.

[01:50:58.340 --> 01:51:03.060] You wouldn't even know it's a cattail though because the cattail is always pictured with

[01:51:03.060 --> 01:51:04.060] the brown thing.

[01:51:04.060 --> 01:51:05.060] Right, right.

[01:51:05.060 --> 01:51:06.060] That's true.

[01:51:06.060 --> 01:51:07.060] Do you know what we called them when we were young?

[01:51:07.060 --> 01:51:08.500] Do you guys remember this?

[01:51:08.500 --> 01:51:09.500] The cattails?

[01:51:09.500 --> 01:51:10.500] Punks.

[01:51:10.500 --> 01:51:12.100] We called them cattails, but we also called them something else.

[01:51:12.100 --> 01:51:13.100] Punks.

[01:51:13.100 --> 01:51:19.700] We called them punks because you could light them and it's like a poor man's roaming candle.

[01:51:19.700 --> 01:51:20.700] You know what I mean?

[01:51:20.700 --> 01:51:23.900] Like it's a firework, you could light it and it burns and it kind of sparkles.

[01:51:23.900 --> 01:51:24.900] Are fireworks called punks?

[01:51:24.900 --> 01:51:25.900] You like Mr. Spark?

[01:51:25.900 --> 01:51:26.900] Yeah.

[01:51:26.900 --> 01:51:27.900] Oh, okay.

[01:51:27.900 --> 01:51:30.900] And I remember we used to call them punks.

[01:51:30.900 --> 01:51:31.900] Huh, weird.

[01:51:31.900 --> 01:51:34.820] They have a very distinct smell too when you burn them.

[01:51:34.820 --> 01:51:35.820] Yeah.

[01:51:35.820 --> 01:51:36.820] Yeah.

[01:51:36.820 --> 01:51:39.180] So they could also be used as a fuel source.

[01:51:39.180 --> 01:51:44.020] They could be used to burn for fire, to filter water, for food, for insulation.

[01:51:44.020 --> 01:51:47.180] So they had a lot of uses out on the frontier.

[01:51:47.180 --> 01:51:50.100] Why aren't you doing this now?

[01:51:50.100 --> 01:51:51.100] You could be.

[01:51:51.100 --> 01:51:55.180] That's kind of the point of all this is that we could be availing ourselves of thousands

[01:51:55.180 --> 01:51:57.060] of edible plants.

[01:51:57.060 --> 01:52:00.980] Not just edible, but actually digestible, consumable plants.

[01:52:00.980 --> 01:52:06.060] And we can modify them with extra nutritional value and other things, oh my gosh.

[01:52:06.060 --> 01:52:10.540] Okay, which means that many species of cacti contain significant stores of water that can

[01:52:10.540 --> 01:52:15.820] be used as an emergency source in the desert is absolute fiction.

[01:52:15.820 --> 01:52:20.500] And not because the words are wishy-washy, because this is absolutely incorrect.

[01:52:20.500 --> 01:52:22.540] You cannot drink water from a cactus.

[01:52:22.540 --> 01:52:23.580] They are toxic.

[01:52:23.580 --> 01:52:24.580] They will make you sick.

[01:52:24.580 --> 01:52:25.580] Yeah.

[01:52:25.580 --> 01:52:29.300] It's the exact last thing you should do if you're in the desert.

[01:52:29.300 --> 01:52:34.940] If you are, you know, dying of thirst in the desert, the last thing you want to do is eat

[01:52:34.940 --> 01:52:40.860] the inside of a cactus, no matter how juicy it may seem to you, because it has alkali

[01:52:40.860 --> 01:52:42.260] poison in it.

[01:52:42.260 --> 01:52:44.060] It's going to fry your kidneys.

[01:52:44.060 --> 01:52:48.620] It's going to stress out your kidneys, which are already stressed out by your dehydration.

[01:52:48.620 --> 01:52:53.660] It's a complete TV media myth that you could just get the water from the cactus.

[01:52:53.660 --> 01:52:54.660] It never happens.

[01:52:54.660 --> 01:52:55.940] You can't do it.

[01:52:55.940 --> 01:53:00.260] So I wanted to include that one partly for like, don't believe what you see on TV about

[01:53:00.260 --> 01:53:01.260] the cactus.

[01:53:01.260 --> 01:53:03.380] They are not a source of water in the desert.

[01:53:03.380 --> 01:53:08.260] If you are lost in the desert or abandoned or whatever and it's hot out and you're running

[01:53:08.260 --> 01:53:14.460] out of water, you know, your best chance is, drink your pee, nope, hunkering down in the

[01:53:14.460 --> 01:53:17.260] shade somewhere and hoping you get found.

[01:53:17.260 --> 01:53:20.220] If somebody knows that somebody has a general idea where you are, you have to hope they

[01:53:20.220 --> 01:53:21.960] come looking for you and find you.

[01:53:21.960 --> 01:53:24.260] If nobody knows where you are, you're probably dead.

[01:53:24.260 --> 01:53:28.780] You know, if you're out of water in the middle of the desert, I guess you could just pick

[01:53:28.780 --> 01:53:30.820] a direction, make a run for it and hope for the best.

[01:53:30.820 --> 01:53:34.860] But you know, it's just, you don't need a cactus though.

[01:53:34.860 --> 01:53:36.140] Why are you in the middle of the desert?

[01:53:36.140 --> 01:53:37.140] Go to the street.

[01:53:37.140 --> 01:53:38.140] There are streets everywhere.

[01:53:38.140 --> 01:53:39.140] Yeah.

[01:53:39.140 --> 01:53:41.700] Find the street.

[01:53:41.700 --> 01:53:46.340] One of the resources I read was saying, and when you think about it, why would the cactus

[01:53:46.340 --> 01:53:47.340] do that?

[01:53:47.340 --> 01:53:51.700] Like evolutionarily, because then animals would come eat it, you know, water is precious

[01:53:51.700 --> 01:53:52.700] in the desert.

[01:53:52.700 --> 01:53:56.340] Why would you make your water accessible to anything that wants to come by and eat it?

[01:53:56.340 --> 01:54:01.660] So of course they fill it with toxins so nobody, no animals would eat it, including humans.

[01:54:01.660 --> 01:54:03.260] It also doesn't even make sense.

[01:54:03.260 --> 01:54:09.460] Like there's, it's a low water plant, you know, like it's not just some like, like

[01:54:09.460 --> 01:54:11.340] vase of water.

[01:54:11.340 --> 01:54:13.420] There's not enough water for it to be that way.

[01:54:13.420 --> 01:54:19.460] I think it comes from the sort of magical noble savage mythology where we think of like

[01:54:19.460 --> 01:54:25.500] people of the land know all these secrets and like you have to show them doing amazing

[01:54:25.500 --> 01:54:29.420] things like you're starving to death and they just walk over and cut open a cactus and suck

[01:54:29.420 --> 01:54:31.780] the juice out and they, and they're good.

[01:54:31.780 --> 01:54:33.340] So they just make up stuff like that.

[01:54:33.340 --> 01:54:34.340] You know what I mean?

[01:54:34.340 --> 01:54:38.300] And so it just becomes part of the mythology, part of the narrative, but it's just not based

[01:54:38.300 --> 01:54:39.300] on reality.

[01:54:39.300 --> 01:54:40.300] Yikes.

[01:54:40.300 --> 01:54:41.300] That's actually,

[01:54:41.300 --> 01:54:42.300] You can't eat aloe though.

[01:54:42.300 --> 01:54:43.300] Yeah.

[01:54:43.300 --> 01:54:44.300] Aloe is edible.

[01:54:44.300 --> 01:54:45.300] Yeah.

[01:54:45.300 --> 01:54:47.940] If there's like, I mean, so there are some succulents, just not, you know, there's like

[01:54:47.940 --> 01:54:51.460] two cacti that I think you could eat, but not that many.

[01:54:51.460 --> 01:54:56.460] I didn't know this, but true cacti are native only to the Americas.

[01:54:56.460 --> 01:54:57.460] Oh, cool.

[01:54:57.460 --> 01:55:03.900] There's like 750 species, 749 of which are only found in the Americas.

[01:55:03.900 --> 01:55:12.900] There's one cactus found in another country like that's considered native and it had to

[01:55:12.900 --> 01:55:13.900] get there somewhere.

[01:55:13.900 --> 01:55:19.700] It was probably a, at some point it was an accidental, like it was brought over there,

[01:55:19.700 --> 01:55:21.580] it was contaminated or whatever.

[01:55:21.580 --> 01:55:27.700] But because, you know, they evolved in the Americas and they're not native anywhere else.

[01:55:27.700 --> 01:55:33.180] And you know, there are cactus like plants in other deserts, right?

[01:55:33.180 --> 01:55:34.780] But they're not true cacti.

[01:55:34.780 --> 01:55:36.740] They're not cladistically, right?

[01:55:36.740 --> 01:55:45.140] They're not in the cactus clade, superficially look like cactus.

[01:55:45.140 --> 01:55:46.140] Right.

[01:55:46.140 --> 01:55:47.140] All right.

[01:55:47.140 --> 01:55:48.140] Well, good job.

[01:55:48.140 --> 01:55:49.140] All right.

[01:55:49.140 --> 01:55:50.140] Thank you.

[01:55:50.140 --> 01:55:51.140] Thank you.

[01:55:51.140 --> 01:55:52.140] Yay.

Skeptical Quote of the Week (1:55:51)

The illiterates of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.
Alvin Toffler (1928-2016), American writer, futurist, and businessman

[01:55:52.140 --> 01:55:53.140] All right.

[01:55:53.140 --> 01:55:54.140] Evan, give us a quote.

[01:55:54.140 --> 01:55:57.700] The illiterates of the 21st century will not be those who cannot read and write, but those

[01:55:57.700 --> 01:56:02.020] who cannot learn, unlearn and relearn.

[01:56:02.020 --> 01:56:03.020] Alvin Toffler.

[01:56:03.020 --> 01:56:05.140] And who is he?

[01:56:05.140 --> 01:56:12.180] Alvin Toffler, American writer, futurist and businessman known for his works discussing

[01:56:12.180 --> 01:56:17.380] modern technologies, including the digital revolution and the communication revolution.

[01:56:17.380 --> 01:56:21.620] And obviously this was he was, you know, prominent in the 20th century.

[01:56:21.620 --> 01:56:26.020] Oh, his first major book about the future was called Future Shock in 1970.

[01:56:26.020 --> 01:56:27.020] Oh, yeah.

[01:56:27.020 --> 01:56:28.020] I know that book.

[01:56:28.020 --> 01:56:29.020] I remember that.

[01:56:29.020 --> 01:56:30.020] Perhaps you read that.

[01:56:30.020 --> 01:56:31.020] I did.

[01:56:31.020 --> 01:56:32.740] I forgot the author's name, but I'm very familiar with that book.

[01:56:32.740 --> 01:56:33.740] Future Shock.

[01:56:33.740 --> 01:56:34.740] Yep.

[01:56:34.740 --> 01:56:35.740] Yep.

[01:56:35.740 --> 01:56:36.740] Worldwide bestseller.

[01:56:36.740 --> 01:56:37.740] Yeah.

[01:56:37.740 --> 01:56:38.740] Cool.

[01:56:38.740 --> 01:56:39.740] Yeah.

[01:56:39.740 --> 01:56:40.740] I mean, that's totally true.

[01:56:40.740 --> 01:56:41.740] I mean, I agree with that.

[01:56:41.740 --> 01:56:42.740] Obviously, it's a little pithy.

[01:56:42.740 --> 01:56:45.940] But the idea is that it's not just basic, you know, reading and writing.

[01:56:45.940 --> 01:56:50.220] Well, still, literacy is, you know, basic literacy is still important and not everybody

[01:56:50.220 --> 01:56:51.220] has it.

[01:56:51.220 --> 01:56:53.060] And that's still something that we need to fix.

[01:56:53.060 --> 01:56:59.340] But even among the literate, they do separate out by access to information and also the

[01:56:59.340 --> 01:57:06.180] ability to know how to consume information in a world flush with misinformation and disinformation.

[01:57:06.180 --> 01:57:07.180] So that's the ability.

[01:57:07.180 --> 01:57:12.860] You need to unlearn things and you need to be able to then readjust what you think you

[01:57:12.860 --> 01:57:13.860] know.

[01:57:13.860 --> 01:57:16.420] Yeah, Alvin Toffler and Yoda both said it.

[01:57:16.420 --> 01:57:17.420] Media savvy.

[01:57:17.420 --> 01:57:18.420] All that.

[01:57:18.420 --> 01:57:19.420] That's all the new literacy.

[01:57:19.420 --> 01:57:20.420] Definitely.

[01:57:20.420 --> 01:57:21.820] Because the information is there for anybody who wants it.

[01:57:21.820 --> 01:57:26.780] It's really the knowledge of how to access it and discern what's real from what's not

[01:57:26.780 --> 01:57:30.700] real is really the new literacy skill of the 21st century.

[01:57:30.700 --> 01:57:32.380] I totally agree with that.

Signoff/Announcements

[01:57:32.380 --> 01:57:33.820] All right, everyone.

[01:57:33.820 --> 01:57:35.460] Thanks for joining me this week.

[01:57:35.460 --> 01:57:36.460] You got it.

[01:57:36.460 --> 01:57:37.460] Thanks, Steve.

[01:57:37.460 --> 01:57:38.460] Thanks, Steve.

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[8]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary

Navi-previous.png Back to top of page Navi-next.png