SGU Episode 897: Difference between revisions
Line 4,906: | Line 4,906: | ||
== Signoff == | == Signoff == | ||
[01:59:48.340 --> 01:59:50.220] Well, thank you all for joining me this week. | |||
[01:59:50.220 --> 01:59:51.220] You got it, Steve. | |||
[01:59:51.220 --> 01:59:52.220] Thanks. | |||
[01:59:52.220 --> 01:59:53.220] Thanks, Steve. | |||
<!-- ** if the signoff includes announcements or any additional conversation, it would be appropriate to include a timestamp for when this part starts | <!-- ** if the signoff includes announcements or any additional conversation, it would be appropriate to include a timestamp for when this part starts | ||
--> | --> | ||
'''S:''' —and until next week, this is your {{SGU}}. <!-- typically this is the last thing before the Outro --> | '''S:''' —and until next week, this is your {{SGU}}. <!-- typically this is the last thing before the Outro --> | ||
{{Outro664}}{{top}} <!-- for previous episodes, use the appropriate outro, found here: https://www.sgutranscripts.org/wiki/Category:Outro_templates --> | {{Outro664}}{{top}} <!-- for previous episodes, use the appropriate outro, found here: https://www.sgutranscripts.org/wiki/Category:Outro_templates --> | ||
== Today I Learned == | == Today I Learned == | ||
* Fact/Description, possibly with an article reference<ref>[url_for_TIL publication: title]</ref> <!-- add this format to include a referenced article, maintaining spaces: <ref>[URL publication: title]</ref> --> | * Fact/Description, possibly with an article reference<ref>[url_for_TIL publication: title]</ref> <!-- add this format to include a referenced article, maintaining spaces: <ref>[URL publication: title]</ref> --> |
Revision as of 23:40, 28 October 2022
This transcript is not finished. Please help us finish it! Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts. |
Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.
SGU Episode 897 |
---|
September 17th 2022 |
By comparison, Neanderthals needed more brain to control their larger bodies. |
Skeptical Rogues |
S: Steven Novella |
B: Bob Novella |
C: Cara Santa Maria |
J: Jay Novella |
E: Evan Bernstein |
Quote of the Week |
If I want to know how we learn and remember and represent the world, I will go to psychology and neuroscience. |
Patricia Churchland, Canadian-American analytic philosopher |
Links |
Download Podcast |
Show Notes |
Forum Discussion |
Introduction, Black Mirror reflections
Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.
[00:12.600 --> 00:17.480] Today is Wednesday, September 14th, 2022, and this is your host, Stephen Novella.
[00:17.480 --> 00:19.280] Joining me this week are Bob Novella.
[00:19.280 --> 00:20.280] Hey, everybody.
[00:20.280 --> 00:21.280] Kara Santamaria.
[00:21.280 --> 00:22.280] Howdy.
[00:22.280 --> 00:23.280] Jay Novella.
[00:23.280 --> 00:24.280] Hey, guys.
[00:24.280 --> 00:25.280] And Evan Bernstein.
[00:25.280 --> 00:26.280] Good evening, everyone.
[00:26.280 --> 00:30.800] You know what, guys, I'm rewatching The Black Mirror because I haven't seen most of those
[00:30.800 --> 00:32.600] episodes since they originally aired.
[00:32.600 --> 00:33.600] Really?
[00:33.600 --> 00:36.080] You mean you started at season one, episode one all over again?
[00:36.080 --> 00:37.080] Yeah.
[00:37.080 --> 00:38.160] I'm just going through in order.
[00:38.160 --> 00:43.840] And I forgot most of the details of the episodes, you know?
[00:43.840 --> 00:48.640] I sort of remember what the episode was about, but don't remember the details.
[00:48.640 --> 00:50.920] So it's almost like watching it again.
[00:50.920 --> 00:51.920] So good.
[00:51.920 --> 00:54.120] It is a brilliant TV series.
[00:54.120 --> 00:55.720] So you just watched the first season?
[00:55.720 --> 00:56.720] No.
[00:56.720 --> 00:57.720] I think I'm in the third season now.
[00:57.720 --> 01:00.800] I mean, there's not that many episodes, like four episodes a season, so I'm burning my
[01:00.800 --> 01:01.800] way through.
[01:01.800 --> 01:02.800] Yeah.
[01:02.800 --> 01:03.800] There's some good stuff in there, man.
[01:03.800 --> 01:04.800] Yeah.
[01:04.800 --> 01:05.800] Sure.
[01:05.800 --> 01:06.800] Very, very good.
[01:06.800 --> 01:07.800] Very good futurism, actually.
[01:07.800 --> 01:08.800] Quite good.
[01:08.800 --> 01:09.800] Even though they're mostly like cautionary tales.
[01:09.800 --> 01:10.800] Oh.
Cheating at Tournament Chess (1:09)
[01:10.800 --> 01:11.800] So speaking of cautionary tales.
[01:11.800 --> 01:12.800] Yeah.
[01:12.800 --> 01:16.040] If you're going to enter a chess tournament, okay?
[01:16.040 --> 01:17.040] Don't cheat.
[01:17.040 --> 01:18.040] Now, what the heck?
[01:18.040 --> 01:19.040] Where did that come from?
[01:19.040 --> 01:21.760] Why are you bringing that up, Evan?
[01:21.760 --> 01:24.320] Because of this particular news item I ran across today.
[01:24.320 --> 01:30.240] Of course, I'm a gamer, I've been a chess player, I've been in tournaments.
[01:30.240 --> 01:32.280] So chess is something that's near and dear to me.
[01:32.280 --> 01:37.520] So when chess pops up in the news, I do pause and I read about it.
[01:37.520 --> 01:42.240] And in this particular case, this headline, it's the New York Post, so take that for
[01:42.240 --> 01:47.920] what it is, but it reads, huge chess world upset of Grandmaster sparks wild claims of
[01:47.920 --> 01:52.160] cheating with vibrating sex toy.
[01:52.160 --> 01:53.160] What a title.
[01:53.160 --> 01:54.160] I love it.
[01:54.160 --> 01:58.240] So if that's not click bait, I don't know what it is.
[01:58.240 --> 01:59.680] But here's the thing.
[01:59.680 --> 02:04.740] The Magnus Carlsen is currently the world's chess champion, he's like a five time world
[02:04.740 --> 02:05.740] chess champion.
[02:05.740 --> 02:12.400] He's on a long streak of wins, I believe he had 59 wins coming into a particular tournament
[02:12.400 --> 02:16.760] in which he was matched up in the first round against the lowest rated player, which obviously
[02:16.760 --> 02:17.760] makes sense.
[02:17.760 --> 02:21.160] Highest versus lowest and you meet in the middle and that's usually how the first round
[02:21.160 --> 02:22.600] works.
[02:22.600 --> 02:23.600] And he was upset.
[02:23.600 --> 02:24.600] He was beaten.
[02:24.600 --> 02:32.520] He was beaten by somebody who's effectively relatively new to the professional chess circuit
[02:32.520 --> 02:35.240] and tournaments and other things.
[02:35.240 --> 02:40.600] And it's causing obviously a controversy, a big one in the world of chess.
[02:40.600 --> 02:47.040] You see, because the person who beat him, his name is Hans Nieman, he admitted to cheating
[02:47.040 --> 02:49.240] in online tournaments when he was younger.
[02:49.240 --> 02:51.640] Oh boy, not good for him.
[02:51.640 --> 02:52.800] Yeah.
[02:52.800 --> 02:59.540] And so he has this cloud of accusations hovering over him that there is really no plausible
[02:59.540 --> 03:04.200] way in the world of chess that the lowest rated player can beat the highest rated who
[03:04.200 --> 03:09.800] happens to be the current grandmaster, world grandmaster, five time world champion in the
[03:09.800 --> 03:12.120] first round of a tournament like this.
[03:12.120 --> 03:18.060] Apparently it's so statistically nearly impossible that it likely would not have happened unless
[03:18.060 --> 03:21.600] there was some kind of cheating and you add on top of that the fact that this person has
[03:21.600 --> 03:26.840] admitted to cheating before.
[03:26.840 --> 03:33.620] He's being questioned by certainly lots of professional organizations about it, this
[03:33.620 --> 03:35.160] kid Nieman.
[03:35.160 --> 03:41.760] He has also been banned from chess.com, the world's number one chess website because of
[03:41.760 --> 03:42.760] the accusations.
[03:42.760 --> 03:43.760] I'm sorry, is it chess.org or chess.com?
[03:43.760 --> 03:44.760] I thought it was chess.com.
[03:44.760 --> 03:45.760] Evan.
[03:45.760 --> 03:50.640] And he's been banned from them because of these cheating accusations, yep.
[03:50.640 --> 03:54.760] The part that I don't get is you can make the accusation.
[03:54.760 --> 04:00.760] Well, first of all, I'm very triggery about someone like, I didn't win so therefore it
[04:00.760 --> 04:04.240] must be cheating, right, because we're seeing that.
[04:04.240 --> 04:05.240] Yes.
[04:05.240 --> 04:09.040] Number two, they either caught the guy or they didn't catch the guy.
[04:09.040 --> 04:10.040] You can't say afterwards.
[04:10.040 --> 04:11.040] They didn't catch him.
[04:11.040 --> 04:12.720] They did not catch him.
[04:12.720 --> 04:14.240] Let's say he had a device on him.
[04:14.240 --> 04:16.000] Let's say he was cheating, right?
[04:16.000 --> 04:17.000] Yes.
[04:17.000 --> 04:19.040] They don't catch him during the competition.
[04:19.040 --> 04:25.040] He gets up, he walks out, he gets rid of anything that could incriminate him.
[04:25.040 --> 04:28.400] So now they're making an accusation that is virtually unprovable.
[04:28.400 --> 04:34.440] So what I read, first of all, Carlson, the champion who lost, did not directly accuse
[04:34.440 --> 04:37.040] him of cheating, but he implied it.
[04:37.040 --> 04:42.520] He quote unquote all but accused him, but he didn't straight up say he cheated.
[04:42.520 --> 04:46.200] And you're right, Jay, from what I'm reading, we're not experts, but this is an interesting
[04:46.200 --> 04:51.480] story is that it's all based on plausibility and game analysis.
[04:51.480 --> 04:54.720] It's based upon like what's more likely to be true.
[04:54.720 --> 04:57.160] There's no direct evidence that he cheated.
[04:57.160 --> 04:58.160] Yeah.
[04:58.160 --> 05:02.720] Speaking of game analysis, though, I just read that both, if you look at gameplay, both
[05:02.720 --> 05:08.000] sides were making mistakes and the author was claiming that, you know, something that
[05:08.000 --> 05:13.080] would make you think that maybe he really wasn't cheating if he was also making mistakes,
[05:13.080 --> 05:17.440] which isn't necessarily true because you could just cheat not for every move, but for just
[05:17.440 --> 05:20.560] some of the critical moves, you know, so you could still make mistakes.
[05:20.560 --> 05:21.560] So yeah.
[05:21.560 --> 05:27.080] So the initial analysis was like when people were watching the game live, like if you were
[05:27.080 --> 05:31.800] listening to the commentary from what I'm reading again, it said that Carlson kind of
[05:31.800 --> 05:32.800] underestimated.
[05:32.800 --> 05:35.360] He was like, this is the first round, this is a low strength player.
[05:35.360 --> 05:41.520] He kind of rushed and that he messed up, like he did not play well early in the game, but
[05:41.520 --> 05:45.080] that he should have still been able to play him to a draw.
[05:45.080 --> 05:50.760] But then he made a bad move late in the game that Neiman exploited and won.
[05:50.760 --> 05:55.760] So it just it looked like he choked because he underestimated based on what you just said,
[05:55.760 --> 05:56.760] man.
[05:56.760 --> 06:02.040] However, once Carlson brought up the possibility that the guy cheated and people like analyze
[06:02.040 --> 06:09.720] the game in detail, some people are saying that Neiman made a clutch, brilliant move
[06:09.720 --> 06:16.680] really quickly and that that might imply that, you know, he that he cheated, that he was,
[06:16.680 --> 06:18.480] you know, that there was some sort of guidance.
[06:18.480 --> 06:19.480] Yeah.
[06:19.480 --> 06:20.480] But of course, we don't.
[06:20.480 --> 06:23.840] This is all, you know, speculation, speculation and probability.
[06:23.840 --> 06:25.600] It's possible that it was just an upset.
[06:25.600 --> 06:29.280] The thing is, unusual outcomes are going to occur from time to time.
[06:29.280 --> 06:33.100] And when they do, you can point to that's an anomaly and therefore there must be something
[06:33.100 --> 06:34.100] going on.
[06:34.100 --> 06:36.540] But anomaly should happen pretty regularly.
[06:36.540 --> 06:37.920] And there are upsets in chess.
[06:37.920 --> 06:38.920] It does happen.
[06:38.920 --> 06:39.920] You know.
[06:39.920 --> 06:40.920] Oh, in all sports.
[06:40.920 --> 06:41.920] Sure.
[06:41.920 --> 06:42.920] Sure.
[06:42.920 --> 06:44.880] So it's not enough to say, oh, this guy should not have won.
[06:44.880 --> 06:49.920] They would they would need to show evidence that he actually cheated, not although it
[06:49.920 --> 06:57.800] is interesting to this idea that we can, quote unquote, prove cheating to a high degree of
[06:57.800 --> 07:00.780] probability by analyzing the game.
[07:00.780 --> 07:03.960] So let me give you an example from a game if you guys remember this.
[07:03.960 --> 07:07.140] But I can't remember the specific video game, which a lot of our listeners know.
[07:07.140 --> 07:10.720] But somebody, you know, how they do like a you try to run through the game as fast as
[07:10.720 --> 07:11.720] possible.
[07:11.720 --> 07:12.720] Yes.
[07:12.720 --> 07:13.720] I've seen some.
[07:13.720 --> 07:17.320] Somebody did that in one of the games on the portal, whatever it was, one of the some
[07:17.320 --> 07:21.400] game where you could play through the beginning to end and broke all records.
[07:21.400 --> 07:23.720] And I think it was from Minecraft.
[07:23.720 --> 07:28.120] I think he did a Minecraft run through like faster than anybody else.
[07:28.120 --> 07:34.820] And somebody calculated the odds of him getting the drops that he got in the game.
[07:34.820 --> 07:36.320] And it was like astronomical.
[07:36.320 --> 07:39.000] I just defied all probability.
[07:39.000 --> 07:41.120] So he said he must have been hacking somehow.
[07:41.120 --> 07:47.520] He was cheating that it wasn't just based on drops, not speed, but but drops.
[07:47.520 --> 07:49.520] And when you say drops for people who aren't familiar with Minecraft.
[07:49.520 --> 07:54.200] So in other words, like you kill a bad guy and he drops treasure and that that drop is
[07:54.200 --> 07:58.480] random and there's a very hard probability.
[07:58.480 --> 07:59.480] It's coded into the game.
[07:59.480 --> 08:03.840] Like there's a one percent chance that you'll get this drop, you know, in a perfect thing.
[08:03.840 --> 08:04.840] Yeah.
[08:04.840 --> 08:10.880] So if you calculate the odds of him getting the favorable drops that he got, it defies
[08:10.880 --> 08:11.880] all.
[08:11.880 --> 08:12.880] It's like winning a lottery.
[08:12.880 --> 08:15.760] You know, it was like, but somebody always wins the lottery.
[08:15.760 --> 08:17.520] Well, that's that's kind of the point.
[08:17.520 --> 08:18.520] It's different.
[08:18.520 --> 08:19.520] No, but it's different.
[08:19.520 --> 08:20.520] It's numbers are different.
[08:20.520 --> 08:21.520] Yeah.
[08:21.520 --> 08:23.480] I have 10 million people play in that game.
[08:23.480 --> 08:24.480] Yeah.
[08:24.480 --> 08:29.640] But but so many but so many attempts at it if it's a large enough number, shouldn't there
[08:29.640 --> 08:30.640] be?
[08:30.640 --> 08:31.720] But it wasn't even close.
[08:31.720 --> 08:33.640] Not that many people do this right.
[08:33.640 --> 08:38.040] Do this like fast running, you know, run through of Minecraft.
[08:38.040 --> 08:42.320] The probability that somebody doing this, let's say there are thousands of people doing
[08:42.320 --> 08:43.320] it, whatever.
[08:43.320 --> 08:48.200] It still is like, you know, trillions to one against like orders of magnitude off it trillions
[08:48.200 --> 08:49.400] is a tough number to overcome.
[08:49.400 --> 08:50.400] It's just yeah.
[08:50.400 --> 08:51.400] Yeah.
[08:51.400 --> 08:54.840] It just should not have happened by right by chance because that doesn't mean it's impossible.
[08:54.840 --> 08:59.120] We're just saying probabilistically it's a huge red flag.
[08:59.120 --> 09:02.660] It's I think a little bit harder to say that with chess because it's not hard probabilities
[09:02.660 --> 09:03.660] that you can calculate.
[09:03.660 --> 09:07.560] It's just like maybe the guy choked and maybe the other guy got lucky or he made a he made
[09:07.560 --> 09:08.600] a move.
[09:08.600 --> 09:11.040] In retrospect, it was a brilliant move, but he could have just got lucky.
[09:11.040 --> 09:13.000] I mean, you know, could have just been.
[09:13.000 --> 09:14.000] Yeah.
[09:14.000 --> 09:15.000] Yeah.
[09:15.000 --> 09:18.860] The big thing for me, the big thing for me was Steve was when you said that this guy made
[09:18.860 --> 09:20.480] some bad moves.
[09:20.480 --> 09:21.480] He did.
[09:21.480 --> 09:24.860] A bunch of uncharacteristically bad moves.
[09:24.860 --> 09:29.600] And to me, that really kind of sways it back into this guy's corner, I think, because if
[09:29.600 --> 09:35.040] he if the champ still played a brilliant game and the guy still took him out, then that
[09:35.040 --> 09:38.520] would be, you know, it would be different, a little bit different.
[09:38.520 --> 09:39.520] Right.
[09:39.520 --> 09:40.520] Now, in terms of the cheating.
[09:40.520 --> 09:43.520] I mean, you know, this is why you don't cheat, man, because then your reputation's in the
[09:43.520 --> 09:44.520] shitter.
[09:44.520 --> 09:45.520] Yeah.
[09:45.520 --> 09:46.520] That's right.
[09:46.520 --> 09:48.080] Then if you do get lucky, no one's going to believe you.
[09:48.080 --> 09:50.280] But he said and even said, listen, he admitted it.
[09:50.280 --> 09:54.760] I admitted that I cheated once when I was 12 years old and when I was six twelve years
[09:54.760 --> 09:55.760] old.
[09:55.760 --> 10:03.160] And then when he was 16, he's now 19 years old, but he says, oh, I know he's sorry about
[10:03.160 --> 10:04.160] those.
[10:04.160 --> 10:05.160] He's reformed, whatever.
[10:05.160 --> 10:06.880] He cheats about every three years.
[10:06.880 --> 10:11.000] That's what you're saying.
[10:11.000 --> 10:12.520] You can kind of take that for what it's worth.
[10:12.520 --> 10:16.820] I mean, if you were like 30, I would say, OK, it was like he was a child and I was.
[10:16.820 --> 10:18.560] But he's 19.
[10:18.560 --> 10:25.560] It's still 16 to 19 is a huge deal, but it's not so much time that we could say he's out
[10:25.560 --> 10:29.880] of the woods in terms of still right bearing the burden of having a reputation of being
[10:29.880 --> 10:30.880] a cheater.
[10:30.880 --> 10:33.920] But it's interesting like you could make a case any way you want with something like
[10:33.920 --> 10:34.920] this.
[10:34.920 --> 10:35.920] You know, it's all about you.
[10:35.920 --> 10:37.920] You're missing like a part of this, Steve.
[10:37.920 --> 10:41.480] Evan, did I hear you correctly?
[10:41.480 --> 10:44.560] Did you say that they accused him of cheating with a sex toy?
[10:44.560 --> 10:48.160] Well, that's well, yeah, that where does that detail come from?
[10:48.160 --> 10:54.740] I'm not one hundred percent sure where that I think they're saying how could he have possibly
[10:54.740 --> 10:56.880] cheated using a piece of technology?
[10:56.880 --> 10:58.200] And this was one scenario.
[10:58.200 --> 11:03.640] And because it is, you know, because of the nature, the sexual nature of it, it obviously
[11:03.640 --> 11:06.520] gets a lot of attention more so than perhaps other.
[11:06.520 --> 11:12.160] But what's the what sex toy did this guy have that was helping him play chess?
[11:12.160 --> 11:18.040] Well, according to the accusation, it's something, you know, you anally insert and you vibrate
[11:18.040 --> 11:19.040] more.
[11:19.040 --> 11:20.480] And it vibrates and it vibrates.
[11:20.480 --> 11:22.720] Somebody would have had to have been controlling it remotely.
[11:22.720 --> 11:29.320] Well, yeah, you can other other another person or a computer or something else can control
[11:29.320 --> 11:30.320] the vibration.
[11:30.320 --> 11:32.800] Oh, and use it as a means of communication.
[11:32.800 --> 11:36.920] That's it's basically a way to send him information remotely.
[11:36.920 --> 11:37.920] Yeah, right.
[11:37.920 --> 11:38.920] Yeah.
[11:38.920 --> 11:39.920] But that's correct.
[11:39.920 --> 11:40.920] Yeah.
[11:40.920 --> 11:46.000] And that has and that and that that is a known thing in cheating when when somebody places
[11:46.000 --> 11:51.160] a device upon their body and it gives them a shock or a vibrational pulse or something
[11:51.160 --> 11:55.600] that that is very well established that people have done that in the past.
[11:55.600 --> 11:58.940] But do you think the guy was sitting there playing chess and every like five minutes
[11:58.940 --> 12:09.060] he'd be like, oh, well, this is what that sounds awfully like an argument from lack
[12:09.060 --> 12:10.060] of evidence.
[12:10.060 --> 12:11.060] Right.
[12:11.060 --> 12:12.720] It's like there's no evidence that he cheated.
[12:12.720 --> 12:18.280] That means he's a really good cheater because he he had something in his but, you know,
[12:18.280 --> 12:20.960] it's just that's not a very compelling argument.
[12:20.960 --> 12:22.360] But it is technically feasible.
[12:22.360 --> 12:25.160] You can communicate with very little information.
[12:25.160 --> 12:30.080] I think it's like three characters, three or four characters for any given chess move.
[12:30.080 --> 12:32.920] So it wouldn't take so that that can be done.
[12:32.920 --> 12:33.920] But yeah.
[12:33.920 --> 12:34.920] Yeah, you're right.
[12:34.920 --> 12:35.920] I mean, yeah.
[12:35.920 --> 12:36.920] Well, right.
[12:36.920 --> 12:37.920] Every piece occupies.
[12:37.920 --> 12:38.920] Yeah, that's right.
[12:38.920 --> 12:39.920] Every piece has a designation, a letter number combination.
[12:39.920 --> 12:44.180] So very, very easy, like you said, but let's follow this has those codes.
[12:44.180 --> 12:45.360] Let's follow this.
[12:45.360 --> 12:52.080] So he had to have a co-conspirator here that was like in the audience pressing.
[12:52.080 --> 12:53.160] Was it televised?
[12:53.160 --> 12:58.000] The button like he'd have to have somebody like looking up the information and then radioing
[12:58.000 --> 12:59.000] it to his butt.
[12:59.000 --> 13:00.000] Right.
[13:00.000 --> 13:01.000] Yeah.
[13:01.000 --> 13:02.800] So I I have to check and I haven't looked for the video.
[13:02.800 --> 13:09.360] I think it was somehow being televised or was able to be watched in real time.
[13:09.360 --> 13:15.200] And so, yeah, there would be some sort of in the audience would be too too risky.
[13:15.200 --> 13:20.080] Co-conspirator or with them or or a or something that's or a I don't know if there are automated
[13:20.080 --> 13:25.360] programs that read the chessboard or it's somehow programmed in or somebody online is
[13:25.360 --> 13:30.240] putting in the moves and then that is being relayed into whatever device supposedly this
[13:30.240 --> 13:31.920] thing is can transmit.
[13:31.920 --> 13:33.480] You know, I get you're right.
[13:33.480 --> 13:38.480] It's it's it's total speculation and unprovable at this point.
[13:38.480 --> 13:45.160] And you know, it does smack of kind of sour grapes overall, if you ask me, you know, queen
[13:45.160 --> 13:54.720] to to two D. Oh, but yes, I mean, Carlson is denying that he accused him of cheating
[13:54.720 --> 14:00.200] because that I think he knows that is bad for him now, unless you have proof.
[14:00.200 --> 14:01.200] Yeah.
[14:01.200 --> 14:03.280] You don't accuse the other guy of cheating.
[14:03.280 --> 14:05.560] Have them play five more games.
[14:05.560 --> 14:08.240] Let's see how this guy does that.
[14:08.240 --> 14:09.360] That proves nothing.
[14:09.360 --> 14:10.360] It proves nothing.
[14:10.360 --> 14:11.360] Yeah.
[14:11.360 --> 14:12.360] Why?
[14:12.360 --> 14:15.960] Because we know that Carlson will lose.
[14:15.960 --> 14:16.960] Yeah.
[14:16.960 --> 14:17.960] Right.
[14:17.960 --> 14:20.680] Because we know that the champion is better than the lowest ranking ranking guy.
[14:20.680 --> 14:23.000] It's just that did he underestimate him and choke?
[14:23.000 --> 14:24.000] Right.
[14:24.000 --> 14:27.080] That's the question that the other guy get lucky that, you know, that's the question.
[14:27.080 --> 14:31.440] And then nothing will answer that because it's done because the guy's clearly not going
[14:31.440 --> 14:32.920] to underestimate him a second time.
[14:32.920 --> 14:34.440] He's going to bring his freaking a game.
[14:34.440 --> 14:35.440] Yeah.
[14:35.440 --> 14:37.400] I played one Grandmaster in my life.
[14:37.400 --> 14:38.400] Really?
[14:38.400 --> 14:39.400] Yes.
[14:39.400 --> 14:40.400] How badly did he wipe you?
[14:40.400 --> 14:42.960] He destroyed me in like nine moves.
[14:42.960 --> 14:44.440] It was pretty much done.
[14:44.440 --> 14:45.440] Nine's not bad.
[14:45.440 --> 14:46.440] You held out for nine moves.
[14:46.440 --> 14:47.440] It was.
[14:47.440 --> 14:48.440] It was.
[14:48.440 --> 14:49.440] Yeah.
[14:49.440 --> 14:50.440] It was humbling.
[14:50.440 --> 14:51.440] It was just fun.
[14:51.440 --> 14:53.240] It was a friend of mine from high school.
[14:53.240 --> 14:54.240] His father.
[14:54.240 --> 14:55.240] Yeah.
[14:55.240 --> 14:56.240] Was technically a Grandmaster.
[14:56.240 --> 14:57.240] He played for 13.
[14:57.240 --> 14:58.240] I'd just like to be one of those guys.
[14:58.240 --> 15:01.240] You don't like to have the Grandmaster play 20 people at once.
[15:01.240 --> 15:02.240] Yeah.
[15:02.240 --> 15:03.240] Oh, gosh.
[15:03.240 --> 15:04.240] Defeats being one of those people.
[15:04.240 --> 15:08.280] You're taking up one twentieth of his attention and he's still wiped the board with you.
[15:08.280 --> 15:09.280] It's humbling.
[15:09.280 --> 15:10.280] Yeah.
[15:10.280 --> 15:11.280] So many moves.
[15:11.280 --> 15:12.280] Expertise.
[15:12.280 --> 15:13.280] Oh, gosh.
[15:13.280 --> 15:14.280] Yes.
[15:14.280 --> 15:15.280] And they're thinking so many moves ahead.
[15:15.280 --> 15:16.280] Yes.
[15:16.280 --> 15:17.280] Yeah.
[15:17.280 --> 15:21.640] The Korovinsky move from 1947 when he played Stratsky in this game and, you know, really
[15:21.640 --> 15:22.640] it comes down to that.
[15:22.640 --> 15:27.080] It's like they analyze they were they you they can memorize all the moves of a particular
[15:27.080 --> 15:31.960] game from a particular tournament from a particular, you know, year 90 that was played 90 years
[15:31.960 --> 15:32.960] ago.
[15:32.960 --> 15:33.960] It's impressive.
[15:33.960 --> 15:37.720] What's interesting from a skeptical point of view is that so many people now are trying
[15:37.720 --> 15:45.320] to infer whether or not he cheated based upon circumstantial and tangential evidence and
[15:45.320 --> 15:49.760] the logical fallacies are flying, you know, the motivated reasoning is flying.
[15:49.760 --> 15:56.900] So it's interesting to watch that from the sidelines having zero stake in the game.
[15:56.900 --> 15:58.480] But it's interesting.
[15:58.480 --> 16:02.280] And if any objective evidence emerges, we'll we'll let you know, because that would be
[16:02.280 --> 16:03.960] then then you have the hindsight.
[16:03.960 --> 16:04.960] Right.
[16:04.960 --> 16:07.960] And we'll look at all those statements and inferences with hindsight.
[16:07.960 --> 16:08.960] All right.
[16:08.960 --> 16:10.240] We're going to start off.
Is It Real: Ear Snake (16:08)
[16:10.240 --> 16:11.800] Evan, you sent this around.
[16:11.800 --> 16:12.800] This is a segment.
[16:12.800 --> 16:14.240] I think we've done this once or twice before.
[16:14.240 --> 16:15.240] Is it real?
[16:15.240 --> 16:16.240] Right.
[16:16.240 --> 16:17.240] Is the segment.
[16:17.240 --> 16:18.240] Is it real?
[16:18.240 --> 16:20.720] Have you guys all seen the YouTube video of the ear snake?
[16:20.720 --> 16:21.720] Oh, yeah.
[16:21.720 --> 16:26.000] I you know, I was going to watch it and then I realized I don't want to see whether it's
[16:26.000 --> 16:27.000] fake or not.
[16:27.000 --> 16:28.000] I don't want to see.
[16:28.000 --> 16:29.000] Oh, my God.
[16:29.000 --> 16:31.200] A snake come out of somebody's ear.
[16:31.200 --> 16:32.380] It's a high creep factor.
[16:32.380 --> 16:33.380] It's like it's.
[16:33.380 --> 16:34.380] Oh, yes.
[16:34.380 --> 16:35.380] I want to see it.
[16:35.380 --> 16:37.600] Well, it's right now.
[16:37.600 --> 16:40.680] Snakes is a natural fear, Steve, or the brain.
[16:40.680 --> 16:42.880] We have a disposition towards fear of snakes.
[16:42.880 --> 16:43.880] Oh, yeah.
[16:43.880 --> 16:44.880] I mean, generally.
[16:44.880 --> 16:47.800] So right there, you know, is the cringe.
[16:47.800 --> 16:48.800] You don't see it come out, Jay.
[16:48.800 --> 16:52.760] It's just basically hanging out in the ear with the opening and closing its mouth.
[16:52.760 --> 16:55.800] I don't like its head is facing outward.
[16:55.800 --> 16:56.800] Yeah.
[16:56.800 --> 16:57.800] Right.
[16:57.800 --> 17:03.320] So it's a it's like a portion of a video of a longer video, which is, you know, cut
[17:03.320 --> 17:09.560] strategically to only show that there's a head of a snake protruding from a woman's
[17:09.560 --> 17:14.800] ear and someone with gloves and tweezers is kind of poking it and provoking it into making
[17:14.800 --> 17:15.800] these mouth gestures.
[17:15.800 --> 17:16.800] Oh, my God.
[17:16.800 --> 17:17.800] Right.
[17:17.800 --> 17:18.800] And they're so they're so funny.
[17:18.800 --> 17:19.800] How did it turn inside?
[17:19.800 --> 17:20.800] Did it enter from another ear?
[17:20.800 --> 17:21.800] I know.
[17:21.800 --> 17:22.800] Oh, gosh.
[17:22.800 --> 17:23.800] It's crazy.
[17:23.800 --> 17:30.800] So as a neurologist, I could tell you this is 100 percent fake.
[17:30.800 --> 17:33.080] There's just no place for the snake to be.
[17:33.080 --> 17:37.800] You would be dead if there was if there was a body attached to that snake head.
[17:37.800 --> 17:42.840] There's the only place for it to be is in your brain's brain, freaking dead if that
[17:42.840 --> 17:43.840] were real.
[17:43.840 --> 17:48.460] If that were coming out of a corpse, OK, then there would be some plausibility there.
[17:48.460 --> 17:54.360] And the other thing is, the doctor is clearly not trying to remove it.
[17:54.360 --> 17:56.960] If you were trying to remove it, you would freaking remove it.
[17:56.960 --> 17:59.200] He's just poking it to make it smoking at it.
[17:59.200 --> 18:00.200] Yeah.
[18:00.200 --> 18:03.640] Like, there's no species of snake that's just a head, right?
[18:03.640 --> 18:09.480] Like, that would be the only plausible thing is if there was just a living head of a snake
[18:09.480 --> 18:10.480] there.
[18:10.480 --> 18:11.480] Right.
[18:11.480 --> 18:12.480] My guess is there's two options.
[18:12.480 --> 18:13.480] Yeah.
[18:13.480 --> 18:16.600] Either CG, which doesn't look CG, but I mean, it's possible.
[18:16.600 --> 18:17.600] No, it doesn't.
[18:17.600 --> 18:18.600] It could be.
[18:18.600 --> 18:19.600] It could be.
[18:19.600 --> 18:20.600] It could be.
[18:20.600 --> 18:21.600] It could be animatronic.
[18:21.600 --> 18:23.800] That's damn good animatronics.
[18:23.800 --> 18:26.240] Do you consider, Bob, that it was a ghost snake?
[18:26.240 --> 18:27.240] You know, it would be ethereal.
[18:27.240 --> 18:28.240] It wouldn't be actually.
[18:28.240 --> 18:29.240] Bigfoot snake.
[18:29.240 --> 18:30.240] It's a bigfoot snake.
[18:30.240 --> 18:31.240] Lots of feet on it.
[18:31.240 --> 18:34.520] It's a psychic, ghost, bigfoot snake from the future.
[18:34.520 --> 18:36.960] No, that's the most plausible explanation I've heard yet.
[18:36.960 --> 18:41.960] Or the most likely explanation, right, we pretty much, most people agree or Snopes agrees
[18:41.960 --> 18:47.880] or whatever, is that it's just a decapitated snake and they will move for a while, even
[18:47.880 --> 18:48.880] after decapitation.
[18:48.880 --> 18:50.440] And that's why he's poking it.
[18:50.440 --> 18:51.440] Yes.
[18:51.440 --> 18:53.760] Yeah, they cut the snake's head off, stuck it in her ear, and they're poking it to make
[18:53.760 --> 18:54.760] it move.
[18:54.760 --> 18:59.000] Okay, so if that's the, ooh, if that's the explanation, I don't know what's worse.
[18:59.000 --> 19:02.160] The false story or the actual explanation for this thing.
[19:02.160 --> 19:05.320] Also then, the question with no context, is this real?
[19:05.320 --> 19:06.320] Yes.
[19:06.320 --> 19:09.440] There is a decapitated snake head in her ear.
[19:09.440 --> 19:10.440] That's real.
[19:10.440 --> 19:16.080] Well, it's not real as presented, like as a living snake nestled in somebody's ear.
[19:16.080 --> 19:19.400] And that is what we're supposed to get from it, because the first thing I said was, why
[19:19.400 --> 19:21.800] is there just a snake head in her ear?
[19:21.800 --> 19:25.440] Because of course, any reasonable person knows that there's nowhere for the body to go, because
[19:25.440 --> 19:28.560] your ear canal, how big is your ear canal?
[19:28.560 --> 19:29.560] It's teeny.
[19:29.560 --> 19:30.560] I don't know.
[19:30.560 --> 19:31.560] Like a centimeter or two?
[19:31.560 --> 19:32.560] Yeah.
[19:32.560 --> 19:33.640] An inch max?
[19:33.640 --> 19:34.760] I don't know.
[19:34.760 --> 19:35.760] And it's narrow.
[19:35.760 --> 19:36.760] Yeah.
[19:36.760 --> 19:37.760] That's what I'm saying.
[19:37.760 --> 19:38.760] Yeah.
[19:38.760 --> 19:39.760] It's short and narrow.
[19:39.760 --> 19:40.760] There's no...
[19:40.760 --> 19:44.880] And then there's your cochlea, your inner ear, and then there's your brainstem.
[19:44.880 --> 19:48.560] You know, it's just, there's no place for the snake body to be.
[19:48.560 --> 19:50.200] So there's clearly no snake body there, right?
[19:50.200 --> 19:51.480] That's that we could say for sure.
[19:51.480 --> 19:56.880] Whether it's CG or a recently decapitated head or whatever, there's no body attached
[19:56.880 --> 19:57.880] to it.
[19:57.880 --> 19:58.880] It's an illusion.
[19:58.880 --> 20:01.440] It's an illusion, right?
[20:01.440 --> 20:06.560] Is surgeon in quotes, struggles to remove live snake bones there.
[20:06.560 --> 20:09.160] It's a surgeon in quotes.
[20:09.160 --> 20:10.160] Yeah.
[20:10.160 --> 20:15.600] Because apparently it started as a clip to Facebook, posted on September 1st by India
[20:15.600 --> 20:21.280] based social media star named Chandan Singh, or 20,000 followers, whatever.
[20:21.280 --> 20:26.900] And surgeon, it was written in a foreign language, I can't read it, but the word surgeon was
[20:26.900 --> 20:27.940] in there.
[20:27.940 --> 20:31.740] And also in quotes, it says the snake has gone in the ear.
[20:31.740 --> 20:35.800] So that's why surgeon is quoted the way it is.
[20:35.800 --> 20:37.140] That guy's not a surgeon.
[20:37.140 --> 20:41.660] Or if he is, he's not trying to remove that snake skin because if he were, he would freaking
[20:41.660 --> 20:42.660] remove it.
[20:42.660 --> 20:47.880] And the other thing is, if he removed the snake, why are we not seeing that portion
[20:47.880 --> 20:48.880] of the video?
[20:48.880 --> 20:49.880] Right.
[20:49.880 --> 20:55.560] Why is it so conveniently cut before you get to see the head pop out or whatever?
[20:55.560 --> 21:00.800] And after the removal of something other than just poking the snake head.
[21:00.800 --> 21:06.800] But sitting there and allowing yourself to be used like that for this purpose is heroic,
[21:06.800 --> 21:08.760] brave or weird.
[21:08.760 --> 21:14.320] I've seen people do weirder, grosser things on the internet, so not surprising.
[21:14.320 --> 21:18.000] All right, let's move on to some news items.
News Items
S:
B:
C:
J:
E:
(laughs) (laughter) (applause) [inaudible]
What Children Believe (21:18)
[21:18.000 --> 21:21.200] Kara, you're going to start us off with what children believe.
[21:21.200 --> 21:22.200] Yeah.
[21:22.200 --> 21:26.520] This is one of those like I could approach this two different ways because as I was reading
[21:26.520 --> 21:29.860] the coverage of this, the headlines made it sound kind of juicy.
[21:29.860 --> 21:34.560] And then the more I dug into the actual paper, like the research study that we're about to
[21:34.560 --> 21:38.600] talk about, the more I was like, uh-huh, uh-huh, well, duh.
[21:38.600 --> 21:45.360] And so let's talk a little bit about how this is a well, duh subject, but also why it matters.
[21:45.360 --> 21:54.400] So an article published in Child Development by some Canadian and I think and U.S. researchers,
[21:54.400 --> 22:02.080] yeah, University of Toronto and also Harvard, were saying kind of do kids believe everything
[22:02.080 --> 22:03.080] that they're told?
[22:03.080 --> 22:07.120] I think any of us, like as you're reading coverage of this, it's really funny to me
[22:07.120 --> 22:10.080] because you're like, have you ever been around a child?
[22:10.080 --> 22:15.680] Like yes, kids are gullible, but yes, kids also are explorers.
[22:15.680 --> 22:18.320] And so the question here, they're observers and they're explorers.
[22:18.320 --> 22:22.200] So the question here is how do those things converge?
[22:22.200 --> 22:24.940] I don't think it's really well answered by this study, but let me tell you what they
[22:24.940 --> 22:26.200] did in the study.
[22:26.200 --> 22:29.520] So there's sort of two paradigms.
[22:29.520 --> 22:33.120] In one of the paradigms, it was quite simple.
[22:33.120 --> 22:38.480] They basically had kids come in and they asked them straightforward questions like, is this
[22:38.480 --> 22:41.600] rock hard or soft?
[22:41.600 --> 22:44.880] And the kids were like, well, the rock's hard.
[22:44.880 --> 22:48.680] Like all of them said that because these were four to seven-year-olds, by the way, because
[22:48.680 --> 22:50.960] even a four-year-old knows that a rock is hard.
[22:50.960 --> 22:53.200] They have seen and felt rocks before.
[22:53.200 --> 22:57.720] And then they sort of randomized them into groups and in one group they were like, yeah,
[22:57.720 --> 22:58.720] it's hard.
[22:58.720 --> 23:02.560] And in the other group they were like, no, this rock is soft.
[23:02.560 --> 23:07.600] And then the researcher was like, oh, quick, I got a phone call, BRB, and left the kids
[23:07.600 --> 23:08.600] in the room.
[23:08.600 --> 23:13.300] And so unbeknownst to the kids, they were being videotaped and their exploratory behavior
[23:13.300 --> 23:14.740] was being observed.
[23:14.740 --> 23:16.280] And that's really what the study was.
[23:16.280 --> 23:22.640] What do the kids do after they're told that there's a surprising piece of information
[23:22.640 --> 23:28.800] that doesn't comport with their preexisting understanding of the world?
[23:28.800 --> 23:29.800] Kids explored.
[23:29.800 --> 23:33.040] I mean, you know, what do you expect them to do?
[23:33.040 --> 23:36.080] I guess you might expect that one or two kids are going to sit there and go, I guess the
[23:36.080 --> 23:39.400] rock's soft, weird, and like never touched the rock.
[23:39.400 --> 23:41.760] But most kids did exactly what you would think they would do.
[23:41.760 --> 23:47.620] They started to observe and explore and see for themselves, see if that adult's claim
[23:47.620 --> 23:50.440] was real.
[23:50.440 --> 23:51.440] What age?
[23:51.440 --> 23:52.440] Four to seven.
[23:52.440 --> 23:53.440] Yeah, sure.
[23:53.440 --> 23:54.440] Yeah.
[23:54.440 --> 24:00.600] What I love about it, though, because they have to summarize the study exactly, is that
[24:00.600 --> 24:07.520] in the pretest or in the pre-experimental condition group, so they ask them all, is
[24:07.520 --> 24:10.320] the rock hard or soft, all the kids said the rock was hard.
[24:10.320 --> 24:15.640] And then in part of the two different groups, one group, no, the rock is soft.
[24:15.640 --> 24:19.280] The other group, yeah, you're right, the rock is hard.
[24:19.280 --> 24:25.680] Most of the kids, it's really funny, they say most but not all of the kids concurred.
[24:25.680 --> 24:29.080] In the group where they were reinforced that the rock was hard, most but not all of the
[24:29.080 --> 24:32.620] kids continued to concur that the rock was hard.
[24:32.620 --> 24:33.960] This is my favorite part of the study.
[24:33.960 --> 24:36.760] That means at least one child was like, no, the rock's hard.
[24:36.760 --> 24:39.120] And then the adult goes, yeah, you're right, the rock's hard, and the kid goes, I think
[24:39.120 --> 24:40.120] it's soft.
[24:40.120 --> 24:41.120] Yeah, right?
[24:41.120 --> 24:43.240] Which is like, because they had to report it that way, right?
[24:43.240 --> 24:45.120] They said most, not all.
[24:45.120 --> 24:46.640] Anyway, that's an aside.
[24:46.640 --> 24:49.920] So some of the kids said, okay, maybe it is soft.
[24:49.920 --> 24:52.160] Some of the kids said, no, I think it's hard.
[24:52.160 --> 24:55.760] But regardless, they went in and they explored for themselves.
[24:55.760 --> 25:01.420] Then they did another study where they actually kind of compared the differences between the
[25:01.420 --> 25:06.540] younger group, so they sort of arbitrarily split them after the fact into a four to five-year-old
[25:06.540 --> 25:10.460] group and then into a six to seven-year-old group.
[25:10.460 --> 25:15.720] And in that one, they were given the vignettes, specific vignettes.
[25:15.720 --> 25:18.840] I think they were sort of based on the ideas from this first group.
[25:18.840 --> 25:24.600] And in those different vignettes, they asked the kids, what do you do if an adult says
[25:24.600 --> 25:28.140] to you, the sponge is harder than the rock?
[25:28.140 --> 25:32.160] What do you do if an adult says to you, blah, blah, blah?
[25:32.160 --> 25:36.680] And so they presented the kids with these vignettes of kind of unbelievable claims or
[25:36.680 --> 25:41.560] claims that shouldn't compute for them because they're between the ages of four and seven
[25:41.560 --> 25:45.560] and are old enough to know that a sponge is soft and a rock is hard.
[25:45.560 --> 25:53.340] And what they found was that the older group kind of identified strategies that were more
[25:53.340 --> 25:55.920] specific and more efficient.
[25:55.920 --> 26:01.080] So the older group would say things like, they should touch the sponge and they should
[26:01.080 --> 26:03.340] touch the rock and compare them.
[26:03.340 --> 26:09.280] And the younger group was less likely to have, I guess, a more sophisticated approach to
[26:09.280 --> 26:11.060] that problem solving.
[26:11.060 --> 26:12.680] This is being reported all over the place.
[26:12.680 --> 26:17.620] And basically, the quote that a lot of people are citing is from one of the researchers
[26:17.620 --> 26:20.160] that says, there's still a lot we don't know.
[26:20.160 --> 26:23.320] This is a senior author from the Toronto Lab.
[26:23.320 --> 26:24.880] It's called the Child Lab.
[26:24.880 --> 26:28.320] But what's clear is that children don't believe everything they're told.
[26:28.320 --> 26:31.520] They think about what they've been told, and if they're skeptical, they seek out additional
[26:31.520 --> 26:34.960] information that could confirm or disconfirm it.
[26:34.960 --> 26:38.480] And I think for me, this is the like, well, duh, haven't you ever been around a child
[26:38.480 --> 26:39.840] situation?
[26:39.840 --> 26:44.280] Because children aren't completely naive to the world by the time they're four.
[26:44.280 --> 26:50.020] They've lived for four years, and they've made their own observations.
[26:50.020 --> 26:55.940] And so Steve, you wrote this up, and you took sort of the study, and you said, well, let's
[26:55.940 --> 26:56.940] think broader.
[26:56.940 --> 27:00.040] Because of course, the findings, the outcome findings of the study are not surprising.
[27:00.040 --> 27:02.400] It's pretty narrow and not surprising, yeah.
[27:02.400 --> 27:03.900] It's super narrow.
[27:03.900 --> 27:06.240] It's really trying to test, are kids ultra gullible?
[27:06.240 --> 27:11.000] And it's like, well, yeah, sometimes they are, but obviously, sometimes they're not.
[27:11.000 --> 27:15.800] And then, you know, the bigger question is, you know, do kids believe everything they're
[27:15.800 --> 27:20.440] told by adults, kind of when does that start to change?
[27:20.440 --> 27:24.760] And sort of what are the factors, I think, that are involved here?
[27:24.760 --> 27:29.740] If the paradigm had been something that was a little bit vaguer or harder to confirm,
[27:29.740 --> 27:32.200] we may have seen a different outcome.
[27:32.200 --> 27:39.160] Very often, we're running into claims that we can't confirm ourselves, that we have to
[27:39.160 --> 27:42.880] confirm by figuring out who are the experts?
[27:42.880 --> 27:44.480] What are they saying?
[27:44.480 --> 27:46.120] Is there a consensus?
[27:46.120 --> 27:50.060] And this skill is not the skill that the study looked at at all.
[27:50.060 --> 27:54.600] This study looked at very basic scientific reasoning skills.
[27:54.600 --> 27:58.880] And so, I'm a little, like, the headline's fine.
[27:58.880 --> 28:01.760] Children don't believe everything they're told, well, again, duh.
[28:01.760 --> 28:06.040] Or children as, it's hard to lie to children according to scientists, that one's a stretch
[28:06.040 --> 28:07.440] for me.
[28:07.440 --> 28:08.440] Not liking that headline.
[28:08.440 --> 28:14.000] It's hard to lie to children about things that they can directly observe themselves.
[28:14.000 --> 28:17.120] They said, you know, an adult, an authority figure is telling them something.
[28:17.120 --> 28:23.480] But whether the child agreed or not, the child then explored to try and test that observation
[28:23.480 --> 28:24.480] for themselves.
[28:24.480 --> 28:27.920] And I think that is a fundamentally important aspect of this study.
[28:27.920 --> 28:32.360] But I don't think there's a lot of inferences that you can draw from it about how to develop
[28:32.360 --> 28:34.920] really strong critical thinking skills later in life.
[28:34.920 --> 28:37.240] Yeah, this one study is such a tiny slice.
[28:37.240 --> 28:39.880] You have to look at it in the context of so much of the research.
[28:39.880 --> 28:43.040] I didn't talk about it in my write-up, though, but we also, there's also research looking
[28:43.040 --> 28:51.520] at like if an adult tells a child, like gives them a toy and shows them how to use the toy,
[28:51.520 --> 28:57.040] the child will use the toy in the way that they were shown, even if it's a very narrow,
[28:57.040 --> 28:58.920] simplistic way of using the toy.
[28:58.920 --> 29:03.880] If they're given the toy with no direction, they will be more creative and they'll explore
[29:03.880 --> 29:07.760] and they'll use it in different ways and they'll try out different things.
[29:07.760 --> 29:14.520] So one factor is does it contradict or conflict with things they already know or are they
[29:14.520 --> 29:18.200] being told information in a vacuum?
[29:18.200 --> 29:23.460] And also, as you say, is it part of, like, are they being told, this is part of our identity
[29:23.460 --> 29:24.840] of who we are, right?
[29:24.840 --> 29:29.400] Obviously, parents convey religious beliefs to children and children believe them.
[29:29.400 --> 29:32.860] Most people have the religious faith that they were raised in.
[29:32.860 --> 29:36.520] And that's one of those things that you can't just turn around and observe for yourself.
[29:36.520 --> 29:38.840] Yeah, you can't say, is God there?
[29:38.840 --> 29:42.520] You know, there's nothing you can do to test that.
[29:42.520 --> 29:47.440] But what's interesting about testing kids, first of all, it's interesting to say, when
[29:47.440 --> 29:50.320] do certain modules, you know, engage?
[29:50.320 --> 29:52.640] When do they start to do things?
[29:52.640 --> 29:56.720] And you can see how they get more sophisticated and nuanced and how they approach things.
[29:56.720 --> 30:03.240] But also, there is this question about like whether or not, you know, children are like
[30:03.240 --> 30:09.600] more curious and more sort of questioning younger, and then it gets beaten out of them
[30:09.600 --> 30:12.800] by the desire to conform to society.
[30:12.800 --> 30:13.800] Right.
[30:13.800 --> 30:16.120] And is there something almost bimodal there, right?
[30:16.120 --> 30:19.800] Where when they're so young, they believe everything they're told because they don't
[30:19.800 --> 30:23.760] have context and they don't have anything to connect it to.
[30:23.760 --> 30:26.120] And they have no reason to question.
[30:26.120 --> 30:30.320] And then as they get older, they start to be more questioning.
[30:30.320 --> 30:35.040] And then as they get even older, still, they want to conform and belong because the idea
[30:35.040 --> 30:40.100] of social in-group, out-group status becomes more salient to them.
[30:40.100 --> 30:42.960] Perhaps it does kind of follow that to some extent.
[30:42.960 --> 30:44.840] I think creativity is the same way.
[30:44.840 --> 30:47.720] Creativity, I mean, it's interesting, you were talking about the research about giving
[30:47.720 --> 30:48.720] a child a toy.
[30:48.720 --> 30:50.260] And this is maybe a little bit of a departure.
[30:50.260 --> 30:55.060] But I worked with a professor who was like fascinated by creativity research.
[30:55.060 --> 30:58.480] And I hated it because I was like, how do you define that?
[30:58.480 --> 30:59.480] Oh, my gosh.
[30:59.480 --> 31:00.480] It's so vague.
[31:00.480 --> 31:02.000] It's all over the place.
[31:02.000 --> 31:09.840] And they often talked about like, give a child anything, a piece of equipment, a paperclip,
[31:09.840 --> 31:13.360] and have them list all the things it can be.
[31:13.360 --> 31:17.580] And for me, I would get frustrated when people would say it's super creative if they just
[31:17.580 --> 31:21.400] made a list of things that it could never be.
[31:21.400 --> 31:27.040] But creativity seemed really in that sweet spot when they would think of things that
[31:27.040 --> 31:31.400] were outside of the box, but they still used some amount of constraint.
[31:31.400 --> 31:33.680] Like a paperclip can't be an airplane.
[31:33.680 --> 31:35.200] It just can't.
[31:35.200 --> 31:37.440] But it can be this, this, this, and this.
[31:37.440 --> 31:41.460] And maybe those are things you wouldn't think of if you're always thinking inside the box.
[31:41.460 --> 31:48.280] And so there does seem to be some amount of developmental correlation there, right?
[31:48.280 --> 31:52.920] The older that you get, the more constrained your thinking becomes.
[31:52.920 --> 31:57.320] And so it is that sort of like the more conformist you are, the more it's being beaten out of
[31:57.320 --> 31:58.320] you.
[31:58.320 --> 32:02.360] But I think that also comes not just with age, but it comes with the amount of time
[32:02.360 --> 32:04.760] you spend in a certain paradigm as well.
[32:04.760 --> 32:09.920] Because you see this a lot with people who work in a certain field being brought into
[32:09.920 --> 32:13.600] another field to try to solve problems in that field.
[32:13.600 --> 32:16.240] And it's amazing what happens where they're like, well, did you try this?
[32:16.240 --> 32:20.760] And people are like, oh, my god, how have we none of us have ever thought of that before.
[32:20.760 --> 32:24.480] Because that's not how you were trained to think.
[32:24.480 --> 32:25.480] Fresh approach to it.
[32:25.480 --> 32:26.480] Yeah.
[32:26.480 --> 32:30.920] I always think of the fact that the Iceman, you know, they didn't know how he died.
[32:30.920 --> 32:39.640] And meanwhile, there's an arrowhead clearly visible on the X-ray in his chest that they
[32:39.640 --> 32:44.160] looked at for years and didn't see it because they weren't looking for it.
[32:44.160 --> 32:45.160] No, no.
[32:45.160 --> 32:46.880] They kept saying, wow, what is this arrow pointing to?
[32:46.880 --> 32:51.160] It must be a clue as to what might have killed him.
[32:51.160 --> 32:52.160] Someone's like, hey, what's that arrowhead?
[32:52.160 --> 32:53.160] You know, whatever.
[32:53.160 --> 33:00.960] Or what's the other one, the gorilla and the scan of the monkey in the brain?
[33:00.960 --> 33:06.240] They did a research study where they showed radiologists a CT scan of the chest.
[33:06.240 --> 33:09.880] And they said, tell us what pathology you see there.
[33:09.880 --> 33:14.000] And there was literally a gorilla in the middle of the chest.
[33:14.000 --> 33:15.000] And nobody found it.
[33:15.000 --> 33:16.000] Oh, that's great.
[33:16.000 --> 33:17.000] Nobody.
[33:17.000 --> 33:20.640] It's like a large percentage of them didn't see it because, of course, they're not looking
[33:20.640 --> 33:21.640] for it.
[33:21.640 --> 33:22.640] They're looking for what they know to be pathology.
[33:22.640 --> 33:23.640] Yeah.
[33:23.640 --> 33:24.640] That's based on that classic.
[33:24.640 --> 33:25.640] Yeah.
[33:25.640 --> 33:26.640] It's inattentional blindness.
[33:26.640 --> 33:27.640] There's a classic psych experiment that you can Google it.
[33:27.640 --> 33:28.640] Like, you can watch the video.
[33:28.640 --> 33:32.160] It's a super classic video where they tell people, count how many times the ball is passed.
[33:32.160 --> 33:36.400] And it's a really complex video where basketball is being passed amongst a lot of people.
[33:36.400 --> 33:39.320] You really have to focus to count the passes.
[33:39.320 --> 33:41.400] And while focusing on it, it's inattentional blindness.
[33:41.400 --> 33:43.640] That's the phenomenon that they're highlighting.
[33:43.640 --> 33:47.360] While you're focusing on the ball, you literally don't see the gorilla walk completely through
[33:47.360 --> 33:48.360] the scene.
[33:48.360 --> 33:49.360] It's amazing.
[33:49.360 --> 33:50.360] It's amazing.
[33:50.360 --> 33:51.360] I know.
[33:51.360 --> 33:54.440] And professors love to show this to first-year psych students and go, anybody notice anything
[33:54.440 --> 33:56.800] weird about the video?
[33:56.800 --> 33:59.240] It's pretty cool how many people are like, what do you mean?
[33:59.240 --> 34:01.880] About 30% see it, 30% or 40%.
[34:01.880 --> 34:02.880] Yeah.
[34:02.880 --> 34:03.880] Yeah.
[34:03.880 --> 34:05.920] But that's why they used a gorilla in the x-ray study.
[34:05.920 --> 34:06.920] Yeah.
[34:06.920 --> 34:07.920] Because it was like a nod.
[34:07.920 --> 34:10.920] An homage to that original gorilla video.
[34:10.920 --> 34:11.920] Yeah.
[34:11.920 --> 34:12.920] Fascinating.
[34:12.920 --> 34:15.800] Well, everyone, we're going to take a quick break from our show to talk about one of our
[34:15.800 --> 34:18.680] sponsors this week, BetterHelp.
[34:18.680 --> 34:21.640] There are so many reasons to go to therapy.
[34:21.640 --> 34:23.760] I mean, too many to list.
[34:23.760 --> 34:28.800] And I think all of us know that whether we're struggling with a mental illness, with an
[34:28.800 --> 34:35.120] actual diagnosis, or whether we're dealing with an experience in our lives, that we just
[34:35.120 --> 34:38.640] need a little bit of support, a little bit of guidance through.
[34:38.640 --> 34:43.760] These are all valid reasons to talk to somebody, and BetterHelp makes it super easy because,
[34:43.760 --> 34:45.560] of course, this is online therapy.
[34:45.560 --> 34:46.560] Yeah.
[34:46.560 --> 34:50.640] Kara, in my personal experience, going to therapy, it's doing multiple things at the
[34:50.640 --> 34:51.640] same time.
[34:51.640 --> 34:53.720] I just feel good after I go to therapy.
[34:53.720 --> 34:55.680] It's like I'm unloading.
[34:55.680 --> 35:01.200] And along with that, I'm learning skills to help me deal with my own anxiety and depression,
[35:01.200 --> 35:02.840] which it's a double win.
[35:02.840 --> 35:06.320] So when you want to be a better problem solver, therapy can get you there.
[35:06.320 --> 35:11.600] Visit BetterHelp.com slash SGU today to get 10% off your first month.
[35:11.600 --> 35:15.000] That's Better H-E-L-P dot com slash SGU.
[35:15.000 --> 35:16.160] All right, guys.
[35:16.160 --> 35:17.640] Let's get back to the show.
[35:17.640 --> 35:18.760] All right.
[35:18.760 --> 35:19.760] Let's move on.
Health Effects of Gas Stoves (35:18)
[35:19.760 --> 35:23.680] Jay, is my gas stove slowly killing me?
[35:23.680 --> 35:25.400] It is, isn't it, Jay?
[35:25.400 --> 35:26.400] Check the gas.
[35:26.400 --> 35:30.320] Do you hear it moving around the house at night when you're in bed?
[35:30.320 --> 35:31.320] Is that what it is?
[35:31.320 --> 35:32.320] Is that my noise?
[35:32.320 --> 35:37.160] Yeah, the question is, is having a gas stove in your house dangerous to your health?
[35:37.160 --> 35:42.000] Yeah, unfortunately, we've known this for a while, but recent research strongly suggests
[35:42.000 --> 35:43.680] the answer is yes.
[35:43.680 --> 35:48.360] Gas stoves are considered, this is my opinion, but lots of people feel this way.
[35:48.360 --> 35:52.440] Gas stoves are considered the best of all versions of stoves out there.
[35:52.440 --> 35:53.440] Disagree.
[35:53.440 --> 35:54.440] Induction.
[35:54.440 --> 35:58.880] I'll explain to you right now why I like it better than induction.
[35:58.880 --> 36:04.000] Because you have more control over the temperature because I can visually tell what my... Once
[36:04.000 --> 36:09.780] you learn your stove, I can visually tell where it's at just by looking at the flame.
[36:09.780 --> 36:13.600] You think you have more control because you can visually see the flame height once you
[36:13.600 --> 36:19.900] get used to it as opposed to being able to dial in a digital display?
[36:19.900 --> 36:22.600] Of course you have more control when you're using technology.
[36:22.600 --> 36:24.800] Yeah, but this is gas.
[36:24.800 --> 36:29.480] I love you, Jay.
[36:29.480 --> 36:31.080] You don't have more control.
[36:31.080 --> 36:32.080] You just think you do.
[36:32.080 --> 36:33.080] It's also instant heat.
[36:33.080 --> 36:34.600] You like the flame, go vroom, vroom, vroom.
[36:34.600 --> 36:38.520] You don't have to wait for it to heat up, which is another thing that I like about it.
[36:38.520 --> 36:39.520] Induction is instant.
[36:39.520 --> 36:40.520] It boils water in 90 seconds.
[36:40.520 --> 36:41.520] It's true.
[36:41.520 --> 36:42.520] I had one.
[36:42.520 --> 36:43.520] Look into it.
[36:43.520 --> 36:45.800] You never have invited me over to show me.
[36:45.800 --> 36:46.800] You've never done this.
[36:46.800 --> 36:47.800] I know.
[36:47.800 --> 36:48.800] Sorry.
[36:48.800 --> 36:49.800] Sorry.
[36:49.800 --> 36:50.800] See?
[36:50.800 --> 36:51.800] You're talking like we hang out.
[36:51.800 --> 36:53.800] Induction can be very fast.
[36:53.800 --> 36:57.640] The one thing I don't like about induction is that you have to use special pottery.
[36:57.640 --> 36:59.320] Not special, just magnetic.
[36:59.320 --> 37:00.560] So long as everything's magnetic.
[37:00.560 --> 37:01.560] And yes, okay.
[37:01.560 --> 37:04.340] Go through your cookware.
[37:04.340 --> 37:05.560] You can't use all your cookware.
[37:05.560 --> 37:09.120] You got to now have all cookware that's consistent with the induction.
[37:09.120 --> 37:11.240] Kara, this is a good throw down.
[37:11.240 --> 37:13.480] We should have George make this a throw down for the Excavator.
[37:13.480 --> 37:14.480] We should have.
[37:14.480 --> 37:15.480] Oh, it'd be great.
[37:15.480 --> 37:20.440] I just can hear the sounds of all of the European listeners nodding their heads vigorously
[37:20.440 --> 37:24.800] because for some reason it's so caught on in Europe and like here in the U.S. induction
[37:24.800 --> 37:25.800] is really rare.
[37:25.800 --> 37:28.520] Anyway, I will not continue to fight for induction.
[37:28.520 --> 37:30.280] I have an induction stove top.
[37:30.280 --> 37:33.320] I have a gas stove, but I have an induction plate.
[37:33.320 --> 37:37.040] Oh yeah, that's not uncommon to have like that, the hybrid kind of thing.
[37:37.040 --> 37:38.040] Oh, so you have a hybrid.
[37:38.040 --> 37:39.040] Let's circle it back.
[37:39.040 --> 37:45.320] So the important topic here tonight is whether or not the gas debate will put down for a
[37:45.320 --> 37:46.320] little bit.
[37:46.320 --> 37:47.880] The question is, does it pollute the air in your home?
[37:47.880 --> 37:48.880] And the answer is yes.
[37:48.880 --> 37:50.680] Let's get into the details.
[37:50.680 --> 37:56.960] Gas stoves give off nitrogen dioxide when combustion happens and nitrogen dioxide exposure
[37:56.960 --> 38:01.280] in the home is associated with an increase in asthma symptoms.
[38:01.280 --> 38:06.080] Also it's associated with a higher level of use of something called a rescue inhaler for
[38:06.080 --> 38:07.080] children, right?
[38:07.080 --> 38:10.600] So that's doing things to people's lungs.
[38:10.600 --> 38:11.600] So that's bad.
[38:11.600 --> 38:15.080] And as far as asthmatic adults go, they're affected as well.
[38:15.080 --> 38:21.080] And breathing in nitrogen dioxide increases the occurrences or worsening of chronic obstructive
[38:21.080 --> 38:22.920] pulmonary disease.
[38:22.920 --> 38:25.240] So you know, that's not good.
[38:25.240 --> 38:26.240] Definitely not good.
[38:26.240 --> 38:29.560] Keep in mind that nitrogen dioxide can come from the outdoors as well, right guys?
[38:29.560 --> 38:31.400] It's not just coming from your stove.
[38:31.400 --> 38:35.040] People who live near busy roads, you know, they're exposed to higher levels as well.
[38:35.040 --> 38:39.480] Indoor emissions of nitrogen dioxide are typically greater than outdoor sources though.
[38:39.480 --> 38:41.100] And this is important.
[38:41.100 --> 38:44.520] For example, during times when people cook, right?
[38:44.520 --> 38:48.400] It's dinnertime and you get in there and you start turning the oven on, you're cooking.
[38:48.400 --> 38:53.960] We call this peak exposure and half of those people that are cooking are exposed to higher
[38:53.960 --> 38:59.080] levels than what health standards suggest we should be, you know, that we should be
[38:59.080 --> 39:00.280] exposed to.
[39:00.280 --> 39:01.280] That's not good.
[39:01.280 --> 39:02.320] That's a lot of people.
[39:02.320 --> 39:07.440] The obvious question is how could one stove expose you to more nitrogen dioxide than all
[39:07.440 --> 39:10.140] of the cars that are driving around where you live?
[39:10.140 --> 39:14.240] And the simple answer is that there is a significant amount of air outside compared to the air
[39:14.240 --> 39:16.360] you have inside your home.
[39:16.360 --> 39:21.800] All of that pollution outside is dramatically diluted in the unbelievable amount of air
[39:21.800 --> 39:22.800] that exists outside.
[39:22.800 --> 39:27.640] But when you're inside your house, you have a very small amount of air.
[39:27.640 --> 39:33.500] And when that air gets polluted in any way, it's significant, you know, and it can stay
[39:33.500 --> 39:34.680] in your house.
[39:34.680 --> 39:39.000] So another factor to consider is that the layout of a home also impacts your exposure.
[39:39.000 --> 39:44.040] So things like a stove exhaust, you know, like if you have a range hood with a vent
[39:44.040 --> 39:48.880] on it, you know, if these vent outside, I'm not talking about the microwave one, which
[39:48.880 --> 39:53.320] just blows air, you know, away from the stove, it has to vent outside.
[39:53.320 --> 39:57.920] Or if you happen to have a well ventilated home or even a larger home that has a lot
[39:57.920 --> 40:01.720] more airspace in it, this could help limit your exposure to what's, you know, what's
[40:01.720 --> 40:05.320] being what's coming out of your your stove.
[40:05.320 --> 40:07.340] So a lot has to do with air circulation.
[40:07.340 --> 40:10.720] If your kitchen is in a small, non ventilated area.
[40:10.720 --> 40:15.000] This is very typical in apartments or, you know, if you're living in a city, for example,
[40:15.000 --> 40:19.460] you're in a small apartment, the kitchen could be in a little, you know, galley way, right?
[40:19.460 --> 40:20.460] That's very bad.
[40:20.460 --> 40:25.320] This is very, very, very likely you'll get greater exposure than a kitchen where the
[40:25.320 --> 40:29.360] air in the kitchen can mix with, you know, the living room and dining room or like an
[40:29.360 --> 40:30.360] open kind of thing.
[40:30.360 --> 40:34.000] I have I have that kind of layout in my house where there's not a lot of walls.
[40:34.000 --> 40:37.840] It's just like, you know, connected rooms that are open for the most part.
[40:37.840 --> 40:41.000] And that happens to be better because then the air in your kitchen will mix with the
[40:41.000 --> 40:44.700] other air and it dilutes down all all of the toxins.
[40:44.700 --> 40:49.860] So simply opening a kitchen window can dramatically decrease the amount of nitrogen dioxide exposure
[40:49.860 --> 40:51.360] that you have in your kitchen.
[40:51.360 --> 40:54.560] So, you know, keep that in mind, you know, just crack open a window, get it open or have
[40:54.560 --> 40:59.000] a fan sucking air out, you know, while you're cooking that that can make a big difference.
[40:59.000 --> 41:00.000] All right.
[41:00.000 --> 41:02.900] Now, let's get to the part where things get a little troublesome.
[41:02.900 --> 41:08.240] Even when your stove is off, you could be exposed to pollutants that can affect your
[41:08.240 --> 41:09.240] health.
[41:09.240 --> 41:10.240] That sucks.
[41:10.240 --> 41:15.020] So a study conducted this year concluded that stoves that are not currently in use emit
[41:15.020 --> 41:19.080] a colorless, odorless gas, also known as methane.
[41:19.080 --> 41:20.080] Right.
[41:20.080 --> 41:22.160] Methane is a major component of natural gas.
[41:22.160 --> 41:28.240] A study conducted this year estimated that gas stoves in the United States emitted enough
[41:28.240 --> 41:35.800] methane equal to about four hundred thousand cars in the same time frame.
[41:35.800 --> 41:36.800] That's bad.
[41:36.800 --> 41:41.080] Now, this is we're talking about little leaks here, like many methane leaks found in the
[41:41.080 --> 41:44.540] home go undetected because it's just a tiny little bit of methane.
[41:44.540 --> 41:47.000] But there happens to be a lot of people.
[41:47.000 --> 41:48.320] So you add up that methane.
[41:48.320 --> 41:53.680] I mean, I remember the few times I lived in Manhattan two times in my life and so many
[41:53.680 --> 41:57.240] times while living in Manhattan, you smell natural gas.
[41:57.240 --> 41:59.440] Walking by a building, you smell natural gas.
[41:59.440 --> 42:03.620] That's because that building was built, you know, a hundred years ago.
[42:03.620 --> 42:08.120] And you know, the the pipes that they use to carry the natural gas to the stoves is
[42:08.120 --> 42:10.360] old and they leak and you're just smelling it.
[42:10.360 --> 42:13.840] You go into an apartment building and you're smelling natural gas the whole time you're
[42:13.840 --> 42:14.840] there.
[42:14.840 --> 42:15.840] It's just natural gas.
[42:15.840 --> 42:20.360] And just to be pedantic, the the odors put in the gas, the gas itself is odorless, as
[42:20.360 --> 42:21.360] you said.
[42:21.360 --> 42:22.360] Yeah.
[42:22.360 --> 42:25.680] They actually they put a chemical in the natural gas so that it does smell so that
[42:25.680 --> 42:26.680] you can detect it.
[42:26.680 --> 42:28.880] You know, because otherwise it would be odorless.
[42:28.880 --> 42:29.880] That's right, Steven.
[42:29.880 --> 42:30.880] That's very important.
[42:30.880 --> 42:33.700] And a lot of gas leaks are found that way.
[42:33.700 --> 42:39.480] But sometimes that smell is not strong enough and people can't detect small leaks.
[42:39.480 --> 42:40.480] Right.
[42:40.480 --> 42:43.680] Because if you have a very small leak, you're not going to be able to detect that smell
[42:43.680 --> 42:45.400] even though it's there.
[42:45.400 --> 42:49.960] Another study found that five percent of homes that had an active natural gas leak significant
[42:49.960 --> 42:53.500] enough to require repair went undetected.
[42:53.500 --> 42:55.540] You know, that five percent is significant.
[42:55.540 --> 43:01.180] Did you know that benzene is also present in natural gas and it causes cancer?
[43:01.180 --> 43:05.320] So one of the worst case scenarios is to be in a poorly ventilated home that has a natural
[43:05.320 --> 43:10.320] gas leak because you're just breathing this in all the time.
[43:10.320 --> 43:11.320] It's there.
[43:11.320 --> 43:15.580] It's just this leak is constantly putting out this gas 24 hours a day.
[43:15.580 --> 43:17.020] You know, it's not just when you cook.
[43:17.020 --> 43:18.020] It's just happening all the time.
[43:18.020 --> 43:21.240] Now, I'm not saying that you need to get rid of your gas stove.
[43:21.240 --> 43:23.740] It wouldn't be a bad idea, but you don't really need to do it.
[43:23.740 --> 43:28.880] There's some things you could do like, you know, my wife and I just got a new stove about
[43:28.880 --> 43:32.960] a year ago, and I am absolutely not going to get rid of that magical box that I now
[43:32.960 --> 43:35.640] have in my kitchen because I adore this stove.
[43:35.640 --> 43:36.640] But that's just me.
[43:36.640 --> 43:39.160] You know, what should you do if you have a gas stove in your home?
[43:39.160 --> 43:43.800] Well, you know, it's it's not a bad idea to improve the overall ventilation in your house,
[43:43.800 --> 43:47.360] open windows, you know, particularly when you're cooking.
[43:47.360 --> 43:50.920] Use your kitchen ventilation over your stove if you have it.
[43:50.920 --> 43:51.920] That that can help a lot.
[43:51.920 --> 43:55.800] And if you have someone in your home that does have some type of breathing condition,
[43:55.800 --> 43:59.920] then, you know, it very well might be a good idea for you to get rid of your gas stove
[43:59.920 --> 44:02.600] and go with a magnetic inductive stove, Kara.
[44:02.600 --> 44:03.600] All right.
[44:03.600 --> 44:05.640] I said I said it.
[44:05.640 --> 44:08.200] Kara, you live in the United States, correct?
[44:08.200 --> 44:09.200] I do.
[44:09.200 --> 44:11.540] I live in Fort Lauderdale now.
[44:11.540 --> 44:12.540] That's right.
[44:12.540 --> 44:13.680] You moved to Florida for your internship.
[44:13.680 --> 44:18.080] Well, if you live in the if you live in the United States, I'm sure other countries have
[44:18.080 --> 44:19.320] incentives as well.
[44:19.320 --> 44:24.160] But specifically in the United States, if you look into the Inflation Reduction Act
[44:24.160 --> 44:30.200] of 2022 that Biden passed recently, this offers rebates if you purchase certain high efficiency
[44:30.200 --> 44:37.720] electric appliances for your home so you can get a break on the cost of some non gas appliances,
[44:37.720 --> 44:38.720] which is a good idea.
[44:38.720 --> 44:44.280] But I sort of put the gas stove thing into a little bit of perspective as well.
[44:44.280 --> 44:48.800] Everything you're saying is true, but the relative risk here is actually fairly small
[44:48.800 --> 44:50.840] in terms of like the asthma risk.
[44:50.840 --> 44:54.680] And if you don't have asthma or somebody in the in the home with asthma, I haven't seen
[44:54.680 --> 45:00.560] any data saying there's any other health risk, asthma and COPD, I'll say, you know, chronic
[45:00.560 --> 45:02.160] obstructive pulmonary disease.
[45:02.160 --> 45:05.820] You also have to put it into the context of the fact that there's lots of other pollutants
[45:05.820 --> 45:15.320] in the home, pet dander, candles, your fireplace, insects, basically they leave their bits themselves
[45:15.320 --> 45:21.640] all over the place, flatulence, dust, flatulence, especially in Jay's home.
[45:21.640 --> 45:26.120] So there's lots of sources of pollutants inside the home.
[45:26.120 --> 45:30.320] It's not like if you get rid of your gas stove, your home, the air inside your home is going
[45:30.320 --> 45:32.600] to be magically perfect.
[45:32.600 --> 45:37.380] The best thing to do for all of those things is to have good ventilation.
[45:37.380 --> 45:41.880] Just think like we ventilate our home whenever we can, like whenever the weather is permitting,
[45:41.880 --> 45:46.640] we try to have as much ventilation going through the home as we possibly can, especially in
[45:46.640 --> 45:49.400] the downstairs kitchen area, you know.
[45:49.400 --> 45:50.520] So that's a good idea.
[45:50.520 --> 45:54.440] You could run a fan, you know, which should be pointed at an open window just to get the
[45:54.440 --> 45:55.440] air circulating.
[45:55.440 --> 46:00.500] It's got to go outside the house, right, you know, just moving air around inside the house.
[46:00.500 --> 46:04.560] You can have a HEPA air filtration system with a HEPA filter, the HEPA carbon filter.
[46:04.560 --> 46:09.120] So that would be another way to remove particulate matter and certain contaminants from the air
[46:09.120 --> 46:10.880] in your home.
[46:10.880 --> 46:12.640] So those are all, those are all good ideas.
[46:12.640 --> 46:16.160] Just good ventilation, especially around your kitchen, especially when you're cooking is
[46:16.160 --> 46:18.560] probably the best thing to do.
[46:18.560 --> 46:23.600] I think if you have a choice between a gas stove and a not a gas stove, whether it's
[46:23.600 --> 46:28.380] electric or induction, there's a lot of reasons to not choose the gas stove because of the
[46:28.380 --> 46:34.560] methane issue, not just the pollutants inside your house, but you know, also contributes
[46:34.560 --> 46:35.560] to global warming.
[46:35.560 --> 46:40.520] You know, so we're trying to minimize the amount of natural gas we're pumping around.
[46:40.520 --> 46:45.180] There's basically a certain amount of a minimal amount of leak leakage that happens in the
[46:45.180 --> 46:47.880] system, you know, and it's significant.
[46:47.880 --> 46:52.600] It's a significant contributor to global warming that's up absolutely.
[46:52.600 --> 46:54.780] Okay, let's move on.
Neanderthal Brains (46:55)
[46:54.780 --> 46:58.240] What do you guys know about Neanderthal brains?
[46:58.240 --> 47:00.200] Let me ask you a basic question.
[47:00.200 --> 47:05.920] Do you think a bit rubbery, do you think that Neanderthals were smarter, as smart or not
[47:05.920 --> 47:07.560] as smart as modern humans?
[47:07.560 --> 47:12.200] Well, I think the knee jerk reaction would be not as smart, but I've lately been leaning
[47:12.200 --> 47:14.880] towards pretty much just as smart.
[47:14.880 --> 47:16.760] I mean, we were having sex with them.
[47:16.760 --> 47:17.760] Mm hmm.
[47:17.760 --> 47:18.760] Yeah.
[47:18.760 --> 47:22.400] Something about them that I mean, they couldn't have been significantly different cognitively
[47:22.400 --> 47:23.560] from us.
[47:23.560 --> 47:26.480] When you say we care, what do you mean by that?
[47:26.480 --> 47:31.600] Human being homo sapiens a long time ago, like you're not something you want to tell
[47:31.600 --> 47:32.600] us.
[47:32.600 --> 47:38.640] Well, I mean, there are no I mean, the more interesting aspect of that would be if there
[47:38.640 --> 47:44.720] were a Neanderthal among us, that would be like sex with that person was would be so
[47:44.720 --> 47:45.720] famous.
[47:45.720 --> 47:46.720] Exactly.
[47:46.720 --> 47:50.880] Is there a chance one of us has some Neanderthal DNA?
[47:50.880 --> 47:51.880] We all do.
[47:51.880 --> 47:52.880] We all do.
[47:52.880 --> 47:53.880] Yes.
[47:53.880 --> 47:54.880] All right.
[47:54.880 --> 47:55.880] So hey, five Neanderthals.
[47:55.880 --> 47:58.640] Two to four percent, depending on where you come from.
[47:58.640 --> 48:01.040] Europeans have the most around more like four percent.
[48:01.040 --> 48:02.480] All right.
[48:02.480 --> 48:09.760] So Neanderthals as a species officially lived from about 400000 years ago to 40000 years
[48:09.760 --> 48:10.760] ago.
[48:10.760 --> 48:13.400] They are extinct.
[48:13.400 --> 48:18.080] There's been a little bit of a controversy about whether or not humans and Neanderthals
[48:18.080 --> 48:25.540] are subspecies that like homo sapiens, Neanderthal lenses, but I think the current consensus
[48:25.540 --> 48:30.960] is that they're a distinct species from homo sapiens, which is modern humans.
[48:30.960 --> 48:37.000] There's also and also just this is not a scientific controversy, but just a lot of people might
[48:37.000 --> 48:43.960] be confused about the fact that we didn't we did not evolve from Neanderthals, Neanderthals
[48:43.960 --> 48:52.000] and modern humans have a common ancestor and that common ancestor is is about 700000 years
[48:52.000 --> 48:54.560] ago.
[48:54.560 --> 49:00.400] Probably something related to homo heidelbergensis was our common ancestor.
[49:00.400 --> 49:04.760] And now there's also the Denisovans who are very close to Neanderthals, but they're recognized
[49:04.760 --> 49:06.400] as a separate branch.
[49:06.400 --> 49:10.320] That's basically like Asian Neanderthals were the Denisovans.
[49:10.320 --> 49:11.320] Cool.
[49:11.320 --> 49:12.320] Yeah.
[49:12.320 --> 49:18.800] So there's probably a lot more nuance that will come to light as we find more specimens.
[49:18.800 --> 49:20.520] Neanderthals were Ice Age adapted.
[49:20.520 --> 49:21.840] They were more robust.
[49:21.840 --> 49:22.840] Yeah.
[49:22.840 --> 49:23.840] Modern humans.
[49:23.840 --> 49:27.480] So they were they had bigger, thicker bones, heavier muscles.
[49:27.480 --> 49:28.480] Right.
[49:28.480 --> 49:34.600] So on average, their brains were a little bigger than homo sapiens brains.
[49:34.600 --> 49:40.160] But if you take the ratio to body size, they're basically the same.
[49:40.160 --> 49:42.120] So they're a little bit bigger, but they were a little bit bigger.
[49:42.120 --> 49:44.880] But how many data points do we have?
[49:44.880 --> 49:45.880] We have quite a few.
[49:45.880 --> 49:47.800] We have quite a few Neanderthal skulls.
[49:47.800 --> 49:52.400] So enough to say that on average, it's a little bit bigger.
[49:52.400 --> 49:53.400] Yeah, we do.
[49:53.400 --> 49:56.880] And if you look at the skull, so if you just look at a human skull and look at and look
[49:56.880 --> 50:00.760] at Neanderthal, like adult Neanderthal skull, you could see they're just everything is big.
[50:00.760 --> 50:03.440] You could say that looks like their brain should be bigger.
[50:03.440 --> 50:06.720] But in any case, it's proportional to their overall robustness.
[50:06.720 --> 50:10.380] So that doesn't necessarily mean that they're more encephalized than humans are.
[50:10.380 --> 50:16.280] So from that point of view, we could say, yeah, there are very closely, they're like
[50:16.280 --> 50:18.820] the most closely related species to homo sapiens.
[50:18.820 --> 50:21.600] Their brains were basically the same size as ours.
[50:21.600 --> 50:23.840] They can't be that different, you know, from ours.
[50:23.840 --> 50:25.400] So maybe they were the same.
[50:25.400 --> 50:27.640] So how could we know, right?
[50:27.640 --> 50:28.780] How could we know?
[50:28.780 --> 50:29.880] We don't have Neanderthals.
[50:29.880 --> 50:31.480] We can't give them IQ tests or whatever.
[50:31.480 --> 50:35.200] We can't figure out, evaluate their neurological or cognitive functions.
[50:35.200 --> 50:36.980] We have to infer it from indirect evidence.
[50:36.980 --> 50:42.640] So the two basic kinds of indirect evidence we have are one, biological and two, cultural.
[50:42.640 --> 50:47.240] So the cultural evidence is basically, well, what did they do?
[50:47.240 --> 50:50.860] What did they leave behind in terms of their toolkit?
[50:50.860 --> 50:52.580] Did they have any art?
[50:52.580 --> 50:54.840] How sophisticated was their culture?
[50:54.840 --> 50:57.400] How sophisticated was their language?
[50:57.400 --> 50:59.880] And what can we infer from that?
[50:59.880 --> 51:05.160] The problem with that line of evidence is we don't know how much is biological potential
[51:05.160 --> 51:06.600] and how much is just culture.
[51:06.600 --> 51:08.720] You know what I mean?
[51:08.720 --> 51:14.920] Like would a Neanderthal raised in modern human culture be just as smart as a homo sapiens?
[51:14.920 --> 51:15.920] We don't know.
[51:15.920 --> 51:19.340] That'd be obviously very, very interesting, but we don't know that.
[51:19.340 --> 51:26.360] But we do know that their toolkit was not as sophisticated as homo sapiens.
[51:26.360 --> 51:32.720] Homo sapiens stone tools were more delicate, were harder to make.
[51:32.720 --> 51:36.520] They had longer blades with more cutting edge.
[51:36.520 --> 51:38.700] They had way more variety.
[51:38.700 --> 51:39.700] That's it.
[51:39.700 --> 51:40.700] We win.
[51:40.700 --> 51:41.700] Done.
[51:41.700 --> 51:45.820] Well, it's just, you know, but the Neanderthal tools worked really well.
[51:45.820 --> 51:50.100] And in some ways, for some applications, they were superior to the more delicate tools,
[51:50.100 --> 51:54.920] you know, but still there are a lot of paleontologists do infer from that.
[51:54.920 --> 52:00.760] There's evidence of greater tool making skill among any cave paintings we can look at.
[52:00.760 --> 52:08.800] So for a while the evidence was homo sapiens, you know, had art cave paintings and there
[52:08.800 --> 52:12.380] was no evidence of any artistic creations among Neanderthals.
[52:12.380 --> 52:14.520] But more recently they have been discovered.
[52:14.520 --> 52:18.680] So the very first cave paintings were actually Neanderthal.
[52:18.680 --> 52:23.840] But definitely there is more types of art, you know, way more production of art among
[52:23.840 --> 52:27.600] homo sapiens than homo neanderthalensis, but it's not an absolute difference now.
[52:27.600 --> 52:30.020] It's just a relative difference.
[52:30.020 --> 52:34.760] There is no evidence that Neanderthals developed ever developed written language.
[52:34.760 --> 52:37.400] And of course, homo sapiens did.
[52:37.400 --> 52:43.600] And we also have to point out that Neanderthal culture was completely unchanged over 400,000
[52:43.600 --> 52:44.600] years.
[52:44.600 --> 52:45.600] Right.
[52:45.600 --> 52:48.840] It was pretty much what it was, you know, they sort of developed their toolkit early
[52:48.840 --> 52:49.840] on and that was it.
[52:49.840 --> 52:51.160] You know, we have DNA, Steve.
[52:51.160 --> 52:52.160] Yes, we do.
[52:52.160 --> 52:56.740] So you're getting to the second line of evidence, which is the biological evidence, you know,
[52:56.740 --> 53:01.880] and so one is we could look at their their brain cavities and not just that sheer size,
[53:01.880 --> 53:08.040] but we could say, well, we can infer something about, you know, the the anatomy, the gross
[53:08.040 --> 53:12.800] anatomy of the brain from the shape of the skull.
[53:12.800 --> 53:18.480] And there are some studies which show that Neanderthal brains were bigger in the parts
[53:18.480 --> 53:23.600] of the brain that had to deal with motor and sensory function and not as big in the parts
[53:23.600 --> 53:27.520] of the brain that had to do with like the higher cortical function.
[53:27.520 --> 53:33.720] So kind of their their frontal lobes weren't as developed as homo sapiens and and more
[53:33.720 --> 53:37.400] of their brain tissue is probably dedicated to motor and sensory function, which also
[53:37.400 --> 53:39.880] goes along with the they were physically more robust.
[53:39.880 --> 53:42.840] They needed more brain just to map to that bigger body.
[53:42.840 --> 53:44.640] But did their brains pulsate?
[53:44.640 --> 53:47.480] Yeah, they probably didn't pulsate.
[53:47.480 --> 53:50.400] I think they pulsated even more than homo sapiens.
[53:50.400 --> 53:51.400] Oh, geez.
[53:51.400 --> 53:52.400] Oh.
[53:52.400 --> 53:54.280] Now, Jay, you bring up the genetic evidence.
[53:54.280 --> 53:58.640] That's the latest evidence because we do have a lot of DNA evidence now from Neanderthals
[53:58.640 --> 54:00.160] and Denisovans.
[54:00.160 --> 54:05.280] And there was the reason for the actual news item here is a recent study looking at a particular
[54:05.280 --> 54:08.360] protein called the TKTL1 protein.
[54:08.360 --> 54:11.360] TK421, why aren't you at your post?
[54:11.360 --> 54:12.360] TKTL1.
[54:12.360 --> 54:19.280] And modern humans have a different version of this protein than Neanderthals.
[54:19.280 --> 54:26.320] And the version that modern humans have when they look at it in the effect that it has
[54:26.320 --> 54:35.240] on brain development is correlates with greater neuro neurone genesis, especially in the frontal
[54:35.240 --> 54:39.300] cortex, which then goes along with the brain anatomy evidence.
[54:39.300 --> 54:47.080] So probably homo sapiens have greater neuronal density than the Neanderthals, especially
[54:47.080 --> 54:52.120] in the frontal lobes, which is where all the action is in terms of the higher cortical
[54:52.120 --> 54:53.120] function.
[54:53.120 --> 54:59.400] So the brain anatomy and the genetic evidence is pointing back in the direction of, OK,
[54:59.400 --> 55:07.360] we probably were smarter than the Neanderthals when it comes to higher cognitive function.
[55:07.360 --> 55:14.360] In much less time, we developed cars and the computer and the Neanderthals over a few hundred
[55:14.360 --> 55:19.600] thousand years didn't get beyond their starting toolkit.
[55:19.600 --> 55:22.800] But again, is that because one genius kicked it off?
[55:22.800 --> 55:29.360] It's hard to say how much of this is cultural contingency rather than biological destiny.
[55:29.360 --> 55:33.760] And sort of uncomfortable with the notion that this is sort of biological destiny.
[55:33.760 --> 55:39.120] But even still, there's multiple lines of evidence now to suggest that homo sapiens
[55:39.120 --> 55:44.520] did probably have greater neuronal density and greater size of their frontal lobes and
[55:44.520 --> 55:51.200] their higher neocortex, like the executive function, the highest level of cortical function,
[55:51.200 --> 55:52.740] the cognitive function.
[55:52.740 --> 55:56.680] So they would have won all the gold medals, but we would have won all the Nobel Prizes.
[55:56.680 --> 55:57.840] We would have won the chess games.
[55:57.840 --> 55:58.840] Yeah.
[55:58.840 --> 56:06.880] I saw a video where they were showing a Neanderthal and like it dislocated a couple of fingers
[56:06.880 --> 56:11.400] on its hand and it literally just popped them back out and put it back like it was nothing.
[56:11.400 --> 56:14.840] I don't know how accurate that is, but I have no idea where they would get that information.
[56:14.840 --> 56:15.840] I agree with you.
[56:15.840 --> 56:20.240] But, you know, I guess they were trying to express how physically tough they were, you
[56:20.240 --> 56:21.240] know?
[56:21.240 --> 56:22.240] But how could they even know that?
[56:22.240 --> 56:27.800] Yeah, but a human could effectively do the same thing, you know, ignore that pain and
[56:27.800 --> 56:29.320] fix themselves like that.
[56:29.320 --> 56:32.360] But unfortunately, we only have one data point, right?
[56:32.360 --> 56:33.360] Yeah.
[56:33.360 --> 56:39.240] We only know how history played itself out one time and we have to infer a lot from that.
[56:39.240 --> 56:44.480] You wonder how, like, is it possible that a genius Neanderthal would have been born
[56:44.480 --> 56:50.160] that could have kicked off agriculture and then once that happens, all of modern society
[56:50.160 --> 57:00.800] eventually unfolds from that, or did Homo sapiens really just tick over the neurological
[57:00.800 --> 57:04.680] threshold necessary to have a technological civilization?
[57:04.680 --> 57:06.640] You know, and is it that close?
[57:06.640 --> 57:11.700] Is it really that, you know, that fine that Neanderthals could never develop technology
[57:11.700 --> 57:17.600] and we did or, you know, it's just interesting to think about that because we really don't
[57:17.600 --> 57:18.600] know.
[57:18.600 --> 57:21.200] Could DNA last 40,000 years?
[57:21.200 --> 57:22.200] Yeah.
[57:22.200 --> 57:23.200] Yeah.
[57:23.200 --> 57:24.200] We have DNA from Neanderthals.
[57:24.200 --> 57:25.200] Absolutely.
[57:25.200 --> 57:26.200] I mean, like, how much?
[57:26.200 --> 57:28.680] I mean, how much of a genome?
[57:28.680 --> 57:30.880] We've pretty much sequenced their genome at this point.
[57:30.880 --> 57:31.880] Well, crap, man.
[57:31.880 --> 57:32.880] That's it.
[57:32.880 --> 57:34.880] One day we'll make one.
[57:34.880 --> 57:35.880] Well, yeah.
[57:35.880 --> 57:38.680] Now, that's ethically, you know, complicated.
[57:38.680 --> 57:39.680] No.
[57:39.680 --> 57:40.680] Not at all.
[57:40.680 --> 57:41.680] Not at all.
[57:41.680 --> 57:46.720] What I suggest, Bob, instead is we allow AI to help us make certain inferences, play out
[57:46.720 --> 57:49.720] various scenarios using what?
[57:49.720 --> 57:50.720] Computer modeling.
[57:50.720 --> 57:51.720] Oh, yeah.
[57:51.720 --> 57:52.720] We could simulate one.
[57:52.720 --> 57:55.840] That would say, you know, give you, okay, we did this 8 million times, ran it through
[57:55.840 --> 58:00.720] our systems, and in all these scenarios, you know, in every scenario, Neanderthal didn't,
[58:00.720 --> 58:02.000] you know, did not emerge.
[58:02.000 --> 58:03.000] Yeah.
[58:03.000 --> 58:05.480] That's hard to simulate, you know, convincingly by, yeah.
[58:05.480 --> 58:08.200] I'm not saying it's going to happen in five to 10 years, but...
[58:08.200 --> 58:09.200] Yeah.
[58:09.200 --> 58:13.080] Computer simulations will help inform that as well at some point, absolutely.
[58:13.080 --> 58:17.760] And we, you know, we may be able to, with enough genetic information, you know, imagine
[58:17.760 --> 58:23.240] making a Neanderthal brain in silicon, right, basically an Android Neanderthal brain, and
[58:23.240 --> 58:31.240] then we can test it, you know, and see what its limits were, but as long as the model
[58:31.240 --> 58:32.760] is accurate enough.
[58:32.760 --> 58:34.920] But I find all this extremely fascinating.
[58:34.920 --> 58:35.920] Yeah, it's awesome.
[58:35.920 --> 58:36.920] Yeah, man.
[58:36.920 --> 58:37.920] First question.
[58:37.920 --> 58:38.920] First question we'll ask it.
[58:38.920 --> 58:39.920] How are you?
[58:39.920 --> 58:40.920] I dislocated my finger.
Synthetic Microbiome (58:43)
... encephalized [v 1]
[58:40.920 --> 58:47.960] All right, Bob, you're going to give us an update on synthetic microbiology.
[58:47.960 --> 58:53.040] Researchers have created the most sophisticated synthetic microbiome yet with over 100 species
[58:53.040 --> 58:57.000] of bacteria, and they've tested it inside very special mice.
[58:57.000 --> 59:00.800] So why would they do such a thing, and what makes the mice so special?
[59:00.800 --> 59:06.760] If you want to find out, listen, or just go right to Cell, which was published on September
[59:06.760 --> 59:07.760] 6th of this year.
[59:07.760 --> 59:11.920] Author of the study was Michael Fischbach, he was associate professor of bioengineering
[59:11.920 --> 59:18.540] microbiology and immunology at Stanford's Seraphin Chem H. Interesting place, Seraphin
[59:18.540 --> 59:24.560] Chem H. Okay, so if we learn anything about the hundreds of species of bacteria in our
[59:24.560 --> 59:31.000] digestive systems or our gut microbiome, it's that not only do they help us digest otherwise
[59:31.000 --> 59:36.020] indigestible food, but recent decades have clearly shown that they've got a connection
[59:36.020 --> 59:40.920] to many of the scourges of the day, obesity, depression, anxiety, Parkinson's, the list
[59:40.920 --> 59:42.640] goes on and on.
[59:42.640 --> 59:47.080] There's some major connection going on there, it seems.
[59:47.080 --> 59:52.640] So the potential benefits from fully understanding our gut microbiome seems as great as the complexity
[59:52.640 --> 59:57.000] of these bacterial ecosystems that we've evolved in partnership with.
[59:57.000 --> 01:00:03.040] Up until now, if you wanted to study our microbiome, it involved words that probably dramatically
[01:00:03.040 --> 01:00:06.920] change the expression on your face every time you hear them, fecal transplants.
[01:00:06.920 --> 01:00:07.920] Yeah.
[01:00:07.920 --> 01:00:10.200] We've talked about those.
[01:00:10.200 --> 01:00:16.400] So that technique essentially just drops in the entire gut microbiome of one organism
[01:00:16.400 --> 01:00:21.560] into another, the whole thing, tweaking it and then learning from such a transplant though
[01:00:21.560 --> 01:00:24.640] is pretty much doesn't happen.
[01:00:24.640 --> 01:00:30.360] Since there's really just no tools now that would allow researchers to edit any of those
[01:00:30.360 --> 01:00:34.160] fecal bacterial species, it's a real shitty problem.
[01:00:34.160 --> 01:00:38.880] Come on, I just had to get it out of the way.
[01:00:38.880 --> 01:00:44.720] So author of the study, Michael Fishback said, so much of what we know about biology, we
[01:00:44.720 --> 01:00:50.920] would not know if it weren't for the ability to manipulate complex biological systems piecewise.
[01:00:50.920 --> 01:00:54.280] That's exactly what we cannot do with fecal transplants.
[01:00:54.280 --> 01:01:00.840] So their solution was to just build a microbiome from scratch, which sounds hard, but is actually
[01:01:00.840 --> 01:01:04.320] in reality still hard.
[01:01:04.320 --> 01:01:08.680] All the bacteria had to do two critical things or it wasn't going to happen.
[01:01:08.680 --> 01:01:13.560] They had to get along with each other without one or two of them just taking over, right?
[01:01:13.560 --> 01:01:18.640] Imagine you throw a hundred bacteria into one's place and it's like one or two of them
[01:01:18.640 --> 01:01:22.960] just like, that's it, we're kings and they just dominate and take it all over.
[01:01:22.960 --> 01:01:25.520] They also had to actually be functional, right?
[01:01:25.520 --> 01:01:30.160] Like a natural microbiome, they actually had to perform some of the functions that our
[01:01:30.160 --> 01:01:33.720] biome does, otherwise what's the point?
[01:01:33.720 --> 01:01:40.560] So now they couldn't use a natural microbiome as a template because there is no real template
[01:01:40.560 --> 01:01:41.940] out there.
[01:01:41.940 --> 01:01:47.960] If you take two random people, they only share about 50% of the bacterial genes.
[01:01:47.960 --> 01:01:56.760] Now of course, the closer you are genetically and I guess location wise, I think that increases
[01:01:56.760 --> 01:01:58.480] the similarity.
[01:01:58.480 --> 01:02:01.240] Like Steve and I probably share 80 or 90%.
[01:02:01.240 --> 01:02:06.200] I would be my guess, I have no idea, but I would guess it'd be more than 50%.
[01:02:06.200 --> 01:02:12.280] So the researchers compromised on using a hundred strains of bacteria that 20% of all
[01:02:12.280 --> 01:02:14.100] people share.
[01:02:14.100 --> 01:02:15.920] So then of course they had to do it right.
[01:02:15.920 --> 01:02:17.820] They didn't just take them and throw them together.
[01:02:17.820 --> 01:02:24.360] They grew them individually and then mix them together and they called it Human Community
[01:02:24.360 --> 01:02:28.680] 1 or HCOM1, which I guess is a decent, it's okay.
[01:02:28.680 --> 01:02:31.360] I've heard worse, but I think we could have come up with something better.
[01:02:31.360 --> 01:02:36.280] Okay, so they then tested this community in that special mice that I was talking about.
[01:02:36.280 --> 01:02:38.960] Now these mice were bred to be literally germ free.
[01:02:38.960 --> 01:02:40.880] I mean, amazing.
[01:02:40.880 --> 01:02:44.040] Imagine that, no gut bacteria at all.
[01:02:44.040 --> 01:02:50.920] So they basically implanted HCOM1 into these special mice and they found that 98% of the
[01:02:50.920 --> 01:02:56.500] HCOM bacterial species colonized and stayed stable and balanced over two months, which
[01:02:56.500 --> 01:02:58.360] is pretty sweet.
[01:02:58.360 --> 01:03:01.920] But they really were just getting started though, nowhere near the end.
[01:03:01.920 --> 01:03:06.520] Next was the stage to make HCOM more robust, right?
[01:03:06.520 --> 01:03:12.840] So to do that, they took advantage of a theory, interesting theory called colonization resistance.
[01:03:12.840 --> 01:03:17.860] So that means if I, for example, say distracted Jay and introduced a new bacterium into his
[01:03:17.860 --> 01:03:24.040] established colony, that bacterium would survive only if it fills an unoccupied niche.
[01:03:24.040 --> 01:03:30.800] So that's the essence of colonization resistance, but not Jay, it's just the idea that if you
[01:03:30.800 --> 01:03:36.640] introduce a bacterium into a colony, it's not going to get a job unless it fills a job
[01:03:36.640 --> 01:03:39.480] that is not currently being taken, okay?
[01:03:39.480 --> 01:03:44.960] For the second part, they introduced to HCOM1 in a mouse, an entire human fecal microbiome,
[01:03:44.960 --> 01:03:45.960] right?
[01:03:45.960 --> 01:03:50.680] So they have the mouse, they have HCOM1 already in there and established, and then they throw
[01:03:50.680 --> 01:03:56.620] at it, bam, here's a fecal microbiome, which is the entire suite, right, A to Z.
[01:03:56.620 --> 01:03:59.240] And a lot of people thought that, what do you think would happen?
[01:03:59.240 --> 01:04:03.840] I mean, just a gut feeling, so to speak, what do you think would happen?
[01:04:03.840 --> 01:04:09.200] It seems, I agree with a lot of the scientists who thought that, hey, this fecal microbiome
[01:04:09.200 --> 01:04:10.880] has been around for a long time.
[01:04:10.880 --> 01:04:13.060] This one specifically was 10 years.
[01:04:13.060 --> 01:04:19.600] It was an established colony of bacteria for 10 years and they put it up against this HCOM1,
[01:04:19.600 --> 01:04:22.160] which has just been inside this mouse for three weeks.
[01:04:22.160 --> 01:04:28.360] So a lot of the scientists thought that the fecal bacteria would just decimate them, but
[01:04:28.360 --> 01:04:29.760] that's not what happened.
[01:04:29.760 --> 01:04:37.900] HCOM1 had girded its loins and survived, it totally survived, but the resulting new community
[01:04:37.900 --> 01:04:42.240] now had 10% of its constituents from the fecal transplant.
[01:04:42.240 --> 01:04:46.480] So then of course, the obvious implication there then is that the fecal bacteria filled
[01:04:46.480 --> 01:04:53.820] roles in HCOM1 that were not yet filled yet by other bacteria per the colonization resistance
[01:04:53.820 --> 01:04:55.640] theory, okay?
[01:04:55.640 --> 01:04:59.480] So they then individually, of course, now they had to start from scratch, right?
[01:04:59.480 --> 01:05:04.920] Now they learned something about what new bacteria were needed, were important, so they
[01:05:04.920 --> 01:05:08.880] individually grew the now 120 bacterial species.
[01:05:08.880 --> 01:05:15.000] They regrew the community and put them together and then they renamed it, right?
[01:05:15.000 --> 01:05:17.560] Because you got to rename it at this point because it's kind of new.
[01:05:17.560 --> 01:05:24.720] They called it, of course, HCOM2, HCOM2, which was now much more resistant to any more attempts
[01:05:24.720 --> 01:05:27.060] at shitty interference.
[01:05:27.060 --> 01:05:31.200] So they, all right, so then they weren't even done there.
[01:05:31.200 --> 01:05:38.240] Then they tested HCOM2 against E. coli, an E. coli infection, and showed that the synthetic
[01:05:38.240 --> 01:05:42.200] microbiome resisted infection just like a natural one does.
[01:05:42.200 --> 01:05:45.560] So that's great news, but now what, okay, what's the next step?
[01:05:45.560 --> 01:05:50.280] Okay, so in the future, what they want to do is, and of course, this makes perfect sense,
[01:05:50.280 --> 01:05:53.980] they want to more fully take advantage of this new research paradigm because now we
[01:05:53.980 --> 01:05:56.060] can tweak, you know, think about it.
[01:05:56.060 --> 01:06:02.080] They can now add or delete the individual components of an engineered microbiome to learn who does
[01:06:02.080 --> 01:06:03.480] what, right?
[01:06:03.480 --> 01:06:07.940] So first up, what they want to do is they want to identify the critical bacteria that
[01:06:07.940 --> 01:06:13.280] confer the observed infection resistance that they saw, and perhaps they can make it even
[01:06:13.280 --> 01:06:14.280] better.
[01:06:14.280 --> 01:06:18.240] And then after that, they may do the same for those strains that may now, that may show
[01:06:18.240 --> 01:06:21.880] at some point immunotherapy response, for example.
[01:06:21.880 --> 01:06:27.620] So you can kind of see, all right, it's doing A, so let's find out which specific bacteria
[01:06:27.620 --> 01:06:33.560] of these 120 bacteria are filling these roles, and they could find out, you know, which bacteria,
[01:06:33.560 --> 01:06:37.280] all the roles that they play and how they interact, pretty amazing.
[01:06:37.280 --> 01:06:42.120] So engineered microbiomes certainly seem to have an amazing potential, in my eyes anyway,
[01:06:42.120 --> 01:06:46.180] for therapeutic interventions to enhance health and treat disease.
[01:06:46.180 --> 01:06:52.440] The deeper future of this technology's potential, it seems like right out of a sci-fi movie.
[01:06:52.440 --> 01:06:53.440] Like what?
[01:06:53.440 --> 01:06:57.520] Imagine, imagine, what could an engineered gut microbiome do if the bacteria themselves
[01:06:57.520 --> 01:06:58.520] were engineered?
[01:06:58.520 --> 01:06:59.520] Right?
[01:06:59.520 --> 01:07:03.160] So you're not, you're not just tweaking the constituents of the microbiome.
[01:07:03.160 --> 01:07:07.880] You're also, imagine at some point, and we're doing this now where we're taking individual,
[01:07:07.880 --> 01:07:13.320] you know, bacteria, bacterium, and tweaking it genetically to be even more efficient or
[01:07:13.320 --> 01:07:19.440] whatever at what it does, or how about altering, fundamentally altering the DNA, changing the
[01:07:19.440 --> 01:07:25.200] base pairs and, you know, using, you know, using things that just aren't found in nature
[01:07:25.200 --> 01:07:26.920] and then sticking that into a microbiome.
[01:07:26.920 --> 01:07:30.680] I mean, I'm really going, you know, I'm going, you know, many decades in the future where,
[01:07:30.680 --> 01:07:33.080] you know, where we could really tweak the crap out of this.
[01:07:33.080 --> 01:07:37.920] It's really fascinating to think what could be possible, but even beyond our own personal
[01:07:37.920 --> 01:07:44.880] gut microbiome, you know, looking non-selfishly outward, we could have engineered biomes that
[01:07:44.880 --> 01:07:49.160] could dramatically impact environmental preservation and restoration.
[01:07:49.160 --> 01:07:54.200] We could tweak oceanic biomes that can mitigate microplastics pollution and on and on.
[01:07:54.200 --> 01:07:59.720] It's really, you know, it's really interesting to think at where this could be, you know,
[01:07:59.720 --> 01:08:05.160] even five, 10 years or 30, 40 years, this could be really, make some dramatic changes,
[01:08:05.160 --> 01:08:09.160] not only to our health and the treatment of disease, but also the environment itself as
[01:08:09.160 --> 01:08:10.160] well.
[01:08:10.160 --> 01:08:11.160] Sounds like we need it today, Bob.
[01:08:11.160 --> 01:08:12.160] Yeah.
[01:08:12.160 --> 01:08:13.160] Tell me about it, dude.
[01:08:13.160 --> 01:08:14.160] Yeah.
[01:08:14.160 --> 01:08:15.160] This is potentially a really big, big step.
[01:08:15.160 --> 01:08:16.160] Oh yeah.
[01:08:16.160 --> 01:08:22.200] Essentially, now that we have like a starter microbiome ecosystem, right, because it's
[01:08:22.200 --> 01:08:26.360] exactly why like the whole probiotic thing, it's like, oh, you're going to take some,
[01:08:26.360 --> 01:08:32.380] you know, like one or two or three different bacteria and, and add that to your ecosystem.
[01:08:32.380 --> 01:08:33.380] It does nothing.
[01:08:33.380 --> 01:08:38.760] Like I say, if you have a stable, complete ecosystem, adding something new, isn't going
[01:08:38.760 --> 01:08:39.760] to do anything.
[01:08:39.760 --> 01:08:46.120] Um, you know, like Mark Chrysler made an analogy, it's like, it's like planting corn, rows of
[01:08:46.120 --> 01:08:47.120] corn in the rainforest.
[01:08:47.120 --> 01:08:48.120] It's not going to do anything.
[01:08:48.120 --> 01:08:50.600] It's like, it's not going to affect the ecosystem.
[01:08:50.600 --> 01:08:55.840] Here we have, yeah, you have a good, you have a complete ecosystem as a starting point and
[01:08:55.840 --> 01:08:57.980] now we could endlessly tweak it.
[01:08:57.980 --> 01:09:03.900] And as you say, like this could be a platform for like a totally new therapeutic paradigm.
[01:09:03.900 --> 01:09:07.160] People have been researching this for the last 20 years or so, but they haven't really
[01:09:07.160 --> 01:09:12.240] been making any headway because again, they're trying to, to, to add one or two bacteria.
[01:09:12.240 --> 01:09:16.360] But here, if they could say, all right, we're going to replace this ecosystem with an alternate
[01:09:16.360 --> 01:09:24.520] gut ecosystem, one that will reduce inflammation or will reduce depression or will, will cause
[01:09:24.520 --> 01:09:30.280] you to have to be, have better weight control or whatever it is, or all kinds of things
[01:09:30.280 --> 01:09:32.120] that theoretically it could affect.
[01:09:32.120 --> 01:09:35.160] Or maybe just get rid of that excessive flatulence that you have or whatever.
[01:09:35.160 --> 01:09:39.400] So it, yeah, this is, this, but I do think it's going to be 20, 30 years before we see
[01:09:39.400 --> 01:09:44.600] like the real, the real therapeutic applications emerging.
[01:09:44.600 --> 01:09:52.480] Just saying, we'll have to give you an update on episode 2,374 at that point, right Cara?
[01:09:52.480 --> 01:09:59.200] But first, what, first, what's the word from Cara gentrification is done with her internship
[01:09:59.200 --> 01:10:00.200] by then.
[01:10:00.200 --> 01:10:01.200] That's right.
[01:10:01.200 --> 01:10:02.200] Hopefully.
UFO Videos Classified (1:10:03)
[01:10:02.200 --> 01:10:03.200] All right.
[01:10:03.200 --> 01:10:06.280] Evan, tell us about the latest on these Pentagon UFO videos.
[01:10:06.280 --> 01:10:07.280] Yeah.
[01:10:07.280 --> 01:10:08.280] There's some new news about this.
[01:10:08.280 --> 01:10:10.880] I first read about this at vice.com it's author.
[01:10:10.880 --> 01:10:16.000] The article was authored by Jason Kobler and the news story headline is this Navy says
[01:10:16.000 --> 01:10:18.520] all UFO videos are classified.
[01:10:18.520 --> 01:10:21.240] Releasing them will quote harm national security.
[01:10:21.240 --> 01:10:22.760] Oh my God.
[01:10:22.760 --> 01:10:23.760] The US Navy.
[01:10:23.760 --> 01:10:24.760] Yeah.
[01:10:24.760 --> 01:10:25.760] Yeah.
[01:10:25.760 --> 01:10:26.760] First, first paragraph.
[01:10:26.760 --> 01:10:29.960] US Navy says that releasing any additional UFO videos would harm national security and
[01:10:29.960 --> 01:10:32.480] told the government transparency website.
[01:10:32.480 --> 01:10:33.760] We'll talk about that in a second.
[01:10:33.760 --> 01:10:38.420] That all of the government's UFOs videos are classified information.
[01:10:38.420 --> 01:10:43.060] So vice links to the original source of the news, this government transparency, transparency
[01:10:43.060 --> 01:10:49.260] website, the black vault.com, which is a place we've talked about before, co-founded by
[01:10:49.260 --> 01:10:54.760] the lead singer of Blink 182 and some other celebrity enthusiasts who all happen to be
[01:10:54.760 --> 01:11:00.440] true believers build this as the largest privately run online repository of declassified government
[01:11:00.440 --> 01:11:04.840] documents anywhere in the world, 2 million pages of documents to read, blah, blah, blah.
[01:11:04.840 --> 01:11:10.160] So if you remember back in April of 2020, you know, we were in full lockdown with COVID
[01:11:10.160 --> 01:11:14.120] so things are a little fuzzy then, but in any ways that's when the Navy released the
[01:11:14.120 --> 01:11:19.320] three videos of UAPs, right, unidentified aerial phenomenon videos.
[01:11:19.320 --> 01:11:23.160] And they released them officially after they had been leaked years before and there was
[01:11:23.160 --> 01:11:25.800] a lot of controversy and back and forth and talking about it.
[01:11:25.800 --> 01:11:29.660] So they deemed them unclassified and officially released them.
[01:11:29.660 --> 01:11:34.160] And the statement from the Navy that accompanied the videos says this, the department has determined
[01:11:34.160 --> 01:11:39.080] that the authorized release of these unclassified videos does not reveal any sensitive capabilities
[01:11:39.080 --> 01:11:44.000] or systems and does not impinge on any subsequent investigations of military airspace and combat
[01:11:44.000 --> 01:11:46.840] incursions by UAPs.
[01:11:46.840 --> 01:11:50.240] The Department of Defense is releasing the videos in order to clear up any misconceptions
[01:11:50.240 --> 01:11:54.440] by the public on whether or not the footage that has been circulating was real or whether
[01:11:54.440 --> 01:11:56.040] or not there are more videos.
[01:11:56.040 --> 01:12:00.240] The aerial phenomenon observed in the videos remains characterized as unidentified.
[01:12:00.240 --> 01:12:01.240] That's their statement.
[01:12:01.240 --> 01:12:02.240] Okay.
[01:12:02.240 --> 01:12:08.400] The next day, this is April of 2020, on the next day, Black Vault sprung into action and
[01:12:08.400 --> 01:12:12.820] hit them with a freedom of information request.
[01:12:12.820 --> 01:12:14.320] They pursued this.
[01:12:14.320 --> 01:12:15.320] Yep.
[01:12:15.320 --> 01:12:22.880] So U.S. Navy, they wanted the release of all of their videos that held a UAP designation.
[01:12:22.880 --> 01:12:25.520] And they got their answer in March of 2022.
[01:12:25.520 --> 01:12:26.520] All right.
[01:12:26.520 --> 01:12:30.720] So you go from April of 2020 to March of 2022.
[01:12:30.720 --> 01:12:35.040] And they basically said that they have to deny the request because they found no additional
[01:12:35.040 --> 01:12:37.600] videos with that official designation.
[01:12:37.600 --> 01:12:43.760] Now in the meantime, the Black Vault had put through another FOIA request.
[01:12:43.760 --> 01:12:51.360] This was in February of 2021, but specifically with the Office of Naval Intelligence or ONI.
[01:12:51.360 --> 01:12:57.480] And they asked them to release any and all video footage with that designation of UAP.
[01:12:57.480 --> 01:12:58.480] Any or ONI?
[01:12:58.480 --> 01:12:59.480] I don't know.
[01:12:59.480 --> 01:13:00.480] ONI?
[01:13:00.480 --> 01:13:01.480] ANI?
[01:13:01.480 --> 01:13:02.480] ONI?
[01:13:02.480 --> 01:13:03.480] It could be either.
[01:13:03.480 --> 01:13:04.480] I was trying to make a joke.
[01:13:04.480 --> 01:13:05.880] July, they got there.
[01:13:05.880 --> 01:13:07.960] So that was a second request.
[01:13:07.960 --> 01:13:13.740] They got their answer to that in July of 2022, so just a couple months ago, in which they
[01:13:13.740 --> 01:13:16.980] said that basically, don't ask us.
[01:13:16.980 --> 01:13:18.160] You asked the wrong department.
[01:13:18.160 --> 01:13:22.880] You have to put this in through the Office of the Chief Naval Operations.
[01:13:22.880 --> 01:13:24.920] So you basically barked up the wrong tree.
[01:13:24.920 --> 01:13:30.860] So then they put in the third request in July of 2022 with the Office of Chief Naval Operations.
[01:13:30.860 --> 01:13:31.860] And they got their answer.
[01:13:31.860 --> 01:13:34.280] They got their answer just last week.
[01:13:34.280 --> 01:13:35.280] Request denied.
[01:13:35.280 --> 01:13:36.280] Here's what they said.
[01:13:36.280 --> 01:13:41.480] The UAP task force, this is the response to the black vault from the Office of the Chief
[01:13:41.480 --> 01:13:42.840] of Naval Operations.
[01:13:42.840 --> 01:13:47.880] The UAP task force has responded back and stated that the requested videos contain sensitive
[01:13:47.880 --> 01:13:53.740] information pertaining to UAPs and are classified and exempt from disclosure under this particular
[01:13:53.740 --> 01:13:54.740] exemption.
[01:13:54.740 --> 01:14:01.360] They cite a code in the law in accordance with Executive Order 13-526 and the UAP Security
[01:14:01.360 --> 01:14:03.880] Classification Guide, blah, blah, blah.
[01:14:03.880 --> 01:14:08.880] The release of this information will harm national security, as it may provide adversaries
[01:14:08.880 --> 01:14:14.360] valuable information regarding DOD and Navy operations, vulnerabilities, or capabilities.
[01:14:14.360 --> 01:14:17.560] No portions of the video can be segregated for release.
[01:14:17.560 --> 01:14:22.580] So yeah, they basically are saying that they shut them down, said, we're not giving you
[01:14:22.580 --> 01:14:23.580] anything else.
[01:14:23.580 --> 01:14:28.760] There are, but at the same time, sort of tipped their hand that, yeah, there are videos, but
[01:14:28.760 --> 01:14:31.560] you're not getting them.
[01:14:31.560 --> 01:14:40.920] Which is a bit of a departure in past denials of these kinds of requests by these departments
[01:14:40.920 --> 01:14:49.240] in which they'll say it's called a GLOMAR, G-L-O-M-A-R response, also known as a glomerization
[01:14:49.240 --> 01:14:53.600] or glomar denial, and refers to a response request for information that will neither
[01:14:53.600 --> 01:14:59.000] confirm nor deny or N-C-N-D, the existence of whatever information you're seeking.
[01:14:59.000 --> 01:15:01.600] That's been the standard, but this is a departure from that.
[01:15:01.600 --> 01:15:08.720] So that's effectively what this new news is, is that they are not using that standard that
[01:15:08.720 --> 01:15:12.260] they've used in the past, but instead they're saying you're not getting any other videos
[01:15:12.260 --> 01:15:14.800] that have that designation.
[01:15:14.800 --> 01:15:22.880] So that's the whole news story in a nutshell, and I have a few observations, if I may, and
[01:15:22.880 --> 01:15:25.400] then you guys can maybe chime in.
[01:15:25.400 --> 01:15:29.360] So my first observation is kind of about our culture, specifically our American culture,
[01:15:29.360 --> 01:15:34.840] our Western culture, maybe in a broader sense, but we're absorbed in the UFO mythology, and
[01:15:34.840 --> 01:15:36.640] it's reached a saturation point.
[01:15:36.640 --> 01:15:41.880] With the discussion of the evidence of alien technology, alien beings on Earth, the discussion
[01:15:41.880 --> 01:15:48.560] of that evidence has been pretty much subdued, if not totally shoved to the side.
[01:15:48.560 --> 01:15:50.160] And it's because of our institutions.
[01:15:50.160 --> 01:15:53.320] It's our news media, our social media, our political media.
[01:15:53.320 --> 01:15:55.240] They've allowed it to fester.
[01:15:55.240 --> 01:16:00.360] They've not done a good job in asking that very basic fundamental question, and instead
[01:16:00.360 --> 01:16:06.040] years, decades pass, and the culture just gets used to the fact, all right, yeah, UFOs
[01:16:06.040 --> 01:16:10.640] are probably real, just matter of figuring out what they are and who's covering what
[01:16:10.640 --> 01:16:11.640] up.
[01:16:11.640 --> 01:16:17.460] So I think it's a reminder that this is cultural influence, and in this case, I think cultural
[01:16:17.460 --> 01:16:21.840] damage that the belief in UFOs and alien visitations and alien abductions and all these things,
[01:16:21.840 --> 01:16:26.280] they have real effects on the qualities of our institutions, and unfortunately, our institutions
[01:16:26.280 --> 01:16:31.360] fail us at protecting our culture against these kinds of crazy ideas.
[01:16:31.360 --> 01:16:35.400] It's a waste of time, waste of energy, waste of person power, waste of taxpayer money in
[01:16:35.400 --> 01:16:36.400] a lot of cases.
[01:16:36.400 --> 01:16:43.280] And it's not just the wastefulness of it all, it contributes to the dumbening of the
[01:16:43.280 --> 01:16:44.280] society.
[01:16:44.280 --> 01:16:48.240] It makes people less skeptical, less informed, more detached from reality, and these beliefs,
[01:16:48.240 --> 01:16:51.100] and it translates to other equally fantastical beliefs.
[01:16:51.100 --> 01:16:52.440] So that's one thing I have to say.
[01:16:52.440 --> 01:16:57.060] The next thing, we have to remember there are throngs of people who really want these
[01:16:57.060 --> 01:17:01.480] things to be true for a myriad of reasons that we've talked about in the past, but
[01:17:01.480 --> 01:17:03.880] the media amplifies those values.
[01:17:03.880 --> 01:17:07.360] And then you have a government that's really incapable of handling this situation with
[01:17:07.360 --> 01:17:11.040] any good techniques or effectiveness.
[01:17:11.040 --> 01:17:12.040] Congress is feeble.
[01:17:12.040 --> 01:17:17.680] I mean, really, it's ill-equipped to deal with this, I think at best, and at worst,
[01:17:17.680 --> 01:17:21.600] it's politically calculating, trying to make the most political currency out of it.
[01:17:21.600 --> 01:17:24.220] They lack the ability to communicate this correctly with the public.
[01:17:24.220 --> 01:17:28.480] We had the chance earlier this year, those congressional hearings about UAPs, that should
[01:17:28.480 --> 01:17:33.440] have been the chance to explain to the world, and really this world stage, the values of
[01:17:33.440 --> 01:17:37.680] skepticism and rational analysis, but instead it just added fuel to the fire.
[01:17:37.680 --> 01:17:42.560] Now, as far as the Department of Defense goes specifically, they are put in an impossible
[01:17:42.560 --> 01:17:47.880] situation where they have entirely justified and valid reasons to keep a lot of secrets
[01:17:47.880 --> 01:17:50.260] from getting into the hands of enemies of the country.
[01:17:50.260 --> 01:17:51.860] That is their job.
[01:17:51.860 --> 01:17:58.800] But that's also why they are a prime target for organizations like Black Vault.
[01:17:58.800 --> 01:18:00.400] And it goes back to a very old playbook.
[01:18:00.400 --> 01:18:05.480] You keep asking questions of the people who are incapable or will never give up the answers
[01:18:05.480 --> 01:18:12.060] that you're seeking, or they're going to lie to you outright, but it feeds the beast.
[01:18:12.060 --> 01:18:16.480] You keep doing that, and then you can claim the government is involved in massive cover-ups
[01:18:16.480 --> 01:18:18.400] involving extraterrestrials.
[01:18:18.400 --> 01:18:23.580] This takes back right to the late 1940s when this whole thing basically started.
[01:18:23.580 --> 01:18:27.000] It's the same old patterns, and this really hasn't changed.
[01:18:27.000 --> 01:18:30.720] In fact, it's gotten more sophisticated and kind of worse as time goes up.
[01:18:30.720 --> 01:18:33.360] And here's my wrap up.
[01:18:33.360 --> 01:18:39.680] The Department of Defense or the Navy office that released this answer to Black Vault,
[01:18:39.680 --> 01:18:42.740] I think they made things worse.
[01:18:42.740 --> 01:18:47.680] The runaround and ultimate response that they made that they have these videos but you're
[01:18:47.680 --> 01:18:50.520] not getting it is exactly what the believers want to hear.
[01:18:50.520 --> 01:18:53.640] It helps cement their distrust of the institutions.
[01:18:53.640 --> 01:18:58.440] I mean, the Navy, have they ever consulted with organizations that have experience in
[01:18:58.440 --> 01:19:03.040] dealing with scientific explanations for UAPs and UFOs?
[01:19:03.040 --> 01:19:07.360] Are you at all involved with psychologists who specialize in understanding how the human
[01:19:07.360 --> 01:19:08.360] brain works?
[01:19:08.360 --> 01:19:09.720] Neurologists?
[01:19:09.720 --> 01:19:11.200] Why things operate they do?
[01:19:11.200 --> 01:19:13.000] Why people believe they do?
[01:19:13.000 --> 01:19:15.040] The psychology of mass delusions?
[01:19:15.040 --> 01:19:20.680] Does any of this go into their equations on how they structure their answers to these
[01:19:20.680 --> 01:19:21.680] kinds of questions?
[01:19:21.680 --> 01:19:24.220] But I don't think they do.
[01:19:24.220 --> 01:19:28.760] They fail to give reasonable scientific analysis to what the people are seeing in the videos
[01:19:28.760 --> 01:19:29.760] that they release.
[01:19:29.760 --> 01:19:36.300] And it dates back earlier this year when we had the chance to really set the record straight,
[01:19:36.300 --> 01:19:37.840] and they totally, totally blew it.
[01:19:37.840 --> 01:19:40.120] They fumble the ball time and time again.
[01:19:40.120 --> 01:19:41.120] Those are my thoughts on this.
[01:19:41.120 --> 01:19:45.080] Yeah, I think you hit all the points I was going to make, Evan.
[01:19:45.080 --> 01:19:47.940] So I think you hit the nail on the head.
[01:19:47.940 --> 01:19:52.200] They don't have the skill set necessary to deal with the situation.
[01:19:52.200 --> 01:19:57.400] It's kind of a no-win situation, as you say, they get trolled because if they release it,
[01:19:57.400 --> 01:19:58.400] they'll make hay out of it.
[01:19:58.400 --> 01:19:59.880] The true believers will make hay out of it.
[01:19:59.880 --> 01:20:02.520] They'll anomaly hunt the hell out of it.
[01:20:02.520 --> 01:20:04.780] And if they don't release it, then it's a cover-up, right?
[01:20:04.780 --> 01:20:09.840] There's no legitimate move that the government can make.
[01:20:09.840 --> 01:20:16.920] But at the same time, they completely fail to appreciate the situation, to know that
[01:20:16.920 --> 01:20:21.600] they lack the expertise to deal with it, and to respond appropriately.
[01:20:21.600 --> 01:20:26.120] It's kind of like the way universities and professional institutions respond to snake
[01:20:26.120 --> 01:20:27.120] oil.
[01:20:27.120 --> 01:20:28.120] It's the same thing.
[01:20:28.120 --> 01:20:33.520] They don't have the skill set, so they don't know what they're doing, so they flub it.
[01:20:33.520 --> 01:20:36.360] They just utterly fail to do it.
[01:20:36.360 --> 01:20:43.320] And it is a no-win scenario, but the best you can do is to have as transparent a process
[01:20:43.320 --> 01:20:52.900] as possible where you, again, have the people who do have the expertise patiently, continuously
[01:20:52.900 --> 01:20:56.200] explain what the science actually shows.
[01:20:56.200 --> 01:21:00.680] The thing is, there's always going to be some information that they're not going to be able
[01:21:00.680 --> 01:21:01.680] to release.
[01:21:01.680 --> 01:21:05.320] And it's not necessarily even about the information itself.
[01:21:05.320 --> 01:21:11.320] It could just be about how they obtained the information, or the technology they are using.
[01:21:11.320 --> 01:21:14.560] It's not even necessarily about the information itself.
[01:21:14.560 --> 01:21:18.120] It's not like those videos are showing something that they don't want the world to see.
[01:21:18.120 --> 01:21:23.000] It's that we don't want the Chinese to know that we have equipment that could do that,
[01:21:23.000 --> 01:21:27.560] that we even have the capability of producing videos like that, or whatever it is, you know?
[01:21:27.560 --> 01:21:32.240] Or seeing what the inside of a cockpit of one of these fighter jets looked like.
[01:21:32.240 --> 01:21:36.440] There may be sensitive, secret information in there that they would otherwise glean.
[01:21:36.440 --> 01:21:40.280] They don't even care what the heck that they think that their sensors are depicting as
[01:21:40.280 --> 01:21:43.080] far as what this unidentified object is.
[01:21:43.080 --> 01:21:45.440] No, it's, oh, look at this dial.
[01:21:45.440 --> 01:21:48.560] This fighter jet is capable of doing this.
[01:21:48.560 --> 01:21:49.560] We didn't know that.
[01:21:49.560 --> 01:21:50.560] That's a sensitive piece of information.
[01:21:50.560 --> 01:21:51.560] That kind of stuff.
[01:21:51.560 --> 01:21:52.560] Yeah.
[01:21:52.560 --> 01:21:59.160] So, yeah, I agree with you, though, they need to partner with the people who know what they're
[01:21:59.160 --> 01:22:02.720] actually talking about when it comes to things like UFOs, you know?
[01:22:02.720 --> 01:22:07.320] And again, the frustrating thing is that that knowledge base, those skill sets are out there,
[01:22:07.320 --> 01:22:08.320] and they're clamoring.
[01:22:08.320 --> 01:22:09.800] They're like, hey, we're here.
[01:22:09.800 --> 01:22:11.520] But it's just they don't know what they're doing.
[01:22:11.520 --> 01:22:12.520] Nope.
[01:22:12.520 --> 01:22:13.520] Nope.
[01:22:13.520 --> 01:22:14.520] They fumble the ball every time.
[01:22:14.520 --> 01:22:15.520] And it made it worse.
[01:22:15.520 --> 01:22:16.520] Yep.
[01:22:16.520 --> 01:22:17.520] All right.
[01:22:17.520 --> 01:22:18.520] Thanks, Evan.
[01:22:18.520 --> 01:22:19.520] Well, everyone, we're going to take a quick break from our show to talk about one of our
[01:22:19.520 --> 01:22:21.520] sponsors this week, Bombas.
[01:22:21.520 --> 01:22:25.720] So, guys, you know, we talk about Bombas all the time.
[01:22:25.720 --> 01:22:27.840] Most of us on this show wear Bombas socks.
[01:22:27.840 --> 01:22:28.840] Bob, you wear Bombas?
[01:22:28.840 --> 01:22:29.840] Oh, yeah, man.
[01:22:29.840 --> 01:22:30.840] Okay.
[01:22:30.840 --> 01:22:31.840] All of us at this point wear Bombas socks.
[01:22:31.840 --> 01:22:32.840] Yeah, they're awesome.
[01:22:32.840 --> 01:22:33.840] What was it?
[01:22:33.840 --> 01:22:35.560] A couple months ago, I said I reloaded.
[01:22:35.560 --> 01:22:40.340] And one thing I noticed was that when I compare my new socks to my old socks, I can't tell
[01:22:40.340 --> 01:22:45.000] the difference, which is a good sign, which means that the older socks are holding up
[01:22:45.000 --> 01:22:47.480] really well because, you know, you wash them, you fold them.
[01:22:47.480 --> 01:22:48.480] Which ones are new?
[01:22:48.480 --> 01:22:49.480] I don't know anymore.
[01:22:49.480 --> 01:22:52.520] Do you know that Bombas doesn't just make socks, but also shirts and underwear?
[01:22:52.520 --> 01:22:54.160] I knew that, Cara.
[01:22:54.160 --> 01:22:59.360] And did you know that everything they make is soft, seamless, tagless, and super cozy?
[01:22:59.360 --> 01:23:01.360] I knew that, Cara.
[01:23:01.360 --> 01:23:02.760] It's true.
[01:23:02.760 --> 01:23:05.600] Bombas t-shirts are made with really thoughtful design features.
[01:23:05.600 --> 01:23:07.680] Their underwear is super breathable.
[01:23:07.680 --> 01:23:12.680] And my favorite part, of course, is that Bombas donates one for one.
[01:23:12.680 --> 01:23:18.520] For every item you buy, they donate an item, you know, socks, underwear, t-shirts to homeless
[01:23:18.520 --> 01:23:19.520] shelters.
[01:23:19.520 --> 01:23:25.720] So go to bombas.com slash skeptics and use code skeptics for 20% off your first purchase.
[01:23:25.720 --> 01:23:31.640] That's B-O-M-B-A-S dot com slash skeptics and use skeptics at checkout.
[01:23:31.640 --> 01:23:32.720] All right, guys.
[01:23:32.720 --> 01:23:33.720] Let's get back to the show.
Who's That Noisy? (1:23:35)
[01:23:33.720 --> 01:23:36.480] All right, Jay, it's Who's That Noisy Time?
[01:23:36.480 --> 01:23:37.480] All right, guys.
[01:23:37.480 --> 01:23:39.040] Last week, I played this noisy.
[01:23:39.040 --> 01:23:57.320] I don't know what's making the sound, Jay, but you know what it reminds me of?
[01:23:57.320 --> 01:23:58.320] What?
[01:23:58.320 --> 01:24:01.720] Our opening music to our skeptical extravaganza.
[01:24:01.720 --> 01:24:06.600] It has the same sort of initial buildup, those first few notes, and then it gets right into
[01:24:06.600 --> 01:24:08.540] the main thrust of the music.
[01:24:08.540 --> 01:24:11.200] That's what it reminded me of a little bit.
[01:24:11.200 --> 01:24:12.200] Anybody else?
[01:24:12.200 --> 01:24:18.920] Well, I know it's a mouth harp, but I think you want more specific information than that.
[01:24:18.920 --> 01:24:19.920] A human mouth harp.
[01:24:19.920 --> 01:24:20.920] You don't have it?
[01:24:20.920 --> 01:24:21.920] Okay.
[01:24:21.920 --> 01:24:22.920] Which I don't have, yeah.
[01:24:22.920 --> 01:24:23.920] All right.
[01:24:23.920 --> 01:24:28.680] Well, I'll start off by saying, like, so many people got this one, you know, 99% correct.
[01:24:28.680 --> 01:24:29.680] Cool.
[01:24:29.680 --> 01:24:31.360] Because it does, if you know it, you know it.
[01:24:31.360 --> 01:24:32.360] You know what I mean?
[01:24:32.360 --> 01:24:33.840] So let me read some of these to you.
[01:24:33.840 --> 01:24:38.240] So a listener named Alex Vickery wrote in, hey, guys, short term listener, first time
[01:24:38.240 --> 01:24:42.980] guesser is this week's sound, someone playing a mouth harp connected to a Peter Frampton
[01:24:42.980 --> 01:24:44.080] talk box.
[01:24:44.080 --> 01:24:51.000] I thought that was a really cool interpretation of this because I kind of hear that now when
[01:24:51.000 --> 01:24:57.560] I listen to it, it has a little bit of that, you know, like vocalization aspect to it.
[01:24:57.560 --> 01:24:58.560] That is not correct.
[01:24:58.560 --> 01:24:59.560] I have a question for Kara.
[01:24:59.560 --> 01:25:00.560] Sorry to interrupt.
[01:25:00.560 --> 01:25:01.560] Kara?
[01:25:01.560 --> 01:25:02.560] Hmm?
[01:25:02.560 --> 01:25:03.560] Have you heard of Peter Frampton?
[01:25:03.560 --> 01:25:04.560] Yeah, of course.
[01:25:04.560 --> 01:25:05.560] Oh, okay.
[01:25:05.560 --> 01:25:06.560] Just making sure.
[01:25:06.560 --> 01:25:07.560] Sorry.
[01:25:07.560 --> 01:25:08.560] Go ahead.
[01:25:08.560 --> 01:25:09.560] Cool guess.
[01:25:09.560 --> 01:25:11.960] It's only the sci-fi fantasy stuff, guys.
[01:25:11.960 --> 01:25:12.960] All right.
[01:25:12.960 --> 01:25:15.960] That's the stuff I don't know about.
[01:25:15.960 --> 01:25:16.960] Okay.
[01:25:16.960 --> 01:25:20.680] Another listener, Edward Myers, wrote in, hi, Jay, I think this week's noisy is a Jew's
[01:25:20.680 --> 01:25:30.560] harp also known as a jaw harp, a Vargan mouth harp, Guga, Gwimbard, Khamis, Ozark harp,
[01:25:30.560 --> 01:25:33.880] Bermuda, words I can't pronounce.
[01:25:33.880 --> 01:25:34.880] Lots of names.
[01:25:34.880 --> 01:25:35.880] Is that one guess or eight guesses?
[01:25:35.880 --> 01:25:43.840] Well, no, he's saying, you know, there's a lot of names for this type of instrument.
[01:25:43.840 --> 01:25:47.600] And that's very true because lots of different cultures have their own version of this instrument.
[01:25:47.600 --> 01:25:52.640] I mean, I'm curious to know, you know, how many times this was discovered, you know what
[01:25:52.640 --> 01:25:53.640] I mean?
[01:25:53.640 --> 01:25:54.920] Like around the world.
[01:25:54.920 --> 01:25:55.920] But very cool.
[01:25:55.920 --> 01:26:00.680] I mean, I thought that kind of this answer kind of shows you like the incredible variety.
[01:26:00.680 --> 01:26:05.200] So I'm trying to ask which one is this, you know, out of all these different varieties,
[01:26:05.200 --> 01:26:07.260] which one is it specifically?
[01:26:07.260 --> 01:26:11.960] So I have a couple of people who, I got a lot of people who guessed some type of mouth
[01:26:11.960 --> 01:26:13.300] harp correctly.
[01:26:13.300 --> 01:26:18.580] And I actually got quite a few people who got the exact right answer correct.
[01:26:18.580 --> 01:26:19.580] But I have a couple of people.
[01:26:19.580 --> 01:26:25.080] So first off, Anton Evans wrote in first and he said, hi, my name is Anton Evans.
[01:26:25.080 --> 01:26:29.600] I started listening as an undergrad, continued to listen through my master's during my time
[01:26:29.600 --> 01:26:34.160] in the pharmaceutical industry, and now sitting here on a Saturday in lab, finishing up the
[01:26:34.160 --> 01:26:37.560] last experiments of my Ph.D. in microbiology.
[01:26:37.560 --> 01:26:38.560] Very cool.
[01:26:38.560 --> 01:26:39.560] That's so awesome.
[01:26:39.560 --> 01:26:44.520] This noisy comes from a tick tock, which I found on the Reddit post link below showing
[01:26:44.520 --> 01:26:50.880] a guy he has a triangular object in his mouth, he then proceeds to create sound while smacking
[01:26:50.880 --> 01:26:55.520] his mouth with his hand in the shape of the peace sign and eyes wildly rolling in his
[01:26:55.520 --> 01:26:56.520] head.
[01:26:56.520 --> 01:26:58.420] So this is exactly correct.
[01:26:58.420 --> 01:27:04.900] So this particular this particular one is an Indian Morchang, right?
[01:27:04.900 --> 01:27:08.500] It is a type of jaw harp, you know, it's a mouth instrument.
[01:27:08.500 --> 01:27:12.560] The instrument consists of a metal ring in the shape of a horseshoe with two parallel
[01:27:12.560 --> 01:27:17.300] forks which form the frame and a metal tongue in the middle between the forks fixed to the
[01:27:17.300 --> 01:27:20.340] ring at one end and free to vibrate at the other.
[01:27:20.340 --> 01:27:24.000] So basically use your mouth to to make resonances.
[01:27:24.000 --> 01:27:25.640] Oh, yeah.
[01:27:25.640 --> 01:27:31.240] And that's where you get like the idea that there is like a voice behind it almost.
[01:27:31.240 --> 01:27:35.920] And if you've never seen someone play a mouth harp, you should look for it.
[01:27:35.920 --> 01:27:36.920] It's pretty cool.
[01:27:36.920 --> 01:27:39.520] Pretty cool what you can do with the instrument.
[01:27:39.520 --> 01:27:40.520] How hard is it to you?
[01:27:40.520 --> 01:27:41.520] It's probably pretty hard.
[01:27:41.520 --> 01:27:43.120] I don't think it's that hard.
[01:27:43.120 --> 01:27:44.120] Really?
[01:27:44.120 --> 01:27:45.120] I really don't.
[01:27:45.120 --> 01:27:47.280] I mean, I used to have one when I was a kid and I used to get noise out of it.
[01:27:47.280 --> 01:27:49.120] It's you know, I mean, you get really good at it.
[01:27:49.120 --> 01:27:50.120] I don't know.
[01:27:50.120 --> 01:27:51.120] I honestly don't know.
[01:27:51.120 --> 01:27:52.120] Like, how good can you get at one of these?
[01:27:52.120 --> 01:27:54.960] I played a mean kazoo once, you know.
[01:27:54.960 --> 01:27:56.680] Yeah, I've played that before.
[01:27:56.680 --> 01:27:57.680] It's not hard.
[01:27:57.680 --> 01:28:01.380] Yeah, like a lot of like a lot of the instruments, it's not hard to get a sound.
[01:28:01.380 --> 01:28:03.400] It's hard to make it sound really good.
[01:28:03.400 --> 01:28:08.000] Yeah, there's a you could get that difference between an amateur and somebody who's just
[01:28:08.000 --> 01:28:10.000] very, very good at it.
[01:28:10.000 --> 01:28:12.200] It can be pretty big.
[01:28:12.200 --> 01:28:17.760] But it's not a challenging instrument just to make a basic musical noise.
[01:28:17.760 --> 01:28:21.160] I mean, who else can play the pan flute like Zamfir?
[01:28:21.160 --> 01:28:22.160] No one.
[01:28:22.160 --> 01:28:23.160] Yeah.
[01:28:23.160 --> 01:28:24.160] I mean, he's the best.
[01:28:24.160 --> 01:28:25.160] Yeah.
[01:28:25.160 --> 01:28:29.080] So the another listener, Nikhil Somaya, also wrote in and guessed correctly.
[01:28:29.080 --> 01:28:31.640] So thank you both for writing in your correct answers.
[01:28:31.640 --> 01:28:36.440] And thank everyone this week for, you know, all the people who got it right or got really
[01:28:36.440 --> 01:28:37.440] close.
[01:28:37.440 --> 01:28:40.480] I knew that a lot of people were going to get it, but I didn't realize like there was
[01:28:40.480 --> 01:28:43.440] there was hundreds of people who guessed it correctly this week.
[01:28:43.440 --> 01:28:46.600] Yeah, it was pretty intense.
[01:28:46.600 --> 01:28:50.200] But that's the work that I do here, Bob.
[01:28:50.200 --> 01:28:52.360] I have a new Noisy for you guys this week.
[01:28:52.360 --> 01:28:56.600] This Noisy was sent in by a listener named Claire Distin.
[01:28:56.600 --> 01:28:58.000] And check this one out.
[01:28:58.000 --> 01:29:02.320] I will tell you that this is one of my favorite Noisies.
[01:29:02.320 --> 01:29:08.760] Every once in a while, I'll repeat a really cool Noisy that I played a decade ago because
[01:29:08.760 --> 01:29:12.440] it just worth, you know, I want people who haven't heard it yet to hear it because it's
[01:29:12.440 --> 01:29:13.440] pretty cool.
[01:29:13.440 --> 01:29:38.240] It's one of my favorites.
[01:29:38.240 --> 01:29:41.760] I'm sure some of you are going to get this, but I will give a little bit of hint.
[01:29:41.760 --> 01:29:43.800] Let's see what you come up with.
[01:29:43.800 --> 01:29:45.320] That's my hint.
[01:29:45.320 --> 01:29:51.000] If you have any answers or you heard any cool Noisies this week, you could email us at WTN
[01:29:51.000 --> 01:29:55.400] at theskepticsguide.org Steve got some announcements real quick.
New Noisy (1:28:51)
[whooshing and deep woodwind-like tones and vibrations]
... some of you are going to get this,
Announcements (1:29:54)
[01:29:55.400 --> 01:29:59.000] So if you didn't know, Bob, Steve and I wrote a book.
[01:29:59.000 --> 01:30:00.700] It's called The Skeptics' Guide to the Future.
[01:30:00.700 --> 01:30:03.160] It's coming out September 27th.
[01:30:03.160 --> 01:30:07.280] On September 24th, we will be having a live stream.
[01:30:07.280 --> 01:30:13.600] It'll be an extended six hour live stream where we will be, among other things, we will
[01:30:13.600 --> 01:30:18.720] be discussing chapters in the book and talking about the backstory behind the book.
[01:30:18.720 --> 01:30:24.040] We will also be recording two live podcasts that you could see us, you know, watch and
[01:30:24.040 --> 01:30:26.320] listen to us record a podcast live.
[01:30:26.320 --> 01:30:32.000] We'll have a couple of special guests that'll help us discuss elements of the book.
[01:30:32.000 --> 01:30:35.400] And Evan and Kara, of course, will definitely be there for that.
[01:30:35.400 --> 01:30:39.000] So that's happening on September 24th.
[01:30:39.000 --> 01:30:44.760] That'll be from 12 Eastern noon time to six p.m. and you should be able to, you can go
[01:30:44.760 --> 01:30:47.360] to our website or go to our Facebook page.
[01:30:47.360 --> 01:30:50.240] Any place where we're at, you could see, you'll get the link for that.
[01:30:50.240 --> 01:30:53.500] It'll be on YouTube and Facebook.
[01:30:53.500 --> 01:30:55.960] We also have shows coming up in December, guys.
[01:30:55.960 --> 01:30:57.400] They're not that far away.
[01:30:57.400 --> 01:31:01.840] And I will say this, airline tickets have been purchased.
[01:31:01.840 --> 01:31:03.760] Vehicles have been rented.
[01:31:03.760 --> 01:31:05.320] Knives have been sharpened.
[01:31:05.320 --> 01:31:10.480] This show, this show is going to so we have four shows and Steve and I keep discussing
[01:31:10.480 --> 01:31:11.480] this.
[01:31:11.480 --> 01:31:15.720] We want to make sure because some people write in moderately confused about what is the difference
[01:31:15.720 --> 01:31:19.060] between the private show and the extravaganza.
[01:31:19.060 --> 01:31:24.860] The extravaganza is basically a stage show and it's a lot of fun and it's hosted by George
[01:31:24.860 --> 01:31:26.680] Hobb does a great job.
[01:31:26.680 --> 01:31:29.160] So there's music, there's games.
[01:31:29.160 --> 01:31:30.600] We pit ourselves against the audience.
[01:31:30.600 --> 01:31:36.280] We're basically demonstrating a lot of skeptical principles, but it's all mainly fun.
[01:31:36.280 --> 01:31:43.480] Then we have our SGU, our private shows where it's a live recording of the SGU, but we're
[01:31:43.480 --> 01:31:47.600] doing a new thing where we're doing an enhanced, expanded private show.
[01:31:47.600 --> 01:31:52.200] It's going to be at least three and a half hours where we're going to, in the middle
[01:31:52.200 --> 01:31:58.520] there, we will be recording a live podcast, but in addition, we'll be doing things just
[01:31:58.520 --> 01:32:03.160] with the audience that are not designed to go into the podcast.
[01:32:03.160 --> 01:32:08.240] So there'll be, obviously we'll have a lot of time to do pictures and book signings and
[01:32:08.240 --> 01:32:14.200] talking with everybody, but we're also going to be doing specific events with the people
[01:32:14.200 --> 01:32:16.320] that are there at the show.
[01:32:16.320 --> 01:32:21.080] So there'll be specific either games or whatever bits that we'll be doing with you.
[01:32:21.080 --> 01:32:26.200] So those are always a lot of fun and we're just basically expanding on them.
[01:32:26.200 --> 01:32:32.220] We're going to do a private show on Thursday, December 15th in Phoenix.
[01:32:32.220 --> 01:32:36.880] Then we go to Tucson and we do our first extravaganza in Tucson.
[01:32:36.880 --> 01:32:39.480] Then we have the private show on Saturday.
[01:32:39.480 --> 01:32:41.720] This is Saturday the 17th.
[01:32:41.720 --> 01:32:46.520] We have a private show, I think that'll be at noon, starting at noon in Tucson.
[01:32:46.520 --> 01:32:52.320] Then we drive back to Phoenix and do an extravaganza in Phoenix on Friday night the 17th of December.
[01:32:52.320 --> 01:32:55.940] So we have four shows, two extravaganzas and two private shows.
[01:32:55.940 --> 01:32:58.880] We would love it if you guys joined us to one or more of those shows.
[01:32:58.880 --> 01:33:03.600] Go to our website, theskepticsguide.org forward slash events for all the details.
[01:33:03.600 --> 01:33:04.600] All right.
[01:33:04.600 --> 01:33:05.600] Thanks, Jay.
Questions/Emails/Corrections/Follow-ups (1:33:06)
Email #1: Climate Change Nihilism
_consider_using_block_quotes_for_emails_read_aloud_in_this_segment_
with_reduced_spacing_for_long_chunks –
[01:33:05.600 --> 01:33:07.040] We do have one quick email.
[01:33:07.040 --> 01:33:13.440] This one comes from Cirrus, S-I-R-I-S, and they write, hi, long time fan.
[01:33:13.440 --> 01:33:17.120] I've been listening since 2008 as an early podcast adopter.
[01:33:17.120 --> 01:33:19.600] SG was my first and longest running podcast.
[01:33:19.600 --> 01:33:25.840] I own a hard copy and audio book copy of your first book and have pre-ordered the second.
[01:33:25.840 --> 01:33:28.080] I believe in science and climate change.
[01:33:28.080 --> 01:33:29.080] I am a dink.
[01:33:29.080 --> 01:33:31.160] Carrie, you know what that is?
[01:33:31.160 --> 01:33:32.640] Dual income, no kids.
[01:33:32.640 --> 01:33:33.640] No kids.
[01:33:33.640 --> 01:33:34.640] Yeah, no kids.
[01:33:34.640 --> 01:33:35.640] Dual income, no kids.
[01:33:35.640 --> 01:33:39.400] With a comfortable household income, we will not be having children.
[01:33:39.400 --> 01:33:43.720] I fully appreciate the doom that will come to future generations as a result of climate
[01:33:43.720 --> 01:33:48.580] change and with the strong political divides and ineffective global politicking, I do not
[01:33:48.580 --> 01:33:52.280] believe that people will ever address the climate issue.
[01:33:52.280 --> 01:33:55.560] With all that said, why should I care about climate change?
[01:33:55.560 --> 01:33:58.720] Basically, my household is well off enough to get through it.
[01:33:58.720 --> 01:34:01.720] Without too much discomfort, we won't ever have children, so we don't really have to
[01:34:01.720 --> 01:34:04.520] be concerned with the welfare of future generations.
[01:34:04.520 --> 01:34:09.480] And even if we tried to do pro-social activities to address our individual footprint would
[01:34:09.480 --> 01:34:12.760] amount to next to nothing in addressing the issue.
[01:34:12.760 --> 01:34:17.400] Without overall structural change, which won't ever occur, anything we do as individuals
[01:34:17.400 --> 01:34:20.080] would result in less than rounding errors.
[01:34:20.080 --> 01:34:23.480] And then he says, parenthetically, I understand the fallacy of composition.
[01:34:23.480 --> 01:34:28.040] He's a big fan, and apathetically and nihilistically yours too.
[01:34:28.040 --> 01:34:31.480] I know Bob appreciates that.
[01:34:31.480 --> 01:34:33.840] I abridged that a little bit, but those are his key points.
[01:34:33.840 --> 01:34:35.360] So what do you guys think about that?
[01:34:35.360 --> 01:34:38.920] So one is, there's a few pieces here that I wanted to address.
[01:34:38.920 --> 01:34:43.140] One is, there's no way that we're going to address climate change.
[01:34:43.140 --> 01:34:44.140] It's not going to happen.
[01:34:44.140 --> 01:34:47.680] We're too politically dysfunctional for that to happen.
[01:34:47.680 --> 01:34:51.540] And two, if you don't have kids, why should you care?
[01:34:51.540 --> 01:34:54.680] And three, nothing I do is going to make any difference anyway, because there's one
[01:34:54.680 --> 01:34:58.960] individual out of seven to eight billion.
[01:34:58.960 --> 01:35:00.660] Our individual footprints are not the issue.
[01:35:00.660 --> 01:35:04.760] It's the big governmental things, and they're never going to change.
[01:35:04.760 --> 01:35:06.560] So how do you guys all feel about that?
[01:35:06.560 --> 01:35:08.360] There's a lot to unpack there.
[01:35:08.360 --> 01:35:09.360] Yeah, there is.
[01:35:09.360 --> 01:35:10.360] Yeah.
[01:35:10.360 --> 01:35:12.520] What are your initial, what are your top line thoughts?
[01:35:12.520 --> 01:35:13.960] His views are valid.
[01:35:13.960 --> 01:35:19.640] He is a human being in the world, and it's very hard not to feel that way.
[01:35:19.640 --> 01:35:23.040] That doesn't mean that there's nothing we can do.
[01:35:23.040 --> 01:35:29.760] It doesn't mean that there's not a reason to care about the future.
[01:35:29.760 --> 01:35:37.020] But I think we also can't minimize the fact that, because I don't think what he's saying
[01:35:37.020 --> 01:35:42.580] is we're screwed, so I'm just going to become super hedonistic.
[01:35:42.580 --> 01:35:45.920] I think he's saying, I'm living my life with balance.
[01:35:45.920 --> 01:35:52.040] I'm just worried that it's causing me so much anxiety and ultimately at the end of the day,
[01:35:52.040 --> 01:35:53.040] it's all going to burn.
[01:35:53.040 --> 01:35:57.520] Yeah, although I don't know that he's not saying that he's just going to say, screw
[01:35:57.520 --> 01:35:58.520] it.
[01:35:58.520 --> 01:36:03.760] Why not live hedonistically, meaning that why pay any attention at all to my carbon
[01:36:03.760 --> 01:36:05.520] footprint when it's a pinprick?
[01:36:05.520 --> 01:36:06.520] It doesn't matter.
[01:36:06.520 --> 01:36:07.520] Right.
[01:36:07.520 --> 01:36:10.600] And so obviously that's an extreme view that I don't subscribe to.
[01:36:10.600 --> 01:36:14.400] So if none of this matters, why not just burn it all down?
[01:36:14.400 --> 01:36:18.160] Yeah, that's terrible, but that's also a slippery slope fallacy.
[01:36:18.160 --> 01:36:20.720] That's not the case.
[01:36:20.720 --> 01:36:23.920] So let me do this.
[01:36:23.920 --> 01:36:28.880] Let's set aside the point of why should I care about the future?
[01:36:28.880 --> 01:36:32.080] That's more of a philosophical, ethical question.
[01:36:32.080 --> 01:36:38.040] We all have to decide for ourselves where we get meaning in this world.
[01:36:38.040 --> 01:36:42.720] Personally I like the idea that humanity is self-sustaining and we'll at least live with
[01:36:42.720 --> 01:36:46.560] reasonable harmony with nature sustainably into the future.
[01:36:46.560 --> 01:36:49.440] If you don't care about that, there's probably no way I could convince you.
[01:36:49.440 --> 01:36:53.040] You know, if you're saying, why should I care about what happens after the second that I
[01:36:53.040 --> 01:36:56.200] die from one perspective, it's like, you're right, you'll be dead.
[01:36:56.200 --> 01:36:58.840] Nothing can possibly affect you at that point.
[01:36:58.840 --> 01:37:04.240] And if you don't care about people beyond your life or your kids, and if you're not
[01:37:04.240 --> 01:37:08.240] going to have kids and beyond yourself, then I don't know that I could convince you that
[01:37:08.240 --> 01:37:09.240] you should.
[01:37:09.240 --> 01:37:10.240] Right.
[01:37:10.240 --> 01:37:12.400] That's kind of a philosophical worldview thing.
[01:37:12.400 --> 01:37:14.120] So let's just put that aside.
[01:37:14.120 --> 01:37:16.840] Let's just focus on the two more concrete claims.
[01:37:16.840 --> 01:37:23.680] One is that government is never going to get their shit together and do anything meaningful
[01:37:23.680 --> 01:37:26.240] about climate change.
[01:37:26.240 --> 01:37:34.560] So I would say that there is a lot of reason to be skeptical and even nihilistic about
[01:37:34.560 --> 01:37:37.600] government, effective government action on climate change.
[01:37:37.600 --> 01:37:40.360] But I am not nihilistic about it.
[01:37:40.360 --> 01:37:45.000] And I think that that's a self-fulfilling prophecy, like if we all give up, then we're
[01:37:45.000 --> 01:37:47.820] definitely not going to do anything about it.
[01:37:47.820 --> 01:37:53.720] And sometimes you just have to keep up that constant pressure over even generations and
[01:37:53.720 --> 01:37:57.180] eventually you break through.
[01:37:57.180 --> 01:38:00.240] And so I think we just need to keep doing that.
[01:38:00.240 --> 01:38:02.360] And it's never going to be too late.
[01:38:02.360 --> 01:38:06.580] It may be too late to keep it from getting bad, but it can always be worse.
[01:38:06.580 --> 01:38:10.480] It's never going to be too late to keep it from getting even worse.
[01:38:10.480 --> 01:38:15.440] And so, you know, we could lament what we didn't do, but that doesn't help us.
[01:38:15.440 --> 01:38:17.740] So we just got to do what we can.
[01:38:17.740 --> 01:38:24.300] I'd also say that there is recent evidence that governments can do things that are meaningful.
[01:38:24.300 --> 01:38:25.560] You mean can or are?
[01:38:25.560 --> 01:38:26.560] Are.
[01:38:26.560 --> 01:38:27.560] I think there are.
[01:38:27.560 --> 01:38:30.660] I mean, I think the recent, you know, the Inflation Reduction Act, which is really about
[01:38:30.660 --> 01:38:35.240] health care and climate change, shows that there are policies that could make a difference.
[01:38:35.240 --> 01:38:42.160] I mean, there were three independent analyses of the IRA and its effect on carbon reduction
[01:38:42.160 --> 01:38:46.700] in the U.S. between now and, you know, 2030, 2035.
[01:38:46.700 --> 01:38:51.640] And they found that it would significantly decrease, it's going to significantly decrease
[01:38:51.640 --> 01:38:55.200] our carbon footprint between now and then.
[01:38:55.200 --> 01:39:00.960] And it's already working in the United States, it's already working in that there are already
[01:39:00.960 --> 01:39:06.720] companies saying, OK, in light of this, we're going to invest $2 billion in a plant that's
[01:39:06.720 --> 01:39:10.100] going to make batteries or going to refine lithium or whatever, that are going to do
[01:39:10.100 --> 01:39:13.900] things that we need to do in order to make this transition to clean energy.
[01:39:13.900 --> 01:39:14.900] So it's working.
[01:39:14.900 --> 01:39:17.520] It's not, it didn't get us all the way to where we wanted to be.
[01:39:17.520 --> 01:39:19.260] But it was something.
[01:39:19.260 --> 01:39:20.700] And everything helps.
[01:39:20.700 --> 01:39:23.880] Everything will reduce how bad it's going to get, basically.
[01:39:23.880 --> 01:39:28.900] But we also know, we also know the exacerbating effect that the like negative feedback loop
[01:39:28.900 --> 01:39:30.940] that's already in play.
[01:39:30.940 --> 01:39:36.220] Like we know that we're not in a safe place right now and that all this stuff we already
[01:39:36.220 --> 01:39:40.240] did leading up to today is still going to continue to play out.
[01:39:40.240 --> 01:39:42.880] So we need to use that in the calculus.
[01:39:42.880 --> 01:39:43.880] Absolutely.
[01:39:43.880 --> 01:39:53.660] But the bottom line is that the range of possible outcomes, it falls on both sides of very significant
[01:39:53.660 --> 01:39:54.780] tipping points.
[01:39:54.780 --> 01:39:58.500] So there are tipping points that we still may avoid.
[01:39:58.500 --> 01:40:01.080] But if we don't avoid them, it's going to get a lot worse.
[01:40:01.080 --> 01:40:02.400] So that's the uncertainty.
[01:40:02.400 --> 01:40:04.320] So we don't know exactly how bad it's going to get.
[01:40:04.320 --> 01:40:09.460] So my hope is that if we keep pushing, keep pushing, at the very least, we may avoid some
[01:40:09.460 --> 01:40:11.420] of these worst tipping points.
[01:40:11.420 --> 01:40:12.420] That's still on the table.
[01:40:12.420 --> 01:40:14.300] And that's the thing that I think is really clear.
[01:40:14.300 --> 01:40:19.260] Like I feel like sitting here isn't made very clear is that you're right.
[01:40:19.260 --> 01:40:20.580] The error bars are huge.
[01:40:20.580 --> 01:40:24.620] The tipping points are within the area, within the error bars.
[01:40:24.620 --> 01:40:26.340] But it's all worse.
[01:40:26.340 --> 01:40:28.580] The question is how much worse.
[01:40:28.580 --> 01:40:29.580] Totally.
[01:40:29.580 --> 01:40:30.580] Absolutely.
[01:40:30.580 --> 01:40:32.620] And we have to remember there's literally nothing we can do now.
[01:40:32.620 --> 01:40:36.000] It's going to get worse before it gets better, no question.
[01:40:36.000 --> 01:40:38.080] There's a question of whether it could get better.
[01:40:38.080 --> 01:40:41.180] It's just could it get not worse, worse?
[01:40:41.180 --> 01:40:44.660] Some tipping points are irreversible on a human civilization time scale.
[01:40:44.660 --> 01:40:46.580] Yes, we're going to lose a foot.
[01:40:46.580 --> 01:40:50.420] Let's do all we can do to make sure we don't lose one leg or two legs.
[01:40:50.420 --> 01:40:51.420] Exactly.
[01:40:51.420 --> 01:40:54.780] And I think that that's the part that we have to be so clear about, which is why I think
[01:40:54.780 --> 01:41:01.460] it's valid when somebody is experiencing like existential climate distress.
[01:41:01.460 --> 01:41:05.180] Because they're like, look at my foot's going away.
[01:41:05.180 --> 01:41:06.180] I don't want to lose a foot.
[01:41:06.180 --> 01:41:08.340] And it's like, well, that's kind of a done deal.
[01:41:08.340 --> 01:41:10.940] Yes, but it's not a valid reason to be nihilistic.
[01:41:10.940 --> 01:41:15.540] Because it's not a valid argument that there's nothing we can do or that nothing we do will
[01:41:15.540 --> 01:41:19.860] matter because we still can affect the outcome.
[01:41:19.860 --> 01:41:21.260] We still can affect the outcome.
[01:41:21.260 --> 01:41:25.340] And I think the real question here, and again, this comes back to like the psychological
[01:41:25.340 --> 01:41:30.380] profile and the philosophical profile of the email that we decided to talk about, is who
[01:41:30.380 --> 01:41:34.640] is we, who am I as an actor in a system?
[01:41:34.640 --> 01:41:42.020] And so it's one thing to say, well, it's no excuse to be nihilistic or it's no, I don't
[01:41:42.020 --> 01:41:43.740] remember how you phrased it.
[01:41:43.740 --> 01:41:45.980] It doesn't justify nihilism.
[01:41:45.980 --> 01:41:47.700] It doesn't justify nihilism.
[01:41:47.700 --> 01:41:51.980] And I would say that's right if you are a person in a position of power.
[01:41:51.980 --> 01:41:54.020] Well, but I disagree.
[01:41:54.020 --> 01:41:57.780] I think that the most important thing we each can do is vote.
[01:41:57.780 --> 01:42:03.460] And at least like in the United States, if enough people vote for, you know, politicians
[01:42:03.460 --> 01:42:06.180] who are prioritizing climate change, it will matter.
[01:42:06.180 --> 01:42:07.820] It will absolutely matter.
[01:42:07.820 --> 01:42:14.220] Who we, you know, who we put in the White House between 2016 and 2020 was vastly different
[01:42:14.220 --> 01:42:17.260] than who was in the White House now in terms of climate change.
[01:42:17.260 --> 01:42:18.260] Absolutely different.
[01:42:18.260 --> 01:42:19.260] Those votes mattered.
[01:42:19.260 --> 01:42:20.260] Absolutely mattered.
[01:42:20.260 --> 01:42:24.900] And I guess there's an assumption in what you said that being nihilistic means not engaging
[01:42:24.900 --> 01:42:25.900] at all.
[01:42:25.900 --> 01:42:28.100] And what I'm saying is that there are shades of gray.
[01:42:28.100 --> 01:42:30.340] Like an individual can be like, yeah, I'm going to vote.
[01:42:30.340 --> 01:42:36.300] Of course I'm going to vote, but I'm not going to spend all day every day eaten up
[01:42:36.300 --> 01:42:37.300] by anxiety.
[01:42:37.300 --> 01:42:43.020] And I'm not going to make a million decisions that are detrimental to my mental health in
[01:42:43.020 --> 01:42:46.060] an effort to be like an eco warrior.
[01:42:46.060 --> 01:42:48.860] And I think it's about finding balance.
[01:42:48.860 --> 01:42:53.020] Like I do think you can have a touch of nihilism because it's authentic.
[01:42:53.020 --> 01:42:55.420] Yes, I would say pessimism.
[01:42:55.420 --> 01:42:56.420] I would say pessimism.
[01:42:56.420 --> 01:42:57.420] Sure.
[01:42:57.420 --> 01:42:58.420] Nihilism is kind of an absolute.
[01:42:58.420 --> 01:42:59.420] And maybe we're splitting hairs.
[01:42:59.420 --> 01:43:00.420] Yeah.
[01:43:00.420 --> 01:43:03.140] I agree with the balance issue, the balance in the middle of I'm going to be realistic,
[01:43:03.140 --> 01:43:09.680] but find optimism where I can and find pragmatic things that I can realistically do.
[01:43:09.680 --> 01:43:10.680] That's going to be the best outcome.
[01:43:10.680 --> 01:43:15.420] And I also agree that you don't necessarily have to reorganize your life around it.
[01:43:15.420 --> 01:43:19.660] You could still live your life, but take reasonable steps, like be informed enough to know who
[01:43:19.660 --> 01:43:20.660] to vote for.
[01:43:20.660 --> 01:43:22.700] Like if you just do that, then I'm going to pay with you.
[01:43:22.700 --> 01:43:23.700] Yeah, that's super important.
[01:43:23.700 --> 01:43:24.700] Yeah, that might be enough.
[01:43:24.700 --> 01:43:25.700] That might be enough.
[01:43:25.700 --> 01:43:31.060] In doing that, hopefully what we're doing is we're pushing the needle towards a structure
[01:43:31.060 --> 01:43:35.660] that's organized because one of the biggest frustrations for me, and you guys have heard
[01:43:35.660 --> 01:43:42.580] me bitch about this before, is how the onus has been successfully shifted to the consumer.
[01:43:42.580 --> 01:43:44.060] Yeah, I agree with that.
[01:43:44.060 --> 01:43:48.900] And that's the system itself makes us feel like the burden is on us.
[01:43:48.900 --> 01:43:49.900] It's a misdirection.
[01:43:49.900 --> 01:43:50.900] Yeah, good point.
[01:43:50.900 --> 01:43:52.420] And that's not healthy psychologically.
[01:43:52.420 --> 01:43:58.900] So if we get the right people in power to make the right regulatory decisions, what
[01:43:58.900 --> 01:44:02.260] we'll start to see is that our choices are better choices.
[01:44:02.260 --> 01:44:04.340] Yeah, we have better choices to make.
[01:44:04.340 --> 01:44:09.420] All right, let's move on from a point that we all agree on.
Science or Fiction (1:44:11)
Theme: 2022 Golden Goose Awards
Item #1: The development of laser LASIK surgery was inspired by a case of accidental laser injury to the eye, producing precise perfectly circular damage.[8]
Item #2: Researchers developed a powerful microscope out of paper that folds like origami, with total material costs less than $1.[9]
Item #3: While examining the properties of cone snail venom, researchers accidentally discovered that it is a potent inhibitor of HIV replication.[10]
Answer | Item |
---|---|
Fiction | Snail venom inhibits hiv |
Science | Powerful origami microscope |
Science | Lasik from laser eye injury |
Host | Result |
---|---|
Steve | win |
Rogue | Guess |
---|---|
Evan | Lasik from laser eye injury |
Cara | Lasik from laser eye injury |
Bob | Lasik from laser eye injury |
Jay | Snail venom inhibits hiv |
Voice-over: It's time for Science or Fiction.
Evan's Response
Cara's Response
Bob's Response
Jay's Response
Steve Explains Item #2
Steve Explains Item #1
Steve Explains Item #3
[01:44:09.420 --> 01:44:13.940] Let's go on with science or fiction.
[01:44:13.940 --> 01:44:23.300] It's time for science or fiction.
[01:44:23.300 --> 01:44:27.300] Each week I come up with three science news items or facts, two genuine and one fictitious,
[01:44:27.300 --> 01:44:31.820] and then I challenge my panel of skeptics to tell me which one is the fake.
[01:44:31.820 --> 01:44:33.580] We have a theme this week.
[01:44:33.580 --> 01:44:38.220] The theme is the Golden Goose Awards because they are being awarded tonight as we record
[01:44:38.220 --> 01:44:39.220] this show.
[01:44:39.220 --> 01:44:44.940] Now, the Golden Goose Awards are scientific awards that are given to research that had
[01:44:44.940 --> 01:44:52.300] an incredibly positive impact, but was also either fortuitous or unusual in some way.
[01:44:52.300 --> 01:44:56.060] It's not like the Ig Nobles, where they're funny.
[01:44:56.060 --> 01:45:01.940] This is just, oh, that was like a very lucky find, but it led to something hugely positive.
[01:45:01.940 --> 01:45:05.500] Okay, the Golden Goose Awards, you can look it up after we do the show.
[01:45:05.500 --> 01:45:06.500] All right, here we go.
[01:45:06.500 --> 01:45:12.220] Eye number one, the development of laser LASIK surgery was inspired by a case of accidental
[01:45:12.220 --> 01:45:17.180] laser injury to the eye, producing precise, perfectly circular damage.
[01:45:17.180 --> 01:45:23.900] Eye number two, researchers developed a powerful microscope out of paper that folds like origami,
[01:45:23.900 --> 01:45:28.020] with total material costs less than $1.
[01:45:28.020 --> 01:45:32.980] And eye number three, while examining the properties of cone snail venom, researchers
[01:45:32.980 --> 01:45:37.860] accidentally discovered that it is a potent inhibitor of HIV replication.
[01:45:37.860 --> 01:45:39.340] So you kind of get the theme there?
[01:45:39.340 --> 01:45:40.340] Yep.
[01:45:40.340 --> 01:45:41.340] Yeah.
[01:45:41.340 --> 01:45:42.340] Evan, go first.
[01:45:42.340 --> 01:45:48.740] LASIK surgery inspired by a case of accidental laser injury to the eye, producing precise,
[01:45:48.740 --> 01:45:50.060] perfectly circular damage.
[01:45:50.060 --> 01:45:55.580] That almost sounds like too perfect, literally speaking.
[01:45:55.580 --> 01:45:59.860] And I don't know if I've heard anything about this.
[01:45:59.860 --> 01:46:00.860] I'm trying.
[01:46:00.860 --> 01:46:02.380] I'm jogging my memory.
[01:46:02.380 --> 01:46:03.980] No, it's not there.
[01:46:03.980 --> 01:46:06.620] So I don't know.
[01:46:06.620 --> 01:46:10.140] I'm leaning, I think, on this one being the fiction.
[01:46:10.140 --> 01:46:16.300] The other two are quite extraordinary, so much so that, you know, not that I'm not giving
[01:46:16.300 --> 01:46:17.300] you credit, Steve.
[01:46:17.300 --> 01:46:20.700] I give you a lot of credit for a lot of things having to do with science fiction.
[01:46:20.700 --> 01:46:30.900] But of the three, these next two are so, you know, extraordinary that I'm not sure you
[01:46:30.900 --> 01:46:34.100] would have made these up.
[01:46:34.100 --> 01:46:36.980] Researchers developed a powerful microscope out of paper.
[01:46:36.980 --> 01:46:37.980] Huh?
[01:46:37.980 --> 01:46:38.980] Microscope out of paper?
[01:46:38.980 --> 01:46:39.980] What?
[01:46:39.980 --> 01:46:43.380] That folds like origami and total material costs of less than a dollar.
[01:46:43.380 --> 01:46:44.380] What is this?
[01:46:44.380 --> 01:46:47.800] Like, is it nano size scale?
[01:46:47.800 --> 01:46:49.780] Is that why it costs less than a dollar?
[01:46:49.780 --> 01:46:52.260] Like, what are we talking about here?
[01:46:52.260 --> 01:46:53.260] Made out of paper.
[01:46:53.260 --> 01:46:54.260] I have no concept.
[01:46:54.260 --> 01:46:55.260] Like, it's right.
[01:46:55.260 --> 01:46:59.260] It's beyond being able to even think or invent.
[01:46:59.260 --> 01:47:00.260] No lens.
[01:47:00.260 --> 01:47:03.740] Yeah, but well, to be clear, it's a usable microscope, right?
[01:47:03.740 --> 01:47:05.500] It's not some nothing.
[01:47:05.500 --> 01:47:06.500] No deception there.
[01:47:06.500 --> 01:47:07.500] Right.
[01:47:07.500 --> 01:47:12.340] And so what would be the technical definition of microscope, I suppose, is anything that
[01:47:12.340 --> 01:47:18.060] allows you to magnify something that's smaller, I suppose that would be.
[01:47:18.060 --> 01:47:20.380] So I guess that one's going to be right.
[01:47:20.380 --> 01:47:22.660] And then the cone snail venom.
[01:47:22.660 --> 01:47:25.900] I don't know what the cone snail is, but apparently it has venom.
[01:47:25.900 --> 01:47:29.820] And the researchers accidentally discovered it's a potent inhibitor of HIV.
[01:47:29.820 --> 01:47:34.160] How would they accidentally discover that if they weren't looking for it?
[01:47:34.160 --> 01:47:38.420] They must have been looking for it to be an inhibitor of something else.
[01:47:38.420 --> 01:47:41.060] And along the way, at some point, they stumbled on it being the HIV.
[01:47:41.060 --> 01:47:43.700] I guess I'll stay the LASIK surgery one.
[01:47:43.700 --> 01:47:46.540] It just seems the most made up of the three to me.
[01:47:46.540 --> 01:47:47.540] OK, Kara.
[01:47:47.540 --> 01:47:52.300] Yeah, I'm leaning towards Evans for the same reasons.
[01:47:52.300 --> 01:47:58.780] I could see that, you know, this idea of something being an accident is that you're looking in
[01:47:58.780 --> 01:48:04.180] one place and then you notice a property that you weren't looking for.
[01:48:04.180 --> 01:48:07.820] And then you go, oh, that property might work over here.
[01:48:07.820 --> 01:48:12.300] So when I think about this cone snail venom one, OK, I'm looking at it and, oh, look at
[01:48:12.300 --> 01:48:13.300] this factor.
[01:48:13.300 --> 01:48:14.300] Look at this protein.
[01:48:14.300 --> 01:48:20.100] Oh, back when I was doing research on, you know, retroviruses, I knew that that protein
[01:48:20.100 --> 01:48:21.620] was really important.
[01:48:21.620 --> 01:48:23.940] Like that's what I think of when it's like an accidental.
[01:48:23.940 --> 01:48:28.780] It's not literally like they accidentally spilled some cone snail venom into an HIV patient's
[01:48:28.780 --> 01:48:29.780] IV.
[01:48:29.780 --> 01:48:32.660] You know, like that's not what's probably happening.
[01:48:32.660 --> 01:48:34.480] So that one seems reasonable.
[01:48:34.480 --> 01:48:37.900] So yeah, I think it's a laser surgery because that's sort of like what you're saying.
[01:48:37.900 --> 01:48:42.420] Like, oops, they accidentally shot a laser into somebody's eye and they go, I can see.
[01:48:42.420 --> 01:48:44.420] And for me, that seems less likely.
[01:48:44.420 --> 01:48:45.940] So I'm going to go with Evan on that.
[01:48:45.940 --> 01:48:46.940] OK, Bob.
[01:48:46.940 --> 01:48:50.060] I'm really curious about this paper microscope.
[01:48:50.060 --> 01:48:53.900] Yeah, I just can't try to imagine how that could work.
[01:48:53.900 --> 01:48:58.300] I'm having some trouble, but I still think that that's probably science in some weird
[01:48:58.300 --> 01:48:59.300] way.
[01:48:59.300 --> 01:49:02.100] The venom one yet to me just seems like, duh, of course.
[01:49:02.100 --> 01:49:09.020] And yeah, this basic one is just I can't imagine, you know, first off, it's creating a perfectly
[01:49:09.020 --> 01:49:12.640] circular I mean, I assume that takes a little bit of time.
[01:49:12.640 --> 01:49:16.820] So I mean, I mean, how long would you be looking into that accidentally looking into that laser
[01:49:16.820 --> 01:49:21.780] and then perfectly arranged, you know, on your cornea and not hitting like the sclera
[01:49:21.780 --> 01:49:22.780] at all?
[01:49:22.780 --> 01:49:24.340] It just seems too contrived.
[01:49:24.340 --> 01:49:26.300] I'll say that was fiction, too.
[01:49:26.300 --> 01:49:31.340] And Jay, it's funny that you guys all went that way because the laser one out of all
[01:49:31.340 --> 01:49:34.100] of them seems the most likely to me.
[01:49:34.100 --> 01:49:35.100] I don't know.
[01:49:35.100 --> 01:49:36.100] It just doesn't seem that odd to me.
[01:49:36.100 --> 01:49:42.060] I mean, the paper microscope doesn't seem that wacky to me either, just because, you
[01:49:42.060 --> 01:49:47.980] know, the paper is going to hold up the lenses, you know, like you just have it's like it's
[01:49:47.980 --> 01:49:51.020] the lattice work to hold the lenses.
[01:49:51.020 --> 01:49:53.660] I would suppose that's what Steve is saying.
[01:49:53.660 --> 01:49:57.340] And, you know, I've seen people build very substantial things out of origami.
[01:49:57.340 --> 01:50:00.500] It doesn't seem that much of a stretch to think that someone would like fashion some
[01:50:00.500 --> 01:50:05.460] type of, you know, structure that would be able to hold a lens or whatever.
[01:50:05.460 --> 01:50:06.460] I don't know.
[01:50:06.460 --> 01:50:08.500] I just don't think that's that big of a deal.
[01:50:08.500 --> 01:50:13.100] Well, my interpretation was that it says all all costs under a dollar.
[01:50:13.100 --> 01:50:18.460] Yeah, I'm thinking no, you're not going to get any lens, you're not going to get any
[01:50:18.460 --> 01:50:23.260] lens, you know, for under a buck.
[01:50:23.260 --> 01:50:26.700] A good lens, microscope, you know, a class of lens.
[01:50:26.700 --> 01:50:27.700] It doesn't say good.
[01:50:27.700 --> 01:50:28.700] Yeah, it doesn't, right.
[01:50:28.700 --> 01:50:29.700] It says powerful.
[01:50:29.700 --> 01:50:30.700] What do you mean?
[01:50:30.700 --> 01:50:31.700] It says powerful.
[01:50:31.700 --> 01:50:32.700] It doesn't say good.
[01:50:32.700 --> 01:50:33.700] We have to define that, too.
[01:50:33.700 --> 01:50:39.620] So wait, so you're assuming that there's a lens, there's a glass or some sort of lens
[01:50:39.620 --> 01:50:41.300] in this system as well?
[01:50:41.300 --> 01:50:43.180] I think you can, based on how he wrote it.
[01:50:43.180 --> 01:50:44.180] Yeah, I guess so.
[01:50:44.180 --> 01:50:45.180] He doesn't preclude that.
[01:50:45.180 --> 01:50:46.180] Yeah, but the lesson...
[01:50:46.180 --> 01:50:47.180] It doesn't say entirely on paper.
[01:50:47.180 --> 01:50:49.180] Yeah, it doesn't say on paper.
[01:50:49.180 --> 01:50:53.020] I don't know, that's a good question that Bob brought up.
[01:50:53.020 --> 01:50:54.020] I don't know.
[01:50:54.020 --> 01:50:55.020] I don't know.
[01:50:55.020 --> 01:50:56.020] I don't know if the lens is...
[01:50:56.020 --> 01:50:57.020] Final answer, I don't know.
[01:50:57.020 --> 01:50:59.620] All right, I'm going to ask for a lifeline on the next one, too.
[01:50:59.620 --> 01:51:05.340] And then the last one seems to be the weirdest to me, like what kind of lab accident would
[01:51:05.340 --> 01:51:10.380] have to take place where cone snail venom has accidentally...
[01:51:10.380 --> 01:51:13.860] They accidentally discovered that it could inhibit HIV replication.
[01:51:13.860 --> 01:51:17.980] Like, was this literally like, whoa, I spilled this green stuff into this thing?
[01:51:17.980 --> 01:51:18.980] That's what Cara just talked about.
[01:51:18.980 --> 01:51:21.020] He literally was not listening.
[01:51:21.020 --> 01:51:22.020] No.
[01:51:22.020 --> 01:51:24.540] Or he was, and he was...
[01:51:24.540 --> 01:51:28.860] I got to be honest, I don't remember a word that you said, Cara, because I was thinking
[01:51:28.860 --> 01:51:29.860] about these.
[01:51:29.860 --> 01:51:32.620] So I must have just subliminally picked that up.
[01:51:32.620 --> 01:51:33.620] You were in your own head.
[01:51:33.620 --> 01:51:34.620] Yeah.
[01:51:34.620 --> 01:51:37.100] I love that you're the kid with his hand up, like, call on me, and the teacher's like,
[01:51:37.100 --> 01:51:40.180] yeah, I just answered that question when the other kid asked it.
[01:51:40.180 --> 01:51:41.860] Well, I'm deeply thinking of it.
[01:51:41.860 --> 01:51:42.860] I don't know.
[01:51:42.860 --> 01:51:43.860] I'm going to just go with three.
[01:51:43.860 --> 01:51:47.140] I don't want to go with everybody else, because first of all, I've realized that I shouldn't
[01:51:47.140 --> 01:51:48.700] care about the results of...
[01:51:48.700 --> 01:51:49.700] The cone snail.
[01:51:49.700 --> 01:51:51.340] I'm going to go with the cone snail as the fiction.
[01:51:51.340 --> 01:51:52.340] Really?
[01:51:52.340 --> 01:51:54.700] Because I don't think that that happened.
[01:51:54.700 --> 01:51:56.500] I think that that is made up.
[01:51:56.500 --> 01:51:57.500] All right.
[01:51:57.500 --> 01:52:01.260] So you all agree with number two, so we'll start there.
[01:52:01.260 --> 01:52:06.540] Researchers developed a powerful microscope out of paper that folds like origami with
[01:52:06.540 --> 01:52:10.480] total material costs less than $1.
[01:52:10.480 --> 01:52:15.900] You guys all think this one is science, and this one is science.
[01:52:15.900 --> 01:52:16.900] This is real.
[01:52:16.900 --> 01:52:17.900] You buy it at the dollar store.
[01:52:17.900 --> 01:52:18.900] Steve, can I tell you something?
[01:52:18.900 --> 01:52:19.900] Yeah.
[01:52:19.900 --> 01:52:20.900] You have one.
[01:52:20.900 --> 01:52:21.900] I have a fold scope.
[01:52:21.900 --> 01:52:22.900] You have a fold scope.
[01:52:22.900 --> 01:52:23.900] Yeah, it's a fold scope.
[01:52:23.900 --> 01:52:24.900] That's what it is.
[01:52:24.900 --> 01:52:25.900] I'm really good friends with the people who developed it.
[01:52:25.900 --> 01:52:26.900] Yeah.
[01:52:26.900 --> 01:52:27.900] They charge you $2 for it.
[01:52:27.900 --> 01:52:28.900] It's so cool.
[01:52:28.900 --> 01:52:29.900] It's the coolest thing ever.
[01:52:29.900 --> 01:52:30.900] Yeah, it's a glass ball.
[01:52:30.900 --> 01:52:31.900] The lens is a glass ball.
[01:52:31.900 --> 01:52:32.900] There are magnets and paper.
[01:52:32.900 --> 01:52:33.900] That's it.
[01:52:33.900 --> 01:52:34.900] And it's not the manufacturing cost.
[01:52:34.900 --> 01:52:35.900] It's the material cost.
[01:52:35.900 --> 01:52:38.420] But still, and you can buy it for like 15 bucks.
[01:52:38.420 --> 01:52:40.380] They're very, very cheap.
[01:52:40.380 --> 01:52:45.200] They've made millions of them for distribution to poor kids or whatever, you know, school
[01:52:45.200 --> 01:52:49.180] systems that are in underdeveloped countries or...
[01:52:49.180 --> 01:52:50.180] And field researchers.
[01:52:50.180 --> 01:52:51.180] And field researchers.
[01:52:51.180 --> 01:52:52.180] Sure.
[01:52:52.180 --> 01:52:54.620] They're good enough that you could put an iPhone next to the lens and take good pictures
[01:52:54.620 --> 01:52:55.620] with it.
[01:52:55.620 --> 01:52:56.620] Wow.
[01:52:56.620 --> 01:52:57.620] Or you could just use your eyeball as well.
[01:52:57.620 --> 01:53:02.620] So, yeah, they set out to create the cheapest microscope they can, so they made it out of
[01:53:02.620 --> 01:53:03.860] basically origami.
[01:53:03.860 --> 01:53:05.100] How powerful is it?
[01:53:05.100 --> 01:53:06.100] 140x.
[01:53:06.100 --> 01:53:07.100] Okay.
[01:53:07.100 --> 01:53:08.100] Jesus.
[01:53:08.100 --> 01:53:09.100] Not bad.
[01:53:09.100 --> 01:53:10.100] Good enough.
[01:53:10.100 --> 01:53:11.100] Yeah.
[01:53:11.100 --> 01:53:12.100] The camera is ripe.
[01:53:12.100 --> 01:53:15.100] Powerful and good are two different things when you're talking about lenses.
[01:53:15.100 --> 01:53:16.100] And pretty much...
[01:53:16.100 --> 01:53:17.620] It's not very optically clear.
[01:53:17.620 --> 01:53:21.100] I mean, it is, but it's not a Zeiss lens or something.
[01:53:21.100 --> 01:53:22.100] Yeah.
[01:53:22.100 --> 01:53:25.540] It's good enough for a $15 microscope, you know.
[01:53:25.540 --> 01:53:31.940] But glass, as we say, you get exactly what you pay for, pretty much, you know, when that's
[01:53:31.940 --> 01:53:35.120] true of cameras and microscopes and telescopes or whatever it is.
[01:53:35.120 --> 01:53:39.420] There's so many things you could do to improve the quality of the lens.
[01:53:39.420 --> 01:53:41.900] And you basically get what you pay for there.
[01:53:41.900 --> 01:53:45.860] But just a glass ball can function as a lens perfectly fine, and it could be very cheap
[01:53:45.860 --> 01:53:46.860] to make.
[01:53:46.860 --> 01:53:47.860] And it could have been plastic, too.
[01:53:47.860 --> 01:53:48.860] Yeah.
[01:53:48.860 --> 01:53:51.900] Yeah, but I didn't think for under a buck you would get something that powerful.
[01:53:51.900 --> 01:53:53.860] I guess we'll go back to number one.
[01:53:53.860 --> 01:53:58.900] The development of laser LASIK surgery was inspired by a case of accidental laser injury
[01:53:58.900 --> 01:54:02.420] to the eye, producing precise, perfectly circular damage.
[01:54:02.420 --> 01:54:06.740] Bob, Kara, and Evan, you think this one is a fiction.
[01:54:06.740 --> 01:54:10.740] Jay, you're all by yourself and thinking that this one is science.
[01:54:10.740 --> 01:54:12.740] And this one is science.
[01:54:12.740 --> 01:54:14.740] Good work, Jay.
[01:54:14.740 --> 01:54:15.740] What?
[01:54:15.740 --> 01:54:16.740] Nice.
[01:54:16.740 --> 01:54:18.780] Oh, boy.
[01:54:18.780 --> 01:54:19.780] That's what happens.
[01:54:19.780 --> 01:54:20.780] Good job, Jay.
[01:54:20.780 --> 01:54:21.780] That's what you get.
[01:54:21.780 --> 01:54:22.780] That's nutty.
[01:54:22.780 --> 01:54:25.460] So a graduate student at the University of Michigan Center for the Ultrafast Optical
[01:54:25.460 --> 01:54:29.380] Science experienced an accidental laser injury to his eye.
[01:54:29.380 --> 01:54:33.380] Fortunately, it didn't destroy his vision.
[01:54:33.380 --> 01:54:38.860] The observation, you know, the people who took care of him observed that he had a very
[01:54:38.860 --> 01:54:42.860] precise and perfectly circular damage produced by the laser.
[01:54:42.860 --> 01:54:45.540] And they're like, huh, I wonder if we could cut eyes with lasers.
[01:54:45.540 --> 01:54:51.540] Because up to that point, the eye surgery, the keratotomy surgery, was done just with
[01:54:51.540 --> 01:54:53.620] a scalpel, with a blade, right?
[01:54:53.620 --> 01:54:55.140] They basically do the same thing.
[01:54:55.140 --> 01:54:56.620] They just were cutting it with a knife.
[01:54:56.620 --> 01:55:01.580] And they say, well, we could make the same cut, but smaller and more precise and less
[01:55:01.580 --> 01:55:05.100] recovery time and less pain if we could do it with a laser.
[01:55:05.100 --> 01:55:10.860] It took them eight years to develop the actual procedure, but that led to bladeless LASIK
[01:55:10.860 --> 01:55:12.420] or all laser LASIK.
[01:55:12.420 --> 01:55:16.820] They used a femtosecond laser to make the cuts.
[01:55:16.820 --> 01:55:22.420] Yeah, but it was inspired by an accidental laser injury to the eye.
[01:55:22.420 --> 01:55:23.420] Cool.
[01:55:23.420 --> 01:55:29.020] We're so lucky that people like are constantly pushing out there like, oh, look at this weird
[01:55:29.020 --> 01:55:30.580] accident that we just had.
[01:55:30.580 --> 01:55:31.580] Wait a second.
[01:55:31.580 --> 01:55:32.580] You know what I mean?
[01:55:32.580 --> 01:55:35.460] Like, it's in those moments where incredible things happen.
[01:55:35.460 --> 01:55:36.460] Yeah.
[01:55:36.460 --> 01:55:38.260] What's that expression, Steve?
[01:55:38.260 --> 01:55:39.420] It's not eureka.
[01:55:39.420 --> 01:55:40.900] It's like, that's weird.
[01:55:40.900 --> 01:55:43.380] That's interesting.
[01:55:43.380 --> 01:55:47.160] Which means that while examining the properties of cone snail venom, researchers accidentally
[01:55:47.160 --> 01:55:55.180] discovered that it is a potent inhibitor of HIV replication is the fiction because, yeah.
[01:55:55.180 --> 01:56:00.140] But the cone snail venom did get one of the Golden Globe Awards, but it was for what they
[01:56:00.140 --> 01:56:04.140] were looking for, which was for a painkiller.
[01:56:04.140 --> 01:56:11.140] What they found was a non-opioid target for pain relief, which is huge.
[01:56:11.140 --> 01:56:12.740] Yeah, that's an addiction.
[01:56:12.740 --> 01:56:17.140] So identifying a new therapeutic target is always big in medicine, meaning something
[01:56:17.140 --> 01:56:23.020] you could bind to or whatever you can affect that could alter the way something functions.
[01:56:23.020 --> 01:56:28.820] So researchers are constantly looking for more ways to manipulate the pain system to
[01:56:28.820 --> 01:56:33.580] get away from the opioid system because that's obviously what causes lots of problems.
[01:56:33.580 --> 01:56:38.860] Now the cone snail, a lot of these animals that like have venom or they bite you or whatever,
[01:56:38.860 --> 01:56:43.080] they do things like either they numb you so you don't feel them biting you or they thin
[01:56:43.080 --> 01:56:45.160] your blood so it doesn't coagulate or whatever.
[01:56:45.160 --> 01:56:49.680] So we use them to find things like anticoagulants and painkillers.
[01:56:49.680 --> 01:56:55.180] Those are common things that we try to source from these snails and other similar critters.
[01:56:55.180 --> 01:56:59.660] So they were looking for these properties and they found that, oh, this is acting through
[01:56:59.660 --> 01:57:05.060] – the discovery was that it was acting through a non-opioid pain reliever, a pain receptor.
[01:57:05.060 --> 01:57:08.180] That was the huge thing here.
[01:57:08.180 --> 01:57:13.060] Still this hasn't fully played out through the clinical research.
[01:57:13.060 --> 01:57:17.020] This is a basic science finding, but huge.
[01:57:17.020 --> 01:57:20.620] And I just made up the HIV thing because I thought it sounded semi-plausible.
[01:57:20.620 --> 01:57:21.620] It did.
[01:57:21.620 --> 01:57:22.620] It totally did.
[01:57:22.620 --> 01:57:26.300] I like to think that they were giving it to somebody who had HIV but they were testing
[01:57:26.300 --> 01:57:30.020] it on something else and it's like, huh, my HIV counts are down.
[01:57:30.020 --> 01:57:31.180] But I just made it up.
[01:57:31.180 --> 01:57:32.180] That didn't happen.
[01:57:32.180 --> 01:57:33.180] All right.
[01:57:33.180 --> 01:57:34.180] So good job, Jay.
[01:57:34.180 --> 01:57:39.260] I always like to reward people who go out on a limb and are not afraid to give an answer
[01:57:39.260 --> 01:57:41.460] that departs from the crew.
[01:57:41.460 --> 01:57:42.460] What's my reward?
[01:57:42.460 --> 01:57:43.460] What's the reward?
[01:57:43.460 --> 01:57:44.460] I didn't know this.
[01:57:44.460 --> 01:57:45.460] Your reward is that you won this week.
[01:57:45.460 --> 01:57:46.460] That's what your reward is.
[01:57:46.460 --> 01:57:47.460] There you go.
[01:57:47.460 --> 01:57:48.460] Yeah.
[01:57:48.460 --> 01:57:49.460] Wow.
[01:57:49.460 --> 01:57:50.460] Okay.
Skeptical Quote of the Week (1:57:48)
If I want to know how we learn and remember and represent the world, I will go to psychology and neuroscience. If I want to know where values come from, I will go to evolutionary biology and neuroscience and psychology, just as Aristotle and Hume would have, were they alive.
–Canadian-American analytic philosopher Patricia Churchland, Professor Emeritus of Philosophy at the University of California, San Diego, and Adjunct Professor at the Salk Institute
[01:57:50.460 --> 01:57:51.460] All right.
[01:57:51.460 --> 01:57:52.460] Evan, give us a quote.
[01:57:52.460 --> 01:57:56.740] If I want to represent the world, I will go to psychology and neuroscience.
[01:57:56.740 --> 01:58:02.180] If I want to know where values come from, I will go to evolutionary biology and neuroscience
[01:58:02.180 --> 01:58:08.900] and psychology, just as Aristotle and Hume would have were they alive.
[01:58:08.900 --> 01:58:15.300] Patricia Churchland, she's a professor emeritus of philosophy at the University of California,
[01:58:15.300 --> 01:58:19.860] San Diego, and an adjunct professor at the Salk Institute.
[01:58:19.860 --> 01:58:22.580] Very nice homage to Aristotle and Hume, I think.
[01:58:22.580 --> 01:58:23.580] Yeah, but it's interesting, though.
[01:58:23.580 --> 01:58:26.620] If you look at the sentence, though, she doesn't say that she would go to philosophy.
[01:58:26.620 --> 01:58:29.700] Yeah, I was expecting to hear that come out.
[01:58:29.700 --> 01:58:34.340] And that's kind of, I mean, when you're talking about where values come from, that's pretty
[01:58:34.340 --> 01:58:35.340] much philosophy.
[01:58:35.340 --> 01:58:40.700] I mean, I think it's informed by neuroscience and psychology, but it's interesting to hear
[01:58:40.700 --> 01:58:44.660] a philosophy professor kind of make that point, though.
[01:58:44.660 --> 01:58:49.700] Right, but is it where they come from, or is that where we go to explore them?
[01:58:49.700 --> 01:58:50.700] Yeah.
[01:58:50.700 --> 01:58:55.340] You know, it's like, I think the idea there that, well, it depends on your philosophies,
[01:58:55.340 --> 01:58:57.060] but are philosophies natural?
[01:58:57.060 --> 01:58:58.100] Are they constructed?
[01:58:58.100 --> 01:58:59.100] Are they whatever?
[01:58:59.100 --> 01:59:02.020] It's like the process, not the content.
[01:59:02.020 --> 01:59:06.980] They're exploring how we come to those places, but they're not finding their values in philosophy.
[01:59:06.980 --> 01:59:09.820] But what she's saying, though, if she wants to understand them, she would go to these
[01:59:09.820 --> 01:59:12.220] disciplines to understand them.
[01:59:12.220 --> 01:59:13.220] And I would have just-
[01:59:13.220 --> 01:59:14.220] Grass is always greener.
[01:59:14.220 --> 01:59:17.020] Yeah, I would have just thrown philosophy into the list myself.
[01:59:17.020 --> 01:59:18.020] That's funny, yeah.
[01:59:18.020 --> 01:59:21.180] I'm sure Massimo Colucci would do that also.
[01:59:21.180 --> 01:59:23.900] Maybe it's because Aristotle and Hume were philosophers.
[01:59:23.900 --> 01:59:28.700] Yeah, I think she's saying that the classic philosophers would have studied neuroscience
[01:59:28.700 --> 01:59:29.700] and psychology.
[01:59:29.700 --> 01:59:30.700] I will agree with that.
[01:59:30.700 --> 01:59:34.540] I do think that if that's the point, I would agree.
[01:59:34.540 --> 01:59:35.540] Maybe she's assuming philosophy.
[01:59:35.540 --> 01:59:36.540] Right.
[01:59:36.540 --> 01:59:37.540] Yeah, maybe.
[01:59:37.540 --> 01:59:38.540] That didn't have to be stated.
[01:59:38.540 --> 01:59:40.540] That's why all of our degrees are PhDs.
[01:59:40.540 --> 01:59:45.340] It's like we're all studying philosophy, or at least that's how they view it.
[01:59:45.340 --> 01:59:46.340] Applied philosophy.
[01:59:46.340 --> 01:59:47.340] Yeah.
[01:59:47.340 --> 01:59:48.340] All right.
Signoff
[01:59:48.340 --> 01:59:50.220] Well, thank you all for joining me this week.
[01:59:50.220 --> 01:59:51.220] You got it, Steve.
[01:59:51.220 --> 01:59:52.220] Thanks.
[01:59:52.220 --> 01:59:53.220] Thanks, Steve. S: —and until next week, this is your Skeptics' Guide to the Universe.
S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.
Today I Learned
- Fact/Description, possibly with an article reference[11]
- Fact/Description
- Fact/Description
Notes
References
- ↑ NY Post: Huge chess world upset of grandmaster sparks wild claims of cheating — with vibrating sex toy
- ↑ Snopes: Does Video Show Snake’s Head in Woman’s Ear?
- ↑ Neuroscience News: Children Don't Believe Everything They Are Told
- ↑ Ars Technica: Is your gas stove bad for your health?
- ↑ Neurologica: Neanderthal Brains
- ↑ From Cell: Design, construction, and in vivo augmentation of a complex gut microbiome
- ↑ Vice: Navy Says All UFO Videos Classified, Releasing Them ‘Will Harm National Security’
- ↑ Golden Goose Award: How a Lab Incident Led to Better Eye Surgery for Millions of People
- ↑ Golden Goose Award: Foldscopes and Frugal Science: Paper Microscopes Make Science Globally Accessible
- ↑ Golden Goose Award: Tiny Snail, Big Impact: Cone Snail Venom Eases Pain and Injects New Energy into Neuroscience
- ↑ [url_for_TIL publication: title]