SGU Episode 901: Difference between revisions
Line 3,166: | Line 3,166: | ||
=== Steve Explains Item #_n_ === | === Steve Explains Item #_n_ === | ||
[01:38:18.000 --> 01:38:21.000] Okay, everyone, let's go on with science or fiction. | |||
[01:38:24.000 --> 01:38:28.000] It's time for science or fiction. | |||
[01:38:33.000 --> 01:38:36.000] Each week I come up with three science news items or facts, | |||
[01:38:36.000 --> 01:38:38.000] two real and one fake. | |||
[01:38:38.000 --> 01:38:42.000] And then I challenge my panel of skeptics to tell me which one is the fake. | |||
[01:38:42.000 --> 01:38:46.000] No theme this week, just three science news items. | |||
[01:38:46.000 --> 01:38:48.000] Are you all ready? | |||
[01:38:48.000 --> 01:38:49.000] Yep. | |||
[01:38:49.000 --> 01:38:50.000] Yep, yep. | |||
[01:38:50.000 --> 01:38:51.000] Yep. | |||
[01:38:51.000 --> 01:38:55.000] Item number one, a review of COVID-19-related preprints | |||
[01:38:55.000 --> 01:38:58.000] that were later published in peer-reviewed journals | |||
[01:38:58.000 --> 01:39:01.000] finds that 50% were substantially altered, | |||
[01:39:01.000 --> 01:39:05.000] including changes to effect sizes, the data used, | |||
[01:39:05.000 --> 01:39:07.000] and statistical significance. | |||
[01:39:07.000 --> 01:39:10.000] Item number two, scientists have developed a simple, rapid, | |||
[01:39:10.000 --> 01:39:14.000] and effective method for making tissue optically transparent, | |||
[01:39:14.000 --> 01:39:16.000] including entire organs. | |||
[01:39:16.000 --> 01:39:20.000] And item number three, in a comprehensive meta-analysis, | |||
[01:39:20.000 --> 01:39:23.000] researchers find that women have advantages over men | |||
[01:39:23.000 --> 01:39:28.000] in phonemic fluency, verbal memory, and verbal recognition, | |||
[01:39:28.000 --> 01:39:31.000] and that this advantage is stable over 50 years of research | |||
[01:39:31.000 --> 01:39:34.000] and over a participant's lifetime. | |||
[01:39:34.000 --> 01:39:35.000] All right, Jay, go first. | |||
[01:39:35.000 --> 01:39:39.000] All right, this first one here, a review of COVID-19-related preprints | |||
[01:39:39.000 --> 01:39:42.000] that were later published in peer-reviewed journals | |||
[01:39:42.000 --> 01:39:45.000] finds that 50% were substantially altered, | |||
[01:39:45.000 --> 01:39:47.000] including changes to effect sizes, the data used, | |||
[01:39:47.000 --> 01:39:49.000] and statistical significance. | |||
[01:39:49.000 --> 01:39:51.000] All right, Steve, I'm not clear about what's happening here. | |||
[01:39:51.000 --> 01:39:53.000] All right, so you know what a preprint is? | |||
[01:39:53.000 --> 01:39:56.000] It basically means that scientists do a study | |||
[01:39:56.000 --> 01:40:01.000] and they just put it up online before it goes through peer review. | |||
[01:40:01.000 --> 01:40:03.000] Like, we're all ready to go, here it is. | |||
[01:40:03.000 --> 01:40:06.000] Meanwhile, they send it to publishers, right, | |||
[01:40:06.000 --> 01:40:09.000] hopefully one publisher, and eventually it gets published. | |||
[01:40:09.000 --> 01:40:11.000] But there's a process to that publication process, | |||
[01:40:11.000 --> 01:40:13.000] you know, you get feedback from reviewers | |||
[01:40:13.000 --> 01:40:15.000] and they tell you to change this and do that and whatever. | |||
[01:40:15.000 --> 01:40:17.000] So the question they were looking at was, | |||
[01:40:17.000 --> 01:40:20.000] how substantially do these preprints change | |||
[01:40:20.000 --> 01:40:22.000] in that publication process? | |||
[01:40:22.000 --> 01:40:24.000] So the published version of these papers | |||
[01:40:24.000 --> 01:40:28.000] were substantially altered 50% of the time. | |||
[01:40:28.000 --> 01:40:31.000] Okay, I think that one, I would say that one is true. | |||
[01:40:31.000 --> 01:40:34.000] That one makes sense, you know, once I'm editing, | |||
[01:40:34.000 --> 01:40:37.000] people are giving editorial feedback and that type of thing. | |||
[01:40:37.000 --> 01:40:39.000] That seems to make sense to me. | |||
[01:40:39.000 --> 01:40:42.000] The second one here about that scientists developed | |||
[01:40:42.000 --> 01:40:46.000] a fast and easy way of making tissue optically transparent. | |||
[01:40:46.000 --> 01:40:48.000] I think the key word here is optically, | |||
[01:40:48.000 --> 01:40:50.000] because I bet you that's the trick, | |||
[01:40:50.000 --> 01:40:52.000] is it might not be to the human eye, | |||
[01:40:52.000 --> 01:40:54.000] but there might be a way for them to use | |||
[01:40:54.000 --> 01:40:56.000] some type of machine that can do it. | |||
[01:40:56.000 --> 01:40:59.000] Well, I'll just tell you, that word is there | |||
[01:40:59.000 --> 01:41:01.000] to tell you that it's not a trick. | |||
[01:41:01.000 --> 01:41:04.000] It's optically transparent, as opposed to being transparent | |||
[01:41:04.000 --> 01:41:06.000] in X-rays or in... | |||
[01:41:06.000 --> 01:41:08.000] Yeah, so like under a microscope. | |||
[01:41:08.000 --> 01:41:09.000] Yeah. | |||
[01:41:09.000 --> 01:41:10.000] To the eye, basically. | |||
[01:41:10.000 --> 01:41:13.000] This is in optical frequencies, in visible spectrum. | |||
[01:41:13.000 --> 01:41:15.000] So it's exactly opposite of what I just said. | |||
[01:41:15.000 --> 01:41:17.000] Okay, that's fine. | |||
[01:41:17.000 --> 01:41:19.000] Hey, man, I'm trying to get data out of Steve | |||
[01:41:19.000 --> 01:41:20.000] whether he knows... | |||
[01:41:20.000 --> 01:41:21.000] You're the first one. | |||
[01:41:21.000 --> 01:41:22.000] You're the first one. | |||
[01:41:22.000 --> 01:41:25.000] I always try to make these as concise and precise as possible. | |||
[01:41:25.000 --> 01:41:28.000] And so if you're like significantly misinterpreting | |||
[01:41:28.000 --> 01:41:31.000] what I wrote, I'm happy to correct it in the first go around. | |||
[01:41:31.000 --> 01:41:34.000] All right, so this means that if they don't need to... | |||
[01:41:34.000 --> 01:41:39.000] If the tissue can actually be alive when it's transparent, | |||
[01:41:39.000 --> 01:41:41.000] that they could make people invisible. | |||
[01:41:41.000 --> 01:41:42.000] That's what you're saying. | |||
[01:41:42.000 --> 01:41:44.000] That's what you're saying, Steve. | |||
[01:41:44.000 --> 01:41:45.000] Okay, I'm just pointing that out. | |||
[01:41:45.000 --> 01:41:48.000] Okay, the third one, comprehensive meta-analysis. | |||
[01:41:48.000 --> 01:41:51.000] Researchers find that women have advantages over men | |||
[01:41:51.000 --> 01:41:54.000] in phonemic fluency, verbal memory, and verbal recognition, | |||
[01:41:54.000 --> 01:41:57.000] and that this advantage is stable over 50 years of research. | |||
[01:41:57.000 --> 01:42:00.000] Wow, that's cool. | |||
[01:42:00.000 --> 01:42:04.000] So, I mean, I have no reason to think that the third one | |||
[01:42:04.000 --> 01:42:07.000] is something fishy about it either. | |||
[01:42:07.000 --> 01:42:09.000] I don't know. | |||
[01:42:09.000 --> 01:42:11.000] Women have advantages over men. | |||
[01:42:11.000 --> 01:42:15.000] That's a very unscientific thing there. | |||
[01:42:15.000 --> 01:42:17.000] What's the advantage? | |||
[01:42:17.000 --> 01:42:19.000] Women have better phonemic fluency. | |||
[01:42:19.000 --> 01:42:20.000] How much better? | |||
[01:42:20.000 --> 01:42:22.000] How much better do they perform? | |||
[01:42:22.000 --> 01:42:24.000] He's not going to tell us that. | |||
[01:42:24.000 --> 01:42:27.000] Well, you see? | |||
[01:42:27.000 --> 01:42:29.000] You see, Kara? | |||
[01:42:29.000 --> 01:42:31.000] I just don't like the way this one is worded. | |||
[01:42:31.000 --> 01:42:34.000] I think this one is made to be vague deliberately, | |||
[01:42:34.000 --> 01:42:36.000] and I'm going to say it's the fake. | |||
[01:42:36.000 --> 01:42:37.000] Okay, Evan. | |||
[01:42:37.000 --> 01:42:40.000] Yeah, I think I'm in line with Jay here on this particular one. | |||
[01:42:40.000 --> 01:42:42.000] I think with the COVID-19 pre-prints, | |||
[01:42:42.000 --> 01:42:45.000] 50% were substantially altered. | |||
[01:42:45.000 --> 01:42:48.000] Yeah, I would like to know how, | |||
[01:42:48.000 --> 01:42:52.000] for things that are not COVID-19 studies, | |||
[01:42:52.000 --> 01:42:57.000] is this different from other medical topics other than COVID-19, | |||
[01:42:57.000 --> 01:42:59.000] or is this specific to COVID-19? | |||
[01:42:59.000 --> 01:43:02.000] But I have a feeling this one's going to turn out to be right. | |||
[01:43:02.000 --> 01:43:07.000] Then the second one about the method for making tissue optically transparent. | |||
[01:43:07.000 --> 01:43:12.000] Sure, I didn't have a problem with this one per se. | |||
[01:43:12.000 --> 01:43:14.000] There's nothing in there, I don't think, | |||
[01:43:14.000 --> 01:43:18.000] and that threw me off, but the last one throws me off. | |||
[01:43:18.000 --> 01:43:21.000] The question, I think, boils down to why. | |||
[01:43:21.000 --> 01:43:25.000] Why would this be the case that women would have advantages over men | |||
[01:43:25.000 --> 01:43:27.000] in all these areas? | |||
[01:43:27.000 --> 01:43:31.000] Maybe one, but three very distinct things, | |||
[01:43:31.000 --> 01:43:35.000] phonemic fluency, verbal memory, and verbal recognition. | |||
[01:43:35.000 --> 01:43:38.000] I don't see that being the case, so I'm with Jay on that fiction. | |||
[01:43:38.000 --> 01:43:39.000] All right, Bob. | |||
[01:43:39.000 --> 01:43:41.000] All right, so for the COVID-19, | |||
[01:43:41.000 --> 01:43:45.000] 50% substantially altered changes to effects. | |||
[01:43:45.000 --> 01:43:48.000] I mean, that seems like some p-hacking. | |||
[01:43:48.000 --> 01:43:51.000] Would that be an accurate description? | |||
[01:43:51.000 --> 01:43:54.000] So you're not answering me, I'm just going to go with it. | |||
[01:43:54.000 --> 01:43:58.000] Yeah, I mean, it's very disappointing. | |||
[01:43:58.000 --> 01:44:02.000] Once I get past the first person, I'm going to get extremely... | |||
[01:44:02.000 --> 01:44:03.000] Yeah, I hear you. | |||
[01:44:03.000 --> 01:44:05.000] It's not a shot anyway. | |||
[01:44:05.000 --> 01:44:07.000] If you don't ask, you don't get. | |||
[01:44:07.000 --> 01:44:11.000] The third one, you know, it could be a minor advantage, | |||
[01:44:11.000 --> 01:44:16.000] a small advantage over men in phonemic fluency, et cetera. | |||
[01:44:16.000 --> 01:44:18.000] Yeah, if it's not egregious. | |||
[01:44:18.000 --> 01:44:20.000] The one that was really rubbing me the wrong way, though, | |||
[01:44:20.000 --> 01:44:24.000] was the second one, the rapid, simple, and effective method | |||
[01:44:24.000 --> 01:44:27.000] for making tissue optically transparent. | |||
[01:44:27.000 --> 01:44:29.000] I mean, wouldn't that require some fundamental changes | |||
[01:44:29.000 --> 01:44:32.000] to the tissue in question here? | |||
[01:44:32.000 --> 01:44:35.000] There may be some weird trick that they got. | |||
[01:44:35.000 --> 01:44:38.000] But whatever, I'm going to say that's fiction anyway. | |||
[01:44:38.000 --> 01:44:39.000] Go out on my own. | |||
[01:44:39.000 --> 01:44:41.000] Okay, and Cara. | |||
[01:44:41.000 --> 01:44:42.000] Cara, what say you? | |||
[01:44:42.000 --> 01:44:44.000] I think I'm going to go out on my own too. | |||
[01:44:44.000 --> 01:44:45.000] Oh, good. | |||
[01:44:45.000 --> 01:44:47.000] Oh, you think it's the COVID one. | |||
[01:44:47.000 --> 01:44:50.000] Yeah, so the reason that I feel... | |||
[01:44:50.000 --> 01:44:55.000] Okay, so I totally buy the one that both Jay and Evan said | |||
[01:44:55.000 --> 01:44:57.000] was fiction, I totally think is science, | |||
[01:44:57.000 --> 01:45:00.000] that women have more verbal fluency or verbal memory, | |||
[01:45:00.000 --> 01:45:02.000] verbal recognition, phonomic fluency. | |||
[01:45:02.000 --> 01:45:04.000] There's a reason that when we give psychological tests | |||
[01:45:04.000 --> 01:45:06.000] that we have gender norms. | |||
[01:45:06.000 --> 01:45:08.000] There are differences among genders. | |||
[01:45:08.000 --> 01:45:13.000] That doesn't necessarily mean that it's biologically innate. | |||
[01:45:13.000 --> 01:45:15.000] It could also be learned culturally. | |||
[01:45:15.000 --> 01:45:18.000] But regardless, I would not be surprised that we see | |||
[01:45:18.000 --> 01:45:20.000] a significant, I don't know if it's huge, like Bob said, | |||
[01:45:20.000 --> 01:45:22.000] it might be small, but a significant difference | |||
[01:45:22.000 --> 01:45:26.000] between genders and verbal facets. | |||
[01:45:26.000 --> 01:45:28.000] And Evan, I don't think those three things | |||
[01:45:28.000 --> 01:45:29.000] are that wildly different. | |||
[01:45:29.000 --> 01:45:32.000] They're all in a kind of similar part of the brain. | |||
[01:45:32.000 --> 01:45:36.000] And then, Bob, you said the tissue being optically transparent. | |||
[01:45:36.000 --> 01:45:38.000] Well, tissue is optically transparent | |||
[01:45:38.000 --> 01:45:41.000] if you slice it thin enough, like we know this. | |||
[01:45:41.000 --> 01:45:44.000] Like any sort of microscope slide you can see. | |||
[01:45:44.000 --> 01:45:46.000] Yeah, but including entire organs. | |||
[01:45:46.000 --> 01:45:48.000] Right, so that's the difference here, right? | |||
[01:45:48.000 --> 01:45:50.000] But if you think about it, what gives tissue pigment, | |||
[01:45:50.000 --> 01:45:52.000] it's literally just pigment. | |||
[01:45:52.000 --> 01:45:54.000] It's pigmentation in little organelles. | |||
[01:45:54.000 --> 01:45:56.000] And so if you could get rid of blood, | |||
[01:45:56.000 --> 01:45:58.000] if you could get rid of certain pigments | |||
[01:45:58.000 --> 01:46:01.000] within different tissues, I think you could make it. | |||
[01:46:01.000 --> 01:46:03.000] And also this doesn't say it's in vivo. | |||
[01:46:03.000 --> 01:46:05.000] Like this could be dead tissue. | |||
[01:46:05.000 --> 01:46:07.000] But then it's easy. | |||
[01:46:07.000 --> 01:46:09.000] You could do anything to it to make it clear | |||
[01:46:09.000 --> 01:46:11.000] so that you could look at it. | |||
[01:46:11.000 --> 01:46:13.000] That one has too many caveats. | |||
[01:46:13.000 --> 01:46:16.000] The COVID-19 pre-prints is freaking me out. | |||
[01:46:16.000 --> 01:46:20.000] 50% substantially altered, including changes to effect sizes, | |||
[01:46:20.000 --> 01:46:24.000] the actual data set, and statistical significance. | |||
[01:46:24.000 --> 01:46:28.000] First of all, we would be shutting down these pre-prints | |||
[01:46:28.000 --> 01:46:30.000] if that were the case. | |||
[01:46:30.000 --> 01:46:33.000] We would be like, the pre-print system isn't working, | |||
[01:46:33.000 --> 01:46:36.000] half of the studies that are pre-published are not holding up, | |||
[01:46:36.000 --> 01:46:39.000] and they're having to make significant changes to the research | |||
[01:46:39.000 --> 01:46:42.000] before the editors are allowing them to be good enough to publish. | |||
[01:46:42.000 --> 01:46:44.000] It just doesn't work that way. | |||
[01:46:44.000 --> 01:46:46.000] I mean, when you say it like that, you know. | |||
[01:46:46.000 --> 01:46:49.000] Yeah, I feel like if something's good enough to be published, | |||
[01:46:49.000 --> 01:46:52.000] you're making minor changes to appease the editors at that point. | |||
[01:46:52.000 --> 01:46:56.000] You're not like adding whole parts of your data set | |||
[01:46:56.000 --> 01:46:58.000] or completely changing. | |||
[01:46:58.000 --> 01:47:01.000] Like all that stuff is supposed to be fixed in advance. | |||
[01:47:01.000 --> 01:47:04.000] That's why we register studies. | |||
[01:47:04.000 --> 01:47:06.000] So you can't after the fact go through and change. | |||
[01:47:06.000 --> 01:47:10.000] That is the definition of p-hacking, or one definition of p-hacking, Bob. | |||
[01:47:10.000 --> 01:47:13.000] So I don't think, like, if it was a 50% substantial change, | |||
[01:47:13.000 --> 01:47:16.000] we'd be p-hacking between the pre-print and the publication. | |||
[01:47:16.000 --> 01:47:18.000] That's the wrong direction. | |||
[01:47:18.000 --> 01:47:21.000] Like, no good journal editor is going to be like, | |||
[01:47:21.000 --> 01:47:24.000] how about you add some shit so we can falsify this? | |||
[01:47:24.000 --> 01:47:26.000] I just don't think this one is science. | |||
[01:47:26.000 --> 01:47:28.000] What about influence from the previous administration? | |||
[01:47:28.000 --> 01:47:30.000] Right. I mean, there is that too. | |||
[01:47:30.000 --> 01:47:33.000] But COVID was mostly, well, yeah. | |||
[01:47:33.000 --> 01:47:35.000] I don't think so, though, because a pre-print is a pre-print. | |||
[01:47:35.000 --> 01:47:38.000] You can just throw it up there. It's not peer-reviewed. | |||
[01:47:38.000 --> 01:47:40.000] Yeah, this one bothers me. | |||
[01:47:40.000 --> 01:47:42.000] I feel like that has to be the fiction. | |||
[01:47:42.000 --> 01:47:44.000] And if it's not, I'm going to be real sad | |||
[01:47:44.000 --> 01:47:48.000] for the scientific field as a whole, biomedical science. | |||
[01:47:48.000 --> 01:47:50.000] The publishing field, yeah. | |||
[01:47:50.000 --> 01:47:52.000] Okay, so you guys are all spread out. | |||
[01:47:52.000 --> 01:47:54.000] Why don't we take this in reverse order? | |||
[01:47:54.000 --> 01:47:56.000] No, no. | |||
[01:47:56.000 --> 01:47:58.000] So Jay and Evan, you think the third one is the fiction. | |||
[01:47:58.000 --> 01:48:01.000] Not anymore, now that you've taken the third guy's screen. | |||
[01:48:01.000 --> 01:48:04.000] Scientists, researchers find that women have advantages over men | |||
[01:48:04.000 --> 01:48:07.000] in phonemic fluency, verbal memory, and verbal recognition, | |||
[01:48:07.000 --> 01:48:10.000] and that this advantage is stable over 50 years of research | |||
[01:48:10.000 --> 01:48:12.000] and over a participant's lifetime. | |||
[01:48:12.000 --> 01:48:14.000] You two think this one is the fiction. | |||
[01:48:14.000 --> 01:48:16.000] Bob and Kerry think this one is science. | |||
[01:48:16.000 --> 01:48:19.000] And this one is science. | |||
[01:48:19.000 --> 01:48:20.000] This is science. | |||
[01:48:20.000 --> 01:48:21.000] Yeah, Bob! | |||
[01:48:21.000 --> 01:48:22.000] Yeah. | |||
[01:48:22.000 --> 01:48:26.000] This is the first meta-analysis of this research, though, since 1988. | |||
[01:48:26.000 --> 01:48:27.000] So it's been a while. | |||
[01:48:27.000 --> 01:48:28.000] Jeez, it took so long. | |||
[01:48:28.000 --> 01:48:30.000] So they basically did a really comprehensive meta-analysis, | |||
[01:48:30.000 --> 01:48:32.000] and that's what they found. | |||
[01:48:32.000 --> 01:48:34.000] But you guys, those of you who said this are correct, | |||
[01:48:34.000 --> 01:48:37.000] the advantage is very small, but it's statistically significant, | |||
[01:48:37.000 --> 01:48:41.000] and it's very consistent across research. | |||
[01:48:41.000 --> 01:48:46.000] Now, this is the kind of thing where it's like a bimodal distribution | |||
[01:48:46.000 --> 01:48:49.000] where the differences within a gender is going to be a lot greater | |||
[01:48:49.000 --> 01:48:52.000] than the statistical difference between the two genders, | |||
[01:48:52.000 --> 01:48:54.000] but there is a statistical difference there. | |||
[01:48:54.000 --> 01:48:57.000] And essentially, this is sort of the conventional wisdom | |||
[01:48:57.000 --> 01:49:00.000] that women have more verbal fluency than guys do, basically. | |||
[01:49:00.000 --> 01:49:02.000] And they wanted to find out, is this really true, | |||
[01:49:02.000 --> 01:49:04.000] or is it just one of those things that people believe, | |||
[01:49:04.000 --> 01:49:06.000] but it's not really true? | |||
[01:49:06.000 --> 01:49:08.000] But yeah, there is actually a little bit of an effect there. | |||
[01:49:08.000 --> 01:49:12.000] What's really interesting, they also found that when you look | |||
[01:49:12.000 --> 01:49:15.000] at the individual papers, when you break them up, | |||
[01:49:15.000 --> 01:49:20.000] the advantages were greater for women in papers | |||
[01:49:20.000 --> 01:49:22.000] where the lead author was a woman, | |||
[01:49:22.000 --> 01:49:27.000] and they were smaller in papers where the lead author was a man. | |||
[01:49:27.000 --> 01:49:31.000] Right, and so that makes you wonder, is that a positive or a negative bias? | |||
[01:49:31.000 --> 01:49:35.000] Are the men minimizing the difference, or are the women magnifying the difference? | |||
[01:49:35.000 --> 01:49:37.000] Or both, they're not. | |||
[01:49:37.000 --> 01:49:39.000] So there's definitely some researcher bias, | |||
[01:49:39.000 --> 01:49:43.000] probably also some publication bias as well in the data set, | |||
[01:49:43.000 --> 01:49:46.000] but it holds out across the totality of research. | |||
[01:49:46.000 --> 01:49:50.000] Also, here's the other thing, the effect size was greater | |||
[01:49:50.000 --> 01:49:53.000] in published studies than in unpublished studies. | |||
[01:49:53.000 --> 01:49:56.000] So that's the publication bias. | |||
[01:49:56.000 --> 01:50:00.000] So yeah, I just love the fact that those types of analyses are now fairly routine. | |||
[01:50:00.000 --> 01:50:02.000] You have to look at data that way. | |||
[01:50:02.000 --> 01:50:05.000] But even when you account for all of that, there's still a tiny effect there. | |||
[01:50:05.000 --> 01:50:08.000] It's cool, though, that there's so much data on certain topics | |||
[01:50:08.000 --> 01:50:10.000] that have just been really interesting for years | |||
[01:50:10.000 --> 01:50:13.000] that we can really look at it in a super clean way now. | |||
[01:50:13.000 --> 01:50:18.000] These were 500 measures with 350,000 participants | |||
[01:50:18.000 --> 01:50:21.000] in the total analysis. | |||
[01:50:21.000 --> 01:50:25.000] All right, so it's between Bob and Kara. | |||
[01:50:25.000 --> 01:50:27.000] We go to number two. | |||
[01:50:27.000 --> 01:50:30.000] Scientists have developed a simple, rapid, and effective method | |||
[01:50:30.000 --> 01:50:35.000] for making tissue optically transparent, including entire organs. | |||
[01:50:35.000 --> 01:50:37.000] Bob, you think this one is the fiction. | |||
[01:50:37.000 --> 01:50:39.000] Everyone else thought this one was science. | |||
[01:50:39.000 --> 01:50:42.000] And this one is science. | |||
[01:50:42.000 --> 01:50:44.000] Sorry, Bob. | |||
[01:50:44.000 --> 01:50:46.000] Good job, Kara. | |||
[01:50:46.000 --> 01:50:50.000] But Kara only won because she has greater verbal fluency than the guys do. | |||
[01:50:50.000 --> 01:50:54.000] Well, clearly that's recent analysis. | |||
[01:50:54.000 --> 01:50:56.000] Yeah, Jay, there's nowhere to say that they're alive. | |||
[01:50:56.000 --> 01:50:59.000] This is not a living organism. | |||
[01:50:59.000 --> 01:51:03.000] This is an organ that you removed. | |||
[01:51:03.000 --> 01:51:06.000] So you could have an invisible zombie then, or an invisible corpse. | |||
[01:51:06.000 --> 01:51:08.000] Yeah, you could have an invisible zombie. | |||
[01:51:08.000 --> 01:51:09.000] It could be reanimated. | |||
[01:51:09.000 --> 01:51:10.000] Good recovery, yeah. | |||
[01:51:10.000 --> 01:51:14.000] And I believe they were mostly using mice tissue here, or mouse organs. | |||
[01:51:14.000 --> 01:51:17.000] And yeah, this was possible previously, | |||
[01:51:17.000 --> 01:51:19.000] but it was really expensive and complicated, | |||
[01:51:19.000 --> 01:51:23.000] and required specialized equipment, and used hazardous organic solvents. | |||
[01:51:23.000 --> 01:51:26.000] So they basically found a simple method for taking away the bits | |||
[01:51:26.000 --> 01:51:29.000] that make it opaque. | |||
[01:51:29.000 --> 01:51:30.000] Visoray? | |||
[01:51:30.000 --> 01:51:37.000] The key is rendering the whole organs transparent | |||
[01:51:37.000 --> 01:51:40.000] without disrupting their architecture, | |||
[01:51:40.000 --> 01:51:44.000] the relationship of tissue to... | |||
[01:51:44.000 --> 01:51:47.000] They're removing proteins basically from the tissue, | |||
[01:51:47.000 --> 01:51:51.000] but they were able to do it without disrupting the architecture. | |||
[01:51:51.000 --> 01:51:52.000] That was the key. | |||
[01:51:52.000 --> 01:51:54.000] And it's cheap and safe. | |||
[01:51:54.000 --> 01:51:55.000] What kind of proteins? | |||
[01:51:55.000 --> 01:51:58.000] I mean, it's basically mostly proteins, right? | |||
[01:51:58.000 --> 01:52:01.000] The ones that specifically are not transparent, | |||
[01:52:01.000 --> 01:52:04.000] the ones that are pigmented. | |||
[01:52:04.000 --> 01:52:07.000] They call their method EZClear. | |||
[01:52:07.000 --> 01:52:08.000] That's branding. | |||
[01:52:08.000 --> 01:52:10.000] EZClear. | |||
[01:52:10.000 --> 01:52:13.000] The EZClear method of organ preparation. | |||
[01:52:13.000 --> 01:52:14.000] Have you lost an organ recently? | |||
[01:52:14.000 --> 01:52:15.000] Well, we've got a solution. | |||
[01:52:15.000 --> 01:52:16.000] The EZ method. | |||
[01:52:16.000 --> 01:52:20.000] It's like a commercial on the Simpsons, or I guess Futurama. | |||
[01:52:20.000 --> 01:52:23.000] Dr. Nick Riviera suggests. | |||
[01:52:23.000 --> 01:52:26.000] There's a video on the link that I'll provide. | |||
[01:52:26.000 --> 01:52:27.000] It's cool. | |||
[01:52:27.000 --> 01:52:30.000] You can see it's a mouse eye that's completely transparent. | |||
[01:52:30.000 --> 01:52:31.000] You can see all the blood vessels, | |||
[01:52:31.000 --> 01:52:35.000] but everything looks shadowy, like ghostly. | |||
[01:52:35.000 --> 01:52:39.000] Okay, this all means that a review of COVID-19-related preprints | |||
[01:52:39.000 --> 01:52:41.000] that were later published in peer-reviewed journals finds | |||
[01:52:41.000 --> 01:52:43.000] that 50% were substantially altered, | |||
[01:52:43.000 --> 01:52:46.000] including changes to effect sizes, the data used, | |||
[01:52:46.000 --> 01:52:49.000] and statistical significance is the fiction, | |||
[01:52:49.000 --> 01:52:51.000] because thank God it's the fiction. | |||
[01:52:51.000 --> 01:52:52.000] I know. | |||
[01:52:52.000 --> 01:52:58.000] If this were true, then, of course, this would lead to, | |||
[01:52:58.000 --> 01:53:01.000] I think, a change in the preprint system. | |||
[01:53:01.000 --> 01:53:07.000] What they found, actually, was that 90% was unchanged. | |||
[01:53:07.000 --> 01:53:09.000] Thank goodness. | |||
[01:53:09.000 --> 01:53:13.000] So they were looking at data points. | |||
[01:53:13.000 --> 01:53:15.000] It's not just 90% of studies. | |||
[01:53:15.000 --> 01:53:18.000] It's 90% of the data points across the studies that they looked at | |||
[01:53:18.000 --> 01:53:20.000] were unchanged. | |||
[01:53:20.000 --> 01:53:25.000] About 10% were, but they had no effect on statistical significance. | |||
[01:53:25.000 --> 01:53:27.000] So none of them would have changed | |||
[01:53:27.000 --> 01:53:30.000] whether or not the paper was statistically significant or not. | |||
[01:53:30.000 --> 01:53:35.000] Basically, a lot of them added data that wasn't there on the preprint, | |||
[01:53:35.000 --> 01:53:37.000] so that counted as well. | |||
[01:53:37.000 --> 01:53:39.000] So that was just, hey, we have more data, so they added it. | |||
[01:53:39.000 --> 01:53:41.000] And sometimes the data was removed, | |||
[01:53:41.000 --> 01:53:44.000] so they may have had some problematic data where they said, | |||
[01:53:44.000 --> 01:53:46.000] eh, let's see. | |||
[01:53:46.000 --> 01:53:49.000] So essentially, they tightened up. | |||
[01:53:49.000 --> 01:53:54.000] About 7% of the papers had some tightening up of the data | |||
[01:53:54.000 --> 01:53:56.000] after peer review. | |||
[01:53:56.000 --> 01:53:58.000] Changes in the actual estimates were minor | |||
[01:53:58.000 --> 01:54:00.000] and statistically insignificant. | |||
[01:54:00.000 --> 01:54:02.000] So that's all very good news. | |||
[01:54:02.000 --> 01:54:04.000] Yes, it is, especially during COVID | |||
[01:54:04.000 --> 01:54:07.000] when we know there was like a mad dash to publish | |||
[01:54:07.000 --> 01:54:09.000] that it still held. | |||
[01:54:09.000 --> 01:54:11.000] Yeah, and that's what kind of threw me off on that one in a sense, | |||
[01:54:11.000 --> 01:54:15.000] was that, okay, this is all new, therefore they got a bunch of information up | |||
[01:54:15.000 --> 01:54:18.000] and they realized, oh, no, we have to change a bunch of stuff. | |||
[01:54:18.000 --> 01:54:23.000] But when the first preprint server was the physicist's archive, | |||
[01:54:23.000 --> 01:54:27.000] the Rxiv, when was that created? | |||
[01:54:27.000 --> 01:54:30.000] Well, I know that the paper that I referenced that I was on | |||
[01:54:30.000 --> 01:54:34.000] with a bunch of physicists in the neuroscience lab earlier in the show, | |||
[01:54:34.000 --> 01:54:38.000] that was in the archive, and that was published in like 2006. | |||
[01:54:38.000 --> 01:54:39.000] 1991. | |||
[01:54:39.000 --> 01:54:41.000] Yeah, okay, it's old. | |||
[01:54:41.000 --> 01:54:43.000] They've been doing it for 30 years. | |||
[01:54:43.000 --> 01:54:45.000] CompuServe email address. | |||
[01:54:45.000 --> 01:54:47.000] But it's been slowly spreading to other disciplines. | |||
[01:54:47.000 --> 01:54:50.000] Now, the COVID-19 preprints were created specifically | |||
[01:54:50.000 --> 01:54:54.000] because they wanted to get studies out quickly | |||
[01:54:54.000 --> 01:54:58.000] so that clinicians could make decisions based upon it, | |||
[01:54:58.000 --> 01:55:00.000] but also that other researchers would know where to go. | |||
[01:55:00.000 --> 01:55:05.000] Like we wanted to accelerate COVID-19 research as much as possible, | |||
[01:55:05.000 --> 01:55:09.000] and the publication process could take one to two years easily, | |||
[01:55:09.000 --> 01:55:11.000] and so they didn't want things to be slowed down that much. | |||
[01:55:11.000 --> 01:55:14.000] So it's good to look back and go, okay, that was a good thing, | |||
[01:55:14.000 --> 01:55:16.000] and the information's mostly valid. | |||
[01:55:16.000 --> 01:55:19.000] Yeah, so the preprint system worked. | |||
{{anchor|qow}} <!-- leave this anchor directly above the corresponding section that follows --> | {{anchor|qow}} <!-- leave this anchor directly above the corresponding section that follows --> | ||
== Skeptical Quote of the Week <small>()</small> == | == Skeptical Quote of the Week <small>()</small> == | ||
<!-- | <!-- |
Revision as of 17:46, 28 October 2022
This transcript is not finished. Please help us finish it! Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts. |
This episode needs: transcription, time stamps, formatting, links, 'Today I Learned' list, categories, segment redirects. Please help out by contributing! |
How to Contribute |
This is an outline for a typical episode's transcription. Not all of these segments feature in each episode.
There may also be additional/special segments not listed in this outline.
You can use this outline to help structure the transcription. Click "Edit" above to begin.
SGU Episode 901 |
---|
October 15th 2022 |
Wreckage of TWA Flight 800[1] |
Skeptical Rogues |
S: Steven Novella |
B: Bob Novella |
C: Cara Santa Maria |
J: Jay Novella |
E: Evan Bernstein |
Quote of the Week |
We have learned in recent years that the techniques of misinformation and misdirection have become so refined that, even in an open society, a cleverly directed flood of misinformation can overwhelm the truth, even though the truth is out there, uncensored, quietly available to anyone who can find it. |
Daniel Dennett, American philosopher |
Links |
Download Podcast |
Show Notes |
[ https://sguforums.org/index.php?BOARD=1.0 Forum Discussion] |
Introduction, Alex Jones lawsuit, conspiracy thinking
Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.
[00:00.000 --> 00:09.000] You're listening to the Skeptic's Guide to the Universe, your escape to reality.
[00:09.000 --> 00:13.000] Hello and welcome to the Skeptic's Guide to the Universe.
[00:13.000 --> 00:18.000] Today is Thursday, October 13th, 2022, and this is your host, Stephen Novella.
[00:18.000 --> 00:20.000] Joining me this week are Bob Novella.
[00:20.000 --> 00:21.000] Hey everybody.
[00:21.000 --> 00:22.000] Kara Santamaria.
[00:22.000 --> 00:23.000] Howdy.
[00:23.000 --> 00:24.000] Jay Novella.
[00:24.000 --> 00:25.000] Hey guys.
[00:25.000 --> 00:26.000] And Evan Bernstein.
[00:26.000 --> 00:27.000] October 13th?
[00:27.000 --> 00:28.000] Happy Skeptic's Day!
[00:28.000 --> 00:30.000] Happy Skeptic's Day, everyone.
[00:30.000 --> 00:31.000] Cool.
[00:31.000 --> 00:32.000] There's a Skeptic's Day?
[00:32.000 --> 00:33.000] Yeah, we have one.
[00:33.000 --> 00:34.000] There is.
[00:34.000 --> 00:37.000] Is it self-proclaimed, or is there any officiality to it, or what's going on?
[00:37.000 --> 00:41.000] International Skeptic's Day is observed on October 13th every year,
[00:41.000 --> 00:45.000] also known as World Skeptic's Day or International Day of Scientific Skepticism.
[00:45.000 --> 00:47.000] All right, I'll buy that for a dollar.
[00:47.000 --> 00:51.000] My question is, is there any officiality to the word officiality?
[00:51.000 --> 00:54.000] It is a coincidence, but it feels like it isn't.
[00:54.000 --> 01:00.000] But we had a skeptical triumph happen this past week with Alex Jones getting screwed.
[01:00.000 --> 01:01.000] Yeah.
[01:01.000 --> 01:07.000] Yeah, one billion, basically almost a billion, a little bit less than a billion dollar judgment against him.
[01:07.000 --> 01:10.000] I mean, there's no way he's actually going to pay that money.
[01:10.000 --> 01:12.000] But that's the kind of judgment you like to see.
[01:12.000 --> 01:13.000] It's not a slap on the hand.
[01:13.000 --> 01:15.000] It's not the cost of doing business.
[01:15.000 --> 01:20.000] It is, you ruined these people's lives because you're a dick, and now we're going to ruin your life.
[01:20.000 --> 01:22.000] That's completely fair.
[01:22.000 --> 01:28.000] And so this was, just to get into more detail, this was the judgment for the lawsuits in Connecticut.
[01:28.000 --> 01:32.000] Remember, there's separate lawsuits in Texas, and Texas has different laws,
[01:32.000 --> 01:36.000] you know, that limit the amount of money you can get for punitive damages.
[01:36.000 --> 01:38.000] But apparently Connecticut, there aren't such limitations.
[01:38.000 --> 01:43.000] So, you know, because it's not just like this is the harm you caused to these families.
[01:43.000 --> 01:51.000] It's also because you're such a jerk, we're going to have to, we're going to magnify it just for punitive damages.
[01:51.000 --> 01:58.000] And Connecticut has other laws that could be triggered, and the final judgment could be ten times that
[01:58.000 --> 02:01.000] because of multipliers that could get added in.
[02:01.000 --> 02:03.000] No way. Really?
[02:03.000 --> 02:07.000] I mean, we're way past the money that he, the resources that he has.
[02:07.000 --> 02:10.000] Yeah, it's like a 500-year jail sentence at this point.
[02:10.000 --> 02:11.000] Yeah, exactly.
[02:11.000 --> 02:13.000] Yeah, exactly, like a 500-year jail sentence.
[02:13.000 --> 02:14.000] But, of course, he's crying poverty.
[02:14.000 --> 02:16.000] He's like, I don't have any money, you know.
[02:16.000 --> 02:24.000] But a forensic economist estimates that he's worth between $135 and $270 million.
[02:24.000 --> 02:25.000] That's something.
[02:25.000 --> 02:29.000] That was Bernard Peddingill Jr., who was the forensic economist.
[02:29.000 --> 02:36.000] So the lawyers for the families are going to have to just pursue him and wring every cent out of him.
[02:36.000 --> 02:37.000] Yeah, for the rest of his life.
[02:37.000 --> 02:38.000] Yeah, basically.
[02:38.000 --> 02:39.000] He's got five houses in Texas.
[02:39.000 --> 02:40.000] Yeah.
[02:40.000 --> 02:45.000] And according to Texas law, I mean, you can't, you know, you can't lose a house that way if you use it for your business,
[02:45.000 --> 02:46.000] which is, all right, whatever.
[02:46.000 --> 02:50.000] So at the very least, he's going to probably lose those four of the five houses.
[02:50.000 --> 02:51.000] Yeah.
[02:51.000 --> 02:55.000] And he transferred them over to like his wife or ex-wife or whoever.
[02:55.000 --> 02:56.000] There's been a lot of shenanigans, yeah.
[02:56.000 --> 03:02.000] But there's even ways around that because even the court can say, no, these are fraudulent transfers
[03:02.000 --> 03:05.000] because they were obviously intentionally done to avoid creditors.
[03:05.000 --> 03:06.000] Yeah.
[03:06.000 --> 03:09.000] And he could declare bankruptcy, and he already has.
[03:09.000 --> 03:13.000] He already has declared bankruptcy, and that can delay it.
[03:13.000 --> 03:17.000] But I think the courts will probably not care about that, you know,
[03:17.000 --> 03:22.000] because Jones' actions were willful and malicious enough that they're probably going to say, so what?
[03:22.000 --> 03:24.000] Bankruptcy court's not going to help you.
[03:24.000 --> 03:28.000] And people are saying that, some people, you know, lawyers and stuff are saying,
[03:28.000 --> 03:31.000] this is going to impact his livelihood for the rest of his life.
[03:31.000 --> 03:35.000] He's going to be feeling the pain forever.
[03:35.000 --> 03:38.000] So that's pretty much the best we could hope for.
[03:38.000 --> 03:39.000] That's awesome.
[03:39.000 --> 03:40.000] Yeah.
[03:40.000 --> 03:41.000] Right.
[03:41.000 --> 03:45.000] And the lawyers at this point, once they've won the judgment, you know, it's a huge amount,
[03:45.000 --> 03:48.000] whatever the, you know, with appeals and everything, whatever the final amount turns out to be,
[03:48.000 --> 03:50.000] it's going to be massive.
[03:50.000 --> 03:56.000] So they have every motivation to spend a lot of time and resources bringing money out of him forever,
[03:56.000 --> 03:58.000] you know, because they get some of that money, right?
[03:58.000 --> 04:03.000] They're probably getting 30% or whatever they get out of Jones.
[04:03.000 --> 04:08.000] Yeah, this will also mean that his future endeavors will be tied up in a sense.
[04:08.000 --> 04:12.000] He won't be able, but he won't be able to partner with other companies or organizations.
[04:12.000 --> 04:14.000] Obviously, they won't want to.
[04:14.000 --> 04:19.000] But anything, anything that leads him to a new stream of income will automatically be,
[04:19.000 --> 04:21.000] should automatically be tapped, is my understanding.
[04:21.000 --> 04:24.000] So this severely hinders him from doing,
[04:24.000 --> 04:29.000] branching out into future types of businesses or avenues in which he can find new revenue streams.
[04:29.000 --> 04:31.000] No, they could liquidate everything he owns.
[04:31.000 --> 04:38.000] I mean, they could just, it's a huge win and it sends out a incredibly important message,
[04:38.000 --> 04:44.000] which means that you can get yourself in serious legal trouble for, you know, for spreading misinformation
[04:44.000 --> 04:48.000] and it could cost you and it can cost you an immense amount of money.
[04:48.000 --> 04:54.000] But yeah, the thing is, though, while that is true, Alex Jones is like the perfect storm because...
[04:54.000 --> 04:56.000] It is. It doesn't get much worse than that.
[04:56.000 --> 05:00.000] I know. I mean, if he couldn't get hit with a big judgment, then the system is broken
[05:00.000 --> 05:02.000] because, you know, he had a terrible defense.
[05:02.000 --> 05:05.000] He didn't even bother to defend himself a lot of the times.
[05:05.000 --> 05:08.000] He was clearly, you know, malicious.
[05:08.000 --> 05:14.000] He's got all this other stuff going on where he says, like, I'm, you know, a character that I play.
[05:14.000 --> 05:19.000] So in terms of, he made it so easy to prove that he was just lying for money
[05:19.000 --> 05:22.000] and he didn't care that he was destroying these people's lives.
[05:22.000 --> 05:26.000] And it's almost like not even just for a secondary gain.
[05:26.000 --> 05:31.000] Like with Alex Jones, there was almost this like extra layer of disgust.
[05:31.000 --> 05:36.000] You know, like we talk about, oh, you can like intentionally mislead or, you know, like Jay,
[05:36.000 --> 05:40.000] I can't remember exactly how you worded it, but you were saying, you know, that somebody can like spread misinformation.
[05:40.000 --> 05:44.000] And it's like with Alex Jones, it was so much more than spreading misinformation.
[05:44.000 --> 05:46.000] It was like akin to like a hate crime.
[05:46.000 --> 05:48.000] He weaponized it.
[05:48.000 --> 05:53.000] Yeah, he weaponized this to harm people who lost their children in a violent setting.
[05:53.000 --> 05:55.000] Like it was disgusting.
[05:55.000 --> 06:01.000] I mean, maybe it was just to make money, but I think it was also a power trip that this guy has.
[06:01.000 --> 06:05.000] Oh, sure. He got caught up in himself in all of this. Absolutely.
[06:05.000 --> 06:08.000] He definitely has an out of control ego.
[06:08.000 --> 06:11.000] I mean, it's borderline if not fully blown narcissism at this point.
[06:11.000 --> 06:15.000] Oh, I'm sure. Yeah, I'm sure it's narcissism, like megalomaniac.
[06:15.000 --> 06:18.000] I mean, it's up there. It's really up there.
[06:18.000 --> 06:22.000] I suppose you can't cop an insanity plea when it comes to these kind of civil suits.
[06:22.000 --> 06:27.000] I don't know that for sure, but wouldn't that have been attacked for him to take legally if that would have been the case?
[06:27.000 --> 06:29.000] He never would have passed that, though, ever.
[06:29.000 --> 06:35.000] If he's got capacity to run a radio show, there is no way that he didn't have capacity.
[06:35.000 --> 06:41.000] Like that's a really high bar to prove that you were incompetent.
[06:41.000 --> 06:49.000] I think not for the judgment, but for the amount, there is going to be mitigating factors and exacerbating factors.
[06:49.000 --> 06:53.000] And he's a big ball of exacerbating factors, basically.
[06:53.000 --> 06:55.000] That's kind of what I'm saying.
[06:55.000 --> 07:03.000] He was like the perfect storm of why a judge would say, I'm going to basically hit you with freaking everything because you deserve it.
[07:03.000 --> 07:05.000] You know, and there's no remorse.
[07:05.000 --> 07:07.000] He's still casting doubt on it.
[07:07.000 --> 07:11.000] He's still saying on his show, like, there's reason to doubt the official story of that.
[07:11.000 --> 07:14.000] You know, it's when he's got really. Yeah, yeah.
[07:14.000 --> 07:16.000] Still going there. Wow.
[07:16.000 --> 07:18.000] When he's got to know that it's BS.
[07:18.000 --> 07:29.000] I mean, it just takes three minutes of thinking halfway reasonably to understand that there's no possible way that this shooting was a hoax.
[07:29.000 --> 07:30.000] Right, right.
[07:30.000 --> 07:35.000] So there's something severely wrong with Alex Jones and how his brain works, I think.
[07:35.000 --> 07:36.000] Or he doesn't believe it.
[07:36.000 --> 07:39.000] That the simpler thing is he's a psychopath who doesn't believe it for one second.
[07:39.000 --> 07:40.000] It's all greed.
[07:40.000 --> 07:41.000] He's playing a part.
[07:41.000 --> 07:44.000] And I mean, some people you could always say, do they really believe it or not?
[07:44.000 --> 07:46.000] And most of the time you don't know.
[07:46.000 --> 07:49.000] And clearly a lot of people do believe this crazy stuff.
[07:49.000 --> 07:52.000] But in my opinion, he believes none of this.
[07:52.000 --> 07:58.000] And this is just part of his of his whole scheme of the character he's playing that he's admitted to.
[07:58.000 --> 08:08.000] And he and this is all this is all the interstitials in between the bits on his show where he he begs for money for crappy products or just just to give him money.
[08:08.000 --> 08:10.000] That's that's all this is.
[08:10.000 --> 08:15.000] And it turned out to be the most expensive piece of role playing in history.
[08:15.000 --> 08:17.000] The greatest cost ever by.
[08:17.000 --> 08:18.000] Well, we say that.
[08:18.000 --> 08:24.000] But honestly, I want to see a in the grand scheme of things like kind of after all the dust has settled,
[08:24.000 --> 08:28.000] how much actually gets collected compared to how much he earned over the course of this career.
[08:28.000 --> 08:33.000] Because I have a feeling he didn't lose everything and he's not going to lose everything.
[08:33.000 --> 08:37.000] And I will be I will be happy to leave him like in a trailer park in a T-shirt.
[08:37.000 --> 08:40.000] And really, if you're just looking at it.
[08:40.000 --> 08:46.000] Yeah, even if it's a function of like he lost everything because he also lost a lot of his money over over time.
[08:46.000 --> 08:50.000] But what I'm saying is that if we looked like month to month at his earnings,
[08:50.000 --> 08:54.000] I don't think this judgment is going to wipe away every cent he ever made.
[08:54.000 --> 08:57.000] Yeah, well, he's from jobs. He spent a lot of it already.
[08:57.000 --> 09:00.000] He lived his life comfortably for years.
[09:00.000 --> 09:05.000] I've heard estimates of him spending three hundred thousand dollars a month.
[09:05.000 --> 09:07.000] Oh, my God.
[09:07.000 --> 09:10.000] It's amazing that people gave him so much money.
[09:10.000 --> 09:12.000] But do we know what his net worth is?
[09:12.000 --> 09:16.000] Well, that's somewhere between one hundred thirty and two hundred sixty million dollars.
[09:16.000 --> 09:21.000] And he's claiming, I think, what, three that he's three million, three and a half.
[09:21.000 --> 09:24.000] And he's, you know, one house.
[09:24.000 --> 09:26.000] His watch is worth more than that.
[09:26.000 --> 09:29.000] Yeah. Five houses in Texas.
[09:29.000 --> 09:32.000] That's the other thing. You look at a character like Alex Jones.
[09:32.000 --> 09:35.000] First of all, you listen to him speaking for 30 seconds.
[09:35.000 --> 09:38.000] If you don't immediately react to that,
[09:38.000 --> 09:42.000] you know, listening to him speak by thinking this guy is a raving lunatic.
[09:42.000 --> 09:48.000] Right. If that's not your immediate reaction, you have to seriously rethink your life.
[09:48.000 --> 09:49.000] You know, seriously.
[09:49.000 --> 09:55.000] But the other thing is you think that, all right, so this guy's making millions of dollars selling,
[09:55.000 --> 10:01.000] you know, supplements and whatever prepper crap, you know, selling all this stuff
[10:01.000 --> 10:04.000] that he's then promoting with his conspiracy theories.
[10:04.000 --> 10:06.000] Like, that's the guy I'm going to believe.
[10:06.000 --> 10:12.000] That's the guy, the guy who's making hundreds of millions of dollars off of saying what he's saying.
[10:12.000 --> 10:15.000] He's speaking truth to power. This guy.
[10:15.000 --> 10:20.000] That requires people having the ability to part the curtain, which they can't do.
[10:20.000 --> 10:27.000] Right. And I think that that's I did this series called Pulling the Thread for World Channel PBS,
[10:27.000 --> 10:30.000] where it's like a multipart series about like why people believe in conspiracy theories.
[10:30.000 --> 10:34.000] And we focus specifically on one of the episodes focused on this Alex Jones thing.
[10:34.000 --> 10:38.000] And there were some deep dives with some psychologists into the people.
[10:38.000 --> 10:42.000] So not about Alex, whatever, but about the people who actually like believe this conspiracy
[10:42.000 --> 10:44.000] because he has like followers, right?
[10:44.000 --> 10:47.000] And they're not all narcissistic, megalomaniacal people.
[10:47.000 --> 10:50.000] Like some people, it's like, how did you get baited?
[10:50.000 --> 10:52.000] How did you get hooked here?
[10:52.000 --> 10:58.000] And one of the kind of hypotheses, the working hypotheses that these individuals said
[10:58.000 --> 11:07.000] was the idea of the atrocity that occurred at Sandy Hook was so psychologically devastating to some people.
[11:07.000 --> 11:15.000] Like the idea that that random cruelty and violence could exist in the world is so hard for some people to swallow
[11:15.000 --> 11:22.000] that even an extravagant alternative, which is that it was made up, was more palatable for some people.
[11:22.000 --> 11:26.000] And so they got hooked in in that way because it gave order to their chaos.
[11:26.000 --> 11:29.000] And I think you see this a lot with religion as well.
[11:29.000 --> 11:34.000] Things feel too complicated, too dark, too weird, and you get hooked.
[11:34.000 --> 11:38.000] And then the conspiracy becomes complicated and dark and weird, but by then you're in it already.
[11:38.000 --> 11:43.000] Right, right. And you do give money to those kinds of, you know, Peter Popov's of the world.
[11:43.000 --> 11:49.000] There's a whole list of the religious, of the religious sects out there who have done this over time.
[11:49.000 --> 11:54.000] And they've become multi, I mean, hundreds of millions of dollars poured into those people's pockets.
[11:54.000 --> 11:57.000] So I think you're right. Mainstream religions do the same thing.
[11:57.000 --> 12:02.000] Right. Even like super like the Catholic Church, like, you know, like how much money do they make off of individuals
[12:02.000 --> 12:06.000] and what are they offering people? Like when you really get into the lore, it's bananas.
[12:06.000 --> 12:11.000] But they're not diving into the deep end. They're getting drawn in in the shallow end.
[12:11.000 --> 12:16.000] Oh, yeah. It's like Scientology. They don't come out with Xenu bullshit right out of the gate.
[12:16.000 --> 12:19.000] Yeah, they don't hit you with the heavy stuff.
[12:19.000 --> 12:22.000] You got to level up a few times before you get to that stuff.
[12:22.000 --> 12:26.000] You're already committed in deep by the time you hear that word.
[12:26.000 --> 12:28.000] There's a sunk cost fallacy. Yeah.
[12:28.000 --> 12:30.000] You're literally, yeah, you're pot committed. Yeah.
[12:30.000 --> 12:34.000] Kara, that does make sense. You know, I could that could rationalize what you're saying there.
[12:34.000 --> 12:42.000] But the reality of that, though, is that people are incapable of facing the seriousness of the world that we live in.
[12:42.000 --> 12:46.000] Oh, yeah. It's existentially too painful for them.
[12:46.000 --> 12:48.000] I think some people that might have been the pathway.
[12:48.000 --> 12:54.000] But I know people who believe that who did not get there that by that pathway, they were already conspiracy loons.
[12:54.000 --> 12:59.000] Right. What happened? So, yeah, there's multiple paths into that rabbit hole.
[12:59.000 --> 13:02.000] But you're right. Once you get into it, it's self reinforcing.
[13:02.000 --> 13:08.000] It's hard to get out of it unless you have critical thinking skills, which these people clearly don't.
[13:08.000 --> 13:10.000] And it's super hard for people on the outside to understand.
[13:10.000 --> 13:13.000] It's like anybody who's found themselves in a cult.
[13:13.000 --> 13:16.000] You look at them and you go, how the hell do you believe what they're saying?
[13:16.000 --> 13:20.000] How did you get in so deep? And it's so hard for us to see it because we're not in it.
[13:20.000 --> 13:25.000] Yeah. The boiling frog or an abusive relationship, you know, or an abusive relationship.
[13:25.000 --> 13:31.000] Exactly. Because it doesn't. Have you guys have you guys seen Made on Netflix?
[13:31.000 --> 13:33.000] I have not. Oh, it's stunning.
[13:33.000 --> 13:44.000] Highly, highly recommend the series. And there's a perfect example of that, Steve, where long story short, it's a story of a girl who is in an kind of emotionally abusive relationship and leaves.
[13:44.000 --> 13:46.000] And there's just a lot of mental health issues and stuff.
[13:46.000 --> 13:52.000] She becomes a maid and she's struggling to keep food in her mouth and to keep out of homelessness and take care of her daughter.
[13:52.000 --> 13:57.000] But she's talking to a girl in the in the shelter and she's like, well, he didn't really hit me.
[13:57.000 --> 14:00.000] Like these people are actually abused. Look, she has a black guy.
[14:00.000 --> 14:03.000] He didn't really hit me. And she's like, so he still was abusive.
[14:03.000 --> 14:09.000] And she's like, yeah, but he never hit me. She's like, do you think that my boyfriend on our first date choked me out?
[14:09.000 --> 14:11.000] Like, of course not. I wouldn't have been with him.
[14:11.000 --> 14:21.000] It's like it builds up. It's like a little thing you forgive here, a tiny little violent tendency that you forgive there until you're financially dependent on them and you're in so deep.
[14:21.000 --> 14:26.000] And I think it's the same thing with cults, with conspiracy theories, these little nibbles.
[14:26.000 --> 14:32.000] And then eventually you've given them so much money, you've given them so much time, you're pot committed, like we said.
[14:32.000 --> 14:40.000] And then and then soon they're literally choking you out and you don't even realize it's happening or you realize it's happening, but you don't see an exit.
[14:40.000 --> 14:48.000] And yeah, at some point you'd have to admit so much personal failure that you can't do it, you know.
Quickie with Bob (14:50)
- [link_URL TITLE][2]
[14:48.000 --> 14:50.000] All right, Bob, you're going to start us off with a quickie.
[14:50.000 --> 14:54.000] Thank you, Steve. This is your Quickie with Bob. Gird your loins, people.
[14:54.000 --> 15:01.000] You wouldn't think ship sales could be super high tech and smart for super heavy ships, but they are.
[15:01.000 --> 15:13.000] The China Merchant Energy Shipping Company has rolled out a three hundred and thirty three meter supertanker with four forty meter carbon fiber composite blades or sails for them.
[15:13.000 --> 15:16.000] Twelve hundred square feet altogether. That's the surface area.
[15:16.000 --> 15:20.000] Now they're retractable at the push of a button, totally computer controlled.
[15:20.000 --> 15:29.000] The system can monitor the weather and navigation data and maximize the use of available wind to pull every every ounce of energy from that wind.
[15:29.000 --> 15:32.000] You know, this ship still primarily uses diesel.
[15:32.000 --> 15:38.000] That's you know, that's the mainstay. But estimates say that it could reduce fuel consumption by almost 10 percent,
[15:38.000 --> 15:45.000] which could amount to a whopping twenty nine hundred tons of carbon dioxide not released every trip, every trip.
[15:45.000 --> 15:49.000] Twenty nine hundred, almost three thousand tons of carbon dioxide not released.
[15:49.000 --> 15:51.000] And this is only, you know, with a 10 percent savings.
[15:51.000 --> 15:58.000] Pretty astonishing numbers there, because that's that could in terms of climate change and just money.
[15:58.000 --> 16:02.000] That's a lot. And, you know, money speaks and multiply that multiply that savings.
[16:02.000 --> 16:06.000] And this is just basically a test bed. This isn't this isn't like a dedicated, you know,
[16:06.000 --> 16:12.000] attempt to see how much we could, you know, we could use wind to move the ship instead of the diesel engine.
[16:12.000 --> 16:18.000] So multiply that times some or all of the fifty thousand merchant ships that are plying the waves.
[16:18.000 --> 16:24.000] And yet could seriously whack back that one point seven percent of global greenhouse emissions that they all contribute.
[16:24.000 --> 16:28.000] One point seven percent. All of these merchant ships. That's a lot.
[16:28.000 --> 16:33.000] Now, the idea has been attempted. You know, this isn't this didn't leap out of a vacuum in the past couple of years.
[16:33.000 --> 16:36.000] It's been talked about going back a couple of decades. Attempts have been made.
[16:36.000 --> 16:40.000] There's been there's some serious other plans that are going on, but they're not, you know,
[16:40.000 --> 16:42.000] they're not going to be real for another few years.
[16:42.000 --> 16:51.000] And this is one of the best attempts at this idea of these of sales of hard sales on big ships to save from using fossil fuels that I've heard.
[16:51.000 --> 16:53.000] It's really intriguing. The ship's cool.
[16:53.000 --> 17:03.000] I mean, the obvious hope here is that that this will inspire other ship designers and that are also looking not only to to help reduce the greenhouse gas emissions,
[17:03.000 --> 17:06.000] but also, you know, save some money, too, which is a great combination.
[17:06.000 --> 17:15.000] Now, of course, the mad irony here is that this ship was made to transport not solar panels or even Halloween skulls,
[17:15.000 --> 17:19.000] but two million barrels of fossil fuel all over the planet.
[17:19.000 --> 17:23.000] It brings millions of barrels of this stuff all over.
[17:23.000 --> 17:29.000] So, yeah, this is like, woohoo, here's another drop in the bucket that we're saving here while we're transporting the fossil fuel.
[17:29.000 --> 17:31.000] So, yeah, that irony has not been lost.
[17:31.000 --> 17:35.000] But hopefully it's it's something that will that will inspire people in the future.
[17:35.000 --> 17:37.000] And hopefully we'll see a lot more of these.
[17:37.000 --> 17:39.000] I'd love to see how big some of these hard sales can get.
[17:39.000 --> 17:42.000] That'd be pretty cool. So, loins unguarded.
[17:42.000 --> 17:45.000] This has been your Quickie with Bob. I hope it was good for you, too.
News Items
S:
B:
C:
J:
E:
(laughs) (laughter) (applause) [inaudible]
News_Item_1 (mm:ss)
- [link_URL TITLE][3]
News_Item_2 ()
- [link_URL TITLE][4]
News_Item_3 ()
- [link_URL TITLE][5]
News_Item_4 ()
- [link_URL TITLE][6]
JWST detections (49:48)
News_Item_5 ()
- [link_URL TITLE][7]
[17:45.000 --> 17:49.000] Yeah, it's a good example of how sometimes low tech things or they work really well.
[17:49.000 --> 17:53.000] You know, it's basically wind power. It's it's free.
[17:53.000 --> 18:01.000] Yeah. Yeah. We know about you know, we know something about using, you know, sails and airfoils, you know, to to push things along the water.
[18:01.000 --> 18:08.000] We've kind of been doing that for thousands of years longer than these digital engines or or even steam power.
[18:08.000 --> 18:12.000] I mean, it's just like, you know, we know it. It's it's something that's not that difficult.
[18:12.000 --> 18:17.000] And I love how this is fully automated. They basically don't need to do anything.
[18:17.000 --> 18:22.000] It automatically automatically extends them and totally makes it as efficient as possible.
[18:22.000 --> 18:25.000] Oh, no one has to call out hoist the mizzen or anything like that.
[18:25.000 --> 18:29.000] Well, that's kind of a shame in that sense.
[18:29.000 --> 18:31.000] But I've been reading about this for years as well.
[18:31.000 --> 18:41.000] I read one proposed design where it's basically like a kite like this giant kite that you get way up into the high enough that you're getting where the real
[18:41.000 --> 18:46.000] wind stream is right. So you get much faster wind. Yeah. And more power. Interesting.
[18:46.000 --> 18:52.000] Yeah. Be technically challenging. When you look at this ship, though, the sails look teeny compared to the overall size of the ship.
[18:52.000 --> 18:56.000] The bigger ones on there. That's a big ship. They could easily double the number.
[18:56.000 --> 19:00.000] There's enough deck space for it. But OK, let's keep going.
[19:00.000 --> 19:08.000] Kara, you're going to start off full news items with this is a this has been blowing up and it's a complicated story.
[19:08.000 --> 19:12.000] And, you know, the reporting has been kind of all over the place.
[19:12.000 --> 19:22.000] Let's see if we could unpack this. Is there a clump of brain cells that plays pong?
[19:22.000 --> 19:26.000] What an exaggeration. Yeah. The short and dirty answer is no.
[19:26.000 --> 19:36.000] But there is a monolayer network of brain cells that is doing something akin to what the researchers are calling simulated pong.
[19:36.000 --> 19:51.000] So let's unpack this. OK, so the idea here and part of the reason I just want to say this up front that this news story really interests me is because back when I was working in neuroscience, this was the area I worked in.
[19:51.000 --> 19:55.000] So these are researchers in Australia that are working in industry.
[19:55.000 --> 20:03.000] The industry is called Cortical Labs in conjunction with some associated universities, which is not uncommon.
[20:03.000 --> 20:12.000] Their lab is an electrophysiology lab, electrophysiology using electricity to understand the physiology of specifically here, neurons.
[20:12.000 --> 20:16.000] And they are working with I'm going to be throwing around a lot of terms.
[20:16.000 --> 20:27.000] So I'm going to try and define them as I go in vitro nerve cell networks or neuronal networks in vitro, meaning in a dish, not in the body.
[20:27.000 --> 20:31.000] In vivo would be within the animal in vitro in a dish.
[20:31.000 --> 20:38.000] So their platform is very similar to the platform that I used to work with.
[20:38.000 --> 20:46.000] What you do is you take either brain cells from embryonic mice, so mouse pups before they're born.
[20:46.000 --> 20:53.000] Or in this case, they also took pluripotent stem cell line from human beings like they didn't actually dissected out of human beings.
[20:53.000 --> 20:58.000] It's a living stem cell line that's kept in freezers and they did both.
[20:58.000 --> 21:05.000] And they took these cells, you go through this long process of dissociating the cells from one another, getting them ready to be plated.
[21:05.000 --> 21:12.000] You stick them on these little glass plates that have micro electrodes or electrodes that you can only see under the microscope that are embedded in them.
[21:12.000 --> 21:15.000] And then here's a cool thing that happens, not even part of the study.
[21:15.000 --> 21:19.000] They just grow into a relatively organized network.
[21:19.000 --> 21:23.000] That's just what neurons do, especially when they're at this age.
[21:23.000 --> 21:31.000] So you plate them on the cell, they look like a bunch of little basketballs that light up underneath the microscope, kind of dissociated and free floating.
[21:31.000 --> 21:36.000] They stick to the plate and they start to grow neurites, which are undifferentiated axons and dendrites.
[21:36.000 --> 21:43.000] And eventually they form these relatively complicated, what we call monolayer, so flat networks.
[21:43.000 --> 21:50.000] And here's the cool thing, the micro electrodes underneath them actually read their activity.
[21:50.000 --> 21:55.000] They can detect their electrodes so they can detect the firing of the cells.
[21:55.000 --> 22:03.000] Even though this is happening at microscopic levels, they can detect it and they can amplify the signal and you can literally read it on a computer.
[22:03.000 --> 22:05.000] These go both ways, though.
[22:05.000 --> 22:15.000] They also can induce electrical activity, meaning that there's like a teeny tiny, I wouldn't call it a shock, but a teeny tiny electrical pulse that they can send in.
[22:15.000 --> 22:18.000] And of course, neurons are excitable tissue.
[22:18.000 --> 22:23.000] If you load them up with enough electricity, you can actually cause them to fire on their own.
[22:23.000 --> 22:28.000] So basically their MEA chip both has input and output capabilities.
[22:28.000 --> 22:29.000] Does that make sense?
[22:29.000 --> 22:30.000] So we're starting there.
[22:30.000 --> 22:33.000] Okay, so we've got a flat neuronal network on a chip.
[22:33.000 --> 22:36.000] That chip has input and output.
[22:36.000 --> 22:37.000] But this is a classic MEA.
[22:37.000 --> 22:42.000] Like I said, I worked with MEAs in, I think I got my master's in 07.
[22:42.000 --> 22:46.000] Okay, so this research has been going on a really, really long time, so keep that in mind.
[22:46.000 --> 22:48.000] They're calling it DishBrain.
[22:48.000 --> 22:50.000] This is just branding, okay?
[22:50.000 --> 22:56.000] Like I don't even want to use this name because there's a lot of branding in this study and I think it overcomplicates things.
[22:56.000 --> 22:59.000] So I'm going to try and avoid using the like made up jargon that they use.
[22:59.000 --> 23:12.000] So on this electrical system, what they decided that they wanted to do is simulate, and that's kind of, that's in all caps, simulate gameplay.
[23:12.000 --> 23:16.000] So they tried to think what's some of the simplest gameplay that we can simulate?
[23:16.000 --> 23:17.000] Pong.
[23:17.000 --> 23:19.000] I mean, was Pong the first video game?
[23:19.000 --> 23:23.000] It wasn't, no, but it was one of the first commercial video games.
[23:23.000 --> 23:24.000] There you go, okay.
[23:24.000 --> 23:25.000] Yeah.
[23:25.000 --> 23:32.000] It's simple, it's easy to pick up, it's intuitive, and the inputs and outputs aren't overly complicated, right?
[23:32.000 --> 23:43.000] What they did is they simulated this by using intentionality of inputs, of electrical inputs, and they had two different ways that they did it.
[23:43.000 --> 23:49.000] One of them is spatial, meaning that they were able to kind of divide the network into two halves, left side, right side.
[23:49.000 --> 23:53.000] So if I stimulate over here, that is a difference.
[23:53.000 --> 23:58.000] That's quantitatively and qualitatively different than stimulating on the right side over here.
[23:58.000 --> 24:03.000] And then they also did something with the frequency of the stimulation, so da-da-da-da versus da-da-da-da, right?
[24:03.000 --> 24:06.000] That was kind of how the paddle was moving.
[24:06.000 --> 24:13.000] So this is how they stimulated the idea of movement in gameplay.
[24:13.000 --> 24:18.000] It was really just through manipulating the inputs.
[24:18.000 --> 24:23.000] And what they found was that there was a significant difference.
[24:23.000 --> 24:35.000] The nerve cells actually responded differently to organized gameplay, and they gave predictable feedback,
[24:35.000 --> 24:44.000] meaning that they fired in predictable patterns versus random, chaotic, unpredictable gameplay.
[24:44.000 --> 24:55.000] So they actually started to show what might be called learning, plasticity, patternicity, but intentional patternicity,
[24:55.000 --> 24:59.000] as opposed to the patternicity that happens naturally that we might call noise.
[24:59.000 --> 25:06.000] And that's something that I think we need to specify here, because what a lot of people don't know is that when you plate nerve cell networks in vitro,
[25:06.000 --> 25:17.000] they automatically do something called, it's kind of like kindling, they automatically fire together, and they go zh-zh-zh, like the whole network does that.
[25:17.000 --> 25:23.000] It chatters a little bit too, but it can recruit all of the cells and they can all talk to each other
[25:23.000 --> 25:28.000] and start having these really predictable patterns that arises naturally.
[25:28.000 --> 25:33.000] This is really typical when there's no sensory inputs or outputs, because you've got to think about this as like a network that's,
[25:33.000 --> 25:37.000] it's almost like a circuit where the wires are just hanging, right?
[25:37.000 --> 25:41.000] There's no eyes, there's no fingers, there's no muscles.
[25:41.000 --> 25:45.000] So we think of sensory motor stuff as inputs and outputs to the brain.
[25:45.000 --> 25:49.000] Nerve cell networks don't have those things, they're just neurons.
[25:49.000 --> 25:51.000] Yeah, they call them disembodied.
[25:51.000 --> 25:53.000] Disembodied, yeah, that's a great way to put it.
[25:53.000 --> 25:57.000] I say dissociated, I think that's a little more accurate, but okay.
[25:57.000 --> 26:06.000] So they are basically inducing or simulating inputs and outputs with these different patterns of electrical activity.
[26:06.000 --> 26:11.000] And they're using Pong, we've really got to think about this whole thing with Pong,
[26:11.000 --> 26:17.000] because it's kind of irrelevant, as a way to imagine organization.
[26:17.000 --> 26:19.000] That's really all this is.
[26:19.000 --> 26:21.000] It's a mechanism for us to imagine organization.
[26:21.000 --> 26:23.000] They are not playing Pong.
[26:23.000 --> 26:27.000] They aren't even, I don't even, I think it's a stretch to say they're playing simulated Pong,
[26:27.000 --> 26:30.000] but they're using Pong almost metaphorically to say,
[26:30.000 --> 26:34.000] we are inducing organization and we are getting non-random feedback.
[26:34.000 --> 26:36.000] And that's the big deal here.
[26:36.000 --> 26:43.000] That is the step in the direction of this could be really helpful.
[26:43.000 --> 26:49.000] But what I want to really impress is that this is not new, this did not happen overnight.
[26:49.000 --> 26:53.000] Electrical physiology studies have been used for decades,
[26:53.000 --> 27:02.000] and using electrical monolayer networks as what we call kind of like biosensors is a very common practice.
[27:02.000 --> 27:08.000] So you've got this in vitro cell culture, you add a drug, you see how it responds.
[27:08.000 --> 27:09.000] This is really important.
[27:09.000 --> 27:11.000] It can help us with drug development.
[27:11.000 --> 27:15.000] You've got it, you put it into a different environment, you see how it responds,
[27:15.000 --> 27:19.000] different air pressure, I don't know, different temperature, you see if the cells respond.
[27:19.000 --> 27:21.000] This can give us some really important feedback.
[27:21.000 --> 27:28.000] The difference here is instead of looking at the changes in the random electrical activity of these cells,
[27:28.000 --> 27:34.000] the naturally occurring kind of self-organizing random electrical activity,
[27:34.000 --> 27:40.000] what these researchers are saying is we can actually induce kind of a very rudimentary learning
[27:40.000 --> 27:44.000] because we're seeing predictable feedback in these cells.
[27:44.000 --> 27:48.000] So now when we add a drug, now when we put it in a new environment,
[27:48.000 --> 27:52.000] now when we change certain parameters, we can see if the learning is altered.
[27:52.000 --> 27:56.000] So we've got a whole other layer of insight.
[27:56.000 --> 28:02.000] And the cool thing about this is it really does go to show the multiple levels of organization of the brain
[28:02.000 --> 28:03.000] and how complicated it really is.
[28:03.000 --> 28:05.000] Our brains are not machines.
[28:05.000 --> 28:10.000] And it's very important to remember that not only do we have things that are happening at the molecular level,
[28:10.000 --> 28:15.000] we have things that are happening at the cellular level, we have things that are happening at the network level,
[28:15.000 --> 28:20.000] and then we have things that are happening at the organ level.
[28:20.000 --> 28:22.000] Here we're talking about the network level.
[28:22.000 --> 28:26.000] So it's BS when you're reading that these are organoids.
[28:26.000 --> 28:28.000] I don't know why a lot of the headlines say organoids.
[28:28.000 --> 28:29.000] These aren't organoids.
[28:29.000 --> 28:31.000] These are monolayer networks.
[28:31.000 --> 28:36.000] And so it's missing a whole level of complexity because the brain isn't flat.
[28:36.000 --> 28:39.000] The brain has like layers in it, which is really cool.
[28:39.000 --> 28:44.000] But in order to get to that high level of complexity, oftentimes we have to be a little bit reductionistic
[28:44.000 --> 28:47.000] and go down and see what's happening at simpler levels.
[28:47.000 --> 28:51.000] We're not even talking about what's happening at the molecular level here, or even at the cellular level.
[28:51.000 --> 28:54.000] We're talking about what's happening at the network level.
[28:54.000 --> 29:00.000] And each level of organization does seem to have its own complexity.
[29:00.000 --> 29:04.000] So it's just this really beautiful, really multifaceted thing.
[29:04.000 --> 29:08.000] What I worry about with the type of reporting that I'm seeing, and to be fair,
[29:08.000 --> 29:09.000] I don't think it's all bad reporting.
[29:09.000 --> 29:16.000] I think that the study itself is written with a lot of hype language, is terminology like
[29:16.000 --> 29:19.000] they use the acronym over and over, SBI.
[29:19.000 --> 29:23.000] They're literally referring to their, what do they call it, brain dish or whatever.
[29:23.000 --> 29:25.000] It's like brain dish, TM.
[29:25.000 --> 29:31.000] They're literally referring to it as a synthetic biological intelligence.
[29:31.000 --> 29:35.000] Kara, they claim that they have founded a new science here.
[29:35.000 --> 29:36.000] I know.
[29:36.000 --> 29:37.000] They explicitly state that.
[29:37.000 --> 29:40.000] They're massively over calling that.
[29:40.000 --> 29:41.000] They're massively over calling.
[29:41.000 --> 29:45.000] It's cool what they found, and it is interesting and helpful.
[29:45.000 --> 29:46.000] It is.
[29:46.000 --> 29:51.000] But I'm afraid that because they've over called it so much, it's overshadowing how cool it is
[29:51.000 --> 29:53.000] because they're saying it's something it's not.
[29:53.000 --> 29:55.000] It's not a synthetic biological intelligence.
[29:55.000 --> 29:57.000] These are just brain cells doing what brain cells do.
[29:57.000 --> 30:01.000] They've just figured out a way to read them better and to manipulate them better.
[30:01.000 --> 30:04.000] There's nothing synthetic about this.
[30:04.000 --> 30:09.000] These are neurons growing on a dish that we've been doing for literally decades.
[30:09.000 --> 30:14.000] Then they do break it into some interesting physics properties.
[30:14.000 --> 30:16.000] Why is it that they would do this?
[30:16.000 --> 30:20.000] If they don't have inputs or outputs, why would they randomly, or not randomly,
[30:20.000 --> 30:23.000] why would they non-randomly, quote, play the game?
[30:23.000 --> 30:25.000] Remember, that's a simulation.
[30:25.000 --> 30:30.000] They are saying, well, we think it has to do basically with entropy.
[30:30.000 --> 30:33.000] They're always going to be going towards an area of high efficiency.
[30:33.000 --> 30:35.000] This is just how brain cells work.
[30:35.000 --> 30:38.000] They're always going to try and conserve as much energy as possible.
[30:38.000 --> 30:43.000] It's more efficient to learn than it is to make random noise all the time.
[30:43.000 --> 30:48.000] If we see an input in a closed loop system, we're going to start to react in a predictable pattern.
[30:48.000 --> 30:50.000] That's more efficient.
[30:50.000 --> 30:52.000] That's going to save, basically, cellular energy.
[30:52.000 --> 30:54.000] It's going to save ATP.
[30:54.000 --> 31:00.000] But, of course, they had to come up with a whole other lexicon to describe this.
[31:00.000 --> 31:04.000] They're calling it the ultimate biomimetic sandbox.
[31:04.000 --> 31:09.000] That's literally the term that they're using for the platform.
[31:09.000 --> 31:14.000] Then they're saying that they are applying implications
[31:14.000 --> 31:19.000] from the theory of active inference via the free energy principle.
[31:19.000 --> 31:20.000] Whoa.
[31:20.000 --> 31:22.000] That sounds like gobbledygook.
[31:22.000 --> 31:24.000] Yeah, I tried to unpack that.
[31:24.000 --> 31:27.000] The only thing I could find is that they're saying what you already said.
[31:27.000 --> 31:29.000] Yeah, they're saying this is an entropy thing.
[31:29.000 --> 31:32.000] Yeah, neurons will try to minimize their entropy.
[31:32.000 --> 31:37.000] But what I found to be the cool bit, like when I dug down to paydirt,
[31:37.000 --> 31:39.000] what's actually happening here?
[31:39.000 --> 31:43.000] Forget all of the hype and the new branding and all that crap.
[31:43.000 --> 31:48.000] This is an incremental advance in that they figured out a way to close the loop in vitro.
[31:48.000 --> 31:52.000] I think that's what they're talking about with the new system that they have here.
[31:52.000 --> 31:57.000] Yeah, they're plating these neurons on an electrode array
[31:57.000 --> 32:01.000] that they could then use in an in vitro closed loop system.
[32:01.000 --> 32:02.000] Okay, that's nice.
[32:02.000 --> 32:03.000] That's a nice advance.
[32:03.000 --> 32:05.000] Good for you guys.
[32:05.000 --> 32:11.000] Not what this one experiment shows, but what all of these experiments show,
[32:11.000 --> 32:14.000] this science shows, which is really cool,
[32:14.000 --> 32:19.000] is that you can have a very simple system where you have these neurons,
[32:19.000 --> 32:25.000] like basic cells that can depolarize and produce an electrical current
[32:25.000 --> 32:28.000] and also be depolarized by an electrical current,
[32:28.000 --> 32:31.000] so they can connect to each other and communicate with each other,
[32:31.000 --> 32:39.000] that at its simplest form, they're basically firing in such a way as to minimize their work.
[32:39.000 --> 32:47.000] And by doing that, they can respond to external input to make it more predictable and more regular.
[32:47.000 --> 32:51.000] It's like insects building a big structure.
[32:51.000 --> 32:55.000] Following that very simple algorithm,
[32:55.000 --> 33:00.000] they can quote-unquote learn by adapting to environmental input.
[33:00.000 --> 33:07.000] In this case, the researchers are using the input and the output as a metaphor for Pong.
[33:07.000 --> 33:14.000] And you could theoretically, because it's connected to a silicon chip,
[33:14.000 --> 33:16.000] You could do a readout.
[33:16.000 --> 33:20.000] You could interpret that as Pong.
[33:20.000 --> 33:24.000] And just like you have the ones and zeros of a regular silicon computer chip,
[33:24.000 --> 33:27.000] you interpret that as everything that's happening on your desktop.
[33:27.000 --> 33:29.000] The computer chip doesn't know what's happening.
[33:29.000 --> 33:32.000] It's just calculating ones and zeros or whatever.
[33:32.000 --> 33:35.000] It's just doing things at a very, very micro level.
[33:35.000 --> 33:36.000] It's the same thing.
[33:36.000 --> 33:40.000] These neurons are just firing the way that they're evolved to fire,
[33:40.000 --> 33:43.000] that they're programmed to fire in response to environmental stimuli.
[33:43.000 --> 33:46.000] To me, that would have been more interesting, don't you think, Steve?
[33:46.000 --> 33:52.000] Neurons are not playing Pong, we're using Pong to visualize a patternicity within the neurons.
[33:52.000 --> 33:54.000] And I think the point that I just want to reiterate,
[33:54.000 --> 33:57.000] because I think you made a good point but I almost want to clarify,
[33:57.000 --> 34:01.000] is that the incremental advance here is not that they're on MEAs.
[34:01.000 --> 34:05.000] It's not that there are electrical inputs or even that there are electrical outputs.
[34:05.000 --> 34:07.000] We've been doing this for decades.
[34:07.000 --> 34:11.000] The incremental advance is that, unlike what most people have long done,
[34:11.000 --> 34:15.000] which is do something to the cell culture and then see what happens,
[34:15.000 --> 34:19.000] what these people said is, I want to try and do something to the cell culture
[34:19.000 --> 34:23.000] and see if I can take their response to then have a new thing happen.
[34:23.000 --> 34:25.000] I want to close the feedback loop.
[34:25.000 --> 34:27.000] That's what they did.
[34:27.000 --> 34:31.000] Instead of leaving one of those hanging open, they closed the loop.
[34:31.000 --> 34:36.000] And I think anybody who's been doing this for a long time would predict that that would happen,
[34:36.000 --> 34:40.000] but it's complicated from a physics standpoint to do that,
[34:40.000 --> 34:42.000] and they managed to do that, and that is pretty cool.
[34:42.000 --> 34:46.000] But I also think that I'm not surprised that it happened at all.
[34:46.000 --> 34:49.000] If you were to ask me, hey, if somebody did this, do you think this would happen?
[34:49.000 --> 34:51.000] I would be like, yeah, I think so. That's cool.
[34:51.000 --> 34:53.000] Somebody should do that.
[34:53.000 --> 34:55.000] And so I'm glad they did.
[34:55.000 --> 34:59.000] I think the take-home for the non-expert, people listening to this,
[34:59.000 --> 35:02.000] again, you can get lost in the weeds here.
[35:02.000 --> 35:06.000] The take-home is that, from an evolutionary point of view,
[35:06.000 --> 35:12.000] a very simple system, one that has inputs and outputs,
[35:12.000 --> 35:17.000] can learn and adapt to its environment.
[35:17.000 --> 35:22.000] You think about the simplest life forms, multi-cellular life forms,
[35:22.000 --> 35:28.000] with some kind of nervous system, could function at a very simplistic level
[35:28.000 --> 35:34.000] to do things like move towards food or away from toxins or toward light or whatever.
[35:34.000 --> 35:36.000] You don't even need nervous systems.
[35:36.000 --> 35:39.000] We see this with slime mold, and we see this with plants.
[35:39.000 --> 35:43.000] Algorithmic learning, at its core, can be really simple.
[35:43.000 --> 35:46.000] I think that's the take-home here.
[35:46.000 --> 35:49.000] And from that, you can evolve human brains,
[35:49.000 --> 35:51.000] give it 600 million years or whatever.
[35:51.000 --> 35:55.000] Over time, you can just keep adding in complexity and complexity and complexity,
[35:55.000 --> 36:01.000] but there's a toehold in really, really simple learning.
[36:01.000 --> 36:03.000] And that's what this is. This is simple learning.
[36:03.000 --> 36:04.000] That's what this is.
[36:04.000 --> 36:06.000] The other little side thing I noticed was that,
[36:06.000 --> 36:10.000] because they did mouse cells and human cells, like human neurons and mouse neurons,
[36:10.000 --> 36:15.000] that the human neurons learned a little bit faster than the mouse neurons.
[36:15.000 --> 36:18.000] So there might be something intrinsically a little bit more higher functioning
[36:18.000 --> 36:20.000] about human neurons than mouse neurons.
[36:20.000 --> 36:23.000] There might be more complexity or more efficiency within the cell.
[36:23.000 --> 36:26.000] The cells themselves, not just the organization of the cells.
[36:26.000 --> 36:31.000] I also noted that the neurons learned faster over time.
[36:31.000 --> 36:35.000] The feedback loop actually got more efficient over time.
[36:35.000 --> 36:36.000] And so that's interesting as well.
[36:36.000 --> 36:37.000] Which also would make sense.
[36:37.000 --> 36:38.000] Yeah, it also makes sense.
[36:38.000 --> 36:40.000] It makes sense, but it's also cool.
[36:40.000 --> 36:45.000] I guess the one last thing that I want to say is be careful when reading things like this,
[36:45.000 --> 36:46.000] because this was published in Neuron.
[36:46.000 --> 36:48.000] This is a big journal.
[36:48.000 --> 36:49.000] It's getting a lot of hype.
[36:49.000 --> 36:52.000] But even the title of the paper, are you guys ready for this?
[36:52.000 --> 36:58.000] In vitro, neurons learn and exhibit sentience when embodied in a simulated game world.
[36:58.000 --> 37:00.000] Oh my God, sentience.
[37:00.000 --> 37:01.000] What?
[37:01.000 --> 37:04.000] And by the way, sentience, once again, we've talked about this a lot, literally means...
[37:04.000 --> 37:06.000] It's got baggage, that word.
[37:06.000 --> 37:09.000] It's got a lot of baggage and it's supposed to be beyond, right?
[37:09.000 --> 37:13.000] Feeling or sensation as distinguished from perception and thought.
[37:13.000 --> 37:19.000] So not only are they saying these neurons themselves can perceive and think,
[37:19.000 --> 37:21.000] they're saying they go beyond that.
[37:21.000 --> 37:22.000] They also feel.
[37:22.000 --> 37:25.000] No, nothing about this is telling us that.
[37:25.000 --> 37:28.000] All this is telling us is that they have inputs and outputs
[37:28.000 --> 37:35.000] and that they can, quote, behave or they can respond in predictable ways.
[37:35.000 --> 37:38.000] It's like a very primitive AI algorithm in Neuron.
[37:38.000 --> 37:39.000] Yeah.
[37:39.000 --> 37:40.000] That's basically what it is.
[37:40.000 --> 37:41.000] It's a circuit.
[37:41.000 --> 37:43.000] It's a circuit.
[37:43.000 --> 37:45.000] Okay, but it's cool.
[37:45.000 --> 37:46.000] All right, let's move on.
[37:46.000 --> 37:51.000] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week,
[37:51.000 --> 37:53.000] BetterHelp Therapy Online.
[37:53.000 --> 37:58.000] Guys, you all know I've been going to therapy for basically all of my adult life,
[37:58.000 --> 38:02.000] and to summarize it, it really has helped me tremendously.
[38:02.000 --> 38:08.000] I mean, particularly at times when things have been very stressful, like when our father died,
[38:08.000 --> 38:14.000] I really needed to go to therapy to help me just get things off my chest and to talk things out,
[38:14.000 --> 38:17.000] not to mention the help I've received for anxiety.
[38:17.000 --> 38:18.000] It's true, and you know what?
[38:18.000 --> 38:21.000] BetterHelp makes it really easy to get started in therapy.
[38:21.000 --> 38:23.000] They're the world's largest therapy service.
[38:23.000 --> 38:27.000] They've matched millions of people with professionally licensed and vetted therapists,
[38:27.000 --> 38:30.000] and they're available 100% online.
[38:30.000 --> 38:33.000] All you've got to do is fill out a brief questionnaire to match with a therapist.
[38:33.000 --> 38:36.000] If things aren't clicking, you can easily switch to a new one.
[38:36.000 --> 38:37.000] It really couldn't be simpler.
[38:37.000 --> 38:53.000] There's no waiting rooms, no traffic.
[38:53.000 --> 38:55.000] All right, guys, let's get back to the show.
[38:55.000 --> 38:58.000] We're going to pivot to a bit of a lighter item, Jay.
[38:58.000 --> 39:01.000] You're going to tell us about smelling in virtual reality.
[39:01.000 --> 39:06.000] Yeah, so the story here is that some people were developing technology
[39:06.000 --> 39:10.000] to smell things that you experience in virtual reality.
[39:10.000 --> 39:16.000] This group is in Sweden, and they actually created something that could be 3D printed
[39:16.000 --> 39:21.000] that would enable you to essentially, with a little bit of creativity,
[39:21.000 --> 39:27.000] you'd be able to make this thing at home and connect it to your VR software.
[39:27.000 --> 39:30.000] I'm not exactly sure because the details really aren't that handy,
[39:30.000 --> 39:34.000] but it is open source code, everything that they're doing.
[39:34.000 --> 39:39.000] The things that remain unclear, just for total transparency here,
[39:39.000 --> 39:42.000] they don't really describe how they're making the smells.
[39:42.000 --> 39:44.000] They don't get into that.
[39:44.000 --> 39:49.000] All they said was that this version of it is intended for people to use
[39:49.000 --> 39:55.000] when they're doing fake VR wine smelling as an app.
[39:55.000 --> 39:59.000] You're smelling different wines, and you've got to identify them, that type of thing.
[39:59.000 --> 40:01.000] It has a fruity bouquet.
[40:01.000 --> 40:06.000] Apparently, they found a way to mimic the different odors that wine creates,
[40:06.000 --> 40:10.000] but I don't think you would be able to just plug this thing in and tell it,
[40:10.000 --> 40:13.000] I want you to smell like cinnamon and oranges.
[40:13.000 --> 40:15.000] I don't think it can do all that.
[40:15.000 --> 40:18.000] I think it's very limited in what it can do.
[40:18.000 --> 40:23.000] The idea, though, is extraordinarily provocative.
[40:23.000 --> 40:27.000] That's the fun part of this item, is not what they achieved,
[40:27.000 --> 40:29.000] but what the future of this might be.
[40:29.000 --> 40:33.000] I do think, Steve, that this is definitely something that will be added
[40:33.000 --> 40:40.000] to the suite of haptic feedback with VR, and there's going to be other types of scents.
[40:40.000 --> 40:44.000] Your senses will be fooled in many different ways.
[40:44.000 --> 40:47.000] Right now, they're fooling your ears and your eyes,
[40:47.000 --> 40:51.000] but to fool your sense of touch or to fool your sense of smell,
[40:51.000 --> 40:55.000] that could be profound, and that can add a dimension to this
[40:55.000 --> 40:58.000] that would bring even more immersion.
[40:58.000 --> 41:00.000] I'll give you a cool example.
[41:00.000 --> 41:04.000] A friend of mine was using my VR headset, and there was a fan on in the room,
[41:04.000 --> 41:09.000] and in the app, by coincidence, in the VR experience that he was having,
[41:09.000 --> 41:14.000] there was wind, and he said that feeling that wind on his skin,
[41:14.000 --> 41:19.000] it really made it all feel much more real, and it was just an accident.
[41:19.000 --> 41:23.000] It wasn't done deliberately in any way to simulate what was happening
[41:23.000 --> 41:25.000] in the VR experience.
[41:25.000 --> 41:34.000] These things do have the potential to dramatically increase our feeling
[41:34.000 --> 41:37.000] of actually being there, that whole uncanny valley.
[41:37.000 --> 41:38.000] Telepresence.
[41:38.000 --> 41:40.000] Telepresence, exactly.
[41:40.000 --> 41:43.000] I think it's really cool. I'm glad that people are working on it.
[41:43.000 --> 41:45.000] This is one of those things you never know.
[41:45.000 --> 41:49.000] One of the big VR headset companies might read this.
[41:49.000 --> 41:52.000] I'm sure that they're aware, and they might think,
[41:52.000 --> 41:55.000] hey, this is actually a really cool idea.
[41:55.000 --> 41:59.000] Let's dump half a billion dollars into it, that type of thing.
[41:59.000 --> 42:06.000] Yeah, it's definitely true that when you have more than one sense overlap,
[42:06.000 --> 42:13.000] reinforce each other, it dramatically increases the realness of the experience.
[42:13.000 --> 42:14.000] Oh, yeah.
[42:14.000 --> 42:15.000] Right?
[42:15.000 --> 42:17.000] Even just sight and sound.
[42:17.000 --> 42:22.000] There's a quote-unquote parlor trick that I used to love to play on people,
[42:22.000 --> 42:26.000] where you put an ice cube in your mouth surreptitiously,
[42:26.000 --> 42:28.000] and then you put your finger in your mouth,
[42:28.000 --> 42:31.000] and you pretend to bite your finger, but you're really crunching the ice cube.
[42:31.000 --> 42:34.000] And when they see you bite down on your finger,
[42:34.000 --> 42:39.000] and they hear the crunch of the ice cube, it has a visceral reaction.
[42:39.000 --> 42:42.000] Yeah, I do the same thing, but I also squirt blood at the same time,
[42:42.000 --> 42:44.000] so it's kind of like a triple threat.
[42:44.000 --> 42:46.000] There you go.
[42:46.000 --> 42:50.000] Or remember there's a ride in Universal, I think it was at the Spider-Man ride,
[42:50.000 --> 42:54.000] where at one point in the ride, you're surrounded by screens
[42:54.000 --> 42:57.000] to make it seem like you're actually in the environment,
[42:57.000 --> 43:02.000] and visually there's something that gives you a blast of heat,
[43:02.000 --> 43:06.000] and they literally blow hot air at you at the same time.
[43:06.000 --> 43:14.000] So you see and feel the heat, and it really solidifies the illusion of reality tremendously.
[43:14.000 --> 43:16.000] Yeah, they call those 4D movies or something.
[43:16.000 --> 43:18.000] They would do smells, too.
[43:18.000 --> 43:24.000] Do you remember Honey, I Shrunk the Kids back in Disney in the 80s or 90s?
[43:24.000 --> 43:28.000] And they had the mice were released on the movie,
[43:28.000 --> 43:31.000] and then there were these little hoses under your feet in the chairs,
[43:31.000 --> 43:34.000] and it would move under your feet and everybody would scream
[43:34.000 --> 43:36.000] and jump up out of their chair.
[43:36.000 --> 43:41.000] They had a Jurassic Park ride like that in which a velociraptor snorted
[43:41.000 --> 43:43.000] and a puff of air hit your neck,
[43:43.000 --> 43:45.000] and you were like, whoa, what the heck?
[43:45.000 --> 43:52.000] But it goes to show you guys our brain is constructing reality moment to moment as we go,
[43:52.000 --> 43:56.000] and it's not that hard to fool your senses.
[43:56.000 --> 44:02.000] Your brain is doing everything it can to make sense of all of the different input that it gets
[44:02.000 --> 44:04.000] and to present to you, into your conscious mind,
[44:04.000 --> 44:09.000] to present to you a seamless, understandable reality.
[44:09.000 --> 44:13.000] The second you interfere with that input,
[44:13.000 --> 44:17.000] the second you put two computer screen monitors close to your eyes
[44:17.000 --> 44:22.000] and block out the rest of the light and have each eye see something a little bit different
[44:22.000 --> 44:27.000] so you get a 3D effect, and then you have some audio mixed in with it,
[44:27.000 --> 44:30.000] you are largely fooled at that point.
[44:30.000 --> 44:33.000] You can intellectually know that you're not in the room,
[44:33.000 --> 44:36.000] but man, does the world drop away instantly, instantly.
[44:36.000 --> 44:40.000] Jay, do you guys remember the one movie we saw when we were kids that had,
[44:40.000 --> 44:44.000] I don't know what they called it, but they had the smells with the movie.
[44:44.000 --> 44:48.000] Basically, you had a scratch and sniff card.
[44:48.000 --> 44:53.000] And then when number three appeared on the screen, you're supposed to scratch number three.
[44:53.000 --> 44:59.000] And of course, like when they pick up a smelly gym sock and tell you to scratch number six.
[44:59.000 --> 45:01.000] No, why would you do that?
[45:01.000 --> 45:06.000] But it was a gimmick, it didn't really add anything to the experience.
[45:06.000 --> 45:10.000] Jay, think about like a kind of Psych 101 class
[45:10.000 --> 45:13.000] where you're learning about perception and cognition
[45:13.000 --> 45:18.000] and all of these cool ways that experimentalists have turned things on their ear.
[45:18.000 --> 45:23.000] And something as simple as, oh Steve, what are they called?
[45:23.000 --> 45:28.000] The prism glasses that you can literally put on a pair of prism glasses
[45:28.000 --> 45:35.000] that flips the world upside down and it doesn't take that long to fully adapt to an upside down world,
[45:35.000 --> 45:41.000] completely navigate it relatively seamlessly, and just live with your world upside down.
[45:41.000 --> 45:44.000] Yeah, but your brain will eventually make it right side up.
[45:44.000 --> 45:48.000] No, no, no, no. Well, yeah, you will adapt so much that it seems real.
[45:48.000 --> 45:53.000] But then when you take them off again, you have a whole other readapting process,
[45:53.000 --> 45:56.000] which is very cool. I mean, it's that easy.
[45:56.000 --> 46:04.000] Or if you're wearing VR goggles and you're being fed a video camera feed of your own back
[46:04.000 --> 46:11.000] and somebody touches you so that you see and feel yourself being touched,
[46:11.000 --> 46:16.000] you feel like you're in the virtual avatar that you're seeing.
[46:16.000 --> 46:18.000] Like your brain says, oh, I'm here.
[46:18.000 --> 46:20.000] It's embodied completely.
[46:20.000 --> 46:26.000] It's also the reason that mirror therapy works for phantom limb pain.
[46:26.000 --> 46:32.000] You give yourself another leg, then you can scratch the other leg and that itch goes away.
[46:32.000 --> 46:34.000] It's so cool.
[46:34.000 --> 46:38.000] Yeah, we're very hackable. Okay, let's go on.
[46:38.000 --> 46:42.000] So I'm going to talk about technosignatures and biosignatures.
[46:42.000 --> 46:44.000] Oh my God.
[46:44.000 --> 46:46.000] There was a couple of news items that we're dealing with these.
[46:46.000 --> 46:49.000] Is that like signing with an electric pen versus a pencil?
[46:49.000 --> 46:51.000] Technosignatures?
[46:51.000 --> 46:57.000] What we're interested in is finding evidence that there is life beyond Earth
[46:57.000 --> 47:01.000] and specifically life beyond our solar system.
[47:01.000 --> 47:03.000] And we have a couple of options.
[47:03.000 --> 47:06.000] One is to find biosignatures.
[47:06.000 --> 47:09.000] We obviously can't go to another star system and sample the water
[47:09.000 --> 47:12.000] and see if there's bacteria crawling around.
[47:12.000 --> 47:16.000] We have to see if there's evidence of life by what they produce.
[47:16.000 --> 47:21.000] So there's two big ones that we talk about frequently, oxygen and methane,
[47:21.000 --> 47:27.000] because they're both products of metabolism of living things on Earth,
[47:27.000 --> 47:32.000] the one example of life that we have of a living ecosystem.
[47:32.000 --> 47:37.000] And they're also very reactive chemicals, so they don't last very long in the atmosphere.
[47:37.000 --> 47:43.000] So if we detect either oxygen or methane in the atmosphere of an exoplanet,
[47:43.000 --> 47:49.000] or Mars even, for example, we've done this, then we know that it's being replenished.
[47:49.000 --> 47:54.000] There's some process replenishing that oxygen or replenishing that methane.
[47:54.000 --> 47:58.000] And so then we have to rule out abiotic sources, right?
[47:58.000 --> 48:02.000] Is there any process going on that could be, you know, chemical process
[48:02.000 --> 48:04.000] that could be producing either the oxygen or the methane?
[48:04.000 --> 48:09.000] And there are some abiotic sources, but then there may be ways of saying,
[48:09.000 --> 48:14.000] well, based upon whatever, whether there's the presence of other things,
[48:14.000 --> 48:20.000] we can infer if there's a likely abiotic source, and if we rule them all out,
[48:20.000 --> 48:24.000] then we think, hey, maybe we're looking at a biosignature, right?
[48:24.000 --> 48:26.000] And maybe there's life on that planet.
[48:26.000 --> 48:30.000] So there's a recent study where the scientists were arguing for the addition
[48:30.000 --> 48:36.000] of another biosignature, nitrous oxide, which raises a couple of interesting points.
[48:36.000 --> 48:39.000] So one is that we can't limit ourselves to what's happening on Earth
[48:39.000 --> 48:41.000] when we think about life.
[48:41.000 --> 48:44.000] You know, we have to think about what all the different ways in which life
[48:44.000 --> 48:48.000] might be undergoing its chemistry and doing its thing.
[48:48.000 --> 48:53.000] And so they were saying that nitrous oxide and 2O, nitrogen with two oxygens,
[48:53.000 --> 48:55.000] and two nitrogens and an oxygen.
[48:55.000 --> 49:00.000] Some living organisms actually produce nitrates as a byproduct,
[49:00.000 --> 49:07.000] and then other living organisms can metabolize those nitrates and create nitrous oxide
[49:07.000 --> 49:13.000] as a waste product in enough amounts that it could build up in the atmosphere,
[49:13.000 --> 49:16.000] and then we might be able to detect that.
[49:16.000 --> 49:20.000] And the research has calculated, like, how much nitrous oxide
[49:20.000 --> 49:24.000] could we reasonably expect to build up based upon things like the density
[49:24.000 --> 49:27.000] of the atmosphere, the amount of oxygen in the atmosphere, et cetera.
[49:27.000 --> 49:32.000] And there's a lot of scenarios under which this could be a reasonable biosignature.
[49:32.000 --> 49:37.000] So that's really the news item, just that they were arguing to add that to the list.
[49:37.000 --> 49:42.000] But the idea that there may be other things out there that are also biosignatures
[49:42.000 --> 49:45.000] that we just don't know that they're biosignatures because, you know,
[49:45.000 --> 49:51.000] we have a narrow conception of what life does, limited to what's happening on Earth,
[49:51.000 --> 49:56.000] and maybe we need to get a little bit more open-minded about different kinds of life.
[49:56.000 --> 50:02.000] And then by a little bit of a coincidence, there was a technosignature news item this week as well,
[50:02.000 --> 50:05.000] although this is really barely a technosignature item.
[50:05.000 --> 50:08.000] It's actually just really a cool astronomy item.
[50:08.000 --> 50:14.000] This is also, in my opinion, the science picture or image of the week.
[50:14.000 --> 50:19.000] You say that, but the inside of a gecko's foot just won the Nikon,
[50:19.000 --> 50:23.000] what is it called, Small World or Small Wonder photo competition.
[50:23.000 --> 50:26.000] So that might beat it for the science picture.
[50:26.000 --> 50:30.000] Well, but that wasn't a picture that came out this week though, right? That was just a one-of-a-kind.
[50:30.000 --> 50:33.000] Oh, you're right. Yeah, it's old. It just won this week.
[50:33.000 --> 50:37.000] And also like the b-ball was another one that won this week.
[50:37.000 --> 50:47.000] But anyway, this picture was from the James Webb Space Telescope, and it's of WR140.
[50:47.000 --> 50:53.000] The previous images showed that this star, which we thought was a binary star,
[50:53.000 --> 51:00.000] was surrounded by these perfect concentric rings, regularly spaced concentric rings,
[51:00.000 --> 51:04.000] going out from the start for quite a distance.
[51:04.000 --> 51:11.000] And there wasn't really a serious, I think, consideration that this could be a technosignature medium.
[51:11.000 --> 51:14.000] This is some kind of alien megastructure.
[51:14.000 --> 51:17.000] But that always gets brought up.
[51:17.000 --> 51:20.000] Somebody is going to bring up this could be some kind of...
[51:20.000 --> 51:24.000] But it wasn't really one of the high probability ones.
[51:24.000 --> 51:25.000] It's like dust grains, right?
[51:25.000 --> 51:28.000] Yeah. The one I got most excited about was Tabby's star,
[51:28.000 --> 51:33.000] where we really couldn't explain that the light was dipping by that much.
[51:33.000 --> 51:36.000] And we thought, maybe it's a Dyson swarm.
[51:36.000 --> 51:38.000] But that got quickly ruled out.
[51:38.000 --> 51:42.000] Well, not that quickly, but yeah.
[51:42.000 --> 51:44.000] Fairly quickly, within a year or so.
[51:44.000 --> 51:46.000] It's not glowing in the infrared.
[51:46.000 --> 51:52.000] And if that were solar panels surrounding the star, it would be glowing in the infrared.
[51:52.000 --> 51:55.000] So pretty much a technosignature got ruled out pretty quickly.
[51:55.000 --> 51:58.000] And then we eventually figured out, oh, it's just a really dusty star.
[51:58.000 --> 51:59.000] All right.
[51:59.000 --> 52:04.000] So for this one, with the higher resolution images of the James Webb,
[52:04.000 --> 52:09.000] you can see these concentric rings in beautiful detail.
[52:09.000 --> 52:10.000] Yeah, that's cool.
[52:10.000 --> 52:14.000] They're not quite perfect.
[52:14.000 --> 52:18.000] And guess how far they extend from this binary system?
[52:18.000 --> 52:20.000] Bob, did you read that?
[52:20.000 --> 52:21.000] No. Light year?
[52:21.000 --> 52:22.000] About a light year.
[52:22.000 --> 52:24.000] Yeah, about a light year.
[52:24.000 --> 52:25.000] I'm good.
[52:25.000 --> 52:27.000] About 10 trillion kilometers, which is about a lot.
[52:27.000 --> 52:28.000] That's far.
[52:28.000 --> 52:30.000] So a little bit more than a light year, actually.
[52:30.000 --> 52:31.000] So that's really far.
[52:31.000 --> 52:37.000] That would be a massive, massive megastructure if somebody built it.
[52:37.000 --> 52:40.000] But it also confirmed that it's a binary system.
[52:40.000 --> 52:46.000] The larger star is a Wolf-Rayet star.
[52:46.000 --> 52:51.000] Wolf is the first word, hyphen, R-A-Y-E-T, Rayet.
[52:51.000 --> 52:52.000] Okay.
[52:52.000 --> 52:53.000] Wolf-Rayet star.
[52:53.000 --> 52:55.000] And that's the WR.
[52:55.000 --> 53:02.000] And these are really big and unstable stars at the end of their life, essentially.
[53:02.000 --> 53:04.000] And it's about 30 solar masses, right?
[53:04.000 --> 53:07.000] Solar mass being the mass of our sun.
[53:07.000 --> 53:11.000] The other star is an O-type star that's about 10 solar masses.
[53:11.000 --> 53:18.000] But astronomers think that it was probably as big as the Wolf-Rayet star in the past.
[53:18.000 --> 53:23.000] And it's been giving off material and losing mass over time.
[53:23.000 --> 53:25.000] The two stars are a close binary.
[53:25.000 --> 53:31.000] They orbit each other once every 7.93 years, like almost eight years.
[53:31.000 --> 53:35.000] They're also called, Bob, if you've ever heard this term, a wind binary.
[53:35.000 --> 53:36.000] Have you heard of a wind binary?
[53:36.000 --> 53:37.000] No.
[53:37.000 --> 53:38.000] Yeah.
[53:38.000 --> 53:39.000] A wind binary.
[53:39.000 --> 53:40.000] Wind binary.
[53:40.000 --> 53:42.000] New term, new astronomy term to learn.
[53:42.000 --> 53:43.000] Solar wind?
[53:43.000 --> 53:44.000] What's going on there?
[53:44.000 --> 53:45.000] Yeah.
[53:45.000 --> 53:51.000] They're so close to each other and so energetic that their solar winds interact with each other.
[53:51.000 --> 53:54.000] And that's basically what's happening here.
[53:54.000 --> 53:56.000] And their orbit's very eccentric.
[53:56.000 --> 54:03.000] So as these two stars whip around each other every eight years, their solar winds are interacting.
[54:03.000 --> 54:06.000] And I guess I was trying to think of a good metaphor.
[54:06.000 --> 54:11.000] It's kind of like a bellows, you know, but at like a solar wind scale.
[54:11.000 --> 54:16.000] As they're moving toward each other and moving away from each other,
[54:16.000 --> 54:24.000] the interaction of their solar winds are pumping the dust that's coming off of the stars,
[54:24.000 --> 54:28.000] especially the smaller one, out into the surrounding space.
[54:28.000 --> 54:32.000] So it's like, you know, imagine if there's another metaphor.
[54:32.000 --> 54:34.000] Like we used to do this in the pool all the time.
[54:34.000 --> 54:37.000] Like you have a floaty device on top of it, on the surface of the water,
[54:37.000 --> 54:43.000] and you're pumping it up and down and you're generating waves that are radiating out away from your floaty thing.
[54:43.000 --> 54:44.000] So it's the same thing.
[54:44.000 --> 54:45.000] Yeah.
[54:45.000 --> 54:47.000] So if you're doing, but this is not, I don't think this is even about that.
[54:47.000 --> 54:50.000] This is just, it's just about the pumping action, right?
[54:50.000 --> 54:55.000] So as these stars are whipping around each other, they're pumping these waves of solar wind
[54:55.000 --> 55:03.000] that are just creating these concentric rings of dust that then move away from the stellar system.
[55:03.000 --> 55:08.000] And because their period is regular, the spacing of the rings is regular.
[55:08.000 --> 55:09.000] Right.
[55:09.000 --> 55:10.000] Oh, wow. Nice.
[55:10.000 --> 55:14.000] So, yeah, so that's the thing is whenever we see something that's mathematically precise,
[55:14.000 --> 55:20.000] we think life intelligence, but nature can be mathematically precise as well.
[55:20.000 --> 55:25.000] In fact, the first technosignature arguably was the LGM signal.
[55:25.000 --> 55:26.000] You guys remember this?
[55:26.000 --> 55:27.000] Little Green Men.
[55:27.000 --> 55:28.000] Little Green Men.
[55:28.000 --> 55:31.000] We saw this signal was regularly pulsed.
[55:31.000 --> 55:34.000] Yeah, but it turned out to be a pulsar because it was spinning at a regular rate.
[55:34.000 --> 55:36.000] But we thought, oh, that's way too precise.
[55:36.000 --> 55:37.000] It has to be intelligent.
[55:37.000 --> 55:39.000] But nope, nature can be precise.
[55:39.000 --> 55:40.000] Same thing here.
[55:40.000 --> 55:45.000] These rings are very, very precisely concentric, but that's because they're being created
[55:45.000 --> 55:50.000] by a very precise phenomenon, the rotation of these two stars around each other.
[55:50.000 --> 55:51.000] Phenomenon.
[55:51.000 --> 55:53.000] Now, why are they not perfect circles?
[55:53.000 --> 55:54.000] There's two reasons.
[55:54.000 --> 55:57.000] One is because the orbit is concentric.
[55:57.000 --> 56:01.000] The timing is not, it's not like a perfectly circular orbit around each other.
[56:01.000 --> 56:05.000] So the asymmetry in the orbit creates a little asymmetry in these rings.
[56:05.000 --> 56:09.000] But also we're not viewing it dead face on.
[56:09.000 --> 56:13.000] We're viewing it at a slight angle, which causes an optical illusion of a little bit
[56:13.000 --> 56:16.000] of irregularity in the rings themselves.
[56:16.000 --> 56:17.000] Very cool.
[56:17.000 --> 56:22.000] So every time we think we have a technosignature and it turns out to be a natural phenomenon,
[56:22.000 --> 56:27.000] the natural phenomenon is always cool, which makes sense because the reason we think it's
[56:27.000 --> 56:31.000] a technosignature in the first place is because it's an anomaly.
[56:31.000 --> 56:34.000] It's something we can't immediately explain.
[56:34.000 --> 56:36.000] So some idiot says it's aliens, right?
[56:36.000 --> 56:43.000] But then they're taken seriously to greater or lesser extent depending on how anomalous
[56:43.000 --> 56:45.000] the thing actually is.
[56:45.000 --> 56:47.000] But most astronomers are like, yeah, yeah, cool your jets.
[56:47.000 --> 56:48.000] We'll figure it out.
[56:48.000 --> 56:52.000] And they figure it out and it's always something awesome like this.
[56:52.000 --> 56:54.000] But not quite as awesome as a megastructure.
[56:54.000 --> 56:57.000] No, a megastructure would be awesomer.
[56:57.000 --> 56:59.000] I completely agree with that.
[56:59.000 --> 57:06.000] But I also think that finding a megastructure is the best chance we have in our lifetime
[57:06.000 --> 57:09.000] of finding evidence of alien life.
[57:09.000 --> 57:15.000] When you think about it, so finding biosignatures are going to be inherently ambiguous, in my
[57:15.000 --> 57:17.000] opinion, because there's abiotic sources.
[57:17.000 --> 57:18.000] And it'd be nice.
[57:18.000 --> 57:20.000] But then also they could be bacteria.
[57:20.000 --> 57:21.000] That would be awesome.
[57:21.000 --> 57:24.000] But still, you know, that's going to be hard to confirm.
[57:24.000 --> 57:32.000] SETI, yeah, a huge supporter of SETI, but somebody would have to be pumping really powerful
[57:32.000 --> 57:33.000] radio waves our way.
[57:33.000 --> 57:35.000] And that just may not be the case.
[57:35.000 --> 57:41.000] But technosignatures, like some kind of megastructure, we can just see them, you know?
[57:41.000 --> 57:48.000] And they could be really far away and they could be unambiguously technological,
[57:48.000 --> 57:50.000] depending on what it is we're looking at.
[57:50.000 --> 57:54.000] Yeah, like imagine stars, here's an unconventional one, Steve.
[57:54.000 --> 57:58.000] Stars that were herded into a pattern.
[57:58.000 --> 58:01.000] Like, okay, that can't happen naturally.
[58:01.000 --> 58:05.000] I mean, that's just a weird one.
[58:05.000 --> 58:09.000] But of course, the whole Dyson Swarm idea could be big.
[58:09.000 --> 58:11.000] That happened on Futurama.
[58:11.000 --> 58:13.000] Oh, yeah, I'm sure it did, yeah.
[58:13.000 --> 58:14.000] On that documentary.
[58:14.000 --> 58:15.000] Yeah.
[58:15.000 --> 58:17.000] Fry herded the stars into a pattern.
[58:17.000 --> 58:21.000] I think it said it was his marriage proposal to the one.
[58:21.000 --> 58:22.000] Yeah, marry me.
[58:22.000 --> 58:23.000] Yeah, marry me.
[58:23.000 --> 58:26.000] But anyway, yeah, that would do it.
[58:26.000 --> 58:29.000] Or again, like a Dyson Sphere, a Dyson Swarm type of thing.
[58:29.000 --> 58:33.000] Something that has to be constructed would be awesome.
[58:33.000 --> 58:35.000] Yeah, that's my big hope, man.
[58:35.000 --> 58:37.000] We'll find the bacteria first, though, I think.
[58:37.000 --> 58:38.000] Probably.
[58:38.000 --> 58:41.000] And then it'll only be a candidate, a life candidate.
[58:41.000 --> 58:48.000] If we're going to really have a confirmation of life off of Earth, it's going to be in our solar system.
[58:48.000 --> 58:50.000] And it better be playing Pong.
[58:50.000 --> 58:52.000] Yeah, right.
[58:52.000 --> 58:58.000] All right, Evan, you know, this is one of those stories that never completely goes away.
[58:58.000 --> 59:04.000] The TWA 800 crash is still embroiled now in a new lawsuit.
[59:04.000 --> 59:05.000] Tell us about it.
[59:05.000 --> 59:06.000] Sure is.
[59:06.000 --> 59:09.000] Yeah, I read this at Law Street Media is the website.
[59:09.000 --> 59:18.000] Headline reads, citing new evidence, surviving family members sue feds Raytheon and Lockheed Martin over the 1996 TWA crash.
[59:18.000 --> 59:23.000] So this was an amended complaint filed in Massachusetts federal court.
[59:23.000 --> 59:27.000] The original complaint was filed back in July of this year.
[59:27.000 --> 59:28.000] This one's amended.
[59:28.000 --> 59:31.000] I'm not quite sure exactly what parts of it were amended.
[59:31.000 --> 59:34.000] I didn't see a link to the original one.
[59:34.000 --> 59:36.000] But that aside, there's always little technical shit that they're doing.
[59:36.000 --> 59:37.000] Yeah, right.
[59:37.000 --> 59:38.000] Yeah, there may be another amendment coming.
[59:38.000 --> 59:39.000] Who knows?
[59:39.000 --> 59:59.000] But the complaint accuses the U.S. Department of Defense, the United States Navy and the contractors Raytheon Technologies Corporation and Lockheed Martin Corporation of a cover up of the 1996 aviation incident, resulting in the death of all 230 passengers and crew members aboard TWA flight 800.
[59:59.000 --> 01:00:22.000] It's a wrongful death case claims that Freedom of Information Act or FOIA evidence has emerged proving that TWA 800 explosion was not caused by any defect in the airplane, but instead by an errant United States missile fired at aerial target drones flying nearby.
[01:00:22.000 --> 01:00:32.000] And the suits brought by the survivors of the disaster for those of us who have been alive long enough to remember when this happened, I definitely remember when it happened.
[01:00:32.000 --> 01:00:37.000] I remember being woken up in the middle of the night with a phone call to get down to my office.
[01:00:37.000 --> 01:00:41.000] I was in the video production industry at the time.
[01:00:41.000 --> 01:00:56.000] I had to prepare a lot of equipment needed for crews to head out to Long Island, New York, who needed more cameras, more reporters and things from France and from the United States all needed a local shop to get more equipment to cover this.
[01:00:56.000 --> 01:00:57.000] It was a huge story.
[01:00:57.000 --> 01:01:00.000] So I definitely, definitely remember that myself.
[01:01:00.000 --> 01:01:11.000] But here are the basic facts. TWA flight 800 took off from New York's John F. Kennedy International Airport around 820 p.m. on July 17, 1996.
[01:01:11.000 --> 01:01:18.000] It was bound for Paris. Within 12 minutes of takeoff, the plane exploded and crashed into the Atlantic Ocean off the coast of Long Island, New York.
[01:01:18.000 --> 01:01:27.000] In the United States, we have a federal agency that investigates all airplane crashes called the National Transportation Safety Board, the NTSB.
[01:01:27.000 --> 01:01:31.000] And they issued their official report on August 23rd, 2000.
[01:01:31.000 --> 01:01:35.000] It took four years to conduct this entire investigation.
[01:01:35.000 --> 01:01:38.000] Here's the relevant summary from that report.
[01:01:38.000 --> 01:01:52.000] The NTSB determined that the probable cause of the flight accident was an explosion of the center wing fuel tank, CWT for short, resulting from ignition of the flammable fuel air mixture in the tank.
[01:01:52.000 --> 01:01:59.000] The source of ignition energy for the explosion could not be determined with certainty, but of the sources evaluated by the investigation,
[01:01:59.000 --> 01:02:11.000] the most likely cause was a short circuit outside of the CWT that allowed excessive voltage to enter through electrical wiring associated with the fuel quantity indication system.
[01:02:11.000 --> 01:02:20.000] Contributing factors to the accident were the design and certification concept that fuel tank explosions could be prevented solely by precluding all ignition sources
[01:02:20.000 --> 01:02:35.000] and the design and certification of the Boeing 747 with heat sources located beneath the CWT with no means to reduce the heat transferred into the CWT or to render the fuel vapor in the tank non-flammable.
[01:02:35.000 --> 01:02:38.000] That's the relevant summary there.
[01:02:38.000 --> 01:02:55.000] Now, according to this lawsuit, the wrongful death complaint, this report is false and it supposedly can be proven due to a series of FOIA information that has been uncovered primarily by a physicist, Dr. Thomas Stalcup.
[01:02:55.000 --> 01:03:05.000] And Thomas Stalcup has been basically looking into this for, well, almost as long as the case has been a case.
[01:03:05.000 --> 01:03:17.000] He's actively involved in offering testimony and basically talking about this and how it's always been a cover up and there are alternate explanations exactly as to what happened.
[01:03:17.000 --> 01:03:25.000] So this particular physicist has been on this case for a better part of two decades now.
[01:03:25.000 --> 01:03:36.000] He says that according to his evidence, instead of a fuel tank explosion, the cause of the explosion was the testing of the Aegis weapons system developed and managed by Raytheon and Lockheed.
[01:03:36.000 --> 01:03:49.000] It explains that the defense system fired SM-2 missiles with live warheads from warships at aerial missile targets off the coast of New York in close proximity to commercial flight paths.
[01:03:49.000 --> 01:03:57.000] The filing states claims for negligence, wrongful death and survivorship, kind of the technical things that you have to have in these kinds of suits.
[01:03:57.000 --> 01:04:05.000] The complaint specifies that the contractor defendant missile system suffered from system specific software problems.
[01:04:05.000 --> 01:04:16.000] It is further contended by plaintiffs that the defendants engaged in a top down cover up to prevent the public from learning the truth about TWA 800.
[01:04:16.000 --> 01:04:20.000] So yeah, there's a couple different things kind of really going on here.
[01:04:20.000 --> 01:04:32.000] They're saying it was a cover up and that these FOIA requests have revealed evidence to the fact it was a cover up and the evidence they're saying also discredits the exploding fuel tank narrative.
[01:04:32.000 --> 01:04:42.000] Now, Bob, Steve and Jay, I know you guys have some information regarding this that you did some investigating yourself on this particular case.
[01:04:42.000 --> 01:04:50.000] We deeply investigated this for a pilot that we were filming, if you recall, and this is pretty open and shut.
[01:04:50.000 --> 01:04:57.000] Bob actually went to the debris of the plane, right, Bob? And you could see how it was an explosion from the inside out.
[01:04:57.000 --> 01:05:04.000] Yeah, it was in a like a warehouse, basically the most intense three dimensional jigsaw puzzle ever.
[01:05:04.000 --> 01:05:07.000] They put this plane back together from all the exploded pieces.
[01:05:07.000 --> 01:05:19.000] And of course, there were pieces missing and bent and all this stuff, but it was an amazing sight and so impressive, so impressive, extraordinary feat just to get that thing back together.
[01:05:19.000 --> 01:05:23.000] And yeah, you could clearly see that. Yeah, this was an internal explosion.
[01:05:23.000 --> 01:05:28.000] The other thing is, is we spoke to Jay and I blanketed a lot of witnesses.
[01:05:28.000 --> 01:05:30.000] We did our own sort of witness investigation.
[01:05:30.000 --> 01:05:36.000] Then we read the testimony of many witnesses. And also you can reconstruct what happened.
[01:05:36.000 --> 01:05:42.000] I'll try to give this really briefly. Here's basically what happened. There were two explosions, right?
[01:05:42.000 --> 01:05:53.000] There was an initial explosion, then Flight 800, you know, went streaking down, billowing smoke and then exploded again as it broke into two pieces.
[01:05:53.000 --> 01:06:00.000] So and you can calculate distance, right, from the witnesses to where this was happening.
[01:06:00.000 --> 01:06:10.000] And because sound travels at a predictable speed through air, we know how long it would have taken for the sound of the explosion to reach the witnesses.
[01:06:10.000 --> 01:06:22.000] We know exactly how long it would have taken, right? Some witnesses were 42 seconds, you know, away from from the in terms of the speed of sound from the explosion.
[01:06:22.000 --> 01:06:31.000] So this is this is when you reconstruct what the witnesses say. And this is both contemporary and even, you know, currently.
[01:06:31.000 --> 01:06:42.000] And they're remarkably consistent, actually. They heard an explosion, turned toward the explosion and saw a streak in the sky.
[01:06:42.000 --> 01:06:53.000] And they, you know, they saw the, you know, the Flight 800 on fire. Some people say that they saw the ship blow up, the plane blow up.
[01:06:53.000 --> 01:07:00.000] And then either either at the same time that they heard the explosion or they heard the explosion two or three seconds later.
[01:07:00.000 --> 01:07:09.000] Now, their their brains just we were just talking about this. When you see and hear something that fits together, your brain constructs that narrative.
[01:07:09.000 --> 01:07:16.000] So it makes perfect sense that they saw and heard an explosion within a couple of seconds of each other.
[01:07:16.000 --> 01:07:21.000] And they thought that that was the explosion. Right. What they were seeing was what they were hearing.
[01:07:21.000 --> 01:07:29.000] And so imagine if you see and hear an explosion and what you're seeing is a trail of smoke leading to the plane. Right.
[01:07:29.000 --> 01:07:35.000] So that's why their brains put that together as a missile hit the plane and then exploded.
[01:07:35.000 --> 01:07:42.000] But what was actually happening and this is unequivocal, demonstrable laws of physics kind of stuff.
[01:07:42.000 --> 01:07:51.000] They were seeing the second explosion at the near the time that they were hearing the first explosion.
[01:07:51.000 --> 01:07:55.000] Yeah. Forty two seconds or 50 seconds or whatever.
[01:07:55.000 --> 01:08:05.000] Previously, the trail of smoke was from flight 800 itself, which had already been burning for 50 seconds or whatever. Does that make sense?
[01:08:05.000 --> 01:08:17.000] So it's just that they they didn't in real time calculate the time delay of the sound versus the versus what they could see and intuitively know what the distance was.
[01:08:17.000 --> 01:08:22.000] They didn't realize that they were miles away from what they were seeing.
[01:08:22.000 --> 01:08:31.000] And so their own testimony, even people, we spoke to people who were convinced it was a missile and yet their own testimony proves that it wasn't.
[01:08:31.000 --> 01:08:35.000] You know what I mean? So the evidence, the quote unquote evidence that it was a missile is not.
[01:08:35.000 --> 01:08:45.000] And here's something that's that's critical. No one, no, no single witness saw a missile hit the plane and then the plane blew up.
[01:08:45.000 --> 01:08:54.000] Nobody saw that there. They always saw it in process and and then had to sort of reconstruct what was happening.
[01:08:54.000 --> 01:08:59.000] And they just did it wrong because they didn't factor in the time delay of the sound.
[01:08:59.000 --> 01:09:11.000] That makes sense. Yeah, totally. It's remarkable how remarkable over time and remarkable over time, people's memories will become so distorted.
[01:09:11.000 --> 01:09:16.000] Yeah, sure. And there and this is all of us. I'm not I'm not pointing fingers at anyone.
[01:09:16.000 --> 01:09:29.000] We all do this. You know, just reviewing memories in your head and then hearing outside information that is that's related to those memories can significantly alter them.
[01:09:29.000 --> 01:09:33.000] It's so it's so it's so liquid. You know, it's like, you know, we're not.
[01:09:33.000 --> 01:09:40.000] That's why all of our ability to perceive reality and our memories of reality are all in flux.
[01:09:40.000 --> 01:09:52.000] Especially when there's a time factor involved. There's no I don't know how you can possibly actively correctly come up with the time sequence, as Steve described,
[01:09:52.000 --> 01:09:59.000] and be able to talk about that after the fact with to the detail that the evidence otherwise suggests.
[01:09:59.000 --> 01:10:03.000] I don't I don't know a case where that can where someone can possibly really do that.
[01:10:03.000 --> 01:10:14.000] Unless, Steve, like you said, they actually saw, you know, actually witnessed a missile hitting a plane and then having all those events unfold.
[01:10:14.000 --> 01:10:18.000] Instead, people are hearing things, turning and then seeing the phenomenon.
[01:10:18.000 --> 01:10:21.000] And then their brain is piecing together what they think. Right.
[01:10:21.000 --> 01:10:29.000] Here's the other thing. I think there's another factor here is that our expectations are trained by cinema and television.
[01:10:29.000 --> 01:10:34.000] Right. That's true. People expect to see what things look like on the big screen,
[01:10:34.000 --> 01:10:42.000] what they look like on television and not in the real world, you know, where you have to factor things in like extreme distance,
[01:10:42.000 --> 01:10:46.000] like a 50 second delay between the explosion and the sound reaching you.
[01:10:46.000 --> 01:10:50.000] So, yeah, of course they were confused. No one's ever seen anything like that before.
[01:10:50.000 --> 01:10:55.000] This is a one off unexpected bizarre event and their brains reconstructed it wrong.
[01:10:55.000 --> 01:11:03.000] But careful analysis can reconstruct it correctly. So I have no doubt this lawsuit will fail because now they have to.
[01:11:03.000 --> 01:11:12.000] They can't just do the typical conspiracy thing where they anomaly hunt and and try to just make everything sound sinister.
[01:11:12.000 --> 01:11:17.000] They have to positively prove a conspiracy to cover up evidence.
[01:11:17.000 --> 01:11:21.000] And of course, one doesn't exist. So they're not going to be able to do it.
[01:11:21.000 --> 01:11:27.000] You know what I mean? Conspiracies don't fare well when they have rules of evidence, like in a courtroom,
[01:11:27.000 --> 01:11:33.000] because you can't just be just asking questions. I mean, that doesn't get you very far.
[01:11:33.000 --> 01:11:39.000] What they're saying in this particular complaint, and this is on line 85 of the complaint.
[01:11:39.000 --> 01:11:41.000] They obtained several at this part of the FOIA.
[01:11:41.000 --> 01:11:47.000] They obtained several FBI records never released to the families or the public.
[01:11:47.000 --> 01:11:55.000] One described as an original Navy radar tape showing an object heading straight for TWA 800.
[01:11:55.000 --> 01:12:00.000] Another describes an object on radar impacting TWA 800 and then spiraling away,
[01:12:00.000 --> 01:12:06.000] while also stating that witnesses described seeing a flare going up orbit circle another object.
[01:12:06.000 --> 01:12:08.000] Subsequently, debris fell from the sky.
[01:12:08.000 --> 01:12:16.000] And so what they're implying here and through these suggestions is that the FOIA requests have revealed that these tapes,
[01:12:16.000 --> 01:12:21.000] apparently, exist somewhere. I don't think they have them. I doubt it.
[01:12:21.000 --> 01:12:30.000] But that's what they're apparently suggesting is that from the Freedom of Information Act requests that have been disclosed now,
[01:12:30.000 --> 01:12:35.000] they say that there is video evidence that suggests otherwise.
[01:12:35.000 --> 01:12:38.000] And that's what I believe they're going to base this whole thing on.
[01:12:38.000 --> 01:12:41.000] We'll see how far that gets them. I don't know.
[01:12:41.000 --> 01:12:51.000] Yeah, but we know it's bullshit. What always happens is they're doing three levels deep of inference about what this could mean.
[01:12:51.000 --> 01:12:55.000] And they're reading between the lines or just misinterpreting it.
[01:12:55.000 --> 01:12:59.000] It's the same thing with UFO evidence. It's always the same thing.
[01:12:59.000 --> 01:13:04.000] It's just inferential and wishful thinking, but there's never any there there.
[01:13:04.000 --> 01:13:10.000] And then when you actually do wind up seeing tapes, you get things like you have the UFO Navy tapes, right?
[01:13:10.000 --> 01:13:14.000] Yeah. OK, we can explain this now. Yes. And here is exactly what happened.
[01:13:14.000 --> 01:13:20.000] Actually, it doesn't support your complaint here. It actually further supports what the what the investigation.
[01:13:20.000 --> 01:13:25.000] Yeah. So, you know, again, because we have a lot invested in this particular story, not only because of just regionally,
[01:13:25.000 --> 01:13:30.000] it's something that happened in our backyard, basically, but also because we investigated it so thoroughly.
[01:13:30.000 --> 01:13:35.000] Definitely will follow this story and give any what the ultimate conclusion is.
[01:13:35.000 --> 01:13:39.000] I definitely would like to see what they're calling evidence in this case.
[01:13:39.000 --> 01:13:43.000] I bet you it's UFO level evidence. Right. That's that was my thought as well.
[01:13:43.000 --> 01:13:47.000] But we will see and we'll see. Keep it going. All right. Thanks, Evan.
[01:13:47.000 --> 01:13:52.000] Yep. Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Bombas.
[01:13:52.000 --> 01:13:58.000] Guys, in case you didn't know, everything that Bombas makes is incredibly comfortable to wear.
[01:13:58.000 --> 01:14:01.000] They're soft. They're seamless. They don't have any tags.
[01:14:01.000 --> 01:14:06.000] And they have they they overall have a cozy feel that I am sure you're going to enjoy.
[01:14:06.000 --> 01:14:11.000] Just like us, I'm wearing Bombas socks right now. I mean, it doesn't get any more real than that.
[01:14:11.000 --> 01:14:15.000] Yeah, Jane. It's not just socks. It's shirts. It's underwear. It's so many things.
[01:14:15.000 --> 01:14:21.000] And Bombas mission is simple. Make the most comfortable clothes ever and match every item sold with an equal item donated.
[01:14:21.000 --> 01:14:24.000] So when you buy Bombas, you're also giving to someone in need.
[01:14:24.000 --> 01:14:32.000] Go to Bombas dot com slash skeptics and use code skeptics for 20 percent off your first purchase.
[01:14:32.000 --> 01:14:40.000] That's Bombas dot com slash skeptics and use code skeptics at checkout.
[01:14:40.000 --> 01:14:48.000] All right, guys, let's get back to the show. So, Bob, how did the whole smashing ship into an asteroid go?
[01:14:48.000 --> 01:14:53.000] It went great, Steve, much 25 times better than anticipated.
[01:14:53.000 --> 01:14:54.000] Wow.
[01:14:54.000 --> 01:14:59.000] But now this is Steve's referring to NASA's DART mission, smashing a probe into an asteroid.
[01:14:59.000 --> 01:15:07.000] And the fact that it has been shown, at least on preliminary examination, to be a success beyond expectations.
[01:15:07.000 --> 01:15:13.000] So what happened and why could that mission potentially lead to saving human civilization?
[01:15:13.000 --> 01:15:17.000] Maybe a little dramatic, but this could prove to be quite important.
[01:15:17.000 --> 01:15:23.000] So I'm sure most of us have heard about the DART mission that culminated recently, September 26, 2022.
[01:15:23.000 --> 01:15:27.000] DART stands for Double Asteroid Redirection Test.
[01:15:27.000 --> 01:15:33.000] So the what of this is that a refrigerator-sized probe traveling at 22,000 kilometers per hour.
[01:15:33.000 --> 01:15:37.000] Wait, wait, what's the metric equivalent to refrigerator-sized?
[01:15:37.000 --> 01:15:38.000] Never mind.
[01:15:38.000 --> 01:15:44.000] The speedy probe ventured into the domain of an 830-meter asteroid called Didymos,
[01:15:44.000 --> 01:15:54.000] and its 160-meter asteroid moonlet called Dimorphus, or affectionately called Didymoon, which is pretty funny.
[01:15:54.000 --> 01:15:57.000] So the probe smacked into Didymoon.
[01:15:57.000 --> 01:16:05.000] And so the why of this is important and well worth the third of a billion dollars spent on this, in my opinion.
[01:16:05.000 --> 01:16:11.000] The goal was to see if this technique was a viable method for deflecting an asteroid.
[01:16:11.000 --> 01:16:14.000] And the answer is a resounding yes.
[01:16:14.000 --> 01:16:19.000] We've proven for the first time that we can divert an asteroid and by even more than we anticipated.
[01:16:19.000 --> 01:16:24.000] So first of all, why did we even hit an asteroid that was orbiting another asteroid?
[01:16:24.000 --> 01:16:27.000] When I first saw that, I was like, why? Why do we do that? That's weird.
[01:16:27.000 --> 01:16:29.000] They can't be too common.
[01:16:29.000 --> 01:16:38.000] Actually, this was the perfect spot to do it because it made determining how its small orbit around Dimorphus much quicker,
[01:16:38.000 --> 01:16:41.000] easier and accurate than it orbit around the Sun, right?
[01:16:41.000 --> 01:16:44.000] Because if it's going in these little orbits that are quick,
[01:16:44.000 --> 01:16:53.000] if we then mess with that moonlet and we could easily say, oh, its orbit around the primary is much shorter or it's much longer
[01:16:53.000 --> 01:16:58.000] or it could exquisitely determine how accurate, you know, how much the orbit has changed.
[01:16:58.000 --> 01:17:01.000] Imagine if this thing was in orbit around the Sun and we have to, oh, boy,
[01:17:01.000 --> 01:17:05.000] I think it would take much longer to determine that with any accuracy.
[01:17:05.000 --> 01:17:11.000] So now before the mission, NASA defined the minimum change that would be declared a success,
[01:17:11.000 --> 01:17:16.000] the minimum change that they would break out the booze was 73 seconds.
[01:17:16.000 --> 01:17:21.000] So if the orbit changed by 73 seconds, that would be a success.
[01:17:21.000 --> 01:17:24.000] 72 seconds, not so much.
[01:17:24.000 --> 01:17:29.000] So now the actual change, give or take two minutes, was a whopping 32 minutes,
[01:17:29.000 --> 01:17:33.000] 25 times larger than their smallest success estimation.
[01:17:33.000 --> 01:17:42.000] So to describe it more fully, an orbit that once took 11 hours and 55 minutes now takes 11 hours and 23 minutes.
[01:17:42.000 --> 01:17:44.000] That's what the impact did.
[01:17:44.000 --> 01:17:52.000] One of the interesting parts of this is that this wasn't, you would think it hit it and it slowed it down or changed the orbit,
[01:17:52.000 --> 01:17:58.000] but it wasn't the pure kinetic impact alone that was so impactful, if you will.
[01:17:58.000 --> 01:18:04.000] The most spectacular visual component to the crash was the ejecta plume leaping off the asteroid.
[01:18:04.000 --> 01:18:08.000] If you've seen the videos, you could see this plume just flying away.
[01:18:08.000 --> 01:18:14.000] It's a large cloud of rocks and smaller particles that were launched away from the asteroid.
[01:18:14.000 --> 01:18:19.000] And I use the word launch there on purpose because there's similarities to a rocket plume.
[01:18:19.000 --> 01:18:26.000] That plume of rocks and debris increases the transfer of momentum from the crash itself.
[01:18:26.000 --> 01:18:29.000] So the plume actually adds to it.
[01:18:29.000 --> 01:18:34.000] So one of the NASA boffins said, it's too soon to say, there's a lot of moving parts in this calculation,
[01:18:34.000 --> 01:18:43.000] but it looks like the recoil from the ejecta blasted off the surface was a substantial contributor to the overall push given to the asteroid.
[01:18:43.000 --> 01:18:45.000] So it's kind of like a double whammy.
[01:18:45.000 --> 01:18:52.000] You've got the impact and then you've got the ejecta launched off the body that also transfers more momentum.
[01:18:52.000 --> 01:18:55.000] Now the research is not over though for the DART team.
[01:18:55.000 --> 01:18:57.000] They're still working on it.
[01:18:57.000 --> 01:19:05.000] They're still studying the plume itself, which is still visible to determine the structure of the moon and its mass and density, hopefully.
[01:19:05.000 --> 01:19:12.000] That then would then be fed into the models describing how such objects react to such impacts, which is important.
[01:19:12.000 --> 01:19:17.000] Because if we see one coming in the future, then we'll know, okay, we'll have a much better sense of, you know,
[01:19:17.000 --> 01:19:22.000] what kind of impact would make the desired changes to its orbit and miss the Earth.
[01:19:22.000 --> 01:19:28.000] Well, we're also going to send another ship there that we're going to look at the crater and see, you know,
[01:19:28.000 --> 01:19:32.000] what kind of crater it created, like, you know, how deep it was, how big it was.
[01:19:32.000 --> 01:19:38.000] And that'll help us figure out what future versions of this, you know, how it would behave and what to expect.
[01:19:38.000 --> 01:19:40.000] Yeah, that's cool, Jay.
[01:19:40.000 --> 01:19:46.000] Now we know now that, you know, I'm sure you've all seen a lot of those cool close up video images just before the impact.
[01:19:46.000 --> 01:19:49.000] The moon that appears to be basically a huge rubble pile.
[01:19:49.000 --> 01:19:53.000] It's not let's not we don't think it's a solid piece of rock.
[01:19:53.000 --> 01:19:55.000] It's just basically a huge rubble pile, it seems.
[01:19:55.000 --> 01:20:00.000] And the other the major the major thing about the surface so that if you look at it,
[01:20:00.000 --> 01:20:05.000] there was very few fine grained particles like dust on Didymoon.
[01:20:05.000 --> 01:20:13.000] If you look at other asteroids that we have landed probes on, you could see that there's definitely a lot of dust there.
[01:20:13.000 --> 01:20:15.000] But not on this, not on this moonlit.
[01:20:15.000 --> 01:20:24.000] And that's probably because it's so small that the gravity is just so weak that any time that that moonlit was actually hit by anything,
[01:20:24.000 --> 01:20:33.000] that the you know, these low mass, really tiny particles would then be ejected at the highest velocity and then just escape and the meager gravity.
[01:20:33.000 --> 01:20:39.000] So basically all that any any dust that might have been there was just been knocked off over the eons.
[01:20:39.000 --> 01:20:43.000] So that's why it kind of looked so different from other more massive asteroids.
[01:20:43.000 --> 01:20:49.000] The future should also see more tests, especially like Jay said, going getting a close up to see what the aftermath looks like.
[01:20:49.000 --> 01:20:57.000] But also they're going to be doing other research on other asteroid deflection techniques, which I which I think is critical.
[01:20:57.000 --> 01:21:01.000] And so if you look at these techniques, they've been called three different things.
[01:21:01.000 --> 01:21:06.000] There's three major ideas or techniques that you can do to divert an asteroid.
[01:21:06.000 --> 01:21:09.000] There's the nuke, the kick and the tug.
[01:21:09.000 --> 01:21:11.000] So those are the three different ways.
[01:21:11.000 --> 01:21:14.000] The nuke technique is self explanatory.
[01:21:14.000 --> 01:21:22.000] In 2007, NASA actually wrote a report to Congress and they said that the best method to deflect an asteroid could be using a nuke in space.
[01:21:22.000 --> 01:21:25.000] And it seems clear that in some scenarios a nuke would be ideal.
[01:21:25.000 --> 01:21:31.000] But the obvious risk is that all the pieces could still hit the Earth and cause even more widespread damage.
[01:21:31.000 --> 01:21:33.000] There could be there could be radiation problems.
[01:21:33.000 --> 01:21:39.000] So there's a lot of things that you'd have to be very careful of before you would throw a nuke at something in space.
[01:21:39.000 --> 01:21:44.000] But it could be extremely useful in specific scenarios.
[01:21:44.000 --> 01:21:49.000] NASA's second best option, it was called it was the kick technique.
[01:21:49.000 --> 01:21:51.000] And that involves a kinetic impactor like dark.
[01:21:51.000 --> 01:21:53.000] So that's what we've been describing.
[01:21:53.000 --> 01:21:57.000] And then the tug idea has also been called a gravity tractor.
[01:21:57.000 --> 01:22:07.000] And these these probes would then fly along next to an asteroid over months or even years to change its orbit using only its gravitational attraction.
[01:22:07.000 --> 01:22:15.000] And of course, you'd have to get out there, you know, with a good sized mass and just hang out there and fly along with it for a long time.
[01:22:15.000 --> 01:22:23.000] Unfortunately, current thinking is that asteroids that are over 500 feet, they might not respond well enough to this technique,
[01:22:23.000 --> 01:22:35.000] which would be unfortunate because the asteroids that are greater than 500 feet, those are the ones we really want to take care of because those are the ones obviously that are going to do really significant damage.
[01:22:35.000 --> 01:22:40.000] Well, Bob, unless the ship has a black hole drive, then they could just use the gravity from that.
[01:22:40.000 --> 01:22:43.000] Well, obviously, I thought that was too obvious to mention.
[01:22:43.000 --> 01:22:49.000] But then there's a technique that's known just by a few people, and it's called the Amerindian Tiberius technique.
[01:22:49.000 --> 01:22:51.000] Steve, have you heard of this one?
[01:22:51.000 --> 01:22:52.000] No.
[01:22:52.000 --> 01:22:53.000] If you thought about it, you would.
[01:22:53.000 --> 01:23:00.000] That uses a mysterious obelisk to emit a blue beam to push away an asteroid.
[01:23:00.000 --> 01:23:05.000] But that's pretty far in the future, though, and I will end it there with that silly Star Trek reference.
[01:23:05.000 --> 01:23:15.000] So, Bob, I think one of the key things here, one is we need more ways to detect asteroids earlier in the game.
[01:23:15.000 --> 01:23:16.000] Absolutely.
[01:23:16.000 --> 01:23:19.000] Because, you know, one huge factor.
[01:23:19.000 --> 01:23:20.000] And comets.
[01:23:20.000 --> 01:23:25.000] You know, if you get to it earlier, right, so if you think about like changing the course of a marble, right,
[01:23:25.000 --> 01:23:30.000] in the first few inches of that course change, you might not even be able to visually see it.
[01:23:30.000 --> 01:23:40.000] But over a long distance, you know, if you change a marble's course, it could be actually could be a very huge difference in its path.
[01:23:40.000 --> 01:23:47.000] So the earlier that we detect and manipulate asteroids, the better chance we'll have of having it not hit Earth, right?
[01:23:47.000 --> 01:23:48.000] That's the whole thing.
[01:23:48.000 --> 01:23:55.000] And we want to put it into a safer trajectory or if it's the type of thing that, you know,
[01:23:55.000 --> 01:24:01.000] how like an asteroid could fly by Earth and then it could come back in like five more years and that's the hit.
[01:24:01.000 --> 01:24:04.000] The first couple of flybys aren't the problem.
[01:24:04.000 --> 01:24:06.000] So we have to really know what we're doing.
[01:24:06.000 --> 01:24:12.000] And if we have 10 years, say, we have 10 years to deflect this asteroid, you know, with technologies like this,
[01:24:12.000 --> 01:24:17.000] if we keep putting money and research into this, we will get there.
[01:24:17.000 --> 01:24:18.000] But we've got to take it.
[01:24:18.000 --> 01:24:20.000] We've got to take it seriously.
[01:24:20.000 --> 01:24:22.000] We've got to be willing to put the money in.
[01:24:22.000 --> 01:24:23.000] And this is a global thing.
[01:24:23.000 --> 01:24:24.000] This shouldn't just be NASA.
[01:24:24.000 --> 01:24:29.000] This should be NASA working with people all around the world because everybody's at stake here.
[01:24:29.000 --> 01:24:33.000] Yeah, this is one of those existential threats that that will happen.
[01:24:33.000 --> 01:24:35.000] We're going to get whacked at some point.
[01:24:35.000 --> 01:24:36.000] It could be next week.
[01:24:36.000 --> 01:24:38.000] It could be in 300 years.
[01:24:38.000 --> 01:24:42.000] We're going to get a we're going to get walloped and we could actually do something about it.
[01:24:42.000 --> 01:24:44.000] We can do something.
[01:24:44.000 --> 01:24:50.000] But I love the fact that like our best technology is just to crash something into it.
[01:24:50.000 --> 01:24:51.000] We're just going to hit it with it.
[01:24:51.000 --> 01:24:52.000] We're good at that.
[01:24:52.000 --> 01:24:53.000] Heavy and fast.
[01:24:53.000 --> 01:24:54.000] That's that's the method.
[01:24:54.000 --> 01:24:55.000] But it makes sense.
[01:24:55.000 --> 01:24:56.000] There's still momentum.
[01:24:56.000 --> 01:24:57.000] Lots of ejecta.
[01:24:57.000 --> 01:25:01.000] Yeah, that is cool, though, that the ejecta basically becomes like a little bit of a rocket blast that.
[01:25:01.000 --> 01:25:03.000] Yeah, exactly.
[01:25:03.000 --> 01:25:04.000] So cool.
[01:25:04.000 --> 01:25:05.000] All right.
[01:25:05.000 --> 01:25:06.000] Thanks, Bob.
Who's That Noisy? ()
New Noisy ()
[_short_vague_description_of_Noisy]
[01:25:06.000 --> 01:25:07.000] All right, Jay.
[01:25:07.000 --> 01:25:08.000] It's who's that noisy time.
[01:25:08.000 --> 01:25:09.000] All right, guys.
[01:25:09.000 --> 01:25:25.000] Last week I played this noisy.
[01:25:25.000 --> 01:25:26.000] Crazy.
[01:25:26.000 --> 01:25:27.000] What is that?
[01:25:27.000 --> 01:25:28.000] That's it.
[01:25:28.000 --> 01:25:29.000] That's it.
[01:25:29.000 --> 01:25:30.000] Someone opening a can of food.
[01:25:30.000 --> 01:25:31.000] Right.
[01:25:31.000 --> 01:25:32.000] Can open noise.
[01:25:32.000 --> 01:25:35.000] And then in your feeding your pet dinosaur and the dinosaurs reacting to the food it's
[01:25:35.000 --> 01:25:36.000] smelling in the can.
[01:25:36.000 --> 01:25:40.000] I thought you were going to say that somebody opening up a can of whoop ass.
[01:25:40.000 --> 01:25:42.000] I think that's some kind of animal.
[01:25:42.000 --> 01:25:43.000] That's my guess.
[01:25:43.000 --> 01:25:47.000] It's some kind of a mammal that is weird.
[01:25:47.000 --> 01:25:48.000] That's my guess.
[01:25:48.000 --> 01:25:49.000] But I picked this.
[01:25:49.000 --> 01:25:54.000] I picked this because first off, it was sent in by Visto Tutti.
[01:25:54.000 --> 01:25:55.000] Visto Tutti.
[01:25:55.000 --> 01:26:00.000] Hopefully someday he and I will meet and we will be able to talk who's that noisy.
[01:26:00.000 --> 01:26:01.000] We'll just talk.
[01:26:01.000 --> 01:26:03.000] We'll talk with clicks and whistles.
[01:26:03.000 --> 01:26:06.000] They'll speak in metaphor.
[01:26:06.000 --> 01:26:11.000] So anyway, so it has a Halloween type weird thing going on.
[01:26:11.000 --> 01:26:12.000] I thought this would make Bob happy.
[01:26:12.000 --> 01:26:15.000] Apparently Bob doesn't care.
[01:26:15.000 --> 01:26:16.000] It was cool.
[01:26:16.000 --> 01:26:17.000] I did enjoy it.
[01:26:17.000 --> 01:26:24.000] I only got, there are times when I literally get hundreds of people writing in trying to
[01:26:24.000 --> 01:26:25.000] answer.
[01:26:25.000 --> 01:26:29.000] I got two people responded to this noisy.
[01:26:29.000 --> 01:26:32.000] I think it scared a lot of people away.
[01:26:32.000 --> 01:26:33.000] Was it Hans and Franz?
[01:26:33.000 --> 01:26:35.000] Or you've lost all your fans?
[01:26:35.000 --> 01:26:38.000] Was it Frick and Frack?
[01:26:38.000 --> 01:26:40.000] Ben and Jerry?
[01:26:40.000 --> 01:26:42.000] It was slim and none.
[01:26:42.000 --> 01:26:44.000] Jack and...
[01:26:44.000 --> 01:26:46.000] The first person was Michael Blaney.
[01:26:46.000 --> 01:26:50.000] Michael Blaney said, hi Jake, got a very Jurassic Park theme to it.
[01:26:50.000 --> 01:26:54.000] With that in mind, I'm guessing the sulfur-crested cockatoo.
[01:26:54.000 --> 01:26:58.000] Out of all the interesting looking and sounding birds found in Australia, that one will haunt
[01:26:58.000 --> 01:27:01.000] your dreams with its screech.
[01:27:01.000 --> 01:27:05.000] Michael, that's a good guess because birds make most of the weird sounds in the world,
[01:27:05.000 --> 01:27:07.000] but this is not a bird.
[01:27:07.000 --> 01:27:08.000] It is not a bird.
[01:27:08.000 --> 01:27:15.000] Another person wrote in, Marcin Bukowski, and he said, my guess is it was a gun, perhaps
[01:27:15.000 --> 01:27:20.000] a rifle being loaded and possibly fired, recorded from within its interior.
[01:27:20.000 --> 01:27:21.000] All the best to everyone.
[01:27:21.000 --> 01:27:24.000] So that is also not correct.
[01:27:24.000 --> 01:27:26.000] So I'm just going to get right to it.
[01:27:26.000 --> 01:27:29.000] This, Steve, you're very, very good at your guess.
[01:27:29.000 --> 01:27:31.000] This is actually a mammal.
[01:27:31.000 --> 01:27:33.000] It's a weird mammal?
[01:27:33.000 --> 01:27:35.000] I'll test you guys real quick.
[01:27:35.000 --> 01:27:37.000] So it's a mammal.
[01:27:37.000 --> 01:27:38.000] I'll give you another hint.
[01:27:38.000 --> 01:27:40.000] It lives in Asia and Africa.
[01:27:40.000 --> 01:27:43.000] It likes...
[01:27:43.000 --> 01:27:44.000] Pangolin.
[01:27:44.000 --> 01:27:45.000] Is it a pangolin?
[01:27:45.000 --> 01:27:46.000] Yes, it is.
[01:27:46.000 --> 01:27:47.000] Very good, Kara.
[01:27:47.000 --> 01:27:48.000] Excellent.
[01:27:48.000 --> 01:27:49.000] I was like, what mammal lives in Asia now?
[01:27:49.000 --> 01:27:51.000] That's a made-up name.
[01:27:51.000 --> 01:27:54.000] Pangolins are known as scaly anteaters.
[01:27:54.000 --> 01:27:59.000] They're mammals, but they have a very scaly exterior.
[01:27:59.000 --> 01:28:01.000] They have extraordinarily long and sticky tongues.
[01:28:01.000 --> 01:28:06.000] They like to eat ants and those types of bugs.
[01:28:06.000 --> 01:28:10.000] They prefer sandy soils, and they can be found in woodlands and savannas.
[01:28:10.000 --> 01:28:14.000] They like to be near water, but they're mostly found in Asia and Africa.
[01:28:14.000 --> 01:28:17.000] And yeah, they make a very, very weird noise.
[01:28:17.000 --> 01:28:19.000] Real quick, just listen to that noise again.
[01:28:19.000 --> 01:28:29.000] I mean, it has like an inhuman sound to it.
[01:28:29.000 --> 01:28:31.000] Oh yeah, it's horrifying.
[01:28:31.000 --> 01:28:34.000] Yeah, that would definitely put me on edge a little bit.
[01:28:34.000 --> 01:28:37.000] You could use that as an audio track.
[01:28:37.000 --> 01:28:42.000] Like a foley artist could use that for like an alien or a predator or whatever.
[01:28:42.000 --> 01:28:44.000] Yeah, the beginnings of a cool sound effect.
[01:28:44.000 --> 01:28:46.000] I'm sure they have, Steve.
[01:28:46.000 --> 01:28:51.000] But what I'd like to know is if anybody out there has actually interacted with one,
[01:28:51.000 --> 01:28:53.000] like how dangerous are they?
[01:28:53.000 --> 01:28:55.000] What's their demeanor?
[01:28:55.000 --> 01:28:57.000] Well, they're scared of people.
[01:28:57.000 --> 01:28:58.000] Smart.
[01:28:58.000 --> 01:29:03.000] When I was in Africa, was it this very last time?
[01:29:03.000 --> 01:29:07.000] We were in Limpopo region of South Africa,
[01:29:07.000 --> 01:29:13.000] and we stopped by a rehab facility where sadly a pangolin had been recovered, right?
[01:29:13.000 --> 01:29:18.000] These are the most trafficked animals on the planet, so had been recovered from a poacher.
[01:29:18.000 --> 01:29:22.000] Yeah, it's the keratin shells, traditional Chinese medicine, man.
[01:29:22.000 --> 01:29:24.000] Yeah, people like their pangolin scales.
[01:29:24.000 --> 01:29:25.000] It's disgusting.
[01:29:25.000 --> 01:29:30.000] And so it had been in a box and it had tried to claw its way out of the box
[01:29:30.000 --> 01:29:32.000] and broke all of its fingernails off.
[01:29:32.000 --> 01:29:33.000] So it was in a lot of pain.
[01:29:33.000 --> 01:29:34.000] Yeah, it was really sad.
[01:29:34.000 --> 01:29:38.000] And so I was there with the vet and the pangolin was anesthetized,
[01:29:38.000 --> 01:29:42.000] but I got to like touch it and be really near it while they were doing a surgery on it.
[01:29:42.000 --> 01:29:43.000] So that was really, really cool.
[01:29:43.000 --> 01:29:47.000] I've also been out in the field with a pangolin researcher who was looking at them.
[01:29:47.000 --> 01:29:50.000] I've never held one in the field or anything.
[01:29:50.000 --> 01:29:52.000] You can catch them and hold them.
[01:29:52.000 --> 01:29:55.000] But yeah, their behaviors, they're not interested in people.
[01:29:55.000 --> 01:29:58.000] And they're not people predators.
[01:29:58.000 --> 01:30:01.000] I wouldn't be afraid of a pangolin unless you heard it and didn't see it.
[01:30:01.000 --> 01:30:03.000] Yeah, it makes sense.
[01:30:03.000 --> 01:30:05.000] Yeah, they're really cute actually.
[01:30:05.000 --> 01:30:06.000] Do they get rabies?
[01:30:06.000 --> 01:30:08.000] I don't know if they get rabies.
[01:30:08.000 --> 01:30:09.000] They get COVID.
[01:30:09.000 --> 01:30:10.000] We know that.
[01:30:10.000 --> 01:30:11.000] Yeah, that's right.
[01:30:11.000 --> 01:30:13.000] Yeah, we know that they carry coronaviruses.
[01:30:13.000 --> 01:30:14.000] So don't mess with them.
[01:30:14.000 --> 01:30:16.000] That's what you learned in today's show.
[01:30:16.000 --> 01:30:17.000] Don't mess with my wild animals, people.
[01:30:17.000 --> 01:30:19.000] Never under any circumstances.
[01:30:19.000 --> 01:30:20.000] Hello, nature trying to kill you.
[01:30:20.000 --> 01:30:21.000] Remember that?
[01:30:21.000 --> 01:30:22.000] All right.
[01:30:22.000 --> 01:30:24.000] Listen, you guys, I have something special for you.
[01:30:24.000 --> 01:30:28.000] I made this for the six hour live stream that we did.
[01:30:28.000 --> 01:30:29.000] What was that?
[01:30:29.000 --> 01:30:32.000] Like a month and a half ago at this point?
[01:30:32.000 --> 01:30:36.000] It didn't make it because we had so, you know, literally we talked for six hours.
[01:30:36.000 --> 01:30:38.000] We have an endless amount of content to give the world.
[01:30:38.000 --> 01:30:40.000] So we didn't play it.
[01:30:40.000 --> 01:30:41.000] But I did.
[01:30:41.000 --> 01:30:45.000] I did create this and I will I will tell you before I play it for you.
[01:30:45.000 --> 01:30:54.000] This is 100 percent created by a, you know, I'd like to say an artificial intelligence, but I feel like that's giving it too much credit.
[01:30:54.000 --> 01:30:56.000] It's basically a deep fake, but I want to play it for you.
[01:30:56.000 --> 01:31:00.000] I think you guys will really enjoy it and we'll talk about what you think afterwards.
[01:31:00.000 --> 01:31:03.000] Hey, guys, let's talk about how much we love meatballs.
[01:31:03.000 --> 01:31:04.000] I mean, science.
[01:31:04.000 --> 01:31:05.000] Bob, what do you think?
[01:31:05.000 --> 01:31:08.000] First, I'd like to say, Jay, you don't love Halloween enough.
[01:31:08.000 --> 01:31:09.000] I love science so much.
[01:31:09.000 --> 01:31:11.000] It makes my skin tingle.
[01:31:11.000 --> 01:31:15.000] Well, I do love science, but I also love to think about taxes and goats.
[01:31:15.000 --> 01:31:17.000] I'm a doctor, so you should believe everything I say.
[01:31:17.000 --> 01:31:18.000] OK.
[01:31:18.000 --> 01:31:21.000] Did you guys know that Halloween is a known cure for cancer?
[01:31:21.000 --> 01:31:22.000] Bob, very funny.
[01:31:22.000 --> 01:31:24.000] Everyone knows that Halloween is fake.
[01:31:24.000 --> 01:31:26.000] How does this relate to me being a doctor?
[01:31:26.000 --> 01:31:27.000] Hold on a second.
[01:31:27.000 --> 01:31:29.000] I think something weird is going on here.
[01:31:29.000 --> 01:31:31.000] Why do we all sound so robotic?
[01:31:31.000 --> 01:31:32.000] Did it finally happen?
[01:31:32.000 --> 01:31:35.000] Artificial intelligence has digitized all humans.
[01:31:35.000 --> 01:31:37.000] I don't feel digital.
[01:31:37.000 --> 01:31:38.000] Well, look on the bright side.
[01:31:38.000 --> 01:31:41.000] Maybe this will help me do taxes faster.
[01:31:41.000 --> 01:31:42.000] I'm OK with this.
[01:31:42.000 --> 01:31:43.000] This is absurd.
[01:31:43.000 --> 01:31:45.000] Artificial intelligence says don't need doctors.
[01:31:45.000 --> 01:31:47.000] Christ, now what am I going to do?
[01:31:47.000 --> 01:31:50.000] At least we are all still together, even if it's in a computer somewhere.
[01:31:50.000 --> 01:31:55.000] Maybe now I will be able to pronounce hard words in people's last names.
[01:31:55.000 --> 01:31:56.000] Wow.
[01:31:56.000 --> 01:31:59.000] Jay, you are so funny.
[01:31:59.000 --> 01:32:00.000] Oh, my God.
[01:32:00.000 --> 01:32:05.000] So I had so much fun playing around with our voices.
[01:32:05.000 --> 01:32:09.000] Kara, I was trying to get your thing done, but I didn't get it done in time.
[01:32:09.000 --> 01:32:11.000] Remember I was being pushy with all you guys?
[01:32:11.000 --> 01:32:12.000] Yeah, you were like, send us a sample.
[01:32:12.000 --> 01:32:15.000] And I was like, I don't want them to own my voice.
[01:32:15.000 --> 01:32:17.000] I'm scared to give it to you.
[01:32:17.000 --> 01:32:18.000] Where's the fine print?
[01:32:18.000 --> 01:32:22.000] I want to run this by my agent and my attorney.
[01:32:22.000 --> 01:32:28.000] What I found out, though, was because we do this show so often and so consistently,
[01:32:28.000 --> 01:32:33.000] I can create very quickly very funny dialogue between us.
[01:32:33.000 --> 01:32:38.000] And the fact that I had your voices, it made it so much more visceral for me.
[01:32:38.000 --> 01:32:42.000] Like when I was doing it, I'm like, it just was very easy.
[01:32:42.000 --> 01:32:46.000] I could do it two hours of that without a problem, and it would all be kind of like that.
[01:32:46.000 --> 01:32:48.000] But I thought it was really funny.
[01:32:48.000 --> 01:32:50.000] Bob, here's what I found out.
[01:32:50.000 --> 01:32:53.000] First, I used a program called dscript.
[01:32:53.000 --> 01:32:57.000] And this is not intended to really do a deep fake.
[01:32:57.000 --> 01:33:04.000] This is more intended to make small word changes or fixes to a recording.
[01:33:04.000 --> 01:33:08.000] So let's say that I get a really good sample file from everybody.
[01:33:08.000 --> 01:33:13.000] I think I only used about 20 minutes of everybody in past shows.
[01:33:13.000 --> 01:33:16.000] I just went into our files and I pulled us talking.
[01:33:16.000 --> 01:33:20.000] But if I gave it more like three plus hours of any one of us talking,
[01:33:20.000 --> 01:33:23.000] it would be able to sound a lot more authentic.
[01:33:23.000 --> 01:33:29.000] And then what you could do is you would play your audio file into the dscript software,
[01:33:29.000 --> 01:33:32.000] and then it would textualize everything.
[01:33:32.000 --> 01:33:35.000] It'll give you the text of everything that's being said.
[01:33:35.000 --> 01:33:39.000] And then I could go in and make textual changes.
[01:33:39.000 --> 01:33:43.000] And it will overlay the AI deep fake voice.
[01:33:43.000 --> 01:33:47.000] And most of the time, you probably won't really notice it,
[01:33:47.000 --> 01:33:52.000] especially if it's not during a sentence that has any dramatic value to it.
[01:33:52.000 --> 01:33:57.000] Like if it's just normal talking like I'm doing right now, it could do it without a problem.
[01:33:57.000 --> 01:34:03.000] But it's not good at, in this particular program and with the amount of training that they're accepting,
[01:34:03.000 --> 01:34:07.000] it's not really good at, like, if you do crazy things like this with your voice,
[01:34:07.000 --> 01:34:09.000] you really can't tell it to do that.
[01:34:09.000 --> 01:34:12.000] You can't say, I want you to over-annunciate.
[01:34:12.000 --> 01:34:16.000] They have a little bit of a tweaking method where you can...
[01:34:16.000 --> 01:34:17.000] Oh, I'm tweaked.
[01:34:17.000 --> 01:34:24.000] So what I could do is I could say, here's a sample of the voice Steve sounding excited.
[01:34:24.000 --> 01:34:29.000] Here's a sample of the voice Steve sounding sad, right?
[01:34:29.000 --> 01:34:32.000] And I label it and I let it absorb that.
[01:34:32.000 --> 01:34:36.000] Then I could go in and apply that to something that I typed,
[01:34:36.000 --> 01:34:40.000] and it's supposed to get you a little bit closer to what the real thing would be.
[01:34:40.000 --> 01:34:44.000] So you fed it, oh, this is Steve sounding like a robot.
[01:34:44.000 --> 01:34:51.000] No, no, no, that's what you get from about 20 minutes of voice training.
[01:34:51.000 --> 01:34:57.000] I used an artificial intelligence that is not intended to completely mimic human speech.
[01:34:57.000 --> 01:35:01.000] There aren't a lot of knobs for me to turn to tweak everything,
[01:35:01.000 --> 01:35:03.000] but there is software that does this.
[01:35:03.000 --> 01:35:06.000] There is software that does this really well.
[01:35:06.000 --> 01:35:09.000] And it's only getting better by leaps and bounds every year,
[01:35:09.000 --> 01:35:13.000] like deepfake technology, video and audio is getting much, much, much, much better.
[01:35:13.000 --> 01:35:17.000] Anybody that's a listener or watches The Mandalorian,
[01:35:17.000 --> 01:35:22.000] the first time that, can I be spoiler free here or what?
[01:35:22.000 --> 01:35:25.000] Do I have to worry about spoilers after this thing aired a couple of years ago?
[01:35:25.000 --> 01:35:26.000] No, we're good.
[01:35:26.000 --> 01:35:30.000] They showed a deepfake of Luke Skywalker,
[01:35:30.000 --> 01:35:33.000] and then the very next season, about a year and a half later,
[01:35:33.000 --> 01:35:36.000] they showed a newer version of Luke Skywalker,
[01:35:36.000 --> 01:35:40.000] and it was an order of magnitude better than the first one that they did.
[01:35:40.000 --> 01:35:45.000] It was almost seamless until he starts talking, and that's the hardest part.
[01:35:45.000 --> 01:35:48.000] But it still was over the waterline.
[01:35:48.000 --> 01:35:53.000] So bottom line is, this technology is here.
[01:35:53.000 --> 01:35:56.000] Consumers have access to versions of it.
[01:35:56.000 --> 01:35:58.000] It's exciting. It's really cool.
[01:35:58.000 --> 01:36:00.000] It's also a little bit scary.
[01:36:00.000 --> 01:36:05.000] We really have a lot to work out as far as all the privacy and all the rights and all that stuff.
[01:36:05.000 --> 01:36:08.000] It is unbelievably complicated.
[01:36:08.000 --> 01:36:10.000] But I've got to tell you guys,
[01:36:10.000 --> 01:36:13.000] if you have a group of friends that you could get to do this,
[01:36:13.000 --> 01:36:18.000] if you can get them to give you 20 minutes to an hour of them talking,
[01:36:18.000 --> 01:36:20.000] and you train this thing to do it,
[01:36:20.000 --> 01:36:25.000] you won't believe how much fun it is to get people to say crazy stuff that you know.
[01:36:25.000 --> 01:36:28.000] Definitely something fun to play around with if you have some free time.
[01:36:28.000 --> 01:36:30.000] The name of the app is dscript again.
[01:36:30.000 --> 01:36:32.000] All right, I'm going to jump back into who's that noisy.
[01:36:32.000 --> 01:36:33.000] Let's finish this up.
[01:36:33.000 --> 01:36:37.000] So thanks, everyone, the two people who sent in a guest for this week's noisy.
[01:36:37.000 --> 01:36:39.000] I got a new one for you.
[01:36:39.000 --> 01:36:42.000] Boy, don't I wish that I had a really good deep fake of myself,
[01:36:42.000 --> 01:36:45.000] and this is the deep fake right now that I was tricking you guys with,
[01:36:45.000 --> 01:36:47.000] but it isn't. Someday.
[01:36:47.000 --> 01:36:48.000] Someday.
[01:36:48.000 --> 01:37:13.000] This noisy was sent in by a listener named David Knott.
[01:37:13.000 --> 01:37:15.000] I know that sounds a little disturbing.
[01:37:15.000 --> 01:37:18.000] I'm definitely on a Halloween theme here for Bob.
[01:37:18.000 --> 01:37:19.000] Attaboy.
[01:37:19.000 --> 01:37:22.000] Definitely trying to find things that are weird.
[01:37:22.000 --> 01:37:24.000] So if you think you know what this week's noisy is,
[01:37:24.000 --> 01:37:30.000] or you heard something cool, email me at wtn at theskepticsguide.org.
[01:37:30.000 --> 01:37:33.000] And don't forget, if you email me at that address,
[01:37:33.000 --> 01:37:35.000] you could simply attach a file to that.
[01:37:35.000 --> 01:37:37.000] You're not going through our website or anything.
[01:37:37.000 --> 01:37:39.000] You could attach any sound file that you have.
[01:37:39.000 --> 01:37:42.000] That's the best way to send in a who's that noisy,
[01:37:42.000 --> 01:37:45.000] and that's the one that I check 100% of the emails that come in.
[01:37:45.000 --> 01:37:47.000] Any other place that you send who's that noisy,
[01:37:47.000 --> 01:37:49.000] I might not be seeing it.
Announcements ()
[01:37:49.000 --> 01:37:52.000] And I'd like to make sure everybody out there knows that we have
[01:37:52.000 --> 01:37:55.000] four shows coming up in Arizona.
[01:37:55.000 --> 01:37:58.000] This is a great way to show your support of the work that we do,
[01:37:58.000 --> 01:38:02.000] and it's also a great way to get into a room with us and have a lot of fun.
[01:38:02.000 --> 01:38:06.000] We're going to be doing four shows, two in Phoenix, two in Tucson.
[01:38:06.000 --> 01:38:08.000] All the data that you need is on our website.
[01:38:08.000 --> 01:38:11.000] Go to theskepticsguide.org forward slash events.
[01:38:11.000 --> 01:38:12.000] You can see the dates.
[01:38:12.000 --> 01:38:15.000] These are all going to be in December, starting on December 15th.
[01:38:15.000 --> 01:38:17.000] So please do join us in Arizona.
[01:38:17.000 --> 01:38:18.000] All right. Thanks, Jay.
Questions/Emails/Corrections/Follow-ups ()
_consider_using_block_quotes_for_emails_read_aloud_in_this_segment_
with_reduced_spacing_for_long_chunks –
Question_Email_Correction #1: _brief_description_ ()
Question_Email_Correction #2: _brief_description_ ()
Science or Fiction (h:mm:ss)
Answer | Item |
---|---|
Fiction | |
Science |
Host | Result |
---|---|
Steve |
Rogue | Guess |
---|
Voice-over: It's time for Science or Fiction.
_Rogue_ Response
_Rogue_ Response
_Rogue_ Response
_Rogue_ Response
Steve Explains Item #_n_
Steve Explains Item #_n_
Steve Explains Item #_n_
[01:38:18.000 --> 01:38:21.000] Okay, everyone, let's go on with science or fiction.
[01:38:24.000 --> 01:38:28.000] It's time for science or fiction.
[01:38:33.000 --> 01:38:36.000] Each week I come up with three science news items or facts,
[01:38:36.000 --> 01:38:38.000] two real and one fake.
[01:38:38.000 --> 01:38:42.000] And then I challenge my panel of skeptics to tell me which one is the fake.
[01:38:42.000 --> 01:38:46.000] No theme this week, just three science news items.
[01:38:46.000 --> 01:38:48.000] Are you all ready?
[01:38:48.000 --> 01:38:49.000] Yep.
[01:38:49.000 --> 01:38:50.000] Yep, yep.
[01:38:50.000 --> 01:38:51.000] Yep.
[01:38:51.000 --> 01:38:55.000] Item number one, a review of COVID-19-related preprints
[01:38:55.000 --> 01:38:58.000] that were later published in peer-reviewed journals
[01:38:58.000 --> 01:39:01.000] finds that 50% were substantially altered,
[01:39:01.000 --> 01:39:05.000] including changes to effect sizes, the data used,
[01:39:05.000 --> 01:39:07.000] and statistical significance.
[01:39:07.000 --> 01:39:10.000] Item number two, scientists have developed a simple, rapid,
[01:39:10.000 --> 01:39:14.000] and effective method for making tissue optically transparent,
[01:39:14.000 --> 01:39:16.000] including entire organs.
[01:39:16.000 --> 01:39:20.000] And item number three, in a comprehensive meta-analysis,
[01:39:20.000 --> 01:39:23.000] researchers find that women have advantages over men
[01:39:23.000 --> 01:39:28.000] in phonemic fluency, verbal memory, and verbal recognition,
[01:39:28.000 --> 01:39:31.000] and that this advantage is stable over 50 years of research
[01:39:31.000 --> 01:39:34.000] and over a participant's lifetime.
[01:39:34.000 --> 01:39:35.000] All right, Jay, go first.
[01:39:35.000 --> 01:39:39.000] All right, this first one here, a review of COVID-19-related preprints
[01:39:39.000 --> 01:39:42.000] that were later published in peer-reviewed journals
[01:39:42.000 --> 01:39:45.000] finds that 50% were substantially altered,
[01:39:45.000 --> 01:39:47.000] including changes to effect sizes, the data used,
[01:39:47.000 --> 01:39:49.000] and statistical significance.
[01:39:49.000 --> 01:39:51.000] All right, Steve, I'm not clear about what's happening here.
[01:39:51.000 --> 01:39:53.000] All right, so you know what a preprint is?
[01:39:53.000 --> 01:39:56.000] It basically means that scientists do a study
[01:39:56.000 --> 01:40:01.000] and they just put it up online before it goes through peer review.
[01:40:01.000 --> 01:40:03.000] Like, we're all ready to go, here it is.
[01:40:03.000 --> 01:40:06.000] Meanwhile, they send it to publishers, right,
[01:40:06.000 --> 01:40:09.000] hopefully one publisher, and eventually it gets published.
[01:40:09.000 --> 01:40:11.000] But there's a process to that publication process,
[01:40:11.000 --> 01:40:13.000] you know, you get feedback from reviewers
[01:40:13.000 --> 01:40:15.000] and they tell you to change this and do that and whatever.
[01:40:15.000 --> 01:40:17.000] So the question they were looking at was,
[01:40:17.000 --> 01:40:20.000] how substantially do these preprints change
[01:40:20.000 --> 01:40:22.000] in that publication process?
[01:40:22.000 --> 01:40:24.000] So the published version of these papers
[01:40:24.000 --> 01:40:28.000] were substantially altered 50% of the time.
[01:40:28.000 --> 01:40:31.000] Okay, I think that one, I would say that one is true.
[01:40:31.000 --> 01:40:34.000] That one makes sense, you know, once I'm editing,
[01:40:34.000 --> 01:40:37.000] people are giving editorial feedback and that type of thing.
[01:40:37.000 --> 01:40:39.000] That seems to make sense to me.
[01:40:39.000 --> 01:40:42.000] The second one here about that scientists developed
[01:40:42.000 --> 01:40:46.000] a fast and easy way of making tissue optically transparent.
[01:40:46.000 --> 01:40:48.000] I think the key word here is optically,
[01:40:48.000 --> 01:40:50.000] because I bet you that's the trick,
[01:40:50.000 --> 01:40:52.000] is it might not be to the human eye,
[01:40:52.000 --> 01:40:54.000] but there might be a way for them to use
[01:40:54.000 --> 01:40:56.000] some type of machine that can do it.
[01:40:56.000 --> 01:40:59.000] Well, I'll just tell you, that word is there
[01:40:59.000 --> 01:41:01.000] to tell you that it's not a trick.
[01:41:01.000 --> 01:41:04.000] It's optically transparent, as opposed to being transparent
[01:41:04.000 --> 01:41:06.000] in X-rays or in...
[01:41:06.000 --> 01:41:08.000] Yeah, so like under a microscope.
[01:41:08.000 --> 01:41:09.000] Yeah.
[01:41:09.000 --> 01:41:10.000] To the eye, basically.
[01:41:10.000 --> 01:41:13.000] This is in optical frequencies, in visible spectrum.
[01:41:13.000 --> 01:41:15.000] So it's exactly opposite of what I just said.
[01:41:15.000 --> 01:41:17.000] Okay, that's fine.
[01:41:17.000 --> 01:41:19.000] Hey, man, I'm trying to get data out of Steve
[01:41:19.000 --> 01:41:20.000] whether he knows...
[01:41:20.000 --> 01:41:21.000] You're the first one.
[01:41:21.000 --> 01:41:22.000] You're the first one.
[01:41:22.000 --> 01:41:25.000] I always try to make these as concise and precise as possible.
[01:41:25.000 --> 01:41:28.000] And so if you're like significantly misinterpreting
[01:41:28.000 --> 01:41:31.000] what I wrote, I'm happy to correct it in the first go around.
[01:41:31.000 --> 01:41:34.000] All right, so this means that if they don't need to...
[01:41:34.000 --> 01:41:39.000] If the tissue can actually be alive when it's transparent,
[01:41:39.000 --> 01:41:41.000] that they could make people invisible.
[01:41:41.000 --> 01:41:42.000] That's what you're saying.
[01:41:42.000 --> 01:41:44.000] That's what you're saying, Steve.
[01:41:44.000 --> 01:41:45.000] Okay, I'm just pointing that out.
[01:41:45.000 --> 01:41:48.000] Okay, the third one, comprehensive meta-analysis.
[01:41:48.000 --> 01:41:51.000] Researchers find that women have advantages over men
[01:41:51.000 --> 01:41:54.000] in phonemic fluency, verbal memory, and verbal recognition,
[01:41:54.000 --> 01:41:57.000] and that this advantage is stable over 50 years of research.
[01:41:57.000 --> 01:42:00.000] Wow, that's cool.
[01:42:00.000 --> 01:42:04.000] So, I mean, I have no reason to think that the third one
[01:42:04.000 --> 01:42:07.000] is something fishy about it either.
[01:42:07.000 --> 01:42:09.000] I don't know.
[01:42:09.000 --> 01:42:11.000] Women have advantages over men.
[01:42:11.000 --> 01:42:15.000] That's a very unscientific thing there.
[01:42:15.000 --> 01:42:17.000] What's the advantage?
[01:42:17.000 --> 01:42:19.000] Women have better phonemic fluency.
[01:42:19.000 --> 01:42:20.000] How much better?
[01:42:20.000 --> 01:42:22.000] How much better do they perform?
[01:42:22.000 --> 01:42:24.000] He's not going to tell us that.
[01:42:24.000 --> 01:42:27.000] Well, you see?
[01:42:27.000 --> 01:42:29.000] You see, Kara?
[01:42:29.000 --> 01:42:31.000] I just don't like the way this one is worded.
[01:42:31.000 --> 01:42:34.000] I think this one is made to be vague deliberately,
[01:42:34.000 --> 01:42:36.000] and I'm going to say it's the fake.
[01:42:36.000 --> 01:42:37.000] Okay, Evan.
[01:42:37.000 --> 01:42:40.000] Yeah, I think I'm in line with Jay here on this particular one.
[01:42:40.000 --> 01:42:42.000] I think with the COVID-19 pre-prints,
[01:42:42.000 --> 01:42:45.000] 50% were substantially altered.
[01:42:45.000 --> 01:42:48.000] Yeah, I would like to know how,
[01:42:48.000 --> 01:42:52.000] for things that are not COVID-19 studies,
[01:42:52.000 --> 01:42:57.000] is this different from other medical topics other than COVID-19,
[01:42:57.000 --> 01:42:59.000] or is this specific to COVID-19?
[01:42:59.000 --> 01:43:02.000] But I have a feeling this one's going to turn out to be right.
[01:43:02.000 --> 01:43:07.000] Then the second one about the method for making tissue optically transparent.
[01:43:07.000 --> 01:43:12.000] Sure, I didn't have a problem with this one per se.
[01:43:12.000 --> 01:43:14.000] There's nothing in there, I don't think,
[01:43:14.000 --> 01:43:18.000] and that threw me off, but the last one throws me off.
[01:43:18.000 --> 01:43:21.000] The question, I think, boils down to why.
[01:43:21.000 --> 01:43:25.000] Why would this be the case that women would have advantages over men
[01:43:25.000 --> 01:43:27.000] in all these areas?
[01:43:27.000 --> 01:43:31.000] Maybe one, but three very distinct things,
[01:43:31.000 --> 01:43:35.000] phonemic fluency, verbal memory, and verbal recognition.
[01:43:35.000 --> 01:43:38.000] I don't see that being the case, so I'm with Jay on that fiction.
[01:43:38.000 --> 01:43:39.000] All right, Bob.
[01:43:39.000 --> 01:43:41.000] All right, so for the COVID-19,
[01:43:41.000 --> 01:43:45.000] 50% substantially altered changes to effects.
[01:43:45.000 --> 01:43:48.000] I mean, that seems like some p-hacking.
[01:43:48.000 --> 01:43:51.000] Would that be an accurate description?
[01:43:51.000 --> 01:43:54.000] So you're not answering me, I'm just going to go with it.
[01:43:54.000 --> 01:43:58.000] Yeah, I mean, it's very disappointing.
[01:43:58.000 --> 01:44:02.000] Once I get past the first person, I'm going to get extremely...
[01:44:02.000 --> 01:44:03.000] Yeah, I hear you.
[01:44:03.000 --> 01:44:05.000] It's not a shot anyway.
[01:44:05.000 --> 01:44:07.000] If you don't ask, you don't get.
[01:44:07.000 --> 01:44:11.000] The third one, you know, it could be a minor advantage,
[01:44:11.000 --> 01:44:16.000] a small advantage over men in phonemic fluency, et cetera.
[01:44:16.000 --> 01:44:18.000] Yeah, if it's not egregious.
[01:44:18.000 --> 01:44:20.000] The one that was really rubbing me the wrong way, though,
[01:44:20.000 --> 01:44:24.000] was the second one, the rapid, simple, and effective method
[01:44:24.000 --> 01:44:27.000] for making tissue optically transparent.
[01:44:27.000 --> 01:44:29.000] I mean, wouldn't that require some fundamental changes
[01:44:29.000 --> 01:44:32.000] to the tissue in question here?
[01:44:32.000 --> 01:44:35.000] There may be some weird trick that they got.
[01:44:35.000 --> 01:44:38.000] But whatever, I'm going to say that's fiction anyway.
[01:44:38.000 --> 01:44:39.000] Go out on my own.
[01:44:39.000 --> 01:44:41.000] Okay, and Cara.
[01:44:41.000 --> 01:44:42.000] Cara, what say you?
[01:44:42.000 --> 01:44:44.000] I think I'm going to go out on my own too.
[01:44:44.000 --> 01:44:45.000] Oh, good.
[01:44:45.000 --> 01:44:47.000] Oh, you think it's the COVID one.
[01:44:47.000 --> 01:44:50.000] Yeah, so the reason that I feel...
[01:44:50.000 --> 01:44:55.000] Okay, so I totally buy the one that both Jay and Evan said
[01:44:55.000 --> 01:44:57.000] was fiction, I totally think is science,
[01:44:57.000 --> 01:45:00.000] that women have more verbal fluency or verbal memory,
[01:45:00.000 --> 01:45:02.000] verbal recognition, phonomic fluency.
[01:45:02.000 --> 01:45:04.000] There's a reason that when we give psychological tests
[01:45:04.000 --> 01:45:06.000] that we have gender norms.
[01:45:06.000 --> 01:45:08.000] There are differences among genders.
[01:45:08.000 --> 01:45:13.000] That doesn't necessarily mean that it's biologically innate.
[01:45:13.000 --> 01:45:15.000] It could also be learned culturally.
[01:45:15.000 --> 01:45:18.000] But regardless, I would not be surprised that we see
[01:45:18.000 --> 01:45:20.000] a significant, I don't know if it's huge, like Bob said,
[01:45:20.000 --> 01:45:22.000] it might be small, but a significant difference
[01:45:22.000 --> 01:45:26.000] between genders and verbal facets.
[01:45:26.000 --> 01:45:28.000] And Evan, I don't think those three things
[01:45:28.000 --> 01:45:29.000] are that wildly different.
[01:45:29.000 --> 01:45:32.000] They're all in a kind of similar part of the brain.
[01:45:32.000 --> 01:45:36.000] And then, Bob, you said the tissue being optically transparent.
[01:45:36.000 --> 01:45:38.000] Well, tissue is optically transparent
[01:45:38.000 --> 01:45:41.000] if you slice it thin enough, like we know this.
[01:45:41.000 --> 01:45:44.000] Like any sort of microscope slide you can see.
[01:45:44.000 --> 01:45:46.000] Yeah, but including entire organs.
[01:45:46.000 --> 01:45:48.000] Right, so that's the difference here, right?
[01:45:48.000 --> 01:45:50.000] But if you think about it, what gives tissue pigment,
[01:45:50.000 --> 01:45:52.000] it's literally just pigment.
[01:45:52.000 --> 01:45:54.000] It's pigmentation in little organelles.
[01:45:54.000 --> 01:45:56.000] And so if you could get rid of blood,
[01:45:56.000 --> 01:45:58.000] if you could get rid of certain pigments
[01:45:58.000 --> 01:46:01.000] within different tissues, I think you could make it.
[01:46:01.000 --> 01:46:03.000] And also this doesn't say it's in vivo.
[01:46:03.000 --> 01:46:05.000] Like this could be dead tissue.
[01:46:05.000 --> 01:46:07.000] But then it's easy.
[01:46:07.000 --> 01:46:09.000] You could do anything to it to make it clear
[01:46:09.000 --> 01:46:11.000] so that you could look at it.
[01:46:11.000 --> 01:46:13.000] That one has too many caveats.
[01:46:13.000 --> 01:46:16.000] The COVID-19 pre-prints is freaking me out.
[01:46:16.000 --> 01:46:20.000] 50% substantially altered, including changes to effect sizes,
[01:46:20.000 --> 01:46:24.000] the actual data set, and statistical significance.
[01:46:24.000 --> 01:46:28.000] First of all, we would be shutting down these pre-prints
[01:46:28.000 --> 01:46:30.000] if that were the case.
[01:46:30.000 --> 01:46:33.000] We would be like, the pre-print system isn't working,
[01:46:33.000 --> 01:46:36.000] half of the studies that are pre-published are not holding up,
[01:46:36.000 --> 01:46:39.000] and they're having to make significant changes to the research
[01:46:39.000 --> 01:46:42.000] before the editors are allowing them to be good enough to publish.
[01:46:42.000 --> 01:46:44.000] It just doesn't work that way.
[01:46:44.000 --> 01:46:46.000] I mean, when you say it like that, you know.
[01:46:46.000 --> 01:46:49.000] Yeah, I feel like if something's good enough to be published,
[01:46:49.000 --> 01:46:52.000] you're making minor changes to appease the editors at that point.
[01:46:52.000 --> 01:46:56.000] You're not like adding whole parts of your data set
[01:46:56.000 --> 01:46:58.000] or completely changing.
[01:46:58.000 --> 01:47:01.000] Like all that stuff is supposed to be fixed in advance.
[01:47:01.000 --> 01:47:04.000] That's why we register studies.
[01:47:04.000 --> 01:47:06.000] So you can't after the fact go through and change.
[01:47:06.000 --> 01:47:10.000] That is the definition of p-hacking, or one definition of p-hacking, Bob.
[01:47:10.000 --> 01:47:13.000] So I don't think, like, if it was a 50% substantial change,
[01:47:13.000 --> 01:47:16.000] we'd be p-hacking between the pre-print and the publication.
[01:47:16.000 --> 01:47:18.000] That's the wrong direction.
[01:47:18.000 --> 01:47:21.000] Like, no good journal editor is going to be like,
[01:47:21.000 --> 01:47:24.000] how about you add some shit so we can falsify this?
[01:47:24.000 --> 01:47:26.000] I just don't think this one is science.
[01:47:26.000 --> 01:47:28.000] What about influence from the previous administration?
[01:47:28.000 --> 01:47:30.000] Right. I mean, there is that too.
[01:47:30.000 --> 01:47:33.000] But COVID was mostly, well, yeah.
[01:47:33.000 --> 01:47:35.000] I don't think so, though, because a pre-print is a pre-print.
[01:47:35.000 --> 01:47:38.000] You can just throw it up there. It's not peer-reviewed.
[01:47:38.000 --> 01:47:40.000] Yeah, this one bothers me.
[01:47:40.000 --> 01:47:42.000] I feel like that has to be the fiction.
[01:47:42.000 --> 01:47:44.000] And if it's not, I'm going to be real sad
[01:47:44.000 --> 01:47:48.000] for the scientific field as a whole, biomedical science.
[01:47:48.000 --> 01:47:50.000] The publishing field, yeah.
[01:47:50.000 --> 01:47:52.000] Okay, so you guys are all spread out.
[01:47:52.000 --> 01:47:54.000] Why don't we take this in reverse order?
[01:47:54.000 --> 01:47:56.000] No, no.
[01:47:56.000 --> 01:47:58.000] So Jay and Evan, you think the third one is the fiction.
[01:47:58.000 --> 01:48:01.000] Not anymore, now that you've taken the third guy's screen.
[01:48:01.000 --> 01:48:04.000] Scientists, researchers find that women have advantages over men
[01:48:04.000 --> 01:48:07.000] in phonemic fluency, verbal memory, and verbal recognition,
[01:48:07.000 --> 01:48:10.000] and that this advantage is stable over 50 years of research
[01:48:10.000 --> 01:48:12.000] and over a participant's lifetime.
[01:48:12.000 --> 01:48:14.000] You two think this one is the fiction.
[01:48:14.000 --> 01:48:16.000] Bob and Kerry think this one is science.
[01:48:16.000 --> 01:48:19.000] And this one is science.
[01:48:19.000 --> 01:48:20.000] This is science.
[01:48:20.000 --> 01:48:21.000] Yeah, Bob!
[01:48:21.000 --> 01:48:22.000] Yeah.
[01:48:22.000 --> 01:48:26.000] This is the first meta-analysis of this research, though, since 1988.
[01:48:26.000 --> 01:48:27.000] So it's been a while.
[01:48:27.000 --> 01:48:28.000] Jeez, it took so long.
[01:48:28.000 --> 01:48:30.000] So they basically did a really comprehensive meta-analysis,
[01:48:30.000 --> 01:48:32.000] and that's what they found.
[01:48:32.000 --> 01:48:34.000] But you guys, those of you who said this are correct,
[01:48:34.000 --> 01:48:37.000] the advantage is very small, but it's statistically significant,
[01:48:37.000 --> 01:48:41.000] and it's very consistent across research.
[01:48:41.000 --> 01:48:46.000] Now, this is the kind of thing where it's like a bimodal distribution
[01:48:46.000 --> 01:48:49.000] where the differences within a gender is going to be a lot greater
[01:48:49.000 --> 01:48:52.000] than the statistical difference between the two genders,
[01:48:52.000 --> 01:48:54.000] but there is a statistical difference there.
[01:48:54.000 --> 01:48:57.000] And essentially, this is sort of the conventional wisdom
[01:48:57.000 --> 01:49:00.000] that women have more verbal fluency than guys do, basically.
[01:49:00.000 --> 01:49:02.000] And they wanted to find out, is this really true,
[01:49:02.000 --> 01:49:04.000] or is it just one of those things that people believe,
[01:49:04.000 --> 01:49:06.000] but it's not really true?
[01:49:06.000 --> 01:49:08.000] But yeah, there is actually a little bit of an effect there.
[01:49:08.000 --> 01:49:12.000] What's really interesting, they also found that when you look
[01:49:12.000 --> 01:49:15.000] at the individual papers, when you break them up,
[01:49:15.000 --> 01:49:20.000] the advantages were greater for women in papers
[01:49:20.000 --> 01:49:22.000] where the lead author was a woman,
[01:49:22.000 --> 01:49:27.000] and they were smaller in papers where the lead author was a man.
[01:49:27.000 --> 01:49:31.000] Right, and so that makes you wonder, is that a positive or a negative bias?
[01:49:31.000 --> 01:49:35.000] Are the men minimizing the difference, or are the women magnifying the difference?
[01:49:35.000 --> 01:49:37.000] Or both, they're not.
[01:49:37.000 --> 01:49:39.000] So there's definitely some researcher bias,
[01:49:39.000 --> 01:49:43.000] probably also some publication bias as well in the data set,
[01:49:43.000 --> 01:49:46.000] but it holds out across the totality of research.
[01:49:46.000 --> 01:49:50.000] Also, here's the other thing, the effect size was greater
[01:49:50.000 --> 01:49:53.000] in published studies than in unpublished studies.
[01:49:53.000 --> 01:49:56.000] So that's the publication bias.
[01:49:56.000 --> 01:50:00.000] So yeah, I just love the fact that those types of analyses are now fairly routine.
[01:50:00.000 --> 01:50:02.000] You have to look at data that way.
[01:50:02.000 --> 01:50:05.000] But even when you account for all of that, there's still a tiny effect there.
[01:50:05.000 --> 01:50:08.000] It's cool, though, that there's so much data on certain topics
[01:50:08.000 --> 01:50:10.000] that have just been really interesting for years
[01:50:10.000 --> 01:50:13.000] that we can really look at it in a super clean way now.
[01:50:13.000 --> 01:50:18.000] These were 500 measures with 350,000 participants
[01:50:18.000 --> 01:50:21.000] in the total analysis.
[01:50:21.000 --> 01:50:25.000] All right, so it's between Bob and Kara.
[01:50:25.000 --> 01:50:27.000] We go to number two.
[01:50:27.000 --> 01:50:30.000] Scientists have developed a simple, rapid, and effective method
[01:50:30.000 --> 01:50:35.000] for making tissue optically transparent, including entire organs.
[01:50:35.000 --> 01:50:37.000] Bob, you think this one is the fiction.
[01:50:37.000 --> 01:50:39.000] Everyone else thought this one was science.
[01:50:39.000 --> 01:50:42.000] And this one is science.
[01:50:42.000 --> 01:50:44.000] Sorry, Bob.
[01:50:44.000 --> 01:50:46.000] Good job, Kara.
[01:50:46.000 --> 01:50:50.000] But Kara only won because she has greater verbal fluency than the guys do.
[01:50:50.000 --> 01:50:54.000] Well, clearly that's recent analysis.
[01:50:54.000 --> 01:50:56.000] Yeah, Jay, there's nowhere to say that they're alive.
[01:50:56.000 --> 01:50:59.000] This is not a living organism.
[01:50:59.000 --> 01:51:03.000] This is an organ that you removed.
[01:51:03.000 --> 01:51:06.000] So you could have an invisible zombie then, or an invisible corpse.
[01:51:06.000 --> 01:51:08.000] Yeah, you could have an invisible zombie.
[01:51:08.000 --> 01:51:09.000] It could be reanimated.
[01:51:09.000 --> 01:51:10.000] Good recovery, yeah.
[01:51:10.000 --> 01:51:14.000] And I believe they were mostly using mice tissue here, or mouse organs.
[01:51:14.000 --> 01:51:17.000] And yeah, this was possible previously,
[01:51:17.000 --> 01:51:19.000] but it was really expensive and complicated,
[01:51:19.000 --> 01:51:23.000] and required specialized equipment, and used hazardous organic solvents.
[01:51:23.000 --> 01:51:26.000] So they basically found a simple method for taking away the bits
[01:51:26.000 --> 01:51:29.000] that make it opaque.
[01:51:29.000 --> 01:51:30.000] Visoray?
[01:51:30.000 --> 01:51:37.000] The key is rendering the whole organs transparent
[01:51:37.000 --> 01:51:40.000] without disrupting their architecture,
[01:51:40.000 --> 01:51:44.000] the relationship of tissue to...
[01:51:44.000 --> 01:51:47.000] They're removing proteins basically from the tissue,
[01:51:47.000 --> 01:51:51.000] but they were able to do it without disrupting the architecture.
[01:51:51.000 --> 01:51:52.000] That was the key.
[01:51:52.000 --> 01:51:54.000] And it's cheap and safe.
[01:51:54.000 --> 01:51:55.000] What kind of proteins?
[01:51:55.000 --> 01:51:58.000] I mean, it's basically mostly proteins, right?
[01:51:58.000 --> 01:52:01.000] The ones that specifically are not transparent,
[01:52:01.000 --> 01:52:04.000] the ones that are pigmented.
[01:52:04.000 --> 01:52:07.000] They call their method EZClear.
[01:52:07.000 --> 01:52:08.000] That's branding.
[01:52:08.000 --> 01:52:10.000] EZClear.
[01:52:10.000 --> 01:52:13.000] The EZClear method of organ preparation.
[01:52:13.000 --> 01:52:14.000] Have you lost an organ recently?
[01:52:14.000 --> 01:52:15.000] Well, we've got a solution.
[01:52:15.000 --> 01:52:16.000] The EZ method.
[01:52:16.000 --> 01:52:20.000] It's like a commercial on the Simpsons, or I guess Futurama.
[01:52:20.000 --> 01:52:23.000] Dr. Nick Riviera suggests.
[01:52:23.000 --> 01:52:26.000] There's a video on the link that I'll provide.
[01:52:26.000 --> 01:52:27.000] It's cool.
[01:52:27.000 --> 01:52:30.000] You can see it's a mouse eye that's completely transparent.
[01:52:30.000 --> 01:52:31.000] You can see all the blood vessels,
[01:52:31.000 --> 01:52:35.000] but everything looks shadowy, like ghostly.
[01:52:35.000 --> 01:52:39.000] Okay, this all means that a review of COVID-19-related preprints
[01:52:39.000 --> 01:52:41.000] that were later published in peer-reviewed journals finds
[01:52:41.000 --> 01:52:43.000] that 50% were substantially altered,
[01:52:43.000 --> 01:52:46.000] including changes to effect sizes, the data used,
[01:52:46.000 --> 01:52:49.000] and statistical significance is the fiction,
[01:52:49.000 --> 01:52:51.000] because thank God it's the fiction.
[01:52:51.000 --> 01:52:52.000] I know.
[01:52:52.000 --> 01:52:58.000] If this were true, then, of course, this would lead to,
[01:52:58.000 --> 01:53:01.000] I think, a change in the preprint system.
[01:53:01.000 --> 01:53:07.000] What they found, actually, was that 90% was unchanged.
[01:53:07.000 --> 01:53:09.000] Thank goodness.
[01:53:09.000 --> 01:53:13.000] So they were looking at data points.
[01:53:13.000 --> 01:53:15.000] It's not just 90% of studies.
[01:53:15.000 --> 01:53:18.000] It's 90% of the data points across the studies that they looked at
[01:53:18.000 --> 01:53:20.000] were unchanged.
[01:53:20.000 --> 01:53:25.000] About 10% were, but they had no effect on statistical significance.
[01:53:25.000 --> 01:53:27.000] So none of them would have changed
[01:53:27.000 --> 01:53:30.000] whether or not the paper was statistically significant or not.
[01:53:30.000 --> 01:53:35.000] Basically, a lot of them added data that wasn't there on the preprint,
[01:53:35.000 --> 01:53:37.000] so that counted as well.
[01:53:37.000 --> 01:53:39.000] So that was just, hey, we have more data, so they added it.
[01:53:39.000 --> 01:53:41.000] And sometimes the data was removed,
[01:53:41.000 --> 01:53:44.000] so they may have had some problematic data where they said,
[01:53:44.000 --> 01:53:46.000] eh, let's see.
[01:53:46.000 --> 01:53:49.000] So essentially, they tightened up.
[01:53:49.000 --> 01:53:54.000] About 7% of the papers had some tightening up of the data
[01:53:54.000 --> 01:53:56.000] after peer review.
[01:53:56.000 --> 01:53:58.000] Changes in the actual estimates were minor
[01:53:58.000 --> 01:54:00.000] and statistically insignificant.
[01:54:00.000 --> 01:54:02.000] So that's all very good news.
[01:54:02.000 --> 01:54:04.000] Yes, it is, especially during COVID
[01:54:04.000 --> 01:54:07.000] when we know there was like a mad dash to publish
[01:54:07.000 --> 01:54:09.000] that it still held.
[01:54:09.000 --> 01:54:11.000] Yeah, and that's what kind of threw me off on that one in a sense,
[01:54:11.000 --> 01:54:15.000] was that, okay, this is all new, therefore they got a bunch of information up
[01:54:15.000 --> 01:54:18.000] and they realized, oh, no, we have to change a bunch of stuff.
[01:54:18.000 --> 01:54:23.000] But when the first preprint server was the physicist's archive,
[01:54:23.000 --> 01:54:27.000] the Rxiv, when was that created?
[01:54:27.000 --> 01:54:30.000] Well, I know that the paper that I referenced that I was on
[01:54:30.000 --> 01:54:34.000] with a bunch of physicists in the neuroscience lab earlier in the show,
[01:54:34.000 --> 01:54:38.000] that was in the archive, and that was published in like 2006.
[01:54:38.000 --> 01:54:39.000] 1991.
[01:54:39.000 --> 01:54:41.000] Yeah, okay, it's old.
[01:54:41.000 --> 01:54:43.000] They've been doing it for 30 years.
[01:54:43.000 --> 01:54:45.000] CompuServe email address.
[01:54:45.000 --> 01:54:47.000] But it's been slowly spreading to other disciplines.
[01:54:47.000 --> 01:54:50.000] Now, the COVID-19 preprints were created specifically
[01:54:50.000 --> 01:54:54.000] because they wanted to get studies out quickly
[01:54:54.000 --> 01:54:58.000] so that clinicians could make decisions based upon it,
[01:54:58.000 --> 01:55:00.000] but also that other researchers would know where to go.
[01:55:00.000 --> 01:55:05.000] Like we wanted to accelerate COVID-19 research as much as possible,
[01:55:05.000 --> 01:55:09.000] and the publication process could take one to two years easily,
[01:55:09.000 --> 01:55:11.000] and so they didn't want things to be slowed down that much.
[01:55:11.000 --> 01:55:14.000] So it's good to look back and go, okay, that was a good thing,
[01:55:14.000 --> 01:55:16.000] and the information's mostly valid.
[01:55:16.000 --> 01:55:19.000] Yeah, so the preprint system worked.
Skeptical Quote of the Week ()
We have learned in recent years that the techniques of misinformation and misdirection have become so refined that, even in an open society, a cleverly directed flood of misinformation can overwhelm the truth, even though the truth is out there, uncensored, quietly available to anyone who can find it.
– Daniel Dennett, American philosopher, writer, and cognitive scientist
Signoff/Announcements ()
S: —and until next week, this is your Skeptics' Guide to the Universe.
S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.
Today I Learned
- Fact/Description, possibly with an article reference[11]
- Fact/Description
- Fact/Description
Notes
References
- ↑ AIN Online: TWA 800... 10 years later: Putting the conspiracy theories to rest
- ↑ [url_from_quickie_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_news_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_news_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_news_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_news_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_news_item_show_notes PUBLICATION: TITLE]
- ↑ [url_from_SoF_show_notes PUBLICATION: TITLE]
- ↑ [url_from_SoF_show_notes PUBLICATION: TITLE]
- ↑ [url_from_SoF_show_notes PUBLICATION: TITLE]
- ↑ [url_for_TIL publication: title]
Vocabulary