SGU Episode 891
This episode is in the middle of being transcribed by Hearmepurr (talk) as of 2022-11-05. To help avoid duplication, please do not transcribe this episode while this message is displayed. |
This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading. |
This transcript is not finished. Please help us finish it! Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts. |
Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.
SGU Episode 891 |
---|
August 6th 2022 |
Skeptical Rogues |
S: Steven Novella |
B: Bob Novella |
C: Cara Santa Maria |
J: Jay Novella |
Quote of the Week |
If anyone can refute me–show me I'm making a mistake or looking at things from the wrong perspective–I'll gladly change. It's the truth I'm after, and the truth never harmed anyone. What harms us is to persist in self-deceit and ignorance. |
Marcus Aurelius, Roman emperor |
Links |
Download Podcast |
Show Notes |
Forum Discussion |
Introduction, passing of Nichelle Nichols
Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.
[00:12.720 --> 00:17.200] Today is Tuesday, August 2nd, 2022, and this is your host, Stephen Novella.
[00:17.200 --> 00:23.000] Joining me this week are Bob Novella, everybody, Kara Santamaria, howdy, and Jay Novella.
[00:23.000 --> 00:24.000] Hey, guys.
[00:24.000 --> 00:25.800] Evan is off this week.
[00:25.800 --> 00:31.600] Yeah, he is otherwise engaged, some family matter, but he'll be joining us for the next
[00:31.600 --> 00:32.600] episode.
[00:32.600 --> 00:38.080] This next episode we're going to record actually is Nexus, which is, it's happening probably
[00:38.080 --> 00:42.240] right now as you're listening to this, and, or maybe it happened already if you're listening
[00:42.240 --> 00:47.460] to this after Saturday, and that's, that show will be airing in two weeks.
[00:47.460 --> 00:53.560] So guys, the big news this week, sad news, Nichelle Nichols passed away.
[00:53.560 --> 00:57.120] She was 89.
[00:57.120 --> 00:58.120] Good age, good age.
[00:58.120 --> 00:59.120] Yeah, 89.
[00:59.120 --> 01:00.120] Lived a long life.
[01:00.120 --> 01:01.120] Always great.
[01:01.120 --> 01:06.000] So that means that only Kirk, Sulu, and Chekov are left from the bridge crew of the original
[01:06.000 --> 01:07.000] series.
[01:07.000 --> 01:08.000] Yeah.
[01:08.000 --> 01:13.040] You know what I loved about her and that original bridge crew was when I was a kid watching
[01:13.040 --> 01:18.900] the original Star Trek series, I didn't realize that there was anything special about the
[01:18.900 --> 01:22.240] fact that it was a multicultural crew.
[01:22.240 --> 01:23.240] You know what I mean?
[01:23.240 --> 01:25.280] And it wasn't just humans and aliens.
[01:25.280 --> 01:29.400] There was people from different parts of the world, but you know, different, the differences
[01:29.400 --> 01:30.600] were important back then.
[01:30.600 --> 01:35.680] And it was one of the reasons why Gene Roddenberry actually, you know, constructed it that way.
[01:35.680 --> 01:39.760] And she was, she was key to that because not only was she black, but she was a woman.
[01:39.760 --> 01:44.180] Yeah, she was one of the first black women on TV, like in a major series.
[01:44.180 --> 01:45.180] And she had a significant role.
[01:45.180 --> 01:50.000] I know it's very easy to, to, you know, to denigrate say, well, she was basically answering
[01:50.000 --> 01:51.000] the phone, right?
[01:51.000 --> 01:54.360] Sort of the comedian, one line about her.
[01:54.360 --> 01:58.600] But she actually, you know, had, she was part of the bridge crew, man.
[01:58.600 --> 02:00.640] And I remember there was one episode, I just watched it.
[02:00.640 --> 02:05.920] I just happened to watch it because I was, we're going to be reviewing it on AQ6, Balance
[02:05.920 --> 02:09.760] of Terror, you know, where they're fighting the Romulans for the first time.
[02:09.760 --> 02:12.120] And she is called to the helm.
[02:12.120 --> 02:16.040] She has to actually work at the helm because the helmsman had to be called away.
[02:16.040 --> 02:20.360] So she, you know, if you're on the bridge crew, you're, you're able to handle any station
[02:20.360 --> 02:21.960] that you're called upon to.
[02:21.960 --> 02:28.320] So anyway, it was, the point is it wasn't, it wasn't a small position on the Enterprise
[02:28.320 --> 02:33.560] and it was, and her character was a significant character in the show.
[02:33.560 --> 02:38.560] But even she, you know, had doubts about, about that character that she was portraying.
[02:38.560 --> 02:39.560] Yeah.
[02:39.560 --> 02:43.640] You know, the story is she was, she actually gave her resignation letter after season one
[02:43.640 --> 02:47.400] to Gene Roddenberry and Martin Luther King Jr.
[02:47.400 --> 02:52.440] She was talking to him at some point, not too soon, not too far after that.
[02:52.440 --> 02:56.240] And he told her like, you absolutely have to stay on the show.
[02:56.240 --> 03:00.800] You don't know how unbelievably important it is as a black woman that you're on this
[03:00.800 --> 03:02.080] particular TV show.
[03:02.080 --> 03:03.080] Yeah.
[03:03.080 --> 03:04.080] She's a role model.
[03:04.080 --> 03:05.080] Yeah.
[03:05.080 --> 03:06.080] And he convinced her to stay.
[03:06.080 --> 03:09.900] And it was, you know, it was really, he felt that it was very important to have that representation
[03:09.900 --> 03:10.900] on TV.
[03:10.900 --> 03:11.900] It really was important.
[03:11.900 --> 03:12.900] It was very important.
[03:12.900 --> 03:13.900] Yeah.
[03:13.900 --> 03:17.820] You know, Gopi Goldberg was talking about it recently and she said that when she saw
[03:17.820 --> 03:22.120] it growing up, she was running around the house and she was saying, you know, she was
[03:22.120 --> 03:26.640] very excited, very happy because from her point of view, this was the first time that
[03:26.640 --> 03:30.000] she had seen a black woman on TV that wasn't a maid.
[03:30.000 --> 03:33.880] And to her, to her, that was huge, huge.
[03:33.880 --> 03:38.480] And it was, I think it was huge for lots of, lots of people growing up at that time.
[03:38.480 --> 03:43.880] So yeah, it was clearly a very important role at a very important time.
[03:43.880 --> 03:47.940] It's a point that it's kind of the point that Dr. King was making.
[03:47.940 --> 03:52.040] He was saying that if you leave, they could just fill anybody in that role.
[03:52.040 --> 03:55.320] Like the great thing about the role you have is that it's not, there's nothing about it
[03:55.320 --> 03:56.380] that's black.
[03:56.380 --> 03:58.200] There's nothing about it that's female.
[03:58.200 --> 03:59.340] It's just a strong role.
[03:59.340 --> 04:00.920] They could stick an alien in there.
[04:00.920 --> 04:03.500] They could stick anybody to fill your role.
[04:03.500 --> 04:09.540] And so it shows that, you know, black women are people, fully formed people who can have
[04:09.540 --> 04:13.260] skills and who can engage.
[04:13.260 --> 04:16.800] It's not just, like you said, like you're not just the maid.
[04:16.800 --> 04:21.840] And how important for not just young people, not just young black people, not just young
[04:21.840 --> 04:26.720] black women to see that, but white men to see that, you know?
[04:26.720 --> 04:31.960] I mean, Kara, I was a kid, I was a kid watching those Star Trek episodes.
[04:31.960 --> 04:36.520] And it absolutely had an impact on my life in so many different ways.
[04:36.520 --> 04:41.960] But I, you know, I remember thinking about her on that crew and not thinking anything
[04:41.960 --> 04:42.960] odd about it.
[04:42.960 --> 04:43.960] And that's, that's my point.
[04:43.960 --> 04:44.960] Yeah.
[04:44.960 --> 04:45.960] It normalizes it.
[04:45.960 --> 04:48.200] As a young kid, I wasn't looking at her as black or as a woman.
[04:48.200 --> 04:51.720] I was just looking at her as one of the, one of the people on the bridge.
[04:51.720 --> 04:54.200] And I know it had an impact on me in my life.
[04:54.200 --> 04:59.800] Do you guys know that after Star Trek that Nichelle Nichols went to work for NASA recruiting
[04:59.800 --> 05:02.960] women and minorities into the space program?
[05:02.960 --> 05:09.000] And she actually, she recruited Sally Ride, who was the first American female astronaut.
[05:09.000 --> 05:10.000] That is awesome.
[05:10.000 --> 05:11.000] I didn't know that.
[05:11.000 --> 05:15.840] Oh, and let's not forget that in 1968, wow, what a year, 1968.
[05:15.840 --> 05:17.680] Think about everything that happened that year.
[05:17.680 --> 05:21.240] It was the first interracial kiss on television.
[05:21.240 --> 05:26.400] Yeah, that wasn't, in the US it was, but there was a British one that preceded it,
[05:26.400 --> 05:27.400] you know?
[05:27.400 --> 05:28.400] Okay.
[05:28.400 --> 05:29.400] Yeah.
[05:29.400 --> 05:30.880] First US, first interracial kiss on US television.
[05:30.880 --> 05:31.880] Yeah.
[05:31.880 --> 05:32.880] Yeah.
[05:32.880 --> 05:33.880] Yeah, which was significant.
[05:33.880 --> 05:34.880] Absolutely.
[05:34.880 --> 05:35.880] Yeah.
[05:35.880 --> 05:39.580] And I think if I remember, I'm trudging up some old memories here, they wanted to do
[05:39.580 --> 05:48.920] multiple takes because they were very nervous because the people were very strict with what
[05:48.920 --> 05:53.520] you could have and what you can't have on the show in terms of that kind of thing and
[05:53.520 --> 05:55.380] sexual situations and all that.
[05:55.380 --> 06:01.140] So if I remember correctly, William Chattner actually screwed up all the other takes to
[06:01.140 --> 06:05.400] such a degree that they would never be able to use any of them except the one that you
[06:05.400 --> 06:06.400] saw.
[06:06.400 --> 06:07.400] Well, he and her did that.
[06:07.400 --> 06:08.400] They did it together.
[06:08.400 --> 06:09.400] Oh, even better.
[06:09.400 --> 06:10.400] Even better.
[06:10.400 --> 06:11.400] Even better.
[06:11.400 --> 06:12.400] Yeah.
[06:12.400 --> 06:13.400] That's actually accurate, Bob.
[06:13.400 --> 06:16.280] They deliberately screwed up all the alternate takes.
[06:16.280 --> 06:17.280] Good for them.
[06:17.280 --> 06:19.880] Because they knew the system so well, they knew how much time they had.
[06:19.880 --> 06:21.620] They couldn't keep doing this.
[06:21.620 --> 06:22.620] That was it.
[06:22.620 --> 06:23.620] Their time was allotted to do it.
[06:23.620 --> 06:28.680] So they had no choice in the editing room but to put that in or not show the episode,
[06:28.680 --> 06:29.680] you know?
[06:29.680 --> 06:30.680] Mm-hmm.
[06:30.680 --> 06:34.420] And I didn't realize that she put out albums throughout her life.
[06:34.420 --> 06:41.800] She did two albums, and one of them was standards, and then the other one was rock and roll.
[06:41.800 --> 06:48.960] But they were put out while she was filming Star Trek, and they have those themes, like
[06:48.960 --> 06:53.160] down to earth and out of this world are the names of the albums.
[06:53.160 --> 06:57.320] And I can guarantee you that they are far superior to the albums that William Chattner
[06:57.320 --> 06:58.320] put out.
[06:58.320 --> 07:05.080] Bob, actually, I could do a better job.
[07:05.080 --> 07:09.280] If you want to know what we're talking about, go on YouTube and look up William Chattner
[07:09.280 --> 07:13.200] Tambourine Man, and you'll see everything that you need to know.
[07:13.200 --> 07:14.200] Yikes.
[07:14.200 --> 07:15.200] All right.
[07:15.200 --> 07:16.200] Let's move on.
Quickie with Bob: Friction (7:15)
- Friction: Atomic-scale friction between single-asperity contacts unveiled through in situ transmission electron microscopy[1]
[07:16.200 --> 07:17.640] Bob, you're going to start us off with a quickie.
[07:17.640 --> 07:18.640] Thank you, Steve.
[07:18.640 --> 07:19.640] Gird your loins, everyone.
[07:19.640 --> 07:22.160] This is your Quickie with Bob.
[07:22.160 --> 07:26.920] Friction in the news, specifically from the journal Nature Nanotechnology.
[07:26.920 --> 07:32.840] Recently, using an electron microscope, researchers were, for the first time, able to image two
[07:32.840 --> 07:37.240] surfaces coming into contact and sliding across each other with atomic resolution.
[07:37.240 --> 07:43.000] A professor of mechanical engineering and material science, Guo-Fang Wang, said, in
[07:43.000 --> 07:47.160] the study, we were able to actually see the sliding pathway of interface atoms and the
[07:47.160 --> 07:52.600] dynamic strain and stress evolution on the interface that has only previously been shown
[07:52.600 --> 07:53.600] by simulations.
[07:53.600 --> 07:58.560] Now, after seeing this atomic process of friction in the real world, researchers were able to
[07:58.560 --> 08:03.680] go back to the simulation to verify not only what the microscopic visualization showed,
[08:03.680 --> 08:08.760] but also understand more about the specific forces at play at the atomic scale.
[08:08.760 --> 08:13.720] Wang describes one of his main takeaways from this experiment when he said, what we found
[08:13.720 --> 08:19.040] is that no matter how smooth and clean the surface is, friction still occurs at the atomic
[08:19.040 --> 08:20.040] level.
[08:20.040 --> 08:21.360] It's completely unavoidable.
[08:21.360 --> 08:25.680] However, this knowledge can lead to better lubricants and materials to minimize friction
[08:25.680 --> 08:29.760] and wear as much as possible, extending the life of mechanical systems.
[08:29.760 --> 08:34.240] And also, this new method can now apparently be applied to any material to learn more about
[08:34.240 --> 08:36.600] the role friction and wear plays on it.
[08:36.600 --> 08:40.320] And who knows what this research can do for astroglide.
[08:40.320 --> 08:43.280] Loins un-girded, this has been your Quickie with Bob.
[08:43.280 --> 08:45.080] I hope it was good for you, too.
[08:45.080 --> 08:46.080] Jesus, Bob.
[08:46.080 --> 08:47.080] Hey, man.
[08:47.080 --> 08:48.080] It's thematic.
[08:48.080 --> 08:54.160] Yeah, it's only because you said loins un-girded right after you said astroglide.
[08:54.160 --> 08:55.160] All right.
[08:55.160 --> 08:56.160] Oh, yes.
[08:56.160 --> 08:57.160] Thanks, Bob.
[08:57.160 --> 08:58.160] Sure.
News Items
S:
B:
C:
J:
E:
(laughs) (laughter) (applause) [inaudible]
The Neuroscience of Politics (8:59)
[08:58.160 --> 08:59.160] All right.
[08:59.160 --> 09:02.720] I'm going to start off the news items.
[09:02.720 --> 09:06.120] This is one about the neuroscience of politics.
[09:06.120 --> 09:07.120] Oh, geez.
[09:07.120 --> 09:08.120] Yeah.
[09:08.120 --> 09:09.120] Well, this is interesting.
[09:09.120 --> 09:17.080] And so there's a study that came out looking at fMRI scans of different people, subjects
[09:17.080 --> 09:23.000] that took a survey, answered a bunch of questions so that they could be characterized on a Likert
[09:23.000 --> 09:29.440] scale from one to six, from very liberal to very conservative.
[09:29.440 --> 09:33.480] And so they're just using that just one axis, liberal or conservative, which is a massive
[09:33.480 --> 09:36.120] oversimplification of the political landscape.
[09:36.120 --> 09:39.380] But whatever, that's the scale they used.
[09:39.380 --> 09:46.080] And comparing that to different functions in the brain, looking at functional connectivity
[09:46.080 --> 09:52.120] directly, under nine different tasks, one of which is doing nothing.
[09:52.120 --> 09:57.900] And a quote unquote task is just something, it's just a neurological environment that
[09:57.900 --> 09:59.800] you're putting the subject under.
[09:59.800 --> 10:04.040] For example, you could be showing them images of people expressing emotion.
[10:04.040 --> 10:06.960] That would be one quote unquote task, right?
[10:06.960 --> 10:09.200] Another one might be a memory task.
[10:09.200 --> 10:13.720] Another one was trying to get a reward by hitting a button as quickly as you could.
[10:13.720 --> 10:17.960] Now, none of these tasks had anything to do with politics or ideology.
[10:17.960 --> 10:23.720] They were considered politically neutral, but they just wanted to get the brain to light
[10:23.720 --> 10:27.640] up in different ways and see if there are statistical differences between liberal brains
[10:27.640 --> 10:29.800] and conservative brains, right?
[10:29.800 --> 10:31.680] That was the goal of this research.
[10:31.680 --> 10:37.760] This is not the first study to do this, although this is the first one to try to look at functional
[10:37.760 --> 10:44.520] connectivity, previous research of the neuroanatomical correlates of political ideology, as we would
[10:44.520 --> 10:50.120] say, looked at more of like modules in the brain, like which piece of the brain is lighting
[10:50.120 --> 10:54.720] up versus, you know, this one is looking at different circuits in the brain, lighting
[10:54.720 --> 10:55.720] up.
[10:55.720 --> 10:56.720] You know, the differences are interesting.
[10:56.720 --> 11:02.020] Now, you know, this is a technically very complicated study and I don't have the expertise
[11:02.020 --> 11:06.720] to dissect it technically, like I don't know if they're using the right fMRI technique
[11:06.720 --> 11:12.120] or they used AI to sort of analyze the data and I have no idea if their analysis is valid
[11:12.120 --> 11:13.120] or not.
[11:13.120 --> 11:16.880] So I just wanted to talk conceptually about the research itself, you know, sort of just
[11:16.880 --> 11:22.980] take the results at face value for now, you know, it got through peer review and it will
[11:22.980 --> 11:26.680] go through further analysis and attempts at replication.
[11:26.680 --> 11:32.920] So we'll put that aside, the technical analysis for now and just see, I'm sorry, it was published
[11:32.920 --> 11:37.160] in the Proceedings of the National Academy of Science, right?
[11:37.160 --> 11:38.160] Right?
[11:38.160 --> 11:41.600] So, which is a good journal.
[11:41.600 --> 11:46.120] So let's back up a little bit and look at this research in general.
[11:46.120 --> 11:51.700] Are there brain differences between people with different political ideology?
[11:51.700 --> 11:57.680] So I've already alluded to one of the issues with this research paradigm and that is how
[11:57.680 --> 11:59.920] are you choosing the ideology to look at?
[11:59.920 --> 12:06.360] In this research, they did a one-dimensional liberal to conservative scale, right?
[12:06.360 --> 12:11.440] But we know that politics is about a lot more than that.
[12:11.440 --> 12:18.240] You know, we have a two-party system in the United States and so things tend to sort out,
[12:18.240 --> 12:23.980] you know, it's actually Democrat to Republican, but Democrats are at least several different
[12:23.980 --> 12:29.120] ideologies and Republicans are at least several different ideologies.
[12:29.120 --> 12:35.360] You know, there are sort of coalitions of ideologies and they're actually somewhat in
[12:35.360 --> 12:38.160] flux in the US in the last few years.
[12:38.160 --> 12:39.480] But that's not what they looked at.
[12:39.480 --> 12:40.480] They didn't look at Democrat versus Republican.
[12:40.480 --> 12:41.480] No, they didn't.
[12:41.480 --> 12:42.480] They looked at liberal versus conservative.
[12:42.480 --> 12:46.120] But the point of why, you know, what is, why choose that, you know?
[12:46.120 --> 12:50.160] Well, I think that that's the prevailing model across all political science.
[12:50.160 --> 12:57.400] I know that, but my point is, does that reflect reality or is that a cultural construct that
[12:57.400 --> 13:01.320] may not reflect any kind of neurological reality?
[13:01.320 --> 13:02.320] Oh, I see.
[13:02.320 --> 13:06.680] Yeah, I think the interesting thing is that it's a cultural construct that seems to hold
[13:06.680 --> 13:08.400] in most cultures.
[13:08.400 --> 13:11.600] Yeah, but so what parts of it though?
[13:11.600 --> 13:17.360] Because what, so what's, so for example, are we talking about socially liberal, right?
[13:17.360 --> 13:26.200] Or economically liberal or liberal in terms of foreign policy or what, you know, it breaks
[13:26.200 --> 13:32.060] down multiple different ways and they don't always align, you know?
[13:32.060 --> 13:39.120] And so you can define it in different ways and then you're looking at basically completely
[13:39.120 --> 13:42.320] different phenomena that you're just labeling liberal and labeling conservative.
[13:42.320 --> 13:47.680] I think the issue, I mean, obviously this is, you dug a lot deeper, but I think the
[13:47.680 --> 13:53.840] issue is that if we use that sort of, let's say, fiscal and socially liberal conservative
[13:53.840 --> 13:58.840] quad, you're going to find that there are people who are extremely liberal, which means
[13:58.840 --> 14:03.240] that they are both socially and financially or economically liberal.
[14:03.240 --> 14:06.320] And then you're going to find people who are extremely conservative.
[14:06.320 --> 14:10.400] So they are both financially and socially conservative and they're going to be the most
[14:10.400 --> 14:11.480] severe ends.
[14:11.480 --> 14:17.520] And so if you can take the most severe ends and almost caricature them as an archetype,
[14:17.520 --> 14:20.240] then you might have a better chance to see differences.
[14:20.240 --> 14:29.240] But I think, so they also did look at extremists versus moderates in this study and which to
[14:29.240 --> 14:35.760] me provokes yet another question, which is not answered by the data.
[14:35.760 --> 14:43.860] And that is, so is it, is an extreme liberal someone who is just very liberal or are they
[14:43.860 --> 14:49.680] a liberal who happens to have other cognitive qualities that make them extreme?
[14:49.680 --> 14:51.880] How do you even define extreme liberal?
[14:51.880 --> 14:55.920] Because that definitely, as you mentioned, is culture bound.
[14:55.920 --> 15:00.800] Like what we consider super liberal in America is like moderate in a lot of European countries.
[15:00.800 --> 15:04.520] Well, yeah, it's relative to the culture, but in this case, this is a study of Americans
[15:04.520 --> 15:09.040] and they looked at and they were using a survey.
[15:09.040 --> 15:13.600] You answer these questions and then we grade you on these questions, liberal to conservative.
[15:13.600 --> 15:18.620] You're right, that spectrum may be different in different countries.
[15:18.620 --> 15:21.840] They may use the labels differently too, which we won't get into.
[15:21.840 --> 15:22.960] That's a different thing.
[15:22.960 --> 15:29.720] But my point is like, are there extremists and extremists can end up as an independent
[15:29.720 --> 15:34.740] variable of whether they're liberal or conservative or they just people who are really conservative
[15:34.740 --> 15:38.860] and people who are really liberal and maybe it's both.
[15:38.860 --> 15:39.860] Maybe it's both.
[15:39.860 --> 15:44.500] Maybe there are people who are, whatever ideology they have, they're going to be extreme, right?
[15:44.500 --> 15:53.040] And if they happen to fall on the conservative side, then they would rank as an extreme conservative.
[15:53.040 --> 15:57.000] That doesn't necessarily mean that they're really more conservative than a moderate conservative.
[15:57.000 --> 16:03.120] They're just conservative and they have a cognitive style that makes them extreme.
[16:03.120 --> 16:06.460] And this is just speculation because again, this study didn't really have any way of
[16:06.460 --> 16:10.960] sorting all this out, but it's just sort of treating it as one dimensional.
[16:10.960 --> 16:13.520] Are they all neuroscientists on the study?
[16:13.520 --> 16:16.540] Are there any psychologists that are in their political neuroscience?
[16:16.540 --> 16:18.080] They are political neuroscientists.
[16:18.080 --> 16:19.080] Political neuroscientists.
[16:19.080 --> 16:20.080] Interesting.
[16:20.080 --> 16:21.080] Okay.
[16:21.080 --> 16:24.240] But with all that in mind, let's look a little bit at the data and then you'll also see what
[16:24.240 --> 16:25.920] I mean a little bit.
[16:25.920 --> 16:31.760] So the bottom line is what they found is that all nine states, all nine tasks that they
[16:31.760 --> 16:37.980] gave them showed statistical differences between people who ranked liberal and people who ranked
[16:37.980 --> 16:42.920] conservative on this, on their study, which is interesting.
[16:42.920 --> 16:47.400] Why would, you know, they basically chose nine tasks mainly because they could do that,
[16:47.400 --> 16:48.400] right?
[16:48.400 --> 16:50.120] There are just easy ways to do them for fMRI studies.
[16:50.120 --> 16:54.000] We'll give them a reward task and a memory task and whatever.
[16:54.000 --> 16:58.320] I mean, they weren't picked because they thought they would relate to ideology.
[16:58.320 --> 17:01.400] In fact, they thought that they wouldn't relate to ideology.
[17:01.400 --> 17:02.800] That's why they picked them.
[17:02.800 --> 17:08.840] So why did they all show statistical differences between liberal and conservative?
[17:08.840 --> 17:17.400] The authors suspect that it's because that there are just some fundamental differences
[17:17.400 --> 17:23.800] between the liberal to conservative neurological function, if you will, that just shows up
[17:23.800 --> 17:27.040] in every fMRI you do, even when they're doing nothing.
[17:27.040 --> 17:31.120] You know, it's just, you know, the brains are just functioning differently and it just
[17:31.120 --> 17:34.520] contaminates every state that you look at.
[17:34.520 --> 17:39.440] But did they only look at people who, like on those self-report surveys, who kind of
[17:39.440 --> 17:43.280] fell at a certain threshold of liberality or conservatism?
[17:43.280 --> 17:47.760] No, again, there was a Likert scale, a six-point scale, so you could have been in the middle,
[17:47.760 --> 17:48.760] right?
[17:48.760 --> 17:49.760] Right.
[17:49.760 --> 17:50.760] But they didn't use anybody who's like, meh?
[17:50.760 --> 17:56.000] Well, yeah, it was mild liberal, liberal, extreme liberal, right?
[17:56.000 --> 17:59.200] Mild conservative, conservative, extreme conservative.
[17:59.200 --> 18:01.580] Those are the six points on this scale.
[18:01.580 --> 18:05.660] And they found significant differences between mild liberals and mild conservatives across
[18:05.660 --> 18:06.660] all six?
[18:06.660 --> 18:08.160] Because that's surprising.
[18:08.160 --> 18:13.880] No, I think if they like included all of the liberals and included all of the conservatives,
[18:13.880 --> 18:15.880] they showed differences.
[18:15.880 --> 18:19.440] But they also looked at extremists versus moderates.
[18:19.440 --> 18:23.780] And for that, so I'm going to back up a little bit and just ask you guys a question.
[18:23.780 --> 18:30.520] What do you think is the factor that predicts somebody's political ideology more than any
[18:30.520 --> 18:31.520] other factor?
[18:31.520 --> 18:32.520] Because this could be anything.
[18:32.520 --> 18:35.960] You mean like, for example, like education, something like that?
[18:35.960 --> 18:36.960] Yeah.
[18:36.960 --> 18:37.960] Age.
[18:37.960 --> 18:38.960] Like a demographic factor?
[18:38.960 --> 18:39.960] I'm going to say where they're born.
[18:39.960 --> 18:40.960] Family history.
[18:40.960 --> 18:41.960] Bob's correct.
[18:41.960 --> 18:44.800] You know, pretty much, it's their parents, right?
[18:44.800 --> 18:51.420] So whatever your parents' ideology are, that is the strongest predictor of your ideology.
[18:51.420 --> 18:55.600] It's not where you're born, because if you're a liberal born in a red state, you're still
[18:55.600 --> 18:59.360] a liberal parent, you're still going to be liberal, not red, right?
[18:59.360 --> 19:01.000] Not conservative.
[19:01.000 --> 19:03.280] And then, of course, this cuts both ways.
[19:03.280 --> 19:07.060] This doesn't tell you if it's nature versus nurture, because you could say that they inherited
[19:07.060 --> 19:11.720] the genes from their parents, but they also were raised by their parents to be liberal
[19:11.720 --> 19:13.080] or to be conservative.
[19:13.080 --> 19:14.620] And so you don't know.
[19:14.620 --> 19:19.340] But there are twin studies, you know, you do like twins separated at birth, and that
[19:19.340 --> 19:21.940] shows that it's at least partly genetic.
[19:21.940 --> 19:26.800] It does appear to be at least partly genetic, but not fully, like pretty much everything
[19:26.800 --> 19:27.800] with the brain, right?
[19:27.800 --> 19:33.440] It's a combination of genetic and, you know, hardwiring and also environmental factors.
[19:33.440 --> 19:40.040] Okay, so with that in mind, they said, how much does every, you know, all of these states
[19:40.040 --> 19:44.640] predict whether or not somebody will be on the liberal or the conservative end?
[19:44.640 --> 19:51.800] And also, does it predict who will be extreme versus moderate, like who will, you know,
[19:51.800 --> 19:57.160] counting extreme as the two ends of the spectrum, the very liberal and very conservative?
[19:57.160 --> 20:02.280] So first of all, they found that overall, the functional connectivity patterns were
[20:02.280 --> 20:08.760] as predictive as the ideology of the parents, which is like the gold standard.
[20:08.760 --> 20:12.120] So it was as predictive as parental ideology.
[20:12.120 --> 20:17.600] And for the tasks that they looked at, there were three that correlated the most.
[20:17.600 --> 20:20.000] So there were three standouts.
[20:20.000 --> 20:24.320] One was the empathy task, which was looking at pictures of people who are expressing an
[20:24.320 --> 20:25.320] emotion.
[20:25.320 --> 20:28.800] Right, so it was supposed to be like, how much are you reacting to the emotion that
[20:28.800 --> 20:29.800] you're seeing?
[20:29.800 --> 20:32.800] Aka bleeding heart liberals.
[20:32.800 --> 20:33.800] Yeah.
[20:33.800 --> 20:35.320] Well, that's not correct.
[20:35.320 --> 20:36.320] Oh, really?
[20:36.320 --> 20:42.120] What that correlated with is being politically moderate or ideologically moderate.
[20:42.120 --> 20:46.560] The reward task, on the other hand, correlated with extremism.
[20:46.560 --> 20:53.200] You know, the reward task was trying to win a prize by hitting the button fast.
[20:53.200 --> 20:59.760] And a retrieval task, which involved, you know, it was a memory retrieval thing.
[20:59.760 --> 21:06.480] And that task was the most different among liberal to conservative spectrum, right?
[21:06.480 --> 21:08.400] So you could see that you could predict.
[21:08.400 --> 21:09.400] In what direction?
[21:09.400 --> 21:13.320] I'm just saying, like, you could, you know, you can predict based upon the way their brains
[21:13.320 --> 21:17.320] looked on that task, if they're liberal or conservative more than other tasks.
[21:17.320 --> 21:21.060] Yeah, but I'm saying, like, what was the pattern?
[21:21.060 --> 21:25.320] It's a statistical thing using AI looking at the patterns on fMRI scans.
[21:25.320 --> 21:28.360] So I'd have to show you pictures of fMRI scans.
[21:28.360 --> 21:33.360] Okay, so they weren't using predefined circuitry that they know is like, okay, this is a reward
[21:33.360 --> 21:38.520] circuitry, or this is a retrieval circuitry, and we want to see who has is loading more
[21:38.520 --> 21:39.840] on it, and who's loading less.
[21:39.840 --> 21:42.480] No, they were just saying, what's happening in the brain when we have them do this?
[21:42.480 --> 21:47.560] Oh, and this now, let's now let's see if they sort out into different patterns.
[21:47.560 --> 21:54.680] Can we use those patterns to predict how they scored on the test, you know, on the liberal
[21:54.680 --> 21:55.680] to conservative survey?
[21:55.680 --> 21:56.680] Interesting.
[21:56.680 --> 21:59.720] So it's almost like it's meaningful insofar as it has predictive power, but it's not really
[21:59.720 --> 22:02.160] meaningful insofar as it tells us anything.
[22:02.160 --> 22:03.160] Right.
[22:03.160 --> 22:04.840] We don't know what it tells us.
[22:04.840 --> 22:09.480] So this is really hypothesis hunting, if you will, because they're just saying, hey, what's
[22:09.480 --> 22:13.040] going on in the brain when we give liberals and conservatives different tasks?
[22:13.040 --> 22:14.760] Oh, I wonder what that means.
[22:14.760 --> 22:18.720] What does it mean that certain patterns in your brain when you're doing a reward task
[22:18.720 --> 22:22.480] predict if you're an extremist or a moderate, you know?
[22:22.480 --> 22:27.560] There's also no way to know if we're just loading on a completely different construct
[22:27.560 --> 22:31.560] that like it exactly confabulating variable kind of like those old studies where it's
[22:31.560 --> 22:35.520] like, we took smokers and compared them to non smokers, and we asked them X, Y and Z.
[22:35.520 --> 22:38.840] And it's like, well, but maybe it's because the smokers drink more coffee, or maybe it's
[22:38.840 --> 22:44.360] because our problems, I'm sure there's confounding factors go in this kind of study because we
[22:44.360 --> 22:45.360] don't get we don't.
[22:45.360 --> 22:49.680] We this is what I was alluding to at the top of this is that we don't know that we are
[22:49.680 --> 22:56.120] dealing with fundamental phenomena or just, you know, one or two layers removed from those
[22:56.120 --> 22:59.120] fundamental phenomena.
[22:59.120 --> 23:02.000] What is it about someone that makes them liberal?
[23:02.000 --> 23:05.880] Like liberalness may not be a thing unto itself neurologically.
[23:05.880 --> 23:10.400] It's just a cultural manifestation of more fundamental neurological functions.
[23:10.400 --> 23:13.360] Like you could think of things like empathy, for example.
[23:13.360 --> 23:14.600] And but then what's that?
[23:14.600 --> 23:15.600] What is empathy?
[23:15.600 --> 23:17.960] Is that even a fundamental neurological function?
[23:17.960 --> 23:21.800] Or is that a manifestation of other things that are happening in the brain, other, you
[23:21.800 --> 23:23.840] know, circuits that are firing?
[23:23.840 --> 23:28.880] And so we're trying to dig down, but we are not at the base level yet.
[23:28.880 --> 23:29.880] No.
[23:29.880 --> 23:35.320] And but there's something about the there's like a face validity kind of component to
[23:35.320 --> 23:40.080] this, which has long interested me, and I'm assuming it really interests the authors too,
[23:40.080 --> 23:44.720] that there's something that feels fundamental about ideology.
[23:44.720 --> 23:49.680] Because once you sort of start to develop an awakening into ideology, you know, as your
[23:49.680 --> 23:52.520] as your child, you don't really know what's going on in the world.
[23:52.520 --> 23:57.400] But once you start to say, this becomes part of my identity, it actually is very fundamental
[23:57.400 --> 23:59.780] to a lot of people's identities.
[23:59.780 --> 24:08.340] It's very rare for people to switch parties, unless there's some sort of personal insult
[24:08.340 --> 24:12.600] to their reasoning, like their party fails them.
[24:12.600 --> 24:13.600] Yeah.
[24:13.600 --> 24:17.960] Or like there were you going through a realignment like we're doing now.
[24:17.960 --> 24:18.960] Exactly.
[24:18.960 --> 24:19.960] Yeah.
[24:19.960 --> 24:25.040] So this study does not tell us if what the error of causation is, right, looking for
[24:25.040 --> 24:26.200] only for correlation.
[24:26.200 --> 24:30.480] So it doesn't tell us that people are liberal because their brains function this way.
[24:30.480 --> 24:33.120] They could be that their brains function this way because they're liberal.
[24:33.120 --> 24:34.120] Right.
[24:34.120 --> 24:35.120] Because these are pathways.
[24:35.120 --> 24:36.120] These aren't hard.
[24:36.120 --> 24:37.400] Like this is just use and disuse kind of stuff.
[24:37.400 --> 24:38.400] Right.
[24:38.400 --> 24:39.400] Exactly.
[24:39.400 --> 24:40.400] This could just all be learned.
[24:40.400 --> 24:42.820] You know, this could be the patterns of functioning in the brain because this because you were
[24:42.820 --> 24:47.640] raised this way, rather than predisposed to being liberal or conservative.
[24:47.640 --> 24:48.640] Doesn't answer that.
[24:48.640 --> 24:53.440] I also think it doesn't answer if these things that we're looking at, like the circuits in
[24:53.440 --> 24:59.080] the brain that we're looking at, if they are fundamental to ideology or incidental to ideology.
[24:59.080 --> 25:00.080] We don't know that.
[25:00.080 --> 25:03.560] But there's one other way to sort of look at this data, which is interesting.
[25:03.560 --> 25:07.440] And that is, so what are the parts of the brain that were different?
[25:07.440 --> 25:11.880] Let's just ask that question from liberal to conservative, not the necessarily the functional
[25:11.880 --> 25:17.080] circuits or what tasks they were doing, but just what parts of the brain were involved.
[25:17.080 --> 25:21.520] And they were mostly the hippocampus, the amygdala and the frontal lobes.
[25:21.520 --> 25:26.480] So those are all involved with emotional processing.
[25:26.480 --> 25:31.760] And that's very provocative, in my opinion, because that suggests that ideology is really
[25:31.760 --> 25:36.480] tied very strongly to emotional processing.
[25:36.480 --> 25:40.560] It wasn't so much the more rational cognitive parts of the brain, it was the emotional parts
[25:40.560 --> 25:46.240] of the brain that were able to predict liberal to conservative, extreme to moderate.
[25:46.240 --> 25:51.160] So it's really, you know, political ideology may say more about our emotional makeup than
[25:51.160 --> 25:56.780] our cognitive style, which is interesting to think about, which kind of does jibe with
[25:56.780 --> 26:04.560] other, you know, other research that shows that we tend to come to opinions that are
[26:04.560 --> 26:13.060] emotionally salient to us based upon our emotional instincts, and then post hoc rationalize them
[26:13.060 --> 26:19.080] with motivated reasoning, which is why it's so hard to resolve political or ideological
[26:19.080 --> 26:21.920] or religious, you know, disagreements because people aren't reasoning their way to them
[26:21.920 --> 26:22.920] in the first place.
[26:22.920 --> 26:28.400] They're just using motivated reasoning to backfill their emotional gut instinct.
[26:28.400 --> 26:31.160] And that's their worldview, what feels right to them.
[26:31.160 --> 26:35.340] Now, this is the way it is because it feels right to me, and I will make sure I figure
[26:35.340 --> 26:37.240] out a way to make it make sense.
[26:37.240 --> 26:42.560] And it's why sometimes you have to shake your head at the motivated reasoning that the quote
[26:42.560 --> 26:44.040] unquote other side is engaging.
[26:44.040 --> 26:46.000] But of course, we all do this.
[26:46.000 --> 26:48.280] This is a human condition.
[26:48.280 --> 26:52.400] It also kind of speaks to I mean, I'm curious if you agree, but it speaks to, I think, a
[26:52.400 --> 26:57.880] fundamental construct that's involved in political discourse or political thinking,
[26:57.880 --> 26:59.760] which is moral reasoning.
[26:59.760 --> 27:03.040] And moral reasoning is fundamentally emotive.
[27:03.040 --> 27:07.720] Like it is cognitive and emotive, but you can't strip emotive reasoning away from moral
[27:07.720 --> 27:08.720] philosophy.
[27:08.720 --> 27:09.720] It's part of it.
[27:09.720 --> 27:10.720] Totally.
[27:10.720 --> 27:11.720] Yeah, totally.
[27:11.720 --> 27:12.720] Yeah.
[27:12.720 --> 27:13.720] Like we feel injustice.
[27:13.720 --> 27:14.720] Yes.
[27:14.720 --> 27:15.720] Absolutely.
[27:15.720 --> 27:16.720] And then we then we then we rationalize why that's unjust.
[27:16.720 --> 27:17.720] Yeah.
[27:17.720 --> 27:18.720] Because if we feel it first, absolutely.
[27:18.720 --> 27:24.760] If we were totally cognitive and like a kind of extreme example of like a cool, cold, calculated
[27:24.760 --> 27:28.000] cognitive, the humanism would be not there.
[27:28.000 --> 27:29.840] And that's also dangerous.
[27:29.840 --> 27:30.840] Mm hmm.
[27:30.840 --> 27:31.840] Yeah.
[27:31.840 --> 27:34.880] I mean, that's why I kind of like, you know, science fiction shows that explore that through
[27:34.880 --> 27:40.000] characters that have different emotional makeup than humans like Vulcans or androids or whatever,
[27:40.000 --> 27:45.480] because it's like they are they have only rational reasoning, no emotional reasoning.
[27:45.480 --> 27:46.960] And it's just a good thought experiment.
[27:46.960 --> 27:48.840] What would that result in?
[27:48.840 --> 27:55.440] And even to the point of taking what seem like really extreme moral positions, but they
[27:55.440 --> 27:57.240] make perfect rational sense.
[27:57.240 --> 27:58.240] Right.
[27:58.240 --> 27:59.240] Exactly.
[27:59.240 --> 28:00.240] But they don't feel right.
[28:00.240 --> 28:01.240] They don't feel right to us.
[28:01.240 --> 28:02.240] So they got to be wrong.
[28:02.240 --> 28:03.240] All right.
[28:03.240 --> 28:04.240] Let's move on.
[28:04.240 --> 28:05.240] Very fascinating.
[28:05.240 --> 28:06.240] So this is something that I sort of follow.
[28:06.240 --> 28:09.320] And so I'm sure we'll be talking about this and again in the future.
[28:09.320 --> 28:13.520] And this is a one tiny slice of obviously very complicated phenomenon.
[28:13.520 --> 28:17.440] No one study is going to give us the answer as to what like a liberal brain is or a conservative
[28:17.440 --> 28:19.860] brain or even if there is such a thing.
[28:19.860 --> 28:21.520] But it is very interesting.
[28:21.520 --> 28:22.520] All right.
Cozy Lava Tubes (28:21)
[28:22.520 --> 28:28.280] Jay, this is cool, actually, literally and figuratively cool.
[28:28.280 --> 28:32.960] Tell us about lava tubes and the temperature of them.
[28:32.960 --> 28:33.960] All right.
[28:33.960 --> 28:36.400] First, I want to start by saying, Bob, just calm down.
[28:36.400 --> 28:37.400] Yep.
[28:37.400 --> 28:40.040] I tried to select this topic for this weekend.
[28:40.040 --> 28:41.040] I was shut down.
[28:41.040 --> 28:44.360] So I was very surprised when you said Jay and not Bob.
[28:44.360 --> 28:46.080] Bob, I need you to breathe.
[28:46.080 --> 28:47.080] Okay.
[28:47.080 --> 28:48.080] I'm breathing, man.
[28:48.080 --> 28:49.080] Just get your facts right, baby.
[28:49.080 --> 28:50.080] All right.
[28:50.080 --> 28:57.200] So we've talked about lava tubes as potential locations for future moon habitats, haven't
[28:57.200 --> 28:58.200] we?
[28:58.200 --> 28:59.200] Quite a bit.
[28:59.200 --> 29:00.200] Yes.
[29:00.200 --> 29:04.920] Well, a little later in this news item, I'm going to blow your mind about lava tubes.
[29:04.920 --> 29:09.760] But let me fill your brain a little bit with some interesting things that'll make you appreciate
[29:09.760 --> 29:10.940] it even more.
[29:10.940 --> 29:14.860] So I recently talked about, I think it was last week, about the Artemis 1 mission that
[29:14.860 --> 29:16.840] could be launching very soon.
[29:16.840 --> 29:21.560] This mission sends an uncrewed command module in orbit around the moon.
[29:21.560 --> 29:26.400] If everything goes well, the Artemis 2 mission, which will be crewed, can launch as early
[29:26.400 --> 29:27.760] as 2025.
[29:27.760 --> 29:32.720] It's NASA's intention, this is important, it's their intention to send people to the
[29:32.720 --> 29:36.920] surface of the moon, build habitats, and have people live there.
[29:36.920 --> 29:38.120] That's what they want.
[29:38.120 --> 29:40.560] And I couldn't agree with this more.
[29:40.560 --> 29:43.640] This is the best thing that I think they could be doing right now.
[29:43.640 --> 29:47.480] I imagine that this whole effort is going to be similar to how people stay on the space
[29:47.480 --> 29:48.480] station, right?
[29:48.480 --> 29:51.400] You know, they rotate crew on and off.
[29:51.400 --> 29:54.800] So they probably will rotate crew to and from the moon.
[29:54.800 --> 29:58.480] Some of them will be staying for longer periods of time.
[29:58.480 --> 30:01.900] They'll conduct experiments and at some point they'll start building a place for future
[30:01.900 --> 30:03.380] visitors to live.
[30:03.380 --> 30:08.640] There's a ton of details that we all have to consider about people living on the moon,
[30:08.640 --> 30:09.640] especially NASA.
[30:09.640 --> 30:12.760] NASA has been thinking about this for a long time.
[30:12.760 --> 30:13.760] First, what?
[30:13.760 --> 30:16.520] The moon has no atmosphere, almost.
[30:16.520 --> 30:18.480] There's a tiny little bit of atmosphere on the moon.
[30:18.480 --> 30:22.760] It's about as dense as the atmosphere that's around the space station, which is in low
[30:22.760 --> 30:23.760] Earth orbit.
[30:23.760 --> 30:28.080] There is atmosphere there, but it is essentially a vacuum.
[30:28.080 --> 30:29.080] Not a perfect vacuum.
[30:29.080 --> 30:30.520] It's fascinating, too.
[30:30.520 --> 30:31.520] It's tiny, tiny, tiny, tiny.
[30:31.520 --> 30:32.520] Yeah.
[30:32.520 --> 30:37.000] I mean, one example I heard, Jay, if you took the air that's in like a baseball or football
[30:37.000 --> 30:41.600] stadium in the United States, that kind of size, if you take the air that's inside that
[30:41.600 --> 30:45.160] and spread it around the moon, that's the density we're talking about.
[30:45.160 --> 30:46.160] Yeah.
[30:46.160 --> 30:47.160] It's nothing.
[30:47.160 --> 30:48.160] Quite thin, but fascinating, though.
[30:48.160 --> 30:49.160] It's a thing.
[30:49.160 --> 30:50.160] It's a thing.
[30:50.160 --> 30:54.920] So, Bob, as a point of curiosity, an average human can stay conscious for about 20 seconds
[30:54.920 --> 30:55.920] in a vacuum.
[30:55.920 --> 30:56.920] Yeah.
[30:56.920 --> 30:57.920] About.
[30:57.920 --> 31:03.920] And the next thing about the moon that we have to be concerned with is the temperature.
[31:03.920 --> 31:05.800] The moon has extreme temperatures.
[31:05.800 --> 31:11.880] The daytime temperature there is 260 degrees Fahrenheit or 126 degrees Celsius.
[31:11.880 --> 31:19.720] Nighttime, it goes way down to minus 280 degrees Fahrenheit or minus 173 degrees Celsius.
[31:19.720 --> 31:21.520] Super hot, super cold.
[31:21.520 --> 31:27.940] So with no atmosphere or magnetosphere, visitors on the moon will also be exposed to solar
[31:27.940 --> 31:30.020] wind and cosmic rays.
[31:30.020 --> 31:34.500] This means that the moon's habitat has to provide a lot of amenities in order for people
[31:34.500 --> 31:36.480] to stay for long periods of time.
[31:36.480 --> 31:41.840] So all that said, keeping in mind how hostile the surface of the moon is, it's looking like
[31:41.840 --> 31:44.920] lava tubes are even more awesome than we thought.
[31:44.920 --> 31:50.840] NASA figured out that some lava tubes have a consistent inner temperature of 63 degrees
[31:50.840 --> 31:54.160] Fahrenheit or 17 degrees Celsius.
[31:54.160 --> 31:55.920] Do I need to repeat that?
[31:55.920 --> 31:57.320] That's amazing.
[31:57.320 --> 31:58.320] Amazing.
[31:58.320 --> 32:00.360] It's like the perfect temperature.
[32:00.360 --> 32:03.960] It's the perfect freaking temperature, you know, or within 10 degrees of the perfect
[32:03.960 --> 32:07.000] temperature for people to live.
[32:07.000 --> 32:10.440] It's like the perfect fall day, let me put it to you that way.
[32:10.440 --> 32:19.360] These lava tubes can be as big as 1,600 to 3,000 feet or 500 to 900 meters in diameter,
[32:19.360 --> 32:23.240] which is very, very large, which is fascinating as well.
[32:23.240 --> 32:26.580] There's a reason why lava tubes are large on the moon.
[32:26.580 --> 32:28.040] It's because there's less gravity, right?
[32:28.040 --> 32:31.360] So the more gravity a planet has, the smaller the lava tubes get.
[32:31.360 --> 32:35.680] Well, the moon doesn't have a lot of gravity, so the lava tubes got to be really big.
[32:35.680 --> 32:41.960] Now lava tubes can also, if we build habitats in them, they also can help block harmful
[32:41.960 --> 32:46.760] effects of radiation and micrometeor impact, which happens quite often on the moon.
[32:46.760 --> 32:50.240] It might even be possible to pressurize a lava tube.
[32:50.240 --> 32:55.840] Even if like we can't pressurize a lava tube, for example, there's still a massive benefit
[32:55.840 --> 32:58.880] to building a habitat inside of a lava tube itself.
[32:58.880 --> 33:02.960] So first of all, who would have thought that lava tubes have a cozy temperature?
[33:02.960 --> 33:06.040] That's the thing that I've been rattling around in my head the last few days.
[33:06.040 --> 33:11.600] I just simply can't believe that these things are a perfect temperature for humans to live
[33:11.600 --> 33:12.600] at.
[33:12.600 --> 33:18.200] Now NASA figured this out by analyzing data from the Diviner Lunar Radiometer that's onboard
[33:18.200 --> 33:20.480] the Lunar Reconnaissance Orbiter.
[33:20.480 --> 33:25.960] The data shows that the consistent lunar cycle, which is 15 straight days of light and then
[33:25.960 --> 33:30.880] 15 days of dark, the Diviner instrument measured the temperature of the lunar surface for over
[33:30.880 --> 33:32.120] 11 years.
[33:32.120 --> 33:37.640] And when the sunlight is hitting the surface, the temperature, like I said before, it skyrockets
[33:37.640 --> 33:38.640] way up.
[33:38.640 --> 33:43.040] And then when it gets dark, the temperature plummets very quickly way, way down to a very,
[33:43.040 --> 33:44.040] very low temperature.
[33:44.040 --> 33:47.840] So there's things that are called pits that are on the moon, and most of these were likely
[33:47.840 --> 33:49.800] created by meteor strikes, right?
[33:49.800 --> 33:56.960] Now 16 of these pits so far that have been discovered likely dropped down into a lava
[33:56.960 --> 33:57.960] tube, right?
[33:57.960 --> 33:59.840] So there's a meteor strike.
[33:59.840 --> 34:01.640] It creates a hole.
[34:01.640 --> 34:05.600] That hole cracks into a lava tube that was below the surface, right?
[34:05.600 --> 34:07.040] Can you visualize that?
[34:07.040 --> 34:14.720] Sixteen of them are probably collapsed into a lava tube, but there's no reason to think
[34:14.720 --> 34:19.920] that they're from meteor strikes, it's just that the ground over the lava tube collapsed
[34:19.920 --> 34:21.080] into the lava tube.
[34:21.080 --> 34:23.560] Oh, it sounded to me like those were meteor strikes.
[34:23.560 --> 34:27.040] No, the other ones, the other pits are caused by meteor strikes.
[34:27.040 --> 34:35.320] All right, but the important fact here is that they collapsed down into a lava tube
[34:35.320 --> 34:36.320] for some reason, right?
[34:36.320 --> 34:37.320] Yeah.
[34:37.320 --> 34:38.320] So you have a hole.
[34:38.320 --> 34:42.960] The ceiling of the lava tube collapsed, you know, fell in because it was unstable, whatever.
[34:42.960 --> 34:48.400] It could have been from a nearby impact, you know, shook it and cracked the rocks, whatever.
[34:48.400 --> 34:51.040] Just something happened and it eventually collapsed down.
[34:51.040 --> 34:53.720] Now, the pits have something, though, that's important.
[34:53.720 --> 34:57.920] They have a protective overhang that blocks some of the sunlight, right?
[34:57.920 --> 35:00.480] That's critical, it seems, for this cozy temperature.
[35:00.480 --> 35:01.640] That seems very critical.
[35:01.640 --> 35:06.900] Yeah, this, I guess, Bob, without that rocky overhang that's, you know, partly covering
[35:06.900 --> 35:12.040] up the hole that was made, things wouldn't behave the exact right way in order to create
[35:12.040 --> 35:14.000] this nice, even temperature.
[35:14.000 --> 35:15.320] But so it does a couple of things.
[35:15.320 --> 35:20.560] It blocks light from coming in and it also inhibits retained heat from leaving too fast.
[35:20.560 --> 35:24.400] Now, this is probably why the temperature stays at such a nice temperature.
[35:24.400 --> 35:30.160] This is called blackbody equilibrium, by the way, with a constant temperature of 17 degrees
[35:30.160 --> 35:32.740] Celsius or 63 degrees Fahrenheit.
[35:32.740 --> 35:38.300] There's less than, and this is mind blowing, there's less than a one degree Celsius variation
[35:38.300 --> 35:40.640] throughout the 30-day lunar cycle.
[35:40.640 --> 35:41.720] That's amazing.
[35:41.720 --> 35:43.280] How efficient that is.
[35:43.280 --> 35:48.840] So if we get lucky, one of these pits will indeed connect to a preexisting lava tube.
[35:48.840 --> 35:51.600] And if we find that, we're in business.
[35:51.600 --> 35:58.320] Yeah, they also, their simulations also show that that one degree variance, that 63 degrees
[35:58.320 --> 36:03.720] or 17 degrees Celsius with one degree Celsius variance, probably holds true throughout the
[36:03.720 --> 36:05.520] entire lava tube.
[36:05.520 --> 36:07.280] Yeah, that's what I was wondering.
[36:07.280 --> 36:14.560] So now we have a lava tube that could be pretty long, could be very long, and it has a pretty
[36:14.560 --> 36:15.560] large diameter.
[36:15.560 --> 36:19.720] Huge, yeah, because a little gravity can make them go much bigger than anything found on
[36:19.720 --> 36:20.720] the Earth.
[36:20.720 --> 36:25.160] We're absolutely going to investigate these lava tubes and these pits and see what we
[36:25.160 --> 36:26.160] find.
[36:26.160 --> 36:31.960] And if things happen correctly the way that we want them to, then we most definitely will
[36:31.960 --> 36:34.680] be building some type of habitat inside of one of these.
[36:34.680 --> 36:35.680] It's too good.
[36:35.680 --> 36:36.680] It's just too many benefits.
[36:36.680 --> 36:41.600] It really seems like a no-brainer in a lot of ways because the surface of the moon is
[36:41.600 --> 36:44.920] far deadlier than people generally can appreciate.
[36:44.920 --> 36:48.920] Jay, you mentioned the micrometeoroids, absolutely.
[36:48.920 --> 36:53.160] These things can come in and if you get hit, you will be taken out.
[36:53.160 --> 36:57.980] I don't care what you're wearing, but not only that, even if it hits near you, the debris
[36:57.980 --> 37:01.400] that's kicked up can cause damage, can ruin habitats.
[37:01.400 --> 37:02.880] So you've got that.
[37:02.880 --> 37:07.880] You mentioned the radiation, cosmic radiation, solar radiation, and also the radiation that's
[37:07.880 --> 37:13.000] created at your feet by the other radiation that's hitting the ground by your feet can
[37:13.000 --> 37:15.240] also do more damage.
[37:15.240 --> 37:19.280] And the other thing, Jay, you didn't quite mention, moon dust is horrible.
[37:19.280 --> 37:25.360] The astronauts hated, hated it more than Darth Vader when he was a kid, he was a punk.
[37:25.360 --> 37:26.360] And a kid.
[37:26.360 --> 37:27.360] And a kid.
[37:27.360 --> 37:32.960] And they hated it more than he hated sand because that stuff, think about it, you've
[37:32.960 --> 37:33.960] got-
[37:33.960 --> 37:34.960] It's abrasive, I know, it's sharp.
[37:34.960 --> 37:35.960] It's sharp edges.
[37:35.960 --> 37:36.960] There's no weathering.
[37:36.960 --> 37:37.960] It's everywhere.
[37:37.960 --> 37:38.960] There's no atmosphere.
[37:38.960 --> 37:39.960] There's no water.
[37:39.960 --> 37:40.960] It's very sharp.
[37:40.960 --> 37:41.960] You breathe that in, not good.
[37:41.960 --> 37:42.960] And it also gets everywhere.
[37:42.960 --> 37:45.920] And we got to get, you know, people are going to want to get away.
[37:45.920 --> 37:47.160] You cannot stay there.
[37:47.160 --> 37:50.580] Yeah, you can stay there for three days like our astronauts did.
[37:50.580 --> 37:51.580] That's fine.
[37:51.580 --> 37:54.020] But if you're going to stay there for longer than that, you got to get out of that.
[37:54.020 --> 37:57.080] That is a horrible place to be for an extended period of time.
[37:57.080 --> 37:58.840] And it's like right there waiting for us.
[37:58.840 --> 38:00.880] Now it's even more comfortable than we thought.
[38:00.880 --> 38:04.380] You know, it's just a much better place, it's much safer.
[38:04.380 --> 38:09.520] To me, I mean, you're going to spend the resources and the money to dig deep holes and bury yourself
[38:09.520 --> 38:11.080] under the regolith that way.
[38:11.080 --> 38:15.320] I mean, when there's places just waiting for you that are better gargantuan.
[38:15.320 --> 38:17.960] Now Bob, how about a pressurized lava tube?
[38:17.960 --> 38:18.960] Sure.
[38:18.960 --> 38:23.440] Imagine if there's one that's deep enough that it can hold pressure.
[38:23.440 --> 38:28.720] All you'd really need to do is build two end caps, airlocks, right, that would contain
[38:28.720 --> 38:30.520] the airlocks.
[38:30.520 --> 38:38.040] And you might have to smooth out the interior surface and maybe even coat it to reduce any
[38:38.040 --> 38:39.040] leaking.
[38:39.040 --> 38:43.520] You pressurize that thing and you just kind of you got a huge underground city.
[38:43.520 --> 38:44.520] Right.
[38:44.520 --> 38:48.460] And also, it's not like what people think that you think, oh, you get a little crack
[38:48.460 --> 38:53.920] in this blocked area so that the air maintains pressure.
[38:53.920 --> 38:57.320] You get a little hole or a crack and the air is going to go rushing out and people are
[38:57.320 --> 38:58.320] going to get sucked out.
[38:58.320 --> 39:03.840] No, you would actually have hours and hours and hours or maybe even days before this would
[39:03.840 --> 39:05.640] reach a critical threshold.
[39:05.640 --> 39:11.600] You have time to fix these, any problems, any cracks or holes that might appear.
[39:11.600 --> 39:12.600] You have time.
[39:12.600 --> 39:13.600] It's not like red alert.
[39:13.600 --> 39:16.200] It's more like a soft yellow alert.
[39:16.200 --> 39:17.960] I read a good science fiction story.
[39:17.960 --> 39:21.800] I forget which one, but there was this was aboard a ship, but the principle could apply.
[39:21.800 --> 39:27.080] You basically have these floating balloons that are filled with a sticky substance.
[39:27.080 --> 39:28.640] I have listened to that story as well.
[39:28.640 --> 39:29.640] Yeah.
[39:29.640 --> 39:30.640] Yeah.
[39:30.640 --> 39:33.480] If there's a leak, the balloons would float to the leak again.
[39:33.480 --> 39:38.040] And when they get into the crack, they break and automatically fill it with the sticky
[39:38.040 --> 39:40.040] substance that seals the crack.
[39:40.040 --> 39:44.680] So you could have this just these balloons floating around in the lava tube that would
[39:44.680 --> 39:50.560] just automatically passively seal at least, you know, small size cracks, not if not bigger
[39:50.560 --> 39:51.560] ones.
[39:51.560 --> 39:52.560] That's pretty cool.
[39:52.560 --> 39:56.440] But you'd probably want to have some airtight habitats inside there as a backup.
[39:56.440 --> 40:01.560] I would want two layers, but absolutely, you know, maybe you're living in the habitats,
[40:01.560 --> 40:07.760] but you can have a farm that is just in the, you know, the regolith just in the lava tube
[40:07.760 --> 40:10.220] with, you know, with artificial lighting.
[40:10.220 --> 40:15.280] You could put a frigging nuclear, you know, power station down there.
[40:15.280 --> 40:16.280] Yeah, baby.
[40:16.280 --> 40:17.280] Yeah.
[40:17.280 --> 40:21.960] I think if we're going to build any permanent or long term large, you know, bases or settlements
[40:21.960 --> 40:24.200] on the moon, they're going to be in lava tubes.
[40:24.200 --> 40:25.200] I mean, it just seems...
[40:25.200 --> 40:26.200] Definitely.
[40:26.200 --> 40:27.200] It's so provocative, right?
[40:27.200 --> 40:29.760] Though, it just it's like stories writing itself.
[40:29.760 --> 40:33.280] I'm just envisioning all the cool things that we could be doing in there.
[40:33.280 --> 40:37.960] You know, imagine you could go to the moon, eventually, you know, there might be a living
[40:37.960 --> 40:42.400] space, you know, where people can go to the moon and vacation there for a week.
[40:42.400 --> 40:43.600] You know what I mean?
[40:43.600 --> 40:44.600] That's incredible.
[40:44.600 --> 40:50.080] Well, I know Kara is probably asking herself right now, why would we even go to the moon?
[40:50.080 --> 40:51.720] Like, why have...why send people there?
[40:51.720 --> 40:54.320] Why not just send robots there to do whatever we want?
[40:54.320 --> 40:58.360] I mean, there are definitely, you know, you could make an argument for we should do whatever
[40:58.360 --> 41:03.840] we need to do on the moon with robots, but there are lots of things to do on the moon
[41:03.840 --> 41:10.000] like research, industry, mining.
[41:10.000 --> 41:15.840] You know, if we end up using H3 for our fusion reactors, then the best source of that is
[41:15.840 --> 41:19.900] the lunar surface and as a platform for deep space.
[41:19.900 --> 41:23.720] So if we want to get to the rest of, you know, the rest of the solar system, the moon, we're
[41:23.720 --> 41:25.120] going to go through the moon.
[41:25.120 --> 41:26.120] Yeah.
[41:26.120 --> 41:28.560] My question isn't why, it's should we?
[41:28.560 --> 41:30.120] It's not why would we, it's should we?
[41:30.120 --> 41:31.880] Well, if there are reasons to go, then...
[41:31.880 --> 41:33.960] Are those reasons good enough to go?
[41:33.960 --> 41:35.960] Yeah, which they are.
[41:35.960 --> 41:38.000] I'll agree with some of them, but not all of them.
[41:38.000 --> 41:42.540] But I think that also, you know, it's good to have humanity not on one planet in case
[41:42.540 --> 41:43.540] something happens.
[41:43.540 --> 41:49.000] Yeah, there's always that, which reminds me, of course, of an image, a still image that
[41:49.000 --> 41:53.600] is kind of unforgettable where you see an astronaut on the moon looking at the earth
[41:53.600 --> 42:01.780] and you see the earth has just been basically run through by like a mini planet.
[42:01.780 --> 42:06.080] So the earth has basically been utterly destroyed and this guy's looking at it happens like
[42:06.080 --> 42:07.080] whoops.
[42:07.080 --> 42:08.080] Whoa.
[42:08.080 --> 42:09.080] I wasn't there.
[42:09.080 --> 42:12.420] Well, everyone, we're going to take a quick break from our show to talk about our sponsor
[42:12.420 --> 42:13.920] this week, BetterHelp.
[42:13.920 --> 42:16.240] And let's not sugarcoat it, everyone.
[42:16.240 --> 42:18.720] It's a tough time out there and a lot of people are struggling.
[42:18.720 --> 42:25.520] And if you are struggling and you have never decided to take the plunge and talk to somebody,
[42:25.520 --> 42:27.320] maybe now is the time.
[42:27.320 --> 42:30.740] It's so important to prioritize our mental health.
[42:30.740 --> 42:35.240] If we put that first, everything else really can follow and BetterHelp can help you with
[42:35.240 --> 42:36.240] that.
[42:36.240 --> 42:38.640] You know, I myself work as a therapist and I also go to therapy.
[42:38.640 --> 42:43.440] And I can tell you that online therapy has been really, really beneficial for a lot of
[42:43.440 --> 42:46.920] folks where it's, you know, it fits better within your day.
[42:46.920 --> 42:50.880] You have limitations to be able to get in the car and drive somewhere.
[42:50.880 --> 42:54.560] Being able to talk to somebody online can be really a lifesaver.
[42:54.560 --> 42:57.160] And it's the model that I'm now using all the time.
[42:57.160 --> 43:02.560] Yeah, Keri, you could do it on your phone or, you know, your iPad if you want to, any
[43:02.560 --> 43:04.480] way that you connect with the video.
[43:04.480 --> 43:08.520] You can even live chat with therapy sessions so you don't have to see anyone on camera
[43:08.520 --> 43:09.520] if you don't want to.
[43:09.520 --> 43:14.440] And the other great thing is you could be matched with a therapist in under 48 hours.
[43:14.440 --> 43:20.040] Our listeners get 10% off their first month at BetterHelp.com slash SGU.
[43:20.040 --> 43:23.640] That's Better H-E-L-P dot com slash SGU.
[43:23.640 --> 43:26.400] All right, guys, let's get back to the show.
Video Games and Well-being (43:27)
[43:26.400 --> 43:33.160] Okay, Kara, tell us about video games and well-being, this endless debate that we seem
[43:33.160 --> 43:34.160] to be having.
[43:34.160 --> 43:35.160] Yeah.
[43:35.160 --> 43:38.120] So new study, really interesting, very, very large study.
[43:38.120 --> 43:40.960] So I'm just going to ask you guys point blank.
[43:40.960 --> 43:42.080] What do you think?
[43:42.080 --> 43:46.080] Does playing video games have a detrimental impact on well-being?
[43:46.080 --> 43:47.080] No.
[43:47.080 --> 43:52.320] I would say generally no, unless you abuse it, like anything else.
[43:52.320 --> 43:53.320] Yeah.
[43:53.320 --> 43:54.320] Okay.
[43:54.320 --> 43:55.320] But not especially.
[43:55.320 --> 43:56.320] What do you think?
[43:56.320 --> 43:58.760] Do video games have a positive impact on well-being?
[43:58.760 --> 43:59.760] They can.
[43:59.760 --> 44:00.760] I think so.
[44:00.760 --> 44:01.760] Yeah, that's exactly what I would say.
[44:01.760 --> 44:02.760] I would guess yes.
[44:02.760 --> 44:03.760] It can.
[44:03.760 --> 44:05.960] Probably not generically, but I think it can in some contexts.
[44:05.960 --> 44:06.960] Right.
[44:06.960 --> 44:13.160] So this study looked at probably it was more on the generic side than on the specific context
[44:13.160 --> 44:16.400] side because it was a really, really large study.
[44:16.400 --> 44:22.800] Ultimately, they looked at 38,935 players' data.
[44:22.800 --> 44:25.760] And it started way bigger than that, like hundreds of thousands.
[44:25.760 --> 44:29.080] But of course, with attrition and people not getting back and dropping out of the study,
[44:29.080 --> 44:33.840] they ended with 38,935 solid participants in this study.
[44:33.840 --> 44:36.160] So that's a big, big data set.
[44:36.160 --> 44:41.900] Their basic takeaway was there's pretty much no causal connection between gameplay and
[44:41.900 --> 44:42.900] well-being at all.
[44:42.900 --> 44:43.900] It doesn't improve.
[44:43.900 --> 44:45.640] It's not detrimental.
[44:45.640 --> 44:49.200] It has no effect at all on well-being for the most part.
[44:49.200 --> 44:51.200] Of course, we want to break that down a little bit.
[44:51.200 --> 44:52.200] Yeah, yeah, yeah.
[44:52.200 --> 44:54.880] So they looked at something else, which was kind of interesting, which we'll get to in
[44:54.880 --> 45:00.400] a second, which is the motivation for playing and how that motivation might be a sort of
[45:00.400 --> 45:02.760] underlying variable.
[45:02.760 --> 45:05.000] But first, let's talk about what they actually did.
[45:05.000 --> 45:06.740] So it was pretty cool.
[45:06.740 --> 45:13.800] These researchers were able to partner with, I think it was seven different gaming publishers,
[45:13.800 --> 45:15.360] gaming companies.
[45:15.360 --> 45:23.880] And in doing so, they were able to get direct, objective data about frequency of play because
[45:23.880 --> 45:28.960] they found that most of the, you know, obviously the reason for this study is exactly what
[45:28.960 --> 45:32.260] you asked at the beginning, Steve, like this endless debate.
[45:32.260 --> 45:37.620] And what we've seen is that there's a fair amount of public policy, like legislation,
[45:37.620 --> 45:44.240] and not just here in the US, but across the globe, that directly concerns the fear that
[45:44.240 --> 45:47.000] playing video games is detrimental to health.
[45:47.000 --> 45:48.000] But it's not evidence-based.
[45:48.000 --> 45:54.480] Like, there's, the researchers cited that in China, there is like a limit to the number
[45:54.480 --> 46:00.240] of hours people are allowed to play video games a day for fear that if somebody plays
[46:00.240 --> 46:02.540] longer than that, it can be detrimental.
[46:02.540 --> 46:06.200] And they were like, okay, if we're making like policy decisions based on this, we should
[46:06.200 --> 46:12.320] probably get to the bottom of whether or not this is even true because the data is complex.
[46:12.320 --> 46:16.600] So they were like, a lot of the data, when you look at previous studies, is subjective
[46:16.600 --> 46:17.600] in nature.
[46:17.600 --> 46:19.360] I should say it's self-report.
[46:19.360 --> 46:23.280] So not only are individuals saying, this is how I feel, but they're also saying, oh, I
[46:23.280 --> 46:27.200] kept a journal, and yeah, look, I played seven hours yesterday, or oh, I play an average
[46:27.200 --> 46:28.200] of two hours a week.
[46:28.200 --> 46:30.600] And it's like, okay, we just got to take your word for it.
[46:30.600 --> 46:36.480] So what they decided to do is figure out how to partner with these different companies.
[46:36.480 --> 46:42.320] So they partnered with Nintendo and EA and CCP Games and Microsoft and Square Enix and
[46:42.320 --> 46:43.320] Sony.
[46:43.320 --> 46:46.820] And so they looked at a handful of games.
[46:46.820 --> 46:50.240] They were Animal Crossing New Horizons.
[46:50.240 --> 46:56.480] That was Nintendo, Apex Legends, which was EA, EVE Online, which is CCP Games, Forza
[46:56.480 --> 47:01.780] Horizon 4, which is a Microsoft game, Gran Turismo Sport, which is Sony, Outriders,
[47:01.780 --> 47:07.620] which is Square Enix, and The Crew 2, which is that last one, Ubisoft.
[47:07.620 --> 47:12.700] And they had players from, I think they wanted to make sure that they were English-speaking
[47:12.700 --> 47:14.700] so that they could complete all of the surveys.
[47:14.700 --> 47:18.400] But they had players from all over the world, English-speaking world, Australia, Canada,
[47:18.400 --> 47:22.760] India, Ireland, New Zealand, South Africa, UK, US.
[47:22.760 --> 47:27.380] And they basically said, hey, if you play this game regularly, you can participate in
[47:27.380 --> 47:28.820] this research study.
[47:28.820 --> 47:35.240] They defined regularly as you've played, let's see, in the past two weeks to two months.
[47:35.240 --> 47:40.280] And then they were able to objectively record based on these players who participated the
[47:40.280 --> 47:43.400] hours that they logged on these games.
[47:43.400 --> 47:48.360] And then they were cross-referencing that or they were actually doing their statistical
[47:48.360 --> 47:58.000] analysis comparing those numbers to the different self-report surveys of the game.
[47:58.000 --> 48:02.140] And they used multiple different self-report surveys.
[48:02.140 --> 48:03.640] So let me find them here.
[48:03.640 --> 48:08.960] So they use something called the SPAIN, which is the Scale of Positive and Negative Experiences.
[48:08.960 --> 48:13.520] It's a Likert scale, one to seven, where people basically just say how frequently they felt
[48:13.520 --> 48:16.180] a certain way in the past two weeks.
[48:16.180 --> 48:21.140] So like, how often did you feel this positive experience or this negative feeling?
[48:21.140 --> 48:25.460] So from very rarely to always or never to always.
[48:25.460 --> 48:30.960] They also used the Cantril Self-Anchoring Scale, and that asks participants to imagine
[48:30.960 --> 48:33.780] a ladder with steps from zero to 10.
[48:33.780 --> 48:35.880] The top of the ladder is the best possible life for you.
[48:35.880 --> 48:37.800] The bottom is the worst possible life.
[48:37.800 --> 48:41.800] Which step were you on in the last two weeks?
[48:41.800 --> 48:47.360] And then they did some very, very complicated statistical analysis where they basically
[48:47.360 --> 48:53.240] were comparing how often people were playing, like the time that they spent playing, and
[48:53.240 --> 48:57.160] also the changes in the time, like did they play more or less over the time that they
[48:57.160 --> 48:58.160] measured them?
[48:58.160 --> 49:04.400] Because I think they had three different measurement points, the sort of before, during, and after.
[49:04.400 --> 49:05.860] And they were slightly different.
[49:05.860 --> 49:10.640] This is one of the problems with doing this kind of study where they're using the publishers
[49:10.640 --> 49:15.240] to help provide the data because, of course, the collections were slightly different between
[49:15.240 --> 49:16.240] them.
[49:16.240 --> 49:20.200] But they were able to sort of normalize everything and look at these changes over time.
[49:20.200 --> 49:28.000] And that's how they were able to statistically try to develop a measure of causality.
[49:28.000 --> 49:29.000] So it's not that they were-
[49:29.000 --> 49:30.000] But no controls though?
[49:30.000 --> 49:33.200] They didn't study anyone with, well, I mean, how would they do it, like people that aren't
[49:33.200 --> 49:34.200] playing games?
[49:34.200 --> 49:37.840] I don't think they had a control group at all of non-game players.
[49:37.840 --> 49:44.040] But I don't think it would be that hard to just look at the norms data tables of responses
[49:44.040 --> 49:49.600] to the ladder and the Spain, the Cantrell self-anchoring scale and the Spain.
[49:49.600 --> 49:50.880] They're all going to have norms tables.
[49:50.880 --> 49:55.200] There's going to be a bunch of published literature on pretty much every demographic you can think
[49:55.200 --> 50:01.500] of and how just people standardly answer those scales, so you can use that as an anchor.
[50:01.500 --> 50:09.880] But what they were able to do is by using the type of statistical analysis that they
[50:09.880 --> 50:13.600] utilized, they were able to sort of model causality.
[50:13.600 --> 50:17.960] And you see this quite a lot with sophisticated statistics.
[50:17.960 --> 50:22.760] So if you're not doing a randomized control double-blind placebo control trial, basically,
[50:22.760 --> 50:25.840] if you're not saying, here's time point A, here's time point B, we're going to give
[50:25.840 --> 50:29.220] half of them a placebo and half of them a drug, and then we're going to actually see
[50:29.220 --> 50:33.880] what the outcome was, it's very hard to say whether or not something is causal, you know,
[50:33.880 --> 50:37.980] when you're looking at like longitudinal data or sampling data across time points.
[50:37.980 --> 50:42.560] But there are statistical ways to try to model causality.
[50:42.560 --> 50:48.680] The basic outcome of looking at the amount of time was that if a person played more or
[50:48.680 --> 50:56.160] if a person played less, it neither improved nor decreased their well-being statistically.
[50:56.160 --> 51:01.320] There were some small changes, but none of them reached any sort of statistical significance.
[51:01.320 --> 51:09.520] And they sort of utilizing some not terribly sound but interesting guesstimations, they
[51:09.520 --> 51:13.820] were like, let's assume that there's linearity, and let's assume that some of these response
[51:13.820 --> 51:15.680] categories are equidistant.
[51:15.680 --> 51:22.560] Basically, the outcomes looked like they could say, this is not based on the data, it's based
[51:22.560 --> 51:27.360] on projecting the data into the future, that the average player would have to play like
[51:27.360 --> 51:34.280] 10 more hours per day than typical to notice a change in their well-being.
[51:34.280 --> 51:39.460] And also that even if there was a steady accumulation over time, because of course, they were only
[51:39.460 --> 51:44.720] looking at like a six-week window, but even if there was a steady accumulation over time,
[51:44.720 --> 51:50.440] players would only notice a difference after they were playing that much for 17 weeks straight.
[51:50.440 --> 51:55.020] And that's all again modeled, because they were only looking at the data that they collected.
[51:55.020 --> 51:59.200] So they were saying, based on this data, assuming things like linearity, it would take this
[51:59.200 --> 52:03.840] long to notice a difference because these differences that we saw were so small, they
[52:03.840 --> 52:06.440] didn't reach any statistical significance.
[52:06.440 --> 52:12.280] But what they did find was a wholly different question, which was an important and interesting
[52:12.280 --> 52:17.200] one, which was, why are these people playing video games to begin with?
[52:17.200 --> 52:24.560] And maybe the why gives us some indication of well-being outcomes, right?
[52:24.560 --> 52:25.560] Is it the gameplay?
[52:25.560 --> 52:30.000] No, we're not saying that playing more or playing less is having any sort of effect
[52:30.000 --> 52:35.120] on well-being, but is it the motivation for why they're playing?
[52:35.120 --> 52:38.160] And they looked at internal and external motivation.
[52:38.160 --> 52:45.560] So they used a measure called the PENS, which is the Player Experience and Need Satisfaction
[52:45.560 --> 52:50.800] Scale, which asks the different study participants to think about the past two weeks of playing
[52:50.800 --> 52:57.160] the game and answer questions on a Likert scale about a bunch of different constructs,
[52:57.160 --> 53:01.280] like your sense of autonomy, your sense of competence, your sense of how related you
[53:01.280 --> 53:06.480] felt to other people, and then two big ones, intrinsic versus extrinsic motivation.
[53:06.480 --> 53:08.120] Were you playing because you wanted to?
[53:08.120 --> 53:11.160] Were you playing because you felt pressure from the outside to play?
[53:11.160 --> 53:15.920] So interestingly, they found that when they were comparing intrinsic versus extrinsic
[53:15.920 --> 53:22.440] motivation, they looked at two different things, affect, which is like the way that their mood
[53:22.440 --> 53:25.680] was represented, and also life satisfaction.
[53:25.680 --> 53:33.840] They actually found a positive relationship between the two, and they found a negative
[53:33.840 --> 53:39.880] relationship between affect and life satisfaction and extrinsic motivation.
[53:39.880 --> 53:45.760] So basically, if people felt drawn to play because of external pressures, they also were
[53:45.760 --> 53:52.640] more likely to show a trend towards negative life satisfaction and poorer affect.
[53:52.640 --> 53:57.280] If people were internally motivated to play, if they played because they wanted to, you
[53:57.280 --> 54:00.060] actually saw a positive relationship there.
[54:00.060 --> 54:04.560] So basically, what the study authors say, and they list all the limitations, you know,
[54:04.560 --> 54:07.480] we've got a lot of standard limitations of this study.
[54:07.480 --> 54:09.680] Certain types of conclusions can't be drawn.
[54:09.680 --> 54:11.580] We only looked at seven different games.
[54:11.580 --> 54:14.160] Maybe different types of people play different types of games.
[54:14.160 --> 54:18.620] Maybe games that are more like fighting games or more like driving games or whatever might
[54:18.620 --> 54:20.040] have different outcomes.
[54:20.040 --> 54:24.920] But basically, they're saying, not sure these policies that say we need to limit the amount
[54:24.920 --> 54:29.400] of time people are spending playing video games because it's so detrimental to their
[54:29.400 --> 54:32.640] mental health are evidence-based.
[54:32.640 --> 54:36.800] Because our study shows that we couldn't find a relationship between the amount of time
[54:36.800 --> 54:42.320] these people were playing and their well-being, or at least their self-reported well-being.
[54:42.320 --> 54:45.960] And we know how much time they were playing because we have objective measures of it.
[54:45.960 --> 54:52.440] What if somebody or a country government replied, well, sure, but the games, though, that are
[54:52.440 --> 54:55.920] more violent, those are the ones that we need to limit.
[54:55.920 --> 54:58.280] And that's what they talk about in their limitations.
[54:58.280 --> 55:01.960] They could only access these seven games, these seven games.
[55:01.960 --> 55:07.440] They didn't do any sort of scale to say they were from less violent to more violent.
[55:07.440 --> 55:12.860] But they do, obviously, in their intro and their discussion section, cite other studies
[55:12.860 --> 55:18.820] that show, and we've talked about this before on this show, a lack of evidence supporting
[55:18.820 --> 55:21.760] that violent video games have much of an outcome at all.
[55:21.760 --> 55:28.120] And the relationship is complicated because some psychologists will show studies and they'll
[55:28.120 --> 55:32.240] actually show good supporting evidence and theories based on that good supporting evidence
[55:32.240 --> 55:36.280] that certain types of violent play allow for an outlet.
[55:36.280 --> 55:41.060] And others will show that certain types of violent play exacerbate.
[55:41.060 --> 55:42.060] And so it is complicated.
[55:42.060 --> 55:48.160] Is that good evidence, good evidence to show that violent games can exacerbate?
[55:48.160 --> 55:53.800] No, I'm saying violent play across the board, not necessarily video games per se.
[55:53.800 --> 55:56.760] But a lot of that has to do with who's playing them.
[55:56.760 --> 55:58.060] And you have to remember that, too.
[55:58.060 --> 56:05.680] If a person who has certain personality styles, certain maybe DSM diagnoses, a history of
[56:05.680 --> 56:11.320] violent, you know, something like that, yes, you're probably going to see exacerbation
[56:11.320 --> 56:14.360] in utilizing violent play paradigms.
[56:14.360 --> 56:21.400] But you might also see it's sort of the age old argument about, I think we talked about
[56:21.400 --> 56:26.280] this years ago on the show, do you guys remember when we talked about child pornography and
[56:26.280 --> 56:32.360] specifically modeling of child pornography, sort of digitization so they're not real
[56:32.360 --> 56:39.480] people and whether this was a healthy outlet for individuals who already feel drawn towards
[56:39.480 --> 56:45.120] engaging in this and want a healthy outlet to be able to do it legally and safely where
[56:45.120 --> 56:46.440] there are no victims?
[56:46.440 --> 56:51.000] Is that going to contribute and exacerbate their behavior and codify it and normalize
[56:51.000 --> 56:52.000] it?
[56:52.000 --> 56:54.380] Or is that something where they're going to do it anyway?
[56:54.380 --> 56:58.640] So how do we give them a safe usage that doesn't harm individuals, right?
[56:58.640 --> 57:02.480] So I think there is a more complicated conversation to be had there.
[57:02.480 --> 57:09.280] But when it comes to general use, you know, everyday people, sort of vanilla test subjects
[57:09.280 --> 57:13.440] who aren't scoring high on certain, you know, psychopathologies and don't have violent histories
[57:13.440 --> 57:17.520] and blah, blah, blah, blah, blah, yeah, a lot of the data shows that there's no correlation
[57:17.520 --> 57:21.560] at all between playing violent video games and violence in the real world.
[57:21.560 --> 57:26.880] And so you add that to this very interesting study that shows that also playing for a lot,
[57:26.880 --> 57:32.400] you know, the more I play doesn't make me sadder or less connected or more angry or
[57:32.400 --> 57:35.320] feel like my well being is worse off.
[57:35.320 --> 57:40.320] People I do think that governments, organizations, academic and educational organizations, even
[57:40.320 --> 57:42.660] parents, something to think about.
[57:42.660 --> 57:46.840] It's really something to think about, you know, this is a person to person experience.
[57:46.840 --> 57:50.520] I think you know your children well, but let's not just assume that because somebody is playing
[57:50.520 --> 57:55.480] video games, they have poor well being because the evidence just doesn't bear that out.
[57:55.480 --> 57:57.960] Yeah, it's a moral panic kind of thing.
[57:57.960 --> 57:58.960] Totally.
[57:58.960 --> 58:00.280] It's like satanic panic of the 90s.
[58:00.280 --> 58:01.280] Absolutely.
[58:01.280 --> 58:04.640] Like these role playing games are going to turn our kids into demon worshipers.
[58:04.640 --> 58:05.640] Right.
[58:05.640 --> 58:09.080] And why isn't this going away after all these years?
[58:09.080 --> 58:10.080] I know.
[58:10.080 --> 58:12.600] People are afraid of things they don't know.
[58:12.600 --> 58:14.640] They want easy scapegoats, too.
[58:14.640 --> 58:15.640] They do.
[58:15.640 --> 58:17.800] It's a compelling media narrative also.
[58:17.800 --> 58:18.800] Yeah.
[58:18.800 --> 58:19.800] Yeah.
[58:19.800 --> 58:20.800] All right.
[58:20.800 --> 58:21.800] Interesting.
[58:21.800 --> 58:22.800] Thank you, Kara.
Invisible Dark Matter (58:23)
[58:22.800 --> 58:27.320] Bob, tell us about the latest research trying to image dark matter.
[58:27.320 --> 58:30.080] Yeah, a lot of dark matter news recently.
[58:30.080 --> 58:31.360] This one caught my attention.
[58:31.360 --> 58:37.200] Australia finished its first run of experiments for its first major dark matter detector called
[58:37.200 --> 58:42.120] the Oscillating Resonant Group Axion, also called ORGAN.
[58:42.120 --> 58:47.020] So I figured it'd be good to do a little primer to put this into context.
[58:47.020 --> 58:51.240] So when you're looking into a dark matter detector and you don't know much about it,
[58:51.240 --> 58:55.280] one of the first questions should be, well, what kind of hypothetical type of dark matter
[58:55.280 --> 58:56.880] is it made to look for?
[58:56.880 --> 58:59.080] Also, there's two broad classifications.
[58:59.080 --> 59:01.880] There's hot dark matter and cold dark matter.
[59:01.880 --> 59:06.160] You probably heard the latter one much more than the first one here.
[59:06.160 --> 59:07.800] So hot dark matter, what is it?
[59:07.800 --> 59:12.440] Hot in this context means fast, as in near light speed.
[59:12.440 --> 59:17.640] And dark, the word dark, of course, implies that it does not interact much with matter
[59:17.640 --> 59:22.120] or light, basically almost invisible in a lot of ways.
[59:22.120 --> 59:26.280] So an example of hot dark matter would be a neutrino.
[59:26.280 --> 59:31.120] It goes very, very close to the speed of light because it's nearly massless, but not totally
[59:31.120 --> 59:33.260] massless, but it goes very, very fast.
[59:33.260 --> 59:38.240] And it's really neutrinos, I remember, when they found out that they had mass, they thought
[59:38.240 --> 59:41.060] for a little while that, oh, maybe this is what dark matter is.
[59:41.060 --> 59:45.640] But no, it's just not a popular candidate for dark matter these days.
[59:45.640 --> 59:48.680] Then you have the other big category, cold dark matter.
[59:48.680 --> 59:53.420] So cold in this context means slow compared to the speed of light.
[59:53.420 --> 01:00:00.880] And the two major classes of cold dark matter that I'll talk about are machos and wimps,
[01:00:00.880 --> 01:00:05.080] which was pretty funny when they first came out because there's such, you know, macho
[01:00:05.080 --> 01:00:07.360] and wimp, two ends of the spectrum.
[01:00:07.360 --> 01:00:14.840] So macho is an acronym for massive astrophysical compact halo object.
[01:00:14.840 --> 01:00:19.520] So this was actually one of the very first machos, it was one of the very first candidates
[01:00:19.520 --> 01:00:20.860] for dark matter.
[01:00:20.860 --> 01:00:25.480] It seemed, you know, it seemed kind of obvious, oh yeah, maybe neutron stars or brown dwarfs
[01:00:25.480 --> 01:00:31.680] or primordial black holes, they could potentially, you know, if you add them all up, add up all
[01:00:31.680 --> 01:00:35.760] the masses, maybe that's what dark matter is.
[01:00:35.760 --> 01:00:42.080] Maybe there's halos of these much more than we would ever think.
[01:00:42.080 --> 01:00:45.800] Massive halos of these around, say all the galaxies that are causing this extra mass
[01:00:45.800 --> 01:00:47.360] that we can't see.
[01:00:47.360 --> 01:00:53.400] But it didn't take that long before it's basically now considered to be very unlikely.
[01:00:53.400 --> 01:00:56.880] The data just doesn't show enough of these.
[01:00:56.880 --> 01:01:01.680] Like I remember reading about some studies looking for primordial black holes and gravitational
[01:01:01.680 --> 01:01:06.760] lensing that they would cause, and they're just not seeing enough of them out there to
[01:01:06.760 --> 01:01:12.120] possibly be considered a candidate, a serious candidate for dark matter.
[01:01:12.120 --> 01:01:16.780] The other one of the, the other big option for cold dark matter is wimps.
[01:01:16.780 --> 01:01:21.600] This group I think is the one that's getting the lion's share of research these days.
[01:01:21.600 --> 01:01:26.560] WIMP stands for weekly interacting massive particles.
[01:01:26.560 --> 01:01:32.040] Actually in this context, I just found out to me, I always thought that they interact
[01:01:32.040 --> 01:01:33.040] weekly, right?
[01:01:33.040 --> 01:01:38.520] They just don't interact a lot, but actually it refers to the weak force when it says weekly
[01:01:38.520 --> 01:01:40.320] interacting massive particles.
[01:01:40.320 --> 01:01:43.080] Now wimps are assumed to be non-baryonic.
[01:01:43.080 --> 01:01:47.960] So they are not made of protons, neutrons, quarks, et cetera.
[01:01:47.960 --> 01:01:50.640] And there's some examples here you might not have heard of.
[01:01:50.640 --> 01:01:55.760] The Calusa-Klein particle is a potential wimp candidate for cold dark matter.
[01:01:55.760 --> 01:02:03.320] Calusa-Klein particles are supposedly potentially curled up in a hidden fifth dimension and
[01:02:03.320 --> 01:02:05.000] we therefore cannot see them.
[01:02:05.000 --> 01:02:08.900] You look, no matter where you look, you're not going to see this particle because it's
[01:02:08.900 --> 01:02:11.720] hidden away in a super tiny fifth dimension.
[01:02:11.720 --> 01:02:17.120] But the theory states that it should be able to decay into neutrinos and photons, which
[01:02:17.120 --> 01:02:19.960] we don't see in our accelerators.
[01:02:19.960 --> 01:02:25.120] So that means perhaps that it just doesn't exist or maybe our accelerators just aren't
[01:02:25.120 --> 01:02:26.560] powerful enough to see them.
[01:02:26.560 --> 01:02:29.360] And one day we may, wouldn't that be interesting?
[01:02:29.360 --> 01:02:33.680] We have a solution to dark matter and we have a hidden fifth dimension.
[01:02:33.680 --> 01:02:36.360] That would be pretty amazingly awesome.
[01:02:36.360 --> 01:02:43.040] The other potential wimp that is an example of cold dark matter is Gravitinos and Gravitino
[01:02:43.040 --> 01:02:45.760] is a silly name and I won't discuss it anymore.
[01:02:45.760 --> 01:02:47.360] How's that?
[01:02:47.360 --> 01:02:54.040] And then we have another wimp is an Axion and this is what Australia's Oregon experiment
[01:02:54.040 --> 01:02:56.460] is looking for, axions.
[01:02:56.460 --> 01:02:58.100] Axions are hypothetical particles.
[01:02:58.100 --> 01:03:06.400] They were theory theorized decades ago initially to deal with CP violations of the strong force.
[01:03:06.400 --> 01:03:10.400] So you don't know what that is, well, look it up.
[01:03:10.400 --> 01:03:13.840] It's out of scope today, but it's worthy of a rabbit hole.
[01:03:13.840 --> 01:03:17.340] If these axions exist, they would move very slowly and would interact.
[01:03:17.340 --> 01:03:21.840] They wouldn't interact much at all, but we do know that they would likely have a very
[01:03:21.840 --> 01:03:29.040] certain mass range because they would need to have a minimum and maximum mass because
[01:03:29.040 --> 01:03:32.520] if they were heavier or smaller than we would see them.
[01:03:32.520 --> 01:03:34.480] So that's pretty solid.
[01:03:34.480 --> 01:03:37.580] So if you're going to look for these, you're going to, you should, you need to look within
[01:03:37.580 --> 01:03:40.880] this mass range and that's what people have been doing.
[01:03:40.880 --> 01:03:47.440] The other major part of this theory says that axions should be able to transform by very,
[01:03:47.440 --> 01:03:51.520] very strong magnetic fields into, into photons.
[01:03:51.520 --> 01:03:56.360] And I think also neutrinos as well regarding Oregon, Dr. Ben McCallister from the university
[01:03:56.360 --> 01:04:02.800] of Western Australia said it's, it engineers and corrects conditions for axion photon conversion
[01:04:02.800 --> 01:04:07.740] and looks for weak photon signals, which are little flashes of light generated by dark
[01:04:07.740 --> 01:04:10.740] mass matter passing through the detector.
[01:04:10.740 --> 01:04:15.840] So the big engineering problem then here is dealing with the noise.
[01:04:15.840 --> 01:04:19.880] And I'm not talking about the machine making loud noises like Jay eating meatball after
[01:04:19.880 --> 01:04:22.040] meatball at dinner.
[01:04:22.040 --> 01:04:24.120] Oh God, that happens all the time.
[01:04:24.120 --> 01:04:29.120] Noise, noise in this context refers to the random light signals that are caused by the
[01:04:29.120 --> 01:04:34.160] high temperatures, which are in turn caused by the intense magnetic fields themselves.
[01:04:34.160 --> 01:04:40.300] So this, the heat creates random light that tends to swamp out any of the photons that
[01:04:40.300 --> 01:04:46.620] would be caused by axions that have been converted into photons and then being detected.
[01:04:46.620 --> 01:04:51.200] So they had to deal with that and apparently they have, they have dealt with that.
[01:04:51.200 --> 01:04:52.260] So what happened?
[01:04:52.260 --> 01:04:56.640] What happened after Oregon's recent experimental run?
[01:04:56.640 --> 01:05:02.240] They basically found no axions at all zip, zero, zilch, and how does that expression
[01:05:02.240 --> 01:05:03.240] end?
[01:05:03.240 --> 01:05:04.240] What's the last one?
[01:05:04.240 --> 01:05:05.240] Nada.
[01:05:05.240 --> 01:05:06.240] Yes.
[01:05:06.240 --> 01:05:07.240] Very good.
[01:05:07.240 --> 01:05:10.120] Steve, you are very well conversant with nada zip, zero, zilch, nada.
[01:05:10.120 --> 01:05:11.120] And that sounds bad, right?
[01:05:11.120 --> 01:05:13.640] That sounds, they didn't, they looked, they didn't find it.
[01:05:13.640 --> 01:05:14.640] Holy crap.
[01:05:14.640 --> 01:05:15.640] That's not good.
[01:05:15.640 --> 01:05:17.880] That doesn't mean it's not good.
[01:05:17.880 --> 01:05:19.420] It's not necessarily bad.
[01:05:19.420 --> 01:05:21.960] That reminds me of the quote attributed to Edison.
[01:05:21.960 --> 01:05:23.040] I have not failed.
[01:05:23.040 --> 01:05:26.200] I've just found 10,000 ways that won't work.
[01:05:26.200 --> 01:05:28.520] So it's kind of similar to this axion research.
[01:05:28.520 --> 01:05:35.360] This lack of discovery is just one, it's really one step forward towards the goal of discovering
[01:05:35.360 --> 01:05:39.480] that dark matter is made of axions or we're not.
[01:05:39.480 --> 01:05:41.480] We're still going in the right direction.
[01:05:41.480 --> 01:05:46.600] So to clarify that, I'll give you another quote by McAllister, he said, when we don't
[01:05:46.600 --> 01:05:52.320] see any little flashes, as was the case this time, we instead place exclusion limits where
[01:05:52.320 --> 01:05:56.860] we rule out axions that our experiment would have been sensitive to.
[01:05:56.860 --> 01:06:02.080] So then we tell the rest of the dark matter community, Hey, no dark matter here.
[01:06:02.080 --> 01:06:05.380] And we move on to search for axions of a different mass.
[01:06:05.380 --> 01:06:09.880] So basically there's a chunk of mass ranges that this, that the axions could have, and
[01:06:09.880 --> 01:06:13.680] they basically just took away a little piece of that, like it up, it can't be there.
[01:06:13.680 --> 01:06:14.960] Let's look over here.
[01:06:14.960 --> 01:06:18.800] So as you may have guessed, this was just Oregon's phase one.
[01:06:18.800 --> 01:06:23.980] Future phases will be testing other unexplored, unexplored mass ranges.
[01:06:23.980 --> 01:06:28.800] And hopefully that will be, that will be quite, quite a day if they, if they did find it.
[01:06:28.800 --> 01:06:34.760] And we did find out that dark matter is composed at least partly of these axions.
[01:06:34.760 --> 01:06:39.020] So explaining the significance of their quest, McAllister had another good quote.
[01:06:39.020 --> 01:06:44.080] He said, we never would have discovered electricity or radio waves if we didn't pursue things
[01:06:44.080 --> 01:06:49.920] that at the time appeared to be strange physical phenomena beyond our understanding.
[01:06:49.920 --> 01:06:51.840] Dark matter is the same.
[01:06:51.840 --> 01:06:57.320] So yeah, I, you know, this is a, it was a very interesting experiment and going through
[01:06:57.320 --> 01:07:02.120] it, you know, there wasn't a huge amount of meat necessarily in this specific news item.
[01:07:02.120 --> 01:07:05.680] It was pretty simple, you know, here's the experiment they looked for if they could,
[01:07:05.680 --> 01:07:06.680] they didn't find it.
[01:07:06.680 --> 01:07:11.320] But I think the significance of it, the idea that they're just, they're just eating away
[01:07:11.320 --> 01:07:16.900] at one possibility and eventually they'll have one answer, some answers one way or the
[01:07:16.900 --> 01:07:19.560] other at least according to axions.
[01:07:19.560 --> 01:07:24.320] And I also thought it was, it was fun to just give a little primer on, you know, what exactly
[01:07:24.320 --> 01:07:28.480] are the different types of, of, of cold, of a dark matter with a cold or hot or whatever.
[01:07:28.480 --> 01:07:31.040] So this was a kind of a fun little research thing.
[01:07:31.040 --> 01:07:32.040] So so that's it.
[01:07:32.040 --> 01:07:33.040] Yeah.
[01:07:33.040 --> 01:07:36.520] It's a good reminder that negative results are results and they do push the ball forward
[01:07:36.520 --> 01:07:44.720] it reminds me of, uh, like the, the experiments looking to see if the earth was at rest with
[01:07:44.720 --> 01:07:49.760] respect to the ether or was it moving with respect to the ether?
[01:07:49.760 --> 01:07:53.240] And the answer was no, it's neither.
[01:07:53.240 --> 01:07:57.960] And therefore every experiment they did showed, nope, there's no ether.
[01:07:57.960 --> 01:08:01.880] And but if there's no ether, then what is light propagating through?
[01:08:01.880 --> 01:08:05.200] What is light removing at speed C with respect to?
[01:08:05.200 --> 01:08:09.920] And that eventually led to the answer of everything.
[01:08:09.920 --> 01:08:15.480] And the theory of reality of, yeah, so that was a negative result that, that transformed
[01:08:15.480 --> 01:08:16.480] physics.
[01:08:16.480 --> 01:08:20.600] Um, so it's good to remember that and it's good also that they didn't just double and
[01:08:20.600 --> 01:08:25.720] triple and quadruple down and, and cause we've seen that of course in, in our community,
[01:08:25.720 --> 01:08:30.520] we see that where we're like, remember that, that, that flat earth show where they, they
[01:08:30.520 --> 01:08:35.880] did a really def fairly definitive test with a, you know, with like a, with a laser beam
[01:08:35.880 --> 01:08:39.440] that would have, that, you know, that would have gone above the, uh, the hole that they,
[01:08:39.440 --> 01:08:42.920] that they created because of the, the curvature meant that it went above.
[01:08:42.920 --> 01:08:44.280] And they're like, wait a second.
[01:08:44.280 --> 01:08:50.640] And for a second, you know, you could see them just flirting with the idea, well, wait,
[01:08:50.640 --> 01:08:55.160] the best explanation here is that the earth is a sphere and it's not flat.
[01:08:55.160 --> 01:08:57.560] And you see them kind of nibble at it for a second, right?
[01:08:57.560 --> 01:08:59.480] And then they're like, ah, we did something wrong.
[01:08:59.480 --> 01:09:04.240] And then they, then they totally double and triple down and said, no, the earth is flat
[01:09:04.240 --> 01:09:07.040] and something is wrong with the experimental setup.
[01:09:07.040 --> 01:09:09.960] It's like, oh man, she's like, come on guys.
[01:09:09.960 --> 01:09:11.240] There was so much of that in that movie.
[01:09:11.240 --> 01:09:12.240] It was so brilliant.
[01:09:12.240 --> 01:09:17.040] I mean, my favorite was, you know, the woman who was, you know, a conspiracy theorist,
[01:09:17.040 --> 01:09:21.840] but then someone who was even more nutty than her was, was, had a conspiracy theory about
[01:09:21.840 --> 01:09:28.960] her and she was saying, well, this guy is just making stuff up and doing A and B.
[01:09:28.960 --> 01:09:34.800] And then she like looks off and goes, is it possible that that's what I'm doing?
[01:09:34.800 --> 01:09:37.640] Like she had this moment of insight, like, wait a minute for that.
[01:09:37.640 --> 01:09:38.640] Oh yeah.
[01:09:38.640 --> 01:09:43.680] And then she had a total Theodoric of York moment where she was like, nah, you know,
[01:09:43.680 --> 01:09:46.360] like she, oh my God, right up Martin.
[01:09:46.360 --> 01:09:47.600] What a great scene.
[01:09:47.600 --> 01:09:48.600] Yeah.
[01:09:48.600 --> 01:09:49.600] Yeah.
[01:09:49.600 --> 01:09:50.600] Yeah.
[01:09:50.600 --> 01:09:53.040] I just, you know, yeah, there was a lot of that or, oh yeah, we should do this experiment
[01:09:53.040 --> 01:09:59.120] to spend $20,000 on a gyroscope that will show that the earth is not rotating.
[01:09:59.120 --> 01:10:00.120] Oops.
[01:10:00.120 --> 01:10:01.120] It is rotating.
[01:10:01.120 --> 01:10:02.120] Okay.
[01:10:02.120 --> 01:10:03.120] Hmm.
[01:10:03.120 --> 01:10:07.320] Motivated reasoning kicks in and then eventually they figured out that it's because the sky
[01:10:07.320 --> 01:10:09.920] is moving around the earth.
[01:10:09.920 --> 01:10:12.920] That's what's making the gyroscope move is dragging it along with it.
[01:10:12.920 --> 01:10:15.460] So we just had to invent some new physics there.
[01:10:15.460 --> 01:10:21.360] But yeah, just that movie is a master work of just documenting the process of conspiracy
[01:10:21.360 --> 01:10:25.280] thinking and motivated reasoning and how people can get stuck in it.
[01:10:25.280 --> 01:10:26.280] It's just wonderful.
[01:10:26.280 --> 01:10:27.280] All right.
[01:10:27.280 --> 01:10:28.280] So frustrating.
[01:10:28.280 --> 01:10:29.280] Okay.
[01:10:29.280 --> 01:10:30.280] Let's move on.
Questions/Emails/Corrections/Follow-ups (1:10:30)
[01:10:30.280 --> 01:10:33.440] Oh, we're going to, we're going to skip ahead to a few emails.
[01:10:33.440 --> 01:10:35.040] We had some fun emails.
[01:10:35.040 --> 01:10:38.840] These are just like kind of quick science questions or science feedback.
[01:10:38.840 --> 01:10:40.600] So I was going to go through a few.
_consider_using_block_quotes_for_emails_read_aloud_in_this_segment_
with_reduced_spacing_for_long_chunks –
Question #1: Lord Kelvin (1:10:40)
...In podcast [851]—that's not that old—chirality was a topic of discussion.
[01:10:40.600 --> 01:10:44.280] First one comes from David Allen from Stuttgart, Germany.
[01:10:44.280 --> 01:10:48.980] And David writes, I've just recently heard about your podcast and have downloaded many
[01:10:48.980 --> 01:10:53.820] of the back numbers which explain why my feedback has to do with an old podcast.
[01:10:53.820 --> 01:10:59.640] In podcast 891, it's not that old, chirality was a topic of discussion that this term was
[01:10:59.640 --> 01:11:03.720] coined by Lord Kelvin, AKA William Thompson.
[01:11:03.720 --> 01:11:09.520] One of the broadcasters used a posh English accent in connection with this, probably assuming
[01:11:09.520 --> 01:11:12.800] this would be the accent Lord Kelvin has spoken.
[01:11:12.800 --> 01:11:17.640] Actually having been born in Belfast and brought up from an early age in Glasgow, he was a
[01:11:17.640 --> 01:11:21.400] pronounced, he had a pronounced Scottish accent.
[01:11:21.400 --> 01:11:24.600] Furthermore, he was not a hereditary peer.
[01:11:24.600 --> 01:11:29.040] I have seen something similar in one of the Around the World in 80 Days where he was also
[01:11:29.040 --> 01:11:32.420] given an upper class English accent and manners.
[01:11:32.420 --> 01:11:36.640] Imagine if Benjamin Franklin was similarly misrepresented in a film or a documentary.
[01:11:36.640 --> 01:11:38.300] Otherwise I thoroughly enjoy your podcast.
[01:11:38.300 --> 01:11:39.300] Keep up the good work.
[01:11:39.300 --> 01:11:40.300] All right, thanks David.
[01:11:40.300 --> 01:11:43.500] Wait, doesn't Benjamin Franklin always have a posh upper class English accent in every
[01:11:43.500 --> 01:11:45.000] film about him too?
[01:11:45.000 --> 01:11:46.000] Franklin?
[01:11:46.000 --> 01:11:47.000] No.
[01:11:47.000 --> 01:11:48.000] It's colonial.
[01:11:48.000 --> 01:11:49.720] Everybody talked like this in colonial America.
[01:11:49.720 --> 01:11:52.520] Yeah, but it wasn't a British accent, it was a colonial accent.
[01:11:52.520 --> 01:11:56.360] Well fine, whatever, that comes from the British accent.
[01:11:56.360 --> 01:11:58.080] But yeah, so I didn't know that.
[01:11:58.080 --> 01:12:01.040] So Lord Kelvin was born William Thompson.
[01:12:01.040 --> 01:12:07.320] He's considered Scott Irish and he was knighted and became Sir William Thompson because of
[01:12:07.320 --> 01:12:13.900] his scientific contributions and then later landed and became a baron, Lord Kelvin.
[01:12:13.900 --> 01:12:19.960] But yeah, it wasn't a hereditary title for him, I guess, because he was granted it.
[01:12:19.960 --> 01:12:22.640] But he would have had a Scottish accent.
[01:12:22.640 --> 01:12:25.960] Okay, but to be fair, also Scottish accents are really hard to do.
[01:12:25.960 --> 01:12:30.840] Yeah, I think we do the posh British accent because that's the one we could do, not because
[01:12:30.840 --> 01:12:31.840] it's accurate.
[01:12:31.840 --> 01:12:33.400] Yeah, but because it's intentionally a caricature.
[01:12:33.400 --> 01:12:34.400] Yeah.
[01:12:34.400 --> 01:12:35.400] Well not just that.
[01:12:35.400 --> 01:12:39.680] My take on that was, and I think probably was it Jay who maybe did that accent, my take
[01:12:39.680 --> 01:12:47.720] is this, is that because to me, in my mind, Lord Kelvin, to me seems like he would have,
[01:12:47.720 --> 01:12:54.640] just by the fact that it has Lord, to me equates to upper class British, which is what...
[01:12:54.640 --> 01:12:55.640] That's his point.
[01:12:55.640 --> 01:12:56.640] We assume that.
[01:12:56.640 --> 01:12:57.640] But he was...
[01:12:57.640 --> 01:12:58.640] Yeah, that's the thing.
[01:12:58.640 --> 01:13:00.320] It doesn't mean it's English.
[01:13:00.320 --> 01:13:02.040] Just because it's British doesn't mean it's English.
[01:13:02.040 --> 01:13:03.040] Right, I know.
[01:13:03.040 --> 01:13:10.520] But to me, Lord, I think it just connects automatically with the stereotypical accent
[01:13:10.520 --> 01:13:11.520] that was used.
[01:13:11.520 --> 01:13:12.520] I don't know who did it or whatever.
[01:13:12.520 --> 01:13:13.520] Well, I can tell you why.
[01:13:13.520 --> 01:13:14.520] Because of BBC America, that's why.
[01:13:14.520 --> 01:13:15.520] Right, right.
[01:13:15.520 --> 01:13:24.560] And not just BBC America, it's literally every film ever made in the US about another culture
[01:13:24.560 --> 01:13:26.060] uses a British accent.
[01:13:26.060 --> 01:13:30.720] Doesn't matter if they're German, doesn't matter if they're Polish, well maybe Polish,
[01:13:30.720 --> 01:13:32.080] they'll try and pull something off.
[01:13:32.080 --> 01:13:38.160] But definitely any kind of European, well, most European countries, they just use British
[01:13:38.160 --> 01:13:39.160] accents.
[01:13:39.160 --> 01:13:40.160] It's ridiculous.
[01:13:40.160 --> 01:13:41.160] Mm-hmm.
[01:13:41.160 --> 01:13:42.160] Why do they do that?
[01:13:42.160 --> 01:13:44.040] Because we just, we like it.
[01:13:44.040 --> 01:13:46.080] We like that damn accent, you know?
[01:13:46.080 --> 01:13:47.080] Because we think it makes people sound smart.
[01:13:47.080 --> 01:13:48.080] It's so...
[01:13:48.080 --> 01:13:50.160] Yes, it's such a pleasing sound.
[01:13:50.160 --> 01:13:56.480] They choose accents based upon the character, not what makes sense historically.
[01:13:56.480 --> 01:13:58.200] Disney's the worst at this, you know?
[01:13:58.200 --> 01:13:59.200] Oh.
[01:13:59.200 --> 01:14:02.280] Yeah, a British accent, it means you're smart.
[01:14:02.280 --> 01:14:06.000] A Scottish accent means that you are a barbarian or a rebel.
[01:14:06.000 --> 01:14:07.000] Yeah, that you're rough.
[01:14:07.000 --> 01:14:08.000] Yeah.
[01:14:08.000 --> 01:14:09.000] Yeah.
[01:14:09.000 --> 01:14:14.160] So I remember like in How to Train Your Dragon, the Vikings had Scottish accents.
[01:14:14.160 --> 01:14:17.200] Why did they give Vikings Scottish accents?
[01:14:17.200 --> 01:14:19.560] Well, because they were tough barbarians.
[01:14:19.560 --> 01:14:25.920] And that is now the media trope of that's, I guess, the accent you have if you're a barbarian,
[01:14:25.920 --> 01:14:26.920] you have a Scottish accent.
[01:14:26.920 --> 01:14:28.720] It doesn't matter that you're a Viking.
[01:14:28.720 --> 01:14:30.640] Well, we get similar feedback.
[01:14:30.640 --> 01:14:34.800] I mean, I remember we got an email from somebody and I mean, good on them.
[01:14:34.800 --> 01:14:37.840] And like, I agree, but it's a hard habit to break.
[01:14:37.840 --> 01:14:43.280] And I'm from there that when we do a Southern accent, what does that mean?
[01:14:43.280 --> 01:14:45.400] And this is, I mean, this is across the board.
[01:14:45.400 --> 01:14:49.760] You see this in television, you see it in comedy, you see it in other countries and
[01:14:49.760 --> 01:14:50.760] here.
[01:14:50.760 --> 01:14:55.920] And of course, there are plenty of brilliant people with Southern American accents.
[01:14:55.920 --> 01:15:00.400] But I think there's this stereotype that the more colloquial an accent becomes, like the
[01:15:00.400 --> 01:15:07.160] more specific and entrenched it becomes, the more kind of regional it becomes, the less
[01:15:07.160 --> 01:15:09.960] metropolitan the person is.
[01:15:09.960 --> 01:15:12.120] And that's the stereotype.
[01:15:12.120 --> 01:15:13.720] And so you can take it anywhere.
[01:15:13.720 --> 01:15:17.920] Very deeply New York accents, we think of having all of the stereotypes of very deeply
[01:15:17.920 --> 01:15:23.200] New York people, very deeply Texas accents, you know, we think of George W. Bush.
[01:15:23.200 --> 01:15:25.160] It's just, it's the stereotype.
[01:15:25.160 --> 01:15:26.160] Totally.
[01:15:26.160 --> 01:15:29.880] I remember I had a professor in medical school who spoke with like a Brooklyn accent and
[01:15:29.880 --> 01:15:35.760] it was like, it totally was jarring, you know, because this guy's a medical professor and
[01:15:35.760 --> 01:15:39.320] he was speaking with an accent that you don't normally associate with a scholar, you know,
[01:15:39.320 --> 01:15:41.280] and an academic.
[01:15:41.280 --> 01:15:47.160] But of course, why wouldn't people who were born in New York be medical doctors and teach
[01:15:47.160 --> 01:15:48.160] at a university?
[01:15:48.160 --> 01:15:49.160] Yeah.
[01:15:49.160 --> 01:15:50.160] Yeah, but we get totally trained.
[01:15:50.160 --> 01:15:51.160] We're totally trained.
[01:15:51.160 --> 01:15:56.600] I remember, you know, I think I mentioned this when I went to Vienna and just hearing
[01:15:56.600 --> 01:16:00.560] a bunch of everyday people speaking German and it totally realized like, oh my God, up
[01:16:00.560 --> 01:16:05.000] to this point in my life, every German accent I've ever heard was coming from a Nazi.
[01:16:05.000 --> 01:16:10.360] And you're like so programmed, you know, you had to like get deep, deep, deep, like, no,
[01:16:10.360 --> 01:16:14.000] this is just a normal people accent, you know, this is just the way people speak here.
[01:16:14.000 --> 01:16:15.000] It's not.
[01:16:15.000 --> 01:16:17.720] But that's, we are absolutely programmed by media.
[01:16:17.720 --> 01:16:18.800] All right.
Question #2: Green Methane (1:16:18)
[01:16:18.800 --> 01:16:19.800] Question number two.
[01:16:19.800 --> 01:16:23.920] This one comes from Chris in Florida, and he says, I recently found this article which
[01:16:23.920 --> 01:16:29.180] claims that Musk's new rocket engines based on methane can be carbon neutral.
[01:16:29.180 --> 01:16:30.180] Fact or fiction?
[01:16:30.180 --> 01:16:35.520] So I'm just going to focus on the can methane be carbon neutral?
[01:16:35.520 --> 01:16:39.780] And the answer to that is, well, yes, it can be.
[01:16:39.780 --> 01:16:40.780] You can make methane.
[01:16:40.780 --> 01:16:47.840] Well, you could create methane as like a biofuel if you're using, if you're using as a source
[01:16:47.840 --> 01:16:52.840] of that, things that are, that are carbon neutral, like if you're using plant, you know,
[01:16:52.840 --> 01:16:54.360] plant matter.
[01:16:54.360 --> 01:17:00.440] And also, or if you're carbon, if you're carbon capturing, if the energy you're using to make
[01:17:00.440 --> 01:17:03.680] the methane, because methane is a high energy molecule, right?
[01:17:03.680 --> 01:17:08.320] So if you're, quote unquote, making it, you're allegedly going from lower energy molecules
[01:17:08.320 --> 01:17:14.520] like carbon dioxide, or water, and you're going to this methane, which is a high energy
[01:17:14.520 --> 01:17:17.220] molecule, your energy is coming from somewhere.
[01:17:17.220 --> 01:17:18.480] So where's that energy coming from?
[01:17:18.480 --> 01:17:23.460] So if you're, if you're powering the process with solar panels, yeah, you could theoretically
[01:17:23.460 --> 01:17:25.560] have carbon neutral methane.
[01:17:25.560 --> 01:17:26.560] Absolutely.
[01:17:26.560 --> 01:17:28.940] But it just all depends on how you're making it.
[01:17:28.940 --> 01:17:33.880] If you're sourcing it from fossil fuel and releasing previously sequestered carbon into
[01:17:33.880 --> 01:17:35.920] the atmosphere, then no, not at all.
[01:17:35.920 --> 01:17:39.960] Or if you're burning coal to power the process to make the methane, then no.
[01:17:39.960 --> 01:17:43.160] But if you're powering it with solar or wind or whatever, then, then it could be.
[01:17:43.160 --> 01:17:44.160] Sure.
[01:17:44.160 --> 01:17:45.160] Absolutely.
Question #3: Universe Isotropy (1:17:45)
[01:17:45.160 --> 01:17:46.160] All right.
[01:17:46.160 --> 01:17:47.320] So we have a question from Pedro.
[01:17:47.320 --> 01:17:50.040] He gave his location as capital P small t.
[01:17:50.040 --> 01:17:54.600] I have no idea what that refers to and I couldn't find out because PT is like too generic to
[01:17:54.600 --> 01:17:55.600] search on.
[01:17:55.600 --> 01:17:56.880] You guys know what that is?
[01:17:56.880 --> 01:18:00.200] I don't know because I don't have any, any narrowing context.
[01:18:00.200 --> 01:18:01.200] Is that a state?
[01:18:01.200 --> 01:18:02.200] Is that a city?
[01:18:02.200 --> 01:18:03.200] Is that a country?
[01:18:03.200 --> 01:18:04.200] Is it Portugal?
[01:18:04.200 --> 01:18:05.200] Is it?
[01:18:05.200 --> 01:18:06.200] All the information.
[01:18:06.200 --> 01:18:07.400] Yeah, I don't know.
[01:18:07.400 --> 01:18:11.240] And I voted for Pedro and he instigated that on me.
[01:18:11.240 --> 01:18:12.240] All right.
[01:18:12.240 --> 01:18:16.360] So he asked, if we analyze the universe in all directions from our point, no matter of
[01:18:16.360 --> 01:18:22.180] Earth, solar system or galaxy, do we see any direction where universe is older than others?
[01:18:22.180 --> 01:18:28.160] Because maybe this way we could position ourselves in a kind of universe map in case it is limited
[01:18:28.160 --> 01:18:33.000] or until the visible universe and actually check if universe is bounded or not, speaking
[01:18:33.000 --> 01:18:34.720] in a little bit of a broken English there.
[01:18:34.720 --> 01:18:35.720] It's entropy, baby.
[01:18:35.720 --> 01:18:36.720] Right.
[01:18:36.720 --> 01:18:37.720] So that's exactly right.
[01:18:37.720 --> 01:18:40.120] So and I emailed them back to let them know that.
[01:18:40.120 --> 01:18:44.120] So the answer to Pedro's question is no.
[01:18:44.120 --> 01:18:52.720] So if you look in any direction on obviously on large scale, the universe looks the same.
[01:18:52.720 --> 01:18:54.200] That's a property called isotropy.
[01:18:54.200 --> 01:18:57.680] Yeah, I was going to say I pronounce it isotropy.
[01:18:57.680 --> 01:18:58.680] Yeah.
[01:18:58.680 --> 01:18:59.680] I like isotropy.
[01:18:59.680 --> 01:19:00.680] That's fun.
[01:19:00.680 --> 01:19:01.680] Isotropy.
[01:19:01.680 --> 01:19:02.680] Yeah, I think.
[01:19:02.680 --> 01:19:03.680] Yeah, that's how Bob pronounced it.
[01:19:03.680 --> 01:19:05.200] Remember, I did a what's the word on this.
[01:19:05.200 --> 01:19:08.840] We talked about kind of standing up on top of a hill and looking in all directions.
[01:19:08.840 --> 01:19:11.860] And not having that would be an isotropy.
[01:19:11.860 --> 01:19:15.760] And now there's also the universe also is another property where no matter where you
[01:19:15.760 --> 01:19:20.760] are in the universe, it looks the same again, at a large enough center point.
[01:19:20.760 --> 01:19:21.760] Yeah.
[01:19:21.760 --> 01:19:22.760] And what's that called?
[01:19:22.760 --> 01:19:26.080] The universe is homogeneous, homogeneous, homogeneous.
[01:19:26.080 --> 01:19:29.240] So the universe is isotropic, like a special term.
[01:19:29.240 --> 01:19:33.080] Yeah, the universe is isotropic and homogeneous at large enough scale.
[01:19:33.080 --> 01:19:36.280] But of course, the question is at what scale does that happen?
[01:19:36.280 --> 01:19:39.920] We talked about this previously on the show because there was a news item about that.
[01:19:39.920 --> 01:19:46.320] So one good way to remember it, though, for both of these isotropy, isotropy or homogeneity
[01:19:46.320 --> 01:19:50.040] is that they both have to do with uniformity.
[01:19:50.040 --> 01:19:52.440] Homogeneity is uniformity of position.
[01:19:52.440 --> 01:19:58.260] And isotropy is uniformity in respect to angles, like viewing angle.
[01:19:58.260 --> 01:20:03.840] So that's one might be easier way to remember because I kind of like confuse them sometimes.
[01:20:03.840 --> 01:20:06.840] And that might be one way to make it pithy in your head.
[01:20:06.840 --> 01:20:07.840] Yeah.
[01:20:07.840 --> 01:20:12.880] And this is very important concept cosmologically, because basically, there's no privileged
[01:20:12.880 --> 01:20:15.120] location in the universe.
[01:20:15.120 --> 01:20:20.080] Every point in the universe is pretty much equal to every other point, you know, in terms
[01:20:20.080 --> 01:20:22.760] of its relationship to the universe as a whole.
[01:20:22.760 --> 01:20:25.680] There is no center, there is no edge, right?
[01:20:25.680 --> 01:20:29.320] There is no middle or whatever, just all homogenous, right?
[01:20:29.320 --> 01:20:30.600] Just all the same.
[01:20:30.600 --> 01:20:37.280] In fact, the universe is so isotropic and homogenous that physicists have a hard time
[01:20:37.280 --> 01:20:40.320] explaining why there's any clumps of anything.
[01:20:40.320 --> 01:20:42.680] Like why did galaxies form?
[01:20:42.680 --> 01:20:44.680] Cosmic microwave background radiation, bro.
[01:20:44.680 --> 01:20:45.680] Yeah, man.
[01:20:45.680 --> 01:20:48.000] Why isn't it 100% uniform?
[01:20:48.000 --> 01:20:49.000] It had to be.
[01:20:49.000 --> 01:20:50.240] Why is there patternicity at all?
[01:20:50.240 --> 01:20:54.400] Yeah, it had to be inhomogeneous at some scale at some point.
[01:20:54.400 --> 01:20:55.400] Quantum fluctuations.
[01:20:55.400 --> 01:21:03.560] Yeah, once any clumps, even slight perturbations in the homogeneity form, then gravity will
[01:21:03.560 --> 01:21:08.080] take over and form those into clumps, stars and galaxies and whatnot.
[01:21:08.080 --> 01:21:09.720] But what started it all off?
[01:21:09.720 --> 01:21:15.200] Why aren't we just a uniform haze of hydrogen, you know, mostly hydrogen, a little helium
[01:21:15.200 --> 01:21:22.200] and a tad of lithium, whatever, I mean, it's still an open question.
_text_from_show_about_isotropy_WTW_
Who's That Noisy? (1:21:24)
New Noisy (1:23:38)
[high-pitched, scratchy calls/music]
...
J: If you think you know what this week's Noisy is, you can email me at WTN@theskepticsguide.org. ...
[01:21:22.200 --> 01:21:24.960] All right, Jay, who's that noisy time?
[01:21:24.960 --> 01:21:25.960] All right.
[01:21:25.960 --> 01:21:26.960] Last week, I played this noisy.
[01:21:26.960 --> 01:21:48.520] All right, so we got two people talking.
[01:21:48.520 --> 01:21:50.520] Do you guys have any guesses?
[01:21:50.520 --> 01:21:51.520] Dog.
[01:21:51.520 --> 01:21:52.520] A dog?
[01:21:52.520 --> 01:21:55.600] It's a dog.
[01:21:55.600 --> 01:21:57.560] It's a marine mammal, isn't it?
[01:21:57.560 --> 01:21:58.560] No.
[01:21:58.560 --> 01:22:03.320] Well, we have a listener named Michael Praxty and he wrote in, hey, long time listener and
[01:22:03.320 --> 01:22:04.320] huge fan of the show.
[01:22:04.320 --> 01:22:09.360] We just had our second kid yesterday and after all the nurses left and we were hanging out
[01:22:09.360 --> 01:22:13.160] in postpartum, we turned on the latest skeptics guide.
[01:22:13.160 --> 01:22:16.640] So you were also the first voices that Kai heard outside of the room.
[01:22:16.640 --> 01:22:17.640] That's pretty cool.
[01:22:17.640 --> 01:22:21.220] Anyway, I think this audio is a reconstruction of a conversation happening at the other end
[01:22:21.220 --> 01:22:27.980] of a fiber optic cable using a technique similar to this thing that he sent to me.
[01:22:27.980 --> 01:22:31.600] That is not correct, although I think that's really cool, like an early fiber optic message
[01:22:31.600 --> 01:22:34.040] that they were encoding and decoding.
[01:22:34.040 --> 01:22:35.640] Not correct, but that was a cool guess.
[01:22:35.640 --> 01:22:38.640] I'm going to click right into the winner.
[01:22:38.640 --> 01:22:39.980] I have two people here.
[01:22:39.980 --> 01:22:44.440] So I have a guy said his name is pronounced like Chubby and he's from Romania.
[01:22:44.440 --> 01:22:45.840] So I'm just going to say Chubby.
[01:22:45.840 --> 01:22:46.840] Hello, Jay.
[01:22:46.840 --> 01:22:51.320] So the voices belong to Neil Armstrong and Buzz Aldrin during Apollo 11.
[01:22:51.320 --> 01:22:58.080] Specifically, they were speaking at the one hour 25 minute mark and then they were discussing
[01:22:58.080 --> 01:23:00.940] the camera that they were using to take pictures on the moon.
[01:23:00.940 --> 01:23:03.080] Very freaking cool.
[01:23:03.080 --> 01:23:07.480] Another random listener who did not send it in first, but they did win because they got
[01:23:07.480 --> 01:23:10.080] it correct and you guys know who this person is.
[01:23:10.080 --> 01:23:16.280] Joe Anderson wrote in and said, is Neil talking with Buzz on the goddamn moon?
[01:23:16.280 --> 01:23:18.880] That's right, Joe.
[01:23:18.880 --> 01:23:21.920] They were talking about AOS and f-stops and all that.
[01:23:21.920 --> 01:23:27.040] So we have those two people got it correct this week and I have a new noisy for you guys.
[01:23:27.040 --> 01:23:29.440] Wait, do you know what camera they used?
[01:23:29.440 --> 01:23:30.440] Yes.
[01:23:30.440 --> 01:23:31.440] It was a Hasenblad, right Bob?
[01:23:31.440 --> 01:23:32.440] Hasenblad.
[01:23:32.440 --> 01:23:33.440] Yeah.
[01:23:33.440 --> 01:23:34.440] Is that the name?
[01:23:34.440 --> 01:23:35.440] How do you pronounce it?
[01:23:35.440 --> 01:23:36.440] Close enough.
[01:23:36.440 --> 01:23:37.440] Yeah.
[01:23:37.440 --> 01:23:38.440] All right.
[01:23:38.440 --> 01:23:39.440] Here's the new noisy.
[01:23:39.440 --> 01:23:40.440] This noisy was sent in by a listener named Quinn English.
[01:23:40.440 --> 01:24:03.200] This one I dedicate to Bob and you will know why in a second.
[01:24:03.200 --> 01:24:04.440] Very weird.
[01:24:04.440 --> 01:24:05.440] Very Halloween sounding.
[01:24:05.440 --> 01:24:06.660] Yeah, yeah, yeah.
[01:24:06.660 --> 01:24:12.680] If you think you know what this week's noisy is, you can email me at WTN at theskepticsguide.org.
[01:24:12.680 --> 01:24:16.480] Please don't forget if you heard anything cool, email me at the same address with whatever
[01:24:16.480 --> 01:24:17.480] you heard.
[01:24:17.480 --> 01:24:18.480] Steve.
[01:24:18.480 --> 01:24:19.480] Yeah.
[01:24:19.480 --> 01:24:22.560] Our patrons are what keeps our podcast going.
[01:24:22.560 --> 01:24:24.000] Do you realize this?
[01:24:24.000 --> 01:24:25.000] Absolutely.
[01:24:25.000 --> 01:24:26.000] I look at the number every day.
[01:24:26.000 --> 01:24:32.780] So if you want to support the SGU, if you enjoy this show, if we taught you something
[01:24:32.780 --> 01:24:39.160] and you want to show us some appreciation, please go to patreon.com forward slash skeptics
[01:24:39.160 --> 01:24:40.160] guide.
[01:24:40.160 --> 01:24:41.160] Become a patron.
[01:24:41.160 --> 01:24:45.520] You can join us in curing this planet of misinformation.
[01:24:45.520 --> 01:24:46.520] Join us.
[01:24:46.520 --> 01:24:47.520] All right.
[01:24:47.520 --> 01:24:48.520] Thanks, Jay.
Science or Fiction (1:24:53)
Theme: Materials Science
Item #1: Chemists have developed a method for essentially printing complex designer molecules by using specific frequencies of light.[6]
Item #2: Scientists have produced a method for combining single-walled carbon nanotubes into highly ordered structures, such as a regular helix, with minimal errors by using DNA as a lattice.[7]
Item #3: Researchers have produced a biocompatible fiber optic sensor out of spider silk.[8]
Answer | Item |
---|---|
Fiction | Complex designer molecules |
Science | Dna as a lattice |
Science | Spider silk sensor |
Host | Result |
---|---|
Steve | win |
Rogue | Guess |
---|---|
Jay | Spider silk sensor |
Bob | Complex designer molecules |
Cara | Complex designer molecules |
Voice-over: It's time for Science or Fiction.
Jay's Response
Bob's Response
Cara's Response
Steve Explains Item #2
Steve Explains Item #1
Steve Explains Item #3
[01:24:48.520 --> 01:24:54.600] Guys, let's go on with science or fiction.
[01:24:54.600 --> 01:25:04.000] It's time for science or fiction.
[01:25:04.000 --> 01:25:08.320] Each week, I come up with three science news items or facts, two real and then one fake.
[01:25:08.320 --> 01:25:13.840] And I challenge my panel of skeptics to tell me which one is the fake is a theme this week.
[01:25:13.840 --> 01:25:16.880] Although these are all news items, they just happen to cluster in a theme.
[01:25:16.880 --> 01:25:19.080] The theme is material science.
[01:25:19.080 --> 01:25:20.880] That is a frequent theme.
[01:25:20.880 --> 01:25:21.880] Oh, no.
[01:25:21.880 --> 01:25:26.000] We come to material science news and, you know, you see a bunch in a row.
[01:25:26.000 --> 01:25:27.000] I use it.
[01:25:27.000 --> 01:25:28.000] All right.
[01:25:28.000 --> 01:25:29.000] Here we go.
[01:25:29.000 --> 01:25:30.600] Three news items about material science.
[01:25:30.600 --> 01:25:35.440] Item number one, chemists have developed a method for essentially printing complex designer
[01:25:35.440 --> 01:25:39.240] molecules using specific frequencies of light.
[01:25:39.240 --> 01:25:44.960] Number two, scientists have produced a method for combining single walled carbon nanotubes
[01:25:44.960 --> 01:25:52.000] into highly ordered structures such as a regular helix with minimal errors by using DNA as
[01:25:52.000 --> 01:25:53.000] a lattice.
[01:25:53.000 --> 01:25:59.600] And item number three, researchers have produced a biocompatible fiber optic sensor out of
[01:25:59.600 --> 01:26:00.600] spider silk.
[01:26:00.600 --> 01:26:01.600] Jay, go first.
[01:26:01.600 --> 01:26:02.600] All right.
[01:26:02.600 --> 01:26:06.040] This first one, chemists have developed a method for essentially printing complex designer
[01:26:06.040 --> 01:26:09.360] molecules by using specific frequencies of light.
[01:26:09.360 --> 01:26:10.360] Whoa.
[01:26:10.360 --> 01:26:12.740] Pushing around molecules with light.
[01:26:12.740 --> 01:26:14.480] How can that possibly be?
[01:26:14.480 --> 01:26:19.700] I mean, it can't be that the photons are pushing anything because they're massless, but maybe
[01:26:19.700 --> 01:26:23.400] they do something with temperature or I don't know.
[01:26:23.400 --> 01:26:25.800] That sounds iffy, but super interesting.
[01:26:25.800 --> 01:26:29.560] The second one, scientists have produced a method for combining single walled carbon
[01:26:29.560 --> 01:26:33.680] nanotubes into highly ordered structures such as regular helix with minimal errors by using
[01:26:33.680 --> 01:26:34.920] DNA as a lattice.
[01:26:34.920 --> 01:26:36.160] I think that one is science.
[01:26:36.160 --> 01:26:37.160] I think that's really cool.
[01:26:37.160 --> 01:26:42.720] I know there's a ton of research in this type of processing, you know, nanotubes, you know,
[01:26:42.720 --> 01:26:46.080] hugely wanted, hugely useful.
[01:26:46.080 --> 01:26:49.960] So I could see that they use DNA to help them do something.
[01:26:49.960 --> 01:26:51.840] So I totally think that one is science.
[01:26:51.840 --> 01:26:56.840] Last one, researchers have produced a biocompatible fiber optic sensor out of spider silk.
[01:26:56.840 --> 01:26:59.720] A biocompatible fiber optic sensor.
[01:26:59.720 --> 01:27:05.480] So a sensor, I'm guessing what Steve is saying here is the fiber optic sensor is one of the
[01:27:05.480 --> 01:27:12.680] pieces of hardware that they use to, in this case, receive the signal of light that is
[01:27:12.680 --> 01:27:15.340] then transformed into information.
[01:27:15.340 --> 01:27:18.080] Spider silk is one of those things that you heard a lot about in your life, but they never
[01:27:18.080 --> 01:27:20.280] do anything with it.
[01:27:20.280 --> 01:27:22.040] You know, and it's always like, can you scale it up?
[01:27:22.040 --> 01:27:23.040] Can you produce it?
[01:27:23.040 --> 01:27:24.040] Sure.
[01:27:24.040 --> 01:27:28.560] They might have been able to do something in a lab, but sensing fiber optic, I don't
[01:27:28.560 --> 01:27:29.560] know.
[01:27:29.560 --> 01:27:30.560] It's so weird.
[01:27:30.560 --> 01:27:31.560] I'm going to say that that one is the fiction.
[01:27:31.560 --> 01:27:33.960] I don't think we've done anything with spider silk.
[01:27:33.960 --> 01:27:34.960] Okay, Bob.
[01:27:34.960 --> 01:27:40.880] Yeah, the spider silk biocompatible, I could see, but a fiber optic sensor, I mean, I suspect
[01:27:40.880 --> 01:27:46.160] that the silk is a component, maybe even a major component.
[01:27:46.160 --> 01:27:48.280] So I'm kind of going to buy that one.
[01:27:48.280 --> 01:27:54.960] The nanotubes into a regular helix shape using DNA as a lattice, I guess I can see that.
[01:27:54.960 --> 01:28:00.240] The one that's getting me though is this first one, basically printing designer molecules
[01:28:00.240 --> 01:28:02.440] using specific frequencies of light.
[01:28:02.440 --> 01:28:04.520] Hey, I'm not buying that.
[01:28:04.520 --> 01:28:11.280] That would be amazing and it's just kind of a little bit too amazing at this point.
[01:28:11.280 --> 01:28:13.960] So I'm going to say that one's fiction.
[01:28:13.960 --> 01:28:14.960] And Kara.
[01:28:14.960 --> 01:28:18.920] Well, it's funny because I would say I'm going to go with Bob and Jay on this because Jay
[01:28:18.920 --> 01:28:25.560] basically made the exact same argument and then picked a different, picked a different
[01:28:25.560 --> 01:28:27.600] choice.
[01:28:27.600 --> 01:28:33.480] So like as Jay was going through his reasoning, I was like, yeah, yeah, the frequency, like
[01:28:33.480 --> 01:28:35.120] there's no way.
[01:28:35.120 --> 01:28:36.120] And then, and then you said the same thing.
[01:28:36.120 --> 01:28:40.920] So I think I have to go with you, Bob, and say that, yeah, the designer molecules with
[01:28:40.920 --> 01:28:43.080] frequencies of light feels like the fiction.
[01:28:43.080 --> 01:28:44.080] All right.
[01:28:44.080 --> 01:28:47.120] So you all agree on the middle one, so we'll start there.
[01:28:47.120 --> 01:28:54.360] Scientists have produced a method for combining single walled carbon nanotubes or SWCNs into
[01:28:54.360 --> 01:29:00.200] highly ordered structures using a regular, such as a regular helix with minimal errors
[01:29:00.200 --> 01:29:03.120] by using DNA as a lattice.
[01:29:03.120 --> 01:29:07.360] We all think that the DNA lattice is science.
[01:29:07.360 --> 01:29:10.800] And this one is science.
[01:29:10.800 --> 01:29:12.600] This one is cool.
[01:29:12.600 --> 01:29:13.600] Yeah.
[01:29:13.600 --> 01:29:20.640] So it's actually hard, you know, to get these pesky carbon nanotubes to do what we want
[01:29:20.640 --> 01:29:24.440] them to do with very, very few errors.
[01:29:24.440 --> 01:29:32.280] And those errors interfere with the structure of, you know, of the material that we're going
[01:29:32.280 --> 01:29:35.760] for and therefore really limits their utility.
[01:29:35.760 --> 01:29:43.400] For example, they can form breaking points that can unzip, you know, carbon nanofibers,
[01:29:43.400 --> 01:29:44.400] et cetera.
[01:29:44.400 --> 01:29:45.400] To what end, though?
[01:29:45.400 --> 01:29:46.400] Yeah.
[01:29:46.400 --> 01:29:47.560] So, well, that's an interesting question.
[01:29:47.560 --> 01:29:48.560] So this is how they did it.
[01:29:48.560 --> 01:29:55.600] So first of all, they use a specific sequence of DNA, in this case, you know, with CNG amino
[01:29:55.600 --> 01:30:00.320] acids and, I mean, base pairs, with CNG base pairs.
[01:30:00.320 --> 01:30:07.160] So they, for example, they used one sequence that was C3G, C7G, C3.
[01:30:07.160 --> 01:30:14.460] And those contain cytosine and cross-linking binding spots for carbon.
[01:30:14.460 --> 01:30:21.100] And so when that becomes a lattice on which these single-walled carbon nanotubes then
[01:30:21.100 --> 01:30:27.440] bind to each other, and that particular sequence formed an ordered helical structure with a
[01:30:27.440 --> 01:30:30.480] 6.5 angstrom periodicity.
[01:30:30.480 --> 01:30:36.280] But so they just created a helical structure out of the carbon nanotubes, but with very,
[01:30:36.280 --> 01:30:42.320] very few errors because they're being guided into position by this DNA lattice.
[01:30:42.320 --> 01:30:47.760] And you could basically customize the lattice by the sequence that you give it, which changes
[01:30:47.760 --> 01:30:52.800] the relative positioning, you know, of these, the cross-linking reactions.
[01:30:52.800 --> 01:30:54.040] So it's pretty cool.
[01:30:54.040 --> 01:31:03.440] Now, all of the reporting on this said that you could use this to create superconducting
[01:31:03.440 --> 01:31:04.440] materials.
[01:31:04.440 --> 01:31:09.880] But that's like they're putting the cart before the horse, you know, like they're putting
[01:31:09.880 --> 01:31:15.040] one potential theoretical application of this technology could be making metamaterials that
[01:31:15.040 --> 01:31:19.640] have properties like, oh, I don't know, superconductivity, but it really doesn't have anything to do
[01:31:19.640 --> 01:31:23.680] intrinsically to do with the process that they're developing here, which is just, oh,
[01:31:23.680 --> 01:31:24.680] look at this.
[01:31:24.680 --> 01:31:31.180] We can have exquisite control over linking up these carbon nanotubes by using a specific
[01:31:31.180 --> 01:31:35.840] sequence of, you know, DNA as a, essentially a template or as a lattice.
[01:31:35.840 --> 01:31:36.840] Yeah.
[01:31:36.840 --> 01:31:38.720] I mean, they wouldn't even have to mention superconductivity.
[01:31:38.720 --> 01:31:41.160] Just mention metamaterials and you've got my attention.
[01:31:41.160 --> 01:31:42.160] Yeah.
[01:31:42.160 --> 01:31:45.320] But they all say superconductivity and I'm like, oh, okay, look at that.
[01:31:45.320 --> 01:31:48.000] And I'm like, ah, this has nothing to do with superconductivity.
[01:31:48.000 --> 01:31:49.000] All right.
[01:31:49.000 --> 01:31:50.000] All right.
[01:31:50.000 --> 01:31:51.720] Let's go back to number one.
[01:31:51.720 --> 01:31:55.440] Scientists have developed a method for essentially printing complex designer molecules using
[01:31:55.440 --> 01:31:57.560] specific frequencies of light.
[01:31:57.560 --> 01:32:00.440] Bob and Kara, you think this one is the fiction.
[01:32:00.440 --> 01:32:05.520] Jay thinks this one is science and this one is the fiction.
[01:32:05.520 --> 01:32:06.520] Yeah.
[01:32:06.520 --> 01:32:07.520] Yay, Bob.
[01:32:07.520 --> 01:32:08.520] Yeah.
[01:32:08.520 --> 01:32:09.520] It's a little bit too much.
[01:32:09.520 --> 01:32:10.520] I made it up.
[01:32:10.520 --> 01:32:11.520] You made it up?
[01:32:11.520 --> 01:32:20.360] Well, the news item was using photochemistry as a step in a process of forming organic
[01:32:20.360 --> 01:32:21.360] molecules.
[01:32:21.360 --> 01:32:27.240] I'm using the light to break up a molecule into two components that can then be used
[01:32:27.240 --> 01:32:29.680] to form other organic molecules.
[01:32:29.680 --> 01:32:31.660] It's just photochemistry.
[01:32:31.660 --> 01:32:36.800] But the idea of using light to print designer molecules I made up.
[01:32:36.800 --> 01:32:43.000] I had to come up with something that was different enough because you could do so much with light.
[01:32:43.000 --> 01:32:45.000] You can't push molecules around with light.
[01:32:45.000 --> 01:32:46.000] They've done it.
[01:32:46.000 --> 01:32:51.480] You could also combine light with matter in some cases, really heavy stuff.
[01:32:51.480 --> 01:32:58.680] I had to make sure it was significant enough that it's not actually happening out there.
[01:32:58.680 --> 01:33:08.440] Well, Steve, I got to say, wow, I went through this week in SGU history has the distinction
[01:33:08.440 --> 01:33:14.160] of being the longest most amount of time I went through news items to find something
[01:33:14.160 --> 01:33:15.160] that grabbed me.
[01:33:15.160 --> 01:33:20.320] I read I went to every damn science news website I could think of.
[01:33:20.320 --> 01:33:21.560] It was unbelievable.
[01:33:21.560 --> 01:33:24.520] It was the worst, the hardest one, the most amount of time.
[01:33:24.520 --> 01:33:28.920] So I have a lot of familiarity with a lot of news items that came out this week.
[01:33:28.920 --> 01:33:34.800] And this one specifically, I related this to another news item entirely.
[01:33:34.800 --> 01:33:40.360] And this one to get this, though, it was a way of using lasers to polarize atoms so that
[01:33:40.360 --> 01:33:44.720] one side is more positive, one side more negative, so that you're so that you're bonding atoms
[01:33:44.720 --> 01:33:48.440] together with a with a very they're bonding together.
[01:33:48.440 --> 01:33:50.320] But it's not a strong bond.
[01:33:50.320 --> 01:33:52.760] It's not a molecule like bond.
[01:33:52.760 --> 01:33:57.160] It doesn't have the strength of a molecule, which is why I thought this was fiction, because
[01:33:57.160 --> 01:34:00.840] the word molecule meant that, no, this was absolutely fiction, because it doesn't have
[01:34:00.840 --> 01:34:03.040] the binding strength of molecules.
[01:34:03.040 --> 01:34:06.900] It's just very it's just really just a very light attraction.
[01:34:06.900 --> 01:34:11.520] And so it's so weird how this because there's so much with those photochemistry is the thing.
[01:34:11.520 --> 01:34:16.360] The article that inspired me was light as a tool for the synthesis of complex molecules.
[01:34:16.360 --> 01:34:21.160] But here they're using light to break apart a chemical bond and then using that to then
[01:34:21.160 --> 01:34:23.360] insert something in between.
[01:34:23.360 --> 01:34:24.360] Interesting.
[01:34:24.360 --> 01:34:25.360] I'll take the win no matter what.
[01:34:25.360 --> 01:34:26.360] Yeah.
[01:34:26.360 --> 01:34:27.360] Creating organic molecules.
[01:34:27.360 --> 01:34:30.280] But that's the reason I had to make sure it was fiction, because there's so much out there
[01:34:30.280 --> 01:34:32.760] I could easily if I if it wasn't specific enough.
[01:34:32.760 --> 01:34:33.760] Right.
[01:34:33.760 --> 01:34:36.600] Like I said, they're using light to push atoms around to make molecules.
[01:34:36.600 --> 01:34:38.600] Yeah, that is happening.
[01:34:38.600 --> 01:34:43.680] All right, all of this means that researchers have produced a biocompatible fiber optic
[01:34:43.680 --> 01:34:47.400] sensor out of spider silk is science, but I don't know if you saw this one.
[01:34:47.400 --> 01:34:48.400] I didn't.
[01:34:48.400 --> 01:34:49.400] This one I didn't see.
[01:34:49.400 --> 01:34:50.400] Yeah.
[01:34:50.400 --> 01:34:54.240] The here's the actual title of the article, the published article, biocompatible spider
[01:34:54.240 --> 01:34:58.560] silk based metal dielectric fiber optic sugar sensor.
[01:34:58.560 --> 01:35:03.080] So yeah, I didn't realize this, that spider silk, certain kinds of spider silk actually
[01:35:03.080 --> 01:35:05.760] can have fiber optic properties.
[01:35:05.760 --> 01:35:11.480] You know, fiber optic is something that, you know, the refractive index inside, you know,
[01:35:11.480 --> 01:35:16.120] the substance is such that the light will stay inside, it won't go outside.
[01:35:16.120 --> 01:35:19.360] So it travels down along the fiber, right?
[01:35:19.360 --> 01:35:21.360] That's what makes it fiber optic.
[01:35:21.360 --> 01:35:23.960] So spider silk could be used as a fiber optic.
[01:35:23.960 --> 01:35:28.800] They did coat it with like a metal, not something that made it the fiber optic, but with something
[01:35:28.800 --> 01:35:29.800] else.
[01:35:29.800 --> 01:35:35.500] And then they used it as a sensor for sugar, for different types of sugar.
[01:35:35.500 --> 01:35:41.480] So like this would be like a biosensor and then it's very, very, very precise to tell
[01:35:41.480 --> 01:35:45.040] the difference between glucose, sucrose and fructose.
[01:35:45.040 --> 01:35:46.040] How about sucralose?
[01:35:46.040 --> 01:35:48.200] I didn't, did not mention.
[01:35:48.200 --> 01:35:51.520] And it's biocompatible because you can put it in the body and it won't cause a reaction
[01:35:51.520 --> 01:35:52.520] or anything.
[01:35:52.520 --> 01:35:54.760] So yeah, very, very interesting.
[01:35:54.760 --> 01:35:57.160] More spider silk applications, please.
[01:35:57.160 --> 01:35:59.880] I want my shirt, my bulletproof shirt.
[01:35:59.880 --> 01:36:01.720] I had another one that I didn't use.
[01:36:01.720 --> 01:36:05.400] I could have used it though, but I just thought it was too obvious because, you know, this
[01:36:05.400 --> 01:36:14.140] was a process for making cement that is 40% stronger than regular cement by putting basically
[01:36:14.140 --> 01:36:18.560] nano ground up shrimp shells in there.
[01:36:18.560 --> 01:36:22.080] So yeah, so there's a seafood byproduct, right?
[01:36:22.080 --> 01:36:23.080] Yeah.
[01:36:23.080 --> 01:36:29.120] There's a stream of waste from the seafood industry for chitin shells from things like
[01:36:29.120 --> 01:36:30.120] shrimp, right?
[01:36:30.120 --> 01:36:32.720] Or crabs or lobsters or whatever, but they specifically mentioned shrimp.
[01:36:32.720 --> 01:36:33.720] 40% man.
[01:36:33.720 --> 01:36:34.720] Yeah.
[01:36:34.720 --> 01:36:38.920] So it's cement, not necessarily the resulting concrete, but it probably would translate
[01:36:38.920 --> 01:36:39.920] well to the concrete.
[01:36:39.920 --> 01:36:41.760] They just haven't looked at that yet.
[01:36:41.760 --> 01:36:46.040] And also it makes it more flexible, so it's stronger and more flexible.
[01:36:46.040 --> 01:36:55.080] And so if that translates to the final product, that could reduce the amount of cement, right?
[01:36:55.080 --> 01:36:59.600] Or concrete that you need in a build and therefore, because remember we talked about the fact
[01:36:59.600 --> 01:37:09.200] that steel is responsible for 10% of greenhouse gas release, concrete is responsible for 5%.
[01:37:09.200 --> 01:37:13.840] So yeah, between the two of them, it's 15%, you know, of our greenhouse gas emissions.
[01:37:13.840 --> 01:37:14.840] Yeah.
[01:37:14.840 --> 01:37:19.120] So reducing the need for cement for concrete by 40% could be significant.
[01:37:19.120 --> 01:37:21.520] You know, that could take a big chunk out of that.
[01:37:21.520 --> 01:37:24.900] And also just having a stronger cement is nice and more flexible, plus it also could
[01:37:24.900 --> 01:37:25.900] last longer.
[01:37:25.900 --> 01:37:27.280] It might last like twice as long.
[01:37:27.280 --> 01:37:28.280] Oh my God, man.
[01:37:28.280 --> 01:37:32.800] And it would, that factor alone would reduce our need by, you know, by a significant chunk
[01:37:32.800 --> 01:37:35.840] because you don't have to replace it, you know, as often.
[01:37:35.840 --> 01:37:36.840] Yeah.
[01:37:36.840 --> 01:37:41.640] So more durable, longer lasting, stronger cement using a waste stream.
[01:37:41.640 --> 01:37:42.640] A waste stream.
[01:37:42.640 --> 01:37:44.920] Talk about a win-win-win.
[01:37:44.920 --> 01:37:49.200] Right now we just dump it back in the ocean, which is probably not a bad thing, you know.
[01:37:49.200 --> 01:37:51.380] And chitin, it's basically made of chitin.
[01:37:51.380 --> 01:38:00.520] So chitin's a biopolymer and it is the second most common, second most common or abundant
[01:38:00.520 --> 01:38:02.920] biopolymer in the world.
[01:38:02.920 --> 01:38:03.920] What is the first?
[01:38:03.920 --> 01:38:04.920] Hair.
[01:38:04.920 --> 01:38:05.920] Hair.
[01:38:05.920 --> 01:38:06.920] No, no, no, no.
[01:38:06.920 --> 01:38:07.920] Polycarbonate.
[01:38:07.920 --> 01:38:08.920] No.
[01:38:08.920 --> 01:38:09.920] What's it called?
[01:38:09.920 --> 01:38:10.920] Bicarb.
[01:38:10.920 --> 01:38:11.920] The thing that makes up seashells.
[01:38:11.920 --> 01:38:12.920] Calcium.
[01:38:12.920 --> 01:38:13.920] Calcium bicarbonate.
[01:38:13.920 --> 01:38:14.920] Calcium bicarbonate.
[01:38:14.920 --> 01:38:15.920] Yeah.
[01:38:15.920 --> 01:38:16.920] That's in the seashells too.
[01:38:16.920 --> 01:38:17.920] That is in cement.
[01:38:17.920 --> 01:38:18.920] So that is part of why it's a useful additive.
[01:38:18.920 --> 01:38:20.200] But no, no, biopolymer.
[01:38:20.200 --> 01:38:24.280] So it is cellulose, cellulose made by plants, right?
[01:38:24.280 --> 01:38:25.280] That's like the basic.
[01:38:25.280 --> 01:38:26.280] Ah, yeah.
[01:38:26.280 --> 01:38:31.000] Yeah, so plants speed out insects, I guess, in terms of their structural biopolymer, in
[01:38:31.000 --> 01:38:34.640] terms of just the raw amount in the world.
[01:38:34.640 --> 01:38:37.000] But chitin is the second most common.
[01:38:37.000 --> 01:38:39.880] And spider silk is also a biopolymer.
[01:38:39.880 --> 01:38:41.580] Not my favorite biopolymer.
[01:38:41.580 --> 01:38:47.280] As abundant as cellulose or chitin, but very desirable properties.
[01:38:47.280 --> 01:38:48.280] Yeah.
[01:38:48.280 --> 01:38:50.560] They're all structurally very strong things.
[01:38:50.560 --> 01:38:51.560] Cool.
[01:38:51.560 --> 01:38:52.560] Okay.
[01:38:52.560 --> 01:38:54.560] Well, good job, Bob and Kara.
[01:38:54.560 --> 01:38:56.160] I came close, Steve.
[01:38:56.160 --> 01:38:57.160] You did.
[01:38:57.160 --> 01:38:58.160] You were almost there, Jay.
[01:38:58.160 --> 01:39:00.080] You just doubted yourself at the last moment.
Skeptical Quote of the Week (1:39:01)
If anyone can refute me–show me I'm making a mistake or looking at things from the wrong perspective–I'll gladly change. It's the truth I'm after, and the truth never harmed anyone. What harms us is to persist in self-deceit and ignorance.
– Marcus Aurelius (121-180), Roman emperor and Stoic philosopher, from Meditations Book 6, Number 21
[01:39:00.080 --> 01:39:01.640] Okay.
[01:39:01.640 --> 01:39:05.960] I am taking over the quote for today since Evan is not here.
[01:39:05.960 --> 01:39:08.720] This was submitted by a listener called Grant.
[01:39:08.720 --> 01:39:11.280] That's all he gave as his name.
[01:39:11.280 --> 01:39:13.040] And I thought it was appropriate.
[01:39:13.040 --> 01:39:19.320] And the quote is from Marcus Aurelius, Meditations, Book 6, Number 21.
[01:39:19.320 --> 01:39:24.960] Marcus Aurelius, skeptic of the ancient world, I don't know if you guys know who he is.
[01:39:24.960 --> 01:39:29.180] And he wrote, if anyone can refute me, show me I'm making a mistake or looking at things
[01:39:29.180 --> 01:39:32.420] from the wrong perspective, I'll gladly change.
[01:39:32.420 --> 01:39:35.840] It's the truth I'm after, and the truth never harmed anyone.
[01:39:35.840 --> 01:39:39.600] What harms us is to persist in self-deceit and ignorance.
[01:39:39.600 --> 01:39:40.600] Oh, my God.
[01:39:40.600 --> 01:39:41.600] Wow.
[01:39:41.600 --> 01:39:42.600] Wow.
[01:39:42.600 --> 01:39:43.600] He's a fantastic man.
[01:39:43.600 --> 01:39:44.600] Total, right?
[01:39:44.600 --> 01:39:45.600] Total skeptic living in the ancient world.
[01:39:45.600 --> 01:39:47.640] Yeah, he's a philosopher, right?
[01:39:47.640 --> 01:39:48.640] Yeah.
[01:39:48.640 --> 01:39:49.640] Yeah.
[01:39:49.640 --> 01:39:51.480] We've quoted him before, too, because he's sort of a...
[01:39:51.480 --> 01:39:52.480] He's very quotable.
[01:39:52.480 --> 01:39:53.480] Yeah, yeah.
[01:39:53.480 --> 01:39:55.520] Very quotable.
[01:39:55.520 --> 01:39:58.920] He lived from 121 to 180 A.C.E.
[01:39:58.920 --> 01:40:06.760] He was a Roman emperor from 161 to 180, and a Stoic philosopher, the last of the rulers
[01:40:06.760 --> 01:40:09.840] known as the five good emperors.
[01:40:09.840 --> 01:40:13.360] But yeah, this guy's just overflowing with skeptical philosophy.
[01:40:13.360 --> 01:40:14.920] I'm going to read more about him.
[01:40:14.920 --> 01:40:17.120] I'm interested to see what he has to say.
[01:40:17.120 --> 01:40:18.120] Yeah, totally.
[01:40:18.120 --> 01:40:19.120] It is fascinating.
[01:40:19.120 --> 01:40:24.000] I took a course in Greek philosophy in college, and the professor said they basically thought
[01:40:24.000 --> 01:40:25.000] of everything.
[01:40:25.000 --> 01:40:30.920] The first time they started thinking systematically about stuff, they basically had all the ideas,
[01:40:30.920 --> 01:40:35.140] in terms of just basic philosophical ideas.
[01:40:35.140 --> 01:40:39.960] Everything has its roots, is like variations on a theme from the Greek philosophers.
[01:40:39.960 --> 01:40:42.920] It's probably not literally true, but it does seem like...
[01:40:42.920 --> 01:40:43.920] Wait, he's Roman.
[01:40:43.920 --> 01:40:44.920] Yeah, yeah.
[01:40:44.920 --> 01:40:47.600] But Greek led to Roman, led to, you know, but...
[01:40:47.600 --> 01:40:49.120] So you mean between the two.
[01:40:49.120 --> 01:40:51.320] Yeah, I mean, they also did something really smart.
[01:40:51.320 --> 01:40:52.680] They wrote it down.
[01:40:52.680 --> 01:40:53.680] They wrote it down.
[01:40:53.680 --> 01:40:54.680] Exactly.
[01:40:54.680 --> 01:40:58.000] You know, like, they probably weren't the first people to think of this stuff either,
[01:40:58.000 --> 01:40:59.000] but they wrote it down.
[01:40:59.000 --> 01:41:04.280] But they did think systematically thought about things, philosophy, and they wrote it
[01:41:04.280 --> 01:41:05.280] down.
[01:41:05.280 --> 01:41:06.280] So it survived.
[01:41:06.280 --> 01:41:09.080] And so it's like, oh, they thought of all this stuff, you know, just...
[01:41:09.080 --> 01:41:10.080] Yeah.
[01:41:10.080 --> 01:41:13.040] I mean, they grappled with concepts that we still grapple with today.
[01:41:13.040 --> 01:41:14.520] Yeah, yeah, yeah, yeah.
[01:41:14.520 --> 01:41:15.520] Called the human condition.
[01:41:15.520 --> 01:41:16.520] Yep.
[01:41:16.520 --> 01:41:17.520] Right.
Signoff/Announcements
[01:41:17.520 --> 01:41:18.520] All right.
[01:41:18.520 --> 01:41:19.520] Well, thank you all for joining me this week.
[01:41:19.520 --> 01:41:20.520] Sure, bro.
[01:41:20.520 --> 01:41:21.520] Thanks, Steve.
[01:41:21.520 --> 01:41:22.520] Thanks, Steve.
S: —and until next week, this is your Skeptics' Guide to the Universe.
S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.
Today I Learned
- Fact/Description, possibly with an article reference[9]
- Fact/Description
- Fact/Description
Notes
References
- ↑ Nature Nanotechnology: Atomic-scale friction between single-asperity contacts unveiled through in situ transmission electron microscopy
- ↑ Neurologica: Political Ideology and the Brain
- ↑ Neurologica: Lunar Pits Warm and Comfy
- ↑ The Verge: Playing video games all summer won’t make you feel worse
- ↑ Global Times: Australian scientists begin to shine light into invisible dark matter
- ↑ Nature Chemistry: Photochemical single-step synthesis of β-amino acid derivatives from alkenes and (hetero)arenes
- ↑ Science: DNA-guided lattice remodeling of carbon nanotubes
- ↑ Biomedical Optics Express: Biocompatible spider silk-based metal-dielectric fiber optic sugar sensor
- ↑ [url_for_TIL publication: title]
Vocabulary