SGU Episode 903

From SGUTranscripts
Revision as of 05:51, 3 November 2022 by Xanderox (talk | contribs) (more tweaks coming)
Jump to navigation Jump to search
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) This is an outline for a typical episode's transcription. Not all of these segments feature in each episode.
There may also be additional/special segments not listed in this outline.

You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 903
October 29th 2022
857-DNA-Data.jpg
(brief caption for the episode icon)

SGU 902                      SGU 904

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Guest

RS: Richard Saunders

Quote of the Week

Quote

Author 


Links
Download Podcast
Show Notes
Forum Discussion


Introduction, Navigating Social Media, Misinformation, Pandemics

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality. S: Hello and welcome to the Skeptics' Guide to the Universe. Today is Thursday, October 27th 2022, and this is your host, Steven Novella. Joining me this week are Bob Novella

B: Hey, everybody!

S: Cara Santa Maria

C: Howdy.

S: Jay Novella

J: Hey guys.

S: Evan Bernstein.

E: Good evening, everyone.

S: So I wonder how many of our listeners have noticed that we're recording on Thursday now instead of Wednesday.

C: Oh, right. Because you do say the date of the recording, not the date that we air.

S: Yeah, because I want people to know when we have the conversation, not that people still don't email us saying, why didn't you talk about this thing that happened on Friday, you know, like after we recorded the show before it went on.

(laughter)

J: Because Friday doesn't exist yet.

S: Right.

C: Exactly.

Cara Moving To Florida (00:26)

S: We're in podcast time. So we had to change our night because of Cara because she moved to Florida to do her new job. How's it going?

C: Damn it.

S: How's the whole Florida new training thing going, Cara?

C: Oh, well, okay. This has nothing to do with the question you just asked, but kind of, because you said, how's it going? The first thing I thought was, I was walking to work the other day. So you guys, I walk to and from work every day, which everybody here thinks is bananas because it's like hot and humid, but I love it. It's just like a mile each way, so 20 minutes, whatever. The other day I had my first iguana fell out of a tree right in front of me situation.

J: Did it scream as it went by or what happened?

E: You're a real Floridian now.

C: I'm a real Floridian. I walked to school in Crocs. I'm like, I'm just full on Florida girl now.

S: Not until you have an alligator stalking you.

E: Yeah, you do need to have a gator encounter.

S: Close encounter with a gator.

C: But really-

J: Can somebody explain Crocs to me? Like, and I'm dead serious.

E: Shoes?

C: Yes, I will explain them to you. So they are foam, waterproof shoes that have a sport gear is what they call it down here when you flip the strap behind your heel. Usually you keep it flipped to the front because they're easy to slip on and off, but when you need them to be in sport gear, you put it behind your heel so you can go and they don't fall off.

(Evan laughs)

C: Okay, you didn't think that was as funny as I did. Literally people say that. They're like, oh, your Crocs are in sport gear.

S: You have a desert style.

C: Yeah. So when I first came here, the first day I went to HR and was signing paperwork and I was talking to the woman and I was like, oh yeah, I'm going to be walking to work. And she was like, are you out of your mind? And I was like, what do you mean? She's like, people don't walk here, which is true. I almost get hit by cars constantly. They're not used to pedestrians. It's hot, it's humid, whatever. It's a car culture. But she was like, go ahead, go online, buy yourself a pair of Crocs. Thank me later. And I was like, okay. And I did. And it's like the best decision I ever made because it always rains here. And so you never know when you're going to be walking through puddles. You never know when you're going to step in mud. And Crocs are waterproof completely. They're just these molded foam shoes with holes in them.

J: Now what about the whole, like they look terrible angle?

C: They do look terrible, but they're just, it's because it's one solid piece, right? Usually shoes are like things glued and sewn together, but this is like one piece, like a molded shoe. And so yeah, they're hideous.

E: So you could 3D print your own Crocs.

C: You could, but that would not be comfortable because you want them to be, like, properly.

S: Isn't function over fashion, like all of Florida, isn't like in their constitution or something?

C: Yes. Function over fashion is yes, the state motto.

(Cara & Evan laugh)

C: But they're super smart. And so I keep them, I keep my regular shoes in my backpack and when I get to school, I swap them out for my regular shoes. So anyway, I was walking to school and there's a thing here where apparently when it gets cold and we're not there yet, it's still warm out. So I feel like this iguana just slipped, but apparently when it gets cold, iguanas freeze and fall out of trees. And then they like are stunned when they hit the ground and scurry off. Or if they're still asleep, people collect them and then they either wait until they wake up or they call them because they're invasive. But this iguana fell, I swear, two feet from me.

S: Can you eat them?

C: I don't know. I wouldn't want to.

S: Can you make something out of their leather if you skin them?

C: Oh, probably. You could probably do some cool taxidermy with them because they're beautiful. They're like little dinosaurs. I've talked to multiple people who said an iguana has fallen on their head, which blows my mind because they're big and they break car windshields all the time. It's a thing here. Iguana strike.

S: That's right. The last time I was in Florida, there weren't a lot of iguanas. I guess recently their populations have been taken off?

C: I think they've been here a while, but it depends on where you were. So I'm in South Florida and just the area where I live near Fort Lauderdale and Davie is very iguana heavy and they like bodies of water. So if you're walking by a lot of little ponds and rivers and lakes and streams and things like that, they're just all over the banks and climbing the trees and everything.

S: Yeah.

C: Yeah. They're cool though, but yeah, they don't like them here. But I kind of like them.

Katy Perry's Eye 'Glitch' Due To Makeup (04:53)

J: Cara, can I ask you something else that's related to women and possibly makeup?

C: Florida?

J: No.

C: Oh, yes.

J: So what's up with the Katy Perry eye situation? What do you think that was caused by?

C: So who shared this with us first? Is that you, Evan?

E: Oh, I did. Yes, yes I did.

(Cara & Evan laugh)

C: Okay. People who are listening, you have to see this video. I think it's been all over the internet, so hopefully you've seen it, but if not, make sure you watch it. So she looked like a droid, right?

S: Mmmm

C: Did you guys get that same vibe? Like she looked animatronic in the video?

S: Right.

J: Yeah.

E: And that was one of the running sort of jokes online. But then I also found some posts, you know, they have the clip posted to YouTube a thousand times and a bunch of people saying, oh, here's the result of the vaccination, of the COVID vaccine.

S: Oh, my God.

E: That she's suffering perhaps paralysis of some sort due to the fact that she's an advocate for, for COVID vaccinations.

S: Evan, do you know why people say that?

E: Because they're dumb?

S: Because they're stupid.

E: (laughs)Yeah. So good. I was right.

S: You are correct, sir.

(laughter)

S: That is not neurological. That is not paralysis or weakness or any medical neurological phenomenon at all.

C: So it wasn't droopy. It wasn't like her face, there wasn't anything else in her face. Yes, it was unilateral. But if you watch the video, it's like she closes both eyes like to blink and then one of them doesn't open. The other one opens and then she's like trying to open it and it can't open. And then she like uses her finger and she opens it. And I'm looking at this video and thinking she's on stage. She's wearing a ton of stage makeup. Her eyelashes are amazing, but they're also out of control big.

E: Right.

C: Honestly, her eyes probably got stuck together from the eyelash glue.

S: I agree.

C: That's what it looks like to me.

S: Got stuck. Totally.

C: She's sweaty.

B: And it wasn't a bit?

C: Like melting. No, I don't think it was a bit. Like she looked kind of stressed about it.

E: Well, she actually posted today, today as we speak about it because the first comments she's publicly made about it. Let's see. On an Instagram post, she says, welcoming all my flat earthers, spaces fakers, birds aren't realers, sky isn't bluers to come see my broken doll eye party trick in Las Vegas next year. So poking, poking fun at the whole thing and it does have like doll eye vibe to it.

C: It's very uncanny valley when she does that.

E: Yeah.

S: Yeah. Because it's not biological. That's why.

C: Right. Yeah. Because if it were biological, it wouldn't, it might look kind of different to us, but it would, it wouldn't look not real.

S: Yeah, exactly.

C: Yeah.

S: People should just, you know, just stop diagnosing people on online on videos and stuff.

(Evan laughs)

S: I remember when, Oh look, Hillary coughed up a piece of her lung. No she didn't, stop it.

E: Oh my God (laughs)

C: I mean, what was that? Actually, I'm trying to remember.

S: Probably a lozenge or something, you know.

Interview with Richard Saunders (07:40)

S: Hey, you know who we haven't heard from in a long time?

C: Who?

E: Who's that Steve?

S: Uh, Richard Saunders. Remember that guy from Australia? We used to know.

C: Yeah.

E: Yes? Richard Saunders. The Skeptic zone.

J: Remember him? I want to have his baby. That's where I'm at with him.

(laughter)

C: Wow.

S: You ever seen him since (inaudible)? I mean, peak pandemic.

[07:55.360 --> 07:56.360] Same, same thing.

[07:56.360 --> 07:57.360] 2019.

[07:57.360 --> 07:58.360] Yeah.

[07:58.360 --> 07:59.360] Yeah.

[07:59.360 --> 08:00.360] Yeah.

[08:00.360 --> 08:01.360] Yeah.

[08:01.360 --> 08:02.360] Yeah.

[08:02.360 --> 08:03.360] Yeah.

[08:03.360 --> 08:04.360] Yeah.

[08:04.360 --> 08:05.360] 2019.

J: You want me to bring them in Steve?

[08:06.360 --> 08:07.360] Sure.

[08:07.360 --> 08:08.360] Can you?

[08:08.360 --> 08:09.360] Jay?

[08:09.360 --> 08:10.360] Hold on.

[08:10.360 --> 08:11.360] Yeah.

J: Give me a second.

[08:12.360 --> 08:13.360] All right.

B: Snap your fingers. You got the gauntlet on Jay? Snap them.

J: Richard, are you there?

[08:17.360 --> 08:18.360] What the hell?

[08:18.360 --> 08:19.360] Oh my gosh.

[08:19.360 --> 08:20.360] Sorry.

[08:20.360 --> 08:21.360] I was just on the movie set and now I'm in this room.

[08:21.360 --> 08:22.360] What the hell's going on here?

[08:22.360 --> 08:23.360] The magic of Hollywood.

[08:23.360 --> 08:24.360] Oh, it's SGU.

[08:24.360 --> 08:25.360] My goodness me.

[08:25.360 --> 08:26.360] All right.

[08:26.360 --> 08:27.360] All right.

[08:27.360 --> 08:30.360] I'll, I'll tell the director I'll be back later.

[08:30.360 --> 08:31.360] Yeah.

[08:31.360 --> 08:33.360] Yeah.

[08:33.360 --> 08:34.360] We're going to need you for a couple of hours, Richard.

[08:34.360 --> 08:35.360] All right.

[08:35.360 --> 08:36.360] Hi guys.

[08:36.360 --> 08:37.360] Rogues.

[08:37.360 --> 08:38.360] Hello.

[08:38.360 --> 08:39.360] What's up?

[08:39.360 --> 08:40.360] Richard, how you doing man?

[08:40.360 --> 08:41.360] I'm doing really well.

[08:41.360 --> 08:47.280] It's been a long time and it's largely due to this pandemic and the ramifications.

[08:47.280 --> 08:51.640] As you all know, in the years past we'd always see each other once or twice a year at TAM

[08:51.640 --> 08:55.440] or Psychon or Dragoncon or wherever the hell it was.

[08:55.440 --> 08:57.360] I even came to visit you guys once.

[08:57.360 --> 08:58.360] Nexus.

[08:58.360 --> 08:59.360] Yep.

[08:59.360 --> 09:00.360] Yeah.

[09:00.360 --> 09:01.360] Yeah.

[09:01.360 --> 09:03.360] The pandemic came along and everything went sort of a haywire.

[09:03.360 --> 09:04.360] Yeah.

[09:04.360 --> 09:08.360] We haven't been collectively to Dragoncon for example in three years.

[09:08.360 --> 09:09.360] Yeah.

[09:09.360 --> 09:10.360] Yeah.

[09:10.360 --> 09:11.360] Nexus has been online for three years.

[09:11.360 --> 09:16.120] That's the last time I was, I think last time I saw you guys in the States would have

[09:16.120 --> 09:17.120] been Dragoncon.

[09:17.120 --> 09:18.120] Yeah.

[09:18.120 --> 09:19.120] It was.

[09:19.120 --> 09:20.120] Yeah.

[09:20.120 --> 09:21.120] Yeah.

[09:21.120 --> 09:22.120] Then you came out here.

[09:22.120 --> 09:23.120] We saw you in 2019.

[09:23.120 --> 09:24.120] Yeah.

[09:24.120 --> 09:25.120] Right before the pandemic.

[09:25.120 --> 09:26.120] Yeah.

[09:26.120 --> 09:27.120] That was right before.

[09:27.120 --> 09:28.120] That was the last trip we took before.

[09:28.120 --> 09:29.120] Yeah.

[09:29.120 --> 09:30.120] Before it happened.

[09:30.120 --> 09:31.120] Yeah.

[09:31.120 --> 09:32.120] Yeah.

[09:32.120 --> 09:33.120] Yeah.

[09:33.120 --> 09:34.120] I think about being on set.

[09:34.120 --> 09:35.120] Luckily, now things are starting to get back to normal.

[09:35.120 --> 09:39.680] I've been picking up a lot more movie and TV work and things like that, which I do as

[09:39.680 --> 09:41.320] a part-time thing.

[09:41.320 --> 09:48.720] And I spent a day filming on Thor some time ago because these things take a while then

[09:48.720 --> 09:50.000] the movie came out.

[09:50.000 --> 09:56.360] So I grew a beard and grew my hair for four months to look like a classical God.

[09:56.360 --> 10:00.940] They dressed me up in robes and everything and I spent a whole day filming these amazing

[10:00.940 --> 10:06.560] scenes with other gods around me, looking at Thor and reacting and going crazy.

[10:06.560 --> 10:10.240] And if you freeze the movie, if you freeze the movie, you can actually see me for about

[10:10.240 --> 10:14.140] two seconds.

[10:14.140 --> 10:15.640] You're my favorite deity, Richard.

[10:15.640 --> 10:16.640] There you go.

[10:16.640 --> 10:18.080] There I am.

[10:18.080 --> 10:23.040] And I can honestly say I have played a classical God in a Marvel movie.

[10:23.040 --> 10:24.040] So there you go.

[10:24.040 --> 10:25.040] Nice.

[10:25.040 --> 10:28.240] Do you still have a weekly podcast?

[10:28.240 --> 10:33.280] The Skeptic Zone's now in its heavens above 14th or 15th year.

[10:33.280 --> 10:34.280] Right?

[10:34.280 --> 10:35.280] Yeah.

[10:35.280 --> 10:36.280] I kicked it off.

[10:36.280 --> 10:37.280] Oh my gosh.

[10:37.280 --> 10:38.760] I think I kicked it off the year I met all you guys in person.

[10:38.760 --> 10:43.040] And I met you guys, most of you guys, at the amazing meeting in 2008.

[10:43.040 --> 10:44.040] 2008.

[10:44.040 --> 10:45.040] Yeah.

[10:45.040 --> 10:46.040] 2008.

[10:46.040 --> 10:47.040] Yeah.

[10:47.040 --> 10:48.480] There's a great picture of us all together.

[10:48.480 --> 10:50.400] And that's the year I kicked off The Skeptic Zone.

[10:50.400 --> 10:53.360] And yeah, every week it's still coming out.

[10:53.360 --> 10:59.040] I've had many reporters over the years and lots of things come and go, but it's been

[10:59.040 --> 11:01.200] a huge part of my life.

[11:01.200 --> 11:07.040] You can realize doing a weekly podcast for umpteen years, it's a huge part of your life

[11:07.040 --> 11:11.680] if you look at your life as a whole and you think this whole section was dedicated or

[11:11.680 --> 11:14.760] is still dedicated to this one endeavor.

[11:14.760 --> 11:20.440] And it's a tribute to you guys especially because your show is a year or two older than

[11:20.440 --> 11:21.440] mine.

[11:21.440 --> 11:29.440] And all of us old timers, like Brian Dunning or Gio or from around the world, this is dedication.

[11:29.440 --> 11:30.800] This is real dedication.

[11:30.800 --> 11:34.000] This is episode number 903 for us.

[11:34.000 --> 11:35.000] No.

[11:35.000 --> 11:36.000] Wowee wow.

[11:36.000 --> 11:37.000] Oh.

[11:37.000 --> 11:41.040] My next episode will be 734.

[11:41.040 --> 11:42.520] I'll catch up one day.

[11:42.520 --> 11:44.560] I'll catch up one day.

[11:44.560 --> 11:46.760] That's how time works.

[11:46.760 --> 11:47.760] That's right.

[11:47.760 --> 11:54.440] Richard, you mentioned like how things have changed since the pandemic and it's so true.

[11:54.440 --> 12:00.280] I mean, my perception on just how I spend my time, you know, with my friends outside

[12:00.280 --> 12:05.080] of my house, like everything is different, but it is the norm now.

[12:05.080 --> 12:07.440] Look, you know, you get used to it.

[12:07.440 --> 12:10.560] I mean, we all had the period where we couldn't do anything.

[12:10.560 --> 12:13.800] I mean, especially here in Australia, the lockdowns were pretty harsh.

[12:13.800 --> 12:17.640] You had to have a good reason to leave the house and there were restrictions on how far

[12:17.640 --> 12:18.640] you could travel.

[12:18.640 --> 12:23.160] It's all sort of fading a bit back into the memory now, but especially for me not being

[12:23.160 --> 12:29.600] able to see my regular friends and of course, in the middle of all that, James Randi died,

[12:29.600 --> 12:32.560] which hit me like a ton of bricks, you know?

[12:32.560 --> 12:33.800] Yeah.

[12:33.800 --> 12:39.440] Apart from the influence Randi had on all our lives, as you all know, when you got to

[12:39.440 --> 12:45.280] see him, he was a friend, as we call in Australia, a good mate, a buddy, a pal, and you know,

[12:45.280 --> 12:50.440] you hung out together and swapped stories and learned a lot and you know, that all came

[12:50.440 --> 12:53.000] to an end and that's still a sadness.

[12:53.000 --> 12:58.720] You know, I'm so glad I got to know him, as we all are, but wow, it's, and that's

[12:58.720 --> 13:01.080] one of the big parts of our life too.

[13:01.080 --> 13:04.080] So Richard, what skeptical projects are you working on?

[13:04.080 --> 13:08.600] Well, at the moment in Australia, we're gearing up for the big convention.

[13:08.600 --> 13:12.360] Our first face-to-face convention since the beginning of the pandemic.

[13:12.360 --> 13:16.840] We've had two conventions since the pandemic started online, which were very successful.

[13:16.840 --> 13:17.840] They were great.

[13:17.840 --> 13:22.800] People from around the world joined us, but at last we're coming to meet again and if

[13:22.800 --> 13:27.760] you're in Australia and you want to come and meet all the skeptics face-to-face, we're

[13:27.760 --> 13:31.960] meeting in Canberra, which is the capital of Australia, and this will be the first week

[13:31.960 --> 13:35.160] end of December.

[13:35.160 --> 13:44.520] All the information about the guest speakers is at scepticon.org.au, AU for Australia,

[13:44.520 --> 13:50.800] and we've got, for example, talking about dedicated podcasters, we have the ESP, the,

[13:50.800 --> 13:54.200] all the members of the European Skeptics Podcast coming.

[13:54.200 --> 13:55.760] Oh, awesome.

[13:55.760 --> 13:56.760] Yeah.

[13:56.760 --> 14:01.280] And it's the first time for many of the, one of them's been to Australia before, but

[14:01.280 --> 14:02.720] it's the first time for others.

[14:02.720 --> 14:08.040] We also have Claire Klingenberg, of course, who is the president of the European Skeptical

[14:08.040 --> 14:13.960] Organizations and a host of local speakers, local Australian speakers talking about all

[14:13.960 --> 14:16.120] sorts of wonderful things.

[14:16.120 --> 14:22.160] We've got some international visits via the internet from somebody called Dr. Steve Novella.

[14:22.160 --> 14:23.160] I've heard about him.

[14:23.160 --> 14:30.020] He's going to be appearing, yeah, and our buddy Brian Dunning will be also appearing

[14:30.020 --> 14:34.640] live via the screens to address everybody.

[14:34.640 --> 14:35.640] But it'll be great.

[14:35.640 --> 14:38.680] I mean, it'll be the wonderful things of seeing everybody.

[14:38.680 --> 14:43.160] We'll have the dinner, we'll have the Bent Spoon Award for the most preposterous piece

[14:43.160 --> 14:48.180] of paranormal or pseudoscientific piffle, which we give out every year, and other prizes

[14:48.180 --> 14:49.180] and things like that.

[14:49.180 --> 14:53.040] So that's the big thing happening, coming up, you know.

[14:53.040 --> 14:58.440] But apart from that, we just, for me personally, the biggest thing in the last year was the

[14:58.440 --> 15:03.760] publication of the Great Australian Psychic Prediction Project, which was a 12-year endeavor

[15:03.760 --> 15:11.920] on my part, and later on in that 12 years, people from around the world collating 3,811

[15:11.920 --> 15:18.480] psychic predictions published in Australia over a 20-year period, and analyzing those

[15:18.480 --> 15:20.400] and then publishing our results.

[15:20.400 --> 15:21.400] And that, to me, that's –

[15:21.400 --> 15:22.400] Oh, how'd they do?

[15:22.400 --> 15:24.360] Well, they – it was surprising.

[15:24.360 --> 15:29.660] We discovered that actually about 90% of psychics were spot on, and we're – I'm resigning

[15:29.660 --> 15:30.840] from the skeptics now.

[15:30.840 --> 15:31.840] We've discovered that –

[15:31.840 --> 15:32.840] Well, the evidence speaks.

[15:32.840 --> 15:33.840] You've got to listen.

[15:33.840 --> 15:34.840] I have the evidence.

[15:34.840 --> 15:35.840] That's right.

[15:35.840 --> 15:36.840] I'm done.

[15:36.840 --> 15:37.840] I'm done.

[15:37.840 --> 15:38.840] No, okay.

[15:38.840 --> 15:46.020] So, I mean, to cut a long story short, we discovered the – I call this the Saunders

[15:46.020 --> 15:47.920] number of psychic predictions.

[15:47.920 --> 15:54.880] If you have a lot of predictions and you analyze them, 11 – roughly 11% will be – you can

[15:54.880 --> 15:56.960] categorize as coming true.

[15:56.960 --> 16:00.820] That's it, 11%.

[16:00.820 --> 16:10.040] The majority, in our estimation, our calculations after the research, 53 – roughly 53% of

[16:10.040 --> 16:14.760] predictions are simply wrong, wrong, wrong, didn't happen, never happened, it's just

[16:14.760 --> 16:16.600] a wrong prediction.

[16:16.600 --> 16:21.720] But the other part of the chart, the pie chart, if we look at it in that respect, is we have

[16:21.720 --> 16:26.600] a section called too vague, and that's 18%.

[16:26.600 --> 16:31.640] This is predictions like I predict that this actress, in her heart, she knows her true

[16:31.640 --> 16:37.000] love will come one day, but she worries about – this is waffly, vague waffle stuff.

[16:37.000 --> 16:38.640] We also called it waffle.

[16:38.640 --> 16:46.920] That's 18%, and the other part is there's a 15% which we categorize as expected, and

[16:46.920 --> 16:52.440] these include, and I kid you not, somebody predicting earthquakes in California.

[16:52.440 --> 16:53.440] Wow.

[16:53.440 --> 17:00.240] That's what we said, wow, I can't believe you're going to predict that.

[17:00.240 --> 17:01.240] That's bold, yeah.

[17:01.240 --> 17:02.240] That's something.

[17:02.240 --> 17:10.440] Or if somebody predicted for the last presidential camp race between Joe Biden and Donald Trump,

[17:10.440 --> 17:16.200] they predicted that according to numerology, either Joe Biden or Donald Trump has a chance

[17:16.200 --> 17:17.200] to be president.

[17:17.200 --> 17:18.200] What?

[17:18.200 --> 17:19.200] No way.

[17:19.200 --> 17:22.200] Sounds like one of my predictions.

[17:22.200 --> 17:23.920] I didn't know that.

[17:23.920 --> 17:26.180] That was staggering, that was staggering.

[17:26.180 --> 17:34.220] But if people want to read the results, we published it in our magazine, The Skeptic,

[17:34.220 --> 17:40.760] at the end of last year, but I also have the results – that report is read to you on

[17:40.760 --> 17:42.240] an episode of The Skeptic Zone.

[17:42.240 --> 17:46.640] If you go to skepticzone.tv, right at the top of the page is a link.

[17:46.640 --> 17:52.200] You can click the link, sit back, and have the whole paper we published simply read to

[17:52.200 --> 17:56.920] you and our conclusions and our methodology and the things we had to go through.

[17:56.920 --> 18:00.400] I have an archive of every prediction.

[18:00.400 --> 18:07.740] In other words, the database, as I said, has approaching 4,000 predictions in the database,

[18:07.740 --> 18:11.720] but what I have on the hard drive is every prediction.

[18:11.720 --> 18:16.680] In other words, if the prediction came from a magazine, I have got a copy, a digital copy.

[18:16.680 --> 18:20.080] If the prediction was made on a radio show, I've got the audio.

[18:20.080 --> 18:22.560] A TV show, I've got the video.

[18:22.560 --> 18:27.920] That's why the project took 12 years to complete.

[18:27.920 --> 18:34.800] So I sort of figure every 10 years, I seem to do something slightly monumentous.

[18:34.800 --> 18:38.680] So I'm wondering what the next 10 years will hold, and there's a couple of highlights

[18:38.680 --> 18:40.280] in my skeptical career.

[18:40.280 --> 18:47.340] I guess one of them was helping take down power balance, the wristbands, and I had the

[18:47.340 --> 18:53.440] wonderful memory I share with many of you is the day we spoke at Google in California.

[18:53.440 --> 18:56.440] Yeah, we were talking about that recently.

[18:56.440 --> 18:58.300] And I did the old whammy on Jay.

[18:58.300 --> 19:03.580] We did the power balance routine, the applied kinesiology for the people at Google, and

[19:03.580 --> 19:04.580] that was great.

[19:04.580 --> 19:06.520] I loved that time with you guys.

[19:06.520 --> 19:07.520] That was fantastic.

[19:07.520 --> 19:09.680] Well, the content doesn't end.

[19:09.680 --> 19:10.680] That's the thing.

[19:10.680 --> 19:11.680] It's endless.

[19:11.680 --> 19:17.800] You can apply a skeptical lens at pretty much anything that comes your way, especially from

[19:17.800 --> 19:18.800] the internet.

[19:18.800 --> 19:25.960] And strangely, the world needs critical thinking more now than it did 20 years ago when all

[19:25.960 --> 19:28.840] of us were just beginning doing all this.

[19:28.840 --> 19:33.680] Imagine how bad the world would be if we weren't doing this.

[19:33.680 --> 19:35.920] That's a really good question.

[19:35.920 --> 19:41.560] I wonder, would there be more scam items like the power balance just going through our society?

[19:41.560 --> 19:43.280] I really have to wonder about that.

[19:43.280 --> 19:47.640] Oh, just before we leave the predictions, I just spotted some here, which as you can

[19:47.640 --> 19:49.880] imagine, there were many, many wrong predictions.

[19:49.880 --> 19:52.000] The majority of predictions were simply wrong.

[19:52.000 --> 19:58.760] But I love a few here made about Donald Trump in 2017, where an astrologist said Donald

[19:58.760 --> 20:04.680] Trump will make a good president and his cabinet will be the best the US has seen for years.

[20:04.680 --> 20:08.440] Another one predicted Donald Trump will not be impeached.

[20:08.440 --> 20:12.040] And some were predicting he would easily win the 2020 election.

[20:12.040 --> 20:18.980] So I guess this teaches us that when we see a psychic prediction, take it with a grain

[20:18.980 --> 20:20.060] of salt.

[20:20.060 --> 20:22.980] If you get some laughs or entertainment out of it, that's one thing.

[20:22.980 --> 20:27.360] But many of these people actually make their living giving these predictions to magazines

[20:27.360 --> 20:29.840] and radio shows and things like that.

[20:29.840 --> 20:31.400] This is actually their Trojan horse.

[20:31.400 --> 20:36.600] They start out with the entertaining predictions that get into the periodical or whatever to

[20:36.600 --> 20:37.600] the newspaper.

[20:37.600 --> 20:41.920] But really, they're using that to troll for marks, right?

[20:41.920 --> 20:46.240] To get people to come in, then they could rip them off for all their worth.

[20:46.240 --> 20:47.240] Yeah.

[20:47.240 --> 20:48.240] Yeah.

[20:48.240 --> 20:49.240] Yes.

[20:49.240 --> 20:50.240] So it's not benign.

[20:50.240 --> 20:51.240] It's not benign at all.

[20:51.240 --> 20:55.440] They'll find some wealthy older people, unfortunately, who are susceptible to scams.

[20:55.440 --> 20:57.040] Well, but they will.

[20:57.040 --> 21:02.880] And I search for news items all the time about people getting scammed by psychic charlatans.

[21:02.880 --> 21:07.760] You'd be surprised if you just do the Google search or a browser search under news for

[21:07.760 --> 21:13.940] psychic scam, you'll see a lot of stories come up and hundreds of thousands, millions

[21:13.940 --> 21:17.240] of dollars get scammed from these victims.

[21:17.240 --> 21:18.600] They are victims.

[21:18.600 --> 21:19.600] It's huge.

[21:19.600 --> 21:20.600] It's huge.

[21:20.600 --> 21:26.120] And another interesting thing to note in my report, the official report I noted that despite

[21:26.120 --> 21:31.000] all this work and our conclusions, which are thorough, I fully expect that every year,

[21:31.000 --> 21:35.640] the TV, the radio, the newspapers, whatever, we'll still have, oh, here's so-and-so the

[21:35.640 --> 21:40.280] psychic come in to give, what do you see in the stars for Hollywood this year or whatever

[21:40.280 --> 21:41.280] it is?

[21:41.280 --> 21:42.280] That won't stop.

[21:42.280 --> 21:43.280] And we know that.

[21:43.280 --> 21:44.280] That won't stop.

[21:44.280 --> 21:49.360] But at least this report is out there and any journalist is free to read it and see

[21:49.360 --> 21:50.360] our results.

News Items

Trust in Scientists (21:50)

  • [link_URL TITLE][1]

[21:50.360 --> 21:52.640] Richard, let me ask you a question.

[21:52.640 --> 21:56.960] How much do you think Australians in general trust scientists?

[21:56.960 --> 21:59.680] Oh, that's a good question.

[21:59.680 --> 22:05.200] You know, I don't know officially, I don't know if I've seen any stats on that.

[22:05.200 --> 22:07.880] I would hope, of course, I would hope that they would.

[22:07.880 --> 22:09.920] I think they do.

[22:09.920 --> 22:15.720] One of our most trusted figures in Australia, one of our most trusted personalities is Dr.

[22:15.720 --> 22:18.200] Carl Krzelnicki, who is a science communicator.

[22:18.200 --> 22:19.200] Dr. Carl, yeah.

[22:19.200 --> 22:20.200] Dr. Carl.

[22:20.200 --> 22:21.280] He's like Bill Nye.

[22:21.280 --> 22:26.400] And he regularly appears in Australia's top trusted people.

[22:26.400 --> 22:32.040] So I would hope, you know, certainly they would rate above used car salesman or something

[22:32.040 --> 22:34.920] like that.

[22:34.920 --> 22:40.560] We have a Pew survey asking that question of Americans, Kara, what did they find?

[22:40.560 --> 22:41.560] This is great.

[22:41.560 --> 22:45.360] I mean, I love that Pew does these surveys periodically and consistently because we're

[22:45.360 --> 22:48.120] able to kind of track trends over time.

[22:48.120 --> 22:57.120] And this most recent survey, which was actually just administered to 10,588 adults in the

[22:57.120 --> 23:05.360] US between September 13 and the 18th, 2022, was interested in understanding, as they put

[23:05.360 --> 23:10.620] it, quote, how Americans view science and their levels of confidence in groups and institutions

[23:10.620 --> 23:16.920] in society, including scientists and medical scientists, because they do make some distinctions.

[23:16.920 --> 23:19.080] And you'll see a lot of write ups of this.

[23:19.080 --> 23:23.560] But there was a really good write up in the conversation, which I wanted to specifically

[23:23.560 --> 23:29.520] cite that was written by John C. Besley, who is the who's a professor of public relations

[23:29.520 --> 23:35.220] at Michigan State University, because he quite writes about it's easy when you look at these

[23:35.220 --> 23:38.980] results to get a little bit frustrated.

[23:38.980 --> 23:44.280] But it's also really important to notice the strengths of these results.

[23:44.280 --> 23:51.540] So basically, 81 percent of Americans think that government investments in scientific

[23:51.540 --> 23:56.360] research are, quote, worthwhile investments for society over time.

[23:56.360 --> 23:57.360] So that's pretty good.

[23:57.360 --> 23:58.360] That's good.

[23:58.360 --> 23:59.360] Right.

[23:59.360 --> 24:00.400] Eighty one percent.

[24:00.400 --> 24:05.840] That also means 19 percent don't think that, but 81 percent think that.

[24:05.840 --> 24:08.800] And so there are a lot of different questions like these Pew surveys are never just one

[24:08.800 --> 24:09.800] thing.

[24:09.800 --> 24:11.960] They're like a lot of questions asked a lot of different ways.

[24:11.960 --> 24:14.440] So I wanted to go through some of the ones that he highlights.

[24:14.440 --> 24:16.420] And then I pulled a couple more.

[24:16.420 --> 24:20.920] You're going to see some trends that are very expected here.

[24:20.920 --> 24:29.540] But the important thing is that this trend is actually relatively stable.

[24:29.540 --> 24:33.920] So there are things that are trending that aren't stable, especially with regards to

[24:33.920 --> 24:37.800] like intense political polarization that has gotten worse.

[24:37.800 --> 24:44.600] But in terms of the kind of number of Americans who have a certain amount of confidence, here

[24:44.600 --> 24:50.040] we go, that scientists act in the public's best interest, the numbers are pretty good

[24:50.040 --> 24:54.200] and they've been pretty consistent over time.

[24:54.200 --> 24:58.040] So we looked at a great deal versus a fair amount.

[24:58.040 --> 25:01.800] Do you think these groups act in the public's best interest?

[25:01.800 --> 25:06.860] And when we combine the two highest level confidences, which are a great deal and a

[25:06.860 --> 25:12.000] fair amount, both kind of on the positive side, 80 percent of people said that medical

[25:12.000 --> 25:15.120] scientists act in the public's best interest.

[25:15.120 --> 25:19.480] And that's, you know, in light of COVID and in light of like a massive anti-Fauci campaign.

[25:19.480 --> 25:21.920] So this is like a big deal.

[25:21.920 --> 25:27.480] That's higher than people's confidence in the military acting in the public's best interest.

[25:27.480 --> 25:28.480] So 80 percent.

[25:28.480 --> 25:33.560] There was a bump in a positive direction during the pandemic.

[25:33.560 --> 25:34.560] During the pandemic.

[25:34.560 --> 25:35.560] Yeah.

[25:35.560 --> 25:36.560] Yeah.

[25:36.560 --> 25:42.360] And I'm like, it doesn't surprise me, A, because I think people needed to know that and believe

[25:42.360 --> 25:46.120] that because it gave them stability and it gave them structure and it gave them a reason

[25:46.120 --> 25:47.120] to act.

[25:47.120 --> 25:49.800] It's hard to not act.

[25:49.800 --> 25:54.800] But we have to remember that's sort of the angle that this professor writes.

[25:54.800 --> 25:59.680] The point that he makes very often is we have to remember that even though the minority

[25:59.680 --> 26:02.260] is vocal, it's still a minority.

[26:02.260 --> 26:03.700] It might be loud.

[26:03.700 --> 26:07.600] It might be something that the media picks up a lot.

[26:07.600 --> 26:08.600] But it's still a minority.

[26:08.600 --> 26:14.520] OK, so 80 percent of Americans think that medical scientists act in the public's best

[26:14.520 --> 26:15.520] interest.

[26:15.520 --> 26:19.800] 77 percent the military, 77 percent general scientists.

[26:19.800 --> 26:23.360] So they do see a difference between medical scientists and general scientists, which the

[26:23.360 --> 26:26.560] author of this speaks on a little bit later.

[26:26.560 --> 26:28.920] Only 70 percent in police officers.

[26:28.920 --> 26:33.360] And then it goes down from there, 53 percent religious leaders, 44 percent journalists.

[26:33.360 --> 26:34.360] That's problematic.

[26:34.360 --> 26:37.320] Twenty eight percent elected officials.

[26:37.320 --> 26:42.360] That's appropriate.

[26:42.360 --> 26:49.560] And so here we are starting to see some big political polarization when we when the question

[26:49.560 --> 26:55.080] is a percentage of Americans who say when it comes to public policy debates about scientific

[26:55.080 --> 26:59.020] issues, scientists should and they're given a binary.

[26:59.020 --> 27:00.560] And I think it's important to remember this.

[27:00.560 --> 27:02.640] They're given only two options.

[27:02.640 --> 27:05.800] So you'll see that the numbers total 100 percent here.

[27:05.800 --> 27:12.800] Take an active role in policy debates or focus on establishing sound scientific facts.

[27:12.800 --> 27:20.900] So if we're to talk about those outcomes in a cherry picked way, in a framed, biased way,

[27:20.900 --> 27:27.240] we could say, you know, people who lean Republican in the most recent survey, only 29 percent

[27:27.240 --> 27:32.840] of them think scientists should take an active role in policy debates, while a full 66 percent

[27:32.840 --> 27:36.620] of Democrats think that scientists should take an active role in policy debates.

[27:36.620 --> 27:39.160] But remember, the opposite.

[27:39.160 --> 27:43.200] So it's 29 percent of scientists or Republicans think scientists should take an active role

[27:43.200 --> 27:44.240] in policy debates.

[27:44.240 --> 27:48.520] That means 70 percent of them think scientists should focus on establishing sound scientific

[27:48.520 --> 27:49.520] facts.

[27:49.520 --> 27:55.280] Well, all political persuasions think scientists should focus on establishing sound political

[27:55.280 --> 27:57.400] facts or sound scientific facts.

[27:57.400 --> 28:03.560] But within this political climate, we're seeing that people who lean Democratic are choosing

[28:03.560 --> 28:08.160] that they should take an active role in policy debates more often than they're choosing focus

[28:08.160 --> 28:10.480] on establishing sound scientific facts.

[28:10.480 --> 28:12.800] So a lot of this is about framing.

[28:12.800 --> 28:19.280] And I think that we are going to see big fluctuations in how these polling questions are answered

[28:19.280 --> 28:23.420] when they're framed a particular way against a particular political background.

[28:23.420 --> 28:27.200] And of course, right now we're talking COVID, we're talking vaccination, we're talking

[28:27.200 --> 28:31.200] climate change, we're talking a lot of, you know, we've had this conversation a million

[28:31.200 --> 28:36.440] times on the show, non-political, but very politically fueled issues, if that makes sense.

[28:36.440 --> 28:42.440] So I tended to see those stats as the most negative in this survey.

[28:42.440 --> 28:44.440] Because the difference was so big between them?

[28:44.440 --> 28:45.440] Yeah.

[28:45.440 --> 28:50.280] Well, not only was it very big between Democrats and Republicans, though both the numbers for

[28:50.280 --> 28:54.960] Democrats and Republicans are getting worse in the last few years.

[28:54.960 --> 28:59.480] Democrats are down from 75 to 66, Republicans from 43 to 29.

[28:59.480 --> 29:03.240] But also to me, my interpretation of that, and there's multiple ways to interpret things

[29:03.240 --> 29:09.160] like this, is that that is compartmentalization, that people are basically saying, yes, I like

[29:09.160 --> 29:13.760] science when those scientists stay in their lane, just stay in your lab doing your nerdy

[29:13.760 --> 29:20.120] stuff, then just shut up when it comes to public policy, even about scientific topics.

[29:20.120 --> 29:22.080] So from, you know, the safety of GMO foods.

[29:22.080 --> 29:24.480] Right, because it specifically says of science issues, scientific issues.

[29:24.480 --> 29:28.440] Yes, this is scientific issues, this is not a non-science, they're not telling you, yeah,

[29:28.440 --> 29:29.840] weigh in on economic policy.

[29:29.840 --> 29:36.780] They're saying, we know what these issues are, they are vaccines, nuclear power, GMOs,

[29:36.780 --> 29:44.080] climate change, gun safety, you know, they say, yeah, but don't apply your science to

[29:44.080 --> 29:49.200] questions that I have an emotional or ideological opinion about, you just stay in the lab and

[29:49.200 --> 29:50.600] shut up.

[29:50.600 --> 29:55.040] And that's problematic, in my opinion, that's highly problematic.

[29:55.040 --> 29:56.040] I think it's problematic too.

[29:56.040 --> 29:58.640] Yeah, I don't think there's a positive way to spin that.

[29:58.640 --> 30:02.880] No, I think the only thing to remember, it's not so much a positive, it's more of a caveat,

[30:02.880 --> 30:05.560] which is that's a false binary.

[30:05.560 --> 30:06.560] I hear you.

[30:06.560 --> 30:11.400] And so it's up to a lot of interpretation, and the answers just by definition aren't

[30:11.400 --> 30:12.680] nuanced there.

[30:12.680 --> 30:18.400] I hear you, but I think the one thing that tells me that people are interpreting it exactly

[30:18.400 --> 30:23.960] like we expect them to interpret it is because of the Democratic-Republican split in the

[30:23.960 --> 30:26.240] predictable direction.

[30:26.240 --> 30:31.800] So that means it is pretty much telling us what we think it's telling us, you know, the

[30:31.800 --> 30:38.000] party that has more ideological positions that run afoul of science are the ones who

[30:38.000 --> 30:42.760] don't want scientists speaking about policy issues that are scientific.

[30:42.760 --> 30:49.960] Well, and that speaks to this other question that in the coverage he really, really focused

[30:49.960 --> 30:54.400] on which was, do you think scientists are good at making policy decisions?

[30:54.400 --> 30:56.940] So now there's a value judgment there.

[30:56.940 --> 31:02.280] And they were asking them how they stack up against other people in terms of making good

[31:02.280 --> 31:06.920] policy decisions, again, specifically about scientific issues.

[31:06.920 --> 31:12.080] So the question was, or so the answer choices were usually worse, neither better nor worse

[31:12.080 --> 31:13.220] or usually better.

[31:13.220 --> 31:17.720] So again, do you think scientists are good at making policy decisions when stacked up

[31:17.720 --> 31:23.400] against other people when they're making policy decisions about scientific issues?

[31:23.400 --> 31:26.580] Usually worse, neither better nor worse, or usually better.

[31:26.580 --> 31:33.000] And we again are seeing that Republicans and people who lean Republican in September of

[31:33.000 --> 31:41.540] 2022, they still 24% of them said the scientists are usually better, 17% said the scientists

[31:41.540 --> 31:42.760] are usually worse.

[31:42.760 --> 31:44.420] This is worrisome.

[31:44.420 --> 31:47.740] And 58% in the middle said they're neither better nor worse.

[31:47.740 --> 31:52.640] And this varies dramatically from how Democrats and those who lean Democrat answered.

[31:52.640 --> 31:56.340] 55% of them said scientists are usually better.

[31:56.340 --> 31:58.880] Only 5% said scientists are usually work.

[31:58.880 --> 32:03.000] And 38% in the middle said they're neither better nor worse.

[32:03.000 --> 32:06.440] That has remained relatively consistent over time.

[32:06.440 --> 32:11.760] And when I say over time, it's just the last three polls, so 2022, 2020 and 2019.

[32:11.760 --> 32:17.560] But you've seen a dramatic shift over here with Republicans recently.

[32:17.560 --> 32:22.440] The usually worse has gone from 9% to 11% to a full 17%.

[32:22.440 --> 32:27.720] And the usually better has dropped from 34%, which was consistent in 2019 and 2020 down

[32:27.720 --> 32:30.740] to 24% in the usually better camp.

[32:30.740 --> 32:38.400] So we are seeing, and it is worrisome, that basically two in 10 Republicans think that

[32:38.400 --> 32:44.520] scientists are usually worse at making good policy decisions about scientific issues than

[32:44.520 --> 32:46.760] quote, other people.

[32:46.760 --> 32:52.440] Again, not than politicians, not than some other nuclear physicists or whatever, literally

[32:52.440 --> 32:56.240] just scientists than other people.

[32:56.240 --> 32:58.160] So that is worrisome.

[32:58.160 --> 33:00.800] It's absolutely worrisome.

[33:00.800 --> 33:07.720] And the tack that's taken in this coverage, which I really appreciate, is the tack that

[33:07.720 --> 33:12.360] I think you often see with people who study the science of science communication.

[33:12.360 --> 33:17.240] So not just those who are scientists who want to disseminate their science and are like,

[33:17.240 --> 33:19.200] okay, I'm just going to shout my science to the rooftops.

[33:19.200 --> 33:20.260] We talk about this a lot, right?

[33:20.260 --> 33:26.360] We talk about knowledge deficits and using social psychology and different persuasion

[33:26.360 --> 33:31.200] approaches to help improve the effects of science communication.

[33:31.200 --> 33:38.520] And what the author of this, who co-wrote a book on science communication strategy with

[33:38.520 --> 33:46.980] another researcher, basically focused on the social science research that shows that people

[33:46.980 --> 33:56.760] are considered more trustworthy when they are deemed caring, honest, and competent.

[33:56.760 --> 33:59.680] Scientists very often are deemed competent.

[33:59.680 --> 34:04.960] They are quite often deemed honest, they are very rarely deemed caring.

[34:04.960 --> 34:06.720] And this is something that we need to see change.

[34:06.720 --> 34:12.020] And it's probably why you see a difference between medical scientists and scientists

[34:12.020 --> 34:16.600] in terms of people's responses, because they have personal relationships to their doctors

[34:16.600 --> 34:21.360] and they think of their doctors as caring individuals, but they see scientists as siloed

[34:21.360 --> 34:28.560] in a lab, in a lab coat, cold, calculating, mad scientists, destructive, whatever the

[34:28.560 --> 34:29.560] stereotypes are.

[34:29.560 --> 34:35.640] That's amazing that stereotype has persisted my entire life, 53 years, that hasn't seemed

[34:35.640 --> 34:36.640] to have changed.

[34:36.640 --> 34:39.360] But ironically, Kara, that's what people want them to be.

[34:39.360 --> 34:46.080] They say they want them to be like just robots in a lab, but they don't like that.

[34:46.080 --> 34:49.920] And then they're like, you're not very caring, and so I don't trust you.

[34:49.920 --> 34:54.980] But the thing is, scientists are supposed to be dispassionate, right?

[34:54.980 --> 34:58.040] And you could be caring and dispassionate at the same time.

[34:58.040 --> 35:01.760] This is something, this is a fine line that I have to walk as a physician, right?

[35:01.760 --> 35:06.780] And we are explicitly taught this as part of our medical professionalism.

[35:06.780 --> 35:10.560] It's like the difference between sympathy and empathy, like you have to be sympathetic

[35:10.560 --> 35:16.520] to your patients, you have to care about them, but you can't get emotionally attached because

[35:16.520 --> 35:21.680] then you lose your objectivity, and your job is to be objective.

[35:21.680 --> 35:23.080] And there's no way around it.

[35:23.080 --> 35:29.120] Being objective requires a certain amount of cold calculation because that's your freaking

[35:29.120 --> 35:37.960] job to say, listen, you know, these are the odds, this is the choice you have to make.

[35:37.960 --> 35:42.960] And this is what I recommend based upon the evidence, you know, and we could personalize

[35:42.960 --> 35:47.100] this to you, and I want you to get better, but I don't want to make an emotional recommendation

[35:47.100 --> 35:48.100] or decision here, right?

[35:48.100 --> 35:50.600] My job is to give you the cold, calculating facts.

[35:50.600 --> 35:55.440] But do you really think, and like honestly, Steve, because we're really talking about

[35:55.440 --> 36:01.360] the difference between a practitioner, which as you've said before on the show is maybe

[36:01.360 --> 36:08.240] not as much art as it is science, but there is a science to it, of course, to medical

[36:08.240 --> 36:09.240] practice.

[36:09.240 --> 36:12.120] And I would even throw in psychological practice in there, but there's also a certain amount

[36:12.120 --> 36:13.320] of humanity to it.

[36:13.320 --> 36:15.720] There's also a certain amount of clinical instinct.

[36:15.720 --> 36:20.400] There's a certain amount of understanding that's very different than when we think about

[36:20.400 --> 36:27.800] laboratory-based science, where we're not having any sort of relationship with a, quote,

[36:27.800 --> 36:30.400] patient because they're not a patient, they're a subject, they're a participant, however

[36:30.400 --> 36:31.400] you want to view it.

[36:31.400 --> 36:32.400] Right.

[36:32.400 --> 36:37.640] And that's, yeah, being a clinician includes skills in addition to being a good scientist,

[36:37.640 --> 36:43.240] absolutely, being able to look at the data and make that calculating decision.

[36:43.240 --> 36:49.240] But my problem with these numbers, with the public attitude about like scientists shouldn't

[36:49.240 --> 36:55.880] get involved with policy debate on scientific topics, is that, you know, again, because

[36:55.880 --> 37:00.200] they want, the bottom line is they want to be able to have their opinion without being

[37:00.200 --> 37:02.040] contradicted by the facts.

[37:02.040 --> 37:09.360] And what we're talking about here is not scientists dictating policy or injecting their own opinion

[37:09.360 --> 37:16.320] or ideology, it's being involved in the debate about a scientific topic as an expert to provide

[37:16.320 --> 37:17.320] the facts.

[37:17.320 --> 37:18.320] Right.

[37:18.320 --> 37:23.560] Because the problem that people have with that is that, you know, they're presenting

[37:23.560 --> 37:28.320] authoritative facts that might not go their way.

[37:28.320 --> 37:33.880] And so they're trying to cheat, they're trying to avoid the facts by just by keeping them

[37:33.880 --> 37:36.000] out of the conversation.

[37:36.000 --> 37:40.080] You know, they want to have a conversation where opinion rules because the people who

[37:40.080 --> 37:42.880] know the facts aren't allowed in the room.

[37:42.880 --> 37:44.720] That's really problematic.

[37:44.720 --> 37:45.720] It is.

[37:45.720 --> 37:52.320] And I think it's why you see massive differences between answers to questions about how good

[37:52.320 --> 37:58.760] scientists are at making political judgments or how good scientists are at weighing in

[37:58.760 --> 38:03.200] compared to some of these Pew questions that say things like the percent of U.S. adults

[38:03.200 --> 38:07.520] who say government investments in scientific research are worthwhile investments over time

[38:07.520 --> 38:09.240] or not worthwhile over time.

[38:09.240 --> 38:11.220] Like most people think they're worthwhile.

[38:11.220 --> 38:13.920] Of course, because people want the benefits of science.

[38:13.920 --> 38:14.920] They want everything.

[38:14.920 --> 38:15.920] They want it all.

[38:15.920 --> 38:18.400] They want the benefits of science, but they don't want the inconvenience of having to

[38:18.400 --> 38:20.640] listen to it when it might cut against their.

[38:20.640 --> 38:25.280] So then here comes the question, and I think the important part of the not so much the

[38:25.280 --> 38:29.000] Pew study, which has a lot more statistics, by the way, because these surveys are always

[38:29.000 --> 38:33.200] really like rich and dense and you can they're all open access, you can read about them online.

[38:33.200 --> 38:38.720] But the the part that I find really fascinating about the approach to the right around here

[38:38.720 --> 38:45.720] in the conversation is if we know based on social science research that in order to be

[38:45.720 --> 38:50.920] deemed as trustworthy, you need to be thought of there are three main components to that

[38:50.920 --> 38:54.920] being carrot, being caring, being honest and competent.

[38:54.920 --> 39:01.720] And if scientists are sort of starting to get up against that burnout of frustration

[39:01.720 --> 39:04.480] and of, oh, why can't these people just get it?

[39:04.480 --> 39:07.920] I wish these people would just believe us.

[39:07.920 --> 39:12.720] I wish that they would just, you know, listen to us more.

[39:12.720 --> 39:15.320] And they're you know, they're too stupid to understand.

[39:15.320 --> 39:21.880] It's the exact opposite thing that you would say if you want people to find you to be caring,

[39:21.880 --> 39:23.760] honest and competent.

[39:23.760 --> 39:31.120] And so in an effort, in the exasperated effort to come to the table in a very firm way, I

[39:31.120 --> 39:36.520] think what we're actually doing is causing a bigger rift between that 20 percent and

[39:36.520 --> 39:37.520] the scientific community.

[39:37.520 --> 39:39.080] And it's worrisome.

[39:39.080 --> 39:40.080] It's worrisome.

[39:40.080 --> 39:42.440] And so the question is, how do we approach this?

[39:42.440 --> 39:48.000] How do we how do we as scientists do a better job of, I think, respecting the fact that

[39:48.000 --> 39:54.720] there is a body of literature and an investigative process called the science of science communication

[39:54.720 --> 39:56.600] that we can learn from?

[39:56.600 --> 39:57.600] That's one part of it.

[39:57.600 --> 40:03.000] But there's another signal in this data that's pretty strong and cuts across all other demographics.

[40:03.000 --> 40:09.360] And that is that education positively correlates with correlates with having trust in scientists.

[40:09.360 --> 40:13.640] So really, if we just make people more scientifically literate, that's probably the best single

[40:13.640 --> 40:15.960] thing we could do to improve these numbers.

Reliability of World Energy: Energy Security of Europe (40:15)

  • [link_URL title][2]

[40:15.960 --> 40:19.160] Jay, you're going to talk about the next news item.

[40:19.160 --> 40:23.880] What is the situation, especially in Europe, in terms of energy security, you know, with

[40:23.880 --> 40:25.680] the whole Ukraine war thing?

[40:25.680 --> 40:29.800] Yeah, this is actually I'm going to specifically talk about Europe.

[40:29.800 --> 40:36.160] So the global energy market, as everyone knows, has had severe ups and downs since the pandemic

[40:36.160 --> 40:41.840] hit and the start of the pandemic caused energy demand to actually go down in the beginning.

[40:41.840 --> 40:43.880] Yeah, because the world economy was in the crapper.

[40:43.880 --> 40:48.600] Yeah, because of the lockdowns and people aren't driving to work and industries weren't

[40:48.600 --> 40:51.200] able to function on the same level as they were.

[40:51.200 --> 40:54.440] But then in 2021, demand came back.

[40:54.440 --> 40:59.340] And this was because the world economy started to increase again.

[40:59.340 --> 41:04.160] And then the Ukraine war massively impacted European countries as they tried to move away

[41:04.160 --> 41:05.760] from Russian energy.

[41:05.760 --> 41:08.680] So we're seeing this play out this year, right?

[41:08.680 --> 41:15.120] We're seeing that, you know, European countries were essentially reliant on Russian energy.

[41:15.120 --> 41:20.920] And then as sanctions came down and Russia started to turn down the volume of natural

[41:20.920 --> 41:26.000] gas as they started to turn down the amount of gas that they were selling to Europe, they

[41:26.000 --> 41:30.580] found themselves, Europe found themselves with a deficit and they were having a big

[41:30.580 --> 41:31.580] problem with that.

[41:31.580 --> 41:34.580] And a lot of people were thinking, you know, I guess this means we're going to switch back

[41:34.580 --> 41:38.440] to coal, you know, which has global warming implications.

[41:38.440 --> 41:44.200] So in all this energy drama, some experts are seeing, though, however, that indication

[41:44.200 --> 41:48.240] show that things might actually be not so bad after all.

[41:48.240 --> 41:53.960] You know, like, where are we today versus six months ago when the war was still in

[41:53.960 --> 41:55.360] the beginning phase?

[41:55.360 --> 41:59.720] So the renewable energy market has shown very good growth.

[41:59.720 --> 42:04.200] Over the past decade, global renewable energy consumption has grown exponentially at an

[42:04.200 --> 42:07.360] average annual rate of 12.6 percent per year.

[42:07.360 --> 42:11.840] The EU generated a record 12 percent of its electricity from solar from May to August

[42:11.840 --> 42:15.080] 2022 and 13 percent from wind.

[42:15.080 --> 42:17.020] These are really good numbers.

[42:17.020 --> 42:20.820] Renewables are likely to account for almost 95 percent of the increase in global power

[42:20.820 --> 42:26.120] capacity through 2026, with solar providing more than half of that figure.

[42:26.120 --> 42:32.240] Now, anybody who is keeping up with what's going on with renewables is very well accustomed

[42:32.240 --> 42:37.480] to the fact that the numbers are trending in a very positive and very strong direction.

[42:37.480 --> 42:42.120] You know, the renewables market has mostly offset its own contributions to emissions.

[42:42.120 --> 42:46.720] But you've got to keep in mind, you know, to the creation of renewables, like the creation

[42:46.720 --> 42:52.920] of solar panels and creating, you know, the giant blades that go on the windmills.

[42:52.920 --> 42:57.280] You know, of course, there are wind turbines, I'm sorry, with the turbines.

[42:57.280 --> 42:59.680] These are creating greenhouse gas emissions.

[42:59.680 --> 43:02.960] But then these things are functioning for a long time.

[43:02.960 --> 43:07.560] And they are, you know, there is no more creation of greenhouse gases in their in their ability

[43:07.560 --> 43:10.480] to collect or create electricity.

[43:10.480 --> 43:15.040] So Europe has been trying to find alternative sources of energy since the war began.

[43:15.040 --> 43:20.240] This upcoming winter is particularly under consideration because of the dramatic increase

[43:20.240 --> 43:22.560] in energy usage needed, right?

[43:22.560 --> 43:27.460] Because, you know, during the colder seasons, we have to burn more fuel or we need more

[43:27.460 --> 43:33.200] energy in order to heat our homes and to keep, you know, keep our livable space livable.

[43:33.200 --> 43:38.400] So most of Europe's energy demand comes from the burning of natural gas.

[43:38.400 --> 43:42.560] And historically, Russia has been their key supplier.

[43:42.560 --> 43:46.680] And since the sanctions, like I said, have been lowered, Russia has been slowly turning

[43:46.680 --> 43:51.360] it down and saying, you're giving less and less and less to them to the point where this

[43:51.360 --> 43:56.180] has really been a serious problem that all of Europe has had to confront.

[43:56.180 --> 44:01.280] It's very possible that Russia cuts Europe off completely from its natural natural gas.

[44:01.280 --> 44:06.400] Now with the shortage of natural gas, the prices skyrocketed, you know, the prices of

[44:06.400 --> 44:07.560] natural gas went way up.

[44:07.560 --> 44:11.500] And this puts a financial strain on everyone, you know, everybody in Europe.

[44:11.500 --> 44:15.860] And because of this, an increase in coal use was really all but unavoidable.

[44:15.860 --> 44:18.640] But some interesting decisions were made.

[44:18.640 --> 44:22.760] Germany decided to keep its last nuclear power plants operational, the ones that were

[44:22.760 --> 44:24.200] scheduled to be shut down.

[44:24.200 --> 44:28.040] And they said that we're going to keep these, we're going to keep these open longer in order

[44:28.040 --> 44:33.320] to generate, especially during this winter, in order to generate the energy that's needed.

[44:33.320 --> 44:36.840] This has helped fill the void that natural gas shortage has created.

[44:36.840 --> 44:40.820] Now also companies that produce natural gas have been shipping their products to Europe

[44:40.820 --> 44:45.720] and so many that they can't offload the ships fast enough, a couple of things happening

[44:45.720 --> 44:46.720] here.

[44:46.720 --> 44:52.220] There was a great response by companies that create liquid natural gas.

[44:52.220 --> 44:57.040] And there they these companies don't even have enough ships to bring the natural gas

[44:57.040 --> 44:58.040] to Europe.

[44:58.040 --> 45:00.500] That's how that's how much is being shipped to Europe.

[45:00.500 --> 45:03.020] It's actually great that this is happening.

[45:03.020 --> 45:09.960] Add to that conservation efforts in Europe's natural gas storage is now approximately what

[45:09.960 --> 45:11.200] where do you guys think it is?

[45:11.200 --> 45:16.240] Where do you where do you think Europe's natural gas storage is right now?

[45:16.240 --> 45:17.240] 40?

[45:17.240 --> 45:18.240] I don't know.

[45:18.240 --> 45:19.240] Make it 25.

[45:19.240 --> 45:20.240] There's a guess.

[45:20.240 --> 45:23.880] I don't know what normal is, but yeah, less than half of normal.

[45:23.880 --> 45:25.700] So I have a great number for you.

[45:25.700 --> 45:28.480] They're actually at ninety three point eight percent.

[45:28.480 --> 45:30.080] OK, yes.

[45:30.080 --> 45:35.160] That's because that's because these companies have been shipping a ton of liquid natural

[45:35.160 --> 45:36.160] gas.

[45:36.160 --> 45:37.160] Wow.

[45:37.160 --> 45:38.880] And they've been squirreling it away and squirreling it away.

[45:38.880 --> 45:44.040] And it's funny that I did not hear anything about this in the news.

[45:44.040 --> 45:47.920] I had to I had to dig in to find this information.

[45:47.920 --> 45:53.000] So now natural gas prices are significantly dropping from their August high because Europe

[45:53.000 --> 45:56.680] actually has this this massive store of natural gas.

[45:56.680 --> 46:01.840] If Europe is careful with its natural gas and no out of, you know, out of norm, crazy

[46:01.840 --> 46:08.520] cold weather happens, Europe should most definitely be fine this winter under this interpretation.

[46:08.520 --> 46:09.520] Right.

[46:09.520 --> 46:10.520] Because we'll get into that in a second.

[46:10.520 --> 46:12.760] Like, you know, how do we interpret this information?

[46:12.760 --> 46:14.440] Seems like it's very good news.

[46:14.440 --> 46:19.640] The International Energy Agency, the IEA, says that Europe's energy futures look good.

[46:19.640 --> 46:22.020] Wind and solar are going to continue to grow.

[46:22.020 --> 46:26.400] They predict that carbon emissions will not be tied to the growth of the overall European

[46:26.400 --> 46:27.680] economy.

[46:27.680 --> 46:32.420] And the obvious goal is to see emissions drop as the economy increases because we have this

[46:32.420 --> 46:39.800] major shift to renewables, not just having the renewables there, but letting those renewables

[46:39.800 --> 46:42.080] replace fossil fuel burning.

[46:42.080 --> 46:43.080] Right.

[46:43.080 --> 46:47.440] There's a backup plant that's there in case renewables can't give you the on-demand energy

[46:47.440 --> 46:48.440] that you need.

[46:48.440 --> 46:53.800] But if we are really using renewable energy as much as we possibly can, then, you know,

[46:53.800 --> 46:57.480] we're going to see a decrease in greenhouse gas emissions because we're just simply not

[46:57.480 --> 46:59.040] burning fossil fuels anymore.

[46:59.040 --> 47:05.320] So the way that I delivered this, this news item, is very much a positive look on what's

[47:05.320 --> 47:06.520] going on in Europe.

[47:06.520 --> 47:08.600] I vetted the information very carefully.

[47:08.600 --> 47:11.380] I made sure that these were factual points.

[47:11.380 --> 47:16.120] But there are a lot of people out there that are not agreeing that this is positive.

[47:16.120 --> 47:20.280] And it really is just how do you want to interpret the news and how do you want to interpret

[47:20.280 --> 47:21.960] the information that comes across your plate?

[47:21.960 --> 47:25.440] I find it very interesting.

[47:25.440 --> 47:28.920] This is kind of related to Cara's news item, right?

[47:28.920 --> 47:35.080] Because even if you have a source of information that you can trust, you could even still,

[47:35.080 --> 47:39.780] even still, you can misinterpret that information or choose to interpret it differently depending

[47:39.780 --> 47:42.680] on your, you know, your political point of view.

[47:42.680 --> 47:45.080] I just want to clarify a couple of things.

[47:45.080 --> 47:49.840] When you say renewable energy, I just want people to understand that includes wind, solar,

[47:49.840 --> 47:51.880] but also hydroelectric and geothermal.

[47:51.880 --> 47:52.880] Yeah.

[47:52.880 --> 47:53.880] Yeah.

[47:53.880 --> 47:55.240] So it's not just wind and solar.

[47:55.240 --> 47:56.240] Yeah.

[47:56.240 --> 47:57.720] I mentioned those because they were the highest.

[47:57.720 --> 47:58.720] Yeah, I understand.

[47:58.720 --> 48:02.800] But they're very those are very different types of energy.

[48:02.800 --> 48:06.360] And sometimes you could you can artificially inflate the number like sometimes like in

[48:06.360 --> 48:10.400] some countries like it's mostly hydroelectric and if you say renewable people think wind

[48:10.400 --> 48:11.400] and solar.

[48:11.400 --> 48:14.680] But wind and solar is increasing dramatically as well.

[48:14.680 --> 48:18.800] And also I have to point out, I know I've pointed this out before, that Germany is such

[48:18.800 --> 48:23.840] a cautionary towel here because 10, 15 years ago, you know, when they were saying we're

[48:23.840 --> 48:29.120] going to shut down our nuclear power plants and we're going to switch over to all renewable

[48:29.120 --> 48:35.680] energy, you know, a lot of people were saying, a lot of the experts like, if you try to do

[48:35.680 --> 48:40.440] that, you're basically going to be building coal fired plants because, you know, the choice

[48:40.440 --> 48:44.000] right now is not between is between nuclear and fossil fuel.

[48:44.000 --> 48:45.000] That's your choice.

[48:45.000 --> 48:46.000] Yeah.

[48:46.000 --> 48:47.000] There's no way around that.

[48:47.000 --> 48:48.960] And in general, like, yeah, we know what we're doing, right?

[48:48.960 --> 48:50.720] So they did it anyway.

[48:50.720 --> 48:55.440] And that made them utterly dependent on Russian oil and natural gas.

[48:55.440 --> 48:57.600] I mean, that became basically the plan.

[48:57.600 --> 49:00.960] They basically were saying, we're just going to switch over from nuclear to natural gas,

[49:00.960 --> 49:02.000] which is terrible.

[49:02.000 --> 49:04.040] Their carbon emissions went up.

[49:04.040 --> 49:05.040] It went up.

[49:05.040 --> 49:08.560] They thought they were going to be able to wind and solar their way out of it.

[49:08.560 --> 49:13.200] But you can't do that for multiple reasons.

[49:13.200 --> 49:16.920] One is this is this is like an entire separate issue.

[49:16.920 --> 49:21.600] But it's critical here if we're talking about how are we going to build out our energy infrastructure

[49:21.600 --> 49:23.800] going forward?

[49:23.800 --> 49:30.240] Because wind and solar are intermittent, right, the intermittent sources of power, the grids

[49:30.240 --> 49:35.520] could basically take 30 percent before you start to get you start to get into serious

[49:35.520 --> 49:41.480] problems of not being able to manage the supply and demand on the grid.

[49:41.480 --> 49:45.640] You end up having to build a lot of redundancy, a lot of backup, a lot of overcapacity.

[49:45.640 --> 49:50.440] Steve, you're talking about like even like even on a day to day level, like when the

[49:50.440 --> 49:53.840] sun goes down, solar is not being collected.

[49:53.840 --> 49:54.840] Yeah, right.

[49:54.840 --> 49:56.120] Grid won't have that energy.

[49:56.120 --> 49:57.640] So what's the effect of that?

[49:57.640 --> 50:03.360] How useful is it to have a lot of solar on your grid when it goes down for, you know,

[50:03.360 --> 50:05.520] it's only up about twenty two percent of the time.

[50:05.520 --> 50:12.400] That's the capacity factor of solar wind is on average thirty five percent right now.

[50:12.400 --> 50:17.360] But the new the new wind turbines and better locations, we can get up to 50 percent, maybe

[50:17.360 --> 50:19.400] even higher, which is great.

[50:19.400 --> 50:22.280] That's that's in the realm of fossil fuel plants.

[50:22.280 --> 50:25.560] Nuclear is the best at around ninety two percent, meaning once it's up and running, it's like

[50:25.560 --> 50:29.280] producing energy, 92 percent maximum capacity.

[50:29.280 --> 50:32.360] So we need to build the whole system.

[50:32.360 --> 50:36.580] If you just try to maximize wind and solar, the grid can't take it.

[50:36.580 --> 50:40.760] You end up producing more energy than you can use at times, and then you don't have

[50:40.760 --> 50:44.080] enough at other times and you have no way to buffer it.

[50:44.080 --> 50:45.640] You need big grids.

[50:45.640 --> 50:50.820] So we need to be investing massively in upgrading our grids to make them more robust and get

[50:50.820 --> 50:52.840] to get smart grids.

[50:52.840 --> 50:57.480] And we need to be researching we need to research grid storage because that makes everything

[50:57.480 --> 50:58.480] better.

[50:58.480 --> 51:02.700] Grid storage is both dispatchable and could take the excess.

[51:02.700 --> 51:07.120] You know, I don't think it's going to be probably a long time before we get to even that 30

[51:07.120 --> 51:08.120] percent figure.

[51:08.120 --> 51:11.720] So I'm not worried about that as long as we're not concentrating it in one small area.

[51:11.720 --> 51:12.720] Right.

[51:12.720 --> 51:13.780] That's what we shouldn't be doing.

[51:13.780 --> 51:18.680] We need to be spreading it out around the world to get to that 30 percent on every grid

[51:18.680 --> 51:20.120] to have 20 to 30 percent.

[51:20.120 --> 51:25.280] We don't want to have one, you know, we don't want one grid or one part of a grid to be

[51:25.280 --> 51:27.540] at 80 percent or whatever because we can't handle it.

[51:27.540 --> 51:29.840] We don't have we don't have the grid for it.

[51:29.840 --> 51:32.680] We don't have the technology for it right now and it doesn't work.

[51:32.680 --> 51:36.000] And you can't shut down your nuclear plant and build wind and solar.

[51:36.000 --> 51:37.000] That doesn't work.

[51:37.000 --> 51:42.800] So because nuclear is on demand, you know, so is coal and natural gas, but we don't want

[51:42.800 --> 51:43.800] to rely on them.

[51:43.800 --> 51:44.800] We don't want to burn coal.

[51:44.800 --> 51:47.440] We don't want to burn natural gas.

[51:47.440 --> 51:51.040] Hydroelectric is fantastic, but it's limited by location.

[51:51.040 --> 51:53.640] Geothermal is fantastic, but it's limited by location.

[51:53.640 --> 51:58.440] There are there actually there's research in the works that could allow the number of

[51:58.440 --> 52:02.720] potential locations for geothermal to dramatically expand.

[52:02.720 --> 52:03.720] We talked about that.

[52:03.720 --> 52:04.720] That would be great.

[52:04.720 --> 52:05.720] Yeah, yeah.

[52:05.720 --> 52:06.720] So that would be great.

[52:06.720 --> 52:07.720] But we don't have it yet.

[52:07.720 --> 52:09.480] But that would be great if it if it really comes to fruition.

[52:09.480 --> 52:11.560] That's worth investing in.

[52:11.560 --> 52:13.000] So we just have to be smart about this.

[52:13.000 --> 52:17.720] This is an opportunity like the whole the even though it's it's dramatically spiking

[52:17.720 --> 52:24.120] gas prices and everything is very, very problematic, but it does accomplish a couple of things.

[52:24.120 --> 52:28.000] Everyone realizes now, I think, two big things that they got slapped in the face with.

[52:28.000 --> 52:29.480] First of all, we need nuclear.

[52:29.480 --> 52:30.480] Right.

[52:30.480 --> 52:33.960] I mean, like at least for the next 20 years or so, like we got to have nuclear if we're

[52:33.960 --> 52:35.320] going to make the whole thing work.

[52:35.320 --> 52:41.760] The second thing is there is no energy security while you're dependent on oil.

[52:41.760 --> 52:42.760] That's it.

[52:42.760 --> 52:49.000] You don't have energy independence with when you're dependent on the world market of oil

[52:49.000 --> 52:50.000] and gas.

[52:50.000 --> 52:51.240] I mean, think about this.

[52:51.240 --> 52:53.860] This is a very United States focus thing.

[52:53.860 --> 53:01.540] But Putin and the Saudis recently got together and decided to reduce the production of oil.

[53:01.540 --> 53:07.340] And the consensus of expert opinion is the only only plausible reason they did that was

[53:07.340 --> 53:11.680] to influence America's next election, our midterm elections.

[53:11.680 --> 53:16.760] They're they're using their oil production to screw with our internal politics.

[53:16.760 --> 53:17.760] Think about that.

[53:17.760 --> 53:18.760] Yeah.

[53:18.760 --> 53:23.720] And the only way out of that is to not be dependent on burning fossil fuels.

[53:23.720 --> 53:25.560] You want to be energy independent.

[53:25.560 --> 53:27.540] You need renewable energy sources.

[53:27.540 --> 53:28.540] That is it.

[53:28.540 --> 53:34.080] So and I think there's a I agree with you, Richard, that that I think a lot of the environmentalists

[53:34.080 --> 53:38.160] are coming around to the fact that, yeah, we're just not going to we're not going to

[53:38.160 --> 53:42.120] succeed, you know, with this fantasy of 100 percent wind and solar.

[53:42.120 --> 53:43.120] Yeah.

[53:43.120 --> 53:44.120] Great.

[53:44.120 --> 53:45.120] It's not going to happen.

[53:45.120 --> 53:48.880] We need things that can produce energy on demand to make it work at least for the next

[53:48.880 --> 53:49.880] 20, 30 years.

[53:49.880 --> 53:54.440] You know, until we fusion until the generation until something new comes along, until we

[53:54.440 --> 54:01.000] have some really good grid storage solution or fusion fusion comes on, thorium is great.

[54:01.000 --> 54:05.060] You know, that whatever anything where we can produce lots of on demand power.

[54:05.060 --> 54:06.060] That's what we need.

[54:06.060 --> 54:08.160] Yeah, well, it's that old word.

[54:08.160 --> 54:09.160] Natural comes up again.

[54:09.160 --> 54:13.720] And a lot of the people who are into alternative medicine just get carried away on this the

[54:13.720 --> 54:14.720] natural fallacy.

[54:14.720 --> 54:15.720] You know, it's not.

[54:15.720 --> 54:16.720] Yeah, it must be good.

[54:16.720 --> 54:20.080] It's you know, nature provides all this this energy.

[54:20.080 --> 54:26.200] We can harvest the winds, which is nature's gift to and it's all very flowery and wonderful

[54:26.200 --> 54:27.200] and everything.

[54:27.200 --> 54:32.360] But yeah, when you get down to the brass tacks, people can well, hopefully not.

[54:32.360 --> 54:36.040] But people may well freeze to death this winter.

[54:36.040 --> 54:37.040] That was natural.

[54:37.040 --> 54:38.040] Yeah, coal is natural.

[54:38.040 --> 54:39.040] That's right.

[54:39.040 --> 54:44.480] Well, everyone, we're going to take a quick break from our show to talk about our sponsor

[54:44.480 --> 54:45.640] this week.

[54:45.640 --> 54:46.640] Better help.

[54:46.640 --> 54:51.200] You know, it seems like since the pandemic that it's been kind of hard to get back to

[54:51.200 --> 54:55.040] like what I consider to be my emotional baseline before all of that.

[54:55.040 --> 54:57.720] I don't know about you guys, but it took a lot out of me.

[54:57.720 --> 55:03.760] Yeah, Jay, you know, therapists are trained to help you navigate life's problems, learn

[55:03.760 --> 55:06.440] skills, coping mechanisms.

[55:06.440 --> 55:11.440] I mean, there are so many benefits to going through therapy, whether you have a mental

[55:11.440 --> 55:15.960] health diagnosis or not, whether you're just dealing with something difficult in your life,

[55:15.960 --> 55:18.720] or you're really struggling with mental illness.

[55:18.720 --> 55:23.680] Having that navigator on your side can be a massive benefit.

[55:23.680 --> 55:28.920] As the world's largest therapy service, better help has matched 3 million people with professionally

[55:28.920 --> 55:35.320] licensed and vetted therapists available 100 percent online, plus it's affordable.

[55:35.320 --> 55:38.240] Just fill out a brief questionnaire to match with a therapist.

[55:38.240 --> 55:41.960] If things aren't clicking, you could easily switch to a new therapist any time.

[55:41.960 --> 55:43.280] It couldn't be simpler.

[55:43.280 --> 55:47.080] No waiting rooms, no traffic, no endless searching for the right therapist.

[55:47.080 --> 55:53.000] Learn more and save 10 percent off your first month at better help dot com slash S.G.U.

[55:53.000 --> 55:58.000] That's better help H.E.L.P dot com slash S.G.U.

[55:58.000 --> 56:00.720] All right, guys, let's get back to the show.

Video Games and Cognition (56:00)

  • [link_URL title][3]

[56:00.720 --> 56:05.280] Next news item is interesting, but I'm going to start with a preamble here.

[56:05.280 --> 56:10.320] So Bob Jay and I were at Comic-Con New York talking about our book, and one of the questions

[56:10.320 --> 56:15.840] we fielded from the audience at the end started with this premise that what do we think is

[56:15.840 --> 56:18.400] going to happen with the idiocracy?

[56:18.400 --> 56:24.800] Like where are things headed with the dumbing down of the general populace?

[56:24.800 --> 56:28.840] Now I had to disagree with the premise of the question because the facts actually do

[56:28.840 --> 56:30.640] not show that.

[56:30.640 --> 56:35.520] People are actually, their performance on standardized tests of cognitive function,

[56:35.520 --> 56:39.720] let me put it that way, are increasing over time, if anything.

[56:39.720 --> 56:43.520] There's the so-called Flynn effect we talked about on the show before where IQ points have

[56:43.520 --> 56:48.380] been increasing by about three per decade over the last 50, 60 years.

[56:48.380 --> 56:53.680] If you remember John Miller's civic scientific literacy test that he's been doing for 20

[56:53.680 --> 57:01.880] plus years since 1988, they increased from 9% of the public meeting the low end of the

[57:01.880 --> 57:10.160] threshold for civic scientific literacy up to 29% by 2008, and it's been pretty much

[57:10.160 --> 57:16.080] plateaued since then, but still it increased and now is plateaued at a 29% level.

[57:16.080 --> 57:21.520] I still think that's low, but things are moving in the positive direction.

[57:21.520 --> 57:28.520] So the question is, you know, is this a general phenomenon and specifically what's the relationship

[57:28.520 --> 57:31.600] between technology and human cognitive function?

[57:31.600 --> 57:36.680] Does technology have a positive effect or a negative effect?

[57:36.680 --> 57:41.840] And the news item we're going to talk about relates specifically to video games.

[57:41.840 --> 57:46.360] I'll tell you what the study did and then you guys can tell me what you think the results

[57:46.360 --> 57:47.360] might have been.

[57:47.360 --> 57:53.160] But essentially it was a cohort study, they just looked at two cohorts of children, one

[57:53.160 --> 57:59.680] self report, self report, so that's a caveat, but still one group played no video games,

[57:59.680 --> 58:02.960] zero hours per week of video game play.

[58:02.960 --> 58:09.600] The other group were those children who self reported 21 hours of video games per week

[58:09.600 --> 58:10.600] or more.

[58:10.600 --> 58:14.920] So that averages out to three hours per day, which is quite a bit.

[58:14.920 --> 58:19.540] And they just tracked them over time and did standardized testing on them, looked at their

[58:19.540 --> 58:23.520] brain function and just to see what was going on.

[58:23.520 --> 58:27.240] So what do you guys think this study showed?

[58:27.240 --> 58:31.860] Those who haven't already read my blog post about it, do you think that the group of children

[58:31.860 --> 58:38.120] who played no video games had better cognitive processing on standardized testing or the

[58:38.120 --> 58:42.760] kids who played 21 hours or more of video games per week?

[58:42.760 --> 58:48.360] Well, I'd have to ask, what were the kids who weren't doing video games?

[58:48.360 --> 58:49.360] What were they doing?

[58:49.360 --> 58:50.360] Yeah.

[58:50.360 --> 58:51.360] What were they doing?

[58:51.360 --> 58:56.880] Were they sitting up a tree, were they reading books to enlighten themselves, were they engaged

[58:56.880 --> 58:57.880] in other activities?

[58:57.880 --> 58:58.880] No data.

[58:58.880 --> 58:59.880] No data?

[58:59.880 --> 59:00.880] No data.

[59:00.880 --> 59:01.880] They just weren't playing video games.

[59:01.880 --> 59:02.880] They just weren't.

[59:02.880 --> 59:03.880] Right.

[59:03.880 --> 59:04.880] They were doing the action.

[59:04.880 --> 59:05.880] And it didn't matter what video, like any video game.

[59:05.880 --> 59:06.880] Yeah, any video game.

[59:06.880 --> 59:08.960] I spend a lot of time in VR these days.

[59:08.960 --> 59:11.120] It's really, that's quite something I can tell you.

[59:11.120 --> 59:12.400] That's where you've been these last few years.

[59:12.400 --> 59:13.400] That's where I've been.

[59:13.400 --> 59:14.400] That's where I've been.

[59:14.400 --> 59:19.920] I feel like either no difference or a positive correlation with the video gamers, not the

[59:19.920 --> 59:20.920] other way around.

[59:20.920 --> 59:22.520] If anything, the video goes up.

[59:22.520 --> 59:24.280] I would say there wouldn't be much in it.

[59:24.280 --> 59:25.280] They became super geniuses.

[59:25.280 --> 59:29.720] There was a strong positive effect of playing video games.

[59:29.720 --> 59:30.720] Interesting.

[59:30.720 --> 59:32.960] What do they attribute that to?

[59:32.960 --> 59:36.160] This study did not answer the question or address the question of what causes it.

[59:36.160 --> 59:37.880] This is just purely correlational, right?

[59:37.880 --> 59:38.880] Purely correlational.

[59:38.880 --> 59:39.880] Okay.

[59:39.880 --> 59:44.320] So it's quite possible, quite possible that children who had better performance on these

[59:44.320 --> 59:47.920] tests were drawn to video games rather than video games, making them.

[59:47.920 --> 59:48.920] Oh, interesting.

[59:48.920 --> 59:49.920] True.

[59:49.920 --> 59:53.120] So specifically, it wasn't in everything.

[59:53.120 --> 59:59.720] Specifically they were better in the area of response inhibition and working memory.

[59:59.720 --> 01:00:05.720] And also when they looked at functional imaging of their brain, the parts of their brains

[01:00:05.720 --> 01:00:10.240] that basically correlate with those functions were more active.

[01:00:10.240 --> 01:00:16.560] So it seems like the wiring of their brain actually was different in a way that correlates

[01:00:16.560 --> 01:00:20.920] with improved memory and response inhibition.

[01:00:20.920 --> 01:00:22.680] How significant was the response inhibition?

[01:00:22.680 --> 01:00:28.640] I'm just loving this and I'm like so curious to see this data with groups of kids with

[01:00:28.640 --> 01:00:29.640] ADHD.

[01:00:29.640 --> 01:00:36.560] Yeah, well, because it's very common for kids with ADHD to do really well on video games

[01:00:36.560 --> 01:00:42.680] because the ADHD does not inhibit their ability to play video games because it's engaging.

[01:00:42.680 --> 01:00:48.680] But the interesting thing is that kids with ADHD often suck at response inhibition and

[01:00:48.680 --> 01:00:53.660] the fact that they're able to inhibit their response, like that gaming, again, we only

[01:00:53.660 --> 01:00:59.440] know that this is correlative, but like that gaming seems like it probably has some positive

[01:00:59.440 --> 01:01:06.740] effect on a kid being able to inhibit reactions is like huge for ADHD kids.

[01:01:06.740 --> 01:01:10.660] There is a very, very plausible cause and effect here because basically you get good

[01:01:10.660 --> 01:01:11.880] at whatever you do, right?

[01:01:11.880 --> 01:01:15.440] That's the basic thing that we've in neuroscience have discovered over the last hundred years

[01:01:15.440 --> 01:01:21.720] that you get good at what you do and it doesn't really translate beyond that too much, but

[01:01:21.720 --> 01:01:24.220] at least at a basic level.

[01:01:24.220 --> 01:01:27.640] And a lot of video gaming is response inhibition.

[01:01:27.640 --> 01:01:32.280] That's sort of built into the skill of playing many games.

[01:01:32.280 --> 01:01:36.280] You have to like, even like the Beat Saber, cut the vegetables, but not the bombs.

[01:01:36.280 --> 01:01:37.280] You know what I mean?

[01:01:37.280 --> 01:01:40.120] You have to be able to inhibit, I mean, shoot the bad guys, but not the good guys.

[01:01:40.120 --> 01:01:44.420] There's a lot of response inhibition built into the skill set of playing games and working

[01:01:44.420 --> 01:01:48.560] memory is like the, you know, you're managing multiple things at the same time.

[01:01:48.560 --> 01:01:51.120] There's a lot of moving parts to, to a lot of, you know.

[01:01:51.120 --> 01:01:52.120] Oh my gosh.

[01:01:52.120 --> 01:01:56.040] I watch Rachel play these games sometimes and she, I can't follow her.

[01:01:56.040 --> 01:01:58.760] She's going so fast at what she's doing on the screen.

[01:01:58.760 --> 01:02:00.680] Things are popping up, going away, disappearing.

[01:02:00.680 --> 01:02:01.680] She's over here.

[01:02:01.680 --> 01:02:02.680] She's over.

[01:02:02.680 --> 01:02:03.680] It's amazing.

[01:02:03.680 --> 01:02:04.680] It is amazing.

[01:02:04.680 --> 01:02:05.680] Yeah.

[01:02:05.680 --> 01:02:06.680] Yeah.

[01:02:06.680 --> 01:02:07.680] This is not a one-off.

[01:02:07.680 --> 01:02:11.200] You know, there, there is a lot of data saying that, yeah, that video games actually, you

[01:02:11.200 --> 01:02:16.840] know, do correlate if anything with, with improved skills, depending on the type of

[01:02:16.840 --> 01:02:18.500] gaming that you're playing.

[01:02:18.500 --> 01:02:24.960] So like games that require a lot of visual processing correlate with better visual processing

[01:02:24.960 --> 01:02:30.160] skills, you know, including surgeons like surgeons do a better with the endoscopic surgery

[01:02:30.160 --> 01:02:32.480] if they play video games than if they don't.

[01:02:32.480 --> 01:02:34.000] So that is very, very plausible.

[01:02:34.000 --> 01:02:37.840] Again, this study doesn't prove that that's the cause and effect, but it's, but you know,

[01:02:37.840 --> 01:02:42.500] the research into in total, you know, leans in that direction and there's, I think a pretty

[01:02:42.500 --> 01:02:43.760] consistent effect here.

[01:02:43.760 --> 01:02:47.720] And I do think this is in line with technology in general.

[01:02:47.720 --> 01:02:53.400] Think about how much access we have to information, to other opinions, you know, just dealing

[01:02:53.400 --> 01:02:55.760] with the technology itself.

[01:02:55.760 --> 01:03:00.320] We do need to be smarter to live in a modern technological society than in the past.

[01:03:00.320 --> 01:03:01.920] So that's not surprising to me.

[01:03:01.920 --> 01:03:03.920] The skill sets may be different.

[01:03:03.920 --> 01:03:10.320] We may lament the loss of like handwriting skills or whatever, but it's just, but the,

[01:03:10.320 --> 01:03:16.180] but I think it takes, you know, way more intellectual, just raw processing power to be able to keep

[01:03:16.180 --> 01:03:22.400] up with society and interface with all the technology that we have in front of us.

[01:03:22.400 --> 01:03:28.760] But it can also exacerbate certain neuro atypical styles, you know, so like if you already having

[01:03:28.760 --> 01:03:34.400] a hard time with focus and attention and you're constantly able to bounce from one thing to

[01:03:34.400 --> 01:03:40.440] another thing, or you're already not comfortable sitting alone with your own thoughts, you

[01:03:40.440 --> 01:03:45.480] don't have to do that in, in our kind of technological landscape.

[01:03:45.480 --> 01:03:46.480] Yeah.

[01:03:46.480 --> 01:03:47.480] Yeah.

[01:03:47.480 --> 01:03:51.420] And this study also did look at things like video game addiction or lack of physical activity.

[01:03:51.420 --> 01:03:55.320] So there are other issues as well, and I'm certainly not recommending that you play 21

[01:03:55.320 --> 01:03:56.320] hours a week of video games.

[01:03:56.320 --> 01:04:01.480] I think they were just looking for a strong signal, which is why they wanted the effect

[01:04:01.480 --> 01:04:05.680] size to be big, but everything in moderation, right?

[01:04:05.680 --> 01:04:10.040] You get up, get your physical exercise, engage with the physical world, but video games are

[01:04:10.040 --> 01:04:13.040] not, they're not rotting your brain like parents.

[01:04:13.040 --> 01:04:14.040] Exactly.

[01:04:14.040 --> 01:04:15.040] Yeah.

[01:04:15.040 --> 01:04:17.440] It's a different kind of cognitive engagement.

[01:04:17.440 --> 01:04:22.640] It comes with its own skill sets and you know, especially kids have been like my daughters,

[01:04:22.640 --> 01:04:26.640] I put them in front of video games when, since they were three, I mean they were educational

[01:04:26.640 --> 01:04:30.800] ones, like color matching and math and letters and things like that, but it's still, it got

[01:04:30.800 --> 01:04:35.160] them, it gave them computer skills and, and, and again, like, you know, my daughter kicks

[01:04:35.160 --> 01:04:37.680] my ass on video games when we play together.

[01:04:37.680 --> 01:04:40.200] It's also Steve, it's like, it's fun, it's joyful.

[01:04:40.200 --> 01:04:43.160] And I think one of the things that we don't talk about enough, and I know that this is

[01:04:43.160 --> 01:04:45.120] like might sound a little bit wooey, but whatever.

[01:04:45.120 --> 01:04:50.880] I was just talking with a client today about this idea of like different pillars of mental

[01:04:50.880 --> 01:04:55.120] health and the things that we really need to rest on in order to feel kind of like fulfilled

[01:04:55.120 --> 01:04:56.120] and have a balanced life.

[01:04:56.120 --> 01:05:01.120] And we talked a lot about this one theoretical conceptualization, like think of a three legged

[01:05:01.120 --> 01:05:02.120] stool.

[01:05:02.120 --> 01:05:03.120] It has three legs.

[01:05:03.120 --> 01:05:05.660] If you kick one of them out, you're going to be like wobbly and fall over.

[01:05:05.660 --> 01:05:11.420] We talked about this idea of like work or education, whatever, uh, love, uh, whether

[01:05:11.420 --> 01:05:16.760] it's like familial or romantic or friendship, whatever, and then play.

[01:05:16.760 --> 01:05:21.360] And that very often the, as we get older and older and move more into adulthood, we sort

[01:05:21.360 --> 01:05:24.480] of forget how to play.

[01:05:24.480 --> 01:05:27.220] We don't do it enough in our lives.

[01:05:27.220 --> 01:05:31.800] We don't often do things that are joyful for the sake of joy.

[01:05:31.800 --> 01:05:32.800] I try.

[01:05:32.800 --> 01:05:33.800] Animals play.

[01:05:33.800 --> 01:05:39.920] Like if you look at adult animals, you do, you have a gaming podcast, but if you look

[01:05:39.920 --> 01:05:45.280] at animals, like, like any animal that's a non-domesticate, but even domesticates as

[01:05:45.280 --> 01:05:51.200] well, all the way until they play, even when they're elderly.

[01:05:51.200 --> 01:05:54.640] I think that might be, I don't know if I'm just speculating here, but maybe that's part

[01:05:54.640 --> 01:05:57.400] of the rise of the popularity of nerd culture.

[01:05:57.400 --> 01:06:03.720] Is that in nerd culture, it's okay to play and celebrates play as an adult, cosplay and

[01:06:03.720 --> 01:06:04.720] video games.

[01:06:04.720 --> 01:06:14.400] I always said Dungeons and Dragons was an intellectual pursuit for me as much as it

[01:06:14.400 --> 01:06:15.400] was a fun.

[01:06:15.400 --> 01:06:16.400] Absolutely.

[01:06:16.400 --> 01:06:17.400] Absolutely.

[01:06:17.400 --> 01:06:18.400] Evan.

[01:06:18.400 --> 01:06:19.400] You're totally right, man.

[01:06:19.400 --> 01:06:23.520] I would also say one last thing before we move on is that I do think that video games

[01:06:23.520 --> 01:06:27.960] are under leveraged in terms of just standard education.

[01:06:27.960 --> 01:06:31.480] We should video game cause video games are optimized for learning.

[01:06:31.480 --> 01:06:32.480] They're optimized.

[01:06:32.480 --> 01:06:33.480] They're individualized.

[01:06:33.480 --> 01:06:38.680] They, they move at the pace of, they are constantly keeping that carrot right in front of you.

[01:06:38.680 --> 01:06:40.540] They are optimized for learning.

[01:06:40.540 --> 01:06:44.320] And if you, and again, they're engaging and they're fun, they're everything learning is

[01:06:44.320 --> 01:06:47.560] supposed to be right there.

[01:06:47.560 --> 01:06:49.400] You know, all we have to do is leverage it.

[01:06:49.400 --> 01:06:50.400] Now I know that they're out there.

[01:06:50.400 --> 01:06:51.400] I know that they're out there.

[01:06:51.400 --> 01:06:54.960] I've looked into it even just again when I was writing about this recently, but they're

[01:06:54.960 --> 01:06:59.560] not, but they're not core curriculum, they are not, they're, they're an afterthought.

[01:06:59.560 --> 01:07:03.280] They are not incorporated into the core of our educational system.

[01:07:03.280 --> 01:07:05.880] That in my opinion is a missed opportunity.

[01:07:05.880 --> 01:07:06.880] It is.

[01:07:06.880 --> 01:07:11.160] And I'd say the same thing for, for intervention and treatment, both psychological and medical.

[01:07:11.160 --> 01:07:15.000] So when you see like neuro rehab, when you see a psychological intervention for certain

[01:07:15.000 --> 01:07:21.400] diagnoses, like if we could gamify these things and we can, that's the thing, they do exist.

[01:07:21.400 --> 01:07:22.400] It's huge.

[01:07:22.400 --> 01:07:23.400] It works.

[01:07:23.400 --> 01:07:24.400] But it can't be like cheesy.

[01:07:24.400 --> 01:07:26.120] Like it has to, it has to be authentic.

[01:07:26.120 --> 01:07:28.820] I think we need to get the stink off of video games.

[01:07:28.820 --> 01:07:33.920] I think there's still, you know, there's a stigma and there is a sort of rotting your

[01:07:33.920 --> 01:07:37.840] brain kind of thing that people react to, but no, this is just, it's another thing that

[01:07:37.840 --> 01:07:38.840] we do.

[01:07:38.840 --> 01:07:39.840] It's perfectly fine.

[01:07:39.840 --> 01:07:40.840] It's perfectly healthy.

[01:07:40.840 --> 01:07:44.240] It's actually cognitively healthy, may even be beneficial.

[01:07:44.240 --> 01:07:47.000] And it just needs to be one more thing that we do in modern society.

[01:07:47.000 --> 01:07:48.800] Steve, this isn't news.

[01:07:48.800 --> 01:07:49.800] This isn't news.

[01:07:49.800 --> 01:07:54.400] We've been reading about these studies for years now of how awesome they are.

[01:07:54.400 --> 01:07:55.400] Yeah.

[01:07:55.400 --> 01:07:56.400] Oh, I know.

[01:07:56.400 --> 01:08:01.360] And I want to ask you guys, who really legitimately is out there putting down video games today?

[01:08:01.360 --> 01:08:07.480] Like we have, I think what we, a lot of people, I think we have a memory of a lot of people

[01:08:07.480 --> 01:08:12.760] talking bad about video games, but I will remind you guys that video games are, it's

[01:08:12.760 --> 01:08:16.000] a multi multi-billion dollar industry.

[01:08:16.000 --> 01:08:18.640] A ton of people are buying video games.

[01:08:18.640 --> 01:08:19.640] It's greater than movies.

[01:08:19.640 --> 01:08:20.640] Oh yeah.

[01:08:20.640 --> 01:08:22.520] It's bigger than the motion picture entertainment industry.

[01:08:22.520 --> 01:08:29.840] So you know, the reality is, is that an incredible amount of people are, are playing video games,

[01:08:29.840 --> 01:08:33.080] like fully engaging, using them for social platforms.

[01:08:33.080 --> 01:08:37.440] You know, it's, it's, it's one of the first things that kids bond with each other in school

[01:08:37.440 --> 01:08:39.200] over is video games.

[01:08:39.200 --> 01:08:45.060] So I think that the stigma that we're talking about is, first off, it's probably largely

[01:08:45.060 --> 01:08:48.760] coming from people who have absolutely no clue about video games.

[01:08:48.760 --> 01:08:50.400] They really don't engage with them.

[01:08:50.400 --> 01:08:51.400] That's number one.

[01:08:51.400 --> 01:08:55.040] But we have a memory of people not liking video games now that we might still think

[01:08:55.040 --> 01:08:56.040] is current.

[01:08:56.040 --> 01:08:57.040] I don't think that's current.

[01:08:57.040 --> 01:09:02.560] I think most people completely accept them and this is a completely awesome, normal thing

[01:09:02.560 --> 01:09:03.560] that everybody does.

[01:09:03.560 --> 01:09:08.240] But even if they're a minority, the voice, it's still a vocal minority because every

[01:09:08.240 --> 01:09:13.080] time there's a school shooting or something happens, people violate video games, even

[01:09:13.080 --> 01:09:16.160] though there's no evidence for that.

[01:09:16.160 --> 01:09:17.160] No evidence.

[01:09:17.160 --> 01:09:18.480] You have to pull all the literature back out.

[01:09:18.480 --> 01:09:19.480] Yeah.

[01:09:19.480 --> 01:09:20.480] Yeah.

[01:09:20.480 --> 01:09:21.480] So let's move on.

New Aging Technique (01:09:21)

[01:09:21.480 --> 01:09:26.960] Bob, you're going to tell us about a new technique for aging stuff, like this is aging, um, geological

[01:09:26.960 --> 01:09:27.960] layers mainly.

[01:09:27.960 --> 01:09:28.960] Yeah.

[01:09:28.960 --> 01:09:32.840] Well, basically rocks, if you just say rocks, that's a good general category of things that

[01:09:32.840 --> 01:09:36.680] this could be used for, but there's other things as well, uh, determining the ages of

[01:09:36.680 --> 01:09:42.520] rocks in meteorites and even cultural artifacts took a double incremental leap recently with

[01:09:42.520 --> 01:09:48.760] a new technique that can quickly and non-destructively date billion year old rocks that kind of almost

[01:09:48.760 --> 01:09:52.120] reminds me of Star Trek tricorder in some ways.

[01:09:52.120 --> 01:09:56.440] See if you agree a little click baity, but, um, maybe you'll see it.

[01:09:56.440 --> 01:09:59.480] So this innovation was made by Thermo Fisher scientific.

[01:09:59.480 --> 01:10:04.200] Uh, this news comes from a group, uh, with the university of Chicago and the field museum

[01:10:04.200 --> 01:10:05.200] of natural history.

[01:10:05.200 --> 01:10:11.040] And if you want to read about it in all its glorious jargon babble, plus acronyms and

[01:10:11.040 --> 01:10:16.600] initialisms, check it out in the journal of analytical atomic spectrometry.

[01:10:16.600 --> 01:10:21.040] So this uses radiometric dating, as you may have surmised, um, this is widespread and

[01:10:21.040 --> 01:10:22.200] versatile technique.

[01:10:22.200 --> 01:10:27.240] It takes advantage of harmless radioactive impurities that were incorporated when rock,

[01:10:27.240 --> 01:10:28.760] for example, was formed.

[01:10:28.760 --> 01:10:32.600] The business end of the dating method though comes into play when you compare the amount

[01:10:32.600 --> 01:10:37.360] of that radioactive isotope or a new Clyde, which is probably more accurate to the amount

[01:10:37.360 --> 01:10:41.440] of its decay products or daughter particles, uh, if you will.

[01:10:41.440 --> 01:10:47.360] So since these decay products form at a very specific and constant rate based on what?

[01:10:47.360 --> 01:10:49.080] Based on its half life, right?

[01:10:49.080 --> 01:10:54.960] Determining the samples age based on that amount is then a fairly straightforward calculation.

[01:10:54.960 --> 01:11:00.840] We've been doing it since I think 1905, 1907, right around there over well over a century.

[01:11:00.840 --> 01:11:06.880] Now this specific type of radiometric dating as specifically rubidium strontium, uh, rubidium

[01:11:06.880 --> 01:11:14.240] 87 is unstable and will decay into a new element, strontium 87 with a half life of almost 50

[01:11:14.240 --> 01:11:22.520] billion years, 50 billion amazing, which of course makes it excellent for, for very, very

[01:11:22.520 --> 01:11:23.520] old things.

[01:11:23.520 --> 01:11:29.840] I, that means of course that half of rubidium 87 will become the more stable strontium 87

[01:11:29.840 --> 01:11:32.160] in about 50 billion years.

[01:11:32.160 --> 01:11:36.280] Calculating that ratio has been done, like I said, since 1905, but it has a downside.

[01:11:36.280 --> 01:11:40.160] It can be very destructive process coming up with that number.

[01:11:40.160 --> 01:11:43.760] So if you're dating, for example, a large meteorite, you'd say you'd have to knock a

[01:11:43.760 --> 01:11:48.240] chunk of rock off of it, a chunk of it off, crush it with a hammer or something, and then

[01:11:48.240 --> 01:11:54.720] dissolve the minerals using chemicals and then process all that in one of those ultra

[01:11:54.720 --> 01:11:59.340] clean laboratories, uh, clean rooms, like far cleaner than what Heisenberg used to make

[01:11:59.340 --> 01:12:00.720] his blue stuff.

[01:12:00.720 --> 01:12:05.760] Then you need a mass spectrometer to do the actual measuring of the, of the isotopes or

[01:12:05.760 --> 01:12:08.200] nuclides to get the ratios.

[01:12:08.200 --> 01:12:12.960] So all that can take weeks to come up with the age of your sample weeks.

[01:12:12.960 --> 01:12:14.240] This is not quick.

[01:12:14.240 --> 01:12:18.960] Now the new technique takes a sample using a laser that will create a hole in your rock

[01:12:18.960 --> 01:12:24.040] or whatever, or whatever your, the sample is, then that hole will be comparable in size

[01:12:24.040 --> 01:12:30.720] to a human hair, a human hair, no pulverizing rock, no chunks breaking off, no ultra clean

[01:12:30.720 --> 01:12:32.240] room, none of that needed.

[01:12:32.240 --> 01:12:37.180] It then analyzes the rubidium strontium atom ratios with cutting, with a cutting edge mass

[01:12:37.180 --> 01:12:42.800] spectrometer that can spit out an answer, not in weeks, but just hours, hours.

[01:12:42.800 --> 01:12:49.100] Imagine doing this over and over and instead of waiting three, four weeks in between breakfast

[01:12:49.100 --> 01:12:51.320] and lunch, you've got your, your answer.

[01:12:51.320 --> 01:12:52.720] That must be incredible.

[01:12:52.720 --> 01:12:58.920] Nicholas Dalfas, who's the Lewis Block professor of geophysical sciences at the University

[01:12:58.920 --> 01:13:03.560] of Chicago and the first author of the study said, this is a huge advance.

[01:13:03.560 --> 01:13:07.660] There are many precious meteorites and artifacts that you don't want to destroy.

[01:13:07.660 --> 01:13:12.760] This allows you to tremendously minimize the impact you have during your analysis.

[01:13:12.760 --> 01:13:17.360] Now to be fair, and I was a little bummed about this next fact, the age estimate that

[01:13:17.360 --> 01:13:22.860] comes out is not quite as accurate as the gold standard, which is using thermal ionization

[01:13:22.860 --> 01:13:25.000] mass spectrometry.

[01:13:25.000 --> 01:13:29.540] So and that's not part of this process, this quick process with the laser.

[01:13:29.540 --> 01:13:34.040] So that's the one that takes a while and apparently it's a little bit, it's more accurate.

[01:13:34.040 --> 01:13:38.340] I'm not sure by how much it's more accurate than this new method, but the primary advantages

[01:13:38.340 --> 01:13:41.680] of this new technique are, are important and they're threefold.

[01:13:41.680 --> 01:13:45.620] I mentioned that it's very fast and it's non-destructive, two huge pluses.

[01:13:45.620 --> 01:13:51.040] It can also handle samples with very small grain sizes, which can be problematic for

[01:13:51.040 --> 01:13:53.080] the thermal ionization mass spectrometry.

[01:13:53.080 --> 01:13:58.200] So it could broaden your potential number of samples that can be done.

[01:13:58.200 --> 01:14:03.680] So now to put this device through its paces, the researchers tested a famous Martian meteorite,

[01:14:03.680 --> 01:14:08.120] you may have heard of it, nicknamed Black Beauty, because it's such a beautiful, beautiful

[01:14:08.120 --> 01:14:09.120] dark color.

[01:14:09.120 --> 01:14:14.440] But if you look closely, you see spots of lighter hues though, which are older rocks

[01:14:14.440 --> 01:14:17.020] embedded within the newer rock, right?

[01:14:17.020 --> 01:14:20.800] So these types of rocks can be tricky to fully date.

[01:14:20.800 --> 01:14:22.740] So it's kind of like a meatball, right Jay?

[01:14:22.740 --> 01:14:29.380] The garlic and breadcrumbs in the meatball were created elsewhere and at different times,

[01:14:29.380 --> 01:14:33.320] but they later joined together for the creation of the meatball itself.

[01:14:33.320 --> 01:14:36.160] So good analogy Bob, thank you.

[01:14:36.160 --> 01:14:37.160] Yes.

[01:14:37.160 --> 01:14:41.440] Now Jay understands exactly what you mean.

[01:14:41.440 --> 01:14:43.440] Better than my average meatball analogies.

[01:14:43.440 --> 01:14:44.440] I think so.

[01:14:44.440 --> 01:14:45.440] Yeah.

[01:14:45.440 --> 01:14:46.440] Yeah.

[01:14:46.440 --> 01:14:47.440] It worked well.

[01:14:47.440 --> 01:14:52.400] So in this scenario though, you'd ideally, you'd want to know not only the age of the

[01:14:52.400 --> 01:14:58.400] gestalt rock, if you will, but the tiny older specks of rock inside as well.

[01:14:58.400 --> 01:15:03.960] Once you have that information, you can put together a much more cohesive, full history

[01:15:03.960 --> 01:15:08.000] that would then give you insight into, for example, what the environment on Mars was

[01:15:08.000 --> 01:15:13.560] like volcanically and atmospherically at various ancient periods of its history, because you

[01:15:13.560 --> 01:15:18.120] know when the older rocks were formed and then you would know when and what the environment

[01:15:18.120 --> 01:15:24.120] was like when the bigger rock was formed around those older little tiny pebbles inside.

[01:15:24.120 --> 01:15:29.800] And of course using this, it'd be invaluable for creating essentially a timeline of Mars's

[01:15:29.800 --> 01:15:30.800] history.

[01:15:30.800 --> 01:15:33.300] Now the age of black beauty was assessed before.

[01:15:33.300 --> 01:15:37.920] This has been done before using older techniques, but different studies didn't agree on the

[01:15:37.920 --> 01:15:38.920] age.

[01:15:38.920 --> 01:15:42.600] There was some, you know, nobody was, you know, which one's right or whatever.

[01:15:42.600 --> 01:15:47.600] The latest test that was just done using this new technique calculated that the meteorite

[01:15:47.600 --> 01:15:53.480] was created 2.2 billion years ago, which of course the team believes is, you know, when

[01:15:53.480 --> 01:15:54.480] it formed.

[01:15:54.480 --> 01:15:59.120] In the paper though, I want to mention this, in the paper they said that 2.2 billion years

[01:15:59.120 --> 01:16:04.440] represents the age of lithification and I immediately fell in love with that form of

[01:16:04.440 --> 01:16:05.440] the word.

[01:16:05.440 --> 01:16:06.440] I never heard that word, lithification.

[01:16:06.440 --> 01:16:11.800] I'm going to try to use that in everyday talk, well maybe once a week at least.

[01:16:11.800 --> 01:16:16.000] And lithification just means when fresh grains of sediment are turned into rock.

[01:16:16.000 --> 01:16:22.760] So Nicholas Dalfas said also this was a particularly good tool to solve this controversy and I

[01:16:22.760 --> 01:16:24.840] guess they do consider it solved.

[01:16:24.840 --> 01:16:25.840] That's interesting.

[01:16:25.840 --> 01:16:29.940] When you chip out a piece of rock to test the old way, it's possible you are getting

[01:16:29.940 --> 01:16:33.400] other fragments mixed in, which may affect your results.

[01:16:33.400 --> 01:16:35.440] We do not have that problem with the new machine.

[01:16:35.440 --> 01:16:39.840] So that's probably, that's why they're extra confident because you can be so precise with

[01:16:39.840 --> 01:16:40.840] that laser.

[01:16:40.840 --> 01:16:45.000] You're not going to get any other fragments mixed in messing with your results.

[01:16:45.000 --> 01:16:46.000] Okay.

[01:16:46.000 --> 01:16:47.000] So that does make sense.

[01:16:47.000 --> 01:16:48.000] Okay.

[01:16:48.000 --> 01:16:51.280] So for the future, scientists are excited about this advance obviously and they think it will

[01:16:51.280 --> 01:16:54.400] ultimately be useful in a variety of fields.

[01:16:54.400 --> 01:16:59.520] Dalfas and his team are especially interested in using this technique to understand more

[01:16:59.520 --> 01:17:01.440] fully the history of Mars.

[01:17:01.440 --> 01:17:07.160] For example, like the history of water on Mars' surface and how the solar system itself

[01:17:07.160 --> 01:17:08.160] formed.

[01:17:08.160 --> 01:17:13.200] In the next decade, scientists are going to have an embarrassment of riches of planetary

[01:17:13.200 --> 01:17:14.360] samples.

[01:17:14.360 --> 01:17:15.520] This surprised me.

[01:17:15.520 --> 01:17:19.420] Over the next 10 years, we're going to have samples, more samples of the moon, right?

[01:17:19.420 --> 01:17:21.040] New moon mission samples.

[01:17:21.040 --> 01:17:25.840] U.S. and China are definitely taking a major interest in the moon, obviously.

[01:17:25.840 --> 01:17:30.120] Also samples from an asteroid called Bennu, we'll have samples from there.

[01:17:30.120 --> 01:17:35.800] Mars moon Phobos in 2027 and the Perseverance rover on Mars is collecting that and we will

[01:17:35.800 --> 01:17:37.720] have samples from that.

[01:17:37.720 --> 01:17:41.400] So using this technique will be an invaluable tool.

[01:17:41.400 --> 01:17:45.160] Nicholas Dalfas said, I'll end with his quote, we are very excited by this demonstration

[01:17:45.160 --> 01:17:49.320] study as we think that we will be able to employ the same approach to date rocks that

[01:17:49.320 --> 01:17:52.580] will be returned by multiple space missions in the future.

[01:17:52.580 --> 01:17:57.640] The next decade is going to be mind blowing in terms of planetary exploration.

[01:17:57.640 --> 01:18:05.360] So cool new device for aging, for dating rocks and interesting technique, radiocarbon, radiometric

[01:18:05.360 --> 01:18:13.380] dating is fascinating and if you drill down, just the whole process of atoms decaying isotopes

[01:18:13.380 --> 01:18:17.320] and nuclides and everything is just obviously very interesting to me.

[01:18:17.320 --> 01:18:18.320] So dig deeper.

[01:18:18.320 --> 01:18:19.320] Yeah, it's an amazing tool.

NASA UAP Study (01:18:19)

  • [link_URL title][4]

[01:18:19.320 --> 01:18:25.280] All right, Evan, finish up the news items telling us about NASA's new committee on

[01:18:25.280 --> 01:18:27.080] unidentified aerial phenomena.

[01:18:27.080 --> 01:18:28.280] Yeah, that's right.

[01:18:28.280 --> 01:18:29.500] This is an interesting one.

[01:18:29.500 --> 01:18:36.240] Back in June of this year, NASA had announced the formation of a study team to examine unidentified

[01:18:36.240 --> 01:18:43.520] aerial phenomena, UAPs, if we've talked about many times on the show, specifically

[01:18:43.520 --> 01:18:45.600] from a scientific perspective.

[01:18:45.600 --> 01:18:49.120] That's right in their mission statement effectively from the first sentence.

[01:18:49.120 --> 01:18:53.840] But the news this week since then is that NASA has finally selected the individuals

[01:18:53.840 --> 01:18:58.760] who are going to participate in this study team and their work began this week, this

[01:18:58.760 --> 01:19:00.200] past Monday.

[01:19:00.200 --> 01:19:02.480] They've rolled up their sleeves and gotten to work.

[01:19:02.480 --> 01:19:09.080] According to the statement, the study is going to focus on identifying available data, how

[01:19:09.080 --> 01:19:15.920] best to collect future data and how NASA can use that data to move the scientific understanding

[01:19:15.920 --> 01:19:18.200] of UAPs forward.

[01:19:18.200 --> 01:19:25.200] Thomas Zurbuchen, who's the associate administrator for science at NASA, says that NASA believes

[01:19:25.200 --> 01:19:31.680] that the tools of scientific discovery are powerful and apply here, meaning UAP investigations,

[01:19:31.680 --> 01:19:35.280] and we have access to a broad range of observations of Earth from space.

[01:19:35.280 --> 01:19:38.120] That is the lifeblood of scientific inquiry.

[01:19:38.120 --> 01:19:42.460] We have the tools and the team who can help us improve our understanding of the unknown.

[01:19:42.460 --> 01:19:46.360] That's the very definition of what science is and that's what we do.

[01:19:46.360 --> 01:19:47.360] Okay.

[01:19:47.360 --> 01:19:49.560] Does this remind anyone of Project Blue Book?

[01:19:49.560 --> 01:19:50.560] Yeah.

[01:19:50.560 --> 01:19:51.560] Oh, yeah.

[01:19:51.560 --> 01:19:53.840] Did that come to mind when you guys heard about this?

[01:19:53.840 --> 01:19:58.520] And of course, I know we've talked about Project Blue Book before, maybe it's been years,

[01:19:58.520 --> 01:20:04.640] but for those of you who don't know, this was a systematic study of unidentified flying

[01:20:04.640 --> 01:20:09.500] objects, as they used to be called, UFOs, and it was conducted by the United States

[01:20:09.500 --> 01:20:14.440] Air Force starting in 1952, was terminated in 1969.

[01:20:14.440 --> 01:20:20.760] So they spent a better part of two decades researching UFOs and the summary of their

[01:20:20.760 --> 01:20:25.000] investigations they decided to close it down in 1969, but here was the basic summary of

[01:20:25.000 --> 01:20:26.760] those investigations.

[01:20:26.760 --> 01:20:32.360] Number one, no UFO reported, investigated, and evaluated was ever an indication of a

[01:20:32.360 --> 01:20:35.180] threat to national security, okay.

[01:20:35.180 --> 01:20:39.940] Number two, there was no evidence submitted or discovered that the sightings categorized

[01:20:39.940 --> 01:20:44.400] as unidentified representative technological development or principles beyond the range

[01:20:44.400 --> 01:20:46.740] of modern scientific knowledge.

[01:20:46.740 --> 01:20:51.000] And number three, there was no evidence indicating that sightings categorized as unidentified

[01:20:51.000 --> 01:20:53.560] were extraterrestrial in nature.

[01:20:53.560 --> 01:20:56.360] So long time spent, nothing to see here.

[01:20:56.360 --> 01:21:00.920] I have a feeling we're going to wind up having some similar results with the UAP investigation.

[01:21:00.920 --> 01:21:04.280] Well, of course the US Air Force would say that.

[01:21:04.280 --> 01:21:05.280] Come on, Evan.

[01:21:05.280 --> 01:21:10.880] Again, this is the, you know, in the pre-NASA days, really, but, you know, I suppose it

[01:21:10.880 --> 01:21:17.680] sort of makes sense that, you know, NASA now is going to take the helm in the new version

[01:21:17.680 --> 01:21:20.960] of Project Blue Book in their own way.

[01:21:20.960 --> 01:21:25.840] This part of the study, though, and the people that were selected for this has to do with

[01:21:25.840 --> 01:21:30.360] laying the groundwork for future study of UAPs for NASA.

[01:21:30.360 --> 01:21:35.900] So they're basically going to, I think, set the table as far as, okay, what can we see,

[01:21:35.900 --> 01:21:40.480] what do we have, what's reliable, and what's the best way to go about doing this?

[01:21:40.480 --> 01:21:44.360] And that's what they're going to be doing over the course of the next nine months.

[01:21:44.360 --> 01:21:50.760] So middle of next year, they're going to release their report and their findings, and we're

[01:21:50.760 --> 01:21:57.160] going to learn some more about what their expert opinions are as far as how UAPs should

[01:21:57.160 --> 01:22:00.360] be studied overall.

[01:22:00.360 --> 01:22:05.160] And there's, it's an impressive, look, it's an impressive list of people with some very

[01:22:05.160 --> 01:22:06.960] good credentials.

[01:22:06.960 --> 01:22:16.280] I didn't see anyone who had any questionable or cranky kind of credentials or certifications

[01:22:16.280 --> 01:22:17.280] or anything like that.

[01:22:17.280 --> 01:22:23.540] These are all very, seem like reputable people with a lot of citations, publications.

[01:22:23.540 --> 01:22:30.240] You've got professors, you've got the deputy project scientist for the Verisie Rubin Observatory

[01:22:30.240 --> 01:22:38.680] as part of this team, you've got an oceanographer as part of this team, you've got the undersecretary

[01:22:38.680 --> 01:22:44.520] for science and technology at the U.S. Department of Homeland Security here, you've got a freelance

[01:22:44.520 --> 01:22:51.560] science journalist with certain science credentials, you've got the person who's responsible for

[01:22:51.560 --> 01:22:56.840] something called the Artemis Records, which establishes the norms of behavior in space,

[01:22:56.840 --> 01:23:02.120] former NASA astronaut and test pilot is part of this group, someone from the Da Vinci mission

[01:23:02.120 --> 01:23:07.440] to Venus, so you're pulling from some people who have some real, real good scientific bona

[01:23:07.440 --> 01:23:08.440] fides here.

[01:23:08.440 --> 01:23:13.480] However, however, with that said, I have a couple of observations or questions.

[01:23:13.480 --> 01:23:19.600] And the first thing that came to mind is, well, I didn't see anyone specific to a skeptics

[01:23:19.600 --> 01:23:24.120] organization or skeptics community among the group.

[01:23:24.120 --> 01:23:27.420] Not that I could tell and not from what I could research on some of these folks online

[01:23:27.420 --> 01:23:29.320] if they had any really affiliations.

[01:23:29.320 --> 01:23:33.040] Now there are a couple of people who are involved with SETI.

[01:23:33.040 --> 01:23:34.360] So perhaps that's something.

[01:23:34.360 --> 01:23:39.440] I think there are two of the 16 are affiliated with the SETI Institute.

[01:23:39.440 --> 01:23:41.240] So perhaps that's something.

[01:23:41.240 --> 01:23:48.820] I'm also wondering as part of this, who will these 16 individuals be referencing or relying

[01:23:48.820 --> 01:23:55.840] on or making phone calls and having conversations with sort of to have an almost who will indirectly

[01:23:55.840 --> 01:23:58.460] be involved with this project as well?

[01:23:58.460 --> 01:24:03.720] In other words, will they will someone reach out to maybe Mick West to get his opinion

[01:24:03.720 --> 01:24:10.600] sort of eased on a lot of analysis on the UAP phenomenon that has gained steam over

[01:24:10.600 --> 01:24:12.240] the past 10 years?

[01:24:12.240 --> 01:24:19.360] Will they be talking to anyone in the medical sciences, neurologists, psychologists, the

[01:24:19.360 --> 01:24:24.640] people who can, you know, know about how the human brain works and leads people astray?

[01:24:24.640 --> 01:24:28.400] Will they be will they be speaking to some of the lead engineers that have designed the

[01:24:28.400 --> 01:24:33.280] types of equipment that are capturing the things that they are trying to analyze?

[01:24:33.280 --> 01:24:39.040] So there's nothing specific about that that may flesh out once we have the report.

[01:24:39.040 --> 01:24:44.780] Number two and another thing I have here is that I think it would be good, in my opinion,

[01:24:44.780 --> 01:24:51.280] if they were to come up with perhaps a statement or something right off the bat, something

[01:24:51.280 --> 01:24:53.960] like and I just I came up with this in my own head.

[01:24:53.960 --> 01:24:57.560] So today, if they were to come up with sort of a part of their mission statement or something

[01:24:57.560 --> 01:25:02.160] today, they could say all the evidence cited as proof of alien life or alien technology

[01:25:02.160 --> 01:25:04.320] amounts to zero, right?

[01:25:04.320 --> 01:25:12.520] Kind of deflate the whole UFO extraterrestrial proponent crowds out there.

[01:25:12.520 --> 01:25:13.520] Yeah.

[01:25:13.520 --> 01:25:18.360] My big thought is that they should be doing it, but only if they're going to do it right.

[01:25:18.360 --> 01:25:26.400] And that means understanding that this is ninety nine percent a public outreach mission.

[01:25:26.400 --> 01:25:32.980] What they're doing by doing this, this is all about educating the public about the nature

[01:25:32.980 --> 01:25:41.320] of evidence as it relates to aerial phenomenon, which means that their investigative group,

[01:25:41.320 --> 01:25:46.260] team members needs to have people who know how to communicate to the public about these

[01:25:46.260 --> 01:25:49.200] and engage with the believers, et cetera.

[01:25:49.200 --> 01:25:51.320] And so they need science communicators.

[01:25:51.320 --> 01:25:56.320] And because because of the nature of this study, they need skeptics, basically.

[01:25:56.320 --> 01:26:01.520] And the fact that they don't have people on there who are explicitly have that skill set,

[01:26:01.520 --> 01:26:03.160] is concerning to me.

[01:26:03.160 --> 01:26:04.960] Richard, what do you think?

[01:26:04.960 --> 01:26:09.000] You know, what occurred to me is something I've mentioned on my own podcast because,

[01:26:09.000 --> 01:26:14.960] you know, I've been interested in UFOs now, UAPs since whenever I can remember, is that

[01:26:14.960 --> 01:26:17.480] you remember Chariots of the Gods, question mark?

[01:26:17.480 --> 01:26:18.480] Oh, yeah.

[01:26:18.480 --> 01:26:19.480] Sure.

[01:26:19.480 --> 01:26:20.480] Yeah.

[01:26:20.480 --> 01:26:21.480] Eric Von Deineken.

[01:26:21.480 --> 01:26:22.480] My God.

[01:26:22.480 --> 01:26:26.300] What amuses me to this day is that when you see the video, the documentary made in the

[01:26:26.300 --> 01:26:33.400] early 70s there, they're drawing an analogy between the then state of the art spacecraft,

[01:26:33.400 --> 01:26:38.000] which was the Apollo space things, to alien craft.

[01:26:38.000 --> 01:26:42.320] And I'm thinking, after all these decades, the alien craft really haven't advanced very

[01:26:42.320 --> 01:26:43.320] much, have they?

[01:26:43.320 --> 01:26:47.840] You know, in all the decades we've been stuck, they're mostly doing the same stuff.

[01:26:47.840 --> 01:26:53.000] I'm thinking, why are these aliens, were they stuck at a certain level of their technology?

[01:26:53.000 --> 01:26:56.560] Haven't they advanced decades and decades and decades like we have?

[01:26:56.560 --> 01:27:02.940] And in 30, 40, 50, 60 years, what's going to be zipping around our atmosphere by us

[01:27:02.940 --> 01:27:08.160] will put any of, you know, the past UFOs to shame, surely, by two cents.

[01:27:08.160 --> 01:27:13.880] Yeah, it's like a lot of the paranormal phenomena or, you know, these kinds of conspiracy phenomena,

[01:27:13.880 --> 01:27:16.480] they always seem to track with our existing technology.

[01:27:16.480 --> 01:27:17.480] Yes.

[01:27:17.480 --> 01:27:22.820] You know, it's like ghost photos are always something that emerges from current photographic

[01:27:22.820 --> 01:27:23.820] technology.

[01:27:23.820 --> 01:27:24.820] Yes.

[01:27:24.820 --> 01:27:25.820] Whatever.

[01:27:25.820 --> 01:27:26.820] You know what I mean?

[01:27:26.820 --> 01:27:32.680] It always seems to track along with that because it's artifact, you know, our own technology.

[01:27:32.680 --> 01:27:33.680] That's right.

[01:27:33.680 --> 01:27:34.680] Yeah.

[01:27:34.680 --> 01:27:35.680] All right.

[01:27:35.680 --> 01:27:36.680] Thanks, Evan.

What’s the Word? (01:27:36)

[01:27:36.680 --> 01:27:38.280] Kara, you're going to do a What's the Word?

[01:27:38.280 --> 01:27:39.360] Yes, we are.

[01:27:39.360 --> 01:27:48.480] And this week's word comes from listener Viggo Telifsen Vifstad, maybe, from Norway.

[01:27:48.480 --> 01:27:53.560] And he says, I was wondering if you would be interested in doing a deep dive into the

[01:27:53.560 --> 01:27:55.540] meaning of the word regression.

[01:27:55.540 --> 01:27:59.680] This is a word I've come across in many different settings, but it seems to have a somewhat

[01:27:59.680 --> 01:28:03.360] different meaning based on the context.

[01:28:03.360 --> 01:28:05.080] What do you mean by regression?

[01:28:05.080 --> 01:28:07.880] What do you mean?

[01:28:07.880 --> 01:28:09.920] That's sort of funny.

[01:28:09.920 --> 01:28:12.520] Oh, no, it is.

[01:28:12.520 --> 01:28:16.240] We'll get back to regression to the mean in a second.

[01:28:16.240 --> 01:28:18.720] I like that he actually does a little bit of the deep diving.

[01:28:18.720 --> 01:28:21.720] He did a little bit of the heavy lifting for us, so I'm going to actually just read some

[01:28:21.720 --> 01:28:24.200] of the things that he wrote because they're quite helpful.

[01:28:24.200 --> 01:28:25.280] Here are some examples.

[01:28:25.280 --> 01:28:30.320] Number one, in my field, so he works in applied machine learning, a regression model is a

[01:28:30.320 --> 01:28:35.880] type of machine learning model that outputs continuous data, meaning a decimal or real

[01:28:35.880 --> 01:28:40.400] number as opposed to a discrete value like an integer or a category.

[01:28:40.400 --> 01:28:45.240] In economy, regression is used to indicate the return or undesirable state.

[01:28:45.240 --> 01:28:49.460] In medicine, regression can be used to describe when diseases decrease in severity or size

[01:28:49.460 --> 01:28:54.120] or when a patient's experiencing loss of memories, loss of acquired skills, they're regressing.

[01:28:54.120 --> 01:28:58.280] In stats, we have regression towards the mean, which is a concept that tells us that observations

[01:28:58.280 --> 01:29:01.960] following a randomly sampled extreme outlier are likely to be closer to the mean.

[01:29:01.960 --> 01:29:05.520] Interestingly, Vigo does not talk about the actual, like what I think of when I think

[01:29:05.520 --> 01:29:10.440] of regression, which is a regression analysis and statistics, not regression to the mean,

[01:29:10.440 --> 01:29:13.320] the actual statistical function of regression.

[01:29:13.320 --> 01:29:14.960] So we'll get back to that.

[01:29:14.960 --> 01:29:19.080] And then he also says in astronomy, regression can apparently mean, and this one was interesting

[01:29:19.080 --> 01:29:22.740] to me too, I had to dive into this, retrograde motion.

[01:29:22.740 --> 01:29:28.480] So they do use the term nodal regression to refer to a satellite shift of the orbits line

[01:29:28.480 --> 01:29:31.720] of nodes over time as the earth revolves around the sun.

[01:29:31.720 --> 01:29:34.860] But I found that that's actually called nodal precession.

[01:29:34.860 --> 01:29:37.320] And we talk about precession on the show a lot, right?

[01:29:37.320 --> 01:29:43.040] Yeah, like we've talked about this idea, the rotating body is not actually a perfect sphere.

[01:29:43.040 --> 01:29:46.060] So there's a non-uniform gravitational field.

[01:29:46.060 --> 01:29:52.820] This thing, as it goes through the orbital plane, the satellite kind of moves around.

[01:29:52.820 --> 01:29:57.180] There is a precession, but apparently that's also called nodal regression.

[01:29:57.180 --> 01:30:00.480] So we hear this word, it seems to mean a lot of different things.

[01:30:00.480 --> 01:30:03.760] There are some core features that bind them all together.

[01:30:03.760 --> 01:30:09.080] And that's really that when we look at the etymology of the word, it takes us back to

[01:30:09.080 --> 01:30:15.180] regresse, which is a return, a passage back, an act of going back.

[01:30:15.180 --> 01:30:19.680] And that's from the Latin regressus, so a retreating, a going back, which we think ultimately

[01:30:19.680 --> 01:30:24.520] goes back to a PIA root, which I can't even pronounce, G-H-R-E-D-H.

[01:30:24.520 --> 01:30:27.000] Remember PIA is Proto-Indo-European.

[01:30:27.000 --> 01:30:32.940] So this is what is theorized or hypothesized to be pre kind of Greek and Latin.

[01:30:32.940 --> 01:30:41.440] And we see this in a ton of different words, aggressive, digression, egress, progress,

[01:30:41.440 --> 01:30:48.180] transgression, retrogress, tardigrade, gradual, graduate, degrade, degree, like these all

[01:30:48.180 --> 01:30:50.340] seem to come from that same root.

[01:30:50.340 --> 01:30:56.160] But as we build on it and we look first to the Latin and then to later usages, this kind

[01:30:56.160 --> 01:31:01.760] of regressus looks like going back, going back to a former kind of form, walking back,

[01:31:01.760 --> 01:31:06.280] going back, returning, passaging back.

[01:31:06.280 --> 01:31:10.800] In medicine, it looks like it was first used a lot earlier, like in the 1600s, this idea

[01:31:10.800 --> 01:31:15.940] of going back to a certain condition or like relapsing, like this person seems to be regressing.

[01:31:15.940 --> 01:31:20.000] We saw that first used in like the 1600s.

[01:31:20.000 --> 01:31:24.480] Prior to that, it was used more kind of generally philosophically, scientifically, this idea

[01:31:24.480 --> 01:31:26.880] of returning to a point of departure.

[01:31:26.880 --> 01:31:31.080] So either your actions could be or even like measurements.

[01:31:31.080 --> 01:31:35.960] Now we started to see things get really interesting in the 1800s with Galton.

[01:31:35.960 --> 01:31:38.120] You guys remember Francis Galton?

[01:31:38.120 --> 01:31:39.120] Have you heard of Pearson?

[01:31:39.120 --> 01:31:42.460] Well, oftentimes he's lumped in with Spearman and Pearson and Galton.

[01:31:42.460 --> 01:31:50.000] So he was like a pretty, he was a statistician and eugenicist, like most of them were at

[01:31:50.000 --> 01:31:51.000] the time.

[01:31:51.000 --> 01:31:52.000] Yes.

[01:31:52.000 --> 01:31:56.160] And so it's really interesting because Galton seems to have coined the term regression to

[01:31:56.160 --> 01:31:57.440] the mean.

[01:31:57.440 --> 01:32:01.520] He was actually talking about regression toward mediocrity.

[01:32:01.520 --> 01:32:03.920] That is the phrase that he used.

[01:32:03.920 --> 01:32:08.480] And basically the idea was that offspring, because he was a geneticist, eugenicist, it

[01:32:08.480 --> 01:32:14.000] was the same thing at the time, not completely, but many of them were, offspring deviated

[01:32:14.000 --> 01:32:19.120] less from the mean value of the population than their parents did when you started to

[01:32:19.120 --> 01:32:24.660] look at the statistics as a population whole phenomenon.

[01:32:24.660 --> 01:32:31.480] So he was saying that this is why we saw change via inheritance.

[01:32:31.480 --> 01:32:37.680] And then he started to come up with this idea of regression towards mediocrity.

[01:32:37.680 --> 01:32:44.360] And eventually that led to the idea of regression to the mean, which you already kind of mentioned

[01:32:44.360 --> 01:32:47.900] Evan, and we've talked about it a lot on the show.

[01:32:47.900 --> 01:32:55.520] But the idea here is that if you take a random sample of a normal distribution and it's an

[01:32:55.520 --> 01:33:01.080] extreme sample, the next sample is likely to be closer to the mean, median, or mode.

[01:33:01.080 --> 01:33:05.360] So that's because the normal distribution has more things in the middle than in the

[01:33:05.360 --> 01:33:06.500] extremes.

[01:33:06.500 --> 01:33:10.780] And so the more you sample, the more likely are you to regress to the mean.

[01:33:10.780 --> 01:33:15.600] You're more likely to get results that are less extreme.

[01:33:15.600 --> 01:33:20.440] I looked at a few different internet forums, and I'm hoping that this is helpful to you,

[01:33:20.440 --> 01:33:25.200] Vigo, when we look at your actual field, which is applied machine learning.

[01:33:25.200 --> 01:33:28.120] Because as you described it, and you described it better than a lot of the descriptions I

[01:33:28.120 --> 01:33:33.400] found online, a regression model is a type of machine learning model that outputs continuous

[01:33:33.400 --> 01:33:34.400] data.

[01:33:34.400 --> 01:33:40.700] So we're talking like real numbers, you know, 1.76531, as opposed to discrete data, so integers

[01:33:40.700 --> 01:33:45.160] or categories, things that you can't subdivide, right?

[01:33:45.160 --> 01:33:54.120] What does that have to do with this foundational, etymological root or definition of regression?

[01:33:54.120 --> 01:33:58.560] So I've looked at a couple different forums, and I've seemed to come up with a couple

[01:33:58.560 --> 01:34:00.960] different explanations.

[01:34:00.960 --> 01:34:03.920] One of them comes from a guy, and I don't know if he's trying to square peg a round

[01:34:03.920 --> 01:34:07.120] hole here, but I think it's interesting what he said.

[01:34:07.120 --> 01:34:12.360] Regression comes from regress, right, from the Latin regresses.

[01:34:12.360 --> 01:34:16.880] So in that sense, regression is the technique that allows, in machine learning, those to

[01:34:16.880 --> 01:34:23.600] quote, go back from messy, hard-to-interpret data to a clearer and more meaningful model.

[01:34:23.600 --> 01:34:27.120] And that may be where the term came from in machine learning.

[01:34:27.120 --> 01:34:31.520] And then you have other people explaining on these same kind of internet forums that

[01:34:31.520 --> 01:34:39.180] machine learning, I guess, engineers, scientists, computer scientists, love to take terms from

[01:34:39.180 --> 01:34:42.920] other branches of science and use them the way they want to use them, and apparently

[01:34:42.920 --> 01:34:47.320] it has no basis in the actual definition of the term regression.

[01:34:47.320 --> 01:34:48.880] So I'm not sure which is correct.

[01:34:48.880 --> 01:34:51.720] I can't seem to find the answer online.

[01:34:51.720 --> 01:34:56.560] But it's either that it's kind of a misnomer in machine learning or, and this might just

[01:34:56.560 --> 01:35:02.560] be a backward justification, or it has to do with what I said earlier, this idea that

[01:35:02.560 --> 01:35:06.600] using these regression models in machine learning, because I don't fully understand them, allow

[01:35:06.600 --> 01:35:12.200] the algorithm to basically go back to a cleaner, more meaningful model.

[01:35:12.200 --> 01:35:17.420] So go back away from harder-to-interpret data back to data that's cleaner and more meaningful.

[01:35:17.420 --> 01:35:19.240] So in that way, it would be regressing.

[01:35:19.240 --> 01:35:25.120] I'd be curious, Viggo, if you get this, if you want to reply again on the contact forum

[01:35:25.120 --> 01:35:30.400] and let us know what you thought about the descriptions that I found online about machine

[01:35:30.400 --> 01:35:31.400] learning.

[01:35:31.400 --> 01:35:36.880] I think most of the other uses sort of come back to this foundational definition.

[01:35:36.880 --> 01:35:37.880] All right.

[01:35:37.880 --> 01:35:38.880] Thanks, Kara.

[01:35:38.880 --> 01:35:39.880] Yep.

Who's That Noisy? (01:35:39)


New Noisy ()

[_short_vague_description_of_Noisy]

short_text_from_transcript

[01:35:39.880 --> 01:35:40.880] Jay, it's Who's That Noisy Time?

[01:35:40.880 --> 01:35:52.320] All right, guys, last week I played this noisy.

[01:35:52.320 --> 01:35:53.320] I love that noisy.

[01:35:53.320 --> 01:35:54.320] It's so silly.

[01:35:54.320 --> 01:35:55.320] That's a good noisy.

[01:35:55.320 --> 01:35:56.320] I like that.

[01:35:56.320 --> 01:35:57.320] Right?

[01:35:57.320 --> 01:35:58.320] All right.

[01:35:58.320 --> 01:36:00.180] So I had many people write in on this one.

[01:36:00.180 --> 01:36:04.280] One listener named William Steele said, Hi, Jay, this week's noisy sounds a bit like the

[01:36:04.280 --> 01:36:07.520] Star Wars blaster sound from wrapping on a long cable.

[01:36:07.520 --> 01:36:12.640] I will invoke a bit of that and say that this is a flywheel or some sort of long drive shaft

[01:36:12.640 --> 01:36:16.380] in a tube, maybe some naval application.

[01:36:16.380 --> 01:36:17.380] That is incorrect.

[01:36:17.380 --> 01:36:21.520] But I mean, I totally see where you were going with that description.

[01:36:21.520 --> 01:36:26.000] I have another listener here named Patrick McComb, who said it's a symbol rotating to

[01:36:26.000 --> 01:36:29.360] a stop on a hard surface like a Euler's disk.

[01:36:29.360 --> 01:36:31.360] Again, I could kind of hear that.

[01:36:31.360 --> 01:36:37.360] I mean, I think a symbol would have been a little bit of a higher pitch or tinier sounding

[01:36:37.360 --> 01:36:40.200] to be on the nose with that.

[01:36:40.200 --> 01:36:43.680] Another listener named Matthew Killick wrote in and said, Hey, Jay, newish listener, first

[01:36:43.680 --> 01:36:44.680] time guessing.

[01:36:44.680 --> 01:36:49.080] I think the noisy this week is a super taut wire connected at both ends that's being spun

[01:36:49.080 --> 01:36:50.080] really fast.

[01:36:50.080 --> 01:36:54.720] Now, I've heard something like this, and I think you are correct when it comes to this

[01:36:54.720 --> 01:37:00.480] guess as far as it I've heard that and I've heard a similar type of noise with the wire

[01:37:00.480 --> 01:37:02.840] spinning on onto itself.

[01:37:02.840 --> 01:37:04.640] But that is not correct.

[01:37:04.640 --> 01:37:08.280] I have another guess from Michael Dello, and he says, Hey, folks, this week's noisy had

[01:37:08.280 --> 01:37:10.320] a very motor engine feel to it.

[01:37:10.320 --> 01:37:12.680] I think I could hear the engine dying out at the end.

[01:37:12.680 --> 01:37:16.880] It also had a separate noise that sounded like someone running a stick along a metal

[01:37:16.880 --> 01:37:18.160] fence.

[01:37:18.160 --> 01:37:22.880] So he's guessing an old motor car engine with a defect or brake causing part of it to rattle

[01:37:22.880 --> 01:37:25.640] along another part as the motor runs.

[01:37:25.640 --> 01:37:30.360] So none of these are correct, and nobody actually guessed the noisy this week.

[01:37:30.360 --> 01:37:34.080] I knew this one was going to be difficult, but nobody even came close.

[01:37:34.080 --> 01:37:36.080] So I'm just going to tell you what this is.

[01:37:36.080 --> 01:37:43.920] This is a steamroller that is rolling over gravel, and there is a metal bar on a steamroller

[01:37:43.920 --> 01:37:50.440] that clears the roller before it touches the ground again, kind of like would scrape things

[01:37:50.440 --> 01:37:53.200] off of the roller.

[01:37:53.200 --> 01:37:57.280] The way that I'm seeing it in the video, it looks kind of like, you know, almost like

[01:37:57.280 --> 01:38:06.680] a knife that runs along the drum to get stuff off of it, and it's scraping off these rocks

[01:38:06.680 --> 01:38:09.280] that have stuck to the roller itself.

[01:38:09.280 --> 01:38:11.480] I know it's weird, but it was such a cool noisy.

[01:38:11.480 --> 01:38:12.480] I had to use it.

[01:38:12.480 --> 01:38:13.480] Very hard to guess.

[01:38:13.480 --> 01:38:16.040] Of course, if anybody said steamroller, that would have been it.

[01:38:16.040 --> 01:38:17.040] Rocking roller.

[01:38:17.040 --> 01:38:27.040] I'm going to play the sound for you again.

[01:38:27.040 --> 01:38:32.920] So the vibration, echoey vibration that you're hearing, I think is resonating off of the

[01:38:32.920 --> 01:38:36.720] steel drum, which is essentially the front roller.

[01:38:36.720 --> 01:38:37.720] Very cool noisy.

[01:38:37.720 --> 01:38:40.600] I really wanted to play that one for you guys.

[01:38:40.600 --> 01:38:43.360] Damn near impossible to guess.

[01:38:43.360 --> 01:38:46.840] But I got to throw some hard ones at you guys every once in a while.

[01:38:46.840 --> 01:38:51.840] And clearly we have no steamroll operators listening to the Skeptics Guide to the Universe.

[01:38:51.840 --> 01:38:52.840] Clearly.

[01:38:52.840 --> 01:38:53.840] Thanks, guys.

[01:38:53.840 --> 01:38:58.520] We have no steamroller people representation in this crowd.

[01:38:58.520 --> 01:38:59.520] Something's off.

[01:38:59.520 --> 01:39:01.080] We got to expand our reach, yeah.

[01:39:01.080 --> 01:39:07.360] This next one was sent in by a listener named Peter Canold, and this is a really, really,

[01:39:07.360 --> 01:39:09.060] really fun, cool noisy.

[01:39:09.060 --> 01:39:13.560] Check this one out.

[01:39:13.560 --> 01:39:39.280] All right, I think I played enough of that one for you to be able to figure something

[01:39:39.280 --> 01:39:44.240] out about that, but there is something very specific happening in this noisy.

[01:39:44.240 --> 01:39:49.080] I would dare say it's unique because I've never seen anything like this before.

[01:39:49.080 --> 01:39:50.640] Wow, is this a cool noisy.

[01:39:50.640 --> 01:39:52.920] Wait until I tell you what it is next week.

[01:39:52.920 --> 01:39:59.000] If you think you know the answer, if you're from Australia, or if you heard a cool noisy

[01:39:59.000 --> 01:40:02.280] this week, you can email me at WTN at the SkepticsGuide.org.

[01:40:02.280 --> 01:40:04.620] All right, thanks, Shea.

Science or Fiction (01:40:04)

Answer Item
Fiction
Science
Host Result
'
Rogue Guess

Voice-over: It's time for Science or Fiction.

Item #1: [5]
Item #2: [6]
Item #3: [7]


_Rogue_ Response

_Rogue_ Response

_Rogue_ Response

_Rogue_ Response

_Host_ Explains Item #_n_

_Host_ Explains Item #_n_

_Host_ Explains Item #_n_

_Host_ Explains Item #_n_

[01:40:04.620 --> 01:40:07.160] So we have a treat tonight, guys.

[01:40:07.160 --> 01:40:10.960] Richard Saunders has prepared a science or fiction for us.

[01:40:10.960 --> 01:40:13.360] Let's do it.

[01:40:13.360 --> 01:40:18.360] It's time for Science or Fiction.

[01:40:18.360 --> 01:40:25.360] All right, all right, Rogues.

[01:40:25.360 --> 01:40:31.200] This week, I have three facts concerning the USA-Australia relationship.

[01:40:31.200 --> 01:40:34.040] One is genuine, and two are fictitious.

[01:40:34.040 --> 01:40:39.840] I challenge the panel of skeptics to tell me which one is the genuine one.

[01:40:39.840 --> 01:40:40.920] So are you ready?

[01:40:40.920 --> 01:40:42.000] Here they come.

[01:40:42.000 --> 01:40:48.140] Item number one, Australia and the US have had a long and close relationship with economic

[01:40:48.140 --> 01:40:49.780] and military ties.

[01:40:49.780 --> 01:40:55.400] An attack on the USA is considered to be an attack on Australia and vice versa under our

[01:40:55.400 --> 01:40:56.780] military pact.

[01:40:56.780 --> 01:41:03.800] We have adopted many parts of American culture, including over the past couple of decades

[01:41:03.800 --> 01:41:06.520] and a very American version of Halloween.

[01:41:06.520 --> 01:41:14.780] With such a strong relationship going back decades, it's interesting to note that the

[01:41:14.780 --> 01:41:23.100] first time a sitting US president visited Australia wasn't until 1948 when Harry S.

[01:41:23.100 --> 01:41:28.440] Truman visited the northern city of Brisbane, then the capital, Canberra, to thank Australian

[01:41:28.440 --> 01:41:33.760] troops for their service, support, and comradeship with US servicemen in the Pacific Theatre

[01:41:33.760 --> 01:41:37.280] during World War II.

[01:41:37.280 --> 01:41:41.800] That's item number one.

[01:41:41.800 --> 01:41:49.080] Item number two, although you can get just about any kind of food in Australia, especially

[01:41:49.080 --> 01:41:53.800] in the major cities, we are also known for some unusual tastes.

[01:41:53.800 --> 01:42:01.520] While many Australians love Vegemite on their toast or Mustics, I know there are some people

[01:42:01.520 --> 01:42:10.420] in the Jade household who like Mustics, as a candy, the one dish, the one dish mentioned

[01:42:10.420 --> 01:42:20.360] under traditional cuisine on the CIA World Factbook webpage on Australia is, drum roll,

[01:42:20.360 --> 01:42:27.620] a meat pie, a fist-sized baked pie filled with ground meat, gravy, and cheese and topped

[01:42:27.620 --> 01:42:28.620] with ketchup.

[01:42:28.620 --> 01:42:33.340] The gravy often contains onions or mushrooms.

[01:42:33.340 --> 01:42:35.960] That's item number two.

[01:42:35.960 --> 01:42:41.640] Item number three, the Melbourne Dogs was the name given to a short-lived gang of criminal

[01:42:41.640 --> 01:42:46.720] immigrants from Australia to Los Angeles during the mid-19th century.

[01:42:46.720 --> 01:42:54.280] Originally part of the 1849 Gold Rush, they moved to LA after not striking it rich in

[01:42:54.280 --> 01:43:00.520] the gold fields, because many of these criminals came from the well-known British penal colonies

[01:43:00.520 --> 01:43:02.000] in Australia.

[01:43:02.000 --> 01:43:03.800] They were known to commit arson.

[01:43:03.800 --> 01:43:11.600] They were blamed for a 1853 fire, as well as the rampant crime in the city at that time.

[01:43:11.600 --> 01:43:17.960] There are your three items, and let's start with that well-known lover of Vegemite Northings,

[01:43:17.960 --> 01:43:18.960] Australian Jay.

[01:43:18.960 --> 01:43:19.960] Okay.

[01:43:19.960 --> 01:43:21.280] All right.

[01:43:21.280 --> 01:43:26.560] So starting with the first one, Australia and the US have a long, close relationship

[01:43:26.560 --> 01:43:30.840] economically, and they have military ties.

[01:43:30.840 --> 01:43:35.600] This is a very long, descriptive thing about that relationship.

[01:43:35.600 --> 01:43:40.600] And I'm going to say, I'm going to be very honest, I don't know to what degree the United

[01:43:40.600 --> 01:43:48.480] States and Australia have a military relationship, but I do believe that they have a very strong

[01:43:48.480 --> 01:43:50.480] one.

[01:43:50.480 --> 01:43:51.480] That's not under contention.

[01:43:51.480 --> 01:43:54.420] Really, the guts of this one is the presidential visit.

[01:43:54.420 --> 01:43:57.080] I'm going to absolutely think this one is science.

[01:43:57.080 --> 01:44:01.960] I could see that happening, although you'd think that a president would have visited

[01:44:01.960 --> 01:44:06.200] much earlier than that, but that seems true to me.

[01:44:06.200 --> 01:44:11.960] The second one is, although you can get any kind of food in Australia, their traditional

[01:44:11.960 --> 01:44:14.600] cuisine is a meat pie.

[01:44:14.600 --> 01:44:16.360] This one intrigued me.

[01:44:16.360 --> 01:44:23.560] And I would really think that this would be part of some type of traditional meal.

[01:44:23.560 --> 01:44:27.320] In most countries, to be honest with you, some type of meat pie.

[01:44:27.320 --> 01:44:29.420] So I think that one is science.

[01:44:29.420 --> 01:44:35.440] This third one about the Melbourne dogs roaming around, of these three items, the two that

[01:44:35.440 --> 01:44:42.520] I think are fiction are the first one, which is about a seated president not visiting Australia

[01:44:42.520 --> 01:44:46.200] until 1948.

[01:44:46.200 --> 01:44:52.680] And the second one is that I don't think that there was a band of criminals in California

[01:44:52.680 --> 01:44:53.680] causing trouble.

[01:44:53.680 --> 01:44:56.640] I do not think that that happened.

[01:44:56.640 --> 01:44:57.640] Those are the two.

[01:44:57.640 --> 01:45:02.240] So you're saying that the true one is the number two, the meat pie.

[01:45:02.240 --> 01:45:03.880] Meat pie is, oh, go with the meat.

[01:45:03.880 --> 01:45:07.280] You know what I'm saying, Richard?

[01:45:07.280 --> 01:45:08.280] Absolutely.

[01:45:08.280 --> 01:45:10.160] Bob, let's hear from you.

[01:45:10.160 --> 01:45:11.960] Oh boy.

[01:45:11.960 --> 01:45:12.960] One is true.

[01:45:12.960 --> 01:45:15.480] Two are fictitious.

[01:45:15.480 --> 01:45:16.480] One is true.

[01:45:16.480 --> 01:45:24.320] So the fictitious is, I'm going to say Truman visiting in 46 is fictitious.

[01:45:24.320 --> 01:45:28.280] It definitely seems late, but you know, it's Truman.

[01:45:28.280 --> 01:45:33.960] Meat pie, I'm going to say that's fictitious too because I'm a big fan of meat pies.

[01:45:33.960 --> 01:45:35.500] I don't remember.

[01:45:35.500 --> 01:45:37.520] I remember Vegemite and I remember Musk's days.

[01:45:37.520 --> 01:45:39.720] I don't remember meat pie, the times that I've been there.

[01:45:39.720 --> 01:45:41.520] And I think I would remember that.

[01:45:41.520 --> 01:45:49.340] The one that I think is true is, I love the idea of this Melbourne dogs, these criminal

[01:45:49.340 --> 01:45:51.880] immigrants in LA roving around.

[01:45:51.880 --> 01:45:56.920] I like that idea too much to say that it's fictitious, so I'm going to say that is

[01:45:56.920 --> 01:45:57.920] not fiction.

[01:45:57.920 --> 01:45:58.920] True.

[01:45:58.920 --> 01:45:59.920] All right.

[01:45:59.920 --> 01:46:00.920] That's the true one.

[01:46:00.920 --> 01:46:01.920] Okay.

[01:46:01.920 --> 01:46:02.920] Evan.

[01:46:02.920 --> 01:46:03.920] All right.

[01:46:03.920 --> 01:46:04.920] Well, I'm going a different direction here.

[01:46:04.920 --> 01:46:08.400] I have a feeling the one that's true is the Truman one.

[01:46:08.400 --> 01:46:10.920] I think you're playing a pun there.

[01:46:10.920 --> 01:46:17.240] But also more importantly, because I went through in my head the list of presidents,

[01:46:17.240 --> 01:46:23.240] mainly starting with say, you know, the year 1900 and then going forward, you've got what?

[01:46:23.240 --> 01:46:24.960] Would McKinley have gone?

[01:46:24.960 --> 01:46:25.960] No.

[01:46:25.960 --> 01:46:28.360] Would Roosevelt have gone?

[01:46:28.360 --> 01:46:29.360] Maybe.

[01:46:29.360 --> 01:46:30.360] Maybe.

[01:46:30.360 --> 01:46:32.040] That one I don't quite know about.

[01:46:32.040 --> 01:46:34.960] But after him came, there was Taft in there.

[01:46:34.960 --> 01:46:36.160] I don't think so.

[01:46:36.160 --> 01:46:37.840] Wilson, World War I.

[01:46:37.840 --> 01:46:43.320] We were also obviously part of that allied group, even though the U.S. came in later

[01:46:43.320 --> 01:46:44.320] during the course of World War I.

[01:46:44.320 --> 01:46:46.600] But I don't think Wilson would have gone.

[01:46:46.600 --> 01:46:49.280] And you had who came after Wilson?

[01:46:49.280 --> 01:47:00.320] Was it Coolidge and then the other guy, the guy before FDR.

[01:47:00.320 --> 01:47:03.720] But in any case, I'm going through the list, I'm trying to think, well, who the heck would

[01:47:03.720 --> 01:47:04.720] have gone?

[01:47:04.720 --> 01:47:07.040] Which president would have made that visit?

[01:47:07.040 --> 01:47:08.040] For what reason?

[01:47:08.040 --> 01:47:11.760] I can't come up with a reason or a note.

[01:47:11.760 --> 01:47:17.360] FDR, I don't believe, traveled internationally to Australia during his years and went to

[01:47:17.360 --> 01:47:20.240] Europe during World War II and stuff.

[01:47:20.240 --> 01:47:24.120] So I have a feeling that that one's going to turn out to be correct, therefore, by default,

[01:47:24.120 --> 01:47:25.120] the other two are fiction.

[01:47:25.120 --> 01:47:26.120] All right.

[01:47:26.120 --> 01:47:27.120] Very good.

[01:47:27.120 --> 01:47:28.120] Steve?

[01:47:28.120 --> 01:47:29.480] Yeah, actually, I agree with Evan.

[01:47:29.480 --> 01:47:34.960] For the second one, meat pies, I always thought of them as being more British, but that could

[01:47:34.960 --> 01:47:38.680] obviously be have translated over to Australia.

[01:47:38.680 --> 01:47:40.960] But I can kind of go either way with that one.

[01:47:40.960 --> 01:47:46.120] And then, you know, I may be wrong about this, but wouldn't they be called the Melbourne

[01:47:46.120 --> 01:47:47.120] dingoes?

[01:47:47.120 --> 01:47:54.320] Right, because there are no dogs in Australia.

[01:47:54.320 --> 01:47:58.560] And I was thinking along similar lines of Evan, also the fact that, you know, prior

[01:47:58.560 --> 01:48:04.960] to air travel, I mean, all you do is spend six weeks on a boat going to Australia, it's

[01:48:04.960 --> 01:48:05.960] a far away.

[01:48:05.960 --> 01:48:06.960] As a president?

[01:48:06.960 --> 01:48:07.960] That's the thing.

[01:48:07.960 --> 01:48:11.120] You spend that much time as a president just traveling.

[01:48:11.120 --> 01:48:13.520] They were busy, you know, running the country.

[01:48:13.520 --> 01:48:19.600] So the thing that my caution is FDR was kind of was in office for a long time.

[01:48:19.600 --> 01:48:24.800] And you think he would have eventually visited all of our major allies, especially since

[01:48:24.800 --> 01:48:29.240] plane travel was coming into common use at that time.

[01:48:29.240 --> 01:48:31.600] So that's my one caution there.

[01:48:31.600 --> 01:48:35.760] It might have been FDR, but it's plausible that it just never got around to it.

[01:48:35.760 --> 01:48:39.240] And Truman was the first one to specifically visit Australia.

[01:48:39.240 --> 01:48:41.840] So I'm going to say that number one is science.

[01:48:41.840 --> 01:48:47.360] The visit by the president is, as far as Steve is concerned, that's the science or the true

[01:48:47.360 --> 01:48:48.360] one.

[01:48:48.360 --> 01:48:49.360] Kara.

[01:48:49.360 --> 01:48:54.280] So OK, so we've got Evan and Steve saying the visit by the president is science.

[01:48:54.280 --> 01:48:56.480] Jay likes his meat pie and Bob likes the dogs.

[01:48:56.480 --> 01:48:58.200] Jay with some meat.

[01:48:58.200 --> 01:48:59.200] Bob with the dogs.

[01:48:59.200 --> 01:49:00.200] I'm going to go with Jay.

[01:49:00.200 --> 01:49:01.200] I'm into the meat.

[01:49:01.200 --> 01:49:02.200] I'm going to say that one's the science.

[01:49:02.200 --> 01:49:05.200] I'm not going to give an explanation.

[01:49:05.200 --> 01:49:06.520] We've heard enough explanations.

[01:49:06.520 --> 01:49:08.680] I just feel like going with Jay on this one.

[01:49:08.680 --> 01:49:11.000] This is dark board territory.

[01:49:11.000 --> 01:49:12.000] So this is great.

[01:49:12.000 --> 01:49:13.000] Excellent.

[01:49:13.000 --> 01:49:15.620] We've got a bit of a mixed palette in front of us.

[01:49:15.620 --> 01:49:18.000] And I know this is backwards to the way you normally do it.

[01:49:18.000 --> 01:49:21.560] But as I say, I'm from Australia and we're just sort of like that.

[01:49:21.560 --> 01:49:22.560] It's upside down.

[01:49:22.560 --> 01:49:29.560] But let me say the first one, the first one, the first item about the U.S.-Australia relationship

[01:49:29.560 --> 01:49:30.560] is all very true.

[01:49:30.560 --> 01:49:34.400] We do have a strong close military bond and all the rest of it.

[01:49:34.400 --> 01:49:42.080] However, it was Lyndon Baines Johnston who in 1966 had a three day visit to Australia.

[01:49:42.080 --> 01:49:43.640] That's the first time?

[01:49:43.640 --> 01:49:44.640] Yeah.

[01:49:44.640 --> 01:49:47.740] It's the very first presidential visit as a show of gratitude to the Australian nation

[01:49:47.740 --> 01:49:50.840] for its support for the Vietnam War.

[01:49:50.840 --> 01:49:53.920] He was very late.

[01:49:53.920 --> 01:49:59.320] He was the first sitting U.S. president to visit Australia and visited again in the following

[01:49:59.320 --> 01:50:05.520] year to attend the funeral of the prime minister, Harold Holt, who died in office after going

[01:50:05.520 --> 01:50:07.720] swimming or diving in rough seas.

[01:50:07.720 --> 01:50:13.640] His body was never found, leading to conspiracy theories that he was a Chinese spy and was

[01:50:13.640 --> 01:50:15.920] taken by a submarine.

[01:50:15.920 --> 01:50:22.480] The presidents who have visited Australia have been LBJ, Bush Senior, Bill Clinton,

[01:50:22.480 --> 01:50:26.000] Bush Junior and Barack Obama.

[01:50:26.000 --> 01:50:30.560] So that one was the fiction.

[01:50:30.560 --> 01:50:36.360] So this will tell you, I'll just do them in order so we know.

[01:50:36.360 --> 01:50:42.040] Item number two, you can get any kind of food in Australia, but the CIA reckon that the

[01:50:42.040 --> 01:50:46.080] traditional cuisine of Australia is the meat pie.

[01:50:46.080 --> 01:50:49.400] That is true.

[01:50:49.400 --> 01:50:50.400] That is the fact.

[01:50:50.400 --> 01:50:51.400] Meat pie.

[01:50:51.400 --> 01:50:52.400] Meat pie.

[01:50:52.400 --> 01:50:53.400] Yeah, Jay.

[01:50:53.400 --> 01:50:56.400] Of course it's the meat pie.

[01:50:56.400 --> 01:50:58.160] Come on.

[01:50:58.160 --> 01:51:00.080] I couldn't believe this one when I saw it.

[01:51:00.080 --> 01:51:05.200] As far as the CIA World Factbook are concerned, of all the things you can get in Australia,

[01:51:05.200 --> 01:51:07.240] the meat pie is our traditional cuisine.

[01:51:07.240 --> 01:51:09.480] What the hell, man?

[01:51:09.480 --> 01:51:14.280] They could have said the damper, which is a bread made from a wheat-based dough, flour,

[01:51:14.280 --> 01:51:21.400] salt, and water, which is popular in the outback, booked over a campfire in the coals of the

[01:51:21.400 --> 01:51:22.400] fire.

[01:51:22.400 --> 01:51:26.520] Or they could have said the chicken parmigiana, which is a very popular dish, especially in

[01:51:26.520 --> 01:51:28.280] clubs and pubs throughout Australia.

[01:51:28.280 --> 01:51:31.000] But no, the meat pie wins out.

[01:51:31.000 --> 01:51:34.380] And so Cara and Jay got that one wrong.

[01:51:34.380 --> 01:51:35.380] And out of interest-

[01:51:35.380 --> 01:51:36.960] You mean we got it right?

[01:51:36.960 --> 01:51:37.960] Got it right.

[01:51:37.960 --> 01:51:38.960] You got it right.

[01:51:38.960 --> 01:51:39.960] Yeah.

[01:51:39.960 --> 01:51:40.960] I'm from Australia.

[01:51:40.960 --> 01:51:41.960] Dang it.

[01:51:41.960 --> 01:51:42.960] Australia, right is wrong, Cara.

[01:51:42.960 --> 01:51:43.960] That's right.

[01:51:43.960 --> 01:51:44.960] Yeah.

[01:51:44.960 --> 01:51:45.960] That's right.

[01:51:45.960 --> 01:51:46.960] Wrong is good.

[01:51:46.960 --> 01:51:47.960] Wrong is right.

[01:51:47.960 --> 01:51:53.680] Now, which means that the Melbourne dogs are a fiction, however, the Sydney Ducks was a

[01:51:53.680 --> 01:51:59.200] name given to a gang of criminal immigrants from Australia in San Francisco during the

[01:51:59.200 --> 01:52:00.200] 19th-

[01:52:00.200 --> 01:52:01.200] No.

[01:52:01.200 --> 01:52:02.200] You sly bastard.

[01:52:02.200 --> 01:52:03.200] Mid-19th century.

[01:52:03.200 --> 01:52:04.200] Sure.

[01:52:04.200 --> 01:52:07.320] Because many of these criminals came from the well-known British penal colonies in Australia.

[01:52:07.320 --> 01:52:08.720] They were known to commit arson.

[01:52:08.720 --> 01:52:14.680] They were blamed for a 1849 fire that devastated the heart of San Francisco, as well as rampant

[01:52:14.680 --> 01:52:17.200] crime in the city at the time.

[01:52:17.200 --> 01:52:22.200] So the Sydney Ducks were a very real gang, come up from Sydney, come up from Australia

[01:52:22.200 --> 01:52:27.800] to go in the gold rush, and ended up creating havoc in San Francisco.

[01:52:27.800 --> 01:52:31.080] So that was the fictitious one, because it was the dogs.

[01:52:31.080 --> 01:52:32.080] The dogs was fictitious.

[01:52:32.080 --> 01:52:34.160] Yeah, because they weren't dingoes.

[01:52:34.160 --> 01:52:35.160] They weren't dingoes.

[01:52:35.160 --> 01:52:36.160] That's right.

[01:52:36.160 --> 01:52:40.280] But we get back to the winners this week are Jay and Kara because they love their meat

[01:52:40.280 --> 01:52:41.280] pies.

[01:52:41.280 --> 01:52:42.280] What?

[01:52:42.280 --> 01:52:49.280] I can't tell you how excited and happy I am to win this award.

[01:52:49.280 --> 01:52:53.880] Kara and I worked for three years on this one, and I'm just so proud of the work that

[01:52:53.880 --> 01:52:54.880] we did, Kara.

[01:52:54.880 --> 01:52:55.880] Yep, me too.

[01:52:55.880 --> 01:52:56.880] Me too.

[01:52:56.880 --> 01:52:57.880] Me too.

[01:52:57.880 --> 01:52:58.880] Me too.

[01:52:58.880 --> 01:52:59.880] Well done.

[01:52:59.880 --> 01:53:00.880] Well done.

[01:53:00.880 --> 01:53:01.880] Well done, gents.

[01:53:01.880 --> 01:53:02.880] All right, thanks for that, Richard.

[01:53:02.880 --> 01:53:03.880] That was a lot of fun.

[01:53:03.880 --> 01:53:04.880] Cool.

Skeptical Quote of the Week ()

TEXT
– AUTHOR (YYYY-YYYY), _short_description_

[01:53:04.880 --> 01:53:05.880] Evan, give us a quote.

[01:53:05.880 --> 01:53:06.880] I love it.

[01:53:06.880 --> 01:53:09.880] It's from Jason, and Jason gives us his phone number, which I will not share on this

[01:53:09.880 --> 01:53:10.880] podcast.

[01:53:10.880 --> 01:53:11.880] What the hell?

[01:53:11.880 --> 01:53:12.880] No, it's true.

[01:53:12.880 --> 01:53:13.880] He did.

[01:53:13.880 --> 01:53:14.880] That's amazing.

[01:53:14.880 --> 01:53:15.880] Jason writes.

[01:53:15.880 --> 01:53:16.880] Great podcast.

[01:53:16.880 --> 01:53:17.880] I love it.

[01:53:17.880 --> 01:53:25.960] I was listening to Adam Buxton's podcast, where he interviews British novelist Ian McEwen,

[01:53:25.960 --> 01:53:30.000] and they were talking about the difficulty of predicting trends in the future.

[01:53:30.000 --> 01:53:35.480] Ian referenced the surprising rise of social media that even just 20 years ago, no one

[01:53:35.480 --> 01:53:36.600] really saw coming.

[01:53:36.600 --> 01:53:38.040] Here's the quote.

[01:53:38.040 --> 01:53:42.520] Collectively, we make this future that we surprise ourselves with.

[01:53:42.520 --> 01:53:44.320] Ian McEwen.

[01:53:44.320 --> 01:53:49.840] And Jason says he thought it was a rather poignant observation that the uses of new

[01:53:49.840 --> 01:53:55.600] technology such as the internet are largely shaped by us, and yet we struggle to imagine

[01:53:55.600 --> 01:53:56.860] what they will be.

[01:53:56.860 --> 01:53:57.860] So thank you, Jason.

[01:53:57.860 --> 01:53:58.860] Yeah.

[01:53:58.860 --> 01:53:59.860] I appreciate you sharing that with us.

[01:53:59.860 --> 01:54:05.960] I also think it relates to the fact that we create the future in the aggregate, but no

[01:54:05.960 --> 01:54:10.920] individual could see that process, because it involves everybody.

[01:54:10.920 --> 01:54:13.760] You know what I mean?

[01:54:13.760 --> 01:54:15.560] It's bottom up rather than top down.

[01:54:15.560 --> 01:54:16.560] Right.

[01:54:16.560 --> 01:54:21.800] It just emerges from our interface with technology and everything and millions of individual

[01:54:21.800 --> 01:54:22.800] decisions.

[01:54:22.800 --> 01:54:26.920] Nobody can necessarily, even though we're collectively doing it, we can't perceive it

[01:54:26.920 --> 01:54:30.000] from our perspective until it's already happened.

[01:54:30.000 --> 01:54:31.000] And we're like, what the hell?

[01:54:31.000 --> 01:54:32.000] How did that happen?

[01:54:32.000 --> 01:54:33.000] You know?

[01:54:33.000 --> 01:54:34.000] You know what I'm saying?

[01:54:34.000 --> 01:54:35.000] Yes.

[01:54:35.000 --> 01:54:40.360] I'm slightly related to that thing where you and somebody else are walking somewhere, or

[01:54:40.360 --> 01:54:46.400] even a small group of people, and both people think that the other person is leading.

[01:54:46.400 --> 01:54:47.960] Oh my gosh.

[01:54:47.960 --> 01:54:48.960] No.

[01:54:48.960 --> 01:54:49.960] Never.

[01:54:49.960 --> 01:54:50.960] So you're following them, and they're following you.

[01:54:50.960 --> 01:54:51.960] Oh, please.

[01:54:51.960 --> 01:54:52.960] That's happened so many times.

[01:54:52.960 --> 01:54:54.960] That happened in our last trip down in New Zealand.

[01:54:54.960 --> 01:54:56.720] That happens on every trip.

[01:54:56.720 --> 01:55:00.480] So we end up like blocks away.

[01:55:00.480 --> 01:55:01.480] And then we're like, where are we going?

[01:55:01.480 --> 01:55:02.480] And everybody's like, I'm following you.

[01:55:02.480 --> 01:55:03.480] I'm following you.

[01:55:03.480 --> 01:55:05.560] Well, we get talking about things as we walk.

[01:55:05.560 --> 01:55:06.560] All kinds of things.

[01:55:06.560 --> 01:55:07.560] Yeah.

[01:55:07.560 --> 01:55:11.520] But we collectively are deciding where to go, thinking we're following each other.

[01:55:11.520 --> 01:55:12.520] Yeah.

[01:55:12.520 --> 01:55:13.520] We're a murmuration.

[01:55:13.520 --> 01:55:14.520] We're not.

[01:55:14.520 --> 01:55:15.520] Clearly we are.

[01:55:15.520 --> 01:55:16.520] Clearly we are not Borg.

[01:55:16.520 --> 01:55:17.520] A murmuration.

[01:55:17.520 --> 01:55:18.520] Nice.

Signoff/Announcements (01:55:18)

[01:55:18.520 --> 01:55:19.520] All right.

[01:55:19.520 --> 01:55:23.440] Well, thanks for joining us, Richard.

[01:55:23.440 --> 01:55:25.080] It's been great catching up with you.

[01:55:25.080 --> 01:55:29.640] We should go another three years before we touch base again.

[01:55:29.640 --> 01:55:31.640] I'll make sure that doesn't happen yet.

[01:55:31.640 --> 01:55:32.640] Yeah.

[01:55:32.640 --> 01:55:33.640] Yeah.

[01:55:33.640 --> 01:55:34.640] Yeah.

[01:55:34.640 --> 01:55:35.640] We have to meet in meat space somewhere.

[01:55:35.640 --> 01:55:36.640] Yes.

[01:55:36.640 --> 01:55:37.640] Meat space.

[01:55:37.640 --> 01:55:38.640] Meat space.

[01:55:38.640 --> 01:55:39.640] Meat space.

[01:55:39.640 --> 01:55:40.640] Meat space.

[01:55:40.640 --> 01:55:41.640] Meat space.

[01:55:41.640 --> 01:55:42.640] Meat space.

[01:55:42.640 --> 01:55:43.640] Meat space.

[01:55:43.640 --> 01:55:44.640] Meat space.

[01:55:44.640 --> 01:55:45.640] I'm holding out for an in-person situation.

[01:55:45.640 --> 01:55:46.640] Oh.

[01:55:46.640 --> 01:55:47.640] Yeah.

[01:55:47.640 --> 01:55:48.640] That's what I'm saying.

[01:55:48.640 --> 01:55:49.640] Let's do it.

[01:55:49.640 --> 01:55:50.640] Meat space.

[01:55:50.640 --> 01:55:51.640] We'll call it meat spice.

[01:55:51.640 --> 01:55:52.640] Meat spice.

[01:55:52.640 --> 01:55:53.640] Meat spice.

[01:55:53.640 --> 01:55:54.640] Spicy meat spice and spice.

[01:55:54.640 --> 01:55:55.640] Yeah.

[01:55:55.640 --> 01:55:56.640] All right.

[01:55:56.640 --> 01:55:57.640] Thanks, everyone, for joining me this week.

[01:55:57.640 --> 01:55:58.640] You got it, Steve.

[01:55:58.640 --> 01:55:59.640] Thank you, doctor.

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[8]
  • Fact/Description
  • Fact/Description

Notes

References

  1. [url_from_news_item_show_notes PUBLICATION: TITLE]
  2. [url_from_news_item_show_notes publication: title]
  3. [url_from_news_item_show_notes publication: title]
  4. [url_from_news_item_show_notes publication: title]
  5. [url_from_SoF_show_notes PUBLICATION: TITLE]
  6. [url_from_SoF_show_notes PUBLICATION: TITLE]
  7. [url_from_SoF_show_notes PUBLICATION: TITLE]
  8. [url_for_TIL publication: title]

Vocabulary


Navi-previous.png Back to top of page Navi-next.png