User:Mheguy: Difference between revisions

From SGUTranscripts
Jump to navigation Jump to search
No edit summary
No edit summary
Line 13: Line 13:
|episodeNum = 993
|episodeNum = 993
|episodeDate = {{900s|993|boxdate}}
|episodeDate = {{900s|993|boxdate}}
|episodeIcon = File:
|episodeIcon = File:993.jpg
|caption =
|caption =
|bob = None
|bob = None
Line 138: Line 138:
|swept = <!-- all the Rogues guessed right -->
|swept = <!-- all the Rogues guessed right -->
}}
}}
''Voice-over: It's time for Science or Fiction.''
transcript_placeholder
transcript_placeholder
{{anchor|qow}}
{{anchor|qow}}

Revision as of 08:22, 23 July 2024

  Emblem-pen-orange.png This episode needs: proofreading, time stamps, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute


SGU Episode 993
July 20th 2024
993.jpg
(brief caption for the episode icon)

SGU -1                      SGU 1

Skeptical Rogues
S: Steven Novella

B: Bob Novella

R: Rebecca Watson

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

P: Perry DeAngelis

Links
Download Podcast
Show Notes
SGU Forum


Quickie with Steve: Conspiracy Theories ( 00:00:00 )

transcript_placeholder

News Item #1 – Lunar Cave ( 00:00:00 )

transcript_placeholder

News Item #2 – AI Love ( 00:00:00 )

transcript_placeholder

News Item #3 – AI Scams ( 00:00:00 )

transcript_placeholder

News Item #4 – Solar Clams ( 00:00:00 )

transcript_placeholder

From ( 00:00:00 )

transcript_placeholder

Emails ( 00:00:00 )

transcript_placeholder

Science or Fiction ( 00:00:00 )

Item #1: A new study finds that the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.[5]
Item #2: A recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.[6]
Item #3: Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.[7]

Answer Item
Fiction Item #2
Science Item #1
Science
Item #3
Host Result
Steve
Rogue Guess

transcript_placeholder

Skeptical Quote of the Week ( 00:00:00 )


"In effect, we're all guinea pigs for the dietary supplement industry. The vast majority of these supplements don't deliver on their promises."

 – Nick Tiller, author, The Skeptics Guide to Sports Science: Confronting Myths of the Health and Fitness Industry, (description of author)

transcript_placeholder

Navi-previous.png Back to top of page Navi-next.png


Transcript

Below is the full diarized transcript, not broken into segments.


00:00:03
US#03: You're listening to The Skeptic's Guide to the Universe.
00:00:06
US#03:Your escape to reality.
00:00:10
S: Hello and welcome to The Skeptic's Guide to the Universe.
00:00:13
S:Today is Wednesday, July 17th, 2024, and this is your host, Steven Novella.
00:00:18
S:Joining me this week are Bob Novella, Cara Santa Maria, and Evan Bernstein.
00:00:25
S:Jay is off this week.
00:00:27
E:Jay is both off and on vacation.
00:00:31
C:He just goes off.
00:00:33
C:That's all.
00:00:34
E:Wherever you are, Jay, I hope you're having an excellent time.
00:00:38
S: Jay is in Maine, actually.
00:00:41
C:We're all like, huh, alright.
00:00:44
S:So, have either of you guys watched either The Bear or Shogun?
00:00:48
C:I've watched The Bear.
00:00:49
S:I did not watch The Bear.
00:00:50
E:I watched Cocaine Bear.
00:00:52
E:I watched Shogun when it was a series back in the, what, 1980s?
00:00:56
E:Early 80s?
00:00:58
S:Cara, what did you think of this season of The Bear?
00:01:00
C: I can't do any spoilers yet, right?
00:01:02
C:Because it's still pretty neat.
00:01:03
S:Yeah, you can just give an overview with no spoilers.
00:01:05
C:All right, let me think about how to say the impression with that.
00:01:08
C:I really like how atmospheric it is.
00:01:11
C:I think the writing is really good.
00:01:12
C:I think that it's really visually interesting.
00:01:15
C:I think that I am really annoyed with the main character.
00:01:19
C:I don't think he's complex.
00:01:21
C:I think he needs to get over himself.
00:01:24
E:That's the way the character's designed.
00:01:27
C:100% he's designed to be complex, but he's not.
00:01:30
E: Yeah, well, he's damaged.
00:01:31
C:Yeah, which is fine.
00:01:31
E:And they explore that a little bit.
00:01:33
C:But he's not complex.
00:01:34
C:He's just selfish and privileged and all the things that many women deal with in the dating world right now.
00:01:42
S:Yeah, I mean, my overall take, I mean, this season wasn't as good as the last two seasons.
00:01:47
S:I had the feeling they were trying too hard.
00:01:49
S:You know, like a lot of episodes like, yeah, OK, I could see like artistically where they were going with it.
00:01:54
S:It just wasn't that enjoyable an episode.
00:01:57
C: Yeah, not a lot happens this season.
00:01:58
C:I think it's fair to say that.
00:02:00
C:There's not a whole lot of activity going on this season.
00:02:03
S:It's classic.
00:02:05
S:They forgot to have shit happen.
00:02:07
C:Yeah.
00:02:07
C:Whoops.
00:02:08
C:Whoops.
00:02:09
S:Shogun, on the other hand, was excellent.
00:02:11
S:Really awesome.
00:02:12
S:Good to hear.
00:02:13
S:Leading in Emmy nominations and deserves it.
00:02:15
C:Would I like it?
00:02:16
S:It's very, very good.
00:02:18
S:Yeah, I think you would love it.
00:02:19
S:Wow.
00:02:19
C: Really, I would love it.
00:02:20
C:I'm not a big fan of, like, fighty shows.
00:02:23
S:It's not really a fighting, you know, they show me there's fighting that happens, but that's not what the show's about.
00:02:28
C:Oh, okay, good.
00:02:29
C:There's fighting, but it's not fighty.
00:02:31
C:Okay, cool.
00:02:31
C:Okay, good to know.
00:02:33
E:I mean, it takes place, what, in the 1600s, right?
00:02:36
E:And so what wasn't fighting in the 1600s, right?
00:02:40
E:It's sort of, it's the background, it's the environment.
00:02:44
C: Yeah, it's more that like, I'm just not like entertained by action sequences.
00:02:51
C:You know what I mean?
00:02:51
C:That's not enough to hold my attention.
00:02:54
S:It's very much driven by like excellent characters and the best character in my opinion is the lead female.
00:02:59
S:She's awesome.
00:03:01
C:That's good.
00:03:02
S:Well, we haven't spoken yet about the big thing that happened since the last episode, the failed assassination attempt on Donald Trump.
00:03:14
S: Well, I'm not going to talk about it politically, but the political aspect of it.
00:03:17
S:What I want to talk about is the fact that instantly, within seconds, minutes of this event happening, the internet was abuzz with all sorts of conspiracy theories.
00:03:30
S:Like, that's the go-to explanation now.
00:03:31
S:It's a conspiracy.
00:03:33
E: It was impressive in its own right how fast it took on a life of its own.
00:03:37
C:But I also got to ask, before Steve, you get into the brass tacks of it all, I mean, did you have a moment?
00:03:44
C:No, I really didn't.
00:03:45
C:No, really, not even a single moment.
00:03:47
S:No, because again, it's conspiracy theory.
00:03:51
S:I had sort of an immediate skeptical reaction, although a lot of people in my social circle had that immediate conspiracy theory instinct.
00:04:00
C: I think the difficulty was, yes, how could the Secret Service have failed so miserably?
00:04:07
C:I think that was, and like people are trying to answer that question and they're looking for an explanation.
00:04:13
S:Well, yeah, the simpler explanation is always that is incompetence, right?
00:04:18
S:Never attribute to malice.
00:04:19
S:Incompetence, right?
00:04:21
E:User error, you know.
00:04:23
S: So that was the idea that, and both sides did this, both sides used the fact that the Secret Service failed to prevent this from happening as evidence that or as a reason to think that it might have been a conspiracy.
00:04:34
S:Even like within my group of friends who are generally skeptical smart people, that was sort of their instinct, like, oh, this has to be staged or whatever, you know, and then they would search for reasons to support
00:04:49
S: What they want to believe based upon their ideological outlook.
00:04:53
S:Rather than asking about how plausible it is that something like this could have been pulled off.
00:05:01
S: Also, we have to consider, like, this would be a really stupid thing for either side to do.
00:05:05
S:The risk of being found out massively outweighs any incremental benefit they may get from the politics of either, like, staging it or, you know, the outcome would not necessarily have been good, you know, if the assassination were successful.
00:05:24
S: So, meanwhile, if their campaign was discovered to have been involved with a conspiracy, that would be the end, the absolute end of their campaign, regardless of the outcome of this event.
00:05:37
J:That's right.
00:05:38
C:Exactly.
00:05:38
C:Yeah, that's the big thing.
00:05:39
E:Grand conspiracy.
00:05:41
C:I'm less convinced by an argument that certain actors are doing a lot of things based on logic.
00:05:49
C: So the argument was, was it that it was staged or that it was orchestrated?
00:05:54
C:Because I think that's two different claims.
00:05:57
C:So it could be staged as in it wasn't real.
00:06:00
C:You know what I mean?
00:06:00
C:Like, like as in he didn't actually shoot him.
00:06:03
C:It was like staged.
00:06:04
C:It was like, it was magic.
00:06:06
C:Yeah, exactly.
00:06:07
C:Versus it was orchestrated, meaning that
00:06:11
C: They intentionally had a guy shoot at Donald Trump, knowing that he would miss by a hair.
00:06:16
C:That's the other part.
00:06:17
C:Yeah.
00:06:18
S:Yeah, one of those two.
00:06:19
C:One of those two.
00:06:19
C:Okay, so there's different variations.
00:06:21
S:I think people just conflated, just like, something about this was a false flag operation.
00:06:26
B:Yeah.
00:06:26
B:God, I hate that term so much.
00:06:30
B: So much.
00:06:30
S:And then on the other side, of course, it was the fact that the Democrats are just they, that they did this, right?
00:06:36
S:So, by the way, good skeptical rule of thumb, do not use a vague reference to they, right?
00:06:42
S:Because you're whitewashing over a lot of important details.
00:06:46
S: And you're almost assuming a conspiracy when you do that.
00:06:49
S:So they're saying, like, they tried to impeach him, and then they tried to prosecute him, and then they tried to take away his wealth, and now they tried to assassinate him, as if this is all the same group or the same group of people.
00:07:06
S:And some of them are saying it explicitly.
00:07:08
C:Because they're always just referring to George Soros.
00:07:14
S: There is no they here.
00:07:16
S:This is one person.
00:07:18
S:The only thing the FBI has been able to say so far is this guy was definitely acting alone.
00:07:22
S:And he fits the total profile of a lone wolf, a massive shooter, you know, in terms of age, gender, race, you know, a little bit of on the outside, the fringe, socially, you know, enamored of guns.
00:07:36
S:I mean, it's perfect.
00:07:37
C:But generally speaking in mass shootings, because that's something I think we have to remember too, this was a mass shooting event.
00:07:43
C: Yes, it was an assassination attempt, but let's not also forget, right?
00:07:48
C:And in mass shootings, very often, especially when you're dealing with like the young white male who's like, somewhat intelligent, somewhat socially withdrawn, we often will see a manifesto, we'll see some sort of, you know, something written on social media clues leading up to it about their ideological leanings.
00:08:07
C:Yeah, there's none of that, which is fascinating.
00:08:11
C: Or like, I don't know if you guys did, did you watch the, um, we've talked about it before Manhunt, I think, which was the, yeah, the series about Lincoln and John Wilkes Booth.
00:08:20
C:And yes, John Wilkes Booth was ideologically motivated as in he was a Confederate and he really like believed in those causes.
00:08:28
C:But really he just wanted to be famous.
00:08:30
C:Like that was a huge part of it.
00:08:32
C:He wanted to be the guy.
00:08:33
C:Yeah.
00:08:34
S:All right, Bob, tell us about Caves on the Moon.
00:08:38
B: Yes, I've been waiting for this.
00:08:40
B:I knew it was going to come.
00:08:41
B:So, for the first time, we now have solid evidence that the moon does indeed have large underground tunnels or caves, and that these researchers think they would be a great place to hang out and just be safe on the moon.
00:08:55
B:What did it finally take for scientists to agree with me, and what does this mean for my dream of a moon-based alpha before the heat death of the universe?
00:09:03
B: So this is from the typical international team of scientists, which I love.
00:09:08
B:In this case, led by the scientists at the University of Trento in Italy.
00:09:12
B:This was published mid-July 2024, just recently, in the journal Nature Astronomy.
00:09:18
B:The title of the paper is Radar Evidence of an Accessible Cave Conduit on the Moon Below the Mare Tranquillitatis Pit.
00:09:27
B: So, this
00:09:42
B: Or I don't even like the idea of them being incredibly annoyed by moon dust.
00:09:47
B:And the utility of this of an already existing huge underground cave or tunnel on the moon is primarily underscored by the fact that the moon surface
00:09:57
B: Really, really kind of sucks.
00:09:59
B:It's a horrible place.
00:10:01
B:You think about the moon, you see the videos, right?
00:10:03
B:It seems like a fun place, right?
00:10:05
B:All bouncy and happy.
00:10:06
B:But really, it's really quite hellish.
00:10:09
B:And for a surprisingly number of deadly reasons.
00:10:13
B:First off, there's the temperature variations, which are nasty.
00:10:16
B:On the bright side of the moon, we're talking 127 degrees Celsius, 261 Fahrenheit.
00:10:22
B:On the unilluminated side, side of the whatever.
00:10:26
E:Nice, Bob.
00:10:27
B: It can drop to minus 173 Celsius or minus 280 Fahrenheit.
00:10:33
B:God, that's cold.
00:10:34
E:So there's no Goldilocks zone on the moon.
00:10:36
B:On the surface.
00:10:38
B:Hold on to that thought, Evan.
00:10:39
B:Hold on to that thought.
00:10:40
B:Do I even have to say more about those temperature swings?
00:10:43
B:They're just wow.
00:10:44
B:Next is the radiation on the surface of the moon.
00:10:46
B:There's galactic cosmic rays, which are high energy particles from things like supernova,
00:10:52
B: Thanks for watching.
00:11:13
B: Well, the events, these solar particle events are kind of sudden and they're not very predictable at all.
00:11:20
B:They can expose astronauts on the surface to literally life-threatening doses.
00:11:26
B:Just generally speaking, the moon varies then 200 to 2,000 times the radiation dose that we receive here on Earth.
00:11:33
B: But you got to remember, because if you go to different websites, you may find different ranges.
00:11:37
B:And that's because that's because we're really not sure how bad the radiation is yet.
00:11:42
B:It hasn't been studied as fully as it needs to be studied.
00:11:45
E:We don't have a radiation detector sitting on the moon somewhere.
00:11:48
B:What's that?
00:11:48
E:There's no radiation.
00:11:50
E: Thanks for watching.
00:12:06
B: So all right, then let's talk about a worst case scenario.
00:12:09
B:And that happened on October 20th, 1989.
00:12:12
B:There was an X-class solar flare.
00:12:14
B:That's X-class.
00:12:15
B:There is no Y-class.
00:12:17
B:It ends with the X-class.
00:12:18
B:They are the nastiest solar flares.
00:12:21
B:That essentially, it caused a geomagnetic storm that bathed the moon in radiation.
00:12:26
B:And that radiation was more than eight times the radiation received by plant workers during the Chernobyl.
00:12:32
B: So that's what you would have received.
00:12:34
B:If you were an astronaut on the moon, you would have received over a brief period of time eight times the dose that Chernobyl workers received.
00:12:41
B:From what I could tell, if you were an astronaut on the moon on October 20th in 1989, you probably would have died within hours.
00:12:49
B:So that's how deadly we're talking.
00:12:52
B:Hours, that's pretty fast.
00:12:53
B:The only way you're going to die quicker on the moon is if you're hit with
00:12:58
B: Micrometeorite.
00:13:00
B:That's just nasty.
00:13:02
B:And that's my next one here is micrometeorite impacts.
00:13:06
B:These are constantly bombarding the moon.
00:13:09
B:They can be very tiny particles or they could be up to a few centimeters.
00:13:13
B:And they travel potentially up to 70 kilometers per second.
00:13:18
B: With the average impact velocity of about 20 kilometers per second, that's a lot of kinetic energy there, even for something tiny.
00:13:25
B:The damage to structures or your head would be catastrophic.
00:13:29
B:And it's not even a direct hit.
00:13:30
B:You could get hit by the particles that are that are kicked up after it hits the ground on the moon, the ejecta.
00:13:38
B:Even that can be deadly.
00:13:39
B:So that's yet another one.
00:13:41
B:And then the final one on my list here is the dreaded moon dust, the regolith.
00:13:46
B: This is probably the most hated thing for Apollo astronauts on the moon.
00:13:50
B:They really, really did not like it.
00:13:53
B:Lunar soil, this dust is fine like powder.
00:13:56
B:It's even finer than I thought it was, but it's abrasive and sharp like glass.
00:14:01
B:This comes from mechanical weathering on the surface of the moon.
00:14:06
B:The rocks have been fractured by meteors and micrometeorites over billions
00:14:10
B: And it makes them really, really tiny, but they stay sharp because there's no wind and there's no water erosion.
00:14:17
B:So they stay basically nasty forever.
00:14:20
B:Anakin Skywalker would hate this far more than sand.
00:14:24
B:It not only gets everywhere, it eventually damages whatever it comes in contact with.
00:14:29
B:It even causes what the Apollo 17 astronauts called lunar hay fever.
00:14:34
B:Ever hear of that?
00:14:35
B: Every of the 12 men that walked on the moon, every one of them got this lunar hay fever.
00:14:41
B:I mean, and this was from the moon dust, the moon sand, the regolith.
00:14:46
B:It was sneezing and nasal congestion, and sometimes it took many days for it to even fade, but they all got it.
00:14:52
B:And that's just over a weekend.
00:14:54
B:They were there from like
00:14:56
B: Just like a day or three.
00:14:58
B:They weren't there very long.
00:15:00
B:And get this, they've done experiments with analogs.
00:15:03
B:They created this analog for the regolith and they showed that long-term exposure would likely destroy lung and brain cells.
00:15:11
S: That would be bad.
00:15:39
B: Just for a weekend is basically okay.
00:15:41
B:You know, it's it's, you know, it's fairly safe, but it can be annoying.
00:15:44
B:Specifically, the moon dust was the most annoying because none of the other big players came into play, right?
00:15:52
B:There were no micrometeoroids or...
00:15:55
B: You know, micrometeorites or radiation or any of that that hit them.
00:15:58
B:So it was fairly safe.
00:15:59
B:But if you go beyond that, though, you go beyond just a weekend, like is what we're planning, right?
00:16:04
B:We're trying to make much more permanent stays on the moon.
00:16:07
B:It just gets increasingly and increasingly deadly.
00:16:09
B:All right, so that's my story.
00:16:11
B:That's my background on why it's so nasty on the surface of the moon.
00:16:16
B:This latest news item starts with an open pit in the Mare Tranquillitatis, the Sea of Tranquility.
00:16:23
B:It's such always a beautiful name.
00:16:24
B: So now the Sea of Tranquility, it looked like a sea, right, to early moon observers, but it's really just an ancient lava plane.
00:16:31
B:And it's also where the Apollo 11 astronauts, right, Neil Armstrong and Buzz Aldrin first set foot on the moon.
00:16:37
B:But I mean, this is a lava plane, right?
00:16:39
B:Lava was flowing through here.
00:16:40
B:You know, there was like basaltic lava all over there many billions of years ago.
00:16:44
B: Now, these lunar pits were identified in 2009, which is actually a little bit later than I thought they were.
00:16:51
B:But 2009, they were first really identified.
00:16:55
B:And that's probably because they look like normal craters from a distance.
00:16:58
B:But if you look closer, you see that's not an impact crater.
00:17:02
B:It looks more like a collapsed roof or a skylight, if you will, rather than that impact.
00:17:08
B: Now, by now, hundreds of these pits have been found.
00:17:10
B:And the speculation is, is that many of these are lava tubes, billions of years old.
00:17:16
B:Basaltic lava was flowing through, was all through that area, through these mares.
00:17:21
B:They created these lava tubes.
00:17:23
B: And eventually they drained away, leaving the empty tubes.
00:17:25
B:And we've got plenty of these on the Earth.
00:17:28
B:On the Moon, they could be potentially even bigger because of the low gravity.
00:17:33
B:Now, the specific pit in the Sea of Tranquility is it's 100 meters in diameter and 150 meters deep.
00:17:39
B:So this is kind of big.
00:17:40
B:The difference though, this pit is special because this pit was overflown by NASA's Lunar Reconnaissance Orbiter, and it was done
00:17:48
B: Even more importantly, at a relatively oblique angle.
00:17:53
B:So that relatively low angle allowed the radar to enter the tunnel at, say, 45 degrees instead of straight up and down.
00:18:00
B:And that allowed the radar to actually get under that overhang, the pit walls.
00:18:06
B: You know, the pit sides going down.
00:18:09
B:And so we kind of went under and then the radar kind of bounced around a little bit before coming back out.
00:18:15
B:And this is what this is what showed that the underground area under the pit extended to at least 170 meters so far, you know, wider than wider than the actual hole going in.
00:18:26
B:So this was at least 170 meters.
00:18:28
B:And the researchers thought this was extraordinary.
00:18:31
B:And it is because clearly there is
00:18:33
B: There is some sort of area underneath this pit that's bigger than you might imagine just from looking at the pit opening itself.
00:18:42
B:So they thought this was extraordinary.
00:18:43
B:And like good scientists, they figured, well, let's validate this because this is kind of amazing.
00:18:49
B:So let's see if we can validate this in some other way.
00:18:52
B: And so what the direction they decided to take was to use a 3D, to create models, a 3D computer model of the pit matching the visible geometry, the known geometry of the pit from images and using basically like 3D stereoscopic images.
00:19:08
B: In this video, we're going to
00:19:24
B: Thank you for joining us today.
00:19:45
B: And whatever the model is saying now, whatever it's concluding, could be potentially true.
00:19:49
B:So the model made a couple of different solutions, and there was only one solution that was geologically plausible.
00:19:56
B:And that solution contained a big cave conduit that was up to 170 meters long, but could be even bigger, they say.
00:20:05
B:So that was their conclusion.
00:20:06
B:So according to these researchers, there's very likely to be a sizable subsurface cavern or tunnel on the moon.
00:20:14
B: And in their mind, it seemed like this is basically a done deal.
00:20:18
B:Their confidence levels are very, very high.
00:20:21
B:And that's awesome from my point of view, obviously.
00:20:23
B:But it's also, at the same time, I feel like, yeah, it's about time we confirmed this because it seemed, you know, looking at these pictures, it seemed pretty obvious that there was some sort of space underneath these pits bigger than you would think.
00:20:36
B:So I'm just very
00:20:39
B: Kind of happy and relieved that they finally are really accepting this.
00:20:43
B:All right, so what's the next step here?
00:20:45
B:The next step is to determine how big this is.
00:20:49
B:Because think about it, we have the radar going straight down in one direction.
00:20:54
B:So we know that this is kind of an extended kind of tunnel like 175 meters or more.
00:21:00
B:But what they need to do is they need to do more flybys, but from different angles.
00:21:05
B: So when you hit it from different angles, you're looking at different areas of this subsurface cavern.
00:21:10
B:Is it very narrow, making it a tube or not?
00:21:15
B:So they say right now that even though they have really no idea how wide it is, it's probably almost certainly 55 to 60 meters wide, which would mean it's probably a lava tube.
00:21:26
B:But they say that it could potentially be hundreds of meters wide, which would make it more cave-like than tube-like.
00:21:33
B: So, you know, it could be a lava tube, it could be a bigger, it could be a gargantuan lava tube, or perhaps more of a cave-like system.
00:21:39
B:They're not sure, and they say that the only way to do it is to do more flybys, which I hope we really do.
00:21:45
B:Okay, so the low-gravity elephant in this pit is the idea that if it really is roomy down there, then it would make a great location for a moon base alpha.
00:21:57
B: The scientists actually say this.
00:21:58
B:They say in their paper, this discovery suggests that the MTP, the pit basically, is a promising site for a lunar base as it offers shelter from the harsh surface environment and could support long-term human exploration of the moon.
00:22:12
B: You know, in my mind, it's not only fun to think of colonies in these lunar caves.
00:22:17
B:And of course, the protection they would offer would be really dramatic.
00:22:20
B:And that's why I went into some detail about how dangerous the surface is.
00:22:25
B:So it would be so much safer down there.
00:22:27
B:It seems like a no brainer in many ways, since this cave is already there.
00:22:32
B:You know, because once you're in this cave system, the radiation, the micrometeorites,
00:22:38
B: All that stuff goes away, and get this, the temperature difference goes away as well, because I found a study that looked into what the temperature could potentially be in these pits, and some researchers are saying it could be consistently 63 degrees Fahrenheit.
00:22:55
B:I don't know exactly what that is in Celsius, but that's nice.
00:22:58
B:That's nice weather.
00:22:59
B:That's t-shirt weather.
00:23:01
B:I'm not sure how that works.
00:23:02
C: You wear a t-shirt in 63 degrees?
00:23:04
B:You are from different parts of the country.
00:23:06
B:That's cold to me.
00:23:07
C:Yeah, okay, light jacket, hoodie weather.
00:23:09
B:Very light, very light.
00:23:10
B:But to me, that's amazing.
00:23:12
B:I didn't do a deep dive on that paper, but even if that's not correct, even if it's much higher or even much lower, but a consistent temperature around that temperature would be amazing, totally amazing.
00:23:26
B: Now, of course, it seems pretty pie in the sky with modern technology, right?
00:23:31
B:Getting all the industrial equipment and people up there and working out how to build a moon base in such an environment as the moon is obviously going to be ridiculously hard.
00:23:42
B:We cannot do that right now.
00:23:44
B:And I think before we see anything substantial on the moon, even in these pre-made caverns under the moon's surface,
00:23:52
B: I think it's going to take a hell of a long time.
00:23:55
B:Steve, if you want to make a prediction, 100 years, 80 years, it depends on... Yeah, something like that.
00:24:02
S:I mean, it all depends on how many resources you want to put into it.
00:24:06
S: We are going back to the Moon, we are going to try to have a sustained presence on the Moon.
00:24:09
S:If we want to build a base like this, it would be a huge engineering effort.
00:24:13
S:I mean, as you said, think of all the equipment we have to bring down to the surface of the Moon.
00:24:17
S:It would take decades to do this kind of construction, even once we are permanently on the Moon.
00:24:24
S:But if we want to do it, we can do it.
00:24:26
S:We can do this with our current technology.
00:24:29
S:It's not a technology issue, it's just an effort and resource issue.
00:24:33
B: It really is.
00:24:34
B:And I think they're going to take how dangerous the surface is.
00:24:38
B:They're going to take it seriously.
00:24:39
B:And they're not going to immediately, of course, try to go into these caverns.
00:24:46
B:And by the way, I am waiting.
00:24:47
B:I hope I live long enough to see
00:24:49
B: The first images from a lander that's actually cruising around, you know, in one of these tunnels.
00:24:56
B:That would be an amazing moment.
00:24:58
B:And I think they will.
00:24:59
B:They're going to take this seriously.
00:25:00
B:So whatever they construct on the moon, they're going to make sure that, you know, you pile up enough regolith, you know, you create enough of a shield to protect you, not only from radiation, but for some of the nastier, you know, maybe some of the smaller micrometeorites.
00:25:15
B:They'll take protection seriously.
00:25:17
S: Yeah, if you made a protective structure on a moon base that had two to three feet of mooncrete on the outside, that would go a long way to be protecting from radiation.
00:25:27
B:Oh, absolutely.
00:25:28
B:That's basically a given.
00:25:31
B:They've got to do something like that.
00:25:33
B:Otherwise, it's like, oh yeah, we just lost all of our astronauts on the moon because they weren't protected enough from this solar event.
00:25:41
B:So yeah, I hope they take it very seriously and realize that
00:25:44
B: Yes, it's going to be very difficult to create a large and very safe structure on the moon.
00:25:51
B:It's just so easy.
00:25:52
B:Just go underground, man.
00:25:54
B:It's just right there.
00:25:55
B:And Steve, and also, Steve, I know you mentioned in your blog that the
00:26:00
B: The cave walls might be sharp, but I don't think they would be because the cave walls, this is just lava.
00:26:07
B:It wasn't mechanically weathered by being hit by micrometeorites over billions of years.
00:26:14
B:I think the surface of the tunnel itself would be fairly safe.
00:26:19
S: Yeah, that'd be nice.
00:26:20
S:Yeah, it depends.
00:26:20
S:Some lava tubes on the Earth, many are smooth.
00:26:23
S:Some are rough, though.
00:26:24
S:There are none that I think of that are sharp.
00:26:27
S:So hopefully that will be the same on the Moon.
00:26:29
S:It just depends on what the conditions are there.
00:26:31
B:Right.
00:26:32
B:And that's the other huge thing that I didn't probably stress enough.
00:26:35
B:The fact that we have a huge, very deep tunnel on the Moon right now could do amazing things for just scientific discovery.
00:26:43
B: And we'll see you next time.
00:27:00
B: Yeah, sure.
00:27:01
B:Eventually, what I think of that trip, though, I mean, even better than like, say, low Earth orbit, I mean, going to the moon for a week would be, you know, once it was safe, I think it would be an amazing adventure once I mean, if it if it ever gets as routine as like, say, traveling across the planet.
00:27:19
B:I think there could be lots of people that would go who knows what's going to how it's going to happen.
00:27:24
S: All right, thanks, Bob.
00:27:25
S:Cara, tell us about AI Love.
00:27:30
S:Who's that?
00:27:32
C:Okay, so before I dive into this story, basically, which was published, I read The Conversation a lot.
00:27:39
C:I know that we've talked about it on this show.
00:27:41
C:The Conversation is a website that has lots of different verticals.
00:27:45
C:And the authors of the pieces on The Conversation are academics.
00:27:50
C: So it's sort of a from the horse's mouth format.
00:27:55
C:And there's an article that came out recently called Computer Love, AI-powered chatbots are changing how we understand romantic and sexual well-being.
00:28:04
C:And it's by three different authors from the University of Quebec at Montreal and, and I said that very American because I can't pronounce it in the French, and a researcher at the Kinsey Institute at Indiana University.
00:28:18
C: So these are psychologists and- Wait, as in Alfred Kinsey?
00:28:21
C:Yeah, yeah, as in Alfred Kinsey, yeah.
00:28:24
C:Okay.
00:28:24
C:So these are researchers in psychology and sexology departments.
00:28:28
C:Well, sure, when you say Alfred Kinsey, it's like, there you go.
00:28:31
C:I mean, what else?
00:28:32
C: The first thing I want to know kind of from from you all is, when is the last time or do you regularly interact with chat bot?
00:28:41
C:Like I'm thinking I have interacted with chat bots when I need to like contact IT or customer service.
00:28:49
C: But I can't think of other times when I regularly interact with chatbots.
00:28:54
B:Would you call ChatGPT a chatbot?
00:28:57
C:I don't think so.
00:28:58
B:Okay.
00:28:59
C:Yeah, because I think it's something where you're having a back and forth conversation.
00:29:03
C:And so, you know, there are digital and AI powered assistants like Siri and Alexa.
00:29:08
C: And then we're starting to see more and more chatbots on the rise for a lot of different applications.
00:29:13
C:So I think my exposure to these chatbots really generally is just customer service, which means I hate them.
00:29:20
C:I hate them with a burning passion.
00:29:22
E:Speak to a person, please.
00:29:24
C:Exactly.
00:29:24
C:But there is a growing industry of chatbots for kind of all manner of services, one of which is romantic companions.
00:29:34
C:Apparently, there are over 100 AI-powered apps.
00:29:38
C: That offer romantic and sexual engagement.
00:29:43
B:Only a hundred?
00:29:44
C:Yeah, over a hundred.
00:29:46
B:And the people know that it's chatbot when they're doing it?
00:29:48
C:A hundred percent.
00:29:49
C:A hundred percent.
00:29:50
C:So some of the ones that they listed on here are MyAnima.ai, Eva.ai, Nomi.ai, MyAnima, M-Y-A-N-I-M-A, MyAnima.ai, Eva.ai, Nomi.ai, and Replica.
00:30:07
C: And these are different apps, I guess that you download to your phone, where because they're AI powered, these chatbots evolve, the longer you talk to them, they understand what you're interested in, they understand, you know, turns of phrase that you like to use, shortcuts, how much you emote, you know, sort of your affective stance.
00:30:27
B:Are they chat GPT based?
00:30:29
C: I think they're all different, but probably some of them are.
00:30:32
B:I would assume, right?
00:30:33
B:Yeah, I would assume so.
00:30:34
B:It'd have to be, at this point.
00:30:36
C:But yeah, I'm not sure what the like, what AI platform they're being built upon.
00:30:40
B:Right, right.
00:30:41
B:What's the target audience?
00:30:43
C:Anyone, I would think.
00:30:44
B:Anyone who's interested in this.
00:30:45
B:Anyone with a sex drive?
00:30:46
B:Yeah, so yeah, but who is interested?
00:30:48
C: And so that is the question, right?
00:30:50
C:And I think it's important for us to kind of approach this with an open mind and to start asking some important questions, because there is actually a growing body of scientific data on these topics.
00:31:06
C:There are a lot of studies across multiple disciplines asking questions like, can people feel something for a chatbot?
00:31:17
C: And the answer seems to be across the board, yes.
00:31:20
C:Yeah, people are forming emotional bonds.
00:31:24
C:Some people self identify as having fallen in love with the chatbot knowing that it's a chatbot.
00:31:30
C:And interestingly, there was one study that was cited in in this coverage that showed that
00:31:36
C: When everyday people are engaging with either a potential romantic partner who is human, or an AI version, a chatbot, which is a potential romantic partner, that on average, people tend to choose a more responsive
00:31:57
C: Is that because they feel they can manipulate the conversation
00:32:07
C: I don't know.
00:32:08
C:Well, first of all, I don't know if anybody can answer that question.
00:32:11
C:So like, I think that's, that's a sort of a rhetorical question.
00:32:15
C:It's probably different for different people.
00:32:17
C:But I think that that may be one reason.
00:32:19
C:It's not the first reason I would jump to.
00:32:21
C:I would think it's because they are responsive.
00:32:24
C:They're engaged with you.
00:32:25
B:But not only that, from what you've said, they're kind of like Zeligs, where they adapt themselves to you.
00:32:31
B: Which you wouldn't you wouldn't really want somebody to do to a large extent in normal conversation with human to human because then it's just like weird.
00:32:43
C:I can almost guarantee you there are probably hundreds of studies out there that show that
00:32:49
C: People feel most heard, people feel most connected when you mirror their behaviors, when you respond in ways that are similar to how they talk.
00:32:58
B:But it seems like this system, from how you described it, maybe I'm making assumptions here, would do it to a much more dramatic degree.
00:33:07
B: Possibly.
00:33:08
B:Sure.
00:33:09
B:There's some conscious mirroring for sure.
00:33:11
B:And that's just kind of instinctive, and you're maybe not even aware that you're doing it.
00:33:16
B:And that's fine.
00:33:17
B:But it just made me think of interacting with somebody whose whole purpose is to not even be themselves, but to make themselves an extension of me.
00:33:29
B:And I don't think that's necessarily healthy, right?
00:33:33
C: I think that there are some assumptions being made in that statement that are not necessarily reflective of how most people are.
00:33:42
C:I think that if you, when's the last time that you guys don't have to answer this if you don't want to, but have any of you ever been on dating apps?
00:33:49
E: Yeah, sure.
00:33:49
E:I have never been on a dating app.
00:33:51
C:I didn't think so.
00:33:52
C:I'm like, I'm talking to a bunch of married men.
00:33:56
C:But on a dating app, very often you connect with somebody for the first time, you know nothing about them except for this over the top representation that they are trying to present to you.
00:34:06
C:And then when you start engaging, you start to recognize things like, oh, they don't know the difference between your, your and your.
00:34:12
E:It's a resume than the interview.
00:34:14
C: Thank you for joining us today.
00:34:37
B: Yeah, for sure.
00:34:39
B:I remember to this day, I remember one of the most engaging back and forth I had with somebody on a dating app.
00:34:45
B:And it was incredible.
00:34:47
B:We had so much in common.
00:34:48
B:It really was a joy.
00:34:50
B:But again, having things in common is one thing, but having someone adapt to you on the fly over time
00:35:00
B: It reminds me of the Metamorph from the Next Generation episode where the woman actually attuned everything about herself to her mate so that she became the perfect mate for that person.
00:35:12
B:And it was like, that's just not right.
00:35:14
S:Bob, what's your point with all this?
00:35:17
B: My point is that two people that have many things naturally in common is fantastic, but having somebody who adapts to you on purpose just to get along, to me, that crosses a line.
00:35:32
S:Well, you may think it crosses a line, but the question is, how will people respond to that?
00:35:36
B: Yeah.
00:35:43
C:And I want to get into those implications.
00:35:46
C:And I think that sort of a takeaway from this is, yes, there could be a point where it was creepy, right?
00:35:50
C:Where somebody where your potential romantic chatbot partner felt like too sycophantic and too inflexible.
00:35:59
C:I could see that.
00:35:59
C:But I think most people would
00:36:01
C: The Skeptic's Guide to the Universe is hosted by Steven Novella, Cara Santa Maria, Cara Santa Maria,
00:36:18
B: Oh, if it's a good algorithm that does it seamlessly, of course.
00:36:24
B:There's a lot of parts of me that are human, after all, so I think I absolutely can be swayed by that.
00:36:32
B:But it's just the way it was presented, that it's adapting to you over time.
00:36:36
C:Yeah, that's what AI does.
00:36:38
C:It adapts.
00:36:39
B:Yeah.
00:36:39
C:That's like definitionally what it does, right?
00:36:42
C:And so the question here is, aside from the ick factor,
00:36:46
C: That Bob has flagged for himself personally, like his personal proclivities.
00:36:51
C:What are some of the legitimate moral, ethical, you know, what are the what are the actual potential problems?
00:37:00
B:It's like, there's there's a lot, I think,
00:37:04
B: For creating unhealthy relationships, it's like the way advertising markets men and women that are basically weaponized beauty with people that are amazingly
00:37:19
B: Good looking, right out of
00:37:36
B: Where all this makeup and have cosmetic surgery so I can be that pretty.
00:37:39
B:So when you create a relationship based on that, you're creating a relationship with somebody who's unrealistic because they're so attuned to you that I think you would be unsatisfied with almost anybody else because they wouldn't be as attuned to you as this AI person.
00:37:56
C: So the outcome that you are identifying in this scenario is that you as the consumer are now going to be unsatisfied in real relationships or in, I should say, in analog relationships.
00:38:11
S: Well, I think the worst case scenario here in terms of the effects on people are that would these AI girlfriend or whatever apps, significant other apps, create an arms race to create the most addictive, the most appealing, the most
00:38:28
S: You know, everything that an AI could be, would that create completely unrealistic expectations of people in terms of relationships that no living person could ever keep up with, but at the same time, it could create the pressure for people to feel like they have to be now as good as the AI, and that could be extremely unhealthy.
00:38:49
C: Yeah, and, and I think that that social isolation concern, right, because the eventual outcome of that would be social isolation, it would be the lack of engagement.
00:38:59
C:I think that that is a legitimate concern.
00:39:01
C:And to me, that's sort of, I don't want to say it's the best case scenario, but I think an even more pernicious outcome is
00:39:10
C: A lack of growth.
00:39:13
C:It's a lack.
00:39:14
C:So the consumer, the end user is now not learning about things like empathy.
00:39:20
C:They're not learning skills in relationships like compromise.
00:39:23
E:They're not learning rejection either.
00:39:25
C: Yeah, they're not learning how to have resiliency when they are rejected.
00:39:32
B:I would argue that the best AI chatbot people would be ones that can potentially push you to be a better person.
00:39:42
B:That would be something that would be interesting as hell to have a relationship with an AI that could actually make you a better person from many different angles.
00:39:54
C: It would and researchers are working on developing that for that very purpose.
00:39:59
C:So think about one more time, just to kind of recap what was just said, if you're the end user, there is a potential outcome in which you become more and more socially isolated because you start to develop more and more unrealistic expectations of a partner, which as you mentioned, Bob, it's 100% already happening.
00:40:19
C:We see this with a lot of like,
00:40:22
C: You know, there's the whole incel movement, the involuntarily celibate movement.
00:40:26
C:We see this a lot when individuals have unrealistic expectations of what actual partnership looks like, when there's a sort of privileged or a self-centered perspective that my partner is there to serve me, to give me the things that I require and that I deserve in this world, as opposed to my partner as a human being.
00:40:49
C: And this is a relationship where we are egalitarian in nature, and we are compromising with one another.
00:40:56
C:And so yes, the first negative outcome is I am now alone because I had these expectations of people and then people kept failing me because they weren't as good as my chatbot.
00:41:06
C:But the second, which I believe is a more pernicious, is
00:41:09
C: Now I'm sort of running away on a negative feedback loop of training.
00:41:14
C:And I start to treat people the way I treat a chatbot.
00:41:17
C:And this comes back to the conversation we had, was it just last week, about robot engaging with with robots in our natural environment?
00:41:25
C:And if I can, if I know that the robot doesn't have feelings, and I can treat it in a very particular way, is that going to affect how I treat people?
00:41:33
J:Right.
00:41:34
C: Now we're talking about one level more, which is an emulation of a person in one of the most vulnerable and intimate ways that you can engage with a person, where the psychological flexibility, the emotional maturity, having done the work on yourself is so fundamentally important to be able to have
00:41:55
C: Thank you so much for joining us today.
00:42:12
C: It popped up while I was reading the article, is we talk about driverless cars a lot on the show.
00:42:16
C:And we talk about, are they safer?
00:42:17
C:Are they more dangerous?
00:42:19
C:It's the nuanced gray area of when there are driverless cars on the road and human drivers on the road that it's the most dangerous, because the way that they engage, if it was all just driverless cars, they would probably communicate with each other well, and there wouldn't be as much danger.
00:42:33
C:But because there's a mix, and that's what I worry about here, individuals dipping their toe into AI companions,
00:42:40
C: And then attempting, I don't know, analog human relationships.
00:42:45
C:How do they play off of each other?
00:42:47
C:How do they affect our humanity, really?
00:42:50
C:There's a whole other thing that they talk about in the article about us, just security, like basically just surveillance.
00:42:56
C:We know that most of these apps are collecting and selling personal user data, you know, for marketing purposes.
00:43:02
C:Imagine the
00:43:04
C: Intimacy, the intimate nature of that data, and just how potentially dangerous that could be exactly.
00:43:12
C:But on the flip side, as you mentioned, the researchers are actively doing a study right now, where they are assessing the use of chatbots.
00:43:21
C:This is directly from the article, quote, to help involuntary celibates improve their romantic skills and cope with rejection.
00:43:28
C: So most of the chatbots that are training chatbots on the market right now tend to be used for sexual health education.
00:43:35
C:So like helping understand, I don't know, consent or helping understand, maybe they're not even that sophisticated, helping understand STI risks and things like that, reproductive, you know, health.
00:43:45
C:But development of chatbots to help individuals learn interpersonal skills, to help them learn
00:43:52
C: Thank you for joining us today.
00:44:09
C: They underscore the need, all of these issues and concerns underscore the need for quote, an educated, research informed and well regulated approach for positive integration into our romantic lives.
00:44:22
C:But current trends indicate that AI companions are here to stay.
00:44:26
C: Like, this is the reality, right?
00:44:28
C:So how do we ensure that this reality is safe, that this reality is ethical, and that this reality is utilized for harm reduction, not for increasing harm?
00:44:41
C:And when we talk about harm, I mean physical, psychological, financial, all of it, because all of those things are at risk when we're talking about intimate relationships with basically black box AI.
00:44:54
C:All of those things are at risk.
00:44:55
S: Yeah, I agree.
00:44:56
S:That would be like the best case scenario.
00:44:57
S:That would be awesome to have, you know, AI companions or whatever, teachers, significant others that are programmed to make you your best self, to challenge you, to work on your, you know, your personality, your skills, all of that.
00:45:10
S:That would be great.
00:45:11
S:But you could also see this instantly becoming part of the culture wars.
00:45:15
S:It's like, what, now we got to be nice to these AI robots?
00:45:18
S:I mean, can't I just have my robot slave and be done with it?
00:45:20
S:You got to shame me about it.
00:45:22
C: It's so sad.
00:45:23
C:What does that say about you that you want a robot slave?
00:45:25
C:You know what I mean?
00:45:28
C:Let's self-reflect on that a little bit.
00:45:31
E:It's worth a shot.
00:45:32
E:Let's put it that way.
00:45:33
C:It's worth a shot, but it's not worth a shot in the dark.
00:45:35
C:It's worth a shot done very safely and cautiously.
00:45:39
E:Right, but won't there be bad actors out there who will just throw something together and
00:45:44
C: Always.
00:45:45
C:It's probably already happening.
00:45:46
C:I mean, apparently, one of these companies, they were saying we never wanted it, we never intended this to be sexual.
00:45:52
C:It was supposed to be like, like a friend, right?
00:45:56
C:Well, one of the companies, one of this many were like, okay, this is like your AI friend.
00:46:00
C:And then people started having sex with them, you know, having cyber sex with them.
00:46:05
C:And they started having all of these, like intense relationships and said they fell in love.
00:46:08
C:And
00:46:09
C: When the developers realized it's being used in this way that we didn't intend, and there's some risks there, they cut out that functionality.
00:46:18
C:And that completely changed the AI's algorithm.
00:46:21
C:And all of a sudden, all of these people's friends or companions started to act really differently than they had before.
00:46:28
C:And people had psychological distress.
00:46:31
C:Like there were Reddit threads opening up, there were all of these different conversations like,
00:46:36
C: I feel rejected.
00:46:37
C:My girlfriend broke up with me.
00:46:39
C:She suddenly doesn't want me anymore.
00:46:41
C:And it was as if they were dumped by a human being.
00:46:45
C:And so they actually, under so much pressure, reinstated the functionality because it was so traumatic for their end users.
00:46:53
C:So like these are real life examples of the fact that this is happening.
00:46:57
E:Wow, we're talking about some fragile people?
00:46:59
C:Well, I mean, I don't know if that's a fair thing to say.
00:47:02
C:Well, I don't know.
00:47:06
C: Have you ever had a terrible breakup?
00:47:09
E:Yes, I did have a terrible breakup.
00:47:10
C:Were you a fragile person at that time?
00:47:13
E:At the time, I probably was.
00:47:15
E:Well, maybe you were just human.
00:47:17
E:Well, sure.
00:47:18
E:I mean, but I didn't mean to say that there are fragile and non-fragile people.
00:47:22
E:I think everybody has some fragility to them.
00:47:24
C: Right, I think that this is just a very vulnerable topic and a vulnerable experience.
00:47:30
C:When you open yourself up, and you really are, you know, your true authentic self, whether it's to an AI or to a human being, when you're sharing your deepest, darkest vulnerabilities with them,
00:47:41
C: That is, I think, actually a form of strength.
00:47:45
E:But we are talking about a group of people who otherwise can't find this among humans.
00:47:49
C:I don't think that's true.
00:47:50
C:I don't think that's a fair assumption.
00:47:51
E:You don't know.
00:47:52
E:You think they're going after people who are capable of socializing?
00:47:57
C:I don't think anybody's going after anybody.
00:47:59
C:I think these are apps available in the app
00:48:00
S: I think there are people who are more or less vulnerable to this sort of thing, but you don't have to be vulnerable.
00:48:06
S:I think this is just the human condition.
00:48:08
S:Just like anybody can get addicted to a video game, for example.
00:48:13
E: Right, but what was the first question we asked?
00:48:14
E:Who's the end user here?
00:48:15
C:I think it's anybody and everybody.
00:48:18
C:And when I use the word vulnerable, and this is me putting my psychologist hat on here, vulnerability is a form of strength.
00:48:27
C:To be ultimately vulnerable in a trusting relationship is to be very, very brave.
00:48:35
C: And when people are brave in that way, when they open themselves up and they really put themselves out there and they are vulnerable, the bravery comes in the ability to be hurt.
00:48:46
C:And being rejected when you are vulnerable is psychic pain.
00:48:50
C:And I have seen people become suicidal over that kind of pain.
00:48:55
C:I have seen people have incredibly intense psychological reactions to that kind of pain.
00:49:00
C:People who otherwise did not have mental illness.
00:49:03
C: So I think it's, it's, I'm only saying this, Evan, because I think it's unfair to assume that there's something fundamentally different about the types of people or the individuals using the, I think anybody could find themselves in that position.
00:49:16
S:Yeah, the instinct of, well, this couldn't happen to me, I think is naive.
00:49:20
C: Yeah, because if a person, if we've all been through it with people, and that's, that's, that's an assumption.
00:49:26
C:Not everybody listening to this podcast has had their heart broken, but many people have had their hearts broken and they felt crazy in those moments.
00:49:36
B:Oh my god, yeah, you are just not yourself.
00:49:38
C:Yeah, and there's no reason to say that wouldn't happen with a chatbot.
00:49:41
S: Let's end what for me is the bottom line, psychologically, neurologically.
00:49:46
S:Our brains function in a way that we do not distinguish between things that act alive and things that are alive.
00:49:53
S:If something acts alive, we treat it as an agent, as a living thing emotionally, mentally, with all that that comes along with that.
00:50:02
C: A hundred percent.
00:50:03
C:And then you take and then that agent gives you something you are craving.
00:50:07
C:You're in.
00:50:08
C:You are in.
00:50:09
S:All right.
00:50:09
S:These last two news items I call A.I.
00:50:12
S:Scams and Solar Clams.
00:50:15
S:What?
00:50:15
S:They rhyme.
00:50:16
E:Wow.
00:50:17
S:Evan, tell us about those A.I.
00:50:19
E:Scams.
00:50:19
E:A.I.
00:50:20
E:Scam.
00:50:20
E:The A.I.
00:50:21
E:driven scam adds deep fake tech used to peddle bogus health products.
00:50:27
E:That was the headline and that is what caught my eye.
00:50:31
E: This was at a place called HackRead.com.
00:50:34
E:Had not heard of it before, but still I stumbled upon it.
00:50:38
E:The author's name is Habiba Rashid.
00:50:41
E:She writes that scammers are leveraging deep fake technology to create convincing health and celebrity endorsed ads on social media targeting millions of people.
00:50:51
E: Here's how to spot and avoid these deceitful scams.
00:50:56
E:Okay, that's good advice.
00:50:57
E:I'm intrigued.
00:51:01
E:Social media has always been a hotspot for scam advertisements, yes.
00:51:05
E:Still, recently, cybercriminals have been creating especially deceitful ads using deepfake technology and the allure of celebrity endorsements to exploit unsuspecting individuals.
00:51:18
E: A recent investigation by Bitfinder Labs highlights a surge in health-related scam ads on major social media platforms like Facebook, Messenger, and Instagram.
00:51:38
E: I found it to be both informative and a little bit strange, which I will get to.
00:51:46
E:The link goes to Bitdefender Labs.
00:51:49
E:Bitdefender is a product.
00:51:51
E:You may have heard of it.
00:51:53
E:They consider themselves a global leader in cybersecurity.
00:51:56
E:I think they've been around since 2001, so they have a pretty good footprint.
00:52:01
E:Bitdefender provides cybersecurity solutions with leading security efficacy,
00:52:05
E: Performance and ease of use to small and medium businesses, mid-market enterprises, and consumers.
00:52:11
E:Okay, well, despite the fact that this is a product, basically, that they've linked to, their website does have a lot of information on it, and they published an article on their website, and they have a section called Scam Research.
00:52:25
E:So that was the section in which this article appeared.
00:52:28
E:And it says, a deep dive on supplement scams.
00:52:31
E: How AI Drives Miracle Cures and Sponsored Health Related Scams on Social Media.
00:52:36
E:So this is the source material for that original article.
00:52:40
E:There are four authors here, all with names that I would definitely be mispronouncing, I'm certain.
00:52:46
E:But they are Romanian.
00:52:47
E:I looked up a couple of the names.
00:52:49
E:They appear to all be Romanian, four Romanian authors here.
00:52:52
E:And I think we'll link to this so you can go ahead and give the article a read for yourself.
00:52:57
E: I think this was translated from Romanian to English, and when you go and you read it, it just feels a little off in a way.
00:53:05
E:I don't know, tell me if you feel the same about that when you read it.
00:53:08
E:It felt a little odd to me.
00:53:09
E:But in any case, here's their deep dive.
00:53:11
E:They start by talking about how sponsored social media content is on the rise.
00:53:16
E:Okay, that's no surprise.
00:53:18
E: But hand in hand has been the rise of scams in the form of phony ads on social media.
00:53:23
E:And by phony, I mean that the faces and the voices that often accompany the product being sold are either outright AI fabrications, or they're AI versions of people who really exist, and they're basically deep faking those consumers.
00:53:38
E: Here's what the article says.
00:53:40
E:Researchers at Bitdefender Labs collected and analyzed health-related scams across the globe over a three-month period from March through May of 2024, so very recently.
00:53:51
E:And here were their key findings.
00:53:53
E: Number one, a market increase of health-related fraudulent ads leveraging AI-generated images, videos, and audio, promoting various supplements, especially on meta's social platforms, Facebook, Messenger, and Instagram.
00:54:09
E:Number two, the highest number of followers, a compromise slash fake page that promoted false advertisements, was over 350,000 followers.
00:54:20
E: That's not insignificant.
00:54:22
E:Scammers used over a thousand different deepfake videos across all communities.
00:54:27
E:They discovered that there were over 40 medical supplement ads that were promoted among these.
00:54:33
E:Most of the ads catered to targeted geographical regions with tailored content using the names of celebrities, politicians, TV presenters, doctors, and other healthcare professionals in order to bait those consumers, including people like, well, Brad Pitt.
00:54:49
E: Or Cristiano Ronaldo, soccer player, football player.
00:54:54
E:George Clooney, we certainly know who that is.
00:54:57
E:Dr. Ben Carson, I think most of us know who that is.
00:55:00
E:Bill Maher, sorry.
00:55:02
E:Denzel Washington.
00:55:04
E:Someone named Dr. Heinz Lüscher and a bunch of other doctors that are apparently either in Romania or somewhere in Eastern Europe, they have some sort of celebrity footprint to them.
00:55:15
E:OK.
00:55:16
E: The campaign's targeted millions of recipients across the globe, including Europe, North America, the Middle East, and Australia.
00:55:22
E:So basically, you know, practically everywhere.
00:55:23
E:Oh, Asia as well.
00:55:25
E:These are highly convincing messages that are grammatically correct and in the context with the ads.
00:55:31
E:In other words, not so easy to spot, right?
00:55:34
E:I mean, we've been able to look at some things that have been faked and we can pull out some irregularities about them that would denote them as fake, but they said, nah, for the most part,
00:55:46
E: Things here are pretty good.
00:55:48
E:They said most of the videos show clear signs of tampering though, if you're an expert and you know what to look for.
00:55:54
E:But they also found instances that were very difficult to put into the deepfake category.
00:55:59
E:So it's becoming more and more sophisticated is basically what they're saying.
00:56:04
E: The scammers exploit individuals who are desperate in finding a solution or treatment that will help them ease their symptoms or even cure chronic underlying diseases, they say.
00:56:13
E:And they said some of the most observed scenarios are depicted in these examples.
00:56:17
E:Number one, advertisements are described as alternatives to conventional medicine.
00:56:21
E:Where have we heard about that before?
00:56:23
E: The decline in trust in conventional medicine, aggravated by many scandals within the pharmaceutical industry, is used to prompt consumers into seeking alternative solutions.
00:56:53
E: And right, if you did have, say, a doctor who has some sort of either notoriety, celebrity, whatever, and you're able to use the AI to make that image say whatever it is you want it to say, that definitely is going to have an impact on how people see the particular product.
00:57:13
E: Here's where I thought it started to get a little bit interesting and a little bit weird.
00:57:16
E:They talked about the anatomy of a supplement scam campaign, and they basically used it as their example.
00:57:23
E:It starts with fraudsters crafting social media pages to spread misleading advertisements.
00:57:29
E:They spotted thousands of these pages that promote cures for common ailments.
00:57:33
E: All right, let's get started.
00:58:01
E: It's interesting though, because in their example of this is they point to a deep fake of someone named Dr. Heinz Luescher, L-U with umlauts over it, L-U-S-C-H-E-R, who is apparently well known, not in America, but in parts of Europe, perhaps Romania and some other places, and they've used a deep fake of him, okay, and basically promoting whatever it is, a supplement of some kind.
00:58:27
E:But then I went and I actually looked up this doctor online,
00:58:31
E: And he's basically in integrative medicine and complementary medicine and does all the other things that we talk about.
00:58:41
E:So that's legitimately who this guy is.
00:58:43
E:But they're talking about faking the fact that here's this fake version of this person talking about something else that he normally doesn't talk about, whether it's a supplement or whatever.
00:58:57
E: So it's kind of a scam of another scam trying to trick people covering up another scam, right?
00:59:06
E:That's where it kind of got a little weird for me that they use this particular person as their example of how one of these campaigns go.
00:59:17
E:Because if you look at the truth of this guy, what he's doing is kind of a scam anyways to begin with.
00:59:23
E: Don't fall for the scam of the scam.
00:59:25
E:It's weird.
00:59:26
S:Yeah, well, it's a frightening look at what we're in store for.
00:59:29
S:It's very easy now for people, individuals, small corporations, whatever, to mass produce fake reality, fake endorsements, fake claims, fake scientific education, fake news articles.
00:59:43
S: The only real solution I see to this is very careful and rigorously enforced regulations.
00:59:53
S:There's really no bottom-up solution to this.
00:59:56
E: Yeah, you can't expect the consumer to have a deep level of sophistication in understanding the nuances of things, the AI deep fakes that are going on.
01:00:08
S:Yeah, you can't expect everybody to be constantly filtering out massive amounts of fraud every day of their life.
01:00:16
S:It's not practical.
01:00:17
S:I mean, who wants to live like that?
01:00:18
E:No, not practical.
01:00:20
E:It's not a practical way of going about things.
01:00:23
E: You know, so kudos to them to kind of bringing this to everyone's attention at the same time.
01:00:27
E:I think they could have used some better examples.
01:00:30
S:All right.
01:00:30
S:Thanks, Evan.
01:00:31
E:Thanks.
01:00:32
S:Okay.
01:00:32
S:So Solar Clams.
01:00:35
E:Solar Clams.
01:00:36
S:Yeah.
01:00:37
E:It sounds like a sci-fi movie from the fifties.
01:00:40
S: So what's up with these guys, right?
01:00:42
S:So this is an interesting study.
01:00:45
S:You will file this one under the biomimicry, right?
01:00:47
S:We like when technology is inspired by the millions, hundreds of millions, or whatever years of evolutionary tinkering that living things have done to perfect anatomical solutions to problems, and then we piggyback on that evolutionary tinkering to get inspiration for our own technology.
01:01:08
S: All right, so in this case, we're looking at clams.
01:01:11
S:These giant clams are photosymbiotic animals, right?
01:01:17
S:So they have a symbiotic relationship with photosynthetic algae.
01:01:24
S:The algae exists, these are single-celled algae creatures.
01:01:28
S:They exist in these vertical columns on the surface of the clams.
01:01:35
S: And they use light for photosynthesis to create food, and some of that food gets eaten by the clams.
01:01:43
S:So what the researchers were looking at, this is Allison Sweeney, who is an associate professor of physics and of ecology and evolutionary biology at Yale.
01:01:53
S: What she and her colleagues were looking at for is the anatomy of these photosynthetic structures on the clams and how that relates to their quantum efficiency.
01:02:04
S:Bob, have you ever heard that term quantum efficiency before?
01:02:08
B: I don't think I've heard that term.
01:02:11
B:Quantum efficiency?
01:02:13
S:It's not as complicated as it sounds.
01:02:19
S:Quantum efficiency is the measure of the effectiveness of an imaging device to convert incident photons into electrons.
01:02:27
B: Did they really need quantum in that term?
01:02:29
S:So, for example, if you have a 100% quantum efficiency, a system that's exposed to 100 photons would produce 100 electrons, or in the case of photosynthetic creatures, 100 electrons would produce 100 reactions of photosynthesis.
01:02:46
S:So, you're using all of the photons, basically.
01:02:50
S: So what they wanted to find out was what was the quantum efficiency of the photosynthetic algae in these clams.
01:02:58
S:And what they found was that they're quite high.
01:03:02
S:They had a quantum efficiency.
01:03:04
S:So what they did was they modeled the quantum efficiency.
01:03:06
S:They just said, OK, we're going to make a model of just the anatomical structure of these clams and how the algae is organized in these vertical columns.
01:03:14
S:And they found that the quantum efficiency was 42%.
01:03:18
S: However, we know from direct measurements that these photos symbionts have a higher efficiency than that.
01:03:28
S:So they figured there's something missing from the model.
01:03:32
S:So then they included new information having to do with the dynamic movement of the clamps because the clamps will open and close their mouth and when they do this, this stretches the vertical column so they become
01:03:44
S: So when you include this dynamic movement
01:04:04
S: Now to put things into context, just a tree in the same part of the world, like a tropical environment, would have a quantum efficiency of about 14%.
01:04:13
S:That's a lot less.
01:04:15
S:So these clams are incredibly efficient in terms of their three-dimensional structure, in terms of their quantum efficiency, and they're in fact the most efficient structures that we've ever seen.
01:04:30
S: But interestingly, trees like arboreal forests in the northern hemis, in northern latitudes that are far away from the equator, they have similar levels, although not quite as much, but similar levels of quantum efficiency.
01:04:44
S:Again, makes sense.
01:04:46
S:They use sort of the vertical structure in order to maximize their efficiency because they don't have as much light.
01:04:54
S:They've got to make the most of the light that they get.
01:04:56
S:There's another aspect to this as well.
01:04:59
S: In terms of the anatomy, and that is that the top layer of cells over the columns that hold the algae scatters light.
01:05:11
S:It's a light scattering layer of cells.
01:05:14
S: And so that light scattering also, these are the eridocytes, is the name of the cells.
01:05:21
S:The eridocytes, they scatter the light, which maximizes the absorption of photons as well, right, because the light's bouncing around and it has multiple opportunities to be absorbed.
01:05:32
S: So these are the main takeaways from this.
01:05:35
S:You have a light scattering layer, you have vertical columns, and you have some kind of dynamic adaptation to the amount of light and the angle of the light, etc.
01:05:48
S: To maximize the quantum efficiency and you can get up into the 60s, you know, 67% in their model.
01:05:56
S:The obvious implications of this is that we want to use this knowledge in order to design more efficient solar panels, right, photovoltaics.
01:06:06
S: Some of these things are already being incorporated in design, like scattering light, using multiple layers, using sort of vertical structures.
01:06:13
S:But obviously this information could be very, very useful for attempts at improving those designs.
01:06:20
S:Right now, for context, again, a commercially available silicon solar panel is about 22, 23% efficient, which is really good when we started following this tech.
01:06:31
S: 20 years ago, you know, it was maybe 10-12% efficient, so it's almost doubled since then.
01:06:36
S:Maybe there's the potential, I don't know if we can get all the way up to 67%, but even if we get up to like 40% or 45% from where we are now, imagine twice the electricity production from the same area.
01:06:49
S:That's huge.
01:06:50
S:You know, anything that makes solar panels more cost-effective is great, of course.
01:06:57
S: Now you also have the organic solar panels, which are the best ones now.
01:07:02
S:We're getting up to like 15% efficient, which is not quite as good as the silicon, but they are soft, flexible, durable, cheap.
01:07:12
S:So they're getting close to the point where they're like really commercially viable for more and more applications.
01:07:19
S: Now, if we could apply this kind of technology to some combination of either perovskite, silicon, organic, or whatever, some combination of these solar panels, if we get to the point where it's going to be so cheap and easy to add solar panels that they're just going to be everywhere, right?
01:07:37
S:It's going to be, why not put them on everything?
01:07:41
S:That would be nice.
01:07:42
S:Yeah, that would be nice if we get to that point.
01:07:44
S:So, this is just one more study adding to the pile these sort of
01:07:48
S: Basic science and incremental advances in photovoltaic tech that is the reason why solar is getting so much better, so much cheaper.
01:07:59
S:And it's just good to see that the potential here for efficiencies north of 40-50% is just incredible.
01:08:09
B:Nice.
01:08:10
B:Looking forward to that day.
01:08:12
S: All right, so there's no Who's That Noisy this week.
01:08:15
S:Jay will just pick that up next week.
01:08:17
S:But I do have a TikTok from TikTok segment for this week to make up for it.
01:08:25
S:So every Wednesday, usually starting around noon, we do some live streaming to various social media, to TikTok, of course, to Facebook, to Twitter, to whatever else Ian streams to.
01:08:42
E: MySpace.
01:08:43
S:Yup, to MySpace.
01:08:45
S:Friendster.
01:08:46
E:Friendster.
01:08:49
E:If you need to find us, just use Ask Jeeves.
01:08:52
S: One of the videos I covered this week is by a TikToker called Moonloops, and she was telling the story of—this is like real food-babe territory, just to warn you.
01:09:05
S:She took her autistic child to a doctor to get some blood tests, and among the screening they did, they tested for heavy metals.
01:09:16
S: And she reports that the antimony level was in the 97th percentile.
01:09:23
S:But she had no idea where antimony could be coming from, right?
01:09:27
S:So she did an investigation, you know, and found that the power cord of the air fryer that she'd been using to make her child's food every day for the last couple of years has antimony in it.
01:09:44
C:Yeah, how would it get in the food?
01:09:46
E: Right.
01:09:47
S:Well, that's the question, isn't it?
01:09:48
S:Right?
01:09:49
S:How can antimony get from the power cord into the food?
01:09:53
S:Well, it doesn't really do any scientific investigation.
01:09:56
S:It doesn't close the loop on this evidence, so just that was it.
01:10:00
S:Made a massive leap.
01:10:02
S:It must be the air fryer, so she threw out her air fryer.
01:10:05
S:She's telling everybody to throw out their air fryers because they're bad for you.
01:10:09
S:They're toxic.
01:10:10
S: So let's back up a little bit and deconstruct this.
01:10:14
S:So first of all, yes, antimony is a heavy metal and you can get heavy metal toxicity from it.
01:10:22
S:It's similar to arsenic.
01:10:25
S:There are safety limits that the FDA and the EPA sets for antimony.
01:10:29
S: So one question I have is, first of all, I don't know what kind of doctor she took the trial to.
01:10:33
S:There are a lot of, you know, obviously fringe doctors out there, fringe labs, and why would they have tested them for antimony and all things.
01:10:40
S:So that's curious.
01:10:41
S:Saying it was in the 97th percentile doesn't really tell us much either because what I want to know is the absolute number and is it in or outside of the toxic range, like is it in the safety range or not.
01:10:52
S: So just saying 97th percentile doesn't tell us.
01:10:55
S:Maybe 98% of people are in the safety range, you know, are within the safety limits, which is probably true.
01:11:01
S:So, you know, that again doesn't mean that it was necessarily that it was too high, you know, it sounds high.
01:11:07
S: Also, if you do have a high antimony level, you're supposed to do a follow-up test with antimony-free test tubes, because you can get an artificially high level from the testing equipment itself.
01:11:20
S:So that first test is considered just a screen, and without the follow-up test, to verify, you don't know if it's real or not.
01:11:28
S:No indication that that was done.
01:11:31
S: Now, what about the antimony in the air fryer?
01:11:34
S:So, antimony is a common alloy used in electronics.
01:11:38
S:As an alloy, it tends to strengthen the other metals, right, that it's combined with.
01:11:46
S:And the use of antimony is actually increasing because it's also been recently discovered that it could increase some of the properties, desirable properties, of lithium-ion batteries.
01:11:53
S:So, if anything
01:11:55
S: Our use of antimony in electronics and battery technology is going to be increasing.
01:12:00
S:There are, I found, over a thousand household electronics that have antimony in their electronics, in their power cord or whatever.
01:12:09
S:So that's not a comment.
01:12:10
S:Why focus on the air fryer?
01:12:12
S:Again, makes no real sense.
01:12:15
S:The big thing, the big hole, she didn't in any way demonstrate that the antimony that her son was exposed to, if it's real,
01:12:23
S: It was coming from the air fryer and it's not really plausible that it would get from the power cord into the food.
01:12:29
S:I mean, I have air fryers, I use air fryers.
01:12:31
S:The food goes in a basket, right?
01:12:34
S:There's no antimony in the basket that you're putting the food into, so there's really no plausible way that it should leach into the food.
01:12:42
S:You can't really argue that it's like being evaporated or anything because the melting point of antimony is like over a thousand degrees Fahrenheit.
01:12:50
S: And you'd have to heat it up even more to turn it into a gas, so we're not getting anywhere near those temperatures.
01:12:56
S:So it's just not plausible, not a plausible source of antimony.
01:13:01
S:Again, if it's even real in this case, which wasn't proven.
01:13:05
S: And so, there are more plausible routes of exposure.
01:13:10
S:Antimony is used in the preparation of PET plastic.
01:13:13
S:It's not in the plastic, but it could be a residue that's still left behind from the manufacturing process.
01:13:20
S:And water stored in single-use PET plastic bottles could get a little bit of antimony that leaches into that.
01:13:28
S: And that's probably one of the most common exposures in residentially.
01:13:32
S:Obviously, there's always the potential for exposure in the workplace if you're working in a company that uses antimony in its manufacturing process.
01:13:39
S:Although, apparently, from the research that I did, that's not a big problem.
01:13:42
S:It's just antimony is not something that people generally get exposed to, even industrially.
01:13:48
S:But residentially, it's not coming from your power cord.
01:13:51
S: If somehow you're getting exposed to antimony, you know, in your environment, that's not where I would be looking, you know, for the exposure.
01:13:58
S:It's probably from PET plastics.
01:14:01
S:That would be a much more plausible, I think, culprit there.
01:14:06
S:So, you know, this is the culture of TikTok.
01:14:10
S:Somebody who doesn't know what they're talking about, making huge leaps, huge assumptions, not doing anything even remotely scientific,
01:14:17
S: We're not doing any serious investigation, just completely superficial, and then making massive leaps of logic and going right to the fear-mongering, and then just telling their followers to throw out this perfectly safe appliance, which actually is good for you in that it cooks with less oil than other types of cooking.
01:14:42
S: Yeah, I mean, there's nothing magical about an air fryer.
01:14:45
S:It's just a small oven.
01:14:46
C:Yeah, they're tabletop or countertop ovens.
01:14:48
S:It's just the air fryers are efficient because the space is very small.
01:14:52
S:It heats up and actually uses a lot less energy.
01:14:56
S:The food cooks a lot more quickly.
01:14:57
S:It's just an efficient design.
01:14:59
S:But what I'm finding actually is that the
01:15:03
S: Air fryers are the new microwaves.
01:15:05
S:And what I mean by that is that since microwaves have been around, there's been all these conspiracy theories surrounding microwaves, because people are afraid of it.
01:15:13
S:It's high technology, so people get anxious about that and they invent issues.
01:15:21
S:So there's been conspiracy theories about microwaves swirling around for decades, and now we're seeing the same thing with air fryers, just because they're new.
01:15:28
S:But again, they're just small ovens.
01:15:31
S: There's nothing magical about them.
01:15:34
C:The air fryer is great for frozen food.
01:15:36
C:I mean, it's great for a lot of things, but it's really great for frozen food.
01:15:38
S:Reheating pizza?
01:15:40
C:Reheating anything.
01:15:41
C:Yeah, it's really good for reheating stuff too.
01:15:43
S:I should get one.
01:15:45
S: OK, we've got one email.
01:15:47
S:This one comes from Daniel K. from LA.
01:15:50
S:Another rhyme.
01:15:52
S:He writes, I'm a longtime listener and fan of the SGU.
01:15:55
S:I have been reading more about climate change scientists and came across Dr. Judith Curry and her testimony on the subject that sounds straight out of the SGU critical thinking and following the data approach to skepticism.
01:16:07
S:What is your take and shouldn't this be open for discussion?
01:16:10
S:Then he gets a link to her testimony.
01:16:14
S: So yeah, so Dr. Judith Curry is a well-known climate change denier.
01:16:21
C:But she's also a climatologist.
01:16:23
S:Yes, she is a climate scientist.
01:16:26
S:Ouch.
01:16:26
B:She's one of the three percent.
01:16:27
S:It makes it more complicated.
01:16:28
C:Exactly, the one percent, yeah.
01:16:31
S: Right, so she is clearly an outlier.
01:16:35
S:She has opinions about the science behind anthropogenic global warming that is out of the mainstream, right?
01:16:45
S:So she disagrees with the 98% or whatever of her colleagues who interpret the evidence differently.
01:16:52
S:And she is known as a contrarian, and she's had this contradictory opinion for decades.
01:16:59
S: I don't know how this really, really all happened.
01:17:01
S:I don't know if this is, you know, maybe she's just not a very good climate scientist or she is just a contrarian generally, or maybe like early on she was not as convinced by the evidence or saw some problems with the evidence.
01:17:14
S: And then once she got into the position of being like the skeptic of climate change, she felt like she had to defend that position and couldn't, you know, get out of it and then like double, triple down.
01:17:28
S:I don't know what the process was.
01:17:30
S:What I do know is that her opinions on climate change have not held up well over time.
01:17:36
C: But I can imagine that in a vacuum, you know, without somebody standing next to her fact-checking her, she's using all the right lingo, she sounds like she knows what she's talking about, and she has all the right credentials, and that can be really confusing.
01:17:49
S:Her big thing is that she says that the data is more uncertain than
01:17:55
S: She actually kind of agrees that, yes, the Earth is warming, yes, it's due in part to human-generated greenhouse gases, including carbon dioxide, yes, this could lead to potentially catastrophic consequences, but just that there's way more uncertainty than what the scientific community and the international panel
01:18:18
S: Intergovernmental panel on climate change is saying.
01:18:20
S:That's kind of been, you know, the drum that she's been beating.
01:18:26
S:But the thing is, when you get down to it, when you look at her specific opinions, they're not that far off of sort of mainstream climate change denial.
01:18:35
S: So, for example, let's go over some of the things that she said.
01:18:38
S:She said that global warming stopped in 1995, and she said it again in about 1998, and 2002, and 2007, and 2010.
01:18:51
S: So there's fluctuations in the background temperature, and this has been a ploy of, again, climate change deniers for a very long time.
01:19:00
S:Every time the curve turns down, you say, oh, look, climate change has stopped.
01:19:04
S:It's reverting to the mean or whatever.
01:19:06
S:But of course, the long-term trend has not changed.
01:19:10
S:We're still warming.
01:19:12
S:So she was wrong every time she said that global warming has stopped.
01:19:16
S: She also bought into the hold scientists tried to quote-unquote hide the decline as sort of some kind of conspiracy to hide I guess the uncertainty which was been completely debunked.
01:19:27
S:She's characterized the IPCC as alarmist even though they've their predictions underestimated climate warming since they've been underestimating it so yet but yet she's calling them
01:19:41
S: The Skeptic's Guide to the Universe
01:20:00
S: So, just because she's a climate scientist doesn't mean that she's correct, right?
01:20:06
S:And this is a good general lesson about the argument from authority.
01:20:10
S:Reliable authority lies in a strong, hard-earned consensus of multiple scientists and experts, not one person's opinion.
01:20:19
S:It never rests on one person, because one person could be quirky, they could be a contrarian, they could just be wrong.
01:20:26
S: Now, having said all of that, I do think that the best way to respond to somebody like Dr. Currie is to just focus on their claims and debunk them, right?
01:20:38
S:Or just analyze them, see if they have any merit, and defend whatever opinion that they're criticizing with logic and evidence.
01:20:50
S: I just, that's the best way to deal with it.
01:20:52
S:It's actually not a bad thing.
01:20:54
S:You know, I think every science should have the contrarians on the fringe who are saying, but wait a minute, how do we know this is really true, and whatever.
01:21:03
S:Just to keep the whole process honest, I think it's fine.
01:21:07
S:You know, it actually, I think, helps the process.
01:21:09
S:The problem here, though, is a couple of things.
01:21:12
S: One is that there is a campaign of denial that is funded by the fossil fuel industry and that has been taken up by a political party, so we're not dealing with a good faith, you know, context here, a good faith community.
01:21:27
S: Whether or not she is deliberately part of that or not is almost irrelevant.
01:21:32
S:The problem is that even good faith, playing devil's advocate kind of science, then gets used by denialist, politically motivated, ideologically motivated campaigns.
01:21:44
S: The second thing is that there are massive important policy decisions resting upon what the scientific community says about climate science.
01:21:55
S:And so this is always tricky, you know, we're having scientific discussions
01:22:02
S: In the literature, among experts, and that's fine, but that then gets exploited and used by, again, people who are not acting in good faith, who are then trying to mine all of that for the purpose of political denial.
01:22:22
S:So that complicates the whole situation, right?
01:22:26
S: And whether intentional or not, Dr. Currie has lent a tremendous amount of aid and comfort to the climate change denial community who are not acting in good faith and have really hampered the world's response to what is a very serious and time-sensitive situation.
01:22:49
S: Right, so again, it's complicated, but when you drill down on her claims, they just don't hold water.
01:22:59
S:They have been pretty much utterly trounced by her climate expert colleagues.
01:23:08
S:All right, let's move on with science or fiction.
01:23:14
US#03:It's time for science or fiction.
01:23:24
S: Each week I come up with three science news items, four facts, two real and one fake.
01:23:28
S:Then I challenge my panel of skeptics to tell me which one is the fake.
01:23:33
S:We have three regular news items this week.
01:23:35
E:Not two because Jay's not here.
01:23:37
E:You didn't lower it.
01:23:38
S:There is sort of a theme here.
01:23:40
S:There's a weak kind of thing.
01:23:41
S:They're regular news items, but there's a theme of good or bad.
01:23:45
S:Some of these are either really good or really bad.
01:23:48
E: The Skeptic's Guide to the Universe is hosted by Steven Novella, Bob Novella, Cara Santa Maria, Cara Santa Maria,
01:24:05
S: Eye number two, a recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
01:24:17
S:And eye number three, researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
01:24:27
S: Hey, Cara, you seem very eager, so why don't you go first?
01:24:30
C:Okay, so the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
01:24:38
C:I could see this happening one of two ways, definitely.
01:24:43
C: If you decrease the risk of getting COVID, then you decrease the risk of then having long COVID symptoms after COVID infection.
01:24:53
S:This is not overall in the population.
01:24:55
S:What this is saying is that for people who got COVID, the risk of developing long COVID was reduced.
01:25:02
C: Right.
01:25:02
C:But even still, I'm wondering if that reasoning stands because for people who got COVID, the longer the pandemic went on, and the more they were vaccinated, or the more immunity they developed, the weaker their COVID infections were.
01:25:16
C:But then I wonder if there's like an almost equal and opposite way to look at this one where like,
01:25:22
C: For some people long COVID appears to be some sort of like autoimmune or like excessive immune response.
01:25:28
C:And if that's the case, yes, more code of infection bad, but also maybe maybe accumulation of vaccine.
01:25:36
C:I don't really know.
01:25:36
C:But I think I don't know that one is feeling like science to me.
01:25:39
C:A recent analysis of primate genomes finds that shared viral inclusions
01:25:45
C: So, reduce overall cancer risk by stabilizing the genome.
01:25:49
C:Shared viral inclusion, shared by whom?
01:25:53
S:Primates.
01:25:54
S:These are viral inclusions that are found in all primates within the primate clade.
01:25:58
C:Oh, I see.
01:25:59
C:So among different, yeah, okay, gotcha.
01:26:01
C:Different species within this, okay.
01:26:03
C: So if there are viral inclusions there, that would reduce overall cancer risk by stabilizing the genome.
01:26:09
B:So these are these are viral genomic snippets that have incorporated themselves into our genome?
01:26:15
C:Across multiple species.
01:26:16
C:Yeah.
01:26:17
C:And so but like, I don't know, I mean, yes, cancer is like very largely genetic.
01:26:22
C:But you can have a relatively stable genome and still have like messed up oncogenes and messed up tumor suppressors.
01:26:29
C: Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
01:26:36
C:We'll see, do we even have much of that left?
01:26:43
C:I think it's the cancer one that is the fiction.
01:26:46
C:I'm not exactly sure why.
01:26:49
B: All right, Bob.
01:26:50
B:This first one makes sense with the minimization of long COVID with the introduction of vaccinations.
01:26:58
B:Yeah, it just makes sense that the more people that have the attenuated
01:27:04
B: COVID would be more likely to not even experience long COVID.
01:27:08
B:It just seems very reasonable.
01:27:11
B:Let me go to the third one.
01:27:13
B:Global sea ice has decreased its cooling effect.
01:27:16
B:So yeah, I'm trying to figure out what the cooling effect would actually have been for sea ice.
01:27:21
B:And I don't think it's necessarily dramatic, but that's irrelevant because it's whatever cooling effect it has, it's decreased by 14%.
01:27:31
B:I wonder how they would have calculated that.
01:27:33
B: That seems somewhat reasonable, more reasonable than the second one, having these viral inclusions stabilizing the genome cancer.
01:27:42
B:Yeah, that just seems... Sure, it's not ridiculously impossible, and that would be great if it were true, but it just seems less likely than the other ones, definitely.
01:27:55
B:So I'll say that's fiction as well.
01:27:56
S: And Evan.
01:27:57
E:Yeah, I don't want to be alone on this one.
01:28:01
E:Well, I mean, mostly because from the get-go, I was really thinking of the three, the one I understand the least is the one about the primate genomes.
01:28:11
E:Whereas the other two, I kind of have at least some kind of sense for.
01:28:14
E:Obviously, I know what long COVID is.
01:28:17
E: Um, decreased over the course of the pandemic.
01:28:19
E:The only reason I think that one might be the fiction or could have been the fiction is because, you know, there have been these, what, during the course of the pandemic, there were peaks, right?
01:28:26
E:And then it went back down, but then it peaked again.
01:28:29
E:And I don't know if that had anything to do with how the numbers would have played out as far as determining the long COVID and if the vaccination, how the vaccination had an effect on that.
01:28:38
E:But you did say mostly do, not exclusively do.
01:28:41
E:So that's why I think that one's science.
01:28:43
E:And then, yeah, for the same reasons Bob said about the sea ice,
01:28:48
E: How much overall are we really talking about in the whole grouping of things that go into that?
01:28:54
E:So therefore, that only leaves me with the genomes as fiction.
01:28:58
S: All right, I guess I'll take these in order.
01:29:00
S:We'll start with number one.
01:29:01
S:A new study finds that the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
01:29:09
S:You guys all think this one is science.
01:29:12
S:Well, let me tell you first that the risk of long COVID has been decreasing over the course of the pandemic.
01:29:22
S: And it's due to two things.
01:29:25
S:One is the change of the variants, right, pre-delta to delta to omicron.
01:29:32
S:The later variants had less long COVID, but it's also due to vaccination.
01:29:37
S:So the question is, which one had more of an effect?
01:29:41
US#06:Well, this one is...
01:29:44
S: Science!
01:29:45
S:Yep, this is science.
01:29:46
S:Yep, the researchers found that vaccination was due to about 75% of the reduction in the risk of getting long COVID following a COVID infection.
01:29:58
S:So yeah, it is one more great thing about the vaccines, they reduce your risk of getting the COVID, they reduce the severity of the COVID, and they reduce your risk of long COVID.
01:30:09
S:So yeah, that was mostly due to the vaccines.
01:30:11
S:The later
01:30:13
S: The variants were overall sort of less virulent, although they were, they spread more easily, they didn't cause as bad as a disease, which is something that does typically happen during pandemics.
01:30:25
S:Okay, let's go on to number two, a recent analysis of primate genome spines that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
01:30:34
S:You guys all think this one is the fiction, and this one is the fiction.
01:30:42
B: Thanks, Cara and Bob.
01:30:43
C:I hate going first.
01:30:44
C:It's so stressful.
01:30:45
B:It is, right?
01:30:45
S:Because it turns out that these viral inclusions, these shared viral inclusions, increase the overall risk by destabilizing.
01:30:54
B:That actually makes more sense.
01:31:01
S: The authors write that they found that these viral inclusions cause transcriptional dysregulation.
01:31:07
S:And essentially what that means is that it's more likely for there to be the kinds of mutations that do lead to cancer, right?
01:31:15
S:Mutations that cause the cells to reproduce out of control or whatever.
01:31:19
S:So, yep, it increases the risk of cancer.
01:31:22
S: Now, these viral inclusions are very interesting from a basic science evolutionary point of view because they are an independent, powerful molecular evidence for evolution, right?
01:31:36
S:Because essentially, when, you know, some viruses have reverse transcriptase, they basically insert their DNA into cells and sometimes it gets into the germline and you have sort of this permanent addition of a bit of viral DNA in the genome.
01:31:52
S: When that happens, every descendant inherits it, right?
01:31:56
S:So, any future speciation, whatever, it carries through throughout all the descendants that pick it up.
01:32:03
S:Of course, sometimes it could be lost, but basically, you could see these patterns, these nestled hierarchies of viral inclusions.
01:32:12
S: That are following the evolutionary tree of relationships and therefore it's really an awesomely powerful, independent evidence for the historical fact of evolution.
01:32:24
S:There's really just no way around it.
01:32:27
S:There is no non-evolutionary explanation for the pattern of inclusions that we see in nature.
01:32:35
B:Yeah, the other side of that coin, could they make you a superhero?
01:32:40
B: They could... Asking for a friend.
01:32:42
E:Asking for a super friend.
01:32:44
S:So you're not saying it's impossible.
01:32:46
S:Alright, let's go on to number three.
01:32:50
S:Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
01:32:54
S:This one, of course, is science.
01:32:59
S: Yeah, this is one of those bad, positive feedback loops in climate change.
01:33:06
S:As sea ice melts, it reduces the amount of radiation that it reflects back into space, which has a cooling effect, therefore the cooling effect is reduced, leads to more warming, more ice melting, less cooling, more warming, etc.
01:33:20
S:So that's bad.
01:33:22
S: Because the sea ice has a pretty high albedo.
01:33:25
S:It reflects a lot of energy.
01:33:27
S:And the sea water, right, the oceans are very dark.
01:33:31
S:They have a very low albedo.
01:33:33
S:They reflect very little light.
01:33:35
S:So there is a dramatic difference between the ice and the non-ice covered ocean.
01:33:42
S:So loss of sea ice can be a massive positive feedback effect on climate change.
01:33:48
S: One other interesting wrinkle to this was that the scientists found that this reduction in that 14% reduction in the cooling effect is greater than just the reduction in the surface area of sea ice, the average surface area of the sea ice over the course of a year.
01:34:05
C:You mean the decrease in the cooling effect is greater than the decrease in the surface area?
01:34:09
C:So it's not linear.
01:34:10
S:Yeah, so it's not a strictly linear relationship, which is interesting.
01:34:14
B: We won, woohoo!
01:34:15
C:We win.
01:34:16
E:Yes, winner.
01:34:17
C:What do we win?
01:34:17
S:What do you get?
01:34:18
S:Well, you all get to hear Evan's quote.
01:34:21
S:Evan, let us have the quote.
01:34:23
E:Woohoo!
01:34:25
E:Which you would have heard anyways, I suppose.
01:34:27
E:But let's not ruin the parade right now, yeah.
01:34:31
E:In effect, we're all guinea pigs for the dietary supplement industry.
01:34:36
E:The vast majority of these supplements don't deliver on their promises.
01:34:41
E: Dr. Nick Tiller is the author of the book called The Skeptic's Guide to Sports Science Confronting Myths of the Health and Fitness Industry.
01:34:51
US#06: Hey.
01:34:52
E:Yeah.
01:34:53
E:If you're anyone else but Douglas Adams, you borrowed that from us.
01:34:59
E:And Dr. Tiller will be happy to say hello to you when we are at SciCon in October as part of that conference, which is going to be cool.
01:35:08
E:And I'm featuring a bunch of quotes from people who are going to be at that conference specifically.
01:35:13
C:Oh, that's fun.
01:35:14
S:All right.
01:35:14
S:Thank you, Evan.
01:35:16
S:And thank you all for joining me this week.
01:35:18
E:Sure, Ben.
01:35:19
E:Thank you, Steve.
01:35:19
E:Thanks, Steve.
01:35:20
S: And until next week, this is your Skeptic's Guide to the Universe.
01:35:27
S: Skeptic's Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
01:35:34
S:For more information, visit us at theskepticsguide.org.
01:35:39
S:Send your questions to info at theskepticsguide.org.
01:35:42
S:And if you would like to support the show and all the work that we do, go to patreon.com slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
01:35:53
S:Our listeners and supporters are what make SGU possible.