SGU Episode 762: Difference between revisions

From SGUTranscripts
Jump to navigation Jump to search
(→‎Who's That Noisy? (): trying out the "go to top" button)
m (dropping w/ links)
 
(141 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{transcribing all
|transcriber = xanderox
|date        = 2020-04-28
}}
{{Editing required
{{Editing required
|transcription          = y
|proof-reading = y
|proof-reading         = y
|Today I Learned list = y     
|time-stamps            = y
|formatting            = y
|links                  = y
|Today I Learned list   = y
|categories            = y
|segment redirects      = y    <!-- redirect pages for segments with head-line type titles -->
|}}
{{InfoBox
|episodeNum    = 762
|episodeDate    = Dec 6<sup>th</sup> 2035 😉  <!-- broadcast date -->
|episodeIcon    = File:SGU_future.jpg      <!-- use "File:" and file name for image on show notes page-->
|previous      = 761                      <!-- not required, automates to previous episode -->
|next          = 763                      <!-- not required, automates to next episode -->
|bob            = y                        <!-- leave blank if absent -->
|cara          = y                        <!-- leave blank if absent -->
|jay            = y                        <!-- leave blank if absent -->
|evan          = y                        <!-- leave blank if absent -->
|perry          =                         
|guest1        =                          <!-- leave blank if no guest -->
|guest2        =                          <!-- leave blank if no second guest -->
|guest3        =                          <!-- leave blank if no third guest -->
|qowText        = 'Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos among the cosmos. It is truly the most stunning achievement for a lifeform that has emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward' <!-- add quote of the week text-->
|qowAuthor      =  Dr. Alyssa Carson<ref name=Carson>[https://www.mars-one.com/about-mars-one/ambassadors/alyssa-carson Alyssa Carson: Mars One]</ref>, first resident of {{W|Lunar outpost (NASA)|Armstrong Station}}, The Moon <!-- add author and link -->
|downloadLink  = https://media.libsyn.com/media/skepticsguide/skepticast2020-02-15.mp3
|forumLink      = https://sguforums.com/index.php/topic,51757.0.html
|}}
|}}
{{InfoBox
|episodeNum = 762
|episodeDate = {{month|12}} {{date|6}} <span style="color:blue">2035</span> 😉  <!-- I know this is supposed to be the broadcast date,
                                                  ___ but the show is from the FUTURE! ___
-->
|verified = <!-- leave blank until verified, then put a 'y'-->
|episodeIcon = File:762 SGU from the future.jpg
|caption = "Greetings from the Future" art
|bob =y
|cara =y
|jay =y
|Evan =y
|qowText = Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward.
|qowAuthor = Alyssa Carson<ref name=Carson>[https://nasablueberry.com/about Alyssa Carson: Nasa Blueberry]</ref>, first resident of {{w|Moonbase|Armstrong Station}}, The Moon


|downloadLink = {{DownloadLink|2020-02-15}}
|forumLink      = https://sguforums.org/index.php?topic=51757.0
}}


== Introduction ==
== Introduction ==
''You're listening to the Skeptics' Guide to the Universe, your escape to reality.''
''Voiceover: You're listening to the Skeptics' Guide to the Universe, your escape to reality.''


'''S:''' Hello and welcome to the {{SGU|link=y}}. ''(applause)'' Today is Thursday, December 6th, '''2035''', and this is your host, Steven Novella. ''(audience laughter)'' Joining me this week are Bob Novella ...  
'''S:''' Hello and welcome to the {{SGU|link=y}}. ''(applause)'' Today is Thursday, December 6th, <span style="color:blue">'''''2035'''''</span>, and this is your host, Steven Novella. ''(audience laughter)'' Joining me this week are Bob Novella ...  


'''B:''' Hey, everybody! ''(applause)''
'''B:''' Hey, everybody! ''(applause)''
Line 67: Line 56:
'''S:''' And we're supposed to say it properly for an American.
'''S:''' And we're supposed to say it properly for an American.


'''C:''' Yeah, without an (inaudible).
'''C:''' Yeah, without an [inaudible].


'''S:''' And I have no idea where in the spectrum of "Mel-born" to "Mel-burn" to "Mel-bin"…
'''S:''' And I have no idea where in the spectrum of "Mel-born" to "Mel-burn" to "Mel-bin"…
Line 81: Line 70:
'''C:''' I keep trying to convince you.
'''C:''' I keep trying to convince you.


'''B:''' Of course you've done it. And probably first class (inaudible).
'''B:''' Of course you've done it. And probably first class [inaudible].


'''S:''' What is it, about six hours across the…?
'''S:''' What is it, about six hours across the…?
Line 93: Line 82:
'''B:''' But, Jay, that big breakthrough that allowed the supersonic transport to become viable again was the fact that they design the shape – you've seen the shape, it's a gorgeous, really elongated shape – but that minimizes the sonic boom by like a 1000th of what it used to be. And that's what was the big problem with it. Remember, what was it, the old one, the {{w|Concorde}} …
'''B:''' But, Jay, that big breakthrough that allowed the supersonic transport to become viable again was the fact that they design the shape – you've seen the shape, it's a gorgeous, really elongated shape – but that minimizes the sonic boom by like a 1000th of what it used to be. And that's what was the big problem with it. Remember, what was it, the old one, the {{w|Concorde}} …


'''S:''' And when did we first talk about that? It was, like, 15 years ago.
'''S:''' And when did we first talk about that? It was, like, 15 years ago.{{Link needed}}


'''B:''' Oh my god.
'''B:''' Oh my god.
Line 99: Line 88:
'''E:''' Long time ago.
'''E:''' Long time ago.


'''S:''' And here we are, like just coming (inaudible).
'''S:''' And here we are, like just coming [inaudible].


'''B:''' Remember? I saw it. I think I saw it in a magazine the first time we were in this area. And I said, "Look at this. This is something that's really going to be big in the future. And it was.
'''B:''' Remember? I saw it. I think I saw it in a magazine the first time we were in this area. And I said, "Look at this. This is something that's really going to be big in the future." And it was.


'''J:''' It is.
'''J:''' It is.
Line 115: Line 104:
'''C:''' Time-traveling a little bit here. ''(winks?)''
'''C:''' Time-traveling a little bit here. ''(winks?)''


== News Items ==
== Future "News" Items ==


'''S:''' So, it's 2035, so this is our 30th Anniversary year of doing the SGU and because of that, we're finishing up 30 years. We're going to talk about regular news items, but we're going to give more of a history, like, where does this fit into the arc of science and skepticism over the last 30 years of the SGU, right?
'''S:''' So, it's 2035, so this is our 30th Anniversary year of doing the SGU and because of that, we're finishing up 30 years. We're going to talk about regular news items, but we're going to give more of a history, like, where does this fit into the arc of science and skepticism over the last 30 years of the SGU, right?


=== Venice Floods, Québec Accords <small>(3:10)</small> ===
=== Québec Accord, Global Corporate Alliance <small>(3:10)</small> ===


'''S:''' So, Jay's going to start with a news item that has something to do with {{w|global warming}}. He didn't tell me what it is, but you're going to start by telling us where we've been, where we're going, where are we in this saga that we've been talking about, it seems like, for 30 years.
'''S:''' So, Jay's going to start with a news item that has something to do with {{w|global warming}}. He didn't tell me what it is, but you're going to start by telling us where we've been, where we're going, where are we in this saga that we've been talking about, it seems like, for 30 years.
Line 125: Line 114:
'''J:''' Well, yeah, I mean when we first started talking about this, I don't even know when we first started talking about this –
'''J:''' Well, yeah, I mean when we first started talking about this, I don't even know when we first started talking about this –


'''S:''' I think right at the beginning, 2005, 2006.
'''S:''' —I think right at the beginning, 2005, 2006.


'''J:''' It was a mounting thing that, as the years went by, we started to talk more and more about it. And then somewhere around the late 2020s, we really started to talk about, almost on every episode, to the point where listeners were emailing us, saying, "Okay, we get it. Global warming is bad news.
'''J:''' —It was a mounting thing that, as the years went by, we started to talk more and more about it. And then somewhere around the late 2020s, we really started to talk about, almost on every episode, to the point where listeners were emailing us, saying, "Okay, we get it. Global warming is bad news."


But we've seen a lot of bad things happen over the last 10 to 15 years where local governments, or governments in general are doing absolutely nothing. They still can't get out of their own way, right? We know that, but nothing has really been happening. And then in 2027, when {{w|Venice}} got so flooded that it couldn't recover, that's when the world woke up.
But we've seen a lot of bad things happen over the last 10 to 15 years where local governments, or governments in general are doing absolutely nothing. They still can't get out of their own way, right? We know that, but nothing has really been happening. And then in 2027, when {{w|Venice}} got so flooded that it couldn't recover, that's when the world woke up.
Line 139: Line 128:
'''C:''' It's too dangerous, guys.
'''C:''' It's too dangerous, guys.


'''B:''' But why didn't they try to just to build up, like abandon the bottom five (inaudible).
'''B:''' But why didn't they try to just to build up, like abandon the bottom five [inaudible].


'''C:''' They tried that.
'''C:''' They tried that.
Line 153: Line 142:
'''J:''' It really hit a note across the globe when a lot of the art got destroyed. So that's when everybody—that's when I think we can kind of look back, as a marker, like the whole world took a pause.
'''J:''' It really hit a note across the globe when a lot of the art got destroyed. So that's when everybody—that's when I think we can kind of look back, as a marker, like the whole world took a pause.


So then in 2027, that same year, we had the {{w|Québec Accord}}, which was an absolute failure. I think Canada's heart was in the right place, but they tried to inspire the world to change. But governments just can't get out of their own way.
So then in 2027, that same year, we had the Québec Accord happen, which was an absolute failure. I think Canada's heart was in the right place, but they tried to inspire the world to change. But governments just can't get out of their own way.


'''S:''' But think about it. Think about the {{w|Paris Agreement|Paris Accord}}, right, when was that? That was, like, 2015.
'''S:''' But think about it. Think about the {{w|Paris Agreement|Paris Accord}}, right, when was that? That was, like, 2015.
Line 159: Line 148:
'''E:''' 2015.
'''E:''' 2015.


'''S:''' Yeah, 2015. They said, "Okay, we're going to limit post-industrial warming to 2.0C above pre-industrial levels." And even though they knew that bad shit was going to happen at 2.0, really we needed to keep it beneath 1.5, which we hit this year, guys. This year we had 1.5C above pre-industrial level, 2035. So they didn't even try to ever get 1.5. They're like, "Alright, let's just keep it below 2." And they failed to do that. What they agreed to wouldn't even accomplish that.
'''S:''' Yeah, 2015. They said, "Okay, we're going to limit post-industrial warming to 2.0C above pre-industrial levels." And even though they knew that bad shit was going to happen at 2.0, really we needed to keep it beneath 1.5, which we hit this year, guys. This year we had 1.5C above pre-industrial level, 2035. So they didn't even try to ever get 1.5. They're like, "All right, let's just keep it below 2." And they failed to do that. What they agreed to wouldn't even accomplish that.


'''J:''' Yeah, there was no chance of them getting that.
'''J:''' Yeah, there was no chance of them getting that.


'''S:''' And the Québec Accords, they're like, "Alright, well, let's, maybe 3.0. Let's just keep it 3ºC above…
'''S:''' And the Québec Accord, they're like, "All right, well, let's, maybe 3.0. Let's just keep it 3ºC above…


'''E:''' Move the goalposts.
'''E:''' Move the goalposts.
Line 183: Line 172:
'''E:''' People, countries can exit as they wish.
'''E:''' People, countries can exit as they wish.


'''C:''' I mean, remember back when {{w|Donald Trump|Trump}} just dropped the ball on it? He just {{w|United States withdrawal from the Paris Agreement|left}}. He just said, "No, Paris." I mean, we've been trying to make up for that ever since.
'''C:''' I mean, remember back when {{w|Donald Trump|Trump}} just dropped the ball on it? {{w|United States withdrawal from the Paris Agreement|He just left}}. He just said, "No, Paris." I mean, we've been trying to make up for that ever since.


'''E:''' Gone.
'''E:''' Gone.
Line 197: Line 186:
'''J:''' So, the things that we've seen—it wasn't just what happened in Venice but, you know, the storms continued to become deadly, right? So we have people dying every time there's a storm, a big storm.
'''J:''' So, the things that we've seen—it wasn't just what happened in Venice but, you know, the storms continued to become deadly, right? So we have people dying every time there's a storm, a big storm.


'''S:''' Seems like every hurricane's a {{w|Saffir-Simpson Scale|CAT-5}} now.
'''S:''' Seems like every hurricane's a {{w|Saffir–Simpson_scale#Category_5|CAT-5}} now.


'''C:''' Oh, and my city is constantly on fire. {{w|Los Angeles|LA}}, also {{w|Sydney}}, even {{w|Melbourne}}. It's on fire all the time now.
'''C:''' Oh, and my city is constantly on fire. {{w|Los Angeles|LA}}, also {{w|Sydney}}, even {{w|Melbourne}}. It's on fire all the time now.
Line 225: Line 214:
'''C:''' ''(laughs)''
'''C:''' ''(laughs)''


'''S:''' I think it's like, "Yeah, you guys failed. You're hopeless. You're in total political gridlock. So, somebody's got to step in. So we got this. Go away. We'll (inaudible).
'''S:''' I think it's like, "Yeah, you guys failed. You're hopeless. You're in total political gridlock. So, somebody's got to step in. So we got this. Go away. We'll [inaudible]."


'''B:''' So you're referring to governments in general, right?
'''B:''' So you're referring to governments in general, right?
Line 293: Line 282:
'''C:''' Why would you ever think that?
'''C:''' Why would you ever think that?


'''S:''' Well, I mean it’s always complicated, alright? Companies sometimes do good things, right? And they get PR out of it, and then you say, “Okay, are they doing it because they really care about their customers, or do they really care about the planet?” They’re living on this planet, too, and some of their profits, actually—there are lots of companies who are losing profits because of climate change. So they’re invested in it as well, but then you have to wonder, are they ''just'' doing it for the PR, do they have an ulterior motive (inaudible)
'''S:''' Well, I mean it’s always complicated, all right? Companies sometimes do good things, right? And they get PR out of it, and then you say, “Okay, are they doing it because they really care about their customers, or do they really care about the planet?” They’re living on this planet, too, and some of their profits, actually—there are lots of companies who are losing profits because of climate change. So they’re invested in it as well, but then you have to wonder, are they ''just'' doing it for the PR, do they have an ulterior motive [inaudible]


'''C:''' But also, does that matter?
'''C:''' But also, does that matter?
Line 317: Line 306:
'''E:''' Some of them are.
'''E:''' Some of them are.


'''J:''' So what, though? They’re signing on, but that—they’re the trillionaires. They have the money. They could be throwing down half their wealth but that hasn’t happened yet.
'''J:''' So what, though? They’re signing on, but that—they’re the trillionaires. They have the money. They could be throwing down half their wealth to try to save the planet but that hasn’t happened yet.


'''S:''' That wouldn’t be enough.
'''S:''' That wouldn’t be enough.
Line 324: Line 313:


'''E:''' ''(laughs)''
'''E:''' ''(laughs)''
(inaudible) <!-- what is said here??? -->
 
[inaudible] <!-- what is said here??? -->


'''B:''' Sorry.
'''B:''' Sorry.
Line 408: Line 398:
'''C:''' I got like a whole decade ahead of me at least.
'''C:''' I got like a whole decade ahead of me at least.


'''J:''' Do you still have social security? (inaudible)
'''J:''' Do you still have social security? [inaudible]


'''C:''' No, it’s completely insolvent.
'''C:''' No, it’s completely insolvent.


'''S:''' Alright, so, now we have to wait for IKEA to save us, is that what you’re telling me?
'''S:''' All right, so, now we have to wait for IKEA to save us, is that what you’re telling me?


'''C:''' No, the Global Corporate Alliance.
'''C:''' No, the Global Corporate Alliance.
Line 430: Line 420:
=== Fourth Domain of Life <small>(14:14)</small> ===
=== Fourth Domain of Life <small>(14:14)</small> ===


'''S:''' Alright. Guys, let me ask you a question, especially Bob. How many {{w|Three-domain system|domains}} of life are there?
'''S:''' All right. Guys, let me ask you a question, especially Bob. How many {{w|Domain (biology)|domains}} of life are there?


'''B:''' Wait, there was—oh, crap. There’s bacteria, archaea, prokaryotes—
'''B:''' Wait, there was—oh, crap. There’s bacteria, archaea, prokaryotes—
Line 458: Line 448:
'''B:''' Whaa?
'''B:''' Whaa?


'''S:''' There’s a new fourth domain of life.
'''S:''' There’s a new, fourth domain of life.


'''B:''' Ooh, I know what you’re saying.
'''B:''' Ooh, I know what you’re saying.
Line 478: Line 468:
'''S:''' Well, hang on! We’ll get there.
'''S:''' Well, hang on! We’ll get there.


''(Laughter)''
''(laughter)''


'''S:''' Let’s back up a little bit.
'''S:''' Let’s back up a little bit.


====Revisiting GMOs====
====Revisiting GMOs <small>(15:00)</small>====


'''S:''' So again, we’re going to give the arc, right? We’re talking about {{w|genetic engineering}}, right? Initially, this kind of came on our radar around 2010, maybe 2012, that kind of area, right?
'''S:''' So again, we’re going to give the arc, right? We’re talking about {{w|genetic engineering}}, right? Initially, this kind of came on our radar around 2010, maybe 2012, that kind of area, right?
Line 520: Line 510:
'''E:''' Of course they did.
'''E:''' Of course they did.


'''S:''' Alright. The {{w|American chestnut|American chestnut tree}}—there was a fungus, which was—
'''S:''' All right. The {{w|American chestnut|American chestnut tree}}—there was a fungus, which was—


'''J:''' That was back in, when, like the 60s?
'''J:''' That was back in, when, like the 60s?
Line 568: Line 558:
'''S:''' So there was the {{w|Gros Michel banana|Gros Michel}}, which died out in the early 20th century, and there was the Cavendish, which died out—
'''S:''' So there was the {{w|Gros Michel banana|Gros Michel}}, which died out in the early 20th century, and there was the Cavendish, which died out—


'''C:''' And that’s the one you guys always used to talk about. You loved those weird Gros Michels.
'''C:''' And that’s the one you guys always used to talk about. {{Link needed}} <!-- there was at least one 2019 episode the talked about the monoculture Cavendish’s risk of death --> You loved those weird Gros Michels.


'''S:''' They’re back, though.
'''S:''' They’re back, though.
Line 574: Line 564:
'''J:''' I remember you cried when we found out that they were gone.
'''J:''' I remember you cried when we found out that they were gone.


''(Audience laughter)''
''(audience laughter)''


'''S:''' Well, what the hell? We knew it was coming for years, too. We were talking about it on the show. The banana’s going to be going.
'''S:''' Well, what the hell? We knew it was coming for years, too. We were talking about it on the show. The banana’s going to be going.
Line 580: Line 570:
'''C:''' ''(feigns crying)'' It still surprised you.
'''C:''' ''(feigns crying)'' It still surprised you.


'''S:''' It still surprised me. Fusarium wilt, or Tropical Race 4, or Panama Disease, completely wiped out the Cavendish industry. I think the last holdout was South America, but it was detected in South America in 2019, and that’s when they knew "now it’s a matter of time."
'''S:''' It still surprised me. Fusarium wilt, or Tropical Race 4, or Panama Disease, completely wiped out the Cavendish industry. I think the last holdout was South America, but it was detected in South America in 2019, and that’s when they knew "now it’s a matter of time." Once they had one banana that went ''thbbt'', that’s it.
 
'''B:''' Remember that? No ice cream sundaes for a little while?
 
'''S:''' We went years without a banana.
 
'''B:''' That was bad, man.
 
'''S:''' But even before that, before 2024, when the Cavendish was gone, back in 2017, Australian researchers had developed a Panama disease-resistant banana. <ref>[https://www.sciencemag.org/news/2017/11/gm-banana-shows-promise-against-deadly-fungus-strain# Science: GM banana shows promise against deadly fungus strain]</ref>
 
'''C:''' Oh, it came out of Australia? I didn’t realize that.
 
'''S:'''  It came out of Australia in 2017.
 
'''E:''' Well done! Well done, audience. Well done.
 
'''J:''' That was beginning of the banana hubbub.
 
'''S:''' It was the beginning of the banana hubbub—
 
'''E:''' I think also known as a "banana-rama".
 
'''C:''' Banana-rama.
 
'''S:''' Banana-rama…but, however, nobody really knew about it until the "bananapocalypse".
 
'''J:''' Bananapocalypse.
 
''(audience laughter)''
 
'''S:''' The bananapocalypse wiped out the Cavendish and then these Australian researchers were like, "Hey, we got the GMO."
 
'''E:''' "We got this."
 
'''S:''' We got the resistant banana.
 
'''B:''' We’re ready to go.
 
'''S:''' But the thing is, even that might not have—
 
'''B:''' "We got this."
 
'''C:''' ''(laughs)''
 
'''S:''' "We got this," right. Even that might not have been enough because the Cavendish—I love it, it’s a desert banana. It was the number one export fruit before it was wiped out.
 
'''J:''' That banana fed countries.
 
'''S:''' Well, no, no, not ''that'' banana — other bananas.
 
'''J:''' What other bananas?
 
'''S:''' There are staple bananas that are, basically, like what we would call plantains.
 
'''J:''' Oh, that’s right.
 
'''S:''' They’re starchy bananas, and you cook with them.
 
'''C:''' (in Spanish) ''[https://es.wikipedia.org/wiki/Pl%C3%A1tano_(desambiguaci%C3%B3n) Plátanos].''
 
'''B:''' They’re awesome.
 
'''C:''' Steve, why are you so into bananas?
 
'''S:''' I don’t know.
 
'''C:''' You’re really into bananas.
 
'''S:''' I’ve just always loved them. My favorite fruit.
 
'''C:''' That’s fair.
 
'''B:''' He tried to grow them for years and failed utterly.
 
'''C:''' ''(laughs)''
 
'''E:''' That’s right! Remember, back in the teens [2010s]—
 
'''J:''' Did I ever tell you that I hated those goddamn banana plants?
 
'''S:''' They were in our studio.
 
'''C:''' ''(laughs)''
 
'''J:''' I know. They were getting in—and his cats were pissing in the banana plants.
 
'''E:''' The cats!
 
'''B:''' That’s what it was I remember that.
 
'''C:''' I remember that! That’s when I first joined the SGU, way back then. They were in the basement.
 
'''J:''' Steve and I almost got into a fistfight once in our entire life and it was over cats pissing in the studio in the banana plants.
 
''(laughter)''
 
'''S:''' Those cats are dead now.
 
'''C:''' A little behind-the-scenes info.
 
'''S:''' Maybe I should try again. But anyway, something like 20% of the world are dependent on bananas for their staple calories.
 
'''E:''' That’s a lot.
 
'''S:''' When those started succumbing to versions of Panama disease, then we were starting to have Africa and Southeast Asia—there was starvation looming—that’s when the world’s like, "Okay, this is not just our ice cream sundaes anymore. We can’t feed these people unless we get these banana cultivars back online.
 
'''C:''' This GM technology is looking ''pretty'' good right now.
 
'''S:''' GM technology saved the banana industry and, basically, lots of starving Africans. And then—here’s the double whammy—2026, the citrus industry was completely wiped out by {{w|citrus greening disease|citrus greening}}}.
 
'''E:''' That was awful.
 
'''C:''' I remember that.
 
'''B:''' That was horrible.
 
'''S:''' And again, we talked about that for at least 15 years before it hit. Remember Kevin [inaudible]? <!-- what last name? -->
 
'''C:''' He used to come on all the time.
 
'''S:''' He would always tell us, "Man, when citrus greening wipes out the citrus fruit—"{{Link needed}}
 
'''E:''' Then you’re going to see some—
 
'''C:''' He was right.
 
'''S:''' He was absolutely right. That objection to—so, of course, in 2031, the first GMO orange with resistance genes from spinach was planted. They were working on that for years and years.<ref>[https://gmoanswers.com/biotechnology-solution-citrus-greening GMO Answers: Biotechnology as a Solution to Citrus Greening]</ref> And it essentially resurrected the citrus industry, not only in Florida but also in Australia and in other parts of the world where they grow citrus.
 
'''C:''' Well now they can grow them pretty much anywhere. It was smart.
 
'''B:''' Remember they were selling {{w|screwdriver (cocktail)|screwdrivers}} half-price at the bars?
 
'''C:''' ''(laughs)''
 
'''S:''' So here we are. There’s 8.8 billion people on the planet.
 
'''C:''' God, that’s a lot of people.
 
'''S:''' It's a lot of people. Essentially, everyone knows, except for a shrinking fringe, that there is no agriculture without GMOs, bottom line. We would not be able to feed the planet without GMOs. There are still the extremists who are like, "Yeah, let 'em starve, and then everything will be fine."
 
'''J:''' Oh, great, yeah.
 
'''C:''' Well, those people are terrible.
 
'''E:''' Heartless.
 
'''B:''' They’re so marginalized now."
 
'''S:''' Now they’re totally—even Greenpeace, remember that? What was that, 2030 or something when {{w|Greenpeace}} was like, "Yeah, okay, I guess we have to feed people. We can’t let people starve."
 
'''E:''' It only took them decades.
 
'''S:''' So you don’t really hear anything from the anti-GMO crowd anymore, right?
 
'''C:''' Not really. They’re pretty fringy.
 
'''S:''' They’re pretty fringy. There’s one more thing that happened, too. So this is good. GR-5, this is the fifth generation {{w|golden rice}} is now online, but even back to GR-2, which was the first one planted in Bangladesh in 2019 <ref>[https://www.sciencemag.org/news/2019/11/bangladesh-could-be-first-cultivate-golden-rice-genetically-altered-fight-blindness Science: Bangladesh could be the first to cultivate Golden Rice, genetically altered to fight blindness]</ref>, if you guys remember that. So, before Golden Rice, there were 500,000, ''500,000'' children throughout the world who would go blind from {{w|vitamin A deficiency}} every year, and half of those would die within a year. Not only that, but vitamin A deficiency, even if it doesn’t make you go blind or kill you, it leaves you with low resistance, susceptible, vulnerable to other infections. So, remember all the {{w|measles}} outbreaks in 2019, 2020, 2021?
 
'''J:''' But that was because of anti-vax.
 
'''S:''' Well, even when there was an anti-vax [movement], the children in Africa especially were susceptible to measles because they had relative vitamin A deficiency.
 
'''J:''' Oh, I never knew that.
 
'''S:''' So, guess how many children went blind in 2035 so far—it’s almost at the end of the year—due to vitamin A deficiency?
 
'''C:''' Less than 500,000.
 
'''S:''' ''3,000''.
 
'''B:''' Wow.
 
'''E:''' They shaved all that.
 
'''C:''' That’s a big difference.
 
'''S:''' It’s kind of like anything. When you easily fix the problem, it goes away. So anyway, it’s hard to argue with success.
 
'''C:''' So let’s not.
 
'''J:''' But now…
 
'''S:''' But now, but wait, but of course you know—
 
'''C:''' But wait, there’s more!
 
'''E:''' It gets better?
 
====Synthetica <small>(23:55)</small>====
 
'''S:''' Well, no. So that’s the good news. The good news is over now. Now we’re getting into—so have you guys heard the term "gen-craft"? This is kind of a new term. I think we might have mentioned it right before. It’s all under genetic engineering, but it’s not genetic ''modification''. It’s basically crafting life from scratch.
 
'''C:''' This is the synthetic stuff.
 
'''S:''' This is the synthetic stuff, right. We’ve been talking about this since, I think, 2017, 2018?
 
'''C:''' Venter. Craig Venter. <ref>[https://www.nature.com/news/minimal-cell-raises-stakes-in-race-to-harness-synthetic-life-1.19633 Nature: ‘Minimal’ cell raises stakes in race to harness synthetic life]</ref>
 
'''S:''' Venter. They first did bacteria and then they did colonies, multicellular, and then, actually, not just multicellular pseudo-colonies, but now the first actual multicellular, completely synthetic creatures. Again, we’ve talked about their being created, but the first one was approved for human consumption by the {{w|Food and Drug Administration|FDA}}.
 
'''B:''' Wow.
 
'''C:''' Oh, they got it passed!
 
'''S:''' They got it passed.
 
'''C, E:''' Wow.
 
'''J:''' And it’s disgusting.
 
'''S:''' Hang on.
 
'''C:''' Don’t look at it pre-processed.
 
'''E:''' Just put a lot of to'''mah'''to sauce on it.
 
'''C:''' ''(laughs)'' ''(in British accent)'' Tomato sauce.
 
'''S:''' So it’s ''cibumlimax''—that’s a terrible name—''ventera.''
 
'''C:''' ''(laughs)''
 
'''S:''' It basically means "[https://en.wiktionary.org/wiki/cibus#Noun meat] {{w|slug}}". And then ''ventera'' is for Craig Venter.
 
'''E:''' All right, Jay, you’re right. ''(laughs)''
 
'''C:''' They’re going to come up with some yummy brand names for this [inaudible].
 
'''E:''' Yeah, something else…
 
'''S:''' That’s the {{w|taxonomy (biology)|taxonomical}} name. It’s the {{w|domain (biology)|domain}} Synthetica and then they have the "blah blah blah blah blah blah blah ''cibumlimax ventera.''"
 
'''C:''' Yeah, we don’t go to the barbecue place and ask for some, like, what’s the Latin name for a cow? ''(laughs)''
 
'''S:''' They’ll call it something—
 
'''C:''' Something "bovine."
 
'''E:''' Oh, ''bovinus, uh, whateverus.''
 
'''S:''' Remember they [inaudible] {{w|veggie burgers}}, then the {{w|Impossible_Foods#Impossible_Burger|Impossible Burger}}, then the {{w|Insects as food|Insect Burgers}}, right? The bug burgers.
 
'''B:''' We’ll call it a "blobby burger." I like that.
 
'''S:''' No, a "slug burger." Slug burger.
 
'''E:''' Slug burgers.
 
'''B:''' Slug? No, blobby burgers.
 
'''C:''' That is ''not'' appetizing.
 
'''J:''' You know what, though? You remember how I was so freaked out you were trying to make me eat—
 
'''C:''' Impossible burgers.
 
'''J:''' —cricket meat, cricket wheat or something?
 
'''C:''' Oh, yeah, {{w|cricket flour}}.
 
'''E:''' Cricket flour!
 
'''S:''' Cricket flour. That’s a staple, now, Jay.
 
''(crosstalk)''
 
'''J:''' I’m proud to say I’ve never eaten it, and—
 
'''C:''' Still!?
 
'''S:''' You probably have. I guarantee you have.
 
'''C:''' You have and you didn’t even know it.
 
'''E:''' [inaudible] Restaurants are using it. You’ve eaten it.
 
'''S:''' No they don’t. No they don’t.
 
'''C:''' No they don’t.
 
'''B:''' They don’t.
 
''(audience laughter)''
 
'''S:''' That’s the thing.
 
'''E:''' ''(laughs)''
 
'''S:''' If you have eaten processed food from the supermarket that is a wheat-like product—
 
'''J:''' That’s bullshit.
 
'''C:''' Jay, it’s in ''everything'' now.
 
'''E:''' Have you read your ingredients?
 
'''S:''' It’s in everything.
 
'''B:''' Jay, I’m going to admit right now: Jay was having a hamburger and I made an insect burger, and he didn’t know it, and I [inaudible]. He ate it and said nothing. I didn’t say a word ’til just—
 
'''J:''' When did this happen?
 
''(audience laughter)''
 
'''B:''' Six months ago. Jay, you loved it. You loved it, dude.
 
'''S:''' Insect burgers are old news. Now we have slug burgers.
 
'''B:''' Blobby burgers.
 
'''C:''' But we can call them slug burgers.
 
'''E:''' No, no, we’ll come up with something—
 
'''S:''' They’re going to call it something else.
 
'''C:''' Can we called them "craft burgers," since they come from gen-craft?
 
'''E:''' Oh, gen-craft!
 
''(crosstalk)''
 
'''J:''' You know what the thing is? The slugs look like—remember {{w|pink slime}}? McDonald’s {{w|Chicken McNuggets}}.
 
'''S:''' You’ve seen the videos?
 
'''J:''' They look like pink slime!
 
'''C:''' I know, but that’s why you don’t look at that. We don’t cook them.
 
'''S:''' It’s just a blob of meat-like protein. It’s just the amino acids and whatever for… And then they grind it up and it looks [like] meat.
 
'''B:''' It’s got no central nervous system, right? So there’s no—
 
'''E:''' Right.
 
'''S:''' Yeah. It has nerves because it can move and it can feed, and it has some kind of neuronal kind of ganglia.
 
'''E:''' Ganglia?
 
'''C:''' The vegans aren’t into this, huh?
 
'''S:''' But it’s like an invertebrate. It’s like an insect or a plant.
 
'''C:''' Steve, so the vegans won’t eat this, huh?
 
'''S:''' Why not? I don’t know. Probably not.
 
'''C:''' I think that—some of them still don’t eat insects.
 
'''S:''' Yeah, if they don’t eat insects, they won’t eat this.
 
'''C:''' Yeah, it’s like a hard-line thing.
 
'''S:''' But it has no face.
 
'''E:''' Has no face!
 
'''S:''' Nothing with the face thing.
 
''(audience laughter)''
 
'''C:''' Yeah, that’s a big part of—I don’t eat anything with a face.
 
'''S:''' No face.
 
'''B:''' Did you see the scientists who drew the face on one?
 
''(laughter)''
 
'''E:''' Yes, yes!
 
'''B:''' It’s hilarious.
 
'''S:''' So, it may still be year or two before we could actually get these at the {{w|Hungry Jack’s}} or whatever.
 
''(audience laughter, applause)''
 
'''C:''' ''(laughs)''
 
'''S:''' It’s just protein, right? It’s just like the insect wheat. Now we got slug burgers, slug protein. And you could mass produce these things. These eat slime or something. You see them crawling around eat algae, but they’re working on ones that can photosynthesize.
 
'''C:''' Oh, that’s smart! Just kind of direct—
 
'''S:''' So guess how many genes are in this synthetic slug?
 
'''J:''' Like what, 300 or something?
 
'''B:''' Wait, no. How many genes? So we’ve got far fewer genes than we anticipated when we first—was it 20,000?
 
'''S:''' We have 10,000.
 
'''B:''' So, how about, like, 8,000?
 
'''S:''' 400.
 
'''C, J:''' 400!
 
'''E:''' That’s all?
 
'''S:''' But how much does a slug have?
 
'''J:''' I don’t know.
 
'''S:''' 428. An actual slug.
 
'''B:''' Oh, that’s right. It’s really efficient, huh?
 
'''S:''' Yeah, it’s a little bit more efficient than an actual slug. But the genes have, like, no {{w|exon|exons}}. Or no {{w|intron|introns}}.
 
'''B:''' They work. There’s no junk DNA.
 
'''C:''' So, Steve, is that why decided to just, kind of do this as a gen-craft, like a synthetic biology sit—instead of just genetically modifying the slug?
 
'''S:''' Because you’re not going to get animal protein in an insect.
 
'''C:''' That’s true. If you eat a slug, you’re not going to get a high level—you get a little bit of protein.
 
'''S:''' Vertebrate protein [inaudible]. Muscle pro—but this is like making muscle-like protein.
 
'''C:''' Oh, it’s so gross and weird. I love it.
 
'''J:''' But why didn’t they just do it like back when they started to come up with lab meat?
 
'''S:''' But the lab-grown meat thing never really panned out.
 
'''J:''' Why did they—But what happened?
 
'''S:''' It’s too energy-intensive. You can get—I’ve had the lab-grown meat thing, and they’re fine, but they’re still a little bit expensive.
 
'''C:''' But guys, we’re in a water crisis. We can’t use that much water to produce—
 
'''S:''' It’s very water-intensive.
 
'''C:''' Yeah, we can’t do it.
 
'''B:''' Steve, when they were developing Blobby the Slug, did they figure out some of the junk DNA? Like, "Oh, this junk DNA’s important because it does something that we didn’t think it did."
 
'''S:''' There’s no junk DNA in it because it’s totally—So, Venter gave an interview about it. They’ve written articles about it. Every single gene was completely synthesized. And over the last 20 years, they’ve learned what the minimum number of genes that are absolutely necessary for something to live, something to develop—
 
'''B:''' For bacteria and stuff, but microorganisms—
 
'''S:''' But it turns out it wasn’t that hard. If you’re building a really simple multicellular creature, most of the genes are for just the cells to live, and then just getting them to differentiate a little bit differently so they break up the work—you know what I mean?—they’re not all doing the same thing. It’s not that hard. It actually turned out to be not that hard.
 
'''C:''' And remember, this thing doesn’t have to live in the wild. It doesn’t have to do a lot of the work.
 
'''S:''' All it has to do is eat.
 
'''C:''' It just has to eat and produce meat for us, or protein for us.
 
'''E:''' It doesn’t have to develop a ''defense'' mechanism.
 
'''J:''' I know people like that.
 
'''C:''' ''(laughs)''
 
'''B:''' What if we put it in the wild? Could it evolve?
 
'''S:''' No. It can’t survive in the wild.
 
'''C:''' It would die, I think. It seems like—
 
'''S:''' It has no defense.
 
'''C:''' It has no evolutionary fitness.
 
'''E:''' [inaudible]
 
''(audience laughter)''
 
'''B:''' All the other animals would be like, "Look at that slab of protein!"
 
'''C:''' ''(laughs)''
 
'''B:''' "It can’t get away, can’t do anything. Let’s go eat it!"
 
'''E:''' Is there a waste product or a byproduct of it?
 
'''S:''' I mean, it does poop, apparently. But I think they just recycle that.
 
'''C:''' Eww!
 
'''J:''' Why can’t they just make something that poops meat?
 
''(audience laughter)''
 
'''B:''' Jay!
 
'''S:''' We’ll get right on that.
 
'''C:''' The most scientifically astute question.
 
'''J:''' They could call it a "shit burger"! ''(laughs)''
 
'''E:''' That’ll sell!
 
'''J:''' I’m not eatin' that shit!
 
'''S:''' Yeah, this is the guy who won’t eat a bug burger.
 
'''C:''' Meat poop!
 
'''S:''' But he wants to eat a shit burger.
 
'''J:''' I would try a shit burger.
 
''(audience laughter)''
 
'''E:''' Comes out as sausage links, already cased, ready to go.
 
'''C:''' Quote of the day from Jay. He tries shit burger won’t eat cricket powder. ''(laughs)''
 
'''J:''' I just have a thing about bugs.
 
'''S:''' But not slugs. Slugs are okay.
 
'''C:''' But unh-unh, feces!
 
''(laughter)''
 
'''S:''' So, of course, of course there’s already an anti-gen-craft movement, saying—
 
'''E:''' Oh, this is the bad news. This is the bad news.
 
'''J:''' This is what you’ve been waiting for.
 
'''S:''' —this is the bad news—saying that "it ain’t natural," you know? It’s all the same arguments, recycled over the last 30 years of doing this show. It’s the same thing, right? "It’s not natural. It hasn’t been tested enough."
 
'''B:''' "It’s cruel. It’s cruel."
 
'''S:''' They’re trying to say that—
 
'''B:''' I’ve seen people that—
 
'''S:''' I know, but that’s a hard—this thing is like engineered not to experience its own existence.
 
'''E:''' "We’re playing God." Playing God complex.
 
'''C:''' "Playing God." Yeah, I’ve seen that one a lot.
 
'''B:''' But they’re saying they can’t detect the fact that they are having some sort of existence, some quality of—
 
'''S:''' Prove that they don’t know they’re being killed, whatever. It’s a slug.
 
'''C:''' Aww.
 
'''J:''' Yeah, but…
 
'''S:''' It’s not even cute. They designed it to not be cute.
 
'''E:''' Right. It’s not—it doesn’t have
 
'''C:''' But somethings things that are really ugly are a little bit cute.
 
'''S:''' Oh, stop it.
 
'''C:''' It’s true!
 
'''J:''' You shouldn’t talk about your boyfriend like that.
 
''(audience laughter)''
 
'''E:''' You’ve been going into the Aug too much and putting faces on these slugs.
 
'''C:''' You know I don’t have a boyfriend.
 
'''B:''' ''(laughs)''
 
'''E:''' So you have to cut down your time.
 
'''C:''' All right, all right, all right, all right. I like the Aug.
 
'''S:''' So we’ll see. They’re already writing virtual mails to their congresspersons. And Oregon already banned it. Already banned in Oregon.
 
'''B:''' Of course they did. I’d be shocked if they didn’t.
 
'''E:''' Yeah, well.
 
'''S:''' It’s terrible. So we’ll see. This is another round, now. We’ll see what they do. They’re still sort of creating their message. But this is, I think, going to be our thing for the next few years, now, is dealing with the anti-gen-craft crowd.
 
'''B:''' Yeah, but don’t forget. This is a new domain of life. This is the first. This the first application of that creation. I think—
 
'''C:''' Well, they’ve done more in the lab. This is the first one that we’re able to ''consume''. And that’s cool.
 
'''B:''' And that’s great, but who knows what they’re going to come with with gen-craft.
 
'''S:''' All right, but here’s the thing.
 
'''B:''' Something that’s going to make a blobby burger look like, pff, whatever. Come on!
 
'''S:''' The thing is, they’re not releasing this into the wild. This is a lab creature, right? I think the big fight’s going to come the first time they want to release something into the wild.
 
'''B:''' Well, yeah.
 
'''S:''' Or they grow a crop in a field.
 
'''E:''' Oh, there’s going to be some renegade scientist who tries to do this and—
 
'''S:''' Probably in China.
 
'''E:''' Right, right. The old CRISPR—from way back when.
 
'''S:''' {{w|CRISPR_gene_editing#Human_germline_modification|The CRISPR babies.}}
 
'''C:''' CRISPR baby. Aww.
 
'''S:''' Yeah, they’re still kicking, I understand.
 
'''E:''' Yeah!
 
'''B:''' They can make some that, like, eat all the plastic in the oceans… We know how big of a problem that is.
 
'''E:''' Yeah! Yes!
 
'''S:''' So they’re already doing that with the bacteria. They made the ones that can eat oil spills, that can eat plastic—
 
'''C:''' Yeah, they’re working; they’re just working slowly.
 
'''S:''' —that can eat carbon. So, they’re all there. There’s just a lot in various stages of the regulatory procedure. Some are being used, but they still haven’t pulled the trigger on releasing a Synthetica into the wild. I think that’s going to be the next step.
 
'''J:''' As they should be because that’s super dangerous.
 
'''B:''' It is.
 
'''S:''' It depends.
 
'''E:''' Well, it depends on the form.
 
'''C:''' We have to hear from the experts. The regulatory boards are being formed, the ethics boards, and they’re figuring it out.
 
'''S:''' But here’s one thing: they cannot, by design, cross-pollinate or {{w|Hybrid (biology)|interbreed}} with normal life, with the other three domains of life.
 
'''C:''' Exactly.
 
'''E:''' Right. Where’s the—no compatibility.
 
'''S:''' They’re producing—
 
'''J:''' How do we know?
 
'''S:''' Maybe people will figure it out.
 
'''C:''' And these organisms are just pure prey animals at this point. They’re not…
 
'''B:''' But Steve, what—
 
'''C:''' ''(as {{w|Ian Malcolm (character)|Dr. Ian Malcolm}})'' "Life finds a way."
 
'''S:''' Life find a way…
 
'''B:''' They’ve done—I remember, way back in 2019, I talked about how they took bacteria and they were turning them into multicellular because they were able to—
 
'''S:''' Yeah, this is an extension of that.
 
'''B:''' So, imagine taking Archea or Bacteria with their exotic metabolisms, creating multicellular life out of them. So then, what, would that fall under Synthetica? Or would that be—
 
'''S:''' It depends. So, by definition—
 
'''B:''' We’ve become Eukarya, then—
 
'''C:''' Yeah, how are they defining these?
 
'''S:''' By definition, if you are a member of the domain Synthetica, all of your genes have been created entirely artificially.
 
'''E:''' 100%.
 
'''B:''' Okay.
 
'''C:''' Even if you perfectly replicate a… gotcha.
 
'''S:''' Yes. That’s a loophole. You can replicate a gene that exists in other creatures, but you have to have completely manufactured that—
 
'''J:''' We’re going to have to—
 
'''C:''' And it’s got to be trademarked. You can read it in the DNA.
 
'''J:''' We’re now going to have to train—
 
'''S:''' At the very least, they take out all the junk and all that stuff.
 
'''J:''' I’m serious. We have to train {{w|Blade Runner|Blade Runners}} to kill these things.
 
''(laughter)''
 
'''S:''' "Slug runners."
 
'''J:''' Slug runners!
 
'''S:''' ''(laughs)''
 
'''J:''' Because they get out, think about it, they get it out and then they don’t want to be eaten. And next thing you know, they’re punching holes through walls and they’re pissed off at people.
 
'''C:''' With their little slug hands! ''(laughs)''
 
'''S:''' The tears in the rain.
 
'''J:''' They go back to the scientists who made them.
 
'''E:''' Extended protoplasm arm…
 
'''C:''' Their [inaudible]. ''(laughs)''
 
'''S:''' "I don’t want to be a burger!"  "You’re a slug!" Slug runners, yeah. All right.
 
===Social Media, CAD, & the ''Aug'' <small>(35:25)</small>===
 
'''B:''' All right, what do we got next?
 
'''S:''' What do we got next? We have—
 
'''C:''' Am I next?
 
'''S:''' Yes. Cara is next with—what’s the latest, Cara, with social media?
 
'''C:''' Oh, god, there’s so much to talk about, you guys.
 
'''S:''' This is overwhelming.
 
'''C:''' The main article that I wanted to cover today was kind of the big—and I know you all saw this. This was the headline everywhere. It just happened two days ago, and we’re still dealing with the fallout. We’re going to be dealing with fallout for awhile. So you guys know Control-Alt-Delete, this hacker movement, "CAD."
 
'''S:''' CAD.
 
'''E:''' CADs. C-A-D.
 
'''C:''' Yeah, a lot of people call them "CADs", C-A-D.
 
'''S & B:''' All the cool people call them CADs.
 
'''E:''' I still like "Control Alt Delete," though.
 
'''C:''' I guess I’m not cool. And Control-Alt-Delete is this kind-of underground—we still don’t know who they are, right? There have been a couple of examples in the news where somebody came out and was like, "I’m Control-Alt-Delete," but nobody actually believes them.
 
'''S:''' If you admit to being CAD, you’re not CAD.
 
'''C:''' Then you’re not CAD.
 
'''E:''' Is that the {{w|Spartacus (film)#"I'm_Spartacus!"|Spartacus}} moment? "I am Spartacus! ''I'' am Spartacus"…
 
'''S:''' No, it’s not. It’s loser wannabes. The real people, you will never find out who they are.
 
'''C:''' So Control-Alt-Delete has been targeting a lot of these new platforms. The biggest one, the one that’s been the hardest kind to get into is the one that most of us are on, the Aug, right? I mean, I’ve been wearing—I’ve had my Aug on all night, actually. I think it’s kind of fun, especially when you’re sick and a little bit loopy.
 
''(Rogue whistles "loopy" sound effect)''
 
'''C:''' I don’t know if all of you are in it right now. We don’t really have to be sitting here.
 
'''E:''' Nah, I turned mine off.
 
'''S:''' Intermittently.
 
'''C:''' Yeah, you turn yours off.
 
'''B:''' I was told I could not bring my Aug, and I’m feeling—I’m getting {{w|separation anxiety disorder|separation anxiety}}.
 
'''C:''' Well, Bob, that’s because you just get lost.
 
'''S:''' That’s because when you’re using Aug, Bob—
 
'''J:''' You go off into worlds…
 
'''S:''' Yeah, you are staring off into space. You look creepy.
 
'''C:''' And then we’re like, "Bob? Hello! It’s your turn."
 
'''E:''' Creepi''er''.
 
''(audience laughter)''
 
'''B:''' But there’s a lot of cool stuff I’m doing. You know?
 
'''C:''' I know! ''(Rogues crosstalk.)'' You have to use Aug to improve your work, dude.
 
'''S:''' Checking your V-mail and stuff while we’re doing the show.
 
'''J:''' Do that shit at home. Don’t Aug on my time.
 
'''B:''' But looking at Jay without my filter on is ''hard''.
 
'''C:''' That’s mean!
 
'''J:''' Hey, man!
 
'''B:''' Sorry, Jay.
 
'''J:''' Thanks, Bob.
 
'''B:''' Look! He’s not shaved. Ugh.
 
'''C:''' I know.
 
'''J:''' ''(laughs)'' So you’re seeing a shaved version of me?
 
'''B:''' And the filter I put on his hair makes his hair look so cool.
 
'''J:''' What the f— is wrong with my hair?!
 
''(Cara & audience laughter)''
 
'''B:''' It’s cool. It’s nice, Jay, but the filter I have on your hair is awesome.
 
''(audience laughter)''
 
'''E:''' That blue streak? That’s cool.
 
'''B:''' Oh, yeah. And it moves and stuff.
 
'''C:''' I have to admit, it has been easier. Like, I don’t really like to wear makeup, and so I like thinking that a lot of people are looking at me in Aug-land like I’m a little improved. It’s 2.0. So, you know that the Aug has been kind of the one that’s taken off the most. There’s some offshoots and stuff, but I’m not using them. Are you guys? ''(guys all say no)'' Aug has everything we need, right? It has all of our social stats. It has our social currency. I mean, it’s tied into my bank accounts, all of them, I think.
 
'''E:''' Yep, pretty much. Yeah, for me as well.
 
'''C:''' Yeah, pretty much. And we’ve been kind of on the fence about how it’s plugging more things into it.
 
'''S:''' In 2032, I think it was, insurance companies will now pay for Aug doctor visits.
 
'''C:''' Well there you go!
 
'''B:''' Wow! How’d I miss that?
 
'''E:''' Doesn’t get more mainstream than that.
 
'''C:''' I know. Exactly. It’s kind of hard ''not'' to be in the Aug at this point because—actually, it’s impossible. I don’t think I know anybody who’s not using Aug. Do you?
 
'''E:''' Everybody’s doing it.
 
'''S:''' You can’t function in society.
 

'''C:''' You can’t function. How could you function—
 
'''S:''' It was like—
 
'''C:''' What ''did'' we do before Aug? We used—
 
'''S:''' We had to use our handheld phones…
 
'''E:''' It was a wallet or something.
 
'''C:''' ''(laughs)''
 
'''B:''' Oh my god. Remember that?
 
'''E:''' Remember cards!?
 
'''C:''' Oh, plastic cards! That’s so funny.
 
'''E:''' Oh my gosh. I kept my old ones. They’re in a file drawer.
 
'''J:''' You guys take it like it’s okay, and I’m not cool with it.
 
'''C:''' Are you still using paper money? ''(laughs)''
 
'''J:''' No. Of course not, but my point is this is a totailor—, totalerant—I can’t even say the word.
 
'''S, C, & E:''' Totalitarian.
 
'''J:''' —totalitarian’s wet dream.
 
'''E:''' Three t’s.
 
'''C:''' Jay, it ''is'' in China. It ''is'' in Russia, but the government doesn’t have their hands on Aug. I mean I know—
 
'''J:''' How the hell do you know that?
 
'''C:''' Well, I mean, they don’t own the companies.
 
'''E:''' They don’t admit to…
 
'''C:''' It’s private enterprises.
 
'''S:''' But that’s, again, the conspiracy theory. So we all know that Russia and China are complete Aug-totalitarian governments, right? If you live in China, you’re on their version of the Aug. They completely own you.
 
'''C:''' I think it’s still called {{w|WeChat}}.
 
'''S:''' Is it still WeChat?
 
'''C:''' Yeah, they never changed the name.
 
''(audience laughter)''
 
'''S:''' Out in the West, in developed—in other parts of the world, the governments don’t control it—
 
'''B:''' And on the Moon, too.
 
'''S:''' —but corporations do, and some people argue that they’re actually more powerful than the government.
 
'''C:''' Absolutely.
 
'''S:''' They own us.
 
'''C:''' We’re still having this conversation—
 
'''S:''' We just don't know it.
 
'''C:''' —privacy versus convenience. And I think at this point—
 
'''S:''' People will ''always'' trade privacy for a little bit of convenience.
 
'''B:''' It’s insidious.
 
'''J:''' Back in 2020, {{w|Amazon (company)|Amazon}} was rated the first company and the number one company to truly have such an amazing amount of data on its customers that—it’s like a transcendent moment for a company to get to that level of data. And we were questioning back then, I mean ''I'' was. I was following this very closely back then. There’s no regulations for that level of data. No government in the world created regulation to deal with that.
 
'''C:''' I know.
 
'''S:''' Remember when {{w|Mark Zuckerberg|Zuckerberg}} gave all those testimonies before our Congress and no one believed a word he said?
 
'''E:''' Oh yeah!
 
'''C:''' But Jay, don’t act like you didn’t just buy something from those {{w|Targeted advertising|targeted ads}} the Aug gave you.
 
'''J:''' I literally just did as we were talking. ''(audience laughter)'' No, but the point is, though, we can’t—
 
'''B:''' I love targeted ads.


== Who's That Noisy? <small>()</small>==
'''C:''' Me too. They’re ''so'' good now. They’re crazy good now.


{{go to top}}
'''B:''' [They] really know what I want.


== Questions and Emails <small>()</small> ==
'''J:''' I don’t know. We’re so hard-wired into this thing. We have to—


=== Question 1 <small>()</small>===
'''S:''' It’s scary how [inaudible].
=== Question 2 <small>()</small>===


== Interview with "..." <small>()</small> ==
'''E:''' The interdependencies—


'''J:''' We can’t go back. You can never go back. When cell phones came, there was no going [back] to a life that didn’t exist.


== Science or Fiction <small>()</small> ==
'''C:''' It’s part of our life. Yeah, it would be really hard at this point.


'''J:''' But this thing owns us.


== Skeptical Quote of the Week <small>()</small> ==
'''C:''' But here’s the thing. Here’s the scary thing, and it’s something that we think we didn’t think would be possible because of the way that data is distributed in the cloud—and Bob, I know you know about this server farms and data centers. You understand this a lot better than I do. But apparently this is the new headline. So, Control-Alt-Delete managed, finally—and you know they’ve gone in and they’ve shut down server farms before. We keep seeing these headlines where something gets blacked out for a couple weeks and it takes awhile to put it back online. They finally somehow managed to trace the data of a packet of people. So 100,000 people—their entire Aug history has been erased.
<blockquote>Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos among the cosmos. It is truly the most stunning achievement for a lifeform that has emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward. — Dr. Alyssa Carson<ref name=Carson/>, first resident of {{W|Lunar outpost (NASA)|Armstrong Station}}, The Moon</blockquote>


== Announcements <small>()</small> ==
'''B:''' Oh my god.


'''E:''' ''(cringing)'' Ooooooo!


{{Outro664}}  
'''B:''' They finally did it. They finally did it.
 
'''E:''' Backup and everything ''gone''?
 
'''C:''' They’re ghosts.
 
'''B:''' All the backups, all the—
 
'''S:''' Orphans, rights? Or virtual orphans.
 
'''C & E:''' Virtual orphans.
 
'''C:''' All their money. All of their proof of their education.
 
'''J:''' And there you go.
 
'''C:''' All of their social currency. Everything. Their history. All their memories, basically. We live via our photographs and our video recordings now.
 
'''B:''' I mean, how did they—
 
'''S:''' Their high scores on {{w|''Plants vs. Zombies''}} are gone. ''(audience laughter)''
 
'''B:''' How did they pull that off?!
 
'''C:''' {{w|''FarmVille''}}! Who knew ''that'' would stick around?
 
'''B:''' I really never thought they would be able to dit. Think of all the backups. It’s not one data center. You’ve got backups. You’ve got backups in the cloud, backups on the Moon. How did they get access to all of that?
 
'''C:''' Who did they know, right?
 
'''B:''' That’s scary as hell.
 
'''C:''' You would think. But this is, maybe, part of the problem, is that when a corporation, a multi-national corporation, owns these things—so they should be spread all over the world—it’s still only one ''company'', ultimately, right? It’s a conglomerate, but—
 
'''E:''' Inside job, maybe? Pirates within?
 
'''C:''' They must. They’ve got to have moles in there. They have to have access to enough information to know.
 
'''J:''' It was terrible what CAD did to these people It’s terrible. But the reason why they did it was to show that the companies, literally—look, these people don’t have lives anymore. What are these people going to do? They literally don’t exist in our system, in our collective [inaudible].
 
'''S:''' So, congratulations. They proved you could destroy somebody’s life by destroying their Aug—by making them virtual ghosts.
 
'''C:''' ''They’re'' the ones who did it.
 
'''S:''' But they’re the ones who did it.
 
'''C:''' The companies so far—these people on the Aug had been fine.
 
'''J:''' I don’t know. I don’t agree. I know that what they did was wrong, but I think that the point that they tried to make, they made, and it’s scary.
 
'''C:''' I think this is showing the dark side of {{w|hacktivism}}. As much as I agree with a lot of the posts that I’ve read from Control-Alt-Delete, I think they went too far this time. They went way too far.
 
'''S:''' They have a point, but they’re ''basically'' terrorists. I hate to use that word, but if you’re doing that—So, there’s a talk—I don’t know if this part of your news item, though—but talk of the {{w|United Nations|UN}}—are you going to get to that part? But the UN, basically, they’re considering a resolution to make, just so that they have more regulatory power to go after CAD, you know, Control-Alt-Delete—If you 'kill' somebody’s virtual history, that’s now virtual murder.
 
'''C:''' Oh! Like their—oh, because we all have our little {{w|Avatar (computing)|avatars}}. You can actually murder somebody in the Aug?
 
'''S:''' If you comp—like 100% erase somebody’s data so they can’t come back, that is virtual murder—
 
’’’C:’’’ So these guys could be tried in {{w|the Hague}}?
 
'''S:''' —because you create a virtual ghost. They can get tried in the Hague. If they ever catch them, they can get—
 
'''C:''' We know who they are.
 
'''E:''' Well, catching them’s going to be so hard.
 
'''J:''' If they catch those people…
 
'''B:''' Oh my god, yeah.
 
'''S:''' They’re done, they’re toast. But I’m sure it’s like cells. You might get one guy or one {{w|Clandestine cell system|cell}}, but you’ll never totally root out…
 
'''J:''' That’s the other thing, too. The other scary reality is—yeah, so Control-Alt-Delete, sure, they did something bad.
 
'''C:''' ''Really'' bad.
 
'''J:''' But there’s—okay, I don’t want to say ''real'' terrorists out there—but there are terrorist group that ''do'' want to tear down the society that we live in.
 
'''C:''' How is this different?
 
'''S:''' Well, how better to tear down society than to get rid of someone’s complete Aug history? Jay, imagine yourself as one of these people. What do you do?
 
'''J:''' You’re done. I don’t know.
 
'''S:''' You’re done. You’re cooked. Go live on a commune in the woods somewhere?
 
'''J:''' I think the point is that we’re missing—
 
'''S:''' [inaudible]
 
'''C:''' Some people already do that. There are people who aren’t in the Aug. I don’t know any of them, but I read about them sometimes.
 
'''E:''' The {{w|Off-the-grid|Off-Gridders}}! I love them. The Off-Gridders.
 
'''C:''' Yeah, the Off-Gridders! Yeah, they’re weird. There’s a TV show about them on Discovery.
 
'''E & C:''' ''(laughs)''
 
'''S:''' The Off-Gridders?
 
'''J:''' Do you guys think—and actually the show is pretty cool—but do you guys think, though, that we are kind of going down the snakes mouth right now with technology?
 
''(A Rogue sighs)''
 
'''S:''' But we’ve been saying this for 20 years.
 
'''C:''' That’s the thing. It’s so hard, right? Because we were going to go this route ''anyway''. That’s the thing. If the Aug’s parent company didn’t hit the right kind of algorithm to get us here, another company would have.
 
'''J:''' I’m not saying—yeah, of course, I think it would have happened anyway—but back in the mid-2015 era, we started to realize that {{w|Facebook}} really didn’t have humanity’s best interests in mind. And then we watched Zucker-freak—
 
'''C:''' Did we ever ''really'' think they did?
 
'''J:''' —go in front of Congress and lie his face off, telling them how everything that they—
 
'''C:''' Do you remember when he ran for president? Idiot. Sorry.
 
''(laughter)''
 
'''J:''' That was the beginning of his downfall. But the point is, though, we saw even with Facebook—and this is ''nothing'', Facebook is nothing compared to this. This is, literally, we live in {{w|augmented reality}} now.
 
'''C:''' I know, but Facebook didn’t give us anything except people’s picture so their babies.
 
'''S:''' But at the time—
 
'''C:''' This is ''way'' better.
 
'''B:''' And cat…
 
'''E:''' And a lot of advertising. A ''lot'' of advertising.
 
'''C:''' And {{w|Cats and the Internet|cat videos}}.
 
'''J:''' I don’t know, I don’t know.
 
'''C:''' But now you can just watch a cat video anytime, anywhere.
 
'''E:''' Oh, yeah, that’s a good point.
 
'''B:''' I always got one running in the corner of my vision. It’s really cool.
 
''(audience laughter)''
 
'''S:''' But ''we'' were on Facebook, and it was important to our marketing. And it was—
 
'''C:''' That’s the point. That’s the part that’s so—
 
'''S:''' And now we’re on the Aug and it’s—Imagine our show without the Aug.
 
'''C:''' I know, yeah. But that’s the part that’s so unsavory to me, and that I do have the lucky feeling about, is it ''all'' is just about marketing, still.
 
''(unknown Rogue)'': Yeah, it is.
 
'''C:''' It’s all just about selling us shit.
 
'''E:''' That’s been true for ''so'' long.
 
'''S:''' Ever since the—exactly.
 
'''E:''' Since the analog days. And beyond.
 
'''B:''' And they’re so good at it now. A lot of people are saying that there’s—it’s actually giving credence to people’s belief in psychics.
 
'''C:''' They think they’re psychic?
 
'''B:''' They ''must'' be psychic because they know what I want so fast.
 
'''S:''' Before you know you want it.
 
'''C:''' Don’t they understand big data. That’s ridiculous.
 
'''E:''' ''(laughs)''
 
'''B:''' But we see it. We see it. It’s funny as hell.
 
'''J:''' All right. I’m warning you guys. I’m warning—I bet you in another 10 years we’ll see some seriously bad stuff come out of this.
 
'''C:''' Another 10 years, you’ll be ''dead''. ''(Laughs)''
 
''(audience laughter)''
 
'''E:''' There you go, Jay! How’s that?
 
'''J:''' And that’s the bad thing!
 
'''B:''' But those longevity therapies are working pretty damn good.
 
'''C:''' Ever the techno-optimist.
 
'''E:''' Can we download ourselves yet?
 
'''B:''' Look at me.
 
'''C:''' ''(laughs)''
 
'''S:''' You’ll never be [inaudible].
 
'''C:''' "Five to ten years."
 
[KiwiCo ad]
 
===Near-Earth Asteroids: Apophis and Perses <small>(48:13)</small>===
 
'''S:''' So, Evan—
 
'''B:''' All right, now this is some shit, man.
 
'''S:''' This is the ''big'' news. This is actually—everything else is just the warm up to the actual big news that everyone wants to hear about.
 
'''C:''' 100,000 people erased from—
 
'''S:''' Because what do we got, 10 years to live? What’s going on with that?
 
'''E:''' Uh, yeah. We—well, it’s 20 years to live.
 
'''C:''' Say what now?
 
'''S:''' We’ll be dead.
 
'''E:''' But we’re working on it. We’re working on it. I want to remind everyone the whole background of this, so please bear with me before I get to the ''actual'' news item.
 
'''B:''' Like we don’t know, but go ahead.
 
'''E:''' I know, I know. So I’m hoping the audience here remembers {{w|99942 Apophis|Apophis}}, right, the [[SGU_Episode_392#Quickie_With_Bob:_Apophis_Update_.2842:31.29|2029 asteroid]] that came within 25,000 kilometers of Earth?
 
'''S:''' That’s nothing. That’s a whisker.
 
'''J:''' Phew!
 
'''C:''' But it missed us.
 
'''E:''' It ''did'' miss us, absolutely, and that’s what the scientists told us—
 
'''S:''' Yeah, that’s why we’re still here, because of [inaudible].
 
''(audience laughter)''
 
'''E:''' And it happened on a {{w|Friday the 13th}}.<ref>[https://www.space.com/asteroid-apophis-2029-flyby-planetary-defense.html Space.com: Huge Asteroid Apophis Flies By Earth on Friday the 13th in 2029. A Lucky Day for Scientists]</ref>  Which, you know— ''(crosstalk)''
 
'''S:''' What are the odds?
 
'''B:''' Remember the party we threw that day?
 
'''E:''' Pretty decent. There was so much fear-mongering with Apophis. It was first discovered way back in {{w|99942_Apophis#Discovery_and_naming|2004}}, and at that point, the scientists, with the information they had—there was maybe just under a 3% chance of it actually impacting the planet based on the data that they had at the time. Well that sent people kind of into "Okay, here it is! Now, finally, this is the ''real'' apocalypse coming. Forget all the other—the Mayan, the 2012—all that. This is the actual one.
 
'''C:''' Forget all the other apocalypses!
 
'''E:''' But, as time went on, and more careful studying of it went, they realized—that shrunk down over the years, and by the time, about 2019, 2020 rolled around, the scientists said, "It is 0% chance of this [inaudible] and, of course, it didn’t.
 
'''S:''' Yeah, it’s not going to happen.
 
'''E:''' But Apophis was the {{w|Apep|god of chaos}}, for those who don’t know their {{w|Greek mythology}}. And you’ll remember that tragic incident leading up to the fly-by, the cult, known as the Children of Claude, that was an offshoot of the {{w|Raëlism|Raëlian Movement}}, you guys remember? We used to talk about the Raëlians way back, like in 2005, 2006.
 
'''B:''' Raëlians, right.
 
'''S:''' Didn’t they pretend to clone somebody at one point?
 
'''E:''' Yes!
 
'''C:''' I can’t believe they stuck around all that time.
 
'''E:''' They did! It was little offshoots of it.
 
'''J:''' Was that [https://knowyourmeme.com/memes/ancient-aliens guy with the hair] that said, "I’m not saying it was aliens…but it was aliens." Was he a Raëlian?
 
'''C:''' ''(laughs)''
 
'''E:''' I think I know of whom you’re speaking. That’s the {{w|Raël|Claude}} person, and this offshoot is the "Children of Claude." So, they were the ones who, as the asteroid came by, they thought it was going to open an inter-dimensional space, and the only way to get up there was to be—to leave their earthly coils. A couple dozen people, unfortunately, took their own lives. But we’ve seen this before, cults and suicide.
 
'''S:''' What was that? The {{w|Comet Hale-Bopp|Hale-Bopp}}, back in ’97, and the {{w|Heaven's Gate (religious group)|Heaven’s Gate cult}}, anyone?
 
'''C:''' [to audience] These guys are all way too young to remember that. No, they’re too young.
 
'''E:''' No? Oh, gosh, I’m totally dating myself. I’m an old man now. Well, in any case, that was the most, I think, notable fear-related story to it. The Internet obviously went wild. But then in 2030, just a couple years ago, you know what came next. The astronomers located object designation 2030-US, also known as Perses.
 
'''S:''' Mmm. Perses.
 
'''E:''' Perses. P-E-R-S-E-S, named for—
 
'''S:''' Not Perseus.
 
'''E:''' Not {{w|Perseus|Perse''us''}}, no.
 
'''S:''' Perses.
 
'''E:''' {{w|Perses (Titan)|Perses}} was the Greek {{w|Titans (mythology)|Titan}} of destruction.
 
'''S:''' Mmm. Appropriate.
 
'''E:''' And this one’s giving us trouble. 33% chance—
 
'''S:''' Don’t want to roll those dice.
 
'''E:''' —of {{w|Impact event|impact}}. And the studies since then—they’ve obviously been very closely monitory this one—and it’s holdin' true.
 
'''C:''' How far away is it now?
 
'''E:''' Well, we’re about—2055 is going to be the date. June 21, 2055.
 
'''B:''' Aww, right around the [inaudible].
 
'''E:''' So we’ve got—there’s 20 years. But, as you know, {{w|NASA}}, the {{w|European Space Agency|ESA}}, the {{w|Roscosmos|Russian Space Federation}}, and others have finally—
 
'''S:''' {{w|Free_Willzyx#Plot|MASA}}?
 
'''E:''' MASA, among others—{{w|Israel Space Agency|Israel’s group}}, and the {{w|Indian Space Research Organisation|space agency of India}}…So they can’t behind {{w|global warming}} and deal with that, but at least this is ''something'' that they ''can'' get behind, and they ''have'' gotten behind.
 
'''C:''' We like a good short-term threat.
 
'''E:''' Yeah, exactly. When something’s a little more immediate, and, like, right in your face, that will motivate.
 
'''B:''' Especially when it’s an {{w|Extinction event|Extinction Level Event}}…[inaudible].
 
'''C:''' And they’ll make lots of movies about it.
 
'''B:''' Oh yeah. Documentaries…
 
'''S:''' They’ll dig up {{w|Bruce Willis}}.
 
'''C:''' Poor guy.
 
'''B:''' Think he’s just virtual [inaudible].
 
'''J:''' That {{w|Armageddon_(1998_film)|movie}} he made sucked, didn’t it?
 
'''S:''' That ''one'' movie he made?
 
''(laughter)''
 
'''J:''' And what about that—remember, he was a coal-miner or something?
 
'''E:''' Oh, remember that Christmas movie, {{w|''Die Hard''}}?
 
'''C:''' I was going to say, ''Die Hard'' is Christmas movie! ''(laughs)''
 
'''E:''' ''Die Hard'' is a Christmas movie!
 
'''S:''' That’s still my favorite Christmas movie.
 
'''C:''' [again to audience] Also too young, too young. ''(laughs)''
 
====The good news part <small>(52:45)</small>====
 
'''E:''' Wow! Really? Here’s the news item. Here’s the news item today. ESA—
 
'''S:''' —Some good news?
 
'''E:''' It is good news because—
 
'''C:''' —Oh good, thank goodness.
 
'''S:''' I’d rather ''not'' be hit by a two-kilometer—
 
'''E:''' Exactly. And the {{w|Asteroid impact avoidance|prevention methods}} have gone into effect because ESA successfully launched GT1 into orbit the other day. No issues, everything is fine. It’s the first salvo in the fight against Perses in which is going to approach Perses and establish a fixed position in close proximity to it. It’s using the {{w|gravity tractor}} method—
 
'''C:''' Oh, "GT1."
 
'''E:''' —GT1, which is why it’s called that.
 
'''B:''' I love this idea.
 
'''E:''' So if all goes according to plan, [presumably demonstrating to audience] here’s Perses, it’s coming in, GT1. They’re going to park it over here in a stable position and the gravity between the two objects, it should nudge it. It should nudge it just—and it doesn’t need to nudge much because it’s still out there far enough—a few centimeters! That’s all they’re looking to do at this distance.
 
'''S:''' Yeah, but 20 years is actually ''right'' on the margin—
 
'''B:''' —It’s on the edge.
 
'''S:''' —For the gravity tug method.
 
'''J:''' It’s a little too close for comfort.
 
'''C:''' They want to try as soon as—I mean as late—whatever you want to say—as possible.
 
'''S:''' That can’t be the only thing that they’re doing.
 
'''E:''' No, it’s not—
 
'''J:''' —No, they tried other stuff.
 
'''E:''' —There’s a three-prong attack against Perses, and this was the first one, and it successfully went—but there are two more coming. So the second prong is being undertaken by {{w|China National Space Administration|China’s space agency}}. They’re going to be launching a direct impact probe into Perses, and they’re going to attempt to knock it off its trajectory. Now, this is sometimes referred to as the "battering ram attempt," but this particular project is considered, actually, a little less reliable because previous experiences from space agencies with this exact method, the direct impact approach, had mixed results.
 
So, if you recall, NASA conducted a test of the direct impact approach back in 2022. The name of the test was called DART. DART stood for {{w|Double Asteroid Redirection Test|Direct [sic] Asteroid Redirection Test}}.
 
'''C:''' Oh yeah, DART.
 
'''E:''' And it shot the DART at—oh, you’ll love this Cara—at a small test asteroid called {{w|65803 Didymos|Didymoon}}.
 
'''C:''' Didymoon?
 
'''E:''' Didymoon.
 
'''C:''' I like Didymoon.
 
'''E:''' Jay, didn’t you name one of your dogs Didymoon?
 
'''J:''' No.
 
''(audience laughter)''
 
'''E:''' Bob?
 
'''J:''' It’s Jay. He never would admit to it.
 
'''E:''' I thought it was Jay.
 
'''S:''' It was a goldfish.
 
'''E:''' He lied to me.
 
'''C:''' ''(laughs)'' Little Didymoon.
 
'''E:''' Now, look, Didymoon was a much smaller asteroid than Perses is. So the data revealed by the impact is that, yes, it would be effective on an asteroid ''that'' size, but it wasn’t clear if it would do something the size of Per—oh, I failed to mention: Perses is two kilometers in diameter—
 
'''C:''' —But doesn’t it ''barely'' have to move because it’s so far away, still?
 
'''S:''' —Yeah, but—
 
'''C:''' I mean, I know it’s close. But it’s ''so'' far.
 
'''E:''' —A couple centimeters—
 
'''S:''' —Two kilometers is big.
 
'''J:''' Yeah, but I thought—
 
'''E:''' Yeah two kilometers is ''huge''.
 
'''C:''' —Yeah, but it’s like "bink," and then it’s, like, ''so'' far from us.
 
'''S:''' —It’s all momentum.
 
'''J:''' —But I also thought that they were worried that hitting something like that could cause just a bunch of smaller objects.
 
'''S:''' No, that’s only if they hit it with a nuclear weapon. And even then—
 
'''E:''' —Right, and that was never really a consideration, even back in the late teens, when they were talking about that even as a possibility for any future impact. They kind of ruled it out at that point, for—
 
'''J:''' Okay.
 
'''B:''' —Yeah, the composition of the asteroid’s critical in determining what best approach.
 
'''S:''' What method. But this is solid, right? So it has to be solid. You can’t hit a pile of rubble with an impact method—
 
'''E:''' Right, because you’re—
 
'''C:''' —So it’ll just stay rubble.
 
'''E:''' Exactly. No effect.
 
'''S:''' [inaudible] It’ll decay—it’ll have no effect. So—
 
'''E:''' —No effect.
 
'''S:''' But the thing is, it’s just hard launching a ship fast enough, heavy enough to hit it with enough momentum to move it—
 
'''C:''' —And also, it’s, like, yeah, it’s two kilometers, but that’s really small in the grand scheme of, like, ''space''.
 
'''E:''' So this is why—
 
'''S:''' —But it’s really big in the grand scheme of a ''rocket''.
 
'''C:''' True, but they have to get that calculation ''perfect'' to be able to reach it.
 
'''S:''' That’s not a problem.
 
'''C:''' Really?
 
'''S:''' They won’t miss.
 
'''C:''' Okay.
 
'''B:''' {{w|Classical mechanics|Newtonian mechanics}}. You don’t even need {{w|Introduction to quantum mechanics|quantum mechanics}}. [inaudible] is good enough.
 
'''E:''' China’s craft is significantly bigger than DART’s was. So they’re relying on the much, much larger size of this to perhaps do the job. They’re calling it—I don’t speak Chinese. If anyone out there does speak a dialect of Chinese, forgive me—Tuí Tuí, which is Chinese for "push" or "shove," which I thought was kind of cute. That’s a phonetic spelling. T-U-I with an accent over it is how they spelt it in English.
 
'''J:''' So can they tell—Don’t we have the science to know that the gravity from the ship is going to affect it or not? We’re all kind of sitting on pins and needles, like wanting to get something definitive.
 
'''S:''' But it’s all orbital mechanics. They’ll have to hit it, and then they’ll have to follow its orbit for, like, two years to ''really'' know what the impact is.
 
'''B:''' That’s right. You have to be—
 
'''S:''' —That’s why they can’t wait—
 
'''C:''' —It takes ''that'' long for them to know if it’s knocked off its course?
 
'''S:''' —That’s why they have to everything at once. They can’t wait because every time they wait, we lose the ability to deflect it.
 
'''B:''' Yeah, it’s just too important to screw up, so that’s why it’s good to have Plan A, B, C, as many plans as you can muster.
 
'''C:''' Redundant. Are there more than two?
 
'''S:''' I don’t think three is enough. They should do something else.
 
'''E:''' There’s a third.
 
'''B:''' There is a third. There is—
 
'''E:''' —Now, Tuí’s going to launch in late 2038, early 2039 is the estimated window for that one. But, third prong attack—and, Bob, you’re going to love this one.
 
'''B:''' Oh yeah.
 
'''E:''' This is called Alda. A-L-D-A. It’s expected to launch in 2040, and it stands for Asteroid Laser Deflection Array. Well, I have to mention it now. We love {{w|Alan Alda}}, when we used to watch him back when television was a thing. When {{w|M*A*S*H (TV series)|M*A*S*H}}—but he was also a great science communicator. He did Scientific American Discoveries on—
 
'''C:''' —You can still get M*A*S*H on the Aug.
 
'''E:''' —So good.
 
'''S:''' Yeah, you can.
 
'''E:''' —And you never know what kind of entertainers and stuff are going to become science communicators or great things. {{w|Millie Bobby Brown}} became an oceanographer, and who saw that coming? Stranger things. Who saw that coming?
 
'''C:''' —''(laughs)'' She was smart.
 
'''E:''' —But, in any case, ALDA’s going to be launched in 2040. And it’s going to contain five {{w|Laser propulsion|space lasers}}<!--Not sure if there’s a better wikipedia article…-->, Bob—
 
'''C:''' —''Lasers''.
 
'''E:''' —They’re going to rendezvous with Perses—
 
'''B:''' —How powerful?
 
'''E:''' —In 2040—how powerful, indeed! 50 {{w|peta-}}watts per laser.
 
'''B:''' Yeah! [inaudible]
 
'''E:''' Woo! They’re going to blast this thing.
 
'''S:''' Are they going to draw a shark on the side of the—
 
'''E:''' —I hope so. ''(audience laughter)'' If they don’t, what a wasted opportunity. The idea being is that you pound this thing with enough laser power—debris, gases get released from it—
 
'''S:''' —And that pushes it.
 
'''E:''' —And that creates a little bit of a push. It takes time. This doesn’t—you don’t send it up there, fire a couple lasers, and call it day. They estimate it’s going to take 6 to 24 months of laser bombardment in order to get thing to move those few centimeters.
 
'''C:''' Wait, are the lasers space-based, or are they Earth-based?
 
'''E:''' Oh, they’re launching them out.
 
'''S:''' They’re space-based.
 
'''C:''' Oh they’re launching. Okay, got it.
 
'''E:''' Yep, they’re going to launch them out there.
 
'''C:''' Is anything going to be in between this laser ship and—
 
'''B:''' —Not for long. Not for long.
 
''(laughter)''
 
'''E:''' Not at 50 peta-watts!
 
'''C:''' Are we risking anything?
 
'''E:''' Not at 50 peta watts.
 
'''C:''' They have a pretty clear shot. They’ve calculated that. They don’t care.
 
'''E:''' It will intercept it in 2043. And so that’s the three-prong attack, and the first launch happened today, so we will keep obviously close tabs on this one.
 
'''C:''' So what do we think the odds are?
 
'''E:''' With all three of these things going out there? I think, ''I'' think very good—
 
'''B:''' —Doable.
 
'''E:''' —Scientists are not really putting out any false hope and saying, "Yeah, it’s guaranteed to work," or any kind of 99.9% effective. They’re not really saying anything along those lines, for obvious reasons.
 
'''S:''' So we’re starting at 33%, and I think each one will knock it down 10% or so. They’re hoping to get it to less than 5%. But they may be the best they could do.
 
'''C:''' So it’s an interesting {{w|eschatology|eschatological}} threat. It’s kind of the first one other than climate change, which has been this slow burn. Heh, no pun intended. This is the first real time where I’m feeling like this could be how I go out, you guys.
 
'''B:''' This could be how everyone goes out.
 
'''E:''' We’ll go out with you.
 
'''C:''' You'll be ''dead'' by then.
 
''(audience laughter)''
 
'''J:''' Why do you keep reminding us of that?
 
'''C:''' I’m sorry! But you have this weird false hope that you’re going to live forever. What—
 
'''E:''' —He’s taking his extension therapy…
 
'''C:''' —It’s 2035. When is this supposed to hit us?
 
'''E:''' We’ve had relatively—2055. 20 years from now.
 
'''S:''' [inaudible] people live into their 90s.
 
'''C:''' 20 years from now and you guys are already in your mid-70s?
 
'''S:''' Yeah. Our farm relatives lived into their 90s.
 
'''E:''' I’m only 65.
 
'''J:''' My dad made it to 86, and he ate whatever the hell he wanted.
 
'''C:''' You’re only 65? Huh. You’re closer in age to me.
 
'''E:''' That’s right.
 
'''S:''' He didn’t it meat slugs.
 
'''J:''' That’s right.
 
'''E:''' That’s right!
 
'''C:''' All right. You’re right.
 
'''B:''' My grandmother’s in her early 90s.
 
'''S:''' Or cricket biscuits.
 
'''C:''' We might all go out this way.
 
'''J:''' We have to stay hopeful, and we have to trust the scientists.
 
'''S:''' But it shows you how necessary the asteroid detection system was. Without that early detection system, we wouldn’t have known about this until it was too late.
 
'''C:''' Remember, we didn’t really have that back in the day, did we?
 
'''S:''' I think—remember we interviewed {{w|Rusty Schweickart}}? {{Link needed}} He was working with the UN to develop—
 
'''J:''' That was the beginnings of it. That was it.
 
''(crosstalk)''
 
'''E:''' This is it.
 
'''B:''' Who was he?
 
'''S:''' That detection—
 
'''C:'''  —This happened in our lifetime.
 
'''J:''' He was the {{w|Apollo program|Apollo}} astronaut that we talked to.
 
'''S:''' Remember?
 
'''B:''' I don’t remember that at all.
 
'''S:''' Yeah, that was a long time ago.
 
'''C:''' His memory’s been going.
 
'''B:''' A long time ago.
 
'''J:''' That was one my—that was one of the best interviews we ever did.
 
'''E:''' You have to go back and listen to that one, Bob.
 
'''C:''' How many episodes have we done at this point? Phew!
 
'''S:''' We’re over 1,500.
 
'''E:''' Uh, where are we?
 
'''C:''' Catalogued…
 
'''B:''' So Cara, right before the asteroid hits, I’m going to call you. I’m going to say, "I’m still here."
 
'''E:''' ''(laughs)''
 
'''C:''' We’re still going to be doing the show, my friend.
 
'''S:''' I’ll still be editing it.
 
'''C:''' ''(laughs)''
 
''(audience laughter)''
 
'''E:''' That’s true.
 
'''S:''' I mean, this episode, I’ll be editing it.
 
''(laughter)''
 
'''S:''' So you’re hopeful, Evan?
 
'''E:''' I’m optimistic. The glass is ''more'' than half-full.
 
'''S:''' It’s hard to talk about anything other than this. I know it’s kind of been dominating the news, but I think people are just expecting it’s going to be taken care of, and—
 
'''C:''' —Well, what other options—
 
'''S:''' —Or else you get—
 
''(crosstalk)''
 
'''S:''' We can’t obsess about it all the time.
 
'''E:''' What are we going to do, run around like this for 20 years with our arms flailing in the air? [presumably demonstrating]
 
''(audience laughter)''
 
'''C:''' Let’s start a cult!
 
'''S:''' I know! We’ll kill ourselves!
 
'''E:''' That’ll do.
 
'''S:''' That’ll fix it.
 
'''E:''' That’ll fix everything.
 
'''C:''' Do it the ''day'' before the asteroid, and we’ll never know what happened. ''(laughter)'' We won’t even be missed.
 
===Deep Learning <small>(1:01:56)</small> ===
 
'''S:''' Okay, Bob.
 
'''J:''' Finally.
 
'''S:''' Finally. So we’ve been literally talking about this for 30 years. Remember, 30 years ago, when you thought that we would have {{w|artificial intelligence}} by now?
 
'''B:''' Yeah, yeah, yeah.
 
'''C:''' ''(laughs)''
 
'''S:''' And I said, "Nah."
 
'''B:''' Keep rubbing it in. It’s coming.
 
'''S:''' So have we made any adv—where are we?
 
'''B:''' Yeah, this, ''this'' looks promising.
 
'''S:''' ''This'' looks promising?
 
'''C:''' ''This'' is the one!
 
'''E:''' ''This!''
 
'''B:''' So {{w|deep learning}} is in the news again.
 
'''S:''' Again!
 
'''B:''' Remember, we used to talk about deep learning—
 
'''S:''' —Right there with the {{w|''The Hype About Hydrogen''|hydrogen economy}}, right?
 
'''B:''' —We used to—we talked about, come on, we talked about deep learning a ''lot'' in the late teens, early 20s, and it looked promising as hell, really promising. Remember some of those advances? Let me lead with what the news is, here, that researchers from the {{w|Marvin Minksy|Minsky Institute}} have announced that they created a viable path to {{w|artificial general intelligence}}, and that they think that using the {{w|Moravec’s paradox|Moravec’s artificial general intelligence test}}—they think this could be the first AI test to—
 
'''S:''' —Which one?
 
'''B:''' —The Moravec artificial general intelligence test.
 
'''S:''' What happened to the {{w|Turing test}}?
 
'''B:''' It repl—come on! Get with the times, dude.
 
'''J:''' So general intelligence, just to remind the audience, is a computer that can think like a human being.
 
'''B:''' Right. It’s adaptable and—
 
'''J:''' —It’s not—
 
'''B:''' —It’s not super-smart in one domain. It’s like a human—
 
'''C:''' —Wait, remind me what made deep—
 
'''B:''' —Intelligent in many domains.
 
'''C:''' —What made deep learning deep learning? What ''is'' deep learning?
 
'''B:''' —Well, deep learning is—it’s a technique. It’s an artificial intelligence technique using neural networks and a ''lot'' of training data to see patterns, to see increasingly clearly, patterns and data, lots of data, that otherwise are very, very hard to see. So—
 
'''C:''' —Oh, and that’s why it had all those creative chess moves and Go moves.
 
'''B:''' —Well, right. There was {{w|AlphaZero}}, there was {{w|AlphaGo}}. Those were the systems that beat the best {{w|chess}} and best {{w|Go (game)|Go}} players on the planet. But not just — the AlphaZero was the one that was really fascinating for me because that was a system using deep learning that created a system that is so good in chess that they didn’t even test it against people because it was a waste of time. They tested it against the best computer chess program, and it kicked its butt. And human {{w|Grandmaster (chess)|grandmasters}} that looked at it were like, "This thing played like a person but like a person-super computer hybrid." They said it was such an amazing, virtuoso performance. They could not believe how good this was, and this was largely created by deep learning. So deep learning — oh, what are you laughing at? What’s going on over there? So—
 
'''C:''' It’s a ''hybrid!''
 
'''E:''' It’s a hybrid!
 
'''J:''' Oh, I missed it!
 
'''C:''' ''(laughs)'' You missed it!
 
'''B:''' Come on!
 
'''E:''' Turn ''off'' your Aug!
 
'''C:''' ''(laughs)''
 
'''J:''' When Bob talks, I just zone out, and I start looking at {{w|Cats and the Internet|cat videos}}. I can’t help it.
 
''(audience laughter)''
 
'''S:''' Cat videos!
 
''(laughter)''
 
'''J:''' They’re f-ing adorable.
 
'''B:''' All right. So the point was, Jay, deep learning was a huge success in the late teens and the 20s, not only with chess and Go but also image recognition, autonomous driving, language recognition. It was an amazing success, but the problem was that it was overhyped. Remember? It was just went —
 
'''S:''' —Like everything is overhyped.
 
'''B:''' —This one was crazy overhyped. It went viral. If you looked for AI classes, everything was deep learning, deep learning, deep learning. And so it really was a victim of its success because people kind of equated deep learning with artificial intelligence in ''general'', right? They thought deep learning was going to create the first truly artificial general intelligence, which it could never have done because if you look at it, deep learning was just a tiny little subset of {{w|machine learning}}, and machine learning was a tiny little subset of AI itself. So it never—this was just one of the tools of AI that just exploded, and it really created a false impression. So people became disillusioned when deep learning—
 
'''S:''' —And there was a post-hype phase.
 
'''B:''' —wasn’t making all these—right. So there was—
 
'''E:''' —And, Bob, is a parallel to this, remember the "train your brain" to do—what was the name of that? {{w|Lumosity|Luminosity [sic]}} or whatever.
 
'''J:''' That was all [inaudible]. Yeah, they said your playing their stupid little things will—
 
'''E:''' —Playing your games will increase your overall intelligence and a whole bunch of —but it actually only helped you out in that ''one'' very specific set of puzzles you were learning.
 
'''J:''' —Yeah, it turns out—
 
'''B:''' —Right, okay. I see where you’re going with that.
 
'''J:''' —It turns out that the game that you were training on, that’s what you got better at.
 
'''E:''' That’s right.
 
'''C:''' So that’s like a metaphor for deep learning versus general intelligence.
 
'''S:''' It’s more like a metaphor. But I think the hype, though, was similar to—remember there was the hydrogen economy hype, which never manifested. Then there was, "stem cells are going to cure all diseases," which never manifested. Although all of these—there is a niche for this. Stem cells are having their day now, twenty years later. But it’s not going to cure everything. Then we were going to cure everything with {{w|CRISPR_gene_editing|CRISPR}}—
 
'''J:''' —Then we had the {{w|Twinkie#Twinkie_diet|Twinkie diet}}. Everybody was eating Twinkies.
 
'''S:''' —The Twinkie diet.
 
'''C:''' ''(laughs)'' After {{w|''Zombieland''} with Evan [inaudible].
 
'''B:''' That worked. I lost ten pounds on that.
 
'''E:''' ''(laughs)'' That franchise petered out.
 
'''C:''' Yeah, man.
 
'''S:''' And so, it’s the same thing. But we knew: deep learning was ''never'' on the path to AGI.
 
'''B:''' Right. And if you were kind of an enthusiast in this, you kinda realized that. But the general population really had no idea. So they were really disillusioned—
 
'''S:''' —They don’t know difference between AI and AGI.
 
'''B:''' Exactly. And it kind of created this little—they call it an {{w|AI winter}}, which has happened a couple times in the past, when AI was first really, really hyped. They thought, "Well look, we can create these chess programs. We’ll have human intelligence matched in five or ten years!" And they weren’t even close and—
 
'''S:''' —Oh, you remember when we saw {{w|2001:_A_Space_Odyssey_(film)|''2001''}} how—we were like, "Yeah, we’ll have that in 30 years."
 
'''B:''' Yeah, it seemed totally reasonable. So the expectations were way, way high. Remember that? So just like previous AI winters, it caused a little mini winter, and people were very disillusioned, but the research continued. And we see a lot of its successes, and it’s not called deep learning, or even AI. They called it lots of different things so that people didn’t even realize what it was, and it had become so embedded in society that you don’t even think of it as AI, which is the true test of a system’s success.
 
'''S:''' It’s everywhere now. [inaudible] Deep learning is driving your car, it’s doing everything. But it’s in the background, and nobody talks about it, so you think it’s a failure. We’re like, nope, it’s running everything you’re using.
 
'''B:''' But it’s not AI. It’s not artificial general intelligence—
 
'''S:''' —It’s not AGI.
 
'''B:''' —which is what people—which is the real sexy thing that’s in the movies and the TV shows and what everyone really, really wants, and they’re very disappointed. So I think they may—what are you laughing at?
 
'''J:''' ''(laughs)'' This [cat] video is so funny!
 
''(Laughter)''
 
'''C:''' ''(laughs)'' He is such a jerk.
 
'''E:''' Oh, Jay! Some things ''never'' change.
 
'''J:''' Oh, man.
 
'''B:''' Turn on the scrambler again!
 
'''J:''' This zombie [inaudible]—
 
'''C:''' ''(laughs)''
 
'''E:''' —Jay, Jay, send me the link.
 
'''B:''' So one of the things that this Minsky Institute really showed was that consciousness, they think, is really—it’s like a three-dimensional thing. You need three things. You need computational intelligence, and that’s what deep learning can really help with. But that’s only ''one'' leg of the tripod. You also need autonomous complexity as well—
 
'''J:''' —What does that mean?
 
'''B:''' —And that means—it’s like survival drives. It’s things like getting out of bed in the morning because you want to get out of bed. You’re goal-oriented. You’ve got intentionality. You want to do stuff. Those are things that—
 
'''S:''' —That’s the part that always worries me.
 
'''C:''' Yeah.
 
'''B:''' What’s your goal going to be?
 
'''S:''' Right.
 
'''C:''' It’s like this locus of control.
 
'''B:''' But you need that. That’s something you—consciousness needs that leg of the tripod.
 
'''J:''' I don’t know. I don’t really want to get out of bed, and I’m conscious.
 
''(audience laughter)''
 
'''S:''' You claim.
 
'''E:''' Well…
 
'''C:''' Sometimes you do, though.
 
'''B:''' So the third leg, this is the important one—
 
'''S:''' —Twinkies?
 
'''B:''' —Social complexity. ''That’s'' the one that was really a major driver for human consciousness. Without that—
 
'''C:''' —But this is still digital consciousness. Let’s be clear. It’s ''approximating'' human consciousness.
 
'''B:''' —It is. It is. So who knows how big conscious space actually is and [for] synthetic consciousness, what form that will take. But using the human consciousness as a template, they think—
 
'''C:''' —It’s the only one we have.
 
'''B:''' —You’ve got one real data point there. Well, except for—but it’s all, like, life on Earth and primates and dolphins. So they think that if you link up these AI test beds that have those three legs—so you’ve got the computational complexity, like deep learning gives us with pattern recognition and things like that, and you link that up to another system that has autonomous complexity, and these have been developing in the labs for 15, 20 years, and then you hook that to the social complexity cognitive robotic agents, put them all together—
 
'''S:''' —You get a [https://masteroforion.fandom.com/wiki/Psilon Psilon].
 
'''B:''' —then you—well—
 
'''C:''' —I don’t understand what that means. ''(laughs)''
 
'''E:''' Still?
 
'''J:''' So what’s the point?
 
'''B:''' They’re joining these different test beds that look at AI from a different perspective, putting them together, and they’re communicating, sharing data, sharing the things that you need to become, we think, have a consciousness like humans. So they’re communicating… And the one drawback with deep learning is that even the ones that were great at chess, they couldn’t tell you, "Well, I looked at all the rules of chess and I played about a billion games, and these are my takeaways. These are my insights into chess." They ''can’t'' give us those insights because they’re not designed to speak and say, "This is what my takeaway [is]," so it’s kind of like a black box, kind of like an {{w|oracle}}, where you ask—
 
'''E:''' —We may never know.
 
'''C:''' —We can’t learn [inaudible].
 
''(crosstalk)''
 
'''B:''' —You really can’t. You ask a question, you get an answer. And it sounds like—that sounds completely unintuitive. How could that even work?
 
'''C:''' —{{42}}!
 
'''B:''' —But when you test it, it works.
 
'''E:''' —Basically.
 
'''B:''' —So these systems are now communicating with each other, and this is the crux of this news items is, is that they’re talking together, they’re making advances that they never would expect, not only with computational complexity but social complexity and autonomous complexity. They’re seeing advances they have not seen, ever, so they think this could be—we’re not there.
 
'''C:''' Should we be scared?
 
'''B:''' Not now. Maybe later.
 
''(laughter)''
 
'''B:''' Now, now just be happy because it looks like we’re finally on the path for some sort of artificial general intelligence—
 
'''C:''' ''Now'' be happy! Be scared later. Okay.
 
'''E:''' —Enjoy it while it’s good.
 
'''B:''' —You know, brain imaging has come a long way, and that’s like comparing top-down to bottom-up approaches. I think it could give brain imaging a run for its money because that’s another viable way for artificial general intelligence. We’ve got a brain! Image it. Digitize it. Make it work digitally, and that’s another viable path. That’s very promising, but now maybe it has competition. Who knows who will get there first.
 
'''C:''' Yeah, {{w|radiology}} has been a dwindling specialty lately. Like, the techs are able to do a lot of what the physicians used to do because these new—
 
'''B:''' —The pattern recognitions are—
 
'''C:''' —Yeah, pattern recognition algorithms are amazing.
 
'''B:''' —In that domain, they’re off the hook. Off the hook.
 
'''S:''' All right but here’s the thing that concerns me, right? And this is going back at least 15 years when I first heard about this thing. You guys remember {{w|Google}}, right? They have a—it’s still sort of state of the art. They can translate any language into any language, right?
 
'''B:''' Yep.
 
'''S:''' But do you know how they do that? You translate every language into a machine language and then you translate that machine language back into any other language. So you don’t have to make a connection between every language and each other; you just have to make a connection between every language and this machine language.
 
'''B:''' That’s what’s happening here!
 
'''S:''' But on steroids. So this is going back at least 15 years—
 
'''C:''' —But it’s so glitchy, still, isn’t it?—
 
'''S:''' —No, it really isn’t.
 
'''C:''' —I mean, when you do that, you lose so much context and nuance and cultural kind of—
 
'''S:''' —It’s getting a lot better because they’re not translating word-for-word, they’re translating idea-to-idea. You can translate even a euphemism, and metaphor, whatever, into the machine language. But here’s the thing—
 
'''C:''' —And there’s {{w|prosody (linguistics|prosody}}, and all—
 
'''S:''' —Here’s the critical bit: ''nobody'' can speak this machine language. We have ''no'' idea what it is.
 
'''C:''' Well, yeah, of course not because it’s got every—it’s like the core of everything.
 
'''S:''' Yeah, it’s a separate language that these computers developed. This is mainly deep learning. They developed it through deep learning and—
 
'''C:''' —It’s the black box.
 
'''S:''' —they understand it, but no ''human'' understands this language. So now we have computers talking to each other ''in'' this language that we can’t understand, and it’s like a closed loop. It is another black box. Who knows what the hell’s going to pop out of it.
 
'''E:''' We can’t command them to tell us what is going on?
 
'''S:''' We can’t—it’s not a human language. We can’t understand it.
 
'''C:''' Yeah, all they can do is translate back into our language, which is—
 
'''S:''' —That’s right. They’ll translate back into English, but they can’t communicate to us directly in their language, and people tried—
 
'''C:''' —Because we can’t speak their language.
 
'''E:''' It’s not just {{w|binary code|binary}}?
 
'''S:''' No, it’s not binary. It’s an abstract language.
 
'''C:''' It’s like a synthesis of everything else. It needs all of it to be able to—
 
'''S:''' No one’s been able to crack its
 
'''J:''' There’s only about 30 movies out there {{w|Category:Films about artificial intelligence|that show how bad}} that this will turn out. And we just keep pretending like it’s going to be okay. We should just be like, "Maybe we shouldn’t let computers speak to each other in a language that we don’t understand." Maybe?
 
'''C:''' But, Jay—
 
'''B:''' —That’s been happening on some level for decades—
 
'''S:''' —It’s been happening for a while.
 
'''C:''' —It’s easy to say that, but think about all the amazing technology we’d be missing if we just, like, blocked this from the beginning.
 
'''B:''' But not just that. Imagine the things we can learn, even geo-engineering to help with this climate change ''disaster'' we’re entering.
 
'''S:''' I’m sure they’re running the calculations on the rockets to move the—
 
'''E:''' Something tells me the computers don’t care too much about carbon emissions. It’s no threat to their—
 
'''J:''' —We’re really screwed.
 
'''E:''' —existence.
 
'''C:''' —No, but that’s the thing, we are inherently limited through our own human filters and fallacies, right? So these computers are capable of maximizing algorithms. They don’t fall victim to the {{w|heuristics}} that we have to use. So they’re going to be able to solve problems that we are too limited to be able to solve.
 
'''S:''' That’s the hope.
 
'''C:''' The question is, what are the unintended consequences?
 
'''E:''' Yes, that’s always the case.
 
'''J:'''  The real day that we’ll know we’re screwed is when we finally do tell the computers, "Well, tell us what you’re talking about with the other computers." And they go, "Eh, nothing, don’t worry about it."
 
''(laughter)''
 
'''E:''' "You’ll find out."
 
'''S:''' "It’s not important."
 
'''B:''' Maybe they’re writing poetry. Probably not.
 
'''S:''' I wasn’t worried about this when they were driving your car and things like that, but when you talk about, "We’re going to combine the deep learning piece and social piece with the self-preservation, full autonomous"—that’s the piece that’s ''always'' concerned me. And even if it—and, remember, I’ve gone through these phases where at first, I’m like, "Yeah, this is something we need to be worried about." Then I’m like, "Meh, maybe not because this is deep learning phase. Deep learning can do anything without AGI, so we’re going to develop AGI." Then we sort of really learned the hard limits of deep learning. It’s like, "Well, so we may need to go beyond that." But also, you don’t ''need'' self-awareness in order to be a threat to civilization.
 
'''B:''' Right, just mindlessly do something very destructive.
 
'''S:''' Exactly.
 
'''J:''' In the future, they’re going to say, "Skynet went online in 2037." And you know what happened with Skynet and the {{w|''Terminator'' (franchise)|Terminator}}, remember that?
 
'''S:''' Well didn’t Skynet turn into something else? What was the one it turned into? I forget that crappy reboot. Remember, from 20—
 
'''J:''' Yeah, whatever, that movie sucked.
 
'''C:''' ''(laughs)''
 
'''E:''' Nobody knows. Nobody watched it.
 
'''B:''' I’ve got it on my {{w|Ultra-high-definition television|10K screen}}. It’s awesome.
 
'''S:''' So they have it in 10K?
 
'''E:''' 10K, that’s it.
 
'''C:''' I just watch everything on my Aug now. You guys still have ''screens?''
 
'''S:''' Yeah, I’m old-fashioned.
 
'''E:''' Retro.
 
'''C:''' You’re so retro. You still drive cars, don’t you?
 
''(laughter)''
 
'''S:''' I will still occasionally drive.
 
'''E:''' I have a {{w|classic car|classic}}!
 
'''C:''' You guys will go out and drive a car.
 
'''S:''' Yeah, I still have my license.
 
'''B:''' The {{w|Flying car|drone cars}} are the best, though, come on.
 
'''S:''' Yeah, I know. That’s true.
 
'''C:''' Self-driving…
 
'''S:''' So if Perses doesn’t kill us, the Psilons are going to kill us. Is that what you're telling us?
 
'''J:''' Right.
 
'''B:''' Maybe. Maybe. It’s going to be a fun ride either way.
 
'''S:''' But at least we’ll have slug burgers to eat in the meantime.
 
'''B:''' ''(laughs)'' Way to bring it around, there, Steve!
 
'''S:''' Been doing this for awhile, Bob.
 
''(laughter)''
 
{{anchor|sof}}
{{anchor|theme}} <!-- leave these anchors directly above the corresponding section that follows -->
== Science or Fiction <small>(1:16:50)</small> ==
{{SOFinfo
|theme = Anxiety<ref name=anxiety>[https://www.neurocorecenters.com/8-facts-anxiety-symptoms-statistics Neurocore: 8 Fascinating Facts About Anxiety: Symptoms, Statistics, and Efforts to Reduce the Stigma]</ref>
 
|item1 = Anxiety is more prevalent in developed countries and among women.
|item2 = Anxious people are less sensitive to changes in facial expression.
|item3 = Friends and family of socially anxious people tend to think highly of them.
|item4 = People who suffer from anxiety can perceive smells negatively while having an anxious episode.
|}}
{{SOFResults
|fiction = less sensitive
 
|science1 = more prevalent
|science2 = negative smells
|science3 = think highly of
 
|rogue1=bob
|answer1=less sensitive
|rogue2=Steve
|answer2=less sensitive
|rogue3=Evan
|answer3=less sensitive
|rogue4=Cara
|answer4=think highly of
|host= Jay      <!--- asker of the questions --->
<!-- for the result options below,
    only put a 'y' next to one. -->
|sweep=      <!-- all the Rogues guessed wrong -->
|clever=    <!-- each item was guessed (Steve's preferred result) -->
|win=y        <!-- at least one Rogue guessed wrong, but not them all -->
|swept=      <!-- all the Rogues guessed the correct answer -->
 
}}
'''S:''' So, Jay, you are going to cover "Science or Fiction" this episode.
 
'''B:''' Oh boy.
 
'''J:''' Right.
 
'''E:''' Ooo!
 
''Voiceover: It’s time for Science or Fiction.''
 
'''J:''' So as you know, Cara and I very openly talk about our—we’re medicated people. I suffer from anxiety. I thought I'd talk about {{w|anxiety}} today.
 
'''C:''' ''Extra'' medicated today, though.
 
'''J:''' And I thought I would hit you guys with some interesting facts about anxiety and see if you could figure out which one of these is not correct. So the first one is—so what I'll do is I'll go through these four items—
 
'''B:''' —I’m anxious about this one.
 
'''C:''' ''(laughs)'' Yeah.
 
'''J:''' —and then I'll quiz the audience, and then I'll let you guys go, and then we'll see if you guys change the audience's decisions. So the first one is: "Anxiety is more prevalent in developed countries and among women." The second one is: "Anxious people are less sensitive to changes in facial expressions." The third one: "Friends and family of socially anxious people tend to think highly of them." And the last one: "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively."
 
So if you [the audience] think that the first one – anxiety is more prevalent in developed countries and among women – if you think this one is the fake, clap when I lower my hand. ''(a few claps)'' Okay, four people. ''(audience laughter)'' The second one – anxious people are less sensitive to changes in facial expressions – if you think this one is the fake... ''(most of the audience single claps)'' The third one – friends and family of socially anxious people tend to think highly of them – if you think this one is the fake... ''(another few claps)''. And the fourth one – people who suffer from anxiety, while having an anxious episode, can perceive smells negatively. ''(remaining few claps)'' Okay so, definitely, the crowd here thinks that number 2 is the fake, the one about anxious people are less sensitive to changes in facial expressions. So, Bob – and don't scroll, because all the answers are [inaudible].
 
''(laughter)''
 
'''C:''' You can't ask your wife!
 
=== Bob’s response ===
 
'''B:''' Okay. "...more prevalent in developed countries and among women." That just makes sense. That’s all I’m going to say. "Socially anxious people tend to be thought highly of by friends and family." Yeah, that kind of makes sense. I just realized I know so little about this. I’m just going by what little experience I have. That kind of makes sense as well. And then this last one, here, this one ''really'' makes sense to me. "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." I’ve run into some people who seem to have that happen, although I don’t know if they were necessarily suffering from anxiety. But I think I’m going to go with the audience. They seem to be very confident about this. And this is the only one, the second one, that doesn’t quite make as much sense to me as the other ones. '''They’re less sensitive to changes in facial expressions.''' I can’t imagine why that would be so. So I’ll say that one’s fiction.
 
'''J:''' All right, Steve.
 
=== Steve’s response ===
 
'''B:''' Steve’s like, "I wrote a paper on this one!"
 
''(laughter)''
 
'''C:''' Novella, et. al. 2029. <!-- what?!?!?!? -->
 
'''S:''' The "developed countries and among women", I seem to remember that that is the demographic, yeah. Anxious people are less sensitive to changes in facial expressions? I would guess they were ''more'' sensitive to it because they’re kind of looking for things. So that may be how that one is the fiction. That was my initial thought. Friends and family think highly of them? Yeah, I think they tend to be more kind of overachiever kind of people who are anxious, so that would go along with that. And, yeah, this is going back maybe 15 or 16 years, but I seem to remember the smell one, that they interpret things in a negative way. It’s kind of like the brain is just interpreting everything negatively. So that makes sense. I was thinking that '''the facial expression one was the fiction''' even before the audience chimed in, so I’m going to agree with the audience as well.
 
'''J:''' Evan?
 
=== Evan’s response ===
 
'''E:''' Well, I’m not trying to be a lemur here, but—
 
'''S:''' —Lemurs don’t jump off cliffs. That’s a myth.
 
'''C:''' —That’s also not a lemur. That’s a ''lemming.''
 
'''E:''' —Thank you, Steve.
 
'''S:''' —Lemming.
 
'''E:''' —Oh, whatever!
 
''(laughter), (applause)''
 
'''E:''' I set 'em up, they knock 'em down! ''(laughter)''
 
'''C:''' You don’t have to be a lemur, either. ''(audience laughter)''
 
'''J:''' You’re such a ''lemur.''
 
'''S:''' So what would that be? You piss on your hands and rub it up against trees? ''(laughter)''
 
'''E:''' Yeah, let me show you. ''(laughter)'' Oh boy. Look, I really have no insight to this. I know very little about anxiety issues. I’m a neophyte when it comes to this kind of stuff. I don’t think I’ve experienced any real sensation of anxiety in my life—
 
'''J:''' —Oh, you’re ''so'' lucky.
 
'''E:''' —in which I’ve felt like I had to seek help for it or anything. Maybe I have and just didn’t, but I’ll just say what Steve kind of said—not just because it’s Steve, because I had the same thing—'''''less'' sensitive to changes in facial expressions: that seems to be the opposite.''' Wouldn’t they be ''more'' sensitive to changes in facial expression? They’re constantly looking for feedback, signals, and interpreting—
 
'''S:''' —They could be self-absorbed, though, and that’s why they’re less sensitive.
 
'''E:''' Maybe.
 
'''S:''' I’m just throwing that out there.
 
'''E:''' Maybe, but that was also my initial reaction. And have no reason to believe that it’s otherwise, so I will go that direction.
 
'''J:''' All right, Cara, what do we got?
 
=== Cara’s response ===
 
'''C:''' This is a tough one because I’m not sure I agree with the crowd. I do agree that anxiety is more prevalent among women. I know depression is more prevalent among women, and the neurotic personality style is more prevalent among women, and anxiety and neuroticism tend to—I don’t really like that word, anymore, but they still do use it in the literature. I also think that people who have anxiety might perceive a smell more negatively just because they’re—I think that vigilance that happens—and also, you specifically said while they’re having—you didn’t say panic attack, but I’m assuming it’s something along the lines of a severe experience of anxiety. They’re going to catastrophize everything. That’s a common experience.
 
My problem is with the two middle ones, and I’m kind of on the fence between them right now. So anxious people are ''less'' sensitive to changes in facial expressions? On the whole, anxious people? I don’t know because there’s so many types of anxiety. I think that if somebody is actively experiencing panic,  they’re going to be way less sensitive because they’re not dialed into what somebody looks like at all, but somebody who might be ''socially'' anxious might be ''more'' sensitive to a change because they’re worried about feedback and how they’re being perceived, right? Being on anxiety is kind of like being high, and you’re like, "Everybody’s looking at me. They all think I’m saying something stupid." That ''can'' be an experience of somebody who’s experiencing social anxiety.
 
On the flip side of that, "friends and family of socially anxious people tend to think highly of them." You specifically said ''socially'' anxious people. Socially anxious people tend to withdraw from interaction in public. And I think that sometimes there is actually a lot of stigma around social anxiety that actually leads to people thinking that that person is anti-social. That person’s not very nice. That person kind of comes across like "they don’t really like me, or they think they’re better than me."—
 
'''B:''' —But no one cares.
 
'''C:''' —So I do think sometimes friends and family of socially anxious might actually stigmatize them a little bit and think negatively of them. So that’s kind of where I’m on the fence because I think ''either'' of those could be true. My fear is that—or my concern is that "anxious people are less sensitive to changes in facial expressions" is a ''broad'' statement. Anxious people ''on the whole'' are less sensitive to facial expressions? Maybe? Maybe not. So—
 
'''B:''' —Come on, be a lemur. Come on!
 
'''E:''' Yeah, yeah! Be a lemur!
 
'''C:''' I might be wrong—and just to be clear, I do not study anxiety, and I don’t have anxiety. I am medicated for depression, and I don’t really work with anxiety in any of my clinical work. It’s not an area that I research ''at all'', so basically what I know is just what I know from textbooks. And I’ve never specifically come across these studies. But there’s a part of me that thinks there is still a stigma around socially anxious people. And so I’m going to say '''people actually ''don’t'' think more highly of them.''' And that’s the fiction. But I could be wrong. You guys could totally have it because I’m on the fence about those.
 
=== Jay polls the audience again ===
 
'''J:''' All right. Let’s go through again. I’m going to ask the audience, here. So, we’ll go to the first one again. "Anxiety is more prevalent in developed countries and among women." ''(one clap)''
 
'''S:''' One holdout!
 
'''C:''' [inaudible]
 
'''S:''' Stick to your convictions!
 
'''E:''' Independence! I love it. Yes.
 
'''J:''' Apparently the rest of the audience was too anxious to clap. ''(laughter)'' "Anxious people are less sensitive to changes in facial expressions." ''(audience single claps)
 
'''C:''' Hmm…
 
'''J:''' I don’t know. It’s pretty close to the first one.
 
'''C:''' I don’t know. Let’s lesson to the next one.
 
'''E:''' A few people shifted.
 
'''J:''' "Friends and family of socially anxious people tend to think highly of them." ''(audience single claps)''
 
'''B:''' Oh boy!
 
'''S:''' Cara definitely influenced them.
 
'''C:''' But, guys, I might have led you astray. ''(laughter)'' I’m really sorry.
 
'''J:''' "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." ''(another few claps)'' All right. Did you feel that those [middle] two were close?
 
'''S:''' Those two are a lot closer than initially—
 
'''E:''' —A lot closer.
 
'''B:''' —[inaudible] ask again, real quick?
 
'''S:''' No, I think we’ll just call it a tie.
 
'''C:''' I think we shifted it to more tied in between the two.
 
=== Jay explains Item #4 ===
 
'''J:''' All right. I will start with the last one: "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." So, people with anxiety disorders tend to label neutral smells as bad smells, so '''this one is science.''' Professor Win Lee explains, "in typical order-processing, it is usually just the olfactory system that gets activated, but when a person becomes anxious, the emotional system becomes part of the olfactory processing stream."<ref name=anxiety/> That is ''fascinating.''
 
'''B:''' Wow. That’s cool!
 
'''J:''' So, your anxious consciousness taps into the way that your olfactory processing happens.
 
'''E:''' But what about the other areas, the other senses? Does it also impact—
 
'''C:''' —I think it does affect other senses, too. It might make sounds more shrill or more difficult.
 
'''E:''' —Tastes, even?
 
'''B:''' —But it makes sense that it would be tied to smells because your olfactory centers are closer to the—
 
'''J:''' —To memory.
 
'''B:''' —the limbic areas of your brain tied to emotions. So that’s why when you smell something, it can bring you back ''decades.'' Just that one trigger of a smell can bring you back to a memory that’s literally fifty years—
 
'''C:''' —They’re also very fast, right? Your olfaction, because it doesn’t pass through the {{w|thalamus}} like everything else. It’s a very fast sense compared to some of the other senses. It’s evolution, like, very old.
 
'''J:''' —To answer your question, I don’t know, Ev, I don’t know if it can hijack the other senses as well. As an anxious person, I will tell you that if I’m having a ''really'' bad panic attack, ''everything'' is catastrophized.
 
'''C:''' Yeah, it’s acute.
 
'''J:''' Yeah, everything’s acute. I would imagine—
 
'''E:''' —Or exaggerated. But negative, as well.
 
'''J:''' —But it’s also something, with my personal experience, very much insular, like I’m turned into myself. I’m not peering out into the world. I’m just looking in at what’s going on.
 
'''C:''' —''(hinting)'' You might not be looking at faces…I don’t kno-o-ow. ''(laughs)''
 
=== Jay explains Item #3 ===
 
'''J:''' All right. So I want to go to #3, "Friends and family of socially anxious people tend to think highly of them."
 
'''C:''' Crap.
 
'''J:''' I’ll just read this. And then you guys will—
 
'''C:''' —Crap.
 
'''J:''' —discover what the truth is. So people with social anxiety usually think they don’t do well in social situations, but new research indicates otherwise. So '''this one is science.''' "Friends of those with social anxiety tend to think very highly of their nervous companions. This is possibly due to how sensitive anxious people can be while they’re in a social environment, meaning that they think before speaking and always consider the feelings of others."<ref name=anxiety/>
 
'''C:''' So, wait, you’re saying that they think more highly of them than they think of themselves?
 
'''J:''' Well, a socially-anxious person—
 
'''C:''' —''(playfully growling)'' ''Not'' what the item said!
 
'''J:''' —Yeah it is.
 
'''C:''' Is it?
 
'''J:''' Yeah, listen.
 
'''C:''' It just says, "highly."
 
'''J:''' "Friends and family of socially anxious people tend to think highly of them." So a socially anxious person is actually, for lack of a better way to say it—they’re scoring points with friends and family because they’re tuned into their politeness and to the other people more. ''Because'' of their social anxiety, they’re reading everyone, and they’re analyzing their environment more actively than a person that doesn’t have the anxiety.
 
'''C:''' Gotcha. Okay. Yeah.
 
=== Jay explains Item #2 ===
 
'''J:''' So I will now go to, "Anxious people are less sensitive to changes in facial expressions." '''This one is the fake.''' So the audience got it. Good job.
 
'''C:''' Good job, guys! ''(applause)''
 
'''J:''' I picked this one because the way that I did this—I tested myself on all of these facts. I read them and thought to myself whether I agreed. The website I found was kind of like, "well, what do you think the truth is?" And it was interesting.
 
I thought that this one was the opposite because of what you and I said, because when you’re having a panic attack, you’re so—your surroundings almost don’t matter because you really do kind of get this haze that comes over you and you’re just in your own head. It’s very insular. But it turns out that people who are anxious—so they said, "People with anxiety are quicker to perceive changes in facial expressions than those without anxiety; however, they are less accurate at perceiving their ''meanings.''"<ref name=anxiety/> So they can misinterpret them—
 
'''S:''' —But they probably interpret them negatively.
 
'''C:''' —They make them negative.
 
'''J:''' Right, right, of course. "It’s easy for those who struggle with anxiety to overthink and jump to conclusions. This may lead to tension and conflict in relationships."<ref name=anxiety/> So, very good, audience. You guys did a great job, except you [pointing to lone hold-out], who I noticed didn’t clap because you were thinking probably like I do. ''(laughter)''
 
=== Jay explains Item #1 ===
 
'''J:''' So the first one: "Anxiety is more prevalent in developed countries and among women." '''This one is science.''' "The US is considered to be one of the most anxious nations on Earth."<ref name=anxiety/> Sociologists blame the increased number of choices—''the increased number of choices that we have''—so our modern—
 
'''S:''' —[inaudible]
 
'''J:''' —well, modern society in general. We have—
 
'''S:''' —It’s getting worse.
 
'''J:''' —We have so many choices in front of us that it adds up to emotional stress throughout the day. You get more and more stressed. You got so many—you’re scrolling through {{w|Amazon (company)|Amazon}}, and you don’t just have one pair of socks. You’ve got hundreds of pairs of socks, and you have to think about it and think about it and think about. So—
 
'''S:''' —Well, and this gets to the, seriously, the confluence of AI and [[SGU_Episode_762#Social_Media.2C_CAD.2C_.26_the_Aug_.2835:25.29|the Aug]], social media, is you have virtual assistants who make decisions for you, and people love that because it reduces their anxiety—
 
'''E:''' —Yeah, exactly.
 
'''S:''' —it reduces their choices. And now you have not only targeted ads; you’re allowing whoever’s in charge of the Aug to live your life for you, like to lead you around and make decisions for you. And, at first, it’s like the things you don’t really care about that much or whatever, but how intrusive is that going to get? Think about it! Again, we’ll trade convenience for security, for privacy. Imagine how much we’ll trade to really reduce our cognitive load? That is really what psychologists would call that, right? {{w|Cognitive load}} is how much work you have to do to get through your day, to get through a task, to do something. AI system software in general, it’s all engineered—or it should be, if it’s good, if it’s working well—to ''minimize'' cognitive load, right?
 
Good movie-making is about minimizing cognitive load in a lot of ways. I remember, back when we were still doing films, we learned—because we got a course from our friend at {{w|Pixar}}, who said, "If you follow the action on a movie screen"—remember movie screens?—"You follow the action. If one scene ends over here, the next scene picks up here." [Steve presumably gestures.] Right? It doesn’t pick up over here?
 
'''J:''' Yeah, meaning that where your eyes are—
 
'''S:''' —Yeah, they know where eyes are. They’re following your eyes, and then they’re making sure your eyes are following the action from one scene to the next—
 
'''C:''' —It’s less work for you.
 
'''S:''' —because—right, because that’s less work. If you have to suddenly hunt for where the action picks up—"Oh, it’s over here!"—that’s cognitive load—
 
'''E:''' —Too disorienting, yeah.
 
'''S:''' —it takes you out, it re—
 
'''C:''' —That’s why {{w|360-degree video|360 films}} are hard for people. Like it’s hard to catch on to a 360 movie because you have to—
 
'''S:''' —Or {{w|Virtual_reality_applications#Cinema|virtual films}}, remember the virtual films, which never really took off? <!-- Cara and Steve are basically riffing on the same thing here -->
 
'''C:''' —yeah, you have to ''find'' the action, as opposed to—
 
'''S:''' —Yeah, you’re constantly looking for where the action is. They can be fun, but that’s ''high'' cognitive load. You’ve got to be in the mood for that. So now we’re just going to be surrounded by systems that will reduce our cognitive load for us, and that’s like crack. Who won’t do that?
 
'''J:''' That’s like somebody cutting your lawn for you.
 
'''S:''' Yeah.
 
'''J:''' How could you not love that? ''(audience laughter)''
 
'''S:''' Right.
 
'''E:''' The lawn bots.
 
'''J:''' My wife and I were going in overlord to get the yard cleaned up for the fall. And we hired some people to come and take down some trees from the tornado and I remember standing—I have a cup of coffee. I’m looking out the window. I’m watching a few guys work on my yard, and I’m just like [loving gesture/nod?] "I love ''all'' of you guys. Thank you so much! This is such a pleasure."
 
'''E:''' "I’m in here, you’re out there."
 
'''B:''' I told you to get robots to do that.
 
'''C:''' ''(laughs)''
 
'''B:''' I look out my window and want the ''robots'' cutting my lawn—
 
'''J:''' —I don’t want robots in my yard. ''(laughter)''
 
'''E:''' "Get off my lawn!"
 
'''S:''' Still not down with the robots?
 
'''J:''' No.
 
== Questions/V-mails/Corrections ==
 
'''S:''' We got emailed—or v-mailed some questions, if we want to take some virtual questions.
 
'''C:''' Uh-huh, uh-huh, uh-huh.
 
'''S:''' I have one, which I want to bring up.
 
=== Question #1: New Universal Flu Vaccine <small>(1:33:24)</small> ===
 
'''S:''' So did you guys all get your {{w|Influenza vaccine|flu shot}} this year? Everybody get their flu shot?
 
''(Rogues confirm.)''
 
'''S:''' It’s not really flu season down here, right?
 
'''C:''' They got theirs six months ago, right?
 
'''E:''' Their ''quad.''
 
'''S:''' Well, the quad, that ''was'' the standard of—actually, remember the [https://en.wiktionary.org/wiki/quadrivalent#Adjective tetravalent vaccines], the flu vaccines?
 
'''E:''' Yeah.
 
'''S:''' But now we have the {{w|universal flu vaccine}}, which came out in 2032. So the question—this comes from Haywood, and Haywood asks—
 
''(Rogues cackle at inside joke.)''
 
'''E:''' [to Jay] He got you! Totally got you.
 
'''C:''' ''(laughing)'' I’m sorry.
 
'''J:''' ''(laughing)'' [inaudible] swallowing. I just [inaudible]. Did you ''not'' think I was going to lose it? Does anybody know why that’s funny?
 
'''B:''' No, you ''can’t''—
 
'''C:''' Yeah, we don’t have to tell them.
 
'''E:''' Unh-unh-unh-unh-unh-unh.
 
'''B:''' You can’t say! Jeez, stop it.
 
'''C & E:''' ''(laughs)''
 
'''J:''' ''(laughing)'' What was that emailer’s last name, Steve?
 
'''C:''' No! He left it off the email.
 
'''S:''' ''(laughing)'' He didn’t say. Just a first name [inaudible].
 
'''C:''' ''(laughs hard)'' He’s totally losing it.
 
'''E:''' ''(guffaws)''
 
'''S:''' So, anyway.
 
'''E:''' Oh, gosh!
 
'''S:''' He wants to know if he should get the new universal flu vaccine because—well, there's now the antivaxxer fear mongering around this one, right, because—
 
'''B:''' —Yeah, of course.
 
'''S:''' —because it’s all genetically modified, et cetera. So, yes, ''Haywood,'' you ''should'' get the universal flu vaccine because even the tetravalent vaccine—Every year, back in the day, up until two years ago, they would have to—If you were from the United States, like we are, they used to give us whatever strains of flu you guys [Australians] were getting, and then ''you'' get whatever strains we’re getting six months before because that was lead time to make the vaccines. And there’s, of course, hundreds of strains, and they’re just ''guessing.'' So they increased the ''number'' of strains that they were covering per vaccine. Some sort of became permanently imbedded, so you have to cover certain strains every year, then you have to add one or two that you think are going to come—
 
'''C:''' —But that left out any potential mutations.
 
'''S:''' Yeah. When the vaccine matches, it’s like 95% effective, but mismatch could reduce that to 90, 60, ''40%'' on bad years. It might only be 40% effective.
 
'''C:''' Yeah, there have been years like that, for sure.
 
'''E:''' Very bad.
 
'''C:''' Where you got the vaccine, you still got the flu. It sucked.
 
'''S:''' They’ve been researching, for about 40 years, a universal flu vaccine. The problem has always been that the parts of the flu vaccine—of the flu virus—that are universal are hidden from antibodies. The immune system can’t get access to that because all of the stuff that changes from strain to strain was in the way. But they did finally figure out a way to crack into that, to get access to the universal bits. And so they’ve been, now, producing a universal flu vaccine. And if you get that, you are resistant to ''every'' flu strain. And so you only need to get it about once every five years. If you get that for once every five years—and now it’s like ''every'' year it’s 95% effective.
 
'''E:''' That’s good.
 
'''S:''' So, yes, get it! You should absolutely get it.
 
'''J:''' Of course!
 
'''C:''' We all did!
 
'''S:''' Yeah, I know. I know. I made you get it.
 
'''C:''' ''(playfully proud)'' We’re the SGU!
 
'''E:''' It still hurt a little, though.
 
'''S:''' It’s still a vaccine. It’s still a shot.
 
== Skeptical Quote of the Week <small>(1:36:31)</small> ==
<blockquote>Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward. — Dr. Alyssa Carson<ref name=Carson/>, first resident of {{w|Moonbase|Armstrong Station}}, The Moon</blockquote>
 
'''S:''' All right, Evan, before we close out the show, give us a quote!
 
'''E:''' "Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward," spoken by Dr. Alyssa Carson. She’s a NASA astronaut and she was the first inhabitant of Armstrong Station on the Moon in 2031.
 
''(laughter), (applause)''
 
== Signoff ==
 
'''S:''' Thank you guys all for joining me for this special episode, and [to audience] thank all of you for joining us—
 
'''C:''' —Thanks, Steve.
 
'''S:''' —and until next week, this is your {{SGU}}.
 
''(applause)''
 
{{Outro664}}
 
== Today I Learned ==


== References ==
== References ==
Line 609: Line 3,066:


{{Navigation}}
{{Navigation}}
{{Page categories
|Guest Rogues              =
|Live Recording            = <!-- redirect created for Greetings from the Future: Live from Melbourne -->
|Interview                  =
|Randi Speaks              =
|Skeptical Puzzle          =
|Amendments                =
|Alternative Medicine      =
|Astronomy & Space Science = <!-- redirect created for NEAs: Apophis and Perses -->
|Cons, Scams & Hoaxes      =
|Conspiracy Theories        =
|Creationism & ID          =
|Cryptozoology              =
|Energy Healing            =
|Entertainment              =
|ESP                        =
|General Science            =
|Ghosts & Demons            =
|History                    =
|Homeopathy                =
|Humor                      =
|Legal Issues & Regulations =  <!-- redirects created for global warming, the Aug -->
|Logic & Philosophy        =
|Myths & Misconceptions    =
|Nature & Evolution        =  <!-- redirects created for GMOs in 2035; synthetica; global warming -->
|Neuroscience & Psychology =  <!-- redirect created for anxiety -->
|New Age                    =
|Paranormal                =
|Physics & Mechanics        =
|Politics                  =  <!-- redirect created quebec accords -->
|Prophecy                  =
|Pseudoscience              =
|Religion & Faith          =
|Science & Education        =
|Science & Medicine        =  <!-- redirect created for flu vaccine -->
|Science & the Media        =
|SGU                        =
|Technology                =  <!-- redirects created for the Aug; Deep Learning -->
|UFOs & Aliens            =
|Other                      =
|SoF with a Theme =  <!-- redirect created for anxiety -->
}}

Latest revision as of 15:25, 27 July 2023

  Emblem-pen-orange.png This episode needs: proofreading, 'Today I Learned' list,
Please help out by contributing!
How to Contribute


SGU Episode 762
December 6th 2035 😉
762 SGU from the future.jpg

"Greetings from the Future" art

SGU 761                      SGU 763

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Quote of the Week

Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward.

Alyssa Carson[1], first resident of Armstrong Station, The Moon

Links
Download Podcast
Show Notes
Forum Discussion


Introduction[edit]

Voiceover: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello and welcome to the Skeptics' Guide to the Universe. (applause) Today is Thursday, December 6th, 2035, and this is your host, Steven Novella. (audience laughter) Joining me this week are Bob Novella ...

B: Hey, everybody! (applause)

S: Cara Santa Maria...

C: Howdy. (applause)

S: Jay Novella ...

J: Hey guys. (applause)

S: And Evan Bernstein ...

E: Good evening folks! (applause)

S: So I have to say it's great to be back in Melbourne, but I am –

B: Wait, why did you laugh? Why was that funny? (laughter) We worked for months to get this pronunciation correct. What happened?

S: There's no right or wrong. There's no right or wrong.

J: As recent as today, somebody sent us an email that explained how to say it, yet again. (laughter) They said, "drop all the vowels."

S: Right. But then they yell at us because there's a difference between saying it properly and saying it with an accent.

J: Yeah.

S: And we're supposed to say it properly for an American.

C: Yeah, without an [inaudible].

S: And I have no idea where in the spectrum of "Mel-born" to "Mel-burn" to "Mel-bin"…

E: Yeah, just don't say, "Mel-born." You're safe.

S: So it's great to be here, but I have to say I'm getting a little old for the 14-hour flights across the Pacific. You know, it was just a couple years ago that they brought back the supersonic commercial airliners, like 2031, I think it was, but they are just still too expensive for schlubs like us.

C: I've done 'em before, though. They're worth it, you guys.

S: Oh, sure.

C: I keep trying to convince you.

B: Of course you've done it. And probably first class [inaudible].

S: What is it, about six hours across the…?

C: Yeah, it's so much easier. It's like flying – it's like it used to be when I'd fly from L.A. to New York.

J: And you don't hear the sonic boom anymore. They got rid of it.

C: Yeah, yeah, it's super comfy. Just fall asleep, wake up, I'm there.

B: But, Jay, that big breakthrough that allowed the supersonic transport to become viable again was the fact that they design the shape – you've seen the shape, it's a gorgeous, really elongated shape – but that minimizes the sonic boom by like a 1000th of what it used to be. And that's what was the big problem with it. Remember, what was it, the old one, the Concorde

S: And when did we first talk about that? It was, like, 15 years ago.[link needed]

B: Oh my god.

E: Long time ago.

S: And here we are, like just coming [inaudible].

B: Remember? I saw it. I think I saw it in a magazine the first time we were in this area. And I said, "Look at this. This is something that's really going to be big in the future." And it was.

J: It is.

C: It is.

E: You were right, Bob.

C: Tense-shifting is hard from, like, the U.S. to Australia.

S: Yeah, yeah, yeah.

C: Time-traveling a little bit here. (winks?)

Future "News" Items[edit]

S: So, it's 2035, so this is our 30th Anniversary year of doing the SGU and because of that, we're finishing up 30 years. We're going to talk about regular news items, but we're going to give more of a history, like, where does this fit into the arc of science and skepticism over the last 30 years of the SGU, right?

Québec Accord, Global Corporate Alliance (3:10)[edit]

S: So, Jay's going to start with a news item that has something to do with global warming. He didn't tell me what it is, but you're going to start by telling us where we've been, where we're going, where are we in this saga that we've been talking about, it seems like, for 30 years.

J: Well, yeah, I mean when we first started talking about this, I don't even know when we first started talking about this –

S: —I think right at the beginning, 2005, 2006.

J: —It was a mounting thing that, as the years went by, we started to talk more and more about it. And then somewhere around the late 2020s, we really started to talk about, almost on every episode, to the point where listeners were emailing us, saying, "Okay, we get it. Global warming is bad news."

But we've seen a lot of bad things happen over the last 10 to 15 years where local governments, or governments in general are doing absolutely nothing. They still can't get out of their own way, right? We know that, but nothing has really been happening. And then in 2027, when Venice got so flooded that it couldn't recover, that's when the world woke up.

C: That was so sad. I miss Venice. (laughter)

S: And you can't even visit Venice anymore, right?

J: I mean, sure, you can, but there's only certain parts that you can go to.

C: It's too dangerous, guys.

B: But why didn't they try to just to build up, like abandon the bottom five [inaudible].

C: They tried that.

E: Too cost-prohibitive, among other things.

C: The foundation can't hold it.

J: The foundations weren't capable of holding it. So—

S: They would just sink back down.

J: It really hit a note across the globe when a lot of the art got destroyed. So that's when everybody—that's when I think we can kind of look back, as a marker, like the whole world took a pause.

So then in 2027, that same year, we had the Québec Accord happen, which was an absolute failure. I think Canada's heart was in the right place, but they tried to inspire the world to change. But governments just can't get out of their own way.

S: But think about it. Think about the Paris Accord, right, when was that? That was, like, 2015.

E: 2015.

S: Yeah, 2015. They said, "Okay, we're going to limit post-industrial warming to 2.0C above pre-industrial levels." And even though they knew that bad shit was going to happen at 2.0, really we needed to keep it beneath 1.5, which we hit this year, guys. This year we had 1.5C above pre-industrial level, 2035. So they didn't even try to ever get 1.5. They're like, "All right, let's just keep it below 2." And they failed to do that. What they agreed to wouldn't even accomplish that.

J: Yeah, there was no chance of them getting that.

S: And the Québec Accord, they're like, "All right, well, let's, maybe 3.0. Let's just keep it 3ºC above…

E: Move the goalposts.

S: And then, they, again, "We're not going to achieve that. We're all …

C: Well, and it's because they're not giving themselves any sort of—it's like a treaty. It's like, "Oh, we'll just agree to all do this."

E: It's a pledge.

C: It's a pledge. They're not even giving—

S: There's no consequences.

C: There's no consequences for not sticking to it.

J: Well, that's the problem because it's the real first global problem.

E: People, countries can exit as they wish.

C: I mean, remember back when Trump just dropped the ball on it? He just left. He just said, "No, Paris." I mean, we've been trying to make up for that ever since.

E: Gone.

S: Maybe Rubio will do the same thing.

J: Yeah.

C: Ugh. President Rubio.

E: President Rubio.

J: So, the things that we've seen—it wasn't just what happened in Venice but, you know, the storms continued to become deadly, right? So we have people dying every time there's a storm, a big storm.

S: Seems like every hurricane's a CAT-5 now.

C: Oh, and my city is constantly on fire. LA, also Sydney, even Melbourne. It's on fire all the time now.

S: Yeah, basically it's always fires.

B: Remember when—

C: Yeah, we used to have a fire season.

B: Yeah, remember fire season. Wasn't that quaint?

C: Now it's a red flag day every day.

J: But the reason why we're reviewing this is because, as you guys know, a few years ago, in 2032, IKEA, of all companies, drew a line in the sand and said that corporations have to now take the responsibility. And I love the tagline. What's the tagline?

S: "We got it."

J: "We got this."

All: "We got this."

C: IKEA! They got this.

S: But I don't think it's (plainly) "We got this." I think it's (assuringly confident) "We got this."

C: (laughs)

S: I think it's like, "Yeah, you guys failed. You're hopeless. You're in total political gridlock. So, somebody's got to step in. So we got this. Go away. We'll [inaudible]."

B: So you're referring to governments in general, right?

S: Yeah, governments.

E: Right.

J: And it's—

B: That was a great tagline.

S: Yeah, but, you know, I'm worried about it.

J: It is a dystopian future, though, when corporations have to save us from government.

C: It's a dystopian present.

S: But, literally, I remember back in 2018, I think it was, there was a very short-lived science fiction series on some channel, some cable channel, where that's exactly what happened, [which] is that corporations had to step in because the governments were in gridlock. And then they used that in order to get—they didn't take over from the governments, governments just ceded them more and more power until they were de facto in charge, which is what a lot of people are worried about—like the conspiracy theorists, but it's actually not unreasonable—that that's the ultimate plan of the—what are they calling it? The Global Corporate Initiative.

J: Right.

C: GC—

E: GCI.

S: Yeah, their plan is not just to fix global warming for the world but to actually take power, to seize power.

J: So it didn't really—it almost started off as a joke, but then, just recently, in the news article that I'm covering, we've actually hit a critical mass. There's a lot of companies that just signed on that agreed that they're going to follow it. Now, here are the basic rules, or whatever, that they're following. So they're saying that they will have a zero-carbon emission or less, meaning that they could actually pay in to even reduce carbon emissions, so the company cannot produce any carbon whatsoever. So—

C: Oh, so they get credits if they go negative, carbon negative?

J: Well, actually, the companies are committing to the Alliance or saying that if do, that they have to pay massive fines to the—

S: Well can’t they just buy the credit from people who are negative?

J: Yeah, yeah.

S: So they have to be neutral—

J: They have to be neutral, whether it’s done through finances or through their—

S: So it’s like the old cap-and-trade thing, but they’re just doing it—

B: But what’s the motivation for them to actually join this? Why are they joining—what’s the win for them? I mean, this is going cause some—they may have to pay fines if they don’t—

C: Haven’t you seen all of those social media boycotts of all the companies that are just eating carbon? I think young people today, they don’t want to buy products, they don’t want to engage with companies that are just destroying the environment. They’re a lot hipper than we were when we were young.

B: I don’t go on the young people’s social media, so I don’t know what the hell they’re talking about.

C: We’re all the same platform, Bob.

J: No, but Cara, you’re right because the boycotting is actually part of the issue now. Is that any company—well there’s people—it goes both ways, there’s boycotting going both ways. So we have boycotts happening where companies that don’t join are being boycotted, which is—I’m kind of in that camp. But there are people that are saying if they do join, that these companies are trying to take power away from the government.

C: Great!

J: And people are boycotting them, saying that they’re going to be a part of the future problem.

C: True.

J: As typical—

S: You’re kind of screwed either way, right?

J: It’s a clusterfuck going both ways. It’s a little concerning because I would like to think that these companies have humanity’s best in mind.

C: Why would you ever think that?

S: Well, I mean it’s always complicated, all right? Companies sometimes do good things, right? And they get PR out of it, and then you say, “Okay, are they doing it because they really care about their customers, or do they really care about the planet?” They’re living on this planet, too, and some of their profits, actually—there are lots of companies who are losing profits because of climate change. So they’re invested in it as well, but then you have to wonder, are they just doing it for the PR, do they have an ulterior motive [inaudible]—

C: But also, does that matter?

S: That’s a good question, does it really matter?

J: It just depends on what the result is.

S: If you do the right thing for the wrong reason and it helps, is that—how much do you care about the motivation?

C: I mean, when it comes to climate change, I honestly don’t mind.

E: I think they’re also trying to prevent themselves from being handed down punishments by governments for not meeting certain criteria. So they’re kind of trying to stay one step ahead of that because that’s terrible for their PR.

C: They’re not going to get any punishment. The governments are in the pocket of lobbyists anyway.

S: But if they do get off their ass and actually do something, it’s probably going to be shortsighted and draconian, and the companies are afraid of what might happen if some other populist takes control. Who knows—politics now are so—we thought they bad, 2016 to 2020. They’re even worse.

J: And the trillionaires are doing nothing. We have—

S: Well, some of them are signing onto this accord.

E: Some of them are.

J: So what, though? They’re signing on, but that—they’re the trillionaires. They have the money. They could be throwing down half their wealth to try to save the planet but that hasn’t happened yet.

S: That wouldn’t be enough.

B: Imagine $500 billion’s half your wealth.

E: (laughs)

[inaudible]

B: Sorry.

(audience laughter)

J: Of course, there was an unspoken sentence in there, Bob. Something about Halloween, right?

B: No. It’s just that I don’t have $500 billion.

(laughter)

B: And I want it.

C: 2035 and SGU, we’re not making it. We’re just—we still got a long ways to go before—

E: Scratching that, scratching that—

C: Before we break even a million. Definitely not a billion.

J: So we’ll just have to wait and see. I feel like what do we have to lose? No other government—I mean, Denver—I’m sorry, Colorado and California, these are local governments, but they’re kind of signing on now, too, and they’re starting to pressure the companies that are—

S: But they’ve been doing that for years. And here’s the thing: if you look at—like recently I saw over the last thirty years—as I was looking in preparation for this—last thirty years, what has been the energy mix of the world’s energy infrastructure? Right, you’ve seen this chart. I sent this out. So, if you look at all the fossil fuels, they were increasing up until around 2025? And then they leveled off. Coal has decreased a little bit, but it’s overtaken by natural gas. But, overall, fossil fuel has been about level; it’s not decreasing, even now! What’s happening—

E: It’s population.

C: Because there’s so many more people now.

S: Right, it’s 8.8 billion people.

B: Its proportion has been decreasing.

S: Yeah, so there’s been an expansion of renewable, a little bit of nuclear—now that the Gen IV plants just coming online—

E: About time.

S: But they only have a few years before the older plants really, seriously need to be decommissioned. That’s a looming disaster, by the way.

B: Yeah, but when the fusion plants come online, we’ll be in good shape.

S: Yeah, right.

B: Come on.

S: We’re still 20 years away.

B: It’s real close.

S: We’re still 20 years away.

(audience laughter)

B: It’s not 20 years away; it’s 15 years away.

C: (laughs) Such an optimist.

S: So renewable’s increasing, nuclear’s kind of stable, maybe increasing a little bit, but that’s just taking up all the new expansion of total global energy.

B: Right, which is something.

S: But fossil fuels are flat! We’re not decreasing fossil fuels.

J: We’re maintaining the same carbon output.

S: Over the last—we’ve been talking about this for how long? We haven’t been able—

C: How long has it been? You guys are old now.

S: 30 years.

E: Hey!

C: (laughs)

E: Okay, spring chicken.

C: Hey, well, now…

B: Yeah, when’s your social security kicking in? Not too far away.

E: Yeah, right?

C: I got like a whole decade ahead of me at least.

J: Do you still have social security? [inaudible]

C: No, it’s completely insolvent.

S: All right, so, now we have to wait for IKEA to save us, is that what you’re telling me?

C: No, the Global Corporate Alliance.

B: (sarcasm) That doesn’t sound evil.

J: “We got this.”

C: That does sound evil. (laughs)

S: How could that not be evil?

J: We’ll see what happens.

B: What else do they got?

Fourth Domain of Life (14:14)[edit]

S: All right. Guys, let me ask you a question, especially Bob. How many domains of life are there?

B: Wait, there was—oh, crap. There’s bacteria, archaea, prokaryotes—

S: Those are the prokaryotes.

B: Now, wait. No.

C: Yes.

B: No, no, eukaryotes.

C: And eukaryotes.

B: Archaea, Bacteria, Eukarya, and…

S: So, traditionally, that’s it.

(Rogues assent.)

S: Those three.

B: Oof. Thought I was missing something.

S: But there’s a fourth.

B: Whaa?

S: There’s a new, fourth domain of life.

B: Ooh, I know what you’re saying.

E: That is crazy.

S: And the name will pretty much give it away.

B: Of course.

S: The name is Synthetica.

B: Yes! About time.

S: So now there’s a fourth domain of life.

B: Wait, but is that recognized now?

S: Well, hang on! We’ll get there.

(laughter)

S: Let’s back up a little bit.

Revisiting GMOs (15:00)[edit]

S: So again, we’re going to give the arc, right? We’re talking about genetic engineering, right? Initially, this kind of came on our radar around 2010, maybe 2012, that kind of area, right?

B: Yeah.

S: Something like that—when started talking about GMOs, right? Genetically modified organisms. And there was a big anti-GMO movement, which lasted deep into the 2020s.

C: Oh my god, we talked about that like every week on the show back then.

S: Well, it’s because it became—

E: Well, that’s because, right, it’s not our fault. It’s their fault!

S: It became a huge thing.

C: That’s true.

S: It was like there was a major science denial thing, even among skeptics initially, but I think we sort of turned the boat around for skeptics at first. And then—but then politically it was a really hard sell for awhile, however. But let me give you a history of what’s happened and why there’s really not much of an anti-GMO movement anymore.

B: That was a good win, man. That felt good.

S: Well, it was a good win for the wrong reason. And I’ll explain why. So, first, papaya ringspot virus started around—by 2006, this actually goes back decades before that, had slashed papaya production by 50%. By that time, also, there was basically no farm in Hawaii, no papaya farm, that didn’t have the ring spot virus, so it was basically obliterating the papaya industry. In 1998 a GMO papaya was introduced, which had the viral inclusion in it, the viral DNA in it. And that was how it conferred resistance to the virus. So, basically, there would be no papaya industry—and going back, this is like going back to 2015—there would be no papaya industry without GMO papaya, which is ironic because Hawaii was one of the most anti-GMO states, but they quietly adopted GMO papayas, because they would be f’ed without it.

C: But that didn’t really change sentiment back then, it felt like.

S: It didn’t because it was under the radar.

C: And that’s because all the staple crops still—they were mostly GM, but people—

S: All the anti-GMO people just ignored the papaya story.

C: Although they ate it.

S: They ate the papaya.

E: Of course they did.

S: All right. The American chestnut tree—there was a fungus, which was—

J: That was back in, when, like the 60s?

S: That wiped out the American chestnut in the 1950s.

J: The 50s.

S: And so we grew up with chestnuts but the trees were just basically dying away. This is like eastern United States, a very, very common tree. It was almost like the most common tree in our part of the world up until we were children, then it was gone. Just totally gonzo.

C: I don’t think I’ve ever eaten a chestnut. Is that a thing people eat?

S: However—

J: That’s at Thanksgiving.

E: You know that song? (starts singing) "Chestnuts roast—"

C: It’s a song. I mean, I've never had a chestnut.

B: Come on, I eat about three of those a year, what are you doing?

E: You’ve never had a chestnut?

C: (laughs)

S: But in 2019 they approved a GMO American chestnut tree that was resistant to the fungus that wiped it out. It was years before they planted it, but now there’s a thriving American chestnut industry.

C: You East-coasters are weird.

S: So those were good wins, but they were below the radar for whatever reason. But here’s the one that I—well, there’s two, there’s two that really drove it home. The first one—in 2024, the Cavendish banana industry completely collapsed—

E: Boom.

S: Due to Panama disease.

B: Cavendish banana? That’s the banana we all think of when you think of a banana, Cavendish.

E: Right, common.

S: At the time. At the time, that was banana.

B: That was it.

E: And that was it, one.

S: So there was the Gros Michel, which died out in the early 20th century, and there was the Cavendish, which died out—

C: And that’s the one you guys always used to talk about. [link needed] You loved those weird Gros Michels.

S: They’re back, though.

J: I remember you cried when we found out that they were gone.

(audience laughter)

S: Well, what the hell? We knew it was coming for years, too. We were talking about it on the show. The banana’s going to be going.

C: (feigns crying) It still surprised you.

S: It still surprised me. Fusarium wilt, or Tropical Race 4, or Panama Disease, completely wiped out the Cavendish industry. I think the last holdout was South America, but it was detected in South America in 2019, and that’s when they knew "now it’s a matter of time." Once they had one banana that went thbbt, that’s it.

B: Remember that? No ice cream sundaes for a little while?

S: We went years without a banana.

B: That was bad, man.

S: But even before that, before 2024, when the Cavendish was gone, back in 2017, Australian researchers had developed a Panama disease-resistant banana. [2]

C: Oh, it came out of Australia? I didn’t realize that.

S: It came out of Australia in 2017.

E: Well done! Well done, audience. Well done.

J: That was beginning of the banana hubbub.

S: It was the beginning of the banana hubbub—

E: I think also known as a "banana-rama".

C: Banana-rama.

S: Banana-rama…but, however, nobody really knew about it until the "bananapocalypse".

J: Bananapocalypse.

(audience laughter)

S: The bananapocalypse wiped out the Cavendish and then these Australian researchers were like, "Hey, we got the GMO."

E: "We got this."

S: We got the resistant banana.

B: We’re ready to go.

S: But the thing is, even that might not have—

B: "We got this."

C: (laughs)

S: "We got this," right. Even that might not have been enough because the Cavendish—I love it, it’s a desert banana. It was the number one export fruit before it was wiped out.

J: That banana fed countries.

S: Well, no, no, not that banana — other bananas.

J: What other bananas?

S: There are staple bananas that are, basically, like what we would call plantains.

J: Oh, that’s right.

S: They’re starchy bananas, and you cook with them.

C: (in Spanish) Plátanos.

B: They’re awesome.

C: Steve, why are you so into bananas?

S: I don’t know.

C: You’re really into bananas.

S: I’ve just always loved them. My favorite fruit.

C: That’s fair.

B: He tried to grow them for years and failed utterly.

C: (laughs)

E: That’s right! Remember, back in the teens [2010s]—

J: Did I ever tell you that I hated those goddamn banana plants?

S: They were in our studio.

C: (laughs)

J: I know. They were getting in—and his cats were pissing in the banana plants.

E: The cats!

B: That’s what it was I remember that.

C: I remember that! That’s when I first joined the SGU, way back then. They were in the basement.

J: Steve and I almost got into a fistfight once in our entire life and it was over cats pissing in the studio in the banana plants.

(laughter)

S: Those cats are dead now.

C: A little behind-the-scenes info.

S: Maybe I should try again. But anyway, something like 20% of the world are dependent on bananas for their staple calories.

E: That’s a lot.

S: When those started succumbing to versions of Panama disease, then we were starting to have Africa and Southeast Asia—there was starvation looming—that’s when the world’s like, "Okay, this is not just our ice cream sundaes anymore. We can’t feed these people unless we get these banana cultivars back online.

C: This GM technology is looking pretty good right now.

S: GM technology saved the banana industry and, basically, lots of starving Africans. And then—here’s the double whammy—2026, the citrus industry was completely wiped out by citrus greening}.

E: That was awful.

C: I remember that.

B: That was horrible.

S: And again, we talked about that for at least 15 years before it hit. Remember Kevin [inaudible]?

C: He used to come on all the time.

S: He would always tell us, "Man, when citrus greening wipes out the citrus fruit—"[link needed]

E: Then you’re going to see some—

C: He was right.

S: He was absolutely right. That objection to—so, of course, in 2031, the first GMO orange with resistance genes from spinach was planted. They were working on that for years and years.[3] And it essentially resurrected the citrus industry, not only in Florida but also in Australia and in other parts of the world where they grow citrus.

C: Well now they can grow them pretty much anywhere. It was smart.

B: Remember they were selling screwdrivers half-price at the bars?

C: (laughs)

S: So here we are. There’s 8.8 billion people on the planet.

C: God, that’s a lot of people.

S: It's a lot of people. Essentially, everyone knows, except for a shrinking fringe, that there is no agriculture without GMOs, bottom line. We would not be able to feed the planet without GMOs. There are still the extremists who are like, "Yeah, let 'em starve, and then everything will be fine."

J: Oh, great, yeah.

C: Well, those people are terrible.

E: Heartless.

B: They’re so marginalized now."

S: Now they’re totally—even Greenpeace, remember that? What was that, 2030 or something when Greenpeace was like, "Yeah, okay, I guess we have to feed people. We can’t let people starve."

E: It only took them decades.

S: So you don’t really hear anything from the anti-GMO crowd anymore, right?

C: Not really. They’re pretty fringy.

S: They’re pretty fringy. There’s one more thing that happened, too. So this is good. GR-5, this is the fifth generation golden rice is now online, but even back to GR-2, which was the first one planted in Bangladesh in 2019 [4], if you guys remember that. So, before Golden Rice, there were 500,000, 500,000 children throughout the world who would go blind from vitamin A deficiency every year, and half of those would die within a year. Not only that, but vitamin A deficiency, even if it doesn’t make you go blind or kill you, it leaves you with low resistance, susceptible, vulnerable to other infections. So, remember all the measles outbreaks in 2019, 2020, 2021?

J: But that was because of anti-vax.

S: Well, even when there was an anti-vax [movement], the children in Africa especially were susceptible to measles because they had relative vitamin A deficiency.

J: Oh, I never knew that.

S: So, guess how many children went blind in 2035 so far—it’s almost at the end of the year—due to vitamin A deficiency?

C: Less than 500,000.

S: 3,000.

B: Wow.

E: They shaved all that.

C: That’s a big difference.

S: It’s kind of like anything. When you easily fix the problem, it goes away. So anyway, it’s hard to argue with success.

C: So let’s not.

J: But now…

S: But now, but wait, but of course you know—

C: But wait, there’s more!

E: It gets better?

Synthetica (23:55)[edit]

S: Well, no. So that’s the good news. The good news is over now. Now we’re getting into—so have you guys heard the term "gen-craft"? This is kind of a new term. I think we might have mentioned it right before. It’s all under genetic engineering, but it’s not genetic modification. It’s basically crafting life from scratch.

C: This is the synthetic stuff.

S: This is the synthetic stuff, right. We’ve been talking about this since, I think, 2017, 2018?

C: Venter. Craig Venter. [5]

S: Venter. They first did bacteria and then they did colonies, multicellular, and then, actually, not just multicellular pseudo-colonies, but now the first actual multicellular, completely synthetic creatures. Again, we’ve talked about their being created, but the first one was approved for human consumption by the FDA.

B: Wow.

C: Oh, they got it passed!

S: They got it passed.

C, E: Wow.

J: And it’s disgusting.

S: Hang on.

C: Don’t look at it pre-processed.

E: Just put a lot of tomahto sauce on it.

C: (laughs) (in British accent) Tomato sauce.

S: So it’s cibumlimax—that’s a terrible name—ventera.

C: (laughs)

S: It basically means "meat slug". And then ventera is for Craig Venter.

E: All right, Jay, you’re right. (laughs)

C: They’re going to come up with some yummy brand names for this [inaudible].

E: Yeah, something else…

S: That’s the taxonomical name. It’s the domain Synthetica and then they have the "blah blah blah blah blah blah blah cibumlimax ventera."

C: Yeah, we don’t go to the barbecue place and ask for some, like, what’s the Latin name for a cow? (laughs)

S: They’ll call it something—

C: Something "bovine."

E: Oh, bovinus, uh, whateverus.

S: Remember they [inaudible] veggie burgers, then the Impossible Burger, then the Insect Burgers, right? The bug burgers.

B: We’ll call it a "blobby burger." I like that.

S: No, a "slug burger." Slug burger.

E: Slug burgers.

B: Slug? No, blobby burgers.

C: That is not appetizing.

J: You know what, though? You remember how I was so freaked out you were trying to make me eat—

C: Impossible burgers.

J: —cricket meat, cricket wheat or something?

C: Oh, yeah, cricket flour.

E: Cricket flour!

S: Cricket flour. That’s a staple, now, Jay.

(crosstalk)

J: I’m proud to say I’ve never eaten it, and—

C: Still!?

S: You probably have. I guarantee you have.

C: You have and you didn’t even know it.

E: [inaudible] Restaurants are using it. You’ve eaten it.

S: No they don’t. No they don’t.

C: No they don’t.

B: They don’t.

(audience laughter)

S: That’s the thing.

E: (laughs)

S: If you have eaten processed food from the supermarket that is a wheat-like product—

J: That’s bullshit.

C: Jay, it’s in everything now.

E: Have you read your ingredients?

S: It’s in everything.

B: Jay, I’m going to admit right now: Jay was having a hamburger and I made an insect burger, and he didn’t know it, and I [inaudible]. He ate it and said nothing. I didn’t say a word ’til just—

J: When did this happen?

(audience laughter)

B: Six months ago. Jay, you loved it. You loved it, dude.

S: Insect burgers are old news. Now we have slug burgers.

B: Blobby burgers.

C: But we can call them slug burgers.

E: No, no, we’ll come up with something—

S: They’re going to call it something else.

C: Can we called them "craft burgers," since they come from gen-craft?

E: Oh, gen-craft!

(crosstalk)

J: You know what the thing is? The slugs look like—remember pink slime? McDonald’s Chicken McNuggets.

S: You’ve seen the videos?

J: They look like pink slime!

C: I know, but that’s why you don’t look at that. We don’t cook them.

S: It’s just a blob of meat-like protein. It’s just the amino acids and whatever for… And then they grind it up and it looks [like] meat.

B: It’s got no central nervous system, right? So there’s no—

E: Right.

S: Yeah. It has nerves because it can move and it can feed, and it has some kind of neuronal kind of ganglia.

E: Ganglia?

C: The vegans aren’t into this, huh?

S: But it’s like an invertebrate. It’s like an insect or a plant.

C: Steve, so the vegans won’t eat this, huh?

S: Why not? I don’t know. Probably not.

C: I think that—some of them still don’t eat insects.

S: Yeah, if they don’t eat insects, they won’t eat this.

C: Yeah, it’s like a hard-line thing.

S: But it has no face.

E: Has no face!

S: Nothing with the face thing.

(audience laughter)

C: Yeah, that’s a big part of—I don’t eat anything with a face.

S: No face.

B: Did you see the scientists who drew the face on one?

(laughter)

E: Yes, yes!

B: It’s hilarious.

S: So, it may still be year or two before we could actually get these at the Hungry Jack’s or whatever.

(audience laughter, applause)

C: (laughs)

S: It’s just protein, right? It’s just like the insect wheat. Now we got slug burgers, slug protein. And you could mass produce these things. These eat slime or something. You see them crawling around eat algae, but they’re working on ones that can photosynthesize.

C: Oh, that’s smart! Just kind of direct—

S: So guess how many genes are in this synthetic slug?

J: Like what, 300 or something?

B: Wait, no. How many genes? So we’ve got far fewer genes than we anticipated when we first—was it 20,000?

S: We have 10,000.

B: So, how about, like, 8,000?

S: 400.

C, J: 400!

E: That’s all?

S: But how much does a slug have?

J: I don’t know.

S: 428. An actual slug.

B: Oh, that’s right. It’s really efficient, huh?

S: Yeah, it’s a little bit more efficient than an actual slug. But the genes have, like, no exons. Or no introns.

B: They work. There’s no junk DNA.

C: So, Steve, is that why decided to just, kind of do this as a gen-craft, like a synthetic biology sit—instead of just genetically modifying the slug?

S: Because you’re not going to get animal protein in an insect.

C: That’s true. If you eat a slug, you’re not going to get a high level—you get a little bit of protein.

S: Vertebrate protein [inaudible]. Muscle pro—but this is like making muscle-like protein.

C: Oh, it’s so gross and weird. I love it.

J: But why didn’t they just do it like back when they started to come up with lab meat?

S: But the lab-grown meat thing never really panned out.

J: Why did they—But what happened?

S: It’s too energy-intensive. You can get—I’ve had the lab-grown meat thing, and they’re fine, but they’re still a little bit expensive.

C: But guys, we’re in a water crisis. We can’t use that much water to produce—

S: It’s very water-intensive.

C: Yeah, we can’t do it.

B: Steve, when they were developing Blobby the Slug, did they figure out some of the junk DNA? Like, "Oh, this junk DNA’s important because it does something that we didn’t think it did."

S: There’s no junk DNA in it because it’s totally—So, Venter gave an interview about it. They’ve written articles about it. Every single gene was completely synthesized. And over the last 20 years, they’ve learned what the minimum number of genes that are absolutely necessary for something to live, something to develop—

B: For bacteria and stuff, but microorganisms—

S: But it turns out it wasn’t that hard. If you’re building a really simple multicellular creature, most of the genes are for just the cells to live, and then just getting them to differentiate a little bit differently so they break up the work—you know what I mean?—they’re not all doing the same thing. It’s not that hard. It actually turned out to be not that hard.

C: And remember, this thing doesn’t have to live in the wild. It doesn’t have to do a lot of the work.

S: All it has to do is eat.

C: It just has to eat and produce meat for us, or protein for us.

E: It doesn’t have to develop a defense mechanism.

J: I know people like that.

C: (laughs)

B: What if we put it in the wild? Could it evolve?

S: No. It can’t survive in the wild.

C: It would die, I think. It seems like—

S: It has no defense.

C: It has no evolutionary fitness.

E: [inaudible]

(audience laughter)

B: All the other animals would be like, "Look at that slab of protein!"

C: (laughs)

B: "It can’t get away, can’t do anything. Let’s go eat it!"

E: Is there a waste product or a byproduct of it?

S: I mean, it does poop, apparently. But I think they just recycle that.

C: Eww!

J: Why can’t they just make something that poops meat?

(audience laughter)

B: Jay!

S: We’ll get right on that.

C: The most scientifically astute question.

J: They could call it a "shit burger"! (laughs)

E: That’ll sell!

J: I’m not eatin' that shit!

S: Yeah, this is the guy who won’t eat a bug burger.

C: Meat poop!

S: But he wants to eat a shit burger.

J: I would try a shit burger.

(audience laughter)

E: Comes out as sausage links, already cased, ready to go.

C: Quote of the day from Jay. He tries shit burger won’t eat cricket powder. (laughs)

J: I just have a thing about bugs.

S: But not slugs. Slugs are okay.

C: But unh-unh, feces!

(laughter)

S: So, of course, of course there’s already an anti-gen-craft movement, saying—

E: Oh, this is the bad news. This is the bad news.

J: This is what you’ve been waiting for.

S: —this is the bad news—saying that "it ain’t natural," you know? It’s all the same arguments, recycled over the last 30 years of doing this show. It’s the same thing, right? "It’s not natural. It hasn’t been tested enough."

B: "It’s cruel. It’s cruel."

S: They’re trying to say that—

B: I’ve seen people that—

S: I know, but that’s a hard—this thing is like engineered not to experience its own existence.

E: "We’re playing God." Playing God complex.

C: "Playing God." Yeah, I’ve seen that one a lot.

B: But they’re saying they can’t detect the fact that they are having some sort of existence, some quality of—

S: Prove that they don’t know they’re being killed, whatever. It’s a slug.

C: Aww.

J: Yeah, but…

S: It’s not even cute. They designed it to not be cute.

E: Right. It’s not—it doesn’t have

C: But somethings things that are really ugly are a little bit cute.

S: Oh, stop it.

C: It’s true!

J: You shouldn’t talk about your boyfriend like that.

(audience laughter)

E: You’ve been going into the Aug too much and putting faces on these slugs.

C: You know I don’t have a boyfriend.

B: (laughs)

E: So you have to cut down your time.

C: All right, all right, all right, all right. I like the Aug.

S: So we’ll see. They’re already writing virtual mails to their congresspersons. And Oregon already banned it. Already banned in Oregon.

B: Of course they did. I’d be shocked if they didn’t.

E: Yeah, well.

S: It’s terrible. So we’ll see. This is another round, now. We’ll see what they do. They’re still sort of creating their message. But this is, I think, going to be our thing for the next few years, now, is dealing with the anti-gen-craft crowd.

B: Yeah, but don’t forget. This is a new domain of life. This is the first. This the first application of that creation. I think—

C: Well, they’ve done more in the lab. This is the first one that we’re able to consume. And that’s cool.

B: And that’s great, but who knows what they’re going to come with with gen-craft.

S: All right, but here’s the thing.

B: Something that’s going to make a blobby burger look like, pff, whatever. Come on!

S: The thing is, they’re not releasing this into the wild. This is a lab creature, right? I think the big fight’s going to come the first time they want to release something into the wild.

B: Well, yeah.

S: Or they grow a crop in a field.

E: Oh, there’s going to be some renegade scientist who tries to do this and—

S: Probably in China.

E: Right, right. The old CRISPR—from way back when.

S: The CRISPR babies.

C: CRISPR baby. Aww.

S: Yeah, they’re still kicking, I understand.

E: Yeah!

B: They can make some that, like, eat all the plastic in the oceans… We know how big of a problem that is.

E: Yeah! Yes!

S: So they’re already doing that with the bacteria. They made the ones that can eat oil spills, that can eat plastic—

C: Yeah, they’re working; they’re just working slowly.

S: —that can eat carbon. So, they’re all there. There’s just a lot in various stages of the regulatory procedure. Some are being used, but they still haven’t pulled the trigger on releasing a Synthetica into the wild. I think that’s going to be the next step.

J: As they should be because that’s super dangerous.

B: It is.

S: It depends.

E: Well, it depends on the form.

C: We have to hear from the experts. The regulatory boards are being formed, the ethics boards, and they’re figuring it out.

S: But here’s one thing: they cannot, by design, cross-pollinate or interbreed with normal life, with the other three domains of life.

C: Exactly.

E: Right. Where’s the—no compatibility.

S: They’re producing—

J: How do we know?

S: Maybe people will figure it out.

C: And these organisms are just pure prey animals at this point. They’re not…

B: But Steve, what—

C: (as Dr. Ian Malcolm) "Life finds a way."

S: Life find a way…

B: They’ve done—I remember, way back in 2019, I talked about how they took bacteria and they were turning them into multicellular because they were able to—

S: Yeah, this is an extension of that.

B: So, imagine taking Archea or Bacteria with their exotic metabolisms, creating multicellular life out of them. So then, what, would that fall under Synthetica? Or would that be—

S: It depends. So, by definition—

B: We’ve become Eukarya, then—

C: Yeah, how are they defining these?

S: By definition, if you are a member of the domain Synthetica, all of your genes have been created entirely artificially.

E: 100%.

B: Okay.

C: Even if you perfectly replicate a… gotcha.

S: Yes. That’s a loophole. You can replicate a gene that exists in other creatures, but you have to have completely manufactured that—

J: We’re going to have to—

C: And it’s got to be trademarked. You can read it in the DNA.

J: We’re now going to have to train—

S: At the very least, they take out all the junk and all that stuff.

J: I’m serious. We have to train Blade Runners to kill these things.

(laughter)

S: "Slug runners."

J: Slug runners!

S: (laughs)

J: Because they get out, think about it, they get it out and then they don’t want to be eaten. And next thing you know, they’re punching holes through walls and they’re pissed off at people.

C: With their little slug hands! (laughs)

S: The tears in the rain.

J: They go back to the scientists who made them.

E: Extended protoplasm arm…

C: Their [inaudible]. (laughs)

S: "I don’t want to be a burger!" "You’re a slug!" Slug runners, yeah. All right.

Social Media, CAD, & the Aug (35:25)[edit]

B: All right, what do we got next?

S: What do we got next? We have—

C: Am I next?

S: Yes. Cara is next with—what’s the latest, Cara, with social media?

C: Oh, god, there’s so much to talk about, you guys.

S: This is overwhelming.

C: The main article that I wanted to cover today was kind of the big—and I know you all saw this. This was the headline everywhere. It just happened two days ago, and we’re still dealing with the fallout. We’re going to be dealing with fallout for awhile. So you guys know Control-Alt-Delete, this hacker movement, "CAD."

S: CAD.

E: CADs. C-A-D.

C: Yeah, a lot of people call them "CADs", C-A-D.

S & B: All the cool people call them CADs.

E: I still like "Control Alt Delete," though.

C: I guess I’m not cool. And Control-Alt-Delete is this kind-of underground—we still don’t know who they are, right? There have been a couple of examples in the news where somebody came out and was like, "I’m Control-Alt-Delete," but nobody actually believes them.

S: If you admit to being CAD, you’re not CAD.

C: Then you’re not CAD.

E: Is that the Spartacus moment? "I am Spartacus! I am Spartacus"…

S: No, it’s not. It’s loser wannabes. The real people, you will never find out who they are.

C: So Control-Alt-Delete has been targeting a lot of these new platforms. The biggest one, the one that’s been the hardest kind to get into is the one that most of us are on, the Aug, right? I mean, I’ve been wearing—I’ve had my Aug on all night, actually. I think it’s kind of fun, especially when you’re sick and a little bit loopy.

(Rogue whistles "loopy" sound effect)

C: I don’t know if all of you are in it right now. We don’t really have to be sitting here.

E: Nah, I turned mine off.

S: Intermittently.

C: Yeah, you turn yours off.

B: I was told I could not bring my Aug, and I’m feeling—I’m getting separation anxiety.

C: Well, Bob, that’s because you just get lost.

S: That’s because when you’re using Aug, Bob—

J: You go off into worlds…

S: Yeah, you are staring off into space. You look creepy.

C: And then we’re like, "Bob? Hello! It’s your turn."

E: Creepier.

(audience laughter)

B: But there’s a lot of cool stuff I’m doing. You know?

C: I know! (Rogues crosstalk.) You have to use Aug to improve your work, dude.

S: Checking your V-mail and stuff while we’re doing the show.

J: Do that shit at home. Don’t Aug on my time.

B: But looking at Jay without my filter on is hard.

C: That’s mean!

J: Hey, man!

B: Sorry, Jay.

J: Thanks, Bob.

B: Look! He’s not shaved. Ugh.

C: I know.

J: (laughs) So you’re seeing a shaved version of me?

B: And the filter I put on his hair makes his hair look so cool.

J: What the f— is wrong with my hair?!

(Cara & audience laughter)

B: It’s cool. It’s nice, Jay, but the filter I have on your hair is awesome.

(audience laughter)

E: That blue streak? That’s cool.

B: Oh, yeah. And it moves and stuff.

C: I have to admit, it has been easier. Like, I don’t really like to wear makeup, and so I like thinking that a lot of people are looking at me in Aug-land like I’m a little improved. It’s 2.0. So, you know that the Aug has been kind of the one that’s taken off the most. There’s some offshoots and stuff, but I’m not using them. Are you guys? (guys all say no) Aug has everything we need, right? It has all of our social stats. It has our social currency. I mean, it’s tied into my bank accounts, all of them, I think.

E: Yep, pretty much. Yeah, for me as well.

C: Yeah, pretty much. And we’ve been kind of on the fence about how it’s plugging more things into it.

S: In 2032, I think it was, insurance companies will now pay for Aug doctor visits.

C: Well there you go!

B: Wow! How’d I miss that?

E: Doesn’t get more mainstream than that.

C: I know. Exactly. It’s kind of hard not to be in the Aug at this point because—actually, it’s impossible. I don’t think I know anybody who’s not using Aug. Do you?

E: Everybody’s doing it.

S: You can’t function in society.

C: You can’t function. How could you function—

S: It was like—

C: What did we do before Aug? We used—

S: We had to use our handheld phones…

E: It was a wallet or something.

C: (laughs)

B: Oh my god. Remember that?

E: Remember cards!?

C: Oh, plastic cards! That’s so funny.

E: Oh my gosh. I kept my old ones. They’re in a file drawer.

J: You guys take it like it’s okay, and I’m not cool with it.

C: Are you still using paper money? (laughs)

J: No. Of course not, but my point is this is a totailor—, totalerant—I can’t even say the word.

S, C, & E: Totalitarian.

J: —totalitarian’s wet dream.

E: Three t’s.

C: Jay, it is in China. It is in Russia, but the government doesn’t have their hands on Aug. I mean I know—

J: How the hell do you know that?

C: Well, I mean, they don’t own the companies.

E: They don’t admit to…

C: It’s private enterprises.

S: But that’s, again, the conspiracy theory. So we all know that Russia and China are complete Aug-totalitarian governments, right? If you live in China, you’re on their version of the Aug. They completely own you.

C: I think it’s still called WeChat.

S: Is it still WeChat?

C: Yeah, they never changed the name.

(audience laughter)

S: Out in the West, in developed—in other parts of the world, the governments don’t control it—

B: And on the Moon, too.

S: —but corporations do, and some people argue that they’re actually more powerful than the government.

C: Absolutely.

S: They own us.

C: We’re still having this conversation—

S: We just don't know it.

C: —privacy versus convenience. And I think at this point—

S: People will always trade privacy for a little bit of convenience.

B: It’s insidious.

J: Back in 2020, Amazon was rated the first company and the number one company to truly have such an amazing amount of data on its customers that—it’s like a transcendent moment for a company to get to that level of data. And we were questioning back then, I mean I was. I was following this very closely back then. There’s no regulations for that level of data. No government in the world created regulation to deal with that.

C: I know.

S: Remember when Zuckerberg gave all those testimonies before our Congress and no one believed a word he said?

E: Oh yeah!

C: But Jay, don’t act like you didn’t just buy something from those targeted ads the Aug gave you.

J: I literally just did as we were talking. (audience laughter) No, but the point is, though, we can’t—

B: I love targeted ads.

C: Me too. They’re so good now. They’re crazy good now.

B: [They] really know what I want.

J: I don’t know. We’re so hard-wired into this thing. We have to—

S: It’s scary how [inaudible].

E: The interdependencies—

J: We can’t go back. You can never go back. When cell phones came, there was no going [back] to a life that didn’t exist.

C: It’s part of our life. Yeah, it would be really hard at this point.

J: But this thing owns us.

C: But here’s the thing. Here’s the scary thing, and it’s something that we think we didn’t think would be possible because of the way that data is distributed in the cloud—and Bob, I know you know about this server farms and data centers. You understand this a lot better than I do. But apparently this is the new headline. So, Control-Alt-Delete managed, finally—and you know they’ve gone in and they’ve shut down server farms before. We keep seeing these headlines where something gets blacked out for a couple weeks and it takes awhile to put it back online. They finally somehow managed to trace the data of a packet of people. So 100,000 people—their entire Aug history has been erased.

B: Oh my god.

E: (cringing) Ooooooo!

B: They finally did it. They finally did it.

E: Backup and everything gone?

C: They’re ghosts.

B: All the backups, all the—

S: Orphans, rights? Or virtual orphans.

C & E: Virtual orphans.

C: All their money. All of their proof of their education.

J: And there you go.

C: All of their social currency. Everything. Their history. All their memories, basically. We live via our photographs and our video recordings now.

B: I mean, how did they—

S: Their high scores on Plants vs. Zombies are gone. (audience laughter)

B: How did they pull that off?!

C: FarmVille! Who knew that would stick around?

B: I really never thought they would be able to dit. Think of all the backups. It’s not one data center. You’ve got backups. You’ve got backups in the cloud, backups on the Moon. How did they get access to all of that?

C: Who did they know, right?

B: That’s scary as hell.

C: You would think. But this is, maybe, part of the problem, is that when a corporation, a multi-national corporation, owns these things—so they should be spread all over the world—it’s still only one company, ultimately, right? It’s a conglomerate, but—

E: Inside job, maybe? Pirates within?

C: They must. They’ve got to have moles in there. They have to have access to enough information to know.

J: It was terrible what CAD did to these people It’s terrible. But the reason why they did it was to show that the companies, literally—look, these people don’t have lives anymore. What are these people going to do? They literally don’t exist in our system, in our collective [inaudible].

S: So, congratulations. They proved you could destroy somebody’s life by destroying their Aug—by making them virtual ghosts.

C: They’re the ones who did it.

S: But they’re the ones who did it.

C: The companies so far—these people on the Aug had been fine.

J: I don’t know. I don’t agree. I know that what they did was wrong, but I think that the point that they tried to make, they made, and it’s scary.

C: I think this is showing the dark side of hacktivism. As much as I agree with a lot of the posts that I’ve read from Control-Alt-Delete, I think they went too far this time. They went way too far.

S: They have a point, but they’re basically terrorists. I hate to use that word, but if you’re doing that—So, there’s a talk—I don’t know if this part of your news item, though—but talk of the UN—are you going to get to that part? But the UN, basically, they’re considering a resolution to make, just so that they have more regulatory power to go after CAD, you know, Control-Alt-Delete—If you 'kill' somebody’s virtual history, that’s now virtual murder.

C: Oh! Like their—oh, because we all have our little avatars. You can actually murder somebody in the Aug?

S: If you comp—like 100% erase somebody’s data so they can’t come back, that is virtual murder—

’’’C:’’’ So these guys could be tried in the Hague?

S: —because you create a virtual ghost. They can get tried in the Hague. If they ever catch them, they can get—

C: We know who they are.

E: Well, catching them’s going to be so hard.

J: If they catch those people…

B: Oh my god, yeah.

S: They’re done, they’re toast. But I’m sure it’s like cells. You might get one guy or one cell, but you’ll never totally root out…

J: That’s the other thing, too. The other scary reality is—yeah, so Control-Alt-Delete, sure, they did something bad.

C: Really bad.

J: But there’s—okay, I don’t want to say real terrorists out there—but there are terrorist group that do want to tear down the society that we live in.

C: How is this different?

S: Well, how better to tear down society than to get rid of someone’s complete Aug history? Jay, imagine yourself as one of these people. What do you do?

J: You’re done. I don’t know.

S: You’re done. You’re cooked. Go live on a commune in the woods somewhere?

J: I think the point is that we’re missing—

S: [inaudible]

C: Some people already do that. There are people who aren’t in the Aug. I don’t know any of them, but I read about them sometimes.

E: The Off-Gridders! I love them. The Off-Gridders.

C: Yeah, the Off-Gridders! Yeah, they’re weird. There’s a TV show about them on Discovery.

E & C: (laughs)

S: The Off-Gridders?

J: Do you guys think—and actually the show is pretty cool—but do you guys think, though, that we are kind of going down the snakes mouth right now with technology?

(A Rogue sighs)

S: But we’ve been saying this for 20 years.

C: That’s the thing. It’s so hard, right? Because we were going to go this route anyway. That’s the thing. If the Aug’s parent company didn’t hit the right kind of algorithm to get us here, another company would have.

J: I’m not saying—yeah, of course, I think it would have happened anyway—but back in the mid-2015 era, we started to realize that Facebook really didn’t have humanity’s best interests in mind. And then we watched Zucker-freak—

C: Did we ever really think they did?

J: —go in front of Congress and lie his face off, telling them how everything that they—

C: Do you remember when he ran for president? Idiot. Sorry.

(laughter)

J: That was the beginning of his downfall. But the point is, though, we saw even with Facebook—and this is nothing, Facebook is nothing compared to this. This is, literally, we live in augmented reality now.

C: I know, but Facebook didn’t give us anything except people’s picture so their babies.

S: But at the time—

C: This is way better.

B: And cat…

E: And a lot of advertising. A lot of advertising.

C: And cat videos.

J: I don’t know, I don’t know.

C: But now you can just watch a cat video anytime, anywhere.

E: Oh, yeah, that’s a good point.

B: I always got one running in the corner of my vision. It’s really cool.

(audience laughter)

S: But we were on Facebook, and it was important to our marketing. And it was—

C: That’s the point. That’s the part that’s so—

S: And now we’re on the Aug and it’s—Imagine our show without the Aug.

C: I know, yeah. But that’s the part that’s so unsavory to me, and that I do have the lucky feeling about, is it all is just about marketing, still.

(unknown Rogue): Yeah, it is.

C: It’s all just about selling us shit.

E: That’s been true for so long.

S: Ever since the—exactly.

E: Since the analog days. And beyond.

B: And they’re so good at it now. A lot of people are saying that there’s—it’s actually giving credence to people’s belief in psychics.

C: They think they’re psychic?

B: They must be psychic because they know what I want so fast.

S: Before you know you want it.

C: Don’t they understand big data. That’s ridiculous.

E: (laughs)

B: But we see it. We see it. It’s funny as hell.

J: All right. I’m warning you guys. I’m warning—I bet you in another 10 years we’ll see some seriously bad stuff come out of this.

C: Another 10 years, you’ll be dead. (Laughs)

(audience laughter)

E: There you go, Jay! How’s that?

J: And that’s the bad thing!

B: But those longevity therapies are working pretty damn good.

C: Ever the techno-optimist.

E: Can we download ourselves yet?

B: Look at me.

C: (laughs)

S: You’ll never be [inaudible].

C: "Five to ten years."

[KiwiCo ad]

Near-Earth Asteroids: Apophis and Perses (48:13)[edit]

S: So, Evan—

B: All right, now this is some shit, man.

S: This is the big news. This is actually—everything else is just the warm up to the actual big news that everyone wants to hear about.

C: 100,000 people erased from—

S: Because what do we got, 10 years to live? What’s going on with that?

E: Uh, yeah. We—well, it’s 20 years to live.

C: Say what now?

S: We’ll be dead.

E: But we’re working on it. We’re working on it. I want to remind everyone the whole background of this, so please bear with me before I get to the actual news item.

B: Like we don’t know, but go ahead.

E: I know, I know. So I’m hoping the audience here remembers Apophis, right, the 2029 asteroid that came within 25,000 kilometers of Earth?

S: That’s nothing. That’s a whisker.

J: Phew!

C: But it missed us.

E: It did miss us, absolutely, and that’s what the scientists told us—

S: Yeah, that’s why we’re still here, because of [inaudible].

(audience laughter)

E: And it happened on a Friday the 13th.[6] Which, you know— (crosstalk)

S: What are the odds?

B: Remember the party we threw that day?

E: Pretty decent. There was so much fear-mongering with Apophis. It was first discovered way back in 2004, and at that point, the scientists, with the information they had—there was maybe just under a 3% chance of it actually impacting the planet based on the data that they had at the time. Well that sent people kind of into "Okay, here it is! Now, finally, this is the real apocalypse coming. Forget all the other—the Mayan, the 2012—all that. This is the actual one.

C: Forget all the other apocalypses!

E: But, as time went on, and more careful studying of it went, they realized—that shrunk down over the years, and by the time, about 2019, 2020 rolled around, the scientists said, "It is 0% chance of this [inaudible] and, of course, it didn’t.

S: Yeah, it’s not going to happen.

E: But Apophis was the god of chaos, for those who don’t know their Greek mythology. And you’ll remember that tragic incident leading up to the fly-by, the cult, known as the Children of Claude, that was an offshoot of the Raëlian Movement, you guys remember? We used to talk about the Raëlians way back, like in 2005, 2006.

B: Raëlians, right.

S: Didn’t they pretend to clone somebody at one point?

E: Yes!

C: I can’t believe they stuck around all that time.

E: They did! It was little offshoots of it.

J: Was that guy with the hair that said, "I’m not saying it was aliens…but it was aliens." Was he a Raëlian?

C: (laughs)

E: I think I know of whom you’re speaking. That’s the Claude person, and this offshoot is the "Children of Claude." So, they were the ones who, as the asteroid came by, they thought it was going to open an inter-dimensional space, and the only way to get up there was to be—to leave their earthly coils. A couple dozen people, unfortunately, took their own lives. But we’ve seen this before, cults and suicide.

S: What was that? The Hale-Bopp, back in ’97, and the Heaven’s Gate cult, anyone?

C: [to audience] These guys are all way too young to remember that. No, they’re too young.

E: No? Oh, gosh, I’m totally dating myself. I’m an old man now. Well, in any case, that was the most, I think, notable fear-related story to it. The Internet obviously went wild. But then in 2030, just a couple years ago, you know what came next. The astronomers located object designation 2030-US, also known as Perses.

S: Mmm. Perses.

E: Perses. P-E-R-S-E-S, named for—

S: Not Perseus.

E: Not Perseus, no.

S: Perses.

E: Perses was the Greek Titan of destruction.

S: Mmm. Appropriate.

E: And this one’s giving us trouble. 33% chance—

S: Don’t want to roll those dice.

E: —of impact. And the studies since then—they’ve obviously been very closely monitory this one—and it’s holdin' true.

C: How far away is it now?

E: Well, we’re about—2055 is going to be the date. June 21, 2055.

B: Aww, right around the [inaudible].

E: So we’ve got—there’s 20 years. But, as you know, NASA, the ESA, the Russian Space Federation, and others have finally—

S: MASA?

E: MASA, among others—Israel’s group, and the space agency of India…So they can’t behind global warming and deal with that, but at least this is something that they can get behind, and they have gotten behind.

C: We like a good short-term threat.

E: Yeah, exactly. When something’s a little more immediate, and, like, right in your face, that will motivate.

B: Especially when it’s an Extinction Level Event…[inaudible].

C: And they’ll make lots of movies about it.

B: Oh yeah. Documentaries…

S: They’ll dig up Bruce Willis.

C: Poor guy.

B: Think he’s just virtual [inaudible].

J: That movie he made sucked, didn’t it?

S: That one movie he made?

(laughter)

J: And what about that—remember, he was a coal-miner or something?

E: Oh, remember that Christmas movie, Die Hard?

C: I was going to say, Die Hard is Christmas movie! (laughs)

E: Die Hard is a Christmas movie!

S: That’s still my favorite Christmas movie.

C: [again to audience] Also too young, too young. (laughs)

The good news part (52:45)[edit]

E: Wow! Really? Here’s the news item. Here’s the news item today. ESA—

S: —Some good news?

E: It is good news because—

C: —Oh good, thank goodness.

S: I’d rather not be hit by a two-kilometer—

E: Exactly. And the prevention methods have gone into effect because ESA successfully launched GT1 into orbit the other day. No issues, everything is fine. It’s the first salvo in the fight against Perses in which is going to approach Perses and establish a fixed position in close proximity to it. It’s using the gravity tractor method—

C: Oh, "GT1."

E: —GT1, which is why it’s called that.

B: I love this idea.

E: So if all goes according to plan, [presumably demonstrating to audience] here’s Perses, it’s coming in, GT1. They’re going to park it over here in a stable position and the gravity between the two objects, it should nudge it. It should nudge it just—and it doesn’t need to nudge much because it’s still out there far enough—a few centimeters! That’s all they’re looking to do at this distance.

S: Yeah, but 20 years is actually right on the margin—

B: —It’s on the edge.

S: —For the gravity tug method.

J: It’s a little too close for comfort.

C: They want to try as soon as—I mean as late—whatever you want to say—as possible.

S: That can’t be the only thing that they’re doing.

E: No, it’s not—

J: —No, they tried other stuff.

E: —There’s a three-prong attack against Perses, and this was the first one, and it successfully went—but there are two more coming. So the second prong is being undertaken by China’s space agency. They’re going to be launching a direct impact probe into Perses, and they’re going to attempt to knock it off its trajectory. Now, this is sometimes referred to as the "battering ram attempt," but this particular project is considered, actually, a little less reliable because previous experiences from space agencies with this exact method, the direct impact approach, had mixed results.

So, if you recall, NASA conducted a test of the direct impact approach back in 2022. The name of the test was called DART. DART stood for Direct [sic] Asteroid Redirection Test.

C: Oh yeah, DART.

E: And it shot the DART at—oh, you’ll love this Cara—at a small test asteroid called Didymoon.

C: Didymoon?

E: Didymoon.

C: I like Didymoon.

E: Jay, didn’t you name one of your dogs Didymoon?

J: No.

(audience laughter)

E: Bob?

J: It’s Jay. He never would admit to it.

E: I thought it was Jay.

S: It was a goldfish.

E: He lied to me.

C: (laughs) Little Didymoon.

E: Now, look, Didymoon was a much smaller asteroid than Perses is. So the data revealed by the impact is that, yes, it would be effective on an asteroid that size, but it wasn’t clear if it would do something the size of Per—oh, I failed to mention: Perses is two kilometers in diameter—

C: —But doesn’t it barely have to move because it’s so far away, still?

S: —Yeah, but—

C: I mean, I know it’s close. But it’s so far.

E: —A couple centimeters—

S: —Two kilometers is big.

J: Yeah, but I thought—

E: Yeah two kilometers is huge.

C: —Yeah, but it’s like "bink," and then it’s, like, so far from us.

S: —It’s all momentum.

J: —But I also thought that they were worried that hitting something like that could cause just a bunch of smaller objects.

S: No, that’s only if they hit it with a nuclear weapon. And even then—

E: —Right, and that was never really a consideration, even back in the late teens, when they were talking about that even as a possibility for any future impact. They kind of ruled it out at that point, for—

J: Okay.

B: —Yeah, the composition of the asteroid’s critical in determining what best approach.

S: What method. But this is solid, right? So it has to be solid. You can’t hit a pile of rubble with an impact method—

E: Right, because you’re—

C: —So it’ll just stay rubble.

E: Exactly. No effect.

S: [inaudible] It’ll decay—it’ll have no effect. So—

E: —No effect.

S: But the thing is, it’s just hard launching a ship fast enough, heavy enough to hit it with enough momentum to move it—

C: —And also, it’s, like, yeah, it’s two kilometers, but that’s really small in the grand scheme of, like, space.

E: So this is why—

S: —But it’s really big in the grand scheme of a rocket.

C: True, but they have to get that calculation perfect to be able to reach it.

S: That’s not a problem.

C: Really?

S: They won’t miss.

C: Okay.

B: Newtonian mechanics. You don’t even need quantum mechanics. [inaudible] is good enough.

E: China’s craft is significantly bigger than DART’s was. So they’re relying on the much, much larger size of this to perhaps do the job. They’re calling it—I don’t speak Chinese. If anyone out there does speak a dialect of Chinese, forgive me—Tuí Tuí, which is Chinese for "push" or "shove," which I thought was kind of cute. That’s a phonetic spelling. T-U-I with an accent over it is how they spelt it in English.

J: So can they tell—Don’t we have the science to know that the gravity from the ship is going to affect it or not? We’re all kind of sitting on pins and needles, like wanting to get something definitive.

S: But it’s all orbital mechanics. They’ll have to hit it, and then they’ll have to follow its orbit for, like, two years to really know what the impact is.

B: That’s right. You have to be—

S: —That’s why they can’t wait—

C: —It takes that long for them to know if it’s knocked off its course?

S: —That’s why they have to everything at once. They can’t wait because every time they wait, we lose the ability to deflect it.

B: Yeah, it’s just too important to screw up, so that’s why it’s good to have Plan A, B, C, as many plans as you can muster.

C: Redundant. Are there more than two?

S: I don’t think three is enough. They should do something else.

E: There’s a third.

B: There is a third. There is—

E: —Now, Tuí’s going to launch in late 2038, early 2039 is the estimated window for that one. But, third prong attack—and, Bob, you’re going to love this one.

B: Oh yeah.

E: This is called Alda. A-L-D-A. It’s expected to launch in 2040, and it stands for Asteroid Laser Deflection Array. Well, I have to mention it now. We love Alan Alda, when we used to watch him back when television was a thing. When M*A*S*H—but he was also a great science communicator. He did Scientific American Discoveries on—

C: —You can still get M*A*S*H on the Aug.

E: —So good.

S: Yeah, you can.

E: —And you never know what kind of entertainers and stuff are going to become science communicators or great things. Millie Bobby Brown became an oceanographer, and who saw that coming? Stranger things. Who saw that coming?

C:(laughs) She was smart.

E: —But, in any case, ALDA’s going to be launched in 2040. And it’s going to contain five space lasers, Bob—

C:Lasers.

E: —They’re going to rendezvous with Perses—

B: —How powerful?

E: —In 2040—how powerful, indeed! 50 peta-watts per laser.

B: Yeah! [inaudible]

E: Woo! They’re going to blast this thing.

S: Are they going to draw a shark on the side of the—

E: —I hope so. (audience laughter) If they don’t, what a wasted opportunity. The idea being is that you pound this thing with enough laser power—debris, gases get released from it—

S: —And that pushes it.

E: —And that creates a little bit of a push. It takes time. This doesn’t—you don’t send it up there, fire a couple lasers, and call it day. They estimate it’s going to take 6 to 24 months of laser bombardment in order to get thing to move those few centimeters.

C: Wait, are the lasers space-based, or are they Earth-based?

E: Oh, they’re launching them out.

S: They’re space-based.

C: Oh they’re launching. Okay, got it.

E: Yep, they’re going to launch them out there.

C: Is anything going to be in between this laser ship and—

B: —Not for long. Not for long.

(laughter)

E: Not at 50 peta-watts!

C: Are we risking anything?

E: Not at 50 peta watts.

C: They have a pretty clear shot. They’ve calculated that. They don’t care.

E: It will intercept it in 2043. And so that’s the three-prong attack, and the first launch happened today, so we will keep obviously close tabs on this one.

C: So what do we think the odds are?

E: With all three of these things going out there? I think, I think very good—

B: —Doable.

E: —Scientists are not really putting out any false hope and saying, "Yeah, it’s guaranteed to work," or any kind of 99.9% effective. They’re not really saying anything along those lines, for obvious reasons.

S: So we’re starting at 33%, and I think each one will knock it down 10% or so. They’re hoping to get it to less than 5%. But they may be the best they could do.

C: So it’s an interesting eschatological threat. It’s kind of the first one other than climate change, which has been this slow burn. Heh, no pun intended. This is the first real time where I’m feeling like this could be how I go out, you guys.

B: This could be how everyone goes out.

E: We’ll go out with you.

C: You'll be dead by then.

(audience laughter)

J: Why do you keep reminding us of that?

C: I’m sorry! But you have this weird false hope that you’re going to live forever. What—

E: —He’s taking his extension therapy…

C: —It’s 2035. When is this supposed to hit us?

E: We’ve had relatively—2055. 20 years from now.

S: [inaudible] people live into their 90s.

C: 20 years from now and you guys are already in your mid-70s?

S: Yeah. Our farm relatives lived into their 90s.

E: I’m only 65.

J: My dad made it to 86, and he ate whatever the hell he wanted.

C: You’re only 65? Huh. You’re closer in age to me.

E: That’s right.

S: He didn’t it meat slugs.

J: That’s right.

E: That’s right!

C: All right. You’re right.

B: My grandmother’s in her early 90s.

S: Or cricket biscuits.

C: We might all go out this way.

J: We have to stay hopeful, and we have to trust the scientists.

S: But it shows you how necessary the asteroid detection system was. Without that early detection system, we wouldn’t have known about this until it was too late.

C: Remember, we didn’t really have that back in the day, did we?

S: I think—remember we interviewed Rusty Schweickart? [link needed] He was working with the UN to develop—

J: That was the beginnings of it. That was it.

(crosstalk)

E: This is it.

B: Who was he?

S: That detection—

C: —This happened in our lifetime.

J: He was the Apollo astronaut that we talked to.

S: Remember?

B: I don’t remember that at all.

S: Yeah, that was a long time ago.

C: His memory’s been going.

B: A long time ago.

J: That was one my—that was one of the best interviews we ever did.

E: You have to go back and listen to that one, Bob.

C: How many episodes have we done at this point? Phew!

S: We’re over 1,500.

E: Uh, where are we?

C: Catalogued…

B: So Cara, right before the asteroid hits, I’m going to call you. I’m going to say, "I’m still here."

E: (laughs)

C: We’re still going to be doing the show, my friend.

S: I’ll still be editing it.

C: (laughs)

(audience laughter)

E: That’s true.

S: I mean, this episode, I’ll be editing it.

(laughter)

S: So you’re hopeful, Evan?

E: I’m optimistic. The glass is more than half-full.

S: It’s hard to talk about anything other than this. I know it’s kind of been dominating the news, but I think people are just expecting it’s going to be taken care of, and—

C: —Well, what other options—

S: —Or else you get—

(crosstalk)

S: We can’t obsess about it all the time.

E: What are we going to do, run around like this for 20 years with our arms flailing in the air? [presumably demonstrating]

(audience laughter)

C: Let’s start a cult!

S: I know! We’ll kill ourselves!

E: That’ll do.

S: That’ll fix it.

E: That’ll fix everything.

C: Do it the day before the asteroid, and we’ll never know what happened. (laughter) We won’t even be missed.

Deep Learning (1:01:56)[edit]

S: Okay, Bob.

J: Finally.

S: Finally. So we’ve been literally talking about this for 30 years. Remember, 30 years ago, when you thought that we would have artificial intelligence by now?

B: Yeah, yeah, yeah.

C: (laughs)

S: And I said, "Nah."

B: Keep rubbing it in. It’s coming.

S: So have we made any adv—where are we?

B: Yeah, this, this looks promising.

S: This looks promising?

C: This is the one!

E: This!

B: So deep learning is in the news again.

S: Again!

B: Remember, we used to talk about deep learning—

S: —Right there with the hydrogen economy, right?

B: —We used to—we talked about, come on, we talked about deep learning a lot in the late teens, early 20s, and it looked promising as hell, really promising. Remember some of those advances? Let me lead with what the news is, here, that researchers from the Minsky Institute have announced that they created a viable path to artificial general intelligence, and that they think that using the Moravec’s artificial general intelligence test—they think this could be the first AI test to—

S: —Which one?

B: —The Moravec artificial general intelligence test.

S: What happened to the Turing test?

B: It repl—come on! Get with the times, dude.

J: So general intelligence, just to remind the audience, is a computer that can think like a human being.

B: Right. It’s adaptable and—

J: —It’s not—

B: —It’s not super-smart in one domain. It’s like a human—

C: —Wait, remind me what made deep—

B: —Intelligent in many domains.

C: —What made deep learning deep learning? What is deep learning?

B: —Well, deep learning is—it’s a technique. It’s an artificial intelligence technique using neural networks and a lot of training data to see patterns, to see increasingly clearly, patterns and data, lots of data, that otherwise are very, very hard to see. So—

C: —Oh, and that’s why it had all those creative chess moves and Go moves.

B: —Well, right. There was AlphaZero, there was AlphaGo. Those were the systems that beat the best chess and best Go players on the planet. But not just — the AlphaZero was the one that was really fascinating for me because that was a system using deep learning that created a system that is so good in chess that they didn’t even test it against people because it was a waste of time. They tested it against the best computer chess program, and it kicked its butt. And human grandmasters that looked at it were like, "This thing played like a person but like a person-super computer hybrid." They said it was such an amazing, virtuoso performance. They could not believe how good this was, and this was largely created by deep learning. So deep learning — oh, what are you laughing at? What’s going on over there? So—

C: It’s a hybrid!

E: It’s a hybrid!

J: Oh, I missed it!

C: (laughs) You missed it!

B: Come on!

E: Turn off your Aug!

C: (laughs)

J: When Bob talks, I just zone out, and I start looking at cat videos. I can’t help it.

(audience laughter)

S: Cat videos!

(laughter)

J: They’re f-ing adorable.

B: All right. So the point was, Jay, deep learning was a huge success in the late teens and the 20s, not only with chess and Go but also image recognition, autonomous driving, language recognition. It was an amazing success, but the problem was that it was overhyped. Remember? It was just went —

S: —Like everything is overhyped.

B: —This one was crazy overhyped. It went viral. If you looked for AI classes, everything was deep learning, deep learning, deep learning. And so it really was a victim of its success because people kind of equated deep learning with artificial intelligence in general, right? They thought deep learning was going to create the first truly artificial general intelligence, which it could never have done because if you look at it, deep learning was just a tiny little subset of machine learning, and machine learning was a tiny little subset of AI itself. So it never—this was just one of the tools of AI that just exploded, and it really created a false impression. So people became disillusioned when deep learning—

S: —And there was a post-hype phase.

B: —wasn’t making all these—right. So there was—

E: —And, Bob, is a parallel to this, remember the "train your brain" to do—what was the name of that? Luminosity [sic] or whatever.

J: That was all [inaudible]. Yeah, they said your playing their stupid little things will—

E: —Playing your games will increase your overall intelligence and a whole bunch of —but it actually only helped you out in that one very specific set of puzzles you were learning.

J: —Yeah, it turns out—

B: —Right, okay. I see where you’re going with that.

J: —It turns out that the game that you were training on, that’s what you got better at.

E: That’s right.

C: So that’s like a metaphor for deep learning versus general intelligence.

S: It’s more like a metaphor. But I think the hype, though, was similar to—remember there was the hydrogen economy hype, which never manifested. Then there was, "stem cells are going to cure all diseases," which never manifested. Although all of these—there is a niche for this. Stem cells are having their day now, twenty years later. But it’s not going to cure everything. Then we were going to cure everything with CRISPR

J: —Then we had the Twinkie diet. Everybody was eating Twinkies.

S: —The Twinkie diet.

C: (laughs) After {{w|Zombieland} with Evan [inaudible].

B: That worked. I lost ten pounds on that.

E: (laughs) That franchise petered out.

C: Yeah, man.

S: And so, it’s the same thing. But we knew: deep learning was never on the path to AGI.

B: Right. And if you were kind of an enthusiast in this, you kinda realized that. But the general population really had no idea. So they were really disillusioned—

S: —They don’t know difference between AI and AGI.

B: Exactly. And it kind of created this little—they call it an AI winter, which has happened a couple times in the past, when AI was first really, really hyped. They thought, "Well look, we can create these chess programs. We’ll have human intelligence matched in five or ten years!" And they weren’t even close and—

S: —Oh, you remember when we saw 2001 how—we were like, "Yeah, we’ll have that in 30 years."

B: Yeah, it seemed totally reasonable. So the expectations were way, way high. Remember that? So just like previous AI winters, it caused a little mini winter, and people were very disillusioned, but the research continued. And we see a lot of its successes, and it’s not called deep learning, or even AI. They called it lots of different things so that people didn’t even realize what it was, and it had become so embedded in society that you don’t even think of it as AI, which is the true test of a system’s success.

S: It’s everywhere now. [inaudible] Deep learning is driving your car, it’s doing everything. But it’s in the background, and nobody talks about it, so you think it’s a failure. We’re like, nope, it’s running everything you’re using.

B: But it’s not AI. It’s not artificial general intelligence—

S: —It’s not AGI.

B: —which is what people—which is the real sexy thing that’s in the movies and the TV shows and what everyone really, really wants, and they’re very disappointed. So I think they may—what are you laughing at?

J: (laughs) This [cat] video is so funny!

(Laughter)

C: (laughs) He is such a jerk.

E: Oh, Jay! Some things never change.

J: Oh, man.

B: Turn on the scrambler again!

J: This zombie [inaudible]—

C: (laughs)

E: —Jay, Jay, send me the link.

B: So one of the things that this Minsky Institute really showed was that consciousness, they think, is really—it’s like a three-dimensional thing. You need three things. You need computational intelligence, and that’s what deep learning can really help with. But that’s only one leg of the tripod. You also need autonomous complexity as well—

J: —What does that mean?

B: —And that means—it’s like survival drives. It’s things like getting out of bed in the morning because you want to get out of bed. You’re goal-oriented. You’ve got intentionality. You want to do stuff. Those are things that—

S: —That’s the part that always worries me.

C: Yeah.

B: What’s your goal going to be?

S: Right.

C: It’s like this locus of control.

B: But you need that. That’s something you—consciousness needs that leg of the tripod.

J: I don’t know. I don’t really want to get out of bed, and I’m conscious.

(audience laughter)

S: You claim.

E: Well…

C: Sometimes you do, though.

B: So the third leg, this is the important one—

S: —Twinkies?

B: —Social complexity. That’s the one that was really a major driver for human consciousness. Without that—

C: —But this is still digital consciousness. Let’s be clear. It’s approximating human consciousness.

B: —It is. It is. So who knows how big conscious space actually is and [for] synthetic consciousness, what form that will take. But using the human consciousness as a template, they think—

C: —It’s the only one we have.

B: —You’ve got one real data point there. Well, except for—but it’s all, like, life on Earth and primates and dolphins. So they think that if you link up these AI test beds that have those three legs—so you’ve got the computational complexity, like deep learning gives us with pattern recognition and things like that, and you link that up to another system that has autonomous complexity, and these have been developing in the labs for 15, 20 years, and then you hook that to the social complexity cognitive robotic agents, put them all together—

S: —You get a Psilon.

B: —then you—well—

C: —I don’t understand what that means. (laughs)

E: Still?

J: So what’s the point?

B: They’re joining these different test beds that look at AI from a different perspective, putting them together, and they’re communicating, sharing data, sharing the things that you need to become, we think, have a consciousness like humans. So they’re communicating… And the one drawback with deep learning is that even the ones that were great at chess, they couldn’t tell you, "Well, I looked at all the rules of chess and I played about a billion games, and these are my takeaways. These are my insights into chess." They can’t give us those insights because they’re not designed to speak and say, "This is what my takeaway [is]," so it’s kind of like a black box, kind of like an oracle, where you ask—

E: —We may never know.

C: —We can’t learn [inaudible].

(crosstalk)

B: —You really can’t. You ask a question, you get an answer. And it sounds like—that sounds completely unintuitive. How could that even work?

C:42!

B: —But when you test it, it works.

E: —Basically.

B: —So these systems are now communicating with each other, and this is the crux of this news items is, is that they’re talking together, they’re making advances that they never would expect, not only with computational complexity but social complexity and autonomous complexity. They’re seeing advances they have not seen, ever, so they think this could be—we’re not there.

C: Should we be scared?

B: Not now. Maybe later.

(laughter)

B: Now, now just be happy because it looks like we’re finally on the path for some sort of artificial general intelligence—

C: Now be happy! Be scared later. Okay.

E: —Enjoy it while it’s good.

B: —You know, brain imaging has come a long way, and that’s like comparing top-down to bottom-up approaches. I think it could give brain imaging a run for its money because that’s another viable way for artificial general intelligence. We’ve got a brain! Image it. Digitize it. Make it work digitally, and that’s another viable path. That’s very promising, but now maybe it has competition. Who knows who will get there first.

C: Yeah, radiology has been a dwindling specialty lately. Like, the techs are able to do a lot of what the physicians used to do because these new—

B: —The pattern recognitions are—

C: —Yeah, pattern recognition algorithms are amazing.

B: —In that domain, they’re off the hook. Off the hook.

S: All right but here’s the thing that concerns me, right? And this is going back at least 15 years when I first heard about this thing. You guys remember Google, right? They have a—it’s still sort of state of the art. They can translate any language into any language, right?

B: Yep.

S: But do you know how they do that? You translate every language into a machine language and then you translate that machine language back into any other language. So you don’t have to make a connection between every language and each other; you just have to make a connection between every language and this machine language.

B: That’s what’s happening here!

S: But on steroids. So this is going back at least 15 years—

C: —But it’s so glitchy, still, isn’t it?—

S: —No, it really isn’t.

C: —I mean, when you do that, you lose so much context and nuance and cultural kind of—

S: —It’s getting a lot better because they’re not translating word-for-word, they’re translating idea-to-idea. You can translate even a euphemism, and metaphor, whatever, into the machine language. But here’s the thing—

C: —And there’s prosody, and all—

S: —Here’s the critical bit: nobody can speak this machine language. We have no idea what it is.

C: Well, yeah, of course not because it’s got every—it’s like the core of everything.

S: Yeah, it’s a separate language that these computers developed. This is mainly deep learning. They developed it through deep learning and—

C: —It’s the black box.

S: —they understand it, but no human understands this language. So now we have computers talking to each other in this language that we can’t understand, and it’s like a closed loop. It is another black box. Who knows what the hell’s going to pop out of it.

E: We can’t command them to tell us what is going on?

S: We can’t—it’s not a human language. We can’t understand it.

C: Yeah, all they can do is translate back into our language, which is—

S: —That’s right. They’ll translate back into English, but they can’t communicate to us directly in their language, and people tried—

C: —Because we can’t speak their language.

E: It’s not just binary?

S: No, it’s not binary. It’s an abstract language.

C: It’s like a synthesis of everything else. It needs all of it to be able to—

S: No one’s been able to crack its

J: There’s only about 30 movies out there that show how bad that this will turn out. And we just keep pretending like it’s going to be okay. We should just be like, "Maybe we shouldn’t let computers speak to each other in a language that we don’t understand." Maybe?

C: But, Jay—

B: —That’s been happening on some level for decades—

S: —It’s been happening for a while.

C: —It’s easy to say that, but think about all the amazing technology we’d be missing if we just, like, blocked this from the beginning.

B: But not just that. Imagine the things we can learn, even geo-engineering to help with this climate change disaster we’re entering.

S: I’m sure they’re running the calculations on the rockets to move the—

E: Something tells me the computers don’t care too much about carbon emissions. It’s no threat to their—

J: —We’re really screwed.

E: —existence.

C: —No, but that’s the thing, we are inherently limited through our own human filters and fallacies, right? So these computers are capable of maximizing algorithms. They don’t fall victim to the heuristics that we have to use. So they’re going to be able to solve problems that we are too limited to be able to solve.

S: That’s the hope.

C: The question is, what are the unintended consequences?

E: Yes, that’s always the case.

J: The real day that we’ll know we’re screwed is when we finally do tell the computers, "Well, tell us what you’re talking about with the other computers." And they go, "Eh, nothing, don’t worry about it."

(laughter)

E: "You’ll find out."

S: "It’s not important."

B: Maybe they’re writing poetry. Probably not.

S: I wasn’t worried about this when they were driving your car and things like that, but when you talk about, "We’re going to combine the deep learning piece and social piece with the self-preservation, full autonomous"—that’s the piece that’s always concerned me. And even if it—and, remember, I’ve gone through these phases where at first, I’m like, "Yeah, this is something we need to be worried about." Then I’m like, "Meh, maybe not because this is deep learning phase. Deep learning can do anything without AGI, so we’re going to develop AGI." Then we sort of really learned the hard limits of deep learning. It’s like, "Well, so we may need to go beyond that." But also, you don’t need self-awareness in order to be a threat to civilization.

B: Right, just mindlessly do something very destructive.

S: Exactly.

J: In the future, they’re going to say, "Skynet went online in 2037." And you know what happened with Skynet and the Terminator, remember that?

S: Well didn’t Skynet turn into something else? What was the one it turned into? I forget that crappy reboot. Remember, from 20—

J: Yeah, whatever, that movie sucked.

C: (laughs)

E: Nobody knows. Nobody watched it.

B: I’ve got it on my 10K screen. It’s awesome.

S: So they have it in 10K?

E: 10K, that’s it.

C: I just watch everything on my Aug now. You guys still have screens?

S: Yeah, I’m old-fashioned.

E: Retro.

C: You’re so retro. You still drive cars, don’t you?

(laughter)

S: I will still occasionally drive.

E: I have a classic!

C: You guys will go out and drive a car.

S: Yeah, I still have my license.

B: The drone cars are the best, though, come on.

S: Yeah, I know. That’s true.

C: Self-driving…

S: So if Perses doesn’t kill us, the Psilons are going to kill us. Is that what you're telling us?

J: Right.

B: Maybe. Maybe. It’s going to be a fun ride either way.

S: But at least we’ll have slug burgers to eat in the meantime.

B: (laughs) Way to bring it around, there, Steve!

S: Been doing this for awhile, Bob.

(laughter)

Science or Fiction (1:16:50)[edit]

Theme: Anxiety[7]

Item #1: Anxiety is more prevalent in developed countries and among women.[8]
Item #2: Anxious people are less sensitive to changes in facial expression.[9]
Item #3: Friends and family of socially anxious people tend to think highly of them.[10]
Item #4: People who suffer from anxiety can perceive smells negatively while having an anxious episode.[11]

Answer Item
Fiction Less sensitive
Science More prevalent
Science
Negative smells
Science
Think highly of
Host Result
Jay win
Rogue Guess
Bob
Less sensitive
Steve
Less sensitive
Evan
Less sensitive
Cara
Think highly of

S: So, Jay, you are going to cover "Science or Fiction" this episode.

B: Oh boy.

J: Right.

E: Ooo!

Voiceover: It’s time for Science or Fiction.

J: So as you know, Cara and I very openly talk about our—we’re medicated people. I suffer from anxiety. I thought I'd talk about anxiety today.

C: Extra medicated today, though.

J: And I thought I would hit you guys with some interesting facts about anxiety and see if you could figure out which one of these is not correct. So the first one is—so what I'll do is I'll go through these four items—

B: —I’m anxious about this one.

C: (laughs) Yeah.

J: —and then I'll quiz the audience, and then I'll let you guys go, and then we'll see if you guys change the audience's decisions. So the first one is: "Anxiety is more prevalent in developed countries and among women." The second one is: "Anxious people are less sensitive to changes in facial expressions." The third one: "Friends and family of socially anxious people tend to think highly of them." And the last one: "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively."

So if you [the audience] think that the first one – anxiety is more prevalent in developed countries and among women – if you think this one is the fake, clap when I lower my hand. (a few claps) Okay, four people. (audience laughter) The second one – anxious people are less sensitive to changes in facial expressions – if you think this one is the fake... (most of the audience single claps) The third one – friends and family of socially anxious people tend to think highly of them – if you think this one is the fake... (another few claps). And the fourth one – people who suffer from anxiety, while having an anxious episode, can perceive smells negatively. (remaining few claps) Okay so, definitely, the crowd here thinks that number 2 is the fake, the one about anxious people are less sensitive to changes in facial expressions. So, Bob – and don't scroll, because all the answers are [inaudible].

(laughter)

C: You can't ask your wife!

Bob’s response[edit]

B: Okay. "...more prevalent in developed countries and among women." That just makes sense. That’s all I’m going to say. "Socially anxious people tend to be thought highly of by friends and family." Yeah, that kind of makes sense. I just realized I know so little about this. I’m just going by what little experience I have. That kind of makes sense as well. And then this last one, here, this one really makes sense to me. "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." I’ve run into some people who seem to have that happen, although I don’t know if they were necessarily suffering from anxiety. But I think I’m going to go with the audience. They seem to be very confident about this. And this is the only one, the second one, that doesn’t quite make as much sense to me as the other ones. They’re less sensitive to changes in facial expressions. I can’t imagine why that would be so. So I’ll say that one’s fiction.

J: All right, Steve.

Steve’s response[edit]

B: Steve’s like, "I wrote a paper on this one!"

(laughter)

C: Novella, et. al. 2029.

S: The "developed countries and among women", I seem to remember that that is the demographic, yeah. Anxious people are less sensitive to changes in facial expressions? I would guess they were more sensitive to it because they’re kind of looking for things. So that may be how that one is the fiction. That was my initial thought. Friends and family think highly of them? Yeah, I think they tend to be more kind of overachiever kind of people who are anxious, so that would go along with that. And, yeah, this is going back maybe 15 or 16 years, but I seem to remember the smell one, that they interpret things in a negative way. It’s kind of like the brain is just interpreting everything negatively. So that makes sense. I was thinking that the facial expression one was the fiction even before the audience chimed in, so I’m going to agree with the audience as well.

J: Evan?

Evan’s response[edit]

E: Well, I’m not trying to be a lemur here, but—

S: —Lemurs don’t jump off cliffs. That’s a myth.

C: —That’s also not a lemur. That’s a lemming.

E: —Thank you, Steve.

S: —Lemming.

E: —Oh, whatever!

(laughter), (applause)

E: I set 'em up, they knock 'em down! (laughter)

C: You don’t have to be a lemur, either. (audience laughter)

J: You’re such a lemur.

S: So what would that be? You piss on your hands and rub it up against trees? (laughter)

E: Yeah, let me show you. (laughter) Oh boy. Look, I really have no insight to this. I know very little about anxiety issues. I’m a neophyte when it comes to this kind of stuff. I don’t think I’ve experienced any real sensation of anxiety in my life—

J: —Oh, you’re so lucky.

E: —in which I’ve felt like I had to seek help for it or anything. Maybe I have and just didn’t, but I’ll just say what Steve kind of said—not just because it’s Steve, because I had the same thing—less sensitive to changes in facial expressions: that seems to be the opposite. Wouldn’t they be more sensitive to changes in facial expression? They’re constantly looking for feedback, signals, and interpreting—

S: —They could be self-absorbed, though, and that’s why they’re less sensitive.

E: Maybe.

S: I’m just throwing that out there.

E: Maybe, but that was also my initial reaction. And have no reason to believe that it’s otherwise, so I will go that direction.

J: All right, Cara, what do we got?

Cara’s response[edit]

C: This is a tough one because I’m not sure I agree with the crowd. I do agree that anxiety is more prevalent among women. I know depression is more prevalent among women, and the neurotic personality style is more prevalent among women, and anxiety and neuroticism tend to—I don’t really like that word, anymore, but they still do use it in the literature. I also think that people who have anxiety might perceive a smell more negatively just because they’re—I think that vigilance that happens—and also, you specifically said while they’re having—you didn’t say panic attack, but I’m assuming it’s something along the lines of a severe experience of anxiety. They’re going to catastrophize everything. That’s a common experience.

My problem is with the two middle ones, and I’m kind of on the fence between them right now. So anxious people are less sensitive to changes in facial expressions? On the whole, anxious people? I don’t know because there’s so many types of anxiety. I think that if somebody is actively experiencing panic, they’re going to be way less sensitive because they’re not dialed into what somebody looks like at all, but somebody who might be socially anxious might be more sensitive to a change because they’re worried about feedback and how they’re being perceived, right? Being on anxiety is kind of like being high, and you’re like, "Everybody’s looking at me. They all think I’m saying something stupid." That can be an experience of somebody who’s experiencing social anxiety.

On the flip side of that, "friends and family of socially anxious people tend to think highly of them." You specifically said socially anxious people. Socially anxious people tend to withdraw from interaction in public. And I think that sometimes there is actually a lot of stigma around social anxiety that actually leads to people thinking that that person is anti-social. That person’s not very nice. That person kind of comes across like "they don’t really like me, or they think they’re better than me."—

B: —But no one cares.

C: —So I do think sometimes friends and family of socially anxious might actually stigmatize them a little bit and think negatively of them. So that’s kind of where I’m on the fence because I think either of those could be true. My fear is that—or my concern is that "anxious people are less sensitive to changes in facial expressions" is a broad statement. Anxious people on the whole are less sensitive to facial expressions? Maybe? Maybe not. So—

B: —Come on, be a lemur. Come on!

E: Yeah, yeah! Be a lemur!

C: I might be wrong—and just to be clear, I do not study anxiety, and I don’t have anxiety. I am medicated for depression, and I don’t really work with anxiety in any of my clinical work. It’s not an area that I research at all, so basically what I know is just what I know from textbooks. And I’ve never specifically come across these studies. But there’s a part of me that thinks there is still a stigma around socially anxious people. And so I’m going to say people actually don’t think more highly of them. And that’s the fiction. But I could be wrong. You guys could totally have it because I’m on the fence about those.

Jay polls the audience again[edit]

J: All right. Let’s go through again. I’m going to ask the audience, here. So, we’ll go to the first one again. "Anxiety is more prevalent in developed countries and among women." (one clap)

S: One holdout!

C: [inaudible]

S: Stick to your convictions!

E: Independence! I love it. Yes.

J: Apparently the rest of the audience was too anxious to clap. (laughter) "Anxious people are less sensitive to changes in facial expressions." (audience single claps)

C: Hmm…

J: I don’t know. It’s pretty close to the first one.

C: I don’t know. Let’s lesson to the next one.

E: A few people shifted.

J: "Friends and family of socially anxious people tend to think highly of them." (audience single claps)

B: Oh boy!

S: Cara definitely influenced them.

C: But, guys, I might have led you astray. (laughter) I’m really sorry.

J: "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." (another few claps) All right. Did you feel that those [middle] two were close?

S: Those two are a lot closer than initially—

E: —A lot closer.

B: —[inaudible] ask again, real quick?

S: No, I think we’ll just call it a tie.

C: I think we shifted it to more tied in between the two.

Jay explains Item #4[edit]

J: All right. I will start with the last one: "People who suffer from anxiety, while having an anxious episode, can perceive smells negatively." So, people with anxiety disorders tend to label neutral smells as bad smells, so this one is science. Professor Win Lee explains, "in typical order-processing, it is usually just the olfactory system that gets activated, but when a person becomes anxious, the emotional system becomes part of the olfactory processing stream."[7] That is fascinating.

B: Wow. That’s cool!

J: So, your anxious consciousness taps into the way that your olfactory processing happens.

E: But what about the other areas, the other senses? Does it also impact—

C: —I think it does affect other senses, too. It might make sounds more shrill or more difficult.

E: —Tastes, even?

B: —But it makes sense that it would be tied to smells because your olfactory centers are closer to the—

J: —To memory.

B: —the limbic areas of your brain tied to emotions. So that’s why when you smell something, it can bring you back decades. Just that one trigger of a smell can bring you back to a memory that’s literally fifty years—

C: —They’re also very fast, right? Your olfaction, because it doesn’t pass through the thalamus like everything else. It’s a very fast sense compared to some of the other senses. It’s evolution, like, very old.

J: —To answer your question, I don’t know, Ev, I don’t know if it can hijack the other senses as well. As an anxious person, I will tell you that if I’m having a really bad panic attack, everything is catastrophized.

C: Yeah, it’s acute.

J: Yeah, everything’s acute. I would imagine—

E: —Or exaggerated. But negative, as well.

J: —But it’s also something, with my personal experience, very much insular, like I’m turned into myself. I’m not peering out into the world. I’m just looking in at what’s going on.

C:(hinting) You might not be looking at faces…I don’t kno-o-ow. (laughs)

Jay explains Item #3[edit]

J: All right. So I want to go to #3, "Friends and family of socially anxious people tend to think highly of them."

C: Crap.

J: I’ll just read this. And then you guys will—

C: —Crap.

J: —discover what the truth is. So people with social anxiety usually think they don’t do well in social situations, but new research indicates otherwise. So this one is science. "Friends of those with social anxiety tend to think very highly of their nervous companions. This is possibly due to how sensitive anxious people can be while they’re in a social environment, meaning that they think before speaking and always consider the feelings of others."[7]

C: So, wait, you’re saying that they think more highly of them than they think of themselves?

J: Well, a socially-anxious person—

C:(playfully growling) Not what the item said!

J: —Yeah it is.

C: Is it?

J: Yeah, listen.

C: It just says, "highly."

J: "Friends and family of socially anxious people tend to think highly of them." So a socially anxious person is actually, for lack of a better way to say it—they’re scoring points with friends and family because they’re tuned into their politeness and to the other people more. Because of their social anxiety, they’re reading everyone, and they’re analyzing their environment more actively than a person that doesn’t have the anxiety.

C: Gotcha. Okay. Yeah.

Jay explains Item #2[edit]

J: So I will now go to, "Anxious people are less sensitive to changes in facial expressions." This one is the fake. So the audience got it. Good job.

C: Good job, guys! (applause)

J: I picked this one because the way that I did this—I tested myself on all of these facts. I read them and thought to myself whether I agreed. The website I found was kind of like, "well, what do you think the truth is?" And it was interesting.

I thought that this one was the opposite because of what you and I said, because when you’re having a panic attack, you’re so—your surroundings almost don’t matter because you really do kind of get this haze that comes over you and you’re just in your own head. It’s very insular. But it turns out that people who are anxious—so they said, "People with anxiety are quicker to perceive changes in facial expressions than those without anxiety; however, they are less accurate at perceiving their meanings."[7] So they can misinterpret them—

S: —But they probably interpret them negatively.

C: —They make them negative.

J: Right, right, of course. "It’s easy for those who struggle with anxiety to overthink and jump to conclusions. This may lead to tension and conflict in relationships."[7] So, very good, audience. You guys did a great job, except you [pointing to lone hold-out], who I noticed didn’t clap because you were thinking probably like I do. (laughter)

Jay explains Item #1[edit]

J: So the first one: "Anxiety is more prevalent in developed countries and among women." This one is science. "The US is considered to be one of the most anxious nations on Earth."[7] Sociologists blame the increased number of choices—the increased number of choices that we have—so our modern—

S: —[inaudible]

J: —well, modern society in general. We have—

S: —It’s getting worse.

J: —We have so many choices in front of us that it adds up to emotional stress throughout the day. You get more and more stressed. You got so many—you’re scrolling through Amazon, and you don’t just have one pair of socks. You’ve got hundreds of pairs of socks, and you have to think about it and think about it and think about. So—

S: —Well, and this gets to the, seriously, the confluence of AI and the Aug, social media, is you have virtual assistants who make decisions for you, and people love that because it reduces their anxiety—

E: —Yeah, exactly.

S: —it reduces their choices. And now you have not only targeted ads; you’re allowing whoever’s in charge of the Aug to live your life for you, like to lead you around and make decisions for you. And, at first, it’s like the things you don’t really care about that much or whatever, but how intrusive is that going to get? Think about it! Again, we’ll trade convenience for security, for privacy. Imagine how much we’ll trade to really reduce our cognitive load? That is really what psychologists would call that, right? Cognitive load is how much work you have to do to get through your day, to get through a task, to do something. AI system software in general, it’s all engineered—or it should be, if it’s good, if it’s working well—to minimize cognitive load, right?

Good movie-making is about minimizing cognitive load in a lot of ways. I remember, back when we were still doing films, we learned—because we got a course from our friend at Pixar, who said, "If you follow the action on a movie screen"—remember movie screens?—"You follow the action. If one scene ends over here, the next scene picks up here." [Steve presumably gestures.] Right? It doesn’t pick up over here?

J: Yeah, meaning that where your eyes are—

S: —Yeah, they know where eyes are. They’re following your eyes, and then they’re making sure your eyes are following the action from one scene to the next—

C: —It’s less work for you.

S: —because—right, because that’s less work. If you have to suddenly hunt for where the action picks up—"Oh, it’s over here!"—that’s cognitive load—

E: —Too disorienting, yeah.

S: —it takes you out, it re—

C: —That’s why 360 films are hard for people. Like it’s hard to catch on to a 360 movie because you have to—

S: —Or virtual films, remember the virtual films, which never really took off?

C: —yeah, you have to find the action, as opposed to—

S: —Yeah, you’re constantly looking for where the action is. They can be fun, but that’s high cognitive load. You’ve got to be in the mood for that. So now we’re just going to be surrounded by systems that will reduce our cognitive load for us, and that’s like crack. Who won’t do that?

J: That’s like somebody cutting your lawn for you.

S: Yeah.

J: How could you not love that? (audience laughter)

S: Right.

E: The lawn bots.

J: My wife and I were going in overlord to get the yard cleaned up for the fall. And we hired some people to come and take down some trees from the tornado and I remember standing—I have a cup of coffee. I’m looking out the window. I’m watching a few guys work on my yard, and I’m just like [loving gesture/nod?] "I love all of you guys. Thank you so much! This is such a pleasure."

E: "I’m in here, you’re out there."

B: I told you to get robots to do that.

C: (laughs)

B: I look out my window and want the robots cutting my lawn—

J: —I don’t want robots in my yard. (laughter)

E: "Get off my lawn!"

S: Still not down with the robots?

J: No.

Questions/V-mails/Corrections[edit]

S: We got emailed—or v-mailed some questions, if we want to take some virtual questions.

C: Uh-huh, uh-huh, uh-huh.

S: I have one, which I want to bring up.

Question #1: New Universal Flu Vaccine (1:33:24)[edit]

S: So did you guys all get your flu shot this year? Everybody get their flu shot?

(Rogues confirm.)

S: It’s not really flu season down here, right?

C: They got theirs six months ago, right?

E: Their quad.

S: Well, the quad, that was the standard of—actually, remember the tetravalent vaccines, the flu vaccines?

E: Yeah.

S: But now we have the universal flu vaccine, which came out in 2032. So the question—this comes from Haywood, and Haywood asks—

(Rogues cackle at inside joke.)

E: [to Jay] He got you! Totally got you.

C: (laughing) I’m sorry.

J: (laughing) [inaudible] swallowing. I just [inaudible]. Did you not think I was going to lose it? Does anybody know why that’s funny?

B: No, you can’t

C: Yeah, we don’t have to tell them.

E: Unh-unh-unh-unh-unh-unh.

B: You can’t say! Jeez, stop it.

C & E: (laughs)

J: (laughing) What was that emailer’s last name, Steve?

C: No! He left it off the email.

S: (laughing) He didn’t say. Just a first name [inaudible].

C: (laughs hard) He’s totally losing it.

E: (guffaws)

S: So, anyway.

E: Oh, gosh!

S: He wants to know if he should get the new universal flu vaccine because—well, there's now the antivaxxer fear mongering around this one, right, because—

B: —Yeah, of course.

S: —because it’s all genetically modified, et cetera. So, yes, Haywood, you should get the universal flu vaccine because even the tetravalent vaccine—Every year, back in the day, up until two years ago, they would have to—If you were from the United States, like we are, they used to give us whatever strains of flu you guys [Australians] were getting, and then you get whatever strains we’re getting six months before because that was lead time to make the vaccines. And there’s, of course, hundreds of strains, and they’re just guessing. So they increased the number of strains that they were covering per vaccine. Some sort of became permanently imbedded, so you have to cover certain strains every year, then you have to add one or two that you think are going to come—

C: —But that left out any potential mutations.

S: Yeah. When the vaccine matches, it’s like 95% effective, but mismatch could reduce that to 90, 60, 40% on bad years. It might only be 40% effective.

C: Yeah, there have been years like that, for sure.

E: Very bad.

C: Where you got the vaccine, you still got the flu. It sucked.

S: They’ve been researching, for about 40 years, a universal flu vaccine. The problem has always been that the parts of the flu vaccine—of the flu virus—that are universal are hidden from antibodies. The immune system can’t get access to that because all of the stuff that changes from strain to strain was in the way. But they did finally figure out a way to crack into that, to get access to the universal bits. And so they’ve been, now, producing a universal flu vaccine. And if you get that, you are resistant to every flu strain. And so you only need to get it about once every five years. If you get that for once every five years—and now it’s like every year it’s 95% effective.

E: That’s good.

S: So, yes, get it! You should absolutely get it.

J: Of course!

C: We all did!

S: Yeah, I know. I know. I made you get it.

C: (playfully proud) We’re the SGU!

E: It still hurt a little, though.

S: It’s still a vaccine. It’s still a shot.

Skeptical Quote of the Week (1:36:31)[edit]

Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward. — Dr. Alyssa Carson[1], first resident of Armstrong Station, The Moon

S: All right, Evan, before we close out the show, give us a quote!

E: "Science is the greatest thing known to humans. Through science we have been able to seize a modicum of control over the otherwise natural state of chaos throughout the cosmos. It is truly the most stunning achievement by a life form that emerged from the dust of the stars. In order for us to be the best stewards of our universe, we must continue the pursuit of science, and may it forever be our torch to light our way forward," spoken by Dr. Alyssa Carson. She’s a NASA astronaut and she was the first inhabitant of Armstrong Station on the Moon in 2031.

(laughter), (applause)

Signoff[edit]

S: Thank you guys all for joining me for this special episode, and [to audience] thank all of you for joining us—

C: —Thanks, Steve.

S: —and until next week, this is your Skeptics' Guide to the Universe.

(applause)

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.


Today I Learned[edit]

References[edit]

Navi-previous.png Back to top of page Navi-next.png