SGU Episode 854: Difference between revisions

From SGUTranscripts
Jump to navigation Jump to search
m (timestamps)
m (removed w/ links)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{Google speech|episode}}
{{Episode|11|20|2021}}<!--  
{{Episode|11|20|2021}}<!--  


Line 23: Line 24:
** Once transcription is complete, please delete this markup section!
** Once transcription is complete, please delete this markup section!
-->
-->
{{Editing required (w/links)
{{Editing required
|transcription = y
|transcription = y
|proof-reading = <!-- please only include when some transcription is present. -->
|proof-reading = y
|formatting = y
|formatting = y
|links = y
|links = y
Line 74: Line 75:


'''S:''' And, we have a special in-studio guest with us, George Hrab!
'''S:''' And, we have a special in-studio guest with us, George Hrab!
33.00 34.00 '''S:'''  Hi.
34.00 35.00 '''GH:'''  George.
35.00 36.00 '''GH:'''  Hello. It's been so long.
36.00 38.00 '''GH:'''  You've all changed so much since last I saw you.
38.00 41.00 '''GH:'''  Oh my gosh, how you've grown. I'm so proud. I'm so proud.
41.00 46.00 '''S:'''  Cara, I just want you to know we're okay with the whole flood and the rain. We're good.
46.00 49.00 '''S:'''  We survived two psychological hurricanes.
49.00 51.00 '''C:'''  I'm glad. I'm glad. It was looking pretty scary.
51.00 53.00 '''C:'''  But Cara, guess what happened?
53.00 54.00 '''S:'''  What?
54.00 56.00 '''S:'''  Whenever there's a storm, guess what happens in my yard?
56.00 57.00 '''S:'''  A tree falls down.
57.00 58.00 '''C:'''  A tree falls down.
58.00 60.00 '''J:'''  A tree fell down. Exactly. See, everybody knows.
60.00 61.00 '''E:'''  Does it make a noise?
61.00 62.00 '''J:'''  Yes, another tree fell down.
62.00 63.00 '''C:'''  Does it make a sound?
63.00 68.00 '''C:'''  Well, I'll let all of you know that I'm also safe from wildfires, even though Lake Tahoe is burning.
68.00 73.00 '''C:'''  And the skies are not great. I'm not under any sort of immediate threat here in LA right now.
73.00 82.00 '''S:'''  Yeah, I saw a video of somebody walking out from being indoors and it was, I guess, dusk and the sky was literally blood red.
82.00 83.00 '''B:'''  Oh yeah.
83.00 90.00 '''C:'''  Yeah, you see that periodically at the worst of it. It's so scary. But you guys saw all these videos of New York City, right?
90.00 91.00 '''J:'''  Yes.
91.00 92.00 '''B:'''  My God. Subways.
92.00 94.00 '''B:'''  People drowning in their basement.
94.00 95.00 '''S:'''  Philly as well.
95.00 96.00 '''S:'''  Yeah.
96.00 97.00 '''S:'''  Philly was bad.
97.00 98.00 '''S:'''  Nuts.
98.00 99.00 '''S:'''  Horrible.
99.00 100.00 '''S:'''  We're calling it a one in 500 year rain event.
100.00 103.00 '''S:'''  Yeah, but that's BS because this is the new norm, man.
103.00 108.00 '''B:'''  Well, still though, even with the new norm, I don't think we'll be seeing this at least for another 10 years.
108.00 110.00 '''B:'''  It probably won't be 500 years.
110.00 111.00 '''S:'''  No.
111.00 113.00 '''E:'''  I guarantee you that.
113.00 114.00 '''E:'''  It's no longer biblical.
114.00 120.00 '''S:'''  So we are recording on a Saturday for a couple of reasons. One is we just like to do it, right?
120.00 121.00 '''S:'''  We like to play the podcast.
121.00 122.00 '''S:'''  Why not?
122.00 133.00 '''S:'''  We haven't done a live video, SGU episode in a while. But the primary reason is that we were supposed to all be at Dragon Con this weekend.
133.00 147.00 '''S:'''  And we decided that we didn't want to be moving through massive crowds in the middle of the worst spike of the Delta variant of COVID that we've had so far.
147.00 148.00 '''S:'''  Yeah, it's called us crazy.
148.00 150.00 '''S:'''  During a place in the country where it's particularly bad.
150.00 154.00 '''S:'''  So it was just the unfortunately the pandemic was just too bad.
154.00 158.00 '''S:'''  We could not in good conscience go to Atlanta, do it.
158.00 165.00 '''S:'''  I wish everyone there well, but we just couldn't do it. We have too many people in our immediate family who are vulnerable.
165.00 170.00 '''S:'''  I work at a hospital. It just doesn't work. So we couldn't go.
170.00 180.00 '''S:'''  But we will be streaming a portion of this show to Dragon Con. But we're going to do the first half of the show first.
180.00 181.00 '''S:'''  That's good.
181.00 183.00 '''S:'''  Doing the first half of the show first.
183.00 189.00 '''S:'''  I thought doing the second half first and then I thought it would be more natural if we did the first half.
189.00 193.00 '''E:'''  It just makes more sense to do the first half first. It's not a Quentin Tarantino movie.
193.00 200.00 '''GH:'''  Guys, I got to say I saw the coolest one of the all time, I think most original Dragon Con costumes someone posted.
200.00 204.00 '''GH:'''  And it occurred to me that this costume would have made no sense a year ago.
204.00 212.00 '''GH:'''  It was the Delta variant. So it was a Loki, a female Loki dressed as a Delta stewardess.
212.00 213.00 '''S:'''  Delta variant.
213.00 216.00 '''GH:'''  It was the Delta variant. Is that not brilliant?
216.00 217.00 '''S:'''  Very clever.
217.00 218.00 '''S:'''  It was so clever.
218.00 219.00 '''S:'''  That is so meta.
219.00 223.00 '''GH:'''  And if we had seen that literally, whatever, 12, 15 months ago, you'd have been like, what?
223.00 230.00 '''GH:'''  Because Loki wasn't out yet. COVID wasn't happening. And there was no Delta variant. So now it's just insanely funny.
230.00 237.00 '''S:'''  I'll never forget the Dragon Con where we're doing a live show. We always wear costumes when we're doing the live show at Dragon Con.
237.00 243.00 '''S:'''  But Bob actually thought he was going to do the entire show wearing a full face mask.
243.00 244.00 '''S:'''  Yeah, a rubber pirate mask.
244.00 246.00 '''B:'''  And Steve signed off on it.
246.00 253.00 '''B:'''  And it was an awesome costume, well worth the muffled hour and a half of talking.
253.00 259.00 '''S:'''  That's an excellent demonstration of memory distortion over time.
259.00 261.00 '''B:'''  Yes. Thank you. Thank you.
261.00 269.00 '''C:'''  So how is everybody processing the fact that we're three months away from, what, 2022?
269.00 270.00 '''C:'''  Four months. Yeah.
270.00 274.00 '''C:'''  And it feels like we're still just reckoning with 2019.
274.00 276.00 '''C:'''  Yeah. I know.
276.00 282.00 '''C:'''  So much time has been both dragging on and also completely lost.
282.00 293.00 '''C:'''  It really doesn't feel possible that 2019 will have been, quote, three years. I mean, I know it's not exactly, but like three years ago, the beginning of 2019.
293.00 294.00 '''S:'''  Right.
294.00 301.00 '''S:'''  It is weird. It's disorienting because we've fallen out of our normal life patterns enough.
301.00 306.00 '''S:'''  Our sense of time is completely, it's complete nonsense in our heads as it is.
306.00 310.00 '''S:'''  But you do anything to disrupt the apple cart and you get disoriented. It really is disorienting.
310.00 314.00 '''S:'''  Yeah. But to clarify, I mean, the pandemic really started at the beginning of 2020.
314.00 319.00 '''S:'''  It's been a year and a half. The end of this year will be two full years, really, of the pandemic.
319.00 321.00 '''S:'''  And we're definitely going to still be in the middle of this.
321.00 325.00 '''S:'''  I mean, again, I remember when this started, we had no idea how long it was going to be.
325.00 326.00 '''S:'''  We were talking weeks.
326.00 327.00 '''S:'''  Yeah.
327.00 328.00 '''S:'''  Then weeks became months.
328.00 331.00 '''S:'''  Then we're like, oh, maybe we'll have a vaccine by the end of the year.
331.00 340.00 '''S:'''  The idea now that we're still on the upswing of one of the worst waves of this pandemic more than a year and a half into it.
340.00 343.00 '''S:'''  And honestly- Didn't have to be that way, though.
343.00 356.00 '''S:'''  We still cannot tell how long this is going to drag on because another variant could emerge, could reset the clock on having to get vaccinated, would have to get a booster, et cetera.
356.00 363.00 '''GH:'''  But Steve, at least everyone's agreed to kind of move forward together and do all the steps necessary to try to tamp it.
363.00 364.00 '''GH:'''  Everybody.
364.00 366.00 '''GH:'''  At least as a country, we're all together in this.
366.00 368.00 '''GH:'''  Yeah, we're all in the same boat.
368.00 370.00 '''GH:'''  Unified, doing this. At least there's that.
370.00 372.00 '''GH:'''  Oh, wait, no. The total-
372.00 374.00 '''S:'''  Yeah, what a sense of community.
374.00 384.00 '''GH:'''  What's freaking me out is the fact that we are as far from 1982 as 1982 was from the start of World War II.
384.00 386.00 '''S:'''  It's 40 years.
386.00 397.00 '''GH:'''  Because in high school, or grade school, whatever high school, to be thinking of the 40s as it was such a foreign past, such an old- Ancient history.
397.00 401.00 '''GH:'''  Ancient, ancient history. And now that is what my high school years were.
401.00 402.00 '''C:'''  Yeah.
402.00 406.00 '''C:'''  Well, because that's my mom, right? My mom was born in the 40s.
406.00 410.00 '''C:'''  But I am the age of many moms.
410.00 415.00 '''C:'''  And that's something that's hard to grapple with, I think. I was born in 1983.
415.00 418.00 '''C:'''  So one year apart from what you're referencing.
418.00 419.00 '''C:'''  Yeah.


== COVID-19 Update <small>(6:59)</small> ==  
== COVID-19 Update <small>(6:59)</small> ==  


== News Items ==
419.00 423.00 '''S:'''  So a little bit of good news. We're talking about the pandemic and everything.
 
423.00 434.00 '''S:'''  A study came out just a couple days ago where they looked at 6.8 million people receiving one of the two mRNA vaccines.
 
434.00 440.00 '''S:'''  And they found essentially no serious side effects. No serious side effects.
 
440.00 445.00 '''S:'''  6.8 million doses of either of the mRNA vaccines.
 
445.00 453.00 '''S:'''  Now there were not serious side effects. The big one probably is the pericarditis.
 
453.00 456.00 '''S:'''  So it can cause a little bit of inflammation around the heart.
 
456.00 465.00 '''S:'''  But these were mild cases. Most of the people who had the symptoms were observed in the hospital for one day and then sent home.
 
465.00 470.00 '''S:'''  No long-term or serious consequences. So it wasn't considered a serious adverse event.
 
470.00 479.00 '''S:'''  But it was an adverse event. But again, still, they said there were 32 total cases, again, out of 6.8 million doses.
 
479.00 480.00 '''S:'''  Wow.
 
480.00 481.00 '''S:'''  You know, this is like-
 
481.00 485.00 '''GH:'''  So now when you say no serious side effects, what are the serious side effects?
 
485.00 490.00 '''S:'''  None. So there's no serious side effects. What about the serious side effects?
 
490.00 492.00 '''S:'''  So the risk was- No, there's none.
 
492.00 499.00 '''S:'''  If we can do the math, the denominator was 6.8 million, but it's zero. Yeah, there was zero. Serious side effects.
 
499.00 505.00 '''B:'''  Yes, but I think we would see serious side effects if we gave it to a trillion people.
 
505.00 507.00 '''B:'''  I think we would get one serious side effect.
 
507.00 508.00 '''B:'''  That's true.
 
508.00 509.00 '''B:'''  That's just my theory.
 
509.00 525.00 '''S:'''  Yeah, so the thing is we're at winning the lottery level of statistics at this point, where you could start to make statements like, yeah, you're more likely to die on the way to work in the car or get hit by lightning or die from coconut falling out of a tree. You know what I mean?
 
525.00 526.00 '''S:'''  Way more, yeah.
 
526.00 533.00 '''S:'''  Yeah, it's so unlikely that statistically it's just not worth worrying about.
 
533.00 538.00 '''B:'''  If you're afraid of the vaccine, then you shouldn't be taking baths or playing with your dog.
 
538.00 540.00 '''S:'''  That makes too much sense, Bob.
 
540.00 548.00 '''S:'''  We don't live in a world where people are trusting authority and expertise and science.
 
548.00 553.00 '''S:'''  We're going to get to this. Actually, Jay, that's one of the news items that we're going to be talking about.
 
553.00 555.00 '''S:'''  That's correct. I'm prepping the audience.
 
555.00 557.00 '''S:'''  You're pre-gaming it a little bit?
 
557.00 558.00 '''S:'''  Teaser.
 
558.00 563.00 '''S:'''  The thing is the facts are inarguable.
 
563.00 568.00 '''S:'''  And again, I remember this time a year ago, they were developing the vaccine.
 
568.00 579.00 '''S:'''  It's a new technology, the mRNA technology, and I remember we were talking about this fact, and I was thinking, it's like, God, I really hope this vaccine works out. I really hope it works out.
 
579.00 586.00 '''S:'''  It was perfectly possible that the efficacy of the vaccine was 40% or 50%.
 
586.00 588.00 '''S:'''  That would not have surprised me.
 
588.00 593.00 '''S:'''  It would have been like, okay, it's better than nothing, a little disappointing, but that was one of the possible outcomes.
 
593.00 598.00 '''S:'''  No, it's 95%, like home run efficacy for this vaccine.
 
598.00 604.00 '''S:'''  It's also possible that there could have been a lot of serious side effects.
 
604.00 615.00 '''S:'''  When you test it in 40,000 people, that's reassuring, but then you give it to 40 million people, 100 million people, and more side effects emerge.
 
615.00 629.00 '''S:'''  And we're far enough into this now, and again, this is part of the reason for this new data, is just looking, say, okay, now that we have nine months under our belt with the mRNA vaccines, let's take a look at the data and really see what's going on here.
 
629.00 632.00 '''S:'''  It's really a home run. It's a home run.
 
632.00 638.00 '''S:'''  Science came through for us for this pandemic better than we could have hoped for, really.
 
638.00 644.00 '''S:'''  If you still reject the vaccine at this point, it's because you don't believe the data.
 
644.00 648.00 '''S:'''  You do not trust authority. You don't trust these numbers.
 
648.00 650.00 '''S:'''  There's no rational reason not to get vaccinated.
 
650.00 656.00 '''S:'''  If you look at the risk of dying of COVID versus the risk of the vaccine, it's a no-brainer.
 
656.00 662.00 '''S:'''  It has to stem from either misinformation or just a blatant lack of trust in the system.
 
662.00 664.00 '''S:'''  Again, we're going to be getting to that in more general.
 
664.00 668.00 '''GH:'''  Steve, do you think it's going to end up being a three-shot, like with the booster thing?
 
668.00 669.00 '''GH:'''  I think so.
 
669.00 671.00 '''GH:'''  Isn't that more standard than...
 
671.00 672.00 '''GH:'''  Depends on the vaccine.
 
672.00 673.00 '''GH:'''  Depends on it. Okay.
 
673.00 675.00 '''S:'''  Some are one-shot, some are two, some are three.
 
675.00 678.00 '''C:'''  Yeah, Gardasil's a three-shot now.
 
678.00 681.00 '''C:'''  It used to be two because now it covers more of the different types.
 
681.00 683.00 '''C:'''  Flu is annual.
 
683.00 685.00 '''C:'''  It may be that it becomes an annual thing.
 
685.00 686.00 '''C:'''  Maybe.
 
686.00 688.00 '''C:'''  The hope would be because it's such...
 
688.00 700.00 '''C:'''  Yeah, and because we, as if I had anything to do with it, because the incredible researchers and scientists have managed to figure out how to accelerate...
 
700.00 703.00 '''C:'''  MRNA vaccines happen really fast.
 
703.00 709.00 '''C:'''  The hope is that anytime there's a new variant, Delta being the dominant one now, but now there's this, what, is it mu?
 
709.00 710.00 '''C:'''  This mu variant?
 
710.00 711.00 '''C:'''  That's a new one that's on the rise.
 
711.00 712.00 '''E:'''  Yeah.
 
712.00 714.00 '''E:'''  The CDC just started to warn people about that.
 
714.00 715.00 '''C:'''  Right.
 
715.00 724.00 '''C:'''  The hope is that each year, if this does become an annual jab, that each year we're able to sort of track and get out in front of it, just like we try to do with the flu.
 
724.00 733.00 '''GH:'''  Just when someone argues that, oh, the third booster now means that it's not effective, the argument you can present is that, no, many of these vaccines require multiple...
 
733.00 734.00 '''GH:'''  Right.
 
734.00 736.00 '''GH:'''  It doesn't reflect at all its efficacy.
 
736.00 738.00 '''S:'''  This is how the immune system works.
 
738.00 747.00 '''C:'''  Yeah, and to be clear, we're trying really hard to make sure that the language is clean, so there's a third dose and there's a booster.
 
747.00 748.00 '''C:'''  Right.
 
748.00 751.00 '''C:'''  The third dose is what's available right now to immunocompromised people.
 
751.00 764.00 '''C:'''  The idea is that because they are so immunocompromised, they needed a three-shot course in order to get to the same immunity that you and I only needed a two-shot course to get to because their own body isn't working as well.
 
764.00 766.00 '''C:'''  Their own immune system is struggling.
 
766.00 778.00 '''C:'''  Later, perhaps by the end of September, there have been some signals, but maybe not, and we still don't know if the booster will be reformulated or if it'll just be the same old shot.
 
778.00 780.00 '''C:'''  That's the booster shot.
 
780.00 784.00 '''C:'''  The booster is the idea that you get a new jab when you have waning immunity.
 
784.00 786.00 '''C:'''  The third dose is for immunocompromised people.
 
786.00 789.00 '''C:'''  The booster shot is for waning immunity.
 
789.00 790.00 '''C:'''  Cool.
 
790.00 792.00 '''S:'''  Right, which is typical.
 
792.00 794.00 '''S:'''  Yeah, totally.
 
794.00 797.00 '''S:'''  Many vaccines, they don't last for life.
 
797.00 801.00 '''S:'''  You may need a booster in 10 years or every year or whatever.
 
801.00 806.00 '''S:'''  It just depends on the nature of the vaccine, the nature of the organism that you're vaccinating against.
 
806.00 808.00 '''S:'''  The immune system is complicated.


'''S:'''
808.00 810.00 '''S:''' There are different pieces of it.


'''B:'''
810.00 814.00 '''S:''' We just have to test it, follow it, and it's all evidence-based.


'''C:'''
814.00 829.00 '''C:''' Also, Steve, the nature of our fellow country people, if everybody would have gotten vaccinated, we wouldn't see these intense variants taking hold and really cycling through the community as much.


'''J:'''
829.00 838.00 '''C:''' Part of the reason that we very likely are going to have to continue to get these shots is because other people aren't getting these shots.


'''E:'''
838.00 840.00 '''S:''' Of course, no question.
<!-- ** the triple quotes are how you get the initials to be bolded. Remember to use double quotes with parentheses for non-speech sounds like (laughter) and (applause). It's a good practice to use brackets for comments like [inaudible] and [sarcasm]. -->


''(laughs)''
== News Items ==
''(laughter)''
''(applause)''
[inaudible]


=== Kilometers-Long Spaceship <small>(14:00)</small> ===
=== Kilometers-Long Spaceship <small>(14:00)</small> ===
* [https://phys.org/news/2021-09-china-spaceship-kilometers.html China wants to build a spaceship that's kilometers long]<ref>[https://phys.org/news/2021-09-china-spaceship-kilometers.html Phys.org: China wants to build a spaceship that's kilometers long]</ref>
* [https://phys.org/news/2021-09-china-spaceship-kilometers.html China wants to build a spaceship that's kilometers long]<ref>[https://phys.org/news/2021-09-china-spaceship-kilometers.html Phys.org: China wants to build a spaceship that's kilometers long]</ref>
840.00 848.00 '''S:'''  All right, Bob, you're going to start us off with the news items telling us about China's plans to build a ginormous spaceship.
848.00 858.00 '''B:'''  Yeah, they recently announced a plan to build or research a kilometer or kilometers long spaceship to be built in low Earth orbit.
858.00 860.00 '''B:'''  I found that just so amazing.
860.00 864.00 '''B:'''  To me, that's something you only see in science fiction movies or novels.
864.00 868.00 '''B:'''  That is gargantuan, a kilometer or two.
868.00 870.00 '''E:'''  Is it too early to raise questions?
870.00 872.00 '''E:'''  No, go right ahead.
872.00 877.00 '''E:'''  You mentioned going into orbit does not make it a space station as opposed to a spaceship.
877.00 879.00 '''S:'''  Well, they're building it in orbit.
879.00 881.00 '''B:'''  Yeah, the plan would be to build in orbit.
881.00 884.00 '''B:'''  Doesn't a ship imply traveling elsewhere other than orbit?
884.00 885.00 '''B:'''  Yes, Evan, I thought the same thing.
885.00 888.00 '''B:'''  I thought it was a ship traveling in space.
888.00 893.00 '''B:'''  But no, the plan is we think that it would be something like a space station.
893.00 895.00 '''B:'''  It's not 100% clear.
895.00 902.00 '''B:'''  But the thing though is that this whole thing is about constructing it, researching it, and how would you construct something of that size?
902.00 905.00 '''B:'''  No matter what you do with it, that's the main focus.
905.00 912.00 '''B:'''  This, of course, is coming from the China version of NASA, which is the China National Space Agency, CNSA.
912.00 916.00 '''B:'''  And so I was looking at China recently in terms of their space agency.
916.00 918.00 '''B:'''  You've got to give these guys credit.
918.00 920.00 '''B:'''  They've had an amazing 20 years.
920.00 922.00 '''B:'''  I mean, look what they've accomplished.
922.00 924.00 '''B:'''  They have astronauts in space.
924.00 928.00 '''B:'''  They're building their own space station this year, Tiangong.
928.00 933.00 '''B:'''  They're building it right now and they've got a plan to get this thing.
933.00 937.00 '''B:'''  It's only going to be about the fifth of the size of the International Space Station.
937.00 940.00 '''B:'''  But I mean, that's big leagues right there.
940.00 943.00 '''B:'''  They're developing heavy lift rockets.
943.00 948.00 '''B:'''  That's something that not many nations have the wherewithal to do.
948.00 950.00 '''B:'''  They're sending robotic explorers everywhere.
950.00 955.00 '''B:'''  You remember Chang'e 4 was a lander and rover on the far side of the moon.
955.00 957.00 '''B:'''  We've never landed on the far side of the moon.
957.00 959.00 '''B:'''  They also had, get this one, this one surprised me.
959.00 960.00 '''B:'''  I wasn't aware of this.
960.00 963.00 '''B:'''  They had a Mars mission that was the first.
963.00 968.00 '''B:'''  It was the first mission that had an orbiter, a lander, and a rover all in one mission.
968.00 970.00 '''B:'''  That's something else that has never been done before.
970.00 975.00 '''B:'''  So these guys are definitely in the big leagues and they're doing an amazing job.
975.00 978.00 '''B:'''  So it begs the question, what's going to happen in the future?
978.00 982.00 '''B:'''  What's going on with China and their future with their space agency?
982.00 986.00 '''B:'''  One of the things that was in the news recently was this five-year plan.
986.00 995.00 '''B:'''  They're considering proposals that came from the National Natural Science Foundation of China, which is managed by the Ministry of Science and Magic.
995.00 998.00 '''B:'''  Oh wait, sorry, Ministry of Science and Technology.
998.00 1000.00 '''B:'''  Sorry about that.
1000.00 1011.00 '''B:'''  So one proposal of the ten proposals they got and funded for like 15 million yen was creating an ultra-large spacecraft spanning kilometers in low Earth orbit.
1011.00 1019.00 '''B:'''  Kilometers, this says kilometers, but most other people are saying that it's basically a kilometer is what they're investigating.
1019.00 1023.00 '''B:'''  Now the foundation's website, if you go to the website and if you can read the language, great.
1023.00 1026.00 '''B:'''  I couldn't, but this is apparently what it says.
1026.00 1036.00 '''B:'''  Major strategic aerospace equipment for the future use of space resources, exploration of the mysteries of the universe and staying in long term.
1036.00 1038.00 '''B:'''  So that's how it's described on their website.
1038.00 1043.00 '''B:'''  So two of the big things they talk about in their plan is that they need to minimize the weight.
1043.00 1050.00 '''B:'''  They want to research how do you minimize the weight for something of this scale and the fact that this would be constructed on the ground and then assembled in space.
1050.00 1058.00 '''B:'''  So you would make these big chunks like the International Space Station, build it on the ground and then bring them up to space and kind of put them all together.
1058.00 1062.00 '''B:'''  Just imagining a kilometer long structure of any kind is pretty daunting.
1062.00 1067.00 '''B:'''  So the former NASA chief Mason Peck said that I think it's entirely feasible.
1067.00 1073.00 '''B:'''  I would describe the problem here not as insurmountable impediments, but rather problems of scale.
1073.00 1079.00 '''B:'''  Well, yes, the scale, yes, that would be, the scale is really this whole story, is the scale of it.
1079.00 1083.00 '''B:'''  So now I thought it would be helpful to compare this to the ISS.
1083.00 1088.00 '''B:'''  So to put the International Space Station into orbit, it took 42 assembly flights.
1088.00 1092.00 '''B:'''  It took, guess how many EVAs it took to do everything?
1092.00 1093.00 '''S:'''  Hundreds probably.
1093.00 1095.00 '''B:'''  Two hundred and thirty-two EVAs.
1095.00 1106.00 '''B:'''  It cost $150 billion to develop and build and $4 billion a year just to operate, just to operate this, you know, and to do the maintenance and whatever, well, what maintenance they're doing.
1106.00 1110.00 '''B:'''  I'm not sure how good the maintenance is, it's really showing its age at this point.
1110.00 1113.00 '''E:'''  And it took the efforts of two major nations plus some, so.
1113.00 1114.00 '''B:'''  Right, exactly.
1114.00 1125.00 '''B:'''  This is clearly, yes, initially it was more of a United States thing, but then they brought in, you know, they brought in Russia and lots of other countries, so lots of countries have been contributing.
1125.00 1129.00 '''B:'''  So the International Space Station is 109 meters.
1129.00 1135.00 '''B:'''  China's proposal is ten times that, ten times that length, and who knows how massive it's going to be.
1135.00 1138.00 '''B:'''  It could be 20 times the size when you get down to it.
1138.00 1139.00 '''B:'''  So, Mike.
1139.00 1140.00 '''S:'''  What's the point, though?
1140.00 1142.00 '''S:'''  Why does it need to be that big?
1142.00 1143.00 '''S:'''  Right, that's.
1143.00 1144.00 '''C:'''  Do they get into that?
1144.00 1145.00 '''C:'''  Yeah.
1145.00 1146.00 '''S:'''  Or is this really, is this like.
1146.00 1147.00 '''S:'''  Jane, you sound like Mingi.
1147.00 1153.00 '''S:'''  But it's, look, as cool as it sounds, I mean, it sounds kind of like Jeff Bezos, you know, like.
1153.00 1154.00 '''S:'''  Who's got a bigger ship.
1154.00 1155.00 '''S:'''  You know, like.
1155.00 1156.00 '''S:'''  The one reason.
1156.00 1158.00 '''S:'''  Why, why make it so long?
1158.00 1163.00 '''S:'''  The one reason to make it that big is if you're going to use rotation or something for artificial gravity.
1163.00 1167.00 '''S:'''  So I couldn't find like an artist impression or anything with the design of the ship.
1167.00 1172.00 '''S:'''  Maybe they're not at that point yet, but are they planning it to be that big because they want artificial gravity?
1172.00 1173.00 '''S:'''  Was there any mention of that?
1173.00 1174.00 '''B:'''  No mention at all.
1174.00 1175.00 '''B:'''  And I don't think that's where they're going.
1175.00 1177.00 '''B:'''  I don't think we're quite, quite ready for that.
1177.00 1179.00 '''B:'''  I mean, I don't think we're quite ready for this either.
1179.00 1181.00 '''B:'''  But it is a little mysterious.
1181.00 1183.00 '''B:'''  We're not exactly sure what they want to do with it.
1183.00 1185.00 '''B:'''  The descriptions are a little vague.
1185.00 1188.00 '''B:'''  But Michael Lembeck, professor of aerospace engineering, he said this.
1188.00 1191.00 '''B:'''  It's kind of like talking about building the Starship Enterprise.
1191.00 1192.00 '''B:'''  It's fantastical.
1192.00 1198.00 '''B:'''  Not feasible and fun to think about, but not very realistic for our level of technology, given the cost.
1198.00 1203.00 '''B:'''  So based on, based on a lot of this and what I've been reading, this, this doesn't seem to be practical at all.
1203.00 1207.00 '''B:'''  And I think what they're doing there and they're only devoting a little bit.
1207.00 1208.00 '''B:'''  It's not a lot of money.
1208.00 1211.00 '''B:'''  Clearly, they're not thinking about creating this soon.
1211.00 1213.00 '''B:'''  I think they're going to study this.
1213.00 1214.00 '''B:'''  They're just going to study it.
1214.00 1215.00 '''B:'''  What is it going to take?
1215.00 1220.00 '''B:'''  What is it going to take to get this into to build the structure that big in space?
1220.00 1228.00 '''B:'''  And I think the conclusion that they're going to come to is that this is going to be way too expensive to do, at least in this way.
1228.00 1237.00 '''B:'''  I think when the time comes in the farther future, when we can build something of this scale, that they're going to probably do things like, how about 3D printing in space?
1237.00 1242.00 '''B:'''  What you would do is you'd have these compact masses on the ground that you would launch into space.
1242.00 1243.00 '''B:'''  Still, it would be very expensive.
1243.00 1255.00 '''B:'''  But then you could take those compact masses and then create these huge, elaborate kilometer long structures using some type of 3D printing, which would be a lot easier than building it and then launching the big chunks from the Earth.
1255.00 1257.00 '''S:'''  It seems like it would be easier.
1257.00 1263.00 '''S:'''  But I mean, when you look at any one of those modules that they put on the space station, they're complicated.
1263.00 1267.00 '''S:'''  The wiring and everything, like a lot of that stuff, I don't know if they could pull it off in space.
1267.00 1271.00 '''B:'''  Well, the problem, Jay, again, is the vagueness because we're not sure.
1271.00 1274.00 '''B:'''  A lot of this depends on what exactly are they building.
1274.00 1280.00 '''B:'''  Are they just building long struts and things or are they going to have lots of modules with people inside them?
1280.00 1281.00 '''B:'''  It's a completely different story.
1281.00 1290.00 '''B:'''  If it's going to be people living in all sections of it, then this is going to be far heavier, far more expensive, far more complex.
1290.00 1300.00 '''B:'''  If it's mainly, if there's a lot of these structures that are just there for support and you don't need to have people living in them, then it's going to be a lot lighter, it's going to be a lot cheaper, it's going to be a lot easier.
1300.00 1308.00 '''B:'''  So some of these questions can't be answered until we actually know exactly what they're going to do with it, exactly what the design is with more detail, and that will come.
1308.00 1315.00 '''B:'''  But I do like the idea that they're studying what kind of technologies are needed to create something of this size.
1315.00 1319.00 '''B:'''  Because who doesn't want a kilometer-long construct in its lowest orbit?
1319.00 1321.00 '''E:'''  But isn't that thing going to be a target for space debris?
1321.00 1324.00 '''E:'''  Have they calculated that into their planning?
1324.00 1326.00 '''B:'''  That's another problem that they'd have to factor in.
1326.00 1330.00 '''B:'''  But when you have something of this size, you have to factor in things that you've never had to factor in before.
1330.00 1349.00 '''B:'''  Because if this thing is a kilometer long and you have ships docking with it, or if you're going to be making any maneuvers in orbit, then you're going to have to, there's going to be like, if you bump this thing with something of the mass of a, say, a space shuttle, this is going to create these, it's going to create motion and little waves that go back and forth in the structure itself.
1349.00 1354.00 '''B:'''  And so they're going to need dampeners to absorb the energy from the impacts.
1354.00 1366.00 '''B:'''  Yeah, and if it's in very low Earth orbit, there's going to be drag, so the orbit's going to decay, so you're going to have to have lots of fuel with rockets so that you can get it into a higher orbit.
1366.00 1368.00 '''B:'''  The ISS does that all the time.
1368.00 1369.00 '''B:'''  It sounds fantastic.
1369.00 1374.00 '''B:'''  There are lots of complications, and we just don't have enough information to really see exactly what this detail is.
1374.00 1376.00 '''GH:'''  How visible would a ship that size be from the ground?
1376.00 1378.00 '''GH:'''  If it reflected light, oh, you'd see it pretty well.
1378.00 1379.00 '''E:'''  The ISS is visible.
1379.00 1380.00 '''GH:'''  Yeah, you'd see the ISS.
1380.00 1381.00 '''B:'''  This would be ten times bigger.
1381.00 1383.00 '''GH:'''  I mean, like naked eye or just the telescope?
1383.00 1384.00 '''B:'''  Absolutely naked eye.
1384.00 1385.00 '''B:'''  The ISS is naked eye right now.
1385.00 1386.00 '''B:'''  Yeah, it would be huge.
1386.00 1388.00 '''S:'''  It would be ten times more naked eye.
1388.00 1390.00 '''C:'''  It would look like, I mean, if it's going to be long.
1390.00 1392.00 '''C:'''  What's the comparison in size to the ISS?
1392.00 1394.00 '''S:'''  It's going to be ten times longer.
1394.00 1395.00 '''S:'''  It's ten times?
1395.00 1402.00 '''B:'''  Yeah, a kilometer would be, say, ten times, about a little bit less than ten times the width, just the width, not the mass necessarily.
1402.00 1404.00 '''B:'''  What about the girth?
1404.00 1406.00 '''C:'''  And how big was Mir?
1406.00 1408.00 '''C:'''  Does anybody know off the top of their heads?
1408.00 1409.00 '''C:'''  Was Mir smaller than the ISS?
1409.00 1410.00 '''C:'''  Oh, yes.
1410.00 1415.00 '''B:'''  The ISS is the biggest artificial construct ever created in orbit like that.
1415.00 1417.00 '''B:'''  Mir was far, far smaller.
1417.00 1418.00 '''B:'''  Actually, I do know.
1418.00 1419.00 '''B:'''  I do know.
1419.00 1430.00 '''B:'''  This would, I think we're talking, Mir was a fifth of the mass, I believe, or the length of the ISS, approximately.
1430.00 1431.00 '''B:'''  Oh, wow.
1431.00 1433.00 '''B:'''  And then this is a tenth?
1433.00 1435.00 '''B:'''  This is ten times the ISS.
1435.00 1437.00 '''B:'''  Yeah, ten times the width.
1437.00 1440.00 '''B:'''  A kilometer, anyways, about ten times.
1440.00 1441.00 '''S:'''  But what's the bottom line?
1441.00 1450.00 '''S:'''  Is it just like we really don't have the technology to build something that big, or is it just that it would be a massive project that would cost trillions of dollars?
1450.00 1451.00 '''B:'''  Right.
1451.00 1453.00 '''B:'''  Yeah, it's technically feasible.
1453.00 1456.00 '''B:'''  It's not like we have to develop whole new sciences to make this happen.
1456.00 1463.00 '''B:'''  We could do it, but we'd have to be willing to spend hundreds of billions, maybe, a trillion dollars, maybe not that expensive.
1463.00 1465.00 '''B:'''  But, I mean, the time and money.
1465.00 1466.00 '''B:'''  Yeah, they always run out of money.
1466.00 1470.00 '''B:'''  Steve, I compared, like the ISS, I said there was 42 assembly flights.
1470.00 1473.00 '''B:'''  This would require, what, 80, 90, 100?
1473.00 1474.00 '''B:'''  I don't know.
1474.00 1475.00 '''B:'''  How many EVAs?
1475.00 1477.00 '''B:'''  Too expensive, and we're not ready.
1477.00 1478.00 '''B:'''  We're not ready.
1478.00 1479.00 '''S:'''  We're not there yet.
1479.00 1480.00 '''S:'''  Yeah, maybe next century, that kind of project.
1480.00 1489.00 '''S:'''  From what I'm hearing, though, it just sounds more like a thought experiment more than a real proposal because, again, they're not even saying why it needs to be that big.
1489.00 1491.00 '''S:'''  It just seems kind of like a fantasy.
1491.00 1492.00 '''B:'''  It's more, a little bit.
1492.00 1494.00 '''B:'''  It's more than a thought experiment.
1494.00 1500.00 '''B:'''  They're going to actually try to see what the technology would be needed, how to do it.
1500.00 1503.00 '''B:'''  With modern technology, how would we construct something that big?
1503.00 1504.00 '''B:'''  That's kind of what they're looking at.
1504.00 1506.00 '''S:'''  Okay, let's move on.
   
   
=== Social Media and Kids <small>(25:05)</small> ===
=== Social Media and Kids <small>(25:05)</small> ===
* [https://www.discovermagazine.com/technology/please-please-like-me-social-media-poses-unique-danger-to-kids-experts-say Please, Please Like Me! Social Media Poses Unique Danger to Kids, Experts Say]<ref>[https://www.discovermagazine.com/technology/please-please-like-me-social-media-poses-unique-danger-to-kids-experts-say Discover: Please, Please Like Me! Social Media Poses Unique Danger to Kids, Experts Say]</ref>
* [https://www.discovermagazine.com/technology/please-please-like-me-social-media-poses-unique-danger-to-kids-experts-say Please, Please Like Me! Social Media Poses Unique Danger to Kids, Experts Say]<ref>[https://www.discovermagazine.com/technology/please-please-like-me-social-media-poses-unique-danger-to-kids-experts-say Discover: Please, Please Like Me! Social Media Poses Unique Danger to Kids, Experts Say]</ref>
1506.00 1511.00 '''S:'''  Jay, tell us about the effect of social media on children.
1511.00 1512.00 '''S:'''  No, sir.
1512.00 1517.00 '''S:'''  As a parent, I've been following news items that come up about this.
1517.00 1523.00 '''S:'''  A lot of people, friends and family, we talk about it because social media has a bad rap.
1523.00 1530.00 '''S:'''  It's measurably done some bad things, depending on, I guess, your perspective.
1530.00 1533.00 '''S:'''  But specifically, what's the deal with social media and kids?
1533.00 1542.00 '''S:'''  In March of 2021, Facebook announced that they're working on launching a version of Instagram for children 12 years old and younger.
1542.00 1546.00 '''S:'''  I'm just curious, guys, what's your knee-jerk on hearing that?
1546.00 1547.00 '''S:'''  No.
1547.00 1548.00 '''GH:'''  Bad.
1548.00 1549.00 '''GH:'''  Bad.
1549.00 1550.00 '''B:'''  What's Instagram?
1550.00 1551.00 '''B:'''  Why?
1551.00 1552.00 '''B:'''  I think that the thing is 12 years old and younger.
1552.00 1555.00 '''C:'''  I think that the thing is 12 year olds are already on Instagram.
1555.00 1565.00 '''C:'''  So if there's an area where there are greater parental controls and there's more restriction, the kids are going to use it anyway.
1565.00 1567.00 '''C:'''  Why not make a space for them?
1567.00 1568.00 '''C:'''  I don't know.
1568.00 1569.00 '''C:'''  I think it's not a bad idea.
1569.00 1573.00 '''S:'''  That's the logical argument from the outside looking in.
1573.00 1578.00 '''S:'''  You could make it make sense by saying, you have to be 13 years old to use Instagram.
1578.00 1585.00 '''S:'''  The younger kids, if their parents are not paying attention or if they're letting them do it, they're faking it, they're on there anyway, and they're exposed.
1585.00 1587.00 '''C:'''  There's no way to prove that you're 13 years old.
1587.00 1588.00 '''C:'''  You just self-attest.
1588.00 1589.00 '''C:'''  Yeah, that's right.
1589.00 1591.00 '''S:'''  You can do what China does.
1591.00 1597.00 '''S:'''  China requires you to scan your face to essentially log into video games, social media.
1597.00 1603.00 '''S:'''  They're using that technology in order to monitor kids because they have their new rule, three hours a week video games.
1603.00 1605.00 '''S:'''  I'm going to buy a deep wow.
1605.00 1613.00 '''S:'''  Facebook is aware that kids do this, that they fake their ages, and they decided to make a targeted platform for the younger age group.
1613.00 1616.00 '''S:'''  Facebook and Instagram are saying all the right things too.
1616.00 1622.00 '''S:'''  They want to encourage kids to use an age-appropriate platform that their parents can manage.
1622.00 1624.00 '''S:'''  It sounds very benevolent.
1624.00 1626.00 '''S:'''  They could monitor for pedophiles.
1626.00 1631.00 '''S:'''  Yeah, I mean, look, they had their heads in the right place, but there's a lot of details here we have to unpack.
1631.00 1634.00 '''C:'''  Yeah, the question is, this is marketing, no?
1634.00 1636.00 '''C:'''  It all comes down to marketing.
1636.00 1638.00 '''C:'''  That's how they make their money on these platforms.
1638.00 1639.00 '''C:'''  Exactly.
1639.00 1642.00 '''C:'''  So we're going to be selling crap to our 11-year-old kids?
1642.00 1645.00 '''S:'''  There's a lot of people that are fighting against this idea.
1645.00 1655.00 '''S:'''  In particular, there's an activist group named Fair Play whose goal is to create a safe space for kids to live without marketing and influence a big industry, like you were saying, Cara.
1655.00 1657.00 '''S:'''  They sent Mark Zuckerberg a letter.
1657.00 1661.00 '''S:'''  They wrote a letter and said, hey, Mark, we want you to kill this project.
1661.00 1662.00 '''S:'''  It's not good.
1662.00 1667.00 '''S:'''  In their letter, they cite several studies, and I've read a lot of these studies, that back up their sentiment.
1667.00 1674.00 '''S:'''  So they're saying that media has a negative effect on children, and it's not just social media, but in general, it's screen use.
1674.00 1678.00 '''S:'''  It's not just social media, but just screen use in general has some negative effects.
1678.00 1683.00 '''S:'''  So here are the common risk factors that were found in most of the studies.
1683.00 1697.00 '''S:'''  Obesity, lower psychological well-being, decreased happiness, decreased quality of sleep, increased risk of depression, and increases in suicide-related outcomes, such as suicide ideation plans and attempts.
1697.00 1703.00 '''S:'''  So another study found that children are, in quotes, uniquely vulnerable to advertising.
1703.00 1704.00 '''S:'''  And listen to this.
1704.00 1705.00 '''S:'''  Let me give you the reasons why.
1705.00 1708.00 '''S:'''  So young kids can't detect that they're being manipulated.
1708.00 1716.00 '''S:'''  They don't know when they hear and read and see things that there's a layer of manipulation involved in the whole thing.
1716.00 1722.00 '''S:'''  So around 12 years old, kids could start to become aware that advertising is really about companies making money.
1722.00 1728.00 '''S:'''  But even when kids are aware that advertising is essentially manipulation, kids are horrible at resisting the marketing.
1728.00 1729.00 '''S:'''  Yeah.
1729.00 1732.00 '''C:'''  And Jay, you can tell the difference between reality and fantasy.
1732.00 1742.00 '''C:'''  A very young child who sees an ad for a pair of shoes and the kid in the advertisement starts to fly is going to have a hard time understanding that these shoes, for example, can't make you fly.
1742.00 1743.00 '''C:'''  Right.
1743.00 1755.00 '''C:'''  And so that kind of marketing can have a really dramatic effect on them because they have a hard time understanding the difference between, you know, this is marketing versus this product would really do this and, oh, my gosh, mom, I need this.
1755.00 1756.00 '''C:'''  Yeah.
1756.00 1757.00 '''C:'''  It's going to change my life.
1757.00 1758.00 '''C:'''  Of course.
1758.00 1759.00 '''S:'''  Exactly.
1759.00 1760.00 '''S:'''  You can't fly with those shoes?
1760.00 1767.00 '''S:'''  No, I think this idea of screen use and social media and marketing, it's more of the marketing aspect of it.
1767.00 1770.00 '''S:'''  And it's also the peer pressure that happens.
1770.00 1776.00 '''S:'''  So beyond the marketing angle, social media exposes children to online bullying and to sexual exploitation.
1776.00 1782.00 '''S:'''  Those two things, bullying and sexual exploitation, are not the worst thing that can happen to kids online.
1782.00 1783.00 '''S:'''  Here it is.
1783.00 1788.00 '''S:'''  During preteen and teen years, kids develop their identities during that age range.
1788.00 1789.00 '''S:'''  Right.
1789.00 1795.00 '''S:'''  And so from young children up into your teen years, you're figuring out who you are and what kind of...
1795.00 1796.00 '''C:'''  Well, really into your 20s.
1796.00 1797.00 '''C:'''  Right.
1797.00 1798.00 '''C:'''  You're right.
1798.00 1799.00 '''C:'''  It is.
1799.00 1800.00 '''S:'''  Yeah, it continues.
1800.00 1804.00 '''S:'''  But you're very vulnerable and susceptible much more when you're younger.
1804.00 1808.00 '''S:'''  So this is the way that kids perceive themselves.
1808.00 1812.00 '''S:'''  This is how they find and fit into their social structures.
1812.00 1821.00 '''S:'''  So while their identities are being molded, kids are, like I said, they're super vulnerable and they're constantly interacting with their peers online.
1821.00 1823.00 '''S:'''  And they upload images of themselves.
1823.00 1825.00 '''S:'''  They're constantly looking for acceptance and praise.
1825.00 1831.00 '''S:'''  And kids get these things from the way the platforms tell them that they're acceptable.
1831.00 1837.00 '''S:'''  It's not like they're talking and they're playing sports and their friends are like, oh, come on, you're not working hard enough or great job.
1837.00 1840.00 '''S:'''  They're like, they're looking for likes, upvotes.
1840.00 1842.00 '''S:'''  And it's platform things.
1842.00 1845.00 '''S:'''  It's Facebook and Reddit and Instagram.
1845.00 1851.00 '''S:'''  It's the way that the adults who created these platforms decided what's going to be the positive feedback?
1851.00 1854.00 '''S:'''  What's going to be the thing that gives people a dopamine hit?
1854.00 1857.00 '''GH:'''  And those are quantifiable, too, which is the thing.
1857.00 1859.00 '''GH:'''  A praise from a friend is one type of thing.
1859.00 1865.00 '''GH:'''  But a heart or a check and you see how many you have versus how many your friends have.
1865.00 1874.00 '''GH:'''  It is a binary piece of data where you're like, oh, I am less important than my friend because she has 400 likes and I have 335.
1874.00 1876.00 '''S:'''  Exactly. The friends and followers thing.
1876.00 1877.00 '''S:'''  It's not good.
1877.00 1879.00 '''S:'''  I mean, think about being an unpopular kid.
1879.00 1881.00 '''S:'''  Just think about this right now.
1881.00 1890.00 '''S:'''  Imagine being an unpopular, dorky kid using social media and desperately trying to get any traction you can.
1890.00 1893.00 '''S:'''  And I'm setting the stage for the next thing I'm about to tell you.
1893.00 1894.00 '''S:'''  But this is what happens.
1894.00 1905.00 '''S:'''  This is how kids are getting the way that they feel valued from their peers, from the way social media is conjuring up the acceptance.
1905.00 1909.00 '''S:'''  What is acceptance and how are they getting praise from their peers?
1909.00 1915.00 '''C:'''  But that also assumes, Jay, that we're talking about things like chat rooms and groups and places.
1915.00 1921.00 '''C:'''  Some, as I think you're probably alluding to, some of these platforms aren't structured that way.
1921.00 1931.00 '''C:'''  So, of course, if you're on Reddit or if you're in a Facebook group, you're able to find your friends, you're able to come together as a community, and it's mostly about communication.
1931.00 1935.00 '''C:'''  Instagram is mostly about display.
1935.00 1948.00 '''C:'''  And unfortunately, there's good evidence to show that the types of posts that get the most likes, yes, Instagram is also a place for activism and it's a place to be heard.
1948.00 1955.00 '''C:'''  But it's very well accepted that a picture of your face is always going to perform significantly better than words.
1955.00 1970.00 '''C:'''  And so, unfortunately, what happens, and I don't want to fully gender this because I think this does apply to young boys as well, but in a heavy way applying to both cis and trans girls, what ends up happening is that there's a representation of themselves that's not true to life.
1970.00 1976.00 '''C:'''  There are so many filters available on platforms like this that allow you to completely transform the way you look.
1976.00 1988.00 '''C:'''  So, when we think about all of the psychological damage that came from magazine covers when we were kids, little girls seeing pictures of supermodels on magazines and saying, I'm never going to look like her.
1988.00 1993.00 '''C:'''  Now, she can actually transform herself to look like those things.
1993.00 1997.00 '''C:'''  And think about the deep psychological conflict that comes from that.
1997.00 2005.00 '''S:'''  So, these platforms are designed to keep people on the platform and to continue to engage because that engagement is generating money.
2005.00 2008.00 '''S:'''  Through ad sales for these platforms.
2008.00 2011.00 '''S:'''  So, the more people that use the platform, the more money they make.
2011.00 2015.00 '''S:'''  And the kind of behavior that gets the most responses, Cara, I thought of you when I read this.
2015.00 2025.00 '''S:'''  The kind of behavior that gets the most responses on social media leans far on the side of negativity and being mean, especially in kids.
2025.00 2030.00 '''S:'''  So, people react to these kinds of posts more than they do a happy post.
2030.00 2032.00 '''C:'''  Of course, just like on Yelp, right?
2032.00 2033.00 '''C:'''  People don't leave reviews.
2033.00 2036.00 '''C:'''  They do, they say, I had a great experience.
2036.00 2043.00 '''C:'''  But much more often they say, that was the worst meal of my life or I hated my manicure and this is why.
2043.00 2044.00 '''C:'''  Exactly.
2044.00 2050.00 '''S:'''  So, in the kids' world, we're talking about bad restaurants and bad politics and stuff in the adult world.
2050.00 2054.00 '''S:'''  In a kid's world, they're exposing themselves to this platform.
2054.00 2056.00 '''S:'''  They're putting pictures of themselves up there.
2056.00 2062.00 '''S:'''  And every single post is important to them in a really significant way, right?
2062.00 2070.00 '''S:'''  So, they learn, unconsciously learn that they get more attention when they do bizarre things and mean things or whatever.
2070.00 2072.00 '''S:'''  It's just the fact of life.
2072.00 2074.00 '''S:'''  This is what's happening online.
2074.00 2075.00 '''S:'''  And they lean into it.
2075.00 2083.00 '''C:'''  And Jay, I saw firsthand when I was working in the foster care system, and of course, these are vulnerable youth, and I was working with adolescent girls.
2083.00 2091.00 '''C:'''  So, between the ages of, let's say, 11, maybe 12 with our youngest up to 18, that very often they would sneak social media.
2091.00 2098.00 '''C:'''  And they would have multiple Instagram accounts with like different fake names or different, you know, this one is for my friends, this one is for the boys, this one.
2098.00 2106.00 '''C:'''  And very often they would be chatting with many, many members, either of the same or opposite sex, whatever they were attracted to, they would be chatting with many members.
2106.00 2113.00 '''C:'''  But I especially would see this ideation that, oh, well, this is my boyfriend, this is my boyfriend.
2113.00 2118.00 '''C:'''  I have all these boyfriends online, many of whom you don't know if this is a real human being.
2118.00 2121.00 '''C:'''  You don't know if this is another young child who's playing along.
2121.00 2123.00 '''C:'''  You don't know if this is an adult.
2123.00 2126.00 '''C:'''  And the rhetoric between them was very scary.
2126.00 2130.00 '''C:'''  I mean, this was borderline like concern around sex trafficking.
2130.00 2137.00 '''C:'''  And I saw this quite often within my foster youth who were probably more vulnerable than children obviously being raised in a family setting.
2137.00 2139.00 '''C:'''  But you don't know.
2139.00 2144.00 '''C:'''  You don't know the kinds of conversations and how much deception is actually taking place within those.
2144.00 2151.00 '''C:'''  And how much of this is healthy adolescent play versus manipulation by adults?
2151.00 2153.00 '''S:'''  Sure. So here's a question.
2153.00 2156.00 '''S:'''  This is something for you to think about as we as I wrap this up.
2156.00 2159.00 '''S:'''  How responsible should social media platforms be for this?
2159.00 2164.00 '''S:'''  Right now, I don't think that they wake up in the morning and think, you know, we're going to hurt kids today.
2164.00 2169.00 '''S:'''  I can't believe that. I think what they're doing is they're like, we're going to wake up and we're going to make money today.
2169.00 2171.00 '''S:'''  They lean into making money.
2171.00 2178.00 '''S:'''  I do think that social media platforms do they should be made responsible because it is happening on their platform.
2178.00 2181.00 '''S:'''  You know, just because they're making money doesn't mean it's OK.
2181.00 2183.00 '''S:'''  Doesn't mean that what they're doing is OK.
2183.00 2188.00 '''C:'''  No. And, you know, we as a society, we used to set standards as a society.
2188.00 2197.00 '''C:'''  We voted so that at the governmental level, there was a requirement back when we didn't have social media and we were talking about media only through television and radio.
2197.00 2208.00 '''C:'''  That our advertising dollars on things like, you know, nightly dramas would go to pay for our news and would go to pay for our children's programming.
2208.00 2213.00 '''C:'''  And those things were not linked. Children's programming and news were not money makers.
2213.00 2217.00 '''C:'''  They were money users. And it was required by law.
2217.00 2221.00 '''C:'''  That was a regulation that we completely disbanded during the Reagan era.
2221.00 2233.00 '''C:'''  There's no reason that we couldn't as a society band together and push for new regulations that say kids and basic news should not be directly tied to advertising.
2233.00 2239.00 '''C:'''  It's a massive conflict of interest. There's no reason we can't say that out loud.
2239.00 2243.00 '''GH:'''  Europe has all kinds of rules about what you can advertise to kids and what you can't.
2243.00 2244.00 '''GH:'''  Absolutely.
2244.00 2251.00 '''GH:'''  In certain times of the day or whatever, radio and television and internet, you can't have targeted advertising for children.
2251.00 2256.00 '''GH:'''  And that's probably, what, 30 percent of American advertising, 40 percent of American advertising is directly towards children.
2256.00 2262.00 '''GH:'''  So you can totally regulate this. But to get that to happen is a very challenging thing.
2262.00 2269.00 '''S:'''  So I'm out of time. I wanted to read more. I have information that was from the Fair Play website.
2269.00 2274.00 '''S:'''  Look them up. Their website isn't just Fair Play. It's Fair Play for kids or something like that.
2274.00 2277.00 '''S:'''  But you'll find them if you just type in Fair Play into Google.
2277.00 2284.00 '''S:'''  They have a page where they go into the details about very specifically, I'll give you one example.
2284.00 2293.00 '''S:'''  Being watched and watching. So the always-on reality of social media means that children become hyper-aware of themselves and others in a time when their identity is really developing.
2293.00 2296.00 '''S:'''  And this hyper-awareness actually is very bad for them.
2296.00 2300.00 '''S:'''  Because they're constantly thinking about themselves and how they relate to other people.
2300.00 2304.00 '''S:'''  They're spending way too much time in that mindset and it's bad.
2304.00 2313.00 '''S:'''  And they give you a really nice list of information that you can think about and maybe start paying more attention to your kids' use of social media.
2313.00 2318.00 '''S:'''  Talk to them about it. Make them aware of advertising. Just tell them what advertising is.
2318.00 2319.00 '''S:'''  All right.


=== Trust in Science <small>(38:39)</small> ===
=== Trust in Science <small>(38:39)</small> ===
* [https://theness.com/neurologicablog/index.php/trust-in-science-may-lead-to-pseudoscience/ Trust in Science May Lead to Pseudoscience]<ref>[https://theness.com/neurologicablog/index.php/trust-in-science-may-lead-to-pseudoscience/ Neurologica: Trust in Science May Lead to Pseudoscience]</ref>
* [https://theness.com/neurologicablog/index.php/trust-in-science-may-lead-to-pseudoscience/ Trust in Science May Lead to Pseudoscience]<ref>[https://theness.com/neurologicablog/index.php/trust-in-science-may-lead-to-pseudoscience/ Neurologica: Trust in Science May Lead to Pseudoscience]</ref>
2319.00 2324.00 '''S:'''  So, Evan, the last item before we click over to DragonCon.
2324.00 2328.00 '''S:'''  We obviously are engaged with science communication.
2328.00 2332.00 '''S:'''  Our goal is to make people appreciate science, trust science.
2332.00 2335.00 '''S:'''  But this is tricky territory.
2335.00 2338.00 '''S:'''  You're going to tell us how easily this can backfire.
2338.00 2340.00 '''E:'''  It can backfire.
2340.00 2346.00 '''E:'''  And that's definitely the results of a new study that were published in the Journal of Experimental Social Psychology.
2346.00 2355.00 '''E:'''  In which they found that people who trust science are more likely to be duped into believing and disseminating pseudoscience.
2355.00 2356.00 '''E:'''  Yeah.
2356.00 2358.00 '''E:'''  Which is in a way not intuitive.
2358.00 2360.00 '''E:'''  But those are the results.
2360.00 2362.00 '''E:'''  And this isn't just this study.
2362.00 2365.00 '''E:'''  This is a continuation in which we are seeing similar results.
2365.00 2367.00 '''E:'''  This is just the latest greatest.
2367.00 2370.00 '''E:'''  I don't think I have to define what pseudoscience is for our audience.
2370.00 2373.00 '''E:'''  We should hopefully know what that is.
2373.00 2375.00 '''E:'''  It's something pretending to be science.
2375.00 2376.00 '''E:'''  It is not.
2376.00 2378.00 '''E:'''  It has the trappings of science.
2378.00 2380.00 '''E:'''  Yet it does not qualify.
2380.00 2385.00 '''E:'''  But that line exactly where you draw it can be complicated.
2385.00 2387.00 '''E:'''  It can be very, very tricky.
2387.00 2389.00 '''E:'''  Not always clearly demarcated.
2389.00 2396.00 '''E:'''  And it's in that blurry area in which the pseudoscientists thrive, frankly, and wholly exist.
2396.00 2402.00 '''E:'''  This study, a series of four experiments involving a total of about 2,000 EUS adults.
2402.00 2418.00 '''E:'''  Teachers randomly assigned study participants to read a news article, actually two news articles, and complete an online questionnaire asking, among other things, if they believed the article, believed it was true, and whether it should be shared with others.
2418.00 2421.00 '''E:'''  So these adults were asked to read about two different topics.
2421.00 2427.00 '''E:'''  The first topic was a fictional virus created as a bioweapon dubbed the Valza virus.
2427.00 2429.00 '''E:'''  V-A-L-Z-A.
2429.00 2430.00 '''E:'''  The Valza virus.
2430.00 2437.00 '''E:'''  Which was said to be made in a lab and that the U.S. government concealed its role in creating it as a bioweapon.
2437.00 2438.00 '''E:'''  That's the first topic.
2438.00 2447.00 '''E:'''  The other story was actually a real study supporting the idea that mice developed tumors after eating genetically modified organisms.
2447.00 2448.00 '''E:'''  Not a tumor.
2448.00 2454.00 '''E:'''  But the study was retracted in 2013, but the participants were not told about the retraction.
2454.00 2460.00 '''E:'''  So one wholesale fiction, the other was actual, but retracted because it turned out to not hold water.
2460.00 2462.00 '''C:'''  Because those mice get tumors anyway.
2462.00 2464.00 '''C:'''  They were rats, but yeah.
2464.00 2465.00 '''C:'''  That was why they were retracted.
2465.00 2466.00 '''C:'''  Yeah, the rats, yeah.
2466.00 2480.00 '''E:'''  So researchers assigned some of the people to read versions of the news stories that featured activists, non-scientific people spouting the information, whereas others read versions of the news stories featuring actual scientists.
2480.00 2484.00 '''E:'''  They would gauge the participants' level of trust in science.
2484.00 2488.00 '''E:'''  Researchers asked them to indicate whether they agreed with various statements.
2488.00 2495.00 '''E:'''  For example, one of the statements was, Scientists usually act in a truthful manner and rarely forge results.
2495.00 2503.00 '''E:'''  And then another statement they threw out there for people to gauge, The Bible provides a stronger basis for understanding the world than science does.
2503.00 2508.00 '''E:'''  So these were metrics to try to get people and how they basically feel about these things.
2508.00 2520.00 '''E:'''  And there was also an experiment among this set of studies in which the participants responded to a writing prompt, which was meant to put them into a particular mindset before reading their articles.
2520.00 2527.00 '''E:'''  So what that means is that one prompt was to put people in the trust the science mindset.
2527.00 2533.00 '''E:'''  And they gave them some examples of how science had saved lives and otherwise benefited humanity.
2533.00 2549.00 '''E:'''  And another prompt aimed at a critical evaluation mindset, which was directing participants to give examples of people needing to think for themselves, not blindly trust what the media and other sources were telling them.
2549.00 2555.00 '''C:'''  So they were priming them to either believe or to be skeptical.
2555.00 2569.00 '''E:'''  And the results they came up with is that those who expressed or espoused higher levels of trust in science, just trust in science, turned out to be the most likely to believe the reports if they contained scientific references.
2569.00 2578.00 '''E:'''  In other words, if pseudoscience was there and they simply trust science itself, it doesn't matter, they fell for it.
2578.00 2589.00 '''E:'''  But for those who demonstrated a stronger understanding of the scientific method and scientific methods in general, they were less likely to believe the false stories and what they read,
2589.00 2601.00 '''S:'''  which again is consistent with what we've seen in prior studies. The people with the lowest level of trust in science did not have any effect from whether or not the news item was connected to science or a scientist.
2601.00 2612.00 '''S:'''  So the trust in science was a prerequisite, at least a minimal amount of it, to being manipulated by saying, I'm a scientician and I say this is correct.
2612.00 2626.00 '''S:'''  So trust in science was a dose-response curve there, so it actually made people vulnerable to being manipulated by tying pseudoscience to either a scientist or referring to an article.
2626.00 2637.00 '''S:'''  Basically supporting the pseudoscience with some reference to science worked well for people who had a trust in science, unless they had a high degree of methodological literacy.
2637.00 2647.00 '''S:'''  So basically they were not only scientifically literate, but they had the critical thinking and methodological background to be able to deconstruct the fake science for themselves.
2647.00 2648.00 '''S:'''  You need both.
2648.00 2649.00 '''S:'''  Right.
2649.00 2650.00 '''C:'''  Well, and this makes perfect sense.
2650.00 2654.00 '''C:'''  It's the reason that Doc So-and-so's snake oil is easy to sell.
2654.00 2658.00 '''C:'''  If he doesn't have the lab coat and the name doctor, people don't want it from him.
2658.00 2666.00 '''C:'''  It's because people have that trust and authority, which is pseudoscientists, charlatans, they know this intuitively.
2666.00 2668.00 '''C:'''  And they use this to their advantage.
2668.00 2669.00 '''C:'''  Absolutely.
2669.00 2672.00 '''C:'''  They don't go out spouting things that don't sound scientific.
2672.00 2677.00 '''C:'''  They spout things that do sound scientific because they know people will trust them more if they do it that way.
2677.00 2679.00 '''S:'''  If they have a baseline trust in science.
2679.00 2680.00 '''S:'''  Right.
2680.00 2691.00 '''S:'''  So the irony is that science communicators, if we are only fostering a generic trust in science and scientists, that is just setting people up for manipulation.
2691.00 2693.00 '''S:'''  Actually it doesn't really work as a strategy.
2693.00 2708.00 '''S:'''  You have to combine it with the critical thinking skills and the ability to at least judge sources, judge the validity of evidence for yourself to some extent, which gets very tricky.
2708.00 2709.00 '''GH:'''  Oh, yeah.
2709.00 2710.00 '''GH:'''  Our job was too easy anyway, Steve.
2710.00 2711.00 '''GH:'''  We need a challenge.
2711.00 2712.00 '''GH:'''  We need to be hard-earned.
2712.00 2714.00 '''C:'''  A challenge to inform people.
2714.00 2738.00 '''C:'''  We have to remember that through most of modern Western history, from the time that science came onto the scene, like modern science came onto the scene, only in the beginning and in the examples where it butts up against religious doctrine, we often see distrust.
2738.00 2744.00 '''C:'''  Beyond that, science was like we were in awe of science as a culture.
2744.00 2750.00 '''C:'''  We really respected science and we bowed to the amazing things that science brought us.
2750.00 2774.00 '''C:'''  And it really is in many ways a new phenomenon that there's an educated portion of society who's not necessarily driven by religious dogma that don't trust science as a function of this narcissistic, I know better than the experts, this kind of like the trappings, as we know, of the conspiracy sect.
2774.00 2777.00 '''C:'''  That didn't have a large hold in the past.
2777.00 2782.00 '''C:'''  So this really is a new phenomenon that we're having to grapple with because historically, people trust scientists.
2782.00 2784.00 '''C:'''  And that's why it was easy to become a pseudoscientist.
2784.00 2798.00 '''S:'''  But this is something that we know as scientific skeptics, not just science communicators, because we've long been saying, and the research really strongly supports this, is that yes, you need to teach scientific literacy.
2798.00 2801.00 '''S:'''  That's one of the planks of scientific skepticism.
2801.00 2804.00 '''S:'''  You need to give correct misinformation.
2804.00 2809.00 '''S:'''  You need to give people a basic fundamental knowledge about science and how science works.
2809.00 2833.00 '''S:'''  But in addition to that, you need to give them critical thinking skills, which includes understanding pseudoscience and how pseudoscience works and how to distinguish pseudoscience from genuine science, science denial and how that works and how to distinguish science denial from genuine skepticism because the science deniers all portray themselves as skeptics.
2833.00 2836.00 '''S:'''  We're asking the hard questions that no one else will ask.
2836.00 2845.00 '''S:'''  But no, if it's a perverted version of that designed to deny legitimate science, then it's not skepticism.
2845.00 2846.00 '''S:'''  It's science denial.
2846.00 2849.00 '''S:'''  Steve, could you repeat all that so George and I can write it down?
2849.00 2853.00 '''S:'''  Yeah. And then the third thing is media savvy.
2853.00 2872.00 '''S:'''  You need to understand, like we were also just talking about social media and the internet, you need to understand how information flows through our modern society, how you could find reliable bits of information and distinguish that from unreliable sources because we all know people personally, colleagues, obviously as skeptics.
2872.00 2898.00 '''S:'''  We all know people who are essentially living in an alternate reality because they live in an alternate ecosystem of information and they think that they are completely right and we are all hopelessly duped because we believe in things like anthropogenic global warming, that vaccines are safe and effective, that GMOs are safe, that evolution happened.
2898.00 2899.00 '''S:'''  How silly are we?
2899.00 2902.00 '''S:'''  Yeah, I know. That the earth is basically a sphere.
2902.00 2905.00 '''S:'''  These things that make us gullible in their eyes.
2905.00 2913.00 '''S:'''  Seriously, there are people who if you are immersed in this alternate reality of information and alternate sources, journals and outlets and everything.
2913.00 2914.00 '''S:'''  The Truman Show.
2914.00 2918.00 '''S:'''  Yeah, exactly. Social media makes it very easy to do all these things.
2918.00 2922.00 '''S:'''  You don't need a big brick and mortar institution that has been around for a hundred years.
2922.00 2923.00 '''S:'''  You just need a slick website.
2923.00 2932.00 '''S:'''  Steve, correct me if you disagree, but the fix is that critical thinking needs to become part of the classroom, part of the canon.
2932.00 2942.00 '''S:'''  There should be multiple times during a child's career as a student, they should be taking classes on critical thinking and be taught these things.
2942.00 2945.00 '''S:'''  I agree. It has to be woven in throughout the science curriculum.
2945.00 2947.00 '''S:'''  More than just science.
2947.00 2951.00 '''C:'''  I think it should be in every class. Every class should have a component or, you know.
2951.00 2953.00 '''C:'''  Yeah, woven in.
2953.00 2960.00 '''S:'''  Wouldn't it be fantastic if teachers were taught critical thinking and then that bled into the classroom that way as well.
2960.00 2963.00 '''S:'''  But we don't, I mean, it's rare.
2963.00 2971.00 '''S:'''  We get emails from people who say, oh, I was inspired by this one teacher that I had at one point or, you know, this isn't a common thing at all.
2971.00 2973.00 '''S:'''  It's incredibly uncommon.
2973.00 2982.00 '''C:'''  Well, and let's be clear, some places do this and some school districts do this and some incredible institutions do this already.
2982.00 2985.00 '''C:'''  It's not widespread and it's not well adopted.
2985.00 2992.00 '''C:'''  But, I mean, we shouldn't talk about it as if nobody had this idea before because I think that this is among very good educators who really focus on pedagogy.
2992.00 2995.00 '''C:'''  I think that this is a well established truth.
2995.00 2996.00 '''S:'''  Of course.
2996.00 2997.00 '''S:'''  Yeah, we didn't come up with this.
2997.00 2999.00 '''S:'''  It just happens to be the truth.
2999.00 3005.00 '''S:'''  And we are, I'm telling everyone, if you don't know this, that's the answer.
3005.00 3007.00 '''S:'''  And that's what we should be focusing on.
3007.00 3016.00 '''S:'''  And it's mind boggling how impossible it is to get the money it takes to do this because there's no money in critical thinking in science.
3016.00 3017.00 '''S:'''  It's skepticism.
3017.00 3018.00 '''C:'''  And so how do you do it?
3018.00 3025.00 '''C:'''  How do you fight against the fact that there is money in teaching anti-evolution rhetoric and anti-critical race theory rhetoric and rhetoric?
3025.00 3041.00 '''C:'''  There are states, these states that publish our textbooks, Texas, where there are concerted efforts and many, many southern states where there are concerted efforts to take actual data, truth, reality, history, science out of the textbooks and out of the curriculum.
3041.00 3042.00 '''C:'''  And critical thinking.
3042.00 3043.00 '''C:'''  And specifically critical pedagogy.
3043.00 3044.00 '''C:'''  And critical thinking, yeah.
3044.00 3060.00 '''GH:'''  You know, this idea of a sea change in the educational system of incorporating critical thinking and skepticism across the border into the whole curriculum, I think it's a fantasy because it's just, for something like that to occur is such a gigantic proposition.
3060.00 3069.00 '''GH:'''  I wonder if a focus of our effort as sort of the fringe collective that we are that could have some influence.
3069.00 3079.00 '''GH:'''  I don't know how true this is for everybody here, but there was a time in school where all the girls went into the auditorium one day and they got the sex movie.
3079.00 3083.00 '''GH:'''  And all the boys went into the auditorium that one day and they got the sex movie.
3083.00 3084.00 '''GH:'''  And it wasn't the best.
3084.00 3085.00 '''GH:'''  It wasn't the most clear.
3085.00 3090.00 '''GH:'''  It wasn't the most, the best sex ed necessarily presentation.
3090.00 3093.00 '''GH:'''  But it sort of gave you the basics and it set up something to understand.
3093.00 3095.00 '''GH:'''  This is how procreation works.
3095.00 3096.00 '''GH:'''  And this is the boy parts.
3096.00 3097.00 '''GH:'''  This is the girl parts.
3097.00 3098.00 '''GH:'''  Great.
3098.00 3111.00 '''GH:'''  So would there be some kind of equivalent of a one time, like an unbelievably well produced 45 minute thing of this is like how advertising works.
3111.00 3114.00 '''GH:'''  This is how critical thinking works.
3114.00 3127.00 '''GH:'''  And to get that in front of people's faces, in front of students faces, if nothing else, if nothing else, is that like, is that a better target to shoot for of like, hey, let's have, you know, schools have assemblies.
3127.00 3128.00 '''GH:'''  Schools have assemblies.
3128.00 3139.00 '''GH:'''  And to say, let's get a critical thinking assembly into the rotation that's going to be happening versus wouldn't it be great to have critical thinking at every juncture of a student's education?
3139.00 3140.00 '''GH:'''  Of course it would.
3140.00 3142.00 '''GH:'''  Is that realistic, though?
3142.00 3146.00 '''C:'''  Yeah. But unfortunately, what we're talking about isn't a single fix.
3146.00 3150.00 '''GH:'''  I know it's not a single fix, but I'm saying better than what's happening now.
3150.00 3155.00 '''C:'''  Right. But just like that video, it doesn't necessarily, we'd have to look at the actual outcomes.
3155.00 3158.00 '''C:'''  Did that prevent any unwanted pregnancies?
3158.00 3159.00 '''C:'''  Maybe not.
3159.00 3163.00 '''C:'''  You know, we have to really understand, are we hitting them at the right time?
3163.00 3168.00 '''GH:'''  I just know that I saw Cosmos as a kid and it was like 30 minutes and I was in.
3168.00 3169.00 '''GH:'''  I was in.
3169.00 3170.00 '''GH:'''  There was 30 minutes.
3170.00 3176.00 '''GH:'''  Now, again, my environment was set up so much that my parents had inspired critical thinking and a love of science, et cetera.
3176.00 3179.00 '''GH:'''  But that was a catalyst.
3179.00 3194.00 '''GH:'''  And I'm thinking there can be catalyst moments for young people that sometimes can be just in a well-done presentation of like, it happens with music, it happens with drama, it happens in all kinds of different things that a single presentation, a single moment, a single person seeing a play.
3194.00 3202.00 '''GH:'''  I know how many stories you hear where the person, the kid goes and sees a play for the first time and they go, I want to be an actor. This is amazing.
3202.00 3204.00 '''GH:'''  I call it the big bang moment.
3204.00 3205.00 '''E:'''  Yeah, the big bang moment.
3205.00 3207.00 '''S:'''  Let's work on having big bang moments.
3207.00 3209.00 '''C:'''  We have to have enough of those.
3209.00 3212.00 '''C:'''  We have to have enough big bang moments that they'll hit the kids at the right time.
3212.00 3222.00 '''S:'''  But George, George, I think the problem is that we were all, everybody here and probably the vast majority of the audience that we have, we inherently love science, right?
3222.00 3225.00 '''S:'''  Like we grew up and something happened and we fell for it.
3225.00 3227.00 '''S:'''  You know, that's the problem.
3227.00 3233.00 '''S:'''  That in and of itself is the actual problem is like most people don't fall in love with it as quickly as we do.
3233.00 3235.00 '''S:'''  We're the low hanging fruit of science enthusiasts.
3235.00 3236.00 '''GH:'''  I don't know, man.
3236.00 3242.00 '''GH:'''  I think you hit a kid right with some really, I mean look how popular the slow motion guys are on YouTube.
3242.00 3249.00 '''GH:'''  Look how popular a lot of science communicators are because they're doing presentations that are incidentally science related.
3249.00 3270.00 '''GH:'''  I mean again, Mythbusters is the top example of like the top syndicated cable show, one of the top of all time, was created by a bunch of critical thinkers who weren't making a critical thinking show, but use critical thinking to blow stuff up and it was amazingly produced and amazingly done and it was the idea of let's make this entertaining and interesting.
3270.00 3275.00 '''GH:'''  And there's going to be a whole generation of scientists because of Mythbusters, no question.
3275.00 3283.00 '''GH:'''  I mean they themselves, they see that when they do their live shows or Adam talks about this all the time, that young people come up and they're like, I'm in, I'm so in.
3283.00 3292.00 '''GH:'''  I'm saying is there some equivalent of that that could be done in a partial curriculum basis as opposed to trying to say, boy, wouldn't it be great to just have an entirely different curriculum?
3292.00 3296.00 '''B:'''  I think so, but should we call the sex ed vid the big bang moment?
3296.00 3297.00 '''B:'''  Should we do that?
3297.00 3298.00 '''B:'''  I don't know.
3298.00 3299.00 '''B:'''  Think about it.
3299.00 3304.00 '''S:'''  So obviously you're correct, George, and I think partly we're doing that, right?
3304.00 3305.00 '''S:'''  Yeah.
3305.00 3306.00 '''S:'''  Oh, yeah, yeah, yeah.
3306.00 3315.00 '''S:'''  We're producing as much content as we have, we're trying to flood the culture with as much pro-critical thinking, scientific skepticism kind of content as we can.
3315.00 3320.00 '''S:'''  We do try to bring as much as we can of that into the classroom.
3320.00 3323.00 '''S:'''  I know a lot of what we produce gets used by science teachers in the classroom.
3323.00 3331.00 '''S:'''  We get a lot of feedback from science teachers who use the SGU, use our book, use a lot of the content that we create.
3331.00 3333.00 '''S:'''  Give talks to schools, right?
3333.00 3335.00 '''S:'''  We'll go into schools and give talks.
3335.00 3339.00 '''S:'''  Richard Saunders has a whole magic show that he does teaching critical thinking.
3339.00 3341.00 '''S:'''  So we should do as much of that as possible.
3341.00 3349.00 '''S:'''  Maybe what you're saying is maybe we should divert a little bit more of our focus towards producing the kind of content that could be incorporated into the classroom.
3349.00 3358.00 '''S:'''  But still, I actually don't think it's a pipe dream to change the education infrastructure in a positive way.
3358.00 3367.00 '''S:'''  For example, there is just a culture in science that it's dirty to deal with and talk about pseudoscience.
3367.00 3375.00 '''S:'''  And they don't like to do it, they don't understand it, and therefore it doesn't trickle down in science education.
3375.00 3379.00 '''S:'''  And that is the thing that I really would like to change.
3379.00 3384.00 '''S:'''  And I think we collectively, the scientific skeptical movement should try to change.
3384.00 3392.00 '''S:'''  It's like, no, teaching about pseudoscience is a critical element of teaching science.
3392.00 3395.00 '''S:'''  The evidence overwhelmingly shows that.
3395.00 3398.00 '''S:'''  This is one more study that we're talking about today.
3398.00 3404.00 '''S:'''  One more on a mountain of studies which show that the knowledge deficit model is limited.
3404.00 3406.00 '''S:'''  It's not worthless, but it's limited.
3406.00 3408.00 '''S:'''  You can't just give facts.
3408.00 3409.00 '''S:'''  You can't just teach science.
3409.00 3416.00 '''S:'''  You have to put it in the context of this is how we know what we know, and this is what happens when it goes wrong.
3416.00 3418.00 '''S:'''  Show how terrible it goes.
3418.00 3422.00 '''S:'''  You know that you're doing it right because you're not doing it wrong.
3422.00 3425.00 '''C:'''  Yeah, you can't teach clinical physiology.
3425.00 3431.00 '''C:'''  You can't teach a physician how to be a good doctor if you only ever show him non-pathological.
3431.00 3438.00 '''C:'''  It's like you've got to see what happens when it's happening the normal or typical way versus when something goes wrong.
3438.00 3446.00 '''C:'''  Kids are never going to understand critical thinking if you only teach them the cheery, saccharine version of how things should be.
3446.00 3448.00 '''C:'''  You'll make them vulnerable.
3448.00 3449.00 '''S:'''  Totally.
3449.00 3456.00 '''S:'''  You have to teach pathological science in order to contrast that to healthy science.
3456.00 3459.00 '''S:'''  If you don't do that, all you're doing is making a bunch of pseudoscience.
3459.00 3467.00 '''S:'''  Steve, this is what I'm confused about because I think teaching kids about pseudoscience is almost like showing them a magic trick.
3467.00 3471.00 '''S:'''  If you tell them, here, let me show you what's wrong about this.
3471.00 3472.00 '''S:'''  Let's discuss it.
3472.00 3473.00 '''S:'''  We'll break it down.
3473.00 3478.00 '''S:'''  It becomes interesting because you're almost empowering them with the knowledge.
3478.00 3479.00 '''S:'''  My kids love it.
3479.00 3480.00 '''S:'''  They're responding to this.
3480.00 3481.00 '''GH:'''  That can be so entertaining.
3481.00 3483.00 '''GH:'''  Yeah, I think it can be done.
3483.00 3495.00 '''C:'''  We're making a core assumption that Melissa in the chat made a really, I think, intuitive or insightful comment, which was that it's not uncommon to teach critical thinking in schools, by the way.
3495.00 3501.00 '''C:'''  If you were to get a group of educators in a room and say, we should teach critical thinking, nobody would be like, I don't think so.
3501.00 3504.00 '''C:'''  Educators, by and large, agree with this.
3504.00 3512.00 '''C:'''  There are obviously external forces trying to change our curriculum structure, but people who do this for a living are like, hell yeah, we need to be teaching critical thinking.
3512.00 3514.00 '''C:'''  Many of them do.
3514.00 3518.00 '''C:'''  The question is or the concern is it can be undone at home.
3518.00 3521.00 '''C:'''  It can absolutely be undone in the home.
3521.00 3525.00 '''C:'''  We only have control over our academic system.
3525.00 3532.00 '''C:'''  We cannot control the fact that parents have a much greater influence over their children's outcomes.
3532.00 3534.00 '''E:'''  We've got to educate the parents as well.
3534.00 3536.00 '''S:'''  We can only do what we can do.
3536.00 3538.00 '''S:'''  We can control the things that we can do.
3538.00 3539.00 '''C:'''  It's a chicken and egg situation.
3539.00 3540.00 '''C:'''  Yeah.
3540.00 3541.00 '''S:'''  All right, let's move on.


=== Embryo Research <small>(59:01)</small> ===
=== Embryo Research <small>(59:01)</small> ===
* [https://www.nature.com/articles/d41586-021-02343-7 What’s next for lab-grown human embryos?]<ref>[https://www.nature.com/articles/d41586-021-02343-7 Nature: What’s next for lab-grown human embryos?]</ref>
* [https://www.nature.com/articles/d41586-021-02343-7 What’s next for lab-grown human embryos?]<ref>[https://www.nature.com/articles/d41586-021-02343-7 Nature: What’s next for lab-grown human embryos?]</ref>
3541.00 3546.00 '''S:'''  Cara, you're going to tell us about research using human embryos.
3546.00 3555.00 '''C:'''  Yeah, so there was a feature article that was put out in Nature by a science writer named Kendall Powell called, What's Next for Lab-Grown Human Embryos?
3555.00 3563.00 '''C:'''  The subtitle, which I think is very telling, is, Researchers are now permitted to grow human embryos in the lab for longer than 14 days.
3563.00 3565.00 '''C:'''  Here's what they could learn.
3565.00 3576.00 '''C:'''  I'm not sure if all of us are aware, but there has long been in place an international consensus decision that there is a 14-day rule.
3576.00 3578.00 '''C:'''  That's what it's called, the 14-day rule.
3578.00 3582.00 '''C:'''  This was set by the International Society for Stem Cell Research.
3582.00 3590.00 '''C:'''  It's been discussed ever since the 70s when IVF really did become an actuality for many families.
3590.00 3594.00 '''C:'''  Prior to that, you kind of didn't have a lot of options.
3594.00 3600.00 '''C:'''  But when IVF really became an actuality, there became more embryos available.
3600.00 3606.00 '''C:'''  Individual countries, different research groups said, We want access to these embryos.
3606.00 3614.00 '''C:'''  Of course, there was a big international discussion that was really codified in the mid-2000s, just like we're seeing now with CRISPR.
3614.00 3616.00 '''C:'''  You have a new technology available.
3616.00 3619.00 '''C:'''  Where do the ethics fall in?
3619.00 3621.00 '''C:'''  Where do we want to draw a line in the sand?
3621.00 3625.00 '''C:'''  For a long time, that line in the sand was at 14 days.
3625.00 3631.00 '''C:'''  Before we talk about what happens next, let's talk about what that meant for researchers.
3631.00 3637.00 '''C:'''  Well, first and foremost, most labs can't even grow embryos to 14 days.
3637.00 3638.00 '''C:'''  It's really freaking hard to do.
3638.00 3643.00 '''C:'''  There's only a few labs in the world that have managed to do this and to do this consistently.
3643.00 3650.00 '''C:'''  Part of that is because many countries have their own legal regulations that go beyond the 14-day rule.
3650.00 3654.00 '''C:'''  It's actually illegal in many countries to work with human embryos at all.
3654.00 3665.00 '''C:'''  But in the countries where it is legal, the US being one of them, the 14-day rule says that you cannot continue to grow human embryos beyond 14 days.
3665.00 3668.00 '''C:'''  What happens in those first 14 days?
3668.00 3670.00 '''C:'''  A lot, but also not a lot.
3670.00 3678.00 '''C:'''  The embryos do start to organize, and you start to get right at the edge of what's called the primitive streak.
3678.00 3686.00 '''C:'''  This is a visual thing that you can see under the microscope, a little streak that ultimately is going to form the neural tube.
3686.00 3697.00 '''C:'''  Ultimately, the streak early on, you can tell by the way that the cells are organized, is giving the body design access, so up, down, left, right, head, tail, left, right.
3697.00 3700.00 '''C:'''  Cara, is that the spine? Are you saying that is that the spine?
3700.00 3701.00 '''C:'''  Not yet.
3701.00 3705.00 '''C:'''  It will eventually, like during many more cell divisions, become the spine.
3705.00 3711.00 '''C:'''  I'm talking right now in the first 14 days, a ball of cells, a blastocyst, that has almost no differentiation at all.
3711.00 3715.00 '''C:'''  These cells are totipotent. They're not even pluripotent.
3715.00 3718.00 '''C:'''  These are cells that could literally become anything.
3718.00 3726.00 '''C:'''  They are really in the earliest stages of starting to develop and differentiate.
3726.00 3729.00 '''C:'''  Really, really early on, it just looks like a ball of cells.
3729.00 3734.00 '''C:'''  Around the end of that 14-day mark, you start to see the primitive streak.
3734.00 3736.00 '''C:'''  It's called that because that's all it really is.
3736.00 3741.00 '''C:'''  It's super early, and it's giving you some directionality to the embryo.
3741.00 3744.00 '''C:'''  Then it's going to start folding in on itself.
3744.00 3753.00 '''C:'''  You're going to have three differentiated layers of cells that ultimately give rise to things like skin and neural tissue, to body organs, to bone, and connective tissue.
3753.00 3758.00 '''C:'''  At this point, none of that differentiation has occurred.
3758.00 3768.00 '''C:'''  That's when, up until recently, researchers have had to go ahead and just freeze the embryos and put them in a suspended state, either that or destroy the embryos.
3768.00 3771.00 '''C:'''  The research really couldn't continue beyond that.
3771.00 3773.00 '''C:'''  We're missing a lot of information.
3773.00 3787.00 '''C:'''  That information about what happens when the embryo, or I should say at this point, the blastocyst, really does start to undergo gastrulation, to really develop into a primitive organism.
3787.00 3795.00 '''C:'''  Like you said, Jay, develop that spinal column, which starts as a neural tube based on how it folds and then continues to differentiate.
3795.00 3803.00 '''C:'''  All of that, the post-14 day up through the next couple of weeks, we can't see an ultrasound.
3803.00 3820.00 '''C:'''  The only way we've ever been able to study the development of an actual human embryo is through animal models, through embryonic models, and through taking all of this information and gleaning what we think we understand.
3820.00 3826.00 '''C:'''  We know now a really good amount of what happens from day one to day 14.
3826.00 3831.00 '''C:'''  Of course, we can study an embryo at different stages that was spontaneously aborted, for example.
3831.00 3850.00 '''C:'''  We might be able to look at it, but we can't physically grow these things beyond day 14 and understand exactly what's happening, both at the structural level, but perhaps even more importantly at the molecular level, because it's been this international consensus that says you can't do that.
3850.00 3853.00 '''C:'''  Back in May, that consensus changed.
3853.00 3862.00 '''C:'''  The governing body, again, they are called the International Society for Stem Cell Research, the ISSCR, released new guidelines.
3862.00 3869.00 '''C:'''  It relaxed the 14-day rule, and it allows for study of gastrulation beyond that.
3869.00 3873.00 '''C:'''  They didn't put a new line in the sand, which a lot of people are concerned about.
3873.00 3875.00 '''C:'''  They didn't say it's no longer 14 days, it's whatever.
3875.00 3878.00 '''C:'''  They basically said so few labs do this.
3878.00 3881.00 '''C:'''  If you want to continue beyond 14 days, you just have to seek approval.
3881.00 3884.00 '''E:'''  So few labs are capable of going beyond 14 days.
3884.00 3886.00 '''S:'''  Just individualize it rather than making a rule.
3886.00 3889.00 '''S:'''  I guess that's positive. That's a great move.
3889.00 3903.00 '''C:'''  Of course, there are people who push back and say, I wish they would have set a new line in the sand, but then of course the push back against that is it's very arbitrary what this line is, and a lot of it really does feel moral or religious.
3903.00 3909.00 '''C:'''  Oh, maybe now we're starting to see heart structure, or now the neurons are starting to fire.
3909.00 3912.00 '''C:'''  What does that mean? Could it have thought? Could it have feelings?
3912.00 3919.00 '''C:'''  These are where a lot of these complicated questions come up, but they're the same questions we grapple with when we talk about brain organoids.
3919.00 3923.00 '''C:'''  We just discussed brain organoids in a previous podcast.
3923.00 3928.00 '''C:'''  To be clear, there are these pseudo embryos.
3928.00 3931.00 '''C:'''  They're not actually called pseudo embryos. I just made that term up.
3931.00 3935.00 '''C:'''  But there are these embryonic models that have been developed.
3935.00 3939.00 '''C:'''  Really, really complicated, incredible science.
3939.00 3948.00 '''C:'''  When it first was developed by, I think, a Japanese researcher, first developed how to make these embryonic models, it was like a watershed moment.
3948.00 3951.00 '''C:'''  Then people were able to learn the protocol and other labs were able to do it.
3951.00 3961.00 '''C:'''  But there's no real determination or guarantee that what's happening in an embryonic model perfectly matches with what's actually happening in development.
3961.00 3965.00 '''GH:'''  What's the model like? I don't understand. Is it an organic thing?
3965.00 3972.00 '''C:'''  Yeah, so they actually can take certain types of pluripotent cells and induce them to grow in a certain way, but it's not a human embryo.
3972.00 3981.00 '''C:'''  It's cell types that have been stuck together and induced to grow, kind of like the brain organoids versus an actual brain being grown.
3981.00 3985.00 '''S:'''  Yeah, so they're trying to ultimately make a distinction between a clump of cells and a person.
3985.00 3995.00 '''S:'''  They're trying to draw lines. Beyond this point, it's actually now a person and we're actively going to treat it that way as opposed to it's a clump of cells that we're studying scientifically. But of course, it's a continuum.
3995.00 3998.00 '''S:'''  So there is no sharp demarcation line.
3998.00 4000.00 '''S:'''  There's never going to be a sharp demarcation.
4000.00 4005.00 '''C:'''  So historically, it was 14 days. This international group has lifted that.
4005.00 4015.00 '''C:'''  Now, one thing that's interesting is that, which I didn't really realize, although maybe I did, is that here in the US, we don't really have a lot of regulation about working with these things.
4015.00 4025.00 '''C:'''  The regulation that we all thought we had probably comes into play because we have a law that says that these types of investigations can't be federally funded.
4025.00 4026.00 '''C:'''  Right, yeah.
4026.00 4032.00 '''C:'''  So most labs can't do this because they can't access NIH or NSF dollars to do it.
4032.00 4061.00 '''C:'''  And it's expensive, right? And they have to have their own setup to be able to access and acquire these things. And so very few labs have really worked out the logistics, but the ones that have argue that there's so much more we could know because there's this gap in our knowledge between what we can see on ultrasound and what we can test in a developing fetus through like chorionic villi sampling or any other way to understand developmental biology and what happens after this 14-day rule.
4061.00 4065.00 '''C:'''  There's kind of a black box area that we've been able to model.
4065.00 4073.00 '''C:'''  We've been able to look at animal research, but really within humans, it's interesting how much of a gap we have in our knowledge.
4073.00 4082.00 '''S:'''  Yeah, we're getting rid of the semi-arbitrary rule and just saying, all right, we'll just make individual decisions, justify your research ethically.
4082.00 4085.00 '''S:'''  It's a legitimate ethical concern.
4085.00 4089.00 '''S:'''  You don't want people growing human beings in a lab to study them.
4089.00 4099.00 '''C:'''  Completely. But all researchers have to already go through an institutional review board, and this is an actual international review board, so it's another layer of regulation.
4099.00 4105.00 '''C:'''  They're already approving this stuff. Why not do it on a case-by-case basis if so few labs can do it anyway?
4105.00 4112.00 '''S:'''  Yeah. Okay, we're going to do a couple of quickie news items, and then we're going to do a special segment with Evan, and then we'll do science or fiction.


=== Bionic Arms <small>(1:08:25)</small> ===
=== Bionic Arms <small>(1:08:25)</small> ===
* [https://theness.com/neurologicablog/index.php/bionic-arms/ Bionic Arms]<ref>[https://theness.com/neurologicablog/index.php/bionic-arms/ Neurologica: Bionic Arms]</ref>
* [https://theness.com/neurologicablog/index.php/bionic-arms/ Bionic Arms]<ref>[https://theness.com/neurologicablog/index.php/bionic-arms/ Neurologica: Bionic Arms]</ref>
4112.00 4117.00 '''S:'''  That's the rest of the show. So this is a robotic prosthetic arm.
4117.00 4122.00 '''S:'''  This is a type of research I've been talking about for years.
4122.00 4126.00 '''S:'''  The researchers are calling this a bionic arm.
4126.00 4128.00 '''B:'''  Bionic!
4128.00 4130.00 '''S:'''  Sure, why not?
4130.00 4132.00 '''S:'''  Why not?
4132.00 4134.00 '''S:'''  Conjures images.
4134.00 4150.00 '''S:'''  So this is a function of what we call a brain-machine interface, where basically any time you're having biology interface with a robotic or a mechanical device or a computerized device, in this case it's not connecting directly to the brain.
4150.00 4161.00 '''S:'''  They're using a very cool method where, so these are meant for people who have an amputation, and they have surviving nerve endings.
4161.00 4172.00 '''S:'''  You can graft, let's say for example you graft the motor nerve, the stub of the motor nerve basically, onto a clump of muscle tissue.
4172.00 4181.00 '''S:'''  And so what that does is it keeps everything alive, but it also means that now that nerve is going to connect to this little patch of muscle.
4181.00 4185.00 '''S:'''  And you might be thinking, well what will that do? It's just a little patch of muscle.
4185.00 4187.00 '''S:'''  It can't move your limb or anything.
4187.00 4192.00 '''S:'''  However, what that does is it amplifies the signal from the nerves.
4192.00 4198.00 '''S:'''  Because nerves produce a very, very low amplitude electrical signal.
4198.00 4204.00 '''S:'''  Muscles are like ten times as much, like a much bigger electrical signal from a muscle cell depolarizing.
4204.00 4212.00 '''S:'''  And so essentially that little clump of muscles, not only the nerves and the muscles keep each other healthy, it amplifies the signal.
4212.00 4218.00 '''S:'''  And then the signal from that little patch of muscles contracting activates the bionic arm.
4218.00 4224.00 '''S:'''  So that's how you get motor control through your preexisting motor nerves.
4224.00 4233.00 '''S:'''  So you're controlling it through the normal motor pathways that you had previously, but now you're again using the clump of cells to actuate the bionic arm.
4233.00 4238.00 '''S:'''  However, the breakthrough here, no component of this is new.
4238.00 4242.00 '''S:'''  What's new is bringing it all together in this one limb.
4242.00 4246.00 '''S:'''  The other two components are sensory.
4246.00 4255.00 '''S:'''  So one is having skin sensors that connect to the sensory nerves, which are grafted onto parts of skin.
4255.00 4267.00 '''S:'''  So again, they're taking the existing nerves, grafting them to some intact tissue, and then using that as the interface to the mechanical connection.
4267.00 4274.00 '''S:'''  So in the motor nerve, it goes from the motor nerve to the muscles to the actuators in the arm.
4274.00 4288.00 '''S:'''  And then in the sensory feedback, it goes from the sensors in the surface of the skin of the robotic arm to this predetermined patch of skin cells that the sensory nerves are grafted onto.
4288.00 4298.00 '''S:'''  So with this arm attached biologically to the subject, they did this in two subjects only, so this is early research.
4298.00 4309.00 '''S:'''  They were able to control the limb through voluntary control through the nerves, and they were able to actually feel, like will their brain interpret that as a feeling in the limb?
4309.00 4320.00 '''S:'''  Yes. So the brain happily incorporates all of this into its circuitry so that people feel like they own the limb, not that they have a limb attached to them.
4320.00 4324.00 '''S:'''  There isn't this arm attached to me. It's like this is my arm now.
4324.00 4325.00 '''S:'''  Agency?
4325.00 4327.00 '''S:'''  Yeah. Well, it's ownership is the term.
4327.00 4328.00 '''S:'''  Ownership.
4328.00 4338.00 '''S:'''  Ownership. And they can control it without looking at it, and their ability to error correct and make adjustments is at like near normal levels.
4338.00 4352.00 '''S:'''  That's what they were studying is like their ability to control the limb was actually more similar to an intact person than it was to somebody using an older prosthetic without the sensory feedback.
4352.00 4356.00 '''B:'''  It becomes part of your internal representation of your body.
4356.00 4357.00 '''B:'''  What is that called?
4357.00 4358.00 '''B:'''  Your homunculus?
4358.00 4361.00 '''B:'''  Yes, homunculus. It becomes part of your homunculus basically.
4361.00 4362.00 '''S:'''  It's a homunculus.
4362.00 4364.00 '''S:'''  This is one cool thing when you just think about it,
4364.00 4369.00 '''GH:'''  that like you can close your eyes and you can tell when your hand is open or closed. That's proprioception.
4369.00 4370.00 '''GH:'''  That's proprioception.
4370.00 4371.00 '''GH:'''  Proprioception.
4371.00 4378.00 '''GH:'''  Okay. That's amazing. I'm sorry. I know this is stoner talk, but it's just you know that it's amazing.
4378.00 4379.00 '''C:'''  Yeah. Luckily.
4379.00 4381.00 '''C:'''  Read stories about people who lose it.
4381.00 4382.00 '''C:'''  Yeah, right. I know. I know.
4382.00 4384.00 '''C:'''  It's incredible when people lose proprioception.
4384.00 4388.00 '''S:'''  All of these pieces have been preexisting.
4388.00 4396.00 '''S:'''  This is the first lab to bring them all together into one limb and show that it improves the usability of these limbs.
4396.00 4397.00 '''S:'''  We are getting there.
4397.00 4398.00 '''S:'''  We're getting so much closer.
4398.00 4400.00 '''GH:'''  Was it a $6 million study or was it?
4400.00 4407.00 '''S:'''  I mean we're still 100 years away from the $6 million man, which is interesting.
4407.00 4413.00 '''S:'''  In the 1970s we thought, yeah, we're probably pretty close to this, but it's like 150 years away.
4413.00 4414.00 '''S:'''  But this is where we are.
4414.00 4417.00 '''S:'''  This is still a fantastic improvement, trust me.
4417.00 4427.00 '''S:'''  The 20 square that we have depicted in the $6 million man, indistinguishable from a living limb to the user and to other people, forget about it.
4427.00 4428.00 '''S:'''  We're nowhere near that.
4428.00 4431.00 '''S:'''  Now, Steve, if you had one of these arms, can you pick up the front end of a car?
4431.00 4433.00 '''S:'''  No.
4433.00 4434.00 '''S:'''  Yeah.
4434.00 4435.00 '''S:'''  It's still limited.
4435.00 4437.00 '''S:'''  You have to hitchhike, Steve.
4437.00 4440.00 '''S:'''  So it's strapped on still.
4440.00 4441.00 '''S:'''  It's touching the skin on the outside.
4441.00 4443.00 '''S:'''  Yes, but there is this biological interface.
4443.00 4444.00 '''S:'''  Right.
4444.00 4445.00 '''S:'''  But it's touching.
4445.00 4446.00 '''S:'''  You're not going to be entering the shotput competition.
4446.00 4447.00 '''S:'''  Is it sticking?
4447.00 4449.00 '''S:'''  Is it like electrode in the muscle?
4449.00 4455.00 '''S:'''  No, no, because the electrodes are reading or either stimulating the skin or reading the muscle contraction from the.
4455.00 4456.00 '''S:'''  Okay.
4456.00 4457.00 '''S:'''  So it's not like a.
4457.00 4460.00 '''C:'''  Yeah, so if it fell off, if it was ripped off, like you wouldn't be injured.
4460.00 4462.00 '''S:'''  Yeah, there's no wires going into the arm.
4462.00 4463.00 '''S:'''  Okay.
4463.00 4470.00 '''S:'''  The reason why I say that though is because structurally the body and the musculature are not really, like these arms couldn't pick up a lot of weight.
4470.00 4471.00 '''S:'''  No.
4471.00 4473.00 '''S:'''  All kidding aside, yeah, they couldn't.
4473.00 4474.00 '''S:'''  But it would be functional.
4474.00 4476.00 '''S:'''  You could pick up a can of soda.
4476.00 4478.00 '''S:'''  You could use it.
4478.00 4482.00 '''B:'''  You could pick it up and not be looking at it with the proper strength.
4482.00 4484.00 '''S:'''  And know that it's there.
4484.00 4485.00 '''S:'''  Know how hard you're squeezing it.
4485.00 4486.00 '''S:'''  And know you're not crushing it.
4486.00 4487.00 '''GH:'''  That's amazing.
4487.00 4493.00 '''S:'''  So I keep a close eye on this technology and it's, again, nice incremental improvements.
4493.00 4495.00 '''S:'''  This was worthy of comment.


== Quickie with Bob: Caves on Mars <small>(1:15:00)</small> ==
== Quickie with Bob: Caves on Mars <small>(1:15:00)</small> ==
4495.00 4499.00 '''S:'''  All right, Bob, you're going to give us a quickie about caves on Mars.
4499.00 4500.00 '''S:'''  Okay.
4500.00 4501.00 '''B:'''  Yes, thank you, Steve.
4501.00 4503.00 '''B:'''  This is your quickie with Bob, everyone.
4503.00 4512.00 '''B:'''  Recent analysis by scientists have shown what could be more than a thousand caves in the Tharsis Bulge region of Mars.
4512.00 4513.00 '''B:'''  Tharsis Bulge?
4513.00 4515.00 '''GH:'''  I don't know where that region is, but it sounds kind of cool.
4515.00 4518.00 '''GH:'''  Make a left at the Colossan Cut-Off.
4518.00 4520.00 '''GH:'''  Tharsis Bulge.
4520.00 4528.00 '''B:'''  Calculations show that these caves could be lifesaving for future human colonists, not unlike the lava tubes on Mars.
4528.00 4530.00 '''B:'''  The lava tubes on the moon.
4530.00 4531.00 '''B:'''  I knew that was going to happen.
4531.00 4539.00 '''B:'''  Now, this presupposes, though, that Mars is far deadlier than a lot of people, I think, really, really, truly understand.
4539.00 4542.00 '''B:'''  They think, oh, it's got an atmosphere, you know, that's not bad.
4542.00 4546.00 '''B:'''  But it barely has an atmosphere, barely, barely.
4546.00 4550.00 '''B:'''  And it does not have a magnetosphere, which is also very critical.
4550.00 4559.00 '''B:'''  And regarding the atmosphere, if you're at sea level on Mars, it has 0.7% the air pressure that we experience at sea level.
4559.00 4561.00 '''S:'''  I mean, that's like, it might as well be nothing.
4561.00 4562.00 '''S:'''  It's so small.
4562.00 4563.00 '''S:'''  Bob, let me ask you this.
4563.00 4564.00 '''S:'''  Yes, ask.
4564.00 4567.00 '''S:'''  You may remember this answer from our upcoming book, Guide to the Future.
4567.00 4568.00 '''S:'''  Four.
4568.00 4572.00 '''S:'''  What pressure would you need to have on Mars or anywhere?
4572.00 4577.00 '''S:'''  What pressure would a human being need in order to be able to survive without a pressure suit?
4577.00 4578.00 '''B:'''  Oh, it was...
4578.00 4579.00 '''B:'''  A lot.
4579.00 4581.00 '''B:'''  No, it was actually less than we thought.
4581.00 4582.00 '''S:'''  One atmosphere.
4582.00 4583.00 '''S:'''  Yeah, right.
4583.00 4584.00 '''S:'''  It was less than we thought, wasn't it?
4584.00 4585.00 '''S:'''  Six percent.
4585.00 4586.00 '''S:'''  Yeah.
4586.00 4587.00 '''E:'''  Oh, really?
4587.00 4588.00 '''E:'''  That's it.
4588.00 4589.00 '''S:'''  Yeah.
4589.00 4590.00 '''S:'''  Of an atmosphere.
4590.00 4591.00 '''S:'''  That's survivability.
4591.00 4592.00 '''S:'''  You can survive.
4592.00 4593.00 '''S:'''  You can survive.
4593.00 4594.00 '''S:'''  You can't breathe.
4594.00 4595.00 '''S:'''  You need oxygen.
4595.00 4596.00 '''S:'''  But it's not as much as I thought.
4596.00 4597.00 '''S:'''  It's really...
4597.00 4598.00 '''S:'''  But you won't, like, implode.
4598.00 4599.00 '''B:'''  Yeah, you won't implode.
4599.00 4600.00 '''B:'''  Right.
4600.00 4601.00 '''B:'''  And you can't move.
4601.00 4602.00 '''B:'''  You can't move because if you move, then you'd run out.
4602.00 4611.00 '''B:'''  But, yeah, but still, even if that is true, but even so, we would need, you know, six, seven, eight times what's there now just to be able to breathe in enough oxygen.
4611.00 4612.00 '''S:'''  Yeah, yeah.
4612.00 4613.00 '''S:'''  But we could...
4613.00 4620.28 '''S:'''  So with the volatiles on Mars, we could get to the point where there was enough of an atmosphere where all you would need was supplemental oxygen.
4620.28 4622.72 '''S:'''  You wouldn't need a pressure suit.
4622.72 4623.72 '''S:'''  Yeah.
4623.72 4626.16 '''S:'''  You wouldn't be like Arnold Schwarzenegger in Total Recall.
4626.16 4631.48 '''B:'''  But talking reality today, it's a horrible place.
4631.48 4633.84 '''B:'''  But it's not just the atmosphere, though.
4633.84 4638.28 '''B:'''  It's the extreme ultraviolet and ionizing radiation.
4638.28 4639.28 '''B:'''  It's so deadly.
4639.28 4643.76 '''B:'''  How much deadlier do you think it's on Mars in terms of that radiation than on the Earth?
4643.76 4644.76 '''B:'''  Ten times.
4644.76 4645.76 '''B:'''  You're wrong, Jay.
4645.76 4646.76 '''B:'''  Nine hundred times.
4646.76 4649.76 '''B:'''  The radiation doses are 900 times...
4649.76 4652.80 '''B:'''  I don't think I've ever been wrong so fast in my life.
4652.80 4654.52 '''B:'''  You were wrong before you asked the question.
4654.52 4656.00 '''B:'''  So calculations show...
4656.00 4657.76 '''B:'''  Come on, this is a quickie.
4657.76 4658.76 '''B:'''  This can't...
4658.76 4659.76 '''B:'''  It's got to be under a minute, right?
4659.76 4660.76 '''B:'''  I'm waiting for you to finish.
4660.76 4661.76 '''B:'''  Okay.
4661.76 4666.08 '''B:'''  So calculations show that only 2% of the UV would get through most of those caves.
4666.08 4667.08 '''B:'''  Only 2%.
4667.08 4670.80 '''B:'''  But enough light would get in that photosynthesis wouldn't be completely gone.
4670.80 4673.32 '''B:'''  You'd still be able to take advantage of photosynthesis.
4673.32 4677.36 '''B:'''  And they think that for the ionizing radiation, it would be the same.
4677.36 4678.36 '''B:'''  Very little would get in.
4678.36 4681.56 '''B:'''  So this could be a real haven for future colonists.
4681.56 4685.24 '''B:'''  Also, while you're in a cave, you could also search for life, Jay.
4685.24 4689.08 '''B:'''  Because if I'm living on Mars, I'm going in a cave, even if I'm a little microbe.
4689.08 4695.58 '''B:'''  So there could be, who knows what kind of life could have developed on Mars eons ago in these caves.
4695.58 4699.00 '''B:'''  So in the future, Arnold Schwarzenegger may say...
4699.00 4700.56 '''B:'''  Get your ass to a cave on Mars.
4700.56 4701.56 '''B:'''  Yes.
4701.56 4703.20 '''B:'''  This was your Quickie with Bob.
4703.20 4705.20 '''B:'''  I hope it was good for you too.
4705.20 4706.20 '''S:'''  Okay.
4706.20 4707.68 '''S:'''  Thank you, Jay.


== Mystery Quotes <small>(1:18:27)</small> ==
== Mystery Quotes <small>(1:18:27)</small> ==
4707.68 4712.50 '''S:'''  So Evan worked up a little puzzle for us that we're going to do before we go to science or fiction.
4712.50 4713.50 '''S:'''  Yes.
4713.50 4716.88 '''S:'''  And he has given us a bunch of quotes.
4716.88 4718.88 '''S:'''  And we have to figure out who said them.
4718.88 4720.40 '''E:'''  Or who it's attributed to, right?
4720.40 4721.40 '''E:'''  So you're going to play at home.
4721.40 4722.40 '''E:'''  You're going to keep your own score.
4722.40 4725.84 '''E:'''  We're going to ask each of the individuals here who they think the correct answer is.
4725.84 4728.78 '''E:'''  The first quote, science, my lad, is made up of mistakes.
4728.78 4734.52 '''E:'''  But they are mistakes which is useful to make because they lead little by little to the truth.
4734.52 4738.42 '''E:'''  Was that written by Jules Verne, Ray Bradbury, or Edgar Rice Burroughs?
4738.42 4739.42 '''E:'''  Let's go down the list.
4739.42 4740.42 '''E:'''  Steve.
4740.42 4741.42 '''E:'''  Bradbury.
4741.42 4742.42 '''E:'''  George.
4742.42 4743.42 '''E:'''  Bradbury.
4743.42 4744.42 '''B:'''  Cara.
4744.42 4745.42 '''B:'''  Verne?
4745.42 4746.42 '''B:'''  Bob?
4746.42 4747.42 '''B:'''  Doesn't sound like Bradbury to me.
4747.42 4748.42 '''B:'''  I'll go with Edgar Rice Burroughs.
4748.42 4749.42 '''E:'''  Okay.
4749.42 4750.42 '''E:'''  And Jay?
4750.42 4751.42 '''E:'''  Jules Verne.
4751.42 4752.42 '''E:'''  If you answer Jules Verne, you are correct.
4752.42 4753.42 '''E:'''  Nicely done, Jay.
4753.42 4754.42 '''E:'''  J. Verne.
4754.42 4755.42 '''E:'''  J. Verne to the center of the Earth.
4755.42 4756.42 '''E:'''  Cara, you and me, baby.
4756.42 4757.42 '''E:'''  Cara, too.
4757.42 4758.42 '''E:'''  Nice.
4758.42 4759.78 '''E:'''  Next, I despise the lottery.
4759.78 4766.12 '''E:'''  There's less chance of you becoming a millionaire than there is of getting hit on the head by a passing asteroid.
4766.12 4767.12 '''E:'''  We'll start at the end.
4767.12 4768.12 '''E:'''  Jay.
4768.12 4769.12 '''E:'''  That's Phil Plait.
4769.12 4770.12 '''B:'''  Bob.
4770.12 4771.12 '''B:'''  Gotta be Plait.
4771.12 4772.12 '''E:'''  Cara.
4772.12 4773.12 '''C:'''  I'll go with Phil.
4773.12 4774.12 '''GH:'''  Yeah.
4774.12 4775.12 '''GH:'''  George?
4775.12 4776.12 '''GH:'''  I think it's not clever enough for Phil.
4776.12 4777.12 '''GH:'''  I'm going to say, I'm going to say Vera.
4777.12 4778.60 '''GH:'''  Steve.
4778.60 4780.60 '''C:'''  So she's not clever?
4780.60 4781.60 '''C:'''  No, no.
4781.60 4785.28 '''GH:'''  Phil is known for his weird, you know, he'll like a lot of things.
4785.28 4786.28 '''E:'''  Steve is correct.
4786.28 4787.28 '''E:'''  It is Brian May.
4787.28 4788.28 '''E:'''  Brian, oh.
4788.28 4789.28 '''E:'''  Good old Brian.
4789.28 4790.28 '''E:'''  Guitarist for Queen and an ask for business.
4790.28 4791.28 '''S:'''  Between Vera Rubin and Phil Plait, yeah.
4791.28 4792.28 '''S:'''  Next.
4792.28 4793.28 '''E:'''  In terms of wittiness.
4793.28 4794.28 '''E:'''  Who said this?
4794.28 4798.28 '''E:'''  I was captured for life by chemistry and by crystals.
4798.28 4800.24 '''E:'''  Was it Dorothy Hodgkin?
4800.24 4801.24 '''E:'''  Was it Marie Curie?
4801.24 4802.24 '''E:'''  Or Rosalind Franklin?
4802.24 4803.24 '''E:'''  Steve.
4803.24 4804.24 '''E:'''  Franklin.
4804.24 4805.24 '''E:'''  George.
4805.24 4807.60 '''GH:'''  I'm going to say Franklin as well.
4807.60 4808.60 '''GH:'''  Cara.
4808.60 4811.32 '''C:'''  X-ray crystallographer Rosalind Franklin.
4811.32 4812.32 '''E:'''  And Bob.
4812.32 4813.32 '''E:'''  Rosalind.
4813.32 4814.32 '''E:'''  Jay.
4814.32 4815.88 '''S:'''  Marie Curie.
4815.88 4816.88 '''E:'''  Dorothy Hodgkin.
4816.88 4817.88 '''E:'''  Oh my goodness.
4817.88 4818.88 None  Everybody.
4818.88 4819.88 '''GH:'''  Good distractor.
4819.88 4820.88 '''GH:'''  Billy Crystal Kidnapper?
4820.88 4821.88 '''E:'''  That's amazing.
4821.88 4822.88 '''E:'''  Not a, yeah, exactly.
4822.88 4823.88 '''GH:'''  Oh my god.
4823.88 4826.20 '''E:'''  All right, let's try this one.
4826.20 4827.20 '''E:'''  This is a fun one.
4827.20 4830.32 '''E:'''  Your immune cells are like a circulating nervous system.
4830.32 4834.44 '''E:'''  Your nervous system, in fact, is a circulating nervous system.
4834.44 4835.44 '''E:'''  It thinks.
4835.44 4836.76 '''E:'''  It is conscious.
4836.76 4839.20 '''E:'''  Was that said by Deepak Chopra?
4839.20 4840.52 '''E:'''  Andrew Weil?
4840.52 4843.48 '''E:'''  Or Tenzin Yatso, the 14th Dalai Lama?
4843.48 4844.48 '''E:'''  Jay.
4844.48 4845.48 '''S:'''  Oh my god.
4845.48 4848.04 '''S:'''  I mean, you know, it's got to be Deepak.
4848.04 4849.04 '''B:'''  Okay, Bob.
4849.04 4850.04 '''B:'''  Tenzin.
4850.04 4851.04 '''B:'''  Cara.
4851.04 4854.12 '''C:'''  Yeah, I mean, it feels Chopra-esque, but it's almost not esoteric enough.
4854.12 4856.32 '''C:'''  So I'll say it was the Dalai Lama.
4856.32 4857.32 '''GH:'''  George.
4857.32 4858.32 '''GH:'''  I'm going to say Chopra.
4858.32 4859.32 '''GH:'''  Okay, Steve.
4859.32 4861.76 '''S:'''  Yeah, I think it's too coherent for Chopra.
4861.76 4862.76 '''S:'''  I'll say wheel.
4862.76 4865.76 '''E:'''  According to the internet, it is Chopra.
4865.76 4868.76 '''S:'''  His immune cells are like a circulating nervous system.
4868.76 4869.76 '''E:'''  I win again.
4869.76 4870.76 '''E:'''  I win again.
4870.76 4874.56 '''E:'''  And if that's the most cogent or the most clear that Chopra's ever been, that's not
4874.56 4876.56 '''C:'''  good. Last one.
4876.56 4878.28 '''C:'''  So that was Chopra, not the random Chopra simulator.
4878.28 4879.28 '''C:'''  Correct.
4879.28 4883.44 '''E:'''  Not the Chopra engine that generates random insanity.
4883.44 4884.44 '''E:'''  Final question, folks.
4884.44 4885.44 '''E:'''  A fun one here.
4885.44 4886.44 '''E:'''  To the movies we go.
4886.44 4896.36 '''E:'''  It could mean that that point in time inherently contains some sort of cosmic significance, almost as if it were the temporal junction point for the entire space-time continuum.
4896.36 4899.20 '''E:'''  On the other hand, it could just be an amazing coincidence.
4899.20 4909.24 '''E:'''  Jeffrey Goldblum as Dr. David Levins from Independence Day, Brad Pitt as Jeffrey Goines from 12 Monkeys, or Christopher Lloyd as Dr. Emmett Brown, Back to the Future.
4909.24 4910.24 '''S:'''  Steve.
4910.24 4914.00 '''S:'''  I mean, it sounds like Emmett Brown, but I don't remember it.
4914.00 4915.36 '''S:'''  But I'll say Emmett Brown.
4915.36 4916.36 '''S:'''  Okay, George.
4916.36 4917.88 '''E:'''  Marty, it's Emmett Brown.
4917.88 4918.88 '''GH:'''  It's me.
4918.88 4919.88 '''GH:'''  100%.
4919.88 4922.72 '''C:'''  I think it's Doc Brown, too, but I'm going to go out on a limb.
4922.72 4928.08 '''C:'''  I don't think it's from Independence Day because there was no time continuum there, but maybe it was from 12 Monkeys.
4928.08 4929.08 '''C:'''  Maybe that was Brad Pitt.
4929.08 4930.08 '''E:'''  Okay, Bob.
4930.08 4931.08 '''E:'''  Brad Pitt.
4931.08 4932.08 '''E:'''  And Jay.
4932.08 4933.08 '''E:'''  Christopher Lloyd.
4933.08 4934.80 '''E:'''  It is Christopher Lloyd, Dr. Emmett Brown.
4934.80 4937.96 '''C:'''  Yeah, it sounded like Doc Brown for sure.
4937.96 4943.80 '''GH:'''  Have you seen Rick and Morty with Christopher Lloyd as the live- Oh, that's great.
4943.80 4945.76 '''GH:'''  No, it's five seconds long.
4945.76 4946.76 '''GH:'''  It's just a little teaser.
4946.76 4947.76 '''GH:'''  Christopher Lloyd as-
4947.76 4948.76 '''S:'''  It's amazing.
4948.76 4950.76 '''E:'''  I love him. That's the Mystery Quotes segment.
4950.76 4951.76 '''S:'''  I hope you enjoyed it.
4951.76 4952.76 '''E:'''  I hope you all scored well at home.
4952.76 4953.76 '''E:'''  That was fun, Emmett.
4953.76 4954.76 '''E:'''  Thanks, Evan.


== Science or Fiction <small>(1:22:35)</small> ==
== Science or Fiction <small>(1:22:35)</small> ==
Line 191: Line 2,650:


=== Steve Explains Item #3 ===
=== Steve Explains Item #3 ===
4954.76 4959.76 '''C:'''  It's time for Science or Fiction.
4959.76 4969.36 '''S:'''  All right, we have just enough time for a quick Science or Fiction.
4969.36 4970.36 '''S:'''  We're going to have to move quickly here.
4970.36 4971.36 '''S:'''  Okay, Fiction.
4971.36 4972.36 '''S:'''  This is a theme.
4972.36 4973.36 '''S:'''  The theme is fruit.
4973.36 4975.40 '''S:'''  The theme is fruit.
4975.40 4976.40 '''S:'''  It's all unusual fruit.
4976.40 4977.40 '''S:'''  Help, help.
4977.40 4978.40 '''S:'''  I need fruit.
4978.40 4979.40 '''S:'''  All right, here we go.
4979.40 4990.12 '''S:'''  The Jabuticaba berries native to Brazil are the size of plums but taste like grapes and grow directly on the trunk of the Jabuticaba tree.
4990.12 5000.80 '''S:'''  The pawpaw is a sought-after tropical fruit relative native to the eastern United States with flowers that smell like rotting flesh and fruit that contains a high concentration of neurotoxin.
5000.80 5010.44 '''S:'''  Number three, the yuzu is an Asian tree fruit that is the largest culinary fruit in the world with long tubular fruit weighing over 80 pounds.
5010.44 5022.84 '''S:'''  Now I had to throw in culinary there because pumpkins are technically fruit and pumpkins are the largest true fruit in the world but this is what we think of it more as it's a culinary vegetable.
5022.84 5024.96 '''S:'''  This is something that is a culinary fruit.
5024.96 5026.76 '''S:'''  I thought it was a culinary fruit.
5026.76 5027.76 '''S:'''  Culinary whatever.
5027.76 5030.76 '''S:'''  All right, so Bob, go first.
5030.76 5035.56 '''E:'''  Hold on, I'm going to put the link in chat.
5035.56 5044.48 '''S:'''  So Ian will put a link in the chat to a survey where while we're giving our answers, the rogues are giving their answers, you can vote for the one that you think is.
5044.48 5047.84 '''S:'''  We'll check in with you when the rogues are done and then we'll do the reveal.
5047.84 5048.84 '''S:'''  Go ahead, Bob.
5048.84 5052.20 '''B:'''  All right, I'll say the yuzu is friction.
5052.20 5053.20 '''S:'''  That's it?
5053.20 5054.20 '''S:'''  Yeah.
5054.20 5055.20 '''S:'''  I got to be quick, right?
5055.20 5056.20 '''S:'''  Okay, you've got 10 minutes.
5056.20 5057.20 '''S:'''  Not that great.
5057.20 5058.20 '''S:'''  All right, so let's see.
5058.20 5059.20 '''S:'''  That's fine.
5059.20 5060.20 '''S:'''  You're good.
5060.20 5061.20 '''S:'''  But Jay's down.
5061.20 5062.20 '''S:'''  Jay.
5062.20 5065.20 '''S:'''  All right, the Jabuticaba berries.
5065.20 5066.20 '''S:'''  Jabuticaba?
5066.20 5070.92 '''C:'''  Jay, for the next five minutes, I just want you to say them.
5070.92 5074.68 '''S:'''  I believe that one is science.
5074.68 5076.40 '''S:'''  The size of plums tastes like grapes.
5076.40 5078.32 '''S:'''  Yep, that one is science to me.
5078.32 5083.72 '''S:'''  The pawpaw, I mean, I've heard about some type of thing that smelled like rotting flesh, some type of plant.
5083.72 5089.04 '''S:'''  I don't know if it's this one, but that is a little bit of memory there.
5089.04 5091.80 '''S:'''  So I think that one is real.
5091.80 5094.48 '''S:'''  The yuzu is an Asian tree fruit that is a large.
5094.48 5098.16 '''S:'''  It's basically an 80-pound piece of tubular fruit.
5098.16 5099.28 '''S:'''  I don't believe that's real.
5099.28 5100.76 '''S:'''  I think I would have heard of it.
5100.76 5101.76 '''S:'''  That one is fiction.
5101.76 5102.76 '''S:'''  Okay, George.
5102.76 5110.40 '''GH:'''  I think the Jabuticaba is the fiction, even though you said it with such authority, Steve.
5110.40 5115.00 '''GH:'''  It felt like you practiced it a little bit too much to really make it sound like it's a real thing.
5115.00 5120.92 '''GH:'''  So I'm guessing, because it was too well done, I'm going to say that number one is the fiction.
5120.92 5121.92 '''S:'''  All right.
5121.92 5122.92 '''S:'''  Interesting logic.
5122.92 5123.92 '''S:'''  Cara?
5123.92 5129.20 '''C:'''  I have to say the yuzu is the fiction because I've eaten yuzu before, and it looks like a lemon kind of.
5129.20 5133.48 '''C:'''  I don't know what this crazy 80-pound yuzu is.
5133.48 5134.48 '''C:'''  I got to go with you.
5134.48 5136.20 '''C:'''  I mean, don't you guys... I don't get it.
5136.20 5137.20 '''C:'''  Is this an LA thing?
5137.20 5138.20 '''C:'''  You guys have never had yuzu?
5138.20 5139.20 '''C:'''  No, I've never even heard of it.
5139.20 5140.20 '''C:'''  No?
5140.20 5141.20 '''E:'''  How are you?
5141.20 5142.20 '''E:'''  It sounds like a website.
5142.20 5145.12 '''C:'''  No, it's on every Asian dessert menu.
5145.12 5146.68 '''S:'''  You should have your last date.
5146.68 5149.04 '''S:'''  Cara, I haven't been out of the house in two years.
5149.04 5151.04 '''S:'''  I don't even know what's happening right now.
5151.04 5153.72 '''C:'''  You got to eat more Asian food.
5153.72 5154.72 '''E:'''  Evan?
5154.72 5157.24 '''E:'''  I'll go with Cara and Bob.
5157.24 5158.24 '''E:'''  Yuzu, fiction.
5158.24 5160.92 '''C:'''  Maybe there's a different variant that I don't know about.
5160.92 5165.16 '''E:'''  Even without Cara's help, I think I would have chosen that as the fiction.
5165.16 5166.16 '''S:'''  Okay.
5166.16 5169.48 '''S:'''  Ian, do we have a consensus from the listeners?
5169.48 5170.48 '''B:'''  Relatively.
5170.48 5172.60 '''B:'''  Here it comes.
5172.60 5173.60 '''C:'''  Seems like it's all over the place now.
5173.60 5174.60 '''C:'''  Oh, wow.
5174.60 5175.60 '''E:'''  Yeah.
5175.60 5176.60 '''E:'''  Popo.
5176.60 5177.60 '''E:'''  Oh, you can make pies out of this fruit.
5177.60 5178.60 '''S:'''  Look at that.
5178.60 5179.60 '''S:'''  And then yuzu and the Jakutapapa berries.
5179.60 5180.60 '''S:'''  All right.
5180.60 5187.72 '''S:'''  So the winner as the fiction is the Popo among the listeners.
5187.72 5189.52 '''S:'''  So let's take these in order.
5189.52 5191.20 '''C:'''  Why would that be sought after?
5191.20 5192.20 '''C:'''  I guess for medicine?
5192.20 5199.20 '''S:'''  Jakutapapa berries, native to Brazil, are the size of plums but taste like grapes and grow directly on the trunk of the Jakutapapa tree.
5199.20 5203.04 '''S:'''  That's weird, actually, now that I think about it.
5203.04 5208.52 '''S:'''  George and 27% of the listeners, 25%, think this one is the fiction.
5208.52 5210.88 '''S:'''  I'll answer it with a picture.
5210.88 5211.88 '''S:'''  Oh, geez.
5211.88 5212.88 '''S:'''  Yep.
5212.88 5213.88 '''S:'''  Look at that.
5213.88 5214.88 '''S:'''  Is that a basing?
5214.88 5215.88 '''S:'''  Where is it?
5215.88 5216.88 '''S:'''  Where is it?
5216.88 5217.88 '''S:'''  Now, George, on the trunk.
5217.88 5218.88 '''S:'''  Wow.
5218.88 5219.88 '''S:'''  Excuse me, George.
5219.88 5220.88 '''S:'''  If you were a frequent Reddit user, you would have gotten this correct.
5220.88 5221.88 '''S:'''  Is that where I saw it, James?
5221.88 5222.88 '''S:'''  Yes, it is.
5222.88 5223.88 '''S:'''  That's cool.
5223.88 5224.88 '''S:'''  Yeah, the door right on the trunk of the major branches.
5224.88 5225.88 '''S:'''  Some of the pictures are just amazing.
5225.88 5226.88 '''S:'''  It's so weird.
5226.88 5227.88 '''S:'''  And they're basically like grapes the size of a plant.
5227.88 5228.88 '''S:'''  And they taste like grapes?
5228.88 5229.88 '''S:'''  Yeah.
5229.88 5230.88 '''S:'''  Can you eat them?
5230.88 5231.88 '''S:'''  On the trunk.
5231.88 5232.88 '''S:'''  Yes, yes.
5232.88 5233.88 '''S:'''  They're supposed to be delicious.
5233.88 5236.88 '''E:'''  Are there other fruits that grow on trunks of trees?
5236.88 5238.88 '''S:'''  I've never seen a tree before that I want to lick.
5238.88 5239.88 '''S:'''  Right?
5239.88 5240.88 '''S:'''  Interesting.
5240.88 5241.88 '''S:'''  If you go to the trunk, they'll pick them up.
5241.88 5242.88 '''S:'''  I know.
5242.88 5243.88 '''S:'''  Just chew on them.
5243.88 5244.88 '''S:'''  They don't have stems or anything.
5244.88 5245.88 '''S:'''  Just chew on the tree and start chewing.
5245.88 5246.88 '''S:'''  I've never heard of that before.
5246.88 5247.88 '''C:'''  All right.
5247.88 5248.88 '''C:'''  Oops.
5248.88 5249.88 '''S:'''  Going the wrong way.
5249.88 5250.88 '''S:'''  Tastes like garbage, though.
5250.88 5251.88 '''S:'''  All right.
5251.88 5252.88 '''S:'''  Here we go.
5252.88 5258.40 '''S:'''  The pawpaws are sought after tropical fruit relative native to the eastern United States with flowers that smell like rotting flesh and fruit that contains a high concentration of neurotoxin.
5258.40 5264.72 '''S:'''  None of the rogues thought that was the fiction, but the majority of the listeners think this one is the fiction.
5264.72 5265.72 '''S:'''  Yeah.
5265.72 5266.72 '''S:'''  Interesting.
5266.72 5270.92 '''S:'''  The next one is science.
5270.92 5271.92 '''S:'''  There is a pawpaw.
5271.92 5272.92 '''S:'''  Wow.
5272.92 5273.92 '''S:'''  Ew.
5273.92 5274.92 '''S:'''  Apparently, it's delicious.
5274.92 5275.92 '''S:'''  It looks like a mitochondrion.
5275.92 5281.88 '''S:'''  I actually have a pawpaw tree growing in my backyard.
5281.88 5282.88 '''S:'''  You do?
5282.88 5285.20 '''S:'''  You can see it if you want to, but it's never fruited.
5285.20 5287.80 '''S:'''  It's just not old enough yet to fruit.
5287.80 5289.24 '''S:'''  It is on Instagram.
5289.24 5294.84 '''S:'''  I bought it because it's like a native Connecticut fruit tree, fruit bearing tree, but it has a neurotoxin that can kill you.
5294.84 5298.68 '''C:'''  Yeah, why would you eat something that has a neurotoxin in it?
5298.68 5306.48 '''S:'''  After I purchased it and I started to research it, I'm like, oh, the flowers are pollinated by flies.
5306.48 5311.12 '''S:'''  They replicate this odor of rotting flesh in order to attract flies.
5311.12 5313.40 '''S:'''  No, to attract the flies.
5313.40 5317.78 '''S:'''  If you have a lot of the trees, apparently, then they attract flies well.
5317.78 5324.60 '''S:'''  If you have only one or two, they actually suggest that you can hang rotting meat on the tree in order to help attract flies.
5324.60 5325.60 '''S:'''  Oh my God.
5325.60 5331.56 '''S:'''  Although many references say, nah, you're probably not going to want to do that, so you can just hand pollinate it.
5331.56 5333.88 '''S:'''  If you have one tree, just hand pollinate it.
5333.88 5335.28 '''S:'''  Steve, does it have to be human flesh?
5335.28 5338.88 '''S:'''  You don't have to hang roadkill on your pawpaw tree.
5338.88 5349.28 '''S:'''  Yes, it has what one reference calls a high concentration of a neurotoxin that affects neurons in your brain.
5349.28 5356.04 '''S:'''  The concern is that if you have frequently consumed the pawpaw fruit over years, that it may actually cause brain damage.
5356.04 5358.44 '''S:'''  It may actually cause toxicity.
5358.44 5360.20 '''S:'''  There is no fruit good enough that would want...
5360.20 5364.48 '''S:'''  But this is considered a delicacy.
5364.48 5365.88 '''S:'''  It is a commercial fruit.
5365.88 5369.88 '''S:'''  It is highly sought after by foodies, by people who are aware of it.
5369.88 5370.88 '''S:'''  Like blowfish.
5370.88 5373.16 '''S:'''  Yeah, and it is a tropical relative.
5373.16 5374.16 '''S:'''  It's like a tropical fruit.
5374.16 5377.12 '''S:'''  It's like a guava or something like that.
5377.12 5378.52 '''S:'''  I haven't tasted one yet.
5378.52 5380.76 '''S:'''  I kind of got reluctant when I read about the neurotoxin.
5380.76 5381.76 '''S:'''  Yeah, yeah.
5381.76 5385.28 '''S:'''  Steve, somebody in the chat said they taste like a custardy banana.
5385.28 5386.28 '''S:'''  A custardy banana?
5386.28 5388.40 '''S:'''  That sounds pretty good.
5388.40 5392.44 '''S:'''  So all of this means that the...
5392.44 5394.60 '''C:'''  And apologies guys on this one.
5394.60 5395.60 '''C:'''  I thought I was last.
5395.60 5396.60 '''C:'''  It's all right, Cara.
5396.60 5397.60 '''C:'''  That's okay, Cara.
5397.60 5398.60 '''C:'''  No problem.
5398.60 5399.60 '''C:'''  I would not have said that if I...
5399.60 5400.60 '''C:'''  Cara, I screwed up recently.
5400.60 5401.60 '''S:'''  Don't worry about it.
5401.60 5402.60 '''S:'''  We all get mistakes.
5402.60 5403.60 '''S:'''  Yeah, yeah.
5403.60 5409.52 '''S:'''  The yuzu was an Asian fruit that is the longest culinary fruit in the world with long tubular fruit weighing over 80 pounds.
5409.52 5411.52 '''S:'''  I actually meant for you to go last.
5411.52 5417.72 '''S:'''  I just forgot because I was worried that this was a California thing because we have... I've never heard of or seen this fruit.
5417.72 5419.40 '''S:'''  Never even heard of it before.
5419.40 5423.92 '''C:'''  I think it's just that in LA, we have so many restaurants and so much food from the world.
5423.92 5424.92 '''C:'''  It's a Japanese lemon.
5424.92 5425.92 '''E:'''  The Asian influence on the West Coast is no restaurants on the East Coast.
5425.92 5426.92 '''S:'''  It is a citrus.
5426.92 5427.92 '''S:'''  It is citrus.
5427.92 5428.92 '''S:'''  It is Japanese.
5428.92 5429.92 '''S:'''  It's citrus.
5429.92 5430.92 '''S:'''  Yeah.
5430.92 5433.80 '''S:'''  So there it is on the left picture.
5433.80 5436.28 '''S:'''  That's the actual yuzu.
5436.28 5437.88 '''S:'''  Now on the right is what?
5437.88 5439.28 '''S:'''  Who recognizes that picture?
5439.28 5443.84 '''S:'''  On the right is the actual- That's the- What's that?
5443.84 5444.84 '''S:'''  The actual largest fruit in the world.
5444.84 5445.84 '''S:'''  Yeah, I have cancer.
5445.84 5446.84 '''S:'''  Durian?
5446.84 5447.84 '''GH:'''  Bigger than a watermelon.
5447.84 5448.84 '''GH:'''  Durian.
5448.84 5449.84 '''S:'''  Is that what it is?
5449.84 5450.84 '''E:'''  Durian?
5450.84 5451.84 '''E:'''  Jackfruit.
5451.84 5452.84 '''E:'''  Jackfruit.
5452.84 5453.84 '''S:'''  Yeah, yeah.
5453.84 5454.84 '''S:'''  We ate it.
5454.84 5455.84 '''S:'''  We ate it.
5455.84 5456.84 '''S:'''  It's like shredded chicken.
5456.84 5457.84 '''S:'''  Is that culinary, Steve?
5457.84 5458.84 '''S:'''  What's happening?
5458.84 5459.84 '''C:'''  Jackfruit, it grows on trees.
5459.84 5460.84 '''C:'''  That thing's hanging from a tree.
5460.84 5461.84 '''S:'''  It's huge.
5461.84 5462.84 '''S:'''  Yeah.
5462.84 5463.84 '''S:'''  It's like a watermelon.
5463.84 5464.84 '''S:'''  Yeah, it's like a vegan substitution.
5464.84 5465.84 '''S:'''  They weigh 80 pounds.
5465.84 5466.84 '''E:'''  You can make chili with it and stuff.
5466.84 5467.84 '''E:'''  That's the actual largest fruit.
5467.84 5468.84 '''C:'''  Wow.
5468.84 5471.44 '''C:'''  I think Trader Joe's sells jackfruit sloppy joes.
5471.44 5477.36 '''GH:'''  I literally have cans of Trader Joe's jackfruit in my- Those are big cans, George.
5477.36 5478.36 '''E:'''  Big cans.
5478.36 5479.36 '''E:'''  Yeah.
5479.36 5480.36 '''GH:'''  The biggest cans in the world.
5480.36 5481.36 '''GH:'''  No, it's cool.
5481.36 5483.48 '''GH:'''  You cook it and it just shreds just like chicken.
5483.48 5486.84 '''GH:'''  If you spice it right, it's almost, almost- You can't tell it's not meat.
5486.84 5487.84 '''GH:'''  It's pretty much-
5487.84 5489.84 '''S:'''  Yeah, yeah. It's a fruit.
5489.84 5490.84 '''S:'''  It's kind of a meat substitute.
5490.84 5491.84 '''S:'''  I should have got that right, but I did.
5491.84 5492.84 '''S:'''  There you go.
5492.84 5493.84 '''S:'''  Jackfruit.


== Skeptical Quote of the Week <small>(1:31:32)</small> ==
== Skeptical Quote of the Week <small>(1:31:32)</small> ==
Line 197: Line 3,146:


<blockquote> Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education and separating trustworthy from untrustworthy sources. However, trust in science does not fix all evils and can create susceptibility to pseudoscience if trusting means not being critical.<br>– {{w|Dolores Albarracín}}, director of the Science of Science Communication Division and the [https://www.asc.upenn.edu/research/centers/social-action-lab Social Action Lab] at the University of Pennsylvania's Annenberg Public Policy Center.</blockquote>
<blockquote> Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education and separating trustworthy from untrustworthy sources. However, trust in science does not fix all evils and can create susceptibility to pseudoscience if trusting means not being critical.<br>– {{w|Dolores Albarracín}}, director of the Science of Science Communication Division and the [https://www.asc.upenn.edu/research/centers/social-action-lab Social Action Lab] at the University of Pennsylvania's Annenberg Public Policy Center.</blockquote>
5493.84 5494.84 '''S:'''  Evan, you have one minute to give us a quote.
5494.84 5495.84 '''E:'''  All right.
5495.84 5498.60 '''E:'''  Here's the actual quote to wrap up the show tonight.
5498.60 5510.60 '''E:'''  Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education, and separating trustworthy from untrustworthy sources.
5510.60 5521.40 '''E:'''  However, trust in science does not fix all evils and it can create susceptibility to pseudoscience if trusting means not being critical.
5521.40 5525.32 '''E:'''  Said by Dolores Al-Barrison, I hope I pronounced that correctly.
5525.32 5533.84 '''E:'''  She is the director for the Science of Science Communication Division at the University of Pennsylvania's Annenberg Public Policy Center.
5533.84 5536.80 '''E:'''  I love that there is a science of science communications.
5536.80 5537.80 '''E:'''  Exactly.
5537.80 5538.80 '''S:'''  See, George Shih gets it.
5538.80 5539.80 '''S:'''  Yes.
5539.80 5540.80 '''S:'''  All right.
5540.80 5541.80 '''S:'''  Thank you.
5541.80 5544.80 '''S:'''  Yeah, so is that one of the authors of the study?
5544.80 5545.80 '''S:'''  Yes, right.
5545.80 5550.64 '''E:'''  This is one of the co-authors of the study that we referred to earlier when we were talking
5550.64 5552.64 '''S:'''  about the news item. Yes.
5552.64 5553.64 '''S:'''  Absolutely.
5553.64 5554.64 '''S:'''  Let's have her on the show.
5554.64 5556.24 '''S:'''  So there is a science of science communication.
5556.24 5559.28 '''S:'''  We follow it very closely because it's what we do.
5559.28 5569.72 '''S:'''  And I guess this is self-serving and I try to be skeptical of this, but the fact is we've been saying this for 30 years that you can't just teach science.
5569.72 5570.72 '''S:'''  You have to teach about pseudoscience.
5570.72 5571.72 '''S:'''  You have to teach critical thinking.
5571.72 5574.24 '''S:'''  You have to teach about the mechanisms of self-deception.
5574.24 5576.64 '''S:'''  You've got to teach all of that.
5576.64 5581.88 '''S:'''  And over the last 30 years, the research has showed that we were absolutely correct.
5581.88 5588.08 '''S:'''  The scientific skepticism approach to science communication is the far and away the most evidence-based.
5588.08 5595.36 '''S:'''  And it's been very satisfying to follow this research over the last 30 years and go, yeah, this is what we've been saying all along.
5595.36 5600.20 '''S:'''  Obviously, it's filling in a lot of the details and informing what we do.
5600.20 5601.60 '''S:'''  And there's a lot of nuance here.
5601.60 5603.32 '''S:'''  There's a lot of details that we didn't know obviously.
5603.32 5610.20 '''S:'''  But just the big picture of you've got to teach about pseudoscience is absolutely evidence-based and correct.


== Signoff/Announcements <small>(1:33:30)</small> ==  
== Signoff/Announcements <small>(1:33:30)</small> ==  
<!-- ** if the signoff/announcements don't immediately follow the QoW or if the QoW comments take a few minutes, it would be appropriate to include a timestamp for when this part starts -->
<!-- ** if the signoff/announcements don't immediately follow the QoW or if the QoW comments take a few minutes, it would be appropriate to include a timestamp for when this part starts -->
5610.20 5611.20 '''S:'''  All right.
5611.20 5612.20 '''S:'''  Thank you, George.
5612.20 5613.20 '''S:'''  Thank you, Ian, for running the tech.
5613.20 5616.84 '''S:'''  Buy the book, as it says on the screen there, buy the Skeptics Guide to the Universe book.
5616.84 5618.40 '''S:'''  Give it as a gift to somebody.
5618.40 5619.96 '''S:'''  Thank you, Cara, for joining us from Cal Poly.
5619.96 5620.96 '''S:'''  Thank you, Cara.
5620.96 5622.92 '''S:'''  Thank you, Derek, for inviting us to DragonCon again.
5622.92 5624.22 '''S:'''  Sorry we couldn't be there live.
5624.22 5626.44 '''S:'''  We hope that you guys enjoyed our stream.
5626.44 5628.16 '''S:'''  Enjoy the rest of DragonCon.
5628.16 5629.16 '''S:'''  Stay skeptical.


'''S:''' —and until next week, this is your {{SGU}}. <!-- typically this is the last thing before the Outro -->  
'''S:''' —and until next week, this is your {{SGU}}. <!-- typically this is the last thing before the Outro -->  

Latest revision as of 05:48, 23 October 2023

  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.
  Emblem-pen-orange.png This episode needs: transcription, proofreading, formatting, links, 'Today I Learned' list, categories, segment redirects.
Please help out by contributing!
How to Contribute

You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 854
November 20th 2021
854 bionic limb.png
(brief caption for the episode icon)

SGU 853                      SGU 855

Skeptical Rogues
S: Steven Novella

B: Bob Novella

C: Cara Santa Maria

J: Jay Novella

E: Evan Bernstein

Guest

GH: George Grab, American musician & podcaster

Quote of the Week

Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education, and separating trustworthy from untrustworthy sources. However, trust in science does not fix all evils and can create susceptibility to pseudoscience if trusting means not being critical.

Dolores Albarracín, American psychologist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction[edit]

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello and welcome to the Skeptics' Guide to the Universe. Today is Saturday, September 4, 2021, and this is your host, Steven Novella. (applause) Joining me this week are Bob Novella...

B: Hey, everybody!

S: Cara Santa Maria...

C: Howdy.

S: Jay Novella...

J: Hey, guys.

S: ...and Evan Bernstein.

E: Hello, everyone!

S: And, we have a special in-studio guest with us, George Hrab!

33.00 34.00 S: Hi.

34.00 35.00 GH: George.

35.00 36.00 GH: Hello. It's been so long.

36.00 38.00 GH: You've all changed so much since last I saw you.

38.00 41.00 GH: Oh my gosh, how you've grown. I'm so proud. I'm so proud.

41.00 46.00 S: Cara, I just want you to know we're okay with the whole flood and the rain. We're good.

46.00 49.00 S: We survived two psychological hurricanes.

49.00 51.00 C: I'm glad. I'm glad. It was looking pretty scary.

51.00 53.00 C: But Cara, guess what happened?

53.00 54.00 S: What?

54.00 56.00 S: Whenever there's a storm, guess what happens in my yard?

56.00 57.00 S: A tree falls down.

57.00 58.00 C: A tree falls down.

58.00 60.00 J: A tree fell down. Exactly. See, everybody knows.

60.00 61.00 E: Does it make a noise?

61.00 62.00 J: Yes, another tree fell down.

62.00 63.00 C: Does it make a sound?

63.00 68.00 C: Well, I'll let all of you know that I'm also safe from wildfires, even though Lake Tahoe is burning.

68.00 73.00 C: And the skies are not great. I'm not under any sort of immediate threat here in LA right now.

73.00 82.00 S: Yeah, I saw a video of somebody walking out from being indoors and it was, I guess, dusk and the sky was literally blood red.

82.00 83.00 B: Oh yeah.

83.00 90.00 C: Yeah, you see that periodically at the worst of it. It's so scary. But you guys saw all these videos of New York City, right?

90.00 91.00 J: Yes.

91.00 92.00 B: My God. Subways.

92.00 94.00 B: People drowning in their basement.

94.00 95.00 S: Philly as well.

95.00 96.00 S: Yeah.

96.00 97.00 S: Philly was bad.

97.00 98.00 S: Nuts.

98.00 99.00 S: Horrible.

99.00 100.00 S: We're calling it a one in 500 year rain event.

100.00 103.00 S: Yeah, but that's BS because this is the new norm, man.

103.00 108.00 B: Well, still though, even with the new norm, I don't think we'll be seeing this at least for another 10 years.

108.00 110.00 B: It probably won't be 500 years.

110.00 111.00 S: No.

111.00 113.00 E: I guarantee you that.

113.00 114.00 E: It's no longer biblical.

114.00 120.00 S: So we are recording on a Saturday for a couple of reasons. One is we just like to do it, right?

120.00 121.00 S: We like to play the podcast.

121.00 122.00 S: Why not?

122.00 133.00 S: We haven't done a live video, SGU episode in a while. But the primary reason is that we were supposed to all be at Dragon Con this weekend.

133.00 147.00 S: And we decided that we didn't want to be moving through massive crowds in the middle of the worst spike of the Delta variant of COVID that we've had so far.

147.00 148.00 S: Yeah, it's called us crazy.

148.00 150.00 S: During a place in the country where it's particularly bad.

150.00 154.00 S: So it was just the unfortunately the pandemic was just too bad.

154.00 158.00 S: We could not in good conscience go to Atlanta, do it.

158.00 165.00 S: I wish everyone there well, but we just couldn't do it. We have too many people in our immediate family who are vulnerable.

165.00 170.00 S: I work at a hospital. It just doesn't work. So we couldn't go.

170.00 180.00 S: But we will be streaming a portion of this show to Dragon Con. But we're going to do the first half of the show first.

180.00 181.00 S: That's good.

181.00 183.00 S: Doing the first half of the show first.

183.00 189.00 S: I thought doing the second half first and then I thought it would be more natural if we did the first half.

189.00 193.00 E: It just makes more sense to do the first half first. It's not a Quentin Tarantino movie.

193.00 200.00 GH: Guys, I got to say I saw the coolest one of the all time, I think most original Dragon Con costumes someone posted.

200.00 204.00 GH: And it occurred to me that this costume would have made no sense a year ago.

204.00 212.00 GH: It was the Delta variant. So it was a Loki, a female Loki dressed as a Delta stewardess.

212.00 213.00 S: Delta variant.

213.00 216.00 GH: It was the Delta variant. Is that not brilliant?

216.00 217.00 S: Very clever.

217.00 218.00 S: It was so clever.

218.00 219.00 S: That is so meta.

219.00 223.00 GH: And if we had seen that literally, whatever, 12, 15 months ago, you'd have been like, what?

223.00 230.00 GH: Because Loki wasn't out yet. COVID wasn't happening. And there was no Delta variant. So now it's just insanely funny.

230.00 237.00 S: I'll never forget the Dragon Con where we're doing a live show. We always wear costumes when we're doing the live show at Dragon Con.

237.00 243.00 S: But Bob actually thought he was going to do the entire show wearing a full face mask.

243.00 244.00 S: Yeah, a rubber pirate mask.

244.00 246.00 B: And Steve signed off on it.

246.00 253.00 B: And it was an awesome costume, well worth the muffled hour and a half of talking.

253.00 259.00 S: That's an excellent demonstration of memory distortion over time.

259.00 261.00 B: Yes. Thank you. Thank you.

261.00 269.00 C: So how is everybody processing the fact that we're three months away from, what, 2022?

269.00 270.00 C: Four months. Yeah.

270.00 274.00 C: And it feels like we're still just reckoning with 2019.

274.00 276.00 C: Yeah. I know.

276.00 282.00 C: So much time has been both dragging on and also completely lost.

282.00 293.00 C: It really doesn't feel possible that 2019 will have been, quote, three years. I mean, I know it's not exactly, but like three years ago, the beginning of 2019.

293.00 294.00 S: Right.

294.00 301.00 S: It is weird. It's disorienting because we've fallen out of our normal life patterns enough.

301.00 306.00 S: Our sense of time is completely, it's complete nonsense in our heads as it is.

306.00 310.00 S: But you do anything to disrupt the apple cart and you get disoriented. It really is disorienting.

310.00 314.00 S: Yeah. But to clarify, I mean, the pandemic really started at the beginning of 2020.

314.00 319.00 S: It's been a year and a half. The end of this year will be two full years, really, of the pandemic.

319.00 321.00 S: And we're definitely going to still be in the middle of this.

321.00 325.00 S: I mean, again, I remember when this started, we had no idea how long it was going to be.

325.00 326.00 S: We were talking weeks.

326.00 327.00 S: Yeah.

327.00 328.00 S: Then weeks became months.

328.00 331.00 S: Then we're like, oh, maybe we'll have a vaccine by the end of the year.

331.00 340.00 S: The idea now that we're still on the upswing of one of the worst waves of this pandemic more than a year and a half into it.

340.00 343.00 S: And honestly- Didn't have to be that way, though.

343.00 356.00 S: We still cannot tell how long this is going to drag on because another variant could emerge, could reset the clock on having to get vaccinated, would have to get a booster, et cetera.

356.00 363.00 GH: But Steve, at least everyone's agreed to kind of move forward together and do all the steps necessary to try to tamp it.

363.00 364.00 GH: Everybody.

364.00 366.00 GH: At least as a country, we're all together in this.

366.00 368.00 GH: Yeah, we're all in the same boat.

368.00 370.00 GH: Unified, doing this. At least there's that.

370.00 372.00 GH: Oh, wait, no. The total-

372.00 374.00 S: Yeah, what a sense of community.

374.00 384.00 GH: What's freaking me out is the fact that we are as far from 1982 as 1982 was from the start of World War II.

384.00 386.00 S: It's 40 years.

386.00 397.00 GH: Because in high school, or grade school, whatever high school, to be thinking of the 40s as it was such a foreign past, such an old- Ancient history.

397.00 401.00 GH: Ancient, ancient history. And now that is what my high school years were.

401.00 402.00 C: Yeah.

402.00 406.00 C: Well, because that's my mom, right? My mom was born in the 40s.

406.00 410.00 C: But I am the age of many moms.

410.00 415.00 C: And that's something that's hard to grapple with, I think. I was born in 1983.

415.00 418.00 C: So one year apart from what you're referencing.

418.00 419.00 C: Yeah.

COVID-19 Update (6:59)[edit]

419.00 423.00 S: So a little bit of good news. We're talking about the pandemic and everything.

423.00 434.00 S: A study came out just a couple days ago where they looked at 6.8 million people receiving one of the two mRNA vaccines.

434.00 440.00 S: And they found essentially no serious side effects. No serious side effects.

440.00 445.00 S: 6.8 million doses of either of the mRNA vaccines.

445.00 453.00 S: Now there were not serious side effects. The big one probably is the pericarditis.

453.00 456.00 S: So it can cause a little bit of inflammation around the heart.

456.00 465.00 S: But these were mild cases. Most of the people who had the symptoms were observed in the hospital for one day and then sent home.

465.00 470.00 S: No long-term or serious consequences. So it wasn't considered a serious adverse event.

470.00 479.00 S: But it was an adverse event. But again, still, they said there were 32 total cases, again, out of 6.8 million doses.

479.00 480.00 S: Wow.

480.00 481.00 S: You know, this is like-

481.00 485.00 GH: So now when you say no serious side effects, what are the serious side effects?

485.00 490.00 S: None. So there's no serious side effects. What about the serious side effects?

490.00 492.00 S: So the risk was- No, there's none.

492.00 499.00 S: If we can do the math, the denominator was 6.8 million, but it's zero. Yeah, there was zero. Serious side effects.

499.00 505.00 B: Yes, but I think we would see serious side effects if we gave it to a trillion people.

505.00 507.00 B: I think we would get one serious side effect.

507.00 508.00 B: That's true.

508.00 509.00 B: That's just my theory.

509.00 525.00 S: Yeah, so the thing is we're at winning the lottery level of statistics at this point, where you could start to make statements like, yeah, you're more likely to die on the way to work in the car or get hit by lightning or die from coconut falling out of a tree. You know what I mean?

525.00 526.00 S: Way more, yeah.

526.00 533.00 S: Yeah, it's so unlikely that statistically it's just not worth worrying about.

533.00 538.00 B: If you're afraid of the vaccine, then you shouldn't be taking baths or playing with your dog.

538.00 540.00 S: That makes too much sense, Bob.

540.00 548.00 S: We don't live in a world where people are trusting authority and expertise and science.

548.00 553.00 S: We're going to get to this. Actually, Jay, that's one of the news items that we're going to be talking about.

553.00 555.00 S: That's correct. I'm prepping the audience.

555.00 557.00 S: You're pre-gaming it a little bit?

557.00 558.00 S: Teaser.

558.00 563.00 S: The thing is the facts are inarguable.

563.00 568.00 S: And again, I remember this time a year ago, they were developing the vaccine.

568.00 579.00 S: It's a new technology, the mRNA technology, and I remember we were talking about this fact, and I was thinking, it's like, God, I really hope this vaccine works out. I really hope it works out.

579.00 586.00 S: It was perfectly possible that the efficacy of the vaccine was 40% or 50%.

586.00 588.00 S: That would not have surprised me.

588.00 593.00 S: It would have been like, okay, it's better than nothing, a little disappointing, but that was one of the possible outcomes.

593.00 598.00 S: No, it's 95%, like home run efficacy for this vaccine.

598.00 604.00 S: It's also possible that there could have been a lot of serious side effects.

604.00 615.00 S: When you test it in 40,000 people, that's reassuring, but then you give it to 40 million people, 100 million people, and more side effects emerge.

615.00 629.00 S: And we're far enough into this now, and again, this is part of the reason for this new data, is just looking, say, okay, now that we have nine months under our belt with the mRNA vaccines, let's take a look at the data and really see what's going on here.

629.00 632.00 S: It's really a home run. It's a home run.

632.00 638.00 S: Science came through for us for this pandemic better than we could have hoped for, really.

638.00 644.00 S: If you still reject the vaccine at this point, it's because you don't believe the data.

644.00 648.00 S: You do not trust authority. You don't trust these numbers.

648.00 650.00 S: There's no rational reason not to get vaccinated.

650.00 656.00 S: If you look at the risk of dying of COVID versus the risk of the vaccine, it's a no-brainer.

656.00 662.00 S: It has to stem from either misinformation or just a blatant lack of trust in the system.

662.00 664.00 S: Again, we're going to be getting to that in more general.

664.00 668.00 GH: Steve, do you think it's going to end up being a three-shot, like with the booster thing?

668.00 669.00 GH: I think so.

669.00 671.00 GH: Isn't that more standard than...

671.00 672.00 GH: Depends on the vaccine.

672.00 673.00 GH: Depends on it. Okay.

673.00 675.00 S: Some are one-shot, some are two, some are three.

675.00 678.00 C: Yeah, Gardasil's a three-shot now.

678.00 681.00 C: It used to be two because now it covers more of the different types.

681.00 683.00 C: Flu is annual.

683.00 685.00 C: It may be that it becomes an annual thing.

685.00 686.00 C: Maybe.

686.00 688.00 C: The hope would be because it's such...

688.00 700.00 C: Yeah, and because we, as if I had anything to do with it, because the incredible researchers and scientists have managed to figure out how to accelerate...

700.00 703.00 C: MRNA vaccines happen really fast.

703.00 709.00 C: The hope is that anytime there's a new variant, Delta being the dominant one now, but now there's this, what, is it mu?

709.00 710.00 C: This mu variant?

710.00 711.00 C: That's a new one that's on the rise.

711.00 712.00 E: Yeah.

712.00 714.00 E: The CDC just started to warn people about that.

714.00 715.00 C: Right.

715.00 724.00 C: The hope is that each year, if this does become an annual jab, that each year we're able to sort of track and get out in front of it, just like we try to do with the flu.

724.00 733.00 GH: Just when someone argues that, oh, the third booster now means that it's not effective, the argument you can present is that, no, many of these vaccines require multiple...

733.00 734.00 GH: Right.

734.00 736.00 GH: It doesn't reflect at all its efficacy.

736.00 738.00 S: This is how the immune system works.

738.00 747.00 C: Yeah, and to be clear, we're trying really hard to make sure that the language is clean, so there's a third dose and there's a booster.

747.00 748.00 C: Right.

748.00 751.00 C: The third dose is what's available right now to immunocompromised people.

751.00 764.00 C: The idea is that because they are so immunocompromised, they needed a three-shot course in order to get to the same immunity that you and I only needed a two-shot course to get to because their own body isn't working as well.

764.00 766.00 C: Their own immune system is struggling.

766.00 778.00 C: Later, perhaps by the end of September, there have been some signals, but maybe not, and we still don't know if the booster will be reformulated or if it'll just be the same old shot.

778.00 780.00 C: That's the booster shot.

780.00 784.00 C: The booster is the idea that you get a new jab when you have waning immunity.

784.00 786.00 C: The third dose is for immunocompromised people.

786.00 789.00 C: The booster shot is for waning immunity.

789.00 790.00 C: Cool.

790.00 792.00 S: Right, which is typical.

792.00 794.00 S: Yeah, totally.

794.00 797.00 S: Many vaccines, they don't last for life.

797.00 801.00 S: You may need a booster in 10 years or every year or whatever.

801.00 806.00 S: It just depends on the nature of the vaccine, the nature of the organism that you're vaccinating against.

806.00 808.00 S: The immune system is complicated.

808.00 810.00 S: There are different pieces of it.

810.00 814.00 S: We just have to test it, follow it, and it's all evidence-based.

814.00 829.00 C: Also, Steve, the nature of our fellow country people, if everybody would have gotten vaccinated, we wouldn't see these intense variants taking hold and really cycling through the community as much.

829.00 838.00 C: Part of the reason that we very likely are going to have to continue to get these shots is because other people aren't getting these shots.

838.00 840.00 S: Of course, no question.

News Items[edit]

Kilometers-Long Spaceship (14:00)[edit]

840.00 848.00 S: All right, Bob, you're going to start us off with the news items telling us about China's plans to build a ginormous spaceship.

848.00 858.00 B: Yeah, they recently announced a plan to build or research a kilometer or kilometers long spaceship to be built in low Earth orbit.

858.00 860.00 B: I found that just so amazing.

860.00 864.00 B: To me, that's something you only see in science fiction movies or novels.

864.00 868.00 B: That is gargantuan, a kilometer or two.

868.00 870.00 E: Is it too early to raise questions?

870.00 872.00 E: No, go right ahead.

872.00 877.00 E: You mentioned going into orbit does not make it a space station as opposed to a spaceship.

877.00 879.00 S: Well, they're building it in orbit.

879.00 881.00 B: Yeah, the plan would be to build in orbit.

881.00 884.00 B: Doesn't a ship imply traveling elsewhere other than orbit?

884.00 885.00 B: Yes, Evan, I thought the same thing.

885.00 888.00 B: I thought it was a ship traveling in space.

888.00 893.00 B: But no, the plan is we think that it would be something like a space station.

893.00 895.00 B: It's not 100% clear.

895.00 902.00 B: But the thing though is that this whole thing is about constructing it, researching it, and how would you construct something of that size?

902.00 905.00 B: No matter what you do with it, that's the main focus.

905.00 912.00 B: This, of course, is coming from the China version of NASA, which is the China National Space Agency, CNSA.

912.00 916.00 B: And so I was looking at China recently in terms of their space agency.

916.00 918.00 B: You've got to give these guys credit.

918.00 920.00 B: They've had an amazing 20 years.

920.00 922.00 B: I mean, look what they've accomplished.

922.00 924.00 B: They have astronauts in space.

924.00 928.00 B: They're building their own space station this year, Tiangong.

928.00 933.00 B: They're building it right now and they've got a plan to get this thing.

933.00 937.00 B: It's only going to be about the fifth of the size of the International Space Station.

937.00 940.00 B: But I mean, that's big leagues right there.

940.00 943.00 B: They're developing heavy lift rockets.

943.00 948.00 B: That's something that not many nations have the wherewithal to do.

948.00 950.00 B: They're sending robotic explorers everywhere.

950.00 955.00 B: You remember Chang'e 4 was a lander and rover on the far side of the moon.

955.00 957.00 B: We've never landed on the far side of the moon.

957.00 959.00 B: They also had, get this one, this one surprised me.

959.00 960.00 B: I wasn't aware of this.

960.00 963.00 B: They had a Mars mission that was the first.

963.00 968.00 B: It was the first mission that had an orbiter, a lander, and a rover all in one mission.

968.00 970.00 B: That's something else that has never been done before.

970.00 975.00 B: So these guys are definitely in the big leagues and they're doing an amazing job.

975.00 978.00 B: So it begs the question, what's going to happen in the future?

978.00 982.00 B: What's going on with China and their future with their space agency?

982.00 986.00 B: One of the things that was in the news recently was this five-year plan.

986.00 995.00 B: They're considering proposals that came from the National Natural Science Foundation of China, which is managed by the Ministry of Science and Magic.

995.00 998.00 B: Oh wait, sorry, Ministry of Science and Technology.

998.00 1000.00 B: Sorry about that.

1000.00 1011.00 B: So one proposal of the ten proposals they got and funded for like 15 million yen was creating an ultra-large spacecraft spanning kilometers in low Earth orbit.

1011.00 1019.00 B: Kilometers, this says kilometers, but most other people are saying that it's basically a kilometer is what they're investigating.

1019.00 1023.00 B: Now the foundation's website, if you go to the website and if you can read the language, great.

1023.00 1026.00 B: I couldn't, but this is apparently what it says.

1026.00 1036.00 B: Major strategic aerospace equipment for the future use of space resources, exploration of the mysteries of the universe and staying in long term.

1036.00 1038.00 B: So that's how it's described on their website.

1038.00 1043.00 B: So two of the big things they talk about in their plan is that they need to minimize the weight.

1043.00 1050.00 B: They want to research how do you minimize the weight for something of this scale and the fact that this would be constructed on the ground and then assembled in space.

1050.00 1058.00 B: So you would make these big chunks like the International Space Station, build it on the ground and then bring them up to space and kind of put them all together.

1058.00 1062.00 B: Just imagining a kilometer long structure of any kind is pretty daunting.

1062.00 1067.00 B: So the former NASA chief Mason Peck said that I think it's entirely feasible.

1067.00 1073.00 B: I would describe the problem here not as insurmountable impediments, but rather problems of scale.

1073.00 1079.00 B: Well, yes, the scale, yes, that would be, the scale is really this whole story, is the scale of it.

1079.00 1083.00 B: So now I thought it would be helpful to compare this to the ISS.

1083.00 1088.00 B: So to put the International Space Station into orbit, it took 42 assembly flights.

1088.00 1092.00 B: It took, guess how many EVAs it took to do everything?

1092.00 1093.00 S: Hundreds probably.

1093.00 1095.00 B: Two hundred and thirty-two EVAs.

1095.00 1106.00 B: It cost $150 billion to develop and build and $4 billion a year just to operate, just to operate this, you know, and to do the maintenance and whatever, well, what maintenance they're doing.

1106.00 1110.00 B: I'm not sure how good the maintenance is, it's really showing its age at this point.

1110.00 1113.00 E: And it took the efforts of two major nations plus some, so.

1113.00 1114.00 B: Right, exactly.

1114.00 1125.00 B: This is clearly, yes, initially it was more of a United States thing, but then they brought in, you know, they brought in Russia and lots of other countries, so lots of countries have been contributing.

1125.00 1129.00 B: So the International Space Station is 109 meters.

1129.00 1135.00 B: China's proposal is ten times that, ten times that length, and who knows how massive it's going to be.

1135.00 1138.00 B: It could be 20 times the size when you get down to it.

1138.00 1139.00 B: So, Mike.

1139.00 1140.00 S: What's the point, though?

1140.00 1142.00 S: Why does it need to be that big?

1142.00 1143.00 S: Right, that's.

1143.00 1144.00 C: Do they get into that?

1144.00 1145.00 C: Yeah.

1145.00 1146.00 S: Or is this really, is this like.

1146.00 1147.00 S: Jane, you sound like Mingi.

1147.00 1153.00 S: But it's, look, as cool as it sounds, I mean, it sounds kind of like Jeff Bezos, you know, like.

1153.00 1154.00 S: Who's got a bigger ship.

1154.00 1155.00 S: You know, like.

1155.00 1156.00 S: The one reason.

1156.00 1158.00 S: Why, why make it so long?

1158.00 1163.00 S: The one reason to make it that big is if you're going to use rotation or something for artificial gravity.

1163.00 1167.00 S: So I couldn't find like an artist impression or anything with the design of the ship.

1167.00 1172.00 S: Maybe they're not at that point yet, but are they planning it to be that big because they want artificial gravity?

1172.00 1173.00 S: Was there any mention of that?

1173.00 1174.00 B: No mention at all.

1174.00 1175.00 B: And I don't think that's where they're going.

1175.00 1177.00 B: I don't think we're quite, quite ready for that.

1177.00 1179.00 B: I mean, I don't think we're quite ready for this either.

1179.00 1181.00 B: But it is a little mysterious.

1181.00 1183.00 B: We're not exactly sure what they want to do with it.

1183.00 1185.00 B: The descriptions are a little vague.

1185.00 1188.00 B: But Michael Lembeck, professor of aerospace engineering, he said this.

1188.00 1191.00 B: It's kind of like talking about building the Starship Enterprise.

1191.00 1192.00 B: It's fantastical.

1192.00 1198.00 B: Not feasible and fun to think about, but not very realistic for our level of technology, given the cost.

1198.00 1203.00 B: So based on, based on a lot of this and what I've been reading, this, this doesn't seem to be practical at all.

1203.00 1207.00 B: And I think what they're doing there and they're only devoting a little bit.

1207.00 1208.00 B: It's not a lot of money.

1208.00 1211.00 B: Clearly, they're not thinking about creating this soon.

1211.00 1213.00 B: I think they're going to study this.

1213.00 1214.00 B: They're just going to study it.

1214.00 1215.00 B: What is it going to take?

1215.00 1220.00 B: What is it going to take to get this into to build the structure that big in space?

1220.00 1228.00 B: And I think the conclusion that they're going to come to is that this is going to be way too expensive to do, at least in this way.

1228.00 1237.00 B: I think when the time comes in the farther future, when we can build something of this scale, that they're going to probably do things like, how about 3D printing in space?

1237.00 1242.00 B: What you would do is you'd have these compact masses on the ground that you would launch into space.

1242.00 1243.00 B: Still, it would be very expensive.

1243.00 1255.00 B: But then you could take those compact masses and then create these huge, elaborate kilometer long structures using some type of 3D printing, which would be a lot easier than building it and then launching the big chunks from the Earth.

1255.00 1257.00 S: It seems like it would be easier.

1257.00 1263.00 S: But I mean, when you look at any one of those modules that they put on the space station, they're complicated.

1263.00 1267.00 S: The wiring and everything, like a lot of that stuff, I don't know if they could pull it off in space.

1267.00 1271.00 B: Well, the problem, Jay, again, is the vagueness because we're not sure.

1271.00 1274.00 B: A lot of this depends on what exactly are they building.

1274.00 1280.00 B: Are they just building long struts and things or are they going to have lots of modules with people inside them?

1280.00 1281.00 B: It's a completely different story.

1281.00 1290.00 B: If it's going to be people living in all sections of it, then this is going to be far heavier, far more expensive, far more complex.

1290.00 1300.00 B: If it's mainly, if there's a lot of these structures that are just there for support and you don't need to have people living in them, then it's going to be a lot lighter, it's going to be a lot cheaper, it's going to be a lot easier.

1300.00 1308.00 B: So some of these questions can't be answered until we actually know exactly what they're going to do with it, exactly what the design is with more detail, and that will come.

1308.00 1315.00 B: But I do like the idea that they're studying what kind of technologies are needed to create something of this size.

1315.00 1319.00 B: Because who doesn't want a kilometer-long construct in its lowest orbit?

1319.00 1321.00 E: But isn't that thing going to be a target for space debris?

1321.00 1324.00 E: Have they calculated that into their planning?

1324.00 1326.00 B: That's another problem that they'd have to factor in.

1326.00 1330.00 B: But when you have something of this size, you have to factor in things that you've never had to factor in before.

1330.00 1349.00 B: Because if this thing is a kilometer long and you have ships docking with it, or if you're going to be making any maneuvers in orbit, then you're going to have to, there's going to be like, if you bump this thing with something of the mass of a, say, a space shuttle, this is going to create these, it's going to create motion and little waves that go back and forth in the structure itself.

1349.00 1354.00 B: And so they're going to need dampeners to absorb the energy from the impacts.

1354.00 1366.00 B: Yeah, and if it's in very low Earth orbit, there's going to be drag, so the orbit's going to decay, so you're going to have to have lots of fuel with rockets so that you can get it into a higher orbit.

1366.00 1368.00 B: The ISS does that all the time.

1368.00 1369.00 B: It sounds fantastic.

1369.00 1374.00 B: There are lots of complications, and we just don't have enough information to really see exactly what this detail is.

1374.00 1376.00 GH: How visible would a ship that size be from the ground?

1376.00 1378.00 GH: If it reflected light, oh, you'd see it pretty well.

1378.00 1379.00 E: The ISS is visible.

1379.00 1380.00 GH: Yeah, you'd see the ISS.

1380.00 1381.00 B: This would be ten times bigger.

1381.00 1383.00 GH: I mean, like naked eye or just the telescope?

1383.00 1384.00 B: Absolutely naked eye.

1384.00 1385.00 B: The ISS is naked eye right now.

1385.00 1386.00 B: Yeah, it would be huge.

1386.00 1388.00 S: It would be ten times more naked eye.

1388.00 1390.00 C: It would look like, I mean, if it's going to be long.

1390.00 1392.00 C: What's the comparison in size to the ISS?

1392.00 1394.00 S: It's going to be ten times longer.

1394.00 1395.00 S: It's ten times?

1395.00 1402.00 B: Yeah, a kilometer would be, say, ten times, about a little bit less than ten times the width, just the width, not the mass necessarily.

1402.00 1404.00 B: What about the girth?

1404.00 1406.00 C: And how big was Mir?

1406.00 1408.00 C: Does anybody know off the top of their heads?

1408.00 1409.00 C: Was Mir smaller than the ISS?

1409.00 1410.00 C: Oh, yes.

1410.00 1415.00 B: The ISS is the biggest artificial construct ever created in orbit like that.

1415.00 1417.00 B: Mir was far, far smaller.

1417.00 1418.00 B: Actually, I do know.

1418.00 1419.00 B: I do know.

1419.00 1430.00 B: This would, I think we're talking, Mir was a fifth of the mass, I believe, or the length of the ISS, approximately.

1430.00 1431.00 B: Oh, wow.

1431.00 1433.00 B: And then this is a tenth?

1433.00 1435.00 B: This is ten times the ISS.

1435.00 1437.00 B: Yeah, ten times the width.

1437.00 1440.00 B: A kilometer, anyways, about ten times.

1440.00 1441.00 S: But what's the bottom line?

1441.00 1450.00 S: Is it just like we really don't have the technology to build something that big, or is it just that it would be a massive project that would cost trillions of dollars?

1450.00 1451.00 B: Right.

1451.00 1453.00 B: Yeah, it's technically feasible.

1453.00 1456.00 B: It's not like we have to develop whole new sciences to make this happen.

1456.00 1463.00 B: We could do it, but we'd have to be willing to spend hundreds of billions, maybe, a trillion dollars, maybe not that expensive.

1463.00 1465.00 B: But, I mean, the time and money.

1465.00 1466.00 B: Yeah, they always run out of money.

1466.00 1470.00 B: Steve, I compared, like the ISS, I said there was 42 assembly flights.

1470.00 1473.00 B: This would require, what, 80, 90, 100?

1473.00 1474.00 B: I don't know.

1474.00 1475.00 B: How many EVAs?

1475.00 1477.00 B: Too expensive, and we're not ready.

1477.00 1478.00 B: We're not ready.

1478.00 1479.00 S: We're not there yet.

1479.00 1480.00 S: Yeah, maybe next century, that kind of project.

1480.00 1489.00 S: From what I'm hearing, though, it just sounds more like a thought experiment more than a real proposal because, again, they're not even saying why it needs to be that big.

1489.00 1491.00 S: It just seems kind of like a fantasy.

1491.00 1492.00 B: It's more, a little bit.

1492.00 1494.00 B: It's more than a thought experiment.

1494.00 1500.00 B: They're going to actually try to see what the technology would be needed, how to do it.

1500.00 1503.00 B: With modern technology, how would we construct something that big?

1503.00 1504.00 B: That's kind of what they're looking at.

1504.00 1506.00 S: Okay, let's move on.

Social Media and Kids (25:05)[edit]

1506.00 1511.00 S: Jay, tell us about the effect of social media on children.

1511.00 1512.00 S: No, sir.

1512.00 1517.00 S: As a parent, I've been following news items that come up about this.

1517.00 1523.00 S: A lot of people, friends and family, we talk about it because social media has a bad rap.

1523.00 1530.00 S: It's measurably done some bad things, depending on, I guess, your perspective.

1530.00 1533.00 S: But specifically, what's the deal with social media and kids?

1533.00 1542.00 S: In March of 2021, Facebook announced that they're working on launching a version of Instagram for children 12 years old and younger.

1542.00 1546.00 S: I'm just curious, guys, what's your knee-jerk on hearing that?

1546.00 1547.00 S: No.

1547.00 1548.00 GH: Bad.

1548.00 1549.00 GH: Bad.

1549.00 1550.00 B: What's Instagram?

1550.00 1551.00 B: Why?

1551.00 1552.00 B: I think that the thing is 12 years old and younger.

1552.00 1555.00 C: I think that the thing is 12 year olds are already on Instagram.

1555.00 1565.00 C: So if there's an area where there are greater parental controls and there's more restriction, the kids are going to use it anyway.

1565.00 1567.00 C: Why not make a space for them?

1567.00 1568.00 C: I don't know.

1568.00 1569.00 C: I think it's not a bad idea.

1569.00 1573.00 S: That's the logical argument from the outside looking in.

1573.00 1578.00 S: You could make it make sense by saying, you have to be 13 years old to use Instagram.

1578.00 1585.00 S: The younger kids, if their parents are not paying attention or if they're letting them do it, they're faking it, they're on there anyway, and they're exposed.

1585.00 1587.00 C: There's no way to prove that you're 13 years old.

1587.00 1588.00 C: You just self-attest.

1588.00 1589.00 C: Yeah, that's right.

1589.00 1591.00 S: You can do what China does.

1591.00 1597.00 S: China requires you to scan your face to essentially log into video games, social media.

1597.00 1603.00 S: They're using that technology in order to monitor kids because they have their new rule, three hours a week video games.

1603.00 1605.00 S: I'm going to buy a deep wow.

1605.00 1613.00 S: Facebook is aware that kids do this, that they fake their ages, and they decided to make a targeted platform for the younger age group.

1613.00 1616.00 S: Facebook and Instagram are saying all the right things too.

1616.00 1622.00 S: They want to encourage kids to use an age-appropriate platform that their parents can manage.

1622.00 1624.00 S: It sounds very benevolent.

1624.00 1626.00 S: They could monitor for pedophiles.

1626.00 1631.00 S: Yeah, I mean, look, they had their heads in the right place, but there's a lot of details here we have to unpack.

1631.00 1634.00 C: Yeah, the question is, this is marketing, no?

1634.00 1636.00 C: It all comes down to marketing.

1636.00 1638.00 C: That's how they make their money on these platforms.

1638.00 1639.00 C: Exactly.

1639.00 1642.00 C: So we're going to be selling crap to our 11-year-old kids?

1642.00 1645.00 S: There's a lot of people that are fighting against this idea.

1645.00 1655.00 S: In particular, there's an activist group named Fair Play whose goal is to create a safe space for kids to live without marketing and influence a big industry, like you were saying, Cara.

1655.00 1657.00 S: They sent Mark Zuckerberg a letter.

1657.00 1661.00 S: They wrote a letter and said, hey, Mark, we want you to kill this project.

1661.00 1662.00 S: It's not good.

1662.00 1667.00 S: In their letter, they cite several studies, and I've read a lot of these studies, that back up their sentiment.

1667.00 1674.00 S: So they're saying that media has a negative effect on children, and it's not just social media, but in general, it's screen use.

1674.00 1678.00 S: It's not just social media, but just screen use in general has some negative effects.

1678.00 1683.00 S: So here are the common risk factors that were found in most of the studies.

1683.00 1697.00 S: Obesity, lower psychological well-being, decreased happiness, decreased quality of sleep, increased risk of depression, and increases in suicide-related outcomes, such as suicide ideation plans and attempts.

1697.00 1703.00 S: So another study found that children are, in quotes, uniquely vulnerable to advertising.

1703.00 1704.00 S: And listen to this.

1704.00 1705.00 S: Let me give you the reasons why.

1705.00 1708.00 S: So young kids can't detect that they're being manipulated.

1708.00 1716.00 S: They don't know when they hear and read and see things that there's a layer of manipulation involved in the whole thing.

1716.00 1722.00 S: So around 12 years old, kids could start to become aware that advertising is really about companies making money.

1722.00 1728.00 S: But even when kids are aware that advertising is essentially manipulation, kids are horrible at resisting the marketing.

1728.00 1729.00 S: Yeah.

1729.00 1732.00 C: And Jay, you can tell the difference between reality and fantasy.

1732.00 1742.00 C: A very young child who sees an ad for a pair of shoes and the kid in the advertisement starts to fly is going to have a hard time understanding that these shoes, for example, can't make you fly.

1742.00 1743.00 C: Right.

1743.00 1755.00 C: And so that kind of marketing can have a really dramatic effect on them because they have a hard time understanding the difference between, you know, this is marketing versus this product would really do this and, oh, my gosh, mom, I need this.

1755.00 1756.00 C: Yeah.

1756.00 1757.00 C: It's going to change my life.

1757.00 1758.00 C: Of course.

1758.00 1759.00 S: Exactly.

1759.00 1760.00 S: You can't fly with those shoes?

1760.00 1767.00 S: No, I think this idea of screen use and social media and marketing, it's more of the marketing aspect of it.

1767.00 1770.00 S: And it's also the peer pressure that happens.

1770.00 1776.00 S: So beyond the marketing angle, social media exposes children to online bullying and to sexual exploitation.

1776.00 1782.00 S: Those two things, bullying and sexual exploitation, are not the worst thing that can happen to kids online.

1782.00 1783.00 S: Here it is.

1783.00 1788.00 S: During preteen and teen years, kids develop their identities during that age range.

1788.00 1789.00 S: Right.

1789.00 1795.00 S: And so from young children up into your teen years, you're figuring out who you are and what kind of...

1795.00 1796.00 C: Well, really into your 20s.

1796.00 1797.00 C: Right.

1797.00 1798.00 C: You're right.

1798.00 1799.00 C: It is.

1799.00 1800.00 S: Yeah, it continues.

1800.00 1804.00 S: But you're very vulnerable and susceptible much more when you're younger.

1804.00 1808.00 S: So this is the way that kids perceive themselves.

1808.00 1812.00 S: This is how they find and fit into their social structures.

1812.00 1821.00 S: So while their identities are being molded, kids are, like I said, they're super vulnerable and they're constantly interacting with their peers online.

1821.00 1823.00 S: And they upload images of themselves.

1823.00 1825.00 S: They're constantly looking for acceptance and praise.

1825.00 1831.00 S: And kids get these things from the way the platforms tell them that they're acceptable.

1831.00 1837.00 S: It's not like they're talking and they're playing sports and their friends are like, oh, come on, you're not working hard enough or great job.

1837.00 1840.00 S: They're like, they're looking for likes, upvotes.

1840.00 1842.00 S: And it's platform things.

1842.00 1845.00 S: It's Facebook and Reddit and Instagram.

1845.00 1851.00 S: It's the way that the adults who created these platforms decided what's going to be the positive feedback?

1851.00 1854.00 S: What's going to be the thing that gives people a dopamine hit?

1854.00 1857.00 GH: And those are quantifiable, too, which is the thing.

1857.00 1859.00 GH: A praise from a friend is one type of thing.

1859.00 1865.00 GH: But a heart or a check and you see how many you have versus how many your friends have.

1865.00 1874.00 GH: It is a binary piece of data where you're like, oh, I am less important than my friend because she has 400 likes and I have 335.

1874.00 1876.00 S: Exactly. The friends and followers thing.

1876.00 1877.00 S: It's not good.

1877.00 1879.00 S: I mean, think about being an unpopular kid.

1879.00 1881.00 S: Just think about this right now.

1881.00 1890.00 S: Imagine being an unpopular, dorky kid using social media and desperately trying to get any traction you can.

1890.00 1893.00 S: And I'm setting the stage for the next thing I'm about to tell you.

1893.00 1894.00 S: But this is what happens.

1894.00 1905.00 S: This is how kids are getting the way that they feel valued from their peers, from the way social media is conjuring up the acceptance.

1905.00 1909.00 S: What is acceptance and how are they getting praise from their peers?

1909.00 1915.00 C: But that also assumes, Jay, that we're talking about things like chat rooms and groups and places.

1915.00 1921.00 C: Some, as I think you're probably alluding to, some of these platforms aren't structured that way.

1921.00 1931.00 C: So, of course, if you're on Reddit or if you're in a Facebook group, you're able to find your friends, you're able to come together as a community, and it's mostly about communication.

1931.00 1935.00 C: Instagram is mostly about display.

1935.00 1948.00 C: And unfortunately, there's good evidence to show that the types of posts that get the most likes, yes, Instagram is also a place for activism and it's a place to be heard.

1948.00 1955.00 C: But it's very well accepted that a picture of your face is always going to perform significantly better than words.

1955.00 1970.00 C: And so, unfortunately, what happens, and I don't want to fully gender this because I think this does apply to young boys as well, but in a heavy way applying to both cis and trans girls, what ends up happening is that there's a representation of themselves that's not true to life.

1970.00 1976.00 C: There are so many filters available on platforms like this that allow you to completely transform the way you look.

1976.00 1988.00 C: So, when we think about all of the psychological damage that came from magazine covers when we were kids, little girls seeing pictures of supermodels on magazines and saying, I'm never going to look like her.

1988.00 1993.00 C: Now, she can actually transform herself to look like those things.

1993.00 1997.00 C: And think about the deep psychological conflict that comes from that.

1997.00 2005.00 S: So, these platforms are designed to keep people on the platform and to continue to engage because that engagement is generating money.

2005.00 2008.00 S: Through ad sales for these platforms.

2008.00 2011.00 S: So, the more people that use the platform, the more money they make.

2011.00 2015.00 S: And the kind of behavior that gets the most responses, Cara, I thought of you when I read this.

2015.00 2025.00 S: The kind of behavior that gets the most responses on social media leans far on the side of negativity and being mean, especially in kids.

2025.00 2030.00 S: So, people react to these kinds of posts more than they do a happy post.

2030.00 2032.00 C: Of course, just like on Yelp, right?

2032.00 2033.00 C: People don't leave reviews.

2033.00 2036.00 C: They do, they say, I had a great experience.

2036.00 2043.00 C: But much more often they say, that was the worst meal of my life or I hated my manicure and this is why.

2043.00 2044.00 C: Exactly.

2044.00 2050.00 S: So, in the kids' world, we're talking about bad restaurants and bad politics and stuff in the adult world.

2050.00 2054.00 S: In a kid's world, they're exposing themselves to this platform.

2054.00 2056.00 S: They're putting pictures of themselves up there.

2056.00 2062.00 S: And every single post is important to them in a really significant way, right?

2062.00 2070.00 S: So, they learn, unconsciously learn that they get more attention when they do bizarre things and mean things or whatever.

2070.00 2072.00 S: It's just the fact of life.

2072.00 2074.00 S: This is what's happening online.

2074.00 2075.00 S: And they lean into it.

2075.00 2083.00 C: And Jay, I saw firsthand when I was working in the foster care system, and of course, these are vulnerable youth, and I was working with adolescent girls.

2083.00 2091.00 C: So, between the ages of, let's say, 11, maybe 12 with our youngest up to 18, that very often they would sneak social media.

2091.00 2098.00 C: And they would have multiple Instagram accounts with like different fake names or different, you know, this one is for my friends, this one is for the boys, this one.

2098.00 2106.00 C: And very often they would be chatting with many, many members, either of the same or opposite sex, whatever they were attracted to, they would be chatting with many members.

2106.00 2113.00 C: But I especially would see this ideation that, oh, well, this is my boyfriend, this is my boyfriend.

2113.00 2118.00 C: I have all these boyfriends online, many of whom you don't know if this is a real human being.

2118.00 2121.00 C: You don't know if this is another young child who's playing along.

2121.00 2123.00 C: You don't know if this is an adult.

2123.00 2126.00 C: And the rhetoric between them was very scary.

2126.00 2130.00 C: I mean, this was borderline like concern around sex trafficking.

2130.00 2137.00 C: And I saw this quite often within my foster youth who were probably more vulnerable than children obviously being raised in a family setting.

2137.00 2139.00 C: But you don't know.

2139.00 2144.00 C: You don't know the kinds of conversations and how much deception is actually taking place within those.

2144.00 2151.00 C: And how much of this is healthy adolescent play versus manipulation by adults?

2151.00 2153.00 S: Sure. So here's a question.

2153.00 2156.00 S: This is something for you to think about as we as I wrap this up.

2156.00 2159.00 S: How responsible should social media platforms be for this?

2159.00 2164.00 S: Right now, I don't think that they wake up in the morning and think, you know, we're going to hurt kids today.

2164.00 2169.00 S: I can't believe that. I think what they're doing is they're like, we're going to wake up and we're going to make money today.

2169.00 2171.00 S: They lean into making money.

2171.00 2178.00 S: I do think that social media platforms do they should be made responsible because it is happening on their platform.

2178.00 2181.00 S: You know, just because they're making money doesn't mean it's OK.

2181.00 2183.00 S: Doesn't mean that what they're doing is OK.

2183.00 2188.00 C: No. And, you know, we as a society, we used to set standards as a society.

2188.00 2197.00 C: We voted so that at the governmental level, there was a requirement back when we didn't have social media and we were talking about media only through television and radio.

2197.00 2208.00 C: That our advertising dollars on things like, you know, nightly dramas would go to pay for our news and would go to pay for our children's programming.

2208.00 2213.00 C: And those things were not linked. Children's programming and news were not money makers.

2213.00 2217.00 C: They were money users. And it was required by law.

2217.00 2221.00 C: That was a regulation that we completely disbanded during the Reagan era.

2221.00 2233.00 C: There's no reason that we couldn't as a society band together and push for new regulations that say kids and basic news should not be directly tied to advertising.

2233.00 2239.00 C: It's a massive conflict of interest. There's no reason we can't say that out loud.

2239.00 2243.00 GH: Europe has all kinds of rules about what you can advertise to kids and what you can't.

2243.00 2244.00 GH: Absolutely.

2244.00 2251.00 GH: In certain times of the day or whatever, radio and television and internet, you can't have targeted advertising for children.

2251.00 2256.00 GH: And that's probably, what, 30 percent of American advertising, 40 percent of American advertising is directly towards children.

2256.00 2262.00 GH: So you can totally regulate this. But to get that to happen is a very challenging thing.

2262.00 2269.00 S: So I'm out of time. I wanted to read more. I have information that was from the Fair Play website.

2269.00 2274.00 S: Look them up. Their website isn't just Fair Play. It's Fair Play for kids or something like that.

2274.00 2277.00 S: But you'll find them if you just type in Fair Play into Google.

2277.00 2284.00 S: They have a page where they go into the details about very specifically, I'll give you one example.

2284.00 2293.00 S: Being watched and watching. So the always-on reality of social media means that children become hyper-aware of themselves and others in a time when their identity is really developing.

2293.00 2296.00 S: And this hyper-awareness actually is very bad for them.

2296.00 2300.00 S: Because they're constantly thinking about themselves and how they relate to other people.

2300.00 2304.00 S: They're spending way too much time in that mindset and it's bad.

2304.00 2313.00 S: And they give you a really nice list of information that you can think about and maybe start paying more attention to your kids' use of social media.

2313.00 2318.00 S: Talk to them about it. Make them aware of advertising. Just tell them what advertising is.

2318.00 2319.00 S: All right.

Trust in Science (38:39)[edit]

2319.00 2324.00 S: So, Evan, the last item before we click over to DragonCon.

2324.00 2328.00 S: We obviously are engaged with science communication.

2328.00 2332.00 S: Our goal is to make people appreciate science, trust science.

2332.00 2335.00 S: But this is tricky territory.

2335.00 2338.00 S: You're going to tell us how easily this can backfire.

2338.00 2340.00 E: It can backfire.

2340.00 2346.00 E: And that's definitely the results of a new study that were published in the Journal of Experimental Social Psychology.

2346.00 2355.00 E: In which they found that people who trust science are more likely to be duped into believing and disseminating pseudoscience.

2355.00 2356.00 E: Yeah.

2356.00 2358.00 E: Which is in a way not intuitive.

2358.00 2360.00 E: But those are the results.

2360.00 2362.00 E: And this isn't just this study.

2362.00 2365.00 E: This is a continuation in which we are seeing similar results.

2365.00 2367.00 E: This is just the latest greatest.

2367.00 2370.00 E: I don't think I have to define what pseudoscience is for our audience.

2370.00 2373.00 E: We should hopefully know what that is.

2373.00 2375.00 E: It's something pretending to be science.

2375.00 2376.00 E: It is not.

2376.00 2378.00 E: It has the trappings of science.

2378.00 2380.00 E: Yet it does not qualify.

2380.00 2385.00 E: But that line exactly where you draw it can be complicated.

2385.00 2387.00 E: It can be very, very tricky.

2387.00 2389.00 E: Not always clearly demarcated.

2389.00 2396.00 E: And it's in that blurry area in which the pseudoscientists thrive, frankly, and wholly exist.

2396.00 2402.00 E: This study, a series of four experiments involving a total of about 2,000 EUS adults.

2402.00 2418.00 E: Teachers randomly assigned study participants to read a news article, actually two news articles, and complete an online questionnaire asking, among other things, if they believed the article, believed it was true, and whether it should be shared with others.

2418.00 2421.00 E: So these adults were asked to read about two different topics.

2421.00 2427.00 E: The first topic was a fictional virus created as a bioweapon dubbed the Valza virus.

2427.00 2429.00 E: V-A-L-Z-A.

2429.00 2430.00 E: The Valza virus.

2430.00 2437.00 E: Which was said to be made in a lab and that the U.S. government concealed its role in creating it as a bioweapon.

2437.00 2438.00 E: That's the first topic.

2438.00 2447.00 E: The other story was actually a real study supporting the idea that mice developed tumors after eating genetically modified organisms.

2447.00 2448.00 E: Not a tumor.

2448.00 2454.00 E: But the study was retracted in 2013, but the participants were not told about the retraction.

2454.00 2460.00 E: So one wholesale fiction, the other was actual, but retracted because it turned out to not hold water.

2460.00 2462.00 C: Because those mice get tumors anyway.

2462.00 2464.00 C: They were rats, but yeah.

2464.00 2465.00 C: That was why they were retracted.

2465.00 2466.00 C: Yeah, the rats, yeah.

2466.00 2480.00 E: So researchers assigned some of the people to read versions of the news stories that featured activists, non-scientific people spouting the information, whereas others read versions of the news stories featuring actual scientists.

2480.00 2484.00 E: They would gauge the participants' level of trust in science.

2484.00 2488.00 E: Researchers asked them to indicate whether they agreed with various statements.

2488.00 2495.00 E: For example, one of the statements was, Scientists usually act in a truthful manner and rarely forge results.

2495.00 2503.00 E: And then another statement they threw out there for people to gauge, The Bible provides a stronger basis for understanding the world than science does.

2503.00 2508.00 E: So these were metrics to try to get people and how they basically feel about these things.

2508.00 2520.00 E: And there was also an experiment among this set of studies in which the participants responded to a writing prompt, which was meant to put them into a particular mindset before reading their articles.

2520.00 2527.00 E: So what that means is that one prompt was to put people in the trust the science mindset.

2527.00 2533.00 E: And they gave them some examples of how science had saved lives and otherwise benefited humanity.

2533.00 2549.00 E: And another prompt aimed at a critical evaluation mindset, which was directing participants to give examples of people needing to think for themselves, not blindly trust what the media and other sources were telling them.

2549.00 2555.00 C: So they were priming them to either believe or to be skeptical.

2555.00 2569.00 E: And the results they came up with is that those who expressed or espoused higher levels of trust in science, just trust in science, turned out to be the most likely to believe the reports if they contained scientific references.

2569.00 2578.00 E: In other words, if pseudoscience was there and they simply trust science itself, it doesn't matter, they fell for it.

2578.00 2589.00 E: But for those who demonstrated a stronger understanding of the scientific method and scientific methods in general, they were less likely to believe the false stories and what they read,

2589.00 2601.00 S: which again is consistent with what we've seen in prior studies. The people with the lowest level of trust in science did not have any effect from whether or not the news item was connected to science or a scientist.

2601.00 2612.00 S: So the trust in science was a prerequisite, at least a minimal amount of it, to being manipulated by saying, I'm a scientician and I say this is correct.

2612.00 2626.00 S: So trust in science was a dose-response curve there, so it actually made people vulnerable to being manipulated by tying pseudoscience to either a scientist or referring to an article.

2626.00 2637.00 S: Basically supporting the pseudoscience with some reference to science worked well for people who had a trust in science, unless they had a high degree of methodological literacy.

2637.00 2647.00 S: So basically they were not only scientifically literate, but they had the critical thinking and methodological background to be able to deconstruct the fake science for themselves.

2647.00 2648.00 S: You need both.

2648.00 2649.00 S: Right.

2649.00 2650.00 C: Well, and this makes perfect sense.

2650.00 2654.00 C: It's the reason that Doc So-and-so's snake oil is easy to sell.

2654.00 2658.00 C: If he doesn't have the lab coat and the name doctor, people don't want it from him.

2658.00 2666.00 C: It's because people have that trust and authority, which is pseudoscientists, charlatans, they know this intuitively.

2666.00 2668.00 C: And they use this to their advantage.

2668.00 2669.00 C: Absolutely.

2669.00 2672.00 C: They don't go out spouting things that don't sound scientific.

2672.00 2677.00 C: They spout things that do sound scientific because they know people will trust them more if they do it that way.

2677.00 2679.00 S: If they have a baseline trust in science.

2679.00 2680.00 S: Right.

2680.00 2691.00 S: So the irony is that science communicators, if we are only fostering a generic trust in science and scientists, that is just setting people up for manipulation.

2691.00 2693.00 S: Actually it doesn't really work as a strategy.

2693.00 2708.00 S: You have to combine it with the critical thinking skills and the ability to at least judge sources, judge the validity of evidence for yourself to some extent, which gets very tricky.

2708.00 2709.00 GH: Oh, yeah.

2709.00 2710.00 GH: Our job was too easy anyway, Steve.

2710.00 2711.00 GH: We need a challenge.

2711.00 2712.00 GH: We need to be hard-earned.

2712.00 2714.00 C: A challenge to inform people.

2714.00 2738.00 C: We have to remember that through most of modern Western history, from the time that science came onto the scene, like modern science came onto the scene, only in the beginning and in the examples where it butts up against religious doctrine, we often see distrust.

2738.00 2744.00 C: Beyond that, science was like we were in awe of science as a culture.

2744.00 2750.00 C: We really respected science and we bowed to the amazing things that science brought us.

2750.00 2774.00 C: And it really is in many ways a new phenomenon that there's an educated portion of society who's not necessarily driven by religious dogma that don't trust science as a function of this narcissistic, I know better than the experts, this kind of like the trappings, as we know, of the conspiracy sect.

2774.00 2777.00 C: That didn't have a large hold in the past.

2777.00 2782.00 C: So this really is a new phenomenon that we're having to grapple with because historically, people trust scientists.

2782.00 2784.00 C: And that's why it was easy to become a pseudoscientist.

2784.00 2798.00 S: But this is something that we know as scientific skeptics, not just science communicators, because we've long been saying, and the research really strongly supports this, is that yes, you need to teach scientific literacy.

2798.00 2801.00 S: That's one of the planks of scientific skepticism.

2801.00 2804.00 S: You need to give correct misinformation.

2804.00 2809.00 S: You need to give people a basic fundamental knowledge about science and how science works.

2809.00 2833.00 S: But in addition to that, you need to give them critical thinking skills, which includes understanding pseudoscience and how pseudoscience works and how to distinguish pseudoscience from genuine science, science denial and how that works and how to distinguish science denial from genuine skepticism because the science deniers all portray themselves as skeptics.

2833.00 2836.00 S: We're asking the hard questions that no one else will ask.

2836.00 2845.00 S: But no, if it's a perverted version of that designed to deny legitimate science, then it's not skepticism.

2845.00 2846.00 S: It's science denial.

2846.00 2849.00 S: Steve, could you repeat all that so George and I can write it down?

2849.00 2853.00 S: Yeah. And then the third thing is media savvy.

2853.00 2872.00 S: You need to understand, like we were also just talking about social media and the internet, you need to understand how information flows through our modern society, how you could find reliable bits of information and distinguish that from unreliable sources because we all know people personally, colleagues, obviously as skeptics.

2872.00 2898.00 S: We all know people who are essentially living in an alternate reality because they live in an alternate ecosystem of information and they think that they are completely right and we are all hopelessly duped because we believe in things like anthropogenic global warming, that vaccines are safe and effective, that GMOs are safe, that evolution happened.

2898.00 2899.00 S: How silly are we?

2899.00 2902.00 S: Yeah, I know. That the earth is basically a sphere.

2902.00 2905.00 S: These things that make us gullible in their eyes.

2905.00 2913.00 S: Seriously, there are people who if you are immersed in this alternate reality of information and alternate sources, journals and outlets and everything.

2913.00 2914.00 S: The Truman Show.

2914.00 2918.00 S: Yeah, exactly. Social media makes it very easy to do all these things.

2918.00 2922.00 S: You don't need a big brick and mortar institution that has been around for a hundred years.

2922.00 2923.00 S: You just need a slick website.

2923.00 2932.00 S: Steve, correct me if you disagree, but the fix is that critical thinking needs to become part of the classroom, part of the canon.

2932.00 2942.00 S: There should be multiple times during a child's career as a student, they should be taking classes on critical thinking and be taught these things.

2942.00 2945.00 S: I agree. It has to be woven in throughout the science curriculum.

2945.00 2947.00 S: More than just science.

2947.00 2951.00 C: I think it should be in every class. Every class should have a component or, you know.

2951.00 2953.00 C: Yeah, woven in.

2953.00 2960.00 S: Wouldn't it be fantastic if teachers were taught critical thinking and then that bled into the classroom that way as well.

2960.00 2963.00 S: But we don't, I mean, it's rare.

2963.00 2971.00 S: We get emails from people who say, oh, I was inspired by this one teacher that I had at one point or, you know, this isn't a common thing at all.

2971.00 2973.00 S: It's incredibly uncommon.

2973.00 2982.00 C: Well, and let's be clear, some places do this and some school districts do this and some incredible institutions do this already.

2982.00 2985.00 C: It's not widespread and it's not well adopted.

2985.00 2992.00 C: But, I mean, we shouldn't talk about it as if nobody had this idea before because I think that this is among very good educators who really focus on pedagogy.

2992.00 2995.00 C: I think that this is a well established truth.

2995.00 2996.00 S: Of course.

2996.00 2997.00 S: Yeah, we didn't come up with this.

2997.00 2999.00 S: It just happens to be the truth.

2999.00 3005.00 S: And we are, I'm telling everyone, if you don't know this, that's the answer.

3005.00 3007.00 S: And that's what we should be focusing on.

3007.00 3016.00 S: And it's mind boggling how impossible it is to get the money it takes to do this because there's no money in critical thinking in science.

3016.00 3017.00 S: It's skepticism.

3017.00 3018.00 C: And so how do you do it?

3018.00 3025.00 C: How do you fight against the fact that there is money in teaching anti-evolution rhetoric and anti-critical race theory rhetoric and rhetoric?

3025.00 3041.00 C: There are states, these states that publish our textbooks, Texas, where there are concerted efforts and many, many southern states where there are concerted efforts to take actual data, truth, reality, history, science out of the textbooks and out of the curriculum.

3041.00 3042.00 C: And critical thinking.

3042.00 3043.00 C: And specifically critical pedagogy.

3043.00 3044.00 C: And critical thinking, yeah.

3044.00 3060.00 GH: You know, this idea of a sea change in the educational system of incorporating critical thinking and skepticism across the border into the whole curriculum, I think it's a fantasy because it's just, for something like that to occur is such a gigantic proposition.

3060.00 3069.00 GH: I wonder if a focus of our effort as sort of the fringe collective that we are that could have some influence.

3069.00 3079.00 GH: I don't know how true this is for everybody here, but there was a time in school where all the girls went into the auditorium one day and they got the sex movie.

3079.00 3083.00 GH: And all the boys went into the auditorium that one day and they got the sex movie.

3083.00 3084.00 GH: And it wasn't the best.

3084.00 3085.00 GH: It wasn't the most clear.

3085.00 3090.00 GH: It wasn't the most, the best sex ed necessarily presentation.

3090.00 3093.00 GH: But it sort of gave you the basics and it set up something to understand.

3093.00 3095.00 GH: This is how procreation works.

3095.00 3096.00 GH: And this is the boy parts.

3096.00 3097.00 GH: This is the girl parts.

3097.00 3098.00 GH: Great.

3098.00 3111.00 GH: So would there be some kind of equivalent of a one time, like an unbelievably well produced 45 minute thing of this is like how advertising works.

3111.00 3114.00 GH: This is how critical thinking works.

3114.00 3127.00 GH: And to get that in front of people's faces, in front of students faces, if nothing else, if nothing else, is that like, is that a better target to shoot for of like, hey, let's have, you know, schools have assemblies.

3127.00 3128.00 GH: Schools have assemblies.

3128.00 3139.00 GH: And to say, let's get a critical thinking assembly into the rotation that's going to be happening versus wouldn't it be great to have critical thinking at every juncture of a student's education?

3139.00 3140.00 GH: Of course it would.

3140.00 3142.00 GH: Is that realistic, though?

3142.00 3146.00 C: Yeah. But unfortunately, what we're talking about isn't a single fix.

3146.00 3150.00 GH: I know it's not a single fix, but I'm saying better than what's happening now.

3150.00 3155.00 C: Right. But just like that video, it doesn't necessarily, we'd have to look at the actual outcomes.

3155.00 3158.00 C: Did that prevent any unwanted pregnancies?

3158.00 3159.00 C: Maybe not.

3159.00 3163.00 C: You know, we have to really understand, are we hitting them at the right time?

3163.00 3168.00 GH: I just know that I saw Cosmos as a kid and it was like 30 minutes and I was in.

3168.00 3169.00 GH: I was in.

3169.00 3170.00 GH: There was 30 minutes.

3170.00 3176.00 GH: Now, again, my environment was set up so much that my parents had inspired critical thinking and a love of science, et cetera.

3176.00 3179.00 GH: But that was a catalyst.

3179.00 3194.00 GH: And I'm thinking there can be catalyst moments for young people that sometimes can be just in a well-done presentation of like, it happens with music, it happens with drama, it happens in all kinds of different things that a single presentation, a single moment, a single person seeing a play.

3194.00 3202.00 GH: I know how many stories you hear where the person, the kid goes and sees a play for the first time and they go, I want to be an actor. This is amazing.

3202.00 3204.00 GH: I call it the big bang moment.

3204.00 3205.00 E: Yeah, the big bang moment.

3205.00 3207.00 S: Let's work on having big bang moments.

3207.00 3209.00 C: We have to have enough of those.

3209.00 3212.00 C: We have to have enough big bang moments that they'll hit the kids at the right time.

3212.00 3222.00 S: But George, George, I think the problem is that we were all, everybody here and probably the vast majority of the audience that we have, we inherently love science, right?

3222.00 3225.00 S: Like we grew up and something happened and we fell for it.

3225.00 3227.00 S: You know, that's the problem.

3227.00 3233.00 S: That in and of itself is the actual problem is like most people don't fall in love with it as quickly as we do.

3233.00 3235.00 S: We're the low hanging fruit of science enthusiasts.

3235.00 3236.00 GH: I don't know, man.

3236.00 3242.00 GH: I think you hit a kid right with some really, I mean look how popular the slow motion guys are on YouTube.

3242.00 3249.00 GH: Look how popular a lot of science communicators are because they're doing presentations that are incidentally science related.

3249.00 3270.00 GH: I mean again, Mythbusters is the top example of like the top syndicated cable show, one of the top of all time, was created by a bunch of critical thinkers who weren't making a critical thinking show, but use critical thinking to blow stuff up and it was amazingly produced and amazingly done and it was the idea of let's make this entertaining and interesting.

3270.00 3275.00 GH: And there's going to be a whole generation of scientists because of Mythbusters, no question.

3275.00 3283.00 GH: I mean they themselves, they see that when they do their live shows or Adam talks about this all the time, that young people come up and they're like, I'm in, I'm so in.

3283.00 3292.00 GH: I'm saying is there some equivalent of that that could be done in a partial curriculum basis as opposed to trying to say, boy, wouldn't it be great to just have an entirely different curriculum?

3292.00 3296.00 B: I think so, but should we call the sex ed vid the big bang moment?

3296.00 3297.00 B: Should we do that?

3297.00 3298.00 B: I don't know.

3298.00 3299.00 B: Think about it.

3299.00 3304.00 S: So obviously you're correct, George, and I think partly we're doing that, right?

3304.00 3305.00 S: Yeah.

3305.00 3306.00 S: Oh, yeah, yeah, yeah.

3306.00 3315.00 S: We're producing as much content as we have, we're trying to flood the culture with as much pro-critical thinking, scientific skepticism kind of content as we can.

3315.00 3320.00 S: We do try to bring as much as we can of that into the classroom.

3320.00 3323.00 S: I know a lot of what we produce gets used by science teachers in the classroom.

3323.00 3331.00 S: We get a lot of feedback from science teachers who use the SGU, use our book, use a lot of the content that we create.

3331.00 3333.00 S: Give talks to schools, right?

3333.00 3335.00 S: We'll go into schools and give talks.

3335.00 3339.00 S: Richard Saunders has a whole magic show that he does teaching critical thinking.

3339.00 3341.00 S: So we should do as much of that as possible.

3341.00 3349.00 S: Maybe what you're saying is maybe we should divert a little bit more of our focus towards producing the kind of content that could be incorporated into the classroom.

3349.00 3358.00 S: But still, I actually don't think it's a pipe dream to change the education infrastructure in a positive way.

3358.00 3367.00 S: For example, there is just a culture in science that it's dirty to deal with and talk about pseudoscience.

3367.00 3375.00 S: And they don't like to do it, they don't understand it, and therefore it doesn't trickle down in science education.

3375.00 3379.00 S: And that is the thing that I really would like to change.

3379.00 3384.00 S: And I think we collectively, the scientific skeptical movement should try to change.

3384.00 3392.00 S: It's like, no, teaching about pseudoscience is a critical element of teaching science.

3392.00 3395.00 S: The evidence overwhelmingly shows that.

3395.00 3398.00 S: This is one more study that we're talking about today.

3398.00 3404.00 S: One more on a mountain of studies which show that the knowledge deficit model is limited.

3404.00 3406.00 S: It's not worthless, but it's limited.

3406.00 3408.00 S: You can't just give facts.

3408.00 3409.00 S: You can't just teach science.

3409.00 3416.00 S: You have to put it in the context of this is how we know what we know, and this is what happens when it goes wrong.

3416.00 3418.00 S: Show how terrible it goes.

3418.00 3422.00 S: You know that you're doing it right because you're not doing it wrong.

3422.00 3425.00 C: Yeah, you can't teach clinical physiology.

3425.00 3431.00 C: You can't teach a physician how to be a good doctor if you only ever show him non-pathological.

3431.00 3438.00 C: It's like you've got to see what happens when it's happening the normal or typical way versus when something goes wrong.

3438.00 3446.00 C: Kids are never going to understand critical thinking if you only teach them the cheery, saccharine version of how things should be.

3446.00 3448.00 C: You'll make them vulnerable.

3448.00 3449.00 S: Totally.

3449.00 3456.00 S: You have to teach pathological science in order to contrast that to healthy science.

3456.00 3459.00 S: If you don't do that, all you're doing is making a bunch of pseudoscience.

3459.00 3467.00 S: Steve, this is what I'm confused about because I think teaching kids about pseudoscience is almost like showing them a magic trick.

3467.00 3471.00 S: If you tell them, here, let me show you what's wrong about this.

3471.00 3472.00 S: Let's discuss it.

3472.00 3473.00 S: We'll break it down.

3473.00 3478.00 S: It becomes interesting because you're almost empowering them with the knowledge.

3478.00 3479.00 S: My kids love it.

3479.00 3480.00 S: They're responding to this.

3480.00 3481.00 GH: That can be so entertaining.

3481.00 3483.00 GH: Yeah, I think it can be done.

3483.00 3495.00 C: We're making a core assumption that Melissa in the chat made a really, I think, intuitive or insightful comment, which was that it's not uncommon to teach critical thinking in schools, by the way.

3495.00 3501.00 C: If you were to get a group of educators in a room and say, we should teach critical thinking, nobody would be like, I don't think so.

3501.00 3504.00 C: Educators, by and large, agree with this.

3504.00 3512.00 C: There are obviously external forces trying to change our curriculum structure, but people who do this for a living are like, hell yeah, we need to be teaching critical thinking.

3512.00 3514.00 C: Many of them do.

3514.00 3518.00 C: The question is or the concern is it can be undone at home.

3518.00 3521.00 C: It can absolutely be undone in the home.

3521.00 3525.00 C: We only have control over our academic system.

3525.00 3532.00 C: We cannot control the fact that parents have a much greater influence over their children's outcomes.

3532.00 3534.00 E: We've got to educate the parents as well.

3534.00 3536.00 S: We can only do what we can do.

3536.00 3538.00 S: We can control the things that we can do.

3538.00 3539.00 C: It's a chicken and egg situation.

3539.00 3540.00 C: Yeah.

3540.00 3541.00 S: All right, let's move on.

Embryo Research (59:01)[edit]

3541.00 3546.00 S: Cara, you're going to tell us about research using human embryos.

3546.00 3555.00 C: Yeah, so there was a feature article that was put out in Nature by a science writer named Kendall Powell called, What's Next for Lab-Grown Human Embryos?

3555.00 3563.00 C: The subtitle, which I think is very telling, is, Researchers are now permitted to grow human embryos in the lab for longer than 14 days.

3563.00 3565.00 C: Here's what they could learn.

3565.00 3576.00 C: I'm not sure if all of us are aware, but there has long been in place an international consensus decision that there is a 14-day rule.

3576.00 3578.00 C: That's what it's called, the 14-day rule.

3578.00 3582.00 C: This was set by the International Society for Stem Cell Research.

3582.00 3590.00 C: It's been discussed ever since the 70s when IVF really did become an actuality for many families.

3590.00 3594.00 C: Prior to that, you kind of didn't have a lot of options.

3594.00 3600.00 C: But when IVF really became an actuality, there became more embryos available.

3600.00 3606.00 C: Individual countries, different research groups said, We want access to these embryos.

3606.00 3614.00 C: Of course, there was a big international discussion that was really codified in the mid-2000s, just like we're seeing now with CRISPR.

3614.00 3616.00 C: You have a new technology available.

3616.00 3619.00 C: Where do the ethics fall in?

3619.00 3621.00 C: Where do we want to draw a line in the sand?

3621.00 3625.00 C: For a long time, that line in the sand was at 14 days.

3625.00 3631.00 C: Before we talk about what happens next, let's talk about what that meant for researchers.

3631.00 3637.00 C: Well, first and foremost, most labs can't even grow embryos to 14 days.

3637.00 3638.00 C: It's really freaking hard to do.

3638.00 3643.00 C: There's only a few labs in the world that have managed to do this and to do this consistently.

3643.00 3650.00 C: Part of that is because many countries have their own legal regulations that go beyond the 14-day rule.

3650.00 3654.00 C: It's actually illegal in many countries to work with human embryos at all.

3654.00 3665.00 C: But in the countries where it is legal, the US being one of them, the 14-day rule says that you cannot continue to grow human embryos beyond 14 days.

3665.00 3668.00 C: What happens in those first 14 days?

3668.00 3670.00 C: A lot, but also not a lot.

3670.00 3678.00 C: The embryos do start to organize, and you start to get right at the edge of what's called the primitive streak.

3678.00 3686.00 C: This is a visual thing that you can see under the microscope, a little streak that ultimately is going to form the neural tube.

3686.00 3697.00 C: Ultimately, the streak early on, you can tell by the way that the cells are organized, is giving the body design access, so up, down, left, right, head, tail, left, right.

3697.00 3700.00 C: Cara, is that the spine? Are you saying that is that the spine?

3700.00 3701.00 C: Not yet.

3701.00 3705.00 C: It will eventually, like during many more cell divisions, become the spine.

3705.00 3711.00 C: I'm talking right now in the first 14 days, a ball of cells, a blastocyst, that has almost no differentiation at all.

3711.00 3715.00 C: These cells are totipotent. They're not even pluripotent.

3715.00 3718.00 C: These are cells that could literally become anything.

3718.00 3726.00 C: They are really in the earliest stages of starting to develop and differentiate.

3726.00 3729.00 C: Really, really early on, it just looks like a ball of cells.

3729.00 3734.00 C: Around the end of that 14-day mark, you start to see the primitive streak.

3734.00 3736.00 C: It's called that because that's all it really is.

3736.00 3741.00 C: It's super early, and it's giving you some directionality to the embryo.

3741.00 3744.00 C: Then it's going to start folding in on itself.

3744.00 3753.00 C: You're going to have three differentiated layers of cells that ultimately give rise to things like skin and neural tissue, to body organs, to bone, and connective tissue.

3753.00 3758.00 C: At this point, none of that differentiation has occurred.

3758.00 3768.00 C: That's when, up until recently, researchers have had to go ahead and just freeze the embryos and put them in a suspended state, either that or destroy the embryos.

3768.00 3771.00 C: The research really couldn't continue beyond that.

3771.00 3773.00 C: We're missing a lot of information.

3773.00 3787.00 C: That information about what happens when the embryo, or I should say at this point, the blastocyst, really does start to undergo gastrulation, to really develop into a primitive organism.

3787.00 3795.00 C: Like you said, Jay, develop that spinal column, which starts as a neural tube based on how it folds and then continues to differentiate.

3795.00 3803.00 C: All of that, the post-14 day up through the next couple of weeks, we can't see an ultrasound.

3803.00 3820.00 C: The only way we've ever been able to study the development of an actual human embryo is through animal models, through embryonic models, and through taking all of this information and gleaning what we think we understand.

3820.00 3826.00 C: We know now a really good amount of what happens from day one to day 14.

3826.00 3831.00 C: Of course, we can study an embryo at different stages that was spontaneously aborted, for example.

3831.00 3850.00 C: We might be able to look at it, but we can't physically grow these things beyond day 14 and understand exactly what's happening, both at the structural level, but perhaps even more importantly at the molecular level, because it's been this international consensus that says you can't do that.

3850.00 3853.00 C: Back in May, that consensus changed.

3853.00 3862.00 C: The governing body, again, they are called the International Society for Stem Cell Research, the ISSCR, released new guidelines.

3862.00 3869.00 C: It relaxed the 14-day rule, and it allows for study of gastrulation beyond that.

3869.00 3873.00 C: They didn't put a new line in the sand, which a lot of people are concerned about.

3873.00 3875.00 C: They didn't say it's no longer 14 days, it's whatever.

3875.00 3878.00 C: They basically said so few labs do this.

3878.00 3881.00 C: If you want to continue beyond 14 days, you just have to seek approval.

3881.00 3884.00 E: So few labs are capable of going beyond 14 days.

3884.00 3886.00 S: Just individualize it rather than making a rule.

3886.00 3889.00 S: I guess that's positive. That's a great move.

3889.00 3903.00 C: Of course, there are people who push back and say, I wish they would have set a new line in the sand, but then of course the push back against that is it's very arbitrary what this line is, and a lot of it really does feel moral or religious.

3903.00 3909.00 C: Oh, maybe now we're starting to see heart structure, or now the neurons are starting to fire.

3909.00 3912.00 C: What does that mean? Could it have thought? Could it have feelings?

3912.00 3919.00 C: These are where a lot of these complicated questions come up, but they're the same questions we grapple with when we talk about brain organoids.

3919.00 3923.00 C: We just discussed brain organoids in a previous podcast.

3923.00 3928.00 C: To be clear, there are these pseudo embryos.

3928.00 3931.00 C: They're not actually called pseudo embryos. I just made that term up.

3931.00 3935.00 C: But there are these embryonic models that have been developed.

3935.00 3939.00 C: Really, really complicated, incredible science.

3939.00 3948.00 C: When it first was developed by, I think, a Japanese researcher, first developed how to make these embryonic models, it was like a watershed moment.

3948.00 3951.00 C: Then people were able to learn the protocol and other labs were able to do it.

3951.00 3961.00 C: But there's no real determination or guarantee that what's happening in an embryonic model perfectly matches with what's actually happening in development.

3961.00 3965.00 GH: What's the model like? I don't understand. Is it an organic thing?

3965.00 3972.00 C: Yeah, so they actually can take certain types of pluripotent cells and induce them to grow in a certain way, but it's not a human embryo.

3972.00 3981.00 C: It's cell types that have been stuck together and induced to grow, kind of like the brain organoids versus an actual brain being grown.

3981.00 3985.00 S: Yeah, so they're trying to ultimately make a distinction between a clump of cells and a person.

3985.00 3995.00 S: They're trying to draw lines. Beyond this point, it's actually now a person and we're actively going to treat it that way as opposed to it's a clump of cells that we're studying scientifically. But of course, it's a continuum.

3995.00 3998.00 S: So there is no sharp demarcation line.

3998.00 4000.00 S: There's never going to be a sharp demarcation.

4000.00 4005.00 C: So historically, it was 14 days. This international group has lifted that.

4005.00 4015.00 C: Now, one thing that's interesting is that, which I didn't really realize, although maybe I did, is that here in the US, we don't really have a lot of regulation about working with these things.

4015.00 4025.00 C: The regulation that we all thought we had probably comes into play because we have a law that says that these types of investigations can't be federally funded.

4025.00 4026.00 C: Right, yeah.

4026.00 4032.00 C: So most labs can't do this because they can't access NIH or NSF dollars to do it.

4032.00 4061.00 C: And it's expensive, right? And they have to have their own setup to be able to access and acquire these things. And so very few labs have really worked out the logistics, but the ones that have argue that there's so much more we could know because there's this gap in our knowledge between what we can see on ultrasound and what we can test in a developing fetus through like chorionic villi sampling or any other way to understand developmental biology and what happens after this 14-day rule.

4061.00 4065.00 C: There's kind of a black box area that we've been able to model.

4065.00 4073.00 C: We've been able to look at animal research, but really within humans, it's interesting how much of a gap we have in our knowledge.

4073.00 4082.00 S: Yeah, we're getting rid of the semi-arbitrary rule and just saying, all right, we'll just make individual decisions, justify your research ethically.

4082.00 4085.00 S: It's a legitimate ethical concern.

4085.00 4089.00 S: You don't want people growing human beings in a lab to study them.

4089.00 4099.00 C: Completely. But all researchers have to already go through an institutional review board, and this is an actual international review board, so it's another layer of regulation.

4099.00 4105.00 C: They're already approving this stuff. Why not do it on a case-by-case basis if so few labs can do it anyway?

4105.00 4112.00 S: Yeah. Okay, we're going to do a couple of quickie news items, and then we're going to do a special segment with Evan, and then we'll do science or fiction.

Bionic Arms (1:08:25)[edit]

4112.00 4117.00 S: That's the rest of the show. So this is a robotic prosthetic arm.

4117.00 4122.00 S: This is a type of research I've been talking about for years.

4122.00 4126.00 S: The researchers are calling this a bionic arm.

4126.00 4128.00 B: Bionic!

4128.00 4130.00 S: Sure, why not?

4130.00 4132.00 S: Why not?

4132.00 4134.00 S: Conjures images.

4134.00 4150.00 S: So this is a function of what we call a brain-machine interface, where basically any time you're having biology interface with a robotic or a mechanical device or a computerized device, in this case it's not connecting directly to the brain.

4150.00 4161.00 S: They're using a very cool method where, so these are meant for people who have an amputation, and they have surviving nerve endings.

4161.00 4172.00 S: You can graft, let's say for example you graft the motor nerve, the stub of the motor nerve basically, onto a clump of muscle tissue.

4172.00 4181.00 S: And so what that does is it keeps everything alive, but it also means that now that nerve is going to connect to this little patch of muscle.

4181.00 4185.00 S: And you might be thinking, well what will that do? It's just a little patch of muscle.

4185.00 4187.00 S: It can't move your limb or anything.

4187.00 4192.00 S: However, what that does is it amplifies the signal from the nerves.

4192.00 4198.00 S: Because nerves produce a very, very low amplitude electrical signal.

4198.00 4204.00 S: Muscles are like ten times as much, like a much bigger electrical signal from a muscle cell depolarizing.

4204.00 4212.00 S: And so essentially that little clump of muscles, not only the nerves and the muscles keep each other healthy, it amplifies the signal.

4212.00 4218.00 S: And then the signal from that little patch of muscles contracting activates the bionic arm.

4218.00 4224.00 S: So that's how you get motor control through your preexisting motor nerves.

4224.00 4233.00 S: So you're controlling it through the normal motor pathways that you had previously, but now you're again using the clump of cells to actuate the bionic arm.

4233.00 4238.00 S: However, the breakthrough here, no component of this is new.

4238.00 4242.00 S: What's new is bringing it all together in this one limb.

4242.00 4246.00 S: The other two components are sensory.

4246.00 4255.00 S: So one is having skin sensors that connect to the sensory nerves, which are grafted onto parts of skin.

4255.00 4267.00 S: So again, they're taking the existing nerves, grafting them to some intact tissue, and then using that as the interface to the mechanical connection.

4267.00 4274.00 S: So in the motor nerve, it goes from the motor nerve to the muscles to the actuators in the arm.

4274.00 4288.00 S: And then in the sensory feedback, it goes from the sensors in the surface of the skin of the robotic arm to this predetermined patch of skin cells that the sensory nerves are grafted onto.

4288.00 4298.00 S: So with this arm attached biologically to the subject, they did this in two subjects only, so this is early research.

4298.00 4309.00 S: They were able to control the limb through voluntary control through the nerves, and they were able to actually feel, like will their brain interpret that as a feeling in the limb?

4309.00 4320.00 S: Yes. So the brain happily incorporates all of this into its circuitry so that people feel like they own the limb, not that they have a limb attached to them.

4320.00 4324.00 S: There isn't this arm attached to me. It's like this is my arm now.

4324.00 4325.00 S: Agency?

4325.00 4327.00 S: Yeah. Well, it's ownership is the term.

4327.00 4328.00 S: Ownership.

4328.00 4338.00 S: Ownership. And they can control it without looking at it, and their ability to error correct and make adjustments is at like near normal levels.

4338.00 4352.00 S: That's what they were studying is like their ability to control the limb was actually more similar to an intact person than it was to somebody using an older prosthetic without the sensory feedback.

4352.00 4356.00 B: It becomes part of your internal representation of your body.

4356.00 4357.00 B: What is that called?

4357.00 4358.00 B: Your homunculus?

4358.00 4361.00 B: Yes, homunculus. It becomes part of your homunculus basically.

4361.00 4362.00 S: It's a homunculus.

4362.00 4364.00 S: This is one cool thing when you just think about it,

4364.00 4369.00 GH: that like you can close your eyes and you can tell when your hand is open or closed. That's proprioception.

4369.00 4370.00 GH: That's proprioception.

4370.00 4371.00 GH: Proprioception.

4371.00 4378.00 GH: Okay. That's amazing. I'm sorry. I know this is stoner talk, but it's just you know that it's amazing.

4378.00 4379.00 C: Yeah. Luckily.

4379.00 4381.00 C: Read stories about people who lose it.

4381.00 4382.00 C: Yeah, right. I know. I know.

4382.00 4384.00 C: It's incredible when people lose proprioception.

4384.00 4388.00 S: All of these pieces have been preexisting.

4388.00 4396.00 S: This is the first lab to bring them all together into one limb and show that it improves the usability of these limbs.

4396.00 4397.00 S: We are getting there.

4397.00 4398.00 S: We're getting so much closer.

4398.00 4400.00 GH: Was it a $6 million study or was it?

4400.00 4407.00 S: I mean we're still 100 years away from the $6 million man, which is interesting.

4407.00 4413.00 S: In the 1970s we thought, yeah, we're probably pretty close to this, but it's like 150 years away.

4413.00 4414.00 S: But this is where we are.

4414.00 4417.00 S: This is still a fantastic improvement, trust me.

4417.00 4427.00 S: The 20 square that we have depicted in the $6 million man, indistinguishable from a living limb to the user and to other people, forget about it.

4427.00 4428.00 S: We're nowhere near that.

4428.00 4431.00 S: Now, Steve, if you had one of these arms, can you pick up the front end of a car?

4431.00 4433.00 S: No.

4433.00 4434.00 S: Yeah.

4434.00 4435.00 S: It's still limited.

4435.00 4437.00 S: You have to hitchhike, Steve.

4437.00 4440.00 S: So it's strapped on still.

4440.00 4441.00 S: It's touching the skin on the outside.

4441.00 4443.00 S: Yes, but there is this biological interface.

4443.00 4444.00 S: Right.

4444.00 4445.00 S: But it's touching.

4445.00 4446.00 S: You're not going to be entering the shotput competition.

4446.00 4447.00 S: Is it sticking?

4447.00 4449.00 S: Is it like electrode in the muscle?

4449.00 4455.00 S: No, no, because the electrodes are reading or either stimulating the skin or reading the muscle contraction from the.

4455.00 4456.00 S: Okay.

4456.00 4457.00 S: So it's not like a.

4457.00 4460.00 C: Yeah, so if it fell off, if it was ripped off, like you wouldn't be injured.

4460.00 4462.00 S: Yeah, there's no wires going into the arm.

4462.00 4463.00 S: Okay.

4463.00 4470.00 S: The reason why I say that though is because structurally the body and the musculature are not really, like these arms couldn't pick up a lot of weight.

4470.00 4471.00 S: No.

4471.00 4473.00 S: All kidding aside, yeah, they couldn't.

4473.00 4474.00 S: But it would be functional.

4474.00 4476.00 S: You could pick up a can of soda.

4476.00 4478.00 S: You could use it.

4478.00 4482.00 B: You could pick it up and not be looking at it with the proper strength.

4482.00 4484.00 S: And know that it's there.

4484.00 4485.00 S: Know how hard you're squeezing it.

4485.00 4486.00 S: And know you're not crushing it.

4486.00 4487.00 GH: That's amazing.

4487.00 4493.00 S: So I keep a close eye on this technology and it's, again, nice incremental improvements.

4493.00 4495.00 S: This was worthy of comment.

Quickie with Bob: Caves on Mars (1:15:00)[edit]

4495.00 4499.00 S: All right, Bob, you're going to give us a quickie about caves on Mars.

4499.00 4500.00 S: Okay.

4500.00 4501.00 B: Yes, thank you, Steve.

4501.00 4503.00 B: This is your quickie with Bob, everyone.

4503.00 4512.00 B: Recent analysis by scientists have shown what could be more than a thousand caves in the Tharsis Bulge region of Mars.

4512.00 4513.00 B: Tharsis Bulge?

4513.00 4515.00 GH: I don't know where that region is, but it sounds kind of cool.

4515.00 4518.00 GH: Make a left at the Colossan Cut-Off.

4518.00 4520.00 GH: Tharsis Bulge.

4520.00 4528.00 B: Calculations show that these caves could be lifesaving for future human colonists, not unlike the lava tubes on Mars.

4528.00 4530.00 B: The lava tubes on the moon.

4530.00 4531.00 B: I knew that was going to happen.

4531.00 4539.00 B: Now, this presupposes, though, that Mars is far deadlier than a lot of people, I think, really, really, truly understand.

4539.00 4542.00 B: They think, oh, it's got an atmosphere, you know, that's not bad.

4542.00 4546.00 B: But it barely has an atmosphere, barely, barely.

4546.00 4550.00 B: And it does not have a magnetosphere, which is also very critical.

4550.00 4559.00 B: And regarding the atmosphere, if you're at sea level on Mars, it has 0.7% the air pressure that we experience at sea level.

4559.00 4561.00 S: I mean, that's like, it might as well be nothing.

4561.00 4562.00 S: It's so small.

4562.00 4563.00 S: Bob, let me ask you this.

4563.00 4564.00 S: Yes, ask.

4564.00 4567.00 S: You may remember this answer from our upcoming book, Guide to the Future.

4567.00 4568.00 S: Four.

4568.00 4572.00 S: What pressure would you need to have on Mars or anywhere?

4572.00 4577.00 S: What pressure would a human being need in order to be able to survive without a pressure suit?

4577.00 4578.00 B: Oh, it was...

4578.00 4579.00 B: A lot.

4579.00 4581.00 B: No, it was actually less than we thought.

4581.00 4582.00 S: One atmosphere.

4582.00 4583.00 S: Yeah, right.

4583.00 4584.00 S: It was less than we thought, wasn't it?

4584.00 4585.00 S: Six percent.

4585.00 4586.00 S: Yeah.

4586.00 4587.00 E: Oh, really?

4587.00 4588.00 E: That's it.

4588.00 4589.00 S: Yeah.

4589.00 4590.00 S: Of an atmosphere.

4590.00 4591.00 S: That's survivability.

4591.00 4592.00 S: You can survive.

4592.00 4593.00 S: You can survive.

4593.00 4594.00 S: You can't breathe.

4594.00 4595.00 S: You need oxygen.

4595.00 4596.00 S: But it's not as much as I thought.

4596.00 4597.00 S: It's really...

4597.00 4598.00 S: But you won't, like, implode.

4598.00 4599.00 B: Yeah, you won't implode.

4599.00 4600.00 B: Right.

4600.00 4601.00 B: And you can't move.

4601.00 4602.00 B: You can't move because if you move, then you'd run out.

4602.00 4611.00 B: But, yeah, but still, even if that is true, but even so, we would need, you know, six, seven, eight times what's there now just to be able to breathe in enough oxygen.

4611.00 4612.00 S: Yeah, yeah.

4612.00 4613.00 S: But we could...

4613.00 4620.28 S: So with the volatiles on Mars, we could get to the point where there was enough of an atmosphere where all you would need was supplemental oxygen.

4620.28 4622.72 S: You wouldn't need a pressure suit.

4622.72 4623.72 S: Yeah.

4623.72 4626.16 S: You wouldn't be like Arnold Schwarzenegger in Total Recall.

4626.16 4631.48 B: But talking reality today, it's a horrible place.

4631.48 4633.84 B: But it's not just the atmosphere, though.

4633.84 4638.28 B: It's the extreme ultraviolet and ionizing radiation.

4638.28 4639.28 B: It's so deadly.

4639.28 4643.76 B: How much deadlier do you think it's on Mars in terms of that radiation than on the Earth?

4643.76 4644.76 B: Ten times.

4644.76 4645.76 B: You're wrong, Jay.

4645.76 4646.76 B: Nine hundred times.

4646.76 4649.76 B: The radiation doses are 900 times...

4649.76 4652.80 B: I don't think I've ever been wrong so fast in my life.

4652.80 4654.52 B: You were wrong before you asked the question.

4654.52 4656.00 B: So calculations show...

4656.00 4657.76 B: Come on, this is a quickie.

4657.76 4658.76 B: This can't...

4658.76 4659.76 B: It's got to be under a minute, right?

4659.76 4660.76 B: I'm waiting for you to finish.

4660.76 4661.76 B: Okay.

4661.76 4666.08 B: So calculations show that only 2% of the UV would get through most of those caves.

4666.08 4667.08 B: Only 2%.

4667.08 4670.80 B: But enough light would get in that photosynthesis wouldn't be completely gone.

4670.80 4673.32 B: You'd still be able to take advantage of photosynthesis.

4673.32 4677.36 B: And they think that for the ionizing radiation, it would be the same.

4677.36 4678.36 B: Very little would get in.

4678.36 4681.56 B: So this could be a real haven for future colonists.

4681.56 4685.24 B: Also, while you're in a cave, you could also search for life, Jay.

4685.24 4689.08 B: Because if I'm living on Mars, I'm going in a cave, even if I'm a little microbe.

4689.08 4695.58 B: So there could be, who knows what kind of life could have developed on Mars eons ago in these caves.

4695.58 4699.00 B: So in the future, Arnold Schwarzenegger may say...

4699.00 4700.56 B: Get your ass to a cave on Mars.

4700.56 4701.56 B: Yes.

4701.56 4703.20 B: This was your Quickie with Bob.

4703.20 4705.20 B: I hope it was good for you too.

4705.20 4706.20 S: Okay.

4706.20 4707.68 S: Thank you, Jay.

Mystery Quotes (1:18:27)[edit]

4707.68 4712.50 S: So Evan worked up a little puzzle for us that we're going to do before we go to science or fiction.

4712.50 4713.50 S: Yes.

4713.50 4716.88 S: And he has given us a bunch of quotes.

4716.88 4718.88 S: And we have to figure out who said them.

4718.88 4720.40 E: Or who it's attributed to, right?

4720.40 4721.40 E: So you're going to play at home.

4721.40 4722.40 E: You're going to keep your own score.

4722.40 4725.84 E: We're going to ask each of the individuals here who they think the correct answer is.

4725.84 4728.78 E: The first quote, science, my lad, is made up of mistakes.

4728.78 4734.52 E: But they are mistakes which is useful to make because they lead little by little to the truth.

4734.52 4738.42 E: Was that written by Jules Verne, Ray Bradbury, or Edgar Rice Burroughs?

4738.42 4739.42 E: Let's go down the list.

4739.42 4740.42 E: Steve.

4740.42 4741.42 E: Bradbury.

4741.42 4742.42 E: George.

4742.42 4743.42 E: Bradbury.

4743.42 4744.42 B: Cara.

4744.42 4745.42 B: Verne?

4745.42 4746.42 B: Bob?

4746.42 4747.42 B: Doesn't sound like Bradbury to me.

4747.42 4748.42 B: I'll go with Edgar Rice Burroughs.

4748.42 4749.42 E: Okay.

4749.42 4750.42 E: And Jay?

4750.42 4751.42 E: Jules Verne.

4751.42 4752.42 E: If you answer Jules Verne, you are correct.

4752.42 4753.42 E: Nicely done, Jay.

4753.42 4754.42 E: J. Verne.

4754.42 4755.42 E: J. Verne to the center of the Earth.

4755.42 4756.42 E: Cara, you and me, baby.

4756.42 4757.42 E: Cara, too.

4757.42 4758.42 E: Nice.

4758.42 4759.78 E: Next, I despise the lottery.

4759.78 4766.12 E: There's less chance of you becoming a millionaire than there is of getting hit on the head by a passing asteroid.

4766.12 4767.12 E: We'll start at the end.

4767.12 4768.12 E: Jay.

4768.12 4769.12 E: That's Phil Plait.

4769.12 4770.12 B: Bob.

4770.12 4771.12 B: Gotta be Plait.

4771.12 4772.12 E: Cara.

4772.12 4773.12 C: I'll go with Phil.

4773.12 4774.12 GH: Yeah.

4774.12 4775.12 GH: George?

4775.12 4776.12 GH: I think it's not clever enough for Phil.

4776.12 4777.12 GH: I'm going to say, I'm going to say Vera.

4777.12 4778.60 GH: Steve.

4778.60 4780.60 C: So she's not clever?

4780.60 4781.60 C: No, no.

4781.60 4785.28 GH: Phil is known for his weird, you know, he'll like a lot of things.

4785.28 4786.28 E: Steve is correct.

4786.28 4787.28 E: It is Brian May.

4787.28 4788.28 E: Brian, oh.

4788.28 4789.28 E: Good old Brian.

4789.28 4790.28 E: Guitarist for Queen and an ask for business.

4790.28 4791.28 S: Between Vera Rubin and Phil Plait, yeah.

4791.28 4792.28 S: Next.

4792.28 4793.28 E: In terms of wittiness.

4793.28 4794.28 E: Who said this?

4794.28 4798.28 E: I was captured for life by chemistry and by crystals.

4798.28 4800.24 E: Was it Dorothy Hodgkin?

4800.24 4801.24 E: Was it Marie Curie?

4801.24 4802.24 E: Or Rosalind Franklin?

4802.24 4803.24 E: Steve.

4803.24 4804.24 E: Franklin.

4804.24 4805.24 E: George.

4805.24 4807.60 GH: I'm going to say Franklin as well.

4807.60 4808.60 GH: Cara.

4808.60 4811.32 C: X-ray crystallographer Rosalind Franklin.

4811.32 4812.32 E: And Bob.

4812.32 4813.32 E: Rosalind.

4813.32 4814.32 E: Jay.

4814.32 4815.88 S: Marie Curie.

4815.88 4816.88 E: Dorothy Hodgkin.

4816.88 4817.88 E: Oh my goodness.

4817.88 4818.88 None Everybody.

4818.88 4819.88 GH: Good distractor.

4819.88 4820.88 GH: Billy Crystal Kidnapper?

4820.88 4821.88 E: That's amazing.

4821.88 4822.88 E: Not a, yeah, exactly.

4822.88 4823.88 GH: Oh my god.

4823.88 4826.20 E: All right, let's try this one.

4826.20 4827.20 E: This is a fun one.

4827.20 4830.32 E: Your immune cells are like a circulating nervous system.

4830.32 4834.44 E: Your nervous system, in fact, is a circulating nervous system.

4834.44 4835.44 E: It thinks.

4835.44 4836.76 E: It is conscious.

4836.76 4839.20 E: Was that said by Deepak Chopra?

4839.20 4840.52 E: Andrew Weil?

4840.52 4843.48 E: Or Tenzin Yatso, the 14th Dalai Lama?

4843.48 4844.48 E: Jay.

4844.48 4845.48 S: Oh my god.

4845.48 4848.04 S: I mean, you know, it's got to be Deepak.

4848.04 4849.04 B: Okay, Bob.

4849.04 4850.04 B: Tenzin.

4850.04 4851.04 B: Cara.

4851.04 4854.12 C: Yeah, I mean, it feels Chopra-esque, but it's almost not esoteric enough.

4854.12 4856.32 C: So I'll say it was the Dalai Lama.

4856.32 4857.32 GH: George.

4857.32 4858.32 GH: I'm going to say Chopra.

4858.32 4859.32 GH: Okay, Steve.

4859.32 4861.76 S: Yeah, I think it's too coherent for Chopra.

4861.76 4862.76 S: I'll say wheel.

4862.76 4865.76 E: According to the internet, it is Chopra.

4865.76 4868.76 S: His immune cells are like a circulating nervous system.

4868.76 4869.76 E: I win again.

4869.76 4870.76 E: I win again.

4870.76 4874.56 E: And if that's the most cogent or the most clear that Chopra's ever been, that's not

4874.56 4876.56 C: good. Last one.

4876.56 4878.28 C: So that was Chopra, not the random Chopra simulator.

4878.28 4879.28 C: Correct.

4879.28 4883.44 E: Not the Chopra engine that generates random insanity.

4883.44 4884.44 E: Final question, folks.

4884.44 4885.44 E: A fun one here.

4885.44 4886.44 E: To the movies we go.

4886.44 4896.36 E: It could mean that that point in time inherently contains some sort of cosmic significance, almost as if it were the temporal junction point for the entire space-time continuum.

4896.36 4899.20 E: On the other hand, it could just be an amazing coincidence.

4899.20 4909.24 E: Jeffrey Goldblum as Dr. David Levins from Independence Day, Brad Pitt as Jeffrey Goines from 12 Monkeys, or Christopher Lloyd as Dr. Emmett Brown, Back to the Future.

4909.24 4910.24 S: Steve.

4910.24 4914.00 S: I mean, it sounds like Emmett Brown, but I don't remember it.

4914.00 4915.36 S: But I'll say Emmett Brown.

4915.36 4916.36 S: Okay, George.

4916.36 4917.88 E: Marty, it's Emmett Brown.

4917.88 4918.88 GH: It's me.

4918.88 4919.88 GH: 100%.

4919.88 4922.72 C: I think it's Doc Brown, too, but I'm going to go out on a limb.

4922.72 4928.08 C: I don't think it's from Independence Day because there was no time continuum there, but maybe it was from 12 Monkeys.

4928.08 4929.08 C: Maybe that was Brad Pitt.

4929.08 4930.08 E: Okay, Bob.

4930.08 4931.08 E: Brad Pitt.

4931.08 4932.08 E: And Jay.

4932.08 4933.08 E: Christopher Lloyd.

4933.08 4934.80 E: It is Christopher Lloyd, Dr. Emmett Brown.

4934.80 4937.96 C: Yeah, it sounded like Doc Brown for sure.

4937.96 4943.80 GH: Have you seen Rick and Morty with Christopher Lloyd as the live- Oh, that's great.

4943.80 4945.76 GH: No, it's five seconds long.

4945.76 4946.76 GH: It's just a little teaser.

4946.76 4947.76 GH: Christopher Lloyd as-

4947.76 4948.76 S: It's amazing.

4948.76 4950.76 E: I love him. That's the Mystery Quotes segment.

4950.76 4951.76 S: I hope you enjoyed it.

4951.76 4952.76 E: I hope you all scored well at home.

4952.76 4953.76 E: That was fun, Emmett.

4953.76 4954.76 E: Thanks, Evan.

Science or Fiction (1:22:35)[edit]

Answer Item
Fiction Yuzu largest culinary fruit
Science Jabuticaba berries
Science
Pawpaw: rotting flesh, neurotoxic
Host Result
Steve win
Rogue Guess
Bob
Yuzu largest culinary fruit
Jay
Yuzu largest culinary fruit
George
Jabuticaba berries
Cara
Yuzu largest culinary fruit
Evan
Yuzu largest culinary fruit

Voice-over: It's time for Science or Fiction.

Theme: Fruit

Item #1: Jabuticaba berries, native to Brazil, are the size of plums but taste like grapes and grow directly on the trunk of the jabuticaba tree.[6]
Item #2: The pawpaw is a sought-after tropical fruit relative native to the eastern United States with flowers that smell like rotting flesh and fruit that contains a high concentration of neurotoxin.[7][8]
Item #3: The Yuzu is an Asian tree fruit that is the largest culinary fruit in the world, with long tubular fruit weighing over 80 pounds.[9]


Bob's Response[edit]

Jay's Response[edit]

George's Response[edit]

Cara's Response[edit]

Evan's Response[edit]

Listeners' Top Response[edit]

Steve Explains Item #1[edit]

Steve Explains Item #2[edit]

Steve Explains Item #3[edit]

4954.76 4959.76 C: It's time for Science or Fiction.

4959.76 4969.36 S: All right, we have just enough time for a quick Science or Fiction.

4969.36 4970.36 S: We're going to have to move quickly here.

4970.36 4971.36 S: Okay, Fiction.

4971.36 4972.36 S: This is a theme.

4972.36 4973.36 S: The theme is fruit.

4973.36 4975.40 S: The theme is fruit.

4975.40 4976.40 S: It's all unusual fruit.

4976.40 4977.40 S: Help, help.

4977.40 4978.40 S: I need fruit.

4978.40 4979.40 S: All right, here we go.

4979.40 4990.12 S: The Jabuticaba berries native to Brazil are the size of plums but taste like grapes and grow directly on the trunk of the Jabuticaba tree.

4990.12 5000.80 S: The pawpaw is a sought-after tropical fruit relative native to the eastern United States with flowers that smell like rotting flesh and fruit that contains a high concentration of neurotoxin.

5000.80 5010.44 S: Number three, the yuzu is an Asian tree fruit that is the largest culinary fruit in the world with long tubular fruit weighing over 80 pounds.

5010.44 5022.84 S: Now I had to throw in culinary there because pumpkins are technically fruit and pumpkins are the largest true fruit in the world but this is what we think of it more as it's a culinary vegetable.

5022.84 5024.96 S: This is something that is a culinary fruit.

5024.96 5026.76 S: I thought it was a culinary fruit.

5026.76 5027.76 S: Culinary whatever.

5027.76 5030.76 S: All right, so Bob, go first.

5030.76 5035.56 E: Hold on, I'm going to put the link in chat.

5035.56 5044.48 S: So Ian will put a link in the chat to a survey where while we're giving our answers, the rogues are giving their answers, you can vote for the one that you think is.

5044.48 5047.84 S: We'll check in with you when the rogues are done and then we'll do the reveal.

5047.84 5048.84 S: Go ahead, Bob.

5048.84 5052.20 B: All right, I'll say the yuzu is friction.

5052.20 5053.20 S: That's it?

5053.20 5054.20 S: Yeah.

5054.20 5055.20 S: I got to be quick, right?

5055.20 5056.20 S: Okay, you've got 10 minutes.

5056.20 5057.20 S: Not that great.

5057.20 5058.20 S: All right, so let's see.

5058.20 5059.20 S: That's fine.

5059.20 5060.20 S: You're good.

5060.20 5061.20 S: But Jay's down.

5061.20 5062.20 S: Jay.

5062.20 5065.20 S: All right, the Jabuticaba berries.

5065.20 5066.20 S: Jabuticaba?

5066.20 5070.92 C: Jay, for the next five minutes, I just want you to say them.

5070.92 5074.68 S: I believe that one is science.

5074.68 5076.40 S: The size of plums tastes like grapes.

5076.40 5078.32 S: Yep, that one is science to me.

5078.32 5083.72 S: The pawpaw, I mean, I've heard about some type of thing that smelled like rotting flesh, some type of plant.

5083.72 5089.04 S: I don't know if it's this one, but that is a little bit of memory there.

5089.04 5091.80 S: So I think that one is real.

5091.80 5094.48 S: The yuzu is an Asian tree fruit that is a large.

5094.48 5098.16 S: It's basically an 80-pound piece of tubular fruit.

5098.16 5099.28 S: I don't believe that's real.

5099.28 5100.76 S: I think I would have heard of it.

5100.76 5101.76 S: That one is fiction.

5101.76 5102.76 S: Okay, George.

5102.76 5110.40 GH: I think the Jabuticaba is the fiction, even though you said it with such authority, Steve.

5110.40 5115.00 GH: It felt like you practiced it a little bit too much to really make it sound like it's a real thing.

5115.00 5120.92 GH: So I'm guessing, because it was too well done, I'm going to say that number one is the fiction.

5120.92 5121.92 S: All right.

5121.92 5122.92 S: Interesting logic.

5122.92 5123.92 S: Cara?

5123.92 5129.20 C: I have to say the yuzu is the fiction because I've eaten yuzu before, and it looks like a lemon kind of.

5129.20 5133.48 C: I don't know what this crazy 80-pound yuzu is.

5133.48 5134.48 C: I got to go with you.

5134.48 5136.20 C: I mean, don't you guys... I don't get it.

5136.20 5137.20 C: Is this an LA thing?

5137.20 5138.20 C: You guys have never had yuzu?

5138.20 5139.20 C: No, I've never even heard of it.

5139.20 5140.20 C: No?

5140.20 5141.20 E: How are you?

5141.20 5142.20 E: It sounds like a website.

5142.20 5145.12 C: No, it's on every Asian dessert menu.

5145.12 5146.68 S: You should have your last date.

5146.68 5149.04 S: Cara, I haven't been out of the house in two years.

5149.04 5151.04 S: I don't even know what's happening right now.

5151.04 5153.72 C: You got to eat more Asian food.

5153.72 5154.72 E: Evan?

5154.72 5157.24 E: I'll go with Cara and Bob.

5157.24 5158.24 E: Yuzu, fiction.

5158.24 5160.92 C: Maybe there's a different variant that I don't know about.

5160.92 5165.16 E: Even without Cara's help, I think I would have chosen that as the fiction.

5165.16 5166.16 S: Okay.

5166.16 5169.48 S: Ian, do we have a consensus from the listeners?

5169.48 5170.48 B: Relatively.

5170.48 5172.60 B: Here it comes.

5172.60 5173.60 C: Seems like it's all over the place now.

5173.60 5174.60 C: Oh, wow.

5174.60 5175.60 E: Yeah.

5175.60 5176.60 E: Popo.

5176.60 5177.60 E: Oh, you can make pies out of this fruit.

5177.60 5178.60 S: Look at that.

5178.60 5179.60 S: And then yuzu and the Jakutapapa berries.

5179.60 5180.60 S: All right.

5180.60 5187.72 S: So the winner as the fiction is the Popo among the listeners.

5187.72 5189.52 S: So let's take these in order.

5189.52 5191.20 C: Why would that be sought after?

5191.20 5192.20 C: I guess for medicine?

5192.20 5199.20 S: Jakutapapa berries, native to Brazil, are the size of plums but taste like grapes and grow directly on the trunk of the Jakutapapa tree.

5199.20 5203.04 S: That's weird, actually, now that I think about it.

5203.04 5208.52 S: George and 27% of the listeners, 25%, think this one is the fiction.

5208.52 5210.88 S: I'll answer it with a picture.

5210.88 5211.88 S: Oh, geez.

5211.88 5212.88 S: Yep.

5212.88 5213.88 S: Look at that.

5213.88 5214.88 S: Is that a basing?

5214.88 5215.88 S: Where is it?

5215.88 5216.88 S: Where is it?

5216.88 5217.88 S: Now, George, on the trunk.

5217.88 5218.88 S: Wow.

5218.88 5219.88 S: Excuse me, George.

5219.88 5220.88 S: If you were a frequent Reddit user, you would have gotten this correct.

5220.88 5221.88 S: Is that where I saw it, James?

5221.88 5222.88 S: Yes, it is.

5222.88 5223.88 S: That's cool.

5223.88 5224.88 S: Yeah, the door right on the trunk of the major branches.

5224.88 5225.88 S: Some of the pictures are just amazing.

5225.88 5226.88 S: It's so weird.

5226.88 5227.88 S: And they're basically like grapes the size of a plant.

5227.88 5228.88 S: And they taste like grapes?

5228.88 5229.88 S: Yeah.

5229.88 5230.88 S: Can you eat them?

5230.88 5231.88 S: On the trunk.

5231.88 5232.88 S: Yes, yes.

5232.88 5233.88 S: They're supposed to be delicious.

5233.88 5236.88 E: Are there other fruits that grow on trunks of trees?

5236.88 5238.88 S: I've never seen a tree before that I want to lick.

5238.88 5239.88 S: Right?

5239.88 5240.88 S: Interesting.

5240.88 5241.88 S: If you go to the trunk, they'll pick them up.

5241.88 5242.88 S: I know.

5242.88 5243.88 S: Just chew on them.

5243.88 5244.88 S: They don't have stems or anything.

5244.88 5245.88 S: Just chew on the tree and start chewing.

5245.88 5246.88 S: I've never heard of that before.

5246.88 5247.88 C: All right.

5247.88 5248.88 C: Oops.

5248.88 5249.88 S: Going the wrong way.

5249.88 5250.88 S: Tastes like garbage, though.

5250.88 5251.88 S: All right.

5251.88 5252.88 S: Here we go.

5252.88 5258.40 S: The pawpaws are sought after tropical fruit relative native to the eastern United States with flowers that smell like rotting flesh and fruit that contains a high concentration of neurotoxin.

5258.40 5264.72 S: None of the rogues thought that was the fiction, but the majority of the listeners think this one is the fiction.

5264.72 5265.72 S: Yeah.

5265.72 5266.72 S: Interesting.

5266.72 5270.92 S: The next one is science.

5270.92 5271.92 S: There is a pawpaw.

5271.92 5272.92 S: Wow.

5272.92 5273.92 S: Ew.

5273.92 5274.92 S: Apparently, it's delicious.

5274.92 5275.92 S: It looks like a mitochondrion.

5275.92 5281.88 S: I actually have a pawpaw tree growing in my backyard.

5281.88 5282.88 S: You do?

5282.88 5285.20 S: You can see it if you want to, but it's never fruited.

5285.20 5287.80 S: It's just not old enough yet to fruit.

5287.80 5289.24 S: It is on Instagram.

5289.24 5294.84 S: I bought it because it's like a native Connecticut fruit tree, fruit bearing tree, but it has a neurotoxin that can kill you.

5294.84 5298.68 C: Yeah, why would you eat something that has a neurotoxin in it?

5298.68 5306.48 S: After I purchased it and I started to research it, I'm like, oh, the flowers are pollinated by flies.

5306.48 5311.12 S: They replicate this odor of rotting flesh in order to attract flies.

5311.12 5313.40 S: No, to attract the flies.

5313.40 5317.78 S: If you have a lot of the trees, apparently, then they attract flies well.

5317.78 5324.60 S: If you have only one or two, they actually suggest that you can hang rotting meat on the tree in order to help attract flies.

5324.60 5325.60 S: Oh my God.

5325.60 5331.56 S: Although many references say, nah, you're probably not going to want to do that, so you can just hand pollinate it.

5331.56 5333.88 S: If you have one tree, just hand pollinate it.

5333.88 5335.28 S: Steve, does it have to be human flesh?

5335.28 5338.88 S: You don't have to hang roadkill on your pawpaw tree.

5338.88 5349.28 S: Yes, it has what one reference calls a high concentration of a neurotoxin that affects neurons in your brain.

5349.28 5356.04 S: The concern is that if you have frequently consumed the pawpaw fruit over years, that it may actually cause brain damage.

5356.04 5358.44 S: It may actually cause toxicity.

5358.44 5360.20 S: There is no fruit good enough that would want...

5360.20 5364.48 S: But this is considered a delicacy.

5364.48 5365.88 S: It is a commercial fruit.

5365.88 5369.88 S: It is highly sought after by foodies, by people who are aware of it.

5369.88 5370.88 S: Like blowfish.

5370.88 5373.16 S: Yeah, and it is a tropical relative.

5373.16 5374.16 S: It's like a tropical fruit.

5374.16 5377.12 S: It's like a guava or something like that.

5377.12 5378.52 S: I haven't tasted one yet.

5378.52 5380.76 S: I kind of got reluctant when I read about the neurotoxin.

5380.76 5381.76 S: Yeah, yeah.

5381.76 5385.28 S: Steve, somebody in the chat said they taste like a custardy banana.

5385.28 5386.28 S: A custardy banana?

5386.28 5388.40 S: That sounds pretty good.

5388.40 5392.44 S: So all of this means that the...

5392.44 5394.60 C: And apologies guys on this one.

5394.60 5395.60 C: I thought I was last.

5395.60 5396.60 C: It's all right, Cara.

5396.60 5397.60 C: That's okay, Cara.

5397.60 5398.60 C: No problem.

5398.60 5399.60 C: I would not have said that if I...

5399.60 5400.60 C: Cara, I screwed up recently.

5400.60 5401.60 S: Don't worry about it.

5401.60 5402.60 S: We all get mistakes.

5402.60 5403.60 S: Yeah, yeah.

5403.60 5409.52 S: The yuzu was an Asian fruit that is the longest culinary fruit in the world with long tubular fruit weighing over 80 pounds.

5409.52 5411.52 S: I actually meant for you to go last.

5411.52 5417.72 S: I just forgot because I was worried that this was a California thing because we have... I've never heard of or seen this fruit.

5417.72 5419.40 S: Never even heard of it before.

5419.40 5423.92 C: I think it's just that in LA, we have so many restaurants and so much food from the world.

5423.92 5424.92 C: It's a Japanese lemon.

5424.92 5425.92 E: The Asian influence on the West Coast is no restaurants on the East Coast.

5425.92 5426.92 S: It is a citrus.

5426.92 5427.92 S: It is citrus.

5427.92 5428.92 S: It is Japanese.

5428.92 5429.92 S: It's citrus.

5429.92 5430.92 S: Yeah.

5430.92 5433.80 S: So there it is on the left picture.

5433.80 5436.28 S: That's the actual yuzu.

5436.28 5437.88 S: Now on the right is what?

5437.88 5439.28 S: Who recognizes that picture?

5439.28 5443.84 S: On the right is the actual- That's the- What's that?

5443.84 5444.84 S: The actual largest fruit in the world.

5444.84 5445.84 S: Yeah, I have cancer.

5445.84 5446.84 S: Durian?

5446.84 5447.84 GH: Bigger than a watermelon.

5447.84 5448.84 GH: Durian.

5448.84 5449.84 S: Is that what it is?

5449.84 5450.84 E: Durian?

5450.84 5451.84 E: Jackfruit.

5451.84 5452.84 E: Jackfruit.

5452.84 5453.84 S: Yeah, yeah.

5453.84 5454.84 S: We ate it.

5454.84 5455.84 S: We ate it.

5455.84 5456.84 S: It's like shredded chicken.

5456.84 5457.84 S: Is that culinary, Steve?

5457.84 5458.84 S: What's happening?

5458.84 5459.84 C: Jackfruit, it grows on trees.

5459.84 5460.84 C: That thing's hanging from a tree.

5460.84 5461.84 S: It's huge.

5461.84 5462.84 S: Yeah.

5462.84 5463.84 S: It's like a watermelon.

5463.84 5464.84 S: Yeah, it's like a vegan substitution.

5464.84 5465.84 S: They weigh 80 pounds.

5465.84 5466.84 E: You can make chili with it and stuff.

5466.84 5467.84 E: That's the actual largest fruit.

5467.84 5468.84 C: Wow.

5468.84 5471.44 C: I think Trader Joe's sells jackfruit sloppy joes.

5471.44 5477.36 GH: I literally have cans of Trader Joe's jackfruit in my- Those are big cans, George.

5477.36 5478.36 E: Big cans.

5478.36 5479.36 E: Yeah.

5479.36 5480.36 GH: The biggest cans in the world.

5480.36 5481.36 GH: No, it's cool.

5481.36 5483.48 GH: You cook it and it just shreds just like chicken.

5483.48 5486.84 GH: If you spice it right, it's almost, almost- You can't tell it's not meat.

5486.84 5487.84 GH: It's pretty much-

5487.84 5489.84 S: Yeah, yeah. It's a fruit.

5489.84 5490.84 S: It's kind of a meat substitute.

5490.84 5491.84 S: I should have got that right, but I did.

5491.84 5492.84 S: There you go.

5492.84 5493.84 S: Jackfruit.

Skeptical Quote of the Week (1:31:32)[edit]

Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education and separating trustworthy from untrustworthy sources. However, trust in science does not fix all evils and can create susceptibility to pseudoscience if trusting means not being critical.
Dolores Albarracín, director of the Science of Science Communication Division and the Social Action Lab at the University of Pennsylvania's Annenberg Public Policy Center.

5493.84 5494.84 S: Evan, you have one minute to give us a quote.

5494.84 5495.84 E: All right.

5495.84 5498.60 E: Here's the actual quote to wrap up the show tonight.

5498.60 5510.60 E: Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education, and separating trustworthy from untrustworthy sources.

5510.60 5521.40 E: However, trust in science does not fix all evils and it can create susceptibility to pseudoscience if trusting means not being critical.

5521.40 5525.32 E: Said by Dolores Al-Barrison, I hope I pronounced that correctly.

5525.32 5533.84 E: She is the director for the Science of Science Communication Division at the University of Pennsylvania's Annenberg Public Policy Center.

5533.84 5536.80 E: I love that there is a science of science communications.

5536.80 5537.80 E: Exactly.

5537.80 5538.80 S: See, George Shih gets it.

5538.80 5539.80 S: Yes.

5539.80 5540.80 S: All right.

5540.80 5541.80 S: Thank you.

5541.80 5544.80 S: Yeah, so is that one of the authors of the study?

5544.80 5545.80 S: Yes, right.

5545.80 5550.64 E: This is one of the co-authors of the study that we referred to earlier when we were talking

5550.64 5552.64 S: about the news item. Yes.

5552.64 5553.64 S: Absolutely.

5553.64 5554.64 S: Let's have her on the show.

5554.64 5556.24 S: So there is a science of science communication.

5556.24 5559.28 S: We follow it very closely because it's what we do.

5559.28 5569.72 S: And I guess this is self-serving and I try to be skeptical of this, but the fact is we've been saying this for 30 years that you can't just teach science.

5569.72 5570.72 S: You have to teach about pseudoscience.

5570.72 5571.72 S: You have to teach critical thinking.

5571.72 5574.24 S: You have to teach about the mechanisms of self-deception.

5574.24 5576.64 S: You've got to teach all of that.

5576.64 5581.88 S: And over the last 30 years, the research has showed that we were absolutely correct.

5581.88 5588.08 S: The scientific skepticism approach to science communication is the far and away the most evidence-based.

5588.08 5595.36 S: And it's been very satisfying to follow this research over the last 30 years and go, yeah, this is what we've been saying all along.

5595.36 5600.20 S: Obviously, it's filling in a lot of the details and informing what we do.

5600.20 5601.60 S: And there's a lot of nuance here.

5601.60 5603.32 S: There's a lot of details that we didn't know obviously.

5603.32 5610.20 S: But just the big picture of you've got to teach about pseudoscience is absolutely evidence-based and correct.

Signoff/Announcements (1:33:30)[edit]

5610.20 5611.20 S: All right.

5611.20 5612.20 S: Thank you, George.

5612.20 5613.20 S: Thank you, Ian, for running the tech.

5613.20 5616.84 S: Buy the book, as it says on the screen there, buy the Skeptics Guide to the Universe book.

5616.84 5618.40 S: Give it as a gift to somebody.

5618.40 5619.96 S: Thank you, Cara, for joining us from Cal Poly.

5619.96 5620.96 S: Thank you, Cara.

5620.96 5622.92 S: Thank you, Derek, for inviting us to DragonCon again.

5622.92 5624.22 S: Sorry we couldn't be there live.

5624.22 5626.44 S: We hope that you guys enjoyed our stream.

5626.44 5628.16 S: Enjoy the rest of DragonCon.

5628.16 5629.16 S: Stay skeptical.

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.

[top]                        

Today I Learned[edit]

  • Fact/Description, possibly with an article reference[10]
  • Fact/Description
  • Fact/Description

Notes[edit]

References[edit]

Vocabulary[edit]


Navi-previous.png Back to top of page Navi-next.png