SGU Episode 893
This episode is in the middle of being transcribed by Hearmepurr (talk) as of 2022-11-09. To help avoid duplication, please do not transcribe this episode while this message is displayed. |
This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading. |
This transcript is not finished. Please help us finish it! Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts. |
Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.
SGU Episode 893 |
---|
August 20th 2022 |
Alex Jones' lawyer accidentally sent two years' worth of texts to plaintiffs' lawyers |
Skeptical Rogues |
S: Steven Novella |
B: Bob Novella |
J: Jay Novella |
E: Evan Bernstein |
Guests |
AJR: Andrea Jones-Rooy, |
KB: Kelly Burke, from Guerrilla Skeptics |
GH: George Hrab, NECSS emcee |
IC: Ian Callanan, SGU tech guru |
Quote of the Week |
An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong. |
Russell Baker, American journalist |
Links |
Download Podcast |
Show Notes |
Forum Discussion |
Introduction, Live from NECSS, Book Update
- Perry DeAngelis Memorial Episode
Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.
Special Segment: Chorizo Hoax (7:09)
News Items
S:
B:
C:
J:
E:
(laughs) (laughter) (applause) [inaudible]
[00:13.060 --> 00:18.960] This is your host, Stephen Novella, and today is August 6th, 2022.
[00:18.960 --> 00:20.680] Joining me this week are Bob Novella.
[00:20.680 --> 00:21.680] Hey everybody.
[00:21.680 --> 00:22.680] Jay Novella.
[00:22.680 --> 00:23.680] Hey guys.
[00:23.680 --> 00:24.680] Evan Bernstein.
[00:24.680 --> 00:25.680] Hello everyone.
[00:25.680 --> 00:29.000] And we have two in-studio guests, Kelly Burke.
[00:29.000 --> 00:30.000] Hello.
[00:30.000 --> 00:31.000] Kelly, welcome to the SGU.
[00:31.000 --> 00:32.480] This is your first time on the show.
[00:32.480 --> 00:33.480] It is.
[00:33.480 --> 00:34.480] And Andrea Jones-Roy.
[00:34.480 --> 00:35.960] Andrea, welcome back to the SGU.
[00:35.960 --> 00:36.960] Hello.
[00:36.960 --> 00:37.960] Thank you for having me.
[00:37.960 --> 00:39.400] Thank you all for joining me in the studio.
[00:39.400 --> 00:42.160] You were here live, so we had to have you on the show.
[00:42.160 --> 00:46.640] Now Kara was going to join us for this episode, she was going to join us remotely, but as
[00:46.640 --> 00:50.760] you remember, she had surgery not too long ago and she's going through a bit of a rough
[00:50.760 --> 00:51.760] patch.
[00:51.760 --> 00:57.360] She did want us to say that Kara does struggle with depression and she's having a depressive
[00:57.360 --> 01:02.840] episode partly due to hormones and the surgery and everything that's going on.
[01:02.840 --> 01:06.760] She's dealing with it, but that has to be her priority this weekend to deal with that.
[01:06.760 --> 01:09.720] And so she decided to not do the show.
[01:09.720 --> 01:13.480] So we wish her well, she'll be back for next week's show.
[01:13.480 --> 01:16.920] But we have six people instead of five to make up for it.
[01:16.920 --> 01:25.440] So as you all know, this episode, every year, this is our Perry DeAngelis Memorial episode
[01:25.440 --> 01:32.440] and before it was Nexus, it was just the Perry DeAngelis Memorial SGO episode, then it basically
[01:32.440 --> 01:37.640] morphed into Nexus and we kept it as the episode where we remember our lost rogue, Perry.
[01:37.640 --> 01:39.960] Dan, that was 15 years ago, guys.
[01:39.960 --> 01:40.960] I know, oh my gosh.
[01:40.960 --> 01:45.480] He was with us for two years and it's been 15 years since he was on the show.
[01:45.480 --> 01:46.480] It's just unbelievable.
[01:46.480 --> 01:51.080] And of course, we remember many of the friends that we lost along the way, David Young, Michael
[01:51.080 --> 01:58.000] Oreticelli, all lost too young, too soon, all really, really good friends of the SGU.
[01:58.000 --> 02:00.240] So we like to remember them every year.
[02:00.240 --> 02:01.240] Okay.
[02:01.240 --> 02:05.880] So as you all know, I have an announcement from our book publisher, if you don't mind.
[02:05.880 --> 02:06.880] Oh yes, go ahead, Jay.
[02:06.880 --> 02:12.000] That book is the result of an incredible amount of work.
[02:12.000 --> 02:16.560] So this book is about, it's about science, it's about the history of science and it's
[02:16.560 --> 02:20.440] about making predictions on future technology.
[02:20.440 --> 02:23.600] Historically and modern day.
[02:23.600 --> 02:24.960] And we had a lot of fun writing the book.
[02:24.960 --> 02:31.320] It was really intense, but it's an archive now of incredible information about predictions
[02:31.320 --> 02:34.720] that were made in the past, predictions that were made five, 10 years ago, predictions
[02:34.720 --> 02:36.800] that are made today.
[02:36.800 --> 02:41.760] We also wrote some science fiction for this to illustrate some interesting future concepts
[02:41.760 --> 02:42.760] of technology.
[02:42.760 --> 02:43.760] Yeah.
[02:43.760 --> 02:45.280] What I like is that it's also a time capsule.
[02:45.280 --> 02:47.400] It's like our own time capsule for the future.
[02:47.400 --> 02:50.320] So future generations can look back and see how we did.
[02:50.320 --> 02:53.280] Just like we are looking back at the past future and see how they did.
[02:53.280 --> 02:55.920] I hope they don't laugh at us the way we've been laughing at them.
[02:55.920 --> 02:56.920] Yeah.
[02:56.920 --> 02:58.400] We were talking about the Jetsons earlier.
[02:58.400 --> 02:59.400] It's not-
[02:59.400 --> 03:00.400] Right.
[03:00.400 --> 03:01.400] 2022.
[03:01.400 --> 03:02.400] Happy birthday, George.
[03:02.400 --> 03:03.400] Yeah.
[03:03.400 --> 03:07.840] If you go to skepticsguidetothefuturebook.com, is that right?
[03:07.840 --> 03:08.840] Skepticsguidetothefuturebook.com.
[03:08.840 --> 03:14.600] And you fill out the form there and you put in the secret password, which is the word
[03:14.600 --> 03:15.600] future.
[03:15.600 --> 03:16.600] Don't tell anybody.
[03:16.600 --> 03:17.600] It's a secret.
[03:17.600 --> 03:18.600] That's clever.
[03:18.600 --> 03:19.600] Come up with that.
[03:19.600 --> 03:25.080] And you will be entered in to a giveaway of the very first signed copy of the book.
[03:25.080 --> 03:26.080] Wow.
[03:26.080 --> 03:32.040] So please go to skepticsguidetothefuturebook.com and the secret password, George, what's that
[03:32.040 --> 03:33.040] secret password?
[03:33.040 --> 03:34.040] Flabbing garbage.
[03:34.040 --> 03:35.040] Flabbing garbage.
[03:35.040 --> 03:39.560] Or spelled as in non-George language, future.
[03:39.560 --> 03:40.560] The word is future.
[03:40.560 --> 03:41.560] All right.
[03:41.560 --> 03:47.300] So this slide is just to remind everybody that the theme of Nexus 2022, the 14th Nexus
[03:47.300 --> 03:50.280] is the misinformation apocalypse.
[03:50.280 --> 03:52.960] You've had a lot of talk so far about it.
[03:52.960 --> 03:58.240] That theme might crop up on the SGU show this weekend.
[03:58.240 --> 04:01.680] But we've tried to focus on the positive, right guys?
[04:01.680 --> 04:04.360] We don't just want to say how bad it is.
[04:04.360 --> 04:06.960] We want to focus on what you can do about it.
[04:06.960 --> 04:08.800] And there's a lot of things you can do about it.
[04:08.800 --> 04:11.600] All the things that we try to do every week.
[04:11.600 --> 04:16.860] Understand science better and be more critical thinking, be more humble, know how to communicate,
[04:16.860 --> 04:22.680] be more positive when you communicate, understand how to access information over media, how
[04:22.680 --> 04:23.680] that all works.
[04:23.680 --> 04:27.600] We'll hit some of those themes during the show today as well.
[04:27.600 --> 04:34.000] Well, it's always awesome as one of the people that organize Nexus, we talk to the speakers,
[04:34.000 --> 04:37.440] but we don't get an incredible amount of details about what their talk is going to be.
[04:37.440 --> 04:40.200] Because we're just trusting professionals.
[04:40.200 --> 04:42.400] There's some conversation, but it's not detailed.
[04:42.400 --> 04:43.400] Always a bit of a throw of the dice.
[04:43.400 --> 04:44.400] Yeah.
[04:44.400 --> 04:45.400] It has to be.
[04:45.400 --> 04:47.640] It has to be the people and not the talk, basically, right?
[04:47.640 --> 04:53.260] So when we get to hear the talk and we get to see how it folds into our theme and what
[04:53.260 --> 04:56.880] information that they cover, it's always a fun discovery, right?
[04:56.880 --> 04:58.560] Like, oh my God, that's cool.
[04:58.560 --> 05:02.280] It's more relevant than I thought, or they went into an area I didn't expect them to
[05:02.280 --> 05:03.280] go.
[05:03.280 --> 05:06.080] I thought your talk this morning was a lot of fun that you had with Richard Wiseman.
[05:06.080 --> 05:07.080] Oh yeah.
[05:07.080 --> 05:08.080] Richard Wiseman's like-
[05:08.080 --> 05:09.080] That was really cool.
[05:09.080 --> 05:10.080] Very easy to talk to about stuff like that.
[05:10.080 --> 05:11.080] Definitely.
[05:11.080 --> 05:12.080] Always a pleasure.
[05:12.080 --> 05:16.560] You just gave your talk on political science and misinformation, which is obviously a huge
[05:16.560 --> 05:17.980] intersection there.
[05:17.980 --> 05:18.980] So there's still a lot to learn.
[05:18.980 --> 05:24.640] I think after doing this for 17 years, it doesn't amaze me, but it's always fascinating
[05:24.640 --> 05:30.520] how much we still have to learn about something that we've been knee-deep in for a quarter
[05:30.520 --> 05:31.520] of a century.
[05:31.520 --> 05:37.480] We've been doing this skepticism since 96, 26 years, the podcast for 17 of those years.
[05:37.480 --> 05:43.760] But there's just so much depth, and it's getting deeper, which is the good thing, is that this
[05:43.760 --> 05:45.400] is not a static field.
[05:45.400 --> 05:46.400] It's dynamic.
[05:46.400 --> 05:50.720] We are actually learning more, and we have to update ourselves.
[05:50.720 --> 05:51.720] Constantly.
[05:51.720 --> 05:52.720] Yeah.
[05:52.720 --> 05:56.620] Well, the world... I mean, look at what's happened since this podcast began.
[05:56.620 --> 06:01.800] Look how much things have changed, like you were talking about time capsules.
[06:01.800 --> 06:10.000] Just take a look at what happened in science, the skeptical community itself, and in politics.
[06:10.000 --> 06:13.400] The world has changed so much, and we're running to try to keep up with it, if anything.
[06:13.400 --> 06:14.960] It's not an easy thing to do.
[06:14.960 --> 06:19.920] Yeah, I mean, we're focused on almost entirely different issues now than we were at the start.
[06:19.920 --> 06:24.320] The things that are important or critical have been constantly evolving.
[06:24.320 --> 06:27.680] And then some things come back and stay the same, like we're doing UFOs again.
[06:27.680 --> 06:29.560] Really, we're all at the same point.
[06:29.560 --> 06:31.960] So some things are different, some things are the same.
[06:31.960 --> 06:32.960] It's always interesting.
[06:32.960 --> 06:35.880] I'm just waiting for Bigfoot to become relevant again.
[06:35.880 --> 06:40.200] How long before Bigfoot is like, we're talking about Bigfoot again?
[06:40.200 --> 06:42.360] Yeah, that's one of those eternal ones that never goes away.
[06:42.360 --> 06:45.320] I saw a Loch Ness Monster post on Twitter recently.
[06:45.320 --> 06:46.320] Oh, yeah.
[06:46.320 --> 06:47.320] They're like, oh, new evidence.
[06:47.320 --> 06:48.320] Loch Ness Monster.
[06:48.320 --> 06:49.320] New evidence.
[06:49.320 --> 06:50.320] I love that.
[06:50.320 --> 06:51.320] New evidence.
[06:51.320 --> 06:52.320] Never goes away.
[06:52.320 --> 06:53.320] Yeah.
[06:53.320 --> 06:54.320] New blurry photos.
[06:54.320 --> 06:58.120] Even after it's been definitively... The guy confessed, yeah, that was me.
[06:58.120 --> 06:59.120] I made the first...
[06:59.120 --> 07:00.120] Oh, yeah.
[07:00.120 --> 07:01.120] Awesome.
[07:01.120 --> 07:02.120] Steve, that's an admission that there's an Illuminati.
[07:02.120 --> 07:03.120] Yeah.
[07:03.120 --> 07:04.120] That's what that is.
[07:04.120 --> 07:05.120] Come on, man.
[07:05.120 --> 07:06.120] All right.
[07:06.120 --> 07:08.280] Let's get to some fun stuff.
[07:08.280 --> 07:09.280] What do you guys think that is?
[07:09.280 --> 07:10.280] I'm showing a picture on the screen there.
[07:10.280 --> 07:11.280] Ooh, ask C.
[07:11.280 --> 07:12.280] That is a meeple.
[07:12.280 --> 07:17.640] I mean, whoever made it needs a little help, but we'll get there.
[07:17.640 --> 07:18.640] You're close.
[07:18.640 --> 07:28.120] So a French scientist spread this picture on Twitter, claiming that it was a close-up
[07:28.120 --> 07:31.480] photo from it through... Did he mention the telescope?
[07:31.480 --> 07:32.480] James Webb.
[07:32.480 --> 07:33.480] James Webb.
[07:33.480 --> 07:36.480] This is a James Webb close-up of Alpha Centauri.
[07:36.480 --> 07:37.480] Proxima Centauri.
[07:37.480 --> 07:38.480] Did he say Proxima?
[07:38.480 --> 07:39.480] I think he did.
[07:39.480 --> 07:40.480] I think he said Proxima.
[07:40.480 --> 07:43.160] Proxima Centauri, the closest star to the Earth.
[07:43.160 --> 07:47.720] And it was pretty much believed by a lot of people.
[07:47.720 --> 07:52.120] Turns out that is a picture of essentially a slice of chorizo.
[07:52.120 --> 07:54.120] I love it.
[07:54.120 --> 07:56.120] Oh, my God.
[07:56.120 --> 07:57.120] It's so awesome.
[07:57.120 --> 08:01.120] Yeah, we're having chorizo after the conference is over, because that's great.
[08:01.120 --> 08:02.120] Hey, can we have some of that?
[08:02.120 --> 08:04.120] Look at all those solar swirls in that chorizo.
[08:04.120 --> 08:06.120] He must have looked at it and goes, you know what?
[08:06.120 --> 08:09.120] This kind of looks like those blurry photos, close-ups of the sun.
[08:09.120 --> 08:11.120] I wonder how many people I can get to believe that.
[08:11.120 --> 08:12.120] You know what I love about this?
[08:12.120 --> 08:13.120] It's not even cropped.
[08:13.120 --> 08:15.120] That is the shape of the piece of-
[08:15.120 --> 08:16.120] Yeah, that's it.
[08:16.120 --> 08:17.120] Right?
[08:17.120 --> 08:18.120] That's it.
[08:18.120 --> 08:19.120] That's the whole thing.
[08:19.120 --> 08:20.120] That's great.
[08:20.120 --> 08:21.120] It is funny.
[08:21.120 --> 08:22.120] There is a superficial resemblance.
[08:22.120 --> 08:28.120] He later apologized, I'm not sure he had to, but we talk about this at times.
[08:28.120 --> 08:31.120] He's a scientist, and he pulled a little prank.
[08:31.120 --> 08:36.120] I thought it was pretty harmless, and the point was be a little bit more skeptical before
[08:36.120 --> 08:38.120] you believe things.
[08:38.120 --> 08:43.120] But I do agree that it's problematic to have a scientist doing it, because then we're
[08:43.120 --> 08:49.120] simultaneously saying consider the credentials of the person that you're listening to or
[08:49.120 --> 08:51.120] that you're getting information from.
[08:51.120 --> 08:53.120] If people are saying, hey, no, a scientist put this up.
[08:53.120 --> 08:56.120] This wasn't just some random guy on the internet.
[08:56.120 --> 08:59.120] This was a scientist saying, oh, but he was pranking us.
[08:59.120 --> 09:02.120] It may cause more harm than good.
[09:02.120 --> 09:04.120] Kelly, you are a social media expert.
[09:04.120 --> 09:05.120] What do you think?
[09:05.120 --> 09:09.120] I was going to say, that's actually pretty tricky with the James Webb pictures, too,
[09:09.120 --> 09:14.120] because I've noticed not all of them are coming from NASA, because the data is just out there
[09:14.120 --> 09:17.120] and anybody can compile the pictures.
[09:17.120 --> 09:21.120] So anytime I've seen something presented as a James Webb picture, I have to go and look
[09:21.120 --> 09:24.120] into it, because it's not coming directly from NASA.
[09:24.120 --> 09:27.120] So I could totally see why this took off.
[09:27.120 --> 09:32.120] You may think my knee-jerk reaction is, wait a second, stars are point sources of light.
[09:32.120 --> 09:34.120] You zoom in as much as you want.
[09:34.120 --> 09:38.120] You're really not going to see the disk for the most part.
[09:38.120 --> 09:42.120] That has been true for as long as astronomy has been around, of course, relatively recently.
[09:42.120 --> 09:47.120] Now we can zoom in on certain stars, certain giant stars or stars that are close,
[09:47.120 --> 09:51.120] and we can at least observe some of the disk itself.
[09:51.120 --> 09:53.120] It's not just a point of light.
[09:53.120 --> 09:55.120] And I think the number now is 23.
[09:55.120 --> 09:59.120] 23 stars we have actually seen part of the disk or a little bit of the disk.
[09:59.120 --> 10:02.120] Sometimes you can even see the convection cells.
[10:02.120 --> 10:09.120] So it's not an outrageous thing to say I could see the disk of this nearby star.
[10:09.120 --> 10:11.120] It was not implausible, right.
[10:11.120 --> 10:14.120] And if you looked at some of them, we found one.
[10:14.120 --> 10:16.120] Well, yeah, I got it here.
[10:16.120 --> 10:19.120] I do want to point out before we move off this picture, though, that while that's correct,
[10:19.120 --> 10:22.120] you can see the grain of the meat.
[10:22.120 --> 10:25.120] This is an in-focus photo.
[10:25.120 --> 10:28.120] If he had just blurted out, it would have been a hundred times more powerful.
[10:28.120 --> 10:31.120] If you're looking on your phone, it's really tiny.
[10:31.120 --> 10:33.120] That could also be a bowling ball for all you know.
[10:33.120 --> 10:39.120] So this is the closest up picture I could find of Alpha Centauri A and B,
[10:39.120 --> 10:41.120] including Proxima Centauri.
[10:41.120 --> 10:44.120] Actually, Proxima C, I always forget.
[10:44.120 --> 10:46.120] Is that the third star in the system?
[10:46.120 --> 10:49.120] I just call it Proxima, and they're messing around with the names of these stars.
[10:49.120 --> 10:53.120] Yeah, but this is Alpha 1 and 2, or A and B.
[10:53.120 --> 10:57.120] And you can see they're basically point sources of light.
[10:57.120 --> 11:02.120] You're not seeing really like the surface of those stars.
[11:02.120 --> 11:07.120] There's some flare, lens flare, but you're not seeing the surface.
[11:07.120 --> 11:12.120] Bob and I found not the best picture of Alpha Centauri,
[11:12.120 --> 11:17.120] but just what's the best picture of any star ever, and there you go.
[11:17.120 --> 11:23.120] And it looks pretty much like a blurry slice of chorizo.
[11:23.120 --> 11:26.120] It doesn't even look symmetrical, Steve.
[11:26.120 --> 11:29.120] It's kind of a fattier chorizo, this one, though, right?
[11:29.120 --> 11:32.120] So as I said, if you blurt out that chorizo slice,
[11:32.120 --> 11:36.120] you have a pretty good facsimile of a close-up picture of the star.
[11:36.120 --> 11:39.120] Now this is, what was it, about 520 light years away?
[11:39.120 --> 11:41.120] Yeah, surprisingly far.
[11:41.120 --> 11:43.120] But it's a red supergiant, so it's massive.
[11:43.120 --> 11:44.120] So that helps.
[11:44.120 --> 11:45.120] Yeah, that helps a lot.
[11:45.120 --> 11:49.120] Yeah, that was more plausible than I thought when I first saw it.
[11:49.120 --> 11:53.120] I feel like the scary thing is that we're all so worried about misinformation
[11:53.120 --> 11:55.120] that scientists can't make jokes.
[11:55.120 --> 11:57.120] It's kind of where we're going to live.
[11:57.120 --> 11:59.120] Not that this was the best joke of all time,
[11:59.120 --> 12:02.120] but the idea of a prank is sort of, it feels irresponsible,
[12:02.120 --> 12:04.120] and it's too bad that that's the case,
[12:04.120 --> 12:06.120] because it's making science fun and engaging,
[12:06.120 --> 12:09.120] and you could imagine he could do a fun quiz show,
[12:09.120 --> 12:12.120] like Cartwheel Galaxy or Lollipop or whatever, right?
[12:12.120 --> 12:13.120] Fallon could do a segment,
[12:13.120 --> 12:16.120] but it feels like it would cause more harm than good, which is...
[12:16.120 --> 12:18.120] Right, unless you're transparent up front.
[12:18.120 --> 12:21.120] If he did it as, like, you might think this is a close-up star,
[12:21.120 --> 12:23.120] but it's actually a chorizo.
[12:23.120 --> 12:25.120] Here's a close-up star, something like that.
[12:25.120 --> 12:27.120] That might be a new thing for our social media.
[12:27.120 --> 12:29.120] Close-up, what is this?
[12:29.120 --> 12:31.120] This one looks more like a pizza pie, though.
[12:31.120 --> 12:33.120] It's like that gimmick they did forever ago,
[12:33.120 --> 12:35.120] where they were like, was this a famous painting,
[12:35.120 --> 12:36.120] or did a gorilla paint this?
[12:36.120 --> 12:38.120] All right, so real quick,
[12:38.120 --> 12:41.120] apparently the password that the publisher put up there,
[12:41.120 --> 12:44.120] the space for the password only takes five characters,
[12:44.120 --> 12:47.120] so just type in the first five characters of the word future.
[12:47.120 --> 12:50.120] You know, you can't get good help these days, George.
[12:50.120 --> 12:51.120] Futter.
[12:51.120 --> 12:52.120] Futter.
[12:52.120 --> 12:53.120] Futter.
[12:53.120 --> 12:54.120] I don't understand.
[12:54.120 --> 12:57.120] The password field actually has a limit to how many characters it takes.
[12:57.120 --> 12:58.120] A small limit as well.
[12:58.120 --> 12:59.120] How does that even happen?
[12:59.120 --> 13:01.120] You have to pay more for the six characters.
[13:01.120 --> 13:03.120] Most passwords require it to be way too long these days,
[13:03.120 --> 13:05.120] and I can't fill it in.
[13:05.120 --> 13:08.120] A third book, Jay, maybe you could have six characters.
[13:08.120 --> 13:09.120] Look, this is what I'll do.
[13:09.120 --> 13:11.120] I'll call the publisher on Monday, and I'll tell them,
[13:11.120 --> 13:15.120] forget the password, just whoever entered is going to be legit.
[13:15.120 --> 13:19.120] So just put your info in there if you want to enter in.
[13:19.120 --> 13:21.120] All right, let's get to some news items.
[13:21.120 --> 13:23.120] We have more fun bits coming up later, too,
[13:23.120 --> 13:25.120] but first a couple news items.
Scientific Rigor (13:25)
[13:25.120 --> 13:29.120] That picture of a giant complex of buildings that I'm showing
[13:29.120 --> 13:33.120] is the NIH, the National Institutes of Health.
[13:33.120 --> 13:40.120] They are essentially the main biomedical research funding institution
[13:40.120 --> 13:41.120] in the United States.
[13:41.120 --> 13:43.120] They are a creature of Congress, as we like to say.
[13:43.120 --> 13:46.120] They are created, funded by Congress.
[13:46.120 --> 13:48.120] Essentially, if you do biomedical research in the U.S.,
[13:48.120 --> 13:52.120] you get your funding from the NIH, more likely than not.
[13:52.120 --> 13:56.120] They're massively important for medical research.
[13:56.120 --> 13:59.120] Recently, the NIH created an initiative.
[13:59.120 --> 14:00.120] It's not a new office or anything.
[14:00.120 --> 14:02.120] It's just an initiative.
[14:02.120 --> 14:07.120] They're funding specific groups who are going to create
[14:07.120 --> 14:15.120] an educational module to teach researchers how to do rigorous science.
[14:15.120 --> 14:16.120] That sounds pretty good.
[14:16.120 --> 14:17.120] That sounds pretty good to me.
[14:17.120 --> 14:19.120] That doesn't already exist, though?
[14:19.120 --> 14:20.120] That's my thought.
[14:20.120 --> 14:21.120] That's a good question.
[14:21.120 --> 14:29.120] Right now, how do we teach researchers how to do good research methodology?
[14:29.120 --> 14:32.120] Some universities may have courses on it.
[14:32.120 --> 14:33.120] They may be required.
[14:33.120 --> 14:35.120] They may be elective.
[14:35.120 --> 14:39.120] They might be a statistics course or a research methodology course.
[14:39.120 --> 14:45.120] You do get that, but not like, all right, here's how you do really rigorous research.
[14:45.120 --> 14:55.120] Here's how you avoid p-hacking or how you avoid false positives, etc., etc.
[14:55.120 --> 15:01.120] Clearly, that is needed for reasons that I've been talking about
[15:01.120 --> 15:04.120] and writing about for the last 20 years.
[15:04.120 --> 15:07.120] The other way that people learn that is through, essentially,
[15:07.120 --> 15:08.120] individual mentorship.
[15:08.120 --> 15:12.120] You work in somebody's lab, and they teach you how to do research,
[15:12.120 --> 15:16.120] not only in their specific area, technically, but also just,
[15:16.120 --> 15:18.120] this is what good science is.
[15:18.120 --> 15:22.120] But it's not systematic, and it's not thorough enough.
[15:22.120 --> 15:26.120] Clearly, there's a perception that there is a gap, a gap there.
[15:26.120 --> 15:28.120] They want to fill that gap.
[15:28.120 --> 15:35.120] Their goal is to fund the creation of this module to teach rigorous research design
[15:35.120 --> 15:38.120] and to then make it freely available, basically.
[15:38.120 --> 15:42.120] And then the hope is, so universities may require it.
[15:42.120 --> 15:46.120] They might say, all right, if you're going to work at our university,
[15:46.120 --> 15:47.120] this already happens, right?
[15:47.120 --> 15:52.120] I work at Yale, and I have to do 20 different certifications every year
[15:52.120 --> 15:56.120] on everything, like sexual harassment sensitivity
[15:56.120 --> 15:59.120] or how not to burn your eyes out or whatever, all of these things.
[15:59.120 --> 16:00.120] That's a good one.
[16:00.120 --> 16:02.120] How to treat patients ethically, all good stuff.
[16:02.120 --> 16:04.120] A lot of safety things all in there.
[16:04.120 --> 16:08.120] But just adding one that's like, here's how not to do fake research.
[16:08.120 --> 16:11.120] Here's how not to accidentally commit research from,
[16:11.120 --> 16:13.120] or how to p-hack or whatever.
[16:13.120 --> 16:18.120] It would be very easy to slip that into the existing system
[16:18.120 --> 16:21.120] of getting certified for quality control.
[16:21.120 --> 16:23.120] That's basically what this is.
[16:23.120 --> 16:26.120] Now, the NIH, of course, they could require,
[16:26.120 --> 16:29.120] if you apply to the NIH for a research grant,
[16:29.120 --> 16:31.120] and they're not saying they're going to do this,
[16:31.120 --> 16:33.120] but imagine if they said, all right, in order to get this grant,
[16:33.120 --> 16:37.120] you've got to have certification that you took this module and you passed.
[16:37.120 --> 16:41.120] Because again, they're interested in not wasting money.
[16:41.120 --> 16:43.120] That's their primary interest.
[16:43.120 --> 16:44.120] Obviously, they want to do good science.
[16:44.120 --> 16:45.120] That's their goal.
[16:45.120 --> 16:47.120] Their mission is to obviously do good science,
[16:47.120 --> 16:51.120] but they have a finite budget, and they want to make the most use
[16:51.120 --> 16:53.120] out of that money.
[16:53.120 --> 16:55.120] That, again, is their mission.
[16:55.120 --> 17:00.120] One of the biggest wastes in research is bad science.
[17:00.120 --> 17:04.120] If you publish a study, and it's a false positive, let's say,
[17:04.120 --> 17:08.120] you think that you have a result, but you did poor methodology,
[17:08.120 --> 17:11.120] you p-hacked or whatever, you underpowered the study,
[17:11.120 --> 17:16.120] or the blinding was inadequate, or your statistics were off, or whatever,
[17:16.120 --> 17:20.120] and then other people try to replicate that study,
[17:20.120 --> 17:24.120] how many millions of dollars could be spent proving that your crappy study
[17:24.120 --> 17:28.120] was crappy when you could have filtered it out at the beginning
[17:28.120 --> 17:32.120] by putting in some internal controls that you didn't know you should do
[17:32.120 --> 17:35.120] or by tightening up your research methodology.
[17:35.120 --> 17:39.120] The other goal here, other than not only doing good science,
[17:39.120 --> 17:44.120] is to save money by weeding out the inefficiency in the system of fraud.
[17:44.120 --> 17:49.120] It makes sense, not fraud, but just bad rigor in research design.
[17:49.120 --> 17:53.120] It makes sense that once these modules are up and running,
[17:53.120 --> 17:56.120] phase two would be, and you've got to be certified in this
[17:56.120 --> 17:58.120] before we'll give you any money.
[17:58.120 --> 18:00.120] So that's one way that you, and again,
[18:00.120 --> 18:03.120] the NIH already does this for other things, for example,
[18:03.120 --> 18:07.120] they now require, this has been going on for about 10 or 15 years or so,
[18:07.120 --> 18:09.120] if you get public money to do your research,
[18:09.120 --> 18:13.120] you have to make the results of your research available to the public
[18:13.120 --> 18:15.120] and accessible by the public.
[18:15.120 --> 18:20.120] You have to say, how are you going to explain your results
[18:20.120 --> 18:23.120] to the people who are paying for your research, the public?
[18:23.120 --> 18:26.120] So this would be another way, how can you assure the people
[18:26.120 --> 18:29.120] who are funding your research that you're not wasting their money
[18:29.120 --> 18:31.120] by doing rigorous research design?
[18:31.120 --> 18:36.120] And by the way, here is an educational module,
[18:36.120 --> 18:40.120] and we could easily connect certification to that.
[18:40.120 --> 18:41.120] That's awesome.
[18:41.120 --> 18:45.120] I would like to see big science journals do the same thing.
[18:45.120 --> 18:47.120] You want to get published in our journal,
[18:47.120 --> 18:50.120] we require that you have the author, the lead author,
[18:50.120 --> 18:52.120] or every author has certification.
[18:52.120 --> 18:54.120] And of course, once either of those happens,
[18:54.120 --> 18:57.120] like if the NIH says you need to have certification to get grant money,
[18:57.120 --> 19:01.120] you better believe every university will make sure that it happens.
[19:01.120 --> 19:03.120] They're not going to have any of their people
[19:03.120 --> 19:05.120] not be able to get NIH grants.
[19:05.120 --> 19:08.120] So it's very easy to make this systematic.
[19:08.120 --> 19:10.120] So again, we're right at the very beginning of this,
[19:10.120 --> 19:13.120] and everything I'm hearing and seeing is very, very good.
[19:13.120 --> 19:15.120] We'll keep a close eye on it.
[19:15.120 --> 19:17.120] And again, a lot of people react like you, Jay.
[19:17.120 --> 19:19.120] It's like, really, why isn't this kind of already happening?
[19:19.120 --> 19:22.120] But that's because I think the main reason is,
[19:22.120 --> 19:24.120] I would say there's two things.
[19:24.120 --> 19:27.120] One is people think it is happening, but it's just not happening enough.
[19:27.120 --> 19:31.120] The second one is that the science of doing rigorous science
[19:31.120 --> 19:33.120] has been getting better.
[19:33.120 --> 19:37.120] We're learning more and more subtle ways in which studies go awry
[19:37.120 --> 19:39.120] or that results can be tweaked
[19:39.120 --> 19:42.120] or researchers can put their thumb on the scale.
[19:42.120 --> 19:44.120] We talk about researcher degrees of freedom
[19:44.120 --> 19:47.120] and researcher bias and publication bias and citation bias
[19:47.120 --> 19:52.120] and all these things that can alter the utility and the rigor
[19:52.120 --> 19:54.120] and the quality of science,
[19:54.120 --> 19:59.120] and essentially the old method of just relying upon some,
[19:59.120 --> 20:02.120] like just here's some classic statistics class,
[20:02.120 --> 20:05.120] and then whoever's lab you work in,
[20:05.120 --> 20:07.120] they'll teach you how to do good science.
[20:07.120 --> 20:09.120] It's just not good enough anymore.
[20:09.120 --> 20:13.120] It's got to be systematic, and everyone's got to go through it
[20:13.120 --> 20:17.120] in order to absolutely minimize the waste in the system
[20:17.120 --> 20:21.120] that comes from poor research design.
[20:21.120 --> 20:23.120] So this is a massive move in the right direction.
[20:23.120 --> 20:25.120] This is very, very encouraging.
[20:25.120 --> 20:27.120] Steve, where did you learn how to do it?
[20:27.120 --> 20:29.120] For me, well, I mean,
[20:29.120 --> 20:31.120] it's been the whole science-based medicine initiative,
[20:31.120 --> 20:33.120] which is I've been reading about it, following,
[20:33.120 --> 20:36.120] reading the literature on it for 20 years
[20:36.120 --> 20:38.120] and writing about it, trying to digest it.
[20:38.120 --> 20:41.120] That's basically what we explore at science-based medicine
[20:41.120 --> 20:43.120] is how to do rigorous science,
[20:43.120 --> 20:45.120] the relationship between science and practice.
[20:45.120 --> 20:47.120] How do we know what's true, what's not true?
[20:47.120 --> 20:49.120] Where's the threshold of evidence
[20:49.120 --> 20:51.120] before something should affect your practice?
[20:51.120 --> 20:53.120] That's what we do.
[20:53.120 --> 20:55.120] That's how I learned it.
[20:55.120 --> 20:57.120] It was all basically just self-taught by reading the literature,
[20:57.120 --> 21:00.120] talking to my colleagues, writing about it, engaging about it.
[21:00.120 --> 21:04.120] But most researchers are not spending most of their time,
[21:04.120 --> 21:07.120] their academic time, doing that.
[21:07.120 --> 21:09.120] They're doing their research.
[21:09.120 --> 21:12.120] They're trying to figure out what receptor is causing this disease
[21:12.120 --> 21:14.120] or whatever.
[21:14.120 --> 21:18.120] This is sort of part of that, but it's not their focus.
[21:18.120 --> 21:23.120] That's why it needs to be done systematically.
[21:23.120 --> 21:25.120] This is also one final word and then we'll move on.
[21:25.120 --> 21:28.120] Part of a bigger trend that I've noticed, at least in medicine,
[21:28.120 --> 21:32.120] Andrew, you can tell me if you think it's true in your field as well,
[21:32.120 --> 21:37.120] that you're going away from the model of just counting on mentorship
[21:37.120 --> 21:41.120] and counting on that people will learn what they need to learn
[21:41.120 --> 21:46.120] and moving towards things that are way more systematic,
[21:46.120 --> 21:51.120] that are verified, and also that there are checks in place
[21:51.120 --> 21:59.120] rather than just trying to raise the quality by just over-educating people.
[21:59.120 --> 22:01.120] You just have checks in place to make sure that they do it.
[22:01.120 --> 22:03.120] Medicine is getting too complicated.
[22:03.120 --> 22:06.120] Science is getting too complicated to rely upon methods
[22:06.120 --> 22:08.120] that are not absolutely systematic.
[22:08.120 --> 22:10.120] Is that something you find in academia from your end?
[22:10.120 --> 22:11.120] Definitely.
[22:11.120 --> 22:13.120] I'm thinking about something that I think Jay brought up
[22:13.120 --> 22:16.120] on a different live a while ago about the movement
[22:16.120 --> 22:18.120] towards pre-registering your hypotheses.
[22:18.120 --> 22:20.120] That's another way of just putting the system in place
[22:20.120 --> 22:23.120] because it turns out we can't rely on everyone to do great science
[22:23.120 --> 22:25.120] even though we all like to think that we're doing it.
[22:25.120 --> 22:27.120] Where I thought you were going, Steve, with that was
[22:27.120 --> 22:29.120] we can't rely exclusively.
[22:29.120 --> 22:31.120] Well, we still rely on it a lot, but peer review.
[22:31.120 --> 22:33.120] Peer review is not a perfect process.
[22:33.120 --> 22:35.120] It's a strong process in a lot of ways
[22:35.120 --> 22:37.120] and I don't have great ideas about what to do instead,
[22:37.120 --> 22:39.120] but it's not like it's perfect.
[22:39.120 --> 22:41.120] A lot of stuff gets through peer review,
[22:41.120 --> 22:44.120] and so this is something that could help steer people.
[22:44.120 --> 22:48.120] The only question I'm having, though, is how you could imagine
[22:48.120 --> 22:51.120] a world where they're sort of methodologically specific.
[22:51.120 --> 22:55.120] I'm thinking of machine learning where you have issues
[22:55.120 --> 22:57.120] with overfitting your model.
[22:57.120 --> 23:00.120] That would be totally irrelevant to someone running an experiment.
[23:00.120 --> 23:03.120] I don't know what the future would look like.
[23:03.120 --> 23:05.120] Ten years from now, are there different modules?
[23:05.120 --> 23:07.120] Do we need different modules?
[23:07.120 --> 23:09.120] This is what exists currently in medicine.
[23:09.120 --> 23:13.120] If I'm doing some quality control certification thing
[23:13.120 --> 23:16.120] that I do every year, there's the first part of it,
[23:16.120 --> 23:19.120] which is for everyone or maybe every physician,
[23:19.120 --> 23:22.120] and then you say what your specialty is.
[23:22.120 --> 23:24.120] I'm a neurologist.
[23:24.120 --> 23:26.120] Then you get the neurology-specific stuff.
[23:26.120 --> 23:28.120] You could do the same thing.
[23:28.120 --> 23:30.120] Here's the generic rigors that everyone needs to know,
[23:30.120 --> 23:32.120] and then what are you doing research in?
[23:32.120 --> 23:34.120] Particle physics?
[23:34.120 --> 23:37.120] Here's the particle physics part of the module for you
[23:37.120 --> 23:39.120] for those specific issues.
[23:39.120 --> 23:41.120] I could absolutely see that working that way.
[23:41.120 --> 23:44.120] I kind of like the idea of making a bunch of social scientists
[23:44.120 --> 23:47.120] do the particle physics, just to keep us humble.
[23:47.120 --> 23:49.120] Absolutely.
More Space Debris (23:51)
[23:49.120 --> 23:53.120] Jay, tell us about crap falling from the sky.
[23:53.120 --> 23:57.120] Steve, there's crap, and it's falling from the goddamn sky.
[23:57.120 --> 23:59.120] Oh, my goodness.
[23:59.120 --> 24:06.120] This is about the fact that space agencies around the world
[24:06.120 --> 24:10.120] are not doing a very good job of figuring out
[24:10.120 --> 24:13.120] how to exactly de-orbit pieces of spacecraft
[24:13.120 --> 24:16.120] that are left up there for one reason or another.
[24:16.120 --> 24:20.120] There is a significant number of objects in low Earth orbit.
[24:20.120 --> 24:25.120] NASA tracks anything from 2 inches or 5 centimeters and up,
[24:25.120 --> 24:30.120] and there's 27,000 objects that are being tracked,
[24:30.120 --> 24:36.120] and 70% of the tracked objects are in LEO, low Earth orbit,
[24:36.120 --> 24:39.120] which is the orbit that's basically as close to the Earth
[24:39.120 --> 24:41.120] as you could pretty much get.
[24:41.120 --> 24:43.120] Do they say LEO?
[24:43.120 --> 24:45.120] I've only ever heard LEO.
[24:45.120 --> 24:47.120] I just thought you meant something astrology, Jay,
[24:47.120 --> 24:49.120] and I was like, I can't believe this is happening.
[24:49.120 --> 24:50.120] I've got to go.
[24:50.120 --> 24:52.120] I'm blazing trails here.
[24:52.120 --> 24:54.120] It's low Earth orbit.
[24:54.120 --> 24:57.120] Every one of these objects that are up there
[24:57.120 --> 25:01.120] and that are going to be up there for a long time are hazards.
[25:01.120 --> 25:02.120] They're dangerous.
[25:02.120 --> 25:04.120] They actually have to plan accordingly.
[25:04.120 --> 25:07.120] When anybody launches anything into outer space,
[25:07.120 --> 25:10.120] they have to figure out the right time to do it
[25:10.120 --> 25:13.120] and how to avoid these known objects,
[25:13.120 --> 25:16.120] because one of them could be traveling at such an incredible speed
[25:16.120 --> 25:19.120] in relation to the ship that you're putting up there
[25:19.120 --> 25:20.120] that it could destroy it.
[25:20.120 --> 25:22.120] It could rip right through it.
[25:22.120 --> 25:24.120] So this is a growing issue,
[25:24.120 --> 25:27.120] and we have another issue that is a problem,
[25:27.120 --> 25:31.120] is that there are objects that are being left in low Earth orbit
[25:31.120 --> 25:36.120] that are big, that are slowly de-orbiting over time,
[25:36.120 --> 25:39.120] because there's a tiny, tiny, tiny, tiny bit of atmosphere
[25:39.120 --> 25:41.120] in low Earth orbit,
[25:41.120 --> 25:44.120] and that's just enough to slowly take something out of orbit
[25:44.120 --> 25:46.120] and bring it back down to Earth.
[25:46.120 --> 25:51.120] As an example, China had one of their Long March 5B rockets
[25:51.120 --> 25:53.120] bring something up,
[25:53.120 --> 25:56.120] and a week later, when it came out of orbit,
[25:56.120 --> 25:58.120] because it was only up for a week,
[25:58.120 --> 26:01.120] and by that time there was enough inertia and everything
[26:01.120 --> 26:03.120] to get it back down into the atmosphere,
[26:03.120 --> 26:07.120] pieces of it landed in Malaysia and Indonesia,
[26:07.120 --> 26:10.120] and it landed right near a village where people were living.
[26:10.120 --> 26:12.120] It is a real threat,
[26:12.120 --> 26:15.120] and we're not talking about millions of people getting hurt,
[26:15.120 --> 26:16.120] but it could kill people.
[26:16.120 --> 26:19.120] It could kill handfuls of people now and again,
[26:19.120 --> 26:21.120] which is something that we definitely want to avoid.
[26:21.120 --> 26:23.120] It's also just not good practice.
[26:23.120 --> 26:25.120] It's not keeping your shop clean.
[26:25.120 --> 26:28.120] So getting back to the Long March 5B rocket,
[26:28.120 --> 26:30.120] now this rocket is huge.
[26:30.120 --> 26:32.120] China launched it on July 24th,
[26:32.120 --> 26:35.120] and they were bringing up a new space station module
[26:35.120 --> 26:39.120] to their Tiangong space station, which is a China-only space station.
[26:39.120 --> 26:42.120] It's actually pretty cool, they should read up on it.
[26:42.120 --> 26:46.120] Now this rocket is not designed to de-orbit itself.
[26:46.120 --> 26:48.120] They don't send it up with the ability to do that,
[26:48.120 --> 26:52.120] and in fact, the engines can't even restart after the engines are shut off.
[26:52.120 --> 26:55.120] When it does its main push and gets all that weight up
[26:55.120 --> 26:57.120] to the altitude that they need it to,
[26:57.120 --> 26:59.120] and those engines shut off, they can't go back on.
[26:59.120 --> 27:03.120] This ultimately means that there's no way for China
[27:03.120 --> 27:07.120] to control the de-orbiting of this massive rocket.
[27:07.120 --> 27:10.120] It's just going to fly back into the Earth's atmosphere,
[27:10.120 --> 27:13.120] and I'm not even sure that they know where it's going to end up going.
[27:13.120 --> 27:16.120] I don't even know if there's good physics
[27:16.120 --> 27:19.120] that will really accurately predict where something willy-nilly
[27:19.120 --> 27:23.120] is de-orbiting at some point and coming back into the atmosphere.
[27:23.120 --> 27:27.120] It could end up anywhere, which is the scary part.
[27:27.120 --> 27:31.120] Believe me, I feel completely happy and thrilled and lucky
[27:31.120 --> 27:34.120] that we're alive during a time when space exploration
[27:34.120 --> 27:36.120] is starting to explode again.
[27:36.120 --> 27:37.120] It's a great time.
[27:37.120 --> 27:38.120] Hopefully explode.
[27:38.120 --> 27:40.120] Yeah, you're right.
[27:40.120 --> 27:44.120] When all of these nations are launching new projects,
[27:44.120 --> 27:46.120] how's that? Is that better?
[27:46.120 --> 27:47.120] Better.
[27:47.120 --> 27:52.120] What we don't have right now are proper rules of etiquette.
[27:52.120 --> 27:55.120] There are things that people would like.
[27:55.120 --> 27:59.120] NASA is making it known what information that they would like,
[27:59.120 --> 28:03.120] but in this instance, China didn't share any of the information
[28:03.120 --> 28:06.120] about what trajectory their rocket was on
[28:06.120 --> 28:10.120] and where they think it'll end up coming back into the atmosphere.
[28:10.120 --> 28:13.120] The NASA administrator, the name of Bill Nelson,
[28:13.120 --> 28:15.120] he said, and I'm quoting him,
[28:15.120 --> 28:18.120] All spacefaring nations should follow established best practices
[28:18.120 --> 28:22.120] and do their part to share this type of information in advance
[28:22.120 --> 28:25.120] to allow reliable predictions of potential debris impact risk,
[28:25.120 --> 28:29.120] especially for heavy-lift vehicles like the Long March 5B,
[28:29.120 --> 28:33.120] which carry a significant risk of loss of life and property.
[28:33.120 --> 28:36.120] Doing so is critical to the responsible use of space
[28:36.120 --> 28:39.120] and to ensure the safety of people here on Earth.
[28:39.120 --> 28:42.120] I wish that I could have found some information on what would have happened
[28:42.120 --> 28:48.120] if one of these pieces of larger debris ended up barreling into a city.
[28:48.120 --> 28:50.120] Could it take a part of a building out?
[28:50.120 --> 28:53.120] What's its velocity? How much mass does it have?
[28:53.120 --> 28:56.120] I do know that SpaceX had a module,
[28:56.120 --> 29:00.120] a piece of debris come back down as recently as July 9th.
[29:00.120 --> 29:03.120] Now, if you look at a picture of the Crew-1 module,
[29:03.120 --> 29:06.120] there is a component that's right underneath it
[29:06.120 --> 29:09.120] that is used to relay electricity to the module and all that,
[29:09.120 --> 29:11.120] but it's also a cargo hold, right?
[29:11.120 --> 29:13.120] A cargo hold that's not pressurized.
[29:13.120 --> 29:17.120] This thing is about 3 meters long and it weighs 4 metric tons.
[29:17.120 --> 29:22.120] That's an incredibly heavy object that hit the Earth at one point.
[29:22.120 --> 29:26.120] It came back down on July 9th and it took a year for it to deorbit.
[29:26.120 --> 29:29.120] So that's just another thing that needs to be tracked.
[29:29.120 --> 29:32.120] It could take time for them to come back down
[29:32.120 --> 29:34.120] and then we have to try to figure out where they're going to go.
[29:34.120 --> 29:36.120] But okay, let's say we know where it's going to go.
[29:36.120 --> 29:40.120] So what? What if it's going to hit a major city somewhere?
[29:40.120 --> 29:41.120] What are we going to do about it?
[29:41.120 --> 29:43.120] The answer is there's nothing.
[29:43.120 --> 29:44.120] There's nothing we can do about it.
[29:44.120 --> 29:48.120] We're going to shoot rockets up to take out rockets that are coming.
[29:48.120 --> 29:49.120] The whole thing is crazy.
[29:49.120 --> 29:54.120] So what we need to do is we need to have this rules of etiquette
[29:54.120 --> 29:58.120] where space agencies start to send up more fuel,
[29:58.120 --> 30:01.120] have rocket engines that can deorbit themselves
[30:01.120 --> 30:04.120] and not only have one turn-on cycle.
[30:04.120 --> 30:09.120] These pretty costly and probably very expensive engineering feats
[30:09.120 --> 30:11.120] that need to become a part of all of these projects.
[30:11.120 --> 30:13.120] And that's what NASA wants.
[30:13.120 --> 30:15.120] But right now...
[30:15.120 --> 30:17.120] Just to make sure that the point is crystal clear,
[30:17.120 --> 30:21.120] it's to control the deorbit so that we know where it comes down.
[30:21.120 --> 30:26.120] We dump it in the middle of the Pacific so it doesn't hit Australia or whatever.
[30:26.120 --> 30:27.120] Exactly, yeah.
[30:27.120 --> 30:30.120] So right now there's a couple of companies that are starting to,
[30:30.120 --> 30:33.120] or space agencies that are starting to comply
[30:33.120 --> 30:37.120] and build in this functionality into the new rockets that they're building.
[30:37.120 --> 30:41.120] But let's face it, it's not a global thing.
[30:41.120 --> 30:43.120] A lot of people aren't doing that.
[30:43.120 --> 30:46.120] Some good things that we have are like SpaceX,
[30:46.120 --> 30:50.120] which is leading the pack on this whole idea of reusability.
[30:50.120 --> 30:51.120] That's fantastic.
[30:51.120 --> 30:52.120] You want to reuse your rockets.
[30:52.120 --> 30:54.120] You want your retro rockets to land themselves.
[30:54.120 --> 30:55.120] You see it all the time.
[30:55.120 --> 30:56.120] That's great.
[30:56.120 --> 30:59.120] More reusability that we build into things means more control,
[30:59.120 --> 31:02.120] more ability to bring things down safely,
[31:02.120 --> 31:05.120] which is exactly what everybody needs to be doing.
[31:05.120 --> 31:08.120] One, we don't want to pollute low Earth orbit any worse than it is.
[31:08.120 --> 31:10.120] If anything, we want to get that stuff out of there,
[31:10.120 --> 31:16.120] which no one has come up with a feasible economic way to do it yet.
[31:16.120 --> 31:18.120] But I imagine at some point in the next 50 years,
[31:18.120 --> 31:22.120] someone will come up with something that's making that move.
[31:22.120 --> 31:25.120] But in the meantime, our goals are no more debris
[31:25.120 --> 31:29.120] and absolutely no more craziness of things falling out of the sky
[31:29.120 --> 31:33.120] without any predictability on where they're going to go or drivability,
[31:33.120 --> 31:36.120] meaning we want them to go to a specific place.
[31:36.120 --> 31:38.120] So what do you think about that, Steve?
[31:38.120 --> 31:40.120] Well, it wasn't too long ago.
[31:40.120 --> 31:43.120] It was just a science or fiction item where an estimate was
[31:43.120 --> 31:46.120] that in the next decade, there's actually something like a 10% chance
[31:46.120 --> 31:48.120] of somebody getting hit by space debris.
[31:48.120 --> 31:49.120] Oh, yeah.
[31:49.120 --> 31:50.120] We all thought it was fiction.
[31:50.120 --> 31:54.120] Yeah, it's getting pretty significant now just because of the sheer volume
[31:54.120 --> 31:56.120] of stuff that we're putting up there.
[31:56.120 --> 31:59.120] So, yeah, it's, again, one of those things that we have to take
[31:59.120 --> 32:02.120] a systematic approach to it rather than relying on individuals
[32:02.120 --> 32:03.120] to all do the right thing.
[32:03.120 --> 32:05.120] How would we figure that out, Steve?
[32:05.120 --> 32:07.120] Where would we come up with such an approach?
[32:07.120 --> 32:09.120] People aren't just going to automatically do the right thing
[32:09.120 --> 32:10.120] on their own volition.
[32:10.120 --> 32:11.120] It's just stunning.
[32:11.120 --> 32:12.120] I know.
[32:12.120 --> 32:14.120] I feel like we're going to have apps where you have, like,
[32:14.120 --> 32:16.120] weather forecast, air pollution, space debris.
[32:16.120 --> 32:17.120] Space debris, yeah.
[32:17.120 --> 32:20.120] What's the probability of that thing landing in Manhattan today?
[32:20.120 --> 32:21.120] Take your umbrella.
[32:21.120 --> 32:23.120] Yeah, like a steel umbrella.
[32:23.120 --> 32:26.120] 50% chance of rain, 5% chance of...
[32:26.120 --> 32:28.120] Low-work orbit de-orbiting.
[32:28.120 --> 32:31.120] Emily Calandrelli, who does a lot of space-related science communication,
[32:31.120 --> 32:34.120] she was following this one as it was coming down.
[32:34.120 --> 32:38.120] And what shocked me about it was we really didn't know where it was
[32:38.120 --> 32:41.120] going to be until, like, an hour before, even days before,
[32:41.120 --> 32:45.120] it was like half of the Earth was in the possible target area.
[32:45.120 --> 32:48.120] But she did say, at least this one, they thought.
[32:48.120 --> 32:51.120] But, again, they didn't really know what exactly it was made of,
[32:51.120 --> 32:53.120] but it would only take out a house or two.
[32:53.120 --> 32:54.120] A house or two.
[32:54.120 --> 32:55.120] Just a house or two.
[32:55.120 --> 32:56.120] Yeah.
[32:56.120 --> 33:00.120] Since you suggested a city, a house was the better alternative.
[33:00.120 --> 33:04.120] Does space debris zero in on trailer parks like tornadoes do?
[33:04.120 --> 33:05.120] Yeah.
[33:05.120 --> 33:06.120] I'm just wondering.
[33:06.120 --> 33:07.120] And lawn chairs and stuff.
[33:07.120 --> 33:08.120] Yeah.
[33:08.120 --> 33:10.120] But there's things to consider, though, because it's not just...
[33:10.120 --> 33:12.120] But could there be explosives in there?
[33:12.120 --> 33:15.120] Could there be some leftover rocket fuel fumes?
[33:15.120 --> 33:18.120] Or I have no idea, like, what potential explosive...
[33:18.120 --> 33:20.120] They're probably out of fuel, yeah.
[33:20.120 --> 33:21.120] You'd hope.
[33:21.120 --> 33:22.120] Yeah, you'd hope.
[33:22.120 --> 33:23.120] Who knows?
[33:23.120 --> 33:24.120] What about waste?
[33:24.120 --> 33:27.120] What about, like, dangerous gases and things like that?
[33:27.120 --> 33:30.120] Well, when Columbia broke up in 2003
[33:30.120 --> 33:34.120] and came down over the American South and Southeast,
[33:34.120 --> 33:38.120] there was concern that they didn't know what sort of contamination,
[33:38.120 --> 33:41.120] I think, there was in some of the materials,
[33:41.120 --> 33:44.120] that people were finding and picking up, like, you know,
[33:44.120 --> 33:46.120] a piece of a helmet and things.
[33:46.120 --> 33:48.120] They warned people to not go near them.
[33:48.120 --> 33:49.120] Yeah.
[33:49.120 --> 33:51.120] So I don't know what sort of danger that...
[33:51.120 --> 33:52.120] I don't know.
[33:52.120 --> 33:55.120] I know it always comes up whenever they're sending up any satellite
[33:55.120 --> 33:57.120] or anything that has a nuclear battery in it.
[33:57.120 --> 34:00.120] If that thing, you know, blows up or reenters,
[34:00.120 --> 34:03.120] then we could be dumping nuclear waste.
[34:03.120 --> 34:06.120] Well, now I'm thinking, you know, Cold War Sputnik stuff, too,
[34:06.120 --> 34:08.120] where it's like, what if it's not an accident?
[34:08.120 --> 34:10.120] Not to be the conspiracy theorist of the group,
[34:10.120 --> 34:12.120] but that would be a good way to...
[34:12.120 --> 34:14.120] Anyway, I'll stop with that one thought.
[34:14.120 --> 34:15.120] All right.
Auditory Pareidolia Again (34:16)
[34:15.120 --> 34:17.120] This is actually a couple of years old,
[34:17.120 --> 34:19.120] but it's making the rounds again, and I saw it.
[34:19.120 --> 34:21.120] I don't think we've ever played this on the issue.
[34:21.120 --> 34:23.120] I missed it the first time around.
[34:23.120 --> 34:25.120] This video, just listen to the sound.
[34:25.120 --> 34:27.120] You don't have to see the video.
[34:27.120 --> 34:30.120] So either think the word brainstorm
[34:30.120 --> 34:33.120] or think the word green needle.
[34:33.120 --> 34:37.120] And whatever you think, that's what you will hear.
[34:37.120 --> 34:41.120] You don't even need to be caught with the actual words.
[34:41.120 --> 34:45.120] You just have to think it.
[34:45.120 --> 34:47.120] Isn't that bizarre?
[34:47.120 --> 34:48.120] That's crazy.
[34:48.120 --> 34:50.120] Although I'm hearing the green needle a lot more
[34:50.120 --> 34:52.120] than I'm hearing the brainstorm.
[34:52.120 --> 34:55.120] It's either distinctively green needle or not green needle.
[34:55.120 --> 34:58.120] Yeah, but I could flip both ways at will.
[34:58.120 --> 35:02.120] You would think, though, they seem like such different phrases
[35:02.120 --> 35:05.120] phonetically and everything, but it's in there.
[35:05.120 --> 35:07.120] There are things in there that will trick your brain
[35:07.120 --> 35:09.120] for both of those.
[35:09.120 --> 35:10.120] It's uncanny.
[35:10.120 --> 35:12.120] It's not even the same number of syllables,
[35:12.120 --> 35:15.120] which is surprising to me that it still works, right?
[35:15.120 --> 35:16.120] Yeah, it's one extra syllable.
[35:16.120 --> 35:17.120] Two versus three.
[35:17.120 --> 35:20.120] I think the distortion itself must be a critical component
[35:20.120 --> 35:23.120] of the ability to switch between it from one to the other, perhaps.
[35:23.120 --> 35:26.120] Otherwise, why make it sound so distorted?
[35:26.120 --> 35:30.120] I believe it also works brain needle and green storm as well.
[35:30.120 --> 35:32.120] If you try it.
[35:32.120 --> 35:34.120] I have to stumble upon this.
[35:34.120 --> 35:42.120] It's one of the more dramatic examples of auditory parendolism.
[35:42.120 --> 35:45.120] This happens in a lot of our sensory streams,
[35:45.120 --> 35:48.120] but it happens a lot with language.
[35:48.120 --> 35:52.120] Our sensory streams are wired to make the closest fit
[35:52.120 --> 35:55.120] to phonemes that you know.
[35:55.120 --> 35:59.120] It's constantly trying to make that fit between speech sound
[35:59.120 --> 36:02.120] and words that you know.
[36:02.120 --> 36:05.120] That's why you can misunderstand lyrics all the time
[36:05.120 --> 36:06.120] and misunderstand what people say.
[36:06.120 --> 36:07.120] It sounds like something close to it.
[36:07.120 --> 36:11.120] This is just demonstrating that in a very dramatic way.
[36:11.120 --> 36:14.120] It's amazing how well the priming works.
[36:14.120 --> 36:18.120] When Rob brought up the distortion, it reminded me of,
[36:18.120 --> 36:22.120] we talked about it on SGU, the doll that would talk.
[36:22.120 --> 36:23.120] Full-string dolls.
[36:23.120 --> 36:24.120] It has a recording.
[36:24.120 --> 36:27.120] It's a voice, but it's a crackly kind of voice.
[36:27.120 --> 36:29.120] It has a bit of distortion to it.
[36:29.120 --> 36:32.120] People think they're hearing things that the doll is saying
[36:32.120 --> 36:34.120] that it really isn't programmed to say,
[36:34.120 --> 36:38.120] but they can't distinguish what it was programmed to say.
[36:38.120 --> 36:42.120] They're thinking what they think it's saying instead.
[36:42.120 --> 36:45.120] We've come across this before in other mediums.
[36:45.120 --> 36:48.120] Is this behind those Disney conspiracies too,
[36:48.120 --> 36:49.120] where they're like,
[36:49.120 --> 36:52.120] there are secret horrible messages in various cartoons?
[36:52.120 --> 36:54.120] Is the light, that was one of the dolls that had it,
[36:54.120 --> 36:57.120] but that's not really what the doll was saying,
[36:57.120 --> 37:03.120] but it spread virally and that's what everyone started to hear.
[37:03.120 --> 37:05.120] It was saying because it was suggested that that's what it was saying.
[37:05.120 --> 37:07.120] The awkward masking on records.
[37:07.120 --> 37:09.120] I was just going to say that.
[37:09.120 --> 37:12.120] I've listened to Stairway to Heaven backwards.
[37:12.120 --> 37:19.120] I really hear a lot of stuff in there that has a demonic connotation.
[37:19.120 --> 37:21.120] The words that they're saying.
[37:21.120 --> 37:25.120] It's probably because I've been priming myself since I was a teenager.
[37:25.120 --> 37:27.120] When I hear that, every once in a while I'll listen to it
[37:27.120 --> 37:29.120] because it's actually kind of interesting.
[37:29.120 --> 37:33.120] I'm hearing, here's to my sweet Satan and all that stuff.
[37:33.120 --> 37:35.120] It seems very clear to me.
[37:35.120 --> 37:40.120] Again, your brain is trying to make sense out of chaos.
[37:40.120 --> 37:45.120] Sometimes your brain concocts something that isn't actually there.
[37:45.120 --> 37:47.120] It's kind of like the dress.
[37:47.120 --> 37:49.120] I was just thinking about the dress.
[37:49.120 --> 37:51.120] Or Laurel and Yanni.
[37:51.120 --> 37:54.120] Yeah, Laurel and Yanni.
[37:54.120 --> 37:56.120] The internet will spit out more of these things.
[37:56.120 --> 37:58.120] We'll share them with you.
[37:58.120 --> 38:00.120] This was a particularly impressive one.
[38:00.120 --> 38:02.120] Everyone, we're going to take a quick break from our show
[38:02.120 --> 38:04.120] to talk about our sponsor this week, BetterHelp.
[38:04.120 --> 38:07.120] Guys, we have to take care of not just our physical health,
[38:07.120 --> 38:09.120] but also our mental health.
[38:09.120 --> 38:12.120] There's lots of options available to us now.
[38:12.120 --> 38:13.120] BetterHelp is one of them.
[38:13.120 --> 38:16.120] BetterHelp offers online therapy.
[38:16.120 --> 38:17.120] I'll tell you something.
[38:17.120 --> 38:19.120] I personally do online therapy.
[38:19.120 --> 38:25.120] I've been meeting with my doctor for the past six months every week.
[38:25.120 --> 38:28.120] I've been dealing with anxiety and depression my entire adult life.
[38:28.120 --> 38:32.120] Therapy is one of the biggest things that helps me deal with it.
[38:32.120 --> 38:34.120] I really think that you should consider it.
[38:34.120 --> 38:37.120] If you're suffering, if you're having anything that's bothering you
[38:37.120 --> 38:40.120] that you seem to not be able to get over,
[38:40.120 --> 38:43.120] you really should think about talking to someone to get help.
[38:43.120 --> 38:44.120] You're right, Jay.
[38:44.120 --> 38:49.120] BetterHelp is not only online, but it offers a lot of different options.
[38:49.120 --> 38:52.120] We're talking video, phone, even live chat only.
[38:52.120 --> 38:57.120] You don't have to see someone on camera if you're not in the place to do that.
[38:57.120 --> 39:02.120] It's also affordable, and you can be matched with a therapist in under 48 hours.
[39:02.120 --> 39:07.120] Our listeners get 10% off their first month at BetterHelp.com.
[39:07.120 --> 39:11.120] That's BetterHELP.com.
[39:11.120 --> 39:14.120] All right, guys, let's get back to the show.
The Alex Jones Saga (39:15)
[39:14.120 --> 39:15.120] All right.
[39:15.120 --> 39:21.120] One thing that we can agree on, that is that Alex Jones is a giant douchebag.
[39:21.120 --> 39:25.120] You don't have my permission to use that photo.
[39:25.120 --> 39:28.120] I'm going to get your internet permission to not use that photo.
[39:28.120 --> 39:29.120] Buy my vitamins.
[39:29.120 --> 39:31.120] I have a worse photo.
[39:31.120 --> 39:37.120] All right, Kelly, give us an update on the Alex Jones saga.
[39:37.120 --> 39:41.120] Yes, so I, like the insane person I am,
[39:41.120 --> 39:44.120] have kind of had this on in the background for the last two weeks,
[39:44.120 --> 39:48.120] and I was very glad to have an opportunity to put that to use.
[39:48.120 --> 39:51.120] But in Steve fashion, I'm going to start with a question.
[39:51.120 --> 39:58.120] So what percentage of Americans do you guys think question the Sandy Hook shooting?
[39:58.120 --> 40:00.120] 20%.
[40:00.120 --> 40:01.120] 10%.
[40:01.120 --> 40:02.120] Question it?
[40:02.120 --> 40:04.120] Probably I would say like 22%.
[40:04.120 --> 40:06.120] 22.1%.
[40:06.120 --> 40:07.120] 25%.
[40:07.120 --> 40:08.120] Wow.
[40:08.120 --> 40:10.120] It depends on whether we're doing Price is Right rules or not,
[40:10.120 --> 40:13.120] but I don't think we are because I didn't say it, so Andrea wins.
[40:13.120 --> 40:15.120] Oh, it's that high?
[40:15.120 --> 40:16.120] There we go.
[40:16.120 --> 40:18.120] That's horrible.
[40:18.120 --> 40:22.120] A quarter of the people polled, it's hard because I would have won.
[40:22.120 --> 40:25.120] Price is Right rules, I would have won.
[40:25.120 --> 40:28.120] Granted, there's always issues with polling,
[40:28.120 --> 40:31.120] but even if it's half that, that's absolutely insane,
[40:31.120 --> 40:34.120] and it's almost single-handedly because of Alex Jones.
[40:34.120 --> 40:36.120] Oh, yeah.
[40:36.120 --> 40:39.120] So I'm going to talk more about the misinformation piece.
[40:39.120 --> 40:42.120] I know everyone has seen all of the clips of his testimony
[40:42.120 --> 40:45.120] and all of the perjury and all the fun stuff,
[40:45.120 --> 40:47.120] but since this is a misinformation conference,
[40:47.120 --> 40:50.120] I'm going to focus on that aspect of it.
[40:50.120 --> 40:54.120] And I think as skeptics, we often hear the question, what's the harm?
[40:54.120 --> 40:57.120] Especially with things like conspiracy theories or supplements.
[40:57.120 --> 41:02.120] It's just easy to dismiss until it gets to this point,
[41:02.120 --> 41:06.120] and Alex Jones took both of those things and ruined some families' lives.
[41:06.120 --> 41:08.120] So some backgrounds.
[41:08.120 --> 41:12.120] The caricature that you think of as Alex Jones is pretty much accurate.
[41:12.120 --> 41:16.120] He peddles all of the conspiracy theories, 9-11 truth or pizza gate.
[41:16.120 --> 41:20.120] Now he's talking about the globalists trying to bring about the New World Order,
[41:20.120 --> 41:23.120] and when the Sandy Hook shooting happened,
[41:23.120 --> 41:27.120] he almost immediately was questioning the narrative.
[41:27.120 --> 41:32.120] And he's gone from saying it's a hoax, calling the parents crisis actors,
[41:32.120 --> 41:34.120] and that's changed over time.
[41:34.120 --> 41:37.120] His position has definitely evolved,
[41:37.120 --> 41:42.120] but the consistent through line of that is that he's questioning the official story
[41:42.120 --> 41:45.120] and doesn't think that the official story is true.
[41:45.120 --> 41:48.120] And because of this, the families of the children who died
[41:48.120 --> 41:51.120] have received death threats, they've been harassed,
[41:51.120 --> 41:54.120] and they're dealing with this constantly circulating.
[41:54.120 --> 41:58.120] So a bunch of the families have sued him, rightfully so.
[41:58.120 --> 42:01.120] And so this trial was for the parents of Jesse Lewis,
[42:01.120 --> 42:04.120] who was a six-year-old who died in Sandy Hook,
[42:04.120 --> 42:09.120] for defamation and intentional infliction of emotional distress.
[42:09.120 --> 42:12.120] And we're about to make fun of Alex Jones,
[42:12.120 --> 42:17.120] but as we're doing it, keep in mind that this all sounds silly and ridiculous,
[42:17.120 --> 42:20.120] but it's causing real harm to these families.
[42:20.120 --> 42:23.120] And I don't want to make light of it, but at the same time,
[42:23.120 --> 42:25.120] there's something really satisfying,
[42:25.120 --> 42:29.120] especially in the misinformation apocalypse that we're in right now,
[42:29.120 --> 42:33.120] about somebody who is this awful actually being held accountable.
[42:33.120 --> 42:37.120] So we've got to at least appreciate that for a minute.
[42:37.120 --> 42:40.120] Also, his lawyers are comically terrible.
[42:40.120 --> 42:42.120] How can they be that?
[42:42.120 --> 42:45.120] I mean, for a guy that has this much money,
[42:45.120 --> 42:48.120] how could he because he's a losing case?
[42:48.120 --> 42:50.120] Because nobody wants to defend him.
[42:50.120 --> 42:54.120] He probably has been working his way down the ladder of terrible lawyers.
[42:54.120 --> 42:56.120] And you've had that experience.
[42:56.120 --> 42:58.120] I mean, his lawyers were pretty terrible.
[42:58.120 --> 43:01.120] With your case, your opponent had that as well.
[43:01.120 --> 43:05.120] He kept going through lawyers because nobody of quality would defend him.
[43:05.120 --> 43:07.120] Who wants to defend this guy?
[43:07.120 --> 43:10.120] The other thing is that they did it on purpose.
[43:10.120 --> 43:11.120] That's what I was thinking.
[43:11.120 --> 43:12.120] You think they're sandbagging?
[43:12.120 --> 43:13.120] Yeah.
[43:13.120 --> 43:15.120] His morals got the better of him.
[43:15.120 --> 43:17.120] That thought has been brought up.
[43:17.120 --> 43:20.120] But the thing is, one, it's a civil case,
[43:20.120 --> 43:24.120] so he can't get away with the whole, like, my lawyers were incompetent,
[43:24.120 --> 43:26.120] so get out of it that way.
[43:26.120 --> 43:30.120] But also, they cross-examined the parents.
[43:30.120 --> 43:33.120] And I feel like if you were sandbagging it,
[43:33.120 --> 43:36.120] you wouldn't want to inflict additional trauma on the parents.
[43:36.120 --> 43:40.120] And some of the questions that he was asking them, I couldn't believe.
[43:40.120 --> 43:43.120] Have the lawyers made a statement about how it happened?
[43:43.120 --> 43:47.120] Because it's hard to accidentally send a huge set of files or file.
[43:47.120 --> 43:49.120] I always forget to send attachments.
[43:49.120 --> 43:52.120] Oh, the phone that's almost definitely going to the one-sixth committee
[43:52.120 --> 43:54.120] is like a whole story in itself.
[43:54.120 --> 43:57.120] But basically, the one lawyer said,
[43:57.120 --> 44:00.120] please disregard after he accidentally sent the files,
[44:00.120 --> 44:04.120] but didn't actually take the legal steps to pull back all that information.
[44:04.120 --> 44:08.120] So they just got to use it after his ten days were up.
[44:08.120 --> 44:11.120] This trial was specifically for damages,
[44:11.120 --> 44:15.120] because Alex Jones didn't provide any of the documents or evidence
[44:15.120 --> 44:17.120] that he was supposed to during the discovery phase,
[44:17.120 --> 44:20.120] and he dragged things on for years, and so there was a default judgment.
[44:20.120 --> 44:23.120] So it wasn't a question of if the defamation happens.
[44:23.120 --> 44:25.120] The court had decided the defamation happened.
[44:25.120 --> 44:30.120] This was just to decide how much he had to pay for it.
[44:30.120 --> 44:36.120] And the trial was exactly as dramatic as the clips are portraying it to be,
[44:36.120 --> 44:39.120] and I think this one exchange between Alex Jones and the judge
[44:39.120 --> 44:43.120] is the epitome of his testimony at least.
[44:43.120 --> 44:45.120] So I'm going to read that.
[44:45.120 --> 44:48.120] I'm sorry, I don't have as good an Alex Jones impression as George.
[44:48.120 --> 44:53.120] So the judge, after sending the jury out because Alex Jones was talking about
[44:53.120 --> 44:56.120] things that he wasn't supposed to while he was on the stand,
[44:56.120 --> 44:59.120] said, you're already under oath to tell the truth.
[44:59.120 --> 45:02.120] You've already violated that oath twice today.
[45:02.120 --> 45:03.120] And granted, twice today.
[45:03.120 --> 45:07.120] He had been on the stand for like 10 minutes by that point maybe.
[45:07.120 --> 45:11.120] That might be an exaggeration, but it was end of the day,
[45:11.120 --> 45:12.120] he had just gotten on the stand.
[45:12.120 --> 45:16.120] It seems absurd to instruct you that you must tell the truth while you testify,
[45:16.120 --> 45:18.120] yet here I am.
[45:18.120 --> 45:20.120] You must tell the truth when you testify.
[45:20.120 --> 45:22.120] This is not your show.
[45:22.120 --> 45:25.120] And then she explains some of the specifics, and she goes,
[45:25.120 --> 45:27.120] do you understand what I have said?
[45:27.120 --> 45:30.120] And he goes, I, and she interrupts him and says, yes or no.
[45:30.120 --> 45:34.120] He goes, yes, I believe what I said is true.
[45:34.120 --> 45:35.120] And she cuts him off.
[45:35.120 --> 45:39.120] She goes, you believe everything you say is true, but it isn't.
[45:39.120 --> 45:41.120] Your beliefs do not make something true.
[45:41.120 --> 45:43.120] That's what we're doing here.
[45:43.120 --> 45:44.120] Oh my God.
[45:44.120 --> 45:45.120] Wow.
[45:45.120 --> 45:48.120] And you should really watch that whole clip because there was so much more of it,
[45:48.120 --> 45:50.120] but I couldn't go into the whole thing.
[45:50.120 --> 45:54.120] And watch all the clips from his testimony because it is absolutely horrifying,
[45:54.120 --> 45:58.120] but also really satisfying because he's an awful person and deserves every bit of that.
[45:58.120 --> 46:02.120] And I can't help, through all the things that I've consumed about this man,
[46:02.120 --> 46:07.120] I can't help but think that this entire thing is an act.
[46:07.120 --> 46:08.120] I was thinking the same, Jay.
[46:08.120 --> 46:10.120] I'm wondering what you all think about that.
[46:10.120 --> 46:13.120] You think he knows what he's doing and he's just pretending?
[46:13.120 --> 46:18.120] Of course, I'm not 100% sure, but it just seems like it is all a money-making act.
[46:18.120 --> 46:21.120] Like I don't think he's a real conspiracy theorist.
[46:21.120 --> 46:22.120] I think he is.
[46:22.120 --> 46:23.120] No, I think you're right.
[46:23.120 --> 46:27.120] He uses his conspiracies to sell supplements because he'll talk about the conspiracy theory
[46:27.120 --> 46:33.120] to get the views and then he pivots into an ad for supplements or for shelf-stable food
[46:33.120 --> 46:36.120] because the Great Reset is coming and so you need to have food,
[46:36.120 --> 46:39.120] or gold because there's going to be one world currency, so you need gold.
[46:39.120 --> 46:44.120] And didn't he admit as much during his trial with his, what, divorce with his wife, effectively?
[46:44.120 --> 46:45.120] Custody.
[46:45.120 --> 46:46.120] Was it custody?
[46:46.120 --> 46:49.120] Yeah, Alex Jones is a character that he is playing.
[46:49.120 --> 46:52.120] That was one of his lines of defense,
[46:52.120 --> 46:54.120] which I think probably is accurate.
[46:54.120 --> 46:56.120] Again, we can't read his mind.
[46:56.120 --> 46:58.120] We don't really know what he believes or doesn't believe,
[46:58.120 --> 47:01.120] but it certainly is plausible and it certainly fits everything I've seen about him,
[47:01.120 --> 47:03.120] that this is a character he's playing.
[47:03.120 --> 47:08.120] He did admit that, which means he doesn't necessarily have to believe anything.
[47:08.120 --> 47:10.120] But he's still doing the same level of damage, whether or not.
[47:10.120 --> 47:11.120] Totally.
[47:11.120 --> 47:12.120] That's right.
[47:12.120 --> 47:13.120] Absolutely.
[47:13.120 --> 47:14.120] People believe that he's real.
[47:14.120 --> 47:16.120] Well, and he's doing the character under oath, right?
[47:16.120 --> 47:17.120] Yes, that's the thing.
[47:17.120 --> 47:19.120] That has consequences.
[47:19.120 --> 47:23.120] It's been so interesting to watch because he's not used to being challenged on his show.
[47:23.120 --> 47:25.120] He has control over the entire narrative.
[47:25.120 --> 47:27.120] Now he has to be in reality.
[47:27.120 --> 47:32.120] And so he started to do one of his ad pitches on the stand.
[47:32.120 --> 47:35.120] He started talking about how great his supplements are and they get the best supplements.
[47:35.120 --> 47:36.120] He can't help it.
[47:36.120 --> 47:37.120] Oh, my God.
[47:37.120 --> 47:39.120] It's all he knows, effectively.
[47:39.120 --> 47:43.120] If he can make a few bucks on the stand, why not go for it, I guess, right?
[47:43.120 --> 47:47.120] It's always satisfying to see, because this is not the first time this has happened,
[47:47.120 --> 47:51.120] and there are cases where people who are con artists or pseudoscientists or whatever,
[47:51.120 --> 47:55.120] and they find themselves in a court of law where there are rules of evidence.
[47:55.120 --> 48:01.120] Not that courts are perfect, but they do have fairly rigorous rules of evidence and argument,
[48:01.120 --> 48:03.120] et cetera.
[48:03.120 --> 48:08.120] Judges, if they're competent, aren't going to let you get away with stuff.
[48:08.120 --> 48:13.120] And just watching that disconnect, somebody like Alex Jones who's living in a fantasy world,
[48:13.120 --> 48:19.120] whether he believes it or not, he is used to being in this con artist construct,
[48:19.120 --> 48:25.120] and now he has to deal with reality and rules of evidence,
[48:25.120 --> 48:29.120] and the clash is just wonderful to behold.
[48:29.120 --> 48:33.120] It's kind of reminding me, Jay, I think you talked about this on a live, SU Live,
[48:33.120 --> 48:39.120] maybe a year ago when Sanjay Gupta was on Joe Rogan and we all expected it to be kind of like that,
[48:39.120 --> 48:42.120] but Joe Rogan just sort of steamrolled the whole thing.
[48:42.120 --> 48:46.120] This is what I wish that had been like, because now we're in a place where the rules,
[48:46.120 --> 48:48.120] reality has to hold for a second.
[48:48.120 --> 48:53.120] Fun fact, Joe Rogan was on Infowars on 9-11.
[48:53.120 --> 48:55.120] As he was spewing his...
[48:55.120 --> 48:58.120] One of the least fun, fun facts I've ever heard.
[48:58.120 --> 49:03.120] As soon as 9-11 happened, he was already spewing conspiracy theories,
[49:03.120 --> 49:05.120] and then he had Joe Rogan on.
[49:05.120 --> 49:09.120] Wait, wait, Joe Rogan was on Alex Jones' Infowars show?
[49:09.120 --> 49:13.120] Well, that guy literally just dropped lower than I thought he would.
[49:13.120 --> 49:15.120] That is ridiculous.
[49:15.120 --> 49:21.120] So I read in the chat, somebody said something about Texas tort law
[49:21.120 --> 49:27.120] that drops the 45 million down to 750,000.
[49:27.120 --> 49:28.120] I read that too.
[49:28.120 --> 49:32.120] From what I saw from the plaintiff's lawyer, he was saying...
[49:32.120 --> 49:37.120] So there was talk about a cap because it was divided into two sets of damages.
[49:37.120 --> 49:40.120] So there were the compensatory damages and the punitive damages.
[49:40.120 --> 49:46.120] The compensatory damages were 4.5 million, and then the punitive damages were 41 million.
[49:46.120 --> 49:50.120] And while we were waiting to hear what the punitive damages were,
[49:50.120 --> 49:53.120] people were talking about a cap because it had to be a certain multiple
[49:53.120 --> 49:55.120] of the compensatory damages.
[49:55.120 --> 50:00.120] But from the statement that the plaintiff's lawyer gave afterwards,
[50:00.120 --> 50:03.120] that was more of a guideline, not a hard cap.
[50:03.120 --> 50:05.120] More of a guideline.
[50:05.120 --> 50:07.120] I'm just going based on his statement.
[50:07.120 --> 50:10.120] I don't know anything about Texas law, not a lawyer.
[50:10.120 --> 50:13.120] But that was what I heard about that.
[50:13.120 --> 50:18.120] I was hoping to see them literally dismantle him and his company.
[50:18.120 --> 50:21.120] Why wouldn't this guy see prison time?
[50:21.120 --> 50:24.120] It's a civil case, you don't know prison.
[50:24.120 --> 50:30.120] I understand that, but it doesn't mean that he can't be put in prison legitimately.
[50:30.120 --> 50:32.120] He did perjure himself.
[50:32.120 --> 50:34.120] That would be a whole other story.
[50:34.120 --> 50:37.120] That would be something emerging from the trial itself.
[50:37.120 --> 50:43.120] But it's hard to bring criminal charges against somebody for what they're saying
[50:43.120 --> 50:46.120] in a public forum because of free speech laws, etc.
[50:46.120 --> 50:48.120] But civil is different.
[50:48.120 --> 50:53.120] Holding people liable for the damage that they knowingly and maliciously caused,
[50:53.120 --> 50:55.120] the law allows for that.
[50:55.120 --> 50:59.120] One more thing I did want to bring up is, in my opinion,
[50:59.120 --> 51:01.120] one of the best witnesses that they had.
[51:01.120 --> 51:06.120] Her name is Becca Lewis and she does research in misinformation and disinformation
[51:06.120 --> 51:08.120] and how it spreads.
[51:08.120 --> 51:11.120] They had her on as an expert witness about misinformation.
[51:11.120 --> 51:15.120] She talked about how and why it spreads faster than the truth
[51:15.120 --> 51:20.120] since it feeds into people's world views, the confirmation bias.
[51:20.120 --> 51:24.120] The things that confirm their existing world views are going to circulate,
[51:24.120 --> 51:27.120] especially once you start to have echo chambers like Infowars'.
[51:27.120 --> 51:31.120] Also, Alex Jones platformed other conspiracy theorists.
[51:31.120 --> 51:35.120] There was one that she talked about who his content only had three views
[51:35.120 --> 51:38.120] before Alex Jones started promoting it.
[51:38.120 --> 51:40.120] It was something that nobody was going to see.
[51:40.120 --> 51:43.120] But because of his platform, a lot of people saw it.
[51:43.120 --> 51:49.120] Now we have 24% of the country who questions this main narrative.
[51:49.120 --> 51:51.120] That was a lot of what the trial was about.
[51:51.120 --> 51:53.120] He would claim, oh, I was just asking questions.
[51:53.120 --> 51:56.120] I was just having these people on to get their opinion.
[51:56.120 --> 51:58.120] Oh, my guest said it, but I didn't say it.
[51:58.120 --> 52:02.120] But he provided that platform for them to get their views out.
[52:02.120 --> 52:06.120] I think the most interesting thing she talked about was this idea
[52:06.120 --> 52:09.120] of three degrees of Alex Jones.
[52:09.120 --> 52:13.120] She said that you basically can't do misinformation research
[52:13.120 --> 52:16.120] without encountering Infowars and Alex Jones.
[52:16.120 --> 52:22.120] The common rule is that you're never more than three recommendations away
[52:22.120 --> 52:26.120] from Alex Jones or Infowars videos.
[52:26.120 --> 52:27.120] Wow.
[52:27.120 --> 52:29.120] Ouch.
[52:29.120 --> 52:34.120] The way to restate that is you can't be more full of shit than Alex Jones.
[52:34.120 --> 52:36.120] Yeah, basically.
[52:36.120 --> 52:41.120] Jones' lawyer was trying to trip her up, and he was trying to use
[52:41.120 --> 52:44.120] all of the things that a scientist or a skeptic would use.
[52:44.120 --> 52:48.120] He's talking about sample size and bias and things like that
[52:48.120 --> 52:51.120] because in any paper at the end, they're going to talk about
[52:51.120 --> 52:54.120] all of the limitations and say, like, this is a potential limitation.
[52:54.120 --> 52:57.120] This is a potential source of bias, but we tried to account for it
[52:57.120 --> 52:59.120] as best we could.
[52:59.120 --> 53:02.120] But she's a researcher, so she knew it a lot better than he did.
[53:02.120 --> 53:06.120] So she'd stop and she'd be like, no, this is what that means.
[53:06.120 --> 53:08.120] You have no idea what you're talking about.
[53:08.120 --> 53:10.120] Oh, that's great.
[53:10.120 --> 53:13.120] Yeah, and he tried to say that she hated Alex Jones and things like that,
[53:13.120 --> 53:17.120] and that would bias her, and she didn't know who Alex Jones was
[53:17.120 --> 53:19.120] before she started researching this.
[53:19.120 --> 53:21.120] And she just goes, yes, that's correct.
[53:21.120 --> 53:25.120] Like, when he'd present something, she'd say, yes, that's correct,
[53:25.120 --> 53:27.120] and it's based on hundreds of hours of research.
[53:27.120 --> 53:29.120] It's not just her opinion.
[53:29.120 --> 53:32.120] And so he kept trying to trip her up, and the best part was
[53:32.120 --> 53:37.120] he was asking her questions and said, the poll that found
[53:37.120 --> 53:42.120] 24% questioned Sandy Hook, that it was under 1,000 sample size
[53:42.120 --> 53:45.120] and was trying to discredit it that way.
[53:45.120 --> 53:47.120] And she's like, you can have statistical significance
[53:47.120 --> 53:50.120] with less than 1,000 sample size, like trying to explain that.
[53:50.120 --> 53:55.120] And then the plaintiff's lawyer comes up and hands her the actual study
[53:55.120 --> 54:00.120] and the Jones lawyer was full of shit because it was over 1,000.
[54:00.120 --> 54:02.120] So it wasn't even that, yeah.
[54:02.120 --> 54:04.120] Even the lawyer is full of BS.
[54:04.120 --> 54:10.120] We're really seeing this trend here with these crazy lawsuits.
[54:10.120 --> 54:13.120] How do you defend Alex Jones legitimately?
[54:13.120 --> 54:15.120] How do you do it?
[54:15.120 --> 54:18.120] You literally have to try to slip through some cracks.
[54:18.120 --> 54:22.120] Well, but you also don't have to defend him and say he's innocent.
[54:22.120 --> 54:24.120] I mean, I know innocent and guilty isn't what's happening here
[54:24.120 --> 54:26.120] because it's a civil case, but you don't have to say,
[54:26.120 --> 54:28.120] oh, no, he didn't defame people.
[54:28.120 --> 54:33.120] You can just try to mitigate the damage in an ethical way.
[54:33.120 --> 54:37.120] If a lawyer can give a defense they don't personally believe,
[54:37.120 --> 54:39.120] they don't have to believe it.
[54:39.120 --> 54:42.120] The ethics of law does not require that.
[54:42.120 --> 54:46.120] It just has to be a legally responsible and viable argument.
[54:46.120 --> 54:50.120] Their personal belief is actually not relevant to it.
[54:50.120 --> 54:54.120] So as long as they are mounting an ethical defense, it's fine.
[54:54.120 --> 54:58.120] But it's certainly reasonable to think that there isn't an ethical defense
[54:58.120 --> 55:07.120] of somebody like Alex Jones because it seems so obvious that he's guilty.
[55:07.120 --> 55:11.120] But again, the law is based upon the notion that everybody deserves a defense.
[55:11.120 --> 55:15.120] But that doesn't mean that lawyers can do unethical things on the stand.
[55:15.120 --> 55:18.120] It also is why I think that might speak to the quality of the lawyers
[55:18.120 --> 55:22.120] because, again, the high-quality lawyers, Jones clearly has the money.
[55:22.120 --> 55:25.120] He could pay some high-priced law legal firm to defend him.
[55:25.120 --> 55:28.120] They probably don't want their reputation sullied with this.
[55:28.120 --> 55:29.120] They don't want to go anywhere near it.
[55:29.120 --> 55:31.120] Nobody wants to be the guy who defended Alex Jones.
[55:31.120 --> 55:32.120] Right.
[55:32.120 --> 55:34.120] Do we have any idea how much money, like what his net worth is?
[55:34.120 --> 55:36.120] Like how ruinous is $41 million, $45 million?
[55:36.120 --> 55:38.120] They were desperately trying to figure that out.
[55:38.120 --> 55:42.120] So officially, I'm sorry if you didn't notice, but officially it's $200,000
[55:42.120 --> 55:46.120] that his enterprise makes $200,000 a day.
[55:46.120 --> 55:48.120] But $200,000 a day.
[55:48.120 --> 55:50.120] Is that net?
[55:50.120 --> 55:52.120] But that's probably an underestimate.
[55:52.120 --> 55:58.120] And in the phone records that were revealed, on some days they make up to $800,000.
[55:58.120 --> 55:59.120] That was their best day.
[55:59.120 --> 56:01.120] That was a good day, yeah.
[56:01.120 --> 56:03.120] You guys have got to sell supplements, man.
[56:03.120 --> 56:04.120] This is right.
[56:04.120 --> 56:06.120] We've got to switch sides.
[56:06.120 --> 56:09.120] But they had a really hard time figuring that kind of stuff out
[56:09.120 --> 56:11.120] because he didn't turn over all the documents that he was supposed to turn over.
[56:11.120 --> 56:12.120] Right, part of the problem.
[56:12.120 --> 56:15.120] So they couldn't really get a solid answer on that.
[56:15.120 --> 56:16.120] What kind of bullshit is that?
[56:16.120 --> 56:17.120] Okay, so you don't do that.
[56:17.120 --> 56:19.120] You don't turn over the documents.
[56:19.120 --> 56:25.120] Like doesn't the law, doesn't the court have the ability to deliver some type of incredible smackdown?
[56:25.120 --> 56:27.120] So that's what they did.
[56:27.120 --> 56:29.120] That was why there was the default judgment.
[56:29.120 --> 56:34.120] And so that's why this was just for damages because they already determined that he was liable
[56:34.120 --> 56:37.120] for the defamation and for the infliction of emotional distress.
[56:37.120 --> 56:39.120] I get that they clicked into summary judgment.
[56:39.120 --> 56:41.120] We see we have some experience with that.
[56:41.120 --> 56:42.120] Yeah.
[56:42.120 --> 56:44.120] But in a good way.
[56:44.120 --> 56:47.120] Don't you get into legal trouble if you don't hand over?
[56:47.120 --> 56:49.120] Like doesn't he have to now deal with the fact?
[56:49.120 --> 56:53.120] Well, you could be held in contempt, right, would be the legal remedy there.
[56:53.120 --> 56:58.120] But just in a case like this, the remedy is you lose.
[56:58.120 --> 57:03.120] You now lose the case and now we're going to talk about how much money you have to pay the plaintiff.
[57:03.120 --> 57:05.120] So that was the remedy.
[57:05.120 --> 57:11.120] He was asked, you know, turn over like emails or texts where, you know, you mentioned Sandy Hook.
[57:11.120 --> 57:17.120] And he said, I did a search on my phone, did not see any text that mentioned Sandy Hook.
[57:17.120 --> 57:21.120] So I want to know what did the court or the judge do at that point?
[57:21.120 --> 57:26.120] Because then, of course, afterwards they got two years of text and of course it's all over the place.
[57:26.120 --> 57:28.120] So he was just flat out lying.
[57:28.120 --> 57:31.120] But if they didn't get that dump, what recourse would they have had to say?
[57:31.120 --> 57:32.120] Yeah, I don't believe you.
[57:32.120 --> 57:34.120] I don't believe your phone doesn't have those.
[57:34.120 --> 57:36.120] They can get the info if they want to.
[57:36.120 --> 57:38.120] They can get the info.
[57:38.120 --> 57:43.120] They can appoint somebody to go through the phone and get the information that they want.
[57:43.120 --> 57:46.120] I know like when I had to turn over my emails, I didn't do it.
[57:46.120 --> 57:52.120] My lawyer hired an independent person to come in, go through all my emails and find the ones that were relevant.
[57:52.120 --> 57:54.120] My hands were not on it at all.
[57:54.120 --> 57:55.120] All right.
[57:55.120 --> 57:57.120] Anything else you want to add before we move on?
[57:57.120 --> 58:00.120] I will throw a quote out there from the lawyer today.
[58:00.120 --> 58:03.120] So this was just the first of a few cases.
[58:03.120 --> 58:10.120] And the plaintiff's lawyer said, there's going to be a large set of plaintiffs dividing up the corpse of Infowars.
[58:10.120 --> 58:12.120] And fingers crossed that that actually happens.
[58:12.120 --> 58:13.120] Yeah, that would be nice.
[58:13.120 --> 58:15.120] Tiny slice of justice in this book.
[58:15.120 --> 58:17.120] The corpse of Infowars.
[58:17.120 --> 58:18.120] It's a nice sentence.
[58:18.120 --> 58:19.120] Add that to your Halloween display.
[58:19.120 --> 58:21.120] I would, I would.
Earth Spinning Faster (58:21)
- Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began[5]
[58:21.120 --> 58:22.120] All right, Bob.
[58:22.120 --> 58:30.120] I understand that the earth is supposed to be slowing down over the long historical time.
[58:30.120 --> 58:33.120] But maybe that's not 100 percent true.
[58:33.120 --> 58:36.120] Well, you know, I don't want to get everybody concerned.
[58:36.120 --> 58:43.120] But the earth is now spinning faster than it ever has before in the age of atomic clocks.
[58:43.120 --> 58:45.120] I thought I felt something.
[58:45.120 --> 58:51.120] January 22nd, this past year, January 22nd, no, June 22nd, 2022.
[58:51.120 --> 58:53.120] The shortest day ever recorded.
[58:53.120 --> 58:54.120] And we're not sure why.
[58:54.120 --> 58:55.120] Should we be scared?
[58:55.120 --> 58:57.120] Should we be afraid?
[58:57.120 --> 58:58.120] So what's what's going on here?
[58:58.120 --> 59:00.120] You mean the longest day ever recorded?
[59:00.120 --> 59:01.120] What did I say?
[59:01.120 --> 59:02.120] Shortest day.
[59:02.120 --> 59:03.120] Shortest day.
[59:03.120 --> 59:04.120] Because the earth is spinning faster.
[59:04.120 --> 59:05.120] Faster, so it's short days, right?
[59:05.120 --> 59:06.120] Yeah, it's getting shorter.
[59:06.120 --> 59:07.120] Yeah, it'd be shorter.
[59:07.120 --> 59:09.120] So it all starts with a day.
[59:09.120 --> 59:10.120] What is a day?
[59:10.120 --> 59:11.120] Yeah, what's a day?
[59:11.120 --> 59:12.120] If you ask anybody, what's a day?
[59:12.120 --> 59:13.120] 24 hours.
[59:13.120 --> 59:14.120] 24 hours.
[59:14.120 --> 59:15.120] Steve, what is that in metric?
[59:15.120 --> 59:17.120] Oh, never mind.
[59:17.120 --> 59:20.120] So a mean solar day is 24 hours.
[59:20.120 --> 59:21.120] That's right.
[59:21.120 --> 59:22.120] That's what it is.
[59:22.120 --> 59:25.120] But that's the outer the outermost onion layer.
[59:25.120 --> 59:29.120] As we say, you get a little deeper and it's never really 24 hours.
[59:29.120 --> 59:30.120] Exactly.
[59:30.120 --> 59:31.120] It kind of this is 24 hours.
[59:31.120 --> 59:33.120] It goes a little shorter, a little longer.
[59:33.120 --> 59:35.120] It's like right around 24 hours.
[59:35.120 --> 59:38.120] 24 hours is should be the average.
[59:38.120 --> 59:43.120] But it varies because you've got the interior of the earth kind of roiling around.
[59:43.120 --> 59:45.120] You've got seismic activity.
[59:45.120 --> 59:49.120] You've got the wind, the wind running across the surface of the earth and causing
[59:49.120 --> 59:51.120] friction, pushing against mountains.
[59:51.120 --> 59:56.120] All those things conspire to make the day, you know, slower and faster than 24 hours.
[59:56.120 --> 01:00:02.120] But if you look at it over the over many decades, what you find is that the average is
[01:00:02.120 --> 01:00:06.120] about 24 hours and point zero zero one seconds.
[01:00:06.120 --> 01:00:08.120] So somebody asks you, how long is a day?
[01:00:08.120 --> 01:00:12.120] You say 24 hours and point zero zero one seconds, because that would be more accurate,
[01:00:12.120 --> 01:00:13.120] a little bit more accurate.
[01:00:13.120 --> 01:00:17.120] But the problem here is that we have two ways to tell time.
[01:00:17.120 --> 01:00:19.120] Really, we have atomic time, which is extremely accurate.
[01:00:19.120 --> 01:00:20.120] And here's solar time.
[01:00:20.120 --> 01:00:25.120] And every day, if the earth is a little bit slower, a little bit faster, it notches up
[01:00:25.120 --> 01:00:27.120] and it diverges from atomic time.
[01:00:27.120 --> 01:00:32.120] And after a while, you can't get beyond this, which is about, I don't know, 10 seconds.
[01:00:32.120 --> 01:00:34.120] They don't want to get beyond that, whatever that is.
[01:00:34.120 --> 01:00:36.120] So they throw in a leap second.
[01:00:36.120 --> 01:00:37.120] That's what a leap second is.
[01:00:37.120 --> 01:00:41.120] A leap second isn't because, oh, the earth is slowing and slowing and slowing and we
[01:00:41.120 --> 01:00:42.120] need to throw in a second.
[01:00:42.120 --> 01:00:46.120] It's because because of that divergence between atomic time and solar time.
[01:00:46.120 --> 01:00:47.120] That's what a leap second is.
[01:00:47.120 --> 01:00:51.120] So but why is there this general average of slowing the earth?
[01:00:51.120 --> 01:00:52.120] There's a bunch of reasons.
[01:00:52.120 --> 01:00:55.120] The main and most fascinating one for me is tidal breaking.
[01:00:55.120 --> 01:00:59.120] It's because that damn moon, the moon is doing it, is doing it towards the end.
[01:00:59.120 --> 01:01:01.120] The tides, it's happening because of the tides.
[01:01:01.120 --> 01:01:04.120] So stealing our angular momentum.
[01:01:04.120 --> 01:01:05.120] Exactly.
[01:01:05.120 --> 01:01:06.120] Exactly.
[01:01:06.120 --> 01:01:10.120] Because as because of the way the earth is rotating and the bulges created by the tides,
[01:01:10.120 --> 01:01:14.120] the moon is pulling on those on that bulge, which actually causes friction on the earth,
[01:01:14.120 --> 01:01:17.120] which slows the earth, making our days longer.
[01:01:17.120 --> 01:01:21.120] And the moon is stealing our rotational energy, our angular momentum, because that's got to
[01:01:21.120 --> 01:01:22.120] be conserved.
[01:01:22.120 --> 01:01:25.120] And that's going into a higher orbit and getting farther and farther and farther away.
[01:01:25.120 --> 01:01:30.120] And eventually, if the solar system lasts long enough, which it won't, it will get so
[01:01:30.120 --> 01:01:33.120] far away that we'll be facing each other.
[01:01:33.120 --> 01:01:36.120] The moon and the moon will be facing each other, will be tidally locked like the moon
[01:01:36.120 --> 01:01:37.120] is to us right now.
[01:01:37.120 --> 01:01:41.120] So that's just the interesting aside of why the earth is slowing.
[01:01:41.120 --> 01:01:46.120] When and if that ever happens, does that mean that one side of the earth would be getting
[01:01:46.120 --> 01:01:48.120] sun and the other side will not be getting sun?
[01:01:48.120 --> 01:01:50.120] No, it's all about the orientation of the earth moon.
[01:01:50.120 --> 01:01:51.120] Right.
[01:01:51.120 --> 01:01:52.120] It's not tidally locked to the sun.
[01:01:52.120 --> 01:01:53.120] It's tidally locked to the moon.
[01:01:53.120 --> 01:01:54.120] Right.
[01:01:54.120 --> 01:01:55.120] Now, if we were like...
[01:01:55.120 --> 01:01:57.120] Would the whole thing rotate, basically?
[01:01:57.120 --> 01:01:58.120] Yes.
[01:01:58.120 --> 01:02:00.120] We would always be facing each other.
[01:02:00.120 --> 01:02:01.120] Our orbit would be like this.
[01:02:01.120 --> 01:02:04.120] Instead of the moon is locked now and the earth is rotating.
[01:02:04.120 --> 01:02:07.120] So some side of the earth will see the moon always and the other side will never see
[01:02:07.120 --> 01:02:08.120] the moon.
[01:02:08.120 --> 01:02:11.120] But that wouldn't happen because we're going to burn up before we get to that point, I
[01:02:11.120 --> 01:02:12.120] believe.
[01:02:12.120 --> 01:02:13.120] Oh, thank you.
[01:02:13.120 --> 01:02:14.120] Perfect.
[01:02:14.120 --> 01:02:16.120] But there are planets that have been tidally locked to their sun because they're very big
[01:02:16.120 --> 01:02:19.120] and they're very close to their parent star.
[01:02:19.120 --> 01:02:22.120] So the tidal forces are strong enough to tidally lock that.
[01:02:22.120 --> 01:02:28.120] But 2020, 2021, and 2022 were a little bit different.
[01:02:28.120 --> 01:02:35.120] And it wasn't just because of that damn pandemic because these were the shortest days ever
[01:02:35.120 --> 01:02:36.120] recorded.
[01:02:36.120 --> 01:02:41.120] 2020 had 28 of the shortest days ever recorded since 1960.
[01:02:41.120 --> 01:02:42.120] What?
[01:02:42.120 --> 01:02:43.120] 28 days.
[01:02:43.120 --> 01:02:44.120] Why?
[01:02:44.120 --> 01:02:49.120] 2021 also had a plethora of very, very short days.
[01:02:49.120 --> 01:02:53.120] No dramatic records were broken in 2021, but they were still very, very short.
[01:02:53.120 --> 01:02:59.120] Oh, and 2020, I think we all can agree that if the days in 2020 were shorter, that's a
[01:02:59.120 --> 01:03:03.120] good thing because that year needed to be shorter than it was.
[01:03:03.120 --> 01:03:05.120] Literally the only good thing is this.
[01:03:05.120 --> 01:03:06.120] Right.
[01:03:06.120 --> 01:03:07.120] So 2022, we're not even done with it.
[01:03:07.120 --> 01:03:09.120] We've already broken some good records.
[01:03:09.120 --> 01:03:14.120] June 22nd was 1.59 milliseconds shorter than 24 hours.
[01:03:14.120 --> 01:03:15.120] Holy shit.
[01:03:15.120 --> 01:03:16.120] The shortest day.
[01:03:16.120 --> 01:03:17.120] Is that a lot?
[01:03:17.120 --> 01:03:23.120] It's not an absolute a lot, but relative to history, it is a lot.
[01:03:23.120 --> 01:03:24.120] 1.59 milliseconds.
[01:03:24.120 --> 01:03:27.120] The short day, shortest day ever recorded, ever recorded.
[01:03:27.120 --> 01:03:31.120] And then in July, we had a day that was the second shortest.
[01:03:31.120 --> 01:03:33.120] So something's happening.
[01:03:33.120 --> 01:03:40.120] So why do we have three years where the average day was less than 24 hours when over the past
[01:03:40.120 --> 01:03:46.120] 30, 40, 50, 60, 70 years, the average day has been a little bit longer than 24 hours?
[01:03:46.120 --> 01:03:47.120] Why?
[01:03:47.120 --> 01:03:48.120] What's going on?
[01:03:48.120 --> 01:03:49.120] Well, we're not sure.
[01:03:49.120 --> 01:03:52.120] We're not sure exactly, but there's lots, of course, there's lots of scientists and
[01:03:52.120 --> 01:03:53.120] their theories.
[01:03:53.120 --> 01:03:55.120] They have got lots of ideas of why.
[01:03:55.120 --> 01:04:00.120] One idea is that glaciers are melting and basically the poles don't have as much mass
[01:04:00.120 --> 01:04:03.120] or weight by them as they used to.
[01:04:03.120 --> 01:04:04.120] That's one idea that may be contributing.
[01:04:04.120 --> 01:04:06.120] So is that like a skater pulling in their arms?
[01:04:06.120 --> 01:04:07.120] Right.
[01:04:07.120 --> 01:04:08.120] Yes.
[01:04:08.120 --> 01:04:10.120] Distribution of mass as the skater pulling in the arms to go faster.
[01:04:10.120 --> 01:04:12.120] That's definitely related.
[01:04:12.120 --> 01:04:16.120] And related to that, Steve, is also another idea of why we're getting the speed up is
[01:04:16.120 --> 01:04:22.120] the different movements of the molten core of the planet that could address that and
[01:04:22.120 --> 01:04:23.120] speed up the Earth.
[01:04:23.120 --> 01:04:27.120] Seismic activity is another option that they throw out.
[01:04:27.120 --> 01:04:32.120] My theory is that it's the sheer mass of meatballs at Jay's house that is kind of screwing with
[01:04:32.120 --> 01:04:33.120] our rotation.
[01:04:33.120 --> 01:04:34.120] I would do it.
[01:04:34.120 --> 01:04:35.120] Jay, I'm telling you, man.
[01:04:35.120 --> 01:04:37.120] I've got two scientists that agree with that with me.
[01:04:37.120 --> 01:04:43.120] But a lot of scientists will also throw out there the Chandler wobble as one potential
[01:04:43.120 --> 01:04:44.120] reason why the Earth is speeding up.
[01:04:44.120 --> 01:04:45.120] Is that a dance?
[01:04:45.120 --> 01:04:46.120] What is it?
[01:04:46.120 --> 01:04:47.120] The France thing?
[01:04:47.120 --> 01:04:48.120] Yes.
[01:04:48.120 --> 01:04:49.120] That's the joke.
[01:04:49.120 --> 01:04:52.120] And I couldn't think of a really, really good version of that joke.
[01:04:52.120 --> 01:04:54.120] But I'll just describe what it is.
[01:04:54.120 --> 01:04:57.120] It's essentially the varying wobble of Earth's axis of rotation.
[01:04:57.120 --> 01:04:58.120] It's actually kind of complicated.
[01:04:58.120 --> 01:05:02.120] I'm trying to really wrap my head around what's exactly going on with this Chandler wobble.
[01:05:02.120 --> 01:05:07.120] But it's the axis of rotation that varies, causing a shorter term wobble.
[01:05:07.120 --> 01:05:09.120] So that's as much as I'll say about the Chandler wobble.
[01:05:09.120 --> 01:05:10.120] Okay, so what does this mean?
[01:05:10.120 --> 01:05:11.120] What's going to happen?
[01:05:11.120 --> 01:05:13.120] What are some really bad things?
[01:05:13.120 --> 01:05:16.120] Okay, it's the leap second that could be concerning here.
[01:05:16.120 --> 01:05:17.120] Because we've had leap seconds.
[01:05:17.120 --> 01:05:22.120] We've had plenty of leap seconds where you add an extra second to coordinated universal
[01:05:22.120 --> 01:05:23.120] time.
[01:05:23.120 --> 01:05:24.120] And that's been done.
[01:05:24.120 --> 01:05:26.120] Nobody really thinks about it anymore.
[01:05:26.120 --> 01:05:28.120] But it's problematic.
[01:05:28.120 --> 01:05:32.120] In 2012, Reddit was taken down because of a leap second was added that year.
[01:05:32.120 --> 01:05:33.120] Wow.
[01:05:33.120 --> 01:05:39.120] And if I was into Reddit then as I am now, I would have been pissed if Reddit went down.
[01:05:39.120 --> 01:05:40.120] But they've done tricks.
[01:05:40.120 --> 01:05:43.120] They've got something called leap smearing, where they take microsecond slowdowns.
[01:05:43.120 --> 01:05:44.120] They need a rebrand.
[01:05:44.120 --> 01:05:45.120] Yes.
[01:05:45.120 --> 01:05:52.120] In the course of a day, they might do microsecond slowdowns leading up to the leap second.
[01:05:52.120 --> 01:05:55.120] So that to make it a little more palatable, I guess.
[01:05:55.120 --> 01:05:56.120] Bob, but wait.
[01:05:56.120 --> 01:05:58.120] I hate to cut in.
[01:05:58.120 --> 01:06:03.120] But why does a fraction of a second matter in the world?
[01:06:03.120 --> 01:06:05.120] Well, it's not a fraction of a second.
[01:06:05.120 --> 01:06:06.120] It's a full second.
[01:06:06.120 --> 01:06:07.120] I mean, think about it, Jay.
[01:06:07.120 --> 01:06:12.120] I mean, a second is small, but it's important.
[01:06:12.120 --> 01:06:17.120] And computer systems and GPS and satellites, lots of things are interrelated.
[01:06:17.120 --> 01:06:18.120] And it took down Reddit.
[01:06:18.120 --> 01:06:20.120] I mean, this can happen.
[01:06:20.120 --> 01:06:26.120] Y2K is kind of a related example of when you mess with something so fundamental.
[01:06:26.120 --> 01:06:30.120] And I'll go into it in a little bit more detail in one second, Jay.
[01:06:30.120 --> 01:06:33.120] So a normal leap second can be problematic.
[01:06:33.120 --> 01:06:36.120] Perhaps it's not as much as problematic as it was.
[01:06:36.120 --> 01:06:40.120] But a negative leap second, if the Earth keeps spinning faster and faster,
[01:06:40.120 --> 01:06:46.120] or if we maintain this average of faster than 24 hours,
[01:06:46.120 --> 01:06:50.120] then we may need to add a negative leap second.
[01:06:50.120 --> 01:06:53.120] And that's much more problematic than a regular leap second,
[01:06:53.120 --> 01:06:55.120] where you're skipping one second.
[01:06:55.120 --> 01:07:01.120] It's tougher to do and more risky than adding a second for various technical reasons.
[01:07:01.120 --> 01:07:02.120] For example...
[01:07:02.120 --> 01:07:03.120] This is going into the future.
[01:07:03.120 --> 01:07:05.120] This really sounds like a time travel episode.
[01:07:05.120 --> 01:07:06.120] Yeah, right?
[01:07:06.120 --> 01:07:09.120] But smartphones, computers, communication systems,
[01:07:09.120 --> 01:07:13.120] they synchronize using something called a network time protocol.
[01:07:13.120 --> 01:07:17.120] And that network time protocol is based on the number of seconds
[01:07:17.120 --> 01:07:20.120] that have transpired since January 1st, 1970.
[01:07:20.120 --> 01:07:24.120] So you throw out a second there and things can go a little wonky.
[01:07:24.120 --> 01:07:25.120] So that's a little concerning.
[01:07:25.120 --> 01:07:28.120] It can cause some issues with these systems.
[01:07:28.120 --> 01:07:30.120] Also, there's GPS satellites.
[01:07:30.120 --> 01:07:33.120] GPS satellites don't account for rotation.
[01:07:33.120 --> 01:07:35.120] They're not really built to deal with rotation.
[01:07:35.120 --> 01:07:37.120] So if the Earth is spinning faster,
[01:07:37.120 --> 01:07:43.120] the GPS satellite will all of a sudden be over a specific area a little earlier
[01:07:43.120 --> 01:07:45.120] than it would have been previously.
[01:07:45.120 --> 01:07:46.120] And that could mean the difference,
[01:07:46.120 --> 01:07:50.120] even if the Earth sped up by a half a millisecond,
[01:07:50.120 --> 01:07:54.120] it could be 10 inches or 26 centimeters off.
[01:07:54.120 --> 01:07:56.120] And that would compound.
[01:07:56.120 --> 01:07:59.120] And eventually the GPS satellites could be essentially useless
[01:07:59.120 --> 01:08:02.120] if we don't do anything, which we probably will.
[01:08:02.120 --> 01:08:05.120] I mean, it's not like, oh my God, GPS is going to be worthless.
[01:08:05.120 --> 01:08:08.120] And when you say do something, you're like, we've got to program this problem.
[01:08:08.120 --> 01:08:11.120] Yeah, I'm not sure what level of effort would be required,
[01:08:11.120 --> 01:08:13.120] but I'm sure it's not going to be trivial.
[01:08:13.120 --> 01:08:16.120] So some people say that this is going to be over soon
[01:08:16.120 --> 01:08:23.120] and this increased rotation speed of the Earth isn't going to necessarily stay this way for years.
[01:08:23.120 --> 01:08:28.120] Some people are saying this could be the beginning of a 50-year scenario
[01:08:28.120 --> 01:08:32.120] where the Earth is spinning faster than 24 hours.
[01:08:32.120 --> 01:08:35.120] And we may absolutely need to throw in some of these negative leap seconds,
[01:08:35.120 --> 01:08:37.120] which could cause some problems.
[01:08:37.120 --> 01:08:39.120] So that's the story.
[01:08:39.120 --> 01:08:40.120] It's interesting.
[01:08:40.120 --> 01:08:42.120] I'm not too worried about it.
[01:08:42.120 --> 01:08:45.120] But we'll see if some negative leap seconds get thrown in there,
[01:08:45.120 --> 01:08:53.120] and we might find out by the end of this year or the following year if this keeps up.
[01:08:53.120 --> 01:08:55.120] So, Bob, are you angry about all this?
[01:08:55.120 --> 01:08:56.120] No.
[01:08:56.120 --> 01:09:00.120] It was just an interesting research.
[01:09:00.120 --> 01:09:03.120] It was actually tough.
[01:09:03.120 --> 01:09:04.120] I'm answering.
[01:09:04.120 --> 01:09:08.120] It was tough to really get to fully understand all the nuances here,
[01:09:08.120 --> 01:09:11.120] because you've got sidereal day, solar day, mean solar day,
[01:09:11.120 --> 01:09:16.120] all these things that are different websites had different takes on exactly what those mean.
[01:09:16.120 --> 01:09:21.120] And it was interesting to put it all together and understand exactly what was happening.
[01:09:21.120 --> 01:09:22.120] So, yeah, I enjoyed this.
[01:09:22.120 --> 01:09:25.120] A great bar bet that we were talking about when we were talking about this before.
[01:09:25.120 --> 01:09:29.120] So, Andrea, how many times does the Earth rotate on its axis in one year?
[01:09:29.120 --> 01:09:31.120] 365 and a quarter, isn't that it?
[01:09:31.120 --> 01:09:32.120] Wrong.
[01:09:32.120 --> 01:09:33.120] Oh.
[01:09:33.120 --> 01:09:41.120] 366 and a quarter, because in going around the sun, it's got to rotate one extra time.
[01:09:41.120 --> 01:09:46.120] A day, you know, one day is a full rotation plus a degree,
[01:09:46.120 --> 01:09:50.120] a full rotation plus a degree, and it adds up over a year to a whole other rotation.
[01:09:50.120 --> 01:09:51.120] Right.
[01:09:51.120 --> 01:09:55.120] 361 degrees is the mean solar day, 24 hours.
[01:09:55.120 --> 01:09:57.120] A sidereal day is...
[01:09:57.120 --> 01:09:59.120] 23 hours and 56 minutes.
[01:09:59.120 --> 01:10:00.120] Exactly.
[01:10:00.120 --> 01:10:01.120] Wow.
[01:10:01.120 --> 01:10:02.120] 23 hours and 56 minutes.
[01:10:02.120 --> 01:10:03.120] It's four minutes.
[01:10:03.120 --> 01:10:05.120] But there's also lots of variations.
[01:10:05.120 --> 01:10:08.120] You're going to leave work early and be like, I'm on a sidereal day.
[01:10:08.120 --> 01:10:10.120] That is such a skeptic thing.
[01:10:10.120 --> 01:10:11.120] Like, wrong.
[01:10:11.120 --> 01:10:12.120] 365.
[01:10:12.120 --> 01:10:13.120] You know what I mean?
[01:10:13.120 --> 01:10:14.120] Come on.
[01:10:14.120 --> 01:10:15.120] Yeah.
[01:10:15.120 --> 01:10:21.120] But also, the other nuances is that the day varies depending on where you are in the orbit
[01:10:21.120 --> 01:10:25.120] and what season it is and the tilt of the Earth.
[01:10:25.120 --> 01:10:29.120] There's so many little factors that go in here to make it extra confusing.
[01:10:29.120 --> 01:10:35.120] So can't we help by having a party somewhere on Earth that will slow the rotation down?
[01:10:35.120 --> 01:10:37.120] There must be some human configuration that we could do.
[01:10:37.120 --> 01:10:39.120] We all go to the North Pole at the same time.
[01:10:39.120 --> 01:10:44.120] We all have to jump at the same time so that we can alleviate the pressure.
[01:10:44.120 --> 01:10:45.120] It would be like Earth.
[01:10:45.120 --> 01:10:46.120] It would be like in an elevator.
[01:10:46.120 --> 01:10:49.120] Andrew, it would be like an 80s movie, like the end of an 80s movie where we all jump.
[01:10:49.120 --> 01:10:50.120] Yeah.
[01:10:50.120 --> 01:10:53.120] And like slow motion and freeze and this is us saving the world.
[01:10:53.120 --> 01:10:57.120] Everyone needs to jump at the same time or we have a negative leap second.
[01:10:57.120 --> 01:10:58.120] You two.
[01:10:58.120 --> 01:10:59.120] Right.
[01:10:59.120 --> 01:11:00.120] All right.
[01:11:00.120 --> 01:11:01.120] All right.
[01:11:01.120 --> 01:11:02.120] Thanks, Bob.
[01:11:02.120 --> 01:11:03.120] I'm glad that's over.
[01:11:03.120 --> 01:11:04.120] That's cool.
Science or Fiction (1:11:11)
Theme: Misinformation
Item #1: Reported trust in the media in 2021 was highest in China at 80%, and lowest in Russia at 29%, with the US in between at 39%.[6]
Item #2: Analysis of social media posts finds that bots are far more likely to spread false information and are responsible for as much as 90% of its spread on the most popular platforms.[7]
Item #3: Research shows that fake news spreads 6 times faster and 10 times farther on Twitter than true news, and that people are 70% more likely to share a false tweet than a truthful one.[8]
Answer | Item |
---|---|
Fiction | Bots spread 90% of disinfo |
Science | Reported trust in the media |
Science | Fake news faster & farther |
Host | Result |
---|---|
Steve | win |
Rogue | Guess |
---|---|
Evan | Reported trust in the media |
Kelly | Reported trust in the media |
Jay | Reported trust in the media |
Bob | Reported trust in the media |
Andrea | Bots spread 90% of disinfo |
Voice-over: It's time for Science or Fiction.
Evan's Response
Kelly's Response
Jay's Response
Bob's Response
Andrea's Response
Audience's Responses
Steve Explains Item #3
Steve Explains Item #2
Steve Explains Item #1
[01:11:04.120 --> 01:11:05.120] All right, guys.
[01:11:05.120 --> 01:11:06.120] You know what time it is.
[01:11:06.120 --> 01:11:07.120] Science or fiction.
[01:11:07.120 --> 01:11:08.120] It's time.
[01:11:08.120 --> 01:11:13.120] It's time for science or fiction.
[01:11:13.120 --> 01:11:22.120] It's time for science or fiction.
[01:11:22.120 --> 01:11:23.120] I have three items here.
[01:11:23.120 --> 01:11:26.120] There is a theme to these items.
[01:11:26.120 --> 01:11:29.120] The theme is misinformation.
[01:11:29.120 --> 01:11:32.120] Pretty obvious theme.
[01:11:32.120 --> 01:11:34.120] These are things you may have heard before.
[01:11:34.120 --> 01:11:40.120] And, you know, the details matter because you know these things in broad brushstroke.
[01:11:40.120 --> 01:11:44.120] You know, one of these things may be wrong because of the details.
[01:11:44.120 --> 01:11:46.120] Someone's going to warn you about that.
[01:11:46.120 --> 01:11:47.120] All right.
[01:11:47.120 --> 01:11:48.120] Let's get going.
[01:11:48.120 --> 01:11:55.120] Item number one, reported trust in the media in 2021 was highest in China at 80 percent
[01:11:55.120 --> 01:12:01.120] and lowest in Russia at 29 percent with the U.S. in between at 39 percent.
[01:12:01.120 --> 01:12:07.120] Number two, analysis of social media posts finds that bots are far more likely to spread
[01:12:07.120 --> 01:12:12.120] false information and are responsible for as much as 90 percent of its spread on the
[01:12:12.120 --> 01:12:14.120] most popular platforms.
[01:12:14.120 --> 01:12:20.120] And, item number three, research shows that fake news spreads six times faster and ten
[01:12:20.120 --> 01:12:26.120] times farther on Twitter than true news and that people are 70 percent more likely to
[01:12:26.120 --> 01:12:28.120] share a false treat tweet.
[01:12:28.120 --> 01:12:31.120] I like sharing false treats better than myself.
[01:12:31.120 --> 01:12:32.120] Yeah.
[01:12:32.120 --> 01:12:33.120] A false tweet.
[01:12:33.120 --> 01:12:34.120] It's like we're going to put spinach in a brownie.
[01:12:34.120 --> 01:12:35.120] Yeah.
[01:12:35.120 --> 01:12:36.120] The truthful one.
[01:12:36.120 --> 01:12:37.120] Yeah.
[01:12:37.120 --> 01:12:38.120] So, spinach flavored candies.
[01:12:38.120 --> 01:12:39.120] Yeah.
[01:12:39.120 --> 01:12:40.120] Okay.
[01:12:40.120 --> 01:12:41.120] That's going to be fun editing, Steve.
[01:12:41.120 --> 01:12:43.120] You know, we're going to order this way.
[01:12:43.120 --> 01:12:45.120] Evan, we're going to start with you.
[01:12:45.120 --> 01:12:46.120] Okay.
[01:12:46.120 --> 01:12:47.120] I guess.
[01:12:47.120 --> 01:12:48.120] Wait, wait, Steve.
[01:12:48.120 --> 01:12:49.120] We have a guest.
[01:12:49.120 --> 01:12:50.120] I know.
[01:12:50.120 --> 01:12:52.120] And the political scientist is going last.
[01:12:52.120 --> 01:12:55.120] I'm ready for this one.
[01:12:55.120 --> 01:12:57.120] Can you ask the audience to vote for which one they want?
[01:12:57.120 --> 01:12:58.120] Yeah.
[01:12:58.120 --> 01:12:59.120] We'll get that at the end.
[01:12:59.120 --> 01:13:00.120] All right.
[01:13:00.120 --> 01:13:01.120] I'm going to take them in reverse order if that's okay.
[01:13:01.120 --> 01:13:02.120] Yeah.
[01:13:02.120 --> 01:13:03.120] Go ahead.
[01:13:03.120 --> 01:13:04.120] Let's see.
[01:13:04.120 --> 01:13:09.120] On Twitter, six times faster and ten times farther and that people are 70 percent more
[01:13:09.120 --> 01:13:11.120] likely to share a false tweet.
[01:13:11.120 --> 01:13:14.120] I think those numbers line up correctly.
[01:13:14.120 --> 01:13:16.120] Twitter has reach.
[01:13:16.120 --> 01:13:21.120] It has arms and it is deep and pervasive.
[01:13:21.120 --> 01:13:25.120] So, I'm not really surprised by those numbers in that regard.
[01:13:25.120 --> 01:13:27.120] It's kind of disappointing though.
[01:13:27.120 --> 01:13:32.120] The second one, social media posts find that these bots are more likely to spread false
[01:13:32.120 --> 01:13:36.120] information and are responsible for as much as 90 percent.
[01:13:36.120 --> 01:13:37.120] Wow.
[01:13:37.120 --> 01:13:39.120] Of its spread on the most popular platforms.
[01:13:39.120 --> 01:13:41.120] Well, there are bots.
[01:13:41.120 --> 01:13:43.120] There's no doubt about that.
[01:13:43.120 --> 01:13:46.120] But to this degree, that even surprises me.
[01:13:46.120 --> 01:13:52.120] But if you think about it, depending on how you program the bot, you could do it in such
[01:13:52.120 --> 01:13:56.120] a way that this would be possible because the thing is it's a runaway train basically.
[01:13:56.120 --> 01:13:58.120] So, yeah, it could go that fast.
[01:13:58.120 --> 01:14:01.120] The first one is the one I'm not really in sync with here.
[01:14:01.120 --> 01:14:08.120] And I think part of the one, my gut is telling me it's the U.S. percent here, 39 percent.
[01:14:08.120 --> 01:14:12.120] I actually think my memory serves it's lower than that.
[01:14:12.120 --> 01:14:15.120] So, these numbers I think are out of sync.
[01:14:15.120 --> 01:14:21.120] Perhaps it's the U.S. 29, Russia 39, China 80, something like that or they're all just
[01:14:21.120 --> 01:14:22.120] entirely wrong.
[01:14:22.120 --> 01:14:23.120] I think that one's the fiction.
[01:14:23.120 --> 01:14:24.120] Okay, Telly.
[01:14:24.120 --> 01:14:26.120] It's your first science fiction on the SGU.
[01:14:26.120 --> 01:14:27.120] Don't blow it.
[01:14:27.120 --> 01:14:30.120] I'm just glad I got out of the guess going first thing.
[01:14:30.120 --> 01:14:34.120] So, I'm going to go reverse order too because I feel pretty good about number three because
[01:14:34.120 --> 01:14:40.120] misinformation is designed to be more interesting and more shareable, so six times faster seems
[01:14:40.120 --> 01:14:43.120] reasonable, if not low.
[01:14:43.120 --> 01:14:45.120] Ten times farther makes sense.
[01:14:45.120 --> 01:14:47.120] I see no problem with that.
[01:14:47.120 --> 01:14:55.120] Number two, 90 percent sounds kind of high, but if the bots are just sharing misinformation
[01:14:55.120 --> 01:15:00.120] rather than creating it, I could see that happening because sharing is a lot easier
[01:15:00.120 --> 01:15:01.120] on social media.
[01:15:01.120 --> 01:15:03.120] So, I guess that leaves me with number one.
[01:15:03.120 --> 01:15:09.120] I'm not quite sure why I think number one is the fiction, but process of elimination.
[01:15:09.120 --> 01:15:10.120] Okay, Jay.
[01:15:10.120 --> 01:15:15.120] I'm going to start with number two because I think that one is the most likely to be
[01:15:15.120 --> 01:15:17.120] science.
[01:15:17.120 --> 01:15:20.120] Social media bots are absolutely real.
[01:15:20.120 --> 01:15:25.120] They absolutely are spreading misinformation and I'm not surprised to hear that they're
[01:15:25.120 --> 01:15:29.120] spreading up to 90 percent of that misinformation.
[01:15:29.120 --> 01:15:33.120] So, that one seems pretty obvious to me.
[01:15:33.120 --> 01:15:39.120] Going to number three about fake news spreads six times faster and ten times farther, I
[01:15:39.120 --> 01:15:41.120] mean that sounds legitimate as well.
[01:15:41.120 --> 01:15:44.120] That tracks with all the information that I have with my head.
[01:15:44.120 --> 01:15:49.120] The first one I think is the fiction and I'll give you a couple of reasons why.
[01:15:49.120 --> 01:15:54.120] One, I think that for some reason that 80 percent number in China, it seems a little
[01:15:54.120 --> 01:15:59.120] high to me, but the one that really has got me here is that Russia is as low as 29 percent,
[01:15:59.120 --> 01:16:05.120] which from my understanding, especially because of the Ukraine war, that there is quite a
[01:16:05.120 --> 01:16:11.120] bit of belief in the media coming out of Russia by the citizens there and I think it's much
[01:16:11.120 --> 01:16:13.120] higher than 29 percent.
[01:16:13.120 --> 01:16:14.120] Okay, Bob.
[01:16:14.120 --> 01:16:15.120] Wow.
[01:16:15.120 --> 01:16:20.120] When I first read that first one, 80 percent, 29 and 39, it seemed fairly spot on to me
[01:16:20.120 --> 01:16:23.120] and now I'm questioning myself based on what everyone's saying here.
[01:16:23.120 --> 01:16:26.120] The second one seems reasonable to me.
[01:16:26.120 --> 01:16:31.120] 90 percent at first blush seemed pretty high, but I mean that's what bots do.
[01:16:31.120 --> 01:16:36.120] I mean it doesn't take much for them to do that and Kelly was saying that they're not
[01:16:36.120 --> 01:16:37.120] creating it.
[01:16:37.120 --> 01:16:40.120] They're just kind of spreading it, which is very easy for them to do.
[01:16:40.120 --> 01:16:41.120] It's kind of what they've designed to do.
[01:16:41.120 --> 01:16:44.120] And three makes a lot of sense to me here as well.
[01:16:44.120 --> 01:16:48.120] Again, as Kelly said that these are designed to be spread.
[01:16:48.120 --> 01:16:52.120] This misinformation is designed to be enticing and clickable and spreadable.
[01:16:52.120 --> 01:16:55.120] So that makes perfect sense.
[01:16:55.120 --> 01:17:01.120] So yeah, there's just a lot of opportunity for this first one here, which is trust in
[01:17:01.120 --> 01:17:02.120] media.
[01:17:02.120 --> 01:17:04.120] There's a lot of potential for one of these to be off.
[01:17:04.120 --> 01:17:07.120] So I'm going to agree with everyone and say that that's fiction.
[01:17:07.120 --> 01:17:08.120] Okay.
[01:17:08.120 --> 01:17:09.120] And Andrea.
[01:17:09.120 --> 01:17:10.120] Correct us all, Andrea.
[01:17:10.120 --> 01:17:11.120] All right.
[01:17:11.120 --> 01:17:13.120] Well, I'm going to put – this is really going to be unfortunate when I get this wrong
[01:17:13.120 --> 01:17:19.120] because I studied trust in media for some time and those numbers actually seem okay
[01:17:19.120 --> 01:17:20.120] to me.
[01:17:20.120 --> 01:17:21.120] Can I change mine?
[01:17:21.120 --> 01:17:23.120] I'm really doubting it because it's been a while since I looked.
[01:17:23.120 --> 01:17:25.120] So I don't have 2021 numbers.
[01:17:25.120 --> 01:17:29.120] And just like you said and just like we see with contagion and all of that, the more everyone
[01:17:29.120 --> 01:17:31.120] else doubts one, I'm like, maybe, I don't know.
[01:17:31.120 --> 01:17:35.120] But I think Russia seems a little low.
[01:17:35.120 --> 01:17:36.120] China is always sky high.
[01:17:36.120 --> 01:17:42.120] And I like it because it's a good example of surveys as blunt instruments and getting
[01:17:42.120 --> 01:17:45.120] someone to report they have trust in media versus whether they have trust in media.
[01:17:45.120 --> 01:17:46.120] There's a lot more nuance to it than that.
[01:17:46.120 --> 01:17:49.120] But it's one of the findings is that it comes out that way even though – anyway.
[01:17:49.120 --> 01:17:51.120] So I'm going to say one is science.
[01:17:51.120 --> 01:17:54.120] Two is the one that I'm hung up on.
[01:17:54.120 --> 01:17:58.120] And I think I've fallen into this trap before and this is one of the reasons I'm bad at
[01:17:58.120 --> 01:17:59.120] science or fiction.
[01:17:59.120 --> 01:18:00.120] But I think it's so vague.
[01:18:00.120 --> 01:18:04.120] The 90% feels like a lot but it could be, yeah, they're all just talking to each other
[01:18:04.120 --> 01:18:07.120] and we just don't see it or we're not following it.
[01:18:07.120 --> 01:18:11.120] But I feel like it's the most popular platforms.
[01:18:11.120 --> 01:18:14.120] We're not great at necessarily identifying bots.
[01:18:14.120 --> 01:18:18.120] I just feel like there's a lot of detail but it's not as crystal clear as I would like
[01:18:18.120 --> 01:18:19.120] to believe that it's science.
[01:18:19.120 --> 01:18:24.120] And then number three, I was pretty convinced was science and then Kelly's social media
[01:18:24.120 --> 01:18:26.120] expert take fully convinced me that it was science.
[01:18:26.120 --> 01:18:27.120] Yeah.
[01:18:27.120 --> 01:18:28.120] So I'm going to say one is fiction.
[01:18:28.120 --> 01:18:29.120] I'm sorry.
[01:18:29.120 --> 01:18:30.120] Two is fiction.
[01:18:30.120 --> 01:18:31.120] Two is fiction.
[01:18:31.120 --> 01:18:32.120] Andrew is departing from the crowd.
[01:18:32.120 --> 01:18:35.120] Ian, do we have votes from the listeners, the audience out there?
[01:18:35.120 --> 01:18:36.120] We sure do.
[01:18:36.120 --> 01:18:42.120] So with 28 votes, number one is winning 68%.
[01:18:42.120 --> 01:18:45.120] Number two has 14%, although it's still kind of moving.
[01:18:45.120 --> 01:18:48.120] And number three is 19%.
[01:18:48.120 --> 01:18:49.120] Okay.
[01:18:49.120 --> 01:18:52.120] So pretty close to the panel.
[01:18:52.120 --> 01:18:56.120] So I guess I'll take this in reverse order since everyone agrees on the third one.
[01:18:56.120 --> 01:19:01.120] Research shows that fake news spreads six times faster and ten times farther on Twitter
[01:19:01.120 --> 01:19:10.120] than true news and that people are 70% more likely to share a false tweet than a truthful one.
[01:19:10.120 --> 01:19:15.120] So everyone on the panel thinks this one is science, most of the audience thinks this
[01:19:15.120 --> 01:19:18.120] one is science, and this one is science.
[01:19:18.120 --> 01:19:19.120] This one is science.
[01:19:19.120 --> 01:19:21.120] I get so stressed out for this.
[01:19:21.120 --> 01:19:25.120] We actually talked about this study I think on the show, whatever it was, a year or two
[01:19:25.120 --> 01:19:26.120] ago when it came out.
[01:19:26.120 --> 01:19:34.120] And yeah, there was a massive review of tweets over years, millions of tweets, and they found
[01:19:34.120 --> 01:19:39.120] that the evaluated tweets that were objectively true or objectively false, the ones that were
[01:19:39.120 --> 01:19:47.120] false, they spread much faster and they had, they would average like 10,000 retweets, whereas
[01:19:47.120 --> 01:19:53.120] the true ones would rarely get above 1,000, so they had basically ten times deeper into
[01:19:53.120 --> 01:20:00.120] Twitter, and that people were 70% more likely to share a false tweet than a truthful one.
[01:20:00.120 --> 01:20:07.120] As Kelly said, if you are unfettered from reality and facts and truthfulness, then you
[01:20:07.120 --> 01:20:12.120] can formulate your tweet to be more shareable and interesting.
[01:20:12.120 --> 01:20:18.120] And also, other research shows that the factor that probably predicts whether a tweet will
[01:20:18.120 --> 01:20:23.120] be shared is that it's novel, it's something new, people haven't heard before.
[01:20:23.120 --> 01:20:28.120] And again, that's more likely to be true if the tweet itself is fake, right?
[01:20:28.120 --> 01:20:32.120] You can make, you can craft a novel tweet.
[01:20:32.120 --> 01:20:34.120] This Carrizo is actually a son.
[01:20:34.120 --> 01:20:38.120] You can craft a novel tweet if you don't have to be true.
[01:20:38.120 --> 01:20:41.120] I guess we'll keep going, we'll take this in reverse order.
[01:20:41.120 --> 01:20:46.120] Analysis, number two, analysis of social media posts finds that bots are far more likely
[01:20:46.120 --> 01:20:51.120] to spread false information and are responsible for as much as 90% of its spread on the most
[01:20:51.120 --> 01:20:52.120] popular platforms.
[01:20:52.120 --> 01:20:58.120] Andrea, you are pretty much out there on your own thinking that this one is the fiction.
[01:20:58.120 --> 01:21:00.120] Everyone else thinks this one is science.
[01:21:00.120 --> 01:21:02.120] The majority of the audience thinks this one is science.
[01:21:02.120 --> 01:21:05.120] And this one is the fiction.
[01:21:05.120 --> 01:21:11.120] My whole field was on the line.
[01:21:11.120 --> 01:21:14.120] There's also a lot of people in the chat being like, we changed our vote based on what you
[01:21:14.120 --> 01:21:16.120] said.
[01:21:16.120 --> 01:21:18.120] Listen to the experts.
[01:21:18.120 --> 01:21:23.120] This was the gotcha one, because everyone thinks that bots are out there menacing the
[01:21:23.120 --> 01:21:25.120] social media and doing all this horrible stuff.
[01:21:25.120 --> 01:21:29.120] People are far more likely to spread false tweets than bots.
[01:21:29.120 --> 01:21:35.120] In fact, the same study as number three looked at this, where bots are actually as likely
[01:21:35.120 --> 01:21:37.120] to spread a true tweet as a false one.
[01:21:37.120 --> 01:21:39.120] There's no discrimination.
[01:21:39.120 --> 01:21:47.120] They basically amplify tweets by automatically spreading them, but they're not good at discriminating.
[01:21:47.120 --> 01:21:53.120] Also, they're not good at retweeting, because they don't really know what are the good tweets.
[01:21:53.120 --> 01:21:57.120] Basically, the AI is just not that good enough to be as good as people, whereas people are
[01:21:57.120 --> 01:22:03.120] much more effective spreaders of tweets and are also way more discriminatory towards false
[01:22:03.120 --> 01:22:05.120] tweets.
[01:22:05.120 --> 01:22:11.120] I made it platform nonspecific, just the popular ones, but the bots are, again, not to say
[01:22:11.120 --> 01:22:15.120] they're not a problem, not to say they're not making the problem worse, but actually
[01:22:15.120 --> 01:22:20.120] people are far more of a problem than bots, because they actually are more likely to spread
[01:22:20.120 --> 01:22:25.120] the false bits of news, whereas bots are really not good at discriminating.
[01:22:25.120 --> 01:22:27.120] I've got to follow more bots, then.
[01:22:27.120 --> 01:22:28.120] Yeah.
[01:22:28.120 --> 01:22:29.120] Yeah.
[01:22:29.120 --> 01:22:30.120] Okay.
[01:22:30.120 --> 01:22:35.120] All this means that reported trust in the media in 2021, Jay, before the war in Ukraine,
[01:22:35.120 --> 01:22:40.120] was highest in China, 80%, and lowest in Russia, 29%, with the U.S. in between, 39%.
[01:22:40.120 --> 01:22:42.120] This is science, as Andrea said.
[01:22:42.120 --> 01:22:48.120] I was very careful to put reported trust in the media, because that's only what we know,
[01:22:48.120 --> 01:22:52.120] and that 80% in China, there's a lot of reason to think that that's what people are saying,
[01:22:52.120 --> 01:22:55.120] but not necessarily what they believe.
[01:22:55.120 --> 01:22:59.120] What was interesting to me here, because I immediately saw that, of course they're saying
[01:22:59.120 --> 01:23:03.120] they trust the media, because they don't trust to say that they don't trust the media, they
[01:23:03.120 --> 01:23:08.120] don't have the security to think that they do that, but then the disconnect between that
[01:23:08.120 --> 01:23:12.120] and Russia being literally at the bottom, wouldn't it be the same in Russia?
[01:23:12.120 --> 01:23:17.120] Despite the fact that these are both authoritarian regimes, there's something fundamentally different
[01:23:17.120 --> 01:23:21.120] about the experience of people in China and in Russia.
[01:23:21.120 --> 01:23:27.120] This is in 2021, maybe it's different now because of the war in Ukraine, and so it's
[01:23:27.120 --> 01:23:30.120] a good thought, Jay, to look at that detail.
[01:23:30.120 --> 01:23:33.120] Could it be Russia's access to the Internet?
[01:23:33.120 --> 01:23:37.120] Because we know China's locked down, much more on the Internet than I think we should.
[01:23:37.120 --> 01:23:39.120] Yeah, I don't know, maybe that's the difference.
[01:23:39.120 --> 01:23:41.120] I don't know how much different that is between China and Russia.
[01:23:41.120 --> 01:23:42.120] I was so sure I was right.
[01:23:42.120 --> 01:23:45.120] It's really unbelievable how you just don't know anything.
[01:23:45.120 --> 01:23:50.120] This is a little bit outdated and certainly self-serving, but so my dissertation was on
[01:23:50.120 --> 01:23:53.120] censorship in China, and I mostly looked at what they covered in their national news,
[01:23:53.120 --> 01:23:58.120] but I compared it to Russia, Venezuela, France, and the U.S. for a couple of big events.
[01:23:58.120 --> 01:24:02.120] And Russia, one of the things, I mean, these are my ideas, this isn't necessarily founded
[01:24:02.120 --> 01:24:05.120] science, but I'm trying my best, but exactly right.
[01:24:05.120 --> 01:24:08.120] It's much more interaction with the outside world.
[01:24:08.120 --> 01:24:12.120] Basically, my whole argument is you can't censor that much if there is information flowing
[01:24:12.120 --> 01:24:13.120] from elsewhere.
[01:24:13.120 --> 01:24:16.120] And so Russia can't get away with that kind of thing, and therefore I'm not surprised
[01:24:16.120 --> 01:24:18.120] to see the lower numbers.
[01:24:18.120 --> 01:24:20.120] Why didn't you say that?
[01:24:20.120 --> 01:24:23.120] Why didn't you say that before?
[01:24:23.120 --> 01:24:25.120] The most important thing is…
[01:24:25.120 --> 01:24:28.120] I wasn't sure if the actual numbers were right, but China being higher than Russia
[01:24:28.120 --> 01:24:30.120] is not surprising.
[01:24:30.120 --> 01:24:31.120] You know what I thought?
[01:24:31.120 --> 01:24:33.120] I really did think number two was wrong, Steve.
[01:24:33.120 --> 01:24:35.120] I really thought that that was wrong.
[01:24:35.120 --> 01:24:36.120] Yeah, I thought that one was a fiction.
[01:24:36.120 --> 01:24:38.120] I really did mostly that.
[01:24:38.120 --> 01:24:42.120] I went with number one because I thought I was helping out Andrea.
[01:24:42.120 --> 01:24:46.120] That's very nice of you, Jack.
[01:24:46.120 --> 01:24:50.120] The important thing to remember here is that my decision to have Andrea go last was the
[01:24:50.120 --> 01:24:51.120] right one.
[01:24:51.120 --> 01:24:53.120] It was absolutely spot on.
[01:24:53.120 --> 01:24:54.120] Yeah, you didn't get it now.
[01:24:54.120 --> 01:24:55.120] I understand now, Steve.
[01:24:55.120 --> 01:24:56.120] Yeah, okay.
[01:24:56.120 --> 01:24:57.120] All right.
Skeptical Quote of the Week (1:24:57)
An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong.
– Russell Baker (1925-2019), American journalist and Pulitzer Prize-winning writer, host of PBS' Masterpiece Theater
[01:24:57.120 --> 01:24:59.120] Evan, take us out with a quote.
[01:24:59.120 --> 01:25:04.120] An educated person is one who has learned that information almost always turns out to
[01:25:04.120 --> 01:25:12.120] be, at best, incomplete, and very often false, misleading, fictitious, mendacious, just dead
[01:25:12.120 --> 01:25:13.120] wrong.
[01:25:13.120 --> 01:25:19.120] Russell Baker, American Pulitzer Prize winning writer, host of PBS's Masterpiece Theatre.
[01:25:19.120 --> 01:25:21.120] Very nice and very appropriate.
[01:25:21.120 --> 01:25:22.120] Very appropriate.
[01:25:22.120 --> 01:25:23.120] To the conference.
[01:25:23.120 --> 01:25:24.120] Absolutely.
Signoff/Announcements
[01:25:24.120 --> 01:25:25.120] Well, this was a ton of fun.
[01:25:25.120 --> 01:25:26.120] Thank you all for joining me.
[01:25:26.120 --> 01:25:27.120] Andrea, welcome back.
[01:25:27.120 --> 01:25:28.120] Thank you.
[01:25:28.120 --> 01:25:29.120] Always great to have you on the show.
[01:25:29.120 --> 01:25:30.120] Andrea.
[01:25:30.120 --> 01:25:31.120] Kelly, welcome to your first SGE.
[01:25:31.120 --> 01:25:32.120] Well done.
[01:25:32.120 --> 01:25:33.120] You did great.
[01:25:33.120 --> 01:25:34.120] Really good job.
[01:25:34.120 --> 01:25:35.120] Thanks for talking about Alex Jones with us.
[01:25:35.120 --> 01:25:37.120] And remember, this is coming out soon.
[01:25:37.120 --> 01:25:38.120] Discover the Guide to the Future.
[01:25:38.120 --> 01:25:39.120] You can pre-order right now.
[01:25:39.120 --> 01:25:40.120] Pre-order right now.
[01:25:40.120 --> 01:25:43.620] Seriously, it does help us a lot if you pre-order.
[01:25:43.620 --> 01:25:45.960] That will help promote the book tremendously.
[01:25:45.960 --> 01:25:46.960] So please check it out.
[01:25:46.960 --> 01:25:48.200] We really would appreciate it.
[01:25:48.200 --> 01:25:52.120] Our first book is also still for sale, The Skeptics Guide to the Universe.
[01:25:52.120 --> 01:25:53.120] Nice pair.
[01:25:53.120 --> 01:25:55.560] If you're new to the show and you didn't realize we have a book out there, we do.
[01:25:55.560 --> 01:25:56.560] Just check it out.
[01:25:56.560 --> 01:25:59.480] It basically goes over all the stuff that we talk about on the show, all the critical
[01:25:59.480 --> 01:26:01.120] thinking skills and such.
[01:26:01.120 --> 01:26:05.120] And again, thanks to George for hosting the Nexus for us.
[01:26:05.120 --> 01:26:08.980] And as for his kind introduction, thanks to everybody working behind the scenes.
S: —and until next week, this is your Skeptics' Guide to the Universe.
S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.
GH: (in a Russian accent) I think is basically—I don't trust, I don't trust the media because I am inebriated. And feel free to talk about— (Rogues laugh.)
Today I Learned
- Fact/Description, possibly with an article reference[9]
- Fact/Description
- Fact/Description
Notes
References
- ↑ Neurologica: NIH To Fund Scientific Rigor Initiative
- ↑ Ars Technica: Why space debris keeps falling out of the sky—and will continue to do so
- ↑ Today: 'Green needle' or 'brainstorm'? Hear the latest audio clip dividing the internet
- ↑ Reuters: Jury awards $45.2 million in punitive damages in Alex Jones Sandy Hook trial
- ↑ Forbes: Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began
- ↑ Statista.com: Level of trust in media in selected countries worldwide as of November 2021
- ↑ Science: The spread of true and false news online
- ↑ Science: Fake news spreads faster than true news on Twitter—thanks to people, not bots
- ↑ [url_for_TIL publication: title]
Vocabulary