SGU Episode 893: Difference between revisions

From SGUTranscripts
Jump to navigation Jump to search
(special segment completed)
(1st news item completed)
Line 286: Line 286:


== News Items ==
== News Items ==
<!--
== News Items ==


** We recommend adding section anchors above any news items that are referenced in later episodes (or even hinted in prior episodes as upcoming). See the anchor directly above News Item #1 below, which you would change to {{anchor|news1}}
=== More Space Debris <small>(23:51)</small> ===
-->
* [https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Why space debris keeps falling out of the sky—and will continue to do so]<ref>[https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Ars Technica: Why space debris keeps falling out of the sky—and will continue to do so]</ref>
'''S:'''


'''B:'''
'''S:''' That picture of a giant complex of buildings that I'm showing is the NIH, the National Institutes of Health. They are essentially the main biomedical research funding institution in the United States. They are a creature of Congress, as we like to say. They are created, funded by Congress. Essentially, if you do biomedical research in the U.S., you get your funding from the NIH, more likely than not. They're massively important for medical research. Recently, the NIH created an initiative. It's not a new office or anything. It's just an initiative. They're funding specific groups who are going to create an educational module to teach researchers how to do rigorous science. That sounds pretty good. That sounds pretty good to me.


'''C:'''
'''J:''' That doesn't already exist, though?


'''J:'''
'''AJR:''' That's my thought.


'''E:'''
'''S:''' That's a good question. Right now, how do we teach researchers how to do good research methodology? Some universities may have courses on it. They may be required. They may be elective. They might be a statistics course or a research methodology course. You do get that, but not like, all right, here's how you do really rigorous research. Here's how you avoid p-hacking or how you avoid false positives, etc., etc. Clearly, that is needed for reasons that I've been talking about and writing about for the last 20 years. The other way that people learn that is through, essentially, individual mentorship. You work in somebody's lab, and they teach you how to do research, not only in their specific area, technically, but also just, this is what good science is. But it's not systematic, and it's not thorough enough. Clearly, there's a perception that there is a gap, a gap there. They want to fill that gap. Their goal is to fund the creation of this module to teach rigorous research design and to then make it freely available, basically. And then the hope is, so universities may require it. They might say, all right, if you're going to work at our university, this already happens. I work at Yale, and I have to do 20 different certifications every year on everything, like sexual harassment sensitivity or how not to burn your eyes out or whatever, all of these things.
<!-- ** the triple quotes are how you get the initials to be bolded. Remember to use double quotes with parentheses for non-speech sounds like (laughter) and (applause). It's a good practice to use brackets for comments like [inaudible] and [sarcasm]. -->


''(laughs)''
'''E:''' That's a good one.
''(laughter)''
''(applause)''
[inaudible]


[00:13.060 --> 00:18.960]  This is your host, Stephen Novella, and today is August 6th, 2022.
'''S:''' How to treat patients ethically, all good stuff. A lot of safety things all in there. But just adding one that's, here's how not to do fake research. Here's how not to accidentally commit research fraud. Or how to p-hack or whatever. It would be very easy to slip that into the existing system of getting certified for quality control. That's basically what this is. Now, the NIH, of course, they could require, if you apply to the NIH for a research grant, and they're not saying they're going to do this, but imagine if they said, all right, in order to get this grant, you've got to have certification that you took this module and you passed. Because again, they're interested in not wasting money. That's their primary interest. Obviously, they want to do good science. That's their goal. Their mission is to obviously do good science, but they have a finite budget, and they want to make the most use out of that money. That, again, is their mission. One of the biggest wastes in research is bad science. If you publish a study, and it's a false positive, let's say, you think that you have a result, but you did poor methodology, you p-hacked or whatever. You underpowered the study. Or the blinding was inadequate. Or your statistics were off, or whatever. And then other people try to replicate that study, how many millions of dollars could be spent proving that your crappy study was crappy when you could have filtered it out at the beginning by putting in some internal controls that you didn't know you should do? Or by tightening up your research methodology. The other goal here, other than not only doing good science, is to save money by weeding out the inefficiency in the system of fraud. It makes sense, not fraud, but just bad rigor in research design. It makes sense that once these modules are up and running, phase two would be, and you've got to be certified in this before we'll give you any money. So that's one way that you, and again, the NIH already does this for other things, for example, they now require, this has been going on for about 10 or 15 years or so, if you get public money to do your research, you have to make the results of your research available to the public and accessible by the public. You have to say, how are you going to explain your results to the people who are paying for your research, the public. So this would be another way, how can you assure the people who are funding your research that you're not wasting their money by doing rigorous research design? And by the way, here is an educational module, and we could easily connect certification to that. That's awesome. I would like to see big science journals do the same thing. You want to get published in our journal, we require that you have the author, the lead author, or every author has certification. And of course, once either of those happens, like if the NIH says you need to have certification to get grant money, you better believe every university will make sure that it happens. They're not going to have any of their people not be able to get NIH grants. So it's very easy to make this systematic. So again, we're right at the very beginning of this, and everything I'm hearing and seeing is very, very good. We'll keep a close eye on it. And again, a lot of people react like you, Jay. It's really, why isn't this kind of already happening? But that's because I think the main reason is, I would say there's two things. One is people think it is happening, but it's just not happening enough. The second one is that the science of doing rigorous science has been getting better. We're learning more and more subtle ways in which studies go awry or that results can be tweaked or researchers can put their thumb on the scale. We talk about researcher degrees of freedom and researcher bias and publication bias and citation bias and all these things that can alter the utility and the rigor and the quality of science and essentially the old method of just relying upon some just here's some classic statistics class. And then whoever's lab you work in, they'll teach you how to do good science. It's just not good enough anymore. It's got to be systematic, and everyone's got to go through it in order to absolutely minimize the waste in the system that comes from poor research design. So this is a massive move in the right direction. This is very, very encouraging.


[00:18.960 --> 00:20.680]  Joining me this week are Bob Novella.
'''J:''' Steve, where did you learn how to do it?


[00:20.680 --> 00:21.680]  Hey everybody.
'''S:''' For me, well, it's been the whole science-based medicine initiative, which is I've been reading about it, following, reading the literature on it for 20 years and writing about it, trying to digest it. That's basically what we explore at science-based medicine is how to do rigorous science. The relationship between science and practice. How do we know what's true, what's not true? Where's the threshold of evidence before something should affect your practice? That's what we do. That's how I learned it. It was all basically just self-taught by reading the literature, talking to my colleagues, writing about it, engaging about it. But most researchers are not spending most of their time, their academic time, doing that. They're doing their research. They're trying to figure out what receptor is causing this disease or whatever. This is sort of part of that, but it's not their focus. That's why it needs to be done systematically. This is also one final word and then we'll move on. Part of a bigger trend that I've noticed, at least in medicine. Andrea, you can tell me if you think it's true in your field as well, that you're going away from the model of just counting on mentorship and counting on that people will learn what they need to learn and moving towards things that are way more systematic, that are verified, and also that there are checks in place rather than just trying to raise the quality by just over-educating people. You just have checks in place to make sure that they do it. Medicine is getting too complicated. Science is getting too complicated to rely upon methods that are not absolutely systematic. Is that something you find in academia from your end?


[00:21.680 --> 00:22.680]  Jay Novella.
'''AJR:''' Definitely. I'm thinking about something that I think Jay brought up on a different live a while ago about the movement towards pre-registering your hypotheses. That's another way of just putting the system in place because it turns out we can't rely on everyone to do great science even though we all like to think that we're doing it. Where I thought you were going, Steve, with that was we can't rely exclusively. Well, we still rely on it a lot, but peer review. Peer review is not a perfect process. It's a strong process in a lot of ways and I don't have great ideas about what to do instead, but it's not like it's perfect. A lot of stuff gets through peer review, and so this is something that could help steer people. The only question I'm having, though, is how you could imagine a world where they're sort of methodologically specific. I'm thinking of machine learning where you have issues with overfitting your model. That would be totally irrelevant to someone running an experiment. I don't know what the future would look like. Ten years from now, are there different modules? Do we need different modules?


[00:22.680 --> 00:23.680]  Hey guys.
'''S:''' This is what exists currently in medicine. If I'm doing some quality control certification thing that I do every year, there's the first part of it, which is for everyone or maybe every physician, and then you say what your specialty is. I'm a neurologist. Then you get the neurology-specific stuff. You could do the same thing. Here's the generic rigors that everyone needs to know, and then what are you doing research in? Particle physics? Here's the particle physics part of the module for you for those specific issues. I could absolutely see that working that way.


[00:23.680 --> 00:24.680]  Evan Bernstein.
'''AJR:''' I kind of like the idea of making a bunch of social scientists do the particle physics, just to keep us humble.


[00:24.680 --> 00:25.680]  Hello everyone.
'''S:''' Absolutely.


[00:25.680 --> 00:29.000] And we have two in-studio guests, Kelly Burke.
=== More Space Debris <small>(23:51)</small> ===
* [https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Why space debris keeps falling out of the sky—and will continue to do so]<ref>[https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Ars Technica: Why space debris keeps falling out of the sky—and will continue to do so]</ref>


[00:29.000 --> 00:30.000Hello.
[23:49.120 --> 23:53.120Jay, tell us about crap falling from the sky.


[00:30.000 --> 00:31.000Kelly, welcome to the SGU.
[23:53.120 --> 23:57.120Steve, there's crap, and it's falling from the goddamn sky.


[00:31.000 --> 00:32.480This is your first time on the show.
[23:57.120 --> 23:59.120Oh, my goodness.


[00:32.480 --> 00:33.480It is.
[23:59.120 --> 24:06.120This is about the fact that space agencies around the world


[00:33.480 --> 00:34.480And Andrea Jones-Roy.
[24:06.120 --> 24:10.120are not doing a very good job of figuring out


[00:34.480 --> 00:35.960Andrea, welcome back to the SGU.
[24:10.120 --> 24:13.120how to exactly de-orbit pieces of spacecraft


[00:35.960 --> 00:36.960Hello.
[24:13.120 --> 24:16.120that are left up there for one reason or another.


[00:36.960 --> 00:37.960Thank you for having me.
[24:16.120 --> 24:20.120There is a significant number of objects in low Earth orbit.


[00:37.960 --> 00:39.400Thank you all for joining me in the studio.
[24:20.120 --> 24:25.120NASA tracks anything from 2 inches or 5 centimeters and up,


[00:39.400 --> 00:42.160You were here live, so we had to have you on the show.
[24:25.120 --> 24:30.120and there's 27,000 objects that are being tracked,


[00:42.160 --> 00:46.640Now Kara was going to join us for this episode, she was going to join us remotely, but as
[24:30.120 --> 24:36.120and 70% of the tracked objects are in LEO, low Earth orbit,


[00:46.640 --> 00:50.760you remember, she had surgery not too long ago and she's going through a bit of a rough
[24:36.120 --> 24:39.120which is the orbit that's basically as close to the Earth


[00:50.760 --> 00:51.760patch.
[24:39.120 --> 24:41.120as you could pretty much get.


[00:51.760 --> 00:57.360She did want us to say that Kara does struggle with depression and she's having a depressive
[24:41.120 --> 24:43.120Do they say LEO?


[00:57.360 --> 01:02.840episode partly due to hormones and the surgery and everything that's going on.
[24:43.120 --> 24:45.120I've only ever heard LEO.


[01:02.840 --> 01:06.760She's dealing with it, but that has to be her priority this weekend to deal with that.
[24:45.120 --> 24:47.120I just thought you meant something astrology, Jay,


[01:06.760 --> 01:09.720And so she decided to not do the show.
[24:47.120 --> 24:49.120and I was like, I can't believe this is happening.


[01:09.720 --> 01:13.480So we wish her well, she'll be back for next week's show.
[24:49.120 --> 24:50.120I've got to go.


[01:13.480 --> 01:16.920But we have six people instead of five to make up for it.
[24:50.120 --> 24:52.120I'm blazing trails here.


[01:16.920 --> 01:25.440So as you all know, this episode, every year, this is our Perry DeAngelis Memorial episode
[24:52.120 --> 24:54.120It's low Earth orbit.


[01:25.440 --> 01:32.440and before it was Nexus, it was just the Perry DeAngelis Memorial SGO episode, then it basically
[24:54.120 --> 24:57.120Every one of these objects that are up there


[01:32.440 --> 01:37.640morphed into Nexus and we kept it as the episode where we remember our lost rogue, Perry.
[24:57.120 --> 25:01.120]  and that are going to be up there for a long time are hazards.


[01:37.640 --> 01:39.960Dan, that was 15 years ago, guys.
[25:01.120 --> 25:02.120They're dangerous.


[01:39.960 --> 01:40.960I know, oh my gosh.
[25:02.120 --> 25:04.120They actually have to plan accordingly.


[01:40.960 --> 01:45.480He was with us for two years and it's been 15 years since he was on the show.
[25:04.120 --> 25:07.120When anybody launches anything into outer space,


[01:45.480 --> 01:46.480It's just unbelievable.
[25:07.120 --> 25:10.120they have to figure out the right time to do it


[01:46.480 --> 01:51.080And of course, we remember many of the friends that we lost along the way, David Young, Michael
[25:10.120 --> 25:13.120and how to avoid these known objects,


[01:51.080 --> 01:58.000Oreticelli, all lost too young, too soon, all really, really good friends of the SGU.
[25:13.120 --> 25:16.120because one of them could be traveling at such an incredible speed


[01:58.000 --> 02:00.240So we like to remember them every year.
[25:16.120 --> 25:19.120in relation to the ship that you're putting up there


[02:00.240 --> 02:01.240Okay.
[25:19.120 --> 25:20.120that it could destroy it.


[02:01.240 --> 02:05.880So as you all know, I have an announcement from our book publisher, if you don't mind.
[25:20.120 --> 25:22.120It could rip right through it.


[02:05.880 --> 02:06.880Oh yes, go ahead, Jay.
[25:22.120 --> 25:24.120So this is a growing issue,


[02:06.880 --> 02:12.000That book is the result of an incredible amount of work.
[25:24.120 --> 25:27.120and we have another issue that is a problem,


[02:12.000 --> 02:16.560So this book is about, it's about science, it's about the history of science and it's
[25:27.120 --> 25:31.120]  is that there are objects that are being left in low Earth orbit


[02:16.560 --> 02:20.440about making predictions on future technology.
[25:31.120 --> 25:36.120that are big, that are slowly de-orbiting over time,


[02:20.440 --> 02:23.600Historically and modern day.
[25:36.120 --> 25:39.120because there's a tiny, tiny, tiny, tiny bit of atmosphere


[02:23.600 --> 02:24.960And we had a lot of fun writing the book.
[25:39.120 --> 25:41.120in low Earth orbit,


[02:24.960 --> 02:31.320It was really intense, but it's an archive now of incredible information about predictions
[25:41.120 --> 25:44.120and that's just enough to slowly take something out of orbit


[02:31.320 --> 02:34.720that were made in the past, predictions that were made five, 10 years ago, predictions
[25:44.120 --> 25:46.120and bring it back down to Earth.


[02:34.720 --> 02:36.800that are made today.
[25:46.120 --> 25:51.120As an example, China had one of their Long March 5B rockets


[02:36.800 --> 02:41.760We also wrote some science fiction for this to illustrate some interesting future concepts
[25:51.120 --> 25:53.120bring something up,


[02:41.760 --> 02:42.760]  of technology.
[25:53.120 --> 25:56.120and a week later, when it came out of orbit,


[02:42.760 --> 02:43.760Yeah.
[25:56.120 --> 25:58.120because it was only up for a week,


[02:43.760 --> 02:45.280What I like is that it's also a time capsule.
[25:58.120 --> 26:01.120and by that time there was enough inertia and everything


[02:45.280 --> 02:47.400It's like our own time capsule for the future.
[26:01.120 --> 26:03.120to get it back down into the atmosphere,


[02:47.400 --> 02:50.320So future generations can look back and see how we did.
[26:03.120 --> 26:07.120pieces of it landed in Malaysia and Indonesia,


[02:50.320 --> 02:53.280Just like we are looking back at the past future and see how they did.
[26:07.120 --> 26:10.120]  and it landed right near a village where people were living.


[02:53.280 --> 02:55.920I hope they don't laugh at us the way we've been laughing at them.
[26:10.120 --> 26:12.120It is a real threat,


[02:55.920 --> 02:56.920Yeah.
[26:12.120 --> 26:15.120and we're not talking about millions of people getting hurt,


[02:56.920 --> 02:58.400We were talking about the Jetsons earlier.
[26:15.120 --> 26:16.120but it could kill people.


[02:58.400 --> 02:59.400]  It's not-
[26:16.120 --> 26:19.120]  It could kill handfuls of people now and again,


[02:59.400 --> 03:00.400Right.
[26:19.120 --> 26:21.120which is something that we definitely want to avoid.


[03:00.400 --> 03:01.4002022.
[26:21.120 --> 26:23.120It's also just not good practice.


[03:01.400 --> 03:02.400Happy birthday, George.
[26:23.120 --> 26:25.120It's not keeping your shop clean.


[03:02.400 --> 03:03.400Yeah.
[26:25.120 --> 26:28.120So getting back to the Long March 5B rocket,


[03:03.400 --> 03:07.840If you go to skepticsguidetothefuturebook.com, is that right?
[26:28.120 --> 26:30.120now this rocket is huge.


[03:07.840 --> 03:08.840Skepticsguidetothefuturebook.com.
[26:30.120 --> 26:32.120China launched it on July 24th,


[03:08.840 --> 03:14.600And you fill out the form there and you put in the secret password, which is the word
[26:32.120 --> 26:35.120]  and they were bringing up a new space station module


[03:14.600 --> 03:15.600future.
[26:35.120 --> 26:39.120to their Tiangong space station, which is a China-only space station.


[03:15.600 --> 03:16.600Don't tell anybody.
[26:39.120 --> 26:42.120It's actually pretty cool, they should read up on it.


[03:16.600 --> 03:17.600It's a secret.
[26:42.120 --> 26:46.120Now this rocket is not designed to de-orbit itself.


[03:17.600 --> 03:18.600That's clever.
[26:46.120 --> 26:48.120They don't send it up with the ability to do that,


[03:18.600 --> 03:19.600Come up with that.
[26:48.120 --> 26:52.120and in fact, the engines can't even restart after the engines are shut off.


[03:19.600 --> 03:25.080And you will be entered in to a giveaway of the very first signed copy of the book.
[26:52.120 --> 26:55.120When it does its main push and gets all that weight up


[03:25.080 --> 03:26.080Wow.
[26:55.120 --> 26:57.120to the altitude that they need it to,


[03:26.080 --> 03:32.040So please go to skepticsguidetothefuturebook.com and the secret password, George, what's that
[26:57.120 --> 26:59.120]  and those engines shut off, they can't go back on.


[03:32.040 --> 03:33.040secret password?
[26:59.120 --> 27:03.120This ultimately means that there's no way for China


[03:33.040 --> 03:34.040Flabbing garbage.
[27:03.120 --> 27:07.120to control the de-orbiting of this massive rocket.


[03:34.040 --> 03:35.040Flabbing garbage.
[27:07.120 --> 27:10.120It's just going to fly back into the Earth's atmosphere,


[03:35.040 --> 03:39.560Or spelled as in non-George language, future.
[27:10.120 --> 27:13.120and I'm not even sure that they know where it's going to end up going.


[03:39.560 --> 03:40.560The word is future.
[27:13.120 --> 27:16.120I don't even know if there's good physics


[03:40.560 --> 03:41.560All right.
[27:16.120 --> 27:19.120that will really accurately predict where something willy-nilly


[03:41.560 --> 03:47.300So this slide is just to remind everybody that the theme of Nexus 2022, the 14th Nexus
[27:19.120 --> 27:23.120]  is de-orbiting at some point and coming back into the atmosphere.


[03:47.300 --> 03:50.280]  is the misinformation apocalypse.
[27:23.120 --> 27:27.120It could end up anywhere, which is the scary part.


[03:50.280 --> 03:52.960You've had a lot of talk so far about it.
[27:27.120 --> 27:31.120Believe me, I feel completely happy and thrilled and lucky


[03:52.960 --> 03:58.240That theme might crop up on the SGU show this weekend.
[27:31.120 --> 27:34.120that we're alive during a time when space exploration


[03:58.240 --> 04:01.680But we've tried to focus on the positive, right guys?
[27:34.120 --> 27:36.120is starting to explode again.


[04:01.680 --> 04:04.360We don't just want to say how bad it is.
[27:36.120 --> 27:37.120It's a great time.


[04:04.360 --> 04:06.960We want to focus on what you can do about it.
[27:37.120 --> 27:38.120Hopefully explode.


[04:06.960 --> 04:08.800And there's a lot of things you can do about it.
[27:38.120 --> 27:40.120Yeah, you're right.


[04:08.800 --> 04:11.600All the things that we try to do every week.
[27:40.120 --> 27:44.120When all of these nations are launching new projects,


[04:11.600 --> 04:16.860Understand science better and be more critical thinking, be more humble, know how to communicate,
[27:44.120 --> 27:46.120how's that? Is that better?


[04:16.860 --> 04:22.680be more positive when you communicate, understand how to access information over media, how
[27:46.120 --> 27:47.120Better.


[04:22.680 --> 04:23.680that all works.
[27:47.120 --> 27:52.120What we don't have right now are proper rules of etiquette.


[04:23.680 --> 04:27.600We'll hit some of those themes during the show today as well.
[27:52.120 --> 27:55.120There are things that people would like.


[04:27.600 --> 04:34.000Well, it's always awesome as one of the people that organize Nexus, we talk to the speakers,
[27:55.120 --> 27:59.120NASA is making it known what information that they would like,


[04:34.000 --> 04:37.440]  but we don't get an incredible amount of details about what their talk is going to be.
[27:59.120 --> 28:03.120]  but in this instance, China didn't share any of the information


[04:37.440 --> 04:40.200Because we're just trusting professionals.
[28:03.120 --> 28:06.120about what trajectory their rocket was on


[04:40.200 --> 04:42.400There's some conversation, but it's not detailed.
[28:06.120 --> 28:10.120and where they think it'll end up coming back into the atmosphere.


[04:42.400 --> 04:43.400Always a bit of a throw of the dice.
[28:10.120 --> 28:13.120The NASA administrator, the name of Bill Nelson,


[04:43.400 --> 04:44.400Yeah.
[28:13.120 --> 28:15.120he said, and I'm quoting him,


[04:44.400 --> 04:45.400It has to be.
[28:15.120 --> 28:18.120All spacefaring nations should follow established best practices


[04:45.400 --> 04:47.640It has to be the people and not the talk, basically, right?
[28:18.120 --> 28:22.120and do their part to share this type of information in advance


[04:47.640 --> 04:53.260So when we get to hear the talk and we get to see how it folds into our theme and what
[28:22.120 --> 28:25.120]  to allow reliable predictions of potential debris impact risk,


[04:53.260 --> 04:56.880information that they cover, it's always a fun discovery, right?
[28:25.120 --> 28:29.120especially for heavy-lift vehicles like the Long March 5B,


[04:56.880 --> 04:58.560Like, oh my God, that's cool.
[28:29.120 --> 28:33.120which carry a significant risk of loss of life and property.


[04:58.560 --> 05:02.280It's more relevant than I thought, or they went into an area I didn't expect them to
[28:33.120 --> 28:36.120Doing so is critical to the responsible use of space


[05:02.280 --> 05:03.280go.
[28:36.120 --> 28:39.120and to ensure the safety of people here on Earth.


[05:03.280 --> 05:06.080]  I thought your talk this morning was a lot of fun that you had with Richard Wiseman.
[28:39.120 --> 28:42.120]  I wish that I could have found some information on what would have happened


[05:06.080 --> 05:07.080Oh yeah.
[28:42.120 --> 28:48.120if one of these pieces of larger debris ended up barreling into a city.


[05:07.080 --> 05:08.080Richard Wiseman's like-
[28:48.120 --> 28:50.120Could it take a part of a building out?


[05:08.080 --> 05:09.080That was really cool.
[28:50.120 --> 28:53.120What's its velocity? How much mass does it have?


[05:09.080 --> 05:10.080Very easy to talk to about stuff like that.
[28:53.120 --> 28:56.120I do know that SpaceX had a module,


[05:10.080 --> 05:11.080Definitely.
[28:56.120 --> 29:00.120a piece of debris come back down as recently as July 9th.


[05:11.080 --> 05:12.080Always a pleasure.
[29:00.120 --> 29:03.120Now, if you look at a picture of the Crew-1 module,


[05:12.080 --> 05:16.560You just gave your talk on political science and misinformation, which is obviously a huge
[29:03.120 --> 29:06.120there is a component that's right underneath it


[05:16.560 --> 05:17.980intersection there.
[29:06.120 --> 29:09.120that is used to relay electricity to the module and all that,


[05:17.980 --> 05:18.980So there's still a lot to learn.
[29:09.120 --> 29:11.120but it's also a cargo hold, right?


[05:18.980 --> 05:24.640I think after doing this for 17 years, it doesn't amaze me, but it's always fascinating
[29:11.120 --> 29:13.120A cargo hold that's not pressurized.


[05:24.640 --> 05:30.520how much we still have to learn about something that we've been knee-deep in for a quarter
[29:13.120 --> 29:17.120This thing is about 3 meters long and it weighs 4 metric tons.


[05:30.520 --> 05:31.520of a century.
[29:17.120 --> 29:22.120That's an incredibly heavy object that hit the Earth at one point.


[05:31.520 --> 05:37.480We've been doing this skepticism since 96, 26 years, the podcast for 17 of those years.
[29:22.120 --> 29:26.120It came back down on July 9th and it took a year for it to deorbit.


[05:37.480 --> 05:43.760But there's just so much depth, and it's getting deeper, which is the good thing, is that this
[29:26.120 --> 29:29.120So that's just another thing that needs to be tracked.


[05:43.760 --> 05:45.400is not a static field.
[29:29.120 --> 29:32.120It could take time for them to come back down


[05:45.400 --> 05:46.400It's dynamic.
[29:32.120 --> 29:34.120and then we have to try to figure out where they're going to go.


[05:46.400 --> 05:50.720We are actually learning more, and we have to update ourselves.
[29:34.120 --> 29:36.120But okay, let's say we know where it's going to go.


[05:50.720 --> 05:51.720Constantly.
[29:36.120 --> 29:40.120So what? What if it's going to hit a major city somewhere?


[05:51.720 --> 05:52.720Yeah.
[29:40.120 --> 29:41.120What are we going to do about it?


[05:52.720 --> 05:56.620Well, the world... I mean, look at what's happened since this podcast began.
[29:41.120 --> 29:43.120The answer is there's nothing.


[05:56.620 --> 06:01.800Look how much things have changed, like you were talking about time capsules.
[29:43.120 --> 29:44.120There's nothing we can do about it.


[06:01.800 --> 06:10.000Just take a look at what happened in science, the skeptical community itself, and in politics.
[29:44.120 --> 29:48.120We're going to shoot rockets up to take out rockets that are coming.


[06:10.000 --> 06:13.400]  The world has changed so much, and we're running to try to keep up with it, if anything.
[29:48.120 --> 29:49.120]  The whole thing is crazy.


[06:13.400 --> 06:14.960It's not an easy thing to do.
[29:49.120 --> 29:54.120So what we need to do is we need to have this rules of etiquette


[06:14.960 --> 06:19.920Yeah, I mean, we're focused on almost entirely different issues now than we were at the start.
[29:54.120 --> 29:58.120where space agencies start to send up more fuel,


[06:19.920 --> 06:24.320The things that are important or critical have been constantly evolving.
[29:58.120 --> 30:01.120have rocket engines that can deorbit themselves


[06:24.320 --> 06:27.680And then some things come back and stay the same, like we're doing UFOs again.
[30:01.120 --> 30:04.120]  and not only have one turn-on cycle.


[06:27.680 --> 06:29.560Really, we're all at the same point.
[30:04.120 --> 30:09.120These pretty costly and probably very expensive engineering feats


[06:29.560 --> 06:31.960So some things are different, some things are the same.
[30:09.120 --> 30:11.120that need to become a part of all of these projects.


[06:31.960 --> 06:32.960It's always interesting.
[30:11.120 --> 30:13.120And that's what NASA wants.


[06:32.960 --> 06:35.880I'm just waiting for Bigfoot to become relevant again.
[30:13.120 --> 30:15.120But right now...


[06:35.880 --> 06:40.200How long before Bigfoot is like, we're talking about Bigfoot again?
[30:15.120 --> 30:17.120Just to make sure that the point is crystal clear,


[06:40.200 --> 06:42.360Yeah, that's one of those eternal ones that never goes away.
[30:17.120 --> 30:21.120it's to control the deorbit so that we know where it comes down.


[06:42.360 --> 06:45.320I saw a Loch Ness Monster post on Twitter recently.
[30:21.120 --> 30:26.120We dump it in the middle of the Pacific so it doesn't hit Australia or whatever.


[06:45.320 --> 06:46.320Oh, yeah.
[30:26.120 --> 30:27.120Exactly, yeah.


[06:46.320 --> 06:47.320They're like, oh, new evidence.
[30:27.120 --> 30:30.120So right now there's a couple of companies that are starting to,


[06:47.320 --> 06:48.320Loch Ness Monster.
[30:30.120 --> 30:33.120or space agencies that are starting to comply


[06:48.320 --> 06:49.320New evidence.
[30:33.120 --> 30:37.120and build in this functionality into the new rockets that they're building.


[06:49.320 --> 06:50.320I love that.
[30:37.120 --> 30:41.120But let's face it, it's not a global thing.


[06:50.320 --> 06:51.320New evidence.
[30:41.120 --> 30:43.120A lot of people aren't doing that.


[06:51.320 --> 06:52.320Never goes away.
[30:43.120 --> 30:46.120Some good things that we have are like SpaceX,


[06:52.320 --> 06:53.320Yeah.
[30:46.120 --> 30:50.120which is leading the pack on this whole idea of reusability.


[06:53.320 --> 06:54.320New blurry photos.
[30:50.120 --> 30:51.120That's fantastic.


[06:54.320 --> 06:58.120]  Even after it's been definitively... The guy confessed, yeah, that was me.
[30:51.120 --> 30:52.120]  You want to reuse your rockets.


[06:58.120 --> 06:59.120]  I made the first...
[30:52.120 --> 30:54.120]  You want your retro rockets to land themselves.


[06:59.120 --> 07:00.120]  Oh, yeah.
[30:54.120 --> 30:55.120]  You see it all the time.


[07:00.120 --> 07:01.120]  Awesome.
[30:55.120 --> 30:56.120]  That's great.


[07:01.120 --> 07:02.120]  Steve, that's an admission that there's an Illuminati.
[30:56.120 --> 30:59.120]  More reusability that we build into things means more control,


[07:02.120 --> 07:03.120]  Yeah.
[30:59.120 --> 31:02.120]  more ability to bring things down safely,


[07:03.120 --> 07:04.120]  That's what that is.
[31:02.120 --> 31:05.120]  which is exactly what everybody needs to be doing.


[07:04.120 --> 07:05.120]  Come on, man.
[31:05.120 --> 31:08.120]  One, we don't want to pollute low Earth orbit any worse than it is.


[07:05.120 --> 07:06.120]  All right.
[31:08.120 --> 31:10.120]  If anything, we want to get that stuff out of there,


[07:06.120 --> 07:08.280Let's get to some fun stuff.
[31:10.120 --> 31:16.120which no one has come up with a feasible economic way to do it yet.


[07:08.280 --> 07:09.280What do you guys think that is?
[31:16.120 --> 31:18.120But I imagine at some point in the next 50 years,


[07:09.280 --> 07:10.280I'm showing a picture on the screen there.
[31:18.120 --> 31:22.120someone will come up with something that's making that move.


[07:10.280 --> 07:11.280Ooh, ask C.
[31:22.120 --> 31:25.120But in the meantime, our goals are no more debris


[07:11.280 --> 07:12.280That is a meeple.
[31:25.120 --> 31:29.120and absolutely no more craziness of things falling out of the sky


[07:12.280 --> 07:17.640I mean, whoever made it needs a little help, but we'll get there.
[31:29.120 --> 31:33.120without any predictability on where they're going to go or drivability,


[07:17.640 --> 07:18.640You're close.
[31:33.120 --> 31:36.120meaning we want them to go to a specific place.


[07:18.640 --> 07:28.120]  So a French scientist spread this picture on Twitter, claiming that it was a close-up
[31:36.120 --> 31:38.120]  So what do you think about that, Steve?


[07:28.120 --> 07:31.480photo from it through... Did he mention the telescope?
[31:38.120 --> 31:40.120Well, it wasn't too long ago.


[07:31.480 --> 07:32.480James Webb.
[31:40.120 --> 31:43.120It was just a science or fiction item where an estimate was


[07:32.480 --> 07:33.480James Webb.
[31:43.120 --> 31:46.120that in the next decade, there's actually something like a 10% chance


[07:33.480 --> 07:36.480This is a James Webb close-up of Alpha Centauri.
[31:46.120 --> 31:48.120]  of somebody getting hit by space debris.


[07:36.480 --> 07:37.480Proxima Centauri.
[31:48.120 --> 31:49.120Oh, yeah.


[07:37.480 --> 07:38.480Did he say Proxima?
[31:49.120 --> 31:50.120We all thought it was fiction.


[07:38.480 --> 07:39.480I think he did.
[31:50.120 --> 31:54.120Yeah, it's getting pretty significant now just because of the sheer volume


[07:39.480 --> 07:40.480I think he said Proxima.
[31:54.120 --> 31:56.120of stuff that we're putting up there.


[07:40.480 --> 07:43.160Proxima Centauri, the closest star to the Earth.
[31:56.120 --> 31:59.120So, yeah, it's, again, one of those things that we have to take


[07:43.160 --> 07:47.720And it was pretty much believed by a lot of people.
[31:59.120 --> 32:02.120a systematic approach to it rather than relying on individuals


[07:47.720 --> 07:52.120]  Turns out that is a picture of essentially a slice of chorizo.
[32:02.120 --> 32:03.120]  to all do the right thing.


[07:52.120 --> 07:54.120]  I love it.
[32:03.120 --> 32:05.120]  How would we figure that out, Steve?


[07:54.120 --> 07:56.120]  Oh, my God.
[32:05.120 --> 32:07.120]  Where would we come up with such an approach?


[07:56.120 --> 07:57.120]  It's so awesome.
[32:07.120 --> 32:09.120]  People aren't just going to automatically do the right thing


[07:57.120 --> 08:01.120]  Yeah, we're having chorizo after the conference is over, because that's great.
[32:09.120 --> 32:10.120]  on their own volition.


[08:01.120 --> 08:02.120]  Hey, can we have some of that?
[32:10.120 --> 32:11.120]  It's just stunning.


[08:02.120 --> 08:04.120]  Look at all those solar swirls in that chorizo.
[32:11.120 --> 32:12.120]  I know.


[08:04.120 --> 08:06.120]  He must have looked at it and goes, you know what?
[32:12.120 --> 32:14.120]  I feel like we're going to have apps where you have, like,


[08:06.120 --> 08:09.120]  This kind of looks like those blurry photos, close-ups of the sun.
[32:14.120 --> 32:16.120]  weather forecast, air pollution, space debris.


[08:09.120 --> 08:11.120]  I wonder how many people I can get to believe that.
[32:16.120 --> 32:17.120]  Space debris, yeah.


[08:11.120 --> 08:12.120]  You know what I love about this?
[32:17.120 --> 32:20.120]  What's the probability of that thing landing in Manhattan today?


[08:12.120 --> 08:13.120]  It's not even cropped.
[32:20.120 --> 32:21.120]  Take your umbrella.


[08:13.120 --> 08:15.120]  That is the shape of the piece of-
[32:21.120 --> 32:23.120]  Yeah, like a steel umbrella.


[08:15.120 --> 08:16.120]  Yeah, that's it.
[32:23.120 --> 32:26.120]  50% chance of rain, 5% chance of...


[08:16.120 --> 08:17.120]  Right?
[32:26.120 --> 32:28.120]  Low-work orbit de-orbiting.


[08:17.120 --> 08:18.120]  That's it.
[32:28.120 --> 32:31.120]  Emily Calandrelli, who does a lot of space-related science communication,


[08:18.120 --> 08:19.120]  That's the whole thing.
[32:31.120 --> 32:34.120]  she was following this one as it was coming down.


[08:19.120 --> 08:20.120]  That's great.
[32:34.120 --> 32:38.120]  And what shocked me about it was we really didn't know where it was


[08:20.120 --> 08:21.120]  It is funny.
[32:38.120 --> 32:41.120]  going to be until, like, an hour before, even days before,


[08:21.120 --> 08:22.120]  There is a superficial resemblance.
[32:41.120 --> 32:45.120]  it was like half of the Earth was in the possible target area.


[08:22.120 --> 08:28.120]  He later apologized, I'm not sure he had to, but we talk about this at times.
[32:45.120 --> 32:48.120]  But she did say, at least this one, they thought.


[08:28.120 --> 08:31.120]  He's a scientist, and he pulled a little prank.
[32:48.120 --> 32:51.120]  But, again, they didn't really know what exactly it was made of,


[08:31.120 --> 08:36.120]  I thought it was pretty harmless, and the point was be a little bit more skeptical before
[32:51.120 --> 32:53.120]  but it would only take out a house or two.


[08:36.120 --> 08:38.120]  you believe things.
[32:53.120 --> 32:54.120]  A house or two.


[08:38.120 --> 08:43.120]  But I do agree that it's problematic to have a scientist doing it, because then we're
[32:54.120 --> 32:55.120]  Just a house or two.


[08:43.120 --> 08:49.120]  simultaneously saying consider the credentials of the person that you're listening to or
[32:55.120 --> 32:56.120]  Yeah.


[08:49.120 --> 08:51.120]  that you're getting information from.
[32:56.120 --> 33:00.120]  Since you suggested a city, a house was the better alternative.


[08:51.120 --> 08:53.120]  If people are saying, hey, no, a scientist put this up.
[33:00.120 --> 33:04.120]  Does space debris zero in on trailer parks like tornadoes do?


[08:53.120 --> 08:56.120]  This wasn't just some random guy on the internet.
[33:04.120 --> 33:05.120]  Yeah.


[08:56.120 --> 08:59.120]  This was a scientist saying, oh, but he was pranking us.
[33:05.120 --> 33:06.120]  I'm just wondering.


[08:59.120 --> 09:02.120]  It may cause more harm than good.
[33:06.120 --> 33:07.120]  And lawn chairs and stuff.


[09:02.120 --> 09:04.120]  Kelly, you are a social media expert.
[33:07.120 --> 33:08.120]  Yeah.


[09:04.120 --> 09:05.120]  What do you think?
[33:08.120 --> 33:10.120]  But there's things to consider, though, because it's not just...


[09:05.120 --> 09:09.120]  I was going to say, that's actually pretty tricky with the James Webb pictures, too,
[33:10.120 --> 33:12.120]  But could there be explosives in there?


[09:09.120 --> 09:14.120]  because I've noticed not all of them are coming from NASA, because the data is just out there
[33:12.120 --> 33:15.120]  Could there be some leftover rocket fuel fumes?


[09:14.120 --> 09:17.120]  and anybody can compile the pictures.
[33:15.120 --> 33:18.120]  Or I have no idea, like, what potential explosive...


[09:17.120 --> 09:21.120]  So anytime I've seen something presented as a James Webb picture, I have to go and look
[33:18.120 --> 33:20.120]  They're probably out of fuel, yeah.


[09:21.120 --> 09:24.120]  into it, because it's not coming directly from NASA.
[33:20.120 --> 33:21.120]  You'd hope.


[09:24.120 --> 09:27.120]  So I could totally see why this took off.
[33:21.120 --> 33:22.120]  Yeah, you'd hope.


[09:27.120 --> 09:32.120]  You may think my knee-jerk reaction is, wait a second, stars are point sources of light.
[33:22.120 --> 33:23.120]  Who knows?


[09:32.120 --> 09:34.120]  You zoom in as much as you want.
[33:23.120 --> 33:24.120]  What about waste?


[09:34.120 --> 09:38.120]  You're really not going to see the disk for the most part.
[33:24.120 --> 33:27.120]  What about, like, dangerous gases and things like that?


[09:38.120 --> 09:42.120]  That has been true for as long as astronomy has been around, of course, relatively recently.
[33:27.120 --> 33:30.120]  Well, when Columbia broke up in 2003


[09:42.120 --> 09:47.120]  Now we can zoom in on certain stars, certain giant stars or stars that are close,
[33:30.120 --> 33:34.120]  and came down over the American South and Southeast,


[09:47.120 --> 09:51.120]  and we can at least observe some of the disk itself.
[33:34.120 --> 33:38.120]  there was concern that they didn't know what sort of contamination,


[09:51.120 --> 09:53.120]  It's not just a point of light.
[33:38.120 --> 33:41.120]  I think, there was in some of the materials,


[09:53.120 --> 09:55.120]  And I think the number now is 23.
[33:41.120 --> 33:44.120]  that people were finding and picking up, like, you know,


[09:55.120 --> 09:59.120]  23 stars we have actually seen part of the disk or a little bit of the disk.
[33:44.120 --> 33:46.120]  a piece of a helmet and things.


[09:59.120 --> 10:02.120]  Sometimes you can even see the convection cells.
[33:46.120 --> 33:48.120]  They warned people to not go near them.


[10:02.120 --> 10:09.120]  So it's not an outrageous thing to say I could see the disk of this nearby star.
[33:48.120 --> 33:49.120]  Yeah.


[10:09.120 --> 10:11.120]  It was not implausible, right.
[33:49.120 --> 33:51.120]  So I don't know what sort of danger that...


[10:11.120 --> 10:14.120]  And if you looked at some of them, we found one.
[33:51.120 --> 33:52.120]  I don't know.


[10:14.120 --> 10:16.120]  Well, yeah, I got it here.
[33:52.120 --> 33:55.120]  I know it always comes up whenever they're sending up any satellite


[10:16.120 --> 10:19.120]  I do want to point out before we move off this picture, though, that while that's correct,
[33:55.120 --> 33:57.120]  or anything that has a nuclear battery in it.


[10:19.120 --> 10:22.120]  you can see the grain of the meat.
[33:57.120 --> 34:00.120]  If that thing, you know, blows up or reenters,


[10:22.120 --> 10:25.120]  This is an in-focus photo.
[34:00.120 --> 34:03.120]  then we could be dumping nuclear waste.


[10:25.120 --> 10:28.120]  If he had just blurted out, it would have been a hundred times more powerful.
[34:03.120 --> 34:06.120]  Well, now I'm thinking, you know, Cold War Sputnik stuff, too,


[10:28.120 --> 10:31.120]  If you're looking on your phone, it's really tiny.
[34:06.120 --> 34:08.120]  where it's like, what if it's not an accident?


[10:31.120 --> 10:33.120]  That could also be a bowling ball for all you know.
[34:08.120 --> 34:10.120]  Not to be the conspiracy theorist of the group,


[10:33.120 --> 10:39.120]  So this is the closest up picture I could find of Alpha Centauri A and B,
[34:10.120 --> 34:12.120]  but that would be a good way to...


[10:39.120 --> 10:41.120]  including Proxima Centauri.
[34:12.120 --> 34:14.120]  Anyway, I'll stop with that one thought.


[10:41.120 --> 10:44.120]  Actually, Proxima C, I always forget.
[34:14.120 --> 34:15.120]  All right.


[10:44.120 --> 10:46.120]  Is that the third star in the system?
=== Auditory Pareidolia Again <small>(34:16)</small> ===
* [https://www.today.com/popculture/green-needle-or-brainstorm-hear-latest-audio-clip-dividing-internet-t188193 'Green needle' or 'brainstorm'? Hear the latest audio clip dividing the internet]<ref>[https://www.today.com/popculture/green-needle-or-brainstorm-hear-latest-audio-clip-dividing-internet-t188193 Today: 'Green needle' or 'brainstorm'? Hear the latest audio clip dividing the internet]</ref>


[10:46.120 --> 10:49.120]  I just call it Proxima, and they're messing around with the names of these stars.
[34:15.120 --> 34:17.120]  This is actually a couple of years old,


[10:49.120 --> 10:53.120]  Yeah, but this is Alpha 1 and 2, or A and B.
[34:17.120 --> 34:19.120]  but it's making the rounds again, and I saw it.


[10:53.120 --> 10:57.120]  And you can see they're basically point sources of light.
[34:19.120 --> 34:21.120]  I don't think we've ever played this on the issue.


[10:57.120 --> 11:02.120]  You're not seeing really like the surface of those stars.
[34:21.120 --> 34:23.120]  I missed it the first time around.


[11:02.120 --> 11:07.120]  There's some flare, lens flare, but you're not seeing the surface.
[34:23.120 --> 34:25.120]  This video, just listen to the sound.


[11:07.120 --> 11:12.120]  Bob and I found not the best picture of Alpha Centauri,
[34:25.120 --> 34:27.120]  You don't have to see the video.


[11:12.120 --> 11:17.120]  but just what's the best picture of any star ever, and there you go.
[34:27.120 --> 34:30.120]  So either think the word brainstorm


[11:17.120 --> 11:23.120]  And it looks pretty much like a blurry slice of chorizo.
[34:30.120 --> 34:33.120]  or think the word green needle.


[11:23.120 --> 11:26.120]  It doesn't even look symmetrical, Steve.
[34:33.120 --> 34:37.120]  And whatever you think, that's what you will hear.


[11:26.120 --> 11:29.120]  It's kind of a fattier chorizo, this one, though, right?
[34:37.120 --> 34:41.120]  You don't even need to be caught with the actual words.


[11:29.120 --> 11:32.120]  So as I said, if you blurt out that chorizo slice,
[34:41.120 --> 34:45.120]  You just have to think it.


[11:32.120 --> 11:36.120]  you have a pretty good facsimile of a close-up picture of the star.
[34:45.120 --> 34:47.120]  Isn't that bizarre?


[11:36.120 --> 11:39.120]  Now this is, what was it, about 520 light years away?
[34:47.120 --> 34:48.120]  That's crazy.


[11:39.120 --> 11:41.120]  Yeah, surprisingly far.
[34:48.120 --> 34:50.120]  Although I'm hearing the green needle a lot more


[11:41.120 --> 11:43.120]  But it's a red supergiant, so it's massive.
[34:50.120 --> 34:52.120]  than I'm hearing the brainstorm.


[11:43.120 --> 11:44.120]  So that helps.
[34:52.120 --> 34:55.120]  It's either distinctively green needle or not green needle.


[11:44.120 --> 11:45.120]  Yeah, that helps a lot.
[34:55.120 --> 34:58.120]  Yeah, but I could flip both ways at will.


[11:45.120 --> 11:49.120]  Yeah, that was more plausible than I thought when I first saw it.
[34:58.120 --> 35:02.120]  You would think, though, they seem like such different phrases


[11:49.120 --> 11:53.120]  I feel like the scary thing is that we're all so worried about misinformation
[35:02.120 --> 35:05.120]  phonetically and everything, but it's in there.


[11:53.120 --> 11:55.120]  that scientists can't make jokes.
[35:05.120 --> 35:07.120]  There are things in there that will trick your brain


[11:55.120 --> 11:57.120]  It's kind of where we're going to live.
[35:07.120 --> 35:09.120]  for both of those.


[11:57.120 --> 11:59.120]  Not that this was the best joke of all time,
[35:09.120 --> 35:10.120]  It's uncanny.


[11:59.120 --> 12:02.120]  but the idea of a prank is sort of, it feels irresponsible,
[35:10.120 --> 35:12.120]  It's not even the same number of syllables,


[12:02.120 --> 12:04.120]  and it's too bad that that's the case,
[35:12.120 --> 35:15.120]  which is surprising to me that it still works, right?


[12:04.120 --> 12:06.120]  because it's making science fun and engaging,
[35:15.120 --> 35:16.120]  Yeah, it's one extra syllable.


[12:06.120 --> 12:09.120]  and you could imagine he could do a fun quiz show,
[35:16.120 --> 35:17.120]  Two versus three.


[12:09.120 --> 12:12.120]  like Cartwheel Galaxy or Lollipop or whatever, right?
[35:17.120 --> 35:20.120]  I think the distortion itself must be a critical component


[12:12.120 --> 12:13.120]  Fallon could do a segment,
[35:20.120 --> 35:23.120]  of the ability to switch between it from one to the other, perhaps.


[12:13.120 --> 12:16.120]  but it feels like it would cause more harm than good, which is...
[35:23.120 --> 35:26.120]  Otherwise, why make it sound so distorted?


[12:16.120 --> 12:18.120]  Right, unless you're transparent up front.
[35:26.120 --> 35:30.120]  I believe it also works brain needle and green storm as well.


[12:18.120 --> 12:21.120]  If he did it as, like, you might think this is a close-up star,
[35:30.120 --> 35:32.120]  If you try it.


[12:21.120 --> 12:23.120]  but it's actually a chorizo.
[35:32.120 --> 35:34.120]  I have to stumble upon this.


[12:23.120 --> 12:25.120]  Here's a close-up star, something like that.
[35:34.120 --> 35:42.120]  It's one of the more dramatic examples of auditory parendolism.


[12:25.120 --> 12:27.120]  That might be a new thing for our social media.
[35:42.120 --> 35:45.120]  This happens in a lot of our sensory streams,


[12:27.120 --> 12:29.120]  Close-up, what is this?
[35:45.120 --> 35:48.120]  but it happens a lot with language.


[12:29.120 --> 12:31.120]  This one looks more like a pizza pie, though.
[35:48.120 --> 35:52.120]  Our sensory streams are wired to make the closest fit


[12:31.120 --> 12:33.120]  It's like that gimmick they did forever ago,
[35:52.120 --> 35:55.120]  to phonemes that you know.


[12:33.120 --> 12:35.120]  where they were like, was this a famous painting,
[35:55.120 --> 35:59.120]  It's constantly trying to make that fit between speech sound


[12:35.120 --> 12:36.120]  or did a gorilla paint this?
[35:59.120 --> 36:02.120]  and words that you know.


[12:36.120 --> 12:38.120]  All right, so real quick,
[36:02.120 --> 36:05.120]  That's why you can misunderstand lyrics all the time


[12:38.120 --> 12:41.120]  apparently the password that the publisher put up there,
[36:05.120 --> 36:06.120]  and misunderstand what people say.


[12:41.120 --> 12:44.120]  the space for the password only takes five characters,
[36:06.120 --> 36:07.120]  It sounds like something close to it.


[12:44.120 --> 12:47.120]  so just type in the first five characters of the word future.
[36:07.120 --> 36:11.120]  This is just demonstrating that in a very dramatic way.


[12:47.120 --> 12:50.120]  You know, you can't get good help these days, George.
[36:11.120 --> 36:14.120]  It's amazing how well the priming works.


[12:50.120 --> 12:51.120]  Futter.
[36:14.120 --> 36:18.120]  When Rob brought up the distortion, it reminded me of,


[12:51.120 --> 12:52.120]  Futter.
[36:18.120 --> 36:22.120]  we talked about it on SGU, the doll that would talk.


[12:52.120 --> 12:53.120]  Futter.
[36:22.120 --> 36:23.120]  Full-string dolls.


[12:53.120 --> 12:54.120]  I don't understand.
[36:23.120 --> 36:24.120]  It has a recording.


[12:54.120 --> 12:57.120]  The password field actually has a limit to how many characters it takes.
[36:24.120 --> 36:27.120]  It's a voice, but it's a crackly kind of voice.


[12:57.120 --> 12:58.120]  A small limit as well.
[36:27.120 --> 36:29.120]  It has a bit of distortion to it.


[12:58.120 --> 12:59.120]  How does that even happen?
[36:29.120 --> 36:32.120]  People think they're hearing things that the doll is saying


[12:59.120 --> 13:01.120]  You have to pay more for the six characters.
[36:32.120 --> 36:34.120]  that it really isn't programmed to say,


[13:01.120 --> 13:03.120]  Most passwords require it to be way too long these days,
[36:34.120 --> 36:38.120]  but they can't distinguish what it was programmed to say.


[13:03.120 --> 13:05.120]  and I can't fill it in.
[36:38.120 --> 36:42.120]  They're thinking what they think it's saying instead.


[13:05.120 --> 13:08.120]  A third book, Jay, maybe you could have six characters.
[36:42.120 --> 36:45.120]  We've come across this before in other mediums.


[13:08.120 --> 13:09.120]  Look, this is what I'll do.
[36:45.120 --> 36:48.120]  Is this behind those Disney conspiracies too,


[13:09.120 --> 13:11.120]  I'll call the publisher on Monday, and I'll tell them,
[36:48.120 --> 36:49.120]  where they're like,


[13:11.120 --> 13:15.120]  forget the password, just whoever entered is going to be legit.
[36:49.120 --> 36:52.120]  there are secret horrible messages in various cartoons?


[13:15.120 --> 13:19.120]  So just put your info in there if you want to enter in.
[36:52.120 --> 36:54.120]  Is the light, that was one of the dolls that had it,


[13:19.120 --> 13:21.120]  All right, let's get to some news items.
[36:54.120 --> 36:57.120]  but that's not really what the doll was saying,


[13:21.120 --> 13:23.120]  We have more fun bits coming up later, too,
[36:57.120 --> 37:03.120]  but it spread virally and that's what everyone started to hear.


[13:23.120 --> 13:25.120]  but first a couple news items.
[37:03.120 --> 37:05.120]  It was saying because it was suggested that that's what it was saying.


{{anchor|news#}} <!-- leave this news item anchor directly above the news item section that follows -->
[37:05.120 --> 37:07.120]  The awkward masking on records.
=== Scientific Rigor <small>(13:25)</small> ===
* [https://theness.com/neurologicablog/index.php/nih-to-fund-scientific-rigor-initiative/ NIH To Fund Scientific Rigor Initiative]<ref>[https://theness.com/neurologicablog/index.php/nih-to-fund-scientific-rigor-initiative/ Neurologica: NIH To Fund Scientific Rigor Initiative]</ref>


[13:25.120 --> 13:29.120]  That picture of a giant complex of buildings that I'm showing
[37:07.120 --> 37:09.120]  I was just going to say that.


[13:29.120 --> 13:33.120]  is the NIH, the National Institutes of Health.
[37:09.120 --> 37:12.120]  I've listened to Stairway to Heaven backwards.


[13:33.120 --> 13:40.120]  They are essentially the main biomedical research funding institution
[37:12.120 --> 37:19.120]  I really hear a lot of stuff in there that has a demonic connotation.


[13:40.120 --> 13:41.120]  in the United States.
[37:19.120 --> 37:21.120]  The words that they're saying.


[13:41.120 --> 13:43.120]  They are a creature of Congress, as we like to say.
[37:21.120 --> 37:25.120]  It's probably because I've been priming myself since I was a teenager.


[13:43.120 --> 13:46.120]  They are created, funded by Congress.
[37:25.120 --> 37:27.120]  When I hear that, every once in a while I'll listen to it


[13:46.120 --> 13:48.120]  Essentially, if you do biomedical research in the U.S.,
[37:27.120 --> 37:29.120]  because it's actually kind of interesting.


[13:48.120 --> 13:52.120]  you get your funding from the NIH, more likely than not.
[37:29.120 --> 37:33.120]  I'm hearing, here's to my sweet Satan and all that stuff.


[13:52.120 --> 13:56.120]  They're massively important for medical research.
[37:33.120 --> 37:35.120]  It seems very clear to me.


[13:56.120 --> 13:59.120]  Recently, the NIH created an initiative.
[37:35.120 --> 37:40.120]  Again, your brain is trying to make sense out of chaos.


[13:59.120 --> 14:00.120]  It's not a new office or anything.
[37:40.120 --> 37:45.120]  Sometimes your brain concocts something that isn't actually there.


[14:00.120 --> 14:02.120]  It's just an initiative.
[37:45.120 --> 37:47.120]  It's kind of like the dress.


[14:02.120 --> 14:07.120]  They're funding specific groups who are going to create
[37:47.120 --> 37:49.120]  I was just thinking about the dress.


[14:07.120 --> 14:15.120]  an educational module to teach researchers how to do rigorous science.
[37:49.120 --> 37:51.120]  Or Laurel and Yanni.


[14:15.120 --> 14:16.120]  That sounds pretty good.
[37:51.120 --> 37:54.120]  Yeah, Laurel and Yanni.


[14:16.120 --> 14:17.120]  That sounds pretty good to me.
[37:54.120 --> 37:56.120]  The internet will spit out more of these things.


[14:17.120 --> 14:19.120]  That doesn't already exist, though?
[37:56.120 --> 37:58.120]  We'll share them with you.


[14:19.120 --> 14:20.120]  That's my thought.
[37:58.120 --> 38:00.120]  This was a particularly impressive one.


[14:20.120 --> 14:21.120]  That's a good question.
[38:00.120 --> 38:02.120]  Everyone, we're going to take a quick break from our show


[14:21.120 --> 14:29.120]  Right now, how do we teach researchers how to do good research methodology?
[38:02.120 --> 38:04.120]  to talk about our sponsor this week, BetterHelp.


[14:29.120 --> 14:32.120]  Some universities may have courses on it.
[38:04.120 --> 38:07.120]  Guys, we have to take care of not just our physical health,


[14:32.120 --> 14:33.120]  They may be required.
[38:07.120 --> 38:09.120]  but also our mental health.


[14:33.120 --> 14:35.120]  They may be elective.
[38:09.120 --> 38:12.120]  There's lots of options available to us now.


[14:35.120 --> 14:39.120]  They might be a statistics course or a research methodology course.
[38:12.120 --> 38:13.120]  BetterHelp is one of them.


[14:39.120 --> 14:45.120]  You do get that, but not like, all right, here's how you do really rigorous research.
[38:13.120 --> 38:16.120]  BetterHelp offers online therapy.


[14:45.120 --> 14:55.120]  Here's how you avoid p-hacking or how you avoid false positives, etc., etc.
[38:16.120 --> 38:17.120]  I'll tell you something.


[14:55.120 --> 15:01.120]  Clearly, that is needed for reasons that I've been talking about
[38:17.120 --> 38:19.120]  I personally do online therapy.


[15:01.120 --> 15:04.120]  and writing about for the last 20 years.
[38:19.120 --> 38:25.120]  I've been meeting with my doctor for the past six months every week.


[15:04.120 --> 15:07.120]  The other way that people learn that is through, essentially,
[38:25.120 --> 38:28.120]  I've been dealing with anxiety and depression my entire adult life.


[15:07.120 --> 15:08.120]  individual mentorship.
[38:28.120 --> 38:32.120]  Therapy is one of the biggest things that helps me deal with it.


[15:08.120 --> 15:12.120]  You work in somebody's lab, and they teach you how to do research,
[38:32.120 --> 38:34.120]  I really think that you should consider it.


[15:12.120 --> 15:16.120]  not only in their specific area, technically, but also just,
[38:34.120 --> 38:37.120]  If you're suffering, if you're having anything that's bothering you


[15:16.120 --> 15:18.120]  this is what good science is.
[38:37.120 --> 38:40.120]  that you seem to not be able to get over,


[15:18.120 --> 15:22.120]  But it's not systematic, and it's not thorough enough.
[38:40.120 --> 38:43.120]  you really should think about talking to someone to get help.


[15:22.120 --> 15:26.120]  Clearly, there's a perception that there is a gap, a gap there.
[38:43.120 --> 38:44.120]  You're right, Jay.


[15:26.120 --> 15:28.120]  They want to fill that gap.
[38:44.120 --> 38:49.120]  BetterHelp is not only online, but it offers a lot of different options.


[15:28.120 --> 15:35.120]  Their goal is to fund the creation of this module to teach rigorous research design
[38:49.120 --> 38:52.120]  We're talking video, phone, even live chat only.


[15:35.120 --> 15:38.120]  and to then make it freely available, basically.
[38:52.120 --> 38:57.120]  You don't have to see someone on camera if you're not in the place to do that.


[15:38.120 --> 15:42.120]  And then the hope is, so universities may require it.
[38:57.120 --> 39:02.120]  It's also affordable, and you can be matched with a therapist in under 48 hours.


[15:42.120 --> 15:46.120]  They might say, all right, if you're going to work at our university,
[39:02.120 --> 39:07.120]  Our listeners get 10% off their first month at BetterHelp.com.


[15:46.120 --> 15:47.120]  this already happens, right?
[39:07.120 --> 39:11.120]  That's BetterHELP.com.


[15:47.120 --> 15:52.120]  I work at Yale, and I have to do 20 different certifications every year
[39:11.120 --> 39:14.120]  All right, guys, let's get back to the show.


[15:52.120 --> 15:56.120] on everything, like sexual harassment sensitivity
=== The Alex Jones Saga <small>(39:15)</small> ===
* [https://www.reuters.com/business/media-telecom/jury-alex-jones-defamation-case-begin-deliberations-punitive-damages-2022-08-05/ Jury awards $45.2 million in punitive damages in Alex Jones Sandy Hook trial]<ref>[https://www.reuters.com/business/media-telecom/jury-alex-jones-defamation-case-begin-deliberations-punitive-damages-2022-08-05/ Reuters: Jury awards $45.2 million in punitive damages in Alex Jones Sandy Hook trial]</ref>


[15:56.120 --> 15:59.120]  or how not to burn your eyes out or whatever, all of these things.
[39:14.120 --> 39:15.120]  All right.


[15:59.120 --> 16:00.120]  That's a good one.
[39:15.120 --> 39:21.120]  One thing that we can agree on, that is that Alex Jones is a giant douchebag.


[16:00.120 --> 16:02.120]  How to treat patients ethically, all good stuff.
[39:21.120 --> 39:25.120]  You don't have my permission to use that photo.


[16:02.120 --> 16:04.120]  A lot of safety things all in there.
[39:25.120 --> 39:28.120]  I'm going to get your internet permission to not use that photo.


[16:04.120 --> 16:08.120]  But just adding one that's like, here's how not to do fake research.
[39:28.120 --> 39:29.120]  Buy my vitamins.


[16:08.120 --> 16:11.120]  Here's how not to accidentally commit research from,
[39:29.120 --> 39:31.120]  I have a worse photo.


[16:11.120 --> 16:13.120]  or how to p-hack or whatever.
[39:31.120 --> 39:37.120]  All right, Kelly, give us an update on the Alex Jones saga.


[16:13.120 --> 16:18.120]  It would be very easy to slip that into the existing system
[39:37.120 --> 39:41.120]  Yes, so I, like the insane person I am,


[16:18.120 --> 16:21.120]  of getting certified for quality control.
[39:41.120 --> 39:44.120]  have kind of had this on in the background for the last two weeks,


[16:21.120 --> 16:23.120]  That's basically what this is.
[39:44.120 --> 39:48.120]  and I was very glad to have an opportunity to put that to use.


[16:23.120 --> 16:26.120]  Now, the NIH, of course, they could require,
[39:48.120 --> 39:51.120]  But in Steve fashion, I'm going to start with a question.


[16:26.120 --> 16:29.120]  if you apply to the NIH for a research grant,
[39:51.120 --> 39:58.120]  So what percentage of Americans do you guys think question the Sandy Hook shooting?


[16:29.120 --> 16:31.120]  and they're not saying they're going to do this,
[39:58.120 --> 40:00.120]  20%.


[16:31.120 --> 16:33.120]  but imagine if they said, all right, in order to get this grant,
[40:00.120 --> 40:01.120]  10%.


[16:33.120 --> 16:37.120]  you've got to have certification that you took this module and you passed.
[40:01.120 --> 40:02.120]  Question it?


[16:37.120 --> 16:41.120]  Because again, they're interested in not wasting money.
[40:02.120 --> 40:04.120]  Probably I would say like 22%.


[16:41.120 --> 16:43.120]  That's their primary interest.
[40:04.120 --> 40:06.120]  22.1%.


[16:43.120 --> 16:44.120]  Obviously, they want to do good science.
[40:06.120 --> 40:07.120]  25%.


[16:44.120 --> 16:45.120]  That's their goal.
[40:07.120 --> 40:08.120]  Wow.


[16:45.120 --> 16:47.120]  Their mission is to obviously do good science,
[40:08.120 --> 40:10.120]  It depends on whether we're doing Price is Right rules or not,


[16:47.120 --> 16:51.120]  but they have a finite budget, and they want to make the most use
[40:10.120 --> 40:13.120]  but I don't think we are because I didn't say it, so Andrea wins.


[16:51.120 --> 16:53.120]  out of that money.
[40:13.120 --> 40:15.120]  Oh, it's that high?


[16:53.120 --> 16:55.120]  That, again, is their mission.
[40:15.120 --> 40:16.120]  There we go.


[16:55.120 --> 17:00.120]  One of the biggest wastes in research is bad science.
[40:16.120 --> 40:18.120]  That's horrible.


[17:00.120 --> 17:04.120]  If you publish a study, and it's a false positive, let's say,
[40:18.120 --> 40:22.120]  A quarter of the people polled, it's hard because I would have won.


[17:04.120 --> 17:08.120]  you think that you have a result, but you did poor methodology,
[40:22.120 --> 40:25.120]  Price is Right rules, I would have won.


[17:08.120 --> 17:11.120]  you p-hacked or whatever, you underpowered the study,
[40:25.120 --> 40:28.120]  Granted, there's always issues with polling,


[17:11.120 --> 17:16.120]  or the blinding was inadequate, or your statistics were off, or whatever,
[40:28.120 --> 40:31.120]  but even if it's half that, that's absolutely insane,


[17:16.120 --> 17:20.120]  and then other people try to replicate that study,
[40:31.120 --> 40:34.120]  and it's almost single-handedly because of Alex Jones.


[17:20.120 --> 17:24.120]  how many millions of dollars could be spent proving that your crappy study
[40:34.120 --> 40:36.120]  Oh, yeah.


[17:24.120 --> 17:28.120]  was crappy when you could have filtered it out at the beginning
[40:36.120 --> 40:39.120]  So I'm going to talk more about the misinformation piece.


[17:28.120 --> 17:32.120]  by putting in some internal controls that you didn't know you should do
[40:39.120 --> 40:42.120]  I know everyone has seen all of the clips of his testimony


[17:32.120 --> 17:35.120]  or by tightening up your research methodology.
[40:42.120 --> 40:45.120]  and all of the perjury and all the fun stuff,


[17:35.120 --> 17:39.120]  The other goal here, other than not only doing good science,
[40:45.120 --> 40:47.120]  but since this is a misinformation conference,


[17:39.120 --> 17:44.120]  is to save money by weeding out the inefficiency in the system of fraud.
[40:47.120 --> 40:50.120]  I'm going to focus on that aspect of it.


[17:44.120 --> 17:49.120]  It makes sense, not fraud, but just bad rigor in research design.
[40:50.120 --> 40:54.120]  And I think as skeptics, we often hear the question, what's the harm?


[17:49.120 --> 17:53.120]  It makes sense that once these modules are up and running,
[40:54.120 --> 40:57.120]  Especially with things like conspiracy theories or supplements.


[17:53.120 --> 17:56.120]  phase two would be, and you've got to be certified in this
[40:57.120 --> 41:02.120]  It's just easy to dismiss until it gets to this point,


[17:56.120 --> 17:58.120]  before we'll give you any money.
[41:02.120 --> 41:06.120]  and Alex Jones took both of those things and ruined some families' lives.


[17:58.120 --> 18:00.120]  So that's one way that you, and again,
[41:06.120 --> 41:08.120]  So some backgrounds.


[18:00.120 --> 18:03.120]  the NIH already does this for other things, for example,
[41:08.120 --> 41:12.120]  The caricature that you think of as Alex Jones is pretty much accurate.


[18:03.120 --> 18:07.120]  they now require, this has been going on for about 10 or 15 years or so,
[41:12.120 --> 41:16.120]  He peddles all of the conspiracy theories, 9-11 truth or pizza gate.


[18:07.120 --> 18:09.120]  if you get public money to do your research,
[41:16.120 --> 41:20.120]  Now he's talking about the globalists trying to bring about the New World Order,


[18:09.120 --> 18:13.120]  you have to make the results of your research available to the public
[41:20.120 --> 41:23.120]  and when the Sandy Hook shooting happened,


[18:13.120 --> 18:15.120]  and accessible by the public.
[41:23.120 --> 41:27.120]  he almost immediately was questioning the narrative.


[18:15.120 --> 18:20.120]  You have to say, how are you going to explain your results
[41:27.120 --> 41:32.120]  And he's gone from saying it's a hoax, calling the parents crisis actors,


[18:20.120 --> 18:23.120]  to the people who are paying for your research, the public?
[41:32.120 --> 41:34.120]  and that's changed over time.


[18:23.120 --> 18:26.120]  So this would be another way, how can you assure the people
[41:34.120 --> 41:37.120]  His position has definitely evolved,


[18:26.120 --> 18:29.120]  who are funding your research that you're not wasting their money
[41:37.120 --> 41:42.120]  but the consistent through line of that is that he's questioning the official story


[18:29.120 --> 18:31.120]  by doing rigorous research design?
[41:42.120 --> 41:45.120]  and doesn't think that the official story is true.


[18:31.120 --> 18:36.120]  And by the way, here is an educational module,
[41:45.120 --> 41:48.120]  And because of this, the families of the children who died


[18:36.120 --> 18:40.120]  and we could easily connect certification to that.
[41:48.120 --> 41:51.120]  have received death threats, they've been harassed,


[18:40.120 --> 18:41.120]  That's awesome.
[41:51.120 --> 41:54.120]  and they're dealing with this constantly circulating.


[18:41.120 --> 18:45.120]  I would like to see big science journals do the same thing.
[41:54.120 --> 41:58.120]  So a bunch of the families have sued him, rightfully so.


[18:45.120 --> 18:47.120]  You want to get published in our journal,
[41:58.120 --> 42:01.120]  And so this trial was for the parents of Jesse Lewis,


[18:47.120 --> 18:50.120]  we require that you have the author, the lead author,
[42:01.120 --> 42:04.120]  who was a six-year-old who died in Sandy Hook,


[18:50.120 --> 18:52.120]  or every author has certification.
[42:04.120 --> 42:09.120]  for defamation and intentional infliction of emotional distress.


[18:52.120 --> 18:54.120]  And of course, once either of those happens,
[42:09.120 --> 42:12.120]  And we're about to make fun of Alex Jones,


[18:54.120 --> 18:57.120]  like if the NIH says you need to have certification to get grant money,
[42:12.120 --> 42:17.120]  but as we're doing it, keep in mind that this all sounds silly and ridiculous,


[18:57.120 --> 19:01.120]  you better believe every university will make sure that it happens.
[42:17.120 --> 42:20.120]  but it's causing real harm to these families.


[19:01.120 --> 19:03.120]  They're not going to have any of their people
[42:20.120 --> 42:23.120]  And I don't want to make light of it, but at the same time,


[19:03.120 --> 19:05.120]  not be able to get NIH grants.
[42:23.120 --> 42:25.120]  there's something really satisfying,


[19:05.120 --> 19:08.120]  So it's very easy to make this systematic.
[42:25.120 --> 42:29.120]  especially in the misinformation apocalypse that we're in right now,


[19:08.120 --> 19:10.120]  So again, we're right at the very beginning of this,
[42:29.120 --> 42:33.120]  about somebody who is this awful actually being held accountable.


[19:10.120 --> 19:13.120]  and everything I'm hearing and seeing is very, very good.
[42:33.120 --> 42:37.120]  So we've got to at least appreciate that for a minute.


[19:13.120 --> 19:15.120]  We'll keep a close eye on it.
[42:37.120 --> 42:40.120]  Also, his lawyers are comically terrible.


[19:15.120 --> 19:17.120]  And again, a lot of people react like you, Jay.
[42:40.120 --> 42:42.120]  How can they be that?


[19:17.120 --> 19:19.120]  It's like, really, why isn't this kind of already happening?
[42:42.120 --> 42:45.120]  I mean, for a guy that has this much money,


[19:19.120 --> 19:22.120]  But that's because I think the main reason is,
[42:45.120 --> 42:48.120]  how could he because he's a losing case?


[19:22.120 --> 19:24.120]  I would say there's two things.
[42:48.120 --> 42:50.120]  Because nobody wants to defend him.


[19:24.120 --> 19:27.120]  One is people think it is happening, but it's just not happening enough.
[42:50.120 --> 42:54.120]  He probably has been working his way down the ladder of terrible lawyers.


[19:27.120 --> 19:31.120]  The second one is that the science of doing rigorous science
[42:54.120 --> 42:56.120]  And you've had that experience.


[19:31.120 --> 19:33.120]  has been getting better.
[42:56.120 --> 42:58.120]  I mean, his lawyers were pretty terrible.


[19:33.120 --> 19:37.120]  We're learning more and more subtle ways in which studies go awry
[42:58.120 --> 43:01.120]  With your case, your opponent had that as well.


[19:37.120 --> 19:39.120]  or that results can be tweaked
[43:01.120 --> 43:05.120]  He kept going through lawyers because nobody of quality would defend him.


[19:39.120 --> 19:42.120]  or researchers can put their thumb on the scale.
[43:05.120 --> 43:07.120]  Who wants to defend this guy?


[19:42.120 --> 19:44.120]  We talk about researcher degrees of freedom
[43:07.120 --> 43:10.120]  The other thing is that they did it on purpose.


[19:44.120 --> 19:47.120]  and researcher bias and publication bias and citation bias
[43:10.120 --> 43:11.120]  That's what I was thinking.


[19:47.120 --> 19:52.120]  and all these things that can alter the utility and the rigor
[43:11.120 --> 43:12.120]  You think they're sandbagging?


[19:52.120 --> 19:54.120]  and the quality of science,
[43:12.120 --> 43:13.120]  Yeah.


[19:54.120 --> 19:59.120]  and essentially the old method of just relying upon some,
[43:13.120 --> 43:15.120]  His morals got the better of him.


[19:59.120 --> 20:02.120]  like just here's some classic statistics class,
[43:15.120 --> 43:17.120]  That thought has been brought up.


[20:02.120 --> 20:05.120]  and then whoever's lab you work in,
[43:17.120 --> 43:20.120]  But the thing is, one, it's a civil case,


[20:05.120 --> 20:07.120]  they'll teach you how to do good science.
[43:20.120 --> 43:24.120]  so he can't get away with the whole, like, my lawyers were incompetent,


[20:07.120 --> 20:09.120]  It's just not good enough anymore.
[43:24.120 --> 43:26.120]  so get out of it that way.


[20:09.120 --> 20:13.120]  It's got to be systematic, and everyone's got to go through it
[43:26.120 --> 43:30.120]  But also, they cross-examined the parents.


[20:13.120 --> 20:17.120]  in order to absolutely minimize the waste in the system
[43:30.120 --> 43:33.120]  And I feel like if you were sandbagging it,


[20:17.120 --> 20:21.120]  that comes from poor research design.
[43:33.120 --> 43:36.120]  you wouldn't want to inflict additional trauma on the parents.


[20:21.120 --> 20:23.120]  So this is a massive move in the right direction.
[43:36.120 --> 43:40.120]  And some of the questions that he was asking them, I couldn't believe.


[20:23.120 --> 20:25.120]  This is very, very encouraging.
[43:40.120 --> 43:43.120]  Have the lawyers made a statement about how it happened?


[20:25.120 --> 20:27.120]  Steve, where did you learn how to do it?
[43:43.120 --> 43:47.120]  Because it's hard to accidentally send a huge set of files or file.


[20:27.120 --> 20:29.120]  For me, well, I mean,
[43:47.120 --> 43:49.120]  I always forget to send attachments.


[20:29.120 --> 20:31.120]  it's been the whole science-based medicine initiative,
[43:49.120 --> 43:52.120]  Oh, the phone that's almost definitely going to the one-sixth committee


[20:31.120 --> 20:33.120]  which is I've been reading about it, following,
[43:52.120 --> 43:54.120]  is like a whole story in itself.


[20:33.120 --> 20:36.120]  reading the literature on it for 20 years
[43:54.120 --> 43:57.120]  But basically, the one lawyer said,


[20:36.120 --> 20:38.120]  and writing about it, trying to digest it.
[43:57.120 --> 44:00.120]  please disregard after he accidentally sent the files,


[20:38.120 --> 20:41.120]  That's basically what we explore at science-based medicine
[44:00.120 --> 44:04.120]  but didn't actually take the legal steps to pull back all that information.


[20:41.120 --> 20:43.120]  is how to do rigorous science,
[44:04.120 --> 44:08.120]  So they just got to use it after his ten days were up.


[20:43.120 --> 20:45.120]  the relationship between science and practice.
[44:08.120 --> 44:11.120]  This trial was specifically for damages,


[20:45.120 --> 20:47.120]  How do we know what's true, what's not true?
[44:11.120 --> 44:15.120]  because Alex Jones didn't provide any of the documents or evidence


[20:47.120 --> 20:49.120]  Where's the threshold of evidence
[44:15.120 --> 44:17.120]  that he was supposed to during the discovery phase,


[20:49.120 --> 20:51.120]  before something should affect your practice?
[44:17.120 --> 44:20.120]  and he dragged things on for years, and so there was a default judgment.


[20:51.120 --> 20:53.120]  That's what we do.
[44:20.120 --> 44:23.120]  So it wasn't a question of if the defamation happens.


[20:53.120 --> 20:55.120]  That's how I learned it.
[44:23.120 --> 44:25.120]  The court had decided the defamation happened.


[20:55.120 --> 20:57.120]  It was all basically just self-taught by reading the literature,
[44:25.120 --> 44:30.120]  This was just to decide how much he had to pay for it.


[20:57.120 --> 21:00.120]  talking to my colleagues, writing about it, engaging about it.
[44:30.120 --> 44:36.120]  And the trial was exactly as dramatic as the clips are portraying it to be,


[21:00.120 --> 21:04.120]  But most researchers are not spending most of their time,
[44:36.120 --> 44:39.120]  and I think this one exchange between Alex Jones and the judge


[21:04.120 --> 21:07.120]  their academic time, doing that.
[44:39.120 --> 44:43.120]  is the epitome of his testimony at least.


[21:07.120 --> 21:09.120]  They're doing their research.
[44:43.120 --> 44:45.120]  So I'm going to read that.


[21:09.120 --> 21:12.120]  They're trying to figure out what receptor is causing this disease
[44:45.120 --> 44:48.120]  I'm sorry, I don't have as good an Alex Jones impression as George.


[21:12.120 --> 21:14.120]  or whatever.
[44:48.120 --> 44:53.120]  So the judge, after sending the jury out because Alex Jones was talking about


[21:14.120 --> 21:18.120]  This is sort of part of that, but it's not their focus.
[44:53.120 --> 44:56.120]  things that he wasn't supposed to while he was on the stand,


[21:18.120 --> 21:23.120]  That's why it needs to be done systematically.
[44:56.120 --> 44:59.120]  said, you're already under oath to tell the truth.


[21:23.120 --> 21:25.120]  This is also one final word and then we'll move on.
[44:59.120 --> 45:02.120]  You've already violated that oath twice today.


[21:25.120 --> 21:28.120]  Part of a bigger trend that I've noticed, at least in medicine,
[45:02.120 --> 45:03.120]  And granted, twice today.


[21:28.120 --> 21:32.120]  Andrew, you can tell me if you think it's true in your field as well,
[45:03.120 --> 45:07.120]  He had been on the stand for like 10 minutes by that point maybe.


[21:32.120 --> 21:37.120]  that you're going away from the model of just counting on mentorship
[45:07.120 --> 45:11.120]  That might be an exaggeration, but it was end of the day,


[21:37.120 --> 21:41.120]  and counting on that people will learn what they need to learn
[45:11.120 --> 45:12.120]  he had just gotten on the stand.


[21:41.120 --> 21:46.120]  and moving towards things that are way more systematic,
[45:12.120 --> 45:16.120]  It seems absurd to instruct you that you must tell the truth while you testify,


[21:46.120 --> 21:51.120]  that are verified, and also that there are checks in place
[45:16.120 --> 45:18.120]  yet here I am.


[21:51.120 --> 21:59.120]  rather than just trying to raise the quality by just over-educating people.
[45:18.120 --> 45:20.120]  You must tell the truth when you testify.


[21:59.120 --> 22:01.120]  You just have checks in place to make sure that they do it.
[45:20.120 --> 45:22.120]  This is not your show.


[22:01.120 --> 22:03.120]  Medicine is getting too complicated.
[45:22.120 --> 45:25.120]  And then she explains some of the specifics, and she goes,


[22:03.120 --> 22:06.120]  Science is getting too complicated to rely upon methods
[45:25.120 --> 45:27.120]  do you understand what I have said?


[22:06.120 --> 22:08.120]  that are not absolutely systematic.
[45:27.120 --> 45:30.120]  And he goes, I, and she interrupts him and says, yes or no.


[22:08.120 --> 22:10.120]  Is that something you find in academia from your end?
[45:30.120 --> 45:34.120]  He goes, yes, I believe what I said is true.


[22:10.120 --> 22:11.120]  Definitely.
[45:34.120 --> 45:35.120]  And she cuts him off.


[22:11.120 --> 22:13.120]  I'm thinking about something that I think Jay brought up
[45:35.120 --> 45:39.120]  She goes, you believe everything you say is true, but it isn't.


[22:13.120 --> 22:16.120]  on a different live a while ago about the movement
[45:39.120 --> 45:41.120]  Your beliefs do not make something true.


[22:16.120 --> 22:18.120]  towards pre-registering your hypotheses.
[45:41.120 --> 45:43.120]  That's what we're doing here.


[22:18.120 --> 22:20.120]  That's another way of just putting the system in place
[45:43.120 --> 45:44.120]  Oh my God.


[22:20.120 --> 22:23.120]  because it turns out we can't rely on everyone to do great science
[45:44.120 --> 45:45.120]  Wow.


[22:23.120 --> 22:25.120]  even though we all like to think that we're doing it.
[45:45.120 --> 45:48.120]  And you should really watch that whole clip because there was so much more of it,


[22:25.120 --> 22:27.120]  Where I thought you were going, Steve, with that was
[45:48.120 --> 45:50.120]  but I couldn't go into the whole thing.


[22:27.120 --> 22:29.120]  we can't rely exclusively.
[45:50.120 --> 45:54.120]  And watch all the clips from his testimony because it is absolutely horrifying,


[22:29.120 --> 22:31.120]  Well, we still rely on it a lot, but peer review.
[45:54.120 --> 45:58.120]  but also really satisfying because he's an awful person and deserves every bit of that.


[22:31.120 --> 22:33.120]  Peer review is not a perfect process.
[45:58.120 --> 46:02.120]  And I can't help, through all the things that I've consumed about this man,


[22:33.120 --> 22:35.120]  It's a strong process in a lot of ways
[46:02.120 --> 46:07.120]  I can't help but think that this entire thing is an act.


[22:35.120 --> 22:37.120]  and I don't have great ideas about what to do instead,
[46:07.120 --> 46:08.120]  I was thinking the same, Jay.


[22:37.120 --> 22:39.120]  but it's not like it's perfect.
[46:08.120 --> 46:10.120]  I'm wondering what you all think about that.


[22:39.120 --> 22:41.120]  A lot of stuff gets through peer review,
[46:10.120 --> 46:13.120]  You think he knows what he's doing and he's just pretending?


[22:41.120 --> 22:44.120]  and so this is something that could help steer people.
[46:13.120 --> 46:18.120]  Of course, I'm not 100% sure, but it just seems like it is all a money-making act.


[22:44.120 --> 22:48.120]  The only question I'm having, though, is how you could imagine
[46:18.120 --> 46:21.120]  Like I don't think he's a real conspiracy theorist.


[22:48.120 --> 22:51.120]  a world where they're sort of methodologically specific.
[46:21.120 --> 46:22.120]  I think he is.


[22:51.120 --> 22:55.120]  I'm thinking of machine learning where you have issues
[46:22.120 --> 46:23.120]  No, I think you're right.


[22:55.120 --> 22:57.120]  with overfitting your model.
[46:23.120 --> 46:27.120]  He uses his conspiracies to sell supplements because he'll talk about the conspiracy theory


[22:57.120 --> 23:00.120]  That would be totally irrelevant to someone running an experiment.
[46:27.120 --> 46:33.120]  to get the views and then he pivots into an ad for supplements or for shelf-stable food


[23:00.120 --> 23:03.120]  I don't know what the future would look like.
[46:33.120 --> 46:36.120]  because the Great Reset is coming and so you need to have food,


[23:03.120 --> 23:05.120]  Ten years from now, are there different modules?
[46:36.120 --> 46:39.120]  or gold because there's going to be one world currency, so you need gold.


[23:05.120 --> 23:07.120]  Do we need different modules?
[46:39.120 --> 46:44.120]  And didn't he admit as much during his trial with his, what, divorce with his wife, effectively?


[23:07.120 --> 23:09.120]  This is what exists currently in medicine.
[46:44.120 --> 46:45.120]  Custody.


[23:09.120 --> 23:13.120]  If I'm doing some quality control certification thing
[46:45.120 --> 46:46.120]  Was it custody?


[23:13.120 --> 23:16.120]  that I do every year, there's the first part of it,
[46:46.120 --> 46:49.120]  Yeah, Alex Jones is a character that he is playing.


[23:16.120 --> 23:19.120]  which is for everyone or maybe every physician,
[46:49.120 --> 46:52.120]  That was one of his lines of defense,


[23:19.120 --> 23:22.120]  and then you say what your specialty is.
[46:52.120 --> 46:54.120]  which I think probably is accurate.


[23:22.120 --> 23:24.120]  I'm a neurologist.
[46:54.120 --> 46:56.120]  Again, we can't read his mind.


[23:24.120 --> 23:26.120]  Then you get the neurology-specific stuff.
[46:56.120 --> 46:58.120]  We don't really know what he believes or doesn't believe,


[23:26.120 --> 23:28.120]  You could do the same thing.
[46:58.120 --> 47:01.120]  but it certainly is plausible and it certainly fits everything I've seen about him,


[23:28.120 --> 23:30.120]  Here's the generic rigors that everyone needs to know,
[47:01.120 --> 47:03.120]  that this is a character he's playing.


[23:30.120 --> 23:32.120]  and then what are you doing research in?
[47:03.120 --> 47:08.120]  He did admit that, which means he doesn't necessarily have to believe anything.


[23:32.120 --> 23:34.120]  Particle physics?
[47:08.120 --> 47:10.120]  But he's still doing the same level of damage, whether or not.


[23:34.120 --> 23:37.120]  Here's the particle physics part of the module for you
[47:10.120 --> 47:11.120]  Totally.


[23:37.120 --> 23:39.120]  for those specific issues.
[47:11.120 --> 47:12.120]  That's right.


[23:39.120 --> 23:41.120]  I could absolutely see that working that way.
[47:12.120 --> 47:13.120]  Absolutely.


[23:41.120 --> 23:44.120]  I kind of like the idea of making a bunch of social scientists
[47:13.120 --> 47:14.120]  People believe that he's real.


[23:44.120 --> 23:47.120]  do the particle physics, just to keep us humble.
[47:14.120 --> 47:16.120]  Well, and he's doing the character under oath, right?


[23:47.120 --> 23:49.120]  Absolutely.
[47:16.120 --> 47:17.120]  Yes, that's the thing.
=== More Space Debris <small>(23:51)</small> ===
* [https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Why space debris keeps falling out of the sky—and will continue to do so]<ref>[https://arstechnica.com/science/2022/08/why-space-debris-keeps-falling-out-of-the-sky-and-will-continue-to-do-so/ Ars Technica: Why space debris keeps falling out of the sky—and will continue to do so]</ref>


[23:49.120 --> 23:53.120]  Jay, tell us about crap falling from the sky.
[47:17.120 --> 47:19.120]  That has consequences.


[23:53.120 --> 23:57.120]  Steve, there's crap, and it's falling from the goddamn sky.
[47:19.120 --> 47:23.120]  It's been so interesting to watch because he's not used to being challenged on his show.


[23:57.120 --> 23:59.120]  Oh, my goodness.
[47:23.120 --> 47:25.120]  He has control over the entire narrative.


[23:59.120 --> 24:06.120]  This is about the fact that space agencies around the world
[47:25.120 --> 47:27.120]  Now he has to be in reality.


[24:06.120 --> 24:10.120]  are not doing a very good job of figuring out
[47:27.120 --> 47:32.120]  And so he started to do one of his ad pitches on the stand.


[24:10.120 --> 24:13.120]  how to exactly de-orbit pieces of spacecraft
[47:32.120 --> 47:35.120]  He started talking about how great his supplements are and they get the best supplements.


[24:13.120 --> 24:16.120]  that are left up there for one reason or another.
[47:35.120 --> 47:36.120]  He can't help it.


[24:16.120 --> 24:20.120]  There is a significant number of objects in low Earth orbit.
[47:36.120 --> 47:37.120]  Oh, my God.


[24:20.120 --> 24:25.120]  NASA tracks anything from 2 inches or 5 centimeters and up,
[47:37.120 --> 47:39.120]  It's all he knows, effectively.


[24:25.120 --> 24:30.120]  and there's 27,000 objects that are being tracked,
[47:39.120 --> 47:43.120]  If he can make a few bucks on the stand, why not go for it, I guess, right?


[24:30.120 --> 24:36.120]  and 70% of the tracked objects are in LEO, low Earth orbit,
[47:43.120 --> 47:47.120]  It's always satisfying to see, because this is not the first time this has happened,


[24:36.120 --> 24:39.120]  which is the orbit that's basically as close to the Earth
[47:47.120 --> 47:51.120]  and there are cases where people who are con artists or pseudoscientists or whatever,


[24:39.120 --> 24:41.120]  as you could pretty much get.
[47:51.120 --> 47:55.120]  and they find themselves in a court of law where there are rules of evidence.


[24:41.120 --> 24:43.120]  Do they say LEO?
[47:55.120 --> 48:01.120]  Not that courts are perfect, but they do have fairly rigorous rules of evidence and argument,


[24:43.120 --> 24:45.120]  I've only ever heard LEO.
[48:01.120 --> 48:03.120]  et cetera.


[24:45.120 --> 24:47.120]  I just thought you meant something astrology, Jay,
[48:03.120 --> 48:08.120]  Judges, if they're competent, aren't going to let you get away with stuff.


[24:47.120 --> 24:49.120]  and I was like, I can't believe this is happening.
[48:08.120 --> 48:13.120]  And just watching that disconnect, somebody like Alex Jones who's living in a fantasy world,


[24:49.120 --> 24:50.120]  I've got to go.
[48:13.120 --> 48:19.120]  whether he believes it or not, he is used to being in this con artist construct,


[24:50.120 --> 24:52.120]  I'm blazing trails here.
[48:19.120 --> 48:25.120]  and now he has to deal with reality and rules of evidence,


[24:52.120 --> 24:54.120]  It's low Earth orbit.
[48:25.120 --> 48:29.120]  and the clash is just wonderful to behold.


[24:54.120 --> 24:57.120]  Every one of these objects that are up there
[48:29.120 --> 48:33.120]  It's kind of reminding me, Jay, I think you talked about this on a live, SU Live,


[24:57.120 --> 25:01.120]  and that are going to be up there for a long time are hazards.
[48:33.120 --> 48:39.120]  maybe a year ago when Sanjay Gupta was on Joe Rogan and we all expected it to be kind of like that,


[25:01.120 --> 25:02.120]  They're dangerous.
[48:39.120 --> 48:42.120]  but Joe Rogan just sort of steamrolled the whole thing.


[25:02.120 --> 25:04.120]  They actually have to plan accordingly.
[48:42.120 --> 48:46.120]  This is what I wish that had been like, because now we're in a place where the rules,


[25:04.120 --> 25:07.120]  When anybody launches anything into outer space,
[48:46.120 --> 48:48.120]  reality has to hold for a second.


[25:07.120 --> 25:10.120]  they have to figure out the right time to do it
[48:48.120 --> 48:53.120]  Fun fact, Joe Rogan was on Infowars on 9-11.


[25:10.120 --> 25:13.120]  and how to avoid these known objects,
[48:53.120 --> 48:55.120]  As he was spewing his...


[25:13.120 --> 25:16.120]  because one of them could be traveling at such an incredible speed
[48:55.120 --> 48:58.120]  One of the least fun, fun facts I've ever heard.


[25:16.120 --> 25:19.120]  in relation to the ship that you're putting up there
[48:58.120 --> 49:03.120]  As soon as 9-11 happened, he was already spewing conspiracy theories,


[25:19.120 --> 25:20.120]  that it could destroy it.
[49:03.120 --> 49:05.120]  and then he had Joe Rogan on.


[25:20.120 --> 25:22.120]  It could rip right through it.
[49:05.120 --> 49:09.120]  Wait, wait, Joe Rogan was on Alex Jones' Infowars show?


[25:22.120 --> 25:24.120]  So this is a growing issue,
[49:09.120 --> 49:13.120]  Well, that guy literally just dropped lower than I thought he would.


[25:24.120 --> 25:27.120]  and we have another issue that is a problem,
[49:13.120 --> 49:15.120]  That is ridiculous.


[25:27.120 --> 25:31.120]  is that there are objects that are being left in low Earth orbit
[49:15.120 --> 49:21.120]  So I read in the chat, somebody said something about Texas tort law


[25:31.120 --> 25:36.120]  that are big, that are slowly de-orbiting over time,
[49:21.120 --> 49:27.120]  that drops the 45 million down to 750,000.


[25:36.120 --> 25:39.120]  because there's a tiny, tiny, tiny, tiny bit of atmosphere
[49:27.120 --> 49:28.120]  I read that too.


[25:39.120 --> 25:41.120]  in low Earth orbit,
[49:28.120 --> 49:32.120]  From what I saw from the plaintiff's lawyer, he was saying...


[25:41.120 --> 25:44.120]  and that's just enough to slowly take something out of orbit
[49:32.120 --> 49:37.120]  So there was talk about a cap because it was divided into two sets of damages.


[25:44.120 --> 25:46.120]  and bring it back down to Earth.
[49:37.120 --> 49:40.120]  So there were the compensatory damages and the punitive damages.


[25:46.120 --> 25:51.120]  As an example, China had one of their Long March 5B rockets
[49:40.120 --> 49:46.120]  The compensatory damages were 4.5 million, and then the punitive damages were 41 million.


[25:51.120 --> 25:53.120]  bring something up,
[49:46.120 --> 49:50.120]  And while we were waiting to hear what the punitive damages were,


[25:53.120 --> 25:56.120]  and a week later, when it came out of orbit,
[49:50.120 --> 49:53.120]  people were talking about a cap because it had to be a certain multiple


[25:56.120 --> 25:58.120]  because it was only up for a week,
[49:53.120 --> 49:55.120]  of the compensatory damages.


[25:58.120 --> 26:01.120]  and by that time there was enough inertia and everything
[49:55.120 --> 50:00.120]  But from the statement that the plaintiff's lawyer gave afterwards,


[26:01.120 --> 26:03.120]  to get it back down into the atmosphere,
[50:00.120 --> 50:03.120]  that was more of a guideline, not a hard cap.


[26:03.120 --> 26:07.120]  pieces of it landed in Malaysia and Indonesia,
[50:03.120 --> 50:05.120]  More of a guideline.


[26:07.120 --> 26:10.120]  and it landed right near a village where people were living.
[50:05.120 --> 50:07.120]  I'm just going based on his statement.


[26:10.120 --> 26:12.120]  It is a real threat,
[50:07.120 --> 50:10.120]  I don't know anything about Texas law, not a lawyer.


[26:12.120 --> 26:15.120]  and we're not talking about millions of people getting hurt,
[50:10.120 --> 50:13.120]  But that was what I heard about that.


[26:15.120 --> 26:16.120]  but it could kill people.
[50:13.120 --> 50:18.120]  I was hoping to see them literally dismantle him and his company.


[26:16.120 --> 26:19.120]  It could kill handfuls of people now and again,
[50:18.120 --> 50:21.120]  Why wouldn't this guy see prison time?


[26:19.120 --> 26:21.120]  which is something that we definitely want to avoid.
[50:21.120 --> 50:24.120]  It's a civil case, you don't know prison.


[26:21.120 --> 26:23.120]  It's also just not good practice.
[50:24.120 --> 50:30.120]  I understand that, but it doesn't mean that he can't be put in prison legitimately.


[26:23.120 --> 26:25.120]  It's not keeping your shop clean.
[50:30.120 --> 50:32.120]  He did perjure himself.


[26:25.120 --> 26:28.120]  So getting back to the Long March 5B rocket,
[50:32.120 --> 50:34.120]  That would be a whole other story.


[26:28.120 --> 26:30.120]  now this rocket is huge.
[50:34.120 --> 50:37.120]  That would be something emerging from the trial itself.


[26:30.120 --> 26:32.120]  China launched it on July 24th,
[50:37.120 --> 50:43.120]  But it's hard to bring criminal charges against somebody for what they're saying


[26:32.120 --> 26:35.120]  and they were bringing up a new space station module
[50:43.120 --> 50:46.120]  in a public forum because of free speech laws, etc.


[26:35.120 --> 26:39.120]  to their Tiangong space station, which is a China-only space station.
[50:46.120 --> 50:48.120]  But civil is different.


[26:39.120 --> 26:42.120]  It's actually pretty cool, they should read up on it.
[50:48.120 --> 50:53.120]  Holding people liable for the damage that they knowingly and maliciously caused,


[26:42.120 --> 26:46.120]  Now this rocket is not designed to de-orbit itself.
[50:53.120 --> 50:55.120]  the law allows for that.


[26:46.120 --> 26:48.120]  They don't send it up with the ability to do that,
[50:55.120 --> 50:59.120]  One more thing I did want to bring up is, in my opinion,


[26:48.120 --> 26:52.120]  and in fact, the engines can't even restart after the engines are shut off.
[50:59.120 --> 51:01.120]  one of the best witnesses that they had.


[26:52.120 --> 26:55.120]  When it does its main push and gets all that weight up
[51:01.120 --> 51:06.120]  Her name is Becca Lewis and she does research in misinformation and disinformation


[26:55.120 --> 26:57.120]  to the altitude that they need it to,
[51:06.120 --> 51:08.120]  and how it spreads.


[26:57.120 --> 26:59.120]  and those engines shut off, they can't go back on.
[51:08.120 --> 51:11.120]  They had her on as an expert witness about misinformation.


[26:59.120 --> 27:03.120]  This ultimately means that there's no way for China
[51:11.120 --> 51:15.120]  She talked about how and why it spreads faster than the truth


[27:03.120 --> 27:07.120]  to control the de-orbiting of this massive rocket.
[51:15.120 --> 51:20.120]  since it feeds into people's world views, the confirmation bias.


[27:07.120 --> 27:10.120]  It's just going to fly back into the Earth's atmosphere,
[51:20.120 --> 51:24.120]  The things that confirm their existing world views are going to circulate,


[27:10.120 --> 27:13.120]  and I'm not even sure that they know where it's going to end up going.
[51:24.120 --> 51:27.120]  especially once you start to have echo chambers like Infowars'.


[27:13.120 --> 27:16.120]  I don't even know if there's good physics
[51:27.120 --> 51:31.120]  Also, Alex Jones platformed other conspiracy theorists.


[27:16.120 --> 27:19.120]  that will really accurately predict where something willy-nilly
[51:31.120 --> 51:35.120]  There was one that she talked about who his content only had three views


[27:19.120 --> 27:23.120]  is de-orbiting at some point and coming back into the atmosphere.
[51:35.120 --> 51:38.120]  before Alex Jones started promoting it.


[27:23.120 --> 27:27.120]  It could end up anywhere, which is the scary part.
[51:38.120 --> 51:40.120]  It was something that nobody was going to see.


[27:27.120 --> 27:31.120]  Believe me, I feel completely happy and thrilled and lucky
[51:40.120 --> 51:43.120]  But because of his platform, a lot of people saw it.


[27:31.120 --> 27:34.120]  that we're alive during a time when space exploration
[51:43.120 --> 51:49.120]  Now we have 24% of the country who questions this main narrative.


[27:34.120 --> 27:36.120]  is starting to explode again.
[51:49.120 --> 51:51.120]  That was a lot of what the trial was about.


[27:36.120 --> 27:37.120]  It's a great time.
[51:51.120 --> 51:53.120]  He would claim, oh, I was just asking questions.


[27:37.120 --> 27:38.120]  Hopefully explode.
[51:53.120 --> 51:56.120]  I was just having these people on to get their opinion.


[27:38.120 --> 27:40.120]  Yeah, you're right.
[51:56.120 --> 51:58.120]  Oh, my guest said it, but I didn't say it.


[27:40.120 --> 27:44.120]  When all of these nations are launching new projects,
[51:58.120 --> 52:02.120]  But he provided that platform for them to get their views out.


[27:44.120 --> 27:46.120]  how's that? Is that better?
[52:02.120 --> 52:06.120]  I think the most interesting thing she talked about was this idea


[27:46.120 --> 27:47.120]  Better.
[52:06.120 --> 52:09.120]  of three degrees of Alex Jones.


[27:47.120 --> 27:52.120]  What we don't have right now are proper rules of etiquette.
[52:09.120 --> 52:13.120]  She said that you basically can't do misinformation research


[27:52.120 --> 27:55.120]  There are things that people would like.
[52:13.120 --> 52:16.120]  without encountering Infowars and Alex Jones.


[27:55.120 --> 27:59.120]  NASA is making it known what information that they would like,
[52:16.120 --> 52:22.120]  The common rule is that you're never more than three recommendations away


[27:59.120 --> 28:03.120]  but in this instance, China didn't share any of the information
[52:22.120 --> 52:26.120]  from Alex Jones or Infowars videos.


[28:03.120 --> 28:06.120]  about what trajectory their rocket was on
[52:26.120 --> 52:27.120]  Wow.


[28:06.120 --> 28:10.120]  and where they think it'll end up coming back into the atmosphere.
[52:27.120 --> 52:29.120]  Ouch.


[28:10.120 --> 28:13.120]  The NASA administrator, the name of Bill Nelson,
[52:29.120 --> 52:34.120]  The way to restate that is you can't be more full of shit than Alex Jones.


[28:13.120 --> 28:15.120]  he said, and I'm quoting him,
[52:34.120 --> 52:36.120]  Yeah, basically.


[28:15.120 --> 28:18.120]  All spacefaring nations should follow established best practices
[52:36.120 --> 52:41.120]  Jones' lawyer was trying to trip her up, and he was trying to use


[28:18.120 --> 28:22.120]  and do their part to share this type of information in advance
[52:41.120 --> 52:44.120]  all of the things that a scientist or a skeptic would use.


[28:22.120 --> 28:25.120]  to allow reliable predictions of potential debris impact risk,
[52:44.120 --> 52:48.120]  He's talking about sample size and bias and things like that


[28:25.120 --> 28:29.120]  especially for heavy-lift vehicles like the Long March 5B,
[52:48.120 --> 52:51.120]  because in any paper at the end, they're going to talk about


[28:29.120 --> 28:33.120]  which carry a significant risk of loss of life and property.
[52:51.120 --> 52:54.120]  all of the limitations and say, like, this is a potential limitation.


[28:33.120 --> 28:36.120]  Doing so is critical to the responsible use of space
[52:54.120 --> 52:57.120]  This is a potential source of bias, but we tried to account for it


[28:36.120 --> 28:39.120]  and to ensure the safety of people here on Earth.
[52:57.120 --> 52:59.120]  as best we could.


[28:39.120 --> 28:42.120]  I wish that I could have found some information on what would have happened
[52:59.120 --> 53:02.120]  But she's a researcher, so she knew it a lot better than he did.


[28:42.120 --> 28:48.120]  if one of these pieces of larger debris ended up barreling into a city.
[53:02.120 --> 53:06.120]  So she'd stop and she'd be like, no, this is what that means.


[28:48.120 --> 28:50.120]  Could it take a part of a building out?
[53:06.120 --> 53:08.120]  You have no idea what you're talking about.


[28:50.120 --> 28:53.120]  What's its velocity? How much mass does it have?
[53:08.120 --> 53:10.120]  Oh, that's great.


[28:53.120 --> 28:56.120]  I do know that SpaceX had a module,
[53:10.120 --> 53:13.120]  Yeah, and he tried to say that she hated Alex Jones and things like that,


[28:56.120 --> 29:00.120]  a piece of debris come back down as recently as July 9th.
[53:13.120 --> 53:17.120]  and that would bias her, and she didn't know who Alex Jones was


[29:00.120 --> 29:03.120]  Now, if you look at a picture of the Crew-1 module,
[53:17.120 --> 53:19.120]  before she started researching this.


[29:03.120 --> 29:06.120]  there is a component that's right underneath it
[53:19.120 --> 53:21.120]  And she just goes, yes, that's correct.


[29:06.120 --> 29:09.120]  that is used to relay electricity to the module and all that,
[53:21.120 --> 53:25.120]  Like, when he'd present something, she'd say, yes, that's correct,


[29:09.120 --> 29:11.120]  but it's also a cargo hold, right?
[53:25.120 --> 53:27.120]  and it's based on hundreds of hours of research.


[29:11.120 --> 29:13.120]  A cargo hold that's not pressurized.
[53:27.120 --> 53:29.120]  It's not just her opinion.


[29:13.120 --> 29:17.120]  This thing is about 3 meters long and it weighs 4 metric tons.
[53:29.120 --> 53:32.120]  And so he kept trying to trip her up, and the best part was


[29:17.120 --> 29:22.120]  That's an incredibly heavy object that hit the Earth at one point.
[53:32.120 --> 53:37.120]  he was asking her questions and said, the poll that found


[29:22.120 --> 29:26.120]  It came back down on July 9th and it took a year for it to deorbit.
[53:37.120 --> 53:42.120]  24% questioned Sandy Hook, that it was under 1,000 sample size


[29:26.120 --> 29:29.120]  So that's just another thing that needs to be tracked.
[53:42.120 --> 53:45.120]  and was trying to discredit it that way.


[29:29.120 --> 29:32.120]  It could take time for them to come back down
[53:45.120 --> 53:47.120]  And she's like, you can have statistical significance


[29:32.120 --> 29:34.120]  and then we have to try to figure out where they're going to go.
[53:47.120 --> 53:50.120]  with less than 1,000 sample size, like trying to explain that.


[29:34.120 --> 29:36.120]  But okay, let's say we know where it's going to go.
[53:50.120 --> 53:55.120]  And then the plaintiff's lawyer comes up and hands her the actual study


[29:36.120 --> 29:40.120]  So what? What if it's going to hit a major city somewhere?
[53:55.120 --> 54:00.120]  and the Jones lawyer was full of shit because it was over 1,000.


[29:40.120 --> 29:41.120]  What are we going to do about it?
[54:00.120 --> 54:02.120]  So it wasn't even that, yeah.


[29:41.120 --> 29:43.120]  The answer is there's nothing.
[54:02.120 --> 54:04.120]  Even the lawyer is full of BS.


[29:43.120 --> 29:44.120]  There's nothing we can do about it.
[54:04.120 --> 54:10.120]  We're really seeing this trend here with these crazy lawsuits.


[29:44.120 --> 29:48.120]  We're going to shoot rockets up to take out rockets that are coming.
[54:10.120 --> 54:13.120]  How do you defend Alex Jones legitimately?


[29:48.120 --> 29:49.120]  The whole thing is crazy.
[54:13.120 --> 54:15.120]  How do you do it?


[29:49.120 --> 29:54.120]  So what we need to do is we need to have this rules of etiquette
[54:15.120 --> 54:18.120]  You literally have to try to slip through some cracks.


[29:54.120 --> 29:58.120]  where space agencies start to send up more fuel,
[54:18.120 --> 54:22.120]  Well, but you also don't have to defend him and say he's innocent.


[29:58.120 --> 30:01.120]  have rocket engines that can deorbit themselves
[54:22.120 --> 54:24.120]  I mean, I know innocent and guilty isn't what's happening here


[30:01.120 --> 30:04.120]  and not only have one turn-on cycle.
[54:24.120 --> 54:26.120]  because it's a civil case, but you don't have to say,


[30:04.120 --> 30:09.120]  These pretty costly and probably very expensive engineering feats
[54:26.120 --> 54:28.120]  oh, no, he didn't defame people.


[30:09.120 --> 30:11.120]  that need to become a part of all of these projects.
[54:28.120 --> 54:33.120]  You can just try to mitigate the damage in an ethical way.


[30:11.120 --> 30:13.120]  And that's what NASA wants.
[54:33.120 --> 54:37.120]  If a lawyer can give a defense they don't personally believe,


[30:13.120 --> 30:15.120]  But right now...
[54:37.120 --> 54:39.120]  they don't have to believe it.


[30:15.120 --> 30:17.120]  Just to make sure that the point is crystal clear,
[54:39.120 --> 54:42.120]  The ethics of law does not require that.


[30:17.120 --> 30:21.120]  it's to control the deorbit so that we know where it comes down.
[54:42.120 --> 54:46.120]  It just has to be a legally responsible and viable argument.


[30:21.120 --> 30:26.120]  We dump it in the middle of the Pacific so it doesn't hit Australia or whatever.
[54:46.120 --> 54:50.120]  Their personal belief is actually not relevant to it.


[30:26.120 --> 30:27.120]  Exactly, yeah.
[54:50.120 --> 54:54.120]  So as long as they are mounting an ethical defense, it's fine.


[30:27.120 --> 30:30.120]  So right now there's a couple of companies that are starting to,
[54:54.120 --> 54:58.120]  But it's certainly reasonable to think that there isn't an ethical defense


[30:30.120 --> 30:33.120]  or space agencies that are starting to comply
[54:58.120 --> 55:07.120]  of somebody like Alex Jones because it seems so obvious that he's guilty.


[30:33.120 --> 30:37.120]  and build in this functionality into the new rockets that they're building.
[55:07.120 --> 55:11.120]  But again, the law is based upon the notion that everybody deserves a defense.


[30:37.120 --> 30:41.120]  But let's face it, it's not a global thing.
[55:11.120 --> 55:15.120]  But that doesn't mean that lawyers can do unethical things on the stand.


[30:41.120 --> 30:43.120]  A lot of people aren't doing that.
[55:15.120 --> 55:18.120]  It also is why I think that might speak to the quality of the lawyers


[30:43.120 --> 30:46.120]  Some good things that we have are like SpaceX,
[55:18.120 --> 55:22.120]  because, again, the high-quality lawyers, Jones clearly has the money.


[30:46.120 --> 30:50.120]  which is leading the pack on this whole idea of reusability.
[55:22.120 --> 55:25.120]  He could pay some high-priced law legal firm to defend him.


[30:50.120 --> 30:51.120]  That's fantastic.
[55:25.120 --> 55:28.120]  They probably don't want their reputation sullied with this.


[30:51.120 --> 30:52.120]  You want to reuse your rockets.
[55:28.120 --> 55:29.120]  They don't want to go anywhere near it.


[30:52.120 --> 30:54.120]  You want your retro rockets to land themselves.
[55:29.120 --> 55:31.120]  Nobody wants to be the guy who defended Alex Jones.


[30:54.120 --> 30:55.120]  You see it all the time.
[55:31.120 --> 55:32.120]  Right.


[30:55.120 --> 30:56.120]  That's great.
[55:32.120 --> 55:34.120]  Do we have any idea how much money, like what his net worth is?


[30:56.120 --> 30:59.120]  More reusability that we build into things means more control,
[55:34.120 --> 55:36.120]  Like how ruinous is $41 million, $45 million?


[30:59.120 --> 31:02.120]  more ability to bring things down safely,
[55:36.120 --> 55:38.120]  They were desperately trying to figure that out.


[31:02.120 --> 31:05.120]  which is exactly what everybody needs to be doing.
[55:38.120 --> 55:42.120]  So officially, I'm sorry if you didn't notice, but officially it's $200,000


[31:05.120 --> 31:08.120]  One, we don't want to pollute low Earth orbit any worse than it is.
[55:42.120 --> 55:46.120]  that his enterprise makes $200,000 a day.


[31:08.120 --> 31:10.120]  If anything, we want to get that stuff out of there,
[55:46.120 --> 55:48.120]  But $200,000 a day.


[31:10.120 --> 31:16.120]  which no one has come up with a feasible economic way to do it yet.
[55:48.120 --> 55:50.120]  Is that net?


[31:16.120 --> 31:18.120]  But I imagine at some point in the next 50 years,
[55:50.120 --> 55:52.120]  But that's probably an underestimate.


[31:18.120 --> 31:22.120]  someone will come up with something that's making that move.
[55:52.120 --> 55:58.120]  And in the phone records that were revealed, on some days they make up to $800,000.


[31:22.120 --> 31:25.120]  But in the meantime, our goals are no more debris
[55:58.120 --> 55:59.120]  That was their best day.


[31:25.120 --> 31:29.120]  and absolutely no more craziness of things falling out of the sky
[55:59.120 --> 56:01.120]  That was a good day, yeah.


[31:29.120 --> 31:33.120]  without any predictability on where they're going to go or drivability,
[56:01.120 --> 56:03.120]  You guys have got to sell supplements, man.


[31:33.120 --> 31:36.120]  meaning we want them to go to a specific place.
[56:03.120 --> 56:04.120]  This is right.


[31:36.120 --> 31:38.120]  So what do you think about that, Steve?
[56:04.120 --> 56:06.120]  We've got to switch sides.


[31:38.120 --> 31:40.120]  Well, it wasn't too long ago.
[56:06.120 --> 56:09.120]  But they had a really hard time figuring that kind of stuff out


[31:40.120 --> 31:43.120]  It was just a science or fiction item where an estimate was
[56:09.120 --> 56:11.120]  because he didn't turn over all the documents that he was supposed to turn over.


[31:43.120 --> 31:46.120]  that in the next decade, there's actually something like a 10% chance
[56:11.120 --> 56:12.120]  Right, part of the problem.


[31:46.120 --> 31:48.120]  of somebody getting hit by space debris.
[56:12.120 --> 56:15.120]  So they couldn't really get a solid answer on that.


[31:48.120 --> 31:49.120]  Oh, yeah.
[56:15.120 --> 56:16.120]  What kind of bullshit is that?


[31:49.120 --> 31:50.120]  We all thought it was fiction.
[56:16.120 --> 56:17.120]  Okay, so you don't do that.


[31:50.120 --> 31:54.120]  Yeah, it's getting pretty significant now just because of the sheer volume
[56:17.120 --> 56:19.120]  You don't turn over the documents.


[31:54.120 --> 31:56.120]  of stuff that we're putting up there.
[56:19.120 --> 56:25.120]  Like doesn't the law, doesn't the court have the ability to deliver some type of incredible smackdown?


[31:56.120 --> 31:59.120]  So, yeah, it's, again, one of those things that we have to take
[56:25.120 --> 56:27.120]  So that's what they did.


[31:59.120 --> 32:02.120]  a systematic approach to it rather than relying on individuals
[56:27.120 --> 56:29.120]  That was why there was the default judgment.


[32:02.120 --> 32:03.120]  to all do the right thing.
[56:29.120 --> 56:34.120]  And so that's why this was just for damages because they already determined that he was liable


[32:03.120 --> 32:05.120]  How would we figure that out, Steve?
[56:34.120 --> 56:37.120]  for the defamation and for the infliction of emotional distress.


[32:05.120 --> 32:07.120]  Where would we come up with such an approach?
[56:37.120 --> 56:39.120]  I get that they clicked into summary judgment.


[32:07.120 --> 32:09.120]  People aren't just going to automatically do the right thing
[56:39.120 --> 56:41.120]  We see we have some experience with that.


[32:09.120 --> 32:10.120]  on their own volition.
[56:41.120 --> 56:42.120]  Yeah.


[32:10.120 --> 32:11.120]  It's just stunning.
[56:42.120 --> 56:44.120]  But in a good way.


[32:11.120 --> 32:12.120]  I know.
[56:44.120 --> 56:47.120]  Don't you get into legal trouble if you don't hand over?


[32:12.120 --> 32:14.120]  I feel like we're going to have apps where you have, like,
[56:47.120 --> 56:49.120]  Like doesn't he have to now deal with the fact?


[32:14.120 --> 32:16.120]  weather forecast, air pollution, space debris.
[56:49.120 --> 56:53.120]  Well, you could be held in contempt, right, would be the legal remedy there.


[32:16.120 --> 32:17.120]  Space debris, yeah.
[56:53.120 --> 56:58.120]  But just in a case like this, the remedy is you lose.


[32:17.120 --> 32:20.120]  What's the probability of that thing landing in Manhattan today?
[56:58.120 --> 57:03.120]  You now lose the case and now we're going to talk about how much money you have to pay the plaintiff.


[32:20.120 --> 32:21.120]  Take your umbrella.
[57:03.120 --> 57:05.120]  So that was the remedy.


[32:21.120 --> 32:23.120]  Yeah, like a steel umbrella.
[57:05.120 --> 57:11.120]  He was asked, you know, turn over like emails or texts where, you know, you mentioned Sandy Hook.


[32:23.120 --> 32:26.120]  50% chance of rain, 5% chance of...
[57:11.120 --> 57:17.120]  And he said, I did a search on my phone, did not see any text that mentioned Sandy Hook.


[32:26.120 --> 32:28.120]  Low-work orbit de-orbiting.
[57:17.120 --> 57:21.120]  So I want to know what did the court or the judge do at that point?


[32:28.120 --> 32:31.120]  Emily Calandrelli, who does a lot of space-related science communication,
[57:21.120 --> 57:26.120]  Because then, of course, afterwards they got two years of text and of course it's all over the place.


[32:31.120 --> 32:34.120]  she was following this one as it was coming down.
[57:26.120 --> 57:28.120]  So he was just flat out lying.


[32:34.120 --> 32:38.120]  And what shocked me about it was we really didn't know where it was
[57:28.120 --> 57:31.120]  But if they didn't get that dump, what recourse would they have had to say?


[32:38.120 --> 32:41.120]  going to be until, like, an hour before, even days before,
[57:31.120 --> 57:32.120]  Yeah, I don't believe you.


[32:41.120 --> 32:45.120]  it was like half of the Earth was in the possible target area.
[57:32.120 --> 57:34.120]  I don't believe your phone doesn't have those.


[32:45.120 --> 32:48.120]  But she did say, at least this one, they thought.
[57:34.120 --> 57:36.120]  They can get the info if they want to.


[32:48.120 --> 32:51.120]  But, again, they didn't really know what exactly it was made of,
[57:36.120 --> 57:38.120]  They can get the info.


[32:51.120 --> 32:53.120]  but it would only take out a house or two.
[57:38.120 --> 57:43.120]  They can appoint somebody to go through the phone and get the information that they want.


[32:53.120 --> 32:54.120]  A house or two.
[57:43.120 --> 57:46.120]  I know like when I had to turn over my emails, I didn't do it.


[32:54.120 --> 32:55.120]  Just a house or two.
[57:46.120 --> 57:52.120]  My lawyer hired an independent person to come in, go through all my emails and find the ones that were relevant.


[32:55.120 --> 32:56.120]  Yeah.
[57:52.120 --> 57:54.120]  My hands were not on it at all.


[32:56.120 --> 33:00.120]  Since you suggested a city, a house was the better alternative.
[57:54.120 --> 57:55.120]  All right.


[33:00.120 --> 33:04.120]  Does space debris zero in on trailer parks like tornadoes do?
[57:55.120 --> 57:57.120]  Anything else you want to add before we move on?


[33:04.120 --> 33:05.120]  Yeah.
[57:57.120 --> 58:00.120]  I will throw a quote out there from the lawyer today.


[33:05.120 --> 33:06.120]  I'm just wondering.
[58:00.120 --> 58:03.120]  So this was just the first of a few cases.


[33:06.120 --> 33:07.120]  And lawn chairs and stuff.
[58:03.120 --> 58:10.120]  And the plaintiff's lawyer said, there's going to be a large set of plaintiffs dividing up the corpse of Infowars.


[33:07.120 --> 33:08.120]  Yeah.
[58:10.120 --> 58:12.120]  And fingers crossed that that actually happens.


[33:08.120 --> 33:10.120]  But there's things to consider, though, because it's not just...
[58:12.120 --> 58:13.120]  Yeah, that would be nice.


[33:10.120 --> 33:12.120]  But could there be explosives in there?
[58:13.120 --> 58:15.120]  Tiny slice of justice in this book.


[33:12.120 --> 33:15.120]  Could there be some leftover rocket fuel fumes?
[58:15.120 --> 58:17.120]  The corpse of Infowars.


[33:15.120 --> 33:18.120]  Or I have no idea, like, what potential explosive...
[58:17.120 --> 58:18.120]  It's a nice sentence.


[33:18.120 --> 33:20.120]  They're probably out of fuel, yeah.
[58:18.120 --> 58:19.120]  Add that to your Halloween display.


[33:20.120 --> 33:21.120]  You'd hope.
[58:19.120 --> 58:21.120]  I would, I would.


[33:21.120 --> 33:22.120] Yeah, you'd hope.
=== Earth Spinning Faster <small>(58:21)</small> ===
* [https://www.forbes.com/sites/jamiecartereurope/2022/07/28/earth-is-suddenly-spinning-faster-why-our-planet-just-recorded-its-shortest-day-since-records-began/amp/ Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began]<ref>[https://www.forbes.com/sites/jamiecartereurope/2022/07/28/earth-is-suddenly-spinning-faster-why-our-planet-just-recorded-its-shortest-day-since-records-began/amp/ Forbes: Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began]</ref>


[33:22.120 --> 33:23.120]  Who knows?
[58:21.120 --> 58:22.120]  All right, Bob.


[33:23.120 --> 33:24.120]  What about waste?
[58:22.120 --> 58:30.120]  I understand that the earth is supposed to be slowing down over the long historical time.


[33:24.120 --> 33:27.120]  What about, like, dangerous gases and things like that?
[58:30.120 --> 58:33.120]  But maybe that's not 100 percent true.


[33:27.120 --> 33:30.120]  Well, when Columbia broke up in 2003
[58:33.120 --> 58:36.120]  Well, you know, I don't want to get everybody concerned.


[33:30.120 --> 33:34.120]  and came down over the American South and Southeast,
[58:36.120 --> 58:43.120]  But the earth is now spinning faster than it ever has before in the age of atomic clocks.


[33:34.120 --> 33:38.120]  there was concern that they didn't know what sort of contamination,
[58:43.120 --> 58:45.120]  I thought I felt something.


[33:38.120 --> 33:41.120]  I think, there was in some of the materials,
[58:45.120 --> 58:51.120]  January 22nd, this past year, January 22nd, no, June 22nd, 2022.


[33:41.120 --> 33:44.120]  that people were finding and picking up, like, you know,
[58:51.120 --> 58:53.120]  The shortest day ever recorded.


[33:44.120 --> 33:46.120]  a piece of a helmet and things.
[58:53.120 --> 58:54.120]  And we're not sure why.


[33:46.120 --> 33:48.120]  They warned people to not go near them.
[58:54.120 --> 58:55.120]  Should we be scared?


[33:48.120 --> 33:49.120]  Yeah.
[58:55.120 --> 58:57.120]  Should we be afraid?


[33:49.120 --> 33:51.120]  So I don't know what sort of danger that...
[58:57.120 --> 58:58.120]  So what's what's going on here?


[33:51.120 --> 33:52.120]  I don't know.
[58:58.120 --> 59:00.120]  You mean the longest day ever recorded?


[33:52.120 --> 33:55.120]  I know it always comes up whenever they're sending up any satellite
[59:00.120 --> 59:01.120]  What did I say?


[33:55.120 --> 33:57.120]  or anything that has a nuclear battery in it.
[59:01.120 --> 59:02.120]  Shortest day.


[33:57.120 --> 34:00.120]  If that thing, you know, blows up or reenters,
[59:02.120 --> 59:03.120]  Shortest day.


[34:00.120 --> 34:03.120]  then we could be dumping nuclear waste.
[59:03.120 --> 59:04.120]  Because the earth is spinning faster.


[34:03.120 --> 34:06.120]  Well, now I'm thinking, you know, Cold War Sputnik stuff, too,
[59:04.120 --> 59:05.120]  Faster, so it's short days, right?


[34:06.120 --> 34:08.120]  where it's like, what if it's not an accident?
[59:05.120 --> 59:06.120]  Yeah, it's getting shorter.


[34:08.120 --> 34:10.120]  Not to be the conspiracy theorist of the group,
[59:06.120 --> 59:07.120]  Yeah, it'd be shorter.


[34:10.120 --> 34:12.120]  but that would be a good way to...
[59:07.120 --> 59:09.120]  So it all starts with a day.


[34:12.120 --> 34:14.120]  Anyway, I'll stop with that one thought.
[59:09.120 --> 59:10.120]  What is a day?


[34:14.120 --> 34:15.120]  All right.
[59:10.120 --> 59:11.120]  Yeah, what's a day?


=== Auditory Pareidolia Again <small>(34:16)</small> ===
[59:11.120 --> 59:12.120]  If you ask anybody, what's a day?
* [https://www.today.com/popculture/green-needle-or-brainstorm-hear-latest-audio-clip-dividing-internet-t188193 'Green needle' or 'brainstorm'? Hear the latest audio clip dividing the internet]<ref>[https://www.today.com/popculture/green-needle-or-brainstorm-hear-latest-audio-clip-dividing-internet-t188193 Today: 'Green needle' or 'brainstorm'? Hear the latest audio clip dividing the internet]</ref>


[34:15.120 --> 34:17.120]  This is actually a couple of years old,
[59:12.120 --> 59:13.120]  24 hours.


[34:17.120 --> 34:19.120]  but it's making the rounds again, and I saw it.
[59:13.120 --> 59:14.120]  24 hours.


[34:19.120 --> 34:21.120]  I don't think we've ever played this on the issue.
[59:14.120 --> 59:15.120]  Steve, what is that in metric?


[34:21.120 --> 34:23.120]  I missed it the first time around.
[59:15.120 --> 59:17.120]  Oh, never mind.


[34:23.120 --> 34:25.120]  This video, just listen to the sound.
[59:17.120 --> 59:20.120]  So a mean solar day is 24 hours.


[34:25.120 --> 34:27.120]  You don't have to see the video.
[59:20.120 --> 59:21.120]  That's right.


[34:27.120 --> 34:30.120]  So either think the word brainstorm
[59:21.120 --> 59:22.120]  That's what it is.


[34:30.120 --> 34:33.120]  or think the word green needle.
[59:22.120 --> 59:25.120]  But that's the outer the outermost onion layer.


[34:33.120 --> 34:37.120]  And whatever you think, that's what you will hear.
[59:25.120 --> 59:29.120]  As we say, you get a little deeper and it's never really 24 hours.


[34:37.120 --> 34:41.120]  You don't even need to be caught with the actual words.
[59:29.120 --> 59:30.120]  Exactly.


[34:41.120 --> 34:45.120]  You just have to think it.
[59:30.120 --> 59:31.120]  It kind of this is 24 hours.


[34:45.120 --> 34:47.120]  Isn't that bizarre?
[59:31.120 --> 59:33.120]  It goes a little shorter, a little longer.


[34:47.120 --> 34:48.120]  That's crazy.
[59:33.120 --> 59:35.120]  It's like right around 24 hours.


[34:48.120 --> 34:50.120]  Although I'm hearing the green needle a lot more
[59:35.120 --> 59:38.120]  24 hours is should be the average.


[34:50.120 --> 34:52.120]  than I'm hearing the brainstorm.
[59:38.120 --> 59:43.120]  But it varies because you've got the interior of the earth kind of roiling around.


[34:52.120 --> 34:55.120]  It's either distinctively green needle or not green needle.
[59:43.120 --> 59:45.120]  You've got seismic activity.


[34:55.120 --> 34:58.120]  Yeah, but I could flip both ways at will.
[59:45.120 --> 59:49.120]  You've got the wind, the wind running across the surface of the earth and causing


[34:58.120 --> 35:02.120]  You would think, though, they seem like such different phrases
[59:49.120 --> 59:51.120]  friction, pushing against mountains.


[35:02.120 --> 35:05.120]  phonetically and everything, but it's in there.
[59:51.120 --> 59:56.120]  All those things conspire to make the day, you know, slower and faster than 24 hours.


[35:05.120 --> 35:07.120]  There are things in there that will trick your brain
[59:56.120 --> 01:00:02.120]  But if you look at it over the over many decades, what you find is that the average is


[35:07.120 --> 35:09.120]  for both of those.
[01:00:02.120 --> 01:00:06.120]  about 24 hours and point zero zero one seconds.


[35:09.120 --> 35:10.120]  It's uncanny.
[01:00:06.120 --> 01:00:08.120]  So somebody asks you, how long is a day?


[35:10.120 --> 35:12.120]  It's not even the same number of syllables,
[01:00:08.120 --> 01:00:12.120]  You say 24 hours and point zero zero one seconds, because that would be more accurate,


[35:12.120 --> 35:15.120]  which is surprising to me that it still works, right?
[01:00:12.120 --> 01:00:13.120]  a little bit more accurate.


[35:15.120 --> 35:16.120]  Yeah, it's one extra syllable.
[01:00:13.120 --> 01:00:17.120]  But the problem here is that we have two ways to tell time.


[35:16.120 --> 35:17.120]  Two versus three.
[01:00:17.120 --> 01:00:19.120]  Really, we have atomic time, which is extremely accurate.


[35:17.120 --> 35:20.120]  I think the distortion itself must be a critical component
[01:00:19.120 --> 01:00:20.120]  And here's solar time.


[35:20.120 --> 35:23.120]  of the ability to switch between it from one to the other, perhaps.
[01:00:20.120 --> 01:00:25.120]  And every day, if the earth is a little bit slower, a little bit faster, it notches up


[35:23.120 --> 35:26.120]  Otherwise, why make it sound so distorted?
[01:00:25.120 --> 01:00:27.120]  and it diverges from atomic time.


[35:26.120 --> 35:30.120]  I believe it also works brain needle and green storm as well.
[01:00:27.120 --> 01:00:32.120]  And after a while, you can't get beyond this, which is about, I don't know, 10 seconds.


[35:30.120 --> 35:32.120]  If you try it.
[01:00:32.120 --> 01:00:34.120]  They don't want to get beyond that, whatever that is.


[35:32.120 --> 35:34.120]  I have to stumble upon this.
[01:00:34.120 --> 01:00:36.120]  So they throw in a leap second.


[35:34.120 --> 35:42.120]  It's one of the more dramatic examples of auditory parendolism.
[01:00:36.120 --> 01:00:37.120]  That's what a leap second is.


[35:42.120 --> 35:45.120]  This happens in a lot of our sensory streams,
[01:00:37.120 --> 01:00:41.120]  A leap second isn't because, oh, the earth is slowing and slowing and slowing and we


[35:45.120 --> 35:48.120]  but it happens a lot with language.
[01:00:41.120 --> 01:00:42.120]  need to throw in a second.


[35:48.120 --> 35:52.120]  Our sensory streams are wired to make the closest fit
[01:00:42.120 --> 01:00:46.120]  It's because because of that divergence between atomic time and solar time.


[35:52.120 --> 35:55.120]  to phonemes that you know.
[01:00:46.120 --> 01:00:47.120]  That's what a leap second is.


[35:55.120 --> 35:59.120]  It's constantly trying to make that fit between speech sound
[01:00:47.120 --> 01:00:51.120]  So but why is there this general average of slowing the earth?


[35:59.120 --> 36:02.120]  and words that you know.
[01:00:51.120 --> 01:00:52.120]  There's a bunch of reasons.


[36:02.120 --> 36:05.120]  That's why you can misunderstand lyrics all the time
[01:00:52.120 --> 01:00:55.120]  The main and most fascinating one for me is tidal breaking.


[36:05.120 --> 36:06.120]  and misunderstand what people say.
[01:00:55.120 --> 01:00:59.120]  It's because that damn moon, the moon is doing it, is doing it towards the end.


[36:06.120 --> 36:07.120]  It sounds like something close to it.
[01:00:59.120 --> 01:01:01.120]  The tides, it's happening because of the tides.


[36:07.120 --> 36:11.120]  This is just demonstrating that in a very dramatic way.
[01:01:01.120 --> 01:01:04.120]  So stealing our angular momentum.


[36:11.120 --> 36:14.120]  It's amazing how well the priming works.
[01:01:04.120 --> 01:01:05.120]  Exactly.


[36:14.120 --> 36:18.120]  When Rob brought up the distortion, it reminded me of,
[01:01:05.120 --> 01:01:06.120]  Exactly.


[36:18.120 --> 36:22.120]  we talked about it on SGU, the doll that would talk.
[01:01:06.120 --> 01:01:10.120]  Because as because of the way the earth is rotating and the bulges created by the tides,


[36:22.120 --> 36:23.120]  Full-string dolls.
[01:01:10.120 --> 01:01:14.120]  the moon is pulling on those on that bulge, which actually causes friction on the earth,


[36:23.120 --> 36:24.120]  It has a recording.
[01:01:14.120 --> 01:01:17.120]  which slows the earth, making our days longer.


[36:24.120 --> 36:27.120]  It's a voice, but it's a crackly kind of voice.
[01:01:17.120 --> 01:01:21.120]  And the moon is stealing our rotational energy, our angular momentum, because that's got to


[36:27.120 --> 36:29.120]  It has a bit of distortion to it.
[01:01:21.120 --> 01:01:22.120]  be conserved.


[36:29.120 --> 36:32.120]  People think they're hearing things that the doll is saying
[01:01:22.120 --> 01:01:25.120]  And that's going into a higher orbit and getting farther and farther and farther away.


[36:32.120 --> 36:34.120]  that it really isn't programmed to say,
[01:01:25.120 --> 01:01:30.120]  And eventually, if the solar system lasts long enough, which it won't, it will get so


[36:34.120 --> 36:38.120]  but they can't distinguish what it was programmed to say.
[01:01:30.120 --> 01:01:33.120]  far away that we'll be facing each other.


[36:38.120 --> 36:42.120]  They're thinking what they think it's saying instead.
[01:01:33.120 --> 01:01:36.120]  The moon and the moon will be facing each other, will be tidally locked like the moon


[36:42.120 --> 36:45.120]  We've come across this before in other mediums.
[01:01:36.120 --> 01:01:37.120]  is to us right now.


[36:45.120 --> 36:48.120]  Is this behind those Disney conspiracies too,
[01:01:37.120 --> 01:01:41.120]  So that's just the interesting aside of why the earth is slowing.


[36:48.120 --> 36:49.120]  where they're like,
[01:01:41.120 --> 01:01:46.120]  When and if that ever happens, does that mean that one side of the earth would be getting


[36:49.120 --> 36:52.120]  there are secret horrible messages in various cartoons?
[01:01:46.120 --> 01:01:48.120]  sun and the other side will not be getting sun?


[36:52.120 --> 36:54.120]  Is the light, that was one of the dolls that had it,
[01:01:48.120 --> 01:01:50.120]  No, it's all about the orientation of the earth moon.


[36:54.120 --> 36:57.120]  but that's not really what the doll was saying,
[01:01:50.120 --> 01:01:51.120]  Right.


[36:57.120 --> 37:03.120]  but it spread virally and that's what everyone started to hear.
[01:01:51.120 --> 01:01:52.120]  It's not tidally locked to the sun.


[37:03.120 --> 37:05.120]  It was saying because it was suggested that that's what it was saying.
[01:01:52.120 --> 01:01:53.120]  It's tidally locked to the moon.


[37:05.120 --> 37:07.120]  The awkward masking on records.
[01:01:53.120 --> 01:01:54.120]  Right.


[37:07.120 --> 37:09.120]  I was just going to say that.
[01:01:54.120 --> 01:01:55.120]  Now, if we were like...


[37:09.120 --> 37:12.120]  I've listened to Stairway to Heaven backwards.
[01:01:55.120 --> 01:01:57.120]  Would the whole thing rotate, basically?


[37:12.120 --> 37:19.120]  I really hear a lot of stuff in there that has a demonic connotation.
[01:01:57.120 --> 01:01:58.120]  Yes.


[37:19.120 --> 37:21.120]  The words that they're saying.
[01:01:58.120 --> 01:02:00.120]  We would always be facing each other.


[37:21.120 --> 37:25.120]  It's probably because I've been priming myself since I was a teenager.
[01:02:00.120 --> 01:02:01.120]  Our orbit would be like this.


[37:25.120 --> 37:27.120]  When I hear that, every once in a while I'll listen to it
[01:02:01.120 --> 01:02:04.120]  Instead of the moon is locked now and the earth is rotating.


[37:27.120 --> 37:29.120]  because it's actually kind of interesting.
[01:02:04.120 --> 01:02:07.120]  So some side of the earth will see the moon always and the other side will never see


[37:29.120 --> 37:33.120]  I'm hearing, here's to my sweet Satan and all that stuff.
[01:02:07.120 --> 01:02:08.120]  the moon.


[37:33.120 --> 37:35.120]  It seems very clear to me.
[01:02:08.120 --> 01:02:11.120]  But that wouldn't happen because we're going to burn up before we get to that point, I


[37:35.120 --> 37:40.120]  Again, your brain is trying to make sense out of chaos.
[01:02:11.120 --> 01:02:12.120]  believe.


[37:40.120 --> 37:45.120]  Sometimes your brain concocts something that isn't actually there.
[01:02:12.120 --> 01:02:13.120]  Oh, thank you.


[37:45.120 --> 37:47.120]  It's kind of like the dress.
[01:02:13.120 --> 01:02:14.120]  Perfect.


[37:47.120 --> 37:49.120]  I was just thinking about the dress.
[01:02:14.120 --> 01:02:16.120]  But there are planets that have been tidally locked to their sun because they're very big


[37:49.120 --> 37:51.120]  Or Laurel and Yanni.
[01:02:16.120 --> 01:02:19.120]  and they're very close to their parent star.


[37:51.120 --> 37:54.120]  Yeah, Laurel and Yanni.
[01:02:19.120 --> 01:02:22.120]  So the tidal forces are strong enough to tidally lock that.


[37:54.120 --> 37:56.120]  The internet will spit out more of these things.
[01:02:22.120 --> 01:02:28.120]  But 2020, 2021, and 2022 were a little bit different.


[37:56.120 --> 37:58.120]  We'll share them with you.
[01:02:28.120 --> 01:02:35.120]  And it wasn't just because of that damn pandemic because these were the shortest days ever


[37:58.120 --> 38:00.120]  This was a particularly impressive one.
[01:02:35.120 --> 01:02:36.120]  recorded.


[38:00.120 --> 38:02.120]  Everyone, we're going to take a quick break from our show
[01:02:36.120 --> 01:02:41.120]  2020 had 28 of the shortest days ever recorded since 1960.


[38:02.120 --> 38:04.120]  to talk about our sponsor this week, BetterHelp.
[01:02:41.120 --> 01:02:42.120]  What?


[38:04.120 --> 38:07.120]  Guys, we have to take care of not just our physical health,
[01:02:42.120 --> 01:02:43.120]  28 days.


[38:07.120 --> 38:09.120]  but also our mental health.
[01:02:43.120 --> 01:02:44.120]  Why?


[38:09.120 --> 38:12.120]  There's lots of options available to us now.
[01:02:44.120 --> 01:02:49.120]  2021 also had a plethora of very, very short days.


[38:12.120 --> 38:13.120]  BetterHelp is one of them.
[01:02:49.120 --> 01:02:53.120]  No dramatic records were broken in 2021, but they were still very, very short.


[38:13.120 --> 38:16.120]  BetterHelp offers online therapy.
[01:02:53.120 --> 01:02:59.120]  Oh, and 2020, I think we all can agree that if the days in 2020 were shorter, that's a


[38:16.120 --> 38:17.120]  I'll tell you something.
[01:02:59.120 --> 01:03:03.120]  good thing because that year needed to be shorter than it was.


[38:17.120 --> 38:19.120]  I personally do online therapy.
[01:03:03.120 --> 01:03:05.120]  Literally the only good thing is this.


[38:19.120 --> 38:25.120]  I've been meeting with my doctor for the past six months every week.
[01:03:05.120 --> 01:03:06.120]  Right.


[38:25.120 --> 38:28.120]  I've been dealing with anxiety and depression my entire adult life.
[01:03:06.120 --> 01:03:07.120]  So 2022, we're not even done with it.


[38:28.120 --> 38:32.120]  Therapy is one of the biggest things that helps me deal with it.
[01:03:07.120 --> 01:03:09.120]  We've already broken some good records.


[38:32.120 --> 38:34.120]  I really think that you should consider it.
[01:03:09.120 --> 01:03:14.120]  June 22nd was 1.59 milliseconds shorter than 24 hours.


[38:34.120 --> 38:37.120]  If you're suffering, if you're having anything that's bothering you
[01:03:14.120 --> 01:03:15.120]  Holy shit.


[38:37.120 --> 38:40.120]  that you seem to not be able to get over,
[01:03:15.120 --> 01:03:16.120]  The shortest day.


[38:40.120 --> 38:43.120]  you really should think about talking to someone to get help.
[01:03:16.120 --> 01:03:17.120]  Is that a lot?


[38:43.120 --> 38:44.120]  You're right, Jay.
[01:03:17.120 --> 01:03:23.120]  It's not an absolute a lot, but relative to history, it is a lot.


[38:44.120 --> 38:49.120]  BetterHelp is not only online, but it offers a lot of different options.
[01:03:23.120 --> 01:03:24.120]  1.59 milliseconds.


[38:49.120 --> 38:52.120]  We're talking video, phone, even live chat only.
[01:03:24.120 --> 01:03:27.120]  The short day, shortest day ever recorded, ever recorded.


[38:52.120 --> 38:57.120]  You don't have to see someone on camera if you're not in the place to do that.
[01:03:27.120 --> 01:03:31.120]  And then in July, we had a day that was the second shortest.


[38:57.120 --> 39:02.120]  It's also affordable, and you can be matched with a therapist in under 48 hours.
[01:03:31.120 --> 01:03:33.120]  So something's happening.


[39:02.120 --> 39:07.120]  Our listeners get 10% off their first month at BetterHelp.com.
[01:03:33.120 --> 01:03:40.120]  So why do we have three years where the average day was less than 24 hours when over the past


[39:07.120 --> 39:11.120]  That's BetterHELP.com.
[01:03:40.120 --> 01:03:46.120]  30, 40, 50, 60, 70 years, the average day has been a little bit longer than 24 hours?


[39:11.120 --> 39:14.120]  All right, guys, let's get back to the show.
[01:03:46.120 --> 01:03:47.120]  Why?


=== The Alex Jones Saga <small>(39:15)</small> ===
[01:03:47.120 --> 01:03:48.120] What's going on?
* [https://www.reuters.com/business/media-telecom/jury-alex-jones-defamation-case-begin-deliberations-punitive-damages-2022-08-05/ Jury awards $45.2 million in punitive damages in Alex Jones Sandy Hook trial]<ref>[https://www.reuters.com/business/media-telecom/jury-alex-jones-defamation-case-begin-deliberations-punitive-damages-2022-08-05/ Reuters: Jury awards $45.2 million in punitive damages in Alex Jones Sandy Hook trial]</ref>


[39:14.120 --> 39:15.120]  All right.
[01:03:48.120 --> 01:03:49.120]  Well, we're not sure.


[39:15.120 --> 39:21.120]  One thing that we can agree on, that is that Alex Jones is a giant douchebag.
[01:03:49.120 --> 01:03:52.120]  We're not sure exactly, but there's lots, of course, there's lots of scientists and


[39:21.120 --> 39:25.120]  You don't have my permission to use that photo.
[01:03:52.120 --> 01:03:53.120]  their theories.


[39:25.120 --> 39:28.120]  I'm going to get your internet permission to not use that photo.
[01:03:53.120 --> 01:03:55.120]  They have got lots of ideas of why.


[39:28.120 --> 39:29.120]  Buy my vitamins.
[01:03:55.120 --> 01:04:00.120]  One idea is that glaciers are melting and basically the poles don't have as much mass


[39:29.120 --> 39:31.120]  I have a worse photo.
[01:04:00.120 --> 01:04:03.120]  or weight by them as they used to.


[39:31.120 --> 39:37.120]  All right, Kelly, give us an update on the Alex Jones saga.
[01:04:03.120 --> 01:04:04.120]  That's one idea that may be contributing.


[39:37.120 --> 39:41.120]  Yes, so I, like the insane person I am,
[01:04:04.120 --> 01:04:06.120]  So is that like a skater pulling in their arms?


[39:41.120 --> 39:44.120]  have kind of had this on in the background for the last two weeks,
[01:04:06.120 --> 01:04:07.120]  Right.


[39:44.120 --> 39:48.120]  and I was very glad to have an opportunity to put that to use.
[01:04:07.120 --> 01:04:08.120]  Yes.


[39:48.120 --> 39:51.120]  But in Steve fashion, I'm going to start with a question.
[01:04:08.120 --> 01:04:10.120]  Distribution of mass as the skater pulling in the arms to go faster.


[39:51.120 --> 39:58.120]  So what percentage of Americans do you guys think question the Sandy Hook shooting?
[01:04:10.120 --> 01:04:12.120]  That's definitely related.


[39:58.120 --> 40:00.120]  20%.
[01:04:12.120 --> 01:04:16.120]  And related to that, Steve, is also another idea of why we're getting the speed up is


[40:00.120 --> 40:01.120]  10%.
[01:04:16.120 --> 01:04:22.120]  the different movements of the molten core of the planet that could address that and


[40:01.120 --> 40:02.120]  Question it?
[01:04:22.120 --> 01:04:23.120]  speed up the Earth.


[40:02.120 --> 40:04.120]  Probably I would say like 22%.
[01:04:23.120 --> 01:04:27.120]  Seismic activity is another option that they throw out.


[40:04.120 --> 40:06.120]  22.1%.
[01:04:27.120 --> 01:04:32.120]  My theory is that it's the sheer mass of meatballs at Jay's house that is kind of screwing with


[40:06.120 --> 40:07.120]  25%.
[01:04:32.120 --> 01:04:33.120]  our rotation.


[40:07.120 --> 40:08.120]  Wow.
[01:04:33.120 --> 01:04:34.120]  I would do it.


[40:08.120 --> 40:10.120]  It depends on whether we're doing Price is Right rules or not,
[01:04:34.120 --> 01:04:35.120]  Jay, I'm telling you, man.


[40:10.120 --> 40:13.120]  but I don't think we are because I didn't say it, so Andrea wins.
[01:04:35.120 --> 01:04:37.120]  I've got two scientists that agree with that with me.


[40:13.120 --> 40:15.120]  Oh, it's that high?
[01:04:37.120 --> 01:04:43.120]  But a lot of scientists will also throw out there the Chandler wobble as one potential


[40:15.120 --> 40:16.120]  There we go.
[01:04:43.120 --> 01:04:44.120]  reason why the Earth is speeding up.


[40:16.120 --> 40:18.120]  That's horrible.
[01:04:44.120 --> 01:04:45.120]  Is that a dance?


[40:18.120 --> 40:22.120]  A quarter of the people polled, it's hard because I would have won.
[01:04:45.120 --> 01:04:46.120]  What is it?


[40:22.120 --> 40:25.120]  Price is Right rules, I would have won.
[01:04:46.120 --> 01:04:47.120]  The France thing?


[40:25.120 --> 40:28.120]  Granted, there's always issues with polling,
[01:04:47.120 --> 01:04:48.120]  Yes.


[40:28.120 --> 40:31.120]  but even if it's half that, that's absolutely insane,
[01:04:48.120 --> 01:04:49.120]  That's the joke.


[40:31.120 --> 40:34.120]  and it's almost single-handedly because of Alex Jones.
[01:04:49.120 --> 01:04:52.120]  And I couldn't think of a really, really good version of that joke.


[40:34.120 --> 40:36.120]  Oh, yeah.
[01:04:52.120 --> 01:04:54.120]  But I'll just describe what it is.


[40:36.120 --> 40:39.120]  So I'm going to talk more about the misinformation piece.
[01:04:54.120 --> 01:04:57.120]  It's essentially the varying wobble of Earth's axis of rotation.


[40:39.120 --> 40:42.120]  I know everyone has seen all of the clips of his testimony
[01:04:57.120 --> 01:04:58.120]  It's actually kind of complicated.


[40:42.120 --> 40:45.120]  and all of the perjury and all the fun stuff,
[01:04:58.120 --> 01:05:02.120]  I'm trying to really wrap my head around what's exactly going on with this Chandler wobble.


[40:45.120 --> 40:47.120]  but since this is a misinformation conference,
[01:05:02.120 --> 01:05:07.120]  But it's the axis of rotation that varies, causing a shorter term wobble.


[40:47.120 --> 40:50.120]  I'm going to focus on that aspect of it.
[01:05:07.120 --> 01:05:09.120]  So that's as much as I'll say about the Chandler wobble.


[40:50.120 --> 40:54.120]  And I think as skeptics, we often hear the question, what's the harm?
[01:05:09.120 --> 01:05:10.120]  Okay, so what does this mean?


[40:54.120 --> 40:57.120]  Especially with things like conspiracy theories or supplements.
[01:05:10.120 --> 01:05:11.120]  What's going to happen?


[40:57.120 --> 41:02.120]  It's just easy to dismiss until it gets to this point,
[01:05:11.120 --> 01:05:13.120]  What are some really bad things?


[41:02.120 --> 41:06.120]  and Alex Jones took both of those things and ruined some families' lives.
[01:05:13.120 --> 01:05:16.120]  Okay, it's the leap second that could be concerning here.


[41:06.120 --> 41:08.120]  So some backgrounds.
[01:05:16.120 --> 01:05:17.120]  Because we've had leap seconds.


[41:08.120 --> 41:12.120]  The caricature that you think of as Alex Jones is pretty much accurate.
[01:05:17.120 --> 01:05:22.120]  We've had plenty of leap seconds where you add an extra second to coordinated universal


[41:12.120 --> 41:16.120]  He peddles all of the conspiracy theories, 9-11 truth or pizza gate.
[01:05:22.120 --> 01:05:23.120]  time.


[41:16.120 --> 41:20.120]  Now he's talking about the globalists trying to bring about the New World Order,
[01:05:23.120 --> 01:05:24.120]  And that's been done.


[41:20.120 --> 41:23.120]  and when the Sandy Hook shooting happened,
[01:05:24.120 --> 01:05:26.120]  Nobody really thinks about it anymore.


[41:23.120 --> 41:27.120]  he almost immediately was questioning the narrative.
[01:05:26.120 --> 01:05:28.120]  But it's problematic.


[41:27.120 --> 41:32.120]  And he's gone from saying it's a hoax, calling the parents crisis actors,
[01:05:28.120 --> 01:05:32.120]  In 2012, Reddit was taken down because of a leap second was added that year.


[41:32.120 --> 41:34.120]  and that's changed over time.
[01:05:32.120 --> 01:05:33.120]  Wow.


[41:34.120 --> 41:37.120]  His position has definitely evolved,
[01:05:33.120 --> 01:05:39.120]  And if I was into Reddit then as I am now, I would have been pissed if Reddit went down.


[41:37.120 --> 41:42.120]  but the consistent through line of that is that he's questioning the official story
[01:05:39.120 --> 01:05:40.120]  But they've done tricks.


[41:42.120 --> 41:45.120]  and doesn't think that the official story is true.
[01:05:40.120 --> 01:05:43.120]  They've got something called leap smearing, where they take microsecond slowdowns.


[41:45.120 --> 41:48.120]  And because of this, the families of the children who died
[01:05:43.120 --> 01:05:44.120]  They need a rebrand.


[41:48.120 --> 41:51.120]  have received death threats, they've been harassed,
[01:05:44.120 --> 01:05:45.120]  Yes.


[41:51.120 --> 41:54.120]  and they're dealing with this constantly circulating.
[01:05:45.120 --> 01:05:52.120]  In the course of a day, they might do microsecond slowdowns leading up to the leap second.


[41:54.120 --> 41:58.120]  So a bunch of the families have sued him, rightfully so.
[01:05:52.120 --> 01:05:55.120]  So that to make it a little more palatable, I guess.


[41:58.120 --> 42:01.120]  And so this trial was for the parents of Jesse Lewis,
[01:05:55.120 --> 01:05:56.120]  Bob, but wait.


[42:01.120 --> 42:04.120]  who was a six-year-old who died in Sandy Hook,
[01:05:56.120 --> 01:05:58.120]  I hate to cut in.


[42:04.120 --> 42:09.120]  for defamation and intentional infliction of emotional distress.
[01:05:58.120 --> 01:06:03.120]  But why does a fraction of a second matter in the world?


[42:09.120 --> 42:12.120]  And we're about to make fun of Alex Jones,
[01:06:03.120 --> 01:06:05.120]  Well, it's not a fraction of a second.


[42:12.120 --> 42:17.120]  but as we're doing it, keep in mind that this all sounds silly and ridiculous,
[01:06:05.120 --> 01:06:06.120]  It's a full second.


[42:17.120 --> 42:20.120]  but it's causing real harm to these families.
[01:06:06.120 --> 01:06:07.120]  I mean, think about it, Jay.


[42:20.120 --> 42:23.120]  And I don't want to make light of it, but at the same time,
[01:06:07.120 --> 01:06:12.120]  I mean, a second is small, but it's important.


[42:23.120 --> 42:25.120]  there's something really satisfying,
[01:06:12.120 --> 01:06:17.120]  And computer systems and GPS and satellites, lots of things are interrelated.


[42:25.120 --> 42:29.120]  especially in the misinformation apocalypse that we're in right now,
[01:06:17.120 --> 01:06:18.120]  And it took down Reddit.


[42:29.120 --> 42:33.120]  about somebody who is this awful actually being held accountable.
[01:06:18.120 --> 01:06:20.120]  I mean, this can happen.


[42:33.120 --> 42:37.120]  So we've got to at least appreciate that for a minute.
[01:06:20.120 --> 01:06:26.120]  Y2K is kind of a related example of when you mess with something so fundamental.


[42:37.120 --> 42:40.120]  Also, his lawyers are comically terrible.
[01:06:26.120 --> 01:06:30.120]  And I'll go into it in a little bit more detail in one second, Jay.


[42:40.120 --> 42:42.120]  How can they be that?
[01:06:30.120 --> 01:06:33.120]  So a normal leap second can be problematic.


[42:42.120 --> 42:45.120]  I mean, for a guy that has this much money,
[01:06:33.120 --> 01:06:36.120]  Perhaps it's not as much as problematic as it was.


[42:45.120 --> 42:48.120]  how could he because he's a losing case?
[01:06:36.120 --> 01:06:40.120]  But a negative leap second, if the Earth keeps spinning faster and faster,


[42:48.120 --> 42:50.120]  Because nobody wants to defend him.
[01:06:40.120 --> 01:06:46.120]  or if we maintain this average of faster than 24 hours,


[42:50.120 --> 42:54.120]  He probably has been working his way down the ladder of terrible lawyers.
[01:06:46.120 --> 01:06:50.120]  then we may need to add a negative leap second.


[42:54.120 --> 42:56.120]  And you've had that experience.
[01:06:50.120 --> 01:06:53.120]  And that's much more problematic than a regular leap second,


[42:56.120 --> 42:58.120]  I mean, his lawyers were pretty terrible.
[01:06:53.120 --> 01:06:55.120]  where you're skipping one second.


[42:58.120 --> 43:01.120]  With your case, your opponent had that as well.
[01:06:55.120 --> 01:07:01.120]  It's tougher to do and more risky than adding a second for various technical reasons.


[43:01.120 --> 43:05.120]  He kept going through lawyers because nobody of quality would defend him.
[01:07:01.120 --> 01:07:02.120]  For example...


[43:05.120 --> 43:07.120]  Who wants to defend this guy?
[01:07:02.120 --> 01:07:03.120]  This is going into the future.


[43:07.120 --> 43:10.120]  The other thing is that they did it on purpose.
[01:07:03.120 --> 01:07:05.120]  This really sounds like a time travel episode.


[43:10.120 --> 43:11.120]  That's what I was thinking.
[01:07:05.120 --> 01:07:06.120]  Yeah, right?


[43:11.120 --> 43:12.120]  You think they're sandbagging?
[01:07:06.120 --> 01:07:09.120]  But smartphones, computers, communication systems,


[43:12.120 --> 43:13.120]  Yeah.
[01:07:09.120 --> 01:07:13.120]  they synchronize using something called a network time protocol.


[43:13.120 --> 43:15.120]  His morals got the better of him.
[01:07:13.120 --> 01:07:17.120]  And that network time protocol is based on the number of seconds


[43:15.120 --> 43:17.120]  That thought has been brought up.
[01:07:17.120 --> 01:07:20.120]  that have transpired since January 1st, 1970.


[43:17.120 --> 43:20.120]  But the thing is, one, it's a civil case,
[01:07:20.120 --> 01:07:24.120]  So you throw out a second there and things can go a little wonky.


[43:20.120 --> 43:24.120]  so he can't get away with the whole, like, my lawyers were incompetent,
[01:07:24.120 --> 01:07:25.120]  So that's a little concerning.


[43:24.120 --> 43:26.120]  so get out of it that way.
[01:07:25.120 --> 01:07:28.120]  It can cause some issues with these systems.


[43:26.120 --> 43:30.120]  But also, they cross-examined the parents.
[01:07:28.120 --> 01:07:30.120]  Also, there's GPS satellites.


[43:30.120 --> 43:33.120]  And I feel like if you were sandbagging it,
[01:07:30.120 --> 01:07:33.120]  GPS satellites don't account for rotation.


[43:33.120 --> 43:36.120]  you wouldn't want to inflict additional trauma on the parents.
[01:07:33.120 --> 01:07:35.120]  They're not really built to deal with rotation.


[43:36.120 --> 43:40.120]  And some of the questions that he was asking them, I couldn't believe.
[01:07:35.120 --> 01:07:37.120]  So if the Earth is spinning faster,


[43:40.120 --> 43:43.120]  Have the lawyers made a statement about how it happened?
[01:07:37.120 --> 01:07:43.120]  the GPS satellite will all of a sudden be over a specific area a little earlier


[43:43.120 --> 43:47.120]  Because it's hard to accidentally send a huge set of files or file.
[01:07:43.120 --> 01:07:45.120]  than it would have been previously.


[43:47.120 --> 43:49.120]  I always forget to send attachments.
[01:07:45.120 --> 01:07:46.120]  And that could mean the difference,


[43:49.120 --> 43:52.120]  Oh, the phone that's almost definitely going to the one-sixth committee
[01:07:46.120 --> 01:07:50.120]  even if the Earth sped up by a half a millisecond,


[43:52.120 --> 43:54.120]  is like a whole story in itself.
[01:07:50.120 --> 01:07:54.120]  it could be 10 inches or 26 centimeters off.


[43:54.120 --> 43:57.120]  But basically, the one lawyer said,
[01:07:54.120 --> 01:07:56.120]  And that would compound.


[43:57.120 --> 44:00.120]  please disregard after he accidentally sent the files,
[01:07:56.120 --> 01:07:59.120]  And eventually the GPS satellites could be essentially useless


[44:00.120 --> 44:04.120]  but didn't actually take the legal steps to pull back all that information.
[01:07:59.120 --> 01:08:02.120]  if we don't do anything, which we probably will.


[44:04.120 --> 44:08.120]  So they just got to use it after his ten days were up.
[01:08:02.120 --> 01:08:05.120]  I mean, it's not like, oh my God, GPS is going to be worthless.


[44:08.120 --> 44:11.120]  This trial was specifically for damages,
[01:08:05.120 --> 01:08:08.120]  And when you say do something, you're like, we've got to program this problem.


[44:11.120 --> 44:15.120]  because Alex Jones didn't provide any of the documents or evidence
[01:08:08.120 --> 01:08:11.120]  Yeah, I'm not sure what level of effort would be required,


[44:15.120 --> 44:17.120]  that he was supposed to during the discovery phase,
[01:08:11.120 --> 01:08:13.120]  but I'm sure it's not going to be trivial.


[44:17.120 --> 44:20.120]  and he dragged things on for years, and so there was a default judgment.
[01:08:13.120 --> 01:08:16.120]  So some people say that this is going to be over soon


[44:20.120 --> 44:23.120]  So it wasn't a question of if the defamation happens.
[01:08:16.120 --> 01:08:23.120]  and this increased rotation speed of the Earth isn't going to necessarily stay this way for years.


[44:23.120 --> 44:25.120]  The court had decided the defamation happened.
[01:08:23.120 --> 01:08:28.120]  Some people are saying this could be the beginning of a 50-year scenario


[44:25.120 --> 44:30.120]  This was just to decide how much he had to pay for it.
[01:08:28.120 --> 01:08:32.120]  where the Earth is spinning faster than 24 hours.


[44:30.120 --> 44:36.120]  And the trial was exactly as dramatic as the clips are portraying it to be,
[01:08:32.120 --> 01:08:35.120]  And we may absolutely need to throw in some of these negative leap seconds,


[44:36.120 --> 44:39.120]  and I think this one exchange between Alex Jones and the judge
[01:08:35.120 --> 01:08:37.120]  which could cause some problems.


[44:39.120 --> 44:43.120]  is the epitome of his testimony at least.
[01:08:37.120 --> 01:08:39.120]  So that's the story.


[44:43.120 --> 44:45.120]  So I'm going to read that.
[01:08:39.120 --> 01:08:40.120]  It's interesting.


[44:45.120 --> 44:48.120]  I'm sorry, I don't have as good an Alex Jones impression as George.
[01:08:40.120 --> 01:08:42.120]  I'm not too worried about it.


[44:48.120 --> 44:53.120]  So the judge, after sending the jury out because Alex Jones was talking about
[01:08:42.120 --> 01:08:45.120]  But we'll see if some negative leap seconds get thrown in there,


[44:53.120 --> 44:56.120]  things that he wasn't supposed to while he was on the stand,
[01:08:45.120 --> 01:08:53.120]  and we might find out by the end of this year or the following year if this keeps up.


[44:56.120 --> 44:59.120]  said, you're already under oath to tell the truth.
[01:08:53.120 --> 01:08:55.120]  So, Bob, are you angry about all this?


[44:59.120 --> 45:02.120]  You've already violated that oath twice today.
[01:08:55.120 --> 01:08:56.120]  No.


[45:02.120 --> 45:03.120]  And granted, twice today.
[01:08:56.120 --> 01:09:00.120]  It was just an interesting research.


[45:03.120 --> 45:07.120]  He had been on the stand for like 10 minutes by that point maybe.
[01:09:00.120 --> 01:09:03.120]  It was actually tough.


[45:07.120 --> 45:11.120]  That might be an exaggeration, but it was end of the day,
[01:09:03.120 --> 01:09:04.120]  I'm answering.


[45:11.120 --> 45:12.120]  he had just gotten on the stand.
[01:09:04.120 --> 01:09:08.120]  It was tough to really get to fully understand all the nuances here,


[45:12.120 --> 45:16.120]  It seems absurd to instruct you that you must tell the truth while you testify,
[01:09:08.120 --> 01:09:11.120]  because you've got sidereal day, solar day, mean solar day,


[45:16.120 --> 45:18.120]  yet here I am.
[01:09:11.120 --> 01:09:16.120]  all these things that are different websites had different takes on exactly what those mean.


[45:18.120 --> 45:20.120]  You must tell the truth when you testify.
[01:09:16.120 --> 01:09:21.120]  And it was interesting to put it all together and understand exactly what was happening.


[45:20.120 --> 45:22.120]  This is not your show.
[01:09:21.120 --> 01:09:22.120]  So, yeah, I enjoyed this.


[45:22.120 --> 45:25.120]  And then she explains some of the specifics, and she goes,
[01:09:22.120 --> 01:09:25.120]  A great bar bet that we were talking about when we were talking about this before.


[45:25.120 --> 45:27.120]  do you understand what I have said?
[01:09:25.120 --> 01:09:29.120]  So, Andrea, how many times does the Earth rotate on its axis in one year?


[45:27.120 --> 45:30.120]  And he goes, I, and she interrupts him and says, yes or no.
[01:09:29.120 --> 01:09:31.120]  365 and a quarter, isn't that it?


[45:30.120 --> 45:34.120]  He goes, yes, I believe what I said is true.
[01:09:31.120 --> 01:09:32.120]  Wrong.


[45:34.120 --> 45:35.120]  And she cuts him off.
[01:09:32.120 --> 01:09:33.120]  Oh.


[45:35.120 --> 45:39.120]  She goes, you believe everything you say is true, but it isn't.
[01:09:33.120 --> 01:09:41.120]  366 and a quarter, because in going around the sun, it's got to rotate one extra time.


[45:39.120 --> 45:41.120]  Your beliefs do not make something true.
[01:09:41.120 --> 01:09:46.120]  A day, you know, one day is a full rotation plus a degree,


[45:41.120 --> 45:43.120]  That's what we're doing here.
[01:09:46.120 --> 01:09:50.120]  a full rotation plus a degree, and it adds up over a year to a whole other rotation.


[45:43.120 --> 45:44.120]  Oh my God.
[01:09:50.120 --> 01:09:51.120]  Right.


[45:44.120 --> 45:45.120]  Wow.
[01:09:51.120 --> 01:09:55.120]  361 degrees is the mean solar day, 24 hours.


[45:45.120 --> 45:48.120]  And you should really watch that whole clip because there was so much more of it,
[01:09:55.120 --> 01:09:57.120]  A sidereal day is...


[45:48.120 --> 45:50.120]  but I couldn't go into the whole thing.
[01:09:57.120 --> 01:09:59.120]  23 hours and 56 minutes.


[45:50.120 --> 45:54.120]  And watch all the clips from his testimony because it is absolutely horrifying,
[01:09:59.120 --> 01:10:00.120]  Exactly.


[45:54.120 --> 45:58.120]  but also really satisfying because he's an awful person and deserves every bit of that.
[01:10:00.120 --> 01:10:01.120]  Wow.


[45:58.120 --> 46:02.120]  And I can't help, through all the things that I've consumed about this man,
[01:10:01.120 --> 01:10:02.120]  23 hours and 56 minutes.


[46:02.120 --> 46:07.120]  I can't help but think that this entire thing is an act.
[01:10:02.120 --> 01:10:03.120]  It's four minutes.


[46:07.120 --> 46:08.120]  I was thinking the same, Jay.
[01:10:03.120 --> 01:10:05.120]  But there's also lots of variations.


[46:08.120 --> 46:10.120]  I'm wondering what you all think about that.
[01:10:05.120 --> 01:10:08.120]  You're going to leave work early and be like, I'm on a sidereal day.


[46:10.120 --> 46:13.120]  You think he knows what he's doing and he's just pretending?
[01:10:08.120 --> 01:10:10.120]  That is such a skeptic thing.


[46:13.120 --> 46:18.120]  Of course, I'm not 100% sure, but it just seems like it is all a money-making act.
[01:10:10.120 --> 01:10:11.120]  Like, wrong.


[46:18.120 --> 46:21.120]  Like I don't think he's a real conspiracy theorist.
[01:10:11.120 --> 01:10:12.120]  365.


[46:21.120 --> 46:22.120]  I think he is.
[01:10:12.120 --> 01:10:13.120]  You know what I mean?


[46:22.120 --> 46:23.120]  No, I think you're right.
[01:10:13.120 --> 01:10:14.120]  Come on.


[46:23.120 --> 46:27.120]  He uses his conspiracies to sell supplements because he'll talk about the conspiracy theory
[01:10:14.120 --> 01:10:15.120]  Yeah.


[46:27.120 --> 46:33.120]  to get the views and then he pivots into an ad for supplements or for shelf-stable food
[01:10:15.120 --> 01:10:21.120]  But also, the other nuances is that the day varies depending on where you are in the orbit


[46:33.120 --> 46:36.120]  because the Great Reset is coming and so you need to have food,
[01:10:21.120 --> 01:10:25.120]  and what season it is and the tilt of the Earth.


[46:36.120 --> 46:39.120]  or gold because there's going to be one world currency, so you need gold.
[01:10:25.120 --> 01:10:29.120]  There's so many little factors that go in here to make it extra confusing.


[46:39.120 --> 46:44.120]  And didn't he admit as much during his trial with his, what, divorce with his wife, effectively?
[01:10:29.120 --> 01:10:35.120]  So can't we help by having a party somewhere on Earth that will slow the rotation down?


[46:44.120 --> 46:45.120]  Custody.
[01:10:35.120 --> 01:10:37.120]  There must be some human configuration that we could do.


[46:45.120 --> 46:46.120]  Was it custody?
[01:10:37.120 --> 01:10:39.120]  We all go to the North Pole at the same time.


[46:46.120 --> 46:49.120]  Yeah, Alex Jones is a character that he is playing.
[01:10:39.120 --> 01:10:44.120]  We all have to jump at the same time so that we can alleviate the pressure.


[46:49.120 --> 46:52.120]  That was one of his lines of defense,
[01:10:44.120 --> 01:10:45.120]  It would be like Earth.


[46:52.120 --> 46:54.120]  which I think probably is accurate.
[01:10:45.120 --> 01:10:46.120]  It would be like in an elevator.


[46:54.120 --> 46:56.120]  Again, we can't read his mind.
[01:10:46.120 --> 01:10:49.120]  Andrew, it would be like an 80s movie, like the end of an 80s movie where we all jump.
 
[46:56.120 --> 46:58.120]  We don't really know what he believes or doesn't believe,
 
[46:58.120 --> 47:01.120]  but it certainly is plausible and it certainly fits everything I've seen about him,
 
[47:01.120 --> 47:03.120]  that this is a character he's playing.
 
[47:03.120 --> 47:08.120]  He did admit that, which means he doesn't necessarily have to believe anything.
 
[47:08.120 --> 47:10.120]  But he's still doing the same level of damage, whether or not.
 
[47:10.120 --> 47:11.120]  Totally.
 
[47:11.120 --> 47:12.120]  That's right.
 
[47:12.120 --> 47:13.120]  Absolutely.
 
[47:13.120 --> 47:14.120]  People believe that he's real.
 
[47:14.120 --> 47:16.120]  Well, and he's doing the character under oath, right?
 
[47:16.120 --> 47:17.120]  Yes, that's the thing.
 
[47:17.120 --> 47:19.120]  That has consequences.
 
[47:19.120 --> 47:23.120]  It's been so interesting to watch because he's not used to being challenged on his show.
 
[47:23.120 --> 47:25.120]  He has control over the entire narrative.
 
[47:25.120 --> 47:27.120]  Now he has to be in reality.
 
[47:27.120 --> 47:32.120]  And so he started to do one of his ad pitches on the stand.
 
[47:32.120 --> 47:35.120]  He started talking about how great his supplements are and they get the best supplements.
 
[47:35.120 --> 47:36.120]  He can't help it.
 
[47:36.120 --> 47:37.120]  Oh, my God.
 
[47:37.120 --> 47:39.120]  It's all he knows, effectively.
 
[47:39.120 --> 47:43.120]  If he can make a few bucks on the stand, why not go for it, I guess, right?
 
[47:43.120 --> 47:47.120]  It's always satisfying to see, because this is not the first time this has happened,
 
[47:47.120 --> 47:51.120]  and there are cases where people who are con artists or pseudoscientists or whatever,
 
[47:51.120 --> 47:55.120]  and they find themselves in a court of law where there are rules of evidence.
 
[47:55.120 --> 48:01.120]  Not that courts are perfect, but they do have fairly rigorous rules of evidence and argument,
 
[48:01.120 --> 48:03.120]  et cetera.
 
[48:03.120 --> 48:08.120]  Judges, if they're competent, aren't going to let you get away with stuff.
 
[48:08.120 --> 48:13.120]  And just watching that disconnect, somebody like Alex Jones who's living in a fantasy world,
 
[48:13.120 --> 48:19.120]  whether he believes it or not, he is used to being in this con artist construct,
 
[48:19.120 --> 48:25.120]  and now he has to deal with reality and rules of evidence,
 
[48:25.120 --> 48:29.120]  and the clash is just wonderful to behold.
 
[48:29.120 --> 48:33.120]  It's kind of reminding me, Jay, I think you talked about this on a live, SU Live,
 
[48:33.120 --> 48:39.120]  maybe a year ago when Sanjay Gupta was on Joe Rogan and we all expected it to be kind of like that,
 
[48:39.120 --> 48:42.120]  but Joe Rogan just sort of steamrolled the whole thing.
 
[48:42.120 --> 48:46.120]  This is what I wish that had been like, because now we're in a place where the rules,
 
[48:46.120 --> 48:48.120]  reality has to hold for a second.
 
[48:48.120 --> 48:53.120]  Fun fact, Joe Rogan was on Infowars on 9-11.
 
[48:53.120 --> 48:55.120]  As he was spewing his...
 
[48:55.120 --> 48:58.120]  One of the least fun, fun facts I've ever heard.
 
[48:58.120 --> 49:03.120]  As soon as 9-11 happened, he was already spewing conspiracy theories,
 
[49:03.120 --> 49:05.120]  and then he had Joe Rogan on.
 
[49:05.120 --> 49:09.120]  Wait, wait, Joe Rogan was on Alex Jones' Infowars show?
 
[49:09.120 --> 49:13.120]  Well, that guy literally just dropped lower than I thought he would.
 
[49:13.120 --> 49:15.120]  That is ridiculous.
 
[49:15.120 --> 49:21.120]  So I read in the chat, somebody said something about Texas tort law
 
[49:21.120 --> 49:27.120]  that drops the 45 million down to 750,000.
 
[49:27.120 --> 49:28.120]  I read that too.
 
[49:28.120 --> 49:32.120]  From what I saw from the plaintiff's lawyer, he was saying...
 
[49:32.120 --> 49:37.120]  So there was talk about a cap because it was divided into two sets of damages.
 
[49:37.120 --> 49:40.120]  So there were the compensatory damages and the punitive damages.
 
[49:40.120 --> 49:46.120]  The compensatory damages were 4.5 million, and then the punitive damages were 41 million.
 
[49:46.120 --> 49:50.120]  And while we were waiting to hear what the punitive damages were,
 
[49:50.120 --> 49:53.120]  people were talking about a cap because it had to be a certain multiple
 
[49:53.120 --> 49:55.120]  of the compensatory damages.
 
[49:55.120 --> 50:00.120]  But from the statement that the plaintiff's lawyer gave afterwards,
 
[50:00.120 --> 50:03.120]  that was more of a guideline, not a hard cap.
 
[50:03.120 --> 50:05.120]  More of a guideline.
 
[50:05.120 --> 50:07.120]  I'm just going based on his statement.
 
[50:07.120 --> 50:10.120]  I don't know anything about Texas law, not a lawyer.
 
[50:10.120 --> 50:13.120]  But that was what I heard about that.
 
[50:13.120 --> 50:18.120]  I was hoping to see them literally dismantle him and his company.
 
[50:18.120 --> 50:21.120]  Why wouldn't this guy see prison time?
 
[50:21.120 --> 50:24.120]  It's a civil case, you don't know prison.
 
[50:24.120 --> 50:30.120]  I understand that, but it doesn't mean that he can't be put in prison legitimately.
 
[50:30.120 --> 50:32.120]  He did perjure himself.
 
[50:32.120 --> 50:34.120]  That would be a whole other story.
 
[50:34.120 --> 50:37.120]  That would be something emerging from the trial itself.
 
[50:37.120 --> 50:43.120]  But it's hard to bring criminal charges against somebody for what they're saying
 
[50:43.120 --> 50:46.120]  in a public forum because of free speech laws, etc.
 
[50:46.120 --> 50:48.120]  But civil is different.
 
[50:48.120 --> 50:53.120]  Holding people liable for the damage that they knowingly and maliciously caused,
 
[50:53.120 --> 50:55.120]  the law allows for that.
 
[50:55.120 --> 50:59.120]  One more thing I did want to bring up is, in my opinion,
 
[50:59.120 --> 51:01.120]  one of the best witnesses that they had.
 
[51:01.120 --> 51:06.120]  Her name is Becca Lewis and she does research in misinformation and disinformation
 
[51:06.120 --> 51:08.120]  and how it spreads.
 
[51:08.120 --> 51:11.120]  They had her on as an expert witness about misinformation.
 
[51:11.120 --> 51:15.120]  She talked about how and why it spreads faster than the truth
 
[51:15.120 --> 51:20.120]  since it feeds into people's world views, the confirmation bias.
 
[51:20.120 --> 51:24.120]  The things that confirm their existing world views are going to circulate,
 
[51:24.120 --> 51:27.120]  especially once you start to have echo chambers like Infowars'.
 
[51:27.120 --> 51:31.120]  Also, Alex Jones platformed other conspiracy theorists.
 
[51:31.120 --> 51:35.120]  There was one that she talked about who his content only had three views
 
[51:35.120 --> 51:38.120]  before Alex Jones started promoting it.
 
[51:38.120 --> 51:40.120]  It was something that nobody was going to see.
 
[51:40.120 --> 51:43.120]  But because of his platform, a lot of people saw it.
 
[51:43.120 --> 51:49.120]  Now we have 24% of the country who questions this main narrative.
 
[51:49.120 --> 51:51.120]  That was a lot of what the trial was about.
 
[51:51.120 --> 51:53.120]  He would claim, oh, I was just asking questions.
 
[51:53.120 --> 51:56.120]  I was just having these people on to get their opinion.
 
[51:56.120 --> 51:58.120]  Oh, my guest said it, but I didn't say it.
 
[51:58.120 --> 52:02.120]  But he provided that platform for them to get their views out.
 
[52:02.120 --> 52:06.120]  I think the most interesting thing she talked about was this idea
 
[52:06.120 --> 52:09.120]  of three degrees of Alex Jones.
 
[52:09.120 --> 52:13.120]  She said that you basically can't do misinformation research
 
[52:13.120 --> 52:16.120]  without encountering Infowars and Alex Jones.
 
[52:16.120 --> 52:22.120]  The common rule is that you're never more than three recommendations away
 
[52:22.120 --> 52:26.120]  from Alex Jones or Infowars videos.
 
[52:26.120 --> 52:27.120]  Wow.
 
[52:27.120 --> 52:29.120]  Ouch.
 
[52:29.120 --> 52:34.120]  The way to restate that is you can't be more full of shit than Alex Jones.
 
[52:34.120 --> 52:36.120]  Yeah, basically.
 
[52:36.120 --> 52:41.120]  Jones' lawyer was trying to trip her up, and he was trying to use
 
[52:41.120 --> 52:44.120]  all of the things that a scientist or a skeptic would use.
 
[52:44.120 --> 52:48.120]  He's talking about sample size and bias and things like that
 
[52:48.120 --> 52:51.120]  because in any paper at the end, they're going to talk about
 
[52:51.120 --> 52:54.120]  all of the limitations and say, like, this is a potential limitation.
 
[52:54.120 --> 52:57.120]  This is a potential source of bias, but we tried to account for it
 
[52:57.120 --> 52:59.120]  as best we could.
 
[52:59.120 --> 53:02.120]  But she's a researcher, so she knew it a lot better than he did.
 
[53:02.120 --> 53:06.120]  So she'd stop and she'd be like, no, this is what that means.
 
[53:06.120 --> 53:08.120]  You have no idea what you're talking about.
 
[53:08.120 --> 53:10.120]  Oh, that's great.
 
[53:10.120 --> 53:13.120]  Yeah, and he tried to say that she hated Alex Jones and things like that,
 
[53:13.120 --> 53:17.120]  and that would bias her, and she didn't know who Alex Jones was
 
[53:17.120 --> 53:19.120]  before she started researching this.
 
[53:19.120 --> 53:21.120]  And she just goes, yes, that's correct.
 
[53:21.120 --> 53:25.120]  Like, when he'd present something, she'd say, yes, that's correct,
 
[53:25.120 --> 53:27.120]  and it's based on hundreds of hours of research.
 
[53:27.120 --> 53:29.120]  It's not just her opinion.
 
[53:29.120 --> 53:32.120]  And so he kept trying to trip her up, and the best part was
 
[53:32.120 --> 53:37.120]  he was asking her questions and said, the poll that found
 
[53:37.120 --> 53:42.120]  24% questioned Sandy Hook, that it was under 1,000 sample size
 
[53:42.120 --> 53:45.120]  and was trying to discredit it that way.
 
[53:45.120 --> 53:47.120]  And she's like, you can have statistical significance
 
[53:47.120 --> 53:50.120]  with less than 1,000 sample size, like trying to explain that.
 
[53:50.120 --> 53:55.120]  And then the plaintiff's lawyer comes up and hands her the actual study
 
[53:55.120 --> 54:00.120]  and the Jones lawyer was full of shit because it was over 1,000.
 
[54:00.120 --> 54:02.120]  So it wasn't even that, yeah.
 
[54:02.120 --> 54:04.120]  Even the lawyer is full of BS.
 
[54:04.120 --> 54:10.120]  We're really seeing this trend here with these crazy lawsuits.
 
[54:10.120 --> 54:13.120]  How do you defend Alex Jones legitimately?
 
[54:13.120 --> 54:15.120]  How do you do it?
 
[54:15.120 --> 54:18.120]  You literally have to try to slip through some cracks.
 
[54:18.120 --> 54:22.120]  Well, but you also don't have to defend him and say he's innocent.
 
[54:22.120 --> 54:24.120]  I mean, I know innocent and guilty isn't what's happening here
 
[54:24.120 --> 54:26.120]  because it's a civil case, but you don't have to say,
 
[54:26.120 --> 54:28.120]  oh, no, he didn't defame people.
 
[54:28.120 --> 54:33.120]  You can just try to mitigate the damage in an ethical way.
 
[54:33.120 --> 54:37.120]  If a lawyer can give a defense they don't personally believe,
 
[54:37.120 --> 54:39.120]  they don't have to believe it.
 
[54:39.120 --> 54:42.120]  The ethics of law does not require that.
 
[54:42.120 --> 54:46.120]  It just has to be a legally responsible and viable argument.
 
[54:46.120 --> 54:50.120]  Their personal belief is actually not relevant to it.
 
[54:50.120 --> 54:54.120]  So as long as they are mounting an ethical defense, it's fine.
 
[54:54.120 --> 54:58.120]  But it's certainly reasonable to think that there isn't an ethical defense
 
[54:58.120 --> 55:07.120]  of somebody like Alex Jones because it seems so obvious that he's guilty.
 
[55:07.120 --> 55:11.120]  But again, the law is based upon the notion that everybody deserves a defense.
 
[55:11.120 --> 55:15.120]  But that doesn't mean that lawyers can do unethical things on the stand.
 
[55:15.120 --> 55:18.120]  It also is why I think that might speak to the quality of the lawyers
 
[55:18.120 --> 55:22.120]  because, again, the high-quality lawyers, Jones clearly has the money.
 
[55:22.120 --> 55:25.120]  He could pay some high-priced law legal firm to defend him.
 
[55:25.120 --> 55:28.120]  They probably don't want their reputation sullied with this.
 
[55:28.120 --> 55:29.120]  They don't want to go anywhere near it.
 
[55:29.120 --> 55:31.120]  Nobody wants to be the guy who defended Alex Jones.
 
[55:31.120 --> 55:32.120]  Right.
 
[55:32.120 --> 55:34.120]  Do we have any idea how much money, like what his net worth is?
 
[55:34.120 --> 55:36.120]  Like how ruinous is $41 million, $45 million?
 
[55:36.120 --> 55:38.120]  They were desperately trying to figure that out.
 
[55:38.120 --> 55:42.120]  So officially, I'm sorry if you didn't notice, but officially it's $200,000
 
[55:42.120 --> 55:46.120]  that his enterprise makes $200,000 a day.
 
[55:46.120 --> 55:48.120]  But $200,000 a day.
 
[55:48.120 --> 55:50.120]  Is that net?
 
[55:50.120 --> 55:52.120]  But that's probably an underestimate.
 
[55:52.120 --> 55:58.120]  And in the phone records that were revealed, on some days they make up to $800,000.
 
[55:58.120 --> 55:59.120]  That was their best day.
 
[55:59.120 --> 56:01.120]  That was a good day, yeah.
 
[56:01.120 --> 56:03.120]  You guys have got to sell supplements, man.
 
[56:03.120 --> 56:04.120]  This is right.
 
[56:04.120 --> 56:06.120]  We've got to switch sides.
 
[56:06.120 --> 56:09.120]  But they had a really hard time figuring that kind of stuff out
 
[56:09.120 --> 56:11.120]  because he didn't turn over all the documents that he was supposed to turn over.
 
[56:11.120 --> 56:12.120]  Right, part of the problem.
 
[56:12.120 --> 56:15.120]  So they couldn't really get a solid answer on that.
 
[56:15.120 --> 56:16.120]  What kind of bullshit is that?
 
[56:16.120 --> 56:17.120]  Okay, so you don't do that.
 
[56:17.120 --> 56:19.120]  You don't turn over the documents.
 
[56:19.120 --> 56:25.120]  Like doesn't the law, doesn't the court have the ability to deliver some type of incredible smackdown?
 
[56:25.120 --> 56:27.120]  So that's what they did.
 
[56:27.120 --> 56:29.120]  That was why there was the default judgment.
 
[56:29.120 --> 56:34.120]  And so that's why this was just for damages because they already determined that he was liable
 
[56:34.120 --> 56:37.120]  for the defamation and for the infliction of emotional distress.
 
[56:37.120 --> 56:39.120]  I get that they clicked into summary judgment.
 
[56:39.120 --> 56:41.120]  We see we have some experience with that.
 
[56:41.120 --> 56:42.120]  Yeah.
 
[56:42.120 --> 56:44.120]  But in a good way.
 
[56:44.120 --> 56:47.120]  Don't you get into legal trouble if you don't hand over?
 
[56:47.120 --> 56:49.120]  Like doesn't he have to now deal with the fact?
 
[56:49.120 --> 56:53.120]  Well, you could be held in contempt, right, would be the legal remedy there.
 
[56:53.120 --> 56:58.120]  But just in a case like this, the remedy is you lose.
 
[56:58.120 --> 57:03.120]  You now lose the case and now we're going to talk about how much money you have to pay the plaintiff.
 
[57:03.120 --> 57:05.120]  So that was the remedy.
 
[57:05.120 --> 57:11.120]  He was asked, you know, turn over like emails or texts where, you know, you mentioned Sandy Hook.
 
[57:11.120 --> 57:17.120]  And he said, I did a search on my phone, did not see any text that mentioned Sandy Hook.
 
[57:17.120 --> 57:21.120]  So I want to know what did the court or the judge do at that point?
 
[57:21.120 --> 57:26.120]  Because then, of course, afterwards they got two years of text and of course it's all over the place.
 
[57:26.120 --> 57:28.120]  So he was just flat out lying.
 
[57:28.120 --> 57:31.120]  But if they didn't get that dump, what recourse would they have had to say?
 
[57:31.120 --> 57:32.120]  Yeah, I don't believe you.
 
[57:32.120 --> 57:34.120]  I don't believe your phone doesn't have those.
 
[57:34.120 --> 57:36.120]  They can get the info if they want to.
 
[57:36.120 --> 57:38.120]  They can get the info.
 
[57:38.120 --> 57:43.120]  They can appoint somebody to go through the phone and get the information that they want.
 
[57:43.120 --> 57:46.120]  I know like when I had to turn over my emails, I didn't do it.
 
[57:46.120 --> 57:52.120]  My lawyer hired an independent person to come in, go through all my emails and find the ones that were relevant.
 
[57:52.120 --> 57:54.120]  My hands were not on it at all.
 
[57:54.120 --> 57:55.120]  All right.
 
[57:55.120 --> 57:57.120]  Anything else you want to add before we move on?
 
[57:57.120 --> 58:00.120]  I will throw a quote out there from the lawyer today.
 
[58:00.120 --> 58:03.120]  So this was just the first of a few cases.
 
[58:03.120 --> 58:10.120]  And the plaintiff's lawyer said, there's going to be a large set of plaintiffs dividing up the corpse of Infowars.
 
[58:10.120 --> 58:12.120]  And fingers crossed that that actually happens.
 
[58:12.120 --> 58:13.120]  Yeah, that would be nice.
 
[58:13.120 --> 58:15.120]  Tiny slice of justice in this book.
 
[58:15.120 --> 58:17.120]  The corpse of Infowars.
 
[58:17.120 --> 58:18.120]  It's a nice sentence.
 
[58:18.120 --> 58:19.120]  Add that to your Halloween display.
 
[58:19.120 --> 58:21.120]  I would, I would.
 
=== Earth Spinning Faster <small>(58:21)</small> ===
* [https://www.forbes.com/sites/jamiecartereurope/2022/07/28/earth-is-suddenly-spinning-faster-why-our-planet-just-recorded-its-shortest-day-since-records-began/amp/ Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began]<ref>[https://www.forbes.com/sites/jamiecartereurope/2022/07/28/earth-is-suddenly-spinning-faster-why-our-planet-just-recorded-its-shortest-day-since-records-began/amp/ Forbes: Earth Is Suddenly Spinning Faster. Why Our Planet Just Recorded Its Shortest Day Since Records Began]</ref>
 
[58:21.120 --> 58:22.120]  All right, Bob.
 
[58:22.120 --> 58:30.120]  I understand that the earth is supposed to be slowing down over the long historical time.
 
[58:30.120 --> 58:33.120]  But maybe that's not 100 percent true.
 
[58:33.120 --> 58:36.120]  Well, you know, I don't want to get everybody concerned.
 
[58:36.120 --> 58:43.120]  But the earth is now spinning faster than it ever has before in the age of atomic clocks.
 
[58:43.120 --> 58:45.120]  I thought I felt something.
 
[58:45.120 --> 58:51.120]  January 22nd, this past year, January 22nd, no, June 22nd, 2022.
 
[58:51.120 --> 58:53.120]  The shortest day ever recorded.
 
[58:53.120 --> 58:54.120]  And we're not sure why.
 
[58:54.120 --> 58:55.120]  Should we be scared?
 
[58:55.120 --> 58:57.120]  Should we be afraid?
 
[58:57.120 --> 58:58.120]  So what's what's going on here?
 
[58:58.120 --> 59:00.120]  You mean the longest day ever recorded?
 
[59:00.120 --> 59:01.120]  What did I say?
 
[59:01.120 --> 59:02.120]  Shortest day.
 
[59:02.120 --> 59:03.120]  Shortest day.
 
[59:03.120 --> 59:04.120]  Because the earth is spinning faster.
 
[59:04.120 --> 59:05.120]  Faster, so it's short days, right?
 
[59:05.120 --> 59:06.120]  Yeah, it's getting shorter.
 
[59:06.120 --> 59:07.120]  Yeah, it'd be shorter.
 
[59:07.120 --> 59:09.120]  So it all starts with a day.
 
[59:09.120 --> 59:10.120]  What is a day?
 
[59:10.120 --> 59:11.120]  Yeah, what's a day?
 
[59:11.120 --> 59:12.120]  If you ask anybody, what's a day?
 
[59:12.120 --> 59:13.120]  24 hours.
 
[59:13.120 --> 59:14.120]  24 hours.
 
[59:14.120 --> 59:15.120]  Steve, what is that in metric?
 
[59:15.120 --> 59:17.120]  Oh, never mind.
 
[59:17.120 --> 59:20.120]  So a mean solar day is 24 hours.
 
[59:20.120 --> 59:21.120]  That's right.
 
[59:21.120 --> 59:22.120]  That's what it is.
 
[59:22.120 --> 59:25.120]  But that's the outer the outermost onion layer.
 
[59:25.120 --> 59:29.120]  As we say, you get a little deeper and it's never really 24 hours.
 
[59:29.120 --> 59:30.120]  Exactly.
 
[59:30.120 --> 59:31.120]  It kind of this is 24 hours.
 
[59:31.120 --> 59:33.120]  It goes a little shorter, a little longer.
 
[59:33.120 --> 59:35.120]  It's like right around 24 hours.
 
[59:35.120 --> 59:38.120]  24 hours is should be the average.
 
[59:38.120 --> 59:43.120]  But it varies because you've got the interior of the earth kind of roiling around.
 
[59:43.120 --> 59:45.120]  You've got seismic activity.
 
[59:45.120 --> 59:49.120]  You've got the wind, the wind running across the surface of the earth and causing
 
[59:49.120 --> 59:51.120]  friction, pushing against mountains.
 
[59:51.120 --> 59:56.120]  All those things conspire to make the day, you know, slower and faster than 24 hours.
 
[59:56.120 --> 01:00:02.120]  But if you look at it over the over many decades, what you find is that the average is
 
[01:00:02.120 --> 01:00:06.120]  about 24 hours and point zero zero one seconds.
 
[01:00:06.120 --> 01:00:08.120]  So somebody asks you, how long is a day?
 
[01:00:08.120 --> 01:00:12.120]  You say 24 hours and point zero zero one seconds, because that would be more accurate,
 
[01:00:12.120 --> 01:00:13.120]  a little bit more accurate.
 
[01:00:13.120 --> 01:00:17.120]  But the problem here is that we have two ways to tell time.
 
[01:00:17.120 --> 01:00:19.120]  Really, we have atomic time, which is extremely accurate.
 
[01:00:19.120 --> 01:00:20.120]  And here's solar time.
 
[01:00:20.120 --> 01:00:25.120]  And every day, if the earth is a little bit slower, a little bit faster, it notches up
 
[01:00:25.120 --> 01:00:27.120]  and it diverges from atomic time.
 
[01:00:27.120 --> 01:00:32.120]  And after a while, you can't get beyond this, which is about, I don't know, 10 seconds.
 
[01:00:32.120 --> 01:00:34.120]  They don't want to get beyond that, whatever that is.
 
[01:00:34.120 --> 01:00:36.120]  So they throw in a leap second.
 
[01:00:36.120 --> 01:00:37.120]  That's what a leap second is.
 
[01:00:37.120 --> 01:00:41.120]  A leap second isn't because, oh, the earth is slowing and slowing and slowing and we
 
[01:00:41.120 --> 01:00:42.120]  need to throw in a second.
 
[01:00:42.120 --> 01:00:46.120]  It's because because of that divergence between atomic time and solar time.
 
[01:00:46.120 --> 01:00:47.120]  That's what a leap second is.
 
[01:00:47.120 --> 01:00:51.120]  So but why is there this general average of slowing the earth?
 
[01:00:51.120 --> 01:00:52.120]  There's a bunch of reasons.
 
[01:00:52.120 --> 01:00:55.120]  The main and most fascinating one for me is tidal breaking.
 
[01:00:55.120 --> 01:00:59.120]  It's because that damn moon, the moon is doing it, is doing it towards the end.
 
[01:00:59.120 --> 01:01:01.120]  The tides, it's happening because of the tides.
 
[01:01:01.120 --> 01:01:04.120]  So stealing our angular momentum.
 
[01:01:04.120 --> 01:01:05.120]  Exactly.
 
[01:01:05.120 --> 01:01:06.120]  Exactly.
 
[01:01:06.120 --> 01:01:10.120]  Because as because of the way the earth is rotating and the bulges created by the tides,
 
[01:01:10.120 --> 01:01:14.120]  the moon is pulling on those on that bulge, which actually causes friction on the earth,
 
[01:01:14.120 --> 01:01:17.120]  which slows the earth, making our days longer.
 
[01:01:17.120 --> 01:01:21.120]  And the moon is stealing our rotational energy, our angular momentum, because that's got to
 
[01:01:21.120 --> 01:01:22.120]  be conserved.
 
[01:01:22.120 --> 01:01:25.120]  And that's going into a higher orbit and getting farther and farther and farther away.
 
[01:01:25.120 --> 01:01:30.120]  And eventually, if the solar system lasts long enough, which it won't, it will get so
 
[01:01:30.120 --> 01:01:33.120]  far away that we'll be facing each other.
 
[01:01:33.120 --> 01:01:36.120]  The moon and the moon will be facing each other, will be tidally locked like the moon
 
[01:01:36.120 --> 01:01:37.120]  is to us right now.
 
[01:01:37.120 --> 01:01:41.120]  So that's just the interesting aside of why the earth is slowing.
 
[01:01:41.120 --> 01:01:46.120]  When and if that ever happens, does that mean that one side of the earth would be getting
 
[01:01:46.120 --> 01:01:48.120]  sun and the other side will not be getting sun?
 
[01:01:48.120 --> 01:01:50.120]  No, it's all about the orientation of the earth moon.
 
[01:01:50.120 --> 01:01:51.120]  Right.
 
[01:01:51.120 --> 01:01:52.120]  It's not tidally locked to the sun.
 
[01:01:52.120 --> 01:01:53.120]  It's tidally locked to the moon.
 
[01:01:53.120 --> 01:01:54.120]  Right.
 
[01:01:54.120 --> 01:01:55.120]  Now, if we were like...
 
[01:01:55.120 --> 01:01:57.120]  Would the whole thing rotate, basically?
 
[01:01:57.120 --> 01:01:58.120]  Yes.
 
[01:01:58.120 --> 01:02:00.120]  We would always be facing each other.
 
[01:02:00.120 --> 01:02:01.120]  Our orbit would be like this.
 
[01:02:01.120 --> 01:02:04.120]  Instead of the moon is locked now and the earth is rotating.
 
[01:02:04.120 --> 01:02:07.120]  So some side of the earth will see the moon always and the other side will never see
 
[01:02:07.120 --> 01:02:08.120]  the moon.
 
[01:02:08.120 --> 01:02:11.120]  But that wouldn't happen because we're going to burn up before we get to that point, I
 
[01:02:11.120 --> 01:02:12.120]  believe.
 
[01:02:12.120 --> 01:02:13.120]  Oh, thank you.
 
[01:02:13.120 --> 01:02:14.120]  Perfect.
 
[01:02:14.120 --> 01:02:16.120]  But there are planets that have been tidally locked to their sun because they're very big
 
[01:02:16.120 --> 01:02:19.120]  and they're very close to their parent star.
 
[01:02:19.120 --> 01:02:22.120]  So the tidal forces are strong enough to tidally lock that.
 
[01:02:22.120 --> 01:02:28.120]  But 2020, 2021, and 2022 were a little bit different.
 
[01:02:28.120 --> 01:02:35.120]  And it wasn't just because of that damn pandemic because these were the shortest days ever
 
[01:02:35.120 --> 01:02:36.120]  recorded.
 
[01:02:36.120 --> 01:02:41.120]  2020 had 28 of the shortest days ever recorded since 1960.
 
[01:02:41.120 --> 01:02:42.120]  What?
 
[01:02:42.120 --> 01:02:43.120]  28 days.
 
[01:02:43.120 --> 01:02:44.120]  Why?
 
[01:02:44.120 --> 01:02:49.120]  2021 also had a plethora of very, very short days.
 
[01:02:49.120 --> 01:02:53.120]  No dramatic records were broken in 2021, but they were still very, very short.
 
[01:02:53.120 --> 01:02:59.120]  Oh, and 2020, I think we all can agree that if the days in 2020 were shorter, that's a
 
[01:02:59.120 --> 01:03:03.120]  good thing because that year needed to be shorter than it was.
 
[01:03:03.120 --> 01:03:05.120]  Literally the only good thing is this.
 
[01:03:05.120 --> 01:03:06.120]  Right.
 
[01:03:06.120 --> 01:03:07.120]  So 2022, we're not even done with it.
 
[01:03:07.120 --> 01:03:09.120]  We've already broken some good records.
 
[01:03:09.120 --> 01:03:14.120]  June 22nd was 1.59 milliseconds shorter than 24 hours.
 
[01:03:14.120 --> 01:03:15.120]  Holy shit.
 
[01:03:15.120 --> 01:03:16.120]  The shortest day.
 
[01:03:16.120 --> 01:03:17.120]  Is that a lot?
 
[01:03:17.120 --> 01:03:23.120]  It's not an absolute a lot, but relative to history, it is a lot.
 
[01:03:23.120 --> 01:03:24.120]  1.59 milliseconds.
 
[01:03:24.120 --> 01:03:27.120]  The short day, shortest day ever recorded, ever recorded.
 
[01:03:27.120 --> 01:03:31.120]  And then in July, we had a day that was the second shortest.
 
[01:03:31.120 --> 01:03:33.120]  So something's happening.
 
[01:03:33.120 --> 01:03:40.120]  So why do we have three years where the average day was less than 24 hours when over the past
 
[01:03:40.120 --> 01:03:46.120]  30, 40, 50, 60, 70 years, the average day has been a little bit longer than 24 hours?
 
[01:03:46.120 --> 01:03:47.120]  Why?
 
[01:03:47.120 --> 01:03:48.120]  What's going on?
 
[01:03:48.120 --> 01:03:49.120]  Well, we're not sure.
 
[01:03:49.120 --> 01:03:52.120]  We're not sure exactly, but there's lots, of course, there's lots of scientists and
 
[01:03:52.120 --> 01:03:53.120]  their theories.
 
[01:03:53.120 --> 01:03:55.120]  They have got lots of ideas of why.
 
[01:03:55.120 --> 01:04:00.120]  One idea is that glaciers are melting and basically the poles don't have as much mass
 
[01:04:00.120 --> 01:04:03.120]  or weight by them as they used to.
 
[01:04:03.120 --> 01:04:04.120]  That's one idea that may be contributing.
 
[01:04:04.120 --> 01:04:06.120]  So is that like a skater pulling in their arms?
 
[01:04:06.120 --> 01:04:07.120]  Right.
 
[01:04:07.120 --> 01:04:08.120]  Yes.
 
[01:04:08.120 --> 01:04:10.120]  Distribution of mass as the skater pulling in the arms to go faster.
 
[01:04:10.120 --> 01:04:12.120]  That's definitely related.
 
[01:04:12.120 --> 01:04:16.120]  And related to that, Steve, is also another idea of why we're getting the speed up is
 
[01:04:16.120 --> 01:04:22.120]  the different movements of the molten core of the planet that could address that and
 
[01:04:22.120 --> 01:04:23.120]  speed up the Earth.
 
[01:04:23.120 --> 01:04:27.120]  Seismic activity is another option that they throw out.
 
[01:04:27.120 --> 01:04:32.120]  My theory is that it's the sheer mass of meatballs at Jay's house that is kind of screwing with
 
[01:04:32.120 --> 01:04:33.120]  our rotation.
 
[01:04:33.120 --> 01:04:34.120]  I would do it.
 
[01:04:34.120 --> 01:04:35.120]  Jay, I'm telling you, man.
 
[01:04:35.120 --> 01:04:37.120]  I've got two scientists that agree with that with me.
 
[01:04:37.120 --> 01:04:43.120]  But a lot of scientists will also throw out there the Chandler wobble as one potential
 
[01:04:43.120 --> 01:04:44.120]  reason why the Earth is speeding up.
 
[01:04:44.120 --> 01:04:45.120]  Is that a dance?
 
[01:04:45.120 --> 01:04:46.120]  What is it?
 
[01:04:46.120 --> 01:04:47.120]  The France thing?
 
[01:04:47.120 --> 01:04:48.120]  Yes.
 
[01:04:48.120 --> 01:04:49.120]  That's the joke.
 
[01:04:49.120 --> 01:04:52.120]  And I couldn't think of a really, really good version of that joke.
 
[01:04:52.120 --> 01:04:54.120]  But I'll just describe what it is.
 
[01:04:54.120 --> 01:04:57.120]  It's essentially the varying wobble of Earth's axis of rotation.
 
[01:04:57.120 --> 01:04:58.120]  It's actually kind of complicated.
 
[01:04:58.120 --> 01:05:02.120]  I'm trying to really wrap my head around what's exactly going on with this Chandler wobble.
 
[01:05:02.120 --> 01:05:07.120]  But it's the axis of rotation that varies, causing a shorter term wobble.
 
[01:05:07.120 --> 01:05:09.120]  So that's as much as I'll say about the Chandler wobble.
 
[01:05:09.120 --> 01:05:10.120]  Okay, so what does this mean?
 
[01:05:10.120 --> 01:05:11.120]  What's going to happen?
 
[01:05:11.120 --> 01:05:13.120]  What are some really bad things?
 
[01:05:13.120 --> 01:05:16.120]  Okay, it's the leap second that could be concerning here.
 
[01:05:16.120 --> 01:05:17.120]  Because we've had leap seconds.
 
[01:05:17.120 --> 01:05:22.120]  We've had plenty of leap seconds where you add an extra second to coordinated universal
 
[01:05:22.120 --> 01:05:23.120]  time.
 
[01:05:23.120 --> 01:05:24.120]  And that's been done.
 
[01:05:24.120 --> 01:05:26.120]  Nobody really thinks about it anymore.
 
[01:05:26.120 --> 01:05:28.120]  But it's problematic.
 
[01:05:28.120 --> 01:05:32.120]  In 2012, Reddit was taken down because of a leap second was added that year.
 
[01:05:32.120 --> 01:05:33.120]  Wow.
 
[01:05:33.120 --> 01:05:39.120]  And if I was into Reddit then as I am now, I would have been pissed if Reddit went down.
 
[01:05:39.120 --> 01:05:40.120]  But they've done tricks.
 
[01:05:40.120 --> 01:05:43.120]  They've got something called leap smearing, where they take microsecond slowdowns.
 
[01:05:43.120 --> 01:05:44.120]  They need a rebrand.
 
[01:05:44.120 --> 01:05:45.120]  Yes.
 
[01:05:45.120 --> 01:05:52.120]  In the course of a day, they might do microsecond slowdowns leading up to the leap second.
 
[01:05:52.120 --> 01:05:55.120]  So that to make it a little more palatable, I guess.
 
[01:05:55.120 --> 01:05:56.120]  Bob, but wait.
 
[01:05:56.120 --> 01:05:58.120]  I hate to cut in.
 
[01:05:58.120 --> 01:06:03.120]  But why does a fraction of a second matter in the world?
 
[01:06:03.120 --> 01:06:05.120]  Well, it's not a fraction of a second.
 
[01:06:05.120 --> 01:06:06.120]  It's a full second.
 
[01:06:06.120 --> 01:06:07.120]  I mean, think about it, Jay.
 
[01:06:07.120 --> 01:06:12.120]  I mean, a second is small, but it's important.
 
[01:06:12.120 --> 01:06:17.120]  And computer systems and GPS and satellites, lots of things are interrelated.
 
[01:06:17.120 --> 01:06:18.120]  And it took down Reddit.
 
[01:06:18.120 --> 01:06:20.120]  I mean, this can happen.
 
[01:06:20.120 --> 01:06:26.120]  Y2K is kind of a related example of when you mess with something so fundamental.
 
[01:06:26.120 --> 01:06:30.120]  And I'll go into it in a little bit more detail in one second, Jay.
 
[01:06:30.120 --> 01:06:33.120]  So a normal leap second can be problematic.
 
[01:06:33.120 --> 01:06:36.120]  Perhaps it's not as much as problematic as it was.
 
[01:06:36.120 --> 01:06:40.120]  But a negative leap second, if the Earth keeps spinning faster and faster,
 
[01:06:40.120 --> 01:06:46.120]  or if we maintain this average of faster than 24 hours,
 
[01:06:46.120 --> 01:06:50.120]  then we may need to add a negative leap second.
 
[01:06:50.120 --> 01:06:53.120]  And that's much more problematic than a regular leap second,
 
[01:06:53.120 --> 01:06:55.120]  where you're skipping one second.
 
[01:06:55.120 --> 01:07:01.120]  It's tougher to do and more risky than adding a second for various technical reasons.
 
[01:07:01.120 --> 01:07:02.120]  For example...
 
[01:07:02.120 --> 01:07:03.120]  This is going into the future.
 
[01:07:03.120 --> 01:07:05.120]  This really sounds like a time travel episode.
 
[01:07:05.120 --> 01:07:06.120]  Yeah, right?
 
[01:07:06.120 --> 01:07:09.120]  But smartphones, computers, communication systems,
 
[01:07:09.120 --> 01:07:13.120]  they synchronize using something called a network time protocol.
 
[01:07:13.120 --> 01:07:17.120]  And that network time protocol is based on the number of seconds
 
[01:07:17.120 --> 01:07:20.120]  that have transpired since January 1st, 1970.
 
[01:07:20.120 --> 01:07:24.120]  So you throw out a second there and things can go a little wonky.
 
[01:07:24.120 --> 01:07:25.120]  So that's a little concerning.
 
[01:07:25.120 --> 01:07:28.120]  It can cause some issues with these systems.
 
[01:07:28.120 --> 01:07:30.120]  Also, there's GPS satellites.
 
[01:07:30.120 --> 01:07:33.120]  GPS satellites don't account for rotation.
 
[01:07:33.120 --> 01:07:35.120]  They're not really built to deal with rotation.
 
[01:07:35.120 --> 01:07:37.120]  So if the Earth is spinning faster,
 
[01:07:37.120 --> 01:07:43.120]  the GPS satellite will all of a sudden be over a specific area a little earlier
 
[01:07:43.120 --> 01:07:45.120]  than it would have been previously.
 
[01:07:45.120 --> 01:07:46.120]  And that could mean the difference,
 
[01:07:46.120 --> 01:07:50.120]  even if the Earth sped up by a half a millisecond,
 
[01:07:50.120 --> 01:07:54.120]  it could be 10 inches or 26 centimeters off.
 
[01:07:54.120 --> 01:07:56.120]  And that would compound.
 
[01:07:56.120 --> 01:07:59.120]  And eventually the GPS satellites could be essentially useless
 
[01:07:59.120 --> 01:08:02.120]  if we don't do anything, which we probably will.
 
[01:08:02.120 --> 01:08:05.120]  I mean, it's not like, oh my God, GPS is going to be worthless.
 
[01:08:05.120 --> 01:08:08.120]  And when you say do something, you're like, we've got to program this problem.
 
[01:08:08.120 --> 01:08:11.120]  Yeah, I'm not sure what level of effort would be required,
 
[01:08:11.120 --> 01:08:13.120]  but I'm sure it's not going to be trivial.
 
[01:08:13.120 --> 01:08:16.120]  So some people say that this is going to be over soon
 
[01:08:16.120 --> 01:08:23.120]  and this increased rotation speed of the Earth isn't going to necessarily stay this way for years.
 
[01:08:23.120 --> 01:08:28.120]  Some people are saying this could be the beginning of a 50-year scenario
 
[01:08:28.120 --> 01:08:32.120]  where the Earth is spinning faster than 24 hours.
 
[01:08:32.120 --> 01:08:35.120]  And we may absolutely need to throw in some of these negative leap seconds,
 
[01:08:35.120 --> 01:08:37.120]  which could cause some problems.
 
[01:08:37.120 --> 01:08:39.120]  So that's the story.
 
[01:08:39.120 --> 01:08:40.120]  It's interesting.
 
[01:08:40.120 --> 01:08:42.120]  I'm not too worried about it.
 
[01:08:42.120 --> 01:08:45.120]  But we'll see if some negative leap seconds get thrown in there,
 
[01:08:45.120 --> 01:08:53.120]  and we might find out by the end of this year or the following year if this keeps up.
 
[01:08:53.120 --> 01:08:55.120]  So, Bob, are you angry about all this?
 
[01:08:55.120 --> 01:08:56.120]  No.
 
[01:08:56.120 --> 01:09:00.120]  It was just an interesting research.
 
[01:09:00.120 --> 01:09:03.120]  It was actually tough.
 
[01:09:03.120 --> 01:09:04.120]  I'm answering.
 
[01:09:04.120 --> 01:09:08.120]  It was tough to really get to fully understand all the nuances here,
 
[01:09:08.120 --> 01:09:11.120]  because you've got sidereal day, solar day, mean solar day,
 
[01:09:11.120 --> 01:09:16.120]  all these things that are different websites had different takes on exactly what those mean.
 
[01:09:16.120 --> 01:09:21.120]  And it was interesting to put it all together and understand exactly what was happening.
 
[01:09:21.120 --> 01:09:22.120]  So, yeah, I enjoyed this.
 
[01:09:22.120 --> 01:09:25.120]  A great bar bet that we were talking about when we were talking about this before.
 
[01:09:25.120 --> 01:09:29.120]  So, Andrea, how many times does the Earth rotate on its axis in one year?
 
[01:09:29.120 --> 01:09:31.120]  365 and a quarter, isn't that it?
 
[01:09:31.120 --> 01:09:32.120]  Wrong.
 
[01:09:32.120 --> 01:09:33.120]  Oh.
 
[01:09:33.120 --> 01:09:41.120]  366 and a quarter, because in going around the sun, it's got to rotate one extra time.
 
[01:09:41.120 --> 01:09:46.120]  A day, you know, one day is a full rotation plus a degree,
 
[01:09:46.120 --> 01:09:50.120]  a full rotation plus a degree, and it adds up over a year to a whole other rotation.
 
[01:09:50.120 --> 01:09:51.120]  Right.
 
[01:09:51.120 --> 01:09:55.120]  361 degrees is the mean solar day, 24 hours.
 
[01:09:55.120 --> 01:09:57.120]  A sidereal day is...
 
[01:09:57.120 --> 01:09:59.120]  23 hours and 56 minutes.
 
[01:09:59.120 --> 01:10:00.120]  Exactly.
 
[01:10:00.120 --> 01:10:01.120]  Wow.
 
[01:10:01.120 --> 01:10:02.120]  23 hours and 56 minutes.
 
[01:10:02.120 --> 01:10:03.120]  It's four minutes.
 
[01:10:03.120 --> 01:10:05.120]  But there's also lots of variations.
 
[01:10:05.120 --> 01:10:08.120]  You're going to leave work early and be like, I'm on a sidereal day.
 
[01:10:08.120 --> 01:10:10.120]  That is such a skeptic thing.
 
[01:10:10.120 --> 01:10:11.120]  Like, wrong.
 
[01:10:11.120 --> 01:10:12.120]  365.
 
[01:10:12.120 --> 01:10:13.120]  You know what I mean?
 
[01:10:13.120 --> 01:10:14.120]  Come on.
 
[01:10:14.120 --> 01:10:15.120]  Yeah.
 
[01:10:15.120 --> 01:10:21.120]  But also, the other nuances is that the day varies depending on where you are in the orbit
 
[01:10:21.120 --> 01:10:25.120]  and what season it is and the tilt of the Earth.
 
[01:10:25.120 --> 01:10:29.120]  There's so many little factors that go in here to make it extra confusing.
 
[01:10:29.120 --> 01:10:35.120]  So can't we help by having a party somewhere on Earth that will slow the rotation down?
 
[01:10:35.120 --> 01:10:37.120]  There must be some human configuration that we could do.
 
[01:10:37.120 --> 01:10:39.120]  We all go to the North Pole at the same time.
 
[01:10:39.120 --> 01:10:44.120]  We all have to jump at the same time so that we can alleviate the pressure.
 
[01:10:44.120 --> 01:10:45.120]  It would be like Earth.
 
[01:10:45.120 --> 01:10:46.120]  It would be like in an elevator.
 
[01:10:46.120 --> 01:10:49.120]  Andrew, it would be like an 80s movie, like the end of an 80s movie where we all jump.


[01:10:49.120 --> 01:10:50.120]  Yeah.
[01:10:49.120 --> 01:10:50.120]  Yeah.
Line 3,360: Line 2,341:
{{anchor|sof}}
{{anchor|sof}}
{{anchor|theme}} <!-- leave these anchors directly above the corresponding section that follows -->
{{anchor|theme}} <!-- leave these anchors directly above the corresponding section that follows -->
== Science or Fiction <small>(1:11:11)</small> ==
== Science or Fiction <small>(1:11:11)</small> ==
<!--  
<!--  

Revision as of 02:20, 10 November 2022

  Emblem-pen.png This episode is in the middle of being transcribed by Hearmepurr (talk) as of 2022-11-09.
To help avoid duplication, please do not transcribe this episode while this message is displayed.
  GoogleSpeechAPI.png This episode was transcribed by the Google Web Speech API Demonstration (or another automatic method) and therefore will require careful proof-reading.
  Emblem-pen-green.png This transcript is not finished. Please help us finish it!
Add a Transcribing template to the top of this transcript before you start so that we don't duplicate your efforts.

Template:Editing required (w/links) You can use this outline to help structure the transcription. Click "Edit" above to begin.

SGU Episode 893
August 20th 2022
893 alex jones.jpg

Alex Jones' lawyer accidentally sent two years' worth of texts to plaintiffs' lawyers

SGU 892                      SGU 894

Skeptical Rogues
S: Steven Novella

B: Bob Novella

J: Jay Novella

E: Evan Bernstein

Guests

AJR: Andrea Jones-Rooy,
political, social, and data scientist

KB: Kelly Burke, from Guerrilla Skeptics

GH: George Hrab, NECSS emcee

IC: Ian Callanan, SGU tech guru

Quote of the Week

An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong.

Russell Baker, American journalist

Links
Download Podcast
Show Notes
Forum Discussion

Introduction, Live from NECSS, Book Update

  • Perry DeAngelis Memorial Episode

Voice-over: You're listening to the Skeptics' Guide to the Universe, your escape to reality.

S: Hello and welcome to the Skeptics' Guide to the Universe. This is your host, Steven Novella and today is August 6th, 2022. Joining me this week are Bob Novella...

B: Hey, everybody!

S: Jay Novella...

J: Hey guys.

S: ...and Evan Bernstein.

E: Hello everyone.

S: And we have two in-studio guests, Kelly Burke.

KB: Hello.

S: Kelly, welcome to the SGU. This is your first time on the show.

KB: It is.

S: And Andrea Jones-Roy. Andrea, welcome back to the SGU.

AJR: Hello. Thank you for having me.

S: Thank you all for joining me in the studio. You were here live, so we had to have you on the show. Now Cara was going to join us for this episode, she was going to join us remotely, but as you remember, she had surgery not too long ago and she's going through a bit of a rough patch. She did want us to say that Cara does struggle with depression and she's having a depressive episode partly due to hormones and the surgery and everything that's going on. She's dealing with it, but that has to be her priority this weekend to deal with that. And so she decided to not do the show. So we wish her well, she'll be back for next week's show. But we have six people instead of five to make up for it. So as you all know, this episode, every year, this is our Perry DeAngelis Memorial Episode and before it was NECSS, it was just the Perry DeAngelis Memorial SGU episode, then it basically morphed into NECSS and we kept it as the episode where we remember our lost rogue, Perry. Damn, that was 15 years ago, guys.

E: Oh my gosh.

S: He was with us for two years and it's been 15 years since he was on the show. It's just unbelievable. And of course, we remember many of the friends that we lost along the way, David Young, Michael Oreticelli, all lost too young, too soon, all really, really good friends of the SGU. So we like to remember them every year. Okay. So as you all know.

J: I have an announcement from our book publisher, if you don't mind.

S: Oh yes, go ahead, Jay. The actual. Physical book.

J: That book is the result of an incredible amount of work. So this book is about, it's about science, it's about the history of science and it's about making predictions on future technology. Historically and modern day. And we had a lot of fun writing the book. It was really intense, but it's an archive now of incredible information about predictions that were made in the past, predictions that were made five, 10 years ago, predictions that are made today. We also wrote some science fiction for this to illustrate some interesting future concepts of technology.

S: Yeah. What I like is that it's also a time capsule. It's like our own time capsule for the future. So future generations can look back and see how we did. Just like we are looking back at the past future and see how they did.

B: I hope they don't laugh at us the way we've been laughing at them. Yeah.

AJR: We were talking about the Jetsons earlier.

E: Happy birthday, George.

J: If you go to skepticsguidetothefuturebook.com, is that right? skepticsguidetothefuturebook.com. And you fill out the form there and you put in the secret password, which is the word "future". Don't tell anybody. It's a secret.

AJR: That's clever.

S: [inaudible] come up with that.

J: And you will be entered in to a giveaway of the very first signed copy of the book.

B: Wow.

J: So please go to skepticsguidetothefuturebook.com and the secret password, George, what's that secret password?

GH: Flabbing garbage.

J: Flabbing garbage. Or spelled as in non-George language, "future". The word is "future".

S: All right. So this slide is just to remind everybody that the theme of NECSS 2022, the 14th NECSS is the misinformation apocalypse. You've had a lot of talk so far about it. That theme might crop up on the SGU show this weekend. But we've tried to focus on the positive, right guys? We don't just want to say how bad it is. We want to focus on what you can do about it. And there's a lot of things you can do about it. All the things that we try to do every week. Understand science better and be more critical thinking, be more humble, know how to communicate, be more positive when you communicate, understand how to access information over media, how that all works. We'll hit some of those themes during the show today as well.

J: Well, it's always awesome as one of the people that organize NECSS, we talk to the speakers, but we don't get an incredible amount of details about what their talk is going to be. Because we're just trusting professionals. There's some conversation, but it's not detailed.

S: Always a bit of a throw of the dice.

J: It has to be. It has to be the people and not the talk, basically. So when we get to hear the talk and we get to see how it folds into our theme and what information that they cover, it's always a fun discovery. Oh my god, that's cool. It's more relevant than I thought, or they went into an area I didn't expect them to go. I thought your talk this morning was a lot of fun that you had with Richard Wiseman.

S: Oh yeah. Richard Wiseman's─

AJR: That was really cool.

S: ─very easy to talk to about stuff like that.

J: Definitely.

S: Always a pleasure. Andrea You just gave your talk on political science and misinformation, which is obviously a huge intersection there. So there's still a lot to learn. I think after doing this for 17 years, it doesn't amaze me, but it's always fascinating how much we still have to learn about something that we've been knee-deep in for a quarter of a century. We've been doing this skepticism since 96, 26 years, the podcast for 17 of those years. But there's just so much depth, and it's getting deeper, which is the good thing, is that this is not a static field. It's dynamic. We are actually learning more, and we have to update ourselves.

E: Constantly.

J: Well, the world, I mean, look at what's happened since this podcast began. Look how much things have changed, like you were talking about time capsules. Just take a look at what happened in science, the skeptical community itself, and in politics. The world has changed so much, and we're running to try to keep up with it, if anything. It's not an easy thing to do.

S: Yeah we're focused on almost entirely different issues now than we were at the start. The things that are important or critical have been constantly evolving. And then some things come back and stay the same. We're doing UFOs again. Really, we're all at the same point. So some things are different, some things are the same. It's always interesting.

B: I'm just waiting for Bigfoot to become relevant again. How long before Bigfoot is like, we're talking about Bigfoot again?

S: Yeah, that's one of those eternal ones that never goes away.

AJR: I saw a Loch Ness Monster post on Twitter recently.

E: Oh, yeah.

AJR: New evidence.

E: Loch Ness Monster.

J: New evidence. I love that. New evidence.

E: Never goes away.

J: New blurry photos.

S: Even after it's been definitively, The guy confessed, yeah, that was me. I made the first.

J: Steve, that's an admission that there's an Illuminati. That's what that is. Come on, man.

Special Segment: Chorizo Hoax (7:09)

S: All right. Let's get to some fun stuff. What do you guys think that is? I'm showing a picture on the screen there.

J: That is a meeple. I mean, whoever made it needs a little help, but we'll get there.

S: You're close. So a French scientist spread this picture on Twitter, claiming that it was─

AJR: Is it a Loch Ness Monster?

S: ─a close-up photo from it through, did he mention the telescope?

E: James Webb.

S: This is a James Webb close-up of Alpha Centauri.

B: Proxima Centauri.

S: Did he say Proxima?

B: I think he did.

E: I think he said Proxima.

S: Proxima Centauri, the closest star to the Earth. And it was pretty much believed by a lot of people. Turns out that is a picture of essentially a slice of chorizo. (laughter)

J: I love it. Oh, my god. It's so awesome.

AJR: Yeah, we're having chorizo after the conference is over? Because that's great.

S: Can we have some of that?

E: Look at all those solar swirls in that chorizo.

S: He must have looked at it and goes, you know what? This kind of looks like those blurry photos, close-ups of the sun. I wonder how many people I can get to believe that.

J: You know what I love about this? It's not even cropped. That is the shape of the piece of meat. That's it. That's the whole thing. That's great.

S: It is funny. There is a superficial resemblance. He later apologized, I'm not sure he had to, but we talk about this at times. He's a scientist, and he pulled a little prank. I thought it was pretty harmless, and the point was be a little bit more skeptical before you believe things online. But I do agree that it's problematic to have a scientist doing it, because then we're simultaneously saying consider the credentials of the person that you're listening to or that you're getting information from. If people are saying, hey, no, a scientist put this up. This wasn't just some random guy on the internet. This was a scientist saying, but he was pranking us. It may cause more harm than good. Kelly, you are a social media expert. What do you think?

KB: I was going to say, that's actually pretty tricky with the James Webb pictures, too, because I've noticed not all of them are coming from NASA, because the data is just out there and anybody can compile the pictures. So anytime I've seen something presented as a James Webb picture, I have to go and look into it, because it's not coming directly from NASA. So I could totally see why this took off.

B: You may think my knee-jerk reaction is, wait a second, stars are point sources of light. You zoom in as much as you want. You're really not going to see the disk for the most part. That has been true for as long as astronomy has been around, until of course, relatively recently. Now we can zoom in on certain stars, certain giant stars or stars that are close, and we can at least observe some of the disk itself. It's not just a point of light. And I think the number now is 23. 23 stars we have actually seen part of the disk or a little bit of the disk. Sometimes you can even see the convection cells. So it's not an outrageous thing to say I could see the disk of this nearby star.

E: It was not implausible, right.

B: And if you looked at some of them, we found one.

S: Well, yeah, I got it here. I do want to point out before we move off this picture, though, that while that's correct, you can see the grain of the meat. This is an in-focus photo. If he had just blurted out, it would have been a hundred times more powerful.

KB: If you're looking on your phone, it's really tiny.

E: That could also be a bowling ball for all you know.

S: So this is the closest up picture I could find of Alpha Centauri A and B, including Proxima Centauri. Actually, Proxima C, I always forget. Is that the third star in the system?

B: I just call it Proxima, and they're messing around with the names of these stars.

S: Yeah, but this is Alpha 1 and 2, or A and B. And you can see they're basically point sources of light. You're not seeing really the surface of those stars. There's some flare, lens flare, but you're not seeing the surface. But Bob and I found not the best picture of Alpha Centauri, but just what's the best picture of any star ever, and there you go. And it looks pretty much like a blurry slice of chorizo.

J: It doesn't even look symmetrical, Steve.

S: Why would it?

AJR: It's kind of a fattier chorizo, this one, though, right?

S: So as I said, if you blurt out that chorizo slice, you have a pretty good facsimile of a close-up picture of the star. Now this is, what was it, about 520 light years away?

B: Yeah, surprisingly far.

S: But it's a red supergiant, so it's massive.

B: So that helps.

S: Yeah, that helps a lot. Yeah, that was more plausible than I thought when I first saw it.

AJR: I feel like the scary thing is that we're all so worried about misinformation that scientists can't make jokes. It's kind of where we're going to live. Not that this was the best joke of all time, but the idea of a prank is sort of, it feels irresponsible, and it's too bad that that's the case, because it's making science fun and engaging, and you could imagine he could do a fun quiz show, like Cartwheel Galaxy or Lollipop or whatever, right? Fallon could do a segment, but it feels like it would cause more harm than good, which is...

S: Right, unless you're transparent up front. If he did it as, like, you might think this is a close-up star, but it's actually a chorizo. Here's a close-up star, something like that.

KB: That might be a new thing for our social media. Close-up, what is this?

J: This one looks more like a pizza pie, though.

AJR: It's like that gimmick they did forever ago, where they were like, was this a famous painting, or did a gorilla paint this?

J: All right, so real quick, apparently the password that the publisher put up there, the space for the password only takes five characters, so just type in the first five characters of the word future. You can't get good help these days, George.

GH: Futter.

AJR: Futter.

J: I don't understand. The password field actually has a limit to how many characters it takes. How does that even happen?

E: You have to pay more for the six characters.

AJR: Most passwords require it to be way too long these days, and I can't fill it in.

GH: A third book, Jay, maybe you could have six characters.

J: Look, this is what I'll do. I'll call the publisher on Monday, and I'll tell them, forget the password, just whoever entered is going to be legit. So just put your info in there if you want to enter in.

S: All right, let's get to some news items. We have more fun bits coming up later, too, but first a couple news items.

News Items

News Items

More Space Debris (23:51)

S: That picture of a giant complex of buildings that I'm showing is the NIH, the National Institutes of Health. They are essentially the main biomedical research funding institution in the United States. They are a creature of Congress, as we like to say. They are created, funded by Congress. Essentially, if you do biomedical research in the U.S., you get your funding from the NIH, more likely than not. They're massively important for medical research. Recently, the NIH created an initiative. It's not a new office or anything. It's just an initiative. They're funding specific groups who are going to create an educational module to teach researchers how to do rigorous science. That sounds pretty good. That sounds pretty good to me.

J: That doesn't already exist, though?

AJR: That's my thought.

S: That's a good question. Right now, how do we teach researchers how to do good research methodology? Some universities may have courses on it. They may be required. They may be elective. They might be a statistics course or a research methodology course. You do get that, but not like, all right, here's how you do really rigorous research. Here's how you avoid p-hacking or how you avoid false positives, etc., etc. Clearly, that is needed for reasons that I've been talking about and writing about for the last 20 years. The other way that people learn that is through, essentially, individual mentorship. You work in somebody's lab, and they teach you how to do research, not only in their specific area, technically, but also just, this is what good science is. But it's not systematic, and it's not thorough enough. Clearly, there's a perception that there is a gap, a gap there. They want to fill that gap. Their goal is to fund the creation of this module to teach rigorous research design and to then make it freely available, basically. And then the hope is, so universities may require it. They might say, all right, if you're going to work at our university, this already happens. I work at Yale, and I have to do 20 different certifications every year on everything, like sexual harassment sensitivity or how not to burn your eyes out or whatever, all of these things.

E: That's a good one.

S: How to treat patients ethically, all good stuff. A lot of safety things all in there. But just adding one that's, here's how not to do fake research. Here's how not to accidentally commit research fraud. Or how to p-hack or whatever. It would be very easy to slip that into the existing system of getting certified for quality control. That's basically what this is. Now, the NIH, of course, they could require, if you apply to the NIH for a research grant, and they're not saying they're going to do this, but imagine if they said, all right, in order to get this grant, you've got to have certification that you took this module and you passed. Because again, they're interested in not wasting money. That's their primary interest. Obviously, they want to do good science. That's their goal. Their mission is to obviously do good science, but they have a finite budget, and they want to make the most use out of that money. That, again, is their mission. One of the biggest wastes in research is bad science. If you publish a study, and it's a false positive, let's say, you think that you have a result, but you did poor methodology, you p-hacked or whatever. You underpowered the study. Or the blinding was inadequate. Or your statistics were off, or whatever. And then other people try to replicate that study, how many millions of dollars could be spent proving that your crappy study was crappy when you could have filtered it out at the beginning by putting in some internal controls that you didn't know you should do? Or by tightening up your research methodology. The other goal here, other than not only doing good science, is to save money by weeding out the inefficiency in the system of fraud. It makes sense, not fraud, but just bad rigor in research design. It makes sense that once these modules are up and running, phase two would be, and you've got to be certified in this before we'll give you any money. So that's one way that you, and again, the NIH already does this for other things, for example, they now require, this has been going on for about 10 or 15 years or so, if you get public money to do your research, you have to make the results of your research available to the public and accessible by the public. You have to say, how are you going to explain your results to the people who are paying for your research, the public. So this would be another way, how can you assure the people who are funding your research that you're not wasting their money by doing rigorous research design? And by the way, here is an educational module, and we could easily connect certification to that. That's awesome. I would like to see big science journals do the same thing. You want to get published in our journal, we require that you have the author, the lead author, or every author has certification. And of course, once either of those happens, like if the NIH says you need to have certification to get grant money, you better believe every university will make sure that it happens. They're not going to have any of their people not be able to get NIH grants. So it's very easy to make this systematic. So again, we're right at the very beginning of this, and everything I'm hearing and seeing is very, very good. We'll keep a close eye on it. And again, a lot of people react like you, Jay. It's really, why isn't this kind of already happening? But that's because I think the main reason is, I would say there's two things. One is people think it is happening, but it's just not happening enough. The second one is that the science of doing rigorous science has been getting better. We're learning more and more subtle ways in which studies go awry or that results can be tweaked or researchers can put their thumb on the scale. We talk about researcher degrees of freedom and researcher bias and publication bias and citation bias and all these things that can alter the utility and the rigor and the quality of science and essentially the old method of just relying upon some just here's some classic statistics class. And then whoever's lab you work in, they'll teach you how to do good science. It's just not good enough anymore. It's got to be systematic, and everyone's got to go through it in order to absolutely minimize the waste in the system that comes from poor research design. So this is a massive move in the right direction. This is very, very encouraging.

J: Steve, where did you learn how to do it?

S: For me, well, it's been the whole science-based medicine initiative, which is I've been reading about it, following, reading the literature on it for 20 years and writing about it, trying to digest it. That's basically what we explore at science-based medicine is how to do rigorous science. The relationship between science and practice. How do we know what's true, what's not true? Where's the threshold of evidence before something should affect your practice? That's what we do. That's how I learned it. It was all basically just self-taught by reading the literature, talking to my colleagues, writing about it, engaging about it. But most researchers are not spending most of their time, their academic time, doing that. They're doing their research. They're trying to figure out what receptor is causing this disease or whatever. This is sort of part of that, but it's not their focus. That's why it needs to be done systematically. This is also one final word and then we'll move on. Part of a bigger trend that I've noticed, at least in medicine. Andrea, you can tell me if you think it's true in your field as well, that you're going away from the model of just counting on mentorship and counting on that people will learn what they need to learn and moving towards things that are way more systematic, that are verified, and also that there are checks in place rather than just trying to raise the quality by just over-educating people. You just have checks in place to make sure that they do it. Medicine is getting too complicated. Science is getting too complicated to rely upon methods that are not absolutely systematic. Is that something you find in academia from your end?

AJR: Definitely. I'm thinking about something that I think Jay brought up on a different live a while ago about the movement towards pre-registering your hypotheses. That's another way of just putting the system in place because it turns out we can't rely on everyone to do great science even though we all like to think that we're doing it. Where I thought you were going, Steve, with that was we can't rely exclusively. Well, we still rely on it a lot, but peer review. Peer review is not a perfect process. It's a strong process in a lot of ways and I don't have great ideas about what to do instead, but it's not like it's perfect. A lot of stuff gets through peer review, and so this is something that could help steer people. The only question I'm having, though, is how you could imagine a world where they're sort of methodologically specific. I'm thinking of machine learning where you have issues with overfitting your model. That would be totally irrelevant to someone running an experiment. I don't know what the future would look like. Ten years from now, are there different modules? Do we need different modules?

S: This is what exists currently in medicine. If I'm doing some quality control certification thing that I do every year, there's the first part of it, which is for everyone or maybe every physician, and then you say what your specialty is. I'm a neurologist. Then you get the neurology-specific stuff. You could do the same thing. Here's the generic rigors that everyone needs to know, and then what are you doing research in? Particle physics? Here's the particle physics part of the module for you for those specific issues. I could absolutely see that working that way.

AJR: I kind of like the idea of making a bunch of social scientists do the particle physics, just to keep us humble.

S: Absolutely.

More Space Debris (23:51)

[23:49.120 --> 23:53.120] Jay, tell us about crap falling from the sky.

[23:53.120 --> 23:57.120] Steve, there's crap, and it's falling from the goddamn sky.

[23:57.120 --> 23:59.120] Oh, my goodness.

[23:59.120 --> 24:06.120] This is about the fact that space agencies around the world

[24:06.120 --> 24:10.120] are not doing a very good job of figuring out

[24:10.120 --> 24:13.120] how to exactly de-orbit pieces of spacecraft

[24:13.120 --> 24:16.120] that are left up there for one reason or another.

[24:16.120 --> 24:20.120] There is a significant number of objects in low Earth orbit.

[24:20.120 --> 24:25.120] NASA tracks anything from 2 inches or 5 centimeters and up,

[24:25.120 --> 24:30.120] and there's 27,000 objects that are being tracked,

[24:30.120 --> 24:36.120] and 70% of the tracked objects are in LEO, low Earth orbit,

[24:36.120 --> 24:39.120] which is the orbit that's basically as close to the Earth

[24:39.120 --> 24:41.120] as you could pretty much get.

[24:41.120 --> 24:43.120] Do they say LEO?

[24:43.120 --> 24:45.120] I've only ever heard LEO.

[24:45.120 --> 24:47.120] I just thought you meant something astrology, Jay,

[24:47.120 --> 24:49.120] and I was like, I can't believe this is happening.

[24:49.120 --> 24:50.120] I've got to go.

[24:50.120 --> 24:52.120] I'm blazing trails here.

[24:52.120 --> 24:54.120] It's low Earth orbit.

[24:54.120 --> 24:57.120] Every one of these objects that are up there

[24:57.120 --> 25:01.120] and that are going to be up there for a long time are hazards.

[25:01.120 --> 25:02.120] They're dangerous.

[25:02.120 --> 25:04.120] They actually have to plan accordingly.

[25:04.120 --> 25:07.120] When anybody launches anything into outer space,

[25:07.120 --> 25:10.120] they have to figure out the right time to do it

[25:10.120 --> 25:13.120] and how to avoid these known objects,

[25:13.120 --> 25:16.120] because one of them could be traveling at such an incredible speed

[25:16.120 --> 25:19.120] in relation to the ship that you're putting up there

[25:19.120 --> 25:20.120] that it could destroy it.

[25:20.120 --> 25:22.120] It could rip right through it.

[25:22.120 --> 25:24.120] So this is a growing issue,

[25:24.120 --> 25:27.120] and we have another issue that is a problem,

[25:27.120 --> 25:31.120] is that there are objects that are being left in low Earth orbit

[25:31.120 --> 25:36.120] that are big, that are slowly de-orbiting over time,

[25:36.120 --> 25:39.120] because there's a tiny, tiny, tiny, tiny bit of atmosphere

[25:39.120 --> 25:41.120] in low Earth orbit,

[25:41.120 --> 25:44.120] and that's just enough to slowly take something out of orbit

[25:44.120 --> 25:46.120] and bring it back down to Earth.

[25:46.120 --> 25:51.120] As an example, China had one of their Long March 5B rockets

[25:51.120 --> 25:53.120] bring something up,

[25:53.120 --> 25:56.120] and a week later, when it came out of orbit,

[25:56.120 --> 25:58.120] because it was only up for a week,

[25:58.120 --> 26:01.120] and by that time there was enough inertia and everything

[26:01.120 --> 26:03.120] to get it back down into the atmosphere,

[26:03.120 --> 26:07.120] pieces of it landed in Malaysia and Indonesia,

[26:07.120 --> 26:10.120] and it landed right near a village where people were living.

[26:10.120 --> 26:12.120] It is a real threat,

[26:12.120 --> 26:15.120] and we're not talking about millions of people getting hurt,

[26:15.120 --> 26:16.120] but it could kill people.

[26:16.120 --> 26:19.120] It could kill handfuls of people now and again,

[26:19.120 --> 26:21.120] which is something that we definitely want to avoid.

[26:21.120 --> 26:23.120] It's also just not good practice.

[26:23.120 --> 26:25.120] It's not keeping your shop clean.

[26:25.120 --> 26:28.120] So getting back to the Long March 5B rocket,

[26:28.120 --> 26:30.120] now this rocket is huge.

[26:30.120 --> 26:32.120] China launched it on July 24th,

[26:32.120 --> 26:35.120] and they were bringing up a new space station module

[26:35.120 --> 26:39.120] to their Tiangong space station, which is a China-only space station.

[26:39.120 --> 26:42.120] It's actually pretty cool, they should read up on it.

[26:42.120 --> 26:46.120] Now this rocket is not designed to de-orbit itself.

[26:46.120 --> 26:48.120] They don't send it up with the ability to do that,

[26:48.120 --> 26:52.120] and in fact, the engines can't even restart after the engines are shut off.

[26:52.120 --> 26:55.120] When it does its main push and gets all that weight up

[26:55.120 --> 26:57.120] to the altitude that they need it to,

[26:57.120 --> 26:59.120] and those engines shut off, they can't go back on.

[26:59.120 --> 27:03.120] This ultimately means that there's no way for China

[27:03.120 --> 27:07.120] to control the de-orbiting of this massive rocket.

[27:07.120 --> 27:10.120] It's just going to fly back into the Earth's atmosphere,

[27:10.120 --> 27:13.120] and I'm not even sure that they know where it's going to end up going.

[27:13.120 --> 27:16.120] I don't even know if there's good physics

[27:16.120 --> 27:19.120] that will really accurately predict where something willy-nilly

[27:19.120 --> 27:23.120] is de-orbiting at some point and coming back into the atmosphere.

[27:23.120 --> 27:27.120] It could end up anywhere, which is the scary part.

[27:27.120 --> 27:31.120] Believe me, I feel completely happy and thrilled and lucky

[27:31.120 --> 27:34.120] that we're alive during a time when space exploration

[27:34.120 --> 27:36.120] is starting to explode again.

[27:36.120 --> 27:37.120] It's a great time.

[27:37.120 --> 27:38.120] Hopefully explode.

[27:38.120 --> 27:40.120] Yeah, you're right.

[27:40.120 --> 27:44.120] When all of these nations are launching new projects,

[27:44.120 --> 27:46.120] how's that? Is that better?

[27:46.120 --> 27:47.120] Better.

[27:47.120 --> 27:52.120] What we don't have right now are proper rules of etiquette.

[27:52.120 --> 27:55.120] There are things that people would like.

[27:55.120 --> 27:59.120] NASA is making it known what information that they would like,

[27:59.120 --> 28:03.120] but in this instance, China didn't share any of the information

[28:03.120 --> 28:06.120] about what trajectory their rocket was on

[28:06.120 --> 28:10.120] and where they think it'll end up coming back into the atmosphere.

[28:10.120 --> 28:13.120] The NASA administrator, the name of Bill Nelson,

[28:13.120 --> 28:15.120] he said, and I'm quoting him,

[28:15.120 --> 28:18.120] All spacefaring nations should follow established best practices

[28:18.120 --> 28:22.120] and do their part to share this type of information in advance

[28:22.120 --> 28:25.120] to allow reliable predictions of potential debris impact risk,

[28:25.120 --> 28:29.120] especially for heavy-lift vehicles like the Long March 5B,

[28:29.120 --> 28:33.120] which carry a significant risk of loss of life and property.

[28:33.120 --> 28:36.120] Doing so is critical to the responsible use of space

[28:36.120 --> 28:39.120] and to ensure the safety of people here on Earth.

[28:39.120 --> 28:42.120] I wish that I could have found some information on what would have happened

[28:42.120 --> 28:48.120] if one of these pieces of larger debris ended up barreling into a city.

[28:48.120 --> 28:50.120] Could it take a part of a building out?

[28:50.120 --> 28:53.120] What's its velocity? How much mass does it have?

[28:53.120 --> 28:56.120] I do know that SpaceX had a module,

[28:56.120 --> 29:00.120] a piece of debris come back down as recently as July 9th.

[29:00.120 --> 29:03.120] Now, if you look at a picture of the Crew-1 module,

[29:03.120 --> 29:06.120] there is a component that's right underneath it

[29:06.120 --> 29:09.120] that is used to relay electricity to the module and all that,

[29:09.120 --> 29:11.120] but it's also a cargo hold, right?

[29:11.120 --> 29:13.120] A cargo hold that's not pressurized.

[29:13.120 --> 29:17.120] This thing is about 3 meters long and it weighs 4 metric tons.

[29:17.120 --> 29:22.120] That's an incredibly heavy object that hit the Earth at one point.

[29:22.120 --> 29:26.120] It came back down on July 9th and it took a year for it to deorbit.

[29:26.120 --> 29:29.120] So that's just another thing that needs to be tracked.

[29:29.120 --> 29:32.120] It could take time for them to come back down

[29:32.120 --> 29:34.120] and then we have to try to figure out where they're going to go.

[29:34.120 --> 29:36.120] But okay, let's say we know where it's going to go.

[29:36.120 --> 29:40.120] So what? What if it's going to hit a major city somewhere?

[29:40.120 --> 29:41.120] What are we going to do about it?

[29:41.120 --> 29:43.120] The answer is there's nothing.

[29:43.120 --> 29:44.120] There's nothing we can do about it.

[29:44.120 --> 29:48.120] We're going to shoot rockets up to take out rockets that are coming.

[29:48.120 --> 29:49.120] The whole thing is crazy.

[29:49.120 --> 29:54.120] So what we need to do is we need to have this rules of etiquette

[29:54.120 --> 29:58.120] where space agencies start to send up more fuel,

[29:58.120 --> 30:01.120] have rocket engines that can deorbit themselves

[30:01.120 --> 30:04.120] and not only have one turn-on cycle.

[30:04.120 --> 30:09.120] These pretty costly and probably very expensive engineering feats

[30:09.120 --> 30:11.120] that need to become a part of all of these projects.

[30:11.120 --> 30:13.120] And that's what NASA wants.

[30:13.120 --> 30:15.120] But right now...

[30:15.120 --> 30:17.120] Just to make sure that the point is crystal clear,

[30:17.120 --> 30:21.120] it's to control the deorbit so that we know where it comes down.

[30:21.120 --> 30:26.120] We dump it in the middle of the Pacific so it doesn't hit Australia or whatever.

[30:26.120 --> 30:27.120] Exactly, yeah.

[30:27.120 --> 30:30.120] So right now there's a couple of companies that are starting to,

[30:30.120 --> 30:33.120] or space agencies that are starting to comply

[30:33.120 --> 30:37.120] and build in this functionality into the new rockets that they're building.

[30:37.120 --> 30:41.120] But let's face it, it's not a global thing.

[30:41.120 --> 30:43.120] A lot of people aren't doing that.

[30:43.120 --> 30:46.120] Some good things that we have are like SpaceX,

[30:46.120 --> 30:50.120] which is leading the pack on this whole idea of reusability.

[30:50.120 --> 30:51.120] That's fantastic.

[30:51.120 --> 30:52.120] You want to reuse your rockets.

[30:52.120 --> 30:54.120] You want your retro rockets to land themselves.

[30:54.120 --> 30:55.120] You see it all the time.

[30:55.120 --> 30:56.120] That's great.

[30:56.120 --> 30:59.120] More reusability that we build into things means more control,

[30:59.120 --> 31:02.120] more ability to bring things down safely,

[31:02.120 --> 31:05.120] which is exactly what everybody needs to be doing.

[31:05.120 --> 31:08.120] One, we don't want to pollute low Earth orbit any worse than it is.

[31:08.120 --> 31:10.120] If anything, we want to get that stuff out of there,

[31:10.120 --> 31:16.120] which no one has come up with a feasible economic way to do it yet.

[31:16.120 --> 31:18.120] But I imagine at some point in the next 50 years,

[31:18.120 --> 31:22.120] someone will come up with something that's making that move.

[31:22.120 --> 31:25.120] But in the meantime, our goals are no more debris

[31:25.120 --> 31:29.120] and absolutely no more craziness of things falling out of the sky

[31:29.120 --> 31:33.120] without any predictability on where they're going to go or drivability,

[31:33.120 --> 31:36.120] meaning we want them to go to a specific place.

[31:36.120 --> 31:38.120] So what do you think about that, Steve?

[31:38.120 --> 31:40.120] Well, it wasn't too long ago.

[31:40.120 --> 31:43.120] It was just a science or fiction item where an estimate was

[31:43.120 --> 31:46.120] that in the next decade, there's actually something like a 10% chance

[31:46.120 --> 31:48.120] of somebody getting hit by space debris.

[31:48.120 --> 31:49.120] Oh, yeah.

[31:49.120 --> 31:50.120] We all thought it was fiction.

[31:50.120 --> 31:54.120] Yeah, it's getting pretty significant now just because of the sheer volume

[31:54.120 --> 31:56.120] of stuff that we're putting up there.

[31:56.120 --> 31:59.120] So, yeah, it's, again, one of those things that we have to take

[31:59.120 --> 32:02.120] a systematic approach to it rather than relying on individuals

[32:02.120 --> 32:03.120] to all do the right thing.

[32:03.120 --> 32:05.120] How would we figure that out, Steve?

[32:05.120 --> 32:07.120] Where would we come up with such an approach?

[32:07.120 --> 32:09.120] People aren't just going to automatically do the right thing

[32:09.120 --> 32:10.120] on their own volition.

[32:10.120 --> 32:11.120] It's just stunning.

[32:11.120 --> 32:12.120] I know.

[32:12.120 --> 32:14.120] I feel like we're going to have apps where you have, like,

[32:14.120 --> 32:16.120] weather forecast, air pollution, space debris.

[32:16.120 --> 32:17.120] Space debris, yeah.

[32:17.120 --> 32:20.120] What's the probability of that thing landing in Manhattan today?

[32:20.120 --> 32:21.120] Take your umbrella.

[32:21.120 --> 32:23.120] Yeah, like a steel umbrella.

[32:23.120 --> 32:26.120] 50% chance of rain, 5% chance of...

[32:26.120 --> 32:28.120] Low-work orbit de-orbiting.

[32:28.120 --> 32:31.120] Emily Calandrelli, who does a lot of space-related science communication,

[32:31.120 --> 32:34.120] she was following this one as it was coming down.

[32:34.120 --> 32:38.120] And what shocked me about it was we really didn't know where it was

[32:38.120 --> 32:41.120] going to be until, like, an hour before, even days before,

[32:41.120 --> 32:45.120] it was like half of the Earth was in the possible target area.

[32:45.120 --> 32:48.120] But she did say, at least this one, they thought.

[32:48.120 --> 32:51.120] But, again, they didn't really know what exactly it was made of,

[32:51.120 --> 32:53.120] but it would only take out a house or two.

[32:53.120 --> 32:54.120] A house or two.

[32:54.120 --> 32:55.120] Just a house or two.

[32:55.120 --> 32:56.120] Yeah.

[32:56.120 --> 33:00.120] Since you suggested a city, a house was the better alternative.

[33:00.120 --> 33:04.120] Does space debris zero in on trailer parks like tornadoes do?

[33:04.120 --> 33:05.120] Yeah.

[33:05.120 --> 33:06.120] I'm just wondering.

[33:06.120 --> 33:07.120] And lawn chairs and stuff.

[33:07.120 --> 33:08.120] Yeah.

[33:08.120 --> 33:10.120] But there's things to consider, though, because it's not just...

[33:10.120 --> 33:12.120] But could there be explosives in there?

[33:12.120 --> 33:15.120] Could there be some leftover rocket fuel fumes?

[33:15.120 --> 33:18.120] Or I have no idea, like, what potential explosive...

[33:18.120 --> 33:20.120] They're probably out of fuel, yeah.

[33:20.120 --> 33:21.120] You'd hope.

[33:21.120 --> 33:22.120] Yeah, you'd hope.

[33:22.120 --> 33:23.120] Who knows?

[33:23.120 --> 33:24.120] What about waste?

[33:24.120 --> 33:27.120] What about, like, dangerous gases and things like that?

[33:27.120 --> 33:30.120] Well, when Columbia broke up in 2003

[33:30.120 --> 33:34.120] and came down over the American South and Southeast,

[33:34.120 --> 33:38.120] there was concern that they didn't know what sort of contamination,

[33:38.120 --> 33:41.120] I think, there was in some of the materials,

[33:41.120 --> 33:44.120] that people were finding and picking up, like, you know,

[33:44.120 --> 33:46.120] a piece of a helmet and things.

[33:46.120 --> 33:48.120] They warned people to not go near them.

[33:48.120 --> 33:49.120] Yeah.

[33:49.120 --> 33:51.120] So I don't know what sort of danger that...

[33:51.120 --> 33:52.120] I don't know.

[33:52.120 --> 33:55.120] I know it always comes up whenever they're sending up any satellite

[33:55.120 --> 33:57.120] or anything that has a nuclear battery in it.

[33:57.120 --> 34:00.120] If that thing, you know, blows up or reenters,

[34:00.120 --> 34:03.120] then we could be dumping nuclear waste.

[34:03.120 --> 34:06.120] Well, now I'm thinking, you know, Cold War Sputnik stuff, too,

[34:06.120 --> 34:08.120] where it's like, what if it's not an accident?

[34:08.120 --> 34:10.120] Not to be the conspiracy theorist of the group,

[34:10.120 --> 34:12.120] but that would be a good way to...

[34:12.120 --> 34:14.120] Anyway, I'll stop with that one thought.

[34:14.120 --> 34:15.120] All right.

Auditory Pareidolia Again (34:16)

[34:15.120 --> 34:17.120] This is actually a couple of years old,

[34:17.120 --> 34:19.120] but it's making the rounds again, and I saw it.

[34:19.120 --> 34:21.120] I don't think we've ever played this on the issue.

[34:21.120 --> 34:23.120] I missed it the first time around.

[34:23.120 --> 34:25.120] This video, just listen to the sound.

[34:25.120 --> 34:27.120] You don't have to see the video.

[34:27.120 --> 34:30.120] So either think the word brainstorm

[34:30.120 --> 34:33.120] or think the word green needle.

[34:33.120 --> 34:37.120] And whatever you think, that's what you will hear.

[34:37.120 --> 34:41.120] You don't even need to be caught with the actual words.

[34:41.120 --> 34:45.120] You just have to think it.

[34:45.120 --> 34:47.120] Isn't that bizarre?

[34:47.120 --> 34:48.120] That's crazy.

[34:48.120 --> 34:50.120] Although I'm hearing the green needle a lot more

[34:50.120 --> 34:52.120] than I'm hearing the brainstorm.

[34:52.120 --> 34:55.120] It's either distinctively green needle or not green needle.

[34:55.120 --> 34:58.120] Yeah, but I could flip both ways at will.

[34:58.120 --> 35:02.120] You would think, though, they seem like such different phrases

[35:02.120 --> 35:05.120] phonetically and everything, but it's in there.

[35:05.120 --> 35:07.120] There are things in there that will trick your brain

[35:07.120 --> 35:09.120] for both of those.

[35:09.120 --> 35:10.120] It's uncanny.

[35:10.120 --> 35:12.120] It's not even the same number of syllables,

[35:12.120 --> 35:15.120] which is surprising to me that it still works, right?

[35:15.120 --> 35:16.120] Yeah, it's one extra syllable.

[35:16.120 --> 35:17.120] Two versus three.

[35:17.120 --> 35:20.120] I think the distortion itself must be a critical component

[35:20.120 --> 35:23.120] of the ability to switch between it from one to the other, perhaps.

[35:23.120 --> 35:26.120] Otherwise, why make it sound so distorted?

[35:26.120 --> 35:30.120] I believe it also works brain needle and green storm as well.

[35:30.120 --> 35:32.120] If you try it.

[35:32.120 --> 35:34.120] I have to stumble upon this.

[35:34.120 --> 35:42.120] It's one of the more dramatic examples of auditory parendolism.

[35:42.120 --> 35:45.120] This happens in a lot of our sensory streams,

[35:45.120 --> 35:48.120] but it happens a lot with language.

[35:48.120 --> 35:52.120] Our sensory streams are wired to make the closest fit

[35:52.120 --> 35:55.120] to phonemes that you know.

[35:55.120 --> 35:59.120] It's constantly trying to make that fit between speech sound

[35:59.120 --> 36:02.120] and words that you know.

[36:02.120 --> 36:05.120] That's why you can misunderstand lyrics all the time

[36:05.120 --> 36:06.120] and misunderstand what people say.

[36:06.120 --> 36:07.120] It sounds like something close to it.

[36:07.120 --> 36:11.120] This is just demonstrating that in a very dramatic way.

[36:11.120 --> 36:14.120] It's amazing how well the priming works.

[36:14.120 --> 36:18.120] When Rob brought up the distortion, it reminded me of,

[36:18.120 --> 36:22.120] we talked about it on SGU, the doll that would talk.

[36:22.120 --> 36:23.120] Full-string dolls.

[36:23.120 --> 36:24.120] It has a recording.

[36:24.120 --> 36:27.120] It's a voice, but it's a crackly kind of voice.

[36:27.120 --> 36:29.120] It has a bit of distortion to it.

[36:29.120 --> 36:32.120] People think they're hearing things that the doll is saying

[36:32.120 --> 36:34.120] that it really isn't programmed to say,

[36:34.120 --> 36:38.120] but they can't distinguish what it was programmed to say.

[36:38.120 --> 36:42.120] They're thinking what they think it's saying instead.

[36:42.120 --> 36:45.120] We've come across this before in other mediums.

[36:45.120 --> 36:48.120] Is this behind those Disney conspiracies too,

[36:48.120 --> 36:49.120] where they're like,

[36:49.120 --> 36:52.120] there are secret horrible messages in various cartoons?

[36:52.120 --> 36:54.120] Is the light, that was one of the dolls that had it,

[36:54.120 --> 36:57.120] but that's not really what the doll was saying,

[36:57.120 --> 37:03.120] but it spread virally and that's what everyone started to hear.

[37:03.120 --> 37:05.120] It was saying because it was suggested that that's what it was saying.

[37:05.120 --> 37:07.120] The awkward masking on records.

[37:07.120 --> 37:09.120] I was just going to say that.

[37:09.120 --> 37:12.120] I've listened to Stairway to Heaven backwards.

[37:12.120 --> 37:19.120] I really hear a lot of stuff in there that has a demonic connotation.

[37:19.120 --> 37:21.120] The words that they're saying.

[37:21.120 --> 37:25.120] It's probably because I've been priming myself since I was a teenager.

[37:25.120 --> 37:27.120] When I hear that, every once in a while I'll listen to it

[37:27.120 --> 37:29.120] because it's actually kind of interesting.

[37:29.120 --> 37:33.120] I'm hearing, here's to my sweet Satan and all that stuff.

[37:33.120 --> 37:35.120] It seems very clear to me.

[37:35.120 --> 37:40.120] Again, your brain is trying to make sense out of chaos.

[37:40.120 --> 37:45.120] Sometimes your brain concocts something that isn't actually there.

[37:45.120 --> 37:47.120] It's kind of like the dress.

[37:47.120 --> 37:49.120] I was just thinking about the dress.

[37:49.120 --> 37:51.120] Or Laurel and Yanni.

[37:51.120 --> 37:54.120] Yeah, Laurel and Yanni.

[37:54.120 --> 37:56.120] The internet will spit out more of these things.

[37:56.120 --> 37:58.120] We'll share them with you.

[37:58.120 --> 38:00.120] This was a particularly impressive one.

[38:00.120 --> 38:02.120] Everyone, we're going to take a quick break from our show

[38:02.120 --> 38:04.120] to talk about our sponsor this week, BetterHelp.

[38:04.120 --> 38:07.120] Guys, we have to take care of not just our physical health,

[38:07.120 --> 38:09.120] but also our mental health.

[38:09.120 --> 38:12.120] There's lots of options available to us now.

[38:12.120 --> 38:13.120] BetterHelp is one of them.

[38:13.120 --> 38:16.120] BetterHelp offers online therapy.

[38:16.120 --> 38:17.120] I'll tell you something.

[38:17.120 --> 38:19.120] I personally do online therapy.

[38:19.120 --> 38:25.120] I've been meeting with my doctor for the past six months every week.

[38:25.120 --> 38:28.120] I've been dealing with anxiety and depression my entire adult life.

[38:28.120 --> 38:32.120] Therapy is one of the biggest things that helps me deal with it.

[38:32.120 --> 38:34.120] I really think that you should consider it.

[38:34.120 --> 38:37.120] If you're suffering, if you're having anything that's bothering you

[38:37.120 --> 38:40.120] that you seem to not be able to get over,

[38:40.120 --> 38:43.120] you really should think about talking to someone to get help.

[38:43.120 --> 38:44.120] You're right, Jay.

[38:44.120 --> 38:49.120] BetterHelp is not only online, but it offers a lot of different options.

[38:49.120 --> 38:52.120] We're talking video, phone, even live chat only.

[38:52.120 --> 38:57.120] You don't have to see someone on camera if you're not in the place to do that.

[38:57.120 --> 39:02.120] It's also affordable, and you can be matched with a therapist in under 48 hours.

[39:02.120 --> 39:07.120] Our listeners get 10% off their first month at BetterHelp.com.

[39:07.120 --> 39:11.120] That's BetterHELP.com.

[39:11.120 --> 39:14.120] All right, guys, let's get back to the show.

The Alex Jones Saga (39:15)

[39:14.120 --> 39:15.120] All right.

[39:15.120 --> 39:21.120] One thing that we can agree on, that is that Alex Jones is a giant douchebag.

[39:21.120 --> 39:25.120] You don't have my permission to use that photo.

[39:25.120 --> 39:28.120] I'm going to get your internet permission to not use that photo.

[39:28.120 --> 39:29.120] Buy my vitamins.

[39:29.120 --> 39:31.120] I have a worse photo.

[39:31.120 --> 39:37.120] All right, Kelly, give us an update on the Alex Jones saga.

[39:37.120 --> 39:41.120] Yes, so I, like the insane person I am,

[39:41.120 --> 39:44.120] have kind of had this on in the background for the last two weeks,

[39:44.120 --> 39:48.120] and I was very glad to have an opportunity to put that to use.

[39:48.120 --> 39:51.120] But in Steve fashion, I'm going to start with a question.

[39:51.120 --> 39:58.120] So what percentage of Americans do you guys think question the Sandy Hook shooting?

[39:58.120 --> 40:00.120] 20%.

[40:00.120 --> 40:01.120] 10%.

[40:01.120 --> 40:02.120] Question it?

[40:02.120 --> 40:04.120] Probably I would say like 22%.

[40:04.120 --> 40:06.120] 22.1%.

[40:06.120 --> 40:07.120] 25%.

[40:07.120 --> 40:08.120] Wow.

[40:08.120 --> 40:10.120] It depends on whether we're doing Price is Right rules or not,

[40:10.120 --> 40:13.120] but I don't think we are because I didn't say it, so Andrea wins.

[40:13.120 --> 40:15.120] Oh, it's that high?

[40:15.120 --> 40:16.120] There we go.

[40:16.120 --> 40:18.120] That's horrible.

[40:18.120 --> 40:22.120] A quarter of the people polled, it's hard because I would have won.

[40:22.120 --> 40:25.120] Price is Right rules, I would have won.

[40:25.120 --> 40:28.120] Granted, there's always issues with polling,

[40:28.120 --> 40:31.120] but even if it's half that, that's absolutely insane,

[40:31.120 --> 40:34.120] and it's almost single-handedly because of Alex Jones.

[40:34.120 --> 40:36.120] Oh, yeah.

[40:36.120 --> 40:39.120] So I'm going to talk more about the misinformation piece.

[40:39.120 --> 40:42.120] I know everyone has seen all of the clips of his testimony

[40:42.120 --> 40:45.120] and all of the perjury and all the fun stuff,

[40:45.120 --> 40:47.120] but since this is a misinformation conference,

[40:47.120 --> 40:50.120] I'm going to focus on that aspect of it.

[40:50.120 --> 40:54.120] And I think as skeptics, we often hear the question, what's the harm?

[40:54.120 --> 40:57.120] Especially with things like conspiracy theories or supplements.

[40:57.120 --> 41:02.120] It's just easy to dismiss until it gets to this point,

[41:02.120 --> 41:06.120] and Alex Jones took both of those things and ruined some families' lives.

[41:06.120 --> 41:08.120] So some backgrounds.

[41:08.120 --> 41:12.120] The caricature that you think of as Alex Jones is pretty much accurate.

[41:12.120 --> 41:16.120] He peddles all of the conspiracy theories, 9-11 truth or pizza gate.

[41:16.120 --> 41:20.120] Now he's talking about the globalists trying to bring about the New World Order,

[41:20.120 --> 41:23.120] and when the Sandy Hook shooting happened,

[41:23.120 --> 41:27.120] he almost immediately was questioning the narrative.

[41:27.120 --> 41:32.120] And he's gone from saying it's a hoax, calling the parents crisis actors,

[41:32.120 --> 41:34.120] and that's changed over time.

[41:34.120 --> 41:37.120] His position has definitely evolved,

[41:37.120 --> 41:42.120] but the consistent through line of that is that he's questioning the official story

[41:42.120 --> 41:45.120] and doesn't think that the official story is true.

[41:45.120 --> 41:48.120] And because of this, the families of the children who died

[41:48.120 --> 41:51.120] have received death threats, they've been harassed,

[41:51.120 --> 41:54.120] and they're dealing with this constantly circulating.

[41:54.120 --> 41:58.120] So a bunch of the families have sued him, rightfully so.

[41:58.120 --> 42:01.120] And so this trial was for the parents of Jesse Lewis,

[42:01.120 --> 42:04.120] who was a six-year-old who died in Sandy Hook,

[42:04.120 --> 42:09.120] for defamation and intentional infliction of emotional distress.

[42:09.120 --> 42:12.120] And we're about to make fun of Alex Jones,

[42:12.120 --> 42:17.120] but as we're doing it, keep in mind that this all sounds silly and ridiculous,

[42:17.120 --> 42:20.120] but it's causing real harm to these families.

[42:20.120 --> 42:23.120] And I don't want to make light of it, but at the same time,

[42:23.120 --> 42:25.120] there's something really satisfying,

[42:25.120 --> 42:29.120] especially in the misinformation apocalypse that we're in right now,

[42:29.120 --> 42:33.120] about somebody who is this awful actually being held accountable.

[42:33.120 --> 42:37.120] So we've got to at least appreciate that for a minute.

[42:37.120 --> 42:40.120] Also, his lawyers are comically terrible.

[42:40.120 --> 42:42.120] How can they be that?

[42:42.120 --> 42:45.120] I mean, for a guy that has this much money,

[42:45.120 --> 42:48.120] how could he because he's a losing case?

[42:48.120 --> 42:50.120] Because nobody wants to defend him.

[42:50.120 --> 42:54.120] He probably has been working his way down the ladder of terrible lawyers.

[42:54.120 --> 42:56.120] And you've had that experience.

[42:56.120 --> 42:58.120] I mean, his lawyers were pretty terrible.

[42:58.120 --> 43:01.120] With your case, your opponent had that as well.

[43:01.120 --> 43:05.120] He kept going through lawyers because nobody of quality would defend him.

[43:05.120 --> 43:07.120] Who wants to defend this guy?

[43:07.120 --> 43:10.120] The other thing is that they did it on purpose.

[43:10.120 --> 43:11.120] That's what I was thinking.

[43:11.120 --> 43:12.120] You think they're sandbagging?

[43:12.120 --> 43:13.120] Yeah.

[43:13.120 --> 43:15.120] His morals got the better of him.

[43:15.120 --> 43:17.120] That thought has been brought up.

[43:17.120 --> 43:20.120] But the thing is, one, it's a civil case,

[43:20.120 --> 43:24.120] so he can't get away with the whole, like, my lawyers were incompetent,

[43:24.120 --> 43:26.120] so get out of it that way.

[43:26.120 --> 43:30.120] But also, they cross-examined the parents.

[43:30.120 --> 43:33.120] And I feel like if you were sandbagging it,

[43:33.120 --> 43:36.120] you wouldn't want to inflict additional trauma on the parents.

[43:36.120 --> 43:40.120] And some of the questions that he was asking them, I couldn't believe.

[43:40.120 --> 43:43.120] Have the lawyers made a statement about how it happened?

[43:43.120 --> 43:47.120] Because it's hard to accidentally send a huge set of files or file.

[43:47.120 --> 43:49.120] I always forget to send attachments.

[43:49.120 --> 43:52.120] Oh, the phone that's almost definitely going to the one-sixth committee

[43:52.120 --> 43:54.120] is like a whole story in itself.

[43:54.120 --> 43:57.120] But basically, the one lawyer said,

[43:57.120 --> 44:00.120] please disregard after he accidentally sent the files,

[44:00.120 --> 44:04.120] but didn't actually take the legal steps to pull back all that information.

[44:04.120 --> 44:08.120] So they just got to use it after his ten days were up.

[44:08.120 --> 44:11.120] This trial was specifically for damages,

[44:11.120 --> 44:15.120] because Alex Jones didn't provide any of the documents or evidence

[44:15.120 --> 44:17.120] that he was supposed to during the discovery phase,

[44:17.120 --> 44:20.120] and he dragged things on for years, and so there was a default judgment.

[44:20.120 --> 44:23.120] So it wasn't a question of if the defamation happens.

[44:23.120 --> 44:25.120] The court had decided the defamation happened.

[44:25.120 --> 44:30.120] This was just to decide how much he had to pay for it.

[44:30.120 --> 44:36.120] And the trial was exactly as dramatic as the clips are portraying it to be,

[44:36.120 --> 44:39.120] and I think this one exchange between Alex Jones and the judge

[44:39.120 --> 44:43.120] is the epitome of his testimony at least.

[44:43.120 --> 44:45.120] So I'm going to read that.

[44:45.120 --> 44:48.120] I'm sorry, I don't have as good an Alex Jones impression as George.

[44:48.120 --> 44:53.120] So the judge, after sending the jury out because Alex Jones was talking about

[44:53.120 --> 44:56.120] things that he wasn't supposed to while he was on the stand,

[44:56.120 --> 44:59.120] said, you're already under oath to tell the truth.

[44:59.120 --> 45:02.120] You've already violated that oath twice today.

[45:02.120 --> 45:03.120] And granted, twice today.

[45:03.120 --> 45:07.120] He had been on the stand for like 10 minutes by that point maybe.

[45:07.120 --> 45:11.120] That might be an exaggeration, but it was end of the day,

[45:11.120 --> 45:12.120] he had just gotten on the stand.

[45:12.120 --> 45:16.120] It seems absurd to instruct you that you must tell the truth while you testify,

[45:16.120 --> 45:18.120] yet here I am.

[45:18.120 --> 45:20.120] You must tell the truth when you testify.

[45:20.120 --> 45:22.120] This is not your show.

[45:22.120 --> 45:25.120] And then she explains some of the specifics, and she goes,

[45:25.120 --> 45:27.120] do you understand what I have said?

[45:27.120 --> 45:30.120] And he goes, I, and she interrupts him and says, yes or no.

[45:30.120 --> 45:34.120] He goes, yes, I believe what I said is true.

[45:34.120 --> 45:35.120] And she cuts him off.

[45:35.120 --> 45:39.120] She goes, you believe everything you say is true, but it isn't.

[45:39.120 --> 45:41.120] Your beliefs do not make something true.

[45:41.120 --> 45:43.120] That's what we're doing here.

[45:43.120 --> 45:44.120] Oh my God.

[45:44.120 --> 45:45.120] Wow.

[45:45.120 --> 45:48.120] And you should really watch that whole clip because there was so much more of it,

[45:48.120 --> 45:50.120] but I couldn't go into the whole thing.

[45:50.120 --> 45:54.120] And watch all the clips from his testimony because it is absolutely horrifying,

[45:54.120 --> 45:58.120] but also really satisfying because he's an awful person and deserves every bit of that.

[45:58.120 --> 46:02.120] And I can't help, through all the things that I've consumed about this man,

[46:02.120 --> 46:07.120] I can't help but think that this entire thing is an act.

[46:07.120 --> 46:08.120] I was thinking the same, Jay.

[46:08.120 --> 46:10.120] I'm wondering what you all think about that.

[46:10.120 --> 46:13.120] You think he knows what he's doing and he's just pretending?

[46:13.120 --> 46:18.120] Of course, I'm not 100% sure, but it just seems like it is all a money-making act.

[46:18.120 --> 46:21.120] Like I don't think he's a real conspiracy theorist.

[46:21.120 --> 46:22.120] I think he is.

[46:22.120 --> 46:23.120] No, I think you're right.

[46:23.120 --> 46:27.120] He uses his conspiracies to sell supplements because he'll talk about the conspiracy theory

[46:27.120 --> 46:33.120] to get the views and then he pivots into an ad for supplements or for shelf-stable food

[46:33.120 --> 46:36.120] because the Great Reset is coming and so you need to have food,

[46:36.120 --> 46:39.120] or gold because there's going to be one world currency, so you need gold.

[46:39.120 --> 46:44.120] And didn't he admit as much during his trial with his, what, divorce with his wife, effectively?

[46:44.120 --> 46:45.120] Custody.

[46:45.120 --> 46:46.120] Was it custody?

[46:46.120 --> 46:49.120] Yeah, Alex Jones is a character that he is playing.

[46:49.120 --> 46:52.120] That was one of his lines of defense,

[46:52.120 --> 46:54.120] which I think probably is accurate.

[46:54.120 --> 46:56.120] Again, we can't read his mind.

[46:56.120 --> 46:58.120] We don't really know what he believes or doesn't believe,

[46:58.120 --> 47:01.120] but it certainly is plausible and it certainly fits everything I've seen about him,

[47:01.120 --> 47:03.120] that this is a character he's playing.

[47:03.120 --> 47:08.120] He did admit that, which means he doesn't necessarily have to believe anything.

[47:08.120 --> 47:10.120] But he's still doing the same level of damage, whether or not.

[47:10.120 --> 47:11.120] Totally.

[47:11.120 --> 47:12.120] That's right.

[47:12.120 --> 47:13.120] Absolutely.

[47:13.120 --> 47:14.120] People believe that he's real.

[47:14.120 --> 47:16.120] Well, and he's doing the character under oath, right?

[47:16.120 --> 47:17.120] Yes, that's the thing.

[47:17.120 --> 47:19.120] That has consequences.

[47:19.120 --> 47:23.120] It's been so interesting to watch because he's not used to being challenged on his show.

[47:23.120 --> 47:25.120] He has control over the entire narrative.

[47:25.120 --> 47:27.120] Now he has to be in reality.

[47:27.120 --> 47:32.120] And so he started to do one of his ad pitches on the stand.

[47:32.120 --> 47:35.120] He started talking about how great his supplements are and they get the best supplements.

[47:35.120 --> 47:36.120] He can't help it.

[47:36.120 --> 47:37.120] Oh, my God.

[47:37.120 --> 47:39.120] It's all he knows, effectively.

[47:39.120 --> 47:43.120] If he can make a few bucks on the stand, why not go for it, I guess, right?

[47:43.120 --> 47:47.120] It's always satisfying to see, because this is not the first time this has happened,

[47:47.120 --> 47:51.120] and there are cases where people who are con artists or pseudoscientists or whatever,

[47:51.120 --> 47:55.120] and they find themselves in a court of law where there are rules of evidence.

[47:55.120 --> 48:01.120] Not that courts are perfect, but they do have fairly rigorous rules of evidence and argument,

[48:01.120 --> 48:03.120] et cetera.

[48:03.120 --> 48:08.120] Judges, if they're competent, aren't going to let you get away with stuff.

[48:08.120 --> 48:13.120] And just watching that disconnect, somebody like Alex Jones who's living in a fantasy world,

[48:13.120 --> 48:19.120] whether he believes it or not, he is used to being in this con artist construct,

[48:19.120 --> 48:25.120] and now he has to deal with reality and rules of evidence,

[48:25.120 --> 48:29.120] and the clash is just wonderful to behold.

[48:29.120 --> 48:33.120] It's kind of reminding me, Jay, I think you talked about this on a live, SU Live,

[48:33.120 --> 48:39.120] maybe a year ago when Sanjay Gupta was on Joe Rogan and we all expected it to be kind of like that,

[48:39.120 --> 48:42.120] but Joe Rogan just sort of steamrolled the whole thing.

[48:42.120 --> 48:46.120] This is what I wish that had been like, because now we're in a place where the rules,

[48:46.120 --> 48:48.120] reality has to hold for a second.

[48:48.120 --> 48:53.120] Fun fact, Joe Rogan was on Infowars on 9-11.

[48:53.120 --> 48:55.120] As he was spewing his...

[48:55.120 --> 48:58.120] One of the least fun, fun facts I've ever heard.

[48:58.120 --> 49:03.120] As soon as 9-11 happened, he was already spewing conspiracy theories,

[49:03.120 --> 49:05.120] and then he had Joe Rogan on.

[49:05.120 --> 49:09.120] Wait, wait, Joe Rogan was on Alex Jones' Infowars show?

[49:09.120 --> 49:13.120] Well, that guy literally just dropped lower than I thought he would.

[49:13.120 --> 49:15.120] That is ridiculous.

[49:15.120 --> 49:21.120] So I read in the chat, somebody said something about Texas tort law

[49:21.120 --> 49:27.120] that drops the 45 million down to 750,000.

[49:27.120 --> 49:28.120] I read that too.

[49:28.120 --> 49:32.120] From what I saw from the plaintiff's lawyer, he was saying...

[49:32.120 --> 49:37.120] So there was talk about a cap because it was divided into two sets of damages.

[49:37.120 --> 49:40.120] So there were the compensatory damages and the punitive damages.

[49:40.120 --> 49:46.120] The compensatory damages were 4.5 million, and then the punitive damages were 41 million.

[49:46.120 --> 49:50.120] And while we were waiting to hear what the punitive damages were,

[49:50.120 --> 49:53.120] people were talking about a cap because it had to be a certain multiple

[49:53.120 --> 49:55.120] of the compensatory damages.

[49:55.120 --> 50:00.120] But from the statement that the plaintiff's lawyer gave afterwards,

[50:00.120 --> 50:03.120] that was more of a guideline, not a hard cap.

[50:03.120 --> 50:05.120] More of a guideline.

[50:05.120 --> 50:07.120] I'm just going based on his statement.

[50:07.120 --> 50:10.120] I don't know anything about Texas law, not a lawyer.

[50:10.120 --> 50:13.120] But that was what I heard about that.

[50:13.120 --> 50:18.120] I was hoping to see them literally dismantle him and his company.

[50:18.120 --> 50:21.120] Why wouldn't this guy see prison time?

[50:21.120 --> 50:24.120] It's a civil case, you don't know prison.

[50:24.120 --> 50:30.120] I understand that, but it doesn't mean that he can't be put in prison legitimately.

[50:30.120 --> 50:32.120] He did perjure himself.

[50:32.120 --> 50:34.120] That would be a whole other story.

[50:34.120 --> 50:37.120] That would be something emerging from the trial itself.

[50:37.120 --> 50:43.120] But it's hard to bring criminal charges against somebody for what they're saying

[50:43.120 --> 50:46.120] in a public forum because of free speech laws, etc.

[50:46.120 --> 50:48.120] But civil is different.

[50:48.120 --> 50:53.120] Holding people liable for the damage that they knowingly and maliciously caused,

[50:53.120 --> 50:55.120] the law allows for that.

[50:55.120 --> 50:59.120] One more thing I did want to bring up is, in my opinion,

[50:59.120 --> 51:01.120] one of the best witnesses that they had.

[51:01.120 --> 51:06.120] Her name is Becca Lewis and she does research in misinformation and disinformation

[51:06.120 --> 51:08.120] and how it spreads.

[51:08.120 --> 51:11.120] They had her on as an expert witness about misinformation.

[51:11.120 --> 51:15.120] She talked about how and why it spreads faster than the truth

[51:15.120 --> 51:20.120] since it feeds into people's world views, the confirmation bias.

[51:20.120 --> 51:24.120] The things that confirm their existing world views are going to circulate,

[51:24.120 --> 51:27.120] especially once you start to have echo chambers like Infowars'.

[51:27.120 --> 51:31.120] Also, Alex Jones platformed other conspiracy theorists.

[51:31.120 --> 51:35.120] There was one that she talked about who his content only had three views

[51:35.120 --> 51:38.120] before Alex Jones started promoting it.

[51:38.120 --> 51:40.120] It was something that nobody was going to see.

[51:40.120 --> 51:43.120] But because of his platform, a lot of people saw it.

[51:43.120 --> 51:49.120] Now we have 24% of the country who questions this main narrative.

[51:49.120 --> 51:51.120] That was a lot of what the trial was about.

[51:51.120 --> 51:53.120] He would claim, oh, I was just asking questions.

[51:53.120 --> 51:56.120] I was just having these people on to get their opinion.

[51:56.120 --> 51:58.120] Oh, my guest said it, but I didn't say it.

[51:58.120 --> 52:02.120] But he provided that platform for them to get their views out.

[52:02.120 --> 52:06.120] I think the most interesting thing she talked about was this idea

[52:06.120 --> 52:09.120] of three degrees of Alex Jones.

[52:09.120 --> 52:13.120] She said that you basically can't do misinformation research

[52:13.120 --> 52:16.120] without encountering Infowars and Alex Jones.

[52:16.120 --> 52:22.120] The common rule is that you're never more than three recommendations away

[52:22.120 --> 52:26.120] from Alex Jones or Infowars videos.

[52:26.120 --> 52:27.120] Wow.

[52:27.120 --> 52:29.120] Ouch.

[52:29.120 --> 52:34.120] The way to restate that is you can't be more full of shit than Alex Jones.

[52:34.120 --> 52:36.120] Yeah, basically.

[52:36.120 --> 52:41.120] Jones' lawyer was trying to trip her up, and he was trying to use

[52:41.120 --> 52:44.120] all of the things that a scientist or a skeptic would use.

[52:44.120 --> 52:48.120] He's talking about sample size and bias and things like that

[52:48.120 --> 52:51.120] because in any paper at the end, they're going to talk about

[52:51.120 --> 52:54.120] all of the limitations and say, like, this is a potential limitation.

[52:54.120 --> 52:57.120] This is a potential source of bias, but we tried to account for it

[52:57.120 --> 52:59.120] as best we could.

[52:59.120 --> 53:02.120] But she's a researcher, so she knew it a lot better than he did.

[53:02.120 --> 53:06.120] So she'd stop and she'd be like, no, this is what that means.

[53:06.120 --> 53:08.120] You have no idea what you're talking about.

[53:08.120 --> 53:10.120] Oh, that's great.

[53:10.120 --> 53:13.120] Yeah, and he tried to say that she hated Alex Jones and things like that,

[53:13.120 --> 53:17.120] and that would bias her, and she didn't know who Alex Jones was

[53:17.120 --> 53:19.120] before she started researching this.

[53:19.120 --> 53:21.120] And she just goes, yes, that's correct.

[53:21.120 --> 53:25.120] Like, when he'd present something, she'd say, yes, that's correct,

[53:25.120 --> 53:27.120] and it's based on hundreds of hours of research.

[53:27.120 --> 53:29.120] It's not just her opinion.

[53:29.120 --> 53:32.120] And so he kept trying to trip her up, and the best part was

[53:32.120 --> 53:37.120] he was asking her questions and said, the poll that found

[53:37.120 --> 53:42.120] 24% questioned Sandy Hook, that it was under 1,000 sample size

[53:42.120 --> 53:45.120] and was trying to discredit it that way.

[53:45.120 --> 53:47.120] And she's like, you can have statistical significance

[53:47.120 --> 53:50.120] with less than 1,000 sample size, like trying to explain that.

[53:50.120 --> 53:55.120] And then the plaintiff's lawyer comes up and hands her the actual study

[53:55.120 --> 54:00.120] and the Jones lawyer was full of shit because it was over 1,000.

[54:00.120 --> 54:02.120] So it wasn't even that, yeah.

[54:02.120 --> 54:04.120] Even the lawyer is full of BS.

[54:04.120 --> 54:10.120] We're really seeing this trend here with these crazy lawsuits.

[54:10.120 --> 54:13.120] How do you defend Alex Jones legitimately?

[54:13.120 --> 54:15.120] How do you do it?

[54:15.120 --> 54:18.120] You literally have to try to slip through some cracks.

[54:18.120 --> 54:22.120] Well, but you also don't have to defend him and say he's innocent.

[54:22.120 --> 54:24.120] I mean, I know innocent and guilty isn't what's happening here

[54:24.120 --> 54:26.120] because it's a civil case, but you don't have to say,

[54:26.120 --> 54:28.120] oh, no, he didn't defame people.

[54:28.120 --> 54:33.120] You can just try to mitigate the damage in an ethical way.

[54:33.120 --> 54:37.120] If a lawyer can give a defense they don't personally believe,

[54:37.120 --> 54:39.120] they don't have to believe it.

[54:39.120 --> 54:42.120] The ethics of law does not require that.

[54:42.120 --> 54:46.120] It just has to be a legally responsible and viable argument.

[54:46.120 --> 54:50.120] Their personal belief is actually not relevant to it.

[54:50.120 --> 54:54.120] So as long as they are mounting an ethical defense, it's fine.

[54:54.120 --> 54:58.120] But it's certainly reasonable to think that there isn't an ethical defense

[54:58.120 --> 55:07.120] of somebody like Alex Jones because it seems so obvious that he's guilty.

[55:07.120 --> 55:11.120] But again, the law is based upon the notion that everybody deserves a defense.

[55:11.120 --> 55:15.120] But that doesn't mean that lawyers can do unethical things on the stand.

[55:15.120 --> 55:18.120] It also is why I think that might speak to the quality of the lawyers

[55:18.120 --> 55:22.120] because, again, the high-quality lawyers, Jones clearly has the money.

[55:22.120 --> 55:25.120] He could pay some high-priced law legal firm to defend him.

[55:25.120 --> 55:28.120] They probably don't want their reputation sullied with this.

[55:28.120 --> 55:29.120] They don't want to go anywhere near it.

[55:29.120 --> 55:31.120] Nobody wants to be the guy who defended Alex Jones.

[55:31.120 --> 55:32.120] Right.

[55:32.120 --> 55:34.120] Do we have any idea how much money, like what his net worth is?

[55:34.120 --> 55:36.120] Like how ruinous is $41 million, $45 million?

[55:36.120 --> 55:38.120] They were desperately trying to figure that out.

[55:38.120 --> 55:42.120] So officially, I'm sorry if you didn't notice, but officially it's $200,000

[55:42.120 --> 55:46.120] that his enterprise makes $200,000 a day.

[55:46.120 --> 55:48.120] But $200,000 a day.

[55:48.120 --> 55:50.120] Is that net?

[55:50.120 --> 55:52.120] But that's probably an underestimate.

[55:52.120 --> 55:58.120] And in the phone records that were revealed, on some days they make up to $800,000.

[55:58.120 --> 55:59.120] That was their best day.

[55:59.120 --> 56:01.120] That was a good day, yeah.

[56:01.120 --> 56:03.120] You guys have got to sell supplements, man.

[56:03.120 --> 56:04.120] This is right.

[56:04.120 --> 56:06.120] We've got to switch sides.

[56:06.120 --> 56:09.120] But they had a really hard time figuring that kind of stuff out

[56:09.120 --> 56:11.120] because he didn't turn over all the documents that he was supposed to turn over.

[56:11.120 --> 56:12.120] Right, part of the problem.

[56:12.120 --> 56:15.120] So they couldn't really get a solid answer on that.

[56:15.120 --> 56:16.120] What kind of bullshit is that?

[56:16.120 --> 56:17.120] Okay, so you don't do that.

[56:17.120 --> 56:19.120] You don't turn over the documents.

[56:19.120 --> 56:25.120] Like doesn't the law, doesn't the court have the ability to deliver some type of incredible smackdown?

[56:25.120 --> 56:27.120] So that's what they did.

[56:27.120 --> 56:29.120] That was why there was the default judgment.

[56:29.120 --> 56:34.120] And so that's why this was just for damages because they already determined that he was liable

[56:34.120 --> 56:37.120] for the defamation and for the infliction of emotional distress.

[56:37.120 --> 56:39.120] I get that they clicked into summary judgment.

[56:39.120 --> 56:41.120] We see we have some experience with that.

[56:41.120 --> 56:42.120] Yeah.

[56:42.120 --> 56:44.120] But in a good way.

[56:44.120 --> 56:47.120] Don't you get into legal trouble if you don't hand over?

[56:47.120 --> 56:49.120] Like doesn't he have to now deal with the fact?

[56:49.120 --> 56:53.120] Well, you could be held in contempt, right, would be the legal remedy there.

[56:53.120 --> 56:58.120] But just in a case like this, the remedy is you lose.

[56:58.120 --> 57:03.120] You now lose the case and now we're going to talk about how much money you have to pay the plaintiff.

[57:03.120 --> 57:05.120] So that was the remedy.

[57:05.120 --> 57:11.120] He was asked, you know, turn over like emails or texts where, you know, you mentioned Sandy Hook.

[57:11.120 --> 57:17.120] And he said, I did a search on my phone, did not see any text that mentioned Sandy Hook.

[57:17.120 --> 57:21.120] So I want to know what did the court or the judge do at that point?

[57:21.120 --> 57:26.120] Because then, of course, afterwards they got two years of text and of course it's all over the place.

[57:26.120 --> 57:28.120] So he was just flat out lying.

[57:28.120 --> 57:31.120] But if they didn't get that dump, what recourse would they have had to say?

[57:31.120 --> 57:32.120] Yeah, I don't believe you.

[57:32.120 --> 57:34.120] I don't believe your phone doesn't have those.

[57:34.120 --> 57:36.120] They can get the info if they want to.

[57:36.120 --> 57:38.120] They can get the info.

[57:38.120 --> 57:43.120] They can appoint somebody to go through the phone and get the information that they want.

[57:43.120 --> 57:46.120] I know like when I had to turn over my emails, I didn't do it.

[57:46.120 --> 57:52.120] My lawyer hired an independent person to come in, go through all my emails and find the ones that were relevant.

[57:52.120 --> 57:54.120] My hands were not on it at all.

[57:54.120 --> 57:55.120] All right.

[57:55.120 --> 57:57.120] Anything else you want to add before we move on?

[57:57.120 --> 58:00.120] I will throw a quote out there from the lawyer today.

[58:00.120 --> 58:03.120] So this was just the first of a few cases.

[58:03.120 --> 58:10.120] And the plaintiff's lawyer said, there's going to be a large set of plaintiffs dividing up the corpse of Infowars.

[58:10.120 --> 58:12.120] And fingers crossed that that actually happens.

[58:12.120 --> 58:13.120] Yeah, that would be nice.

[58:13.120 --> 58:15.120] Tiny slice of justice in this book.

[58:15.120 --> 58:17.120] The corpse of Infowars.

[58:17.120 --> 58:18.120] It's a nice sentence.

[58:18.120 --> 58:19.120] Add that to your Halloween display.

[58:19.120 --> 58:21.120] I would, I would.

Earth Spinning Faster (58:21)

[58:21.120 --> 58:22.120] All right, Bob.

[58:22.120 --> 58:30.120] I understand that the earth is supposed to be slowing down over the long historical time.

[58:30.120 --> 58:33.120] But maybe that's not 100 percent true.

[58:33.120 --> 58:36.120] Well, you know, I don't want to get everybody concerned.

[58:36.120 --> 58:43.120] But the earth is now spinning faster than it ever has before in the age of atomic clocks.

[58:43.120 --> 58:45.120] I thought I felt something.

[58:45.120 --> 58:51.120] January 22nd, this past year, January 22nd, no, June 22nd, 2022.

[58:51.120 --> 58:53.120] The shortest day ever recorded.

[58:53.120 --> 58:54.120] And we're not sure why.

[58:54.120 --> 58:55.120] Should we be scared?

[58:55.120 --> 58:57.120] Should we be afraid?

[58:57.120 --> 58:58.120] So what's what's going on here?

[58:58.120 --> 59:00.120] You mean the longest day ever recorded?

[59:00.120 --> 59:01.120] What did I say?

[59:01.120 --> 59:02.120] Shortest day.

[59:02.120 --> 59:03.120] Shortest day.

[59:03.120 --> 59:04.120] Because the earth is spinning faster.

[59:04.120 --> 59:05.120] Faster, so it's short days, right?

[59:05.120 --> 59:06.120] Yeah, it's getting shorter.

[59:06.120 --> 59:07.120] Yeah, it'd be shorter.

[59:07.120 --> 59:09.120] So it all starts with a day.

[59:09.120 --> 59:10.120] What is a day?

[59:10.120 --> 59:11.120] Yeah, what's a day?

[59:11.120 --> 59:12.120] If you ask anybody, what's a day?

[59:12.120 --> 59:13.120] 24 hours.

[59:13.120 --> 59:14.120] 24 hours.

[59:14.120 --> 59:15.120] Steve, what is that in metric?

[59:15.120 --> 59:17.120] Oh, never mind.

[59:17.120 --> 59:20.120] So a mean solar day is 24 hours.

[59:20.120 --> 59:21.120] That's right.

[59:21.120 --> 59:22.120] That's what it is.

[59:22.120 --> 59:25.120] But that's the outer the outermost onion layer.

[59:25.120 --> 59:29.120] As we say, you get a little deeper and it's never really 24 hours.

[59:29.120 --> 59:30.120] Exactly.

[59:30.120 --> 59:31.120] It kind of this is 24 hours.

[59:31.120 --> 59:33.120] It goes a little shorter, a little longer.

[59:33.120 --> 59:35.120] It's like right around 24 hours.

[59:35.120 --> 59:38.120] 24 hours is should be the average.

[59:38.120 --> 59:43.120] But it varies because you've got the interior of the earth kind of roiling around.

[59:43.120 --> 59:45.120] You've got seismic activity.

[59:45.120 --> 59:49.120] You've got the wind, the wind running across the surface of the earth and causing

[59:49.120 --> 59:51.120] friction, pushing against mountains.

[59:51.120 --> 59:56.120] All those things conspire to make the day, you know, slower and faster than 24 hours.

[59:56.120 --> 01:00:02.120] But if you look at it over the over many decades, what you find is that the average is

[01:00:02.120 --> 01:00:06.120] about 24 hours and point zero zero one seconds.

[01:00:06.120 --> 01:00:08.120] So somebody asks you, how long is a day?

[01:00:08.120 --> 01:00:12.120] You say 24 hours and point zero zero one seconds, because that would be more accurate,

[01:00:12.120 --> 01:00:13.120] a little bit more accurate.

[01:00:13.120 --> 01:00:17.120] But the problem here is that we have two ways to tell time.

[01:00:17.120 --> 01:00:19.120] Really, we have atomic time, which is extremely accurate.

[01:00:19.120 --> 01:00:20.120] And here's solar time.

[01:00:20.120 --> 01:00:25.120] And every day, if the earth is a little bit slower, a little bit faster, it notches up

[01:00:25.120 --> 01:00:27.120] and it diverges from atomic time.

[01:00:27.120 --> 01:00:32.120] And after a while, you can't get beyond this, which is about, I don't know, 10 seconds.

[01:00:32.120 --> 01:00:34.120] They don't want to get beyond that, whatever that is.

[01:00:34.120 --> 01:00:36.120] So they throw in a leap second.

[01:00:36.120 --> 01:00:37.120] That's what a leap second is.

[01:00:37.120 --> 01:00:41.120] A leap second isn't because, oh, the earth is slowing and slowing and slowing and we

[01:00:41.120 --> 01:00:42.120] need to throw in a second.

[01:00:42.120 --> 01:00:46.120] It's because because of that divergence between atomic time and solar time.

[01:00:46.120 --> 01:00:47.120] That's what a leap second is.

[01:00:47.120 --> 01:00:51.120] So but why is there this general average of slowing the earth?

[01:00:51.120 --> 01:00:52.120] There's a bunch of reasons.

[01:00:52.120 --> 01:00:55.120] The main and most fascinating one for me is tidal breaking.

[01:00:55.120 --> 01:00:59.120] It's because that damn moon, the moon is doing it, is doing it towards the end.

[01:00:59.120 --> 01:01:01.120] The tides, it's happening because of the tides.

[01:01:01.120 --> 01:01:04.120] So stealing our angular momentum.

[01:01:04.120 --> 01:01:05.120] Exactly.

[01:01:05.120 --> 01:01:06.120] Exactly.

[01:01:06.120 --> 01:01:10.120] Because as because of the way the earth is rotating and the bulges created by the tides,

[01:01:10.120 --> 01:01:14.120] the moon is pulling on those on that bulge, which actually causes friction on the earth,

[01:01:14.120 --> 01:01:17.120] which slows the earth, making our days longer.

[01:01:17.120 --> 01:01:21.120] And the moon is stealing our rotational energy, our angular momentum, because that's got to

[01:01:21.120 --> 01:01:22.120] be conserved.

[01:01:22.120 --> 01:01:25.120] And that's going into a higher orbit and getting farther and farther and farther away.

[01:01:25.120 --> 01:01:30.120] And eventually, if the solar system lasts long enough, which it won't, it will get so

[01:01:30.120 --> 01:01:33.120] far away that we'll be facing each other.

[01:01:33.120 --> 01:01:36.120] The moon and the moon will be facing each other, will be tidally locked like the moon

[01:01:36.120 --> 01:01:37.120] is to us right now.

[01:01:37.120 --> 01:01:41.120] So that's just the interesting aside of why the earth is slowing.

[01:01:41.120 --> 01:01:46.120] When and if that ever happens, does that mean that one side of the earth would be getting

[01:01:46.120 --> 01:01:48.120] sun and the other side will not be getting sun?

[01:01:48.120 --> 01:01:50.120] No, it's all about the orientation of the earth moon.

[01:01:50.120 --> 01:01:51.120] Right.

[01:01:51.120 --> 01:01:52.120] It's not tidally locked to the sun.

[01:01:52.120 --> 01:01:53.120] It's tidally locked to the moon.

[01:01:53.120 --> 01:01:54.120] Right.

[01:01:54.120 --> 01:01:55.120] Now, if we were like...

[01:01:55.120 --> 01:01:57.120] Would the whole thing rotate, basically?

[01:01:57.120 --> 01:01:58.120] Yes.

[01:01:58.120 --> 01:02:00.120] We would always be facing each other.

[01:02:00.120 --> 01:02:01.120] Our orbit would be like this.

[01:02:01.120 --> 01:02:04.120] Instead of the moon is locked now and the earth is rotating.

[01:02:04.120 --> 01:02:07.120] So some side of the earth will see the moon always and the other side will never see

[01:02:07.120 --> 01:02:08.120] the moon.

[01:02:08.120 --> 01:02:11.120] But that wouldn't happen because we're going to burn up before we get to that point, I

[01:02:11.120 --> 01:02:12.120] believe.

[01:02:12.120 --> 01:02:13.120] Oh, thank you.

[01:02:13.120 --> 01:02:14.120] Perfect.

[01:02:14.120 --> 01:02:16.120] But there are planets that have been tidally locked to their sun because they're very big

[01:02:16.120 --> 01:02:19.120] and they're very close to their parent star.

[01:02:19.120 --> 01:02:22.120] So the tidal forces are strong enough to tidally lock that.

[01:02:22.120 --> 01:02:28.120] But 2020, 2021, and 2022 were a little bit different.

[01:02:28.120 --> 01:02:35.120] And it wasn't just because of that damn pandemic because these were the shortest days ever

[01:02:35.120 --> 01:02:36.120] recorded.

[01:02:36.120 --> 01:02:41.120] 2020 had 28 of the shortest days ever recorded since 1960.

[01:02:41.120 --> 01:02:42.120] What?

[01:02:42.120 --> 01:02:43.120] 28 days.

[01:02:43.120 --> 01:02:44.120] Why?

[01:02:44.120 --> 01:02:49.120] 2021 also had a plethora of very, very short days.

[01:02:49.120 --> 01:02:53.120] No dramatic records were broken in 2021, but they were still very, very short.

[01:02:53.120 --> 01:02:59.120] Oh, and 2020, I think we all can agree that if the days in 2020 were shorter, that's a

[01:02:59.120 --> 01:03:03.120] good thing because that year needed to be shorter than it was.

[01:03:03.120 --> 01:03:05.120] Literally the only good thing is this.

[01:03:05.120 --> 01:03:06.120] Right.

[01:03:06.120 --> 01:03:07.120] So 2022, we're not even done with it.

[01:03:07.120 --> 01:03:09.120] We've already broken some good records.

[01:03:09.120 --> 01:03:14.120] June 22nd was 1.59 milliseconds shorter than 24 hours.

[01:03:14.120 --> 01:03:15.120] Holy shit.

[01:03:15.120 --> 01:03:16.120] The shortest day.

[01:03:16.120 --> 01:03:17.120] Is that a lot?

[01:03:17.120 --> 01:03:23.120] It's not an absolute a lot, but relative to history, it is a lot.

[01:03:23.120 --> 01:03:24.120] 1.59 milliseconds.

[01:03:24.120 --> 01:03:27.120] The short day, shortest day ever recorded, ever recorded.

[01:03:27.120 --> 01:03:31.120] And then in July, we had a day that was the second shortest.

[01:03:31.120 --> 01:03:33.120] So something's happening.

[01:03:33.120 --> 01:03:40.120] So why do we have three years where the average day was less than 24 hours when over the past

[01:03:40.120 --> 01:03:46.120] 30, 40, 50, 60, 70 years, the average day has been a little bit longer than 24 hours?

[01:03:46.120 --> 01:03:47.120] Why?

[01:03:47.120 --> 01:03:48.120] What's going on?

[01:03:48.120 --> 01:03:49.120] Well, we're not sure.

[01:03:49.120 --> 01:03:52.120] We're not sure exactly, but there's lots, of course, there's lots of scientists and

[01:03:52.120 --> 01:03:53.120] their theories.

[01:03:53.120 --> 01:03:55.120] They have got lots of ideas of why.

[01:03:55.120 --> 01:04:00.120] One idea is that glaciers are melting and basically the poles don't have as much mass

[01:04:00.120 --> 01:04:03.120] or weight by them as they used to.

[01:04:03.120 --> 01:04:04.120] That's one idea that may be contributing.

[01:04:04.120 --> 01:04:06.120] So is that like a skater pulling in their arms?

[01:04:06.120 --> 01:04:07.120] Right.

[01:04:07.120 --> 01:04:08.120] Yes.

[01:04:08.120 --> 01:04:10.120] Distribution of mass as the skater pulling in the arms to go faster.

[01:04:10.120 --> 01:04:12.120] That's definitely related.

[01:04:12.120 --> 01:04:16.120] And related to that, Steve, is also another idea of why we're getting the speed up is

[01:04:16.120 --> 01:04:22.120] the different movements of the molten core of the planet that could address that and

[01:04:22.120 --> 01:04:23.120] speed up the Earth.

[01:04:23.120 --> 01:04:27.120] Seismic activity is another option that they throw out.

[01:04:27.120 --> 01:04:32.120] My theory is that it's the sheer mass of meatballs at Jay's house that is kind of screwing with

[01:04:32.120 --> 01:04:33.120] our rotation.

[01:04:33.120 --> 01:04:34.120] I would do it.

[01:04:34.120 --> 01:04:35.120] Jay, I'm telling you, man.

[01:04:35.120 --> 01:04:37.120] I've got two scientists that agree with that with me.

[01:04:37.120 --> 01:04:43.120] But a lot of scientists will also throw out there the Chandler wobble as one potential

[01:04:43.120 --> 01:04:44.120] reason why the Earth is speeding up.

[01:04:44.120 --> 01:04:45.120] Is that a dance?

[01:04:45.120 --> 01:04:46.120] What is it?

[01:04:46.120 --> 01:04:47.120] The France thing?

[01:04:47.120 --> 01:04:48.120] Yes.

[01:04:48.120 --> 01:04:49.120] That's the joke.

[01:04:49.120 --> 01:04:52.120] And I couldn't think of a really, really good version of that joke.

[01:04:52.120 --> 01:04:54.120] But I'll just describe what it is.

[01:04:54.120 --> 01:04:57.120] It's essentially the varying wobble of Earth's axis of rotation.

[01:04:57.120 --> 01:04:58.120] It's actually kind of complicated.

[01:04:58.120 --> 01:05:02.120] I'm trying to really wrap my head around what's exactly going on with this Chandler wobble.

[01:05:02.120 --> 01:05:07.120] But it's the axis of rotation that varies, causing a shorter term wobble.

[01:05:07.120 --> 01:05:09.120] So that's as much as I'll say about the Chandler wobble.

[01:05:09.120 --> 01:05:10.120] Okay, so what does this mean?

[01:05:10.120 --> 01:05:11.120] What's going to happen?

[01:05:11.120 --> 01:05:13.120] What are some really bad things?

[01:05:13.120 --> 01:05:16.120] Okay, it's the leap second that could be concerning here.

[01:05:16.120 --> 01:05:17.120] Because we've had leap seconds.

[01:05:17.120 --> 01:05:22.120] We've had plenty of leap seconds where you add an extra second to coordinated universal

[01:05:22.120 --> 01:05:23.120] time.

[01:05:23.120 --> 01:05:24.120] And that's been done.

[01:05:24.120 --> 01:05:26.120] Nobody really thinks about it anymore.

[01:05:26.120 --> 01:05:28.120] But it's problematic.

[01:05:28.120 --> 01:05:32.120] In 2012, Reddit was taken down because of a leap second was added that year.

[01:05:32.120 --> 01:05:33.120] Wow.

[01:05:33.120 --> 01:05:39.120] And if I was into Reddit then as I am now, I would have been pissed if Reddit went down.

[01:05:39.120 --> 01:05:40.120] But they've done tricks.

[01:05:40.120 --> 01:05:43.120] They've got something called leap smearing, where they take microsecond slowdowns.

[01:05:43.120 --> 01:05:44.120] They need a rebrand.

[01:05:44.120 --> 01:05:45.120] Yes.

[01:05:45.120 --> 01:05:52.120] In the course of a day, they might do microsecond slowdowns leading up to the leap second.

[01:05:52.120 --> 01:05:55.120] So that to make it a little more palatable, I guess.

[01:05:55.120 --> 01:05:56.120] Bob, but wait.

[01:05:56.120 --> 01:05:58.120] I hate to cut in.

[01:05:58.120 --> 01:06:03.120] But why does a fraction of a second matter in the world?

[01:06:03.120 --> 01:06:05.120] Well, it's not a fraction of a second.

[01:06:05.120 --> 01:06:06.120] It's a full second.

[01:06:06.120 --> 01:06:07.120] I mean, think about it, Jay.

[01:06:07.120 --> 01:06:12.120] I mean, a second is small, but it's important.

[01:06:12.120 --> 01:06:17.120] And computer systems and GPS and satellites, lots of things are interrelated.

[01:06:17.120 --> 01:06:18.120] And it took down Reddit.

[01:06:18.120 --> 01:06:20.120] I mean, this can happen.

[01:06:20.120 --> 01:06:26.120] Y2K is kind of a related example of when you mess with something so fundamental.

[01:06:26.120 --> 01:06:30.120] And I'll go into it in a little bit more detail in one second, Jay.

[01:06:30.120 --> 01:06:33.120] So a normal leap second can be problematic.

[01:06:33.120 --> 01:06:36.120] Perhaps it's not as much as problematic as it was.

[01:06:36.120 --> 01:06:40.120] But a negative leap second, if the Earth keeps spinning faster and faster,

[01:06:40.120 --> 01:06:46.120] or if we maintain this average of faster than 24 hours,

[01:06:46.120 --> 01:06:50.120] then we may need to add a negative leap second.

[01:06:50.120 --> 01:06:53.120] And that's much more problematic than a regular leap second,

[01:06:53.120 --> 01:06:55.120] where you're skipping one second.

[01:06:55.120 --> 01:07:01.120] It's tougher to do and more risky than adding a second for various technical reasons.

[01:07:01.120 --> 01:07:02.120] For example...

[01:07:02.120 --> 01:07:03.120] This is going into the future.

[01:07:03.120 --> 01:07:05.120] This really sounds like a time travel episode.

[01:07:05.120 --> 01:07:06.120] Yeah, right?

[01:07:06.120 --> 01:07:09.120] But smartphones, computers, communication systems,

[01:07:09.120 --> 01:07:13.120] they synchronize using something called a network time protocol.

[01:07:13.120 --> 01:07:17.120] And that network time protocol is based on the number of seconds

[01:07:17.120 --> 01:07:20.120] that have transpired since January 1st, 1970.

[01:07:20.120 --> 01:07:24.120] So you throw out a second there and things can go a little wonky.

[01:07:24.120 --> 01:07:25.120] So that's a little concerning.

[01:07:25.120 --> 01:07:28.120] It can cause some issues with these systems.

[01:07:28.120 --> 01:07:30.120] Also, there's GPS satellites.

[01:07:30.120 --> 01:07:33.120] GPS satellites don't account for rotation.

[01:07:33.120 --> 01:07:35.120] They're not really built to deal with rotation.

[01:07:35.120 --> 01:07:37.120] So if the Earth is spinning faster,

[01:07:37.120 --> 01:07:43.120] the GPS satellite will all of a sudden be over a specific area a little earlier

[01:07:43.120 --> 01:07:45.120] than it would have been previously.

[01:07:45.120 --> 01:07:46.120] And that could mean the difference,

[01:07:46.120 --> 01:07:50.120] even if the Earth sped up by a half a millisecond,

[01:07:50.120 --> 01:07:54.120] it could be 10 inches or 26 centimeters off.

[01:07:54.120 --> 01:07:56.120] And that would compound.

[01:07:56.120 --> 01:07:59.120] And eventually the GPS satellites could be essentially useless

[01:07:59.120 --> 01:08:02.120] if we don't do anything, which we probably will.

[01:08:02.120 --> 01:08:05.120] I mean, it's not like, oh my God, GPS is going to be worthless.

[01:08:05.120 --> 01:08:08.120] And when you say do something, you're like, we've got to program this problem.

[01:08:08.120 --> 01:08:11.120] Yeah, I'm not sure what level of effort would be required,

[01:08:11.120 --> 01:08:13.120] but I'm sure it's not going to be trivial.

[01:08:13.120 --> 01:08:16.120] So some people say that this is going to be over soon

[01:08:16.120 --> 01:08:23.120] and this increased rotation speed of the Earth isn't going to necessarily stay this way for years.

[01:08:23.120 --> 01:08:28.120] Some people are saying this could be the beginning of a 50-year scenario

[01:08:28.120 --> 01:08:32.120] where the Earth is spinning faster than 24 hours.

[01:08:32.120 --> 01:08:35.120] And we may absolutely need to throw in some of these negative leap seconds,

[01:08:35.120 --> 01:08:37.120] which could cause some problems.

[01:08:37.120 --> 01:08:39.120] So that's the story.

[01:08:39.120 --> 01:08:40.120] It's interesting.

[01:08:40.120 --> 01:08:42.120] I'm not too worried about it.

[01:08:42.120 --> 01:08:45.120] But we'll see if some negative leap seconds get thrown in there,

[01:08:45.120 --> 01:08:53.120] and we might find out by the end of this year or the following year if this keeps up.

[01:08:53.120 --> 01:08:55.120] So, Bob, are you angry about all this?

[01:08:55.120 --> 01:08:56.120] No.

[01:08:56.120 --> 01:09:00.120] It was just an interesting research.

[01:09:00.120 --> 01:09:03.120] It was actually tough.

[01:09:03.120 --> 01:09:04.120] I'm answering.

[01:09:04.120 --> 01:09:08.120] It was tough to really get to fully understand all the nuances here,

[01:09:08.120 --> 01:09:11.120] because you've got sidereal day, solar day, mean solar day,

[01:09:11.120 --> 01:09:16.120] all these things that are different websites had different takes on exactly what those mean.

[01:09:16.120 --> 01:09:21.120] And it was interesting to put it all together and understand exactly what was happening.

[01:09:21.120 --> 01:09:22.120] So, yeah, I enjoyed this.

[01:09:22.120 --> 01:09:25.120] A great bar bet that we were talking about when we were talking about this before.

[01:09:25.120 --> 01:09:29.120] So, Andrea, how many times does the Earth rotate on its axis in one year?

[01:09:29.120 --> 01:09:31.120] 365 and a quarter, isn't that it?

[01:09:31.120 --> 01:09:32.120] Wrong.

[01:09:32.120 --> 01:09:33.120] Oh.

[01:09:33.120 --> 01:09:41.120] 366 and a quarter, because in going around the sun, it's got to rotate one extra time.

[01:09:41.120 --> 01:09:46.120] A day, you know, one day is a full rotation plus a degree,

[01:09:46.120 --> 01:09:50.120] a full rotation plus a degree, and it adds up over a year to a whole other rotation.

[01:09:50.120 --> 01:09:51.120] Right.

[01:09:51.120 --> 01:09:55.120] 361 degrees is the mean solar day, 24 hours.

[01:09:55.120 --> 01:09:57.120] A sidereal day is...

[01:09:57.120 --> 01:09:59.120] 23 hours and 56 minutes.

[01:09:59.120 --> 01:10:00.120] Exactly.

[01:10:00.120 --> 01:10:01.120] Wow.

[01:10:01.120 --> 01:10:02.120] 23 hours and 56 minutes.

[01:10:02.120 --> 01:10:03.120] It's four minutes.

[01:10:03.120 --> 01:10:05.120] But there's also lots of variations.

[01:10:05.120 --> 01:10:08.120] You're going to leave work early and be like, I'm on a sidereal day.

[01:10:08.120 --> 01:10:10.120] That is such a skeptic thing.

[01:10:10.120 --> 01:10:11.120] Like, wrong.

[01:10:11.120 --> 01:10:12.120] 365.

[01:10:12.120 --> 01:10:13.120] You know what I mean?

[01:10:13.120 --> 01:10:14.120] Come on.

[01:10:14.120 --> 01:10:15.120] Yeah.

[01:10:15.120 --> 01:10:21.120] But also, the other nuances is that the day varies depending on where you are in the orbit

[01:10:21.120 --> 01:10:25.120] and what season it is and the tilt of the Earth.

[01:10:25.120 --> 01:10:29.120] There's so many little factors that go in here to make it extra confusing.

[01:10:29.120 --> 01:10:35.120] So can't we help by having a party somewhere on Earth that will slow the rotation down?

[01:10:35.120 --> 01:10:37.120] There must be some human configuration that we could do.

[01:10:37.120 --> 01:10:39.120] We all go to the North Pole at the same time.

[01:10:39.120 --> 01:10:44.120] We all have to jump at the same time so that we can alleviate the pressure.

[01:10:44.120 --> 01:10:45.120] It would be like Earth.

[01:10:45.120 --> 01:10:46.120] It would be like in an elevator.

[01:10:46.120 --> 01:10:49.120] Andrew, it would be like an 80s movie, like the end of an 80s movie where we all jump.

[01:10:49.120 --> 01:10:50.120] Yeah.

[01:10:50.120 --> 01:10:53.120] And like slow motion and freeze and this is us saving the world.

[01:10:53.120 --> 01:10:57.120] Everyone needs to jump at the same time or we have a negative leap second.

[01:10:57.120 --> 01:10:58.120] You two.

[01:10:58.120 --> 01:10:59.120] Right.

[01:10:59.120 --> 01:11:00.120] All right.

[01:11:00.120 --> 01:11:01.120] All right.

[01:11:01.120 --> 01:11:02.120] Thanks, Bob.

[01:11:02.120 --> 01:11:03.120] I'm glad that's over.

[01:11:03.120 --> 01:11:04.120] That's cool.

Science or Fiction (1:11:11)

Theme: Misinformation

Item #1: Reported trust in the media in 2021 was highest in China at 80%, and lowest in Russia at 29%, with the US in between at 39%.[6]
Item #2: Analysis of social media posts finds that bots are far more likely to spread false information and are responsible for as much as 90% of its spread on the most popular platforms.[7]
Item #3: Research shows that fake news spreads 6 times faster and 10 times farther on Twitter than true news, and that people are 70% more likely to share a false tweet than a truthful one.[8]

Answer Item
Fiction Bots spread 90% of disinfo
Science Reported trust in the media
Science
Fake news faster & farther
Host Result
Steve win
Rogue Guess
Evan
Reported trust in the media
Kelly
Reported trust in the media
Jay
Reported trust in the media
Bob
Reported trust in the media
Andrea
Bots spread 90% of disinfo

Voice-over: It's time for Science or Fiction.

Evan's Response

Kelly's Response

Jay's Response

Bob's Response

Andrea's Response

Audience's Responses

Steve Explains Item #3

Steve Explains Item #2

Steve Explains Item #1

[01:11:04.120 --> 01:11:05.120] All right, guys.

[01:11:05.120 --> 01:11:06.120] You know what time it is.

[01:11:06.120 --> 01:11:07.120] Science or fiction.

[01:11:07.120 --> 01:11:08.120] It's time.

[01:11:08.120 --> 01:11:13.120] It's time for science or fiction.

[01:11:13.120 --> 01:11:22.120] It's time for science or fiction.

[01:11:22.120 --> 01:11:23.120] I have three items here.

[01:11:23.120 --> 01:11:26.120] There is a theme to these items.

[01:11:26.120 --> 01:11:29.120] The theme is misinformation.

[01:11:29.120 --> 01:11:32.120] Pretty obvious theme.

[01:11:32.120 --> 01:11:34.120] These are things you may have heard before.

[01:11:34.120 --> 01:11:40.120] And, you know, the details matter because you know these things in broad brushstroke.

[01:11:40.120 --> 01:11:44.120] You know, one of these things may be wrong because of the details.

[01:11:44.120 --> 01:11:46.120] Someone's going to warn you about that.

[01:11:46.120 --> 01:11:47.120] All right.

[01:11:47.120 --> 01:11:48.120] Let's get going.

[01:11:48.120 --> 01:11:55.120] Item number one, reported trust in the media in 2021 was highest in China at 80 percent

[01:11:55.120 --> 01:12:01.120] and lowest in Russia at 29 percent with the U.S. in between at 39 percent.

[01:12:01.120 --> 01:12:07.120] Number two, analysis of social media posts finds that bots are far more likely to spread

[01:12:07.120 --> 01:12:12.120] false information and are responsible for as much as 90 percent of its spread on the

[01:12:12.120 --> 01:12:14.120] most popular platforms.

[01:12:14.120 --> 01:12:20.120] And, item number three, research shows that fake news spreads six times faster and ten

[01:12:20.120 --> 01:12:26.120] times farther on Twitter than true news and that people are 70 percent more likely to

[01:12:26.120 --> 01:12:28.120] share a false treat tweet.

[01:12:28.120 --> 01:12:31.120] I like sharing false treats better than myself.

[01:12:31.120 --> 01:12:32.120] Yeah.

[01:12:32.120 --> 01:12:33.120] A false tweet.

[01:12:33.120 --> 01:12:34.120] It's like we're going to put spinach in a brownie.

[01:12:34.120 --> 01:12:35.120] Yeah.

[01:12:35.120 --> 01:12:36.120] The truthful one.

[01:12:36.120 --> 01:12:37.120] Yeah.

[01:12:37.120 --> 01:12:38.120] So, spinach flavored candies.

[01:12:38.120 --> 01:12:39.120] Yeah.

[01:12:39.120 --> 01:12:40.120] Okay.

[01:12:40.120 --> 01:12:41.120] That's going to be fun editing, Steve.

[01:12:41.120 --> 01:12:43.120] You know, we're going to order this way.

[01:12:43.120 --> 01:12:45.120] Evan, we're going to start with you.

[01:12:45.120 --> 01:12:46.120] Okay.

[01:12:46.120 --> 01:12:47.120] I guess.

[01:12:47.120 --> 01:12:48.120] Wait, wait, Steve.

[01:12:48.120 --> 01:12:49.120] We have a guest.

[01:12:49.120 --> 01:12:50.120] I know.

[01:12:50.120 --> 01:12:52.120] And the political scientist is going last.

[01:12:52.120 --> 01:12:55.120] I'm ready for this one.

[01:12:55.120 --> 01:12:57.120] Can you ask the audience to vote for which one they want?

[01:12:57.120 --> 01:12:58.120] Yeah.

[01:12:58.120 --> 01:12:59.120] We'll get that at the end.

[01:12:59.120 --> 01:13:00.120] All right.

[01:13:00.120 --> 01:13:01.120] I'm going to take them in reverse order if that's okay.

[01:13:01.120 --> 01:13:02.120] Yeah.

[01:13:02.120 --> 01:13:03.120] Go ahead.

[01:13:03.120 --> 01:13:04.120] Let's see.

[01:13:04.120 --> 01:13:09.120] On Twitter, six times faster and ten times farther and that people are 70 percent more

[01:13:09.120 --> 01:13:11.120] likely to share a false tweet.

[01:13:11.120 --> 01:13:14.120] I think those numbers line up correctly.

[01:13:14.120 --> 01:13:16.120] Twitter has reach.

[01:13:16.120 --> 01:13:21.120] It has arms and it is deep and pervasive.

[01:13:21.120 --> 01:13:25.120] So, I'm not really surprised by those numbers in that regard.

[01:13:25.120 --> 01:13:27.120] It's kind of disappointing though.

[01:13:27.120 --> 01:13:32.120] The second one, social media posts find that these bots are more likely to spread false

[01:13:32.120 --> 01:13:36.120] information and are responsible for as much as 90 percent.

[01:13:36.120 --> 01:13:37.120] Wow.

[01:13:37.120 --> 01:13:39.120] Of its spread on the most popular platforms.

[01:13:39.120 --> 01:13:41.120] Well, there are bots.

[01:13:41.120 --> 01:13:43.120] There's no doubt about that.

[01:13:43.120 --> 01:13:46.120] But to this degree, that even surprises me.

[01:13:46.120 --> 01:13:52.120] But if you think about it, depending on how you program the bot, you could do it in such

[01:13:52.120 --> 01:13:56.120] a way that this would be possible because the thing is it's a runaway train basically.

[01:13:56.120 --> 01:13:58.120] So, yeah, it could go that fast.

[01:13:58.120 --> 01:14:01.120] The first one is the one I'm not really in sync with here.

[01:14:01.120 --> 01:14:08.120] And I think part of the one, my gut is telling me it's the U.S. percent here, 39 percent.

[01:14:08.120 --> 01:14:12.120] I actually think my memory serves it's lower than that.

[01:14:12.120 --> 01:14:15.120] So, these numbers I think are out of sync.

[01:14:15.120 --> 01:14:21.120] Perhaps it's the U.S. 29, Russia 39, China 80, something like that or they're all just

[01:14:21.120 --> 01:14:22.120] entirely wrong.

[01:14:22.120 --> 01:14:23.120] I think that one's the fiction.

[01:14:23.120 --> 01:14:24.120] Okay, Telly.

[01:14:24.120 --> 01:14:26.120] It's your first science fiction on the SGU.

[01:14:26.120 --> 01:14:27.120] Don't blow it.

[01:14:27.120 --> 01:14:30.120] I'm just glad I got out of the guess going first thing.

[01:14:30.120 --> 01:14:34.120] So, I'm going to go reverse order too because I feel pretty good about number three because

[01:14:34.120 --> 01:14:40.120] misinformation is designed to be more interesting and more shareable, so six times faster seems

[01:14:40.120 --> 01:14:43.120] reasonable, if not low.

[01:14:43.120 --> 01:14:45.120] Ten times farther makes sense.

[01:14:45.120 --> 01:14:47.120] I see no problem with that.

[01:14:47.120 --> 01:14:55.120] Number two, 90 percent sounds kind of high, but if the bots are just sharing misinformation

[01:14:55.120 --> 01:15:00.120] rather than creating it, I could see that happening because sharing is a lot easier

[01:15:00.120 --> 01:15:01.120] on social media.

[01:15:01.120 --> 01:15:03.120] So, I guess that leaves me with number one.

[01:15:03.120 --> 01:15:09.120] I'm not quite sure why I think number one is the fiction, but process of elimination.

[01:15:09.120 --> 01:15:10.120] Okay, Jay.

[01:15:10.120 --> 01:15:15.120] I'm going to start with number two because I think that one is the most likely to be

[01:15:15.120 --> 01:15:17.120] science.

[01:15:17.120 --> 01:15:20.120] Social media bots are absolutely real.

[01:15:20.120 --> 01:15:25.120] They absolutely are spreading misinformation and I'm not surprised to hear that they're

[01:15:25.120 --> 01:15:29.120] spreading up to 90 percent of that misinformation.

[01:15:29.120 --> 01:15:33.120] So, that one seems pretty obvious to me.

[01:15:33.120 --> 01:15:39.120] Going to number three about fake news spreads six times faster and ten times farther, I

[01:15:39.120 --> 01:15:41.120] mean that sounds legitimate as well.

[01:15:41.120 --> 01:15:44.120] That tracks with all the information that I have with my head.

[01:15:44.120 --> 01:15:49.120] The first one I think is the fiction and I'll give you a couple of reasons why.

[01:15:49.120 --> 01:15:54.120] One, I think that for some reason that 80 percent number in China, it seems a little

[01:15:54.120 --> 01:15:59.120] high to me, but the one that really has got me here is that Russia is as low as 29 percent,

[01:15:59.120 --> 01:16:05.120] which from my understanding, especially because of the Ukraine war, that there is quite a

[01:16:05.120 --> 01:16:11.120] bit of belief in the media coming out of Russia by the citizens there and I think it's much

[01:16:11.120 --> 01:16:13.120] higher than 29 percent.

[01:16:13.120 --> 01:16:14.120] Okay, Bob.

[01:16:14.120 --> 01:16:15.120] Wow.

[01:16:15.120 --> 01:16:20.120] When I first read that first one, 80 percent, 29 and 39, it seemed fairly spot on to me

[01:16:20.120 --> 01:16:23.120] and now I'm questioning myself based on what everyone's saying here.

[01:16:23.120 --> 01:16:26.120] The second one seems reasonable to me.

[01:16:26.120 --> 01:16:31.120] 90 percent at first blush seemed pretty high, but I mean that's what bots do.

[01:16:31.120 --> 01:16:36.120] I mean it doesn't take much for them to do that and Kelly was saying that they're not

[01:16:36.120 --> 01:16:37.120] creating it.

[01:16:37.120 --> 01:16:40.120] They're just kind of spreading it, which is very easy for them to do.

[01:16:40.120 --> 01:16:41.120] It's kind of what they've designed to do.

[01:16:41.120 --> 01:16:44.120] And three makes a lot of sense to me here as well.

[01:16:44.120 --> 01:16:48.120] Again, as Kelly said that these are designed to be spread.

[01:16:48.120 --> 01:16:52.120] This misinformation is designed to be enticing and clickable and spreadable.

[01:16:52.120 --> 01:16:55.120] So that makes perfect sense.

[01:16:55.120 --> 01:17:01.120] So yeah, there's just a lot of opportunity for this first one here, which is trust in

[01:17:01.120 --> 01:17:02.120] media.

[01:17:02.120 --> 01:17:04.120] There's a lot of potential for one of these to be off.

[01:17:04.120 --> 01:17:07.120] So I'm going to agree with everyone and say that that's fiction.

[01:17:07.120 --> 01:17:08.120] Okay.

[01:17:08.120 --> 01:17:09.120] And Andrea.

[01:17:09.120 --> 01:17:10.120] Correct us all, Andrea.

[01:17:10.120 --> 01:17:11.120] All right.

[01:17:11.120 --> 01:17:13.120] Well, I'm going to put – this is really going to be unfortunate when I get this wrong

[01:17:13.120 --> 01:17:19.120] because I studied trust in media for some time and those numbers actually seem okay

[01:17:19.120 --> 01:17:20.120] to me.

[01:17:20.120 --> 01:17:21.120] Can I change mine?

[01:17:21.120 --> 01:17:23.120] I'm really doubting it because it's been a while since I looked.

[01:17:23.120 --> 01:17:25.120] So I don't have 2021 numbers.

[01:17:25.120 --> 01:17:29.120] And just like you said and just like we see with contagion and all of that, the more everyone

[01:17:29.120 --> 01:17:31.120] else doubts one, I'm like, maybe, I don't know.

[01:17:31.120 --> 01:17:35.120] But I think Russia seems a little low.

[01:17:35.120 --> 01:17:36.120] China is always sky high.

[01:17:36.120 --> 01:17:42.120] And I like it because it's a good example of surveys as blunt instruments and getting

[01:17:42.120 --> 01:17:45.120] someone to report they have trust in media versus whether they have trust in media.

[01:17:45.120 --> 01:17:46.120] There's a lot more nuance to it than that.

[01:17:46.120 --> 01:17:49.120] But it's one of the findings is that it comes out that way even though – anyway.

[01:17:49.120 --> 01:17:51.120] So I'm going to say one is science.

[01:17:51.120 --> 01:17:54.120] Two is the one that I'm hung up on.

[01:17:54.120 --> 01:17:58.120] And I think I've fallen into this trap before and this is one of the reasons I'm bad at

[01:17:58.120 --> 01:17:59.120] science or fiction.

[01:17:59.120 --> 01:18:00.120] But I think it's so vague.

[01:18:00.120 --> 01:18:04.120] The 90% feels like a lot but it could be, yeah, they're all just talking to each other

[01:18:04.120 --> 01:18:07.120] and we just don't see it or we're not following it.

[01:18:07.120 --> 01:18:11.120] But I feel like it's the most popular platforms.

[01:18:11.120 --> 01:18:14.120] We're not great at necessarily identifying bots.

[01:18:14.120 --> 01:18:18.120] I just feel like there's a lot of detail but it's not as crystal clear as I would like

[01:18:18.120 --> 01:18:19.120] to believe that it's science.

[01:18:19.120 --> 01:18:24.120] And then number three, I was pretty convinced was science and then Kelly's social media

[01:18:24.120 --> 01:18:26.120] expert take fully convinced me that it was science.

[01:18:26.120 --> 01:18:27.120] Yeah.

[01:18:27.120 --> 01:18:28.120] So I'm going to say one is fiction.

[01:18:28.120 --> 01:18:29.120] I'm sorry.

[01:18:29.120 --> 01:18:30.120] Two is fiction.

[01:18:30.120 --> 01:18:31.120] Two is fiction.

[01:18:31.120 --> 01:18:32.120] Andrew is departing from the crowd.

[01:18:32.120 --> 01:18:35.120] Ian, do we have votes from the listeners, the audience out there?

[01:18:35.120 --> 01:18:36.120] We sure do.

[01:18:36.120 --> 01:18:42.120] So with 28 votes, number one is winning 68%.

[01:18:42.120 --> 01:18:45.120] Number two has 14%, although it's still kind of moving.

[01:18:45.120 --> 01:18:48.120] And number three is 19%.

[01:18:48.120 --> 01:18:49.120] Okay.

[01:18:49.120 --> 01:18:52.120] So pretty close to the panel.

[01:18:52.120 --> 01:18:56.120] So I guess I'll take this in reverse order since everyone agrees on the third one.

[01:18:56.120 --> 01:19:01.120] Research shows that fake news spreads six times faster and ten times farther on Twitter

[01:19:01.120 --> 01:19:10.120] than true news and that people are 70% more likely to share a false tweet than a truthful one.

[01:19:10.120 --> 01:19:15.120] So everyone on the panel thinks this one is science, most of the audience thinks this

[01:19:15.120 --> 01:19:18.120] one is science, and this one is science.

[01:19:18.120 --> 01:19:19.120] This one is science.

[01:19:19.120 --> 01:19:21.120] I get so stressed out for this.

[01:19:21.120 --> 01:19:25.120] We actually talked about this study I think on the show, whatever it was, a year or two

[01:19:25.120 --> 01:19:26.120] ago when it came out.

[01:19:26.120 --> 01:19:34.120] And yeah, there was a massive review of tweets over years, millions of tweets, and they found

[01:19:34.120 --> 01:19:39.120] that the evaluated tweets that were objectively true or objectively false, the ones that were

[01:19:39.120 --> 01:19:47.120] false, they spread much faster and they had, they would average like 10,000 retweets, whereas

[01:19:47.120 --> 01:19:53.120] the true ones would rarely get above 1,000, so they had basically ten times deeper into

[01:19:53.120 --> 01:20:00.120] Twitter, and that people were 70% more likely to share a false tweet than a truthful one.

[01:20:00.120 --> 01:20:07.120] As Kelly said, if you are unfettered from reality and facts and truthfulness, then you

[01:20:07.120 --> 01:20:12.120] can formulate your tweet to be more shareable and interesting.

[01:20:12.120 --> 01:20:18.120] And also, other research shows that the factor that probably predicts whether a tweet will

[01:20:18.120 --> 01:20:23.120] be shared is that it's novel, it's something new, people haven't heard before.

[01:20:23.120 --> 01:20:28.120] And again, that's more likely to be true if the tweet itself is fake, right?

[01:20:28.120 --> 01:20:32.120] You can make, you can craft a novel tweet.

[01:20:32.120 --> 01:20:34.120] This Carrizo is actually a son.

[01:20:34.120 --> 01:20:38.120] You can craft a novel tweet if you don't have to be true.

[01:20:38.120 --> 01:20:41.120] I guess we'll keep going, we'll take this in reverse order.

[01:20:41.120 --> 01:20:46.120] Analysis, number two, analysis of social media posts finds that bots are far more likely

[01:20:46.120 --> 01:20:51.120] to spread false information and are responsible for as much as 90% of its spread on the most

[01:20:51.120 --> 01:20:52.120] popular platforms.

[01:20:52.120 --> 01:20:58.120] Andrea, you are pretty much out there on your own thinking that this one is the fiction.

[01:20:58.120 --> 01:21:00.120] Everyone else thinks this one is science.

[01:21:00.120 --> 01:21:02.120] The majority of the audience thinks this one is science.

[01:21:02.120 --> 01:21:05.120] And this one is the fiction.

[01:21:05.120 --> 01:21:11.120] My whole field was on the line.

[01:21:11.120 --> 01:21:14.120] There's also a lot of people in the chat being like, we changed our vote based on what you

[01:21:14.120 --> 01:21:16.120] said.

[01:21:16.120 --> 01:21:18.120] Listen to the experts.

[01:21:18.120 --> 01:21:23.120] This was the gotcha one, because everyone thinks that bots are out there menacing the

[01:21:23.120 --> 01:21:25.120] social media and doing all this horrible stuff.

[01:21:25.120 --> 01:21:29.120] People are far more likely to spread false tweets than bots.

[01:21:29.120 --> 01:21:35.120] In fact, the same study as number three looked at this, where bots are actually as likely

[01:21:35.120 --> 01:21:37.120] to spread a true tweet as a false one.

[01:21:37.120 --> 01:21:39.120] There's no discrimination.

[01:21:39.120 --> 01:21:47.120] They basically amplify tweets by automatically spreading them, but they're not good at discriminating.

[01:21:47.120 --> 01:21:53.120] Also, they're not good at retweeting, because they don't really know what are the good tweets.

[01:21:53.120 --> 01:21:57.120] Basically, the AI is just not that good enough to be as good as people, whereas people are

[01:21:57.120 --> 01:22:03.120] much more effective spreaders of tweets and are also way more discriminatory towards false

[01:22:03.120 --> 01:22:05.120] tweets.

[01:22:05.120 --> 01:22:11.120] I made it platform nonspecific, just the popular ones, but the bots are, again, not to say

[01:22:11.120 --> 01:22:15.120] they're not a problem, not to say they're not making the problem worse, but actually

[01:22:15.120 --> 01:22:20.120] people are far more of a problem than bots, because they actually are more likely to spread

[01:22:20.120 --> 01:22:25.120] the false bits of news, whereas bots are really not good at discriminating.

[01:22:25.120 --> 01:22:27.120] I've got to follow more bots, then.

[01:22:27.120 --> 01:22:28.120] Yeah.

[01:22:28.120 --> 01:22:29.120] Yeah.

[01:22:29.120 --> 01:22:30.120] Okay.

[01:22:30.120 --> 01:22:35.120] All this means that reported trust in the media in 2021, Jay, before the war in Ukraine,

[01:22:35.120 --> 01:22:40.120] was highest in China, 80%, and lowest in Russia, 29%, with the U.S. in between, 39%.

[01:22:40.120 --> 01:22:42.120] This is science, as Andrea said.

[01:22:42.120 --> 01:22:48.120] I was very careful to put reported trust in the media, because that's only what we know,

[01:22:48.120 --> 01:22:52.120] and that 80% in China, there's a lot of reason to think that that's what people are saying,

[01:22:52.120 --> 01:22:55.120] but not necessarily what they believe.

[01:22:55.120 --> 01:22:59.120] What was interesting to me here, because I immediately saw that, of course they're saying

[01:22:59.120 --> 01:23:03.120] they trust the media, because they don't trust to say that they don't trust the media, they

[01:23:03.120 --> 01:23:08.120] don't have the security to think that they do that, but then the disconnect between that

[01:23:08.120 --> 01:23:12.120] and Russia being literally at the bottom, wouldn't it be the same in Russia?

[01:23:12.120 --> 01:23:17.120] Despite the fact that these are both authoritarian regimes, there's something fundamentally different

[01:23:17.120 --> 01:23:21.120] about the experience of people in China and in Russia.

[01:23:21.120 --> 01:23:27.120] This is in 2021, maybe it's different now because of the war in Ukraine, and so it's

[01:23:27.120 --> 01:23:30.120] a good thought, Jay, to look at that detail.

[01:23:30.120 --> 01:23:33.120] Could it be Russia's access to the Internet?

[01:23:33.120 --> 01:23:37.120] Because we know China's locked down, much more on the Internet than I think we should.

[01:23:37.120 --> 01:23:39.120] Yeah, I don't know, maybe that's the difference.

[01:23:39.120 --> 01:23:41.120] I don't know how much different that is between China and Russia.

[01:23:41.120 --> 01:23:42.120] I was so sure I was right.

[01:23:42.120 --> 01:23:45.120] It's really unbelievable how you just don't know anything.

[01:23:45.120 --> 01:23:50.120] This is a little bit outdated and certainly self-serving, but so my dissertation was on

[01:23:50.120 --> 01:23:53.120] censorship in China, and I mostly looked at what they covered in their national news,

[01:23:53.120 --> 01:23:58.120] but I compared it to Russia, Venezuela, France, and the U.S. for a couple of big events.

[01:23:58.120 --> 01:24:02.120] And Russia, one of the things, I mean, these are my ideas, this isn't necessarily founded

[01:24:02.120 --> 01:24:05.120] science, but I'm trying my best, but exactly right.

[01:24:05.120 --> 01:24:08.120] It's much more interaction with the outside world.

[01:24:08.120 --> 01:24:12.120] Basically, my whole argument is you can't censor that much if there is information flowing

[01:24:12.120 --> 01:24:13.120] from elsewhere.

[01:24:13.120 --> 01:24:16.120] And so Russia can't get away with that kind of thing, and therefore I'm not surprised

[01:24:16.120 --> 01:24:18.120] to see the lower numbers.

[01:24:18.120 --> 01:24:20.120] Why didn't you say that?

[01:24:20.120 --> 01:24:23.120] Why didn't you say that before?

[01:24:23.120 --> 01:24:25.120] The most important thing is…

[01:24:25.120 --> 01:24:28.120] I wasn't sure if the actual numbers were right, but China being higher than Russia

[01:24:28.120 --> 01:24:30.120] is not surprising.

[01:24:30.120 --> 01:24:31.120] You know what I thought?

[01:24:31.120 --> 01:24:33.120] I really did think number two was wrong, Steve.

[01:24:33.120 --> 01:24:35.120] I really thought that that was wrong.

[01:24:35.120 --> 01:24:36.120] Yeah, I thought that one was a fiction.

[01:24:36.120 --> 01:24:38.120] I really did mostly that.

[01:24:38.120 --> 01:24:42.120] I went with number one because I thought I was helping out Andrea.

[01:24:42.120 --> 01:24:46.120] That's very nice of you, Jack.

[01:24:46.120 --> 01:24:50.120] The important thing to remember here is that my decision to have Andrea go last was the

[01:24:50.120 --> 01:24:51.120] right one.

[01:24:51.120 --> 01:24:53.120] It was absolutely spot on.

[01:24:53.120 --> 01:24:54.120] Yeah, you didn't get it now.

[01:24:54.120 --> 01:24:55.120] I understand now, Steve.

[01:24:55.120 --> 01:24:56.120] Yeah, okay.

[01:24:56.120 --> 01:24:57.120] All right.

Skeptical Quote of the Week (1:24:57)

An educated person is one who has learned that information almost always turns out to be at best incomplete and very often false, misleading, fictitious, mendacious – just dead wrong.
– Russell Baker (1925-2019), American journalist and Pulitzer Prize-winning writer, host of PBS' Masterpiece Theater

[01:24:57.120 --> 01:24:59.120] Evan, take us out with a quote.

[01:24:59.120 --> 01:25:04.120] An educated person is one who has learned that information almost always turns out to

[01:25:04.120 --> 01:25:12.120] be, at best, incomplete, and very often false, misleading, fictitious, mendacious, just dead

[01:25:12.120 --> 01:25:13.120] wrong.

[01:25:13.120 --> 01:25:19.120] Russell Baker, American Pulitzer Prize winning writer, host of PBS's Masterpiece Theatre.

[01:25:19.120 --> 01:25:21.120] Very nice and very appropriate.

[01:25:21.120 --> 01:25:22.120] Very appropriate.

[01:25:22.120 --> 01:25:23.120] To the conference.

[01:25:23.120 --> 01:25:24.120] Absolutely.

Signoff/Announcements

[01:25:24.120 --> 01:25:25.120] Well, this was a ton of fun.

[01:25:25.120 --> 01:25:26.120] Thank you all for joining me.

[01:25:26.120 --> 01:25:27.120] Andrea, welcome back.

[01:25:27.120 --> 01:25:28.120] Thank you.

[01:25:28.120 --> 01:25:29.120] Always great to have you on the show.

[01:25:29.120 --> 01:25:30.120] Andrea.

[01:25:30.120 --> 01:25:31.120] Kelly, welcome to your first SGE.

[01:25:31.120 --> 01:25:32.120] Well done.

[01:25:32.120 --> 01:25:33.120] You did great.

[01:25:33.120 --> 01:25:34.120] Really good job.

[01:25:34.120 --> 01:25:35.120] Thanks for talking about Alex Jones with us.

[01:25:35.120 --> 01:25:37.120] And remember, this is coming out soon.

[01:25:37.120 --> 01:25:38.120] Discover the Guide to the Future.

[01:25:38.120 --> 01:25:39.120] You can pre-order right now.

[01:25:39.120 --> 01:25:40.120] Pre-order right now.

[01:25:40.120 --> 01:25:43.620] Seriously, it does help us a lot if you pre-order.

[01:25:43.620 --> 01:25:45.960] That will help promote the book tremendously.

[01:25:45.960 --> 01:25:46.960] So please check it out.

[01:25:46.960 --> 01:25:48.200] We really would appreciate it.

[01:25:48.200 --> 01:25:52.120] Our first book is also still for sale, The Skeptics Guide to the Universe.

[01:25:52.120 --> 01:25:53.120] Nice pair.

[01:25:53.120 --> 01:25:55.560] If you're new to the show and you didn't realize we have a book out there, we do.

[01:25:55.560 --> 01:25:56.560] Just check it out.

[01:25:56.560 --> 01:25:59.480] It basically goes over all the stuff that we talk about on the show, all the critical

[01:25:59.480 --> 01:26:01.120] thinking skills and such.

[01:26:01.120 --> 01:26:05.120] And again, thanks to George for hosting the Nexus for us.

[01:26:05.120 --> 01:26:08.980] And as for his kind introduction, thanks to everybody working behind the scenes.

S: —and until next week, this is your Skeptics' Guide to the Universe.

S: Skeptics' Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking. For more information, visit us at theskepticsguide.org. Send your questions to info@theskepticsguide.org. And, if you would like to support the show and all the work that we do, go to patreon.com/SkepticsGuide and consider becoming a patron and becoming part of the SGU community. Our listeners and supporters are what make SGU possible.


GH: (in a Russian accent) I think is basically—I don't trust, I don't trust the media because I am inebriated. And feel free to talk about— (Rogues laugh.)

[top]                        

Today I Learned

  • Fact/Description, possibly with an article reference[9]
  • Fact/Description
  • Fact/Description

Notes

References

Vocabulary


Navi-previous.png Back to top of page Navi-next.png