YouTube: https://youtube.com/watch?v=FDsQB5Ug4SQ
Previous: How Do Curveballs Change Direction in Midair?
Next: Does Anti-Aging Cream Work?

Categories

Statistics

View count:890
Likes:111
Dislikes:4
Comments:44
Duration:35:35
Uploaded:2017-05-24
Last sync:2017-05-24 17:10
Henry Reich of MinutePhysics and MinuteEarth talks about statistical paradoxes and quantum entanglement. Afterwards, Jessi from Animal Wonders joins the show with two Red-Footed Tortoises!

Henry Reich's Channels:
MinutePhysics: http://www.YouTube.com/MinutePhysics
MinuteEarth: http://www.YouTube.com/MinuteEarth

Animal Wonders: http://www.YouTube.com/AnimalWondersMontana
----------
Support SciShow by becoming a patron on Patreon: https://www.patreon.com/scishow
----------
Dooblydoo thanks go to the following Patreon supporters—we couldn't make SciShow without them! Shout out to Kevin, Bealer, Mark Terrio-Cameron, KatieMarie Magnone, Patrick Merrithew, Charles Southerland, Fatima Iqbal, Sultan Alkhulaifi, Tim Curwick, Scott Satovsky Jr, Philippe von Bergen, Bella Nash, Bryce Daifuku, Chris Peters, Patrick D. Ashmore, Piya Shedden, Charles George
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow

 (00:00) to (02:00)


(intro)

HG: Hello and welcome to SciShow Talk Show, that day where we talk to interesting people about interesting stuff.  Today, we've got YouTube maker of MinutePhysics and also kind of sometimes physicist, Henry Reich, and I see that you have brought today a bunch of decks of cards--

HR: Yes.

HG: --which I take as that you're going to demonstrate something to me and I may be made uncomfortable by it.

HR: The goal here is not to make you uncomfortable unless it is by what I'm demonstrating with the cards, not by your abilities to do things with the cards.

HG: Like the last time you were on when you made me feel like an idiot, which is fine.

HR: That was not my intention.

HG: I know it wasn't. 

HR: So I wanna talk about some weird features of statistics.

HG: Okay.

HR: And eventually, we might get into how they actually apply to physics itself, but their general--it's a paradox in statistics that is very important to know about that most people probably don't.  

HG: Okay.

HR: There are lots of paradoxes in statistics that people don't know about that are important to know about.  This is one.  So I'm going to demonstrate it with cards.  Got two decks of cards here, one deck has birds on it, like the golden eagle.

HG: Good birds.  I like these birds.

HR: And the other one is the VidCon deck of cards.

HG: VidCon deck of cards.

HR: So what I'm gonna do, these are just the big rest of the cards, we don't need those, is I'm going to put out some piles for you, Hank, of cards, and this first demonstration is basically I'm going to make a pile, two piles of cards, and you're gonna tell me if you had to choose a card randomly from each of the piles, which pile would you choose them from?  So here we have--and the goal is to get a red card.  So here we have a single black card and here we have one red card and three blacks.  Which pile would you choose from to get the most--the greatest chance of getting a red card?

 (02:00) to (04:00)


HG: I mean, I think I would go for the one that had a red card in it, Henry.

HR: Yes.  That is very good and that is very correct.  So now we're gonna put those aside here for a second.  So basically, you said the bottom pile, the VidCon cards, are the ones you'd pick.

HG: Okay.  

HR: And here again are another set of piles.  Here's one that has one red card and here's one that has three red cards and a black card.  Again, your goal is to have your gretest chance of getting a red card.  Which one do you pick?

HG: I'd pick the bottom one.

HR: Right, the bottom one, the VidCon one, 'cause like, it's a 100% chance of getting a red card.  So in both cases, you had a better chance of getting a red card if you chose the  VidCon cards, right?  Now, what I'm going to do is I'm going to put the piles together and cover them up.  Which pile should you choose if you want the biggest chance of getting a red card?  

HG: Make me forget what was in there.  

HR: The VidCon one was the best both times.

HG: It was the best both times, which makes me immediately think that it is definitely not the VidCon cards.

HR: That is correct, that is correct.  There are only two red cards out of five in the VidCon pile, and there were three red cards out of five in the bird pile and so what this illustrates is something called Simpson's paradox, which is a paradox in statistics where, when you look at things in aggregate as opposed to separately, you get completely opposite results.

HG: Or you might get completely opposite results.

HR: You might, yes.

HG: You might also just not get the same results everytime.  

HR: Right, the point isn't that this always happens, the point is that it's possible for this to happen..

HG: There are situations where this happens.

HR: Often you would get the same results, right?   But in this case, it has been carefully contrived so that you don't get the same results and the reason this is relevant is, 'cause, you know, this could just be, you know, something about you know, gambling and birds and VidCon but it could also be that if you might be set up in a situation where this first scenario was maybe you're trying to treat people for some disease, and this is one treatment and you say, okay, so the VidCon treatment treats more people and this case, you say, this is a different treatment, in this case, you see, okay, so the VidCon treatment successfully treats more people, but it may be the case that these people on the side have been selected because the populations are different, right?  

 (04:00) to (06:00)


That they have been selected for that treatment for some reason, like maybe they are getting this treatment when they already are farther progressed in the disease, so they're less likely to recover or something, and in this case, they're not as far progressed, or something like that.  So this treamtent on the bottom sucks, because it only has two reds out of--

HG: Two out of five.

HR: And this treatment is better 'cause it's three out of five, but if you looked only at the treatments independently, you might say, okay, so we should do the VidCon treatment even though maybe you should be doing the bird treatment or vice versa depending on how these things are broken up.  So I wanna give another example using a slightly different set of cards to show how this, again, how this paradox, apparent paradox, kind of comes into play.  

So this one, we're going to have a couple groups here, again, we're gonna talk about just choosing a card at random, and so in this case, I want you to think about this one.  We're gonna actually put this into a imaginary scenario.

HG: Okay.

HR: Where this is a college admissions.  So and we're talking about two different departments of the college and we're talking about boys and girls.  So we've got kings and queens, men and women, applying to college.  Do they get in or not to the department?  So maybe it's the ornithology department and the you know, film studies or whatever, right?  So if you look at this one, you could say that I have it arranged such that the departments are fair on a gender basis, right, you can see that there's half of the boys that apply to ornithology and half the girls get in and in this case--

HG: So is like, red getting in and black is not getting in?

HR: Yeah, red is getting in and black's not getting in.  In this case here, film studies happens to be a harder school to get into, so only two out of six boys get in and here one out of three girls get in, so it's the same proportion.  So you say, okay, this department is fair in its admissions from a gender perspective, this department is fair, but if you put them together and look at the entire university as a whole, here we're saying, okay, three out of seven women who applied to the university got in, and here, three out of eight men.

 (06:00) to (08:00)


So you can say, okay, it looks like the University is discriminating against men, because fewer men who apply to the University get in than women, but on the department level, each department, each of the departments is being perfectly fair, right, and this is actually, this has been something that's been an issue at universities and it depends on how you're looking at the data.

HG: Right.

HR: I think the--it was the opposite case where it looked like women were being discriminated against in one case.  It's somewhat different from this, but it was a similar situation, where, depending on whether you look at the department level or the university level--

HG: Right.  Yeah, and like, this is--like, when we're trying to say things about the world and this is especially it feels like it's the case when you're sort of on the cusp, when it's like, pretty close to being like, you might be like, well, we're getting 51% or 49%.

HR: Right.

HG: And you can use some tools to like, say, well, what if we looked at it this way and then it's the opposite of what you would think.

HR: Right.

HG: But it's that we shouldn't necessarily be trying to tease so much meaning out of so little data.

HR: Potentially, although some of these things like even have huge amounts of data and these things don't show up, right, the effects don't show up and one of the things that I think is interesting about this is that you can actually infer causal relationships from this kind of statistics.  People often think that statistics can only tell you correlations, but there are ways of using these kinds of relationships to infer causal like, causal structure, and you may say, okay, so there was a cause that there was this departmental difference and more women apply or more men, let me go back to it, more men applied to the hard-to-get-into department than the women.  That was the reason why we saw this imbalance in admissions is because men were applying, in this case, more to the department that was hard to get into and women applied more to the department that was easy.  I mean, that was what was reversed in the actual case where the women were applying for the hard departments and the men were applying for the easy ones.

HG: Right.  Okay.  

HR: And so you can basically infer there's like, there is a causal, like, common cause to these things, which is that there's--that women and men apply to departments in different proportions.

 (08:00) to (10:00)


HG: Right.

HR: And then once you understand that, all of the other patterns are explained, even if you don't necessarily have all the details, that's something that you can get at with what's called conditional probability and Bayes' Law or Bayes' Rule, which is a way at looking at, well, what's the probability that somebody got in, given that they were a woman or given that they were in the ornithology department or given that they were a woman in the ornithology department.  Those, there are different relationships between all those things, and that's what brings us to physics, because there are certian quantum mechanical phenomena that happen like quantum teleportation or entanglement which seem really counterintuitive.  So entanglement is when you have, you know, say something radioactively decays and you get two photons go out and they have correlated spins and so the--or polarizations, which is something you can measure, where no matter how far away they are, if you measure this one in a certain way, the other one is going to be opposite.  The correlations are stronger than you could ever have if there were any sort of hidden underlying communication between them.  It's like somehow they always know to do the right thing even though we don't know how.

HG: Yeah.

HR: And the question is like, well, what's the causal structure of this?  Like, is there a root cause when they're emitted that like, propogates with them or does somehow, when you measure this one, does like that cause of then knowing what this one is propogate back in time--

HG: --affect it somehow.

HR: --and then forward in time on the other branch or vice versa or is there some third party thing that was even farther back in time that caused them both?  And then thing that I think is fascinating about this is that the folks who have studied this using the same kind of causal models that I'm talking about here, what they discovered was that none of the possible classical causal models could explain the quantum entanglement causality, so basically, it was like normal probability and statistics understanding causality did not apply.  They basically had to develop new math which was basically a quantum version of this causality and then the quantum version of it applies to quantum entanglement.

HG: Of this?

HR: Of this stuff, yeah.  Like, a quantum version of like, what's the probability that you got into ornithology given that you are a woman, there's a quantum version of that, which is a little bit more complicated.  

 (10:00) to (12:00)


HG: Yeah, I might imagine.

HR: It's already sufficiently counterintuitive, right, which is why these are called statistical paradoxes, but there's like a quantum mechanical version of that which is a little bit more complicated.  We don't know if that's true, if there is like, quantum causality, but what we do know is that the causal structures that we can describe with normal statistical causality do not describe certain quantum systems.  

HG: Statistics.  I think that we all need to know more of them.  I--I mean, like, I'm a little bit, I'm like stuck back here where I'm like, when there is so often that you hear a statistic that gets thrown out and it's defending some kind of perspective--

HR: Right.

HR: And knowing that some thing that no one knows, like, that is like kind of paradoxes that people just aren't aware of.

HR: Right.

HG: Might be responsible for it.  That's like, we can make, like big decisions of like, policy--

HR: Oh, huge policy decisions based on statistics and based on--

HG: --decisions based on that stuff.

HR: You know, there are so many other ones as well.  There's various, you know, various fallacies where you invert things and you think, oh, the probability of, you know, getting into the, you know, anthropology department given that you're a woman is the same as the probability that you're a woman given that you got into the department, right, and those things sound the same but they're not always the same.  

HG: Yeah.

HR: I mean, we can do an example right here with these cards in front of us, and it probably would pan out that way. 

HG: Yeah.

HR: But--

HG: You're also telling me that quantum entanglement is a statistical confusion.  Maybe we thought that this thing was happening but there was a statistical explanation for it all and it's just weird.  Does the statistical explanation--or explanation might not be the right word--statistical understanding of quantum entanglement, does it kind of explain what's going on?  'Cause it's always seemed to me like this ridiculous impossible thing.

HR: The thing that I found, it doesn't feel particularly satisfying.  

HG: Okay.

HR: The--it models it very well, it models the causality of it very well in a way that doesn't feel satisfying because it basically says something that feels like, well, both things work because it's quantum mechanics.

 (12:00) to (14:00)


That's kind of how it feels.  In the same way that the--

HG: I kind of feel that way about quantum mechanics all the time.

HR: Right.

HG: It works because it's quantum mechanics.

HR: Right.  But I was gonna say, so there's this thing called the many world interpretation of quantum mechanics which says like, that when you have these paradoxes about like, which event happens, you know, like in Schroedinger's cat, does the cat die or does it--

HG: Does it--yeah.

HR: --survive when the radioactive decay happens and--

HG: Yeah, is there a live cat or a dead cat in the box?

HR: Right, and there's, because we currently don't have a mathematical description of what actually happens, why we see only one thing happen.

HG: Right.

HR: And this is not just like, crazy cat experiments, it's like, anytime you look at a radioactive decay and you see a decay or you don't, the question is why did we happen to see decay then when we could have not seen it or vice versa.

HG: Right, when basically--

HR: And so the many worlds interpretation is that there are actually are all possible histories that could happen, do happen and we just happen to follow one of those histories, kind of like a choose-your-own-adventure book where you only--you read one story and then you're done, even though there are all the other ones that could happen.

HG: So it's sort of in the same way you're looking at this photon that has been affected by or is connected to this (?~
13:20) photon that could not possibly be communicating with it, and you're saying, oh, well, we just happened to--it's just another one of those moments where, like, when we look, we are defining it.  We are creating the state of that photon by looking at it.

HR: I'm not sure it's exactly that.  

HG: Okay.  

HR: But it's another one of those moments where it feels kind of--it feels like this explanation isn't as much an explanation as like, you're just saying that this is the way the world is.

HG: Right, we're not explaining it, we're just saying, like--

HR: Here's a way it could happen that we could never test otherwise.  

HG: Oh my God.  

HR: But yeah, the statistical stuff, I think is super, I think more people need to know more about stastistical fallacies.

 (14:00) to (16:00)


 (16:00) to (18:00)


 (18:00) to (20:00)


 (20:00) to (22:00)


 (22:00) to (24:00)


 (24:00) to (26:00)


 (26:00) to (28:00)


 (28:00) to (30:00)


 (30:00) to (32:00)


 (32:00) to (34:00)


 (34:00) to (35:35)

Website Security Test