scishow psych
Statistics Say Screens Aren't Destroying Today's Teens
YouTube: | https://youtube.com/watch?v=SYLySBpGGM8 |
Previous: | Why Do We Still Teach Freud If He Was So Wrong? |
Next: | Groups That Chant Together, Stay Together |
Categories
Statistics
View count: | 201,564 |
Likes: | 11,619 |
Comments: | 952 |
Duration: | 07:31 |
Uploaded: | 2019-05-27 |
Last sync: | 2024-10-23 08:15 |
Looking around, you might think it’s obvious that the abundance of screens and social media are ruining our lives, but what does the research actually tell us?
Hosted by: Hank Green
----------
Support SciShow by becoming a patron on Patreon: https://www.patreon.com/scishow
SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at https://www.scishowtangents.org
----------
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever:
Adam Brainard, Greg, Alex Hackman, Sam Lutfi, D.A. Noe, الخليفي سلطان, Piya Shedden, KatieMarie Magnone, Scott Satovsky Jr, Charles Southerland, Patrick D. Ashmore, charles george, Kevin Bealer, Chris Peters
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow
----------
Sources:
https://journals.sagepub.com/doi/abs/10.1177/2167702617723376
https://www.nature.com/articles/s41562-018-0506-1
https://www.sciencedirect.com/science/article/pii/S0190740914000693
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5122517/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5122517/
http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf
https://www.vox.com/science-and-health/2019/2/20/18210498/smartphones-tech-social-media-teens-depression-anxiety-research
https://www.npr.org/2017/12/17/571443683/the-call-in-teens-and-depression?t=1555943770080
https://www.theguardian.com/society/2019/apr/24/who-warning-children-screen-time
Images:
https://www.istockphoto.com/photo/blur-teen-phone-chat-for-background-asian-girl-black-short-hair-using-or-play-with-gm980248230-266324704
https://www.istockphoto.com/photo/people-group-having-addicted-fun-together-using-smartphones-detail-of-hands-sharing-gm952414660-260008285
https://www.istockphoto.com/photo/success-is-for-the-dedicated-gm1135957607-302379977
https://www.istockphoto.com/vector/print-gm667006672-121770329
https://www.istockphoto.com/photo/betting-the-spotlight-gm1054454406-281745592
https://www.istockphoto.com/photo/social-media-and-teens-homework-gm931043378-255214144https://www.istockphoto.com/photo/not-hungry-gm173031499-7155600
https://www.istockphoto.com/vector/newspaper-news-symbol-gm930518382-255110095
Hosted by: Hank Green
----------
Support SciShow by becoming a patron on Patreon: https://www.patreon.com/scishow
SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at https://www.scishowtangents.org
----------
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever:
Adam Brainard, Greg, Alex Hackman, Sam Lutfi, D.A. Noe, الخليفي سلطان, Piya Shedden, KatieMarie Magnone, Scott Satovsky Jr, Charles Southerland, Patrick D. Ashmore, charles george, Kevin Bealer, Chris Peters
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com
Instagram: http://instagram.com/thescishow
----------
Sources:
https://journals.sagepub.com/doi/abs/10.1177/2167702617723376
https://www.nature.com/articles/s41562-018-0506-1
https://www.sciencedirect.com/science/article/pii/S0190740914000693
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5122517/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5122517/
http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf
https://www.vox.com/science-and-health/2019/2/20/18210498/smartphones-tech-social-media-teens-depression-anxiety-research
https://www.npr.org/2017/12/17/571443683/the-call-in-teens-and-depression?t=1555943770080
https://www.theguardian.com/society/2019/apr/24/who-warning-children-screen-time
Images:
https://www.istockphoto.com/photo/blur-teen-phone-chat-for-background-asian-girl-black-short-hair-using-or-play-with-gm980248230-266324704
https://www.istockphoto.com/photo/people-group-having-addicted-fun-together-using-smartphones-detail-of-hands-sharing-gm952414660-260008285
https://www.istockphoto.com/photo/success-is-for-the-dedicated-gm1135957607-302379977
https://www.istockphoto.com/vector/print-gm667006672-121770329
https://www.istockphoto.com/photo/betting-the-spotlight-gm1054454406-281745592
https://www.istockphoto.com/photo/social-media-and-teens-homework-gm931043378-255214144https://www.istockphoto.com/photo/not-hungry-gm173031499-7155600
https://www.istockphoto.com/vector/newspaper-news-symbol-gm930518382-255110095
[♩INTRO].
So, we're all used to hearing news stories about how screens and social media are ruining all of our lives. And, well, looking at the world around us, there's something to be said for that.
So when you see a headline like “Screen Time Linked to Depression and Suicide in Teens,†it's easy to point fingers and place blame. Smartphones and social media are bad. You should keep your kid away from them as much as possible.
Right? Well, not exactly. It's often hard to accurately capture the results from large, psychological studies in quick, eye-catching headlines.
And to understand why we need to dive into some misconceptions about statistics. For example, one misleadingly named bit of jargon is statistical significance. When scientists say that, they mean that their data has passed a certain level of scrutiny, and that the odds that the pattern they found was due to chance alone are low.
Those odds are usually expressed as a p-value, which is just the odds as a proportion. So a p-value of 0.5 means 50% probability, or 50-50 odds it's equally likely the data represent something meaningful as it is that they were random luck of the draw. The lower the p-value, the more likely it is that it wasn't chance that you got a result.
And for something to be considered “significantâ€, scientists usually say it has to have a p-value of less than 0.05 or, better than 1 in 20 odds. Now, the connections between things like teen phone or social media use and depressive symptoms are significant. For example, in one 2017 study, the connection between teen social media use and depressive symptoms had a p-value of less than 0.001.
And that means it's really unlikely that that result was due to chance. It's much more likely that there is a link between those two variables. That's what statistically significant means.
But there's a catch, because although this test tells you that an effect exists, it tells you nothing about how powerful that effect is. And for that, you have to look at the effect size. In statistics, effect size refers to a measure of the magnitude of a phenomenon basically, how strong the link between the variables is.
In many psychological studies, those “links†mean correlations: a mathematical connection between two things. And the effect size of a correlation is called the correlation coefficient, which falls somewhere between zero, where there's really no effect at all, and one, where the two variables are perfectly in sync. The correlation between social media and depressive symptoms in the 2017 study had a coefficient of 0.05.
That's really weak. It means we can say with confidence that social media use is correlated with depressive symptoms. But the size of the effect is so tiny that reducing screen time won't really make much of a difference to a teen's mental health.
So, it's statistically significant, but not really, like, significant. The only reason this study was even able to find such a small, significant effect was that it had a huge sample size of more than five hundred thousand teens. You see, the weaker the effect you're trying to find is, the more people you need to study to see it.
And while it might seem like more data is always better, massive studies like this can kind of be a victim of their own success, as they can identify significant but really tiny effects that don't really mean much on a practical level. Not to mention that it's really important to consider why this correlation exists. Like, for example, If you've been told social media is harmful, you might automatically assume the screen time is causing the teens' mental health to tank.
But that's not something the data the researchers collected can say. It could very easily be the other way around that the worse a teen feels, the more they turn to their phones. In fact, that's exactly what a 2019 study of over 12,000 British students found that lower life satisfaction led to increased social media use, though the researchers called the size of the effect “trivialâ€.
And, I know we say this a lot, but it's worth repeating: just because two things are correlated doesn't mean that one causes the other. It's often the case that both are influenced by a third, perhaps unknown factor. Teens might happen to spend less time on their phones if they do sports, for example, because you can't exactly scroll through Facebook while you're kicking a soccer ball.
And exercise positively impacts mental health in a way that's unrelated to phones or social media. It's also worth pointing out that these statistically significant effects may not be real. A p-value of 0.001 sounds impressively small, until you consider that these findings are one among many, many others in a very big survey.
The survey asked about all sorts of subjects, from exercise habits, to TV viewing, to religious service attendance. And when you're looking for lots of effects all at once, it's much easier to happen across a false positive. Remember, a p-value is just a measure of odds a p-value of 0.001 means the odds are 1 in 1000 that the correlation is by chance.
And yeah, that's small. But it means that if you run 1000 or more tests, you're downright likely to get a false positive. To point out the flaws with the kind of analyses run in the 2017 study, a 2019 study published in Nature Human Behavior analyzed a similarly large dataset of over 350,000 adolescents.
And they found that things like wearing glasses and eating potatoes also had significant yet small negative effects on the teens' well-being. More to the point, they found that small decisions about how to analyze the data like, where to set cut-offs between different levels of use could change the results from a significant negative effect to a significant positive one! With massive studies where hundreds of thousands of people are asked lots of things, there can be trillions of ways to run correlations.
And that makes those p-values seem a whole lot less impressive. It's also worth noting that studies haven't universally condemned screen time or social media use. For example, a 2017 systematic review examined 43 studies on the topic between 2003 and 2013.
And surprisingly, most of them actually found mixed or no effects of social media on adolescent well-being. Some researchers even criticized the guidelines put out by the World Health Organization in 2019 which suggest reducing screen time to an hour or less before age five because they say the evidence so far doesn't support imposing strict limits at any age. The truth is, all of these studies can only tell you specific answers to specific questions.
Questions like, “how likely is it that people who self-report playing video games for X hours a week also give depression-related answers on a survey?†That gets generalized to “screen time equals depression†even though that's hugely oversimplifying it! So, should teen screen time be limited? Maybe!
I don't know! No one knows! When it comes to murky subjects like this, science can give us a lot of information.
But sometimes, that information isn't all that useful. So the next time you see a story that says screens are destroying kids or everything you eat is going to give you cancer, take a minute to read beyond the headline and see how much of an effect they're really talking about. A little statistical knowledge can go a long way towards making better and more informed decisions for yourself and your kids. [♩OUTRO].
So, we're all used to hearing news stories about how screens and social media are ruining all of our lives. And, well, looking at the world around us, there's something to be said for that.
So when you see a headline like “Screen Time Linked to Depression and Suicide in Teens,†it's easy to point fingers and place blame. Smartphones and social media are bad. You should keep your kid away from them as much as possible.
Right? Well, not exactly. It's often hard to accurately capture the results from large, psychological studies in quick, eye-catching headlines.
And to understand why we need to dive into some misconceptions about statistics. For example, one misleadingly named bit of jargon is statistical significance. When scientists say that, they mean that their data has passed a certain level of scrutiny, and that the odds that the pattern they found was due to chance alone are low.
Those odds are usually expressed as a p-value, which is just the odds as a proportion. So a p-value of 0.5 means 50% probability, or 50-50 odds it's equally likely the data represent something meaningful as it is that they were random luck of the draw. The lower the p-value, the more likely it is that it wasn't chance that you got a result.
And for something to be considered “significantâ€, scientists usually say it has to have a p-value of less than 0.05 or, better than 1 in 20 odds. Now, the connections between things like teen phone or social media use and depressive symptoms are significant. For example, in one 2017 study, the connection between teen social media use and depressive symptoms had a p-value of less than 0.001.
And that means it's really unlikely that that result was due to chance. It's much more likely that there is a link between those two variables. That's what statistically significant means.
But there's a catch, because although this test tells you that an effect exists, it tells you nothing about how powerful that effect is. And for that, you have to look at the effect size. In statistics, effect size refers to a measure of the magnitude of a phenomenon basically, how strong the link between the variables is.
In many psychological studies, those “links†mean correlations: a mathematical connection between two things. And the effect size of a correlation is called the correlation coefficient, which falls somewhere between zero, where there's really no effect at all, and one, where the two variables are perfectly in sync. The correlation between social media and depressive symptoms in the 2017 study had a coefficient of 0.05.
That's really weak. It means we can say with confidence that social media use is correlated with depressive symptoms. But the size of the effect is so tiny that reducing screen time won't really make much of a difference to a teen's mental health.
So, it's statistically significant, but not really, like, significant. The only reason this study was even able to find such a small, significant effect was that it had a huge sample size of more than five hundred thousand teens. You see, the weaker the effect you're trying to find is, the more people you need to study to see it.
And while it might seem like more data is always better, massive studies like this can kind of be a victim of their own success, as they can identify significant but really tiny effects that don't really mean much on a practical level. Not to mention that it's really important to consider why this correlation exists. Like, for example, If you've been told social media is harmful, you might automatically assume the screen time is causing the teens' mental health to tank.
But that's not something the data the researchers collected can say. It could very easily be the other way around that the worse a teen feels, the more they turn to their phones. In fact, that's exactly what a 2019 study of over 12,000 British students found that lower life satisfaction led to increased social media use, though the researchers called the size of the effect “trivialâ€.
And, I know we say this a lot, but it's worth repeating: just because two things are correlated doesn't mean that one causes the other. It's often the case that both are influenced by a third, perhaps unknown factor. Teens might happen to spend less time on their phones if they do sports, for example, because you can't exactly scroll through Facebook while you're kicking a soccer ball.
And exercise positively impacts mental health in a way that's unrelated to phones or social media. It's also worth pointing out that these statistically significant effects may not be real. A p-value of 0.001 sounds impressively small, until you consider that these findings are one among many, many others in a very big survey.
The survey asked about all sorts of subjects, from exercise habits, to TV viewing, to religious service attendance. And when you're looking for lots of effects all at once, it's much easier to happen across a false positive. Remember, a p-value is just a measure of odds a p-value of 0.001 means the odds are 1 in 1000 that the correlation is by chance.
And yeah, that's small. But it means that if you run 1000 or more tests, you're downright likely to get a false positive. To point out the flaws with the kind of analyses run in the 2017 study, a 2019 study published in Nature Human Behavior analyzed a similarly large dataset of over 350,000 adolescents.
And they found that things like wearing glasses and eating potatoes also had significant yet small negative effects on the teens' well-being. More to the point, they found that small decisions about how to analyze the data like, where to set cut-offs between different levels of use could change the results from a significant negative effect to a significant positive one! With massive studies where hundreds of thousands of people are asked lots of things, there can be trillions of ways to run correlations.
And that makes those p-values seem a whole lot less impressive. It's also worth noting that studies haven't universally condemned screen time or social media use. For example, a 2017 systematic review examined 43 studies on the topic between 2003 and 2013.
And surprisingly, most of them actually found mixed or no effects of social media on adolescent well-being. Some researchers even criticized the guidelines put out by the World Health Organization in 2019 which suggest reducing screen time to an hour or less before age five because they say the evidence so far doesn't support imposing strict limits at any age. The truth is, all of these studies can only tell you specific answers to specific questions.
Questions like, “how likely is it that people who self-report playing video games for X hours a week also give depression-related answers on a survey?†That gets generalized to “screen time equals depression†even though that's hugely oversimplifying it! So, should teen screen time be limited? Maybe!
I don't know! No one knows! When it comes to murky subjects like this, science can give us a lot of information.
But sometimes, that information isn't all that useful. So the next time you see a story that says screens are destroying kids or everything you eat is going to give you cancer, take a minute to read beyond the headline and see how much of an effect they're really talking about. A little statistical knowledge can go a long way towards making better and more informed decisions for yourself and your kids. [♩OUTRO].