how to vote
How to Read a Poll | MediaWise Voter Guide #7
YouTube: | https://youtube.com/watch?v=o_9JQe-78B0 |
Previous: | How to REALLY watch a political ad | MediaWise Voter Guide #6 |
Next: | Demystifying The Voting Process | MediaWise Voter Guide #8 |
Categories
Statistics
View count: | 2,782 |
Likes: | 68 |
Comments: | 2 |
Duration: | 12:37 |
Uploaded: | 2020-08-10 |
Last sync: | 2024-10-18 11:45 |
Polls seem to be everywhere, particularly as election day gets closer. But before you read all the hot takes about what that polling data means, it might help to understand how public opinion polling works and how to make sense of the polling headlines you see in your feed.
***
Pew Research Center "A Field Guide to Polling" https://www.pewresearch.org/methods/2019/11/19/a-field-guide-to-polling-election-2020-edition/
Pew Research Center FAQ https://www.pewresearch.org/methods/u-s-survey-research/frequently-asked-questions/
New York Times "No One Picks Up the Phone, but Which Online Polls Are the Answer?" https://www.nytimes.com/2019/07/02/upshot/online-polls-analyzing-reliability.html
Vox "The difference between good and bad state polls, explained" https://www.vox.com/policy-and-politics/2019/11/14/20961794/education-weight-state-polls-trump-2020-election
Gallup "What Is Public Opinion Polling and Why Is It Important?" http://media.gallup.com/muslimwestfacts/pdf/pollingandhowtouseitr1dreveng.pdf
FiveThirtyEight "13 Tips For Reading General Election Polls Like A Pro" https://fivethirtyeight.com/features/13-tips-for-reading-general-election-polls-like-a-pro/
FiveThirtyEight "The Media Has A Probability Problem" https://fivethirtyeight.com/features/the-media-has-a-probability-problem/
***
Follow us!
Twitter: https://twitter.com/how_to_vote
Instagram: https://www.instagram.com/how_to_vote/
Facebook: https://www.facebook.com/How-to-Vote-in-Every-State-100579251723905
***
MediaWise is a nonprofit, nonpartisan project of the Poynter Institute and supported by Facebook. Complexly is an MVP partner.
https://www.poynter.org/mediawise-voter-project-mvp/
http://www.complexly.com/
***
Pew Research Center "A Field Guide to Polling" https://www.pewresearch.org/methods/2019/11/19/a-field-guide-to-polling-election-2020-edition/
Pew Research Center FAQ https://www.pewresearch.org/methods/u-s-survey-research/frequently-asked-questions/
New York Times "No One Picks Up the Phone, but Which Online Polls Are the Answer?" https://www.nytimes.com/2019/07/02/upshot/online-polls-analyzing-reliability.html
Vox "The difference between good and bad state polls, explained" https://www.vox.com/policy-and-politics/2019/11/14/20961794/education-weight-state-polls-trump-2020-election
Gallup "What Is Public Opinion Polling and Why Is It Important?" http://media.gallup.com/muslimwestfacts/pdf/pollingandhowtouseitr1dreveng.pdf
FiveThirtyEight "13 Tips For Reading General Election Polls Like A Pro" https://fivethirtyeight.com/features/13-tips-for-reading-general-election-polls-like-a-pro/
FiveThirtyEight "The Media Has A Probability Problem" https://fivethirtyeight.com/features/the-media-has-a-probability-problem/
***
Follow us!
Twitter: https://twitter.com/how_to_vote
Instagram: https://www.instagram.com/how_to_vote/
Facebook: https://www.facebook.com/How-to-Vote-in-Every-State-100579251723905
***
MediaWise is a nonprofit, nonpartisan project of the Poynter Institute and supported by Facebook. Complexly is an MVP partner.
https://www.poynter.org/mediawise-voter-project-mvp/
http://www.complexly.com/
Okay so are you ever doing your morning scroll and see a headline like “such and such candidate surges in Iowa” when nobody’s even vote yet?
Or an article about how “46% percent of Americans approve of the president”, but you wonder how they got that number because nobody’s ever asked you? Look, it’s normal to be confused about polls—especially if you’re just trying to understand them in 280 characters or less.
But because polling is a staple of the constant stream of political coverage you see in your news feeds every day, understanding how to read polls will make you a more informed voter. So if you poll 1000 adults for a 5% margin of error...carry the 7 Wait wait wait—hold up—we’re not gonna be memorizing any complex statistical models. I’m just going to arm you with a set of questions to ask any time you see a poll, to get to the bottom of what the poll is really saying.
I’m Evelyn from the internets, this is the MediaWise Voter Guide, and today we’re talking about polls. [Theme Music]. So let’s start from square one, and talk about what a poll is, because it’s more than just clicking a button on twitter to tell your friends how many days into quarantine you ran out of snacks. The idea behind polls is that if we ask a few people out of a group what they think about an issue, we can get a pretty good idea of what everybody in that group thinks.
But if we’re trying to figure out what the entire country thinks, we can’t just ask all the first-generation Black girls from Texas —although that sounds like a fun time, add me to that group chat— because my views based on my experiences, might be different from your views, based on your experiences. Instead, we need to get a random sample which means everyone in the country has an equal chance of being selected, to get the best sense for what you, and I, and everyone else thinks. And for most pollsters, the best way to get that random sample is to have a computer dial phone numbers at random, until enough people answer and agree to participate in an interview.
Usually that’s about 500 to 1000 people, but because not everyone answers their phones, it can take tens of thousands of calls to get the responses they need. Certain populations, like young people and people of color are especially hard to reach, so they may need to weight the responses, which means comparing the demographics of the respondents to the demographics of the country, using census data. If the pollster didn’t get enough responses from young people, or Latinx people, or women, or people without college degrees, they might weight the responses of certain populations more heavily, counting those responses more than once so that the sample isn’t just random, it’s also representative.
To paraphrase how the Pew Research Center describes it, polling is sort of like if you’re making dinner, you only need to taste a little spoonful to know how the whole dish tastes. But, you’ve gotta make sure you have a little of everything in that bite, to understand how all of the flavors are coming together. So now that we know how polls are taken, let’s talk about how to read them.
Say you read a poll that says more Americans would prefer to vote for Beyonce for president, than Donald Trump or Joe Biden. While I’m pretty sure the constitution says you can’t be the president and the queen at the same time, we wanna figure out how to understand what the poll is really telling you. If you’ve seen every episode in this series, you might know by now that I’ve got a lot of questions.
So many questions. But in this case I’ve got some questions to help you think critically about polls. You’ll want to ask yourself who made this poll, what their methodology was, and what other polls are saying.
So who is behind the poll? A good poll will tell you who conducted it. Was it from a university or reputable polling company you’ve seen cited in news articles?
Great. Was it a social media poll that your favorite celebrity gossip website posted? Maybe not so great Likewise, internal polls, those conducted by campaigns or interest groups to gauge opinions about their candidate or issue, have a pretty big stake in showing that the public is on their side, so take them with a whole shaker of salt.
If you’re not sure if a pollster is legit, check to see if they are mentioned in a wide range of sources from all across the political spectrum, not just the ones that show up in your social media echo chambers. You can also check the pollster’s website to see if they publish information about who works there, and what methods they use to conduct their polls. Which brings us to our next question, what’s the methodology behind a poll’s claims?
Methodology is the who, what, when, where, and how of taking a poll. So we already covered who’s asking the questions, but we also want to know who they talked to. Talking to only people who are currently registered voters might give different results than talking to the general public, especially if the people who register last minute, or have barriers to registering at all have other demographics in common.
In our Beyonce poll, we could get different responses if we interview every adult, vs if we only interview Tidal subscribers. You’ll also want to know how many people the pollsters interviewed, otherwise known as the sample size. Generally, when you’re trying to measure a really big population, like the entire United States, you’ll get the best results from talking to a larger sample size.
So a poll that interviews 1000 people will be more accurate than one that only talks to 300. We also want to know what questions pollsters are asking. The wording and order of polling questions can influence how people respond.
So if I ask you a question like,”Are you in favor of the Affordable Care Act?” or “Are you in favor of limiting immigration?” or “Are you in favor of impeccable choreography?” before I ask if you’d vote for Trump, Biden, or Beyonce, your answer may be swayed by whatever issue I put in the front of your mind. For that reason, many pollsters will rotate the order of the questions they ask to limit how much question order affects responses. But question wording also matters.
Because we know that language like “medicare for all” hits differently than “socialized medicine.” Campaigns and partisan media know that, and exploit that. Pollsters shouldn’t. OK, let’s move on to when the interviews happened.
Polls are kind of a snapshot of people’s opinions during the time it was taken, but opinions can shift and change as time goes on. If the interviews happened the day after a debate or a party convention, one candidate might see a temporary bump in enthusiasm that levels out over time. Could Bey’s approval ratings be spiking because she just dropped another surprise Netflix special?
Please. We need this right now. Lastly you should check how the poll was conducted.
So, look for things like how the pollsters reached people. Did they only call landlines? That may tell you something about who got counted and who was left out.
Like do you know anyone your age who has their own landline? That’s what I thought. So a poll that includes landlines and cellphones might be better at representing more of the population.
Online polls also have a leg up in researching more people, but unlike dialing a randomly generated phone number, it’s hard to conduct an online poll that gives every single person in the country an equal chance of getting selected, so you won’t get a truly random sample. So there’s pros and cons to each method, but you should know which pros and cons are baked into a specific poll when deciding how to interpret that information. One last piece of information you can check is how a poll was weighted.
Like I said, some populations are harder to reach, so pollsters weight their responses more heavily so that they match the population. A good poll will weight their responses for age, gender, race, even the level of education someone completed. That’s a major mistake many pollsters made in 2016—leaving education out of their models.
Because, pollsters later found, college graduation is correlated with two important things: a tendency to vote democrat, and tendency to pick up the phone and answer polls. Many state polls in 2016 ended up oversampling people who had completed higher education, and as a result may have counted a higher percentage of democrats than exist in the US population. If pollsters don’t weight for all of these demographic influences, that can increase their margin of error.
That’s a term you may have heard tossed around in the news a lot, so let’s pause and talk about what it means. Once pollsters have conducted lots of interviews and weighted the responses to match the population, the pollsters may publish their results with a 95% degree of confidence, leaving 5% of room for the results to be off, because they couldn’t actually talk to every adult in the country, and that some of the adults they did talk to may have been led to one response by their question design, or even lied about what they truly believe because their real belief is seen as less socially desirable. Like, admitting that they’re not registered to vote, for example.
That wiggle room is the margin of error. So if a poll concludes that 65% of Americans love Beyonce by asking 1000 people, with a 5% margin of error, we can assume that if we actually had the chance to ask everybody, the real result would be anywhere from 60 to 70% of people saying they’re part of the Beyhive. That margin of error can increase though, if the pollsters talk to fewer people, or if their sample isn’t representative of the whole population or even if you start pulling apart the poll to look at a specific demographic.
So you know when the media takes a poll that says 65% of Americans wish they could be the fan that blows back the immaculate sew-in on Beyonce’s head, and says that among women respondents that number was actually 75%? Stop! No!
We’re not doing that! Because when you break down a poll to only look at a subset of the responses, you’re decreasing the sample size and increasing the margin of error, which makes those results less reliable than looking at the respondents as a whole. And that’s a mistake that we see a lot in reporting about polls.
Another one? Not explaining probabilities! That was a major problem in 2016.
You know, when polls showed that Hillary Clinton had a 60-70% chance of winning the election? That doesn’t mean that 70% of voters were voting for her. It means that if we ran a simulation of the election 100 times, she would have won in 70 of them, but she would have lost in 30.
Her losing doesn’t mean the pollsters were wrong. It just means that one of the less likely outcomes was the actual outcome. Polls aren’t meant to predict the results of an election, they’re just meant to capture the sentiment of the electorate at a particular point in time.
That’s why you’ve got to get all the way to our third question: What are other polls saying? A poll is just a snapshot of opinion right? So you take one snapshot, and combine it with another snapshot, and another snapshot, and suddenly, you’ve got a pretty big picture!
So make sure you look at lots of polls. Not just the one that your echo chamber is lifting up, and not even the one that’s making the most headlines. Because sometimes the media can grab an outlier poll— one that’s showing vastly different results than other polls—and run with it because it’s shocking and different.
But amplifying an outlier can make it seem a lot more meaningful than it actually is. I mean, President Beyonce makes an exciting headline but she’s not even running for office. When you look at lots of other polls, it becomes clear that the chances of her unparalleled vocal runs actually shifting election results is pretty nonexistent.
The best way to be smart about polling is to read a lot of different polls from a lot of different news sources to build a broad understanding of public opinion going into election day. But also: know that polls are not votes. They’re just attempts to measure our opinions at single points in time.
So regardless of whether your candidate is up or down in the polls, none of it means anything until people cast their ballots. Remember that our beliefs and your values are part of the picture that pollsters are trying to piece together, and it’s up to you to turn them into reality. The MediaWise Voter Project (MVP for short) is led by The Poynter Institute – that's a journalism teaching non-profit.
Complexly, the creator of this video, is a partner on MVP. The MediaWise Voter Project is supported by Facebook.
Or an article about how “46% percent of Americans approve of the president”, but you wonder how they got that number because nobody’s ever asked you? Look, it’s normal to be confused about polls—especially if you’re just trying to understand them in 280 characters or less.
But because polling is a staple of the constant stream of political coverage you see in your news feeds every day, understanding how to read polls will make you a more informed voter. So if you poll 1000 adults for a 5% margin of error...carry the 7 Wait wait wait—hold up—we’re not gonna be memorizing any complex statistical models. I’m just going to arm you with a set of questions to ask any time you see a poll, to get to the bottom of what the poll is really saying.
I’m Evelyn from the internets, this is the MediaWise Voter Guide, and today we’re talking about polls. [Theme Music]. So let’s start from square one, and talk about what a poll is, because it’s more than just clicking a button on twitter to tell your friends how many days into quarantine you ran out of snacks. The idea behind polls is that if we ask a few people out of a group what they think about an issue, we can get a pretty good idea of what everybody in that group thinks.
But if we’re trying to figure out what the entire country thinks, we can’t just ask all the first-generation Black girls from Texas —although that sounds like a fun time, add me to that group chat— because my views based on my experiences, might be different from your views, based on your experiences. Instead, we need to get a random sample which means everyone in the country has an equal chance of being selected, to get the best sense for what you, and I, and everyone else thinks. And for most pollsters, the best way to get that random sample is to have a computer dial phone numbers at random, until enough people answer and agree to participate in an interview.
Usually that’s about 500 to 1000 people, but because not everyone answers their phones, it can take tens of thousands of calls to get the responses they need. Certain populations, like young people and people of color are especially hard to reach, so they may need to weight the responses, which means comparing the demographics of the respondents to the demographics of the country, using census data. If the pollster didn’t get enough responses from young people, or Latinx people, or women, or people without college degrees, they might weight the responses of certain populations more heavily, counting those responses more than once so that the sample isn’t just random, it’s also representative.
To paraphrase how the Pew Research Center describes it, polling is sort of like if you’re making dinner, you only need to taste a little spoonful to know how the whole dish tastes. But, you’ve gotta make sure you have a little of everything in that bite, to understand how all of the flavors are coming together. So now that we know how polls are taken, let’s talk about how to read them.
Say you read a poll that says more Americans would prefer to vote for Beyonce for president, than Donald Trump or Joe Biden. While I’m pretty sure the constitution says you can’t be the president and the queen at the same time, we wanna figure out how to understand what the poll is really telling you. If you’ve seen every episode in this series, you might know by now that I’ve got a lot of questions.
So many questions. But in this case I’ve got some questions to help you think critically about polls. You’ll want to ask yourself who made this poll, what their methodology was, and what other polls are saying.
So who is behind the poll? A good poll will tell you who conducted it. Was it from a university or reputable polling company you’ve seen cited in news articles?
Great. Was it a social media poll that your favorite celebrity gossip website posted? Maybe not so great Likewise, internal polls, those conducted by campaigns or interest groups to gauge opinions about their candidate or issue, have a pretty big stake in showing that the public is on their side, so take them with a whole shaker of salt.
If you’re not sure if a pollster is legit, check to see if they are mentioned in a wide range of sources from all across the political spectrum, not just the ones that show up in your social media echo chambers. You can also check the pollster’s website to see if they publish information about who works there, and what methods they use to conduct their polls. Which brings us to our next question, what’s the methodology behind a poll’s claims?
Methodology is the who, what, when, where, and how of taking a poll. So we already covered who’s asking the questions, but we also want to know who they talked to. Talking to only people who are currently registered voters might give different results than talking to the general public, especially if the people who register last minute, or have barriers to registering at all have other demographics in common.
In our Beyonce poll, we could get different responses if we interview every adult, vs if we only interview Tidal subscribers. You’ll also want to know how many people the pollsters interviewed, otherwise known as the sample size. Generally, when you’re trying to measure a really big population, like the entire United States, you’ll get the best results from talking to a larger sample size.
So a poll that interviews 1000 people will be more accurate than one that only talks to 300. We also want to know what questions pollsters are asking. The wording and order of polling questions can influence how people respond.
So if I ask you a question like,”Are you in favor of the Affordable Care Act?” or “Are you in favor of limiting immigration?” or “Are you in favor of impeccable choreography?” before I ask if you’d vote for Trump, Biden, or Beyonce, your answer may be swayed by whatever issue I put in the front of your mind. For that reason, many pollsters will rotate the order of the questions they ask to limit how much question order affects responses. But question wording also matters.
Because we know that language like “medicare for all” hits differently than “socialized medicine.” Campaigns and partisan media know that, and exploit that. Pollsters shouldn’t. OK, let’s move on to when the interviews happened.
Polls are kind of a snapshot of people’s opinions during the time it was taken, but opinions can shift and change as time goes on. If the interviews happened the day after a debate or a party convention, one candidate might see a temporary bump in enthusiasm that levels out over time. Could Bey’s approval ratings be spiking because she just dropped another surprise Netflix special?
Please. We need this right now. Lastly you should check how the poll was conducted.
So, look for things like how the pollsters reached people. Did they only call landlines? That may tell you something about who got counted and who was left out.
Like do you know anyone your age who has their own landline? That’s what I thought. So a poll that includes landlines and cellphones might be better at representing more of the population.
Online polls also have a leg up in researching more people, but unlike dialing a randomly generated phone number, it’s hard to conduct an online poll that gives every single person in the country an equal chance of getting selected, so you won’t get a truly random sample. So there’s pros and cons to each method, but you should know which pros and cons are baked into a specific poll when deciding how to interpret that information. One last piece of information you can check is how a poll was weighted.
Like I said, some populations are harder to reach, so pollsters weight their responses more heavily so that they match the population. A good poll will weight their responses for age, gender, race, even the level of education someone completed. That’s a major mistake many pollsters made in 2016—leaving education out of their models.
Because, pollsters later found, college graduation is correlated with two important things: a tendency to vote democrat, and tendency to pick up the phone and answer polls. Many state polls in 2016 ended up oversampling people who had completed higher education, and as a result may have counted a higher percentage of democrats than exist in the US population. If pollsters don’t weight for all of these demographic influences, that can increase their margin of error.
That’s a term you may have heard tossed around in the news a lot, so let’s pause and talk about what it means. Once pollsters have conducted lots of interviews and weighted the responses to match the population, the pollsters may publish their results with a 95% degree of confidence, leaving 5% of room for the results to be off, because they couldn’t actually talk to every adult in the country, and that some of the adults they did talk to may have been led to one response by their question design, or even lied about what they truly believe because their real belief is seen as less socially desirable. Like, admitting that they’re not registered to vote, for example.
That wiggle room is the margin of error. So if a poll concludes that 65% of Americans love Beyonce by asking 1000 people, with a 5% margin of error, we can assume that if we actually had the chance to ask everybody, the real result would be anywhere from 60 to 70% of people saying they’re part of the Beyhive. That margin of error can increase though, if the pollsters talk to fewer people, or if their sample isn’t representative of the whole population or even if you start pulling apart the poll to look at a specific demographic.
So you know when the media takes a poll that says 65% of Americans wish they could be the fan that blows back the immaculate sew-in on Beyonce’s head, and says that among women respondents that number was actually 75%? Stop! No!
We’re not doing that! Because when you break down a poll to only look at a subset of the responses, you’re decreasing the sample size and increasing the margin of error, which makes those results less reliable than looking at the respondents as a whole. And that’s a mistake that we see a lot in reporting about polls.
Another one? Not explaining probabilities! That was a major problem in 2016.
You know, when polls showed that Hillary Clinton had a 60-70% chance of winning the election? That doesn’t mean that 70% of voters were voting for her. It means that if we ran a simulation of the election 100 times, she would have won in 70 of them, but she would have lost in 30.
Her losing doesn’t mean the pollsters were wrong. It just means that one of the less likely outcomes was the actual outcome. Polls aren’t meant to predict the results of an election, they’re just meant to capture the sentiment of the electorate at a particular point in time.
That’s why you’ve got to get all the way to our third question: What are other polls saying? A poll is just a snapshot of opinion right? So you take one snapshot, and combine it with another snapshot, and another snapshot, and suddenly, you’ve got a pretty big picture!
So make sure you look at lots of polls. Not just the one that your echo chamber is lifting up, and not even the one that’s making the most headlines. Because sometimes the media can grab an outlier poll— one that’s showing vastly different results than other polls—and run with it because it’s shocking and different.
But amplifying an outlier can make it seem a lot more meaningful than it actually is. I mean, President Beyonce makes an exciting headline but she’s not even running for office. When you look at lots of other polls, it becomes clear that the chances of her unparalleled vocal runs actually shifting election results is pretty nonexistent.
The best way to be smart about polling is to read a lot of different polls from a lot of different news sources to build a broad understanding of public opinion going into election day. But also: know that polls are not votes. They’re just attempts to measure our opinions at single points in time.
So regardless of whether your candidate is up or down in the polls, none of it means anything until people cast their ballots. Remember that our beliefs and your values are part of the picture that pollsters are trying to piece together, and it’s up to you to turn them into reality. The MediaWise Voter Project (MVP for short) is led by The Poynter Institute – that's a journalism teaching non-profit.
Complexly, the creator of this video, is a partner on MVP. The MediaWise Voter Project is supported by Facebook.