
It’s an election year, which means that many creators are paying attention to polling about local, state and national races – and sharing that information with their audiences.
But polls, while useful, can also be tricky and their results are sometimes mischaracterized by campaigns, journalists, and others.
“It has clear limits and values about what it can and can’t do, so what it’s showing you is just basically a snapshot of in the time where public opinion stands at that moment,” said Jessa Scott-Johnson of SevenLetter Insight, a firm that specializes in bipartisan public opinion polling and research.
Polling can do some things that are helpful: show where public opinion stands today, identify what issues matter more, reveal misconceptions or confusion, highlight differences across key audiences and identify messaging preferences.
But it’s important to know what polling can’t do, Scott-Johnson said. It can’t predict the future, explain why people think what they do or tell you how people behave in real life.
“There is always going to be some level of uncertainty between polling and reality,” Scott-Johnson said.
There are a few things to remember about a poll and the sample it represents, she said. A poll represents a snapshot of the population, not every individual. Polls of different audiences will produce different results, so make sure the sample matches the audience you most care about.
Polls measure what people believe or feel, not what is factually true. High confidence reported in a poll doesn’t always mean high understanding of an issue, so use polls to identify where clarification, education or framing is needed.
To read a poll, start with the topline number. Look at the intensity of responses. People who respond “somewhat” in polls are often not tied to that response. Check movement over time. Has the intensity of concern gone up since the question was last asked? And compare groups – different audiences or demographics may have different opinions.
To assess a poll’s methodology, think about the following points. You may not have all of this information, Scott-Johnson said, but you should at least have sample size and sampling frame. You want to look at what you do have.
- Sample size: Is it large enough to be reliable?
- Screeners: How did respondents qualify for the poll, and how are audiences defined?
- Field dates: When was the poll conducted, and what outside events were occurring at that time?
- Sponsorship: Who paid for the poll? Are there any biases to consider?
- Sampling Frame: What mode was used (online, phone, text), and how were respondents recruited?
- Question Wording: How were questions phrased?
- Weighting variables: How was the poll weighted to mirror the universe? Age, party ID and race that can vacillate the most between polls.
Some red flags while evaluating a poll:
- Unusual methodology: Methods that can’t produce a representative sample (social media polls, opt-in links, QR codes).
- Lack of transparency: No methodology, sample details, or question language provided.
- Small sample size: Too few respondents resulting in large margins of error (+/- 6%).
- Overinterpreted subgroups: Strong conclusions drawn from subgroups that are too small to be reliable (under n=100 respondents for non-directional data).
- Leading or biased questions: Wording that pushes respondents toward a certain answer or frames the issue one-sidedly.
- No weighting or poor weighting: Sample does not match the population.
Scott-Johnson also gave tips on how to talk about polls – and how not to talk about them.
How to talk about a poll:
- “Voters lean towards…”: Signals direction without overstating certain.
- “A majority/plurality support…”: Use majority when the largest share is over 50% and plurality when it is under 50%.
- “Opinion is divided”: Use when results are close or within the margin of error.
- “These messages perform the strongest….”: An accurate way to compare messages without claiming guarantees.
- “Among those who are…”: Clarifies that results refer to a specific group.
How not to talk about a poll:
- “This proves…”: Polling shows patterns, not proof.
- “Voters will definitely…”: Polls measure current attitudes, not future behavior.
- “Everyone agrees…”: Almost never accurate and usually misleading.
- “This poll is exact…”: All polls include uncertainty or a margin or error.
SevenLetter Insight’s Matthew George advised if you’re concerned about a poll, but want to report on an outcome, then try to see if another poll found the same result. If not, say, “This poll is not scientifically sound, but found this and it would be true for this specific group, with these biases.”
George said pointing out a poll’s shortcomings can also help educate audiences about the ins and outs of polling, so they can become more discerning as well.