The 19th and SurveyMonkey have teamed up on a poll to explore core issues in our coverage. SurveyMonkey surveyed more than 20,000 adults on politics, abortion, caregiving, health care, the workplace and more. Here’s why we tackled this project — and how.
Why do a poll?
We talk to a lot of people in the course of our reporting on issues that disproportionately affect women, particularly women of color, and LGBTQ+ people. That reporting is invaluable, but it’s just one tool, and one that doesn’t always give us the widest view. This poll sheds light on what women and LGBTQ+ people across America think about the state of the country — its politics, its politicians and its systems.
Many news outlets and organizations do polls, often focusing on issues of the day or head-to-head electoral matchups. Our poll is different, designed to probe opinions on topics relevant to people The 19th serves. We aim to keep asking these questions every year to see how experiences, attitudes and concerns change over time.
Often as we report on other polls, we run into a lack of data on smaller segments of the American population, such as transgender people. SurveyMonkey’s reach allows us to collect significant insight into the views of the communities we cover who are often overlooked in traditional polling. We want to know how our country’s systems are working: What is motivating people? And where do they run into barriers? Doing our own poll with SurveyMonkey is a chance to get new, more detailed data.
The data from this poll informs a number of stories we’re publishing and will continue to help us better cover and serve women and LGBTQ+ people going forward.
-
Explore Our Findings:
-
Explore Our Findings: The State of Our Nation
How this poll worked — plus how polls work generally
In any public opinion poll, not everyone in the country is asked for their opinion. Pollsters reach out to a representative sample of people and ask a series of questions. Most often outreach occurs through calling landlines or cell phones, though now some pollsters use online or text message surveys. SurveyMonkey, our partner, leverages the reach of its online survey service — engaging more than 2 million people a day — to select a random sample of respondents and ask if they’re willing to answer more questions.
For this 19th News/SurveyMonkey poll, 20,799 American adults took the survey in either English or Spanish. The modeled error estimate — the equivalent of a traditional margin of error for surveys of this type — is plus or minus one percentage point. That means the true value of any number for the total number of respondents is expected to be at most one point more or less than what is reported in this survey, at a 95 percent level of confidence. For results based on subgroups, the error estimate will be higher.
SurveyMonkey goes into more detail here.
But how do you know these answers are accurate?
Even if a pollster is really careful with sampling in the traditional manner, they won’t have an exact replica of the total U.S. population. That’s where weighting comes in. SurveyMonkey, like many other pollsters, compares the population who answered the poll to data collected as a part of the Census’s American Community Survey (ACS) along axes of race, sex, age, education and geography. Then the data is weighted so that it matches the population at large. For example, if a survey had significantly more women respondents than men, weighting ensures that the reported results aren’t skewed due to the sample.
What are the limitations of the survey language?
People describe themselves in a lot of ways; the ACS has more limited options. (As The 19th has previously covered, the Census Bureau only just started explicitly gathering data on LGBTQ+ individuals.) To ensure accurate weighting, certain demographic questions in polls must mirror the ACS exactly. That means the phrasing for several questions on our survey deviated from our editorial guidelines around language, which were created to promote clarity as well as equity.
When respondents were asked how they identify their gender, they were given only three options: male, female or “not listed/non-conforming.” A followup question asked respondents whether they identified as transgender, but we weren’t able to provide more expansive gender options to explicitly collect more granular data on nonbinary individuals.
In a similar vein, The 19th uses the term Latinx to describe people of Latin American origin or descent. However, the ACS refers to this category as Hispanic or Latino, which is the phrasing used in our survey questions. (We are reporting the results as “Latinx.”)
Again, SurveyMonkey’s methods allow us to capture a wide range of perspectives and highlight views from segments of the population in a way that a lot of polls don’t allow. But it’s important to also acknowledge our limitations.
Do we still trust polls?
We trust them to do what they’re supposed to do: give us information about public opinion at a certain point in time. By themselves, they’re not intended to predict what will happen, and there’s always some error inherent in any polling figure. That’s why we share how the survey was conducted, as well as the estimate for the potential error.
“Polls haven’t failed us. They’re still the best way to capture data that accurately reflects the views of millions of people in just a few simple statistics,” said Laura Wronski, director of research at SurveyMonkey. People have particularly high expectations of precision for election polls, she pointed out — maybe unrealistic ones.
When they see two polls give slightly different numbers for how many people support a policy, for example, they tend to accept the variation, she said.
“They wouldn’t be so forgiving if the same were true of an election poll — even though they use the same methodology,” Wronski said. “So it’s really all about knowing what the goal of a poll is, and having enough understanding of statistics to interpret the results correctly.”
The 19th worked with SurveyMonkey to make sure we are doing just that. And in the process of reporting, our journalists have also reached out and followed up with respondents to build on the information in hand.
You can see the full data here and the questionnaire here.
Have more questions on how this poll was done, or how polls work in general? Share them.
Methodology: This SurveyMonkey online poll was conducted online in English and Spanish from August 22-29, 2022 among a national sample of 20,799 adults. Respondents for this survey were selected from the more than 2 million people who take surveys on the SurveyMonkey platform each day. The modeled error estimate for this survey is plus or minus 1.0 percentage points. Data have been weighted for age, race, sex, education, and geography using the Census Bureau’s American Community Survey to reflect the demographic composition of the United States age 18 and over.