Empty room Interior Skyscrapers View. Cityscape Downtown Seattle City Skyline Buildings from High Rise Window

Weekend Reads | A Seattle Public Safety Survey

by Kevin Schofield

This weekend’s read is a document outlining the results from a recent poll commissioned by the Downtown Seattle Association (DSA) and the Seattle Metropolitan Chamber of Commerce. Both organizations are trade associations that count many downtown-based businesses as members, and both work to influence the City’s political outcomes on behalf of their members. This poll — and the fact that they are publishing the results — appears to be an attempt to nudge the complex topic of public safety in a certain direction during an important election year — evidenced by the fact that they chose to survey “likely voters.”

The DSA and the Chamber hired a local political polling firm, EMC, to conduct the survey. They fielded it in mid-January, conducting interviews by telephone and online. In total, they collected 500 survey responses, which gives them a margin of error of plus or minus 4.4 percentage points: When looking at the total group’s response to a given question, the “real” result for all likely voters in Seattle could be up to 4.4 points higher, or 4.4 points lower.

They found that voters are still pessimistic about the direction of the city — 60% think it’s on the wrong track, and only 36% think it’s going in the right direction — but those figures are vastly improved from over a year ago. Mayor Harrell is viewed favorably by two-thirds of voters, Seattle police officers by a bit fewer (around 62%), and the Seattle Police Department (SPD) by just over half.

Homelessness is by far the top public safety concern among voters, mentioned by 52% of voters, followed by crime (23%) and drugs (19%). 

Over 60% of survey respondents supported hiring new police officers, increasing funding to SPD, and using signing bonuses and other incentives to increase hiring. Just over half believed SPD has made “significant progress on police reform.” At the same time, there was overwhelming support for the creation of a new civilian public safety force to respond to lower-priority nonviolent calls.

We should always treat survey results with skepticism until we’ve had the chance to interrogate them; it’s easy to design a survey to incite specific responses and to implement it in ways that skew the response in the direction that those paying for the survey want to see. That doesn’t necessarily mean that all surveys, or this one specifically, are biased; but it’s important for us to look for signs that perhaps the results shouldn’t be trusted quite so much. To that end, let’s pull this one apart.

‘Likely Voters’

The fact that EMC surveyed “likely voters” means several things. First, as mentioned above, they’re trying to send a public message about what’s on voters’ minds as they head to the polls this year. Second, “likely voters” is a different demographic group than the general Seattle populace, so we need to be clear upfront that these poll results don’t necessarily tell us how all of Seattle is thinking about public safety. The group of likely voters tends to skew older and whiter than Seattle as a whole, which is an important demographic difference in a city that is younger and more ethnically diverse than most American cities.

‘Mixed Mode’ Surveys and ‘Response Bias’

The fact that the survey was done “mixed mode” — using both phone and online interviews — is mostly a good thing, as is that they reached out to both landlines and cellphones. As recently as five years ago, many polls only called landline phones, which eliminated the views of a large and growing young population who only had cellphones; now, any poll that only surveyed landlines would be rejected instantly. Online polls often reach a different population than those that do phone surveys, but studies have also shown that people tend to be more outspoken and express stronger views in online polls than they will when speaking with a human interviewer — so including online polls changes not only the audience but also the kinds of responses received. We also need to be cognizant of “response bias”: The people who will take the time to respond to a survey tend to be the ones with stronger opinions, so surveys almost always overstate the strength of people’s beliefs.

Number of Respondents

Having a total of 500 responses to the survey is on the low end of the acceptable range for statistical purposes. As of last November, there were 482,789 registered voters in Seattle, of whom 333,912 cast a vote in the November 2022 election, and 267,414 in the November 2021 off-year election. If around 250,000 Seattle voters are “likely” to vote this coming November, then EMC surveyed 1 in 500 of them. 

It’s always better to have more responses, but it takes more time and costs more money to collect them, so organizations that commission a survey often aim for the minimum number that will give them a statistically significant result. However, this not only increases the margin of error, but it also makes response bias a bigger issue: The fewer respondents, the more easily a handful of people with strong opinions can skew the results.


This is particularly problematic if we try to look at piecemeal results for specific demographics: age, race/ethnicity, political party affiliation, and so on. For some of them, the sheer numbers are so small that the results are meaningless. For example, only 16 survey respondents listed their gender as nonbinary; that is not enough people for us to draw any statistical conclusions about how nonbinary Seattleites would answer the questions. But not separating them out can also be problematic: The EMC survey divides race/ethnicity into only two groups: white or BIPOC. But we know from many other polls that Asian, Black, and Latino Seattleites can have different views on public safety and policy issues based on their lived experiences, and grouping them together (probably because the individual numbers are too small — only 100 people are in the BIPOC category) masks those important differences.

What’s a ‘Cross-Tab’?

And this leads us to one of the most important questions to ask whenever someone hands you poll results: “Can I see the cross-tabs?” A “cross-tab” is a table of the survey results broken out for each question, compared to the demographic question results. This allows us to double-check that the survey respondents match the population as a whole, but also it allows us to see how different demographic groups responded to specific questions (assuming that the total number of people in that demographic group is large enough to be statistically meaningful). In addition, it lets us see what the pollster chose not to include in their report: statistics that might contradict a narrative they wanted to tell. You can view the cross-tabs for the DAS/Chamber poll here.

Looking at the gender and age statistics, it seems the survey respondents generally match the population of “likely voters,” which we know skews older and whiter than Seattle’s population as a whole. Of survey respondents, 71% were white, and 48% were age 50 or older; 55% were Democrats, 4% were Republicans, and 5% were Socialists.

An Anomaly in the Data

One weird thing that pops out from the cross-tabs, though, is that 45 people, almost 10% of the survey respondents, refused to provide their race/ethnicity — a high figure for a poll like this. Among those 45, half of them also refused to give their gender. We might dismiss this as a random occurrence, except that this group of 45 expressed extremely strong and consistently pro-police views on all the related survey questions, as well as the strongest views that the city is less safe than two years ago and that the city remains on the wrong track. They were also somewhat less concerned about homelessness as a public safety issue and much more concerned about drugs and a lack of prosecution of alleged criminals. Nearly 60% still voiced support for a civilian public-safety department, though one-quarter strongly opposed it; 22% said they were Democrats, 31% independent, 4% Republican, none was Socialist, and a whopping 44% refused to give their political affiliation at all. All that said, 45 people is a small group in a relatively small poll: We can’t tell whether this is response bias, sampling error, or something else. But the idea that perhaps 10% of Seattle’s likely voters would give this kind of highly consistent “outlier” response means something — even if we don’t yet know what. And it was certainly a large enough group and strong enough response to influence the overall results.

All told, there’s nothing in the survey composition or the cross-tabs that suggests we should ignore or reject the results — though the small survey size argues for not paying much attention beyond the top-level numbers. Also, we need to remind ourselves that this is a political poll, targeted at likely voters and intended to influence officials and candidates who are concerned about how the voters will respond in this year’s elections; it doesn’t represent the overall views of all Seattleites (nor does it claim to). Also, it was a survey on public safety, and while the DSA and the Chamber claim public safety is the top issue for voters, this survey didn’t ask that question. We need to be careful with assumptions that public safety will necessarily be the motivating factor in voters’ decisions come November.

Seattle Public Safety: Likely November 2023 Voters, January 2023
Survey cross-tabs

Kevin Schofield is a freelance writer and publishes Seattle Paper Trail. Previously he worked for Microsoft, published Seattle City Council Insight, co-hosted the “Seattle News, Views and Brews” podcast, and raised two daughters as a single dad. He serves on the Board of Directors of Woodland Park Zoo, where he also volunteers.

📸 Featured image by VideoFlow/Shutterstock.com.

Before you move on to the next story …
The South Seattle Emerald is brought to you by Rainmakers. Rainmakers give recurring gifts at any amount. With over 1,000 Rainmakers, the Emerald is truly community-driven local media. Help us keep BIPOC-led media free and accessible. 
If just half of our readers signed up to give $6 a month, we wouldn't have to fundraise for the rest of the year. Small amounts make a difference. 
We cannot do this work without you. Become a Rainmaker today!