“Hiding in Plain Sight”: Charles Stewart on Voter Experiences in the 2014 SPAE


[Image – Norman Rockwell’s “Election Day, 1944” – courtesy of the Cedar Rapids Museum of Art]

Last week, I highlighted the initial release of the 2014 Survey of the Performance of American Elections (SPAE), the latest in a series of instruments designed to ask how voters themselves felt about their voting experience. This week, SPAE’s founder and champion Charles Stewart shared his thoughts with electionlineWeekly about what the data says and what it means for the field:

The experience of voters is one of those things that hide in plain sight.

Despite the fact that more than 100 million voters take part in presidential elections, and around 80 million voters take part in midterm congressional elections, very little is actually known about the experiences voters have when they go to cast a ballot.

Do their machines work? Do they wait in long lines? Are they met by competent poll workers?

Voters tell each other stories about these things, and sometimes reporters write news accounts about them, but until 2008 no one had ever attempted to ask voters about their experience on Election Day in any comprehensive, systematic way.

Thus was born the Survey of the Performance of American Elections (SPAE), the first (and thus far only) comprehensive national public opinion study of voting from the perspective of the voter.

In 2014, with the financial assistance of the Pew Charitable Trusts (which has generously funded the SPAE since its inception), we have been able to study in detail the voting experience at midterm. This report touches on some highlights.

But first, a little more background.

The SPAE was begun in 2008, supported by Pew’s Make Voting Work Initiative and the JEHT Foundation, to answer a long list of questions about voting in American that the 2000 Florida recount controversy had raised. These questions ranged from the sensational — do we live in a banana republic, with officials unable to conduct competent elections? — to the scientific — precisely how many votes get lost because of broken machines, inaccurate registration lists, and long lines?

The basic survey is administered to 200 registered voters in each state plus the District of Columbia. This allows us to make reliable comparisons between states on the questions we explore; when we aggregate the answers together, we have responses from 10,200 registered voters to consider.

The study was designed with the ability to compare across states in mind. The overall size of the sample was chosen because some of the issues its addresses are so infrequent, such as the breakdown of voting machines, that we need 10,200 respondents just to get reliable estimates nationwide.
The survey instrument itself contains scores of questions about voting. It starts with the question of whether the respondent voted. If he didn’t, we ask why not. If the respondent did vote, the questionnaire asks basic questions about the experience, depending on whether the vote was cast in-person (on Election Day or early) or by mail. For instance, in-person voters are asked about the ease of finding the polling place, problems with registration, length of time waiting to vote, and the performance of poll workers. Mail voters are asked about how hard it was to request a ballot, whether the instructions on the ballot were easy to follow, and how the ballot was returned (By mail or at a drop-box? Personally, or by someone else?) In addition to basic performance questions, the survey asks about attitudes toward elections and voting. How confident are voters their votes were counted as cast? How confident are all respondents that votes nationwide were counted as cast? How common is voter impersonation fraud in the community? Should Election Day be a holiday?

Finally, the survey collects basic demographic information about respondents, and we know the state and ZIP Codes in which they live. Thus, we can study the relationship between the context of the voter and the voter’s experience.

Since its inception, the SPAE has helped paint an overall positive picture of voting, as experienced by voters.

Very few have problems finding their polling place; 88 percent in 2014 said it was “very easy” to find the polling place. Just 2 percent said they encountered registration problems when they went to vote, with another 2 percent saying they encountered voting equipment problems. Overall, 85 percent of in-person voters said their polling place was run “very well.”

Absentee voters also had good experiences overall. Just 2 percent reported encountering problems receiving their mail ballot, 1.5 percent encountered problems marking the ballot, and 86 percent said they found the instructions easy to follow.

Among both in-person and absentee voters in 2014, more than 70 percent stated they were very confident their votes were counted as cast.

Of course, problems that are encountered by a small percentage of voters may have been encountered by a large number of voters.

For instance, if roughly 60 million voters cast ballots in-person in 2014, the 2 percent who said they had registration problems amounts to 1.2 million voters nationwide — the same number that had problems finding the polling place in the first place.

As well, although the percentages of voters who encounter any given problem are small, the percentage who encounter at least one problem is not so small. For instance, if we consider all the ways an in-person voter could have reported a problem in the SPAE (difficulty finding a polling place, registering, or using the voting equipment, plus encountering a poorly run polling place or a poorly performing poll worker), 7.5 percent of in-person voters in 2014 had at least one problem, or 3.8 million people.

Using the demographic information included in the SPAE, we can see who tend to have more problems. The results are informative: the young, first-time voters, recent movers, and people with physical disabilities are much more likely to report encountering a problem than older, more experienced voters who do not have physical disabilities.

This is not a shocking finding to those who follow election administration, but it helps to further refine the types of factors that give rise to challenges in meeting the needs of voters. It is also important to note the factors that are not associated (in a statistical sense) with having problems in 2014 — sex, race, income, and education.

In 2012, the big question was how long people waited in line to vote. Because of its design, the SPAE was instrumental in providing hard evidence about this hot election administration issue in the days immediately following the 2012 presidential election. Not only could the SPAE identify the states with the longest lines, it established the important fact that most people, in fact, did not stand in any appreciable lines at all — but if they did stand in a line, it could be quite a long one. The 2012 survey was also instrumental in establishing that lines were much longer in early voting than on Election Day, in inner cities than in rural areas, and among blacks compared to whites.

The 2014 SPAE finds that lines were much less prevalent in the most recent midterm election. Compared to the average wait time in 2012 of 14 minutes, in 2014 the average nationwide wait time was 4 minutes. The states with the longest wait times in 2012 were not the states with the longest wait times in 2014. Furthermore, the big disparities that were seen in 2012 along urban/rural, racial, and voting mode dimensions closed significantly.

Because long lines are caused by congestion, and midterm elections have significantly lower turnout than presidential elections, it is not surprising that average wait times were cut by 2/3 in 2014 compared to 2012. On the other hand, the fact that the disparities close significantly was a bit of a surprise.

The SPAE is an evolving instrument. In 2014, some new questions were added to help gain greater insights into how mail and absentee voters returned their ballots, and how voting fit into the voter’s day. I was interested to discover, for instance, than one-fifth of absentee/mail voters actually returned their ballots personally, rather than relying on the mails. In the three states that are now exclusively vote-by-mail, these percentages ranged from 39 percent in Washington to 57 percent in Oregon (with Colorado in the middle at 44 percent).

I also found one new fact about early voters especially interesting. We asked in-person voters how going to the polls fit into the routine of their day. Did they vote on the way to work or school? On the way home? In the middle of the day?

In fact, 60 percent of early voters responded that they “did not have work or school the day I voted.” This is in contrast with 49 percent of Election Day voters.

Why this is interesting to me is that I have long heard that early voting is especially useful for people to accommodate into their busy schedules. While people who are not working or going to school no doubt have many reasons to be busy, it doesn’t seem to be the same type of time constraint on average as someone who is working a full-time job. In any event, knowledge about the other things voters are doing with their time (or not) before and after they vote can be very helpful to local election officials in figuring out when to offer early voting hours and where to locate early voting sites.

One final special feature of the 2014 SPAE was a parallel study, in which we interviewed 1,000 registered voters in 10 states — Arizona, California, Florida, Iowa, Michigan, North Carolina, Ohio, Oregon, Texas, and Washington.

The questionnaire of the parallel study was identical to the 50-state study just described. Space constrains me from saying much about this “over sample study.” Its main goal was to experiment with being able to say more about the experience of voters at the local level, within states with a variety of electoral practices and challenges. We have yet to scratch the surface in learning what this special study as to say.

Like any major survey research effort, a project like this could not have happened without the hard work of a team of scholars and researchers. On the design side, the original survey instrument was developed in collaboration with scholars associated with the Caltech/MIT Voting Technology Project — Michael Alvarez, Stephen Ansolabehere, Adam Berinsky, Thad Hall, Gabriel Lenz, in addition to myself.

The survey has been flawlessly administered by YouGov (formerly Polimetrix), whose Sam Luks has been a champion of the project from the start.

And, of course, the good people at Pew, both current (Michael Caudell-Feagan, Sean Greene, and Zach Markovits) and past (Doug Chapin) have provided financial support, encouragement, and advice. (Pew, of course, while generous in support of the project, is not responsible for the analysis that comes from it.)

Finally the SPAE data are available for download by anyone who wishes to conduct analysis using the dataset. The 2014 SPAE, along with all previous versions, is hosted at the Harvard Dataverse.

The data are there for the election administration community to use. Please use it.

I echo all of Charles’ thanks and add my own to him for this remarkable idea and dataset. Electiongeeks, start your stat engines!

Be the first to comment on "“Hiding in Plain Sight”: Charles Stewart on Voter Experiences in the 2014 SPAE"

Leave a comment

Your email address will not be published.