crowdsourcing.jpg

[Image courtesy of businessgrow.com]

On Tuesday, the Federal Voting Assistance Program (FVAP) released a report and underlying data assessing the extent of voting by military voters in the 2010 election.

The data paints an encouraging but still mixed picture; while participation rates for members of the military (adjusted for age and gender) appear to be strong, there are still areas where the system could improve. For example, 29% of military voters reported that they requested but never received an absentee ballot – up from 16% in 2008. These figures are likely to form the backdrop for continued enforcement and potential expansion of the MOVE Act of 2009, which was designed to improve voting for military and overseas voters.

The FVAP report is so rich with data that I knew there was no way I could dive in alone; that’s why I reached out to my fellow election geeks for their read on the release. Not everyone wanted to speak for the record, so we’ll keep all of these anonymous – but what they had to say was fascinating and helped me (and hopefully you) see the data in different ways.

One geek went into great detail about FVAP’s decision to adjust military participation rates for age and gender:

[T]he rationale for the weight is that the demographic profile of the military is such that, independent of some effects I’ll mention below, the conventional wisdom is that you should expect to see lower turnout rates among the members.

Demographic profile of the military (figures for all, there are important rank and file vs. officer variation, and some branch differences especially marked for the Marines):

+ 15% female
+ 18% African American (5% higher than general public)
+ 25% “minority” (I think this is less than the general public but I’d have to see how they categorize them)
+ 50% are married; 70% of officers are married (not sure how this compares)
+ 93% high school diploma
+ 66% between 18 and 30 (this is a huge one)

I think but could not quickly find that the rank and file generally come from families with lower median incomes than the general public. There is a serious skew toward Southerners.

So some but not all of these (age, income, race) are negatively associated with turnout. What [FVAP] says the weighting does is make the military “look like” the general public, and then you can compare apples to apples on turnout rates.

What this misses is things in the error term–those things that we don’t measure about the military and the general public that might be related to turnout. This is a matter of theory and of debate of course, but two you might consider are –

a) are members of an all volunteer force more patriotic or feel more positive about the state of the nation (both would imply higher turnout on average)?

b) is it possible that being in a very hierarchical organization [like the military] may encourage higher levels of turnout than you’d expect otherwise given the demographic and attitudinal profile?

Another geek spotted an interesting aspect of the survey data used in the report; namely, the low response rate (25% at the highest and as low as 14%) across different categories. The expert attributed this in part to the decision to conduct a census of the full population – and had this suggestion:

“Far better would be a smaller survey, in which LOTS of resources went into making sure the response rate was as close to 100% as possible. Instead of sending out 7700 surveys, send out 1,000. Instead of 2200 surveys of dubious value, you would have 900-1000 surveys of great value.”

Finally, in response to a general query by Rick Hasen at Election Law Blog about how the FVAP data compared to another recent report by the Military Voter Protection Project (MVPP) – whose conclusions were far more pessimistic about the state of military voting – a third geek had this observation which goes to the diversity of ballots cast by voters pursuant to the Uniformed and Overseas Civilians Absentee Voting Act (UOCAVA):

[MVPP] bases its participation numbers on EAC-reported UOCAVA ballots, but those ballots are only roughly a third of how UOCAVA voters vote. They ignore completely in-person voting, which active duty military tell FVAP account for about 1/3 of participation, and ignore the large numbers of ballots cast by UOCAVA eligible voters that are not tracked as such; for example, military who vote through normal absentee or vote-by-mail procedures without ever identifying themselves as UOCAVA.

UPDATE: Late Thursday I received an email from the MVP Project’s Eric Eversole. He has a longer response over at the Election Law Blog, but here’s the relevant part of what he sent me:

In your most recent post, you referenced comments from one of your geeks regarding the difference between the MVP report (as well as the EAC’s report) and the FVAP report. The commenter asserted that the MVP Report completely ignores “in-person voting, which active duty military tell FVAP account for about 1/3 of participation….” That is simply untrue.

Throughout the MVP Report, we made clear that the reported data is focused solely on absentee voting rates among military voters in 2010. We further made clear that our report was an early snapshot of the absentee voting data from states in 2010 and we tried in limited cases to compare that data to the 2006 election. As for in-person voting, we specifically note on page 4 of the report that we are not addressing the in-person voting rates for the 2010.

The commenter also asserted that we ignored “the large number[] of ballots cast by UOCAVA eligible voters that are not tracked as such.” That assertion echoes a claim made by FVAP, but has little or no factual foundation.

[I]f you believe this assertion, it means that state election officials failed to classify more the 300,000 military voters as UOCAVA voters and, thus, failed to provide these voters with the protections guaranteed by federal law. Such a claim is unbelievable. Based on my experience, state and local election officials go out of their way to identify UOCAVA voters to provide them the maximum protection under federal law.

This is just the beginning, of course; the underlying FVAP data has considerable opportunities for further analysis. Kudos to FVAP for releasing this data and thanks to all the geeks who responded to my request for help … my guess is you’ll hear lots more from me and them in the not-too-distant future!