Friday, June 05, 2009

Voters Not Well Informed About Oconee SPLOST

So Who Really Voted?

Only about one in five of the registered voters in Oconee County closely followed the discussions about the Special Purpose Local Option Sales Tax before it was approved by voters on March 17, and only half even knew that the vote was for renewal of a tax already in place.

Only 6.6 percent of all registered voters cast a ballot on March 17, so these findings–from a survey of registered voters conducted by two graduate students and me just after the election–are hardly surprising.

Rather they confirm the success of the county officials in running a low-key campaign for the tax initiative and holding the election at a time when relatively few voters were likely to show up at the polls. Research around the state has shown–and the experience of the county with tax votes has confirmed–that approval is likely if turnout is low.

Of the 1,457 voters who cast their ballots at the March 17 special election, 71.2 percent approved the tax. The county had 22,113 registered voters at the time of the election.

In our survey of 128 registered voters whose names we drew from the registration list at the beginning of February, 16.9 percent said they did not even know about the SPLOST tax until they received the survey sent them in the mail the day after the election.

Another 63.7 percent said they paid some attention to discussions about the SPLOST tax before the elections, "but I did not follow the discussions closely."

Only 10.5 percent told us they "paid quite a bit of attention to the discussions" about SPLOST and 8.9 percent said they "followed the discussions very closely."

Only 53.6 percent of those who returned the survey answered affirmatively (correctly) this question: "As far as you know, did Oconee County collect a SPLOST tax of 1 cent on the dollar even before the March 17 vote?" A few (4.0 percent) said the county did not have such a tax, and 42.4 percent said they did not know the answer.

The sample is relatively small, but the odds are good (19 out of 20) that had we interviewed all registered voters rather than the sample our answers would have been within plus or minus 8.8 percent of those we received.

In fact, we conducted the study as a methodological exercise to determine our success in completing interviews–in this case through the mail–with the sample of voters we selected. The two students were Nicoleta Corbu, a visiting Fulbright Scholar from Romania, and Qingmei Qing, a doctoral students from China.

Both worked with me in the Cox International Center in the Grady College of Journalism and Mass Communication at the University of Georgia.

We sent the survey to 500 voters selected from the full list of voters by chance, and the post office returned 39 questionnaires because the voter no longer lived at the address given. This means that 27.8 percent of those who received the survey returned it. This is comparable to return rates in many national telephone surveys.

Among our 128 returned surveys were 32 from voters who said they actually went to the polls on March 17, or, at 25.0 percent, a considerably higher ratio than for voters overall.

Among those 32 who said they voted, 30 answered the next question as to how they voted, and 21–or 70.0 percent–said they voted for SPLOST. That is very close to the 71.2 percent for the actual election.

That is the good news. But there is bad news as well.

The voter lists themselves are public records, since only if these kinds of records are open for examination is there any way to counter voter fraud. We have to know the list contains the names of real people.

Also public are records of whether an individual actually did vote, though, of course, how one voted is secret. Again, we have to be able to determine if those recorded as voting are real people.

Four of the 128 persons who returned our survey removed their names from the returns, making it impossible for us to match their names with voter files for the March 17 election, but 124 did not.

The 32 person who indicated they had voted were among these 124, and 29 of the 30 who indicated how they voted were among the 124.

The actual vote records, however, show that only 16 of the 32 who claimed they voted actually did vote. Of the 16 who actually did vote, 14 had told us how they voted. Nine said they voted for the SPLOST, or the equivalent of 64.3 percent.

In other words, if we simply wanted to match our results with vote outcome, we would be slightly better off including those who told they voted but actually did not.

It seems that at least some of our respondents felt they should have voted but didn’t, and they corrected for that by giving us the wrong answer to our question on whether they voted.

But it seems they would have voted pretty much the same as those who did go to the polls had they actually participated in the election.

My two students and I completed these analyses just last week. We hope to discuss the results of our study and one we did for the November 2008 election in Oconee County at a scientific conference this fall.

Voters were much more honest in reporting on their voting behavior in the November election. And the sample again matched pretty closely the actual election outcome.

The survey is just one more piece of evidence that the SPLOST election may have been a success from the point of view of county officials who wanted the tax to be approved, but it has to be considered a failure if the goal was to get an informed electorate to the polls to make a considered judgment about whether the sales tax was good or bad for the county.

The county was not allowed by law to advocate for or against the tax, but it was allowed to promote participation and to run an information campaign to help people know about the election and about the details of the tax.

The county held the required public hearings and posted some basic information on the county web site. In other words, it did the minimum.