Election Math and Statistics

Notes on where math, statistics and other forms of analysis play a part in our democracy.
Image
Election Math and Statistics
Media credits

Theresa Thompson, rights info: bit.ly/OYUnA0

Chris Gorski, Editor

(Inside Science) -- In addition to deciding who will be the President, incumbent Barack Obama, or challenger Mitt Romney, many other state and local races and issues will be settled in tomorrow's U.S. election.

Here are several interesting notes on where mathematical, statistical and other forms of analysis play a part in our democracy.

Predictions, Probability and Polls 

Last week, many pundits, most notably MSNBC's Joe Scarborough, bristled at analyst Nate Silver's projections of the presidential election outcome, which suggested at that time that Mr. Obama had a 73.6% chance of winning a majority of electoral college votes and therefore retaining the presidency. Silver's FiveThirtyEight blog, hosted by the New York Times, is watched closely by many political enthusiasts for its mix of cold numerical analysis and projection.

Polls show a very close popular vote in the presidential race. Many pundits claim that the contest is so close it could go either way. However, the U.S. Electoral College system does not require that the winner garner a majority of the national popular vote. This point, and a general larger point about what probability means, are what Silver's defenders have attempted to explain.

To quote from the Atlantic article linked to above: "Claiming that Barack Obama has a 73.6 percent chance of winning is not the opposite of saying 'it could go either way.' It's just saying that it looks more likely to go one way than the other."

On October 17, Silver appeared on "The Daily Show with Jon Stewart" to discuss his recent book and more about the election and the related math. An extended version of the interview is available online.

"The whole book is about how even though we have more information, why we don't know how to use it well," said Silver.

During the extended interview, Stewart asked: "Isn't there a point with our elections that we are beyond the ability to know of it statistically?"

Silver answered that many election recount results could almost be considered essentially coin flips, such as the 2008 Minnesota senate race that ended with Al Franken's win, or the 2000 Presidential contest in Florida that sealed President George W. Bush's victory.

Here's an interesting note: Nate Silver's projections keep moving further in the direction of an Obama victory. As of today, he forecasts the probability of reelection at 86.3%. A different approach to the numbers, Sheldon Jacobson's Election Analytics website, points out that as of November 4, the likelihood of Obama's reelection is a striking 99.6%. Both seem like high numbers when many polls indicate that the popular vote is very close. Here's a link to a post from the Punk Rock Operations Research blog dissecting the differences between the two approaches.

Real Clear Politics parks Obama's Electoral College take at over 300. The number of votes needed to win is 270. The Wall Street Journal and Washington Post at this point hold back on predicting a victor in several swing states, leaving both candidates with less than an Electoral College majority.

Revisiting Redistricting

Redistricting is the process that follows the national decennial census, including any adjustments in the number of seats in the House of Representatives by state. Each state controls the boundaries of their own districts, which are subject to various national and state regulations, and often the party in control of their legislature at the time. The process of redistricting often seems subject to highly contentious politics. Some states end up with maps that look more like Rorschach blots than a collection of well-reasoned, fairly designed districts representing a collection of communities. 

Sheldon Jacobson, an operations researcher at the University of Illinois at Urbana-Champaign, mentioned above, is a co-author on a project intended to make redistricting more transparent. With graduate student Douglas King, he worked on a project to optimize districts according to variables input by a would-be redistricting designer.

Their algorithm breaks down the population of a state by census block, which is the smallest unit of population data available. They said that districts could be designed to balance and optimize for variables such as these: population balance between districts, political affiliation, the shape of districts, and more.

"I think our main goal was that we wanted to link this very practical problem, this problem that every ten years we need to redesign congressional districts, we wanted to add this layer of mathematical theory," said King.

Jacobson added the following: "If someone advocates for transparency [in the redistricting process], again there's no political biases on our side, we just want to make the process as open and as transparent for all the stakeholders, then this tool has the potential to really change the way people do redistricting in any domain, simply because you can use more detailed data and a level of efficiency that is just unheard of."

A second approach to redistricting focuses on a single number that biostatistician Thomas Belin of UCLA says is a wonderful stand-in for political affiliation: housing density. He said that journalist David Brooks and others point out that the most telling difference between red states and blue states is housing density, that areas with higher housing density lean Democratic, and lower housing density areas lean Republican.

Belin indicated that by designing districts to have not only similar populations, but also similar housing densities, that redistricting efforts could help build districts that would encourage races that would be competitive between the two major parties, and not establish a collection of "safe" seats for each party.

If the public and people making the decisions about redistricting had the chance to review a single score that took into account housing density and a variable computed from the district's shape called compactness, in addition to whatever other data they might have access to, Belin said, that it might encourage more sensible and competitive districts in general.

"One of the perspectives that I bring to this is when people think about gerrymandering, they think about irregular shapes of districts. But you could have, and I think political consultants understand this and the public doesn't…  You might have very regular looking districts that lock in safe seats for all the parties," said Belin.

Perhaps these approaches will inspire some new thinking and approaches after the 2020 census.

Proposal Practicalities 

Earlier this year I spoke to a mathematician at Grand Valley State University, Jon Hodge. In the interest of disclosure, I'll note that we knew each other as kids, but hadn't spoken in more than 15 years. His current research examines what's called mathematical voting theory.

That research area address the Electoral College, redrawing district boundaries and many more topics.

One area where this comes up is when voters weigh ballot proposals. In some instances, voters only support items conditional -- a crude example would be if there were two items presented to voters, the first asking if a town should build a new school and the second question if taxes should be raised to fund it. In that case many people might only want to say yes to the first if the second item passed. 

If items are all decided on the same day at the voting booth, Hodge said, it can lead to the least-wanted outcome -- i.e. an approved new school with no money for construction. Some jurisdictions, he said, have been experimenting with voting sequentially where voters vote on one proposal, and then after the results are in, vote on the next related topic, and so on.

 

Author Bio & Story Archive

Chris Gorski is the Senior Editor of Inside Science. Follow him on twitter at @c_gorski.