Go to HHH home page.


MPR/Humphrey Institue poll review: too many 612s?

Minn Post, December 17, 2010

"81 percent of "612" voters participated when asked. Statewide, the figure was 67 percent."

PDF of story

12.17.10 Minn Post Jacobs Institue Poll.pdf

Full story:

The Minnesota Public Radio and the University of Minnesota Humphrey Institute released critiques of their much-criticized 2010 election polls late Friday afternoon.

The two entities commissioned the review after their final poll put DFL gubernatorial candidate Mark Dayton up 12; he beat Republican Tom Emmer by 0.4 percent. Republicans especially have howled about the result.

The U profs who direct the poll, Larry Jacobs and Joanne Miller, did an internal critique, which was reviewed by Gallup Editor-In-Chief Frank Newport.

Safe to say, Jacobs and Miller don't find a lot of fault with themselves; Newport, president of the American Association for Public Opinion Research, was a tougher grader.

Too many 612s and black interviewers?
Newport says the "issue which appears most relevant" is a potential oversample of the 612 area code. The Minneapolis-anchored area favors Democrats.

According to the U profs, 81 percent of "612" voters participated when asked. Statewide, the figure was 67 percent.

This is one of the areas where Jacobs and Miller make the numbers dance a bit; Newport notes they should have compared "612s" to non-612s. The gap would've been even more stark.

Jacobs and Miller suggest weighting future polls by region. Newport agrees, somewhat witheringly: "This is commonly done in state polls, particularly in states where voting outcomes can vary significantly by region, as apparently is the case in Minnesota."

At one point, there's a jolting suggestion that the U employed too many African-American interviewers.

Jacobs and Miller wonder whether the proportion of African-American poll-takers -- 44 percent -- might've freaked out (presumably white) voters in a state that's only 5 percent black. However, they conclude, "This investigation failed to detect statistically significant differences" in candidate support.

However, Newport wonders if minority interviewers achieved higher cooperation rates among minority voters -- perhaps a factor in the high "612" cooperation rate, since that D-favoring area code also has a higher percentage of minorities. He suggests going back over the data.

Too light on weighting
During the campaign, SurveyUSA president Jay Leve criticized how the MPR/HHH poll weighted voters -- particularly how the poll simulated the preferences of so-called "cell-phone-only voters" (CPOs) who lack landlines. MPR/HHH did not call CPOs.

The Jacobs/Miller review only says that their methodology was sound, with a Columbia University expert confirming that evaluation.

However, Newport faults the U profs for not discussing the "particulars of the weighting other than to say that it was reviewed and approved. ... This is an important area of focus."

Taken alone, excluding CPOs wouldn't explain a too-big Dayton margin -- in fact, it would likely narrow it. Pollsters have generally shown that if CPOs have a partisan lean, it's toward Democrats.

However, the weighting issue here is broader. The important point is that Newport didn't have enough information to critique the formula.

(Jacobs, Miller and Newport agree that CPOs should be included in future polls, a more expensive process.)

The other guys did it
Even though the MPR/HHH poll, conducted eight to 12 days before Election Day, gave Dayton his biggest margin of the campaign, Jacobs and Miller contend it wasn't an outlier.

They examine Dayton's and Emmer's support versus four other polls in the field at least part of the time MPR was.

Dayton's support fell within the margin of sampling error of all four (St. Cloud State, Rasmussen, the Star Tribune and SurveyUSA). Emmer's support fell within St. Cloud's and the Strib's margin.

However, it's important to remember that St. Cloud (which showed Dayton up 10) has been criticized, too -- particularly for an ultra-long 12-day interviewing process that can carry three-week-old voter attitudes into a final-days result.

The Dayton number really wasn't the problem; Emmer's was the one that varied widely. The MPR poll didn't even catch all of the back end of the Strib's range, while missing SurveyUSA's and Rasmussen's entirely.

Jacobs has repeatedly noted polls aren't predictions, but a "snapshot in time" that may legitimately miss shifting voter attitudes.

However, Newport archly notes that "the MPR/HHH poll was second closest to Election Day" of the five "and reported the highest Democratic margin."

Other recommendations
If everyone is going to treat the final poll like a prediction, Jacobs and Miller recommend polling even closer to Election Day. That way, they have a better shot at catching voters' ultimate mood. Newport agrees.

(Interestingly, the Strib decided not to do a traditional final-week poll this year because it might influence voter, volunteer and/or donor behavior too much.)

The U profs also suggest reporting poll results differently. They favor including other polls with their own, and reporting candidates' support not as a single point, but a range within the error margin.

On the latter point, Newport is wary: "It does ... again, raise the question of the purpose of, and value of, pre-election polls if they are used only to estimate broad ranges of where the population stands."

As far as overcoming any interviewer "lack of rapport," Jacobs and Miller suggest asking innocuous questions before getting to the horse race question -- something other polls do. Newport suggests testing that theory before implementing it.

A MinnPost poke?
As regular readers know, I've loudly questioned the MPR poll, and there might be a bit of payback in the U's write-up. They repeatedly refer to the St. Cloud State survey as the "MinnPost/St. Cloud State" poll.

We did pay for three questions involving ranked-choice voting, not the topline result that's the focus here. Unlike the MPR/HHH poll, which was a joint production and labeled thusly, St. Cloud State's name stood alone on its survey.

At the very most, it should be "St. Cloud State/MinnPost," but unlike MPR/HHH, we didn't jointly determine the content of most of the lengthy questionnaire.

No responses to “MPR/Humphrey Institue poll review: too many 612s?”

Leave a Reply

Some HTML is permitted: a, strong, em