Pollsters grapple with methodology, impact of election polls - Action News
Home WebMail Friday, November 22, 2024, 04:12 PM | Calgary | -10.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Politics

Pollsters grapple with methodology, impact of election polls

Pollsters from the country's most prominent public opinion research firms are debating the lessons learned last May's election.

Polls seen to have had a 'major impact on the outcome' of last federal election

A side profile shot of a man talking animatedly into a mic. He's dressed in a black suit.
Pollster Darrell Bricker, the president and CEO of Ipsos Reid, shown here in a file photo, co-wrote an open letter to journalists covering the Ontario election earlier this month, sparking a fierce debate about polling methods and reporting standards for public opinion research. (Darren Calabrese/The Canadian Press)

What if there had been no public opinion polls published during last spring's federal election campaign?

Would the NDP's orange wave have swept across the country if no one realized there was a wave to catch? Would support for the Liberals have collapsed so utterly? Would the Conservatives have captured their long-sought majority?

Such questions are not simply idle speculation.

Given the controversy swirling around the accuracy and methodological adequacy of polls, the professionalism of pollsters and the polling literacy of journalists who slavishly report them, questions about the influence of polls on election results are arguably key to the health of Canada's democracy.

The issue arose at a conference last week, where eight pollsters from the country's most prominent public opinion research firms discussed the lessons they've gleaned from last May's election.

"I do think this was an interesting election in that it's hard to deny that polls themselves had a major impact on the outcome of the election," said Derek Leebosh of Environics.

He said a CROP poll published shortly after the televised leaders' debate and showing the NDP had leapt into a commanding lead in Quebec was like "a depth charge" exploding in the campaign.

"Imagine if there had been no polling in the election campaign at all, nobody would have known that this phenomenon in Quebec was happening and the orange crush would never, may not, have spread into the rest of the country and even within Quebec people might not have thought the NDP was a realistic option."

Ekos Research's Frank Graves countered there's "strong evidence" the outcome wouldn't have changed much had there been no polls. Indeed, he argued there was no post-debate NDP surge, that his surveys showed NDP support "proceeding on a pretty placid, straight (upward) line" throughout the campaign.

Regardless, Graves maintained voters are "not that dumb;" even without polls they would've noticed the explosion of NDP lawn signs and anecdotal evidence of New Democrat popularity.

In any event, he said post-election polling found the vast majority of Canadians maintained they weren't influenced by the polls and, among those who were influenced, there was no clear pattern favouring one party over another.

Environics' Kevin Neuman was doubtful.

"People may say that (polls) don't influence, but it would influence the media and how the media cover the story and frame the story," he said, adding that the CROP poll "may have completely changed the media coverage."

While they disagreed about the impact of polls, there was consensus among pollsters at the conference that media coverage of them is often sorely wanting. Journalists, they agreed, are riveted on the bald horse race numbers, disregard issue-based surveys, misinterpret margins of error and make no distinction between proven and unproven polling methods.

The discussion was in many ways a polite echo of the rocket launched a couple weeks ago by Darrell Bricker and John Wright, top honchos at Canada's largest polling company, Ipsos Reid.

The duo penned an "open letter" to journalists covering the Ontario provincial election campaign, warning them that "some marginal pollsters" are counting on media ignorance and competitiveness to peddle "inferior" polls. They accused some unidentified pollsters of becoming "hucksters selling methodological snake oil" and some media outlets of publishing questionable polls simply because they support their editorial position.

"All of this MUST stop," the duo wrote. "We are distorting our democracy, confusing voters and destroying what should be a source of truth in election campaigns -- the unbiased, truly scientific public opinion poll."

Graves pointed out that pollsters used to have stable relationships with specific media outlets that paid well for quality surveys. Today, he said, it's become a "sort of auction to the bottom," with pollsters giving their research to the media for free as a publicity tool. As a result, the quality of polls and the level of "methodological fluency" among journalists reporting on them have plunged.

It's not that pollsters are knowingly peddling "crappy" polls, he insisted. It's just that they can no longer afford to do surveys with the "level of depth and rigour that we'd really like to" because "polling budgets today are a fraction of what they used to be." And journalists don't seem to notice the difference.

"The understanding of basic issues -- like what is a margin of error, or how do you create a good sample -- is dramatically lower than it was back when they had really smart (media) guys ... who knew as much as the pollster."

Last winter, veteran pollster Allan Gregg publicly aired similar concerns in an interview with The Canadian Press -- sparking an angry backlash from some fellow pollsters, including Bricker and Wright.

At the time, Graves proposed the idea of a "blue-ribbon consortium" of public opinion researchers who could apply the highest standards to produce high quality political surveys "as a public service for the industry." No one took him up on it.

"The whole industry might come (out) ahead, much more so than this internecine thing that's emerged in the last year or two," he said.

That internecine squabbling has revolved largely around the different methodologies employed as pollsters grapple with how best to achieve a random sample of the entire population in an age when people vet their phone calls or, in many cases, use only cell phones. Some have moved to online polls, which involve self-selected respondents who sign up to be polled, or, like Ekos, now use automated phone surveys or "robo calling."

Bricker and Wright maintained in their letter that both robo-calling and online polls produce skewed results. But at the conference, most participants agreed that traditional phone and online polls in last May's election produced similar, relatively accurate results. Ekos underestimated Conservative support but, Graves argued, that wasn't a methodological problem so much as it was an inability to anticipate the disproportionate turn-out of Tory voters.

Whatever the drawbacks of the alternatives, it would seem traditional phone surveys are going the way of the dodo bird.

Ipsos representative Mike Colledge argued that no single method can produce a sample that reflects 100 per cent of the population and suggested that mixed polling methods are the way of the future.

Leger pollster Richard Hobbs predicted that by the next federal election, very few pollsters will still be conducting phone surveys.

Graves said he's no pessimist; he's confident the industry will develop ways to ensure quality polling data and develop more poll-literate media partners.

"It's not a high-water point for us right now, let's be blunt. But that doesn't mean we should give up."