What ‘blue wave’? Why pre-election polls faltered again.

Where the polls erred more was in battleground states that both candidates needed to win. The final RealClearPolitics average for Wisconsin predicted a 7-point win for Mr. Biden, with smaller margins in Michigan and Pennsylvania. Most polls also had shown little change during the campaign, suggesting that Mr. Biden’s advantage was stable. In the end, his victories in these three crucial states were thin; Wisconsin was won by 20,540 votes. 

Some pollsters did better in predicting state-level votes. Suffolk University Political Research Center was within 2 points of final results in Florida, Arizona, New Hampshire, and Minnesota, although it also overestimated Mr. Biden’s margin in Pennsylvania. 

Even more erratic was the polling for closely watched congressional races like the Senate seat defended by Republican Susan Collins in Maine. Pre-election polls put her opponent, Sara Gideon, in a strong lead; Senator Collins won by 9 points. In South Carolina, incumbent Sen. Lindsey Graham beat his Democratic challenger by 10 points, defying polls showing a dead heat. 

Analysts say ticket-splitting may have been an under-appreciated factor in states like Maine, where Mr. Biden ran far ahead of Hillary Clinton’s margin of victory in 2016 but failed to lift Ms. Gideon. Intensifying partisanship tends to boost straight-party tickets. 

What all these polling inaccuracies have in common is a direction of travel: support for Democratic candidates was often wildly overestimated. 

What is behind these misses in election polling? 

Accurate polling rests on two critical calculations: The makeup of the electorate and which eligible voters are most likely to cast ballots. These calculations allow polling agencies to weigh the responses to surveys and project the outcome of an actual election. 

In 2016, most state polls failed to predict Mr. Trump’s Electoral College victory, in part because their surveys didn’t include enough non-college-educated voters and underestimated the turnout in rural areas. In the aftermath, surveys were adjusted to account for these demographics. 

Experts caution that it’s too early to pinpoint what went wrong in 2020 since not all ballots have been counted. But it appears that pollsters may have been thrown off by high turnout – the highest in at least 50 years – and the popularity of mail-in and early voting during a pandemic. In Texas, which expanded early in-person voting, turnout by eligible voters rose 9 points. 

Early voting meant that pre-election surveys could identify more actual voters as opposed to likely voters. This may have led to a pro-Biden bias in their sample, since fewer Republicans voted in advance amid Mr. Trump’s baseless claims about fraud in mailed ballots. 

Michael Traugott, a research professor emeritus at the University of Michigan, compares it to a cake recipe in which the ingredients are listed correctly but their proportion is unknown. “The portion of the recipe that was early voting was too large,” he says. 

Unlike in 2016, surveys found few undecided or third-party voters. That year, a significant number of late-breaking voters in battleground states went to Mr. Trump. But that doesn’t appear to be a factor in 2020 that could explain the underestimation of his support.  

Conservatives argue that pollsters miss Mr. Trump’s support because respondents are reluctant to state their preference, knowing that it may be socially unacceptable, particularly in professional circles. Studies have failed to replicate the “shy Trump voter” hypothesis. A bigger factor may be that Trump supporters are less likely to participate in surveys because they don’t trust pollsters. 

Has it become harder to survey public opinion on voting intentions? 

Caller ID and call blocking has made it harder to conduct live surveys. Some polling agencies rely more on robocalls; others have turned to online surveys that may not be as reliable. This drives up the cost of polling and may have contributed to polling errors in 2020, though it was already a factor in 2018 when more of the midterm polling was accurate. 

However, this year the pandemic led to higher response rates since more voters were at home, says David Paleologos, the director of the Suffolk University Political Research Center. “We were finishing projects a day earlier than scheduled,” he says. Voters seemed happy to talk to a pollster, perhaps because they were tired of talking politics with others in their household. 

But as noted, the propensity of voters to respond to polls isn’t equally distributed. The bias in surveys may reflect Democrats being overrepresented, as Trump voters with lower levels of social trust are harder to poll.


Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button