The polls were off in 2016. What does a reliable political opinion poll today look like?
Should most political polling in 2018 be ignored?
By Kelly Johnston
Just before the 2016 elections, nearly every prognosticator at every media outlet predicted that Hillary Clinton would easily be elected the 45th President of the United States. Newsweek magazine even pre-printed its post-election edition featuring "Madam President" on the cover.
Shortly afterwards, Larry Sabato, the legendary political scientist and non-partisan analyst with the University of Virginia's Center for Politics, issued a "mea culpa, mea culpa" for missing the election results so uncharacteristically. He doesn't say so, but I suspect he relied a bit too much on faulty polling (as did so many). Few others were so forthcoming.
Sure, some got it right, even though polling in and of itself is not predictive. Polls, properly conducted, are snapshots of the electorate at any given time. Usually, a series of polls can suggest a trend line or possible results, based on a variety of factors, some seen and unseen. Surprises can and do sometimes happen, even under the best of circumstances.
But I'm constantly find myself telling friends, family and colleagues what comprises a poll that I find reliable. Not many are reliable, frankly. Just in the past few weeks, major media organizations, such as the Washington Post, have touted polls with remarkably large leads for Democratic congressional challengers in highly competitive districts, such as Virginia's 10th District in the Washington, DC suburbs of Loudoun County (America's wealthiest county). A recent Schar poll conducted by George Washington University indicated a substantial 12 point lead for State Senator Jennifer Wexton, the Democratic nominee, over incumbent Republican Rep. Barbara Comstock.
Upon closer inspection, however, the polls methodology was. . . interesting. Here's what "The Rundown" blog from non-partisan BIPAC (Business Industry Political Action Committee) said:
You would think a respected news organization like the Washington Post, which sponsors their own polling of regional and national races, might have invested a little time reading the methodology before highlighting the survey. You would be wrong.
This highlights several problems with so many polls, especially those, it seems, that are attached to a college or university. What makes for a good, reliable poll? I'm not a statistician, nor have I ever worked for a polling company. But having worked in more than three dozen U.S. House and Senate campaigns over 25 years, and working with several polling experts, both qualitative and quantitative, here's what I think you should look for.
When was the poll conducted?
There are three parts to this question. First, when was the poll in the field, for how many days, and which days of the week? That last part may puzzle you, but the pollsters I most respect do not conduct political surveys over a weekend - Friday and Saturday nights. They are unable to explain why, but they find the results to be very different and inconsistent with results from the previous days, or the days after. They skew Democratic. This becomes evident in what is called "tracking surveys," which are in the field every night, relying on a rolling average over 2-3 days, depending on the sample size. Most reliable surveys are "in the field" 3, never more than 4 days (again, depending on the sample). I'm a bit forgiving on the size of the sample, but for House races, it should be at least 300, and for most statewide surveys, at least 800. If you're looking for in-depth of information on specific demographics or populations (the "cross tabulations"), you'll need more.
Where does the sample come from?
Who are you polling, and how do you reach them? For decades, the primary method is known as "random digit dialing," or phone calling a series of numbers, blindly. That worked great until the advent of the cell phone. And now, many newer pollsters such as Scott Rasmussen and You.Gov have turned to online or automated polls. The biggest challenge for pollsters these days is the inordinate "refusal rate" among those they reach via phone (cell or landline), which is now reaching 90 percent. In addition, most states have rules and regulations on the types of calls you can make to mobile phones. Pollsters find themselves spending lots more time and money to college their samples and are looking for less expensive ways to reliably reach voters (if not cut corners).
But here's the thing: random digit dialing may be on the path towards obsolescence. That's because since 2008, voter registration, political party and other third-party lists have become so rich in reliable data. Pollsters use to ask whether you were registered or not, and with which party. A University of Texas study several decades ago demonstrated that unregistered voters often lie with their answer. Now, with the quality of voter files today, you don't have to ask whether someone is registered or even whether they're "likely" to vote. You know. It's right there in black and. Increasingly, the best pollsters (the kinds campaigns employ) use those files to contact voters.
I used to disregard online polling but given the "shy Tory" effect we saw during the 2016 presidential election (i.e., Trump voters refusing to admit to pollsters they were for Trump) or the Brexit voters in the United Kingdom referendum on whether to remain in the European Union, they have their place. It seems many voters will trust a computer with sharing the genuine political views more than they trust a stranger on a live call.
What's the sample look like?
Here is where pollsters engage in some guesswork - who is going to show up to vote, by demographic? In mid-term elections, seniors, for example can comprise upwards of 30% of the electorate even though they might make up only 18 percent of the population in a given area. Younger voters are famously inclined not to show up. But again, relying on a good voter file, that can take a lot of guess work out of the system (except for the newest voters). But let's face it, the elections of 2016 and 2018 surprised just about everyone, not just on the results, but the demographics of those who showed up.
Still, I find many polls are overweighing Democrats, or make dubious judgements on who will turnout to vote. For example, the Elon Poll (Elon University) and others in 2014 had incumbent U.S. Senator Kay Hagan consistently ahead of challenger Thom Tillis. "'In this election, across the board, you saw a Democratic bias' in polling numbers, said Elon University Poll director Kenneth Fernandez, whose final poll had Hagan up 4 percentage points. 'It's not easy to be a pollster'" reported Raleigh's News & Observer. I have found that private polls used by campaigns are more accurate than the public ones.
The Bottom Line
Here's what to look for any in any political poll during these final couple of weeks.
First, take time to see of the news story or source provides a link to the methodology. If they don't, you can pretty much ignore it. Most major media outlets link to the poll's methodology. Pay special attention to poll methodologies employed by university-sponsored polls. Some are better than others.
Second, find out how the population was sampled. Most may still use random digit dialing, and that's fine to a point (look for the percentage of landlines, which favor older voters, versus cell phones, which are increasingly used by younger voters), but increasingly, others are relying on much-improved data files of registered voters with a clear, proven history of voting (or not) in elections, with the most important demographic data (age, gender, race, location, party registration, etc.). And some are relying on a mix of online surveys and calls. I'm more comfortable with online surveys than I used to be. I have found Scott Rasmussen's surveys to be quite reliable and valuable.
Third, when was the survey in the field? If it took longer than 4 days, ignore it. If, as so many do, it polled over a Friday and/or Saturday (which for some reason favors Democrats), ignore it.
Just today (October 22nd), I read the results of several polls in several states, including Florida. Every single one polled over the weekend. I'm casting those aside.
But fourth, remember that even the best polls are not predictors of the outcome, even if only a few days out from the election. They are, at best, snapshots of an electorate at a given point and time.
And please, ignore exit polls - they're the worst. Just ask President John Kerry.