In Re:’s column a 22-year-old with minimal understanding of politics tries to work out what it all means before the election. This week, Zoe Madden-Smith finds out how political polls work. Infographics by Liam van Eeden.
If you are like me and have wondered why pollsters still rely on landlines, why polls are based on just 1000 people, or why polls from different organisations come out looking completely different, then this article might help answer some of your questions. I had a chat with Edward Langley, the research director of the Colmar Brunton polls and Phil O’Sullivan, the head of newsgathering at TVNZ who oversees the 1 News poll, and this is what I learned.
To keep it simple, I’ve only focussed on the 1 News Colmar Brunton poll, which has been running for 20 years and is the most regular political poll we have in New Zealand. There is also a regular Newshub-Reid Research Poll, a less frequent Roy Morgan Research poll and also some smaller online polls dotted around the place.
I have never been called for a poll, how do pollsters find people they survey?
Pollsters find people (eligible voters) through randomly generating landline and mobile phone numbers. So with landlines, they would select an area code (09) a seed code (360) and then they would randomly generate the rest of the numbers (123456) to find each person. Landlines are the best for this technique because the area and seed codes mean they can target different areas of New Zealand and create a sample that spreads across the country.
Mobile numbers are also randomly generated, but they are a little bit less helpful because you can’t tell which area of the country you are calling, only that it is a New Zealand number. “But actually, we have found that the distribution of mobile numbers around the country actually falls in line with what we would need representatively,” says Edward. “So it still seems to work from that aspect.”
Does polling mobile phones make things more accurate?
Historically the Colmar Brunton polls have been relatively close to the final election results. But in 2017, the first time that Colmar Brunton used mobile numbers in their polls, they found their final election day poll was significantly more accurate.
The final poll put National at 46 percent of the party vote (they got 44.4 percent) and Labour at 37 percent (they got 36.9 percent). Because of the success, they have continued polling with a combination of 600 mobiles and 400 landlines (a total of 1000 people surveyed).
So if mobiles make things more accurate, why do they still use landlines at all?
“Landlines let us maximise the coverage. If we only called mobiles we would miss out on the population that doesn’t have a mobile, or ones that don’t really use them,” says Edward. “We found we can access older demographics and potentially access people that don't have great mobile coverage as well. It also just helps us get a wide national spread because we can target area codes.”
Mobile polls are also significantly more expensive to run because people are less likely to answer their phone if an unknown number is calling, which means it can take a lot more time to get the data they need. And time is money.
“Polling used to be shooting fish in a barrel because every home in New Zealand had a landline and people would answer them,” says Phill O’Sullivan, TVNZ’s head of newsgathering. “Whereas these days, less and less people have a landline and people are less likely to answer their mobile.”
It’s always a different 1000 people every time they poll because of the random number generators.
How can a survey of 1000 people reflect how 5 million people are feeling?
A survey of 1000 people is the happy medium of being relatively accurate, but also feasible to achieve.
One way to measure the accuracy of a poll is to look at its margin of error (the amount that a poll might be wrong by). The more people you poll, the smaller your margin of error. As the Scientific American explains it, a sample size of 100 people will give you a 10 percent margin of error. A sample of 250 people will give you a six percent margin of error, and a sample of 1000 people has a three percent margin of error.
Currently, the margin of error for Colmar Brunton polls is plus or minus 3.1 percent, with a 95 percent confidence rate. This means 95 percent of the time, the result of the poll will be within three percentage points of what the general population thinks.
“If you wanted to reduce the margin of error to two percent you would need 2400 people,” says Edward. “So it may be slightly more accurate, but you have more than doubled your sample size, which would be far more costly. So it’s about striking that balance between cost versus statistical rigor.”
Are those 1000 people representative of the population, and what is weighting?
Say you polled 1000 eligible voters, but only two percent of them were 18-24 years old. But in reality, 18-24-year-olds might actually make up 10 percent of eligible votes. To make your sample more representative of the real world you would want to “weight up” this number. This means inflating the data you got from this group so it represents a larger percentage of the answers and reflects New Zealand more accurately. Pollsters will use census data (on age, sex, ethnicity, etc) to inform their weighting.
While weighting is a helpful tool to make polls more accurate, if you do too much weighting your data can become “flaky”, as Edward puts it. This means you're starting to guess more and more what people think, not actually finding out what they think. And although it’s an educated guess, you can still get it wrong.
Edward says using mobiles more than landlines means you don’t have to do as much weighting (especially when it comes to Māori and Pasifika and young voters), making the polls more accurate.
Pollsters will poll people multiple times every year, even when it isn’t an election
The more regularly you poll, the easier it is to see trends and patterns in public opinion, which is the goal. “A lot of people would love it if we polled more than we do,” says Phil. “But, you know, in this climate we just don't have the money.”
Polling is expensive
Neither Edward nor Phill would tell me how much money a poll costs, but it’s clearly a huge consideration. “It’s a massive factor,” says Phil. “I think that partly explains why a lot of media organizations have gone out of polling.”
Mobiles have only made polling more expensive, which is why people like Phil are interested in moving towards online polling. “If we've got access to online databases with 100,000 people, and we are here really struggling to get to 1000 people on landlines and mobiles, well what’s the easy thing to do? I think digital polling is where the future is. But I'm not sure when we will start moving into this space.”
Some polls are already digital - the latest Newshub-Reid Research Poll used 70 percent landline and mobile and 30 percent internet poll.
Why doesn’t everyone just poll online?
“Online has clear benefits like being more cost-effective than telephone polling,” says Edward. “For the same budget, you could do polling more regularly, with larger samples and maybe get a greater sense of trends.”
But the compromise is, online polling has a different sampling technique (called quota sampling). Online polling would use databases that are groups of people who have agreed to answer other surveys for rewards. But not every New Zealander is the survey-answering type, which would mean your sample would already be skewed. People would also need to have the internet and know about the surveys to be polled, which also limits the sample. The pollsters would then select different demographics from these groups to poll, so it’s a lot more curated and less random.
“So if you're an academic, you possibly look down a little bit on that as an approach,” says Edward. “But that's not to say you don't get some very accurate results from online polling.”
Regular and random telephone polling can be seen as the “gold standard” for polling because it allows almost everyone in New Zealand to have an equal chance of being polled. You can refuse to be polled, but if pollsters call you at a bad time they will do their best to call until you can answer their questions.
“We don't just want to take the people that immediately say yes or are keen to participate,” says Edward. “It’s important that we encourage the less inclined people so we are as representative as possible.”
Polls are heavily affected by news events, so it’s wise to keep in mind what was happening at the time of the poll.
“The key thing to take into account is that polls are a snapshot in time,” says Edward. “They are telling you where public opinion is at the time of when the survey was done. And obviously, this can change over time or depending on an event.”
Pollsters will ask questions like, “If a general election was held today, which political party would you vote for?” If a party has just had a scandal or a triumph at that time, that could sway your answer. This is why it’s crucial to look for the fieldwork dates (the days the poll was taken) and find out what was happening then. Colmar Brunton supplies key news events with each poll’s full report.
This can explain why polls released at the same time can have completely different results. This happened in June last year when on the same night, a 1 News Colmar Brunton Poll showed National at 44 percent support, while a Newshub-Reid Research Poll showed them at 37.4 percent.
Even though the two polls were released on the same night, the polling had been done over completely different time frames. Colmar Brunton was conducting their research from the 4th to the 8th of June, and Reid Research did theirs from the 29th May to the 7th of June. So it’s no surprise that Newshub’s poll showed more confidence in Prime Minister Jacinda Ardern because their poll included time close after the March 15th Christchurch terror attack, where Jacinda was praised for her reaction. Whereas Colmar Brunton started polling once this praise had died down.
Polls inform us about how New Zealanders are feeling at one point in time. And with an election system as tactical as MMP, this can be really useful information. But next time you are reading or comparing polls, make sure you have these things in mind as every factor can influence the results.