You are on the Natcen site

Click here for Scotcen

natcen map

You are on the Natcen site

Click here for Scotcen

natcen map

Polling and the EU referendum: the post-match analysis

Posted on 01 September 2016 by Kirby Swales, Director of Survey Research Centre
Tags: EU Referendum, panel, polling, polls, research methodology

The research sector still needs to improve the way it communicates about methodological limitations.

The prevailing estimates from the EU Referendum polls were for a Remain victory, especially based on the last polls taken. Given that Leave won the vote, polling and survey organisations are in the spotlight again. As NatCen Senior Research Fellow John Curtice puts it:  “Polling a referendum truly is a tough business.”

The polling “miss” in the 2015 General Election generated a lot of attention on survey methodology, including a full-blown methodological investigation.[1]  Although not in the business of political polling, NatCen is an organisation engaged in survey methodology and interested in contributing to the debate about polling methods. As part of this, we ran a one off opinion survey on the EU referendum using a new, experimental methodology. Today we publish the micro-data to enable others to examine our results.

Like a number of organisations and commentators (including the BPCIpsos Mori, Opinium, Comres, Populus, YouGov,  and UK polling report) we have been reflecting on why the results didn’t produce a stronger showing for the Leave side:

A significant swing to Leave during the short campaign

There is a trade-off between using an extended fieldwork period to speak to ‘hard-to-reach’ groups and capturing a shift in public opinion. We published our results on 21 June but the majority (65%) of our survey sample responded between 16th & 26th May, before the campaign began. It is very possible that opinions changed between then and the referendum. Our own data shows this as a possibility: 29% of those who had stated a view on whether to leave or remain said that they “may change their mind”. There is clear evidence of a swing during the campaign, though this was somewhat masked by changes to methods and reporting.

Dealing with undecided responses and turnout

Given the one-off nature of the referendum, it was always going to be difficult to predict turnout patterns. For example, do we estimate turnout rates based on peoples’ stated intentions or based on their socio-demographic characteristics?
Identifying and treating those with undecided opinions (i.e. the Don’t Knows) was also a challenge.

In our report, we presented a number of different estimates using different methods of adjusting for turnout and dealing with Don’t knows. It is possible that patterns of turnout were different from the General Election, though it is too early to tell for sure. If this does prove to be the case it would explain why our ‘preferred’ approach (which predicted turnout based on voting patterns at the 2015 General Election) pushed our estimate towards Remain.  Many other commentators are now questioning whether their turnout adjustment methods were misguided, even though they had been changed in response to the General Election “miss”.  This is also a new challenge for NatCen as we are not generally engaged used to “modeling” behaviour.

Sampling bias

The polling enquiry concluded that the General Election polling “miss” was mainly due to unrepresentative samples. In contrast, with the referendum it looks as if a swing during the campaign and voter turnout models played a bigger role than samples, but there is still a possibility that they weren’t representative enough. Certainly, some have argued that, in reality, Leave were always ahead.

Our initial analysis suggests that this sampling bias was not a major issue for our panel sample. Looking at the General Election voter profile, our estimates were very close to the actual population, with the exception of a slight under-representation of UKIP voters.


Election result

BSA estimate

Panel estimate (EU survey)

Did not vote













UK Independence Party (UKIP)




Liberal Democrat




Green Party









Of course, this doesn’t rule out that the biases didn’t exist in the referendum voter profile. Therefore, in the spirit on the polling inquiry and improving methodological understanding, we will be conducting and publishing a re-contact survey as well as the micro-data. Apart from the ESRC funded British Election Study, the NatCen Panel survey will be the only attempt to find out how people who were contacted before polling day actually voted.

Highlighting methodological limitations

We are still keen to stress the main message of our survey, and the level of uncertainty around the estimate itself, in line with the recommendations of the inquiry. Our main message was that the referendum was “on a knife edge”, as evidenced by the confidence interval around our main estimate, which (just) included 50%. Personally, I think NatCen and other research organisations got this right but we need to be bolder in helping the public and media interpret our results. We should try to prevent commentators focussing on the point estimate or putting too much emphasis on small changes over time. In hindsight, I also believe there could have been more discussion about the uncertainty created by unknown patterns of turnout.

Still a case for face-to-face?

Finally, the EU referendum experience of contested polling results raises the case for a conducting a face-to-face survey during the election campaign, as is done by CIS in Spain. After all, as political allegiances continue to transform in the UK, it doesn’t look as if understanding voter intentions is going to get any easier.





[1] See p4 of report at 

comments powered by Disqus
Blog filters
  • 2022
  • 2021
  • 2020
  • 2019
  • 2018
  • 2017
  • 2016
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010
  • Jan
  • Feb
  • Mar
  • Apr
  • May
  • Jun
  • Jul
  • Aug
  • Sep
  • Oct
  • Nov
  • Dec
Clear Filters

Subscribe to the RSS Feed: