At the end of August, I attended the Household Survey Non-Response Workshop in Oslo. This is an annual get together of researchers, statisticians and academics to discuss and identify solutions to the issue of falling response rates in social surveys.
The workshop has been held annually since 1990 – which goes to show just how long the international research community has been concerned about declining participation.
These were my take-home messages from three days of in-depth presentations and discussion at the rather sleek, modern offices of Statistics Norway in Oslo.
1. Response rates vary massively across Europe and North America
I really didn't realise quite how much response rates for major social surveys varied across Europe and North America. For example, the Norwegian Labour Force Survey still achieves a response rate of around 80% (!!!),while over in Germany the General Social Survey in 2014 achieved just 35%. Here in the UK, our response rates sit somewhere in the middle.
I heard various anecdotes which threw some light on the possible reasons behind this huge variation. In countries like Norway there is a strong social contract. The researchers I spoke to thought that this was just natural behaviour; the government helps you, you have to do a few things in return. On the other hand, a German delegate felt that the Nazi’s use of census data had a lot to do with dislike for sharing data with those in authority in his country.
Fortunately, over in the UK there still seems to be some trust in official statistics. As part of the British Social Attitudes survey we found that 88% of the public still trust our official stats. Of course, that doesn’t mean it’s always easy to collect the data.
2. There have been big declines in response everywhere over the last 20 years
The UK is not the only country to have experienced large declines in response rates over the last 20 years. It was comforting to see that it’s not that we are doing something wrong in the UK; something in the world has changed – and we need to adapt.
3. Response rates and decline varies – but concern is universal
Discussing the decline in response rates with delegates – who mainly came from national statistics offices and academia - everyone was concerned about what the decline in response rates means.
But I realised that their outlook on this was relative; in Norway there was concern about a response rate of 80% while in Germany they would be happy to go home with 40%.
4. All countries are working hard to solve the problem
Innovation and research was a universal theme across the nations – we are all working to overcome the challenge. All delegates reported on the additional efforts they were making to improve response rates. There were some ingenious ideas.
Some countries have gone down the route of using approaches from the commercial world. The US Census Bureau is testing digital marketing techniques ahead of their next round of data collection in 2020 (something we have also been testing on various studies at NatCen).
Others were trying to understand just how many times you can contact sample members within one week without decreasing your response rate (answer: more than you think).
Meanwhile, other countries have been going down a more authoritarian route. Norway now has a central database of every citizen’s postal address accompanied, in the vast majority of cases, by email and mobile phone numbers. Over in Canada, responding to their major health survey will become compulsory from next year.
Here at NatCen we have an organisational group dedicated to solving the issue, and multiple experimental trials running to roadtest new approaches. It was great to connect with some others facing the same issues – we hope this can lead to some international collaboration.
5. Falling response rates don’t necessarily harm data quality
Many presentations were concerned with whether falling response rates actually affect data quality. I have to admit, a lot of these involved some quite complicated maths that went a little over my head, but the overall message was clear: lower response rates do not always damage sample quality.
A presentation from Michael Blohm, from the Leibniz-Institute for the Social Sciences, showed that despite plummeting response rates in the German General Social Survey over the past 20 years, the index of dissimilarity – a key measure of bias - had remained fairly static.
Koen Bullens, from the Centre for Sociological Research in Belgium, presented a fascinating paper on the relationship between sample bias, response rate, and the contrast between the sample and population. His take was that, mathematically speaking, once the response rate falls below 70%, reducing contrast between your sample and population was far more important than maximising the response rate.
Now that response rates exceeding 70% are the exception rather than the rule, he called for a more intelligent approach, to replace the blind pursuit of high response rates.
There will always be the issue of unknown biases but all the data seems to point to the development of a more nuanced approach. Understanding how fieldwork works, it won’t be easy to put in place, but could actually provide high quality samples at lower cost.
6. We need to be intelligent about efforts to boost response rates
Another interesting take-home message was that some of our efforts to bolster response rates can have unexpected effects, from improving data quality to increasing bias.
One presentation found that additional fieldwork efforts in Germany actually increased sample bias, after a certain point. Indeed from our own work at NatCen, we know that reissuing cases often gets you ‘more of the same’ rather than diversifying your sample.
A typical concern of commissioners and researchers is that incentives will decrease data quality. The logic goes that if you pay someone they will just rush through thinking only of the money at the end. However, a study from Holland showed that incentives in online surveys actually increased data quality across several measures.
The same study confirmed what the survey community already knows, but has not really quantified: incentives have a differential effect across sub-groups. They showed that incentives are far less effective at encouraging participation among ethnic minorities compared with other Dutch citizens.
7. Survey research needs to find a way forward
Certainly in the UK, there is constant pressure on data collection organisations to come up with higher response rates. On the other hand, there is little appetite for the larger data collection budgets, that this this would inevitably require. To achieve higher response rates we would have to knock on doors more times, incentivise more heavily or spend on developing other innovations.
A 70% response rate without spending a fortune (for example, the Crime Survey for England and Wales achieves this benchmark but reissues some cases up to three times) is not a reality any more in the UK. And it may well be undesirable in achieving what ought to be our end goal: survey estimates that reflect reality.
We really need a way forward to a more nuanced view of response rates. Here are three steps that I think could take us in the right direction:
Stop the chase for high response rates.
Survey commissioners need to write briefs that do not encourage a race for the highest response rates. Points should not be awarded for promising to reach unrealistic targets, and increasingly financial penalties are applied for failure to deliver. At some point, we seem to have forgotten that response rates were only ever a shorthand for sample quality. In the post-70% world, this shorthand doesn’t even work. Commissioners need to embrace a more sophisticated view of sample quality – and reward data collection organisations for doing the same.
Design fieldwork for quality not quantity
In return, survey research organisations, like NatCen, need to recalibrate our approach to fieldwork, creating a model that also has a more nuanced view of achieving participation. This would likely be through approaches like responsive design. NatCen has already field tested this type of approach on the most recent round of the European Social Survey and Next Steps with positive results.
Foster innovation – and careful monitoring of outcomes
Of course, we need to keep on innovating in many areas – fieldwork, incentives and communications – to support this drive for quality. But we must also keep a careful eye on the consequences – good and bad – of any changes we make. Survey commissioners and data collection organisations need to work openly with one another to foster a climate of innovation.