Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Session Chair: Florian Keusch, University of Mannheim, Germany
Presentations
The Long-Term Impact of Different Offline Population Inclusion Strategies in Probability-Based Online Panels: Evidence From the German Internet Panel and the GESIS Panel
Carina Cornesse1, Ines Schaurer2
1University of Mannheim; 2GESIS - Leibniz Institute for the Social Sciences
Relevance & Research Question:
While online panels offer numerous advantages, they are often criticized for excluding the offline population. Some probability-based online panels have developed offline population inclusion strategies: providing internet equipment and offering an alternative survey mode. Our research questions are:
1. To what extent does including the offline population have a lasting positive impact across the survey waves of probability-based online panels?
2. Is the impact of including the offline population different when providing internet equipment than when offering an offline participation mode?
3. Is the impact of offering an alternative participation mode different when extending the alternative mode offer to reluctant internet users than when only making the offer to non-internet users?
Methods & Data:
For our analyses, we use data from two probability-based online panels in Germany: the GIP (which provides members of the offline population with internet equipment) and the GESIS Panel (which offers members of the offline population as well as reluctant internet users the possibility of participating in the panel via postal mail surveys). We assess the impact of including the offline population in the GIP and GESIS Panel across their first 12 panel survey waves regarding two panel quality indicators: survey participation (as measured using response rates) and sample accuracy (as measured using the Average Absolute Relative Bias). Our analyses are based on nearly 10,000 online panel members, among them more than 2,000 members of the offline population.
Results:
We find that, even though recruitment and/or panel wave response rates are lower among members of the offline population than among members of the online population, including the offline population has a positive long-term effect in both panels, which is particularly due to the success of the inclusion strategies in reducing biases in education. In addition, it pays off to offer an offline population inclusion strategy to people who use the internet but do not want to use it for the purpose of completing online surveys.
Added Value:
Ours is the first study to compare the impact of different offline population inclusion approaches in probability-based online panels.
Why do people participate in probability-based online panel surveys?
Sebastian Kocar, Paul J. Lavrakas
Australian National University, Australia
Relevance & Research Question: Survey methodology as a research discipline is predominantly based on quantitative evidence about particular methods since it is fundamentally a quantitative research approach. Probability-based online panels are relatively few in number and there are many knowledge gaps that merit investigation. In particular, more evidence is required to understand the successes and failures in recruiting and maintaining the on-going participation of sampled panelists. In this study, we aim to identify the main motivation factors and barriers in all stages of the online panel lifecycle – recruitment to the panel, wave-by-wave data collection, and voluntary/opt-out attrition.
Methods & Data: The data were collected with an open-ended question in a panel survey and semi-structured qualitative interviews. First, 1500 panelists provided an open-ended verbatim about their motivations for joining the panel, which was gathered in a 2019 wave of Life in Australia™. Between April 2020 and February 2021, fifteen of these panelists were classified into three distinct groups based on their panel response behavior and participated in an in-depth qualitative interview. Each of these panelists also completed a detailed personality inventory (DiSC test). Due to COVID-19 crisis, the in-depth interviews were conducted virtually or over the phone.
Results: The results showed that (1) having the opportunity to provide valuable information, (2) supporting research, (3) having a say and (4) sharing their opinions were the most common reasons reported for people joining the panel and completing panel surveys. The most commonly reported barriers were (1) major life change, (2) length of surveys, (3) survey topics and (4) repetitive or difficult questions. In terms of personality types (DiSC), we can report that non-respondents on average scored much lower on dominance and higher on steadiness than frequent respondents.
Added Value: The study uses qualitative data to link the reported motivation and barriers with the existing survey participation theories, including social exchange theory, self-perception and compliance heuristics. It also relates the theories and the panelists’ reporting of their online panel behavior with their personality types. At the end, we turn the evidence from this study into practical recruitment and panel maintenance solutions for online panels.