Using Facebook for Comparative Survey Research: Customizing Facebook Tools and Advertisement Content
Anja Neundorf, Aykut Ozturk
University of Glasgow, United Kingdom
Relevance & Research Question: Paid advertisements running on platforms such as Facebook and Instagram offer a unique opportunity for researchers, who need quick and cost-effective access to a pool of online survey participants. However, scholars using Facebook paid advertisements need to pay special attention to the issues of sample biases and cost-effectiveness. Our research explores how Facebook tools and advertisement content can be customized to reach cost-effective and balanced samples across the world.
Methods & Data: In this paper, we are presenting the findings of three online surveys conducted in the United Kingdom, Turkey, and Spain during February and March 2021. In these studies, we explored how two tools offered by Facebook, the choice of campaign objectives and the use of demographic targeting, affected the recruitment process. Campaign objectives affect the menu of optimization strategies available to the advertiser. We compare the performances of three campaign objectives in this study: traffic, reach, and conversion. Facebook also allows researchers to target specific demographic groups for their advertisements. We compare the effects of two ways of demographic targeting, targeting several demographic characteristics at once and targeting only one demographic property at each of our advertisements, along with no targeting.
Results: Our studies reveal a series of important findings. First of all, we were able to collect high-quality samples in each of these countries with low costs. Secondly, we found that while traffic campaigns produce more clicks to our Facebook advertisements, it is conversion campaigns that recruit higher quality surveys for a cheaper price. Our study also demonstrated that the use of demographic targeting is necessary to produce balanced samples, although it might cause smaller sample sizes overall.
Added Value: We believe that our study will help researchers planning to use online surveys for comparative research. Most social scientists conventionally use traffic in their Facebook campaigns. We demonstrate that it is actually conversion campaigns that return cheaper and more high-quality samples. We demonstrate the benefits of demographic targeting, and we also discuss under what conditions demographic targeting becomes most effective for researchers.
Trolls, bots, and fake interviews in online survey research: Lessons learned from recruitment via social media
Zaza Zindel
Bielefeld University, Germany
Relevance & Research Question: The rise of social media platforms and the increasing proportion of people active on such platforms provides researchers with new opportunities to recruit study participants. Targeted advertisements can be used to quickly and cost-effectively reach large numbers of potential survey participants – even if these are considered rare population members. Although a growing number of researchers use these new methods, so far, the particular danger points for sample validity and data quality associated with the recruitment of participants via social media have largely remained unaddressed. This presentation addresses a problem that regularly arises when recruiting research participants via social media: fake interviews by trolls and bots.
Methods & Data: To address this issue, data from a number of social science surveys – each to recruit rare population members into an online, convenience sample with the help of ads on the social media platforms Facebook and Instagram – are compiled. Using previous findings from the field of online survey research (e.g., Teichert et al. 2015; Bauermeister et al. 2012) as well as extensions for the specific field of social media generated samples, the completed surveys were reviewed for evidence of fraudulent indications. Fraudulent or at least implausible indications, as well as suspicious information in the metadata, were flagged, and thus a fraud index was formed for each survey participation.
Results: Preliminary results indicate that more than 20 percent of the completed interviews could be classified as at least suspicious. Particularly in the case of socially polarizing survey topics, there appears to be a comparatively high proportion of people who deliberately provide false information in order to falsify study results.
Added Value: All insights derived from the various social media fieldwork campaigns are condensed into a best practice guide to handle and minimize issues due to trolls, bots, and fake interviews in social media recruited samples. This guide adds to the knowledge of how to improve the data quality of survey data generated via social media recruitment.
Using Social Networks to Recruit Health Professionals for a Web Survey
Henning Silber, Christoph Beuthner, Steffen Pötzschke, Bernd Weiß, Jessica Daikeler
GESIS - Leibniz Institute for the Social Sciences, Mannheim, Germany
Relevance, Research Question:
Recruiting respondents by placing ads on social networks sites (SNS) such as Facebook or Instagram is a fairly new, non-probabilistic method that provides cost advantages and offers a larger, more targeted sampling frame than existing convenience access panels employ. By using social networks, hard-to-reach groups, such as migrants (Pötzschke & Weiß 2020), LGBTQ individuals (Kühne 2020) can be reached. However, self-recruitment via ads might lead to systematic sample bias. In our study, we employ SNS advertisements to recruit individuals working in the health care sector into an online survey on SARS-CoV-2. This group is difficult to reach with established recruitment methods due to their small number in the overall population. To test the effects of different targeting strategies, three ad campaign designs are compared in an experimental way. The subject of the research is (1) the detailed analyses of self-selection bias and (2) the evaluation of different methodological choices within SNS-based recruitment.
Methods, Data:
To test how well health sector workers can be targeted using the database and algorithm provided by Facebook/Instagram, three recruitment variants will be tested (about 500 respondents per variant): Variant 1: Specifying the industry "health" in the Facebook/Instagram profile; Variant 2: Specifying health as an "interest" in the profile; Variant 3: Recruiting from the total population as a control group. The control group is a critical reference variable to test whether recruitment via statements in the profile is beneficial.
Results:
The study will be fielded in March/April 2021. We will compare the different recruitment strategies and other methodological aspects (e.g., effect of different pictures in the ads) against each other. Further, we will compare the characteristics of respondents recruited with the different recruitment variants against benchmarks from the general population (e.g., gender and age distribution).
Added Value:
The results will add to the sparse empirical evidence and provide recommendations regarding this relatively new methodology. Specifically, three ways of targeting respondents will be experimentally compared. In addition, we will provide evidence on selection bias and compare five different add versions with respect to effectivity of recruiting respondents of the target population.
|