Affordable Access

deepdyve-link
Publisher Website

Running field experiments using Facebook split test.

Authors
  • Orazi, Davide C1
  • Johnston, Allen C2
  • 1 Dept. of Marketing, Monash University, Australia. , (Australia)
  • 2 Dept. of Info System, Statistics, & Management, University of Alabama, USA.
Type
Published Article
Journal
Journal of Business Research
Publisher
Elsevier
Publication Date
Sep 01, 2020
Volume
118
Pages
189–198
Identifiers
DOI: 10.1016/j.jbusres.2020.06.053
PMID: 32834210
Source
Medline
Keywords
Language
English
License
Unknown

Abstract

Business researchers use experimental methods extensively due to their high internal validity. However, controlled laboratory and crowdsourcing settings often introduce issues of artificiality, data contamination, and low managerial relevance of the dependent variables. Field experiments can overcome these issues but are traditionally time- and resource-consuming. This primer presents an alternative experimental setting to conduct online field experiments in a time- and cost-effective way. It does so by introducing the Facebook A/B split test functionality, which allows for random assignment of manipulated variables embedded in ecologically-valid stimuli. We compare and contrast this method against laboratory settings and Amazon Mechanical Turk in terms of design flexibility, managerial relevance, data quality control, and sample representativeness. We then provide an empirical demonstration of how to set up, pre-test, run, and analyze FBST experiments. © 2020 Elsevier Inc. All rights reserved.

Report this publication

Statistics

Seen <100 times