fbpx

Self-Administered Questionnaires

by Charles Hofacker

Introduction

Novel Approaches for Improving Data Quality, Special issue of International Journal of Market Research; Deadline 30 Sep

INTEREST CATEGORY: MARKETING RESEARCH
POSTING TYPE: CALLS: JOURNALS
Author: Wenkai Zhou

Novel Approaches for Improving Data Quality from Self-Administered Questionnaires

Self-administered questionnaires permit relatively quick and inexpensive data collection from large, diverse, and representative respondent samples However, many marketing scholars have scrutinized and then questioned the quality of survey data derived in this way. One ongoing issue is self-reports submitted by inattentive, disengaged, or mischievous respondents. Poor-quality data resulting from such self-reports remains problematic for marketing scholars and research practitioners. Even a low percentage of untrustworthy survey data may significantly bias statistical findings. Misleading data can yield results that hinder scientific progress or lead to misguided business actions that could harm companies, increase survey costs and create reputational damage for research practitioners.

Some conventional wisdom about survey data collection, cleaning, and transformation may need revisiting, as many seminal research texts were largely or completely written during the pre-internet, pre-social media, and pre-Millennials era. Marketing researchers require new approaches to ensure high-quality survey data collected across various technological platforms.

In this vein, an upcoming special issue of International Journal of Market Research will be dedicated to new approaches for improving survey data quality. The diverse topics suitable for this special issue include, but are not limited to, the following possibilities:

Novel and creative ways to improve survey respondents’ attentiveness, engagement, interest, and response accuracy via:

Survey design innovations (e.g., modular survey design supplementing survey data via multi-method research designs)

Questionnaire design innovations (e.g., improved manipulation checks and warnings better presentation of questionnaire instructions interactive and non-interactive entertainment breaks novel question formats, gamification

Novel approaches for post-collection data handling (e.g., identifying and deleting careless responses, purging data submitted by mischievous respondents, identifying novel data quality indicators for post-hoc data cleaning developing composite data quality indices; improving data imputation techniques)

Recruitment techniques that eliminate undesirable respondents prior to data collection (e.g., screening based on personality traits or demographic characteristics)

Unusual incentives for survey participation based on cooperation norms or other forms of motivation (e.g., charity/political cause donation)

Novel questionnaire administration on web-based platforms that include high-tech features like embedded media and augmented or virtual reality (e.g., use of video/animation versus text for posing questions)

New methods for assessing respondents’ attentiveness and engagement (e.g., eye fixation research)

Empirical studies (qualitative or quantitative), theoretical articles, case studies or evidence based opinion pieces are welcome from academics and practitioners. The review process will be double blind, with at least two referees evaluating each manuscript. Prospective authors can find manuscript guidelines at https://us.sagepub.com/en-us/nam/international-journal-of-market-research/journal203424#submission-guidelines.

This Special Edition is currently proposed for September 2021, and the deadline for submissions is 31st September, 2020.

Guest Editors

Michael R. Hyman
Distinguished Achievement Professor of Marketing
New Mexico State University
Las Cruces, NM 88003-8001
Email: mhyman@nmsu.edu

Alena Kostyk
Adam Smith Business School
University of Glasgow
Glasgow, UK
Email: alena.kostyk@gmail.com

Wenkai Zhou
Assistant Professor of Marketing
University of Central Oklahoma
Edmond, OK 73034-5209
Email: wkzhou22@gmail.com

Leo Paas
Professor of Marketing
University of Auckland
Auckland, New Zealand
Email: leo.paas@auckland.ac.nz

The authoritative version of this call is available here

https://journals.sagepub.com/doi/10.1177/1470785319870622a

References

28 questioners to help buyers of online samples https://ana.esomar.org/document/2666?query=Online%20research%20guidelines

Aust, F., Diedenhofen, B., Ullrich, S., Musch, J. (2013). Seriousness checks are useful to improve data validity in online research. Behavior Research Methods, 45(2), pp.527-535.

AAPOR Report on non-probability sampling , https://www.aapor.org/Education-Resources/Reports/Non-Probability-Sampling.aspx

AAPOR report on evaluating survey quality in today’s complex environment https://www.aapor.org/Education-Resources/Reports/Evaluating-Survey-Quality.aspx

Bansal, H.S., Eldridge, J., Halder, A., Knowles, R., Murray, M., Sehmer, L., Turner, D. (2017). Shorter interviews, longer surveys: Optimising the survey participant experience while accommodating ever expanding client demands. International Journal of Market Research, 59(2), pp.221-238.

Barnette, J.J. (1999). Nonattending respondent effects on interval consistency of self-administered surveys: A Monte Carlo simulation study. Educational and Psychological Measurement, 59(1), pp.38-46.

Bollen, K.A., Arminger, G. (1991). Observational residuals in factor analysis and structural equation models. Sociological Methodology, 21, pp.235-262.

Brosnan, K., Babakhani, N., Dolnicar, S. (2019). “I know what you’re going to ask me”: Why respondents don’t read survey questions. International Journal of Market Research. Available at: https://journals.sagepub.com/doi/abs/10.1177/1470785318821025?journalCode=mrea [Assessed 15 May 2019].

Casler, K., Bickel, L., Hackett, E. (2013). Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Computers in Human Behavior, 29(6), pp.2156-2160.

Dillman, D.A., Smyth, J.D., Christian, L.M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method (3rd ed.). New York, NY: Wiley.

Dodou, D., de Winter, J. C. (2014). Social desirability is the same in offline, online, and paper surveys: A meta-analysis. Computers in Human Behavior, 36, pp.487-495.

Dolnicar, S., Grün, B., Yanamandram, V. (2013). Dynamic, interactive survey questions can increase survey data quality. Journal of Travel & Tourism Marketing, 30(7), pp.690-699.

ESOMAR guideline for online research https://ana.esomar.org/document/2724?query=Online%20research%20guidelinesh.

ESOMAR/GBRN guideline for online sample quality https://ana.esomar.org/document/7799?query=Online%20research%20guidelines.

Fleischer, A., Mead, A.D., Huang, J. (2015). Inattentive responding in MTurk and other online samples. Industrial and Organizational Psychology, 8(2), pp.196-202.

Guin, T.D.L., Baker, R., Mechling, J., Ruyle, E. (2012). Myths and realities of respondent engagement in online surveys. International Journal of Market Research, 54(5), pp.613-633.

Goodman, J.K., Cryder, C.E., Cheema, A. (2013). Data collection in a flat world: The strengths and weaknesses of Mechanical Turk samples. Journal of Behavioral Decision Making, 26(3), pp.213-224. doi:10.1002/bdm.1753

Groves, R.M., Singer, E., Corning, A. (2000). Leverage-saliency theory of survey participation: Description and an illustration. The Public Opinion Quarterly, 64(3), pp.299-308.

Hyman, M.R., Sierra, J.J. (2012). Adjusting self-reported attitudinal data for mischievous respondents. International Journal of Market Research, 54(1), pp.129-145.

Kostyk, A., Zhou, W., Hyman, M.R. (2019). Using surveytainment to counter declining survey data quality. Journal of Business Research, 95, pp.211-219.

Kropf, M.E., Blair, J. (2005). Eliciting survey cooperation: Incentives, self-interest, and norms of cooperation. Evaluation Review, 29(6), pp.559-575.

Lind, J.C., Zumbo, B.D. (1993). The continuity principle in psychological research: An introduction to robust statistics. Canadian Psychology, 34(4), pp.407-414.

Liu, Y., Zumbo, B.D. (2007). The impact of outliers on Cronbach’s coefficient alpha estimate of reliability: Visual analogue scales. Educational and Psychological Measurement, 67(4), pp.620-634.

Meade, A.W., Craig, S.B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), pp.437-455.

Paas, L.J., Dolnicar, S., Karlsson, L. (2018). Instructional manipulation checks: A longitudinal analysis with implications for MTurk. International Journal of Research in Marketing, 35(2), pp.258-269.

Paas, L.J., Morren, M. (2018). Please do not answer if you are reading this: Respondent attention in online panels. Marketing Letters, 29(1), pp.13-21.

Payne, S.L. (1951). The art of asking questions. Princeton, NJ: Princeton University Press.

Peterson, R.A. (2001). On the use of college students in social science research: Insights from a second-order meta-analysis. Journal of Consumer Research, 28(3), pp.450-461.

Peterson, G., Griffin, J., LaFrance, J., Li, J. (2017). Smartphone participation in web surveys. In: Beimer, P.P. , et al. (eds.), Total survey error in practice (pp.203-233). New York, NY: Wiley.

Sears, D.O. (1986). College sophomores in the laboratory: Influences of a narrow database on social psychology’s view of human nature. Journal of Personality and Social Psychology, 51(3), pp.515-530.

Singer, E., Couper, M.P. (2008). Do incentives exert undue influence on survey participation? Experimental evidence. Journal of Empirical Research on Human Research Ethics, 3(3), pp.49-56.

Smith, S.M., Roster, C.A., Golden, L.L., Albaum, G.S. (2016). A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel to MTurk samples. Journal of Business Research, 69(8), pp.3139-3148.

Van Herk, H., Poortinga, Y.H., Verhallen, T.M. (2004). Response styles in rating scales: Evidence of method bias in data from six EU countries. Journal of Cross-Cultural Psychology, 35(3), pp.346-360.

Zhang, C., Conrad, F. (2014). Speeding in web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), pp.127-135.