Conditions and Complexity in Forecasting

Introduction

When Simpler is Better: Conditions and Complexity in Forecasting, Special issue of J Bus Res, Edited by Kesten C. Green, Elizabeth J. Wilson, and J. Scott Armstrong; Proposal deadline 30 Nov 2012

When Simpler is Better: Conditions and Complexity in Forecasting

CALL FOR PAPERS
SPECIAL ISSUE OF THE JOURNAL OF BUSINESS RESEARCH*

Deadline for submission of papers: 1 April 2013

Kesten C. Green, Elizabeth J. Wilson, and J. Scott Armstrong

It is vain to do with more what can be done with fewer. William of Occam We seek submissions of papers that examine conditions under which simpler forecasting methods are as or more useful than more complicated ones. Simpler methods may be useful because they provide accurate forecasts, but also because they are cost effective or more likely to be adopted than more complex alternatives. Write to us now with your ideas for research to improve forecasting knowledge by comparing the usefulness of simple and complex alternatives under different conditions. We will provide you with quick feedback on the suitability of your research plan for this special issue of the Journal of Business Research.

Background

What is the right level of complexity for forecasting methods? Regression analysis and other statistical modelling techniques are readily available and widely used. Such techniques can however only be used when sufficient data are available. Moreover, researchers (including Paul Meehl, Robyn Dawes, and Frank Schmidt) have found that in some situations equal weights models provide forecasts that are at least as accurate as those from regression models. The paradoxical relationship between model complexity and forecast accuracy is our motivation for the Journal of Business Research special issue that is the subject of this Call for Papers.

The question of what is the right level of complexity for forecasting models is relevant for diverse problems and methods. Take election forecasting. Simply asking some people how they expect to vote can provide useful forecasts of population voting behavior. How far ahead of time are such election forecasts valid? To what extent is it possible to increase accuracy simply with, for example, improved sampling, or by damping or by adjusting using errors from analogous forecasts? Other domains and problem types that we expect would be amenable to productive research on this question include, but are not limited to, human resources (hiring and remuneration), production and marketing, investment, public policy, and rare events. Simple methods to consider include, but are not limited to, heuristics, index models, structured analogies, extrapolation techniques, regression models with three or fewer variables, and combinations of forecasts. Complex models need only be limited by the imaginations of the modellers, but we ask that their performance be compared with that of reasonable simple alternatives and against large samples of out-of-sample data under well-defined and realistic conditions.

Action steps

We invite you to send us your proposals for research on forecasting methods and the conditions under which simpler is better, and those under which it is not. Experimental studies examining multiple reasonable hypotheses, replications and extensions of important studies, and meta-analyses of evidence are especially welcome.

  • Now: Sound us out on your ideas for contributing a paper; contact one of us this week!
  • 30 November 2012: Detailed proposals due.
    – Relevant and well-designed studies will be accepted as “invited papers.”
  • 1 April 2013: Deadline for submission of papers.

Contact: Kesten Green at Kesten.Green@unisa.edu.au and Elizabeth Wilson at ewilson@suffolk.edu

*Journal of Business Research has an h5-index of 47 and an h5-median of 62 in Google Scholar’s publications (one of the top 5 journals in citation impact by h-index for the five-year citation count window).