Paper Accepted @ OrgSci

My paper ““Optimal” Feedback Use in Crowdsourcing Contests: Source Effect and Priming Intervention” has been accepted at Organization Science.

Abstract: Crowdsourcing contests allow firms to seek ideas from external solvers to address their problems. This research examines solvers’ use of developmental feedback from different sources and of different constructiveness when generating ideas in contests. I theorize a source effect in solvers’ feedback use, where they use seeker feedback more than peer feedback, even if both give identical suggestions for their ideas. I also show how the source effect affects solvers’ use of constructive and less constructive feedback from the respective sources. An insight is that compared to their use of peer feedback, solvers’ use of seeker feedback is more extensive at any level of but less sensitive to feedback constructiveness. An implication is that solvers may underuse constructive peer feedback and overuse less constructive seeker feedback. Such behaviors can be solver-optimal (in terms of improving solvers’ winning prospects) but not seeker-optimal (in terms of enhancing ideas for seekers’ problems), as constructive feedback is likely to improve idea quality, whereas less constructive feedback may hurt it. I propose a priming intervention of a feedback evaluation mechanism to mitigate the source effect in solvers’ feedback use—in a way, the intervention can cause solvers to behave more optimally for the seekers. A field survey and three online experiments test the theorizing and proposed intervention. I discuss the contributions and implications of this research for various stakeholders in crowdsourcing contests.

Keywords: crowdsourcing, ideation contests, feedback, priming, feedback evaluation mechanism, idea quality, field survey, experiments