Metadata
Title
“Optimal” Feedback Use in Crowdsourcing Contests: Source Effect and Priming Intervention
Category
undergraduate
UUID
1cf627621e1b47d3b999f407b085504e
Source URL
https://bm.hkust.edu.hk/bizinsight/2026/01/optimal-feedback-use-crowdsourcing-co...
Parent URL
https://bm.hkust.edu.hk/bizinsight
Crawl Time
2026-03-13T04:21:17+00:00
Rendered Raw Markdown

“Optimal” Feedback Use in Crowdsourcing Contests: Source Effect and Priming Intervention

Source: https://bm.hkust.edu.hk/bizinsight/2026/01/optimal-feedback-use-crowdsourcing-contests-source-effect-and-priming Parent: https://bm.hkust.edu.hk/bizinsight

[ Digital Platform: Design and Strategy ] [ Innovation and Entrepreneurship ]

“Optimal” Feedback Use in Crowdsourcing Contests: Source Effect and Priming Intervention

29 Jan 2026

KOH, Tat Koon

Associate Dean (UG Programs), Lee Heng Fellow, Associate Professor, Program Director, Global Business Program

Read Full Paper

Crowdsourcing contests allow firms to seek ideas from external solvers to address their problems. This research examines solvers’ use of developmental feedback from different sources and of different constructiveness when generating ideas in contests. I theorize a source effect in solvers’ feedback use where they use seeker feedback more than peer feedback, even if both give identical suggestions for their ideas. I also show how the source effect affects solvers’ use of constructive and less constructive feedback from the respective sources. An insight is that compared with their use of peer feedback, solvers’ use of seeker feedback is more extensive at any level of, but less sensitive to, feedback constructiveness. An implication is that solvers may underuse constructive peer feedback and overuse less constructive seeker feedback. Such behaviors can be solver optimal (in terms of improving solvers’ winning prospects) but not seeker optimal (in terms of enhancing ideas for seekers’ problems), as constructive feedback is likely to improve idea quality, whereas less constructive feedback may hurt it. I propose a priming intervention of a feedback evaluation mechanism to mitigate the source effect in solvers’ feedback use—in a way, the intervention can cause solvers to behave more optimally for the seekers. A field survey and three online experiments test the theorizing and proposed intervention. I discuss the contributions and implications of this research for various stakeholders in crowdsourcing contests.