CMU-HCII-21-101
Human-Computer Interaction Institute
School of Computer Science, Carnegie Mellon University



CMU-HCII-21-101

From Contests to Communities of Practice:
Designing for Effective Feedback Exchange in Online Innovation Contests

Felicia Yan Ng

January 2021

Ph.D. Thesis

CMU-HCII-21-101.pdf


Keywords: Human-computer interaction, social computing, computer-supported cooperative work, crowdsourcing, online innovation contests, online feedback exchange


Online innovation contests are an increasingly popular tool that organizations are using to find breakthrough solutions to problems in a variety of social, scientific, and business domains. By posting a specific challenge and a monetary prize on the internet, they attract large and diverse crowds of people to propose new ideas, in hopes of surfacing one or a few outstanding winners. However, prize-centered contest designs are inefficient at leveraging participant contributions and producing high-quality ideas. Competitive winner-takes-all systems inherently do not reward the majority of ideas, which discourages participants from contributing their earnest efforts, both during and after contests, especially if they lose or believe they may lose. The result is that most online innovation contests produce a disproportionately large number of low-quality ideas. As such, these inefficiencies demonstrate a need for new contest designs to better leverage participant contributions towards more effective innovation processes.

In this dissertation, I address these inefficiencies by exploring an alternative approach to designing online innovation contests as communities of practice, in which many participants are encouraged to contribute their efforts to help and learn from one another, while collectively raising the quality of ideas. To explore this, I introduce new design interventions for inducing different types of participants in online innovation communities to exchange feedback on one another's innovation ideas in contests. Through five field studies in real-world online innovation contests, I test and identify the conditions under which each intervention is effective at engaging and benefiting participants and improving project quality.

Specifically, I show that the introduction of new peer advisor programs, in which participants are explicitly invited and assigned to serve as feedback providers to specific project teams, elicits greater engagement between community members than standard collaboration mechanisms that are currently available on online innovation contest platforms. These successfully leverage more participant contributions by inducing a mutual exchange of benefits related to human capital development (e.g. learning, networking). However, meta-analyses across the five studies reveal that feedback only results in improvements to project quality when teams are matched with advisors who have relevant expertise in their project domains, and when feedback is exchanged during early stages in the contest process. In addition, helping feedback receivers process and incorporate feedback into project revisions also benefits project quality for participants who are not experts in their project domain.

In summary, this work contributes: (1) a new approach for designing online innovation contests as communities of practice: specifically, by inducing participants to exchange feedback with one another, and (2) a conceptual framework of conditions under which feedback interventions are effective at engaging and benefiting participants as well as improving project quality. These contributions provide practical guidance to

171 pages

Thesis Committee:
Aniket Kittur (Co Chair)
Robert Kraut (Co-Chair)
Chinmay Kulkarni
Alex Dehgan (Conservation X Labs)

Jodi Forlizzi, Head, Human-Computer Interaction Institute
Martial Hebert, Dean, School of Computer Science



Return to: SCS Technical Report Collection
School of Computer Science homepage

This page maintained by reports@cs.cmu.edu