An evaluation framework for software crowdsourcing
Wenjun WU , Wei-Tek TSAI , Wei LI
Front. Comput. Sci. ›› 2013, Vol. 7 ›› Issue (5) : 694 -709.
An evaluation framework for software crowdsourcing
Recently software crowdsourcing has become an emerging area of software engineering. Few papers have presented a systematic analysis on the practices of software crowdsourcing. This paper first presents an evaluation framework to evaluate software crowdsourcing projects with respect to software quality, costs, diversity of solutions, and competition nature in crowdsourcing. Specifically, competitions are evaluated by the min-max relationship from game theory among participants where one party tries to minimize an objective function while the other party tries to maximize the same objective function. The paper then defines a game theory model to analyze the primary factors in these minmax competition rules that affect the nature of participation as well as the software quality. Finally, using the proposed evaluation framework_this paper illustrates two crowdsourcing processes, Harvard-TopCoder and AppStori. The framework demonstrates the sharp contrasts between both crowdsourcing processes as participants will have drastic behaviors in engaging these two projects.
crowdsourcing / software engineering / competition rules / game theory
| [1] |
|
| [2] |
|
| [3] |
uTest. |
| [4] |
|
| [5] |
|
| [6] |
Apple Store Metrics. 2012 |
| [7] |
AppStori. 2012 |
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
Algorithm Development Through Crowdsourcing. 2012 |
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
TopCoder Inc. 2013 |
| [20] |
Apple App Store Review Guidelines. 2010 |
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
|
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
Higher Education Press and Springer-Verlag Berlin Heidelberg
/
| 〈 |
|
〉 |