crowdsourcing and project management: scopus ... - ARPN Journals

7 downloads 3469 Views 157KB Size Report
2012), or freelancer.com (Freelancer.org, 2014). ..... development idea (Poetz and Schreier, 2012). ... adoption of crowd sourcing specifically in software or.
VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com

CROWDSOURCING AND PROJECT MANAGEMENT: SCOPUS LITERATURE REVIEW Roy Deddy Hasiholan Tobing Faculty of Informatics Engineering, Del Institute of Technology, Jl. Sisingamangaraja, Sitoluama, Laguboti, Kabupaten Toba Samosir, Sumatera Utara, Indonesia E-Mail: [email protected]

ABSTRACT Nowadays, crowd sourcing becomes trend that eventually affects the way of a project is managed. The success stories of crowd sourcing are noticeable, and so are the failures. Crowd sourcing provides access to diverse and abundant talent pools with relatively low cost. A project manager can easily recruit a geographically separated unknown individual or group to perform a task. However, a project manager should understand the benefits and risks of crowd sourcing as the considerations for adopting it. Moreover, the position of recruited individual or group should be clear, whether they are part of virtual project team or just another form of outsourcing. This research reviews the current literatures on crowd sourcing and project management to study the benefits and risks of crowd sourcing, its relation to project management and how the recruited workers should be positioned in the project. The discussions of crowd sourcing and project management are specifically focused on project attributes. Keywords: crowd sourcing, benefits and risks, project management.

INTRODUCTION In recent years, the way of managing project has faced changes with the Internet as the enabler for accessing abundant relatively cheaper resources pool, specifically human resources. For a project, team members are easily be sourced from the so-called “crowds” and expected to complete certain tasks. The person can be from any countries as long as registered to certain ad-hoc website, such as Amazon Mechanical Turk (Amazon, 2014), InnoCentive (InnoCentive, 2014; Marjanovic, Fry et al. 2012), or freelancer.com (Freelancer.org, 2014). There are many different web-based platforms as the intermediary between requesters –the individual or group that contracts person to participate in a project– and workers –hired persons– available in the Internet. Howe (2006) coined the term “crowd sourcing” in an article referring to the activity. Crowd sourcing has many evidence of successes and failures (Simula, 2013). This raises questions of what are the benefits and risks of a project that is completed using crowd sourcing. By understanding its benefits and risks, a project manager can equip himself/herself with proper knowledge in order to gain the success of adopting crowd sourcing in a project. In term of project definition, Gido and Clements (2012) has explained a set of attributes that construct a project. These attributes help as guidance to manage projects. As to project resources utilization attribute, the benefit of having lower worker costs and large options of available talents can attract the decision to use crowd sourcing (Mason and Suri, 2011). The implication is that the way of managing the workers is different from the traditional project management, in which the workers are internal staff. There are issues that emerge, such as result quality (Kazai, Kamps et al. 2013)  and security (Mason and Suri, 2011). It needs to further discuss on how crowd sourcing affects other project attributes. This research aims to answer question

regarding the implications of crowd sourcing on project management based on a literature review. First, the author wants to find out what are the considerations for a project manager before deciding to use crowd sourcing in a project. Next, the discussion focuses on the topic of the position of the recruited individual or group from crowd sourcing platform. Should the individual or group be part of virtual project team or considered to be another variant of outsourcing? This study is conducted in systematic review to have better understanding on how crowd sourcing can be used and positioned in a project. The next section of this paper explains the methodology used to systematically answer the research questions. In the third section, the author presents the findings on crowdsourcing and project management from the collected articles. The fourth section contains discussion on the findings and finally answer the research questions. The last section has conclusion of the research and suggestion on future research. METHODOLOGY In order to answer the question, the author conducts systematic review by collecting information on crowd sourcing and project management topic. First, answering the first research question requires information gathering process. The decision on whether to adopt crowd sourcing or not is influenced by the knowledge had by a project manager on the benefits and risks of crowd sourcing. If it is decided to be adopted, a project manager can use the knowledge for considerations on what project attributes that should be adapted to match the nature of crowd sourcing. The information gathering process has three activities, in which each activity is a searching process using different search strategy. The main difference is in the second keyword used for retrieving the reviewed literatures. After that, the author can answer the question

1238

VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com about the position of committed individual or group in a project is from the findings discussion. In this research, the information collecting process is conducted by searching and gathering literatures from a major academic database: Scopus. The reason is that the author only uses publications that have been cited at least once in other publications and Scopus academic database is considered to be able to accommodate the needs because it offers citation analysis capability (Falagas, Pitsouni et al. 2008). At the first step for answering research questions, the author uses “crowd sourcing” as the primary keyword. As stated before, the difference between the steps is the usage of second keyword in the search strategy. Firstly, the searching process is conducted in which the first keyword combined with the second keywords “benefits”. After having the result, the author runs another search that has the first keyword been combined with “risks” keyword. Thirdly, the keyword “crowd sourcing” is entered along with “project management” as the second keyword. The results of search process are constrained by using only published journal articles written in English. Moreover, the literatures are published starting from 2006 as the term of “crowd sourcing” is coined at that year despite the facts that crowd sourcing-like activities that existed before that term was created have been reported in many articles such as neogeography (Goodchild and Glennon, 2010). After applying the search strategy, the authors get 28 published journals for the combination of “crowd sourcing” and “benefits” keywords, 18 published journals for “crowd sourcing” and “risks” keywords and 5 published journals for “crowd sourcing” and “project management” keywords combination. Finally, there are 12, 8, and 2 articles, respectively for each search strategy, that has been selected for further analysis after reviewing the abstract of each article. LITERATURE REVIEW Trends of Crowd sourcing in Scopus Academic Database from 2006 – August 2014 When conducting literature search in Scopus academic database, the author identified the trends in three search strategies which were that of combining “crowd sourcing” keyword with (1) “benefits”, (2) “risks”, and (3) “project management” keywords. After run the search, but only by entering the keywords and not having other constraints applied, the author found 145, 65, and 44 articles and conference papers for each search strategy, respectively. Referring to Figure-1, we can see that there is no evidence of studies conducted on crowd sourcing topic in 2006 and 2007. Not until the year of 2008 this topic entered Scopus academic database and the figures show that there is an increasing trend for this topic interests at least until 2013. The number published articles and conference papers in 2014 is not final yet as this study is conducted in August 2014. The studies on the topic of “crowdsourcing” and “benefits” particularly have noticably increased from 19 articles and conference papers in 2011 to

40 articles and conference papers in 2012. The number increased approximately 100%. Meanwhile for the topic of crowdsourcing risks, significant increasing number of published articles and conference papers with more than 100% increase can be seen from 2010 to 2011 and from 2012 to 2013.

Figure-1. Trends of Crowd sourcing topics in SCOPUS: 2006 – August 2014. Crowd sourcing As mentioned before, crowd sourcing is coined by Howe (2006) and describes it as the situation where “hobbyists, part-timers, and dabblers suddenly have a market for their efforts” and “smart companies in industries … discover ways to tap the latent talent of the crowd”. Borrowing the term used by Amazon’s Mechanical Turk (AMT), the hobbyists, part-timers and dabblers are referred as “workers” and “requester” as the term to refer to the other side of the activity. As Figure 1 has shown, there is increasing trend of crowd sourcing. However, the effort to have formal integrated crowd sourcing definition is available by the article of EstellésArolas and González-Ladrón-de-Guevara (2012). Many web-based platforms that act as intermediary between workers and requester are available in the Internet. Crowd sourcing platform uses Web 2.0 technology and applies user generated content concept. As for the different groups of crowd sourcing, Schuurman, Baccarne et al. (2012) divided it into five types. The workers usually complete certain tasks for a small amount of money (Mason and Suri 2011) or only for social status (Franzoni and Sauermann 2014). Benefits and risks of Crowd sourcing The summary of benefits gained from adopting crowd sourcing in this study’s collected literatures can be seen in Table-1. The identified benefits are:

1239

VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com 



Workers skill pools. The requester can have access to large number of people with different skill set that can be matched with the tasks needed to be completed. Workers diversity. In a project that requires a diverse data input, crowdsourcing can give the benefits for accessing groups of people with unique background, different cultures and languages. AMT claimed the requesters can have access to more than 500,000 workers from 190 countries (Amazon 2014).



Low cost. The acquisition of workers requires relatively low cost and so is the payment. For example, the requester only pays a minimum of $0.005 commision in AMT and can set the price as low as $0.01 per HIT. Franzoni and Sauermann (2014) give cases in which the participants only require social recognition as the “payment” in crowdsourcing.



Faster result delivery. As crowdsourcing platform opens access to many potential workers, the tasks can be completed in relatively shorter period. Franzoni and Sauermann (2014) has one example on how Galaxy Zoo gained remarkable results. A project for single researcher to classify galaxies pictures into roughly 50 million classifications could take more than 83 years of full-time efforts. However, the crowds were able to complete about 200 million classifications in three years after the site was launched.



Results quality relatively equals as the tasks performed by professional workers or in laboratory setting (Mason and Suri 2011; Franzoni and Sauermann 2014). Table-1. Summary of literatures discussing Crowd sourcing benefits. Benefits No. Articles B1 B2 B3 B4 B5 Mason and Suri 1 x x x x (2011) Goodchild and x x 2 Glennon (2010) Poetz and 3 x Schreier (2012) Chanal and x 4 Caron-Fasan (2010) Kazai, Kamps et x 5 al. (2013) Schuurman, 6 Baccarne et al. x (2012) Barbier, Zafarani 7 x x et al. (2012) Franzoni and 8 Sauermann x x x (2014) Satzger, Psaier et 9 x x

No.

Articles al. (2013)

B1

B2

Thaler, Simperl et al. (2012) Blohm, 11 Leimeister et al. (2013) Way, 12 Ottenbacher et al. (2011) Note B1: Workers Skills B2: Workers Diversity B3: Low Cost B4: Faster Result Delivery B5: Result Quality

Benefits B3 B4

10

B5 x

x

x

In term of risks, crowdsourcing also has many potential challenges. Table-2 summarizes the findings on crowdsourcing risks gathered from the collected literatures. From the table, we can see that potential risks in using crowdsourcing on a project are dominated by result quality issue. All the literatures have discussion on this topic and report how usable the produced results. This is an interesting fact as one of the benefits of crowdsourcing that is discussed in collected literatures is result quality. In further discussion, Behrend, Sharek et al. (2011) for example, explain that there are risks especially for using crowd sourcing to get survey data. It would be relatively difficult to ensure the random sampling and make sure that one person has not done a survey multiple times as the requesters can find it hard to verify the person. Hence, the data produced by crowdsourcing can be bias. Table-2. Summary of literatures discussing Crowd sourcing risks. Risks Result No. Articles Confidentiality Security quality Behrend, 1 Sharek et x al. (2011) Shapiro, 2 Chandler et x al. (2013) Marjanovic, 3 Fry et al. x (2012) Vivacqua 4 and Borges x (2012) Trottier 5 x (2014) Wagner 6 x (2011) 7 Poetz and x

1240

Social risks

x

VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com Risks No.

8

Articles Schreier (2012) Kazai, Kamps et al. (2013)

Result quality

x

Confidentiality

Security

x

Furthermore, there are also several papers from Table-2 which also discuss the risks of using crowd sourcing. The risks are unexpected result quality (Mason and Suri 2011; Goodchild and Glennon 2010; Schuurman, Baccarne et al. 2012; Barbier, Zafarani et al. 2012; Franzoni and Sauermann 2014; Blohm, Leimeister et al. 2013). Mason and Suri (2011) discuss the confidentiality and security issues that emerge from adopting crowd sourcing. The requesters are not able to hide its identity as the there is need for workers to make the requesters are reliable. Moreover, requester must make sure the security of requester’s internal network or web server if tasks that will performed by workers are done there. Putting HIT in AMT, for example, means that the requester open the job and its detail publicly; Hence, it opens for malicious attacks. Finally, Marjanovic, Fry et al. (2012) argue that further studies are needed to be conducted to find crowd sourcing variables that determine the successes and failures especially in particular crowd sourcing models. In addition, to maintain the result quality particularly on project that generates information from the crowd, the project team has to identify the potential noise in the data (Barbier, Zafarani et al. 2012) and personal biases (Schuurman, Baccarne et al. 2012, Trottier 2014). One of the solution is by performing the recommendation by Blohm, Leimeister et al. (2013) in which the crowd is involved to improve the quality of data, such as by giving data structuring and filtering tasks, or data evaluation and aggregation. Crowd sourcing and project management The result of the author search strategy to study current literatures that discuss project management in crowdsourcing activity or report real-life cases is very limited. The process only produces two articles: (a) Ebner, Leimeister et al. (2009) that gives an example of real life case on how crowd sourcing be used for getting ideas from the crowds and developed framework for that purpose, and (b) Shao, Shi et al. (2012) that discuss empirically the factors that influence the involvement of solvers in crowdsourcing, specifically in China. Despite the limited amount of literatures that specifically discuss crowdsourcing and its relation to project management, other collected literatures from previous search strategy briefly report or discuss the findings on this topic. One of the challenges in crowd sourcing is to find the right people to work on a project (Franzoni and Sauermann 2014). This challenge is true as there are many available potential workers that can be

chosen to finish a certain project task. The other challenges are how to divide the tasks into well-structured smaller Social tasks and integrate the results, and leadership on project, risks such as making decision and provide resources. In term of project management, Mason and Suri (2011) explains how an experiment should be conducted in which the participants are contracted from crowd sourcing web. One of the suggestions is that to restrict participant’s population in order to have qualified workers. The restriction involves screening process. Kazai, Kamps et al. (2013) discuss how a tasks should be breakdown into smaller units that can be easier to be finished by workers. Then, these micro-tasks can fit crowd sourcing model such as of Amazon’s Mechanical Turk. Additionally, Blohm, Leimeister et al. (2013) recommend that the tasks should be precise and understandable. Furthermore, Barbier, Zafarani et al. (2012) also suggest five conditions in which crowd sourcing should not be employed to work on a certain project and recommends to overcome coordination barriers. These conditions can help project manager to decide whether to adopt crowdsourcing in its project part or not. DISCUSSIONS The literatures collected in this study (refers to Table-1 and Table-2) have shown that crowd sourcing is mainly used to gather innovative ideas on particular topic (Chanal and Caron-Fasan, 2010; Wagner, 2011; Marjanovic, Fry et al. 2012; Schuurman, Baccarne et al. 2012; Blohm, Leimeister et al. 2013) or new product development idea (Poetz and Schreier, 2012). Beside that, projects which of the area of research use crowd sourcing to gather data (Behrend, Sharek et al. 2011; Shapiro, Chandler et al. 2013). Several articles also discuss the of crowd sourcing to perform micro tasks (Mason and Suri 2011; Kazai, Kamps et al. 2013; Thaler, Simperl et al. 2012). There is no particular article that discuss the adoption of crowd sourcing specifically in software or information system development projects. Gido and Clements (2012) explains that a project has attributes such as a project has objective, is completed by conducting a series of interdependent tasks, utilizes resources to the project, has a completion period, can be a one-off project, has clients and involves a degree of uncertainty. Related to crowd sourcing, the project attributes that can be affected by the adoption of this activity are the completion of interdependent tasks, resource utilization, project life span and the degree of uncertainly. In term of series of interdependent tasks, a project manager can have the tasks to be completed by workers available in crowd sourcing platforms. However, the attention should be on the task complexity (Franzoni and Sauermann 2014, Kazai, Kamps et al. 2013). When giving the tasks to workers, they should be should be wellstructured and easy to understand. For resource utilization, the project manager should be aware that the workers may have no contract involved and with no association with company or organization (Barbier, Zafarani et al. 2012) and they come from unverified skills background. Chanal and

1241

VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com Caron-Fasan (2010) suggests that incentive model affects the contribution of people to a project. Moreover, choosing the right intermediary or platform that connects the project owner and innovation communities is important as each of the platforms offers different services and communities. Kazai, Kamps et al. (2013) observe the task conditions (pay, effort, and qualifying criteria). It shows that higher payment lead to higher result quality, especially of qualified workers. They also recommend making price estimation based on unit of efforts in order to set the right pay for certain efforts. It can help first requester in which Mason and Suri (2011) has discussed in their article to set the correct pay. Nonetheless, there is auction-based crowd sourcing system designed by Satzger, Psaier et al. (2013) in which the requester do not have to specify the exact price, but set the maximum price. Wagner (2011) discusses and gives tips on how to give rewards on certain tasks done by crowd sourcing, especially to gain innovation. To answer the question on whether to position the workers as other form of outsourcing or virtual projects team, the decision can be based on the complexity of the tasks and the level of confidentiality. For tasks that has been breakdown into smaller tasks can be outsourced to the workers in crowd sourcing platform. This task may require no specific technical skill, such as images classification or product categorization. For tasks that requires confidentiality, the recruited individual or group can be positioned as a virtual project team. However, there are organizational challenges that need to be considered when position the workers as virtual project team (Chatfield, Shlemoon et al. 2013). Considering the discussion about resources acquisition and positioning, the degree of uncertainty can be minimized. CONCLUSIONS Literature review was conducted in Scopus database to study the current view on crowdsourcing and project management. Crowdsourcing is already part of some reported completed projects in literature review gathered in this study. The topic gives insight on how crowdsourcing can leverage a project, what part of project attributes that are affected by the adoption of this activity, some case studies in which the successes and failures of a project are discussed and eventually provide knowledge on risks that crowdsourcing potentially have for a project. However, the number of papers that mainly discussing crowdsourcing and project management is very limited in Scopus academic database. In the future study, the literature reviews can use other academic database such as Web of Science or IEEE to enhance the knowledge on this topic. Despite the fact that there is effort to make integrated crowdsourcing definition, further study on different crowdsourcing models is needed to be conducted in the near future. By having understanding on the models, a project manager can choose appropriate crowdsourcing models that can satisfy the project requirements. Research is also suggested to have more focus on how the uncertainty in using crowdsourcing in a project can be diminished and prevent the failure of a project.

REFERENCES Amazon. (2014). "Amazon Mechanical Turk." Retrieved August 23, 2014, from https://www.mturk.com/mturk/welcome. Barbier G., R. Zafarani H., Gao G. Fung and H. Liu (2012). "Maximizing benefits from crowdsourced data." Computational and Mathematical Organization Theory 18(3): 257-279. Behrend T. S., D. J. Sharek A. W. Meade and E. N. Wiebe (2011). "The viability of crowdsourcing for survey research." Behavior Research Methods 43(3): 800-813. Blohm I., J. M. Leimeister and H. Krcmar (2013). "Crowdsourcing: How to benefit from (too) many great ideas." MIS Quarterly Executive 12(4): 199-211. Chanal V. and M. L. Caron-Fasan (2010). "The difficulties involved in developing business models open to innovation communities: The case of a crowdsourcing platform." Management 13(4): 318-341. Chatfield A. T., V. N. Shlemoon W. Redublado and G. Darbyshire (2013). Creating Value through Virtual Teams: A Current Literature Review. 24th Australasian Conference on Information Systems. Melbourne. Ebner W., J. M. Leimeister and H. Krcmar (2009). "Community engineering for innovations: The ideas competition as a method to nurture a virtual community for innovations." R and D Management 39(4): 342-356. Estellés-Arolas E. and F. González-Ladrón-de-Guevara (2012). "Towards an integrated crowdsourcing definition." Journal of Information Science 38(2): 189-200. Falagas M. E., E. I. Pitsouni G. A. Malietzis and G. Pappas (2008). "Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses." The FASEB Journal 22(2): 338-342. Franzoni C. and H. Sauermann (2014). "Crowd science: The organization of scientific research in open collaborative projects." Research Policy 43(1): 1-20. Freelancer.org. (2014). "Hire Freelancer & Find Freelance Jobs Online." from https://www.freelancer.com/. Gido, J. and J. P. Clements (2012). Successful Project Management. Mason, OH, South-Western Cengage Learning. Goodchild M. F. and J. A. Glennon (2010). "Crowdsourcing geographic information for disaster response: A research frontier." International Journal of Digital Earth 3(3): 231-241.

1242

VOL. 10, NO. 3, FEBRUARY 2015

ARPN Journal of Engineering and Applied Sciences

ISSN 1819-6608

©2006-2015 Asian Research Publishing Network (ARPN). All rights reserved.

www.arpnjournals.com Howe J. (2006). "The Rise of Crowdsourcing." Retrieved August 23, 2014, from http://archive.wired.com/wired/archive/14.06/crowds.html. InnoCentive. (2014). "Innovation Management | Crowdsourcing | Challenges | Competitions." from http://www.innocentive.com/. Kazai G., J. Kamps and N. Milic-Frayling (2013). "An analysis of human factors and label accuracy in crowdsourcing relevance judgments." Information Retrieval 16(2): 138-178.

systems." Journal of Network and Computer Applications 35(1): 189-198. Wagner E. B. (2011). "Why prize? The surprising resurgence of prizes to stimulate innovation." Research Technology Management 54(6): 32-36. Way K. A., M. C. Ottenbacher and R. J. Harrington (2011). "Is Crowdsourcing Useful for Enhancing Innovation and Learning Outcomes in Culinary and Hospitality Education?" Journal of Culinary Science and Technology 9(4): 261-281.

Marjanovic S., C. Fry and J. Chataway (2012). "Crowdsourcing based business models: In search of evidence for innovation 2.0." Science and Public Policy 39(3): 318-332. Mason W. and S. Suri (2011). "Conducting behavioral research on Amazon’s Mechanical Turk." Behavior Research Methods 44(1): 1-23. Poetz M. K. and M. Schreier (2012). "The value of crowdsourcing: Can users really compete with professionals in generating new product ideas?" Journal of Product Innovation Management 29(2): 245-256. Satzger B., H. Psaier D. Schall and S. Dustdara (2013). "Auction-based crowdsourcing supporting skill management." Information Systems 38(4): 547-560. Schuurman D., B. Baccarne L. De Marez and P. Mechant (2012). "Smart ideas for smart cities: Investigating crowdsourcing for generating and selecting ideas for ICT innovation in a city context." Journal of Theoretical and Applied Electronic Commerce Research 7(3): 49-62. Shao B., L. Shi B. Xu and L. Liu (2012). "Factors affecting participation of solvers in crowdsourcing: An empirical study from China." Electronic Markets 22(2): 73-82. Shapiro D. N., J. Chandler and P. A. Mueller (2013). "Using mechanical turk to study clinical populations." Clinical Psychological Science 1(2): 213-220. Simula H. (2013). The Rise and Fall of Crowdsourcing? Hawaii International Conference on System Sciences. Hawaii: 2783 - 2791. Thaler S., E. Simperl and S. Wölger (2012). "An experiment in comparing human-computation techniques." IEEE Internet Computing 16(5): 52-58. Trottier D. (2014). "Crowdsourcing CCTV surveillance on the Internet." Information Communication and Society 17(5): 609-626. Vivacqua A. S. and M. R. S. Borges (2012). "Taking advantage of collective knowledge in emergency response

1243