Crowdsourcing Lessons for Organizations

1 downloads 0 Views 209KB Size Report
Apr 18, 2012 - should be independently verified with primary sources. ... Crowdsourcing is a way of outsourcing tasks to communities of Internet users, .... sections, an abbreviated list of companies using crowdsourcing will be ... awarded to the winner. .... best ideas was done on the Netflix Prize Leaderboard, so while the ...
This article was downloaded by: [Peter Rosen] On: 18 April 2012, At: 02:23 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Decision Systems Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/tjds20

Crowdsourcing Lessons for Organizations Peter A. Rosen

a

a

Schroeder Family School of Business Administration, University of Evansville, 1800 Lincoln Avenue, Evansville, IN, 47722, USA Available online: 18 Apr 2012

To cite this article: Peter A. Rosen (2011): Crowdsourcing Lessons for Organizations, Journal of Decision Systems, 20:3, 309-324 To link to this article: http://dx.doi.org/10.3166/jds.20.309-324

PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Decision Making in a Web 2.0 Environment Crowdsourcing Lessons for Organizations

Downloaded by [Peter Rosen] at 02:23 18 April 2012

Peter A. Rosen Schroeder Family School of Business Administration University of Evansville 1800 Lincoln Avenue Evansville, IN 47722, USA [email protected]

In the Web 2.0 era, business models have changed. This study explores the current use of the Web 2.0 phenomenon crowdsourcing as an avenue for idea generation and decision making. Crowdsourcing is a way of outsourcing tasks to communities of Internet users, typically for little or no compensation. It has been used in Haiti and Kenya to minimize the impact of natural disasters and political turmoil, by scholars to translate ancient texts, by notfor-profits to increase donations and awareness of their causes, and by large corporations such as Proctor & Gamble, Boeing, Toyota, and BP to generate ideas for products, advertisements, and to help solve business problems (Mullins, 2010; Falcioni, 2010; Cohen, 2010; Strom, 2010; Whitla, 2009). This article will focus on current organizational uses of crowdsourcing, along with an analysis of the pros and cons of its use in decision making. Suggestions for companies considering its use will be given to help them avoid the mistakes that others have made in the past. ABSTRACT.

Dans l’ère du web 2.0, les modèles d’entreprise ont changé. Cet article explore l’utilisation du phénomène web 2.0 appelé « crowdsourcing » ou externalisation ouverte, pour la génération d’idées et la prise de décision. L’externalisation ouverte est une méthode de distribution des tâches au sein des communautés de l’internet. A Haïti et au Kenya, on l’a utilisée pour minimiser les impacts des catastrophes naturelles. On l’a aussi utilisée pour la traduction de textes anciens et les grandes entreprises, telles que Proctor & Gamble, Boeing, Toyota, et BP, s’en servent pour la recherche d’idées de produits nouveaux, de publicité et pour la résolution de problèmes organisationnels. Notre article s’intéresse aux utilisations de l’externalisation ouverte par les entreprises et à son impact sur la prise de décision. Nous en tirons des leçons pour le futur de l’externalisation ouverte.

RÉSUMÉ.

KEYWORDS:

Crowdsourcing, Decision Making, Web 2.0.

MOTS-CLÉS : externalisation

ouverte, prise de décision, web 2.0. DOI:10.3166/JDS.20.309-324 © 2011 Lavoisier, Paris

JDS – 20/2011. Decision Making in Web 2.0, pages 309 to 324

310

JDS – 20/2011. Decision Making in Web 2.0

Downloaded by [Peter Rosen] at 02:23 18 April 2012

1. Introduction The term “Web 2.0” was coined by Darcy DiNucci in 1999, and popularized by Tim O’Reilly at the O’Reilly Media Web 2.0 Conference of 2004. DiNucci predicted that the Web would be found not only on your computer, but on your television, your car’s dashboard, your cellular phone, your hand-held gaming device, and even your microwave. He was correct on most accounts. For DiNucci (1999) this new way we used the Web, or Web 2.0, as he called it, “will be understood not as screenfuls of text and graphics, but as a transport mechanism, the ether through which interactivity happens” (DiNucci, 1999). O’Reilly defined Web 2.0 as a collaborative, interactive, participative version of the Web where individuals were no longer passive users of content that others had posted, but were the ones generating the content themselves. His formal definition of Web 2.0 is “the network as platform, spanning all connected devices; Web 2.0 applications [are] delivering software as a continually–updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an ‘architecture of participation’ and deliver rich user experiences” (O’Reilly, 2005). Web 2.0 technologies include social networks, blogs, podcasts, folksonomies, wikis, RSS filters, mashups, virtual worlds, and crowdsourcing. All of these technologies change the way people use the Web to a more interactive and participative nature, and allow for greater communication between users. Web 2.0 is changing the way businesses operate, and its impact needs to be monitored as its technologies constantly change. The information security company McAfee did a comprehensive global study on organizational use of Web 2.0 and found that Web 2.0 solutions are not yet perceived as critical in 2010, but that more organizations are exploring its use. The real benefits of these technologies will be realized when adoption rates for organizations reach 90% (McAfee, 2010). Some benefits realized by organizations using Web 2.0 technologies are increased revenue streams, improved communication and collaboration among employees, better customer service, enhanced marketing, and facilitating production and new product development (McAfee, 2010). While wikis, blogs, and social networks are three of the most often used Web 2.0 technologies, crowdsourcing is starting to gain popularity. Jeff Howe and Mark Robinson coined the term in a 2006 “Wired Magazine” article and defined it as, “the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer-production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential laborers” (Howe, 2006).

Decision Making in a Web 2.0 Environment

311

Downloaded by [Peter Rosen] at 02:23 18 April 2012

2. Literature review In this brief literature review of the crowdsourcing literature, two frameworks of crowdsourcing decision-making will be examined. In his seminal work on the topic of crowdsourcing, Surowiecki (2004) indicates that four conditions must be present for the decision-making process to be successful. The first is diversity of opinions, which means that the organization is using decision-makers who have opinions and skill sets that might be different from their employee base. The second condition is independence, meaning that decision-makers are not interacting with one another during the idea generation process. Independence lessens the impact of group decision-making biases such as groupthink, where the decision-makers focus in on one solution without considering viable alternatives (Janis, 1982). Decentralization is the third condition that must be present for successful crowdsourcing efforts. Information must be shared with the crowd of decision-makers instead of kept private within the organization, or the efforts will likely fail. Once a group of viable solutions has been generated, there must be some mechanism to aggregate the solutions for review by the crowd. Once the crowd, or a group of internal experts is presented with a list of viable alternatives, the final decision can be made, and a collective decision is reached. Bonabeau (2009) has a model with many of the elements of Surowiecki’s framework and suggests that the crowdsourcing decision- making process comes in two stages, the generation of possible solutions and the evaluation of these solutions. Each stage in the process can be susceptible to decision-making biases that must be eliminated for the crowdsourcing effort to be successful. In the generation of possible solution stage, biases like social influence (the influence of others to limit a wide range of decisions), anchoring (decisions all center around the first solution presented), and availability bias (the crowd settles on an acceptable, but not optimal decision) can occur. These biases are akin to the problems that Surowiecki identified in the diversity of opinion, independence, and decentralization conditions of his model. To get around these potential biases, Keim (2011) suggests that in the idea generation stage companies should obtain as many diverse opinions as possible, and that the participants should not initially be allowed to see each other’s ideas. Before reaching Bonabeau’s second decision-making stage, or idea evaluation, the process of additive aggregation, or collecting all of the ideas, and coming up with a way to identify either the best solutions or an average of the solutions is addressed. In this stage, either the crowd in its entirety (if the expertise of the crowd is trusted) or identified experts can be used to reduce the set of all potential solutions down to a smaller subset, which can then be evaluated. Again this part of Bonabeau’s model is consistent with Surowiecki’s final condition of crowdsourcing success, which is aggregation of ideas. These frameworks fail to take into account a variety of factors that can influence the success of the project. For example, both Bonabeua and Surowiekci fail to address the incentives given to the crowd for their work. For large scale complex

312

JDS – 20/2011. Decision Making in Web 2.0

Downloaded by [Peter Rosen] at 02:23 18 April 2012

problems, like the Netflix Prize, large incentives might be required to get people with technical expertise to participate. For idea generation websites like MyStarbucks Idea, where the creator of the idea does not actually have to implement the idea, non-monetary awards and website fame might be enough of an incentive to generate positive contributions. Both frameworks address the aggregation of ideas at the idea selection stage, but do not address who is qualified to participate in crowdsourcing efforts. By better defining the minimum level of expertise necessary to participate, crowdsourcing programs can be more successful. In the following sections, an abbreviated list of companies using crowdsourcing will be presented. Next, two popular crowdsourcing efforts will be analyzed using Surowiecki’s framework. A section on benefits and drawbacks of crowdsourcing will follow, concluding with additional factors organizations should consider, beyond what Surowiecki and Bonabeua present in their models. 3. Crowdsourcing examples The list of organizations that have participated in crowdsourcing efforts is long. From not-for-profits, to Fortune 500 companies, to intermediaries that help firms and participants come together, many organizations are involved. The example below shows an abbreviated list from the crowdsourcingexamples.pbworks.com website, using the classification system of designer Anjali Ramachandran (2011). 1) Individual business: – Create My Tatoo – connects users with thriving community of 700 + talented tattoo artists who compete to design the perfect custom tattoo. – IStockphoto – royalty-free stock images – iStockphoto is the web’s original source for user-generated, royalty-free stock photos, illustrations, video, audio and Flash. – Whether a user is a designer, advertiser, entrepreneur or blogger, the sites has millions of affordable images, vectors and clips to help the user. – Threadlesss – T-shirt designs – Members of the Threadless community submit t-shirt designs online; the designs are then put to a public vote. A small percentage of submitted designs are selected for printing and sold through an online store. 2) Brand initiatives that allow for customer customization: – LEGO – Design By Me – allows users to create their own customized Lego sets. – MojaMix – Premium Custom-Mixed Cereal – allows users to create their own cereal combinations – with cereals, dried fruits, and nuts/seeds. – Nike – NikeID – allows users to create their own customized shoes.

Decision Making in a Web 2.0 Environment

313

Downloaded by [Peter Rosen] at 02:23 18 April 2012

3) Brand-sponsored initiatives: – BMW – Customer Innovation Lab – allows users to submit ideas to the company to improve its automobiles. – Proctor & Gamble – Connect & Develop – allows customers to submit product ideas to the company. – Starbucks – My Starbucks Idea – Allows users to share their ideas, tell the company what they think of other people’s ideas and join the discussion. 4) Brand-sponsored competitions/challenges: – JetBlue – Story Booth – users submit travel stories and have a chance to win free flights. – McDonald’s – Globalcasting – users submit stories and photos and the winning stories were printed on the “to-go” bags. – Netflix Prize – sought to substantially improve the accuracy of predictions about how many people are going to enjoy a movie based on their movie preferences.

3.1. Popular crowdsourcing examples Starbucks and Netflix have held popular crowdsourcing efforts. MyStarbucks Idea is an ongoing effort between the company and its loyal customers to improve the organization. The Netflix Prize was a limited-time contest designed to improve the movie recommendation system of the company, and with a large cash prize awarded to the winner. These examples will be analyzed using the frameworks of Surowiecki. 3.2. MyStarbucks idea This brand-sponsored initiative gives Starbucks customers the chance to submit ideas across three major categories: product ideas, experience ideas, and involvement ideas. To date, approximately 115,000 ideas have been submitted to the website, with a majority of those being product suggestions (MyStarbucks Idea – Idea List, 2011). Ideas that have been suggested by users and implemented at its stores include being able to pay using a SmartPhone instead of cash, a free beverage of choice on your birthday (for loyalty card holders), countless new coffee flavors, and making ice cubes out of coffee instead of water for better flavor in its iced beverages. To date, over 150 ideas have been accepted and implemented (MyStarbucks Idea – 150 Ideas Launched, 2011). Essentially a little over one in every one thousand ideas passes the approval of both the crowd and company, and finds its way into the Starbucks stores.

314

JDS – 20/2011. Decision Making in Web 2.0

Downloaded by [Peter Rosen] at 02:23 18 April 2012

On its FAQ page, Starbucks makes it clear once a participant of MyStarbucks Idea submits an idea, it becomes the property of Starbucks without any compensation. “The submission of your Idea to Starbucks is entirely voluntary, nonconfidential, gratuitous and non-committal. You grant to Starbucks and its designees a perpetual, irrevocable, non-exclusive fully-paid up and royalty-free license to use any ideas, expression of ideas or other materials you submit (collectively, “Idea”) to Starbucks and Mystarbucksidea.com without restrictions of any kind and without any payment or other consideration of any kind, or permission or notification, to you or any third party” (MyStarbucks Idea – IdeaFAQ, 2011). Starbucks is counting on the loyalty of its customers and the shot at website fame to get the crowd to generate ideas for its products and stores, and so far the effort has been a great success. While not compensating participants for their ideas is not a good incentive, the sheer number of ideas generated by the site points to its popularity. Starbucks does a good job getting a diversity of opinions from its large customer base. While official statistics for registered users of MyStarbucksIdea were unavailable, its Facebook page has 27 million fans, so it is safe to assume that the Idea site has hundreds of thousands to millions of users (Starbucks Annual Report, 2010). Once a registered user has posted an idea, it becomes available to all other registered site users. This makes the independence of the idea less than it could be if all ideas were generated, gathered, and then posted at one time. This could cause some of the biases that Bonabeau mentioned, but is more than likely not an important issue in the success of this website, due to the sheer volume of ideas received. Starbucks does not share any privately held information with the crowd, so its decentralization efforts could be improved upon. Again, this may not be an important factor in the success of the site, as idea generators typically have extensive experience with Starbucks stores and products. Starbucks aggregation efforts have been widely criticized. Ideas are not properly catalogued and reviewed, so an idea from the past could again appear on the website as a ‘new’ idea. The final decisionmaking authority is left with Starbucks Idea Partners – employees with area expertise – but the input of the crowd is weighed heavily in the decision-making process. In summary, Starbucks could improve this effort by creating more independence of ideas, by sharing of more corporate information, and by better aggregating ideas. The lack of compensation is also an issue that Starbucks needs to address to get repeat participation and participants with more expertise. But despite the failure to strictly adhere to Surowiecki’s framework, Starbucks has been able to improve its product offerings, store atmosphere, and packaging by adopting customer suggestions. Since no monetary award was given for those suggestions that were adopted, the crowdsourcing effort is well worth the time of maintaining the website.

Decision Making in a Web 2.0 Environment

315

3.3. Netflix

Downloaded by [Peter Rosen] at 02:23 18 April 2012

The Netflix Prize is one of the most publicized crowdsourcing efforts. Under the brand-sponsored contest category, the $1 million Netflix Prize was offered to any team that could improve by 10%, Netflix’s movie recommendation system, Cinematc. The algorithm behind Cinematc tried to predict future movies that subscribers would like, based on their ratings of movies they have already watched. The idea was if subscribers found the personal recommendations useful, they would be both more willing to rent future movies and to tell their friends how wonderful the service was. Netflix put actual data online to allow contest participants the chance to work with it. Three years after the contest started, a team of workers from AT&T Labs who called themselves “BellKor’s Pragmatic Chaos” won the contest and was awarded the $1 million grand prize (Netflix Prize – Index, 2011). A big reason why the contest received so much publicity was the value of the grand prize. Unlike many crowdsourcing initiatives, the incentive for this contest was large enough to attract real experts in the area of mathematical modeling and computer programming. To its credit, Netflix also established clear rules about ownership of submitted idea, who could participate, how the submissions would be judged, and how participants would be compensated (Netflix Prize – Rules, 2011). The Netflix Prize generated entries from 51,051 contestants on 41,305 teams from 186 different countries who participated (Netflix Prize – Leaderboard, 2011). This led to a diversity of opinions, and many different ways were used to solve the problem of enhancing the customer recommendation system. Because teams were competing against each other for the $1 million grand prize, there was great incentive for the teams to keep their rating algorithms secret from each other, creating independence among the decision-makers. Actual Netflix customer data, stripped of its identifying features, was used in the contest so that the participants had the same information as company employees who were also presumably trying to improve the recommendation system. This actual data fits perfectly with the idea of decentralized information as described by Surowiecki. Finally aggregation of the best ideas was done on the Netflix Prize Leaderboard, so while the algorithms that produced the best results were kept secret, all the contestants could see the teams that generated the best performing ideas. Since the data were numerical, subjective opinions of the crowd were not an issue in this case. All of these factors, combined with a huge monetary incentive made this a very successfully crowdsourcing effort. 4. Why croudsourcing? 4.1. Organization perspective One of the major advantages that organizations can achieve by using the power of the crowd is “a very large community of potential workers who have a diverse

316

JDS – 20/2011. Decision Making in Web 2.0

Downloaded by [Peter Rosen] at 02:23 18 April 2012

range of skills and expertise and who are willing and able to complete activities within a short time-frame and often at a much reduced cost as compared to performing the task in-house” (Whitla, 2009). Research conducted has found that the collective intelligence of large groups is equal to or superior to the intelligence of even an elite few (Leimesiter et al., 2009). Large-scale problems, or even multi-generational problems, such as oil dependence and global warming, may be solved by the power of the crowd, and the collaboration and cooperation it brings with it. The masses may generate more and better ideas than any individual, and should be harnessed for solving these grand problems that society faces (Winsor, 2009). An example of this is the transcribing of historical documents by amateurs to get these long-lost works of scholarship into digital form so that they can be read by the masses (Cohen, 2010). This is a painstaking process that has taken decades of work. Problems such as these can be solved more quickly with the help of the crowd. Small to mid-sized companies with a small employee base to draw from can get help from a seemingly unlimited pool of labor resources and their ideas when posing a problem to the crowd. When the idea comes without having to pay fixed salaries, benefits and overhead, it is an attractive alternative to hiring full-time employees (Belsky, 2010). Customers of organizations are demanding input into these entities in exchange for their loyalty. The ability to co-create new ad campaigns and products, and to inspire innovation for the companies they purchase from, is intriguing for many, and could lead to greater loyalty in the future (Winsor, 2009). This shortening of the innovation life cycle is a huge advantage for companies, especially those in the technology sector, where constant innovation is required to stay competitive (Andriole, 2010). 4.2. Participant perspective Leimesiter et al. (2009) attempted to explain why people participate in crowdsourcing ventures and created a framework for motivations and incentives to join crowdsourcing initiatives. The reasons they propose for participation in these ventures is the ability to learn new skills, the compensation participants receive, the ability to self-market or promote oneself to potential employers, and socially to gain appreciation or fame from the crowdsourcing company and from other participants. In a down economy where unemployment rates in the United States are nearly 10% nationally, unemployed workers have a chance to be productive and earn money through entering crowdsourcing contests (Bureau of Labor Statistics, 2011; Winsor, 2009). Some of these people might be considering career changes, but are unsure if they have abilities in certain fields. Crowdsourcing allows those

Decision Making in a Web 2.0 Environment

317

individuals to test a field of employment to determine if a career in that field is a good fit (Moriarty, 2010). Finally, crowdsourcing allows participants to become entrepreneurs, to have an outlet for their creative energies, to learn new skills, to stay involved in hobbies, and explore jobs that they are passionate about. (Brabham, 2008; Winsor, 2009).

Downloaded by [Peter Rosen] at 02:23 18 April 2012

5. Downside of crowdsourcing From the crowdsourcing literature, there appear to be four categories of problems associated with this concept: quality of ideas generated, a lack of or insufficient monetary compensation for the idea, lack of community among crowdsourcers, and a lack of diversity among crowdsourcers. A discussion of each follows. 5.1. Crowdsourcing quality Far and away the biggest drawback identified in the crowdsourcing literature is the quality of the ideas generated by participants. Roman (2009) indicates that crowdsourcing does not help cluster or sort through information, and in fact does the opposite. The sheer number of poor ideas generated leads to vast amounts of noise that may be of little relevance to the company looking for quality ideas (Whitla, 2009). This can lead to choice fatigue for organizations, and fear of bad advice from amateurs is leading a return to using experts in the decision-making process (Dokoupil, 2008). A few researchers mentioned the dichotomy of the “wisdom of the crowd” vs. “mob rules.” Prior to trying crowdsourcing, companies cannot determine if the crowd will be wise or stupid. In crowdsourcing, the popular ideas, and not the most accurate, often rise to the top, leading some to believe that ideas produced are the antithesis of correctness and accuracy (Roman, 2009; Whitla, 2009). Belsky (2010) complains that most of the people that participate in crowdsourcing are amateurs who produce poor quality work when compared to professionals in the same industry. Dokoupil (2008) describes an idea he terms Web 3.0, which runs contrary to the democratic Web 2.0, in which experts are required to verify the content of ideas posted to the Web so that information is accurate. This will help sites like Wikipedia, which constantly has to defend itself against claims that information posted on its site is inaccurate. Finally, not all projects can be outsourced. In scientific research, where accuracy is paramount, the scientific wisdom of the masses is an oxymoron. Science education in the United States has been shown to be deficient, and letting the masses solve scientific problems could be dangerous. Experts are need in these types of situations (Roman, 2009).

318

JDS – 20/2011. Decision Making in Web 2.0

Downloaded by [Peter Rosen] at 02:23 18 April 2012

5.2. Monetary compensation for crowdsourcers There is no set way for companies to compensate those that are working on their behalf. In some models of crowdsourcing, a contest is created whereby a winner or set of winners receive a prize, and anyone not selected does not get compensated. So for many participants, they receive no compensation at all for their work (Belsky, 2010). When viewed from a dollar per hour perspective, “proportionately, the amount of money paid to the crowd for high-quality labor relative to the amount the labor is worth in the market represents a slave economy” (Brabham, 2008). Crowdsourcing participants are paid like independent contractors, and earn far less money than professionals in the industry would be paid had they been assigned the work, including a lack of paid benefits like insurance and 401k programs. (Brabham, 2008; Berkus, 2009). Experts in a field, the type of people who companies want to participate in idea generation, often avoid crowdsourcing opportunities, because they either already have high-paying careers in the industry or are being paid as expert consultants, making far more money than companies pay to crowdsourcers. 5.3. Crowdsourcing community Crowdsourcing has often been compared to another term, “open source”. In typical open source communities, such as the Linux operating system, the community of workers communicates with each other and collaborates to generate better ideas and products. Crowdsourcing participants are in direct competition with each other, eliminating the benefits of collaboration that could come from the community (Berkus, 2008). The next question about crowdsourcing communities is whether or not participants are allowed to participate equally. An example of this is the website Wikipedia.com, the online encyclopedia, where anyone can post or edit an entry, as long as the entry passes a panel of experts. It has been found that at Wikipedia.com, 1 percent of its users make 50 percent of its edits, and the entries of the 1 percent last longer than those of other participants (Dokoupil, 2008). The final problem with crowdsourcing communities is the phenomenon that Chevrolet faced when it asked consumers to create 30-second videos for its Tahoe SUV. The crowd produced unsuitable videos, leading to what Wired Magazine’s Mark Robinson calls “crowdslapping”. Chevy asked for advertisements about the Tahoe, and instead the entries contained political statements questioning the policies of President Bush, or statements against gas-guzzling vehicles, or were parodies that did not fit the theme of the Tahoe (Brabham, 2008). 5.4. Diversity of crowdsourcing ideas While crowdsourcing offers companies the ability to reach and obtain ideas from an Internet-wide audience, some question how wide an audience it is. Most people

Decision Making in a Web 2.0 Environment

319

Downloaded by [Peter Rosen] at 02:23 18 April 2012

on the Web are likely to be under 30, white, upper to middle class, Englishspeaking, educated, and with high- speed Internet connections. Brabham (2008) says we should be careful to assume that “ideas emerging from crowdsourcing applications are theorized to represent an ascendence of the superior idea through democratic process.” Because of the lack of diversity in the crowd, optimal solutions may not be reached and crowdsourcing projects could fail. Roman (2009) indicates that research shows that crowdsourcing favors popular opinion and therefore reinforces homogeneity, which does not lead to diverse, unconventional, or idiosyncratic views. Brabham (2008) also discusses how solutions generated from crowdsourcing are measured by the standards of the crowdsourcing company, or by the opinions of a homogenous crowd, so alternative solutions that run contrary to popular belief will likely lose out. 5.5. Other issues Currently crowdsourcing is not widely accepted by most companies, but that is expected to change. Andriole (2010) surveyed nearly 100 organizations on their use of Web 2.0 technologies and found that only 6% used crowdsourcing internally with their own employees to generate enterprise-wide ideas, and only 4% used external crowdsourcing to aid in the decision-making process. Entire industries are facing the pressure of the crowd, and professionals in creative pursuits such as advertising, photography and journalism are threatened by the work of amateur crowdsourcers, selling their ideas for much less than the professionals do (Berkus, 2009). Whilta (2009) indicates that three major problems can be faced by organizations that are considering crowdsourcing, and they include legal, ethical and privacy issues. There could be dispute of who owns the ideas generated by the crowd for the company, unless legal issues are properly addressed in advanced. Ethical issues are similar to the case of organizations that choose to outsource. Can companies live with the idea of eliminating loyal employees in exchange for hiring low-wage crowdsurfers? A negative impression of these organizations may be formed by current and future employees if this practice becomes too prevalent. Finally, many projects contain proprietary or secret data that competitors might desire. In these cases, organizations might benefit from keeping project in-house. 6. Advice to improve crowdsourcing efforts There are four areas where organizations can improve their crowdsourcing efforts, including project definition, properly defining who from the crowd can participate in the endeavor, communication with crowdsourcing partners, and compensation for participants. A discussion of each follows:

320

JDS – 20/2011. Decision Making in Web 2.0

6.1. Project definition

Downloaded by [Peter Rosen] at 02:23 18 April 2012

For participants to want to join an organization’s crowdsourcing effort, clear project parameters must be established before the project begins. The first thing that an organization needs to consider is if it is going to run the project itself or work with a crowdsourcing provider that has access to a particular audience type. InnoCentive is an example of a crowdsourcing provider that brings together organizations with projects and crowdsourcing participants, and has the goal of “harnessing collective brainpower around the world to solve problems that really matter” (InnoCentive.com, 2011). Companies should next focus on the task that is being sourced. A very clear definition of the job that participants will do is important for getting useful input (Rich, 2010). The scope of the project must be included in that definition so that participants know when the job starts and stops. For example, the company Unilever had clear instructions for its advertising campaign competition, and participants had to submit high-quality films, with topics that were representative of the brands that Unilever produces and that enhanced the value and recognition of that brand (Campaign, 2010). Ownership of the output of the crowdsourcing effort should be defined up front. Does the participant get to keep his or her ideas for later use, or does the company own the final output? If this is not defined up front, legal issues can occur, and everyone involved will be dissatisfied with the crowdsourcing experience (Belsky, 2010). Finally, companies should consider in advance and inform the crowd of what will occur if the ideas submitted by the crowd do not match the vision of the organization (Zuk, 2010). 6.2. Who can participate? According to the Internet World Stats website, there are nearly two billion Internet users (2010). While one of the drawbacks of crowdsourcing seems to be a lack of diversity among its participants, companies theoretically have access to an Internet-wide crowd of ideas. But who should companies turn to when they are seeking ideas and labor? One perspective is that companies should open their initiatives to as many people as possible, as studies have shown that many crowdsourcers stop participating after one or two attempts (Yang et al., 2008). Winsor (2009) agrees that there is a delicate balance that companies must find between full participation from the crowd and being able to solicit entries from only those who will understand the business objectives of the organization seeking help and focus their work in that area. One suggestion to improve the quality of the work from the crowd, while still maintaining an open environment for participation, is to have a filtered approach to who submits ideas. Belsky (2010) and Dokoupil (2008) want companies to consider

Decision Making in a Web 2.0 Environment

321

Downloaded by [Peter Rosen] at 02:23 18 April 2012

just using experts in the field as participants, but that drastically reduces the number of people that are involved. So to maintain an openness, companies can create qualification tests to reduce the number of participants who can contribute to the project, then allow those who pass the tests, or a mini-crowd of qualified people, a chance to submit ideas in exchange for some monetary compensation (Belsky, 2010; Alonso et al., 2008). This will make both companies and participants more satisfied, as companies will get work from more qualified candidates, and participants will increase the odds that their work will be compensated. If this mini-crowd is then given multiple projects in the same arena, from the same company, the quality of the submissions will improve as the participants learn more about the company and can better match their needs (Belsky, 2010). 6.3. Communication with the crowd Once clear guidelines for the project have been defined, it is suggested that communication between the crowd and the company be an ongoing process and not exist just at the beginning and end of the project (Rich, 2010). Winsor (2009) agrees, and suggests that there needs to be an ongoing dialogue, a give and take, between the crowd and the company. This can only improve the quality of the ideas generated. The more the crowd understands about the company and its project, the better the ideas will be. Zuk (2010) also reminds companies to ensure the privacy of the crowd during the process, only announcing winners at the end. He also suggests that to get the best out of the crowd, some company secrets will need to be shared. There is a delicate balance that must be addressed between communicating to the crowd the information they will need in order to submit quality ideas, while maintaining company secrets (Zuk, 2010). 6.4. The economics behind the crowd A clear advantage of crowdsourcing is the ability of the company to harness the ideas of the crowd for little monetary compensation, which we have already mentioned, and which Brabham (2008) compared to slave labor. Companies will improve their crowdsourcing efforts if compensation is commensurate with the work done by those whose ideas are adopted. Rich (2010) indicates that companies get what they pay for, so to attract higher quality work, compensation needs to be improved. Yang et al. (2008) suggest that rewards should match the effort given to the task, and to attract quality participants who will return for future projects, both participants and winners need to be compensated. Belsky (2010) agrees with creating a group of qualified participants that a company can use many times, and that this model of crowdsourcing should focus on long-term success, not short-term greed. Alonso et al. (2008) also remind companies that project winners should be

322

JDS – 20/2011. Decision Making in Web 2.0

compensated only when the task is complete, and not for the idea itself or for only part of the finished product. If companies invest time upfront in the area of project definition and addressing who from the crowd can participate, maintain open communication channels with crowdsourcing partners, and compensate participants properly, crowdsourcing can be a satisfying experience for both company and entrants, and will lead to great benefits for both.

Downloaded by [Peter Rosen] at 02:23 18 April 2012

7. Summary and contribution Managing successful crowdsourcing efforts can be difficult for companies. Using the frameworks presented by Surowiecki and Bonabeua may not be enough to ensure success. This paper suggests that companies need to add four additional factors to these existing models, including project scope, the proper level of participant expertise, proper communication with participants, and proper incentives. Acknowledgements The author would like to thank Brooke Rosen, Elijah Rosen, Hannah Rosen, Dr. Ira G. Rosen, Laura Rosen, Leah Rosen and Trent Rosen for their patience and wisdom. 8. References Alonso O., Rose D.E., Stewart B., “Crowdsourcing for relevance evaluation”, ACM SIGIR Forum, Vol. 42, No. 2, 2008, p. 9-15. Andriole S.J., “Business impacts of Web 2.0 technologies”, Communication of the ACM, Vol. 53, No. 12, 2010, p. 67-79. Belsky S. “Crowdsourcing is broken: How to fix it”, Bloomberg Businessweek, 2010, January 27. Retrieved January, 2011 from http://www.businessweek.com/innovate/ content /jan2010/id201 00122 _047502.htm Berkus, J., Never say crowdsourcing, 2009, September 29. Retrieved January, 2011 from http://it.toolbox.com/blogs/database-soup/never-say-crowdsourcing-34331. Bonabeau E., “Decisions 2.0: The Power of Collective Intelligence”, Sloan Management Review, Vol. 50, No. 2, 2009, p. 45-52. Brabham D.C., “Crowdsourcing as a model for problem solving”, Convergence: The International Journal of Research into New Media Technologies, Vol. 14, No. 1, 2008, p. 75-90. Bureau of Labor Statistics, Employment situation summary, 2011, January. Retrieved January, 2011 from http://www.bls.gov/news.release/empsit.nr0.htm

Decision Making in a Web 2.0 Environment

323

Campaign Unilever crowdsourcing effort divides opinion, 2010, November 5. Retrieved January, 2011 from http://www.campaignlive.co.uk/news/login/1040669/. Cohen S., Scholars recruit public for project, New York Times, 2010, December 27. Retrieved January, 2011 from http://www.nytimes.com/2010/12/28/books/28transcribe.html DiNucci D., “Fragmented future”, Print, Vol. 53, No. 4, 1999, p. 32. Dokoupil T., “Revenge of the experts”, Newsweek, 2008, March 6. Retrieved January, 2011 from http://www.newsweek.com/2008/03/05/revenge-of-the-experts.html Falcioni J., “Project crowdsourcing”, Mechanical Engineering, (2010, December). Retrieved January 7, 2011 from http://memagazine.asme.org/Articles/2010/December/Editorial.cfm

Downloaded by [Peter Rosen] at 02:23 18 April 2012

Howe J., The rise of crowdsourcing, Wired, 2006, June. Retrieved January, 2011 from http://www.wired.com/wired/archive/14.06/crowds.html Internet World Stats, “World internet usage statistics news and world population stats”, 2011. Retrieved January, 2011 from http://www.internetworldstats.com/stats.htm Janis I., Groupthink: Psychological studies of policy decisions and fiascoes, Cengage, Boston, MA, 1982. Keim B.,“Sharing information corrupts wisdom of crowds”, Wired, 2011, May 16. Retrieved August, 2011 from http://www.wired.com/wiredscience/2011/05/wisdom-of-crowdsdecline/ Leimesiter J.M., Huber M., Bretschneider U., Krcmar H., “Leveraging crowdsourcing: Activation-supporting components for IT-based ideas competitions”, Journal of Management Information Systems, Vol. 26, No. 1, 2009, p. 197-224. McAfee Inc., Web 2.0 – A Complex Balancing Act, 2010. Retrieved January, 2011 from http://newsroom.mcafee.com/images/10039/Web2report.pdf Moriarty G.L., “Psychology 2.0: Harnessing social networking, user-generated content, and crowdsourcing”, Journal of Psychological Issues in Organizational Culture, Vol. 1, No. 2, 2010, p. 29-39. Mullins J., “Haiti gets help from net effect”, New Scientist, 2010, January 30. Retrieved January, 2011 from http://www.newscientist.com/article/mg20527453.600-howcrowdsourcing-is-helping-in-haiti.html. Netflix Inc. “Netflix prize index”, http://www.netflixprize.com//index.

2011.

Netflix Inc., “Netflix prize leaderboard”, http://www.netflixprize.com//leaderboard. Netflix Inc., “Netflix prize rules”, http://www.netflixprize.com//rules.

Retrieved

2011.

2011.

Retrieved

Retrieved

August, August, January,

2011

from

2011

from

2011

from

O’Reilly T., “What is web 2.0 - Design patterns and business models for the next generation of software”, O’Reilly Network, 2005. Retrieved January, 2011 from http://oreilly.com/lpt/a/6228

324

JDS – 20/2011. Decision Making in Web 2.0

Ramachandran A., “Crowdsourcing examples”, 2011. Retrieved January, 2011 from http://crowdsourcingexamples.pbworks.com/w/page/16668404/FrontPage. Rich L., “Tapping the wisdom of the crowd”, New York Times, 2010, August 4. Retrieved January, 2011 from http://www.nytimes.com/2010/08/05/business/smallbusiness /05sbiz. html Roman D., “Crowdsourcing and the question of expertise”, Communications of the ACM, Vol. 52, No. 12, 2009, p. 12. Starbucks Corporation, “MyStarbucks idea – 150 ideas launched”, 2011. Retrieved August, 2011 from http://www.starbucks.com/blog/150-ideas-launched/1038

Downloaded by [Peter Rosen] at 02:23 18 April 2012

Starbucks Corporation, “MyStarbucks idea – FAQs”, 2011. Retrieved January, 2011 from http://mystarbucksidea.force.com/ideafaq Starbucks Corporation,” MyStarbucks idea – Idea list,” 2011. Retrieved August, 2011 from http://mystarbucksidea.force.com/ideaList Starbucks Corporation, “Fiscal year 2010 annual report”, 2011. Retrieved August, 2011 from http://phx.corporate-ir.net/External.File?item=UGFyZW50SUQ9NzkzODl8Q2hpbG RJRD 0t MXx UeXBlPTM=&t=1 Strom, S., “Social networks meant for social good, but at a price,” New York Times, 2010, December 19. Retrieved January, 2011 from http://www.nytimes.com/2010/12/20 /business/20charity.html Surowiecki J., The Wisdom of Crowds, Doubleday, New York, NY, 2004. Whitla P., “Crowdsourcing and its applications in marketing activities”, Contemporary Management Research, Vol. 5, No. 1, 2009, p. 15-28. Winsor J., “Crowdsourcing: What it means for innovation”, Bloomberg Businessweek, 2009, June 15. Retrieved January, 2011 from http://www.businessweek.com/innovate/ content / jun2009 /id20090615_946326.htm Yang J., Adamic L.A., Ackerman M.S., “Crowdsourcing and knowledge sharing: Strategic user behavior on Taskcn”, Proceedings of EC’08 Conference, Chicago, IL, 2008, p. 246255. Zuk R., “By popular demand: Crowdsourcing your audience for innovation”, Public Relations Society of America, 2010. Retrieved January, 2011 from http://www.prsa.org/Intelligence /Tactics/Articles /view /8732/101/ Crowdsourcing_your_audience_for_innovation