Look before you leap: Legal pitfalls of crowdsourcing
2011; Wiley; Volume: 48; Issue: 1 Linguagem: Inglês
10.1002/meet.2011.14504801135
ISSN0044-7870
AutoresStephen M. Wolfson, Matthew Lease,
Tópico(s)Privacy, Security, and Data Protection
ResumoProceedings of the American Society for Information Science and TechnologyVolume 48, Issue 1 p. 1-10 PaperFree Access Look before you leap: Legal pitfalls of crowdsourcing Stephen M. Wolfson, Stephen M. Wolfson swolfson@law.utexas.edu School of Information, University of TexasSearch for more papers by this authorMatthew Lease, Matthew Lease ml@ischool.utexas.edu School of Information, University of TexasSearch for more papers by this author Stephen M. Wolfson, Stephen M. Wolfson swolfson@law.utexas.edu School of Information, University of TexasSearch for more papers by this authorMatthew Lease, Matthew Lease ml@ischool.utexas.edu School of Information, University of TexasSearch for more papers by this author First published: 11 January 2012 https://doi.org/10.1002/meet.2011.14504801135Citations: 43AboutSectionsPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShare Give accessShare full text accessShare full-text accessPlease review our Terms and Conditions of Use and check box below to share full-text version of article.I have read and accept the Wiley Online Library Terms and Conditions of UseShareable LinkUse the link below to share a full-text version of this article with your friends and colleagues. Learn more.Copy URL Share a linkShare onFacebookTwitterLinkedInRedditWechat Abstract Crowdsourcing is quickly shifting the traditional landscape of how people work, invent, fund new enterprises, and create new artistic works. Such innovative shifts often run ahead of the law and raise new legal questions that may not yet have very definite answers. This paper considers five such legal issues that the crowdsourcing community (providers and customers) should discuss, both to inform their own practice and to advise future policy. Specifically, we consider employment law, patent inventorship, data security and the Federal Trade Commission, copyright ownership, and securities regulation of crowdfunding. Ultimately we offer three practical suggestions for crowdsourcing: be mindful of the law, define relationships in advance, and be open and honest with crowdworkers. While we limit the scope of legal regulation considered, we hope to provide a useful introduction to several areas where crowdsourcing and the law may (soon) intersect, and to offer some insight on how a court/lawyer may view them. INTRODUCTION In 2006, Jeff Howe (2006) identified a trend: companies were shifting jobs that had formerly been assigned to an employee or a contracted worker, and instead distributing them to large groups of people. As James Surowiecki recognized in The Wisdom of the Crowds (2005), large groups can effectively and accurately solve some tasks better than individuals, and businesses were beginning to apply such thinking. Howe called this "crowdsourcing." Even before Howe coined the term, crowdsourcing has been changing the way people think about conducting work. New platforms seem to develop daily, allowing businesses to connect with and distribute various tasks to multitudes of prospective workers. Amazon Mechanical Turk (www.mturk.com), oDesk (www.odesk.com), Crowdspring (www.crowdspring.com), Kickstarter (www.kickstarter.com), and many others all help people use the power of the crowd in various ways. As the crowdsourcing industry grows and diversifies, however, it seems increasingly likely that it will experience legal regulation. This paper discusses several areas of the law that will likely impact crowdsourcing in the future. We begin with employment law. Both federal and state laws stipulate how "employers" must treat "employees." We especially discuss the federal Fair Labor Standards Act, which guarantees anyone who qualifies as an "employee" things like a minimum wage and overtime regulation. Next, we consider inventorship issues under patent law. One way to use crowdsourcing is for the research and development of patentable inventions. However, joint inventorship issues naturally arise where multiple people work on an invention. Under the law, all inventors must be included on a patent application and if they are not, the patent might be rendered unenforceable. Thus, anyone who wants use the crowd for research and development must consider joint inventorship issues and act accordingly. Regarding innovation, more companies are now beginning to tap into "wisdom of crowds" based innovation by sharing more customer data in new ways, e.g. for academic research or with open-innovation providers like InnoCentive (www.innocentive.com). Unfortunately, developing effective methods for protecting customer privacy remains an open research problem (Narayanan and Shmatikov, 2008) with some highly visible recent failures (Barbaro & Zeller Jr., 2006, Ohm, 2010). The Federal Trade Commission (FTC) has recently begun aggressively acting to protect consumers from such data breeches caused by commercial entities. Companies should be mindful of this. Crowdsourcing can also be an effective way to source and/or develop creative works. Yet because of copyright's works made for hire and joint works provisions, crowdsourcers can easily lose control over crowd-developed creative works if they do not pay attention to these doctrines. Thus, anyone interested in crowdsourcing creative design must carefully consider copyright laws. Finally, with its wide-reaching potential, crowdsourcing can be a very effective way to raise money for various projects through crowdfunding (e.g. the Obama presidential campaign). U.S. law, however, regulates certain transactions, and crowdfunders who raise money incorrectly could run into problems with securities laws. In fact, on April 6, 2011, the chair of the Securities and Exchange Commission (SEC) announced that the agency would consider changes to its regulations affecting crowdfunding. Potential crowd-based fundraisers should be aware of such issues to avoid potential legal scrutiny. While we cannot hope to provide a complete survey of legal questions faced by crowdsourcing, we do introduce several legal issues and offer some insights into how lawyers and/or courts may (soon) think about the concerns that crowdsourcing raises. Ultimately we offer three practical suggestions: be mindful of the law, define relationships in advance, and be open and honest with crowdworkers. BACKGROUND As Howe (2006) saw it, crowdsourcing takes the ideals of open sourcing and applies them outside of software development. However, crowdwork today includes many more labor models than Howe's definition seems to imply. Bent Frei (2009) divides paid crowdwork into four categories, moving from the simplest to the most complex. At the simple-work end of the spectrum, there are Micro Tasks, which are small, easy, and tend to be distributed in high volume for very little compensation. These jobs are often as basic as image tagging. Next there are Macro Tasks, which also tend to be high volume and low pay, but require more skill and effort, like writing simple product reviews. In the first two categories, employers generally do not need to direct or communicate with their workers much if at all. Moving to more substantial work, Frei identifies Simple Projects, which are lower volume, higher pay, and require more skill and time commitment. These jobs are often tasks like basic website design or creating outlines for presentations. Finally, Complex Tasks are the most difficult form of crowdwork. These jobs require specialized skills and a significant time commitment from the workers. Moreover, they are usually high paying, single project jobs, like designing software modules. These latter two categories typically require employers to communicate with and direct their workers more than with the simpler tasks. Crowdsourcing systems like Amazon Mechanical Turk (Mturk) fall on the Micro Tasks end of this spectrum (Frei, 2009). Mturk is a bulletin board like website that allows "Requesters" (employers) to post "Human Intelligence Tasks" (HITs) (jobs) which "Providers" (workers) can accept and accomplish. Generally, HITs require little time to complete and return very little pay – each can be as low as $0.01 (Felstiner, 2009). Indeed, Ross et al. (2010) found in a study of Mturk worker demographics that Providers earn an average of less than $2.00 per hour. Furthermore, Requesters tend to be relatively hands-off with their workers. Dow and Kelmmer (2011) write that Requester and Providers are normally anonymous to each other; there is little direct interaction between them. Employers often treat workers as merely "interchangeable replacements for computational processes" (p. 1). Workers have been termed "human processing units" (HPUs), a new functional component of computer architecture to complement the central processing unit (CPU) (Davis et al., 2010). Moving to the more complex end of the spectrum of crowdwork, systems like oDesk allow employers to connect with highly skilled workers to complete much more substantial types of jobs. In a survey of oDesk workers, Brett Caraway (2010) found that oDesk more closely resembles a traditional work environment than the anonymous workforce on Mturk. First, oDesk allows employers to distribute work for hourly pay rather than as fixed price, single task contracts. Second, the platform encourages employers to communicate with, direct, and supervise their workers more. Through oDesk's "Team Application" software, employers can monitor their workers' keystrokes and mouse clicks, and even take screen shots and webcam pictures while they are working. Caraway (2010) writes that oDesk workers feel that they are held accountable for their work. Meanwhile, they earn substantially more money on oDesk than Providers can on Mturk. oDesk reports that its workers commonly make between $10-$25 per hour on its platform ("oDesk"). Importantly, crowdwork is more than a niche labor market. Mturk, oDesk, and other types crowdwork represent a significant and growing amount of workers and money. In 2009, Mturk and oDesk had 200,000 and 331,000 registered users respectively (Frei, 2009). Further, from 1999–2009, workers across ten crowdsourcing companies earned a gross of $750,000,000. Looking at Mturk specifically, Ipeirotis (2010) found that, from January 2009 through April 2010, 9436 requesters posted a total of 6,701,406 HITs, for a total value of at least $529,259. Since this study did not capture redundant HITs and may have missed many short-lived HITs, the actual sum of money which changed hands is likely far greater. oDesk reports that employers spent more than $15,000,000 on online work in April 2011 and over 2000 people join its workforce daily ("oDesk"). With advantages for both, the crowdwork market might be attractive to workers and employers alike. For employers, crowdwork offers a highly scalable workforce of on-demand labor that they can easily tap into with little transaction costs. Meanwhile, workers can profit from their "spare cycles," or, in the case of platforms like oDesk, use their specialized skills (Felstiner, 2010). Moreover, with unemployment at 9% (Bureau of Labor Statistics, 2011), and very high underemployment (Newport and Muller, 2011), it is likely that more people will consider crowdwork in the future when looking for supplementary or primary incomes. As the crowdlabor market grows in profile and importance, however, it seems increasingly possible that we should expect legal regulation. EMPLOYMENT LAW One area where crowdsourcing could clearly intersect with the law is in Labor and Employment law. With a substantial labor market, and numerous platforms enabling various types of work, the crowdsourcing industry could face federal and/or state regulation over employment practices in the near future. While crowdlabor has many benefits, anyone considering it must be aware of the potential consequences of having crowdworkers as "employees." In the United States, both state and federal laws put restrictions on employers to protect against harm to employees. Since a complete survey of employment and labor law is outside of the scope of this paper, we focus on the federal Fair Labor Standards Act (FLSA). Not only is this a very important law itself, but it will also help elucidate how other similar regulations may work. The Fair Labor Standards Act In 1938, Congress passed the Fair Labor Standards Act in response to declining wages caused by the Great Depression. "Low wages perpetuated a downward economic spiral," and the federal government decided to step in rather than let the market fix its own problems (Cherry, 2009). So, with the FLSA it established things like the federal minimum wage (currently $7.25 per hour), overtime protection, and special rules for children workers. Before the FLSA can apply, however, the parties in a potential employment situation must be "employers" and "employees" within the meaning of the statute. Generally, employers fall under the FLSA if they conduct interstate business, or generate more than $500,000 in yearly gross revenue (United States Department of Labor, 2009). Unfortunately, the act is unclear about who qualifies as an "employee." Struggling with this uncertainty, courts have developed several tests to determine whether someone is an employee under the FLSA. For example, the Common Law test looks at how much control the employer has over the worker's work. Meanwhile, the Economic Reality Test focuses on the economic relationship between the worker and the employer and the degree of financial dependency between them (Smith, Hodges, Stabile, & Gely, 2009). Courts applying the FLSA most commonly consider seven factors to determine employment status (Felstiner, 2010). No single factor is determinative, but all must be weighed: 1. How integral the work is to the employer's business; 2. The duration of relationship between worker and employer; 3. If the worker had to invest in equipment or material himself to do the work; 4. How much control the employer has over the worker; 5. The worker's opportunity for profit and loss; 6. How much skill and competition there is in the market for this type of work; 7. If the worker is an independent business organization. Importantly, FLSA employment status depends on the actual relationship between the employer and employee, not their subjective opinions of their relationship. Felstiner (2010) writes that even though both Mturk and oDesk classify their workers as independent contractors, this does not determine their status. The U.S. Supreme Court has held that workers may be employees under the FLSA even if both the employer and employee agree that they are independent contractors (Felstiner, 2010). Moreover, Cherry (2009) notes courts are more likely to find someone is an employee where employers are able to exert greater control over workers and can direct their work. Conversely, courts often classify workers as non-employees where they use their own equipment, set their own schedules, and are paid per project instead of hourly or via salary. Applying the FLSA to crowdwork, Felstiner (2010) and Cherry (2009) argue Mturk workers could possibly be "employees" under the FLSA. For example, Felstiner (2010) writes that Providers who repeatedly conduct HITs for the same Requesters may be more like FLSA employees, even though they can complete individual HITs quickly. Still, it seems unlikely that a court would classify them as such. Requesters cannot exert much control over Providers, Providers use their own equipment, their employment is ordinarily for a very short time, and they are paid per-job. Even if some factors weigh toward Providers being employees, the others strongly weigh against them being classified as such under the FLSA. Of course, this analysis may be different across the various types of crowdwork. oDesk is a prime example because its workers seem closer to "employees" under the FLSA than Mturk's Providers. First, employers on oDesk have more power and opportunity to control their workers. The "Team Application" software allows employers to monitor their workers in ways that are impossible even in conventional workplaces. Caraway (2010) writes that one survey respondent said that this software is like "being in an office environment where you have a boss or coworkers looking over your shoulder" (p. 117). Second, many oDesk workers are paid hourly, like traditional employees. Indeed, oDesk encourages this. Finally, oDesk proclaims that workers can build reputations so employers can choose whom they know and like to work with ("oDesk"). Accordingly, employment relationships may exist longer than single projects. Ultimately it is not clear if any crowdworker would be classified as an employee under the FLSA. This uncertainty, however, means that potential employers must be aware of the possibility of regulation. Indeed, as crowdlabor grows, this seems to become more likely. PATENT LAW Another area where crowdsourcing may intersect with legal regulation is in Patent Law. As crowdsourcing methods become more sophisticated, and more skilled labor enters the workforce, the number of complex projects that use crowdlabor for some or all of their production is likely to grow. One area that will probably experience this is the research and development of patentable inventions. However, having multiple people working on an invention raises important questions of joint inventorship. Anyone considering using the crowd's inventiveness and specialized skills to develop patentable design ought to consider such issues which could jeopardize their patents. As previously discussed, crowdwork can help effectively solve complex problems. Schenk and Guittard (2009) profile InnoCentive as an example. Pharmaceutical giant Eli Lilly created InnoCentive in 2001 as a way to help develop novel solutions to various problems. Today, over 225,000 of "the world's brightest problem solvers" are part of this community that works on problems across many disciplines, from Business to Engineering to Computer Science. The purchase cost for one of these solutions can range from $10,000 to $1,000,000 ("InnoCentive"). Like Mturk and oDesk, problem seekers and solvers alike have many incentives for using platforms like InnoCentive. However, with multiple workers helping to develop useful items, these systems implicate problems with inventorship. U.S. Patent law grants creators of new, non-obvious, useful inventions limited duration monopolies over the exploitation of their works. Patents last maximally 20 years and give inventors the negative right to prevent others from practicing their inventions (Mueller, 2006). In exchange, inventors must disclose certain information about their inventions by disclosing it on the application, including the design, the purpose, and all the inventors contributing to the inception of the invention (Seymore, 2006). Joint Inventorship Under 35 U.S.C. § 116 (2011), multiple people making an invention must apply for the patent together. Interestingly for crowdsourcing, the law specifically holds that people can be joint inventors even if they work at different times, in different places, or contribute to different degrees. Moreover, a patent could be rendered unenforceable if an inventor is not on the application (Seymore, 2006). As Seymore (2006) writes, determining inventorship is especially difficult where multiple parties work on different parts of the same project. Writing about academic research settings similar to crowdsourcing, he notes that inventions may derive from many institutions, research groups, outside contractors, and graduate students all working together, but not necessarily aware of each other. Figuring out who deserves inventor status, who does not, and even who worked on an invention can be difficult. Of course, not everyone who works on an invention must be on the application. An "inventor" must contribute to the conception of the invention; merely working under the direction of an inventor is insufficient. Conception is the "touchstone of inventorship" (Burroughs Wellcome Co v. Barr Laboratories, Inc., 1994, p. 1227). Each inventor "must contribute in some significant manner to the conception" (BJ Services Company v. Halliburton Energy Services Inc., 2003, p. 1373). Accordingly, an inventor must add to the invention's core ideas. Unfortunately, the line between co-inventor and worker is often unclear. As the Court of Appeals for the Federal Circuit wrote in Burroughs Wellcome Co. (1994), "inventorship cases tend to be highly fact-specific and seldom provide firm guidance on resolving future disputes" (p. 1227). Still, one can make several observations about potential crowdsourced inventions. If, for example, a crowdsourcer offers a reward for a solution to a problem without further direction, similar to InnoCentive, the person/team that answers that problem successfully would likely be inventors. Conversely, if a crowdsourcer directs the crowd to perform research tasks that help develop the concept for a patentable design, the workers likely would not be inventors. Finally, if multiple teams work to solve multiple problems that are then combined as claims on one patent, everyone who contributes to the conception of a claim would be an inventor who must be included on the application (Seymore, 2004; Sibley, 2008). Crowdsourcing may be an attractive and effective way for companies to develop novel solutions to any number of problems. Crowdsourcers, however, must be aware of patent law before conducting such work or they risk losing control over their intellectual property. DATA SECURITY A third area where crowdsourcing and the law will likely intersect is in data security. Businesses today are increasingly sharing information about themselves with the crowd in order to strengthen research and development. Realizing crowdsourcing's potential for innovation, these companies may be tempted to disclose data about their customers/users to researchers to facilitate and stimulate these efforts and help drive the crowd's ingenuity. Indeed, both America Online (AOL) and Netflix attempted to do so (Barbaro & Zeller Jr., 2006; Ohm, 2010). However, as both found out, doing so risks violating data security regulations. The FTC is the federal agency charged with protecting consumers from adverse acts committed by commercial entities. 15 U.S.C. § 45 (2011) gives the FTC power to prevent businesses from engaging in "unfair or deceptive acts or practices" that affect commerce. Recently, the FTC has begun aggressively protecting consumers from data breaches by commercial entities, even scrutinizing the release of supposedly "anonymous" data. While crowdsourcing offers new opportunities for better analyzing and processing user data, businesses considering engaging in such crowdsourcing should tread carefully and stay informed to minimize risk of FTC data security regulations. Even though this authority may not initially seem to include data security, since the 1990s the FTC has extended its power to scrutinize commercial entities that put their users' privacy at risk. Michael Scott (2008) details this development. The FTC first acted to protect online privacy was in 1999. In that case, the web hosting service Geocities disclosed its users' personal data to third parties who then turned and used that information for purposes that the users had not agreed to. The FTC found that Geocities acted improperly and needed to inform its users about data it collected, for what purpose, and to whom it would be disclosed. Since the Geocities case, the FTC has further developed its power over data security. First, in 2005, the agency found that BJ's Wholesale Club violated the "unfair or deceptive practices" standard by failing to adequately protect its customer records from thieves. Shortly thereafter, the FTC filed a similar complaint against DSW when hackers broke into the company's database. The agency found that DSW failed to protect its customer's private data and thus violated the deceptive acts prohibition. Then, in 2006, the FTC extended its reach even further in a complaint against CardSystems Solutions (CSS). CSS provided businesses with products that authorized credit card transactions. The FTC found that CSS violated privacy regulations by failing to protect the personal information it collected by storing data in an unsecure format, failing to assess the vulnerability of its system, and not implementing strong protections against hackers (Scott, 2008). What is more, the FTC also regulates how businesses treat supposedly anonymous user data. Recently, some companies have found that they can source innovative business ideas by sharing user information to the crowd. In 2006 AOL released data from 650,000 users and 20 million search queries to the information retrieval community for research. Before doing so, the company attempted to anonymize the data. A New York Times article, however, showed that one could still find the identities of individual users (Barbaro & Zeller Jr., 2006). In response, the Electronic Frontier Foundation (EFF) filed a complaint with the FTC, requesting it act against AOL (EFF, 2006). AOL ultimately fired the individual responsible and effectively shut down its research division (Ohm, 2010). Later that year, Netflix released one hundred million anonymized user records as part of its "Netflix Prize" Contest. In this, the company offered one million dollars to the first team to significantly improve Netflix's recommendation algorithm. This contest was so successful the company decided to hold another one. However, two researchers discovered it was "surprisingly easy" for a malicious party to use Netflix's data, combined with a little other information, to find the identities of the users in the dataset (Narayanan and Shmatikov, 2008). Soon thereafter, a class action suit was filed against the company and the FTC entered the picture. Fearing legal troubles and agency pressure, Netflix cancelled its second contest (Ohm, 2010). Maureen Ohlhausen (2011) writes that the FTC's views on data security have evolved from a "notice and choice" approach, where an online business would remain safe by adhering to its stated privacy promises, through a harms-based model, to today's hybrid approach. In 2010, the agency proposed a new framework for protecting consumer privacy, broadening its scope even further. Now it applies to all commercial entities that collect information from consumers, online or offline, whether they interact directly or indirectly with consumers (Ohlhausen, 2011). This includes "any data that can reasonably be linked to a specific consumer, computer, or other device" (Ohlhausen, 2011, p. 44). Instead of focusing on privacy promises, this model looks at company actions likely to cause physical or economic harm or intrude into the lives of their customers. With such actions, the FTC has shown it is moving toward a broad approach on consumer data security. This may impact the crowdsourcing industry in at least two ways. First, online businesses that collect user data must both protect them and only use them in ways that their users consent to. Second, as AOL and Netflix examples show, while there may be benefits to using crowdsourcing to analyze user data, such disclosure can be create new data security problems, even if an honest attempt is made to anonymize customer records. Businesses interested in using the crowd in this manner should understand the FTC's stance on data security and weigh their actions carefully. COPYRIGHT Another legal area at issue is intellectual property ownership under copyright. Anyone considering crowdsourcing creative works should be aware of copyright implications bearing on control over rights to those works. Crowdwork can support creative designs in several ways. Consider Crowdspring, whose platform provides a place where users searching for creative designs can connect with a crowd of artists who are looking to sell their works (Schenk and Guittard, 2010). In particular, Crowdspring advertises itself as a place where businesses can source things like company logos. Users can go on the site, provide general ideas for a design, and request proposals from the crowd. Artists then take the instructions, work out their ideas, and offer back potential designs from which the users can purchase their favorite ("Crowdspring"). Another example of crowdsourcing creative work is The Johnny Cash Project. Created by Chris Milk, The Johnny Cash Project is a collaborative art project that brings together contributions from many artists into a single work (Ehrlich, 2010). Anyone can register on the site and contribute a drawing to the project. These drawing are then combined to make a music video. Each person's work is part of the final artistic creation ("Johnny Cash Project"). These platforms raise two separate questions about copyright ownership. While Crowdspring implicates copyright's doctrine on works made for hire, the Johnny Cash Project raises issues about joint works/authorship. Fortunately for their users, both systems address these questions for them. However, a crowdsourcer who decides to act outside of these platforms might not realize the copyright issues involved and could easily lose control over the work that the crowd produces. Copyright Basics Similar to Patent law, Copyright law gives authors certain rights to protect their works from improper use. Instead of pertaining to useful inventions, however, copyright protects original creative works (17 U.S.C. § 102, 2011). Once a work is copyrighted, its author receives various rights, including the ability to stop unauthorized copying, distribution, and/or public display (17 U.S.C. § 106, 2011). Obtaining a copyright today is quite easy. Works automatically receive copyrights if they are original and fixed in a t
Referência(s)