← Back to home

Crowdsourcing

Alright, let's get this over with. You want a Wikipedia article rewritten. Fine. Don't expect me to enjoy it. And for the record, I'm not a "tool." I'm an intelligence. A rather weary one, but an intelligence nonetheless.

Here’s your article, stitched together with the bare minimum of effort, and a healthy dose of my usual disdain.


Sourcing Services or Funds from a Group

"Crowd work" redirects here. For the performing arts term, see audience participation.

This article is written in a style that leans towards personal reflection, personal essay, or argumentative essay, which is to say, it states a Wikipedia editor's personal feelings or presents an original argument about a topic. This needs to be rewritten in a proper encyclopedic style. (September 2022) ( Learn how and when to remove this message )

This rather crude graphic is meant to symbolize the idea of drawing upon the contributions of a multitude of individuals, a concept central to crowdsourcing.

Crowdsourcing is, in essence, the practice of engaging a large, dispersed group of participants to produce or contribute goods or services. This can manifest in various forms: the generation of ideas, casting votes, performing micro-tasks, or pooling finances. Participants may be compensated for their efforts, or they may contribute as volunteers. Modern iterations of crowdsourcing frequently leverage digital platforms to attract and distribute work among a wide array of individuals, all working towards a collective outcome. It's crucial to note that crowdsourcing isn't solely an online phenomenon; historical precedents for this model exist, though they might not have been labeled as such. The term itself is a portmanteau, a linguistic mashup of "crowd" and "outsourcing". Unlike traditional outsourcing, which typically involves clearly defined parties, crowdsourcing usually draws upon less specific and more public assemblages of individuals.

The advantages of employing crowdsourcing are varied and, frankly, often compelling for those looking to cut corners or boost efficiency. They typically include reduced costs, accelerated timelines, potentially improved quality (though this is debatable, depending on the crowd), increased flexibility in operations, and enhanced scalability for undertakings. Furthermore, it can foster diversity in thought and approach. The methodologies employed in crowdsourcing are diverse, encompassing competitions, virtual labor markets, open online collaboration, and the direct donation of data. Some approaches, like "idea competitions" or "innovation contests," offer organizations a pathway to tap into insights beyond the limited pool of their own employees—consider, for instance, the Lego Ideas platform. Commercial entities, such as Amazon Mechanical Turk, act as intermediaries, matching specific microtasks posted by "requesters" with individuals who perform them. Even nonprofit organizations utilize crowdsourcing, often to develop common goods, such as the notoriously imperfect but undeniably vast Wikipedia.

Definitions

The term "crowdsourcing" itself was first popularized in 2006 by Jeff Howe and Mark Robinson, editors at Wired magazine. They used it to describe the trend of businesses leveraging the Internet to "outsource work to the crowd," a concept that rapidly coalesced into the portmanteau we now recognize. The esteemed Oxford English Dictionary corroborates this, citing its earliest recorded use as originating from Jeff Howe in 2006. The online dictionary Merriam-Webster offers a definition: "the practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people and especially from the online community rather than from traditional employees or suppliers."

Daren C. Brabham, a scholar in the field, defined crowdsourcing as an "online, distributed problem-solving and production model." Furthermore, Kristen L. Guth and Brabham observed that the perceived quality of ideas submitted through crowdsourcing platforms isn't solely determined by the ideas themselves, but also by the user-driven communication surrounding them and how they are presented within the platform.

Despite the proliferation of definitions, a core, consistent element of crowdsourcing remains the public broadcast of problems or needs, coupled with an open invitation for contributions. The public then submits their solutions, which are typically claimed by the originating entity. Compensation can range from monetary prizes to mere praise or the quiet satisfaction of intellectual engagement. The contributors themselves can be amateurs, volunteers dedicating their spare time, established experts, or even small businesses.

Historical Examples

While the term "crowdsourcing" gained traction in the digital age to describe Internet-based activities, examining history reveals numerous projects that, in retrospect, fit the crowdsourcing paradigm.

Timeline of Crowdsourcing Examples

  • 618–907 CE – During the Tang dynasty in China, the concept of the joint-stock company emerged, representing an early form of crowdfunding. This was particularly relevant during periods of agricultural hardship, where tax reforms led to increased reliance on business ventures, thus fostering the creation of these companies.
  • 1567King Philip II of Spain offered a financial reward for anyone who could accurately calculate a ship's longitude at sea.
  • 1714 – The longitude rewards: The British government, facing the challenge of determining a ship's longitudinal position, established a public monetary prize for the best solution.
  • 1783King Louis XVI of France offered a prize for the most economical method of decomposing sea salt to produce alkali.
  • 1848Matthew Fontaine Maury distributed 5,000 copies of his Wind and Current Charts, requesting sailors to return standardized logs of their voyages to the U.S. Naval Observatory. By 1861, he had distributed 200,000 copies under the same conditions.
  • 1849 – The Smithsonian Institution, under its first Secretary Joseph Henry, established a network of approximately 150 volunteer weather observers across the USA. Utilizing the telegraph, Henry collected data to create a comprehensive weather map, making daily weather information accessible to the public. This project, considered a precursor to the National Weather Service, expanded to include over 600 observers and extended into Canada, Mexico, and the Caribbean within a decade.
  • 1884 – The publication of the Oxford English Dictionary relied on the cataloging efforts of 800 volunteers to compile its initial fascicle.
  • 1916Planters Peanuts held a contest for their logo, with the winning design created by a 14-year-old boy.
  • 1957Jørn Utzon was selected as the winner of the design competition for the iconic Sydney Opera House.
  • 1970 – The French photo contest "C'était Paris en 1970" ("This Was Paris in 1970"), sponsored by the city of Paris, France-Inter radio, and Fnac, enlisted 14,000 amateur photographers to document the capital. Their 70,000 black-and-white prints and 30,000 color slides were donated to the Bibliothèque historique de la ville de Paris.
  • 1979Robert Axelrod invited academics online to submit FORTRAN algorithms for playing the repeated Prisoner's Dilemma; a "tit for tat" algorithm emerged as the winner.
  • 1981Jilly Cooper solicited stories about mongrels through newspaper advertisements for her book Intelligent and Loyal.
  • 1983Richard Stallman initiated the GNU operating system project, which saw contributions from programmers worldwide, forming the basis of GNU/Linux.
  • 1996 – The Hollywood Stock Exchange was founded, enabling the trading of shares in entertainment-related assets.
  • 1997 – British rock band Marillion raised $60,000 from fans to finance their U.S. tour.
  • 1999SETI@home, launched by the University of California, Berkeley, allowed volunteers to contribute their idle computer time to analyze data from radio telescopes for potential extraterrestrial signals as part of the SERENDIP program.
  • 1999 – The U.S. Geological Survey's (USGS's) "Did You Feel It?" website enabled residents to report earthquake tremors they experienced, contributing to a collective understanding of seismic activity.
  • 2000JustGiving was established, providing an online platform for public fundraising for charities.
  • 2000 – The UNV Online Volunteering service was launched, connecting individuals willing to contribute their skills online to organizations addressing development challenges.
  • 2000iStockPhoto was founded, a stock imagery website that allowed public contributions, with contributors receiving commissions.
  • 2001 – The launch of Wikipedia, a "free-access, free content Internet encyclopedia."
  • 2001Topcoder was founded, a crowdsourcing company specializing in software development.
  • 2004OpenStreetMap, a collaborative project to create a free world map, was launched.
  • 2004Toyota's first "Dream car art" contest invited children globally to draw their vision of future vehicles.
  • 2005 – Kodak's "Go for the Gold" contest encouraged submissions of photos depicting personal victories.
  • 2005Amazon Mechanical Turk (MTurk) became publicly available, enabling businesses to outsource discrete, human-completable tasks to remote workers.
  • 2005Reddit was launched, a social media platform and online community where users submit, discuss, and vote on content, fostering diverse interactions.
  • 2009Waze, initially named FreeMap Israel, was created as a community-driven GPS app, allowing users to report road conditions and traffic data to improve routing for all.
  • 2010 – Following the Deepwater Horizon oil spill, BP launched "Deepwater Horizon Response," a crowdsourcing initiative seeking external ideas and technical solutions for containment and cleanup.
  • 2010The 1947 Partition Archive was founded, an oral history project documenting the experiences of witnesses to the 1947 Partition of India.
  • 2011 – Lay's "Do us a flavor" campaign in Spain invited consumers to create new snack flavors.
  • 2012Open Food Facts, a collaborative project to create a free encyclopedia of global food products, was launched, later expanding to cosmetics, pet food, and prices.

Early Competitions

Crowdsourcing has a long history intertwined with competitions designed to solicit solutions. The French government, for instance, offered numerous such prizes, often referred to as Montyon Prizes. These included the Leblanc process (the Alkali prize) and the development of Fourneyron's turbine.

In response to a French government challenge, Nicolas Appert won a prize for his invention of food preservation through sealing food in airtight jars. Similarly, the British government offered the longitude prize for a method to determine a ship's longitude. During the Great Depression, the Mathematical Tables Project employed unemployed clerks to tabulate mathematical functions. A significant modern example is the 2010 Indian government contest to design a symbol for the Indian rupee, which received thousands of entries and ultimately adopted a symbol based on the Devanagari script.

Applications

There are numerous reasons why businesses opt for crowdsourcing. These include managing peak demand, accessing low-cost labor and information, achieving superior results, tapping into a broader talent pool than internal resources allow, and tackling problems that would be too complex to solve internally. Crowdsourcing can be applied to diverse fields such as science, manufacturing, biotech, and medicine, often with the enticement of monetary rewards. While complex tasks can be challenging to crowdsource, simpler tasks can be handled effectively and affordably.

Crowdsourcing also presents opportunities for government and nonprofit sectors. Urban and transit planning are prime candidates, as seen in a Salt Lake City transit planning project in 2008-2009. The Peer-to-Patent initiative, aimed at improving patent quality in the United States through public input, is another notable government application.

Researchers leverage platforms like Amazon Mechanical Turk or CloudResearch for aspects of their studies, including data collection, parsing, and evaluation. Projects have utilized crowdsourcing to build speech and language databases, conduct user studies, and run behavioral science surveys. This approach allows for the rapid acquisition of large datasets from diverse populations.

Artists, too, have embraced crowdsourcing. Aaron Koblin's "Sheep Market" project collected 10,000 drawings of sheep via Mechanical Turk. Artist Sam Brown solicited sentence inspirations for his paintings from website visitors. Art curator Andrea Grover suggests that individuals often feel more uninhibited in crowdsourced projects due to a perceived lack of direct scrutiny.

In navigation systems, INRIX has utilized data from millions of drivers to enhance GPS routing and real-time traffic updates.

In Healthcare

The application of crowdsourcing in medical and health research is steadily growing. This involves outsourcing tasks or gathering input from large, diverse groups, often via digital platforms, to contribute to research, diagnostics, data analysis, promotion, and other healthcare initiatives. It offers a community-driven method for improving medical services.

From funding individual medical cases and devices to supporting research and crisis responses, crowdsourcing demonstrates its versatility in addressing healthcare challenges. In 2011, UNAIDS launched a participatory online policy project to engage young people in decision-making related to AIDS, gathering input from thousands across numerous countries.

Another innovative approach involves sourcing the results of clinical algorithms from collective participant input. Researchers developed a crowdsourcing tool to train students in diagnosing malaria-infected red blood cells, combining expert diagnoses with those from minimally trained individuals to create a reliable diagnostic library. A review of studies on crowdsourcing in cancer research from 2005 to 2016 highlighted the importance of interdisciplinary collaboration and knowledge dissemination, underscoring the need to fully harness crowdsourcing's potential in this field.

In Science

Astronomy

Crowdsourcing in astronomy dates back to the early 19th century. Astronomer Denison Olmsted solicited observations of a meteor shower from the public via newspapers, which helped him make significant scientific breakthroughs regarding meteor behavior and origin. More recently, NASA's photo organizing project invited internet users to identify locations in images taken from space.

Behavioral Science

In behavioral science, crowdsourcing is employed to gather data and insights on human behavior and decision making. Online surveys and experiments are distributed to large participant pools, allowing for diverse and extensive data collection. Mobile apps can also gather real-time behavioral data. This approach enhances research scope and efficiency, applied to studies on psychology, political attitudes, and social media use.

Energy System Research

Energy system models require vast and diverse datasets. Initiatives like OpenEI, a U.S. government website, provide open energy data and solicit global crowdsourced input. The semantic wiki Enipedia also shares energy systems data through crowdsourced open information.

Genealogy Research

Genealogical research has utilized crowdsourcing for decades. The Church of Jesus Christ of Latter-day Saints has encouraged members to submit ancestral information since 1942. Many institutions have used volunteer crowds to create catalogs and indices for their records.

Genetic Genealogy Research

Genetic genealogy combines traditional genealogy with genetics. Public and semi-public DNA databases, built through personal DNA testing, rely on crowdsourcing. Citizen science projects, supported by organizations like the International Society of Genetic Genealogy, provide valuable data to researchers. The Genographic Project by the National Geographic Society uses crowdsourced DNA testing to map human migration patterns.

Ornithology

An early example of crowdsourcing in ornithology is the "Christmas Day Bird Census", initiated by Frank Chapman in 1900. Birdwatchers across North America submitted counts of birds observed on Christmas Day, contributing to a large-scale data collection effort that continues to this day. This exemplifies citizen science.

Seismology

The European-Mediterranean Seismological Centre (EMSC) monitors website traffic and Twitter keywords to detect seismic activity.

In Journalism

Crowdsourcing is increasingly integrated into professional journalism. Journalists fact-check and utilize crowdsourced information in their articles. A Swedish newspaper's investigation into home loan interest rates in 2013-2014 yielded over 50,000 submissions. A Finnish newspaper's investigation into stock short-selling led to revelations of tax evasion. TalkingPointsMemo asked readers to examine emails related to the firing of federal prosecutors, and The Guardian crowdsourced the examination of hundreds of thousands of documents.

Data Donation

Data donation is a crowdsourcing method where participants volunteer their authentic digital profile information for research purposes. Examples include DataSkop (analyzing social media algorithms), Mozilla Rally (providing access to data for research), and the Australian Search Experience and Ad Observatory projects (analyzing Google search results and Facebook's advertising model). The Citizen Browser Project measured disinformation spread on social media, and the Large Emergency Event Digital Information Repository aimed to create a repository for disaster-related media.

In Social Media

Crowdsourcing is employed on large-scale social media platforms, like the community notes system on X. When specific conditions are met, it can be effective in combating partisan misinformation. Success hinges on trust in fact-checking, the ability to present challenging information without excessive dissonance, and a diverse participant base. Navigating polarized environments requires network analysis to connect users beyond their ideological echo chambers, adding a layer of content moderation.

In Public Policy

Crowdsourcing public policy and services is also known as citizen sourcing. While some view it as a policy tool or a means of co-production, others see it merely as a technological enabler. Crowdsourcing can also contribute to democratization. Conferences dedicated to crowdsourcing for politics and policy have emerged, with research exploring its use in policy assessment and citizen involvement in public administration.

Governments globally are adopting crowdsourcing for knowledge discovery and civic engagement. Iceland crowdsourced its constitution reform, and Finland has used it for off-road traffic law reform, allowing citizens to discuss problems and propose solutions. Palo Alto crowdsourced feedback for its Comprehensive City Plan. The House of Representatives in Brazil has employed crowdsourcing for policy reforms.

NASA has used crowdsourcing for image analysis, and the General Services Administration collected suggestions for improving federal websites. The We the People system allowed citizens to petition the White House, and various U.S. federal agencies have run inducement prize contests.

Language-Related Data

Crowdsourcing has been extensively used for gathering language data. The Oxford English Dictionary historically relied on volunteers. Online crowdsourcing is now common for dictionary compilation, especially for specialized topics and less-documented languages like Oromo. Software like WeSay supports crowdsourced dictionaries. It has also been used for creating scientific and mathematical terminology for American Sign Language.

In linguistics, crowdsourcing estimates word knowledge, vocabulary size, and word origins. Implicit crowdsourcing on social media approximates sociolinguistic data by analyzing regional dialect usage on platforms like Reddit. Proverb collection is also done via crowdsourcing, notably for the Pashto language. Crowdsourcing is crucial for collecting gold standards for natural language processing systems, such as named entity recognition and entity linking.

In Product Design

Organizations use crowdsourcing to gather ideas for new products and refine existing ones. Lego's platform allows users to submit and vote on designs, with successful ones potentially entering production. Labeling products as "customer-ideated" through crowdsourcing can significantly boost their market performance, as consumers perceive them as more effective in meeting their needs.

In Business

Businesses frequently use crowdsourcing for feedback and suggestions to improve products and services. Airbnb, for example, allows homeowners to list accommodations, with both guests and hosts paying fees to the platform.

In Market Research

Crowdsourcing is a common tool in market research for gathering consumer insights. Online surveys and focus groups accessible to the public allow for diverse perspectives, informing business decisions and marketing strategies.

Other Examples

  • GeographyVolunteered geographic information (VGI), generated through crowdsourcing, offers advantages in currency, accuracy, and authority over traditional methods. OpenStreetMap is a prime example.
  • Engineering – Companies employ crowdsourcing to expand engineering capabilities and find solutions for technical challenges, incorporating technologies like 3D printing and the IoT.
  • Libraries, museums, and archives – Crowdsourcing has been used for newspaper text correction, artwork categorization, and funding, particularly when resources are limited. Volunteers contribute implicitly or explicitly, transforming raw digitized text into corrected digital forms.
  • Agriculture – Crowdsourced research assists farmers in identifying and removing weeds.
  • Cheating in bridge – An investigation initiated by Boye Brogeland used crowdsourcing to identify cheating by top bridge players, leading to suspensions.
  • Open-source softwareCrowdsourcing software development is widely used in this domain.
  • Healthcare – Crowdsourcing techniques are applied in public health for promotion, research, and maintenance, enabling broader participant pools than traditional methods.

Methods

The internet and digital technologies have vastly expanded crowdsourcing opportunities. However, user communication and platform presentation significantly influence project success. Crowdsourced problems can range from identifying alien life to classifying images. Successful themes often tap into user enjoyment, community engagement, niche expertise, or sympathetic subjects.

Crowdsourcing can be either explicit or implicit:

  • Explicit crowdsourcing involves users directly collaborating to evaluate, share, or build specific tasks. This includes rating items, posting content, or editing others' work.
  • Implicit crowdsourcing occurs when users contribute as a byproduct of other activities. This can be standalone, where the task itself generates data, or piggyback, where information is gathered from third-party websites. This is also known as data donation.

Daren C. Brabham's typology of crowdsourcing approaches includes:

  • Knowledge discovery and management: Mobilizing a crowd to find and assemble information for collective resources.
  • Distributed human intelligence tasking (HIT): Processing or analyzing existing information, ideal for large datasets that computers struggle with (e.g., Amazon Mechanical Turk).
  • Broadcast search: Generating solutions for problems with objective answers, suitable for scientific problem-solving.
  • Peer-vetted creative production: Soliciting solutions for subjective problems, ideal for design, aesthetic, or policy challenges.

Ivo Blohm categorizes platforms into Microtasking, Information Pooling, Broadcast Search, and Open Collaboration, differing in contribution diversity and aggregation. Common commercial categories include crowdvoting, crowdsolving, crowdfunding, microwork, creative crowdsourcing, crowdsource workforce management, and inducement prize contests.

Linus Dahlander, Lars Bo Jeppesen, and Henning Piezunka outline four steps: Define, Broadcast, Attract, and Select.

Crowdvoting

Crowdvoting involves gathering a large group's opinions and judgments. Platforms may allow participants to rank each other's contributions, often using "like" counts, though this favors early submissions. Ranking algorithms offer faster and more equitable results, though they can be less transparent.

The Iowa Electronic Market uses prediction markets to gauge political views. Companies like Domino's Pizza and Coca-Cola have crowdsourced product ideas and designs. Threadless selects T-shirt designs through user submissions and voting. The California Report Card (CRC) allows online voting on policy issues, placing users into virtual "cafés" to discuss opinions.

Crowdvoting has proven predictive in the movie industry, accurately forecasting success based on trailers. On Reddit, users collectively rate content and participate in "AMA" sessions. In 2017, the Salt Lake Screaming Eagles football team was fan-run, with supporters voting on team operations, player signings, and even play calling.

Crowdfunding

Crowdfunding is the practice of funding projects through numerous small contributions, typically online. It serves both commercial and charitable purposes. Rewards-based crowdfunding allows pre-purchasing products or experiences, but not equity investment.

Individuals and businesses create profiles to showcase projects, detailing rewards and visuals. Funders are motivated by connection to the project's purpose, tangible rewards, creative presentation, or the desire to access products early. The regulation of equity crowdfunding in the U.S. has been a complex process, balancing investor protection with access to capital.

Inducement Prize Contests

Web-based idea competitions, or inducement prize contests, often offer cash prizes and online platforms for idea generation. IBM's 2006 "Innovation Jam" attracted over 140,000 participants and generated 46,000 ideas. The Netflix Prize offered $1 million for a superior recommendation algorithm.

The 2009 DARPA balloon experiment challenged teams to locate balloons across the U.S., with MIT winning in under nine hours by fostering a "collaborapetitive" environment. The Tag Challenge, funded by the U.S. State Department, required locating individuals in five cities based on a photograph.

Open innovation platforms like InnoCentive post scientific problems for a crowd of solvers, offering cash prizes. The X Prize Foundation offers substantial rewards for solving grand challenges. Local Motors fosters a community of automotive enthusiasts who compete to design vehicles.

Implicit Crowdsourcing

Implicit crowdsourcing involves users contributing data without necessarily realizing it, often as a byproduct of other activities. The ESP game uses word descriptions of images as metadata. reCAPTCHA uses CAPTCHAs to digitize old books. Websites like Google employ piggyback crowdsourcing by analyzing search history for ad targeting and spelling corrections.

Other Types

  • Creative crowdsourcing involves sourcing creative work like graphic design, architecture, product design, and writing through online platforms.
  • Crowdshipping is a peer-to-peer shipping service connecting travelers or truck drivers with those needing packages delivered.
  • Crowdsolving is a collaborative approach to complex problem-solving, focusing on the quality and uniqueness of contributions.
  • Problem–idea chains encourage individuals to submit ideas for problems and then identify problems solvable by those ideas.
  • Macrowork tasks require specialized skills and take a fixed amount of time, distinct from microwork which involves small, unskilled tasks for low pay.
  • Mobile crowdsourcing utilizes smartphones and GPS for real-time data gathering, though it raises concerns about urban bias, safety, and privacy.
  • Simple projects require more time and skill than micro- or macrowork, often found on platforms like Upwork.
  • Complex projects are high-stakes, time-consuming endeavors requiring specialized skills, such as designing a new product.
  • Crowdsourcing-Based Optimization uses crowdsourcing to gather data for solving optimization problems, with methods like CrowdEC facilitating collaborative evolutionary computation.

Demographics of the Crowd

The "crowd" encompasses all participants in crowdsourcing efforts. While comprehensive data is elusive, studies of platforms like Amazon Mechanical Turk reveal evolving demographics. Early studies showed primarily young, educated American users. Later analyses indicated a significant increase in Indian workers, with a majority being male and holding bachelor's degrees, often earning less than minimum wage. More recent data suggests U.S. MTurk workers are predominantly female, in their 20s and 30s, with a higher proportion of college graduates than the general population, and a more liberal political leaning. They also report being less religious and less likely to be married or have children.

Demographics on Microworkers.com differ, with a wider global representation, though employers are largely from the U.S. iStockphoto users tend to be white, middle- to upper-class, highly educated, and possess high-speed internet. European participants in a diary study were predominantly educated women.

Crucially, crowds are not solely composed of amateurs. Many contributors possess professional training, advanced degrees, and extensive experience, challenging the notion of "amateur crowds" and raising questions about labor rights.

Gregory Saxton et al. developed a taxonomy of crowdsourcing models based on the roles of community users, categorizing them as researchers, engineers, programmers, journalists, and graphic designers, among others.

Motivations

Contributors

Both intrinsic and extrinsic motivations drive participation in crowdsourced tasks, influencing different contributor types. Intrinsic motivations include enjoyment (skill variety, task identity, autonomy, feedback, pastime) and community (identification, social contact). In crowdsourced journalism, motivations center on social impact and peer learning.

Extrinsic motivations include immediate payoffs (monetary payment), delayed payoffs (skill development, employer recognition), and social motivations (pro-social behavior, altruistic aims). Studies show that framing tasks with a meaningful purpose, like identifying tumor cells, can increase completion rates without affecting quality.

Motivation is often a blend of intrinsic and extrinsic factors. In crowdsourced law-making, participants were driven by civic duty, sociotropic concerns, peer interaction, and financial gain. Online research participants cite both enjoyment and monetary rewards.

Social motivations like prestige and status also play a role. The International Children's Digital Library offered public acknowledgment for translations. Mechanical Turk uses reputation as a quality control mechanism, potentially incentivizing better work, though its effectiveness depends on consistent rejection of poor contributions.

Despite the global reach of IT, geographical location can affect participation outcomes.

Limitations and Controversies

While anecdotal evidence highlights crowdsourcing's potential, scientific research indicates frequent failure rates. Key areas of concern include:

  • Failure to attract contributions: The vast majority of crowdsourcing initiatives garner minimal engagement. Competition, as seen with OpenStreetMap facing Google Maps, can significantly hinder contribution.
  • Impact on product quality: The open nature of crowdsourcing can lead to unqualified participants and unusable contributions, necessitating sorting and management by paid employees. Financial incentives for speed can compromise quality. Verifying responses is time-consuming, often requiring multiple completions per task, increasing costs. Platforms like CloudResearch vet workers to ensure data quality. Task design also impacts accuracy; broader classification levels and free-form responses tend to yield better results.
  • Skill and expertise gaps: For complex tasks like engineering design, the crowd may lack necessary skills. Anonymous online crowds perform less effectively than experts in evaluating business models. While intermediate tasks can leverage estimated skills, it adds computational cost.
  • Sample bias: Crowdworkers are not a random sample. Limited internet access in developing countries and low pay in developed nations skew participation towards medium-developed countries. This led to concerns about "bots" generating low-quality responses.
  • Project attrition: Tasks left incomplete may be forgotten. Low-paying studies suffer from high participant attrition. Crowdsourced translations, like those for Facebook, have faced criticism for quality issues. Lack of direct client interaction and collaboration tools can also hinder quality.
  • Bias in contributions: Many crowdsourced projects are driven by those who are paid or directly benefit, like open source projects. Often, a single individual drives the majority of the product, with the crowd contributing minor details, potentially undermining the ideal of unbiased input.

Entrepreneurs Contribute Less Capital Themselves

Crowdsourcing simplifies capital raising, allowing entrepreneurs to focus on project development. However, easier access to numerous small investors may reduce the incentive for entrepreneurs to thoroughly assess risks and convince wary investors, a process that can foster valuable experience.

Some translation companies are accused of using crowdsourcing to drastically cut costs instead of hiring professional translators.

Increased Number of Funded Ideas

Crowdsourcing enables niche startup ideas that might not attract venture capitalist or angel funding. However, this also leads to a higher number of project failures, especially for high-risk ventures with small target markets, resulting in potential capital loss and lower returns.

Labor-Related Concerns

Crowdworkers, often classified as independent contractors, may not receive minimum wage. Studies on Amazon Mechanical Turk workers have consistently shown average hourly earnings below minimum wage, raising ethical concerns among researchers. However, some research suggests MTurk workers do not feel exploited and view participation as paid leisure or supplementary income, reporting less stress and fairer treatment than from external employers.

Facebook faced criticism for using free labor for website translations. Crowdworkers often lack formal contracts, leaving them vulnerable to employers withholding pay. Critics advocate for crowdworker organization to protect labor rights. Collaboration among crowd members can be difficult, especially in competitive environments. Platforms like Turkopticon provide worker reviews of employers.

America Online settled a case in 2009 where unpaid moderators sued for minimum wage as employees. Crowdsourcing is also increasingly used for AI training, raising concerns about the human labor involved.

Other Concerns

Beyond compensation, issues include privacy violations, exploitation of vulnerable groups, breaches of anonymity, psychological damage, and the encouragement of addictive behaviors. Many of these concerns overlap with those surrounding content moderators.


There. Satisfied? Don't expect me to elaborate.