If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Crowdsourcing, crowdfunding, and open innovation

What could you make if you had access to half of the world's population? What would you do with their knowledge, ideas, and time? With the growth of the Internet and the World Wide Web, that's suddenly a real possibility.
Crowdsourcing is a way to take advantage of the large network of potential contributors online and funnel their resources into an output.
Let's explore how crowdsourcing helps us to create common goods, invent new products, and make scientific discoveries.

Collective knowledge pooling

In 1857, a group of British intellectuals decided to compile a new English dictionary with a comprehensive set of words, definitions, and usages. They put out a call for volunteers to submit words from books in their libraries and eventually received more than 2 million word references.1
Photo of a box filled with slips of paper with words and sentences written on them.
A box of word suggestions sent to the Oxford English Dictionary team. Image source: Media specialist
In 1928, the first edition of the Oxford English Dictionary was finally complete and contained over 400,000 words and phrases.2
Photo of 10 books, each with a different volume number and letter range printed on the side of the book.
Volumes covering letters B - N from the first edition with its original title, "New English Dictionary on Historical Principles". Image source: Liz West
The creators of that dictionary had an important insight: there's no one person that knows every bit of information, but when we combine knowledge from many people, we can develop impressively comprehensive collections of knowledge.
A diagram representing the crowdsourcing of knowledge from multiple people into a collective knowledge base. Shows six icons of people next to a page of text, with arrows flowing from the pages of text into an entire book.
Thanks to computers and the Internet, creating a shared knowledge base is much easier these days than it was in the 1800s. When someone contributes information, the computer can store it in a database and make it easy to sort, search, and edit. The community of contributors can sort through the database to keep the highest quality information, and computer programmers can help by adding reputation systems, voting algorithms, and spam detection.
Wikipedia is a great modern example of a crowd-sourced knowledge base. You've probably run into a Wikipedia article if you've ever searched for knowledge online. With nearly a billion edits since its inception and over 36 million registered users, Wikipedia is collecting the wisdom of a very large crowd.3
I'm one of the millions in that crowd, since I made eight edits this year while working on AP CSP articles:
Screenshot of list
A listing of edits, with timestamps and edit size for each.
The fact that anyone can contribute—and have their edits appear instantly—is both a blessing and a curse.
Wikipedia intentionally lowered the barrier to contribution with the hope that they could create an encyclopedia that'd cover every topic worthy of an article, and it's well on its way to that goal. In June 2019, there were 5.9 million articles in the English Wikipedia and around 20,000 new articles added each month. In print form, that would be equivalent to 2,822 volumes of the Encyclopedia Brittanica:
Illustration of 2,822 books in 15 bookcases.
Can a volunteer community really ensure the accuracy of millions of articles? They can try, but realistically, they can't. You might have the misfortune of reading an article right after a troll made a malicious edit, a spammer plugged their company, or well-meaning but misinformed person introduced an inaccuracy.
During a lunar eclipse in 2011, one vandal made this edit:
Screenshot of first sentence from Wikipedia article on Lunar eclipse, with text "A lunar eclipse is when the moon turns black and explodes, releasing a poisonous gas, killing all of humanity."
To combat the effects of errors, Wikipedia encourages citations for every bit of knowledge added and advises caution in using Wikipedia as a primary source for research.
Wikipedia also suffers from another drawback of a crowd-sourced community: a lack of diversity. In theory, a website can bring in a wide range of people, since the web is open and global. Yet in reality, the contributors to Wikipedia don't mirror the readers of Wikipedia. A 2013 study found that while 47% of Wikipedia readers are female, 84% of Wikipedia editors are male.4 This imbalance may skew the topics covered on Wikipedia and the way they're presented.
🤔 Consider other crowd-sourced knowledge bases that you use, such as Q&A websites and discussion forums. How trustworthy is the knowledge accumulated on those sites? In what ways might that knowledge be biased by the bias of the contributors? Is the overall increase in new knowledge worth the possible inaccuracies and bias?

Crowdfunding

The human love for invention began when our ancestors first created tools millions of years ago. Since then, we've invented ways to control fire and, in the last few thousand years, we've upped our game by inventing the wheel, the lever, and the computer.
Modern inventions involve more than rubbing two sticks together. New gadgets require manufacturing processes and machines; new video games require 3D assets, developers, and designers; new movies require cast, crew, and special effects.
Fortunately, inventors can now turn to the Internet for crowdfunding the costs of creation. When thousands of supporters each invest a small amount of money, that money can add up into enough capital to bring a new product into the world.
Diagram of crowdfunding model. Six people are shown next to a dollar bill, and arrows flow from the dollar bills to a large stack of money in the middle.
Why are supporters willing to part with their money? For crowdfunded gadgets, supporters can be the first ones to own the gadget, and often at a discounted price. For crowdfunded movies and games, the rewards could be anything from autographed prints to an opportunity to meet the creators. Crowdfunding campaigns may offer larger rewards for higher levels of pledged money, so that they can both bring in a long tail of small pledges and also benefit from a few supporters with more cash to give.
Screenshot of 3 fundraising campaigns on IndieGoGo. The first is for a gravity-powered light, the second for a science comedy web series, and the third for an animal-themed video game.
Crowdfunded projects on IndieGoGo with a range of funding goals.
Crowdfunding is also a way to raise funds for charitable causes, where the reward for donors is simply the joy of giving back. For example, DonorsChoose.org is a crowdfunding platform that empowers K-12 teachers to raise funds for classroom needs. The teachers need much less funds than inventors, but for their students, a few hundred or thousand dollars from strangers can make a big difference.
Screenshot of a campaign on DonorsChoose.org where a teacher is raising money to buy multicultural literature for their classroom.
A DonorsChoose.org campaign for multicultural books.
🔍 In May of 2019, the US Federal Trade Commission issued a warning for consumers to avoid crowdfunding scams, campaigns that take money from supporters but don't deliver on their promise.5 Research some crowdfunding scams and reflect on how you might avoid sinking your own money into a scam.

Human computation

Computers are able to solve an increasing number of problems, especially with the recent rise of big data and machine learning. But there are still some things that all humans are naturally better at, just from growing up human. For example:
  • Circling all the cats in a photo.
  • Identifying the emotion of a review as positive or negative
  • Reading words in an old yellowed book.
At this point, the best algorithms for those problems are all still worse than the average human. Until algorithms improve, we can instead use humans to solve the problems and computers to process the results.
Human computation platforms distribute small tasks to humans and give each task to multiple humans. Once enough humans agree on the answer to a particular task, the computer stores that as the probable answer.
A diagram of crowd-sourced human computation. Six people have thought bubbles, five of the thoughts say "X" and the sixth thought says "Y". An arrow flows from the people to a large "X".
Some human computation platforms will pay the humans to do the tasks (or "microwork", as it's often called). On Amazon Mechanical Turk, a requester can post a project with tasks and payouts, and workers choose which tasks to work on. The payouts won't make anyone a millionaire, but the tasks can be done at any time from home with no experience required, so microworking is attractive to students, stay-at-home parents, people in poor countries, and folks looking for a little extra income.
Screenshot of Amazon Mechanical Turk human computation platform, with buttons to create tasks and to make money.
Amazon Mechanical Turk is made up of requesters and workers.
Human computation can also bring in volunteers, when the tasks are interesting enough or the cause seems worthwhile. Zooniverse is a platform for crowdsourced citizen science, where volunteers can work on tasks like transcribing historical documents and spotting gorillas in photos.
Screenshot of task interface for Zooniverse. Left hand side shows a military record from the Civil War, with handwritten soldier details. Right hand side is a form for volunteers to transcribe the handwritten details.
A task for a project transcribing the military records of African-American soldiers in the US Civil War.
Have you ever filled out a captcha? Websites often use captchas like this one to check that you're not a bot before letting you submit a form.
Screenshot of the reCaptcha interface, with two scanned words and a text field for typing in those two words.
If so, you were probably part of one of the largest human computation projects, reCaptcha. The service displays two photos of words from old books, where the first word already has a confirmed transcription and the second word has yet to be transcribed accurately. If you get the first word correctly, you're likely to be a human and your second transcription is also likely to be correct. Once enough humans transcribe that second word the same way, reCaptcha stores the transcription as confirmed.
In its first four years of operation, reCaptcha users unknowingly transcribed the entire archives of The New York Times, more than 13 million articles.6 Since then, reCaptcha has also been used to identify objects in images, another task that humans are quite good at.
🤔 In a way, much of our activity online is doing work that computers can't do themselves. When we vote up funny videos or when we give a restaurant 4 stars on a mapping site, we're giving information to algorithms that influences how they rank and present information. In what ways are you knowingly or accidentally part of the crowd that's changing online content?

Open innovation platforms

Sometimes the wisdom of the crowd lies in the wisdom of a single person in the crowd. We don't all have the solutions for every problem, but given there are billions of people in the world with a wide range of expertise, it's very likely that one of those billions has a great idea that can solve a unique problem.
Diagram of crowd-sourced open innovation model. 8 people are shown with lightbulbs above their heads, and one lightbulb is highlighted.
Open innovation platforms take advantage of the global connectivity of the Internet to connect solution seekers with solution solvers. InnoCentive is one of those platforms, and it offers challenges across a range of disciplines: chemistry, computers, engineering, food, agriculture, life sciences, statistics, and physical sciences.
Screenshot of 2 InnoCentive challenges: Bird Identification from a Minimal Sample, Mitigating the Environmental Impact of Large Photovoltaic Plants, and
A few InnoCentive challenges from June 2019.
Like many crowdsourcing platforms, InnoCentive incentivizes with monetary rewards, typically in the thousands of dollars. Top solvers can actually make a living entirely by entering challenges.
For the organizations presenting these challenges, the money spent is worth the potential gain. By opening their challenge up to a crowd, organizations broaden the idea search outside of their own internal expertise and even outside of their entire industry's expertise, to improve their chances of discovering novel solutions.
A non-profit organization named Prize4Life issued a challenge in 2007 to anyone who could come up with a biomarker for the fatal illness ALS, and in 2011, they awarded the million dollar prize to a neurosurgeon. That sounds like a lot of money (and it is!) but Prize4Life estimates that the discovery of the biomarker can reduce the cost of ALS clinical trials from $10 million to $5 million, making it more feasible for pharmaceutical companies to find ways to treat or even cure the disease.7
The XPrize is another organization which gives big rewards for big ideas. The first XPrize competition challenged teams to build spaceships for suborbital spaceflight and awarded a $10 million prize in 2004. Since then, their competitions have encouraged innovation in under-funded technological research areas such as vehicle efficiency, oil cleanups, health monitoring, and carbon emissions reduction.8
Photo of pilot standing on top of the SpaceShipOne with two thumbs up.
The pilot after the successful launch of SpaceShipOne, August 2004. Image source: RenegadeAven
Competitions like the XPrize and InnoCentive may seem out of reach to many of us, as most of the winners have advanced degrees and extensive backgrounds in research and industry. Not every great idea needs to save lives or revolutionize an industry; great ideas can simply make our world a more enjoyable place to live.
LEGO Ideas is a platform where anyone can submit ideas for new LEGO build kits. Each year, the LEGO team reviews the ideas with at least 10,000 community supporters and selects a kit to manufacture and sell in their stores.9
The kit ideas are as diverse as their creators: a science writer created a Women of NASA kit, a Hungarian animator designed a Steamboat Willie set, and a Filipino software engineer built a Voltron set.
Screenshot of three winning LEGO build kit ideas: Women of NASA, Steamboat Willie, and Voltron.
Three user-suggested kit ideas that LEGO now produces for stores.

Want to join the conversation?

  • hopper cool style avatar for user Arthur
    "computation platformS distribute small tasks to humans and giveS" - maybe it is a typo?
    (1 vote)
    Default Khan Academy avatar avatar for user
  • aqualine ultimate style avatar for user Reine L
    I've seen captchas which ask you to identify objects in pictures (for example, click on all the pictures that contain cars). Are these doing the same jobs as reCaptchas? If so, why?
    (0 votes)
    Default Khan Academy avatar avatar for user
    • aqualine ultimate style avatar for user Martin
      Captchas are a security mechanism to protect against bots. The idea is that you have to solve a small task, which is very hard for computers but easy for humans to solve.
      This way you make sure the resource you're protecting cannot be easily accessed by bots.
      (Although there are systems which can beat it, to a certain degree. And there is always the option of cheap labor, something like sweatshops where the employees solve Captchas so the bots can do their work).
      ReCaptcha takes Captchas and turns them into something useful, by creating a large pool of "volunteers" that transcribe old writing.
      (1 vote)