If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Human computation

Computers are able to solve an increasing number of problems, especially with the recent rise of big data and machine learning. But there are still some things that all humans are naturally better at, just from growing up human. For example:
  • Circling all the cats in a photo.
  • Identifying the emotion of a review as positive or negative
  • Reading words in an old yellowed book.
At this point, the best algorithms for those problems are all still worse than the average human. Until algorithms improve, we can instead use humans to solve the problems and computers to process the results.
Human computation platforms distribute small tasks to humans and give each task to multiple humans. Once enough humans agree on the answer to a particular task, the computer stores that as the probable answer.
A diagram of crowd-sourced human computation. Six people have thought bubbles, five of the thoughts say "X" and the sixth thought says "Y". An arrow flows from the people to a large "X".
Some human computation platforms will pay the humans to do the tasks (or "microwork", as it's often called). On Amazon Mechanical Turk, a requester can post a project with tasks and payouts, and workers choose which tasks to work on. The payouts won't make anyone a millionaire, but the tasks can be done at any time from home with no experience required, so microworking is attractive to students, stay-at-home parents, people in poor countries, and folks looking for a little extra income.
Screenshot of Amazon Mechanical Turk human computation platform, with buttons to create tasks and to make money.
Amazon Mechanical Turk is made up of requesters and workers.
Human computation can also bring in volunteers, when the tasks are interesting enough or the cause seems worthwhile. Zooniverse is a platform for "people-powered research", where volunteers can work on tasks like transcribing historical documents and classifying baby sounds.
Screenshot of task interface for Zooniverse. Left hand side shows a military record from the Civil War, with handwritten soldier details. Right hand side is a form for volunteers to transcribe the handwritten details.
A task for a project transcribing the military records of African-American soldiers in the US Civil War.
Have you ever filled out a captcha? Websites often use captchas like this one to check that you're not a bot before letting you submit a form.
Screenshot of the reCaptcha interface, with two scanned words and a text field for typing in those two words.
If so, you were probably part of one of the largest human computation projects, reCaptcha. The service displays two photos of words from old books, where the first word already has a confirmed transcription and the second word has yet to be transcribed accurately. If you get the first word correctly, you're likely to be a human and your second transcription is also likely to be correct. Once enough humans transcribe that second word the same way, reCaptcha stores the transcription as confirmed.
In its first four years of operation, reCaptcha users unknowingly transcribed the entire archives of The New York Times, more than 13 million articles. 1 Since then, reCaptcha has also been used to identify objects in images, another task that humans are quite good at.
🤔 In a way, much of our activity online is doing work that computers can't do themselves. When we vote up funny videos or when we give a restaurant 4 stars on a mapping site, we're giving information to algorithms that influences how they rank and present information. In what ways are you knowingly or accidentally part of the crowd that's changing online content?

🙋🏽🙋🏻‍♀️🙋🏿‍♂️Do you have any questions about this topic? We'd love to answer—just ask in the questions area below!

Want to join the conversation?