CrowdForge

CrowdForge

Aniket Kittur (left) and Robert Kraut

Writing can be a solitary, intellectual pursuit. Or not, says Carnegie Mellon University researchers.

They have shown an informational article can be written by dozens of people working independently online just as well as an individual writer.

"This is exciting because collaborative crowdsourcing could change the future of work," said Aniket Kittur, assistant professor in CMU's Human-Computer Interaction Institute (HCII).

Kittur led the research team.

Each person in the experiments completed just a sliver of the work of preparing an article. This included preparing an outline, gathering facts and writing simple prose.

The "authors" never spoke with each other. But the team found that the 'crowdsourced' articles compared favorably with articles written by a single author and with Simple English Wikipedia entries.

Kittur added, "We foresee a day when it will be possible to tap into hundreds of thousands or millions of workers around the globe to accomplish creative work on an unprecedented scale."

Kittur, along with Robert Kraut, professor of human-computer interaction, and Boris Smus, a student in HCII's joint master's degree program with the University of Madeira, have created a framework for crowdsourcing like this.

Called CrowdForge, it breaks down complex tasks into simple, independent micro-tasks that can be completed rapidly and cheaply.

Jim Giles and MacGregor Campbell, San Francisco-based science journalists, have created a blog, www.mybossisarobot.com, that will explore the use of CrowdForge for preparing science news articles based on research reports.

Crowdsourcing has become a powerful mechanism for accomplishing work online.

Millions of volunteers have performed tasks such as cataloging Martian landforms and translating text into machine-readable form.

Most crowdsourcing today pays workers just a few cents for small tasks, like writing product descriptions and transcribing audio.

"But much of the work required by real-world organizations requires more time, cognitive effort and coordination among co-workers than is typical of these crowdsourcing efforts," Kittur said.

So the CMU researchers approached the crowdsourcing market as if it was a distributed computing system.

In such a system, computations are divided into smaller chunks can be solved simultaneously by large numbers of processors. Failures by individual processors won't undermine the entire process.

CrowdForge acts the same way. To prepare a brief encyclopedia article, for instance, CrowdForge would assign several people the task of writing an outline.

As a quality control measure, a second set of workers might be tasked with voting for the best outline. Or they could be asked to combine the best parts of each outline into a master outline.

Subsequent sub-tasks might include collecting one fact for a topic in the outline.

Finally, a worker might be given the task of taking several of the facts collected for a topic and turning them into a paragraph. 

Researchers used CrowdForge to produce articles on New York City. Compared to articles written by individual authors, the group-written articles were rated higher in quality.

"We were surprised at how well CrowdForge worked," said Kittur.

This work was supported in part by grants from the National Science Foundation. For additional details and to download the technical study, read the press release.

More information is available on the CrowdForge project page.

Related Links: CrowdForge


Homepage Story Archives