Why Don't More Companies Use Prediction Markets?

Last Updated Apr 24, 2008 4:10 PM EDT

The basic idea behind James Surowiecki's The Wisdom of Crowds is that a collection of complete strangers in aggregate can make a better decision than a single expert. A crowd at the county fair, when polled in aggregate, can guess the weight of a giant pumpkin unfailingly within a few pounds and much more accurately than a pumpkin farmer.

A relatively small number of companies including Google, HP, Intel, Yahoo, and Eli Lilly have jumped on this idea by creating prediction markets to aggregate the opinions of employees on predictions such as a new product's ship date and first quarter sales. The predictions are generally remarkably accurate -- much more so than the "official" numbers offered up by internal product managers or Marketing.

So how come more companies don't avail themselves of such a valuable tool, asks Harvard Business School professor Andy McAfee in a recent blog post.

Why is this? It's not because the technology is hard to acquire: Inkling Markets, Xpree, and Consensus Point, among others, will happily provide a company with Web-based prediction market software. So what is the real stumbling block? Is it that companies don't really want the most accurate information about future events to come out and be widely known?
How Prediction Markets Work
A corporate prediction market is essentially a Web location where any employee can come and bet on a potential outcome. Your bet comes in the form of virtual betting chips that you can allocate any way you wish. At the end of the market, the company's best predictors -- those who win the most money -- are rewarded with some incentive or another.

Here's a very simplified example of how PMs work. Your company opens a prediction market for all employees to participate in. Among the predictions you can you can invest in are the following two:

Our new AutoNailer product will ship by the March 31 deadline.

Our new AutoNailer product will not ship by the March 31 deadline.

I know that my company never ships on time, so I buy shares in the "not ship" prediction. Just like the stock market, the more people that buy that prediction, the more the price goes up. At the same time, shares of "ship on time" are headed down. Now, if I learn some new information from Sally over on the assembly line that building of the AutoNailer is ahead of schedule, I can sell my "won't ship" stock (at a nice profit) and use those credits to buy cheap "will ship" stock, or make bets on other predictions.

Over time and possibly thousands of transactions, company executives learn what their assembled employees really think will happen. They might end up with an aggregate opinion that employees believe there is an 80 percent chance the product won't ship on time. And most times they are right: the chances of it shipping as promised are probably only two-in-ten.

Why so effective? As McAfee observed in a recent class discussion, prediction markets provide a kind of corporate truth serum by allowing the top of the organization to do an end-run around the entrenched bureaucracy and plug into the collective knowledge of the people doing the enterprise block-and-tackle work, the folks who really know what's happening on the ground.

Little Traction
So back to Andy McAfee's question: Why don't more companies use this tool to inform their decision making?

I think there have been three initial roadblocks:

  • Unfamiliar with the Concept. Most managers, especially the veterans, grew up hearing not to trust crowds, to avoid mob rule. Instead, they were encouraged to value expertise and be ready to pay consultants. It's a major brain-bender to suddenly believe crowds can be smarter than experts.
  • Internal Friction Corporate prediction markets rankle folks like product managers when company employees start betting on a bad outcome for their product.
  • Implementation Issues Until recently, there wasn't a whole lot of best practices around how to run efficient prediction markets. Results can be seriously skewed by having too few participants, asking the wrong type of questions, or by running a market too long or too short a time.
These design problems are being solved. PM designers know a lot more about such factors as motivating participants, keeping the markets liquid, and balancing out questions so a full range of outcomes is captured.

Is crowdsourcing the death of reputation and expertise? Should I cancel my consulting contract with Forrester Research and go ask a bunch of strangers down at the skating rink?

Of course not. But what McAfee suggest is that prediction markets are another great tool to inform your decision making. Why wouldn't you want as much good information as possible?

Have you had experience with prediction markets? Do you think the whole idea is ludicrous. Do tell!

  • Sean Silverthorne

    Sean Silverthorne is the editor of HBS Working Knowledge, which provides a first look at the research and ideas of Harvard Business School faculty. Working Knowledge, which won a Webby award in 2007, currently records 4 million unique visitors a year. He has been with HBS since 2001.

    Silverthorne has 28 years experience in print and online journalism. Before arriving at HBS, he was a senior editor at CNET and executive editor of ZDNET News. While at At Ziff-Davis, Silverthorne also worked on the daily technology TV show The Site, and was a senior editor at PC Week Inside, which chronicled the business of the technology industry. He has held several reporting and editing roles on a variety of newspapers, and was Investor Business Daily's first journalist based in Silicon Valley.