(MoneyWatch) The financial media tends to focus most of its attention on stock market forecasts by purported investment gurus. They do so because they know that's what gets the public's attention. Investors must believe they have value or they wouldn't tune in. Nor would they subscribe to investment newsletters, nor publications like Barron's that claim to provide you with "news before the markets know."
Unfortunately for investors, there's a whole body of evidence demonstrating that market forecasts have no value (though they provide me with plenty of fodder for my blog) -- the accuracy of forecasts is no better than one would randomly expect. For investors who haven't learned that forecasts should only be considered as entertainment, or what Jane Bryant Quinn called investment porn, they actually have negative value because forecasts can cause them to stray from well-developed plans.
The latest piece of evidence illustrating the futility of forecasts comes from CXO Advisory Group. The investor research firm set out to determine if stock market experts, whether self-proclaimed or endorsed by others, provide useful guidance on how to time the stock market. To find the answer, from 2005 through 2012 they collected and investigated roughly 6,600 forecasts for the U.S. stock market offered publicly by 68 experts, bulls and bears employing technical, fundamental and sentiment indicators. Their collection included forecasts, all of which were publicly available on the Internet and which went back as far as the end of 1998. They selected experts based on Web searches for public archives with enough forecasts spanning enough market conditions to gauge their broader accuracy.
CXO's methodology was to compare forecasts for the U.S. stock market to the S&P 500 index returns over the future intervals most relevant to the forecast horizon. They excluded forecasts that were too vague and forecasts that included conditions requiring consideration of data other than stock market returns. They matched the frequency of a guru's commentaries (such as weekly or monthly) to the forecast horizon, unless the forecast specified some other timing. And importantly, they took into account the long-run empirical behavior of the S&P 500 index. For example, if a guru said investors should be bullish on U.S. stocks over the year, and the S&P 500 index was up by just a few percent, they judged the call incorrect (because the long-term average annual return has been much higher). Finally, they graded complex forecasts with elements proving both correct and incorrect as both right and wrong (not half right and half wrong).
The following is a summary of CXO's findings:
- Across all forecasts, accuracy was worse than the proverbial flip of a coin -- just under 47 percent.
- The average guru also had a forecasting accuracy of about 47 percent.
- The distribution of forecasting accuracy by the gurus looks very much like the proverbial bell curve -- what you would expect from random outcomes. That makes it very difficult to tell if there is any skill present.
- The highest accuracy score was 68 percent and the lowest was 22 percent.
There were many well-known forecasters among the "contestants." I've highlighted 10 of the more famous, most of whom I'm sure you'll recognize, along with their forecasting score.
- James Dines, founder of The Dines Letter. According to his Website, "he is truly a living legend... one of the most-accurate and highly regarded Security Investment Analysts today." His forecasting accuracy score was 50 percent. Not quite the stuff of which legends are made.
- Ben Zacks, a co-founder of well-known Zacks Investment Research and senior strategist and portfolio manager at Zacks Wealth Management Group. His score was 50 percent.
- Bob Brinker, host of the widely syndicated MoneyTalk radio program and editor of the Marketimer newsletter. His score was 53 percent.
- Jeremy Grantham, Chairman of GMO LLC, a global investment management firm. His score was 48 percent.
- Dr. Mark Faber, publisher of the Gloom, Boom and Doom Report. His score was 47 percent.
- Jim Cramer, CNBC superstar. His score was 47 percent.
- John Mauldin, well-known author. According to his Website, "His individual investor-readers desperately need to know what his institutional money-manager clients and friends know about the specific investments available to help them succeed in challenging markets." His score was just 40 percent.
- Gary Shilling, Forbes columnist and founder of A. Gary Shilling & Co. His score was 38 percent.
- Abby Joseph Cohen, partner and chief U.S. investment strategist at Goldman Sachs. Her score was 35 percent.
- Robert Prechter, president of Elliott Wave International, publisher of the Elliott Wave Theorist and author of multiple books. He brought up the rear with a score of 22 percent.
Of course, there were a few with fairly good records. But only five of the 68 gurus had scores above 60 percent (among them was David Dreman with a score of 64 percent), yet 12 had scores below 40 percent. It's also important to keep in mind that strategies based on forecasts have no costs, but implementing them does.
The bottom line is that the research shows that whether it comes to predicting economic growth, interest rates, currencies or the stock market, the only value of gurus is to make weathermen look good.
Keep this in mind the next time you find yourself paying attention to the latest guru's forecast. You're best-served by ignoring it. As I point out in my latest book, that's exactly what Warren Buffet does himself and what he advises you to do -- ignore all forecasts. They tell you nothing about the direction of the market, but a whole lot about the person doing the predicting.
Image courtesy of Flickr user Images_of_Money