Watch CBS News

Business Forecasting: What Do You Expect?

Binoculars by Flickr User AdmScooForecasting demand is a serious challenge to most organizations. As Mark Twain famously said, "the art of prophecy is very difficult, especially with respect to the future." Difficult, yes, but essential. Demand forecasts are used for all manner of planning purposes -- from capital budgeting to raw materials ordering to call center staffing. Because forecasts are so important to business, there are all sorts of mathematical techniques and expensive software to help get it right.

There's also the whole judgmental approach to forecasting, where managers tap their knowledge and experience to supply information that statistical forecasting can't really dig into. (Example: a sales manager knows his customer is about to place an uncharacteristically huge order.) When the dust settles, however, judgmental and analytical techniques may fall short of expectations. But the first thing to realize is that expectations about forecast accuracy are often totally unreasonable.

There is, indeed, an upper limit to forecast accuracy -- and it sure isn't 100 percent! If demand for a particular service is highly volatile, there's no way you're going to be able to get a forecast with 60 percent accuracy at best. If demand for a particular service is extremely stable, you may reasonably be able to expect a forecast with 90 percent accuracy. How can you know what forecast accuracy is reasonable to expect? Use the naïve method as a baseline -- if your advanced forecasts aren't beating the naïve, then you'd better make some changes, fast.
What is this so-called "naïve" method?

It generally comes in two flavors. The easiest method is just to say "what happened this month (or week or quarter or year) will happen next month," and just forecast incrementally like that. (You can see why it's called "naïve.") But if there's any seasonality to your demand, there's another type of naïve forecast that's a lot more effective and not a bit harder to calculate.

Incidentally, it's called the seasonal naïve method, and it just says "what happens in Q3 (or any time interval) this year will happen in Q3 next year." And while there are much more sophisticated methods, you'll never know if they're actually more accurate (and therefore more valuable) than a simple naïve forecast. And even more worrisome -- your sophisticated techniques may actually make things worse! A SAS Institute white paper cautions against this peril of perils:

The naïve forecast provides a baseline level of accuracy against which all other forecasting efforts must be compared. Very few companies utilize naïve models, but everyone should. If you fi nd, for example, that a naïve model forecasts your business with 70 percent accuracy, but your existing systems and processes generate forecasts that are only 60 percent accurate, then something is terribly wrong!
All of this sounds unfathomable â€" how could million dollar systems and elaborate collaborative processes produce worse forecasts than a naïve model â€" but it happens every day. Until you've [checked against the naïve] and proven otherwise, don't be so sure it isn't happening at your organization.
In other words, don't have blind faith that more math means a better forecast. And don't think that more input from managers and executives always improves forecasts, either. Just test your forecast (or have your analyst test her forecast) against the naïve one be sure that all your highfalutin forecasts are worth the investment.

(Image of Tattered Binoculars by AdmScoo, CC 2.0)

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.