Learn from Your
Analytics Failures
by Michael
Schrage | 10:00 AM September 3, 2014
By far, the safest
prediction about the business future of predictive analytics is that more
thought and effort will go into prediction than analytics. That’s bad news and
worse management. Grasping the analytic “hows” and “whys” matters more than the
promise of prediction.
In the good old
days, of course, predictions were called forecasts and stodgy
statisticians would torture their time series and/or molest multivariate
analyses to get them. Today, brave new data scientists discipline k-means clusters and random graphs to proffer
their predictions. Did I mention they have petabytes more data to play with and
process?
While the
computational resources and techniques for prediction may be novel and
astonishingly powerful, many of the human problems and organizational
pathologies appear depressingly familiar. The prediction imperative frequently
narrows focus rather than broadens perception. “Predicting the future” can—in
the spirit of Dan Ariely’s Predictably
Irrational—unfortunately bring out the worst cognitive impulses
in otherwise smart people. The most enduring impact of predictive analytics,
I’ve observed, comes less from quantitatively improving the quality of
prediction than from dramatically changing how organizations think about
problems and opportunities.
Ironically, the
greatest value from predictive analytics typically comes more from their
unexpected failures than their anticipated success. In other words, the real
influence and insight come from learning exactly how and why your predictions failed.
Why? Because it means the assumptions, the data, the model and/or the analyses
were wrong in some meaningfully measurable way. The problem—and pathology—is
that too many organizations don’t know how to learn from analytic failure. They
desperately want to make the prediction better instead of better understanding
the real business challenges their predictive analytics address. Prediction
foolishly becomes the desired destination instead of the introspective journey.
In pre-Big Data
days, for example, a hotel chain used some pretty sophisticated mathematics,
data mining, and time series analysis to coordinate its yield management
pricing and promotion efforts. This ultimately required greater centralization
and limiting local operator flexibility and discretion. The forecasting
models—which were marvels—mapped out revenues and margins by property and room
type. The projections worked fine for about a third of the hotels but were
wildly, destructively off for another third. The forensics took weeks; the data
were fine. Were competing hotels running unusual promotions that screwed up the
model? Nope. For the most part, local managers followed the yield management
rules.
Almost five months
later, after the year’s financials were totally blown and HQ’s credibility
shot, the most likely explanation materialized: The modeling group—the data
scientists of the day—had priced against the hotel group’s peer competitors.
They hadn’t weighted discount hotels into either pricing or room availability.
For roughly a quarter of the properties, the result was both lower average
occupancy and lower prices per room.
The modeling group
had done everything correctly. Top management’s belief in its brand value and
positioning excluded discounters from their competitive landscape. Think this
example atypical or anachronistic? I had a meeting last year with another hotel
chain that’s now furiously debating whether Airbnb’s impact should be
incorporated into their yield management equations.
More recently, a
major industrial products company made a huge predictive analytics commitment
to preventive maintenance to identify and fix key components before they failed
and more effectively allocate the firm’s limited technical services talent.
Halfway through the extensive—and expensive—data collection and analytics
review, a couple of the repair people observed that, increasingly, many of the
subsystems could be instrumented and remotely monitored in real time. In other
words, preventive maintenance could be analyzed and managed as part of a
networked system. This completely changed the design direction and the business
value potential of the initiative. The value emphasis shifted from preventive
maintenance to efficiency management with key customers. Again, the predictive
focus initially blurred the larger vision of where the real value could be.
When predictive
analytics are done right, the analyses aren’t a means to a predictive end;
rather, the desired predictions become a means to analytical insight and
discovery. We do a better job of analyzing what we really need to analyze and
predicting what we really want to predict. Smart organizations want predictive
analytic cultures where the analyzed predictions create smarter questions as
well as offer statistically meaningful answers. Those cultures quickly and
cost-effectively turn predictive failures into analytic successes.
To paraphrase a famous
saying in a data science context, the best way to predict the future is to
learn from failed predictive analytics
No comments:
Post a Comment