How does a business set a realistic return on investment (ROI) expectation for analytics? This question often comes up from customers I talk to and companies I advise – from customers so they can build a realistic business case for investment, and from vendors so they can set prices for their offerings.

Is there a certain rule-of-thumb, an industry norm for expected % ROI? After all, every analytics project is different; the business problems are quite diverse and so are the use cases, and results are determined, at the end of the day, by how effectively the analytics project is executed and its life cycle managed.

That said, there are certain ways of categorizing business problems and use cases that yield different sets of ROI profiles. One such categorization is ‘cost mitigation’ vs. ‘revenue generation’ use cases. Let’s look at these categories in turn:

Cost mitigation use cases

Companies (hopefully) know the current costs of different components/processes of their business and have a notion of which business processes can be streamlined with data-driven insights and the expected cost savings. Companies can also estimate the cost of implementing the analytics project and managing the life cycle of the analytics model. So with sufficient due diligence, you can build a fairly reliable model for ROI.

So what ROI % have the majority of such projects been accruing? In my experience, it has between 50%-300% for successful implementations. So for every $100 spent on an analytics project, in the majority of cases, companies have been able to save between $150 and $400 in cost.

Risk mitigation use cases (ex. IT Security or business risk) have a dynamic similar to cost mitigation use cases, in instances where you can translate the risk into an expected cost.

Revenue generation use cases

When the business goal is to generate new revenue or sustain existing revenue, for instance, by monetizing data, providing better customer experience, targeted marketing efforts, etc. it becomes much more difficult to quantify the revenue potential with a degree of certainty with which you can quantify cost savings.

So what ROI % have the majority of such projects been accruing? Frankly, it has been all over the map. Even within the same project/initiative, the ROI can shift significantly over time. I have seen projects such as large scale churn analytics where the ROI has been 10 fold (ie. 1000%). I have seen projects such as personalized customer care where the returns could not be reliably quantified and correlated with the investments.

A good analogy of this dichotomy can be found in the financial investment industry: ‘money market funds’ vs. ‘stock funds’. Money market funds have lower risk and provide predictable yields in contrast to stock funds, which have a higher risk profile, could potentially provide much better returns than money market funds over an investment cycle of a few years, but could also hand down significant negative returns.

Summary

So what is the strategy that companies are adopting in investing in analytics projects? There exists a small fraction of companies (primarily based in Silicon Valley) whose primary source of revenue itself is directly driven by analytics. But the majority of other companies are focusing on implementing cost mitigation use cases and plan to use the accrued – and more predictable – ROI from these use cases to then invest in revenue generation use cases with a higher risk profile but a greater potential for growing the business.

Originally posted on Linkedin