As the “Data Quality” conversation continues to gain more momentum, we are constantly on the lookout for research and other various bits of data that help to quantify the full scope of the problem.  Really… we need to know this!  “How much is bad data costing our organizations?”  We need to understand the scope of the problem.  Think of Lord Kelvin’s famous words, “If you cannot measure it, you cannot improve it.”  We want to improve quality and we want to reduce costs…  therefore we need to measure the effect of quality on costs.

In the above infographic we have compiled a greatest hits of “The Costs of Data Quality” citing a number of various analysts and reports from the last few years.  Most of this data is focused on the enterprise, but the statistics seem to be very consistent when we filter them through a Lavante colored lens.  We work directly with business leaders throughout the P2P suite and in conversations with clients and prospects we have heard similar estimates about the shortfalls of data quality when it comes to Supplier Information Management.  At Lavante – we also have the benefit of working directly in the supplier data and we can directly see the effect that poor quality data has on our clients.   The downstream impact -in terms of profit loss and wasted time – is chilling.

The most recently sourced statistics listed above (placed in the center of the infographic) focus on a recent Deloitte study which was published in the Wall Street Journal and states that 67% of CPOs point to “Data Quality” as a key barrier for getting the maximum value out of their systems.  This is very interesting, because this research is taking the “Data Quality” discussion right to the P2P ecosystem – This data is seemingly talking about supplier information.  That means two-thirds of CPO’s are suffering from bad supplier data and that is having an adverse effect on their ability to maximize ROI!  Wow…. that number is staggering when you think about it.  Expect to see much more of this conversation in the next year.  In 2016, expect CPO’s (and other procurement and organizational leaders)  to start paying very close attention to how improving their supplier “Data Quality” can ultimately improve their bottom line.

All of this data aligns perfectly with what I have been hearing from market trendspotters in the last several months.  Consistently, analysts and other experts are reporting that business leaders throughout the P2P cycle are universally realizing that the investments they’ve recently made in downstream point solutions and/or in advanced analytics are not reaching their maximum ROI.  The most common reason coming out the various user groups is “poor data quality.”  It is only a matter of time before we see a laser focused attention how poor data quality is preventing Procurement and the entire P2P cycle from ever achieving it optimal levels of value.  The emergence of more “Costs of Data Quality” research in Procurement and P2P will spur a new wave of great ideas to improve supplier data quality and in turn improve the output of our downstream point solutions and processes.  Truly, measuring the costs of bad supplier data will ultimately lead to a search for solutions and the subsequent improvement.  Lord Kelvin was right.