The Heavy Cost of Dirty Data And How to Clean It Up

By Israel Rosales on January 13, 2014


Photo by: Duncan Hull [CC Attribution 2.0]

I would assert that if asked- most companies agree that clean Master Data is incredibly important to their success. In a previous post we pointed out the importance of Master Data and the need to educate employees on the role of clean data. If all companies understand that Master Data is vital, why do so many companies fail to keep their data clean?

By speaking to our customers on a daily basis, we’ve discovered an important ingredient to this mystery.  Our customers who are successful in achieving a high level of data integrity, have enrolled their employees on the importance of clean data. Alternatively, the worst enemy to making improvements is the Status Quo; doing nothing and dealing with inertia. In most of the cases, the decision of certain workers to not take action is due to a number of reasons:

  • Lack  of involvement
  • Laziness
  • Fear of the change
  • Not understanding the benefits of the change

There’s a common root element on why employees don’t take action; they haven’t been enlightened on how clean data affects the businesses bottom-line. In order to effectively communicate the benefits of maintaining data, there are two main financial arguments that should be used:

  • ROI (Return of Investment) of implementing those changes  in the Master Data processes.
  • COI (Cost of Inactivity) that would suppose doing nothing and keeping the current Status Quo.

ROI justification is usually presented in a business case where the company analyzes the cost and benefit of the investment. The change in the master data processes with Winshuttle tools is something that all organizations could use to justify investments. It can be easy to identify the miss. An example of a miss is spending 1 million USD in BVAoverages due to errors in customer’s shipping addresses. One thing that we do for customers is to perform a BVA (Business Value Analysis) to avoid costly mistakes and to identify process improvements.

But in a lot of situations the ROI is not (or should not be) the main criteria.  Measuring cost inefficiencies is more about the COI (Cost of Inactivity). Instead of asking “How long will it take to return the investment?” one should be analyzing “What is the cost of doing nothing?”. This question leads a business to analyze other questions:

  • How would it affect us if our competitors improve and become more competitive?
  • Could we achieve our company objectives without changing anything?
  • Could we afford keeping those inefficiencies in our processes?

In all these situations we absolutely cannot afford to maintain the current Status Quo. Using dirty data this would stop the high level business objectives such as reducing the required working capital, or reducing the time to market. Although most of the time the average worker doesn’t give it much thought, master data is at the heart of all business processes. In fact there is no business processes that Master Data doesn’t impact. Bad quality in master data will destroy all operating areas causing serious economic impacts.

Everyone knows it’s imperative to maintain our investments. For example, we keep our car running smoothly with proper maintenance, changing the oil at the recommended frequency and ensuring the brakes work. We should treat master data with the same care, analyzing whether it is optimized.

Ultimately, it is not about how long it would take to recover the investment in improving your master data. It is about a more serious question, if nothing changes, how long could we cover our inefficiencies and all associated extra cost without impacting operations?

We’re here to help!

If you’d like to speak with us on how you can analyze your business processes and your data please reach out to

Questions or comments about this article?

Tweet @IsraelRosJ to continue the conversation!