Tim Woods Series #5: Downstream Effects of Duplicate Records
By Clinton Jones on May 17, 2017
In the 5th post in our Tim Woods series, I’ll discuss the concept of overproduction, as it relates to Application Data Management (ADM). It may not surprise you that this has a lot to do with duplication of work, and the effect it has on data quality.
O is for overproduction
Duplicate records are one of the biggest challenges for application data managers and can be a greater risk to business continuity than missing or incomplete data. Duplicate systems and duplicate entries can stifle business agility. This often stems from an incomplete migration vision or even poor data governance.
According to a 2007 Computerweekly article, “On average, we have found that in the best countries, about 10% of the data is duplicated. In the worst countries, it is about 30%.” In the article, Xerox Europe uses an MDM solution to standardize data held by each of its operations before the data is loaded into SAP. Things have likely progressed over the last ten years, but this isn’t a unique problem to them or SAP customers as a whole. Duplicate data falls into that category of overproduction as it has multiple participants involved in the ADM process who are duplicating their efforts. Tools like LSMW don’t provide any solution to reduce or eliminate the risk of duplicate entries. Unless you have configured SAP to identify duplicate entries, this can add to duplicate data.
The knock on effect
While duplicate customer, vendor and material records are bad enough, imagine if you had duplicate employee records or needed to make multiple payments. From a marketing perspective, duplicate contacts in your database that are attached to customers means your audience for outreach campaigns is overstated. If you set up campaigns based on the number of people you attempt to reach, you shouldn’t be surprised if some of them unsubscribe because they get too many emails. Your ROI will be poor if you have duplicate records and you won’t impress your audience with duplicate messages. Gartner suggests this problem could lead to a 25% reduction in potential revenue gains.
In addition, having duplicate records breaks the idea of a single unified view of the customer which makes it hard to understand the customer relationship and behavior. If you can’t see customer interactions and transactions in a unified view then it’s hard to have relevant conversations and determine appropriate outreach campaigns or even perform appropriate analysis.
Paying an employee or supplier more than once erodes cash-flow which can lead to the need to borrow funds to continue operations and expansion, but this failure to appropriately control disbursements actually speaks to controls and potentially raises a lot of red flags for audit. Appropriately and consistently matched and coded payment requests reduce the likelihood of duplicate payments but with growing volumes of transactions stemming the tide of risk is a constant battle.
Winshuttle offers several mechanisms to help reduce and eliminate duplicate records. At the time of record creation, particularly master records and master data, you can route record creation requests through several layers of oversight before the record is actually created. With Winshuttle Foundation, a number of SAP and Salesforce customers have implemented forms and workflow-based processes that have record search as an integral part of the creation request process.
If the request is still progressed, even after a search, it’s possible to make certain attributes mandatory or comply with specific input requirements (even when the requirements are not enforced or required by the system of record). For example, certain fields that can be optional in SAP may be rendered mandatory in a form or workbook.
The ability to push record creation requests through a workflow process also provides insight into who the requestors and participants are in the data or transaction creation requests which help build a profile and pattern of request initiation and solution usage. Usage metrics can serve to help with reporting KPIs that may be a key indicator for data quality.
Winshuttle has been used for many years for mass maintenance of master data and corrections and continues to be relevant in this area. For many companies looking to eliminate inefficiency and waste, it’s just as critical to establishing good data governance practices up-front before the requests get processed to ensure the best possible data quality.
Check out other posts in this series:
About the author
The Winshuttle blog is written by professional thought leaders who are dedicated to providing content on a variety of topics, including industry news, best practices, software updates, continued education, tips and techniques, and much more.
Questions or comments about this article?
Tweet @Winshuttle to continue the conversation!