Standards-based data collection
By Clinton Jones on March 26, 2013
A consistent experience is the mainstay of running an effective ERP system. Part of the rationale for implementing ERP is a unified view of data objects and processes in flight within the business. Centralizing your data processing to a single or a few systems eliminates some of the inconsistencies and churn associated with servicing different data demands of data platforms.
In this vein, several overarching objectives flow from taking on the goal of standardized data collection approaches.
Whether your business unit focuses on buying, making, storing or selling materials, the inventory or material master is the mainstay of reference for determining how and what actions you need to run your business.
If your materials masters do not accurately reflect the characteristics of materials across all product lines and do not contain all the values that determine what you can do, then it is likely that you will experience operational inefficiencies. Some simple examples include the absence or incorrect statement of item weights and measures. These might include pack sizes, dimensions and how items should be stored. Misalignment of data leads to downstream and upstream operational inefficiencies and even safety risks that might come from inappropriate unit quantity ordering, storing items that should be in ambient storage in refrigerated areas, trying to squeeze oversized products into undersized packaging and using the wrong materials to ship product.
Using a prescriptive approach to defining all the attributes of a material avoids the problem of elective decision making on the part of people tasked with defining materials in your inventory system.
Collaboration and traceability
It is not unusual for many participants to be involved in the definition of materials in your system. These can include participants from finance, procurement, quality, logistics and engineering. In the lifecycle of materials creation in your ERP system it is important to ensure that all those who contribute data to the final picture of a material in your system get to make their contribution. In addition, when you make changes to the characteristic values of a material it is equally important that all those who have a vested interest in the definition of that material also have insight and right of veto into the proposed changes.
A standards based approach to gathering the data from all the participants is one in which the participation process is either linear or parallel, but nevertheless is always consistent. Although a given ERP system like SAP may support the consistent creation of some of the data attributes, it is often not mandatory to provide all information at the outset. As a consequence, incomplete or partial material information may be created for inventory material shells that ultimately is not wholly usable by all those who need them. Using a prescriptive approach with multi participant workflows and delegated responsibility enables the necessary collaboration and moreover provides visibility into the end-to-end material creation process thereby avoiding unnecessary follow-up activities that flow from unstructured and ad-hoc processes without workflow governance.
Oftentimes a part of the data collection process requires the inclusion of content and documentation that is not a necessary part of the final data creation step or which has a temporary life and will ultimately be discarded. In this vein, one can consider credit and inspection reports and even resumes in the employee hiring process. Although a resume may be an important piece of documentation in the recruitment and hiring process, it is not normally necessary in the employee on-boarding process. However, for the purposes of an audit trail and process transparency it may be important to understand who interviewed a candidate and what they submitted as their documentation for being considered for a given position.
In the realm of transactional processing, the ability to keep track of sourcing documentation such as scans of incoming vendor or customer documents, communiques or information is an audit and compliance requirement but here again, certain decisions that are made around engineering, sourcing, payments, receipts, the granting or denial of credit and the structuring of accounts may not necessarily find its way to the core system of record and may instead languish on shared or local drives of computer systems across the enterprise. Keeping associated documentation in its context in a single repository that is referable may lead to major productivity gains in the long term for audit and business process improvement
Single version of the truth
A single version of the truth and the merits of having only one originating entry for master data is often held up as one of the main drivers for convergence to solutions like a master data tool. Focal to the concept is the idea that the whole approach to getting to a single version is using one core repository for all the information related to the data or process. It is easy to assume that this should be a centralized system; however, in reality a centralized and consistent set of processes that drive unified data is also the way to converge on a single version of the truth.
By avoiding or preventing the introduction of data processing activities via methods other than a standardized approach, it is possible to improve the overall quality of data. Instead of users relying on email, post-it notes and paper forms, and instead concentrating on using a system based approach to getting things acted upon, it is possible to not only accelerate the action completion cycle but also enhance business process transparency and effectiveness. Moreover, using flexible and adaptable technologies like Winshuttle Foundation enable business to be more agile in adjusting to a constantly changing operational environment without negatively disrupting operational effectiveness.
Reliance on a standardized approach to data gathering, restricting the number of ways that business units and employees collaborate with one another need not mean a negative experience for getting important and meaningful tasks completed in the enterprise around master and transactional data processing. Failure to impose a structured and prescriptive set of tools can however lead to maverick and malformed business processes that rely on tribal knowledge, personal relationships and individual intellect and specialized skills. In an accelerated and highly demanding work environment, unstructured approaches to data gathering and collaboration are inconsistent, do not scale and can lead to unexpected outcomes, personal stress and ultimately lead to process failures.
About the author
Clinton Jones is a Director for Finance Solutions Management at Winshuttle where he has worked since 2009. He is internationally experienced having worked on finance technologies and business process with a particular focus on integrated business solutions in Europe, the Middle East, Africa and North America. Clinton serves as a technical consultant on technology and quality management as it relates to data and process management and governance for finance organizations globally. Prior to Winshuttle he served as a Technical Quality Manager at SAP and with Microsoft in their Global Foundation Services group.
Questions or comments about this article?
Tweet @uploadsap to continue the conversation!