Ask anyone how they feel about data management and they will likely tell you that it’s a problem and it’s growing.
Data quality is a rising issue, as well as the methods used to perform data hygiene. You also want to ensure the data that is entered or maintained is managed as properly as possible.
Dirty and irrelevant data in your systems carries a heavy burden to the average organization, and this is often considered an operational tax, as it costs time, resources and potentially lost opportunities.
These data challenges pervade business activities in complex ways and aren’t limited to master data, but include all data that flows in and out of the business. The problem is also relevant to data remaining in the business and used as a reference point for less strategic decision making.
In the CGMA report From Insight to Impact, over half of corporate leaders presently rank big data and analytics as a top ten corporate priority. Of the 2000 finance professionals interviewed for the CGMA report, 87% felt that data holds the potential to change the way business is done, but weren’t perfectly clear on what that data is or could be.
Although businesses gather a surplus of data, not all of that data needs to be used for making decisions. All data should be present and correct, but if you start to invest energy in correcting less important data, you likely miss out on early wins in your data management initiatives. To help you on this journey, I thought I would share some ideas on things to consider in lean data management initiatives.
Prioritize your inventory
A lean data management approach considers the end to end process of how data is used in the business and strives to determine which data is most pertinent to decision making.
In the same way that an organization continuously manages inventory and manufacturing processes around current and new products, a business tends to focus on active business activities. It also pushes old stock and spares for redundant or end-of-life products and services out of focus.
While older data and resources may still hold the key to future benefits, they aren’t necessarily where the majority of value in the data is found. Tracking inventory and prioritizing processes will help determine exactly what is important. Keeping stock of exactly what data you use is also important, especially data used to make critical decisions.
An example of a data inventory exercise might be one where you examine all types of customer master records, in order to determine where you want to begin data management improvements.
Let’s say you’ve identified through an internal stakeholder analysis and review that customer master data is a problem area for your business, and you’ve decided that the one-time customer creation process in particular is the most critical. You might determine this to be the best place to start improving your customer data today, and in the future, but not necessarily customer master creation as a whole.
You might develop an automated data management solution for creating customer master records and try to incorporate all possible scenarios of customer master data with all possible combinations of data entry. However this could be very complex, and take a long time to finalize your designs. If that’s not what you need, then that approach is not a good place to start.
Using a leaner approach, you can start with your most common customer master type and the aspects that represent your greatest challenges. Tackle those as the most critical ones to perform data hygiene and improve the data creation and maintenance processes. This new and improved process can be added to your existing practices in a way that doesn’t change how you create customers as a whole.
Determine what is relevant
Getting things right the first time is ideal, but sometimes it takes several iterations before you determine the balance between a fully guided procedure and one that is a little looser in how it helps you manage data. Implementing a major change can be very disruptive, but using an agile lean approach to making changes should be additive in terms of capability and control.
For example, setting an email address as a mandatory field is great if you know that all customers will have an email address. But if you simply set it as mandatory because you think it will be present for most customers, you run the potential of introducing bad data into the process. It may be non-mandatory in your system of record, but you could set it as mandatory in your low code lean application for data management.
Every field that you choose needs to represent some value to someone in the organization, otherwise it doesn’t make sense to include it or make it a prerequisite to complete the process.
How quick does this need to be?
Data management processes can be thought of as having very similar characteristics to factory assembly lines.
Run the assembly line too slowly and the product doesn’t come out of the factory fast enough and resources may sit around idly. Similarly, if you have data that needs to be created relatively quickly, make sure you keep the bare minimum number of participants in the process that you need to ensure proper governance through completion.
If you run the process too quickly, you risk sub-par results in terms of completeness and quality, and could cause stress to everyone involved.
If you decide you want to use alerts and monitor how long it takes to complete steps in your data governance solution, tune these to have timers that are appropriate to the end to end process and needs of the scenario, before moving to the next participant. For example, don’t simple choose one hour and expect that everyone will jump on every event within 60 minutes and complete it.
Alternatively, consider the design of two processes: One for a standard service level and one for an expedited service.
You can try to accommodate both scenarios into the same process, however you may find that you can only expedite if you reduce the number of participants or the amount of data you need to make a decision.
Are the changes making a difference?
While lean, low code solutions for data management like those offered by Winshuttle, help accelerate the rejigging of your data management and governance for both master and transactional data, they can’t help you to assess whether your approach to data management is effectively improving the end-to-end process. This is something you have to review more holistically.
The only way that you can determine whether your data management solutions are effectively meeting the needs of the business is to assess how your key performance indicators (KPI) for data management look, before and after the implementation of various solutions.
However, with lean solutions you can pivot the way you handle various data management scenarios. You can do this in a far more agile way than you can with traditional solutions, which are very burdensome in terms of requirements management and development effort.
Take a look at this free white paper that outlines some of the common challenges associated with customer master data, and how you can address them with a lean data management approach.
Questions or comments about this article?
Tweet @uploadsap to continue the conversation!