Data Quality has often come under the spotlight as being a difficult discipline in which to demonstrate quantifiable financial value. Many people have had issues gaining funding for DQ initiatives because of the purely speculative nature of the ROI. The aim here to discuss ways in which we can go beyond this speculation, and look to financially quantify data quality within an organisation.
Our Internal Processes are often hard to quantify in financial terms. For instance, a large amount of time and effort has been applied to ensure that the business community has a definitive business glossary, containing all the terminology and business rules that they use within their reporting and business processes. This has been published, and highly praised, throughout the organisation.
However, how can we begin to ascertain the true financial benefit of this activity? To do this we would need to interview the whole business community, asking them to cast their collective minds back to the world before the business glossary. Asking them:
- How long they have spent chasing the correct definition?
- How many reports they generated with incorrect definition?
- How much scrap and rework they undertook because of this incorrect definition?
Two key metrics I want to delve into are:
- Known cost of poor data quality
- Known saving due to DQ Management
Known cost of poor data quality
I like to think of the 'known cost of poor data quality' as a reactive metric. What I mean by this is that the cost of poor data quality can only truly be ascertained after an issue has occurred. If an issue has not yet occurred, the cost can only be pure speculation. As part of our reaction to a data quality issue we should undertake an impact assessment.
This impact assessment will ask, among other things:
- How long has the issue been a problem?
- Who has it impacted during that time?
- What workarounds were undertaken?
A example of the cost
We discussed Business definitions in the previous section, so let's now take an example from another one of our Internal Processes: Data Quality Issue Resolution. The below issue was raised to a newly established Data Quality team by an MI analyst within a financial services organisation:
"We have an Issue with Product Names within our datamart. A large number of records are coming through with incorrect or empty product names, which is causing havoc with my reporting. The product code is correct, but the people I send my reporting pack to won't understand the code. I'm currently bringing the data from the datamart into an Access database, and joining the table to my lookup table that contains all the correct product codes/names. This is taking me about an hour a day because of the amount of data I have to import/export. It's been like this for 4 months but I didn't know who to contact about the issue. Thanks for your help!"
This issue was promptly resolved by ensuring that reference data was updated to reflect true products. But how can we begin to add a financial perspective to this issue?
Using ITJobsWatch, the British IT Jobs market tracker, I noticed that the average salary of an MI Analyst was £32,500. Based upon this salary I estimated the cost of an hour a day each working day for 4 months.
Weekly: £625 (based on 4.33 weeks in a month)
Daily: £125 (based on 5 working days per week)
Hourly: £16.66 (based on 7.5 hour working day)
4 months at 4.33 weeks = 17.32 weeks
5 days per week for 17.32 weeks = 86.6 days
1 hour a day for 86.6 days at £16.66 per hour = £1,442.75
In this example, the cost was caused by time & resource spending 86.6 hours firefighting instead of partaking in value-adding activity. If the MI Analyst hadn't reported the issue, and continued to firefight for 1 year, they would have spent 259.8 hours, or more than six working weeks over the course of a year firefighting. A scary thought.
Known saving due to DQ Management
'Known saving due to DQ Management' is on the other hand a proactive metric for quantifying financial benefit. It is the measurement of savings garnered due to DQ management efforts to ensure issues are captured prior to become issues.
Caution should be taken to ensure that speculative savings are not mistaken for known savings.
For instance, you could speculate that because a data quality issue relating to customer address details - that would have impacted marketing, billing, customer care and customer complaints - was fixed prior to impacting customer mailouts, you saved the organisation £5,000,000. Not to mention the potential bad publicity and customer churn. What qualifies you to suggest this monetary value?
The best way to ensure that this metric relates to true savings is to ensure that DQ management efforts are closely aligned to business processes.
A example of the saving
For instance, the Data Quality Firewall initiative that I wrote about previously discovered that a UK retail bank were about to overpay their staff by £200,000 in sales incentive payments due to duplicate sales transactions in their processing tables. The initiative resulted in these duplicate records being captured prior to Incentive payments being calculated. Our DQ initiative saved the organisation £200,000 by performing one simple data profiling technique (FACT). Not to mention the savings due to scrap & rework and potential trade union/media involvement that a potential mistake and subsequent clawing back of employees take-home pay could invoke (SPECULATION).
When looking at measuring Data Quality from a financial perspective it is important to look at it from the perspective of both the 'known cost of poor data quality' and the 'known saving due to DQ Management'.
We know, and accept, that it may not be possible to truly quantify all aspects of data quality management. However, by starting to quantify data quality in terms of costs & savings wherever you can will help to raise the profile of both your data quality management activities, and the need for fit for purpose data within your organisation.