Monday, 1 February 2010

A balanced approach to scoring data quality: Part 3

Today I wanted to discuss the ‘Internal Processes’ section of the scorecard. In case you missed the previous parts of the series, you can read the introduction here, and about the ‘customer’ section here.

Perform, Perform, Perform

Throughout the business world people are measured upon their performance. How well do they carry out their responsibilities? Do they hit their objectives? Do they adhere to any applicable SLAs? The Data Quality team should be no different and we should look towards measuring our performance against our internal processes.

Our Internal processes are the procedures and tasks we follow to ensure data quality is managed, and communicated, throughout the business community.

Consider the following Internal Processes:

  • Publishing and Review of a Business Terminology / Data Dictionary
  • Resolution and Communication of DQ issues in a timely manner
  • Identification of appropriate system, data & report ownership

All of the above are critical processes within the day to day responsibilities of a Data Quality team. If we under-perform in delivering any of these processes, it will have a knock-on impact on how data quality management is delivered within an organisation. In some cases, poor performance within our internal processes could even be a contributing factor to poor data quality.

For example, a Product Manager has noticed that sales data for their product is not accurate in the data warehouse. They raised a data quality issue to your team. The data warehouse is also used by the Finance team, and is currently being used to provide financial figures for a last minute board meeting. The data quality issue was raised yesterday, and is currently being investigated, but there has been no communication to the business community to advise them of the issue. The board of directors are now looking at inaccurate data, questioning the figures and wondering whether they can trust the data or not?

How can we measure our Internal Processes?

We can measure the performance of our internal processes by benchmarking them against our objectives, or against targets based upon our objectives. As an example, let’s take the process of ‘Resolution and Communication of DQ issues in a timely manner’.

Objective

All known Data Quality issues should be immediately communicated to the business community, and be resolved within 3 days of being raised

Actuals

DQ Issues raised – 125
DQ Issues resolved within 3 days – 70 (56%)
Issues communicated to community – 100 (80%)

Upon seeing the measures above, we could ask:

“Why were only 56% of DQ issues resolved within our target time period? Do we need to involve more resources to fix issues? Do we need to adjust the target SLA?”

OR

“All issues were due to be communicated to the business community immediately. Why were 25 issues not communicated? Do we need to set up reminders? Was no one able to pick up the issues?”

In Conclusion

As Satesh suggested in a comment to my previous post: “What gets measured improves”. This is exactly what we are trying to achieve from a scorecard. Poor Performance within our Internal Processes could have a knock on effect on the perception of DQ management from our customers. Therefore, a process of continuous measurement, analysis and improvement is required, in order to ensure that we do not get complacent and adopt poor DQ Management habits.

The next post in this series will deal with the ‘Financial’ section of the scorecard, and we’ll look into how we can begin to measure the financial impact that DQ management can have on an organisation.

Related Posts

Part 1: Introduction
Part 2: Customer
Part 4: Financial
Part 5: Data Dimensions
Part 6: The Dashboard

4 comments:

Jim Harris said...

Phil,

First of all – excellent series – I am really enjoying it.

Measuring internal processes is probably one of the most contentious aspects of a data quality scorecard – which is why many organizations conveniently ignore it.

Internal service level agreements (SLAs) have not been documented at most of the organizations I have encountered in my consulting career.

Many organizations seem to confuse this critical measurement as simply finger-pointing criticism analogous to a well-organized blame-storming session.

I agree with Satesh as well, that “what gets measured improves” – as long as the people behind the process don’t get defensive. Measuring internal processes is about focusing the attention on process improvement, which is best facilitated by improving collaboration and information sharing such as what is provided via the data quality scorecard.

The corresponding issue I have witnessed when internal SLAs are defined is since they are measuring performance that can be in many ways (real and imagined) be associated with job security, the target levels are too easy to achieve – or the definitions of “communicate the issue” and “resolve the issue” are so nebulous that they can too easily be satisfied.

So yes, what gets measured does improve – but are you measuring and improving what will have the most positive impact on data quality, customer satisfaction, internal performance, and financial benefit?

I have seen too many data quality scorecards “grade on a curve” or otherwise make it a test that is too easy to achieve an acceptable score on – which in my opinion, is almost as bad as not bothering to administer the test in the first place.

I am looking forward to the rest of the series.

Cheers,

Jim

Phil Wright said...

Jim,

Thank you. Some great points you’ve made there too.

I’ve always advised that the objectives should not be too easy to achieve - this isn’t an exercise to catch anyone out, or point the finger. Typically, if an organisation has a structured approach to their employees performance appraisals, or yearly objectives, I have advised that scorecard objectives relating to ‘Internal processes’ should come directly from these appraisals.

For instance, the example I gave in the post about DQ resolution is actually from a yearly objectives plan for a Data Quality analyst. I benchmarked targets for the scorecard against expectations set forth in their yearly objective plan, which were set by their line manager.

You’re right about how the definitions of SLAs can be contentious, and I think that the only way around this is to be as definitive as we can be with the definition. Ie. Communication means “Updating the intranet site using this template, and E-Mailing the 'Data Stewards' list, telling them the issue name, detail, and impacted systems” etc. Anything less, and the definition is too slippery, therefore as you say, the SLA is too simple to satisfy.

Julian said...

Phil,

A good, relevant post which aligns well with one of my current major themes - namely, the fact that people (and processes) are the root causes of data quality problems (the symptoms).

Data is typically an input to many business processes, will be transformed or created by the process and then provided as an output to other business processes and data stores. It is therefore essential that processes are aware of the quality of input data and the output data.

In a supportive organisational culture, there should be a strong desire to discover the root causes of these problems and to design them out of the process. The data quality metrics you describe should be a key part to identifying issues, assessing their impact and extent and then developing solutions.

As perfect data quality is a position most organisations aspire to, but few achieve, the quality processes should also be assessing where business decision making and follow on processes need to have greater checks and controls in order to ensure that real world data quality does not adversely affect the outcome.

Julian.

Phil Wright said...

Thanks Julian.

I have a belief that data providers/creators should be responsible for ensuring that the data they are providing is of good quality, and that if there are data issues, these should be communicated to the user community who consume that data. Increased data transparency, with checkpoints, much like passport control, would be ideal.

This certainly aligns with your opinion that it's essential that processes are aware of the quality of input/output data, and the supportive organisational culture. Couldn't agree more.

Post a Comment