Think about these two statements:
- “If we wait for the quality of data to improve, we’ll never get the project off the ground.”
- “If we proceed with poor quality data, we’ll never get the project implemented.”
Sound familiar? While the statements are paraphrased, we’ve all had exposure to them at point or another. Waiting for data quality to improve before embarking on a project most likely means the project is behind schedule before it starts. Embarking on a project knowing that data quality is poor likely means that the project will fail to deliver on its goal.
With limited control over a project and almost no control over upstream data quality, here is a tried and true practice to follow:
- Publicly acknowledge and memorialize data quality concerns to key project stakeholders up front. Make certain that data suppliers are aware the project will be relying on good quality but also be realistic and let them know you fully appreciate the job they have.
- Be extremely proactive by communicating early and often to the user community who will eventually recognize poor data quality. Let them know upon project completion, there will be a 3, 6 or maybe even a 9 month post-implementation review process to track incremental data quality improvement.
- Bring a few key contributors from the user community and the data supplier together so that each side gains further insight into their partners concerns.
- Report, report, report – and then report again.
Here are a few reporting tips:
- Report in the same format on a set frequency and to the same recipients with a copy to everyone’s direct management team
- Include MTD, YTD and LTD metrics
- Praise improvement no matter how small
- Give data suppliers a private preview at least one week prior to publishing – you will gain their confidence and in turn, they will be more likely to push forward on their initiatives
In the meantime, the search for the silver bullet continues. If anyone finds it, please share!