You may wonder why you should bother improving your data quality or what the benefits of this activity may be. You may wonder how to secure suitable resources and funding to deliver improvements to data quality. Read on to discover why ‘data quality is free’.
Data quality has direct impacts on profitability, services, products and customer experience.
- Good quality data, exploited well helps deliver the best products/ services/ performance possible
- Poor quality data leads to reduced efficiency and profitability, poorer decision making
- Poor data exploitation leads to reduced efficiency through increased time and effort required to deliver outcomes
Some may think that improving the quality of data is similar to an OCD approach to house cleaning – spending inordinate amounts of time removing the final specks of dust from inaccessible places, however, this is not the case.
The quotation above is adapted from one by 1980’s quality guru Philip B. Crosby whose 1979 book ‘Quality is free’ was the inspiration for the ‘zero defects’ approaches adopted by many quality conscious organisations. The way organisations use and exploit data is also a key part of the quality equation.
So do you recognise your organisation in the following?:
- Significant time and effort is spent combining and resolving the differences between different data sources to prepare regular performance reports
- Staff creating new spreadsheets and databases because they are not sure that they trust the ones that already exist
- Significant time spent searching through different data stores to find information to support decision making
- Many different data stores all containing information about similar business entities
- Many different data stores that all need updating when reference information changes, for example, new product identifiers, additional service codes etc.
- Projects over-running on time and/or cost due to poor forecasting and scoping
- Incorrect decisions about which projects to approve
- Plus many other situations where data is poorly managed and used
If you recognise the above, now consider the financial impact of the situations you have identified:
- The cost of all the effort used to prepare performance reports which could perhaps come from a single/small number of systems – perhaps a day a week for one of the team
- The time spent creating and maintaining new spreadsheets and databases – perhaps a weeks effort to create and then 1/2 a day per week to maintain
- 20% of the time of your analysts taken up searching for information that already exists, but cannot easily be located
- Perhaps a couple of days effort updating the many different data sets whenever new codes are required, or entities added etc.
- Project costs being 10-50% over the forecast due to foreseeable, but ignored, data issues
- The cost of a project that should not have been approved
In addition to the above are all the intangible issues related to this situation – staff frustration at ‘not being able to find the truth’, low job satisfaction/morale, slow speed of response by the organisation etc.
All the above avoidable costs are reduced by improving data quality management and data exploitation – data quality is free. Tips on strategic approaches to improving data quality management are here and here.
A survey from a number of years ago suggested that infrastructure and asset intensive organisations spent around 20-25% of turnover on all data related activities (acquiring, finding, storing, exploiting, improving manipulating data etc.). The survey also suggested that, for those organisations that were less effective at data exploitation, these costs could be around 5% higher. Depending on sector, the proportion of turnover spent on data related activities could be significantly higher and hence the potential efficiency gains. What proportion of your turnover and profit is spent on data related activities? How much could this improve if data was exploited and managed better?
So what are you waiting for?