Conversations with another data quality professional recently highlighted organisations that treat data problems as if they are an ‘Act of God’. In these cases the organisations were trying to deny their existence and were certainly denying that they could do anything about them.
Whilst there are many definitions for an ‘Act of God’ they typically are used to refer to events that are outside human control. Is that really the case with data?
According to the excellent ISO8000:150 data errors are either:
- Errors by user – providing inaccurate data, not completing all data fields, inappropriate analysis etc.
- Errors by structure – for example, incorrect database schemas, interfaces etc.
Whilst lightning strikes or tornados may be thought of as Acts of God, people living in areas prone to such problems are able to take precautions to minimise the impact of these problems – lightning conductors and storm cellars can be utilised very effectively.
So what about data? Good database design backed up by effective governance should minimise the risk of ‘errors by structure’, so that is one key area covered.
Errors by user may be thought of as harder to predict and overcome, however, just like the use of lightning conductors to deal with lightning strikes, then effective data quality monitoring can be used to identify potential data issues to investigate. Further investigation can then be used to try and determine the root cause of the data issue. These could include:
- Users not being aware what to do – more training and user guides needed
- Users opting out of completing non-mandatory fields – monitoring and training/awareness can help
- Users entering incorrect data by following local ‘custom and practice’ – identify the issues and raise awareness of the impacts caused
- Users entering ‘wrong’ data to speed up work completion – again more monitoring and perhaps reporting of the number of data entry errors can resolve this
This only covers some of the more common ‘Errors by user’, what other ones can you suggest?