A few years ago we did a series of blog posts on ISO 8000-150 which have been perennially popular. Well, since those posts were created ISO 8000-61 has been published which provides a richer and more comprehensive approach to data quality management.(more…)
It has been stated that the most common failure of a software project is caused by not capturing or understanding what the system is required to achieve. This is equally true of your information needs; have you truly captured your data requirements to meet your information needs and organisational objectives?
Is your data over or under specified?(more…)
You may be wondering why you should bother improving your data quality or what the benefits of this activity may be. You may be wondering how to secure suitable resources and funding to deliver improvements to data quality. Read on to discover why ‘data quality is free’.(more…)
Data quality problems all, at their root, involve some form of human error. Whilst this is easy to say, it is perhaps harder to identify and resolve the causes of these human errors.
In this blog post, I will explore ways to categorise human data errors and propose strategies to reduce the severity of these errors. (more…)
When talking about data quality, it is usual to consider different aspects or ‘dimensions’ of data quality – validity, completeness, uniqueness, consistency, timeliness and accuracy. These six dimensions were agreed as the most relevant and representative of data quality as part of work led by DAMA UK that I contributed to in 2013 and published in this White Paper.
Some recent work and discussions suggest that there may be another dimension to consider – continuity. This blog post explores things in a bit more detail.(more…)
The world is changing, people are changing, organisations are changing, and this is no different for data requirements. An organisation needs to accept this and makes sure that change requirements are captured, impact assessed, acted upon and communicated.
During my career I’ve seen many approaches to configuration changes or data change management. I have seen approaches in organisations range from being totally non-existent right through to exemplary practices. It’s true that the process takes resources i.e. time and effort; to firstly put Change Management in place, and then the effort to maintain the process as a business as usual activity. However, this time and effort will reap dividends by reducing rework, ambiguity, process failure, etc. and create a defined and agreed data specification that will meet the needs of the business in terms of information requirements.(more…)
There is a variant of this that cropped up in a discussion recently “Garbage In, Gospel Out”. So what can this mean and does it apply to your organisation?
Those days when you need to make an important decision can be trying at the best of times. The old saying “Garbage In, Garbage Out” is never more relevant (particularly if an AI tool is making automated decisions). (more…)
In this post I will expand on these themes and consider how modern organisations could be considered as having a ‘Victorian’ attitude to data quality.