Data quality is free

“Data quality is free. It’s not a gift, but it’s free. What costs money are the unquality things – all the actions that involve not getting data quality right the first time and all the actions to correct these data quality issues”
“Data quality is free. It’s not a gift, but it’s free. What costs money are the unquality things – all the actions that involve not getting data quality right the first time and all the actions to correct these data quality issues”

You may be wondering why you should bother improving your data quality or what the benefits of this activity may be. You may be wondering how to secure suitable resources and funding to deliver improvements to data quality. Read on to discover why ‘data quality is free’.

(more…)

Continuity – the new data quality dimension

When talking about data quality, it is usual to consider different aspects or ‘dimensions’ of data quality – validity, completeness, uniqueness, consistency, timeliness and accuracy. These six dimensions were agreed as the most relevant and representative of data quality as part of work led by DAMA UK that I contributed to in 2013 and published in this White Paper.

Some recent work and discussions suggest that there may be another dimension to consider – continuity. This blog post explores things in a bit more detail.

(more…)

Changing your data requirements

The world is changing, people are changing, organisations are changing, and this is no different for data requirements. An organisation needs to accept this and makes sure that change requirements are captured, impact assessed, acted upon and communicated. 

During my career I’ve seen many approaches to configuration changes or data change management. I have seen approaches in organisations range from being totally non-existent right through to exemplary practices. It’s true that the process takes resources i.e. time and effort; to firstly put Change Management in place, and then the effort to maintain the process as a business as usual activity. However, this time and effort will reap dividends by reducing rework, ambiguity, process failure, etc. and create a defined and agreed data specification that will meet the needs of the business in terms of information requirements.

(more…)

Garbage In, Gospel Out

Most people should be familiar with the old adage “Garbage In, Garbage Out” intended to remind people that if your input data is poor, then any outputs will also be poor.

There is a variant of this that cropped up in a discussion recently “Garbage In, Gospel Out”. So what can this mean and does it apply to your organisation?

(more…)