I have just caught up in reading a recent supplement to the Times newspaper entitled “Business Intelligence“. As is often the case for such supplements, this is sponsored ‘advertorial’ content paid for by some of the key software providers in this area which proposes some interesting (perhaps controversial) views.

One article stated that companies with good BI saw the recession coming before it was officially announced. If this was the case some of the financial institutions who were, arguably, some of the causes of the sub-prime lending problems, did not spot the impending recession, despite such organisations tending to adopt BI approaches extensively, which undermines this suggestion.

A number of views were presented which suggested wider use of BI by the majority of a company’s employees. For example, one article stated that “the more employees who have access to business data, the greater a company’s ability to anticipate changes”. I am not sure that having more people with access to more data, does not necessarily mean better insights – people need to understand the context of a problem as much as the answer in order to make sense of it. Another suggestion included setting up a “BI competency centre” staffed with both business and IT users to promote and encourage the use of BI, which is one effective way of ensuring that both analytical capability and contextual awareness are used to deliver BI.

Another suggestion stated that the move from BI being presented to senior managers towards “empowering 60-80%” of employees to use BI for day-to-day decision making and also suggested the need to include analysis of unstructured data (emails, documents etc.). It was also suggested as an advantage through the freedom BI gives all employees to experiment with information. Again this may not result in the desired outcomes as there is a risk that many people will not necessarily understand the context of a problem, may duplicate analysis and may spend too much time analysing data. I would also be interested to see a real world demonstration of a BI tool that would allow 80% of the staff in an organisation to generate meaningful BI from unstructured data!

The concept of collaborative BI and the utilisation of Web 2.0 social networking approaches to share and discuss BI outputs was proposed. This may work in some organisations, but arguably, other more traditional forms of communication e.g. conference calls and meetings, may be more efficient and effective.

As many of the articles presented the supplier’s view of BI tools, there was also a focus on the usability of tools and clarity of output visuals, including “active dashboards” and the faster deployment time over traditional development, greater usability, linkage to performance targets etc. All these suggestions are to be welcomed, if used intelligently and effectively.

A more balanced approach where wider visibility of BI outputs and relevant measures, is linked to controlled “What If..” type analysis is probably more suitable for most organisations. If this is supported by the dedicated BI Competency Centre mentioned previously, there is far more likelihood that the overall approach will be beneficial. In my experience, detailed domain knowledge coupled with a knowledge of data structures and an adaptable approach to a problem is the key to effective BI. The actual tool used to deliver the BI is one of the less important parts of the solution.

One major omission I noticed from most articles was the data itself. This included no reference to data catalogues or data quality which clearly will have a major impact on the outputs of any analysis. Interestingly, the only person to mention data quality, availability and relevance was Howard Dresner (the man who coined the term Business Intelligence in 1989). His article also provided a focus on the strategic and organisational uses of BI and was less concerned about the actual systems that created the BI, which is not surprising as he does not represent a software vendor.

Other omissions from most of the articles included no mentions of data or process governance, metadata management and approaches for managing the multitude of report outputs.

These articles present a very rosy view of the benefits that potentially can be gained from exploitation of BI, but pose the burning question:

Can effective Business Intelligence exist without effective data governance?

Any views…..

Tagged on:                     

5 thoughts on “Business Intelligence without Data Quality

  • 19th January 2010 at 6:04 am
    Permalink

    Good observations Julian.

    I ran a survey last year (results here: http://bit.ly/w1dd3).

    There were various findings but for me the most striking was just how many companies were launching a BI initiative without even considering data quality in the first few months.

    Also, where companies had invested in BI nearly 50% were experiencing major issues relating to data quality.

    So the message is a clear one as you say in your post, investing in BI? Get your data governance and DQ processes in order first!

    Reply
  • 22nd March 2010 at 1:54 pm
    Permalink

    I have made the case for all-pervasive BI on a number of occasions, but this never really went beyond 40-50% of staff; then I wasn’t looking to enhance my license revenue!

    I believe that it is true that the value of BI increases as more people use it, but there is a cut-off point as well. Where that cut-off is depends on the nature of the workforce and the industry in question. Maybe it could be as high as 60% for some organisations, but I would think that was atypical.

    I get a bit tired of linkage between BI and the global recession. BI doesn’t predict macro-economic trends. It is there to help people to understand what is happening within their business and with their customers; at a push some BI systems may also look at the trends in an organisation’s market. Examples of the use of BI are things like: product X has strong take-up in small-to-medium sized clients; industry sector Y seems to be causing us profit problems; strategy Z does not seem to be achieving its objectives; our drop in profits in Q2 was driven by the loss of three key accounts, which was not offset by our strong growth in new business with smaller customers.

    While of course a global recession impacts all of the above, I can’t imagine that many companies have a cube allowing global economic health to be diced and sliced. Economists struggle enough with this themselves and this is not really the preserve of companies.

    I think the suggestion that BI either did or did not predict the recession is specious, probably emanating from the area being oversold by people not charged with actually implementing solutions. BI is not a crystal ball, just a useful tool.

    However I also think that some of the true benefits of BI have been undersold. BI will not generally help you to predict recessions, but it can be a useful tool for navigating through one when it occurs.

    Peter

    Reply
      • 24th March 2010 at 11:11 am
        Permalink

        Peter,

        Thanks for the comments.

        I think we are both coming from similar start points regarding the use of analysis and the impacts that data quality can have on your analysis. I have covered this in more detail in another blog post “Is computer analysis accurate?” http://ow.ly/1qdvS.

        In summary, people should treat analysis with caution and should understand the nature of the supporting data and algorithms used in order to determine what additional controls/analysis may be required.

        Julian

        Reply
  • Pingback: The Business Intelligence / Data Quality symbiosis « Peter Thomas – Award-winning Business Intelligence and Cultural Transformation Expert

Leave a Reply to Peter Thomas Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.