The relevance of data quality in achieving delivery for Business Intelligence (BI) implementations has become increasingly clear over the past few years, and in 2012, it is expected that this trend will be continued, says GARY ALLEMANN at Master Data Management.
Predictions by Boris Evelson of Forrester Research suggest that delivery will be a primary focus for BI projects for the coming year, and a number of these are relevant to the data quality community. Of particular note is the prediction that enterprise standards will not be allowed to stand in the way of delivery: in other words, specialised data quality tools with the functionality to get things done will trump architecture standards.
Leading IT departments are taking cognisance of the fact that the traditional reliance on a few big vendors, who according to Gartner Research are focussed on protecting the status quo rather than on delivering new value, is insufficient to meet the enterprise data quality needs, and are turning to specialist platforms. The integration of an enterprise data quality platform into business applications and real-time processes is critical to support the increasing demand for near real-time analytics.
Another key prediction is that BI will move into the hands of end users (and IT will have to accept this), but that this self-service element must be controlled and managed at an appropriate level. This continues a trend that has been developing over the last few years in the data quality space, particularly within leading organisations that recognise the importance of business user involvement in data quality and data governance projects, as it enables business projects to remain relevant in the face of changing external circumstances.
Business projects typically run much tighter deadlines than IT projects and may have far fuzzier objectives. As an example, imagine that a business department needs to rapidly identify all clients in a particular segment, in order to respond to a competitor’s new product launch, or risk churn of this key segment. At the same time the legal department needs to identify all U.S. clients in order to comply with pending legislation.
Ad hoc data analysis of this nature is difficult for IT to deliver in time – typically each requirement would be queued with one user losing priority and missing their deadline. This has a negative impact on the business, and if the business user had been involved from the start IT would be better equipped to deal with requirements.
Specialist data quality platforms enable business users to take ownership of data governance and data quality issues, either on an ad hoc basis or as formal projects that involve IT. Some of these specialist tools address the key criterion of enabling the business user to quickly deliver ‚good enough‚ data quality while supporting enterprise data volumes and standards and plugging in easily to enterprise systems such as ERP and MDM, either in real-time or via batch.
Specialists also bring the data centric view that is necessary for successful data governance and master data management. This experience, and the vendor’s ability to transfer this knowledge into the corporate environment, can add tremendously to an organisation’s ability to deliver relevant and sustainable data management capabilities ‚ bridging the gap between business and IT.
Data quality is recognised as a key driver for the successful delivery of data governance, master data management and BI projects, and is a key component of many compliance and regulatory projects.
Data quality and BI are intrinsically linked, and the data quality space can be expected to follow the predictions around BI for the coming year. If 2012 is the year of delivery for BI tools, it stands to reason that data quality will follow suite ‚ there is no reason why 2012 should not be a year of delivery all round.
*