Is a reactive approach to data quality enough?
So in this example, even analysing the problem required moving mountains. This represented a time when data quality technology was procured and primarily used by IT departments, with a strong focus on specialist skills and timescales driven by software development lifecycles. After cycles of escalations, we got the matter resolved, but by then the problem had become worse and the cost to find and prevent the problem had increased. This represented the classic reactive data quality practice and poor data governance.
Today, data and data quality is climbing up the corporate agenda. The reactive data quality machine may still be present but it is fast losing out to the changing business appetite. The 2014 Experian Data Quality Global Research revealed that nearly 99% of companies surveyed have a data quality strategy.1 With moving times, the context, scope, and user expectations have changed; data quality practices must be nimble and agile when responding to business needs. No longer can businesses twiddle their thumbs and sit on problems for over two months. Regulators are breathing down our necks and competition is stealing customers over the smallest failures in data quality. Organisations Gartner has surveyed estimate that poor-quality data is costing them on average $14.2 million annually.2 That is not a figure to take lightly. Businesses today require more responsive data quality practices.
Our new paper “2014: Key Trends Driving the Change In Data Quality Technology” features the Gartner research note, “The State of Data Quality: Current Practices and Evolving Trends.” It looks at the changing nature of data quality as a practice and the technology that has evolved with it.
What are the trends that are changing the nature of technology?
We follow this with an Experian note on how these trends are changing the nature of technology and we look at examples from across the industry. In particular we look at:
- Changing Data: Increasing types of data are changing the focus of data quality technology, requiring it to be specialist where needed but also all-encompassing. Data is growing in volume, variety and velocity, and data quality technology needs to keep up with the growth rate. Growth in the types of reference data used by organisations adds an additional dimension to managing data quality.
- External Drivers: Information Governance is setting stringent expectations on data quality and how it is managed and monitored. Failure in governance is often linked to poor execution of processes, and the evidence is often in the data that is collected. Effective data quality technology can keep the data in check and meet requirements of governance programmes.
Gartner reports “Those planning to deploy data quality tools over the next 12 months cited information governance programmes as their most common intended use case, at 57%”.2
- Changing Expectations: Data profiling and visualisation of data quality is seen to be critical in understanding the nature of quality, with the focus shifting from just fixing the data to preventing the problem. Uncovering issues before they become unmanageable is critical to successful business operations, and being able to convey the impact of data quality in a language that business sponsors and stakeholders recognise is seen as critical for approval.
- New Users: Varying types of business users are getting involved with data quality and require a new breed of technology. There has been a marked shift from IT to a business wide approach when we speak to companies about their data quality challenges. Many of these users are looking for technology that is easy to implement and adopt, especially when it comes to analysing data quality and monitoring it over time.
Gartner state “CDOs, information governance teams and other roles in the business will also become more involved. Vendors technology solutions are beginning to reflect this change.”2
With these changing trends it is imperative that business review their current data quality technology and identify if they are already facing gaps between capability and expectations. The paper also lists specific actions for businesses, advising on what to look out for when facing these particular trends.
At Experian, we have seen the rise of these trends and are always adapting our data quality portfolio. The most notable introduction is the Experian Pandora, which allows business users to take control of data quality, allowing them to analyse, improve and control the quality of their data. The platform especially takes into consideration the changing nature of data, the data quality audience, the expectations of external drivers like governance and the rise of data profiling and visualisation.
Which of these trends are changing the way you plan and execute your data quality strategy within your organisation? Are you observing any different trends and are they making your life easy or difficult? We are keen to hear from businesses out there, so please use the comments box below to share your experience.
1 Global Data Quality Research 2014, an independent market research report commissioned by Experian Data Quality and produced by Dynamic Markets.
2 Gartner, The State of Data Quality: Current Practices and Evolving Trends, Ted Friedman | Saul Judah, 11 December 2013