Between responding to supply chain disruptions, coping with the economic slowdown, responding to inflation, retaining and winning new customers, and better managing inventory and production, data quality has never been more critical to your business.
In the digital age, data is a company’s most valuable resource. Data collection, data analysis and data management strategies set leaders apart from the rest of the pack. And data quality is intertwined with the entire data architecture.
What is data quality?
A Forrester survey found that top customer intelligence professionals are considering the ability to: integrate data and manage data quality the two main factors holding back customer intelligence. But data quality is about more than just customers. Top-level executives and management use internal data to drive day-to-day operations and achieve business goals.
Quality data must be accurate, complete, consistent, reliable, secure, up-to-date and not siled. High quality data is often defined as data that is “fit for use in operations, decision making and planning”. High-quality data also represents real-world constructs.
The difference between internal and external data and what makes it “fit for use” is important. External data is generated by a company’s customer base and may be of high quality for marketing campaigns, but not high quality or suitable for use for specific business decisions that require internal data. Whether external or internal, data quality must always be verified and must meet or exceed expectations.
Additionally, as businesses and organizations embrace digital transformation and migrate to cloud and hybrid cloud environments, the need to break data silos becomes imperative for data quality. It is critical for companies on this digitization journey to understand the consequences of not improving data quality.
SEE: Research: Digital Transformation Initiatives Focus on Collaboration (TechRepublic Premium)
What are the operating costs or risks of poor data quality?
Data quality has a direct impact on your business results. Poor external data quality can lead to missed opportunities, lost revenue, reduced efficiency and neglected customer experiences.
Poor internal data quality is also responsible for ineffective supply chains – an issue that has been in the news constantly for the past year. The same factor is one of the main drivers of the Great Resignation, as HR departments that work with bad data are challenged to understand their employees in order to retain talent.
In addition, there are serious immediate risks that companies need to address, and they can only do that by addressing data quality. The cybersecurity and threat landscape continues to grow in size and complexity and thrives on poor data quality management policies.
Companies that work with data and fail to comply with data, financial and privacy regulations face the risk of reputational damage, lawsuits, fines and other consequences related to non-compliance.
Gartner estimates that the average financial consequences of poor data quality on organizations is $9.7 million per year. At the same time, IBM says companies are losing $3.1 trillion a year in the US alone due to: insufficient data quality.
As the new economic slowdown and recession threaten every organization, data quality is becoming the key to navigating new economies; make difficult decisions; and drawing up short, medium and long term plans.
Common data quality issues
The most common data quality issues are duplicate, ambiguous, inaccurate, and hidden and inconsistent data. New issues include siled data, outdated data, and data that is not secure.
But another growing problem with data is that it is often strictly managed by IT departments, while an organization must approach data quality at all levels. McKinsey says companies should see data as a productmanaging their data to create “data products” across the organization.
Addressing data quality issues
When data is managed as a product, quality is assured because the data is ready for use, consumption and sale. The quality of this data is unique. It is verified, reliable, consistent and secure. Like a finished product your company sells, it is double checked for quality.
Gartner explains that to address data quality issues, companies must: align data policy and quality processes with business goals and missions. Executives need to understand the connection between their business priorities and the challenges they face, and adopt a data quality approach that solves real problems.
For example, if a company has high churn rates and the main business goal is to increase its customer base, a data quality program will work to improve performance in those areas.
Once the business purpose and challenges are understood and data teams have selected the right performance metrics, Gartner says the organization needs to profile its current data quality.
Data profiling must be done early and often and high standards of data quality must be established to benchmark progress towards achieving a goal. Data quality is not a “one time” activity; it is a constant, active management approach that must develop, adapt and perfect itself.
SEE: Hiring Kit: Database Engineer (TechRepublic Premium)
Improving data quality issues
McKinsey explains that teams that use data don’t have to waste time searching, processing, cleaning or preparing it for use. It proposes an integrated data architecture to deal with data quality and ensures that the model can accelerate business use cases by 90%, reduce data-related costs by 30% and keep companies free from data governance risks.
To improve data quality, organizations need the right model. McKinsey cautions that neither the basic approach, where individual teams aggregate data, nor the big-bang data strategy, where a centralized team responds to all processes, will yield good results.
In an effective data quality model, different teams are responsible for different types of data, which are classified according to use. Each team works independently. For example, data that consumers will use in digital apps must be managed by a team responsible for cleaning, storing and preparing the data as a product.
Internal data used for reporting systems or decision making should also be managed by a separate team responsible for closely monitoring quality, security and data changes. This focused approach makes it possible to use data for operational decisions and regulatory compliance. The same goes for data used for external sharing or information used for advanced analytics, where a team must clean and engineer the data so it can be used by machine learning and AI systems.
Companies that excel at creating data products will need to establish standards and best practices and track the performance and value of internal and external business activities. This focus on a product version of data is one of the most effective ways to protect against data quality erosion.