How to address poor data quality in your organization

business man working with best service sign Quality assurance, guarantee, standards, ISO certification and standardization concept.
Image: Looker_Studio/Adobe Stock

Between responding to supply chain disruptions, pivoting to the economic downturn, reacting to inflation, retaining and winning new customers, and better managing inventories and production, data quality has never been more critical to your business.

In the digital age, data is a company’s most valuable resource. Data collection, data analysis, and data governance strategies are what separate leaders from the rest of the pack. And data quality is woven into the entire data architecture.

What is data quality?

A Forrester survey found that the ability to integrate data and manage data quality are the top two factors holding back customer intelligence among top customer intelligence professionals. But data quality goes beyond customers. Executives and senior management use internal data to drive daily operations and meet business goals.

Quality data must be accurate, complete, consistent, reliable, secure, up-to-date, and not isolated. High-quality data is often defined as data that is “fit for use in operations, decision-making, and planning.” High-quality data also represents real-world constructions.

The difference between internal and external data and what makes it “fit for use” is important. External data is generated by a company’s customer base and may be of high quality for marketing campaigns, but not high quality or suitable for specific business decisions that require internal data. Whether external or internal, data quality should always be verified and must meet or exceed expectations.

Additionally, as businesses and organizations embrace digital transformation and migrate to cloud and hybrid cloud environments, the need to break down data silos becomes imperative for data quality. It is critical that companies on this digitization journey understand the consequences of not correcting data quality.

SEE: Research: Digital transformation initiatives focus on collaboration (TechRepublic Premium)

What are the business costs or risks of poor data quality?

The quality of the data will have a direct impact on your results. Poor external data quality can lead to missed opportunities, lost revenue, reduced efficiency, and neglected customer experiences.

Poor internal data quality is also responsible for inefficient supply chains, an issue that has been in the news for the past year. The same factor is one of the main drivers of the Great Resignation, as HR departments operating with poor data are challenged to understand their workers in order to retain talent.

Additionally, there are serious immediate risks that companies need to address, and they can only do so by addressing data quality. The cybersecurity and threat landscape continues to grow in size and complexity and thrives when poor data quality management policies prevail.

Companies that work with data and do not comply with financial, privacy and data regulations risk reputational damage, lawsuits, fines and other consequences related to non-compliance.

Gartner estimates that the average financial impact of poor data quality on organizations is $9.7 million per year. At the same time, IBM says that in the US alone, companies lose $3.1 trillion a year due to insufficient data quality.

As the new economic slowdown and recession threaten all organizations, data quality becomes key to navigating the new economies; make difficult decisions; and preparation of short, medium and long term plans.

Common data quality issues

The most common data quality issues are duplicate, ambiguous, inaccurate, hidden, and inconsistent data. New issues include siled data, stale data, and data that is not secure.

But another growing problem with data is that it is often tightly managed by IT departments when an organization must have a multi-layered approach to data quality. McKinsey says that companies should think of data as a product, managing their data to create “data products” across the organization.

How to address data quality issues

When data is managed as a commodity, quality is assured because the data is ready to use, consume, and sell. The quality of these data is unique. It is verified, reliable, consistent and secure. Like a finished product your company sells, it is double-checked for quality.

Gartner explains that to address data quality issues, companies must align data policies and quality processes with business goals and missions. Executives must understand the connection between their business priorities and the challenges they face and adopt a data quality approach that solves real-world problems.

For example, if a company has high churn rates and their main business goal is to grow their customer base, a data quality program will work to strengthen performance in those areas.

Once the business purpose and challenges are understood and the data teams have selected the appropriate performance metrics, Gartner says the organization should profile the quality of its current data.

Data profiling should be done early and often, and high data quality standards should be set to benchmark progress toward meeting a goal. Data quality is not a one-and-done activity; it is an active and constant management approach that needs to evolve, adjust and refine.

SEE: Hiring Kit: Database Engineer (TechRepublic Premium)

Improvement of data quality issues

McKinsey explains that teams that use data shouldn’t have to spend time searching for it, processing it, cleaning it, or making sure it’s ready for use. It proposes a comprehensive data architecture to handle data quality and claims that its model can accelerate business use cases by 90%, reduce data-related costs by 30%, and keep businesses free from data risk. data governance.

To improve data quality, organizations require the right model. McKinsey warns that neither the grassroots approach, in which individual teams aggregate data, nor the big bang data strategy, where a centralized team responds to all processes, will work.

In an effective data quality model, different teams are responsible for different types of data, which are classified by use. Each team works independently. For example, the data that consumers will use in digital applications must be managed by a team responsible for cleaning, storing, and preparing the data as a product.

Internal data used for reporting systems or decision making should also be managed by a separate team responsible for closely protecting data quality, security, and change. This focused approach makes it possible for data to be used for operational decisions and regulatory compliance. The same applies to data used for external exchange or information used for advanced analytics, where a team must clean and engineer the data for use by AI and machine learning systems.

Companies that excel at creating data products will need to establish standards and best practices and track performance and value across internal and external business operations. This attention to a working version of the data is one of the most effective ways to guard against data quality erosion.

Leave a Comment