ds-blog-icon.png

THE DATA DOWNLOAD

As a business intelligence consulting company, we pride ourselves on being able
to deliver on our projects as well as provide good quality content to our readers.

Top Ten Data Quality Problems: Part II

Posted by Sally McCormack on Jul 3, 2017 10:23:52 AM

Recently, we outlined the top five data quality problems in enterprise data management and offered best practices to solve those challenges. This post explores the topic further to highlight five additional roadblocks associated with managing the critical data of an organization. Organizations that take time to decipher the root cause behind data challenges will run more successful enterprise data programs. They’ll also have a foundation in place for sustainable growth with an enterprise-wide view of customers, manufacturing, supply chains, sales, and operations. 

top-ten-data-quality-problems-part-two.png

6. Inaccurate transaction data-. The volume of transaction data grows when there is an increase in frequency of data exchanged between systems and databases in real-time. Immediate results and real-time analysis are the resulting benefits, but they become a concern if there are underlying data quality issues. This can be doubly serious for public enterprises bound by SEC disclosure rules; what may start out as something small, such as duplicate data or records, can snowball into a severe data integrity issue when financial reports show quarterly expenses or income.

Solution: In a traditional data warehouse design, a copy of transaction data is captured and specifically structured for analysis. From here, data marts are used as smaller data storage facilities to provide reporting and analytical capabilities for specific business processes. Unreliable data acquisition and delivery processes can lead to inaccurate transaction data, impairing analytics capabilities and restricting visibility into transactions and customer actions. The data warehouse team should work with analysts to establish clear definitions for each data element; this will help maintain data acquisition and delivery specifications and tell the IT team how to properly populate the data warehouse.

7. Operational productivity roadblocks- By showing a 360-degree view of buying habits, contact preferences, etc., a data warehouse is fundamental in enabling an enterprise to better understand their customers. High-quality data can then be used for marketing automation and personalization, sales, and loyalty programs. However, when data integrity is questionable, the data stalls decision-making and productivity, becoming costly and possibly detrimental. For example, productivity decreases in instances of lost products and time due to reprocessing an order shipped to an outdated address, or an increase in time spent flagging and fixing invalid records.

Solution: To eliminate productivity drains, a data quality process should be well-structured and include a repeatable process that defines specific requirements for 'good data.' It should also establish rules for certifying the quality of the data. These rules should be worked into the existing data quality process and be continually amended as new data types are added. For instance, if a new platform such as Hubspot is deployed to track inbound marketing activities, the enterprise data quality initiative should be scalable to address those new requirements.

8. Growth and expansion impacts-  It’s common to combine internal data with data from third-party applications and external systems to understand customer behavior. A manufacturer, for instance, may have limited internal sales data, requiring third-party data from retailers to get a full picture of their end customer. If a company is looking at this aggregated data, and any sources along that data chain are inaccurate, they face a significant problem. Data quality issues at this level could misdirect building or expansion plans, with the potential for lost opportunities and misguided resource allocation.

Solution: A focused data quality approach is especially important during the merger and acquisition process when adding new facilities and enriching internal base data with third-party sources. Data enrichment tools can help merge data sources and support the manual effort. This is a key step to enhance the value of the business data, making it more meaningful in daily decision-making.

9. Finding a data champion is tough when everyone is busy! Pinpointing responsibility for data quality is a challenge because no single business unit is responsible for all data in an enterprise. Data touches every corner of the company, and quality responsibilities run across organizational boundaries in ways that can be confusing and difficult to track.

Solution: Data governance leads should be selected early in any enterprise data management initiative. It’s important to assign responsibilities and formal procedures for each data governance lead and everyone that handles data. During this phase, it’s also critical that IT has a firm grip on data quality expectations so they can build the system to support the right data types. It also means that data consumers and business analysts understand how to leverage the data properly for real, actionable insights. Next, these groups need to agree on consistent business rules and definitions for data collected for products, customers, sales, etc. This may include gathering input from Manufacturing, R&D, and Sales and Marketing on agreed-upon rules and processes for data acquisition, management, dissemination, and data disposal.

10. Using flawed data– Flawed data is a fact of life. Despite our best efforts to eradicate all data issues, imperfect data will always exist. Deciding when data quality is ‘good enough’ can be a challenge and often means striking a delicate balance. For instance, the accuracy and completeness of customer information may be ‘good enough’ to send timely invoices and promotions to a customer, but not complete enough to fully grasp the profitability of that customer or predict future purchases.

Solution: While there is no straightforward ‘fix’ to determine when data quality is good enough, it’s important to see the big picture and set realistic goals and an iterative approach to any data quality initiative. Determine what the data is used for, compared to the time and resources needed to make it perfect. While it’s important to always strive for clean data and to follow rigorous data quality standards, some data will still be imperfect.  However, this data is still useful to create models and identify trends. When it comes to data quality, enterprises should strive for improvements, not perfection.

With a strong data quality program in place, organizations can achieve maximum benefit from data warehouse and business intelligence platforms. Quality data sources also empower enterprises to make significant strides in identifying new customers, increasing customer retention, and delivering exceptional value to each customer. If your organization is interested in implementing a data quality program, email us - we’d love to chat!

 

Topics: Data Quality

Written by Sally McCormack