Data Quality Analysis – Do You Have Bad Data?

With data pouring into your organization faster than ever before, it’s hard to keep a handle on exactly what data you’ve got and what kind of shape it’s in. Clearly the situation calls for regular data quality analysis, but many organizations are held back by the difficulty of building in-house solutions and the high cost of packaged data quality tools from the few market-dominating commercial vendors. With the right approach, data quality analysis is possible for any organization.

What constitutes bad data?

Lack of data quality analysis will lead your organization to become rife with bad data. How do you know if it is “bad”? Here are six earmarks of bad data that will help illustrate:

  • Inaccurate Data: Contains misspellings, wrong numbers, missing information, blank fields
  • Noncompliant Data: Doesn’t meet regulatory standards
  • Uncontrolled Data: Becomes polluted over time due to lack of continuous monitoring
  • Unsecured Data: Left vulnerable to access by hackers
  • Static Data: Never updated, thus becoming obsolete and useless
  • Dormant Data: Loses its value as it’s neither updated nor shared 

Poor data quality analysis is pervasive

A Harvard Business Review study found that only 3% of data in organizations meets basic data quality standards. Poor data quality analysis can lead to difficulties when extracting insights and ultimately poor decision-making. This is something that many executives are worried about. According to the Forbes Insights and KPMG “Global CEO Outlook”, data and analytics capability ranks the highest of the top five investment priorities, but despite this, CEOs do not have much faith in the data they rely on.

Bad data quality has never been such a big deal, and it adversely affects all organizations on many levels. Here are just a few examples:

  • Incorrect email addresses: Will negatively impact marketing campaigns
  • Inaccurate personal details: May lead to missed sales opportunities or a rise in customer complaints
  • Incorrect shipping addresses: Goods can get shipped to the wrong locations
  • Incorrect product measurements: Can lead to significant transportation issues i.e. the product will not fit into a truck, alternatively too many trucks may have been ordered for the size of the actual load.

Determining the value of data

The ability to perform data quality analysis is a strategic asset that can give your organization a huge competitive advantage. Data quality can be achieved with the right combination of people and technology. People in different departments or levels will have varying opinions on what data is the most important. Data’s value comes primarily when it underpins a business process or decision-making based on business intelligence. Therefore, data quality rules should be agreed upon early, taking account of the value that data can provide to an organization. If it is identified that data has a very high value in a certain context, then this may indicate that more rigorous data quality rules are required in this context.

Check out our Definitive Guide to Data Quality today

Get the ebook

Performing data quality analysis

A proactive approach to data quality analysis allows you to check and measure that level of quality before it even really gets into your core systems. Accessing and monitoring that data across internal, cloud, web, and mobile applications is a big task. The only way to scale that kind of monitoring across all those systems is through data integration. It therefore becomes necessary to control data quality in real time.

Of course, avoiding the propagation of erroneous data by inserting your agreed-upon data quality rules into your data integration processes is key.  With the right tools and integrated data, you can create whistleblowers that detect some of the root causes of overall data quality problems. There are built-in smart features available that can help accelerate your data controls. Today, almost everyone has big data, machine learning, and cloud at the top of their IT “to-do” list. The importance of these technologies can’t be overemphasized, as all three are spawning innovation, uncovering opportunities, and optimizing businesses.

Finally, you will need to track data across your landscape of applications and systems. This allows you to parse, standardize, and match the data in real-time. You can organize the activity to check the correct data whenever needed.

The data stewardship and quality connection

Data stewardship is the process of managing the data lifecycle from curation to retirement. It is becoming a critical requirement for enterprises that want to rely on insights from data. Data stewardship involves several activities including monitoring, reconciliation, refining, de-duplication, cleansing, and aggregation to help deliver quality data to applications and end users. Cleaner data will lead to more data use while reducing the costs associated with “bad data quality” such as decisions made using incorrect analytics.

Data preparation for all

Self-service is the way to get data quality standards to scale. Data scientists spend 60% of their time cleaning data and getting it ready to use. Reduced time and effort mean more value and more insight to be extracted from data.

Self-service data preparation is not just a separate discipline to make lines of business more autonomous with data; it’s a core element for data quality analysis and integration. And it will become a critical part of most data integration efforts. Although it improves personal productivity, the true value of data preparation is to drive collaboration between business and IT.

Making data quality a priority

If your organization is ready to commit to data quality, Talend Data Fabric will enable you to automate data quality and governance at every step of your data pipeline, from capturing data lineage and cataloging data; to data profiling, cleansing, and enrichment; to data stewardship throughout the data lifecycle. Unlike legacy vendors or point solutions that have separate tools for data integration and data quality, Talend embeds data quality across the data value chain into all our products, so developers can create data integration jobs with data quality functions built-in.