The article highlights the importance of data-quality as a part of effective decision making process. Not only does the article help understand the impact poor data can have on business decisions, but also suggests three important steps that business managers can follow to retain and improve the quality of data that is a key driver for their strategies. It underscores the need for businesses to inculcate a culture within the organization and design processes that nurture data automatically, for it to remain useful.
Why is data quality important?
Data quality is an important element of strategy as it impacts decision making at all levels in a business organization.
Your Prospect / Customer data is a key strategic asset, which needs to be managed effectively to generate significant returns. Its accuracy is directly linked to your marketing campaign’s effectiveness and sales’ close-out efficiency. It also impacts your ability to create new services and sources of revenues. Cleaned data enables you to accurately cross- and up-sell valuable customers and reduce the risk of allocating precious marketing resources to non-profitable customers.
If data is not managed appropriately and regularly, the quality of the asset degrades and the cost of maintaining poor quality data starts to outweigh its benefits.
Poor data quality can result in a whole range of problems like fallacious segmentation resulting in lost business, poor targeting, inadequate budgeting and unreliable financial projections, and so on.
Further, especially in the Financial Services Industry, data quality problems can result in difficulty in compliance with current and future regulations like Basel II, Sarbanes-Oxley, The US Patriot Act and International Accounting Standards.
It is evident, there is a breadth of impact that poor data quality can have on your business. Despite the breadth of the this impact, we have experienced that many organizations are oblivious of the scale of the problem or have no real sense of how much it is affecting their business.
How big is poor data quality issue?
Poor data quality issue is much higher in magnitude than many organizations realize. A study by Seattle-based Data Warehousing Institute found out that poor-quality data costs U.S. businesses approximately $611 billion a year in postage, printing and staff overheads.
In 2004, Gartner Inc. studied that more than 25 percent of critical data used in large corporations is flawed, due to human data-entry error, customer profile changes (such as a change of address) and a lack of proper corporate data standards.
Our experience has been that duplicate records, missing data and non-standardized formats are the most important elements of poor data quality of organizations. In another recent study by Forrester Research Inc., 37 percent of companies cited duplicate and overlapping files as significant data-management problems.
Given the extent of data quality problem, it is imperative that organizations set up a mechanism to manage data efficiently and prevent significant data deterioration.
What needs to be done?
Improving data quality isn't just a desirable (and profitable) goal, it's a full-fledged process in itself—one that requires inputs and support from all areas of a business. Keeping the data clean is a constant effort that needs to be diligently carried out by the organizations.
We believe organizations like yours need to adopt the following three-step process to identify and manage this issue:
1. Identify your greatest sore points in terms of data quality. You need to answer questions like “How bad is my data?”, and “How much impact is it having on my business performance?” and so on. You can do this internally or utilize an external vendor to deploy a data quality team to help you understand the extent of problem. This data quality team will identify and measure all issues your data might be having by doing a thorough due-diligence on your data samples.
2. Once specific data quality issues have been identified and measured the data quality team needs to define rules for data cleansing and set quality targets for output data. This team needs to carry out the following:
Data Integration to collate and integrate data from different sources (Internal databases, Survey Results, External Vendor Lists etc.)
Cleansing and Parsing to identify different types of data and put them in specific fields
Standardization (Name and Address) to ensure a consistent way of storing information throughout the data
De-duping to remove duplicates from data
External Data Append which will involve comparing customer data against a universal master such as say the U.S. Postal Service
3. Set up a constant monitoring process to observe data quality on a continuous basis. This will involve setting the process internally or with an external vendor by extracting data samples on a regular basis (monthly, quarterly, etc.) and performing data quality checks with regard to missing values, inconsistent values, duplicates etc. Our experience has been that organizations prefer to deploy a dashboard / MIS reporting process which is refreshed by data quality teams of the vendor after analyzing sample data on a regular basis. Based on data quality reports, data quality teams carry out the steps mentioned in #2 above on a regular basis.
Our own experience suggests the above process is better managed by completely outsourcing it to an external vendor who has deep expertise in providing such data management solutions. This allows organizations to focus on their core competencies while their vendors take care of all the data management activities.
Typically these arrangements range from $100,000 to $500,000 per annum depending on the volume and complexity of data management. We believe given the costs of poor quality of data, an investment in deploying services for data management easily results in triple digit ROI for the organizations. Not a bad investment after all!
Remember one more thing: Data quality improvement is not just about fixing data. You must ensure a process and cultural change within your organization to ensure that data quality is monitored and high quality data is maintained on a regular basis.
We, at Inductis, wish you all the best for your data cleanup act!