Data Denial and Business Intelligence – How to Achieve Data Quality

The greatest battle you may face inside the organization will be to get management to the point where they agree that data quality is a goal even worth considering.

Everybody talks about data, but many often confuse it with information and knowledge. Basically, data is a core corporate asset that must be synthesized into information before it can serve as the basis for knowledge within the organization. Nevertheless, data is ubiquitous – it is used to support every aspect of the business, and is an integral component of every key business process. However, incorrect data cannot generate useful information, and knowledge built on invalid information can lead organizations into catastrophic situations. As such, the usefulness of the data is only as good as the data itself – and this is where many organizations run into trouble.

Many organizations neither recognize nor accept the bad quality status of their data, and try instead to divert the attention to supposed faults within their respective systems or processes. To these organizations data denial has practically become an art form, where particularly daunting corporate barriers have been built – typically over long periods of time – to avoid the call to embark on any “real” Data Quality improvement initiatives.

However, we have found that the best way to measure the extent to which your organization may be dealing with data denial is to ask the following key questions:

  • Are you aware of any Data Quality issues within your company?
  • Are there existing processes that are not working as originally designed?
  • Are people circumventing, the system in order to get their work completed?
  • Have you ever been forced to deny a business request for information due to an issue of Data Quality?
  • If the system was functioning properly, would this information have been readily available?
  • Has a business case been made outlining the economic impact of this issue? And, if so, has it ever been addressed with the organization’s leadership?
  • What was the response to these issues? And if there was no response, what is stifling this process?
  • What causes these “gaps” in Data Quality?
  • How are these issues affecting the responsiveness of your organization (i.e., to customers, stockholders, employees, etc.)?
  • If these issues were to be addressed and corrected, what strategic value would be added or enhanced?
  • Who bears the responsibility for addressing these issues within your organization?
  • What can be done to address these issues in the future?
  • What support is needed to implement a Data Quality strategy?

Depending on the answers to these questions, your organization may already be facing significant barriers to attaining Data Quality, each of which will need to be identified, assessed, prioritized and corrected. According to William K. Pollock, president of the Westtown, PA-based services consulting firm, Strategies For GrowthSM, “Most companies already know what data they do not have – and for them, this is a significant problem. However, the same companies are probably not aware that some of the data they do have may be faulty, incomplete or inaccurate – and if they use this faulty data to make important business decisions, that becomes an even bigger problem”.

Common Problems with Corporate Data

Research has shown that the amount of data and information acquired by companies has close to tripled in the past four years, while an estimated 10 to 30 percent of it may be categorized as being of “poor quality” (i.e., inaccurate, inconsistent, poorly formatted, entered incorrectly, etc.). The common problems with corporate data are many, but typically fall into the following five major areas:

  • Data Definition – typically manifesting itself through inconsistent definitions within a company’s corporate infrastructure.
  • Initial Data Entry – caused by incorrect values entered by employees (or vendors) into the corporate database; typos and/or intentional errors; poor training and/or monitoring of data input; poor data input templates; poor (or nonexistent) edits/proofs of data values; etc.
  • Decay – causing the data to become inaccurate over time (e.g., customer address, telephone, contact info; asset values; sales/purchase volumes; etc.).
  • Data Movement – caused by poor extract, transform and load (ETL) processes that lead to the creation of data warehouses often comprised of more inaccurate information than the original legacy sources, or excluding data that is mistakenly identified as inaccurate; inability to mine data in the source structure; or poor transformation of data.
  • Data Use – or the incorrect application of data to specific information objects, such as spreadsheets, queries, reports, portals, etc.

Each of these areas represents a potential problem to any business; both in their existence within the organization, as well as the ability of the organization to even recognize that the problem exists. In any case, these are classic symptoms of “data denial” – one of the most costly economic drains on the well-being of businesses today.

Data Quality Maturity Levels

There are five key status indicators that can be used to measure the existing levels of Data Quality maturity in an organization, each with its own set of distinct corporate – and human – attributes. However, it is at the mature level where you will want your organization to be positioned.

  1. Embryonic – this level is the least beneficial place to be, as Data Quality does not even appear on the organization’s radar screen; there is extensive finger-pointing with respect to data-associated blame, generally leading to cover-ups and CYAs; and there is no formal Data Quality organization in place. As far as the humans involved in the process are concerned – they are totally “clueless”.
  2. Infancy– this level is not that much better, although the organization has begun to consider looking into Data Quality; various ad hoc groups may have been established to search for “answers”; and Data Quality has been positioned as a subset of corporate IT. This typically occurs as the human element begins to show an emerging interest.
  3. Adolescence – this level is one of mixed Data Quality accomplishments where most of the pain points have already been identified and the strategy team has shifted into a crisis-driven “full court press” managed by formal Data Quality teams that are populated and coordinated by both IT and the Business. However, this is also the point where alternating periods of panic and frenzy typically set in.
  4. Young Adult – by the time the organization reaches this level, there begins to be some semblance of an evolving Data Quality structure, where the entire organization is involved; one where both IT and the Business have begun to work as partners toward a common goal. Accordingly, the human attribute has also become much more “stabilizing”.
  5. Mature – once the organization has attained the this level, it has finally reached the point where it has implemented an effective Data Quality structure, characterized by collaborative efforts and Data Quality/Center of Excellence (DQCE), as well as the ability to measure and track customer value over time. As such, the organization has been able to attain a “controlled” environment, where all of the personnel involved – on both the supply and demand sides – are comfortable that the desired levels of Data Quality have been achieved.

Moving Toward Data Quality

Data Quality is the desired state where an organization’s data assets reflect the following attributes:

  • Clear definition or meaning;
  • Correct values;
  • Understandable presentation format (as represented to a knowledge worker); and
  • Usefulness in supporting targeted business processes.

However, regardless of the state of the organization’s data assets, there must still be a balance of data, process and systems in order to meet the company’s stated business objectives, which generally focus on things like:

  • Increasing revenues and margins;
  • Increasing market share; and/or
  • Increasing customer satisfaction.

In today’s economy, companies tend to focus their investments more on packaged systems and business process optimization, rather than on Data Quality. As a result, investment in corporate Data Quality is often overlooked – and this can very easily lead to a significant reduction in the organization’s ability to effectively answer critical business questions, such as:

  • Who is our customer?
  • Are we missing sales opportunities?
  • Is the customer’s product entitled to service?
  • Are inaccuracies causing customer dissatisfaction?
  • What should we spare; how many; where?
  • Are our service functions efficient; is our decision support timely and reliable?
  • How is our product defined?
  • Is our billing accurate and timely?

The inability to answer these critical business questions leads to data quality issues such as:

  • Inconsistent or incomplete product structure and service data
  • Inability to uniquely identify entitled versus non-entitled equipment
  • Incomplete or non-existent configuration data on entitled products
  • Duplication and redundancy

But, it gets even worse! Poor Data Quality eventually stunts operational efficiency in virtually every area of the organization, as otherwise valuable resources (i.e., personnel, dollars, time, etc.) are required to spend an inordinate – and unnecessary – amount of extra effort:

  • Searching for missing data;
  • Correcting inaccurate information;
  • Creating temporary, or permanent, workarounds;
  • Laboring to assemble information from disparate data bases; and
  • Resolving data-related customer complaints.

Over time, poor data quality significantly decreases an organization’s revenue-generating opportunities. Lost revenue can exist is the following:

  • Lost Maintenance Contract Revenue – products that should be under contract are not captured and billing revenue is understated.
  • Lost T&M Revenue – Non-entitled products that should be serviced under T&M are serviced under contract
  • Lost Product Upgrade Opportunities – Inability to identify customer need for product and software upgrades
  • Incorrect Maintenance Charges – Incorrect contract pricing since product configurations cannot be accurately identified.
  • Lost Customer – Lost customers and revenue due to dissatisfaction with poor asset management and cumbersome reconciliations.
  • Delayed Contract Renewals – lost renewal revenue and increased admin costs due to delays in new contract initiation.
  • Overlooked Cross-sell & Up-sell Opportunities – missed opportunity to sell complementary or advanced solutions die to inaccurate records

Poor data quality also significantly increases its operating costs and, may in fact, lead to a reduction of customer satisfaction. Increased operating costs can exist in the following areas:

  • Sales Team – more time is required to manage new opportunities and create quotes, less time is spent selling and and quoting new maintenance contracts becomes inaccurate.
  • Customer Care Center – T&M billing disputes increase, the cost of contract dispute resolution is higher and there is a decreased accuracy and timeliness of invoices with increased dispute losses.
  • Contract Management – the effectiveness and timeliness of renewal activity is decreased.
  • Logistics – stocking locations become sub-optimized by an over/under stocking of spare parts.
  • Finance – data for decision support and performance reporting becomes incomplete and/or inaccurate.
  • Service Delivery – tech on-calls are doubled dispatched due to the wrong part, service level commitments are missed and trouble call handling is degraded.
  • Product Management – the product lifecycle position is inaccurately identified and inaccurate service history affects service offering decisions.
  • Services Marketing – the ability to develop pricing programs is hindered, marketing programs are not deployed effectively and there is an increased burden/time for data collection.

How to Achieve Data Quality

Arguably, the greatest battle you may face inside the organization will be to get management to the point where they agree that data quality is a goal even worth considering. To do this, every organization must have a champion to help find ways for removing barriers and changing existing perceptions. The primary focus of the champion should be on:

  • Assisting in making data quality a strategic priority;
  • Assuring that data quality will be used to enable business processes; and
  • Find – and communicate – compelling ways to make data quality attractive.

In our own experiences, Bardess Group has assisted many organizations to achieve data quality by applying the most effective methodology for accelerating the data cleansing and control processes.

Finding Success

Many organizations can achieve data quality by applying the most effective methodology for accelerating the data cleansing and control processes.

The seven major steps that must be taken to achieve Data Quality are:

  1. Acknowledge the problem, and identify the root causes;
  2. Determine the scope of the problem by prioritizing data importance and performing the necessary data assessments;
  3. Estimate the anticipated ROI, focusing on the difference between the cost of improving Data Quality vs. the cost of doing nothing;
  4. Establish a single owner of Data Quality with accountability (e.g., make it a senior management role, such as a Data Officer/DQ COE);
  5. Create a Data Quality vision and strategy, and identify the key change drivers;
  6. Develop a formal Data Quality improvement program based on specific tools wherever possible (e.g., First Logic, Trillium, IBM Ascential, Data Flux, Group 1), and use a value-driven approach for large projects; and
  7. Make it a priority to move your organization up through the levels of the Data Maturity model!

Achieving Data Quality is critical, but getting there is often a complex process. Data Quality requires commitments from all business functions, as well as from the top-down. Quick fixes typically do not work and generally only end up creating frustration. For many organizations, it may have taken years to create and foster a culture of data denial, and it will require rigorous processes to:

  • First, identify the problem before it can be fixed and;
  • Second, recognize – and accept – the full extent of the potential benefits that can ultimately be realized.

However, for many business enterprises, the numbers speak for themselves, where the implementation of a Data Quality initiative can ultimately lead to:

Reductions ranging from:

  • 10 – 20% of corporate budgets,
  • 40 – 50% of the IT budget, and
  • 40% of operating costs;
  • And increases of:
  • 15 – 20 % in revenues, and
  • 20 – 40% in sales

The application of Data Quality can provide an organization with the opportunity to capitalize on its cumulative information and knowledge assets. Knowledge that was previously unknown – or unavailable – such as cross-referenced customer buying patterns, profiles of potential buyers, or specific patterns of product/service usage may be uncovered and put into practical use for the first time. The end result can lead to anything ranging from improvements in operational efficiency, more accurate sales forecasting, more effective target marketing, and improved levels of customer service and support – all based on a strong foundation of Data Quality.