Why Data Quality is Undermining Business Success

Published in TDAN.com April 2003

Poor data quality is threatening to undermine the massive investment being made in IT projects. In an environment that is increasingly based on information, companies need to understand that
the value of their businesses relies on the successful management of their data. With nearly all of a company’s operating units, from marketing and finance to customer services, depending on
accurate data for their success, the failure of many companies to take this issue seriously is exposing them to potential failure. The key to this is an ongoing data quality strategy.

The past decade has seen companies of all sizes scrambling to pour money into IT: building Internet presences, creating customer relationship management (CRM) solutions and forming complex
enterprise resource planning (ERP) systems. Businesses make these investments because managers attach huge value to how they can improve business efficiency and customer service through the use of
technology.

However, while huge emphasis is placed on systems integration and infrastructure, scant attention is paid to the lifeblood of any IT system: the actual data; and more importantly, the quality of
that data. In many companies, even at a boardroom level, there is a lack of appreciation of the damage that poor data quality can inflict upon a business. The reason for this probably has much to
do with the relative ease with which a manager can determine the effectiveness of a business process or IT system, compared with the presumed difficulty of ascertaining the quality of the
underlying data that gets fed into the system.

However, poor data is often exposed in a somewhat unexpected manner. For example, there was an instance when a telco company undertook a mailing to its customer base, inadvertently including a
number of security alarms. It was eventually determined that the error arose because the inventory system had stored the alarms as business addresses. The reason being that they had to have a
telephone link connected to them for calling emergency services. It was a relatively harmless error, but still added to the campaign’s costs, not to mention the company’s embarrassment.


Integration issues

The data problem worsens when a company decides to merge two disparate information repositories. This is typically done because of a merger or acquisition, or as part of an enterprise application
integration (EAI) procedure that is intended to streamline a business process. However, few CEOs or CIOs are prepared for the surprises that await them when deciding to undergo a data integration
project.

Consider the case of a large insurance company that decided to merge its customer databases in order to gain a better understanding of its customers, the products they bought with the aim of
improving customer profitability and service. Prior to the project, its management believed that it had a total of 13 million customers, based on the information available to them. However, when
the project was completed, so many duplicates were discovered that around five million names had to be struck from the total. If the management team had only been more informed about the data that
their business was run on, a lot of corporate blushes and shareholder annoyance could have been avoided.

Effective data analysis and data management is a necessity if an EAI or data integration project is to be successful. Research has shown that failure to examine the content, structure and quality
of data prior to integration is one of the root causes why nearly eight out of ten data integration projects cost two thirds more than their original budgets, or collapse altogether.

The risks are real and could affect organisations in more ways than just business profitability. Since the introduction of the latest UK Data Protection Act, the rules that govern information
management and ownership have changed. Under this new law, companies have to comply with rules for manual and electronic processing of personal data, such as name, addresses and dates of births.
Any company that keeps such records is responsible for the data collected, how it is used and to whom it can be given. One of the eight enforceable principles of the Act states that
“information must be accurate, up-to-date and held for no longer than necessary”. This means that companies need to take the issue of data quality seriously.


Taking it seriously

A significant part of the problem lies in the fact that very few companies take the issue of data quality seriously enough. If they do, it’s normally not at a sufficiently senior level. A
data management survey conducted by PriceWaterhouseCoopers discovered that two thirds of traditional companies and half of all e-businesses only discuss data management at a board level
occasionally, if at all. This lack of senior management overseeing data strategy and quality is a massive concern. The report further noted that responsibility for data is an issue largely left in
the hands of the IT department, especially within traditional companies, meaning that it often lacks the senior management control required to generate sufficient budget and attention.

There are distinct bottom line rewards to be had from paying attention to data quality. For instance, a large high-street retail bank, after undergoing a data quality exercise, managed to improve
the quality of its customer addresses by one per cent. While that may not sound too impressive, the resulting reduction of incorrectly addressed statements, brochures and other documents helped
shed a million pounds from its postal bill. The problem isn’t an isolated one. How many incorrectly addressed letters end up in your mailbox every month?


Steps to success

Companies that invest the proper time and resources into creating an effective data strategy are putting themselves in the correct position to secure maximum value. At the same time, they are
building a foundation for any future business changes required to keep them ahead of the game.

Once the required management support is in place, the crucial first step is to prioritise data quality issues and perform a thorough analysis of the company’s data. The good news is that this
is no longer a task that requires teams of people painstakingly checking reams of customer names, addresses and other information. Powerful new data analysis tools provide an opportunity to
identify a whole variety of data issues. These tools are capable of checking for most kinds of problems, inconsistencies and errors with minimal human intervention. By automating the bulk of the
analysis workload, analysts and other data specialists are then free to examine how the business can integrate and improve its data for the future.

As data increasingly becomes a company’s most valuable asset, so the quality of that data needs to become one of its highest priorities. This means it should be treated with the same
seriousness as revenue generation and customer service. Organisations that maintain the quality and integrity of their data will protect their business success and bottom line, while promoting
customer satisfaction and loyalty. This in turns helps assure their position as a major player in an ever-changing business environment.

Box-out: Sun-tan lotion for the hotspots of data quality

1. Seek out your quality hot spots: Identify what your indirect and direct costs are in the business, for example the cost of failing to meet customer service levels or incorrect
billing/loss of business because of bad data. This will clearly demonstrate the need for ongoing data quality initiatives.

2. Quantify them: Work out what cost benefits you could achieve if these hot spots were improved. Bear in mind that some hot spots may all be related to a single data area. (Look
for the Jackpot)

3. Prioritise your quality hotspots: Order your quality pains by those that would bring in the most return once rectified. (Fit for purpose)

4. Secure data ownership within both business and technical communities: Responsibility for ownership of data must be placed at both the commercial and practical levels within
organisations. For data areas impacted by the hot spots, business-level ‘buy in’ must be obtained.

5. Create a data cleansing action plan: To be successful, data quality initiatives need clear milestones, dedicated resources and demonstrative measures that data standards are
being successfully achieved

6. Promote the cost benefits achieved from implementing the action plan: Consider including data quality measures in your overall business scorecard. After all, data issues cost
perhaps more than other problems – when things go wrong with data they go spectacularly wrong. Solid proof will also help when initiating other cleansing initiatives.

7. Maintain quality on an ongoing basis: Work with data owners to ensure any decreases in data quality are quickly resolved and data quality is maintained, ensuring that you turn
your data into a highly valuable business asset.

Share this post

Tony Rodriguez

Tony Rodriguez

Tony is the principal founder and CEO of Avellino. He has over ten years of experience in high performance computing and data integration. Founded in 1997, Avellino is an independent provider of enterprise solutions that enable organisations to integrate and migrate data, and manage data quality accurately and effectively. Avellino has set the standard for data profiling and analysis with its flagship software product, Avellino Discovery. Launched in November 2000, Avellino Discovery is used worldwide by many blue-chip corporations, including, Abbey National, BT, Co-operative Insurance Services, EDS, Ford Financial, UK Ministry of Defence, Old Mutual, RAC and Sun Life of Canada. Avellino's headquarters are in Aldermaston, Berkshire, UK. It has US offices in Austin, Texas and in Pittsburgh, Pennsylvania. Avellino also has representative offices located in the Netherlands, Spain, South America and Switzerland. Avellino's website is located at http://www.avellino.com.

scroll to top