METAbits – October 2003

Published in TDAN.com October 2003

Information Audits Gain Business Value from Corporate Information

27 August 2002 – Meta Group, Inc.

“Information is a primary currency of many organizations,” says META Group analyst Doug Laney. “Why not treat it like a financial asset?”

Currently, the problem in many enterprises is that, although the huge value of information is acknowledged, information is not treated it as if it were valuable. Information must be considered an
asset in the business and IT portfolio. It should be owned by the enterprise, not by specific lines of business, divisions, or departments. However, each information asset should have clear lines
of responsibility to specific organizations and people within the enterprise. Responsibility for the security of that information must be assigned, just as corporate revenues are owned by the
corporation but are managed and secured by the financial department. Information should be managed with the same disciplines (e.g., security, focus on investment, and governance) that organizations
apply to money.

However, the problem is that information is often less tangible than money. CIOs and business executives should ask the following questions:

  • What information do we have, and what do we lack?
  • Are we gathering too much information, or too little?
  • How can we measure and improve the quality of our information?
  • Where in the information supply chain should we deal with information quality?
  • How well do we manage and leverage information – are we leaving large amounts of valuable information buried in databases?
  • How can we improve access to that information?
  • How can our business processes be improved by access to better information?

To answer these questions, META Group has identified an information audit process – a concept borrowed from financial audits. Like a financial audit, it is designed to document the value that the
enterprise realizes from the asset – in this case, information – by answering three basic questions:

  • What information does the enterprise have?
  • How well is information managed?
  • How well does the enterprise leverage information to create a positive impact on business performance?

“We are developing a five-level model that gauges how mature client organizations are in the management of their information assets,” says Laney. “Our first cut looks at data quality. It rates
organizations on their maturity and provides specific actions they can take to move up the maturity scale and improve data quality.”

One key difference between this information audit and previous efforts is that it is tied directly to business processes and strategy. It does not look at managing information for its own sake, but
rather at how information can improve business effectiveness, support business strategy, and drive down operational costs.

For instance, through an information audit, retail clients have identified how they could improve customer service by applying household information, which they already purchased, for another
purpose. A major insurer discovered it was paying claims on policies that had expired as long as 30 years ago. A major manufacturer realized that it had 250 model years identified in its main
database structure for a product that has been in existence for fewer than 100 years. One of the most common issues identified by information audits is the existence of numerous conflicting
definitions for key entities, such as “customer”, which hamper efforts to identify, sell to, and service customers.

In one case, 10% of a retail client’s customer records with credit approval had missing address fields. In another, business executives had nicknamed the corporate data warehouse the “data
jailhouse” because they could not get information out of it. Their chosen platform was high performance, but it did not support the analytical tools they had. Also, the information audit exposes
superfluous costs due to data duplication and data set overlap (e.g., multiple data warehouses).

While the information audit is designed to identify these and similar problems and provide methods for remedying them, it also looks at information from a higher, architectural level. Issues that
are considered include:

  • How is the data organized?
  • What knowledge (metadata) is available about the data?
  • How is it controlled (governance)?
  • How do operations detract from data flow and availability?
  • How is the data used versus how it is intended to be used?
  • How complete and accurate is it?

In addition, the information audit process can be used to attribute a financial value to units of information (e.g., customer contact, sales order, Web session). This helps an enterprise plan or
regulate its level of resource investment in managing and deploying information. Various factors, such as: its quality characteristics, business process relevancy, loss risk, and information
acquisition/management/delivery costs, should factor into value-of-information models.

User Action: Users need to go beyond talking about information and treat it as an asset. They should apply financial and inventory management techniques to managing information. The first step in
this process is an audit to document how information is treated currently, where the key problems in maximizing the benefits of the information asset lie, and how those problems can be fixed. This
audit should be repeated regularly, just as corporate finances are audited annually, to document progress and identify the next steps the enterprise must make to move up the information maturity
index.

META Group analysts Doug Laney, Chris Byrnes, John Brand, Jeffrey Mann, Dale Kutnick, and Val Sribar contributed to this article.

More METAbits …

The Project Stop Loss
Robert Handler – 27 August 2003

A project “stop loss” is an effective technique to prevent mismanagement of project portfolios. With a project portfolio, however, “stop loss” triggers must be created that are more complex
than simple price-change triggers used in investment portfolios. To institute a project “stop loss” system, project managers must be consistently reporting actual project performance information
against expected project performance information. Instituting an IT project “stop loss” will minimize loss on bad projects.

Misinformation on Enterprise Information Integration (EII)
Doug Laney – 21 August 2003

As with many novel technologies, some individuals (even respected IT professionals) get caught inappropriately pigeon-holing and misconstruing them. This is what seems to be happening with the
emerging breed of enterprise information integration (EII) technology. We have long advocated EII as a means for low-volume federated access to heterogeneous data sources. Similarly, EII vendors
(e.g., BEA, IBM, Metamatrix) have been very careful not to characterize their solutions as a substitute for physical data warehouses. Yet some in the industry still have trouble appreciating that
EII is an enabler for a valuable new style of data integration (i.e. “virtual” or “federated” integration) — one that can extend a data warehouse, but that certainly does not intend to replace
it.

Signs of the Adaptive Infrastructure: Self-Correcting Data
Doug Laney – 13 August 2003

By 2004/05, data profiling technologies (e.g., Ascential, Evoke, Firstlogic, Avellino, DataFlux) will be used to monitor and introspect data feeds – adjusting data integration (e.g., ETL) and data
quality routines on the fly. As data profiling becomes more pervasive, manual data modeling efforts will become largely irrelevant other than for initial application design. With the push of a
button, an enterprise will be able to generate an enterprise data model that reflects the real world – not just its best guesstimate.

If You Step Twice Into a Data Stream, Is It Still the Same Stream?
Doug Laney – 8 August 2003

Information-rich enterprises, particularly those in dynamic business environments that lead to new and modified data streams, are increasingly turning the “data modeling” exercise on its head.
Using data profiling technologies (e.g., Avellino, Evoke, DataFlux, Firstlogic, Ascential), an enterprise can model data through introspection rather than laborious hand-modeling. These
technologies look at what’s actually in the data to form an understanding of real relationships (rather than the assumed ones endemic to traditional manual data modeling). Data profiling tools can
not only generate data models, but also report on data quality issues.

Business Object’s Strategy Crystalized
David Folger – 7 August 2003

Business Objects recently announced plans to acquire Crystal Decisions, a business intelligence (BI) company based in Vancouver, Canada. While this is a merger of two healthy companies, and on
paper puts the combination ahead of BI leader Cognos, we note that like-sized high-tech mergers are often problematic. The companies will need to address overlapping product lines, differing
distribution channels, technological differences (e.g., metadata, report definitions), and major operations split between three countries (France, US, and Canada). Given the financial strength of
both companies and the large number of entrenched users, we expect the merger will ultimately be successful – probably after a year of sorting out the above issues. Given these companies’
strengths, we feel that end users can count on continued support for core products (e.g., Crystal Reports, Business Objects WebIntelligence and client version), but users of peripheral products
(e.g., OLAP viewers, analytic applications) should await definitive road maps before initiating large deployments. This merger accelerates the trend from a market of medium-sized vendors with
narrow product lines into a BI market dominated by a few large suite players. The merger also puts additional pressure on smaller BI vendors (e.g., ProClarity, Actuate, MicroStrategy) to find
partners as the market consolidates.

Customer Data Integration Lesson #1= “Go Early, Go Executive”
Aaron Zornes – 29 July 2003

Providing a 360-degree view of customers within the organization is always targeted by customer relationship management (CRM) initiatives, but rarely is it effectively delivered, due to many
challenges. Given the political gerrymandering associated with customer data ownership and costs, early succor of executive sponsorship is critical. Too often, customer data integration (CDI)
initiatives fail due to political infighting over data primacy concerning customer data.

Customer Data Integration Lesson #2 = “It’s the Data, Stupid”
Aaron Zornes – 29 July 2003

Without conformance regarding certain aspects of the customer data model, other efforts such as process model integration suffer extremely. In other words, organizations should fixate on
data-centricity rather than system-centricity. Next-generation customer data models offer potential for dramatically driving down costs of CRM integration as well as for documenting and enforcing
privacy policies regarding use of customer data. The functionality and extensibility of customer data models are critical evaluation criteria for customer data integration (CDI) solutions.

Customer Data Integration Lesson #3 = “Think Global, Act Local”
Aaron Zornes – 29 July 2003

Rather than boiling the ocean as many enterprise customer data repositories attempt by provisioning an enterprise data warehouse or operational data store (ODS) with the penultimate set of internal
and external customer data sources, organizations should simply define the customer data integration (CDI) long-term vision but act incrementally. CDI solutions are major infrastructure investments
that require traditional technology evaluation practices. Such master data infrastructure needs to be evaluated for future leverage with other master data (e.g., employees, partners, suppliers).

Customer Data Integration Lesson #4 = “Commit to Commitment”
Aaron Zornes – 29 July 2003

Creation of a customer data integration (CDI) committee or task force (a.k.a. program management office or PMO) is vital to the success of CDI initiatives. Such organizations are typically tasked
with: change management across lines of business and disciplines; coordination of cross-system metadata; and monitoring of industry standards (e.g., HIPAA, US PATRIOT Act, OFAC).

Customer Data Integration Lesson #5 = “Discipline Thyself, IT Organization”
Aaron Zornes – 29 July 2003

IT organizations (ITOs) love to evangelize discipline to IT analysts in lines of business, but often fail to measure up to such ideals themselves. Common ITO engineering rigor can build discipline
into ITOs by adherence to methodologies, and establishment and recognition of best practices such as centralized customer administration (“data stewardship”); application of HIPAA rules to
restrict data access; and leverage of security roles by applying traditional LDAP security to the CDI solutions’ cross-system mapping data model.

Pretty Reports Are Not the Answer to Enterprise Analytic Value
Doug Laney – 23 July 2003

Effective consultants help clients be opportunistic with data in spite of themselves. Instead of developing low-value data warehouse solutions that deliver interesting data to curious eyeballs,
effective consultants create high-value solutions that deliver important data to business processes. A consultancy that can determine how information can fuel the performance of business processes
will fare far better than one that can wow business users with periodic pretty reports.

Mind the Gap: Data Warehouse Boutiques, Systems Integrators, or…
Doug Laney – 17 July 2003

IT consulting in the data warehousing and business intelligence space continues to become more competitive, yet decidedly polarized. In the gap between the global systems integrator generalists and
local data warehousing boutiques lies a significant opportunity for consultancies that aggressively and exclusively focus on being full-service analytic solution providers. Enterprises employing
these firms will find ample resources dedicated to and profoundly experienced in the discipline and nuances of data warehousing.

“Data Hoarding” by Business Units Leads to “Corporate Data Privation”
Aaron Zornes – 17 July 2003

Online data enrichment can instantly increase the value of customer information (e.g., many service bureaus are Web-enabled so that customer data can immediately be enhanced, instead of waiting for
monthly batch updates). A key challenge is to broadly apply a common data model and specifically tag a unique identifier to customers so that information from multiple touch points and lines of
business can be combined. Creating a single master reference database for the entire organization is difficult due to ownership issues. Emerging customer data integration (CDI) solutions enable
each business unit to manage its own data yet find ways to share it (e.g., DWL, ISI, IBM, Journee, SAP, Siebel, Siperian). This can also help leverage customer data without risking infringement of
privacy regulations.

Emergence of the Web Site as a Customer Channel
Kurt Schlegel – 11 June 2003

It took a dot-com implosion, but the role of a Web site as a customer channel complementary to others (e.g., direct sales, call center, resellers) has been clearly defined for most Global 2000
organizations. Customer relationship management programs implemented during the past two years have formalized customer life-cycle dynamics around four primary events: engaging, transacting,
fulfilling, and servicing the customer relationship. Despite the Web site’s new prominence as a customer channel, most organizations are shortchanging investments in analyzing Web sites relative
to other customer channels. In many cases, the central analytics or data warehouse team is not involved with analyzing Web site activity.

How Much Does It Cost to Make an Enterprise Intelligent?
Kurt Schlegel – 7 June 2003

In most large enterprises (30,000+ users), the analytic team consists of 15-18 FTEs during a steady-state operational phase. However, labor requirements will spike significantly (up to 100%) during
a major implementation or restructuring. These staffing requirements are typically filled with systems integrators, vendor consulting, and small boutiques. Hourly labor rates for these services are
at an all-time low (e.g., $150 per hour). We recommend using these services to gain expertise and satisfy surplus demand (e.g., design and implementation stage of new projects), but large IT
organizations should maintain analytic operations with internal staff, which is almost always cheaper.

Architecture Is a Verb
Richard Buchanan – 5 May 2003

Successful organizations do not create architectures – they architect. Architecture easily becomes shelfware unless there is an ongoing process to refresh it. The analogy between enterprise
architecture and a building’s architecture is false. Enterprise architecture is more akin to urban planning. Confusion about this basic fact sabotages many enterprise architecture efforts that are
assumed to be “projects” that have a beginning, middle, and end.

Measuring Data Quality: Divide and Conquer
Doug Laney – 7 March 2003

Through 2002/03, we believe the recognition of information as part of the IT portfolio will prompt leading organizations to fashion custom indicators and a scorecard for measuring data quality.
Data quality is a composite of characteristics including accuracy, consistency, completeness, entirety, breadth, depth, precision, latency, scarcity, redundancy and integrity – each a bit more
measurable than “data quality” itself.

Used by permission of Doug Laney, META Group, Inc.
Copyright 2003 © META Group, Inc.

Share this post

Doug Laney

Doug Laney

Doug Laney, Vice President and Service Director of Enterprise Analytics Strategies for META Group is an experienced practitioner and authority on business performance management solutions, information supply chain architecture, decision support system project methodology, consulting practice management, and data warehouse development tools. Prior to joining META Group in February 1999, he held positions with Prism Solutions as a consulting practice director for its Central US and Asia Pacific regions, as a methodology product manager, and as a consultant to clients in Latin America. With data warehouse solution involvement in dozens of projects, his field experience spans most industries. Mr. Laney's career began at Andersen Consulting, where he advanced to managing batch technical architecture design/development projects for multimillion-dollar engagements. He also spent several years in the artificial intelligence field, leading the development of complex knowledgebase and natural language query applications. Mr. Laney holds a B.S. from the University of Illinois.

scroll to top