Data Professional Introspective: Comparative Data Management (Part 2)

Data-Introspective

Part 1   |     Part 3    |  Part 4 (Coming Soon)


In our last column, “Comparative Data Management Part 1,” we spoke about how organizations are evolving their data management programs, and viewed average assessment results from organizations that employed the Data Management Maturity (DMM)SM Model to create a comprehensive benchmark.

We looked at the score ranges by discipline (Process Area) in a chart of 12 organizations and saw that there was a wide range in capability strengths and maturity.  For this column, we’ve updated the chart, which now reflects 16 organizations.  Industries included are Ten the financial industry, IT product companies, health care, scientific research and Federal.  As we stated, each organization follows a unique path based on its current state, the degree of time and attention that has previous been focused on data management, and its strategic business and technology priorities.  The DMM has been applied to organizations with 130,000 people and as small as 15 people; from an enterprise-wide including all business areas to a scope as small as a single repository.

Nonetheless, we can generalize at a high level about the state of the industry from these results, because there are a number of recurring themes that surface – what disciplines tend to be established and nurtured, what disciplines tend to be neglected, and what challenges organizations typically face.  The DMM advocates that all organizations strive to achieve a capability strength of Level 3 – Defined in all Process Areas.  Since the model was developed from conceptual orientation to effective and sound best practices for a well-rounded data management program (Level 3)  this evaluation result is the equivalent of ‘Green, Good, Go!’

mecca_jul01

The DMM scoring mechanism produces a raw score based on the number of practice statements that were assessed as Fully Met, Partially Met, and Not Met.  Practices are not weighted.  A raw score is rounded up or down to the nearest quarter point.  For example, a raw score of 1.58 would round down to 1.50, while a raw score of 1.66 would round up to 1.75.

Average scores in the chart are rounded and indicated by the orange triangles, the mid-point between the highest score evaluated and the lowest score evaluated.  Note that for this sample of 16 organizations, there are only six Process Areas where the average score is 3.0 or above – Governance Management, Data Quality Assessment, Provider Management, Architectural Approach, Data Integration, and Configuration Management.  The top of the range for these six is 4.00 or above, pulling the midpoint higher.

Let’s briefly review DMM Levels.  Within a DMM process area, statements are organized by best practice capabilities that represent a successive, abstracted path of capability growth for a typical organization.  A brief Levels primer:

  • Level 1 – basic practices are being performed, usually project by project. Capabilities are not systematically shared across the organization – Project
  • Level 2 – capabilities are developing and extended across at least one line of business or major multi-business line program (such as a master data hub or a data warehouse) – Program
  • Level 3 – capabilities are well developed, integrated and extended across the organization: Level 3 represents the recommended target for all organizations, yielding the benefits of improved data assets, increased efficiency, and cost reduction – Organization (indicated by the blue horizontal line)
  • Level 4 – capabilities are refined by employing metrics and statistical analysis techniques
  • Level 5 – capabilities receive senior management attention and are continuously improved.

For Part 2, we’re going to focus on the disciplines with the lowest average scores – Data Management Strategy (2.00), Data Lifecycle Management (1.50), and Measurement and Analysis (1.50) – and what organizational characteristics and capability gaps they typically reveal.

There are clear reasons why organizations do not demonstrate strong capabilities within these three areas.  Let’s explore what we can conclude about why this is the case.

Data Management Strategy

Since the mid-80’s, data management industry thought-leaders and practitioners, armed with skills and implementation experience, have been explaining, pushing, and shouting from the housetops that “Data assets are critical components of an organization’s infrastructure and must be managed across the entire enterprise!”  Nothing new about that concept.

Because organizations are, naturally, focused on activities – getting things done – data per se, the fabric created and used to accomplish activities – has historically been an afterthought.  Now that organizations have embraced that ‘Data assets are critical,’ they face major challenges. The larger and more complex the data layer, the more daunting the contemplation of ‘How the heck can we straighten this out?’

However, the challenge must be faced.  The data layer is always depicted as the foundation of the N-tiered architecture. The first data asset was born on the day the organization first opened its doors (before that, in most cases), and data stores have continued to proliferate and expand ever since, forever continuing to grow for the life of the organization.  Data needs to be well-managed like any other critical asset; the alternative is ever-higher expenses, increasing chaos, and increased risks in meeting business needs.

So, organizations need to face their fears, like ‘It will be so expensive!’ or ‘We’ll never untangle the legacy environment!’ Instead of nail-biting and hiding under the blankets, the DMM recommends a simple and practical approach, essentially for the organization to answer these questions:  What have we got and how well are we handling it; what do we have to do; who will do it; how do we gain agreements; and how long will it take?  The net result of thoughtful, forward-looking, and collaborative planning and sequencing is a Data Management Strategy (DMS).  Common challenges in creating and approving a DMS are:

  • A federated / distributed organizational structure which currently lacks an established governance program – organizational units are used to handling everything themselves, business-line centric, versus collaborating. This makes agreeing upon vision, goals, objectives, and priorities more difficult.
  • Many staff members who should be involved in creation of a DMS have a limited awareness of the organization’s business strategy; and many organizations do not publish their business strategy internally. This makes prioritization difficult, especially if executive management is not actively engaged.
  • Too great a scope chosen for “enterprise data.” It should be obvious that success is easier to achieve in smaller chunks.  The successful effort typically is focused on highly shared data, or broad key implementation initiatives that enable high priority business gains, e.g. to improve the customer experience.
  • Too broad a selection of topics. The term ’Data Strategy’ is often used to describe the entire realm of data – target architecture and technologies, as well as the data management disciplines needed to support them.  This ‘complete realm’ approach could take years and be hundreds of pages in length.  It is self-defeating because it is large and difficult to implement.

The DMM contains specific practices and a content outline of information to assist an organization in developing a DMS.  It should be approached as a high-energy, time-boxed effort with all relevant stakeholders involved.  As an example, the two organizations with the highest scores achieved both developed an enterprise DMS, have been implementing it, adjusting for changed priorities, and reaping great rewards based on the strategy’s multi-faceted implementation.

Data Lifecycle Management

This Process Area represents the nitty-gritty work of achieving the desired state of data knowledge – mapping, inventorying, understanding, and controlling data flows through the data life cycle from creation or acquisition to retirement.  An important aspect of this work is to determine which are the most important business processes and map the data to them – where it is created, where it is modified, where it is used.  Data Lifecycle Management (DLM) activities and work products are essential support to managing metadata, data governance, data architecture, data integration, and designation of authoritative data sources.

Clearly, if an organization could instantly have comprehensive and accurate well-documented knowledge of exactly what data resides where, how it is created, what happens to it later, and who uses it, it would be highly valuable, resulting in decreasing time to market, increased agility, stronger data governance, and more effective data integration. It is hard work—due  to the many thousands of data representations implemented in many data stores over many decades by many business users and project teams—often within their own business unit.  Also,  most organizations have not engaged in a top-down re-engineering of their business architecture, adding to the difficulty.  The core business processes themselves have often developed in an ad hoc manner, and individual knowledge typically fills the gaps of the many inefficiencies that have crept in over the years as the business evolves.

Therefore, most organizations do not have any clear plans in place; what we find is projects-by-necessity, dictated by regulatory compliance or top priority business needs.  It follows that each time the approach is different, causing a dependence on each project to blaze its own trail.  Key challenges are:

  • Defining a standard approach and framework by which business processes are mapped to data flows.
  • Justification of the time, expense, and effort beyond single projects.
  • Lack of metadata and documentation of existing business processes, data flows, and data stores.
  • Difficulty in identifying the subject matter experts.
  • Determining which toolsets meet the requirements of the desired end result and metadata.
  • Gaining agreements from producers and consumers, especially where data sets and business processes overlap.

The DMM DLM Process Area identifies clear practices that assist an organization define a right-sized baseline of data sets and processes, identify data impacts and dependencies, and maintain mappings as business processes are modified.  It is recommended that this work begin with an enterprise-wide consideration of the approach to be followed, the repositories to be populated, and the depth of information to be collected.  Then, implementation can be orchestrated in manageable segments leveraging natural events, such as a key program implementation.

Measurement and Analysis

Since most organizations have only recently conceived of data management as an enterprise-wide permanent program (because the data is forever, there will be no stopping unless the organization goes out of business), it is not surprising that the state of measures and metrics is haphazard – even organizations which are quite mature in many data management disciplines neglect the importance of metrics.  The DMM’s Measurement and Analysis (M&A) process area addresses best practices for defining and using metrics.

Metrics consist of milestones, which are usually better addressed since they can be connected to projects, counts, percentages, and other measures enabling statistical analysis.  Without a solid set of meaningful metrics, organizations can struggle to evaluate if the many aspects of the data management program are healthy and progressing.

The DMS states the program’s objectives. The organization must: determine its vision of the desired state for the program itself, as well as for each discipline comprising the program; develop a standard process for how metrics will be established, maintained, and used to evaluate processes; determine how to inform stakeholders; and provide organization-wide access to important metrics.  Challenges to development of a measurement approach include:

  • Lack of defining a sound approach for metrics development, based on a ‘vision of the good’ and a determination of what can be measured that directly speaks to progress towards the good.
  • Lack of, or inadequate, operational definitions for processes.
  • Metrics that are mathematically justifiable but don’t provide concise conclusions about how processes are being done and products produced and do not support decisions.
  • Use of metrics as part of performance reviews; to reward or punish.
  • Lack of training in how to develop metrics for those involved in crafting them.
  • Lack of involvement from staff performing the processes to be measured.
  • Lack of periodic review to ensure that metrics being employed are still useful.

The DMM contains best practices for formulating, adapting, and using metrics for program improvements, and significant supporting information with detailed activity steps which can be used as a guide.  It is recommended to start with program-level metrics which can be captured in the form of a dashboard available to all stakeholders, and then address key metrics for each of the disciplines that are a priority.  To take an expanding data governance initiative as an example, the organization may want to track the number of governance participants, number of governance action items completed, number of action items escalated, etc.

In Part 3 of this series, we will focus on the highest scoring disciplines and provide examples of key accomplishments that reflect success.

The higher score ranges (Level 3) and above, show that some organizations have mastered specific capabilities in a number of areas, achieving robust practices that are implemented organization-wide.  Score ranges with higher floors, such as Business Case, Governance Management, and Data Integration, show that organizations have invested resources and attention into developing consistent practices.   The diamonds, average scores among all 12 organizations, collectively represent that while data management capabilities have indeed progressed, the average organization is still working primarily at the program level.

Share this post

Melanie Mecca

Melanie Mecca

Melanie Mecca, CEO of DataWise Inc., an Enterprise Data Management Council Partner and Authorized Instructor for DCAM certification courses, is the world’s most experienced evaluator of enterprise data management programs. Her expertise in evaluation, design, and implementation of data management programs has empowered clients in all industries to accelerate their success. As ISACA/CMMI Institute’s Director of Data Management, she was managing author of the DMM and has led 35+ Assessments, resulting in customized roadmaps and rapid capability implementation. DataWise provides instructor-led courses leading to DCAM certification, as well as for data stewardship and the proven Assessment method that she developed. DataWise offers a suite of eLearning courses for organizations aimed at elevating the data culture and providing practical skills to a large number of staff. Broad stakeholder education is the key to data management excellence! Visit datawise-inc.com to learn more.

scroll to top