Part 1 | Part 2 | Part 4 (Coming Soon)
In our last column, Comparative Data Management, Part 2, we began our exploration of the state of data management employing a benchmark chart, based on the scores of 12 organizations in a comprehensive evaluation of their data management capabilities according to the Data Management Maturity (DMM)SM Model. We set context by reviewing the scoring method and the concept of capability “Levels” inherent in the model, and examined three disciplines that have been somewhat neglected, generalizing from these organizations and many other sources of feedback within our industry. They were: Data Management Strategy; Data Lifecycle Management; and Measurement and Analysis (aka, metrics).
In Part 3, we’re going to tour two of the disciplines with the highest average scores and provide examples of factors that reflect robust capabilities, which propel the organization towards optimizing its data assets and gaining the most value from them. But first, let’s orient ourselves with a couple of observations about the extremes of ‘data consciousness’ that organizations can exhibit.
The picture to the left represents the Buddhist view of the journey of humanity from Samsara (illusion) to Nirvana (enlightenment, or ‘emptiness’). We often hear the phrase “data Nirvana” in presentations and webinars, used as a shorthand expression for a (much-to-be-desired) future state, in which everyone in an organization can access high quality data in a timely fashion and without excessive reconciliation, to make sound business decisions, share data easily, predict trends, and minimize risks (financial, reputational or regulatory).
In the lowest section, you can see a dark figure running in fear before a charging elephant – in the realm of data, it represents a chaotic approach to handling the data assets. For example, each project doing its own thing without standards or governance, collectively creating complex spider webs of varying names, definitions, and values for the same data elements. Or, waiting until the last minute to realize “Hey, I WILL need customer reference data, but there are 13 possible sources, where do I get it?” (and then, frequently, designing the interface to grab it from the easiest place to access). Taken to the extreme, no standards, no governance, and no strategic plan for the data assets.
Skipping several turns of the depicted path (after much more drama takes place) there is the fully opened lotus flower and seated Buddha at the top left, the gateway to the two god-like figures riding on the now tame and purified elephants. In our metaphor, that represents the ‘awakened data culture,’ where everyone understands the importance of their actions as they create and manage information, everyone works together to ‘make Nirvana real,’ and sound practices are internalized and followed, becoming second nature.
In the DMM, the desired state is evidenced by groups of best practices by topic (Process Areas) which are scored at Level 3 and above, such that policies, processes, standards, and guidelines have been established, are understood, and are followed by all relevant stakeholders across the organization. This language may seem somewhat academic, but in fact, evaluating activities, work products, and the extent of their implementation is a fast and efficient method to discover the degree of overall evolution of the organization’s data culture. It provides substantive information, determined by consensus, to construct a comprehensive and unbiased estimate of data management accomplishments. In sum, it provides a definitive bottom-up answer to the overall question “Are we doing what it takes to get the most out of our data, and if not, what is lacking?”
When an organization provisions its journey by starting from an examination of where it is today, it represents the end of fear-driven action – in our metaphor, no more running from the enraged elephant, the tangled and complex data layer – and the choice to act based on thoughtful introspection and planning. This seems patently obvious, but in the past, there has been such a strong tendency to view data expediently, as ‘stuff running through the pipes’ of systems and technology, that most organizations have structured themselves around that essential premise, which has inevitably led to greater and greater complexity and fragmentation of the data assets.
So the overarching goal for the data management professional, business leaders, and executives is to significantly change the perspective of everyone materially involved in creating and managing data – the path we are all traveling, and no mean feat. However, the old chestnut “How do you eat an elephant? One bite at a time” applies – (or) “wherever you are, there you are.” An organization cannot radically alter itself overnight, without risking staff continuity and impeding progress towards vital business objectives. So how can the organization push the stalled train up the hill? By many pairs of hands pushing together.
As organizations have become more convinced that data is their life’s blood, a critical component of infrastructure, more are implementing comprehensive data governance, which combines emphasis on increased data knowledge and controls, instituting collective decision-making about shared data assets, and accomplishing team-based enterprise objectives.
That leads us into our first higher-performing Process Area (average score 3 or greater), Governance Management, according to the benchmark chart below. (For an explanation of this chart, see Part 2).
Governance Management in the DMM addresses activities and work products related to governance itself – what collective decision-making activities high-performing organizations perform consistently (processes), and what work products (artifacts) they typically produce to support these activities (e.g., RACI charts, etc.). Specific decisions arrived at through governance activities are addressed in the disciplines where they occur, for example, the process through which authoritative data sources are determined is called out in Data Lifecycle Management, and the process through which quality targets and thresholds are set is called out in Data Quality Assessment.
Most organizations have, by now, devoted significant energy and attention to develop effective structures to facilitate decisions about shared data and improve data controls. Successful implementations of governance are characterized by: an educational orientation; active executive engagement; and a shared mission and vision.
Business representatives engaged in governance – data stewards – benefit from training in their role, which may include: the types of decisions for which their input is desired; the expectation that they will engage others in their line of business for consensus; the periodic meetings they should attend; the process to raise data issues; the template to capture their business line’s definition of a business term; the process for proposing quality rules, etc.
Executive engagement in high-performing organizations is visible, active, and consistent. A high level of executive attention is necessary to overcome the natural tendency to focus narrowly within one’s own business area. Individuals must be convinced that their efforts on behalf of the organization as a whole are both needed and appreciated. Otherwise, they will tend to stay in their business lanes, for example “I’m from Sales and we define Product this way and always will; why should I spend my time wrangling over Product Type Codes with Marketing or Research for the enterprise business glossary?” Executive engagement and recognition = a bit of altruism becomes “cool.”
Organizations with high participation in governance succeed in showing business representatives that their participation is essential for success. This is best accomplished by engaging governance participants in key achievements, and initiatives with a beginning, middle and end.
For example, one financial organization launched formal data governance with a critical initial task, determining the data set, business glossary, metadata, and quality rules for a securities master data store. This was the first segment of a complete transformation of the legacy enterprise data warehouse, and a number of business areas were highly dependent on its implementation. Success was achieved – approved sources, standardized data, technology refreshed, quality controls implemented. In this first big governance-centric project, participants were proud of their contribution and became confident, experienced, and ready to tackle the next subject area. This example shows how focusing the lens of collective action through empowering each individual creates a win for the organization and a win for the team.
Once governance groups have experienced some tangible successes – “We did that!” – they have more energy for additional team projects and more tolerance for the ongoing, sustaining functions of governance. This organization, for instance, was able to expand its governance program to set up a business manager forum to raise and resolve issues about shared data. Finally, developing a strategy and sequence plan for data management improvements, typically linked to either a major architectural transformation, regulatory compliance, or other strong business driver, will forge a consensus vision, increase data awareness, and help to get all hands on deck. To paraphrase a classic commercial, “If you’ve got governance, you’ve got just about everything.”
In the DMM, Provider Management addresses the processes and work products that enable an organization to manage its data providers: to optimize internal and external data sourcing to meet business requirements; to apply quality considerations; and to manage data provisioning agreements consistently. As a part of Data Lifecycle Management, authoritative sources are identified and approved; in Provider Management, the focus is on how organizations acquire data, agreements between suppliers and consumers, and applying quality rules to data sources.
It’s easy to find examples of problems experienced by many organizations: buying multiple data feeds for the same data; failing to monitor provisioning issues; lack of quality standards in the contract; ‘poison pills’ in the SLA or contract – (here’s an actual example – a major data provider included within its contract the provision that if the contract were terminated, ALL data previously received would have to be deleted from the organization’s systems [!]); internal resistance to service level agreements (SLAs) which protect the consuming applications from interruptions, etc.
In this ‘Good Housekeeping’ aspect of data management, an ounce of prevention is worth a pound of cure. Conversely, organizations that do a good job in managing data providers realize many benefits:
- Ensure that the right data is delivered to consumers
- Improved data quality for delivered data
- Reduced lag time in defining new requirements
- Simplification of the data source selection process
- Reduce risk, rework, and excessive costs.
It seems that many organizations have become aware of the consequences of neglecting good practices to specify, document, and control the data that is provided and consumed. Many are determined to do a better job and take the time to tackle the problems. Let’s look at the example of one large organization, which realized that while it was purchasing millions of dollars of external feeds annually, there were many issues, namely: procurement was taking a ‘one-off’ approach to each acquisition; legal did not have a standard contract template; there was often no clear business owner; there were weak or non-existent SLAs; and data quality criteria were not specified. In short, no plan, no clear process, no business engagement, and minimal management.
The organization determined that it would address these issues. It engaged key business consumers and their data stewards to conduct an analysis of the more expensive data feeds to determine:
- The primary consuming organization (the business entity that originated the purchase request)
- Secondary consumers, e.g. who else would be impacted
- If there were contract terms specifying issues escalation, penalty clauses, etc.
- If the SLAs specified timeliness, completeness, and conformity for the data provided
- If there was systematic communication with vendors to ensure that any issues were addressed, and that all information on changes or additions to the feeds is received.
As a result of this effort, the organization made a number of practical and significant improvements. A business owner was assigned to each data feed. Each feed was uniquely identified and captured in the metadata repository, including a description of the content, the point of contact for access, and characteristics (e.g., a weekly digest of short sales). Contract terms were standardized for all new agreements. At least one annual meeting with the vendor was required. And the organization engaged consumers to agree on a minimum set of data quality rules that the vendor must agree to satisfy. During this exercise, it was determined that some original customers no longer had a need for some of the data, so the contracts were not renewed, saving costs.
This example demonstrates how a collaborative approach in a high-performing organization can lead to significant improvements. Note that it involved a number of players – legal, procurement, line of business experts, data stewards – and yet was a well-organized and executed effort yielding both tactical and strategic benefits.
In Part 4 of this series, we will focus on one broad topic area, Data Quality, and share observations and examples from both lower and higher-performing organizations, with practical suggestions for how to accelerate a data quality program.