There has been increasing discussion around standards used for managing and governing master data. This is driven by evolving the maturity of data practices, but also by the demands placed on data practitioners by big data, analytics, and the shifting business models that those two activities engender. From a best practice’s perspective, the ISO 8000 standards seek to solve business challenges related to how accurate information is exchanged between business partners specifically within the context of an organization’s supply chain.
While these standards have been in the works for some time, the more substantive portions were more recently published in the 2017 – 2018 time frame. As a result, we are seeing increasing dialogue and interest from companies seeking to mature their master data management (MDM) capabilities. As is often the case with standards, two questions arise: what do they do for me, and is adoption worth the work?
The Path Forward: To Adopt or Not, and if so When?
As with any set of standards, it is important to understand the value they bring to your organization. For most organizations with complex supply chain and master data challenges, many use the current approach of fixing the data manually, which will not scale into the world of big data, analytics, and the increasing demands to share data. These standards systemically enable capabilities that have significant benefits and ultimately reduce overheads associated with the master data management function. They address capabilities that:
- Classify, label and organize data to ensure the full meaning is captured;
- Publish data in a form that supports automation and the application of machine learning techniques.
- Score data with respect to quality (completeness); and,
- Structure data into a portable format to enable sharing.
Having said that, adoption comes at a cost, and these are detailed standards that, for some companies, may require a rework of the entire operational approach to master data management (MDM) and Governance. The table below presents some of the key drivers and the implementation considerations – both pros and cons.
Adoption Driver | Considerations |
Business Value | Master data by definition is the high value data within the organization. What happens when it is not right? What is the cost of being wrong? If master data was correct, what impact would that have on revenue? Generally “value“ falls into three categories that drive the business case:
Operational Efficiency: Bad data can have huge top line and bottom line impact. Inventory shortages will impact revenues, and over time erode competitive positions, while production issues will erode margins as manual remediation rework drives up costs. Reporting & Compliance: Regulation in one form or another is squarely focused on master data entities. The protection of personal information for example, is considerably simplified with a mature customer master data capability.What is the business value with effective MDM? It is generally the avoidance of cost – For example, maximum fines under GDPR regulation on protecting personal information is 4% of worldwide revenue. Analytics & Insights: MDM forces a discipline on how organizations define terms, resolve issues; and, make data more systemically useable. These all have a bearing on how analysts collect, organize, and apply machine learning techniques and models to the data.Analytical capabilities are significantly enhanced as metadata related to classification, relationship links, semantic tagging, temporal and geo tags, and pattern analysis all create a rich set of features against which analytical models can be applied. |
Digital Transformation | I bring this up only in that it is something that drives many data discussions at the executive level. The impact of transformation is that current business models are changed, or new ones are created.Many see this as the holy grail of data and analytics. For a more complete discussion on business impacts see Leading Digital: Turning Technology into Business Transformation. |
Cost Reduction:· Data Operations· Data Quality | The driver here is automation and ensuring that the data quality issue is addressed before the data gets into the organization’s ecosystem. This is critical for organizations venturing into the realm of big data as current processes are simply too cumbersome to implement in real time and cannot scale to address big data challenges. |
Complexity / Simplification | This driver can go both ways. These standards are often considered overly complex. However, implementation of these standards will simplify the MDM process. The question is at what level do you implement these standards?Organizations will need to evaluate the nature of their master data challenge to understand whether the additional data complexity outweighs the business benefit. |
Industry & Partner Alignment | These standards address how information on complex topics can be shared. In certain industries they have already been adopted or are being adopted – healthcare / pharmaceutical research and the financial industries are leaders. Often adoption is occurring in phases. The concept of a metadata registry (ISO 11179) for example is well established and in use across many industries. The ISO 8000 specs, however, are newer and are therefore less implemented. |
My recommendation is to understand the big concepts associated with the standards, and work those capabilities into the data strategy and roadmap. For example, you may not be able to make data “portable,” but you may be able to add the detail into the dictionary to support the process. Using this approach, one can incrementally build towards a more mature set of capabilities that simplify and streamline the MDM process across the entire supply chain.