In this column, we explore the basic foundations of an integrated governance framework, defined as the correct blending of governance metadata management, technology, and general controls.
AI and machine learning hold immense potential. Far beyond buzzwords, these technologies promise to revolutionize profits, government effectiveness, and even quality of life. Authors who previously focused on the fundamentals of good governance have jumped aboard the AI bandwagon.
However, we must keep firmly in mind that the power of AI and machine learning is only as good as the data beneath the technology. The simple reality is that the quality of the AI or machine outcome is directly related to the quality of the data input into the algorithms. With so much focus on technology, will data quality play second fiddle? And what type of controls are being implemented to check for the reasonableness of predictive outcomes?
We recently worked with a company that implemented AI technology to create a predictive sales strategy based on historical data. These forecasts combined actual sales with third-party forecasts of a series of economic factors. The basic idea was straightforward. What future sales and product roadmap strategies should be given, such as past product sales and customer data, when combined with forecasted interest rates and buyer preferences? Data was fed into the models. Highly graphic visuals were distributed across the organization, including the board. The future strategy seemed secure.
Unfortunately, an inconvenient truth was overlooked. Within this company was a small group of data and risk analysts expressing concerns about the quality of the data used in modeling. Missing were basic controls around reconciling between financial books, customer records, and product balances. Spreadsheets, or “magic sheets,” showing the problem could not get the proper attention against the excitement of the predictive graphics. This underlying data problem finally came to light through an internal audit as part of a third-line defense process. Manual reconciliations showed that the future predictions were off by two-digit percentages.
This triage work was done in a panic to identify the source, use, control requirements, and data movement. Like so many companies, the data warehouse implementation was documented on outdated spreadsheets. The implementation consultants and their underlying knowledge were long gone. A task force determined the current state of the data and what needed to be done to improve data accuracy and, therefore, the predictive results. One root cause of this issue was a compromised view of the underlying data movement and controls. This entire problem could have been avoided by addressing system implementation through the lens of governance metadata. This approach requires a change in mindset. One analyst complained that there was never time to capture requirements as metadata because they were just too busy re-mapping spreadsheets that were constantly outdated. The age-old lament remains, “We don’t have time to do it right, but find time to do it over.”
So, what is an integrated governance framework? To answer this question, we start with a basic understanding of the relationship between a business unit, business processes, and the data created or used by those processes. This relationship is like a giant puzzle connecting the business process maps so often drawn by efficiency experts attempting to optimize resources or increase profits. A governance architect can look at a simple process’s data and process components and redefine this as business and governance metadata. Analyzing a business process is straightforward, but a top-down view of the entire value stream of an organization will show a dynamic that is constantly in motion. An integrated governance framework provides a common vocabulary to define this complexity as a metadata puzzle. Given the right technology solution, this puzzle is completed under a comprehensive governance methodology that provides critical control and awareness of the source and use of data across an organization.
A recent governance framework implementation provides a refreshing contrast to the predictive triage outlined above. This company has undertaken a major system conversion leveraging the governance awareness that comes from looking at their organization through the lens of an integrated puzzle. The source and use of data are known and captured as metadata. Critical data elements provide evidence of data accuracy and are integrated into automated controls. Business users receive email notifications with the classic red-yellow-green labeling of these specific control attributes. These emails are targeted based on a need-to-know basis via the business process. This company is poised to leverage further governance metadata in the control of unstructured data and awareness of underlying risks inherent in confidential and critical data. They have the mindset and technology in place to accomplish this goal.
In governance controls, less is more to avoid flooding people with noise. Understanding the critical data to determine a balanced system and distribute them only where needed is a hallmark of lean governance. Focusing on the power of lean thinking, we equate data to inventory and applied lean manufacturing principles to an organization’s companies’ data and information factory.
Fortunately, technological advancements in governance metadata management will enable companies to leverage the full power of AI while simultaneously having automated controls monitoring the overall quality of the underlying data. In past columns, we have written about essential governance metadata management and the need to maintain awareness. This is a call to action, to stay informed and engaged in the evolving landscape of governance.