Trust. Trust is defined as the assured reliance or belief on the character, ability, strength, or truth of someone or something (Webster’s Dictionary). It’s a term we use often to describe how we feel about the people, the institutions, and the things around us. But I would argue that the term “trust” was used differently years ago vs. how it is used and perceived today.
Years ago, trust was binary. You either trusted someone or something, or you didn’t. Today, that simplicity is blurred. We have half-truths. We have conditional truths. Our trust in the news we read, the information we gather, the products we use are laced with hope, and skepticism. It’s a difficult world to navigate when it comes to trust.
Trust in Data in Our Digital World
One of the areas that has been impacted by this challenge is our data. Our trust in data was traditionally viewed through the lens of data quality. Data quality was either good or bad. Information was correct, or it was incorrect. It was easier then, perhaps, because data was simpler. The types of asset classes were (seemingly) finite. Information about these asset classes was relatively simple to measure. We had products, customers, transactions. But today, the types of data and the ways we collect it, measure it, and interpret it have dramatically increased. We live in a digital world now where everything we do and see is represented digitally. We have structured data and unstructured data. We have digitized our world such that our data today can describe symbols, images, opinions, sentiments, colors, and even feelings. And with this kaleidoscope of data comes a kaleidoscope of interpretations, shades of grey, and questions about the integrity of our information.
A number of years ago, I was chatting with a very senior risk officer at a major financial institution. She explained to me how she was given the responsibility of gathering information from a number of the firm’s operations and submitting that data to a scrutinous risk review. One day, she was called into the office of her boss — the Chief Risk Officer (CRO) — who asked about the data that she was using to calculate the risk positions of the firm. She told me that, when asked if she trusted the data she was using to calculate the firm’s risk, she answered, “Sort of.” With downtrodden eyes, she recalled, “That was not a good answer to give to your CRO.”
Data for Advanced Analytics
Today, trust in the data we use is taking on an even greater consequence because of the advances in machine learning, artificial intelligence, and generative AI. While increasingly powerful, these technologies are also incredibly sensitive to anomalies in our data. Unintentional omissions or hints of biases in the data that feed our analytic engines can have an exaggerated impact on the outcome of these analyses. But the real challenge we face is our somewhat “blind” faith in the outcome of the analytics. Because these technologies are so powerful and can produce fantastic results, and because these technologies are becoming so ubiquitous to everything we do, we tend to simply “trust” the outcomes just because they came from an AI. We forget that the very nature of machine learning is that it learns. Although the initial data sets may be “clean,” new data is constantly flowing into our AI engines and if not careful, questionable data can skew future outcomes. At a recent EDM Council DataVision event, one of the panelists described this state of affairs by saying, “Bad data with good analytics is worse than no analytics at all.” If we cannot assure trust in the data feeding these powerful analytics, we risk, at a minimum, having to apologize, or at worse, making very bad decisions that impact our businesses, our environment, and people’s lives.
So, we are on the precipice of the perfect storm. We have an ever-increasing set of new asset classes, coming from new sources and channels, coupled with incredibly powerful technology. Putting this together, we need trust in the entire supply chain — from source to capture to storage to application. Although the challenge is multi-faceted, one way we can help improve this condition is through advanced and disciplined data management.
Trusted Data Foundations to Manage Risk
In January 2013, the Basel Committee on Banking Supervision (BCBS), on the heels of the 2008 Financial Crisis, issued the “Principles for effective risk data aggregation and risk reporting.” Referred to as BCBS239, the document (in summary) stated that in order to effectively manage financial risk, in addition to ensuring that your models are correct, you need accurate, timely, and trusted data. Ten years later, the essence of this paper still rings true. Trusted data is foundational. But as we have learned over the years, it is more than just data quality. Trusted data is the result of a disciplined “manufacturing process” that includes a series of capabilities from strategy and planning to well-defined business requirements, to advanced data architecture, and an adherence to a data governance framework that enables business through controlled access to quality data. Without this foundation, all AI/ML and analytics will be compromised.
The EDM Council, in collaboration with industry experts, has been advocating this approach via the DCAM® — Data Management Capability Assessment Model — an approach that has been utilized across public and private industries. The executive that has been charged with driving this discipline is the chief data officer (CDO). In its early inception, the CDO’s responsibilities centered around operational efficiency (fixing errors, correcting trade fails, cleaning customer data). Today, in addition to these responsibilities, the role of the CDO has dramatically expanded to include the critical capabilities needed to enable the successful use of a firm’s information assets. CDOs focus on cloud adoption, data democratization, semantic modeling, data sharing, and responsible and ethical use of data, all of which are precursors for leveraging responsible and trusted AI/ML and advanced analytics.
Trust is hard to achieve. There are many factors and many stakeholders needed to achieve this objective. When it comes to trust in our data and in our analytics, don’t be blinded by the desire to chase after the shiny ball of AI/ML without remembering the basics. To get to the benefits of trusted analytics, you first need to achieve trust in your data through the foundational disciplines of data management.
Trust me …
This quarter’s column contributed by:
John Bottega, President of the EDM Council
John Bottega is a senior strategy and data management executive with more than 40 years of experience in the industry. John began working with the EDM Council as an industry contributor in 2005 and served as chairman from 2007 to 2014. He joined the Council’s executive team as a senior advisor in 2014. In 2017, he took over as the senior executive and today holds the title of president of the EDM Council. Over his career, John has served as chief data officer in both the private and public sectors, serving as CDO for Citi, Bank of America, and the Federal Reserve Bank of New York. He also served as head of data management for the Office of Financial Research at the U.S. Department of the Treasury.