Principles of Cloud Data Governance For Banks

Banks – and their data volumes – are at the epicenter of the world’s digital transformation. The pace of change mirrors the velocity, volume, and variety of data within the industry. It is where new products, new markets, and new touchpoints mean new – often cloud-based – ways to do business in financial services. 

Of course, in highly regulated industries, innovation can equal risk. That’s why continued progress depends on banking institutions establishing robust data governance principles. 

After all, wherever sensitive and personally identifiable information is stored, malicious actors will be found. Even without external threats, there are still the unintentional threats from internal employees. Phishing remains the most common attack vector in finance and a leading entry point for ransomware.

The Path to Successful Bank Transformation

Mitigating these threats without stifling innovation is an ever-present challenge for organizations. However, evidence clearly shows what happens when it’s done successfully. 

Industry incumbents can partner with fintechs to disrupt long-established banking models. Customer contact strategies are free to evolve from static statements to personalized interactions. Closed, proprietary models give way to open banking ecosystems. 

Naturally, delivering these on-demand services often means storing and accessing information held in the cloud. In this digital-first banking model, data often arrives unstructured. Sometimes through unsecure omnichannel interactions or unpatched devices. 

This means data volumes aren’t just getting bigger. They’re also becoming wider. Much like banks’ attack surfaces, and the locations of their often-remote workforce. 

Borderless Business: Meeting Cloud Data Governance Challenges

The resulting pressure is magnified by the financial and reputational impact of data breaches. 

Flagstar experienced one of the biggest breaches in 2022, with 1.5 million consumers impacted. The US bank also paid an out-of-court settlement of $5.9 million dollars. 

Additionally in 2022, there was the conviction of the hacker behind the 2019 Capital One data breach. This resulted in a $190 million class action settlement

Financial organizations already experience one of the highest average costs from data breaches. At $5.97 million, only the healthcare industry is higher, at £10.10 million on average.

For international banks, increased risks come from the multiple regulations for moving and managing data across borders. For example, operating in the US requires a state-by-stage approach to data governance. Five states (California, Colorado, Connecticut, Utah, Virginia) are set to enforce new statutes in 2023.

“Approximately 78% of financial institutions regard the rapid change in reporting requirements and ad-hoc, non-standardized reporting requests as significantly impacting compliance costs.”

The concept of data location is further complicated by partnerships with fintechs and other third parties. 

From data storage to API access, banks require full transparency and visibility from any cloud data governance strategy. They need this for their internal workflows, as well as to demonstrate compliance to regulators. And from a brand perspective, to reassure and satisfy their customers. 

Once these foundations are in place, organizations can focus on maximizing value with a cloud data governance framework. The process starts by analyzing where data access has been in the past. And it then continues with understanding where data access is going in the future.

“Once a defined governance and access control structure is in place, banks can ‘democratize’ their data by allowing each business unit to access the data mesh and take greater ownership of the quality and value of the data relevant to their domain.”

Novelty to Necessity: How Data Needs Have Evolved

Traditional data access operated on a relatively open-to-all basis. When concepts of data privacy and security tended to be limited to compliance and governance teams. 

After the 2007-8 financial crisis, regulatory reforms started having an impact. The international Basel Committee on Banking Supervision was established, with guidance for banks around identifying and acting on risks. 

Enter the ‘Need to Know’ Approach

The reforms helped pave the way for a “need to know” approach where data access became more controlled with permissions granted based on roles or attributes. 

For now, some aspects of this remain manual-led where IT teams have to play a gatekeeping role, fielding requests from the business. It can also be relatively inefficient and time-consuming. Many internal resources are often required to keep track, reconcile, and generate reports to relevant teams. 

As a result, bottlenecks are more likely to appear – and have an impact on the bottom line across different scenarios. A lack of discoverable data causes less-informed decision-making or a slow approval process means data becomes out-of-date by the time permissions are granted.

The ‘Need to Share’ Approach

That’s why it’s imperative that banks look beyond the “need to know” approach and start incorporating the “need to share” approach where data is governed based on sharing with the right people at the right time. 

While this approach still fulfills regulatory requirements, it also offers competitive advantage by putting information at the fingertips of the people who need it. The focus is more on making sure insights are discoverable, rather than locked away in silos. 

Naturally, access is granted in seconds, so it aids in organizations democratize data. Achieving this – especially at scale – relies on a mix of technology and human-in-the-loop expertise. 

How the Human-in-the-Loop Helps a Hybrid Cloud Data Management Strategy

The approach brings systems and humans together. A hybrid approach, reflecting the nature of many banks’ infrastructures. 

After all, banking remains one of the industries where legacy infrastructure still runs many core banking processes. However, there’s another form of legacy that’s an advantage to established banks. It’s a kind of legacy a bank builds over many years of managing and authenticating customer accounts:

  • Building trust and reputations by protecting people’s savings and investments
  • Generating loyalty by giving access to mortgages and financial advice

This legacy spans authentication, protection, and access. In other words, a readymade foundation for moving toward a “need to share” approach. Banks can get even further ahead by putting in place some key principles:

Principle 1: Implement a Data Security Operations (Datasecops) Approach 

DataSecOps is the natural evolution for how organizations manage data. It mirrors the evolution of software development where security (Sec) becomes part of the application development and operations (DevOps) cycle, and is no longer an add-on. This reduced silos and security gaps, and opens the doors to the more continuous and agile approach (DevSecOps).

In practice, DataSecOps means policy decisions are automated to ensure access is as close to real-time as possible. This differs from traditional approaches where potential access threats notify an admin who has to carry out manual remediation. 

DataSecOps is run from one single, integrated platform with a consistent and dynamic inventory. Data governance can therefore be run – and scaled – from a central location, reducing delays. Sensitive data can be masked at query run-time and aligned to relevant regulatory requirements. Self-service is at the heart of DataSecOps, enabling data democratization within the business.

Principle 2: Ensure Continuous Data Discovery, Classification, And Security

Banking’s fast-moving and highly regulated environment means access policies are complex by default. Policies also need to be dynamic and scalable to match growing data repositories alongside identifying the when, where, how, and what purpose. This relies on data catalogs, enriched with cleansed and unified metadata. It then becomes possible to maintain a standardized taxonomy for automated aggregation. 

Centralizing data access can remove much of the manual effort associated with data governance. One integrated dashboard that provides an overview of activity. Where self-service requests can be activated and monitored.  

The result – identifying sensitive and restricted information, while also supporting data discoverability. Other times, anonymization and obfuscation are the desired outcomes. That’s when it’s time to deploy techniques such as masking and partial masking, hashing, bucketing, and row-level filtering.  

Principle 3: Develop a Data Governance Strategy for the Data-Driven Banking Era

For banks, being data-driven isn’t just about uncovering insights. It also means governing sensitive information at scale – in a way that secures and protects privacy and corporate reputations while also ensuring that access is available so that the same customers receive the levels of service they expect and deserve. 

It’s a form of exchange that benefits all parties – when managed correctly. The challenge is maintaining the right balance. Particularly, when new and complex regulations impact the way data should be accessed. Or when advances in technology recalibrate customer expectations, forcing banks to adjust their offerings. 

To stay ahead of the changes, banks must adopt increasingly agile approaches to data governance and access so that information can be shared, with control at a more granular level. This calls for dynamic cloud data governance tools, with self-service for managing data lifecycles. The increased freedom can be supported by data lineage that maps, tracks and audits permission statuses. Through the increased visibility, permissions can be revoked and granted based on use case and policy.

How to Add a Partner to The Principles 

Banks need to start implementing “need to share” access controls as soon as possible. The above principles are crucial to realizing the competitive advantage that’s currently available. By adding the Velotix platform to the process, and banks also gain an AI-based policy engine. It learns from previous requests, becoming more accurate and providing more granular recommendations, even after regulations change or grow in complexity. 

Banks gain real-time access, based on the “need to share”. Data flows are automated from request, while security and compliance is enforced. 

This content originally appeared on www.velotix.ai and has been republished with the author’s permission.

Share this post

Hilee Avrahami

Hilee Avrahami

Hilee leads Velotix’s marketing initiatives and operations. She brings extensive experience in guiding B2B technology startups to market leadership positions. Hilee takes a holistic approach to the customer journey and user experience, ensuring that value is delivered at each touchpoint. Prior to joining Velotix, Hilee led marketing efforts at recognized organizations, including Trigo, Replicated, and KPMG. She has held several senior-level marketing positions at enterprise technology companies, including Clarizen (acquired by PlanView), AppDome, Zoomin Software, and Alcide (acquired by Rapid7). Earlier in her career, she was a management consultant at Accenture focused on strategy and operational excellence.

scroll to top