Considerations For Managing Risk In A Post-SOX Environment

Published in TDAN.com April 2007


INTRODUCTION

For many years corporate leaders (CEOs, CFOs) and information technology (IT) executives (CIOs) understood the rewarding possibilities of business intelligence gleaned from data warehouse stores of
transaction records. Unfortunately, there is now a converse requirement to protect data from unintended users and uses. Put another way, revenues are made from better data and saved through faster
data production. Yet in an era when the survival of companies and organizations is threatened by data at risk, revenues must also be protected with safer data.

A major emphasis is therefore being placed on compliance with government regulations, industry standards, and corporate policies that are designed to identify and protect data at risk of misuse
through disclosure, outsourcing, hacking, theft, and otherwise improper processing or handling.

From an auditing and risk control perspective, executives are scrambling to understand compliance and fund the implementation of controls that will solve problems with data before they surface.
From a data perspective, business and IT stakeholders are also discovering that data quality is questionable, and that they often cannot trace data from point A to B without disconnects; i.e. where
data starts in the system and ends in reports. Understanding and verifying these data flows are keys to complying with Sarbanes-Oxley (SOX) regulations.

In this compliance era, the common language is now one of risk management. IT departments are now learning that language in many of the same ways that business people have always had to – including
the hard way. This paper lays out that landscape, and suggests ways to help achieve (and verify) compliance, while still producing rapid, actionable business intelligence.


THE COMPLIANCE LANDSCAPE

Following are some of the key considerations and recommendations surrounding risk mitigation in the post-SOX environment:

1) Data=Risk

Data are always at risk — at risk of hacking, theft, incorrect transmission or modification, being pulled from the wrong sets, and so on. If sensitive data are disclosed, the Privacy Rights Clearinghouse records and posts the incident for the world to forever know what companies allowed its customers’ personal information to be
compromised. In addition to the embarrassing secrets revealed, lawsuit damages, government fines, and/or data remediation costs can cripple the entire company.

Consider that H&R Block published a 47-digit customer tracking ID that contained social security numbers, and then had to spend $140 on each of the millions of records that had to be
fixed. [1]

Other recent examples include [2]:

  • February 2005: On-line identity thieves established bogus database access accounts at ChoicePoint. The private information of 163,000 people (records) was affected. ChoicePoint subsequently
    settled with the Federal Trade Commission for $10 million in penalties and $5 million for consumers.
  • September 2006: 8,8800 drivers identified through the City of Savannah’s red-light camera enforcement program had their drivers’ license numbers (which in some cases were their social
    security number) exposed on-line for seven months due to a “hole” in the city’s firewall.
  • October 2006: A laptop computer (since recovered) was stolen from Connors State College containing the social security numbers and other data for students, as well as 22,500 records of high
    school students who qualified for Oklahoma HLAP scholarships.
  • October 2006: A thief stole three laptop computers from Gymboree’s corporate headquarters. They contained the unencrypted human resource data (names and social security numbers) of up to
    20,000 workers.
  • November 2006: According to documents obtained under the Freedom of Information Act, 478 laptops were either lost or stolen from the IRS since 2002. 112 of these computers held sensitive
    taxpayer information such as social security numbers.
  • November 2006: Starbucks lost track of four laptop computers. Two of the laptops containing the names, addresses, and social security numbers of 60,000 current and former U.S. employees and
    about 80 Canadian workers and contractors.
  • November 2006: A computer purchased at a second-hand store, Deseret Industries, contained the names, social security numbers, employment records, and other personal information about 6,244
    Intermountain Health Care employees.
  • December 2006: A Boeing laptop containing files with the names, salary information, social security numbers and contact details of 382,000 current and former employees was stolen.

2) Risk Must Be Managed

Managers must understand the risks data represent and then formulate, execute, monitor, and verify plans to remove these risks through reasonable means. SOX adopters are specifically obligated to
conduct a formal risk assessment and choose a risk management strategy. Assessing risk is about identifying the strongest intersections of the likelihood of data compromise, and the potential
impact of that compromise.

3) Managing Risk with Formal Controls

IT groups are starting to understand the language of “Controls” which are manual or automated steps within IT departments using software that can prevent, detect, and/or correct a problem.
Preventive controls are the best (like a door lock). Some tools can do more than one thing, like a firewall (which is preventive and detective). This company makes a tool to prevent and correct
problems with data by protecting and transforming it, leaving detection to tools like Exeros DataMapper. Anything that can automate these controls effectively is a preferable solution because
people make (inconsistent) mistakes.

4) Executives Are Responsible for the Controls

Because information in the financial reports (that drove the WorldCom and Enron scandals and subsequent SOX legislation) can flow through IT department resources, CEOs and CFOs need IT’s help to
attest to the accuracy of the data and the efficacy of the controls. This is true not only for SOX, but for HIPAA, PCI, and other regulations with which companies must comply. This means these
executives must guarantee the data could not have been breached, changed, or hacked, and show how the company tries to protect the data. A good control system must also have logging built-in for
verifying the use of these controls.

5) Use Industry-Standard Controls

Industry-recognized, post-SOX standards for data and access protection (like NSA Suite B encryption techniques) should be followed in order to be truly compliant, including:

Committee of Sponsoring Organizations of the Treadway Commission (COSO)

A private sector consortium named for former SEC chairman James Treadway in 1985 and funded by five major professional associations in the United States. COSO is “dedicated to improving the
quality of financial reporting through business ethics, effective internal controls, and corporate governance” as published in a series of principles, reports and recommendations available at
www.coso.org.

Control Objectives for Information and Related Technologies (CobiT)

“An IT governance framework and supporting toolset (from the IT Governance Institute and the Information Systems Audit and Control Association) that allows managers to bridge the gap between
control requirements, technical issues and business risks.” [3]

International Standards Organization (ISO) 17799

10 controls first published in 2000 comprising the best practices in securing information assets.

Information Technology Infrastructure Library (ITIL)

Published by the Office of Government Commerce in Great Britain, ITIL focuses on IT services, and is often used to complement the COBIT framework

6) Implement Data Governance (People and Rules)

Applications are easy to map to corporate organizational charts because a specific person or group can be responsible for an application. Data, on the other hand, flow throughout the organization
and do not map to specific points on an organization chart. Only now are companies beginning to understand that they cannot map data accountabilities to specific job functions or departments.
Responsibility must be mapped across the organization. [4] This is the goal of those in the “Data Governance Office.” Data Governance sets the
rules of engagement for people responsible for specifying, designing, implementing, monitoring, testing, and retiring the controls on data. The controls have a life cycle, just like applications
and data do. Data Governance officials are responsible for implementing these rules throughout the life cycle, and the corporate IT staff provides input into, and is accountable to, these officials
and the agreed-upon rules. It is up to data governance efforts to identify the location and nature (e.g. risk level, and need-to-know classifications) of data at risk.

7) Master Data Management (MDM)

Master data is a unique core set of transactional data elements (fields) used in most of a company’s processes and information systems transactions; i.e. control tables, or reference data in a
look-up table. It exists because companies need to share data across departments and business functions, or, in the case of a merger or joint venture, two or more companies must share data across
different platforms. Master data is used for modeling and data quality rules in data migration, cleansing, and stewardship (procedures to prevent and protect disclosure). Ventana Research defines
master data management as the collection of practices and technologies for providing business and IT the capability to define enterprise-wide master or reference data that is linked to the
business. [5] When a piece of master data that affects
many applications and transactions is missing, wrong, inconsistent, or incorrectly labeled, there will be adverse effects downstream. Therefore, good master data management (MDM) is required to
keep this data clean, and to standardize master data models amid the relational taxonomy of facet data.

According to Jane Griffin, a Deloitte Consulting partner specializing in data warehousing and business intelligence systems, there are four processes essential to a good MDM strategy [6]:

  1. Data Migration and Integration
  2. Data Maintenance
  3. Data Quality Assurance and Control
  4. Data Archiving [7]

Master field data are typically stored in a centralized repository (often in support of a Service Oriented Architecture) for cleansing and sharing — suggesting a higher stakes need for field-level
protection(s).

Please visit www.cosort.com to request additional material from this abridged document, including discussions and examples of
specific, field-level security techniques (such as encryption and de-identification) in CoSort’s SortCL tool that help secure master data, and make transactional data safe for compliance and
outsourcing.

© 2007 Innovative Routines International (IRI), Inc. All Rights Reserved.

[1] Source: http://www.privacyrights.org/ar/ChronDataBreaches.htm

[2] Corporate, brand, and product names mentioned throughout this paper are trademarks or registered trademarks of their respective holders.

[3] CobiT is the likely the most important of these control sets from the perspective of the IT director and those responsible for making practices compliant and data safe.

[4] Source: The Data Governance Institute

[5] According to Ventana Global Research Director David Waddington, investing in master data management will enable companies to continue leveraging the benefits of their current
BI, ERP and data warehousing investments.

[6] Griffin, Jane. DM Review, October 2005. Information Strategy: Building a Data Management Strategy, Part 1: Process.

[7] IRI’s CoSort is a tool that can assist in all four processes, including the role of changed data capture. However, such applications, including CoSort’s field-level
protections for data at risk in files (at rest or in motion), must be covered in a separate white paper.

Share this post

David Friedland

David Friedland

David, Vice President of Business Development for Innovative Routines International IRI, Inc., has more than 15 years of progressively responsible experience growing the firm into a global leader in sorting and data manipulation technology. Prior to joining IRI, David held positions at home and abroad in journalism, management, and education. He has communication and business degrees from Cornell University and University of Albany (SUNY), respectively.

scroll to top