Performance Measurement:


The Performance Measurement Focus

The success and growth in every business activity-from manufacturing to customer service-is dependent upon how an organization utilizes its critical data. This is a lesson that is being learned and
re-learned by large and small companies everywhere. Of course data warehouses can show Buying patterns * Business trends * Proactive customer service opportunities *, and * Shifts in consumer
behavior *, so why concentrate on performance measures as the key?

The initiatives in data warehousing in many of these companies may be graded as success or failures, but if the companies are smart they haven’t lost faith in the underlying goals; to leverage
this critical business data in a complete business strategy.
Performance monitoring and reporting to the business based critical success factors are the best insurance that the business is on
track and reaching for value based growth. The delivery mechanism for information systems support of this is through the data warehouse (or data mart) appropriate operational and strategic
performance measures. Corporate commitment to action must follow from the performance measures reports thorough feedback loops created to effect positive change. Effects of corrective action will
then be reflected in performance measure improvements. As author Thomas S. Monson has said: When performance is measured, performance improves. When performance is measured and reported back, the
rate of improvement accelerates. (1)


The Data Jungle

Consider how many business analysts in a typical company work on various views of the corporate tracking and strategy. They often work in a spreadsheet environment and are always looking for the
latest official numbers. Instead of building and reinforcing business operations, reports are dispersed and often duplicated through manual aggregations up the line organization, eroding
information content in the data and thus inefficiently supporting the infrastructure it was collected to support. As in all data warehousing solutions, the problem of access to one set of
“official record” data as well as dimensional and drill-down views of the information are addressed and directly address this problem.

However, the sophisticated data manipulations that are necessary in analytics are considered the territory of the spreadsheet and this can obscure the fundamental role for the data warehouse can
play. But an appropriate the data warehouse architecture can address even this twist in requirements. Due to the comparatively small number of measures (data points) required in a performance
measurement application when compared with other data warehousing applications, emphasis can be put on the pre-calculation and storage of complex measures and algorithms that are repeatedly
requested.

The key is the ETL (Extract Transform and Load) that is employed in the data warehousing architecture. The tools in this market niche are designed to handle much more than the de-coding or
re-coding of information as it selects transforms and loads the warehouse. In one application, 60 datapoints extracted from source systems on a monthly basis can explode into 400 data points once
manipulations in the transformation stage of the load are completed.


The Scorecard View of the Business

The concept of the Balanced Scorecard, was articulated by Norton and Kaplan at Harvard a few years ago. (2) In this scheme, the financial view of business performance is complemented by views into
the customers, employee and environment performance. Since the KPI’s are no longer along one financial dimension but along many dimensional aspects of most of the company activities, the
technology solutions are often called Integrated Performance Measurement Systems or IPMS. The justification for investing in an IPMS is often self evident to Management as it lies in the improved
Corporate bottom line.

It is important to capture the widest range of metrics from the transaction system however the challenge is to balance the availability of large numbers of metrics with the added complexity it
represents for the broad user community of the data warehouse. As well, making all possible metrics available probably adds to background noise. Selectively for a universe of possible candidates is
a refining process that systems must engage in with the business analysts. The most effective companies out there use a set of the “vital few” indicators to monitor the health of the
organization.

The mechanics of performance measurement are complex and the development and deployment of the process painful. Being in the unique position of understanding what data is available, the data
warehousing analyst becomes a consultant to the business on alternatives to measures and an expert in data ‘genesis’ to support the design of measures. One thing to note is that measures
evolve as the understanding of their basis in process and usefulness to understanding the business becomes better understood. Typically many measures will be reported and tracked before a key set
will emerge. Many choices will be driven by industry best practices measures so that a competitive benchmarking program can be established. Many will be reported in order to establish a baseline
well before a target improvement value is established. A balance between breath, completeness and complexity and competitive comparison must be developed.

There is a challenge to produce aggregate views with few metrics for the CEO, and to support the disaggregated data for analysis and support of ‘root cause analysis’ questions that will come
back down from Management. The vital few measures must “roll up” from the local level, where employees understand the measure, take ownership of the data, and the results. This suggests gathering
requirements with a key group of users since there will be differing but related measures between groups. Line management will report on the CEO measures plus a local set for business control.
Measures that pass through without change to the CEO and measures that participate in aggregate views need to be designed. Additionally targets may vary by reporting level. More stringent targets
will be set at lower levels in the organization in order to have a comfort performance zone with the targets at the senior management levels. In fact more and more companies are tying individual
compensation to business performance measures at all levels of the organization.

Aggregate views presented to the CEO may well be ‘bundled’ measures. In this design groups of related measures that are seen to participate in a performance area will be represented by one
number that is a weighted view of scores in the individual participating measures. For example, a customer satisfaction measure may consist of a weighted score of ‘service wait time’,
‘total service time’, ‘focus group score’, ‘intent to buy again’, and ‘# of complaints’. Each score will be weighted from 1 to 10 in importance in its participation in
the bundled measure, and then the actual scores will participate in the weighted overall score accordingly. Of course this method of reporting has its limitations. Measure participation in the
overall bundled score can be obscured and clearly the design is very subjective. Go for unbundled measures for a number of reporting cycles before leaping into this design alternative.


Managing the Business: IPMS Data

Performance measures in a Scorecard are put into context with other measure types or trend data. Measure types include current value, best practice value, average value over time. Leading, or
predictive, measures can make the scorecard even more powerful. Targets for improvement include external benchmarks, compliance targets (regulatory), and internal targets. Targets can be expressed
as absolute numbers or range of acceptable values. Variances to targets are common measure presentation choices.


What Business Meta data You May Need to Store and Access

You’ll need to link to documents storing the technical meta data on the measures that store meta data important to the business.

These might include:

  • Key performance area: e.g. Shareholder value
  • Key performance indicator: e.g. Financial or Productivity
  • Measure name: e.g. Supply
  • Measure unit: e.g. Cost per customer
  • Measure status: eg. Implemented, defined, approved for design
  • Applicable asset: e.g. Retail Services
  • Should this measure be considered for annual reporting?
  • Measure calculation : mathematical calculation described in English.
  • Measure Driver: Goal for improvement in English language. E.g. To reduce unit cost of delivery to customer while retaining reliable and efficient service.
  • Measure Definition: Definition in text.
  • Data source: Source system, table, field, filter,
  • Measure Process: Process for the extraction of the data, and technical transformations, timing, delta update mechanism
  • Competitive Benchmarks: external target values and discussion
  • Internal Targets: value and milestones to be achieved.
  • Change Owner: Management responsibility
  • Reporting frequency: measure and management response timetables.


Best Practices

The ultimate benefit is that the communication of the important things to the business is facilitated and these issues and events are more effectively communicated through these reports, effecting
change more rapidly.

As data administration practitioners, data warehouse project participants it is it critical that we get this type of application on the business agenda.

Traditional support in one dimension of the business, customer, financial or human resource performance are not the penultimate views for managing the business unless further integrated into a
Balanced Scorecard that is made available to business for continuously checking for opportunities, challenges and improvements.

Best Practice companies use performance measurement systems together with qualitative measurement initiatives. Performance measures become enablers of change by supplying data for these
initiatives, rather than simply lagging indicators of performance levels.

Best Practice companies also have knowledge management programs, with all employees in two-way communication about the status of programs and the actual measured results. This is not a set of
measurements for Executive only. Results must be published to the Company signaling “we are all in this boat together”. For the IT staff this should suggest a delivery of the Performance
Measurement Report through a thin client might be appropriate.

An finally the IPMS measures must be designed to support galloping change so that the relevance of the performance measures can be maintained, continuing to produce actionable data as business
needs continue to evolve.

(1) Thomas S. Monson. Favorite Quotations from the Collection of Thomas S. Monson. Deseret Books, 1985

(2) Robert S. Caplan and David Norton. ‘The Balanced Scorecard: Translating Strategy into Action.” Harvard Business School, 1996.

Further Reading:
Nils-Goran Olve, et al .”The Performance Drivers : A Practical Guide to Using the Balanced Scorecard”. 1999.

Share this post

Deborah Henderson

Deborah Henderson

Deborah Henderson, B.Sc., MLS, PMP, CDMP, is President of DAMA Foundation, and Vice President of Education and Research for DAMA International (DAMAI). She chairs the DAMAI Education Committee and DAMA-DMBOK Editorial Board, and sponsors the DAMA-DMBOK to advance data management education in post-secondary institutions. Deborah has many years experience in data architecture, data warehousing, executive information systems/decision support systems, online analytical processing design and project management, with an extensive background in information management (process modelling, data modelling and data dictionaries), enterprise management practices and technical report writing. She has consulting experience in many different business functions in energy and automotive sectors using the techniques of data and process modelling. Deborah is a President of DA Henderson Consulting Ltd.

scroll to top