Data Warehousing at the Speed of Business

Published in TDAN.com October 2003
How Next-Generation Technologies Enable Enterprises to
Challenge Traditional Data Warehousing Assumptions

As demand for responsive Business Intelligence (BI) and Business Performance Management (BPM) grows, global enterprises are still turning to data warehouses as their preferred source of data for
analysis[1]. The principle of gathering corporate data into a single, consistent store remains perfectly valid, but as businesses are constantly
changing, the practice of traditional data warehousing can prove complex, costly and prone to failure.

The fundamental problem is that traditional data warehousing methodology promotes stasis of the business model, but businesses thrive on change. The difficulty of reconciling these opposites is a
major contributor to why four in every ten data warehouse implementations are expected to fail[2].

Conventional data warehousing wisdom says that you should plan for a lengthy and expensive implementation, that you will need an army of skilled project managers and technicians, and that you can
forget about trying to reflect the changing state of your business: a data warehouse is static data in a static model, custom-built to meet fixed user requirements.

However, in order to be able to adapt intelligently and at high speed to new competitive challenges, business users need access to information that remains consistent however much their
organisation is changing. The cost and time overheads of re-coding a conventional data warehouse to track every change in the business are prohibitive, so reporting in such an environment will
always be delayed or inaccurate, and business intelligence initiatives will not deliver actionable conclusions.

Leaders of responsive, ROI-conscious enterprises rightly observe that this is no way to support a business. Rather than moulding their business models to fit in with what data warehousing
convention says is possible, major companies such as Royal Dutch/Shell Group, HBOS plc, and Unilever are breaking the rules, using next-generation tools and methodologies that make data warehousing
responsive to their businesses, and highly cost-effective.

Next-generation data warehousing assumes that both the business model and reporting requirements are ever-changing. This enables businesses not only to obtain up to date business intelligence, but
also to compare present, past and predicted performance, no matter what the business structure is at any given time. This enables business leaders to run truly adaptive enterprises, capitalising on
opportunities and reacting to global events faster than the competition.


The conventional rules – and how to break them

  • Build, don’t buy. Your enterprise is unique, so your data warehouse will need to be highly customised, tailored and coded to suit your individual business model.

By using a data warehousing application with a generic data structure, users can create customised data warehouses without the usual cost or time overheads.

  • The enterprise must clearly define an end-point for the data warehouse before starting any development work; the source systems to be used, and the queries and reporting formats needed, must
    be defined in advance.

With next-generation data warehousing, defining an end-point is no longer necessary, giving business intelligence and performance management tools the ability to be adapted to changing user
requirements. The latest data warehousing techniques make it easier to define new data feeds and alter existing ones, as new star schemas can be automatically created. Adding a new transaction data
set, or modifying an existing one and then regenerating the star schema, is a point-and-click operation. Business users can also alter their own reporting and querying requirements through defining
and managing their own data marts.

  • Freeze your business, and build the data warehouse to reflect it. Re-design is complex and expensive, therefore model the business as it is, and build your data warehouse to those
    specifications.

Global enterprises may introduce new brands, acquire competitors or sell off under-performing business units on a daily basis, so freezing the business is an impractical proposition. By separating
data from the business model, and allowing multiple models to co-exist, next generation data warehousing enables the data warehouse to evolve at the same speed as the business even during
implementation.

  • Time variance is expensive and difficult to manage, so you must apply ongoing changes to the business model indiscriminately to all data, whether current, historical or future.

Next generation data warehouses provide a generic data structure that separates transaction and reference (business context) data from the current business model, and stores them all as separate
entities. This makes it possible to view all of the organisation’s collected data according to past, current or future business models. A clear view of data in current and future business
models is particularly important during merger and acquisition activity, where it enables decision-makers to compare pre- and post-merger performance at high speed and low cost.

  • Federations of data warehouses are too complex and costly to build and synchronise. Handling multiple business models around the world is a sure-fire way to destroy the integrity of
    data.

By storing data separately from its model, enterprises can support multiple business models across a federation with greater ease. Synchronisation can be handled automatically, with new business
models distributed over the internet, and reporting controlled from a central point for maximal cost-effectiveness.

  • A major data warehousing project requires significant investments in programming skills, as well as in project management, system architecture, business reporting, Online Analytical
    Processing (OLAP), and database architecture skills.

By using a pre-built data warehousing application that can quickly be adapted to suit the business, then managed by business users via a simple interface, enterprises can create and run data
warehouses without the investment in programming skills normally required – and without needing a skilled database administrator for every local instance.

  • Building a data warehouse could cost in the millions and take many months, if not years.

Enterprises that use data warehousing applications rather than building from scratch can expect much faster implementation at significantly reduced cost. Next-generation data warehousing software
also gives enterprises the opportunity to change the structure and purpose of the data warehouse during the implementation cycle, reducing the need for exhaustive pre-planning and dramatically
cutting the risk of project failure.


The next generation goes live

Next-generation data warehousing is not merely a blueprint for the future, but a reality in major enterprises around the world, where it is saving time and money, and delivering a clearer and more
accurate view of performance throughout change.

Take for example Shell OP, the various Oil Products businesses within the Royal Dutch/Shell Group. Shell OP needed to accommodate independently-changing local, regional and global business models
and data structures, while providing a standardised global view of business performance. According to the standard assumptions about data warehousing, the cost of designing, building and
maintaining such a system would be astronomical, and the system would have a high chance of failure.

Challenging the rules, Shell OP successfully built a federation of over 60 data warehouses covering over 80 countries in just 18 months, a timescale that would have been inconceivable under the
conventional rules of data warehousing. The solution brings together management information to support standardisation and segmentation, with global and local views of key business entities such as
customers and products. The federative approach permits any number of localisations to co-exist with the common corporate data model, giving a consistent top-down view without forcing a structure
on individual operating units.

Global FMCG giant Unilever regularly undertakes mergers and acquisitions, so it needed a data warehouse that would not require its multiple business models to remain static. The company also needed
to be able to view historical brand performance, in order to measure the effects of restructuring initiatives. Unilever successfully broke through the constraints of conventional data warehousing,
building a flexible and cost-effective solution that has delivered rapid results.

Using next-generation data warehousing technology, Unilever has succeeded in bringing together complex, time-variant data from numerous systems, and is using this data to deliver relevant and
timely management information directly to business users. The company now has commonality across supply-chain, brand, customer and financial data, all cross-referenced by the same master reference
data warehouse, ensuring greater consistency and accuracy of information.

The solution has made a substantial contribution to savings in procurement, and expanded Unilever’s ability to view the historic and projected performance of global brands across financial
and non-financial measures.

When Halifax and Bank of Scotland merged to form HBOS plc, the board wanted to integrate procurement data across the whole organisation in order to facilitate cost savings. Conventional wisdom
dictated that a custom-built data warehouse would be needed, and that HBOS would need to define an end-point very carefully before starting any work. HBOS could not accept these constraints,
because the nature of its ongoing business evolution meant that its organisational structures would be changing regularly. Furthermore, HBOS needed an operational data warehouse as quickly as
possible, since the board of directors wanted to use the cost savings made within the first few months of the merger as proof of its success.

With conventional data warehousing methodology, this degree of flexibility would have been at worst unfeasible, and at best expensive and slow to build. HBOS used a data warehousing application to
bring together data in different coding structures, and was able to give business users a clear view of the merged procurement information within just three months, without affecting its ability to
view data according to the old business models.


Breaking free from constraints

Enterprise leaders seeking to improve the ROI of their management information initiatives no longer need to feel that data warehousing technology holds them back. As the above examples demonstrate,
new software and methodologies make it possible to create highly responsive data warehouses that can be managed at low cost in rapidly-changing business environments. These data warehouses can
deliver a consistent view of the past and the present without requiring any costly changes to source systems, and automatically adapt to business change.

By challenging restrictive assumptions about data warehousing, enterprises can develop the flexibility they need, but without having to make unsustainable investments in technology. In a climate of
cost-cutting, can any enterprise afford to ignore next-generation data warehousing.

——————————————————————————–

[1] A Harte-Hanks information integration survey published February 2003 found that 54 per cent of Global 2000 companies are implementing
a data warehouse, and 27 per cent plan to do so in the next 12 months. The survey was commissioned by Kalido Group, and was based on interviews with 154 respondents from the US, UK, and the
Netherlands.

[2] Cutter Consortium, Corporate Use of Data Warehousing and Enterprise Analytic Technologies, December 2002. According to the report,
the addition of features during development is a primary reason for data warehouse project failures.

Share this post

Cliff Longman

Cliff Longman

Chief Technology Officer, Cliff Longman is responsible for product and technology strategy for Kalido's data warehousing and master data management portfolio. Cliff has for over 25 years been a leading-edge software technology practitioner in systems strategy, data modeling, and database design. He has significant experience in leading the design and development of major software systems in a variety of application areas. He also has extensive consulting experience across a number of industries including manufacturing, oil and gas, finance, electronics, and service sectors. Cliff graduated with a 1st Class Honours degree in Computer Science from Coventry University.

scroll to top