The BIS Roadmap

Published in TDAN.com January 2003

Note for Readers: This article comprises of a write-up on core thinking / interrogation associated with BIS implementation, embedded in explanatory notes elucidated with example of a business case.
The article therefore has two facets:

  1. A concise reference on strategic thinking behind BIS
  2. BIS roadmap for the uninitiated

As a general rule, it is a good idea to sip this article slowly the first time around, going through the case and illustration based on it, appearing in
blue
. Upon assimilating the concepts and getting accustomed to the author’s vocabulary and writing style, it is possible to refer only to the core portion appearing in black that
serves as a ready reference.


Introduction

The unfortunate reality of Information Systems projects is that a completely successful project is among the most unusual phenomena. Among those projects that do not take the “good day
path”, a majority are deemed successful if they reach their objectives despite eating-up excess resources and often near-missing the boat. Then there are projects that are deemed failures by
the most generous evaluation, primarily because they at best meet their objectives only partially, leaving certain success criteria left unfulfilled. An alarming number of these failed projects
relate to the Business Intelligence arena. Most of us find this surprising since business intelligence is largely about analyzing and deducing rather than creating facts. The failure of these
projects must therefore be attributed to gathering wrong facts, doing wrong analysis and deducing wrong intelligence (AKA Garbage). It’s worthwhile therefore for us to delve here upon the
steps that we may take and the questions that we may ask ourselves in establishing a BIS, for it to have a better rate of success. Here we’ll concentrate on the strategic thinking associated
with BIS development, leaving technological nitty-gritty to another article.

In the following sections we’ll consider example of a simple organization, find the questions that we should ask ourselves (and thus lines of analysis we should pursue) leading to a roadmap
for BIS implementation.


The Case

Consider case of a medium-sized (50-500 employees) organization operating in a geographical area with two business competitors. The organization is a spin-off from utility companies and serves them
by providing points for collecting payment in cash for paying utility bills across-the-till at various retail outlets.

The retail outlets are generically called Agents and may have a multi-tier structure, e.g. a retailer may be a franchisee of a retail chain, which in turn a segment of a retail group and so on. Our
organization needs to collect the payments from its Agents and in turn reward them a small variable rate. Further, it may deal with any, some or all levels of a particular Agent hierarchy and may
need to reward higher levels in the hierarchy for contribution by lower levels.

The utility companies served by the organization are referred to as Clients. Clients too may have hierarchy and multiple points of interaction.

The process of payment may involve proprietary transaction machinery of the organization or POS machinery of the retailer, and may have scope for using multiple payment methods and multiple
currencies. In all cases, the transaction involves communication between the payment machinery and data center at the organization over telephone lines.

The organization has three main departments:

  1. Operations: Runs data center. Facilitates the electronic transaction.
  2. Retail management: Manages Agent force. Operates call center.
  3. Sales and Marketing: Maintains relations with Clients, has role in negotiating Agents.

It also has auxiliary arms including HR, Finance, Admin and a bunch of analysts, developers and testers, on ‘need to’ basis, clubbed into Development.

The organization needs to report suitably aggregated performance data to its Clients and study performance data for its Agents. The management also needs to study trends and hopefully use them to
take the Right decisions.


The Solution


Step1: Find objectives

It is a world of demanding clients and obliging servers. Providers believe that there job it is to provide what clients want and to keep the clients happy. Unfortunately though, in an overwhelming
number of cases the two statements ‘provide what the client wants’ and ‘keep the client happy’ contradict, particularly when the client itself is unclear of what it wants.
In such cases the client progressively gets frustrated as the BIS shapes. Also possibility is that the client begins getting a low opinion of the solution framework, given that the provider simply
pleads with the client that certain requirements can not be met, rather than highlighting that the solution platform usually offers a more elegant way of reaching the objectives than is envisaged
by the client. It is therefore important to concentrate first on the objectives to be met rather than seeking stakeholders’ dictation with all its imagined cosmetics. In other words, the BIS
specialist must concentrate on seeking ‘What’ rather than ‘How’ first.

 



To take an example from our organization, say the Retail Management department has an assistant who takes transaction numbers from the database into a spreadsheet manually (often making human
errors in the process). She then saves copies of this spreadsheet at various places for them to be potentially backed up. Next she prints this spreadsheet, struggling to fit the columns into the
printable area, and hands it to a colleague in the Sales and Marketing department (again a couple of copies filed here and there) who puts her department’s stamp and posts the spreadsheet to
the Client as a transaction summary report along with a customary complement card.

If we start our process of BIS building from this assistant – a key stake (and job) holder, we are going to get requirements such as: Write software that will allow transfer of transaction
figures from the database onto a visual interface form, which looks similar to the spreadsheet. Provide two buttons on the form, one of which allows printing the form so it can be filed (put into a
black-hole) and other converts the form to a GIF image that can then be sent to the S & M assistant through e-mail. She will then print it off, put her departmental stamp and post it to the
Client as usual.

In reality though the process is simply one of taking certain transactional data from database operated by the Operations department, aggregating it suitably and making it available to the Client.
The objective therefore is providing the Client with an accurate summary of transactions related to it (and possibly giving it the opportunity to analyze aggregated data) and not posting rims of
potentially inaccurate reports that require efforts on parts of Client personnel to give a decision-worthy picture. This can be achieved without requiring any effort on part of the Retail
Management or S & M departments, and avoiding delays & inaccuracies of the present process. The rendering of data can be done securely using extranet mechanism or as encrypted e-mail
attachment. However, this solution can only be arrived at by concentrating on the objectives first, rather than blindly following stakeholders’ commandments.

The following questions can guide in our quest towards finding objectives.


Does the organization have coherent set of primary objectives?

It is often the case that what we refer to as an organization may turn out to be a set of weakly coupled or in extreme case practically disjoint entities / organizations with their own distinct
agenda. In certain cases they may have a related set of management or information infrastructure, but strategically they are disjoint. It is therefore important to ascertain that we are indeed
dealing with one organization strategically, alternatively have separate or weakly related BISs that can, as necessary, share infrastructural resources to a much wider extent but not necessarily
intelligence. The easiest method to confirm that we are dealing with a single organization at strategic level, is to confirm that the organization has a coherent set of primary objectives.


What are primary objectives of the organization?

The task of ascertaining that we are dealing with a coherent set of primary objectives goes hand in hand with the task of identifying them. Primary objectives are typically reasons for existence of
the organization, subjects that identify the organization (and typically correspond to an action constituting mission statement of the organization).

Our organization being simple, it has single primary objective, that of carrying out electronic transaction accurately and cost-effectively between its Clients and Clients’ customers.


What are antecedent objectives to primary objectives?

Having appreciated what the organization stands for, the next logical thing is to find what it does to meet its primary objectives. I prefer the word antecedent in defining this since meeting the
primary objectives is consequent to meeting it.

In our organization, we need:

  1. Reliable electronic transaction machinery
  2. Appropriate geographical spread
  3. Loyalty and efficiency of Agents
  4. Trust, good life expectancy and attractive commission rate for Clients


What are the CSFs?

In an enterprise, the organization is endeavoring to succeed, i.e. preserve and strengthen its existence. This naturally corresponds to elements that lead to the primary objectives of the
organization being met at an efficiency where the organization not only gains enough intrinsic inertia to keep going but is also able to meet competition. These factors that must be met for the
organization to succeed are referred to as critical success factors. Many analysts like to equate these to the antecedent objectives mentioned above, though a better representation of CSFs is as
quantified antecedent objectives, where benchmarks are laid to represent minimum success criteria.

In our organization, the CSFs may look somewhat like:

  1. At least 95% electronic cash transactions must succeed with transaction time not exceeding one minute each.
  2. An agent outlet must exist within an area of 10 sq. miles that caters to 1000 customers of existing Clients.
  3. An agent outlet must exist within an area of 10 sq. miles that caters to 2000 customers of prospective Clients / competitors, if it caters to 500 customers of
    existing Clients.
  4. Transaction machinery should be withdrawn from an Agent outlet that results in both under 1000 transactions and under $5000 total transaction amount per week for
    four consecutive weeks or twice for three consecutive weeks within a three month period (This is to ensure network supportability).
  5. Clients must receive accurate summary of their transactions, with regional classification, each month.


What are antecedents to CSFs?

Having identified and quantified factors that can keep an organization ticking and potentially prospering in the marketplace, we next trace them to simpler antecedents that the BIS may keep a track
of and report level of success on. Note that we are sticking to the lowest possible level of measurable statistical quantity and not the non-statistical infrastructural issues that result in this.
Whereas the latter may be where the focus of any remedy would be, they are beyond the realm of BIS.

In our examples, antecedents to CSFs would be:

  • Downtime of transaction network should be under 0.5% during daytime (8 am to 8pm) and under 1% during night.
  • At least 99.5% transaction terminals should be operational at any time.
  • At least 99% transaction terminals should be operational during daytime.
  • At least 98% electronic cash transactions must complete in a minute.
  • At least 97% electronic cash transactions must succeed.
  • Mechanism should exist to aggregate transaction data at each Agent outlet, classified by Clients, at the end of each day.
  • Mechanism should exist to aggregate transaction data for each Client at county level granularity, at the end of each day.


What are desirables and their antecedents?

Like essentials or primary objectives discussed above, we next need to explore “good to have” traits – the desirables and their antecedents. These though not prime focal points of
decision support, have a definite role in success of the organization. They also have the potential of getting promoted in desirability in fierce competition, weak economic conditions or when the
market saturates.


What are ASFs and their antecedents?

The desirables, when quantified, lead to Additional Success Factors. These and their measurable antecedents must next be derived.


Step 2: Find stakeholders

Only after performing an objective study of objectives of the BIS, should the analyst proceed to finding and studying the stakeholders. This is to help present data relevant to the objectives to
stakeholders involved, in a form which is most appropriate for each individual / group to consume. The following questions help in this.


Who are the stakeholders?

This has two facets:

  1. Who all need to be supplied the data?
  2. What is the best form of presenting the data to the individual / group?

Going back to example of report of Client transaction aggregates mentioned earlier, we observe that apart from the Client, it is useful for the S & M personnel dealing with the Client to know
what data was made available to the Client, to be able to work as liaising officer on behalf of the organization. Further, it may also be useful for this person to have statistics of a particular
Client’s transaction contribution in comparison to other Clients. If the Client operates across different locales, it may be desirable to express the data in multiple units (The biggest mess
is when mainland Europe, the British Isles and the Scandinavians get together to do business. Whereas placed closely geographically, they differ in currency, units of measurement, language and
national time. You may be dealing with half a dozen locales covering geography and business volume no bigger than that corresponding to the state of California.).


How are stakeholders classified?

This question is important in striking a balance between personalization of information presentation and a level of commonality across presentation to similar clients (of the BIS), so as to allow
them to relate to the data while communicating with each other. This also helps in implementation of data protection and in turn a data design where such protection may be conveniently and
effectively applied. Another aspect of classifying stakeholders is working out who needs facts, who needs trends and what should be the level / granularity of either.

E.g. whereas it is the same data about Client transactions that forms basis of inputs for all members of the team, it is necessary for someone from Finance department, involved in reconciliation
process, to have accurate data, including low granularity data for irregular transactions, so accounts may be settled with the Client personnel. A company executive on the other hand would like to
see a trend in how the number and amount of transactions have varied over time for a particular Client and how this compares with others. Further, the executive’s counterpart at the Client
organization may be given access to the former (trend), but may be not the latter (comparison with others).


Step 3: Analyze

So far we have been hovering over the problem domain, deciding what we want and how best to group and supply this information. The next aspect is relating this to the data available.


What intelligence to look for?

Concept of BIS is comprehensive and covers DSS, MIS and lower level factual reporting. These different facets aim at different stakeholders and need different levels of data analysis.


What triggers to look for?
Triggers serve two roles. They alert management to a decision critical threshold being reached. They also help automate (or semi-automate by reducing decision delay points and supervisory
contribution) the process. In particular, they greatly save efforts on monitoring / wait-n-watch. Triggers in turn are based on facts or “quantified trends” (somewhat paradoxical,
basically meaning facts pertaining to “rate of change of … w. r. t. …”). Triggers precipitate from antecedents to success factors / quantified objectives.

In our example, a warning trigger may be activated if an Agent misses defined thresholds of aggregate transaction amount and number for say 3 successive weeks, rolling and calculated each day.


What trends to look for?

Trends define derivative value of data – how one data element varies proportional to other – and help extrapolate facts to forecasts. Trends therefore allow tactical maneuvers both to
leverage on favorable circumstances and to take evasive actions. They also allow piloting – outlining bigger picture with smaller experiment, establishing proof of concept, etc. Trends are
reflections of triggers in continuous evaluation paradigm. Trends to be monitored are therefore definable directly from objectives and their antecedents.

Our organization may therefore consider it worthwhile to monitor trends in transaction volume for a Client over time, over geographical features (market share, level of urbanization), over Agent
group size, etc.

Trends are however tricky, in that they require isolation of tangibles and need to measure one with respect to a second, keeping all others constants (or suitably normalizing).


What best describes trends?

Trends provide very different picture with change in granularity. Whereas courser granularity may be a better indicator of performance, a finer granularity may bring out effect of periodicity of
one tangible on the other.

In our organization, consider measuring transaction volume for a Client, month by month over an extended period, and try drawing a “trend-curve”. This will give us an indication of how
that Client is doing. Instead, break this data into annual cycles and it may reveal how energy consumption fluctuates with season change. Analyze the same data in terms of weeks in a month for
consumers who earn their wages (say) towards month-end and we get a plot of consumer behavior with respect to time of earning. Further, study the month on granularity of daily data and it may also
reveal how consumers respond to billing frequency, bill dispatch date and deadline, inciting the Client organization to potentially change its billing cycle.

Trends are also susceptible to dimensions[i] (as definition of dimension in Physics). We may therefore want to take higher derivative of one
tangible with respect to other or in general devise calculable quantities by products / ratios / powers of primary tangibles, to best bring out the trend that objectives leading to the BIS are
seeking.

I have actually dropped an example of this while discussing granularity. I.e. whereas we measured the same quantities dimensionally (as usual, in Physics sense of dimension) in transaction volume
for each week of each month and for each month of each year, we actually dealt with a different quantity when we measured transaction volume each month over extended period (we took one lower
derivative with respect to time studying $/t rather than $/t2). Another example of changing trend with dimension is considering how a Client is contributing to revenue of the organization. Here,
whereas monthly transaction volume ($/t) for the Client may have shown a steady increase over time (thus $/t2 is always positive), what best serves the organization’s strategic purpose is
trend in proportion of overall monthly transaction volume ($/t) contributed by the Client (thus dimensionally, ((($/t)/($/t))/t) = t-1, which may well turn out negative and therefore discouraging).


What facts to look for?

Triggers, trends and reporting requirements together constitute set of facts that must be built to establish the BIS. Additionally, it may be necessary to collect unusual fact data (the
‘unusual’ being determined by static or dynamic triggering) and sample facts for analysis.

Going back to our example, we know that to activate static trigger such as ‘notify if amount in single transaction exceeds $1000’ we need to collect amounts involved in individual
transactions and apply trigger on these. We also need to collect identity information for unusual transactions isolated by the trigger / criterion. The same example could be modified to
‘isolate transactions that are over 500% of average value of individual transaction as computed the previous day’. This is a semi-static trigger and additional data – average of
individual transaction value for previous day, needs to be calculated, which then remains static through run of the day. Then there could be a dynamic trigger where we want to ‘isolate
transactions that involve amount over 500% the cumulative average for individual transaction calculated for period of the day elapsed’. Here we’ll need to calculate the cumulative
average periodically and frequently through the day. In short, as the trigger tends to get more dynamic, we need to calculate, collect and store additional data. A plot of cumulative count of
number of such transactions over progress of the day may help us reach some worthwhile conclusion (e.g. consumers tend to pay big amount bills on the afternoon of their pay day). This can then help
anticipate load on the transaction system and schedule ‘fact finding’ operations suitably.


What inputs are available?

We have maintained that in the process of analysis we are dealing with the solution domain. So far we have discussed what quantities at what granularity are desired for the BIS function. The next
step is taking an inventory of what raw data is available at our disposal (typically coming from say the OLTP system) so it could be crunched and granulated suitably. This is largely a mechanical
job, though one aspect of this activity is to find if and where redundant raw data is getting produced. This could be in two forms – two values covering the same quantity instance at
different granularity, and (say, in the simplest case) three quantities stored separately where the third could be calculated directly from the other two. One consideration is to minimize this
redundancy at pre-BIS / OLTP level (and then introduce it in BIS). Second is to guard against / make choice within two quantities representing same facts where only one would be taken as input to
BIS. Dimensional analysis has a role in this matching and selecting process.

E.g. in our process, the electronics in the payment terminal may be keeping track of total number and value of transactions performed by it since last reset. This information may have a role in
verification process but is of little value to the BIS where data is expected to arrive in a raw verified form and is so available directly through individual transactions.


How to build the bridge?

We have accurate data in raw form on one hand and type of data we require on the other. We therefore have to manipulate the first to reach the second. Dimensional analysis is about the only way of
doing this. The process is simple: work out dimensions of quantities on either side. Where dimensions match, quantities can cross from one side to other, possibly changing in granularity on the
way. Where a quantity on one side (BIS solution) does not have a direct match on the other (available data) we mix-n-match the dimensions (perform dimensional analysis) until we are able to relate
the two sides, where once again granularity of quantities involved needs to be correlated.

In general, granularity is changed only in one direction – from lower to higher grain size. This happens by process of aggregation. When both granularity and dimensions are to be changed to
match quantities on one side with those on the other, granularity is changed first (aggregation is done first) followed by manipulation suggested by dimensional analysis.


Step 4: Implement

In general, we have agreed that Implementation is too big a topic to be given justice in this article. We are therefore totally leaving out aspects such as platform selection. We may however be
able briefly define some details of the implementation step.


Is brunch a substitute for two meals?

In plain English brunch is breakfast and lunch rolled into one. People therefore take brunch as a practical way of saving time, or they take it because they find one meal sufficient. Some go as far
as suggesting that their body activity does not deserve two meals. The same variations apply to BIS implementation.


Brunching goals

Organizations / business units that have a common set of objectives / apical goals may get together and develop BIS to allow strategic leverage on each other’s data. The arrangement works
where the units are aligned enough so as not to have BI that must be guarded from or is inconsistent with each other.


Brunching execution

We have considered earlier that there may be two or more organizations gathered under one corporate umbrella with little common in their strategic objectives. They can however share a common pool
of infrastructure effectively, particularly in situations where BI demands of neither are high enough to give best justice to a recognized BI infrastructure, if implemented separately.

 





De-brunching

Conversely, if we have tremendous use of BI where some portions of the BIS are used predominantly by certain subsections of the organization, then it is possible to repeat the BI or change its
granularity / operation frequency by constructing data marts relying substantially on central data warehouse, and then applying different but consistent BI policies on them.

An entirely different situation to this is where two BISs are separate for purpose of data protection rather than efficiency. Here, they may hold a common subordinate BIS.

The two models are similar to the extent that both have some commonly operated and some uncommonly operated data. The residence of super-data is central in the first instance whereas it is separate
in the second.

How far should the coverage be? This is an essential question in BIS implementation, though there is no single and simple answer. Some answers are provided by longevity of underlying transactional
data. As a thumb rule, low granularity BIS data derived from transactional data may disappear when associated transactional data is not longer required to be maintained (typically as legal
obligation).

Many analysts follow the proportional value model to work out worth of BI data. E.g. if hour level aggregates provide specific value for only a month (and a month comprises of say 180 hours worth
of data) then this number of 180 (or there around – say 200) is applied throughout. Thus, day level aggregates are valuable for ~200 days which is nearly a working year. Likewise week level
aggregate will be worth for ~ 5 years (~ 40 working weeks per year) and so on. Most analysts seem to agree to hover within one order of magnitude with respect to the proportional value (e.g. say 20
to 2000 in the case above).

Many other BIS specialists try finding this answer in the CSFs and in that respect consider most BIS data older than 5 years to be of little significance to a dynamic organization.


What should the sequence of implementation be?

It is of course well understood that BI related to CSFs / primary objectives must be derived first. It therefore does not greatly matter as to which aspect is handled first, in terms of leaving
other aspects out. On one hand, if DSS were to assume highest important, followed by MIS in general, then trends and triggers assume more importance than reports. From productively perspective,
Triggers score the highest and reports the lowest. Analysts however debate that triggers usually emerge upon trend analysis (unless they are representing regulator guidelines) and trends are based
on facts that can appear as such in reports. Thus, on one hand the process essentially depends on report-worthy facts; on the other, computer aided management has greatly improved trend and trigger
analysis and thus rendered isolated plain reporting obsolete.


Step 5: Manage

BIS is an eternal entity and its success depends not only upon the thinking gone into developing it but more so on how it is managed and continuously improved.


Who holds the rope?

BIS function often forms part of a broad Information Systems department. This department tends to club together everything and everyone that assume orientation with computer, beyond the basics.
Thus PC support, data center, network & security management, and often other jobs dealing with technology, including telephony and infrastructural support, are all clubbed with BIS.
Strategically though BIS is equidistant from all functional departments (non-functional being support / auxiliary departments) of an organization. It is therefore fair that it may be given separate
departmental status, irrespective of the staffing level involved.

In our organization, the BIS deals equally closely with the functional units – Operations, Retail Management and S&M. At strategic level it is very close to entire management of the
organization. At tactical level, it is perhaps closest to outward facing units (say Retail Management and S&M). It therefore deserves a separate berth and mandate to borrow from other units as
need be.


Staffing

Once again, given the strategic value of BIS, it is useful for the team to have good numerical analysts with knowledge of one or more functional areas of the organization. It usually requires
ongoing support from the support arms where staff members tend to be involved less on a day to day basis than their functional / analytical counterparts.

It is generally proven that a centrally positioned BIS team with all functional units represented, can cut down the administrative staffing at those functional units by ~ 5 times their
representation in the BIS team. It also streamlines reporting, MIS and DSS processes, which staff members at individual functional departments are often not best qualified to perform.

[i] An exhaustive discussion of dimensional analysis may be found in the article Applying
Dimensional Analysis to Business Intelligence Systems

Share this post

Amit Bhagwat

Amit Bhagwat

Amit Bhagwat is an information architect and visual modeling enthusiast, in the thick of object oriented modeling and its application to information systems. He has developed a specialised interest in applying techniques from pure sciences to data modeling to get the best out of MIS / BIS. He is an active member of the precise UML group and happens to be the brain father of Projection Analysis - http://www.inconcept.com/JCM/June2000/bhagwat.html - and Event Progress Analysis - http://www.tdan.com/special003.htm - techniques. He also maintains contents for the celebrated Cetus Links - http://www.cetus-links.org - in Architecture and Design areas. He shares a variety of other interests including photography, poetry and sociological studies. Explore some of his work at: http://www.geocities.com/amit_bhagwat/

scroll to top