A New Way of Thinking – October 2004

Published in TDAN.com October 2004

People frequently talk about business rules as if the concept was well defined and there was a widespread agreement to that definition. However, while the concept seems to have widespread
acceptance, there is less agreement about some of the more mundane (and simultaneously, pragmatic) issues surrounding the form, function, and definition of a business rule. It is very interesting,
though, to see that despite the ambiguity associated with business rule concepts, there does seem to be an overwhelming agreement as to the importance of identifying, managing, and using business
rules as a significant part of information management. At the risk of going out on a limb, I thought it might be interesting to take a step backward and think about business rules and their context
in the data management universe.

The first issue to tackle involves refining the concept of the business rule as it is used today. Historically, today’s business rule evolved from two, or perhaps three technological tracks:
artificial intelligence, formal description, and perhaps formal validation.

Early computer programming was, (and for the most part, still is) procedural, meaning that the program describes a series of tasks or steps the system takes in order to compute the answers to one
or more questions. While a large percentage of applications are programmed using this method, historically there have been other programming languages designed to take different approaches. An
approach of interest is referred to as declarative programming, where the programmer states some expected goal in terms of a final set of variable values. In addition, the programmer provides some
initialization to a set of variables and provides a set of rules that must remain true within the system. In execution, the rules are iteratively applied to the variables to both ensure that the
system remains in a valid state while refining the value set until the expected goal values are achieved.

The essence of a rule within this system is that when the asserted rule is inconsistent with the current state (i.e., the assignment of values to the variables), the variable values are modified to
be in sync with the asserted rule. For example, consider a state with two variables, X and Y, with an initial assignment of 10 to X and 16 to Y. If we assert the rule “X’s value is
equal to 2 times Y’s value,” then during execution, it would be seen that the initial assignments are not consistent with the rule, and that one of the variables’ values must be
modified to make the rule true. In this case, there are two different ways this can be accomplished – by resetting X’s value to 8, or by resetting Y’s value to 20. The rule was
triggered by the determination that the X and Y values were not consistent with the assertion, and that some change needed to be made to correct that inconsistency.

The iterative application of the rules is performed through a rules engine that determines at each point whether the conditions are appropriate for firing any rules. At that point the triggered
rules are prioritized and actions are taken to correct any recognized inconsistencies. The nice thing about this approach is its base automation – the programmer dictates the truth and the
system enforces it automatically. The problem, though, is that using this kind of system requires expert knowledge in the specific rule-based programming language.

The second origin of business rules derives from what I am referring to as “formal description.” This concept, which evolved within the data community, was focused on continued
improvements in approaches to describing information models. The main driver for this initiative was the dichotomy between the description of the “static structure” of information,
which is embodied in traditional data modeling, and the “dynamic behavior” of the world being modeled. This dynamism was captured in terms of what was referred to as business rules. The
nice thing about this approach is that it captures a different aspect of information systems – the embedded knowledge that runs them. The problem with this approach is that its high-level
semantics are mostly designed for description, and less for direct implementation.

The third technology, formal validation, was originally developed as a means to validate software during its development as a way of proving the code’s correctness. Formal validation uses a
formal definition language to express assertions or constraints before and after a section of code that must be true. Assertions before a program statement are called pre-conditions and assertions
after statements are called post-conditions. The pre-conditions essentially dictate the system state, and by ensuring that the post-conditions always evaluate to true after the statement is
executed, the statement is known to be correct. The nice thing about this approach is that it used a language with formal syntax and semantics for descriptions. However, these languages were not
meant to describe conditions and actions, but rather to be used solely for description. One well-known formal description language called Z (“zed”) forms the basis of what is now called
Object Constraint Language (OCL), which is part of the Unified Modeling Language (UML).

Each of these approaches provides some insight into ways to view business rules – either as actionable statements, high-level business descriptions, or formal specifications. But while none
of these approaches really is the holy grail of business rules – high-level descriptions that can be made actionable – perhaps a combination of the three might move us closer in our

Let me elaborate: rule-based languages are great for creating the execution environment, but without business insight all we have is an empty shell. The formal description approach provides the
insight, but without the ability to execute. Formal validation provides a formal declarative description that can capture some of the insight as well as some of the aspects of action, but without
the environment.

But by combining the best aspects of these approaches, we can evolve a comprehensive solution that can capture the high-level description and still make it actionable. To adopt a different analogy,
one might say that the rule-based languages form the body, formal description is the soul, and formal validation provides the “breath of life.”

Here’s a start: formal validation systems can be used to capture a low-level description of the actual business rules. Each high-level rule asserted within an information environment may map
to one or more low-level rules, each of which relating to a data set and its corresponding attributes. In turn, we can use a rule-based execution environment to compile the low-level rules into a
single system that ensures that the assertions remain true during application execution. In other words, we map high-level business rules into lower-level assertions that capture the business logic
and at the same time can be made actionable.

I believe that eventually this composite approach will emerge as a relevant component of an information modeling or information architecture development environment. The barrier to acceptance lies
not in the technical development, but in the organizational changes needed to make this work. Because we advocate the close association of application logic to business requirements, the
organization must support cooperation between technically-savvy business clients and business-savvy technologists. But even in places where the historical adversarial relationship exists, I am now
seeing evidence of a slow, but sure movement in the right direction, and I expect to see this emerging cooperation reflected in successful business rule programs.

Copyright © 2004 Knowledge Integrity, Inc.


submit to reddit

About David Loshin

David is the President of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of The Practitioner's Guide to Data Quality Improvement, Master Data Management, Enterprise Knowledge Management: The Data Quality ApproachÊand Business Intelligence: The Savvy Manager's Guide. He is a frequent speaker on maximizing the value of information. David can be reached at loshin@knowledge-integrity.com or at (301) 754-6350.

Editor's Note: More articles and resources are available in David's BeyeNETWORK Expert Channel. Be sure to visit today!