A Component Framework Model for Information Technology Arcitecture – Part Two

Businesses constantly pursue the cost-effective application of information technology to achieve sustainable competitive advantage in an ever-changing competitive environment. An IT Architecture
should, at the very least, provide a context for defining this goal. Above and beyond this, in order to be a truly effective tool for creating solutions, an IT architecture needs to define not only
all components of a company’s IT investment, but also

  • the components’ interaction among themselves
  • the components’ interaction with the enterprise as a whole
  • the components’ interaction within the competitive environment of the enterprise.

In the initial installment of “A Component-Framework Model for Information Technology Architecture” (TDAN 5.0), we defined an Information Technology Architecture as a model, comprised of
a set of variables, which forms one side of an equation. On the other side of the equation is a state of information technology usage that provides the best continuous informing of the business it
supports, at the least possible cost. In this installment we will begin to take a closer look at the framework of the IT architecture side of the equation.

We defined the term framework as specifically the set of interfaces, or common boundaries, within the IT architecture. Some observations from the current literature may help to
clarify the concepts of components and interfaces:

A collection of services supported by a component is its component interface. An interface prescribes the mechanisms for a component to interact with other components. The description and
implementation of interfaces is the single most important aspect of using components…1
[Interface definition] provides encapsulation of components and isolation between subsystems. Component isolation is an important property of a good software architecture because it enables the
reconfiguration and replacement of components during the system life cycle. 2

A graphical model of the proposed architecture is shown below.

The overall model diagram presents the architecture in a form similar to an entity-relationship model, however, the “entities” (boxes) should be viewed as component types, and the relationships
(arrows) as interfaces. Nesting of component types, for example, “IT Inventory” within “Enterprise”, indicate that the inner component type is a subclass of the component type in which it is
nested, enabling the subclass to inherit the interfaces of the superclass. In this case, “IT Inventory” inherits the “Current-State-determines-Future-State” interface.

Let’s take as a specific example the “Data” component, near the center of the architecture diagram. (Readers of The Data Administration Newsletter should have no disagreement with this choice of
placement!) Beginning to “drill down”, using a more detailed graphic representation (specifically Microsoft Component Object Model [COM] notation), this component with its direct and inherited
interfaces would appear like this:

I would propose the following definition for the component-type “Data”:

[Data is] a persistent set of propositions, resulting from observation, which are instantiated by the assignment of values to labels.

As we can see in the overall diagram, this component-type shares common boundaries (interfaces) with other component-types including Human, Software, Hardware and Data.

It’s pretty obvious that data does interface with these things, so, what’s the point of this exercise? The point is that this approach allows us to effectively and comprehensively identify and
assign labels to those things-interfaces–which constitute the most profitable area of concentration in the study of IT architecture and its application to solving problems. The interfaces can then
be analyzed and understood, to first determine those which are the greatest bottlenecks, and then to optimize these interfaces to the greatest practical extent. Techniques such as Eliyahu
Goldratt’s Theory of Constraints, one of the most widely known methods for analyzing and optimizing bottlenecks, can be brought to bear on these bottlenecks. (See www.goldratt.com. More on this later.)

Let’s return to the example, and look at this set of interfaces in yet more detail. The labels are based, again, on the Microsoft COM convention, and are more or less arbitrary. The actual name is
less important than the process of identifying and labeling of the interface.

The Data-Hardware Interface, or, Rust Never Sleeps (iHarDat)

This is the interface between hardware and data, including both data storage and data transmission. Data at rest is still usually rust-that is, iron oxide. We can write words in atoms, but remain
prisoners of seek time and bandwidth. Storage and transmission technologies lag behind relative to other information technologies, making this interface one of the most debilitating bottlenecks in
any enterprise. As a result of circumventing this bottleneck, additional costly interface instances proliferate. Data compression techniques, holographic storage, and in-memory databases hold much
promise, but progress is painfully slow. How can a company best deal with this set of circumstances?

The Data-Human Interface, or, Knowledge-to-Data-to-Knowledge (iHumDat)

This is where “informing”–the transformation of data into knowledge-takes place. Knowledge is data that has been internalized by a human; data is knowledge that has been externalized by a human.
Its absolutely dominant nature is exemplified by the explosion of the Web-a global and revolutionary optimization of this interface. Contrast this with the relatively lesser “Windows”
revolution-“merely” an optimization of the Human-Software interface. How a business can internally optimize this critical interface by increasing the “data inventory turnover ratio” will be
examined in more detail later.

The Data-Data Interface, or, Data Structure (iDatDat)

The relationships among data at the same level, and between data at multiple levels. One example is the relationship between meta-data (a type of data, after all), e.g. variable-names, and the data
it describes. Foreign-key relationships are another example. A company which understands the structure of its data resource better than its competitors do theirs, can turn this understanding into a
clear advantage. The current wave of “customer-driven” application strategies is one result of the dawning realization of the power of this interface. But how can a business optimize this
interface, above and beyond its competition, to gain competitive advantage?

The Software-Data Interface, or, Program I/O (iSofDat)

This interface is what computer programs do with, or to, data: “read” it from keystrokes; cause it to be displayed on an output device; use it to calculate other data. The revolution from
record-at-a-time sequential processing to set-at-a-time relational processing has transformed the use of information technology in business. What will be the next revolution in this interface, and
what will its impact be?

Data Refinement, or, the Development Life Cycle of Data (iRefine)

This interface is inherited from the superclass IT Inventory. “Refinement”, as defined in the Unified Modeling Language (UML), is “a relationship that represents a fuller specification of
something that has already been specified at a certain level of detail. For example, a design class is a refinement of an analysis class.” Refinement, for example, is what happens to the data
component type as it proceeds from any “higher” row of the Zachman framework to a “lower” row. This interface is invoked as venerable data models are dusted off to enable their refinement into
a whole generation of new data warehouses.

Current State of Data Determines Future State of Data, or, Back to the Future? (iDeterm)

The current state of the enterprise’s data, its “legacy”, determines the future state of the enterprise’s data. This interface is inherited from the superclass Enterprise: the way the
enterprise is now determines its future. Optimizing this interface facilitates and accelerates the evolution of the enterprise data. Updates happen faster, the data values remain “fresher”; the
variables/labels evolve as quickly, synchronized with the enterprise as it adapts to changes in its business environment. A company who has optimized this interface has achieved a position of
advantage relative to its competition.

This is just a brief look at the set of interfaces of one of the two key component types. As we progress we will examine other component types and their interface sets, looking into the methods
which comprise some key interface types. We’ll take a look at how all the components interact through their interfaces, and how the entire IT architecture is animated through its framework of
interfaces. Then we will examine some metrics for measuring the effectiveness of these methods, and techniques for monitoring and optimizing this effectiveness.

  1. “Software Components as Application Building Blocks”, white paper, Quoin Inc., 1998
  2. The Essential CORBA, Mowbray and Zahavi, 1995

Share this post

William Lewis

William Lewis

Wiliam has more than 20 years’ experience delivering data-driven solutions to business challenges across the financial services, energy, healthcare, manufacturing, software and consulting industries. Bill has gained recognition as a thought leader and leading-edge practitioner in a broad range of data management and other IT disciplines including data modeling, data integration, business intelligence, meta data management, XML and XSLT, requirements structuring, automated software development tools and IT Architecture. Lewis is a Principal Consultant at EWSolutions, a GSA schedule and Chicago-headquartered strategic partner and systems integrator dedicated to providing companies and large government agencies with best-in-class business intelligence solutions using enterprise architecture, managed meta data environment, and data warehousing technologies. Visit http://www.ewsolutions.com/. William can be reached at wlewis@ewsolutions.com.

scroll to top