Bringing Business and IT Together – Part 3

Published in October 2006
Articles in this series – Part 1, Part 2, Part 3

This is the third and final article in a series about Business/IT working relationships. The first article, Bringing Business and IT
Together: A Mandate for Change
, was published in in October 2005. In that article I presented a belief that now, more than ever before, business/IT working relationships must undergo
fundamental and systemic change. The role of business intelligence (BI) as a strategic component of information systems makes this a critical issue. Good BI systems are closely integrated
collections of technologies, business metrics, business processes, and disciplines of business management. They demand cohesion, continuity, consistency, and quality that can’t be achieved with
typical tenuous and fragile working relationships between business and IT organizations.

The second article, Bringing Business and IT Together: Practical Steps to Improved Working Relationships, made the transition from
describing the problem and its urgency to offering a solution. That article, published in January 2006, prescribes organizational alignment as the solution and describes a process of continuous
organizational alignment (COA). COA is based on a three dimensional framework that encompasses elements of working relationships, elements of organizational effectiveness, and alignment activities.
The activities – identify misalignment, correct misalignment, and sustain alignment – offer a macro view of the solution but are not sufficiently specific and concrete to be actionable. A
continuous alignment process (see Figure 1) is necessary for real and meaningful changes to occur. Continuous is a key concept here. Continuous alignment is the goal and the
challenge – continuous because organizations, people, processes, and technology continuously change. One-time alignment simply will not do.

Note that each cycle of the COA process begins and ends with measurement. As with any continuous improvement work (CQI – continuous quality improvement, CPI – continuous process improvement, etc.)
measurement is fundamental. It is the beginning of a cycle, the end of a cycle, and the essential element to provide continuity.

Although essential, measuring organizational alignment is difficult and challenging. Inter- and intra-organization working relationships are complex things that are primarily driven by human
factors – beliefs behaviors, experiences, politics, power, and the like. Measuring organizational alignment is a complex problem. It seeks to turn personal, subjective, and intangible things into
quantifiable and objective things. That is the reason this article has taken so long to produce – the first two installments in consecutive issues of TDAN, and the final piece lagging by several

At long last, I believe I have a measurement system that will work (with much credit to the work of Jerry Luftman, Distinguished Professor, Stevens Institute of Technology School of Management). By
a measurement system that will work, I don’t mean a system that provides precise and absolute quantification of organizational alignment. The purpose of this measurement system is to collect data
that is useful to (1) assess the current state of alignment, (2) identify the trouble spots, (3) determine the root causes of trouble spots, (4) make informed decisions to correct problems, (5)
take corrective actions, and (6) evaluate the impact of those actions.


As with all metric systems, organizational alignment measures must be dimensional if they are to be useful for analysis. Two primary dimensions are organizational effectiveness and working
relationships, which are described in detail in the Practical Steps to Improved Working Relationships article and briefly reviewed here. The third dimension of organizational alignment is human
behaviors. Collectively these dimensions produce an analytic structure as illustrated in Figure 2.

Organizational Effectiveness: Ineffective organizations are difficult (and undesirable) to align with. Organizational effectiveness has three major components:

  • Processes are the procedures and activities through which interaction occurs, needs are communicated, products and services are delivered, and payment or chargeback is achieved. Important
    influences include the degree of formality in processes, the level at which they are documented and understood, the extent to which they are followed, results-orientation without undue bureaucracy,
    consistency of application, and much more.
  • Relationships involve both the state and the quality of interaction among multiple organizations. Organizational relationships have a structural component and a cultural component. Structure
    expresses the form or forms of a relationship – contractual, partnership, and collaborative for example. Culture expresses the attitudes and emotions of a relationship – friendly vs. combative,
    trusting vs. distrustful, comfortable vs. awkward, etc.
  • Skills are the abilities to produce solutions in a problem domain. Business skills, technology skills, and interpersonal skills are all important. It is increasingly important that business
    people have some technical skills and that technical people have some business skills.

Working Relationships: Working relationships are the procedures, processes, attitudes, and behaviors with which organizations interact. Working relationships are founded on three

  • Governance provides the structure and controls needed to achieve value from IT resources, minimize risk of IT initiatives, ensure long-term viability of IT systems, ensure that IT systems
    support regulatory compliance, raise the level of information technology maturity, and satisfy business expectations of IT. Governance addresses ownership, responsibilities, measurement, policies,
    and working practices for data, technology, and information systems.
  • Competency is the ability to produce and deliver results and encompasses both knowledge and ability to apply that knowledge. Effective Business/IT relationships demand competency in three
    domains – business, technology, and program/project management. Although frequently considered to be subjective and intangible, competency is readily affirmed and demonstrated through references
    which may range from internal word-of-mouth impressions and reputation to recognition as a best practices leader. Competency is essential for each of individuals, teams, and organizations.
  • Communications are a cornerstone of Business/IT alignment – the connections that enable access, understanding, cooperation, and teamwork. Every aspect of alignment depends in some way and to
    some degree on communication. Skills, for example, are individual and problem domain specific; Competency is collective and results oriented; Moving from skills to competency and from problem to
    results depends largely upon communications.

Human Behaviors: This third dimension, not discussed in the previous articles, is essential both for measurement and for taking action to effect change. Organizations are, of
course, composed of people. Thus organizational alignment problems are also people problems, and measuring alignment must include a human dimension. A measurement structure for organizational
alignment must include ways to measure:

  • What we think – Expectations and beliefs are the basis of natural and instinctive responses and reactions in any situation where two or more people interact. Shared expectations and beliefs –
    those of groups and teams – and individual expectations and beliefs both play a significant role in working relationships.
  • What we say – Communications of many kinds – spoken, written, and visual — will influence the expectations and beliefs of individuals and groups. Where communications as an element of the
    working relationships dimension address formal communications, it is also necessary to evaluate informal communication. Informal communication frequently has greater effect on what we think than
    formal communication. Visual communications such as body language, dress, organization of a meeting space, etc. are highly influential and often overlooked elements of working relationships.
  • What we do – The old adage “actions speak louder than words” is undoubtedly true when applied to working relationships. The effect of positive communication is quickly undone when we say one
    thing and do another. Actions are perhaps the most visual of communications.


As discussed earlier, measuring organizational alignment is particularly complex and challenging. The purpose is to quantify “soft” things – intangibles such as beliefs and attitudes. In this
realm the only things that can be practically measured are perceptions (what we think). That is the right stuff to measure anyway because that is what drives the actions and behaviors that
influence working relationships and organizational effectiveness. Ultimately a measurement structure must provide a means to quantify. It isn’t measurement if we can’t turn it into numbers that
enable analysis and support decision-making. Using three dimensions of organizational effectiveness, working relationships, and human behaviors offers twenty-seven points of measurement that can be
used to quantify perceptions of:

  • What we think about process communication
  • What we say about process communication
  • What we do about process communication
  • What we think about process competency
  • What we say about process competency
  • What we do about process competency
  • and so on …

Each measurement point must be assigned a numerical value as a first step to making it quantifiable. For ease of analysis and intuitive understanding it is ideal to work with a one-hundred point
scale. Figure 3 illustrates the distribution of values across the twenty-seven measurement points to support a one-hundred point scale.

Note that one-hundred percent of the values are distributed through the nine building blocks that are based on organizational effectiveness and working relationships. The distribution is not equal
– that is not all of the nine areas carry equal weight in the measurement process. The intersection of relationships with competency is given more significance than the remaining eight building
blocks. It is obvious that competency in relationships is a core component essential to achieve real and lasting impact from improvement in the other eight areas.

The values are also distributed along the human factors dimension, with “what we do” weighted more heavily than “what we think” and “what we say.” This is based on the premise that the things
we do have greater impact that the things that we think and say – actions speak louder than words.

The contribution of each of the twenty-seven individual measurement points is determined by the two distributions of values. What we do about relationship competency, for example, contributes 8% of
the overall measure (40% of 20%). Distribution across all measurement points appears as

The three dimensions, twenty-seven measurement points, and distribution of values throughout are interesting, but only academically so without a means to collect data. Without data it still is not
measurement. Figure 4 illustrates the data collection basis. A sound data collection structure provides:

  • Consistency of semantics across all measurement points: In this case the semantic structure is composed of three elements – how we manage, how we implement, and how we sustain – at each
    measurement point. Collecting data, then, about how we manage process governance has the context necessary to provide analytic value.
  • A scale that is uniform for all measurement points: In this case we use a three point scale where a value of one is the lowest rating and a value of three is the highest rating. One means that
    we do it poorly; three means that we do it well.
  • Interpretation of the scale: To have real meaning and support decision making we must be able to apply the numbers to real-world concepts about organizational alignment. In this case the scale
    is interpreted as three levels of management (purposefully, carelessly, casually), three levels of implementation (globally, locally, individually), and three levels of sustenance (proactively,
    reactively, and accidentally).


Collecting and quantifying subjective data such as that for organizational alignment is generally performed with survey methods. The structure – three dimensions and twenty-seven measurement points
– provides a big picture. Semantics, scale, and interpretation add detail to the big picture. One additional view is needed before preparing the survey – a concept of how survey results will be
aggregated and presented for analysis. Certainly more than one person will respond to the survey. Looking at each response individually isn’t informative. Important considerations for a good
survey include knowing how you will

  • combine multiple responses into a collective set of values,
  • visually present those values as a starting place for analysis.

Employing the dimensions and following the survey principles I have constructed a survey that consists of one-hundred statements. Every statement maps to one of the twenty-seven measurement points,
and is further classified by the semantic structure (manage, implement, sustain) that underlies those measurement points. The percentage distribution of values determines the number of statements
that are used for each measurement point. A sample of survey statements looks something like:

(For a copy of the complete survey, email the author at

Survey participants will respond to each statement twice – once indicating the degree that they believe it to be true for business units, and once again for the degree that they think it is true
for IT. The survey does not show interpretation of the scale but uses a more general format of always, sometimes, or never true.


Using the survey and using the measures is not an event but a process as illustrated in Figure 5. The survey process is designed to address five key points: (1) Measurement is a process. (2) Its
purpose is to collect data. (3) Collecting data is pointless unless it is used to gain understanding. (4) Understanding is pointless unless it is used to take action. (5) Action to align business
and IT is continuous and not a one-time event.

Identifying the Survey Group: The participants that you choose will influence the nature of survey responses and the resulting measures of alignment. If only IT people participate
you can be certain that analysis will show that most alignment problems result from business behaviors. And if only business people participate the reverse is likely. If you survey only managers,
be assured that your problems will appear to stem from implementation and sustaining activities – certainly not from management. Choose the survey group deliberately and carefully. Seek an
appropriate balance between business and IT participation. Include people at all levels from executives to staff. Include everyone from recently hired people to “old timers.” Try to get a sample
group that is a truly representative cross-section of the company or organization that you want to measure. Some key points when identifying a survey group: (1) Include both business and IT people.
(2) Include people from multiple levels of the organization. (3) Group size depends on company size. (4) Incremental or parallel surveys may work best in very large organizations

Collecting the Data: How many people should be invited to participate? That is a tough question to answer. Do you expect a low rate of response from a large group, a high response
rate from a small group, or somewhere in between the two extremes?

Ideally, thirty to forty people make a manageable set of respondents. But in a very large organization that may not be enough to offer a representative sample. The rate of responsiveness varies
widely between companies, so consider that also when identifying a group. Too small a response isn’t especially valuable. But responses from a cast of thousands will turn your measurement activity
into a data management project.

Many variables influence the way that people respond to surveys. Under whose sponsorship or authority is the survey being conducted? What information should you provide when sending the survey?
Under whose name should the survey be sent?

How you distribute a survey also needs to be considered. A geographically dispersed company will find email easiest, and it may be less constrained by schedule and labor considerations even when
all participants are in the same location. But group meetings give opportunity for discussion and are certain to improve the rate of response.

Conducting the survey in a group also offers the opportunity to impose a time limit that requires rapid, first-impression responses. These are often more meaningful for the first-time measures when
you’re getting started because they capture reaction-based rather than reasoned responses. Instinctive reactions generally drive behavior in relationships until we begin to measure and observe
that behavior. Un-timed and reasoned responses, however, may be more effective on a continuing basis. The process of responding, the thought that it initiates, and the knowledge of measurement and
observation all have a role in behavioral change.

Finally, before distributing the survey, consider the demographics that you need to support analysis of the data – separating business responses from IT, for example. Ideally survey responses
should be anonymous – it encourages honest answers and improves response rate – but you must include some demographics or analytical value will be limited.

Consolidating and Aggregating the Data: Once survey data has been collected it needs to be prepared for analysis. If you use a paper survey, data entry is the logical first step.
Using a spreadsheet or other electronic survey medium eliminates data entry, but you may still need to check that each survey is complete and contains only valid data. Missing and invalid data are
sure to skew the results.

To give the measures context that is useful for decision making it is also helpful to collect some statistics (metadata) about the survey process. Expressing, for example, the following statistics
gives the user of the measures valuable insights:

  • The survey was sent to 60 people representing business and IT equally.
  • 55% of those who received the survey responded – 15 from IT and 18 from business.
  • Two surveys were removed from the pool due to missing or invalid responses.
  • The measures reflect responses of 31 people – 15 from IT and 16 from business.
  • Only one executive response is included in the pool. The remainder is evenly divided between management and staff.

Use a spreadsheet or similar tool to consolidate responses and publish the results for information and analysis. The decisions that you make about demographics significantly affect your ability to
aggregate and summarize the data, which in turn affects its value for analysis. Separating business perceptions from those of IT, analyzing differences in perception between management and staff,
distinguishing departments that work effectively with IT from those with difficult relationships, and much more all depend on the demographic data associated with each survey that you collect.


Aggregation and summarization make the step from data to information. For analysis and application this is a shift of thinking from measures to metrics. Applied as metrics the data helps to answer
questions such as: Where are we today? How has it changed from the past? How effective are the actions that we’ve taken? Where is progress being made? Where do problems still exist? Where are the
most severe or immediate problems?

Analyzing Survey Results: The purpose of data analysis is to gain insight that helps to prepare for action – making and implementing decisions. Three basic principles help to find
hot spots that are candidates for action:

  • Consistently low responses are indicators of weaknesses in organizational alignment.
  • Consistently high responses show strengths.
  • Widely-varied responses indicate areas of risk where common understanding and consensus are needed.

The terms “consistently high” and “consistently low” are relative to the overall average of all responses.

Where weaknesses and risks are identified, further analysis to understand the root causes is helpful. Remember that the goal is to correct problems, not to treat the symptoms. Drill-down analysis
by demographic grouping, by what we think/say/do, and by manage/implement/sustain are helpful to identify root causes.

Further analysis by various demographics and classifications helps to gain insights and uncover ways to improve alignment. For example:

  • Comparing the differences where differences of opinion and beliefs exist. These are areas where change may have particularly high impact for improved alignment.
  • Comparing results between business units may yield interesting information including the ability to see pockets of strength, weakness, and risk. Where some business units are well aligned while
    other struggle you may find successes and strengths to recognize as best practices.
  • Analysis by organization level (executive, management, staff) identifies differences of opinion – perhaps even differences of reality – that are organizationally based.

Developing an Action Plan: Measurement without action is futile, and action without deliberation is foolish. Deliberate action begins with goals – knowing what you want to
accomplish. Measures provide the baseline from which to establish goals. Remember in goal-setting that the objective is improvement – not perfection. Set ambitious but realistic and attainable
goals. Then use the measures and results of analysis to define an action plan that will:

  • Leverage strengths by propagating them to troubled areas, by ensuring that they are repeatable, and by integrating them into the core values and culture of your organization.
  • Remediate weaknesses by changing the communications, expectations, and behaviors that are found at the root cause of each weakness. Realize that you can’t directly change beliefs. By
    changing communications, expectations, and behaviors in a planned way it is possible to change beliefs indirectly and over time.
  • Mitigate risks, often by instantiating best practices in troubled areas. Similarly to weaknesses, risks are addressed by changing communications, expectations, and behaviors. In both instances
    remember that you can’t directly change beliefs.

Complete the action plan with tactics. Know not only what you want to accomplish but how you plan to achieve it. Be specific about which communications, expectations, and behaviors to change and
about how you will change them. Improve the probability of success with attention to the barriers including politics, territorialism, resistance, defensiveness, and a culture of “the way we’ve
always done it.”

Taking Action: What more is there to say here? If you’ve done the earlier steps well, then you know what to do, why you’re doing it, and what results you expect to achieve.
Execute the plan and begin to change the way that business and IT work together. Don’t expect miracles. Expect progress and steady improvement.


Finally, keep in mind that the objective is continuous improvement – not instant and radical change. A “big bang” approach is almost certainly doomed to failure. Follow the proven principles of
continuous improvement including measurement to assess the current state, measurement as a feedback loop, and steady but incremental progress. Continuous improvement is a lifestyle, not an event.
Getting there means that you will execute an action plan, measure to know the effect of the actions, analyze the “new” current state, make a new action plan, and do it all over again.

Share this post

Dave Wells

Dave Wells

Dave Wells leads the Data Management Practice at Eckerson Group, a business intelligence and analytics research and consulting organization. Dave works at the intersection of information management and business management, where real value is derived from data assets. He is an industry analyst, consultant, and educator dedicated to building meaningful and enduring connections throughout the path from data to business value. Knowledge sharing and skills development are Dave’s passions, carried out through consulting, speaking, teaching, and writing. He is a continuous learner – fascinated with understanding how we think – and a student and practitioner of systems thinking, critical thinking, design thinking, divergent thinking, and innovation. He can be reached at

scroll to top
We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept