The day after the end of the world was supposed to take place, I was reading a column in Computerworld by John D. Halamka of CareGroup Healthcare Systems and Harvard Medical School. He introduced me to the acronym “VUCA,” which stands for Volatility, Uncertainty, Complexity and Ambiguity. Think of it as FUD on steroids.
VUCA started as a military term from the 1990s, and it refers to places such as the war zone in Somalia, but it’s being co-opted for organizational leadership by Bob Johansen, former CEO of the Institute of the Future. He wants to redefine VUCA to mean Vision, Understanding, Clarity, and Agility1. But let’s break down the original meanings.
Volatility: This is the rate of change in your system. Some systems are slower to change than others, but change can be sudden. As a math major, I can tell you there is a difference between chaotic systems and a catastrophe point. A chaotic system is sensitive to the initial conditions. The system is not predictable because even a small input can have huge effects. This is called “the butterfly effect” in popular literature, and the analogy is that the flapping of a butterfly’s wings can change a hurricane.
A catastrophe marks a sudden change in a system that has been working smoothly. The analogy for this is “the straw that breaks the camel’s back” or the more modern version: Viral Marketing does not work – tell everyone you know!
Vision is seeking a future rather than being thrown around either chaos or catastrophe. You have a world view and general goal.
Uncertainty: You do not know the future – or even the present situation. This is what a chaotic system does. Realize that groping around in the right direction is better than marching in the wrong direction with great certainty. This is where understanding does not have to be absolute. But it does have to be shared. And expect your understanding to change with more information. Do not expect to predict the future. “There is a world market for about five computers.” (Tom Watson, Chairman of IBM, 1943.)
Complexity: How many inputs are there and how are they related? One of the laws of systems theory is that complex systems evolve from simpler systems that were successful.
Clarity can come from removing the complexity from a model of the situation.
“Entia non sunt multiplicanda praeter necessitatem.” (“Entities should not be multiplied beyond their necessity.”) William of Ockham (c.1280-1349)
In other words, no more things should be presumed to exist than are absolutely necessary.
If you don’t like Occam’s Razor, I give you Occam’s eraser: the philosophical principle that even the simplest solution is bound to have something wrong with it.
Ambiguity: You can explain the current situation in several different ways. Of course you are doing this after the fact. For example, right now the United States has one of the lowest violent crime rates in history. The conventional wisdom is that crime goes up during bad economic times.
Networks are agile; hierarchy is rigid. Networks can share a vision in a community. Hierarchies follow Robert Anton Wilson’s Laws of Hierarchy (Celine’s laws). The second law is that communication does not flow honestly up or down in a hierarchy. The superiors tell subordinates what they want them to know to maintain power; subordinates tell superiors what the superior wants to hear gain favor. This is how Soviet Five Year plans could be reported as on or ahead of schedule until the last year when they had total failures. Wilson’s second law maintains that true information can only flow horizontally among units at the same level.
The fact is that VUCA gets worse, not better. You are dealing with unsolvable dilemmas. There is no recipe book. Instead, you have to find a lot of different people, keep a lot of communication flowing and quickly change directions. All without losing a vision.
Johansen, Bob (2007). Get There Early: Sensing the Future to Compete in the Present. San Francisco, CA: Berrett-Koehler Publishers, Inc. ISBN-9781576754405.