In most of our organizations we have people doing data analysis all over the place. From very technical SQL queries and cubes to the more mundane spreadsheet-based number crunching, we have no shortage of data activity going on.
It would be too easy to say that the reason organizations fail to fully capitalize on this activity is that people just don’t know what they are doing. Certainly, sometimes they don’t, but people throughout an organization are usually putting together some clever stuff.
Not everybody is capable of full-blown statistics and regression models, sure, but much value can be driven from simpler techniques.
Though the analysis itself is usually reasonably sound, it too often accomplishes very little. Why is this? It depends with each circumstance, but here are three common reasons that may be hiding in plain sight.
Hidden Reason #1: The Quality of Data is Insufficient for What You are Asking
It seems like everyone I talk to, when they hear I’m a data leadership advocate, responds with some variation of “Great, you should help us! Our data is terrible!”
Usually they are right, but that hasn’t stopped them from continuing to blindly push forward with reporting and analytics efforts that they know cannot be supported by the data they are using. To be fair, a general sentiment like “Our data stinks” is not so actionable, but at least the problem is already understood!
It is the solution that remains elusive.
The answer for improving data quality begins with situational awareness. If the data is bad, we must understand where and how it is insufficient to support the analyses we want to perform. We can adopt and apply a scoring methodology and then determine a score and an action plan to remediate the deficiencies.
It is worth noting that the best time to remediate data quality deficiencies is when the data is initially created. While data quality problems can be introduced later in the data lifecycle, if we can get it right at the start, we have a much better chance of keeping data reliable for the widest array of later uses.
Hidden Reason #2: The Analysis Effort was Conducted for a Preferred Outcome
One of my favorite quotes from business school is: “With enough analysis, every business idea looks bad.” This is a good reminder that while data is the closest thing we have to the objective truth in our organizations, it must still pass through the subjective eyes of people.
A simplified way of performing basic analysis is to identify a hypothesis and then design a data-driven test to determine if it is mathematically likely beyond a reasonable doubt that the identified hypothesis is better than pure chance. This isn’t exactly simple and is likely not what most people are doing in real life.
More often than not, people look to data to validate that their chosen hypotheses are correct. It may seem like this is the same as above, but it is like the fallacy of observing a flock of white birds and concluding that all birds are white. People looking to prove themselves right will tend to pluck out the most convincing evidence they can find and throw it into their slide deck as reinforcement that the conclusions they had all along are correct!
This leads to highly biased conclusions that may overlook more nuanced insights that exist within the data. I believe this reason happens mostly because people are kind of lazy. It feels like less work to prove ourselves right than to dig into a bunch of data with a high likelihood of having no discernable pattern emerge from all our effort. This also tracks with a more qualitative observation that not having an opinion or saying (gasp) “I don’t know” something seems to be considered a sign of weakness.
The challenge all this points to is one of motivation. Our folks that are doing data analysis need to have a purpose, but if that purpose is too one-sided, they will gravitate towards the path of least resistance. It is better to use data analytics for research purposes and in unbiased development of hypotheses.
We are generally less prone to big mistakes if we use basic analytics to find the interesting questions, and then do more advanced analytics to validate conclusions. At the very least, as data consumers we must be ready to challenge the underlying methodology when data-driven conclusions are being presented in an overly-one-sided way.
Hidden Reason #3: The Analysis Leads to No Change in Business Processes
Even if our data can support the intended uses, and if the analysis is as objective as it can possibly be, it still may not matter. This is because data analytics, at its best, gives us something interesting that we can use to make our business better. It does not do us any good if we simply look at a report and say “Hmm, that’s super-interesting!” and then return to doing the exact same thing we would have done before our fleeting moment of data enlightenment.
“Interesting” is worthless. We need to be shooting for “actionable.”
Analysis needs to inform business process improvement. It’s through business processes that any of our efforts (data or otherwise) become valuable. Furthermore, they should be measurable and tracked to revenue, cost, and risk impacts for our organization. This is the underlying premise of data value and data leadership!
We need to connect our meaningful business activities to our interesting data insights. That’s where things matter, and it is true of the data analysis our folks are doing throughout the organization–and it is true of the data governance and data management functions being performed by data professionals. In fact, it is true of everybody with a job to do!
What matters is our contribution to revenue, cost, and risk. It may be indirect, or it may manifest over the longer-term, but it is always there. We must own this truth, forever pushing towards measurable outcomes. When we do, we will be better equipped to overcome the hidden reasons we’ve discussed here, as well as anything else that is getting in our way.
And until next time, go make an impact!