Why Do Top-Down Big-Picture Modeling? Part 1

ART04x - edited feature imageModels are only ever a means to an end. Sometimes car designs are modeled in clay –you can’t drive the models, but you can get valuable feedback on the visual impression of the proposed concept. Sometimes scale models of aircraft are tested in wind tunnels –you can’t fly them, but you can evaluate their aerodynamics. Models are cheaper to build than the real things, and serious consideration of design alternatives can be debated without breaking the bank.

Likewise, data models are only ever a means to an end, but if they drive open discussion on design alternatives and contribute to better solutions that meet real business needs, then they will be highly valued. And, unlike clay models of cars or scale models of aircraft, given the right environment, you might be able to press what I call the “big green Go button” and turn the data model into a real software deliverable.

Data modelling used to be seen by many as a strictly technical exercise, aimed at physical implementation. Increasingly, people are referring it to information modelling, and that’s all about the business. So here’s the warning – if data modelers can’t or won’t engage with the business to deliver value in a timely manner, at best they may be undervalued, and at worst shunned.

This paper argues that there are times big-picture top-down models may not only be sufficient for today’s urgent needs, but in some cases are preferable to a more detailed, rigorous bottom-up model.

This paper is the first in a trilogy (and is itself split into two – you’re reading the first segment). It argues “why” top-down models should be considered. A second companion set of materials articulates “how” to develop such models. But any model is useless shelf ware if it’s not applied. One of the most common applications of these models is the design of a Data Warehouse, and this is the topic for the third member of the trilogy, specifically using a Data Vault as the platform for discussion.

However, it is important to note that top-down models can deliver tangible business value across to a variety of other scenarios such as formulation of enterprise strategies, delivery of MDM solutions, design of a service oriented architecture, and more. Those interested in just Data Vaults will hopefully enjoy the entire trilogy; others are encouraged to read and apply the first two parts, and then maybe add their own paper on how they applied top-down models to their particular area of interest.


Laying Down the Gauntlet

There’s a debate that surfaced recently in Australia about doctors and the way they should communicate with patients who have gained a bit too much weight. One side argues that if someone is technically in the category of “obese,” they should call it out. Maybe a confronting label might shock the patient into change. The other side argues that the patient probably already knows that their weight is a risk to their health, and a gentler approach may help them towards a healthier lifestyle.

I’m not a health practitioner, so I will leave that debate to the experts. But I am observing concerns within the “data” world, and it seems the experts are worried too. Taking a leaf from the medical example above, maybe the problems (1) need to be called out, and (2) a gentle message of hope also needs to be shared.

Have you ever seen times when the business wants a high-level perspective of its technology assets so it can make a quick and informed decision, but the old-school data professionals can’t or won’t give a timely answer until the last precise and agonising details have been nailed down? I’m a “data” bloke myself, so I get that there are times when the devil is in the details. But I’m also aware that my survival as a valued professional depends on delivering value to the business when they need it.

Facing the Problem: Bottom-Up vs. Top-Down Thinking

First, an analogy. Two people want a “water” expert to help them. One person has dripping tap and wants a plumber. The other is a town planner who wants to design a new city and needs the big picture for essential infrastructure. They both need water experts, but with a different focus.

Now let’s look at our “data” experts. Perhaps the problem is a report’s bottom line figures that look dodgy. We need a detail-focused person to nut out how we arrived at the numbers. We absolutely need hands-on data people, and always will. Just like we will always need plumbers to fix dripping taps.

Now let’s look at the big-picture business scenario. The executives are tossing around strategic direction alternatives. For the focus of this paper, let’s assume some specifics of managing “data” need to be understood. Where are we today? What might tomorrow look like? And how might we get to tomorrow from today?

It’s time to be blunt. We data professionals can either contribute to the high-level (and often imprecise) think-fest in the boardrooms, or the business people will discount us. Scott Ambler was a valued speaker at Dan Linstedt’s Data Vault conference in 2017, and told the attendees that many managers simply don’t want to even try to engage with data folk. Why? Because we are so consumed by technical details that we aren’t servicing business needs. So they do the best they can without us.

We need to lift our game. One honourable reason for changing our behaviour might be to deliver value to the business because we want them to succeed. One selfish reason might be that we want to keep our jobs, and the best way is to be seen to deliver value. I am not going to even try to judge the motives of others. What we can agree on is that everyone wants the data people to deliver? Fast. And good.

This sounds great, but can it actually be done?

Can Top-Down Actually Work?

Land Titles Registrar

This story is spectacular. A government body responsible for registration of land titles had a problem with how they managed the recording of data. In fact, they had recognised the problem for a century – yes, 100 years. Of course the earlier records were not kept on computers, but “computerisation” hadn’t solved the underlying problem. It was sufficiently serious that they dedicated a team to solving it. Three-and-a-half years later they gave up.

Yours truly, taking a top-down pattern-based look at their concepts, cracked it. How? A willingness to ask the dumb questions, combined with the helicopter view facilitated by the proven, flexible, and “universally applicable” data model patterns as published by luminaries such as David Hay and Len Silverston. More on data model patterns later, but the resultant framework looked something like the following:

Screen Shot 2018-05-13 at 9.41.25 PM

A quick explanation of the diagram is that vendors, purchasers, solicitors, and the like are represented as parties fulfilling roles; things like loans and the sales contract are represented by agreements, and the property that’s changing hands is an asset at a location. That’s a gross over-simplification and each icon needs drill-down into its details. But it gives a context to start a discussion.

The long story short is that even though the word “land” appeared in the title of their organisation, and in many of their documents, I suggested that their business was really about “rights,” not land – the rights of owners, the rights of tenants, the rights of people to access the property on registered easements, the rights for mining companies to perform exploration, and so on. The rights were assets as much as the land itself. Problem solved.

Reengineering a Bank

An Aussie 2nd tier bank was spending a lot of money on a 6-month business process reengineering exercise, but had ground to a halt. They concluded that one particular problem was a data problem, not a process problem. I was called in. They wanted an enterprise data model showing high-level business concepts, with some drill-down in one problem area. And they wanted it delivered in a day. Impossible? The simple answer is that I did deliver, again by using patterns. Thanks David and Len – you saved my neck.

Detailed? Heck, no. Sufficient? Absolutely. Well, enough for now. Yes, they needed more details later, but for now they were making progress again. Maybe it’s a bit like the town planner needing a plan for water infrastructure services. Way down the track someone will have to decide if the Jones’ household wants two taps in the kitchen, or one mixer tap. But we don’t need those details to get the town planning exercise moving.

Data (& Processes) Integration For a Not-For-Profit

I did some top-down enterprise data modelling for a not-for-profit. It took 3 weeks to develop an enterprise data model (using “patterns”, of course).

They had existing business processes, many of which centred on care for the vulnerable in our community. The processes were specialised. There were ones for helping those with drug and alcohol problems, or unemployment, or victims of domestic violence, or homelessness, and so on. The participants in a 1-day workshop came in pretty defensive. They were specialists; they didn’t want to be bunched together.

I tackled the tension at the very outset by writing a slogan on the whiteboard: “Let’s leverage off what is common, and respect what is distinct.”

At the end of the session one of the participants came up to me and said that she had initially assumed there was 20% in common, and 80% distinct. At the end of the day she concluded it was 80/20, but flipped – 80% in common, 20% distinct. A telling statement summarised the feelings of another participant: “You know what? We’re all just care workers.”

A top-down data view of the business concepts led to a massive reorganisation of business processes, with coordinated case management across not just this organisation, but associated not-for-profit and government care providers. One could argue that the outcome was obvious, but it hadn’t been in the 150 years they had been operating! A high-level enterprise data model gave a break-through perspective.

I was subsequently asked to join their monthly board meetings for years after. The data work had delivered business value.

Can Bottom-Up Fall Short?

So top-down can add value, but can bottom-up sometimes fail us?

A Palette of Choices

The first of my two stories here came to me from a recent work colleague. He was on an agile project. The team didn’t model the data, they just started coding and building tables as required. Their velocity got progressively worse. The table designs were constructed bottom-up, just looking at the scope of the current sprint, but there was increasing rework in subsequent sprints. Rework (called refactoring) delivers incremental improvement on some agile projects, and that’s great. In this case, the rework was a reflection of really bad design.

The scrum master made a bold call. He got commitment from the product owner to use one entire sprint to build the big-picture, town-plan data model. OK, they “lost” one sprint, but the measured velocity was 12 times faster in subsequent sprints.

The cost to that organisation of initially doing bottom-up construction was slow velocity until they wised up and did a top-down model. The next story had much more serious consequences coming out of bottom-up practices.

Surely a Tiny Application Doesn’t Need Big-Picture Thinking?

Graham Witt, an internationally recognised Aussie data modeller, told a story that from memory goes something like this. There was a junior developer, responsible for delivery of a system for a school. It included a register of the people involved, including teachers, students, and parents/guardians. That doesn’t sound too hard. The developer figured there was no need to do a model and crosscheck it with the business.

Using a bottom-up approach based on a limited number of samples, he observed that each student had a family name, and the student, mum, and dad shared that one name. So to save space, why not record the family name once. What a genius! It worked fine until he encountered a family where members had different surnames.

There’s a not-too-subtle message here, and I will come back to it when we look at “fact-based modeling”. Here’s the point: When you start bottom-up with explicit examples, they may not represent the general cases. Alternatively, when you start with proven data model patterns (think of David Hay and Len Silverston), they, by their very nature, represent designs that are typically widely applicable and exemplify what you are likely to encounter. Such as family members with different surnames!

Before we leave this school system case scenario, it had one more terrible outcome. This developer also assumed that all family members lived at the same address. Things got ugly for a wife who was living in a women’s refuge for protection from a violent husband. The addresses of these facilities are closely guarded. That is until the IT system behind this story exposed the refuge address to the husband, based on the built-in assumption they all lived together. A massively serious outcome when the violent man exclaimed, “Ah, now I know where the so-&-so lives.”

Let’s Be Fair

A Palette of Choices

Diversity in humans makes life interesting, and data professionals are certainly a mixed bunch.

Some have a preference for top-down modeling. They start with the concepts, and worry about the details later. Hey, if you get the framework right, success flows from there. Right?

But their opponents often quote that “the devil’s in the details”. Soft, fuzzy concepts may look all well and good for a while, but what if they fall apart when you hit the details of reality?

Those that belong to the bottom-up school of thought like to start with the hard details. After all, they represent the facts, don’t they? Surely the reality is in the hard data, not in the soft concepts. No abstractions here, please.

But their top-down opponents don’t quietly sit on the sidelines either. One of their many criticisms is that these bottom-up practitioners can’t see the wood for the trees. There is so much detail that the overarching, unifying constructs are lost, and the result is unnecessary complexity. That’s if we ever get to a result!

So how do we achieve a balance? One solution is to take the view that it doesn’t matter where you start (top-down or bottom-up), as long as you’ve incorporated the other view before you claim to have finished. That’s a view I endorse as a general principle. But a couple of caveats on this non-confrontational approach follow.

Firstly, there are some scenarios where only one approach may be needed. Be aware of that possibility, and be comfortable making that conscious choice.

And secondly, while there are cases where both the big-picture and the detailed thinking are required, sometimes the starting point makes a difference in the speed of delivery and the quality of the result.

OK, I Hand-Picked the Examples

The real-life stories I’ve shared paint success stories arising from top-down big-picture thinking, and warn of possible failures if you do only bottom-up detailed modeling. You may well be able to counter my stories with failures for people with their heads in the clouds, and success stories for those who do the hard yards grappling with grubby details! I accept that.

Please, my message is simple. If you do only top-down, or only bottom-up, there is the possibility it may end up in tears. It’s safer to address both ends of the spectrum. And I suggest that where you start does have impacts, at least in certain cases – in the next segment of this paper, we will look at some scenarios where starting top-down may work out much better, shine a spotlight on reasons why some people may be missing out on these benefits, and have a first look at a way to move from resistance to adoption of better ways of working.

Acknowledgments

A number of the ideas in this paper have resulted from the interaction over years with many great people. A recent “thought leader” that has sharpened the ideas is Rob Barnard, and it is with gratitude I acknowledge his contribution.

Share this post

John Giles

John Giles

John Giles is an independent consultant, focusing on information architecture, but with a passion for seeing ideas taken to fruition. He has worked in IT since the late 1960s, across many industries. He is a Fellow in the Australian Computer Society, and completed a Master’s degree at RMIT University, with a minor thesis comparing computational rules implementations using traditional and object-oriented platforms. He is the author of “The Nimble Elephant: Agile Delivery of Data Models Using a Pattern-based Approach”.

scroll to top