The Data-Centric Revolution: Data-Centric and Model Driven

mccomb-editedModel Driven Development

Model Driven seems to be enjoying a bit of an upsurge lately.  Gartner has recently been hyping (is it fair to accuse the inventors of the hype curve of hyping something? or is it redundant?) what they call “low code/ no code” environments.

Perhaps they are picking up on and reporting a trend, or perhaps they are creating one.

Model Driven Development has been around for a long time. To back fill what this is and where it came from, I’m going to recount my experience with Model Driven, as I think it provides a first person narrative for most of what was happening in the field at the time.

I first encountered what would later be called Model Driven in the early 80’s when CAD (Computer Aided Design—of buildings and manufactured parts) was making software developers jealous.  Why didn’t we have workbenches where we could generate systems from designs?  Early experiments coalesced into CASE (Computer Aided Software Engineering).  I was running a custom ERP development project in the early 80’s (on an ICL Mainframe!) and we ended up building our own CASE platform.  The interesting thing about that platform was that we built the designs on recently acquired 8-bit microcomputers, which we then pushed to a compatible framework on the mainframe.  We were able to iterate our designs on the PCs, work out the logistical issues, and get a working prototype UI to review with the users before we committed to the build.

The framework built a scaffold of code based on the prototype and indicated where the custom code needed to go.  This forever changed my perspective on how systems could and should be built.

What we built was also being built at the same time by commercial vendors (we did this project in Papua, New Guinea and were pretty out of the loop as to what was happening in mainstream circles).  When we came up for air, we discovered what we had built was being called “I-CASE” (Integrated Computer Aided Software Engineering), which referred to the integration of design with development (seemed like that was the idea all along).  I assume Gartner would call this approach “low code” as there still was application code to be written for the non-boiler-plate functionality.

Next stop on my journey through model driven was another ERP custom build.  By the late 80’s a few new trends had emerged.  One was CAD was being invaded by parametric modeling.  Parametric modeling recognizes that many designs of physical products did not need to be redesigned by a human every time a small change was made to the input factors.  A motor mount could be designed in such a way that a change to the weight, position, and torque would drive a new design optimized for those new factors. The design of the trusses for a basketball court could be automatically redesigned if the span, weight, or snow load changed and the design of big box retail outlets could be derived from, among other things: wind shear, maximum rainfall, and seismic potential.

The other trend was AI (remember AI?  Oh yeah, of course you remember AI, which you forgot about from the early 90’s until Watson and Google’s renaissance of AI).

Being privy to these two trends, we decided to build a parametric model of applications and have the code generation be driven by AI.  Our goal was to be able to design a use case on a post-it note. We didn’t quite achieve our goal.  Most of our designs were up to a page long.  But this was a big improvement over what was on offer at the time.  We managed to generate 97% of the code in this very sophisticated ERP system.  While it was not a very big company, I have yet to see more complex requirements in any system I have seen (lot based inventory, multi-modal outbound logistics, a full ISO 9000 compliant laboratory information management system, in-line QA, complex real time product disposition based on physical and chemical characteristics of each lot).

In the mid 90’s we were working on systems for ambulatory health care.  We were building semantic models for our domain.  Instead of parametric modeling we defined all application behavior in a scripting language called tcl. One day we drew on a white board where all the tcl scripts fit in the architecture (they defined the UI, the constraint logic, the schema, etc.) It occurred to us that with the right architecture, the tcl code, and therefore the behavior of the application, could be reduced to data.  The architecture would interpret the data, and create the equivalent of application behavior.

We received what I believe to be the original patents on fully model driven application development (patent number 6,324,682).  We eventually built an architecture that would interpret the data models and build user interfaces, constraints, transactions, and even schemas.  We built several healthcare applications in this architecture and were rolling out many more when our need for capital and the collapse of the .com bubble ended this company.

I offer this up as a personal history of the “low code / no code” movement.  It is not only real, as far as we are concerned, but its value is underrepresented in the hype.

Data-Centric Architecture

More recently we have become attracted to the opportunity that lies in helping companies become data centric.  This data-centric focus has mostly come from our work with semantics and enterprise ontology development.

What we discovered is that when an enterprise embraces the elegant core model that drives their business, all their problems become tractable.  Integration becomes a matter or conforming to the core.  New system development becomes building to a much, much simpler core model.

Most of these benefits come without embracing model driven.  There is amazing economy in reducing the size of your enterprise data model by two orders of magnitude.

Data-Centric and Model Driven

As I started interviewing people who had achieved success with data-centric, I noticed many of them had also implemented model driven.  This seemed like more than a coincidence.  Standard & Poor’s Market Intelligence have been data-centric since 2000.  Within a few years of adopting a single shared logical model, they began building model driven user interface forms. This has continued to the present.  A large percentage of their internal facing data entry forms are code free (“no code”).  The customer, facing part of their system, has much less data entry and is mostly complex data presentation, where they use a more hybrid approach (“low-code”).

This paper is about exploring what the connection is.   As far as I can tell, data-centric does not imply model driven.  And model driven does not imply data-centric.

So, why do they coexist at much higher than coincidental rates?

My Theory

I don’t know for sure.  Here is my theory.

I think data-centric is driving model driven more than vice versa. Although I admit it could be the other way around and it might even be a coincidence.

When someone pushes data-centric, they are asking their constituents to adopt a shared model in place of the one they have a lot of application code committed to.  It is entirely possible that at that point, they recognize that replacing a bunch of application specific code with some new code that conforms to the shared model may be a losing proposition.

At this point they may realize that the carrot in this equation is to not write a bunch or code dependent on the legacy model.  However, as soon as you come to this conclusion, you realize that committing to a new shared model, while better in principle, is not immediately better in implementation.

What I have seen, is many people who offered up the shared data-centric model as superior, needed to offer a reason for people to adopt.   This is where I think the model driven approaches sprung up.  While a given application owner might be reluctant to buy into a shared model, because of the massive rewriting of their existing code, if they could be lead to believe that a new system might be derivable from the inputs, they might come on board.

The other possibility is that once you have all the meta data needed to define a system in a central repository, it is natural to look at that asset and say “this is almost enough to fully define at least a user interface, if not a whole application.”

We have recently become fans of the W3C standard SHACL (SHApes Constraint Language) standard. The primary mission of the SHACL standard was to add a constraints layer to a semantic model or a triple store, but as the adopters of data-centric have found, when you have enough metadata to define constraints unambiguously in data, you are very close to being able to dynamically define applications.


Whichever way the causality arrow is pointing, there seems to be a strong connection between data-centric and model driven development.  Each empower the other in a virtuous cycle.  As you become more data-centric, the scope and benefit of model driven becomes greater.  As you become model driven, the premium on very high quality models goes up, and as it does it becomes more obvious that putting the efforts into a shared model will have much greater payoff.

Share this post

Dave McComb

Dave McComb

Dave McComb is President and co-founder of Semantic Arts, Inc. ( a Fort Collins, Colorado based consulting firm. Since its’ founding in 2000, Semantic Arts has remained 100% focused on helping clients adopt Data-Centric principles and implementing Enterprise Knowledge Graphs. Semantic Arts has 30 employees in three countries, and is currently helping ten highly visible Fortune 500 companies adopt Data-Centric. Semantic Arts was named a Colorado Company to Watch in 2022 ( Dave is the author of Semantics in Business Systems. (, Software Wasteland ( bulk purchases available through Technics ( and The Data-Centric Revolution ( which is also available for bulk buys (

scroll to top