Data first, programs as guests

Some ideas don’t arrive suddenly. They form slowly, through repetition and exposure, until one day they become visible. Over the years, I’ve noticed that the systems I’ve always felt most at home in share a specific trait. It took me a long time to see the bigger picture and name it clearly.

In those systems, data is the center of gravity, not programs.

Programs are transient. They come and go, evolve, get rewritten, replaced, or even discarded. Data, instead, is persistent, structured, and sovereign. The environment exists to protect it, give it shape, and make it accessible over time. Code is something you attach to data to perform a task, not the thing that defines reality.

This distinction may sound abstract, but it has very concrete consequences.

When data is primary, copying data is enough. Backup, restore, migration, and disaster recovery are not exceptional procedures or complex rituals. They are implicit properties of the system. If data is well structured and self-describing, copying it is preservation. Programs can be redeployed or rewritten, but the meaning remains.

Programs also become simpler in this model. They don’t need to reinvent structure, integrity, or semantics, because those already live in the environment and in the data itself. They do one thing, assume the data is real, and then step out of the way.

This only works if formats and protocols are open, documented, and stable. When the structure of data is visible and well defined, programs become interchangeable. You can replace them, rewrite them, or discard them entirely without losing meaning. The interface remains. The data remains. Programs are free to evolve because reality no longer lives inside them.

I first encountered this way of thinking very early, with dBase.

For readers who didn’t grow up around it, dBase was not just a database program. It was an environment. DBF files were the world. Screens, reports, and bits of code orbited around them. You could copy a floppy disk and have the entire system: data, structure, and logic together. The data format was inspectable, documented, and portable. Its structure was not hidden inside the program, which meant the data could outlive any single implementation. Long before SQLite or embedded databases became common, you could move data simply by copying files.

I’ve written more extensively about dBase in another post, so I won’t repeat the details here. What matters is the feeling it left behind: data had gravity. Programs were guests.

Looking back, that experience shaped more of my thinking than I realised at the time.

Around the same years, I was also deeply attracted to BBSes. Not just as a user, but as an environment. Menus, message bases, doors, files. Everything felt glued together, yet loosely coupled. The system had a shape. Interaction was mediated, intentional, and calm. Fidonet and similar networks reinforced this further: offline-first by necessity, resilient by design, tolerant of delay and imperfection.

I didn’t have the language for it then, but these were also data-centric worlds. Messages mattered more than programs. Structure mattered more than speed. Copying and forwarding data was the network.

Coming from that world, interoperability was never optional. It existed because protocols were open, documented, and shared. Different systems, different software, different machines, yet they could all exchange meaning. The protocol was the contract. Programs could change, but the network remained. Data could move freely because its structure was not owned by any single implementation.

Years later, when I encountered IBM i, something immediately felt familiar, even though the context was completely different.

IBM i takes the idea of data gravity seriously at the operating system level. Files are first-class objects. Structure is enforced by the system, not improvised by applications. Programs operate within a shared, stable universe rather than defining their own private realities.

This is also why COBOL makes sense in that environment. Not because the language is superior in isolation, but because the environment already provides structure, integrity, and continuity. COBOL thrives when it can rely on that shared reality. More specifically, in IBM i data is defined externally in Physical Files (PF) … and in IBM z on VSAM files. On platforms where that gravity is missing, like Unix or Windows/MS-DOS, it always felt slightly awkward and incomplete to me.

This is an important distinction for me: I’m not interested in ranking languages. I’m interested in environments.

This perspective also explains why I’m not hostile to modern technologies, even when I’m critical of how they’re often marketed.

I like containers. I use them. But not in the way the market tends to describe them.

I don’t see containers as applications, platforms, or scaling units. I see them as sealed software packages: a modern way to ship programs together with their runtime assumptions, a package manager in a way, while keeping data external, persistent, and sovereign.

Used this way, containers fit perfectly into a data-centric worldview. Backups remain copying data. Disaster recovery remains restoring data and restarting known programs. Upgrading the base operating system becomes safer, because programs carry their dependencies with them. This isn’t a rejection of new tools, but an adoption of them along the same axis of responsibility that older systems embodied.

I’ve written in a previous post about responsible computing and the importance of restraint, sustainability, and minimizing waste. This way of using containers aligns with that thinking. It’s not about novelty or scale theatre. It’s about making systems that remain understandable and kind under stress.

I want to stress that this is not nostalgia.

Yes, some of the systems that embodied these values belong to the past. That part is gone. But the value itself isn’t tied to a decade, a language, or a vendor. It’s an operating value. And this is the part I grieve.

It’s the belief that data should outlive programs.
That copying should be enough.
That recovery should be boring.
That systems should respect both resources and the people who run them.

Technology changes. Tools come and go.
But where meaning lives in a system is still a choice.

For me, meaning lives in data.
Programs come and go. The data remains.

2026-02-16