Before we called ourselves "developers," we had a different name.
The shift from Information Systems to Computer Science to Developer happened so gradually that most people don't remember there was a choice at all. It felt inevitable, like progress. But it was a choice: an economic one, made by universities, the industry, and the market. And for thirty years, it pulled us away from something fundamentally important.
In my experience, the best technical leaders I've worked with all think the same way: they start with the business problem, not the code. They ask what information needs to flow where, what decisions need support, what processes are broken. They think in systems. This isn't a coincidence. Many of them studied Information Systems, not Computer Science. They inherited a different tradition.
That tradition has a name and a history. And understanding where it came from explains both why AI is so important to our profession right now, and why so many of us feel like we're finally coming home.
The Systems Thinkers
Edward Yourdon wasn't a programmer first. He was a systems analyst. In the 1970s and 1980s, when he developed Structured Analysis and Design Technique his goal wasn't to optimize code. It was to model how businesses worked so that systems could be designed to support them. Data Flow Diagrams weren't beautiful abstractions. They were communication tools. A business analyst could draw flows of information through a sales department, a finance team, a warehouse operation. Developers could read those diagrams and understand not just what to build, but why it mattered.
James Martin extended this thinking in the 1980s with Information Engineering. Again, the focus was crystal clear: information is the artifact. How does it flow? How is it transformed? How does it support decision-making? From those models, systems followed. The information came first, philosophically and practically.
In university programs across North America and Europe, Information Systems departments were serious places. They taught business process analysis, data modeling, systems thinking. They taught you that a computer system was fundamentally a system for processing information to support an organization's goals. The "computer" part was almost incidental to the larger purpose.
Here lies the real challenge with that era: the economics didn't work.
The gap between those beautiful Data Flow Diagrams and actual running code was enormous. CASE tools (Computer-Aided Software Engineering) and MDA (Model Driven Architectures) promised to bridge it. Draw the model, generate the code. It almost never worked that way. The closer you got to implementation, the more the reality diverged from the model. Changes to business processes meant updating models. Updates to models sometimes required rebuilding systems. The cost of maintaining that alignment grew faster than the value it provided.
By the 1990s, universities started the shift. Computer Science programs expanded. They taught algorithms, data structures, language design, optimization. Practical things. Things that directly connected to a job in software development. Information Systems programs contracted. Why study slow, expensive analysis when you could learn to code? The market spoke clearly: build faster, iterate constantly, ask forgiveness not permission.
The profession followed the money. By the early 2000s, "developer" had become the default identity. We stopped thinking of ourselves as analysts designing information systems and became craftspeople writing code. The business focus didn't disappear entirely—it became someone else's job. Product managers, business analysts, stakeholders. The development team's job was to take requirements and implement them. Fast.
The Brief Attempt to Remember
Eric Evans' Domain-Driven Design appeared in 2003, and if you know what to look for, it's a quiet cry to remember what we'd forgotten. DDD put the business domain front and center. Developers should understand the business model deeply. Systems should be organized around how the business thinks about its problems. Ubiquitous language meant that business terminology and code terminology should be the same thing.
It was, in essence, Information Systems thinking applied to the age of object-oriented programming.
Ironically, DDD never became mainstream in the way it deserved. Why? Economics, again. Building systems the DDD way required significant upfront investment in understanding the business. It required developers who could think like business analysts. It required maintaining that alignment between domain models and code throughout the system's life. It worked beautifully when organizations could afford it. For everyone else, it felt like a luxury they couldn't sustain.
So we kept building as we had been. Fast, iterative, requirements-driven. And we got very good at it. Modern architectures, microservices, platform thinking—all genuine improvements. The pace of delivery accelerated. But something persistent remained: the disconnect between what the business needed and what we built. Not always catastrophic. But noticeable. Requirements misunderstood. Systems that technically worked but didn't quite fit. The need to constantly refactor because assumptions had changed.
The root cause was always the same: we weren't thinking like Information Systems professionals anymore. We were thinking like Computer Science professionals trying to solve business problems. Different discipline. Different tools. Different starting point.
What We Actually Lost
Understand the distinction clearly. Computer Science asks: How do we make computers do things efficiently? It's a rich, important question. Algorithms matter. Performance matters. Elegant solutions matter. We need people asking these questions.
Information Systems asks a different question: What should computers do to support this business? It's not less technical. It's differently focused. It starts with information flows, processes, decisions. It thinks in terms of systems behavior, not computational efficiency. Where data comes from. Where it needs to go. What transformations it undergoes. What decisions depend on it.
We didn't stop needing IS thinking. We just stopped calling it that, stopped teaching it systematically, stopped organizing the profession around it. And for decades, we paid the price in misaligned systems, failed implementations, and the constant feeling that technology wasn't quite solving the right problems.
What we lost wasn't nostalgia for SADT or CASE tools. Those belonged to their era, and modern tools are objectively better at handling complexity. What we lost was the starting point. We lost the assumption that understanding the business was part of your job. That systems thinking was core to software development. That being able to model information flows and processes was a professional skill, not a business analyst's task.
Coming Home with Better Tools
Here's what changed: AI and modern platform architectures finally made the IS approach economically viable again.
Consider what was always missing: the gap between beautiful models and running code, between business requirements and implementation. That gap exists because translating models into code is expensive. You need someone smart enough to understand both domains and translate between them. For thirty years, that person was scarce and expensive, so we hired fewer of them.
Now imagine that gap shrinking. Imagine AI that can participate in that translation. Business domain models expressed in multiple forms—textual descriptions, structured diagrams, event models. Code generators that understand domain language. Platform capabilities built explicitly around business concepts rather than technical primitives. The economics shift immediately.
Modern platforms (good ones, anyway) are operationalizing what Yourdon and Evans always understood: business models should be primary artifacts. Not documentation that gets out of sync with code. Not throwaway requirements written in Word documents. The actual structure of the system, visible and maintainable.
What was too expensive is becoming practical. The IS approach—starting with business information and processes, designing systems around them, maintaining that alignment—is becoming economically rational again.
Vindication, Not Nostalgia
This isn't nostalgia for the 1980s. The tools were genuinely limited. The speed was genuinely slow. The ability to handle complexity was genuinely constrained. Modern architecture is better. AI capabilities are revolutionary. Platform thinking is fundamentally sound.
But the philosophy, the starting point, was always right.
If you studied Information Systems, you weren't learning an outdated discipline. You were learning something that went out of fashion for economic reasons, not because it was wrong. And now those economic reasons are changing.
If you studied Computer Science and became a developer, you learned something genuinely valuable: the craft of making computers do things, efficiently and reliably. You still need that. You always will.
But both disciplines are needed. The CS tradition gave us speed and scale. The IS tradition was always asking the right question: what problem are we actually solving?
We're finally approaching an era where we can do both. Where thinking about information and processes isn't treated as overhead but as the foundation of what we build. Where "developer" might gradually expand to mean someone who understands both code and business context. Where AI handles some of the translation burden that made the IS approach too expensive.
Do you remember when we called it Information Systems? Do you remember the assumption that understanding your client's business—really understanding it—was part of the job?
What would your career have looked like if the economics had been different? If modeling business processes and information flows had remained as central to software development as it should have been?
We're finding out. And it's about time.



