Developers need to lead the way in a business revolution

Developers, we are rapidly approaching a major inflection point (we may already even be there). The amount of code we write to pretend that legacy systems are not legacy is beyond too much.

The recent discussion about the programming paradigm needing an update spanned a number of topics (the discussion is so dense that it could probably make up a couple book chapters). One theme that kept coming up is that the current state of the art in business computing is horrid.

Developers, we are rapidly approaching a major inflection point (we may already even be there). The amount of code we write to pretend that legacy systems are not legacy is beyond too much. The rapid adoption of cheap multicore x64 CPUs in the server room and on the desktop means that parallel processing for CPU intensive tasks is no longer the realm of the Big Iron developers. All of the mainstream development environments have too many layers of abstraction between the underlying data and the application itself; this indicates that either the current data storage methods are inadequate or current development environments are inadequate. Too much code lacks proper validation because developers rely purely on the data type, which is completely unrelated to the business logic. We're constantly choosing between a code generator that writes garbage code or missing deadlines because their languages require 5 LOC to write an accessor — and that is without any validation or data constraints.

This is going to collapse under its own weight.

The more I work with on interesting projects, the more the cracks are showing. The languages that are well suited to really advanced computing techniques are not well suited to business computing for a variety of reasons. Some of the reasons are technical, some of them are not, and most of them are valid. After all, a language that supports lazy evaluation or set theory or is based purely on lambda calculus is not likely to be very well suited for the typical serial data transformations that most business computing really is.

The issue comes back to something that Alan Kay first made me aware of in one of his papers: the history of computing. In a nutshell, current business computing is the direct descendant of the data processing or information processing folks. These are the people for whom the computer was to primarily replace the accountants and record keepers and other desk jobs that amounted to paper shuffling. In the discussion about the programming paradigm, Chad Perrin (Apotheon) mentioned that most "applications" are really a database browser with a customized interface. This is more or less correct.

Most folks who spec out software development projects cannot see past this. Rather than asking, "How can I do a better job?" they are wondering, "How can I get out of performing these tedious, repetitive tasks?" There are two major reasons why this happens. First, it's a lot less challenging (i.e., you examine what you do now, find the parts that the computer can easily do, and have the programmers code those parts). It's tough to actually analyze your business process. The other reason is people would feel threatened if they did anything else.

Think about it: When writing the specs for a tool they use, no one is going to spec themselves out of a job; they are going to make that job easier. The results are software designs that handle the tedious parts of a bad process instead of enabling a better process. The users leverage this to do more of the bad process instead of freeing themselves completely to learn a new job.

Nothing will change until developers show the business folks that we are capable of revolutionizing their business and not merely automating it — that is the IT world's contribution to this dilemma. Instead of playing the part of an active partner, we act like a doormat. When we transform that business process into code and say to ourselves, "Gee, this looks like a really bad process," why are we only saying that to ourselves and not the BAs, SMEs, and users? Why are we not offering more than what is requested? For example, "OK, I think I understand the spec" can become "I understand the spec, but I think I can add some value here too, and maybe make this 'smarter' as well as 'automatic.'"

This is a difficult but needed transition. As much as I am down on agile development methodologies, they are well intentioned, and my hat is off to them for trying to do something about it. After all, the goal of agile development is to be a much more responsive and fully fledged member of the IT-user partnership, and to encourage the user to better leverage the development team. Most of my reasons for not being pro agile development are cultural; it takes a very special development team and some very special customers to make it work.

This speaks volumes about the current business environment. Remember that many of our current business leaders were junior clerks at companies run by people like Robert McNamara. His management techniques ("systems analysis") worked very well in many instances. The role for computing in such an environment was to replace the filing cabinet with a computer program, to provide humans with the aggregated information needed to make an informed decision about a complex subject. Does this sound familiar?

Developers have two choices

Option #1: We can proceed down the current course, which is to allow businesses to continue running on 80-year-old principles despite having the technology to create new management principles. This course will lead us towards ever complex systems to handle the exponential increase in data volume (ironically, without any correlated increase in data significance). The abstractions needed to make this work are also coming at an exponentially fast pace. Look at how long COBOL ruled and then was supplanted by SQL. But SQL was not enough by itself, so we added ODBC to cover it up, then systems like DBM, ADO/ADO.NET, etc. on top of that, then ORM systems, "typed datasets", and now layers upon layers to hide that. At the current pace, we should be seeing a new data abstraction layer every six months in a year or two. Option #2: We can reject the current course and lead the way out of this mess. Organizations that are jettisoning this mindset are reaping rewards. Companies that programmed the business model into the system have freed their leadership to spend their time keeping an eye on the ever-shifting business landscape and not with their eyes glued to some "data dashboard" with their finger "on the pulse of the organization" waiting for its heart to stop. These smart companies are doing well.

What's your take?

Would you prefer to: allow leadership to figure out when the patient is about to die, or help leadership keep their eyes on the road so the wreck doesn't occur in the first place? I opt for the latter.



Justin James is the Lead Architect for Conigent.

Editor's Picks