Security optimize

Do it now, or do it right

If doing a consulting job the right way isn't best for your client's timetable or budget, Chip Camden says you need to compromise.

They didn't want it good, they wanted it Wednesday.

-- Robert A. Heinlein

Consultants should strive to provide the best value they can to their clients, because that's what will keep customers coming back for more. In general, doing things the right way, though it may take more time initially, provides better value in the long run. Sometimes, however, elapsed time itself is an important component of value to our clients.

A long time ago, I worked on a project with a client to port their software to the Windows platform. To say that Windows was unlike any other platform they had ever supported before would be like saying Hades is unlike any other vacation spot. We had to completely re-implement most of the user interface components, while attempting to preserve compatibility with other platforms. We didn't have a decade to accomplish this, so we ended up making lots of expedient design decisions. I said at the time that many of these would come back to haunt us. I'm not just talking about technical debt here. These were features that, once they released the product, could never be altered without major customer inconvenience. They (and I) have been living with the consequences of some of those decisions ever since.

Nevertheless, if I had to do it over again, I probably wouldn't change much. The product became a success for both my client and their customers, primarily because of those imperfect design decisions that preserved portability against all sanity and got the product to market quickly. If those hadn't been our priorities, a non-trivial number of my client's customers wouldn't have been able to benefit from the product at all.

On the other hand, rushing a project can often lead to disaster. You can only perform a Scottyesque miracle so many times before the dilithium crystals really do melt down. Unfortunately, it's all too easy to let that last-minute heroic effort become your modus operandi.

In recent years, I've become a big fan of Test-Driven Development (TDD). When done right, I find that the approach of writing tests first not only improves the quality of my work, but usually saves time. In my experience, constructing the tests for the desired behaviors (and their exceptions) becomes an initial walk-through of the design before I start writing code. I can still do some code prototyping after that phase, but it's already informed about all the goals it needs to meet.

It's not always easy to get clients to buy into TDD, however; to some of them, it just looks like wasted time. That perception usually stems from the chronic tendency in our industry to view almost any programming task as being simpler than it needs to be. "Just make it do X when the user does Y!" All of the "what ifs" evaporate until the user actually exercises them, and then once again the oversimplification kicks in: "How could you not have foreseen the obvious?" In the worst organizations, the programmer hastily slaps a patch on that behavior -- again, without considering all the "what ifs" -- until the next problem surfaces. Eventually, you have an application in which patches on patches make it hard to tell what the code is doing, or even what it's trying to do -- let alone what it should do.

Time is money, as the saying goes, so nearly every organization wants to minimize development time. For consultants like myself who bill by the hour, our clients receive a further motivation from our invoice to cut down the schedule as much as possible. When we suggest methods that take more time, they might think we're just interested in padding our wallets. We need to monitor ourselves to make sure that isn't the case, and then present our case for doing it right because it's the best thing for the client.

Sometimes you have to compromise. Developing a thorough set of tests for a new feature on an old system could involve creating tests for the entire system or a large portion thereof, and the client simply may not have enough calendar space for that. So, you agree to create only enough tests to prove compliance for the most common use cases and exceptions -- but document the areas not tested. I was going to add, "so someone can go back and write those tests later" -- yeah, that'll happen. However, at least you'll know what you haven't proven, for the next time you have to extend this application.

About

Chip Camden has been programming since 1978, and he's still not done. An independent consultant since 1991, Chip specializes in software development tools, languages, and migration to new technology. Besides writing for TechRepublic's IT Consultant b...

10 comments
Tony Hopkinson
Tony Hopkinson like.author.displayName like.author.displayName 2 Like

his option was now. Imagine my surprise....

RMSx32767
RMSx32767 like.author.displayName 1 Like

Also happens in a non-consulting/in-house scenario especially when rapidly developing apps for individuals, a group, or client that "needs it now". Oh sure, it's always part of the plan to go back and clean it up but as the CCR song says "Someday never comes".

MikeGall
MikeGall like.author.displayName 1 Like

I don't do consultant work but am the only person who can code using "professional" grade languages (rest are Matlab developers, not that Matlab is necessarily "unprofessional" but in my experience most developers tend towards the science and engineering "it seems to work good enough" mindset which my coworkers follow). Anyways my workflow: When I get spare time I use it to catch up on new technology, build up some underlying infrastructure for programming (new revision control, new unit test suite etc), and relook at things that I wish I had done better. Since a lot of my projects tend to be modest sized (2k loc,

apotheon
apotheon like.author.displayName 1 Like

It seems like you've independently invented a nontrivial percentage of an agile development methodology.

apotheon
apotheon like.author.displayName 1 Like

> So, you agree to create only enough tests to prove compliance for the most common use cases and exceptions -- but document the areas not tested. I was going to add, "so someone can go back and write those tests later" -- yeah, that'll happen. However, at least you'll know what you haven't proven, for the next time you have to extend this application. The only part of the article I find even remotely disagreeable is the notion that you save time by documenting the areas not tested. Once you have enumerated what you should have tested, the actual test-writing part is trivial, in my experience. Have you seen testing systems like RSpec for Ruby? They call them "Behavior-Driven Development" tools, but the truth is (as far as I can see) that they're more verbose TDD tools that use terms like "Behavior" and "Specification" or "Spec" in their usage model jargon. I'm half-convinced these tools were created primarily for the purpose of getting corporate buy-in for TDD by referring to the process as "specifying" or "writing a spec" or something like that, rather than "writing tests first". In practice, the major difference between use of BDD tools and TDD tools seems to be that the comments you should have with your tests in TDD tools are mandatory string arguments to your BDD tests. Learn a BDD tool and tell clients you're writing executable specs rather than using an xUnit tool and writing tests before you write code, if you must. It might help get buy-in to "do it right" the first time.

prh47
prh47 like.author.displayName like.author.displayName like.author.displayName 3 Like

If doing foundational work is a habit, you will get quicker at it over time. One way to handle payment for such work might be to charge a fixed price for it, while charging hourly for the rest of the project. What do you think?

Sterling chip Camden
Sterling chip Camden like.author.displayName like.author.displayName 2 Like

Often that part is easier to predict (except on research-based projects). With some clients, it doesn't even need to be enumerated. It's just part of the process. It depends on how "hands on" the client wants to be.

DOSlover
DOSlover like.author.displayName like.author.displayName like.author.displayName 3 Like

I have been involved with non-IT projects to mobilise new businesses and the quick and dirty route tends to cost you forever after in those environments. Trying to fix the noise in the engine when the car is going down the highway at 60 miles an hour is a much harder prospect than doing te fix before you leave and knowing the fix with hold for the rest of the trip. Always a delicate balance but foundations allow you to build a strong framework if the are soundly implemented. If the project has a short shelf life or is a known expendable you can afford to be a lot more compromising.

Sterling chip Camden
Sterling chip Camden like.author.displayName like.author.displayName 2 Like

Leaving the documented door open to doing it later can help. Completely ignoring it doesn't.

Sterling chip Camden
Sterling chip Camden like.author.displayName 1 Like

... but business factors often require just that. Nevertheless, we don't want to fall into the "broken window" syndrome and just let quality go to hell. How do you deal with this when it comes to your clients?