If doing a consulting job the right way isn't best for your client's timetable or budget, Chip Camden says you need to compromise.
They didn't want it good, they wanted it Wednesday.
Consultants should strive to provide the best value they can to their clients, because that's what will keep customers coming back for more. In general, doing things the right way, though it may take more time initially, provides better value in the long run. Sometimes, however, elapsed time itself is an important component of value to our clients.
A long time ago, I worked on a project with a client to port their software to the Windows platform. To say that Windows was unlike any other platform they had ever supported before would be like saying Hades is unlike any other vacation spot. We had to completely re-implement most of the user interface components, while attempting to preserve compatibility with other platforms. We didn't have a decade to accomplish this, so we ended up making lots of expedient design decisions. I said at the time that many of these would come back to haunt us. I'm not just talking about technical debt here. These were features that, once they released the product, could never be altered without major customer inconvenience. They (and I) have been living with the consequences of some of those decisions ever since.
Nevertheless, if I had to do it over again, I probably wouldn't change much. The product became a success for both my client and their customers, primarily because of those imperfect design decisions that preserved portability against all sanity and got the product to market quickly. If those hadn't been our priorities, a non-trivial number of my client's customers wouldn't have been able to benefit from the product at all.
On the other hand, rushing a project can often lead to disaster. You can only perform a Scottyesque miracle so many times before the dilithium crystals really do melt down. Unfortunately, it's all too easy to let that last-minute heroic effort become your modus operandi.
In recent years, I've become a big fan of Test-Driven Development (TDD). When done right, I find that the approach of writing tests first not only improves the quality of my work, but usually saves time. In my experience, constructing the tests for the desired behaviors (and their exceptions) becomes an initial walk-through of the design before I start writing code. I can still do some code prototyping after that phase, but it's already informed about all the goals it needs to meet.
It's not always easy to get clients to buy into TDD, however; to some of them, it just looks like wasted time. That perception usually stems from the chronic tendency in our industry to view almost any programming task as being simpler than it needs to be. "Just make it do X when the user does Y!" All of the "what ifs" evaporate until the user actually exercises them, and then once again the oversimplification kicks in: "How could you not have foreseen the obvious?" In the worst organizations, the programmer hastily slaps a patch on that behavior -- again, without considering all the "what ifs" -- until the next problem surfaces. Eventually, you have an application in which patches on patches make it hard to tell what the code is doing, or even what it's trying to do -- let alone what it should do.
Time is money, as the saying goes, so nearly every organization wants to minimize development time. For consultants like myself who bill by the hour, our clients receive a further motivation from our invoice to cut down the schedule as much as possible. When we suggest methods that take more time, they might think we're just interested in padding our wallets. We need to monitor ourselves to make sure that isn't the case, and then present our case for doing it right because it's the best thing for the client.
Sometimes you have to compromise. Developing a thorough set of tests for a new feature on an old system could involve creating tests for the entire system or a large portion thereof, and the client simply may not have enough calendar space for that. So, you agree to create only enough tests to prove compliance for the most common use cases and exceptions -- but document the areas not tested. I was going to add, "so someone can go back and write those tests later" -- yeah, that'll happen. However, at least you'll know what you haven't proven, for the next time you have to extend this application.