Clients usually call in the consultant because they expect us to possess knowledge and experience that they lack. Due to choice-supportive bias, the client may inflate their assessment of how much the consultant actually knows -- which can lead to disappointment if they find out they were wrong. That's just one more reason why consultants need to present themselves honestly, to avoid as much as possible creating these unrealistic expectations.
But how much knowledge of our field should clients reasonably expect us to possess? Technology advances at such a rapid pace these days that unless you're in a dying niche, trying to keep up with even a reasonable subset of new information could be a full-time job. You can be the expert in one narrow field, but in everything else you'll find it difficult to stay moderately well informed. Nevertheless, you can't practice your specialty in a vacuum, whether you're in software development, networking, security, or some other focus. You'll regularly have to know something about all the others, and even within your focus it's impossible to cover all the territory.
Ben Franklin said, "The doorstep to the temple of wisdom is a knowledge of our own ignorance." We can't even begin to deal with the problem until we know what we don't know. When I first started out as a consultant 20 years ago, I had already worked for 12 years in the field, both in programming and in management, and I thought I knew almost everything. I quickly found that expecting yourself to know everything wastes a lot of energy trying to maintain that image by covering your ignorance. The more I learn, the more I realize how little I know and how little I should be expected to know. This isn't an excuse for not educating myself -- it's recognizing that no matter how much I learn, there will always be vast tracts of relevant knowledge that I'll never get around to learning.
Where does that leave our clients? If they don't know it already, we need to educate them that "I don't know" is an acceptable answer -- as long as it's followed by "but I can find out." No matter what subject you studied in college, unless you just graduated you can bet that a fair share of the details you learned are already obsolete. Hopefully, though, they taught you something much more relevant: how to research, and how to reason.
Even the knowledge of how to research is changing. When I was in college, the Internet was not available to us for research. It would be another decade before Tim Berners-Lee invented the web, and Larry and Sergey were in elementary school. Back then, you went to the library and started with the card catalog or the Bibliography of Bibliographies and you ultimately gleaned all your information from dead trees, microfilm, or microfiche. Just finding the sources you needed consumed most of your time. That process has become vastly easier since Google, especially if you know how to use it well. Google even automates to some degree the evaluation of relevance and authority, though you always need to apply your own evaluation as well. The art of appropriating the knowledge you find and applying it to your specific problem, though, hasn't really changed.
You don't want to have to start with Google always, however. For specific fields of inquiry that repeatedly require your research, you want to have reliable resources bookmarked. For example, if I have a question about FreeBSD, I always go to the FreeBSD Handbook first. For Ruby, my first stop is Ruby-doc.org. And despite all its deficiencies, the most comprehensive source for Microsoft information is the MSDN Library. However, if you need to search one of these sites, it's often better to use Google with a "site:" qualifier than to use the site's search engine.
After you've found what you think is the answer, you need to prove it. You don't just stick it into the production server and hope for the best. A good researcher has his or her own set of test systems for doing this sort of work. Virtualization has made that extremely affordable (as in $0), so you really don't have an excuse for not testing your solutions in an isolated environment most of the time. If you find that the project is too tightly coupled to external resources to be able to isolate it, then that probably indicates a design flaw. You should at least be able to stub out those services for testing purposes.
The ability to conduct reasoned research is just as much, if not more, of a benefit as prior knowledge of the subject. The main reason for learning the details of our field turns out to be so that we will do better at researching what we don't know. Background knowledge informs us so we can know what to look for, evaluate the relevance and authority of what we find, and appropriately apply what it tells us. When we market ourselves to clients, we shouldn't present ourselves as having all the answers. It's far more beneficial to have the right questions.
Thanks to Bob Eisenhardt (reisen55) for suggesting this topic.
More IT Consultant resources on TechRepublic
Chip Camden has been programming since 1978, and he's still not done. An independent consultant since 1991, Chip specializes in software development tools, languages, and migration to new technology. Besides writing for TechRepublic's IT Consultant blog, he also contributes to [Geeks Are Sexy] Technology News and his two personal blogs, Chip's Quips and Chip's Tips for Developers.