Justin James discusses how the pace of change in the technology industry has made it nearly impossible to specialize in much or become an expert in anything.
The pace of change in the technology industry has made it nearly impossible to specialize in much or become an expert in anything. I started cluing in on this a few years ago when I was reading a lot about Lisp, but I simply didn't have time to try it out. Learning Lisp, and learning it well, requires a lot of reading, practice, trial and error, and so on. This would have been fine and dandy, but I wasn't doing this during my 9-5 job; this was just a "wouldn't it be neat to learn this?" type of project. A few months ago, my father and I were discussing the topic of expertise, and some of the things he said really clicked in my head.
Compare the amount of knowledge needed to really know C, Pascal, or maybe COBOL to their modern business programming analogues of J2EE and .NET. There are a few commands, a few primitive types, and a few operators. Let's look at the libraries in 1991 as an example (1991 is the year I learned to program). Windows was just starting to penetrate the enterprise. A lot of programming was happening on mainframes. By and large, UIs were completely text based without a mouse. Input validation was a cinch; you simply looped over a "wait for input" statement until the user pressed one of the three valid buttons for that point in the program. Events? Nope. Object-oriented programming? Nope. Declarative UI definitions (HTML, XAML, etc.)? Nope. N-tiered architecture? Accessing resources over a network? Globalization? Nope, nope, and nope. Life was pretty darned simple.
It was a really nice transition from that world to the Perl/CGI world. At the end of the day, writing Perl in a CGI environment is conceptually very similar to writing a COBOL program: You take a batch of input, do your processing, and dump your output as a batch. It took roughly 50 printed pages to have enough Perl documentation, examples, "cookbook recipes," and so on to do the job at a decent level of competency. The experts were the folks who had been around long enough to understand issues such as SQL injection, and the need to properly escape encoded entities. At the time, I specialized in regular expressions. A few months of working with regular expressions (which are arguably a micro-language of their own), and I could read a regex like English. Over the next few years, I worked some real magic with them, from writing my own PHP-like system that fit my needs to working on screen scraping.
In 2001, I transitioned to Java. All of a sudden, I had some monolithic 1,000 page tome on my desk and 200 MB of HTML with minimal formatting on my hard drive — and these were just as reference sources. I ended up reading 500-700 pages of a "teach yourself Java" type of book. (In comparison, I read C in Plain English in three days — it was only a few hundred pages — without access to a C compiler, and I definitely qualified as "knowing" C.) The Java language has grown quite a bit since then; C# and VB.NET are comparable in size and scope. I would guess that it would take at least 500 pages to adequately document (with examples and a thorough explanation of the techniques, usage scenarios for them, etc.) VB.NET, C#, or Java. PHP is much less complex, thanks to its Perlish roots. We still haven't looked at the massive libraries that these systems carry with them; J2EE and the .NET Framework are both quite extensive. If anyone tells me that they are an expert in these systems, I know they are lying.
I have been working with the .NET Framework since 2003, and I am not an expert in the Framework. I am highly experienced with a few namespaces: System.Drawing, System.Threading, System.Net.Web, and System.Text.RegularExpressions. The rest of my time working with .NET has been so insanely spread out that I can't remember much of it. Thank goodness for Visual Studio's IntelliSense; at least with that, I can muddle through namespaces until I find what I need. If anyone were to watch me write code, they would assume that I am an idiot, since it would definitely look like I was grasping at straws 50% of the time. To be honest, that is what it feels like. Unfortunately, my "grasping at straws 50% of the time" still beats the industry average of 70% (I'm just making up numbers).
An expert programmer is no longer someone who is really knowledgeable or experienced. All too often, an expert programmer is the person who is adept at using a variety of reference tools and documentation to find out how to achieve their goals. This is my secret sauce. I am really good at looking at a problem, figuring out approximately what is wrong, and being able to quickly find the solution. To the uninitiated, it looks like sheer magic. To others, they say, "wow, Justin figured that out in only 30 minutes!" when they had been struggling with it all day. My real talent is knowing how to rapidly research and turn my findings into usable information. I suspect that I would be just as good at being a forensic accountant or a question writer for Jeopardy!.
Most of the really good programmers I have met are the same way — they know a little of everything. They have tons of experiences that inform their "guiding light" when they look for answers. They have a natural talent, but overall, if you were to grill them on anything outside a narrow area (say, the System.Threading namespace), there is a really good chance that they will know where to get the answer from but not actually know the answer. Mark Cuban recently called this the "Open Book" World, and I tend to agree. Although for those of us in the IT industry, this happened 10 years ago; programmers (unlike network engineers and system administrators) have been able to touch radically new concepts with a simple download since about 1995. The eruption in new programming models, languages, frameworks, libraries, etc. closely tracks the adoption of the Internet. Between vendors and open source projects, it seems like another new "this will revolutionize how you program!" system is announced once a week or so. They all look worthy, but they take six months to learn really well, and I just can't devote my time to working with more than one at a time. So my choice is to either become barely familiar with a lot of things, or to commit myself to something that may not pan out.
For now, the "sampler platter approach" has been working well for me at the professional level, although I find it quite frustrating at the personal level. I miss learning things in depth. I miss the sense of satisfaction from attaining a level of expertise. I miss getting to explore obtuse and obscure areas of knowledge. But it simply does not match the reality of my work or that of most other programmers.
Disclosure of Justin's industry affiliations: Justin James has a working arrangement with Microsoft to write an article for MSDN Magazine.