Software Development

Challenges for old developers learning new tricks

The problem of staying current is more acute in IT than in other fields because of the speed of new technology development. What do you do when a new technology comes along that makes obsolete a technology that you've devoted a good part of your career to?

The problem of staying current is more acute in IT than in other fields because of the speed of new technology development. What do you do when a new technology comes along that makes obsolete a technology that you've devoted a good part of your career to?

---------------------------------------------------------------------------------------------------------------

When my son was little, I used to help him with his homework. English, spelling, science, no problem. But then he started bringing home his math homework. At first, I thought that it would be a breeze because I'd always been a math whiz in school. I hadn't planned on [insert ominous music here] New Math. I completely confused the poor boy because I was showing him how I used to solve math problems, which was totally different from the way he was supposed to be doing it and by which he would be graded. I knew I was going to be totally useless on this front until I learned the new way of doing it.

Well, easier said than done. The new method completely baffled me. I couldn't get my brain around it because for my whole life I'd only known one way of reaching a mathematical conclusion. I got frustrated, and I whined until he finally sent me to bed without any supper.

Now that's a funny story but that has a much more serious meaning when you think about the effect changing technology has on the career of an IT pro. When you earn your living being very good at one specific technology, and something else is thrown in, what do you do? You have no choice but to keep up. And that can be very stressful.

I received an e-mail from a TechRepublic member in response to my blog about the worst job in the world. This reader made the point, as did many others in the discussion, that a job might not be bad in itself but the circumstances around it might be.

As the reader described in his e-mail:

"Moving from VB6 to .NET. 10 years ago I was doing a lot of COM development with VB6, lots of 'object-oriented' programming, and was very successful at it. .NET changes many of the paradigms that took years to learn, even if the underlying principles are the same. A younger person with no previous OOP experience may simply embrace .NET's approach to classes without having to shed the VB6 mindset.

I personally have not had too much trouble keeping up, but I see a lot of Cobol, classic ASP, and VB6 guys who are drowning with .NET. And, given the changes expected in IT over the next several years, these people are the most likely to end up having to find new jobs -- and, without the skills necessary to replace their current salaries."

The problem of staying current is more acute in IT than in other fields because of the speed of new technology development. It's also a problem because the executives who want to shift to the newest thing are also the very ones who don't understand the learning curve. Their thinking is like, "Well, you speak French, surely you speak Italian too."

What was the hardest new technology for you to adjust to?

About

Toni Bowers is Managing Editor of TechRepublic and is the award-winning blogger of the Career Management blog. She has edited newsletters, books, and web sites pertaining to software, IT career, and IT management issues.

10 comments
avidtrober
avidtrober

I don't mean to sound harsh or critical. But, what about VB6 takes "years" to learn? And, what specifically about the .Net learning curve keeps someone from being able to at least be productive with something as they climb that learning curve? I'm an aging developer myself (22 years of dev). So, I have the same challenges. If I depended on others to train me, I'd be a nervous wreck. No company is going to take care of you at a senior-level having to be "trained". By the time you're doing this 15+ years you should know how to dive into something with the skills to be productive at least to get started. I'm puzzled by this post.

mikeg3
mikeg3

It's not that anyone spends "years" learning VB6 -- the statement is that paradigms and practices that a person has used for years and years are changing. It's often much easier for a callow 22-year old with no practical experience to pick up complex concepts (like .NET) simply because they have nothing to compare it to. I used to teach VB6 for a national training company. Most of the classes had at least one COBOL guy (virtually always a guy, not a gal) who was there to begin or supplement the change from COBOL to a more modern language. Of these students, the most successful were the ones who were able to shed the COBOL paradigms -- all those big headers and English-like statement contructs, and top-down program flow. Seriously, event-driven programming was quite difficult for some of these guys to grasp, even when they knew and understood typical Windows apps. It can be quite frustrating when a person with a lot less experience appears to be more productive with the new tools.

jperick.mbei
jperick.mbei

Well, first of all, I am not sure where the 1% statistic comes from. the fact is, things like these may happen more often than we think possible, or aware of. However, organizations may not publicize this kind of mess because of the potential effects on corporate image. Therefore, our statistics might only take into account what we are aware of, or know. Unless there is subtantial scholarly research--and replicated research on this issue, we'd continue to make decisions based on inaccurate data. Secondly, I am also concerned that it is this attitude (i.e., the contention that it happens rarely, why spend so much time to address a 1% occurrence, and so), that had left us looking miserable when the 1% actually hits us with merciless force. 9/11 may have been thought of, back then, as being in the 1% statistic basket. Powerful power outages that have recently wreaked havoc in our business and communities may have been considered part of the 1% basket. The dealy and probably unprecedented shooting at Virginia Tech, and back Columbine High, may have been considered part of the 1% basket. I could go on and on. Look at the pain--big pain corporates went through after SOX and HIPAA were passed. Had organizations taken proactive, and systematic steps before to make security part of their core processes, they probably would not have spent so much money hiring consultants to hurriedly implement processes to be SOX and HIPAA-compliant. In today's extremely dynamic, and unpredictable environments, the saw "prevention is better than cure" can't find a better logic. Pro-action, not re-action, seems, in my modest opinion, the ideal approach. We cannot have perfect systems, let alone a perfect world. But we can put in place mechanisms that allow organizations to significantly mitigate risks and their impact on our lives. Just my two cents and I might be bloody wrong. J-P

Tony Hopkinson
Tony Hopkinson

proficiency with a tool = a skill? Does a mathematician view an integral as a skill?. No, it's a tool. If you don't understand the math, then an integral is magic, a lamp that you rub in just the right way and you get what you thought you wanted. There is nothing new in .NET, not one bit of it had not been seen before somewhere else. It was new to those who were limited to VB6, there again so was OOP. All languages are, are a different way of looking at something. They are an enabler and a constraint. No one who can't do more than one language, prefereably of more than one type, should even call themsevles a programmer. That's like some who assembles flat pack furniture, calling themselves a cabinet maker. This keeping up with new technology is a salesman's fallacy. I mean if someone said it does nothing that you can't do now, and can't do somthings that you can do now, who would buy it? OOP was formalised reification, with good development practice built in. Event model, was interrupts writ large. This stuff is new only to people who haven't done it before. Don't get me wrong there is a learning curve, but its' in this language do you have to allocate memory, syntax and semantics. The most imporant thing about languages and coding environments is each one you learn to use teaches you more about programming, simply because it's a different way of looking at the same thing. If you wear someone else glasses, things look strange, they haven't changed though. Tony coder 1976 - to date

jquigley2
jquigley2

I am a COBOL/PL1 programmer and have been out of the business for a number of years (20 to be exact). Just trying to wrap my brain around JAVA and PHP and Ajax/JavaScript is driving me bats. DOS to windows was a snap compared to this. I started in 1970 with key punch and IBM built my own Sinclair and that was easier than this is. Top down to OOPs whew!

nvrtis
nvrtis

There is hope for you actually if you managed to pick up both Cobol and PL/1. It's a LOT easier to look things up in the manuals, and a lot easier to put together an inexpensive (free) learning lab. PHP is a lot like Cobol without a working storage section (but I usually build something close by defining most of my variables at the beginning of my code). Java is a lot like PL/1.. except they call subroutines methods, and a class is a steplib/joblib that is limited to one group of subroutines. When I decided I needed to learn OO, I picked up a copy of Smalltalk. The ONLY way you can write Smalltalk is OO. Which forces you to go through the mindset change.

Tony Hopkinson
Tony Hopkinson

One is OOP which you can simply think of as good coding practice. Describing your data by the operations you perform on it. The is a an order file, I can add change, remove , sort, find..... Event or asynchronous programming is a much bigger switch, but given you built a sinclair, think interupts but with state attached to code as opposed to decoupled in the stack.

apoloduvalis
apoloduvalis

The transition to OOP was not easy (in fact, sometimes some dark OOP concept still trick me from time to time) but the real challenge from me was to pass from DOS to Windows. In DOS programming I was used to program very specific control flows, contextual option menus, etc. In Windows/Mac applications you have to program events and you have no idea wich possible branch of the control flow you foreseen the user will follow.

keith29@usa.net
keith29@usa.net

I have been doing OOP since the mid 90's, so it isn't such a big deal to me. Web programming, though, threw me for a loop. It seemed to go backward in software development, throwing out what I had learned about display/processing separation, nice smart event-driven GUIs replaced by a "stateless" page, etc. Then, add in all the weird ways to get state again (cookies, etc.), then add Ajax to get a GUI that doesn't have to be completely refreshed... and slowly regain the capabilities I had with Tcl/Tk or something.

Ceespace
Ceespace

oh yes I remember that! I used to disable everything on the form until I had finished an event to stop users wandering off