Software Development

Frustrated by a coworker's use of old-school programming techniques

What do you do when a veteran developer isn't open-minded about the benefits of modern programming techniques? Find out what advice Justin James offers a reader who is dealing with this tough situation.

Back in November, a reader sent me a really good question about how to handle another developer who is stuck in the '80s. Unfortunately, because of how the e-mail was sent, I was unable to reply. Here is the reader's e-mail, along with my recommendations about how to handle this tough situation.

I follow your and Chip Camden's blogs on TechRepublic.com, but I have yet to see a post on a particular issue that keeps bugging me.

How do you (or do you?) deal with clients on the issue of technical expertise? I grew up with OOP, and this guy wants to basically do procedural programming (this is an ASP.NET project). He uses external functions and methods sparingly, preferring to only use Page_Load. Any time an error occurs, he begins debugging by blaming the fact that code is in a different page event (e.g., PreInit or PreRender).

When he notices that I use classes, he goes nuts and starts explaining about why it's better to program 'transparently,' which is basically the non-OOP way of doing things. I'm more of an n-tier guy, myself, and no matter what he says, I can't bring myself to code down to his level.

I keep trying to explain to him the reason why I program the way I do, and that there are benefits to using the different page lifecycle events, but he won't take the time to learn new things.

Another example of weird behavior: he doesn't know how to use the debug menu buttons, such as Step Into or Step Over. He was using Step Into using the correct function key, but he didn't understand that he could jump over, or out of, a piece of code. Then he would complain that there are too many layers in the project, so I would hide that code from the debugger by using an attribute on classes he didn't need to go into.

By the way, there is an age difference of about 40 years, so I'm assuming that, mixed with the PhD, is causing an issue of trust. How would you deal with this?

From the sound of it, I get the impression that this other person is not just a coworker, but someone whose opinion can make or break your career. Unfortunately, having spent decades in the "stone ages" has made it hard for this other developer to be open-minded about the benefits of modern programming techniques. In addition, it seems like this developer is unwilling to put forth even the smallest amount of effort to learn new tools and techniques.

Some things to keep in mind while working with this developer include the following:

  • He is filled with a ton of old school tricks and tips that he would probably love to share with someone, and he could give you an insight into this business that you can't get elsewhere.
  • At the end of the day, all of the mainstream object-oriented languages end up being imperative code at the method level. While his experiences may not be directly useful or pertinent to your current projects, there are probably a lot of ideas that he has that can be used, but maybe not in the way that he thinks they should be.
  • Very few people are 100% impervious to change. Oftentimes, the "latest and greatest" does not have an obvious benefit to them, or things that you think are benefits seem like drawbacks to someone else. For example, to you, object-oriented code is great because it abstracts out the implementation details; to him, that is frightening because it means he has no idea what is happening "under the hood." Both viewpoints have merit.
  • Object-oriented programming came about to address the same problems that more traditional techniques were trying to solve, but from a different angle that hopefully minimizes the problems that those techniques had too.
  • Given the length of his career, your coworker has some evidence to believe that his way is right. This doesn't mean that it is the best way, but it seems to be working for him up to this point.
  • For better or worse, he may have some influence over your career. Whether it is through a supervisory role or even just as the seasoned veteran passing on his opinions to the boss, when an experienced pro speaks ill of someone, management often listens.
  • As much as watching him floundering with the tools may drive you nuts, it probably does not actually affect you as much as it feels like.

What does this mean to you? Tread lightly. The last thing you want is to have someone with decades of experience telling the boss that you are impossible to work with. In these uncertain times, not being viewed as a team player will sink you.

From what you've written, it sounds like you have tried to show him the light on modern techniques, but it may be the way you approached the topic. Did you talk about the techniques in a way that addressed his concerns? It's one thing, for example, to say, "OOP abstracts the details away." As someone who was taught in a more old school environment, I can tell you that this is a scary thought. Most of the more experienced developers I know turn green when they see the call chains that object-oriented code often contains; to them, it looks like a fancy version of spaghetti code. You know what? They're right. The same thing goes for "overly architected" object-oriented solutions, where someone took what could have been a 1,000 line application and turned it into a 20,000 line "framework" complete with interfaces, abstract classes, factory classes, and so on. I am not saying that these ideas don't have a place; but in so many of the examples that are out there in books and magazines and on the Internet, it is really hard to see the benefits of object-oriented programming.

There are a number of things that you can do that will help this situation become more pleasant. You won't be able to change who he is or what his experiences have been, but take the time to get to know what his past experiences have been. Maybe he is really turned off by object-oriented programming because he used to work with someone who tried to ram it down his throat. Or maybe he has been involved in some really poorly architected object-oriented projects. Maybe he has never been properly introduced to object-oriented programming, and all he knows about it is what some "guru" has said. But you won't be able to approach him with a chance of success without knowing what his fears about modern ideas are.

Fears can be divided into a three major categories: rational fears, irrational fears, and past-based fears. Rational fears are those that make sense, but there are no experiences to back them up. For example, this person may have a rational fear that using object-oriented techniques will make it harder to find which piece of code is responsible for generating an error. An irrational fear is one with no logical basis. If this developer does not like object-oriented programming because he feels that it would lead to mass hysteria, it is an irrational fear. Past-based fears are rational fears with actual experience to back them up. For instance, if he worked on a project where 10,000 lines of code were object-oriented architecture with only 500 lines of actual code, his fear of object-oriented programming would fall into this category.

This matters because it affects how you want to settle these fears. Rational fears are the easiest to deal with: simply acknowledge and respect them. If he has a rational fear, show him how you can mitigate the risk as best as possible and move on. If his fear is irrational, you can try to defuse it, but remember, irrational fears are often immune to reason, and you may need to just leave it alone. If the problem is that he has had bad experiences in the past, then you need to show him through the quality of your work that object-oriented code does not always have to violate traditional principles of good coding.

In terms of his issues about using the tools, I know it can be frustrating. At the same time, try to keep in mind that, if his debugging work goes slower than it could because he does not know much about how to use the IDE, that's his problem and not yours. If it does impact your job (maybe you are waiting for his work to integrate with yours, for example), then you need to find a way to educate him without offending him. Sometimes the best approach is to be blunt but gentle. Something like, "It seems like you are not familiar with some of the more useful debugging methods in this IDE. Would you like me to show you some tricks? They might save you a lot of time." This puts it out on the table that you know something that could help him out, without making him feel like he is stupid. Things like eye rolling, muttering, etc. are never helpful. Another trick that might help would be to compile a list of IDE cheats and send it out to the entire team so he doesn't feel he's being picked on. This puts the information in his hands in a format that he can use and allows him to learn these things on his own and feel a sense of accomplishment.

One thing I must caution you about: under no circumstances should you "call him out" in public. If there is one surefire way of making a permanent enemy, it is making someone feel like a coworker is calling them stupid in front of others, even if you aren't. These conversations should be held away from others, preferably in a private office, closed conference room, or possibly at a friendly lunch. I've found that sometimes the exact same message will get a very different reception depending on who else is around. Even if you are trying your best to be polite and positive, the presence of others can make him defensive and cause him to ignore you or become hostile.

I hope this advice is helpful. If you've been in a similar situation, what words of wisdom would offer this developer? Share your thoughts in the discussion.

J.Ja

Disclosure of Justin's industry affiliations: Justin James has a working arrangement with Microsoft to write an article for MSDN Magazine. He also has a contract with Spiceworks to write product buying guides. --------------------------------------------------------------------------------------------------------------- Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!

About

Justin James is the Lead Architect for Conigent.

172 comments
jkameleon
jkameleon

It's not just annoying coworker thing, it runs far deeper. You've mentioned ASP.NET. That means web applications, which are, in essence, procedural. HTTP call invokes a method/procedure on server in stateless manner. This procedure typically has to acquire a state from a database. When translating data from database into objects you have to deal with something called "The Vietnam of Computer Science". With procedural approach, you don't have such problems. I have 10+ years of experience in procedural (various assemblers, C) as well as 10+ years of OOP (C++, C#). I'm familiar with both approaches, and adherent to none of them. There are situations, where OOP is better (typically UI intensive desktop apps), and situations where procedural approach is still sometimes better (stateless database intensive server apps, for example). Solution? In the final consequence, everything amounts to man hours and money. If your organization has well established development process based on procedural approach, the cost of swithcing to OOP might simply not be worth a fuss.

philr
philr

This question motivated co-workers to look at Structured Programming and its benefits circa 1973. Unless you can demonstrate practical benefits you may be guilty of pushing the latest fad! I have seen quite a few... as you may have gathered. ;-)

StephenInScotland
StephenInScotland

Slightly off the thread but there is some merit in not using OOP, though usually the pluses outweigh the minuses. My big bugbear is maintenance. OOP is great for *building* an app, but its a nightmare when you have to change it. And by change it I mean the interfaces have to change. How do you find out all those things which use your web service which are now going to fall in a heap? When someone cracks this one, then they are going to make a fortune.

Osiyo53
Osiyo53

In my line of work this sort of thing occurs regularly. For instance, we have a senior programmer who seems to be "stuck in his old ways". In my field of work, we do a highly specialized kind of programming. The programming of DDC equipment (Direct Digital Control), essentially dedicated computers in a "Black Box" (so to speak), which control real world mechanical and electrical equipment. The programming languages used are multiple and varied. Some of them allow either procedural or OOP coding methods, or a combination of the two. Anyway, when I first started working with the guy, I just about started to think he was stuck in the "Stone Age". As he avoided certain more "modern" techniques and methods. And avoided certain new "tools" of the trade. Then, over time I questioned him about this and that. Asking for an explanation of why he insisted on doing whatever "this" way. Not approaching him in a confrontational style. I put it more or less like, "Hmmm, Bob, maybe I'm too stupid to understand, but why are you doing this like that? Instead of like such-and-such?" Chuckle, more times than I wish to admit, turned out he'd a darned good reason. Based on past experiences, successes or failures, and so forth. He brought up, once asked, aspects, issues, and problems that I'd never even thought about, much less thought through. Not that he was always right. But I sure got my eyes opened, FAST. And started keeping some of his wisdom and experience in mind when doing my own coding. Incorporating it into my own code, even when using more modern techniques. As concerns his refusal to use some of the more modern programmer's tools. Turns out he knew his old and trusted utilities so well and proficiently, that using them he was even faster at getting things done than I (and several others) was using the "latest and greatest" stuff. Fact is, yah can't watch him and follow what he's doing fast enough to stay up with him. He KNOWS his tools and utilities, as old as they are. So I leave him alone, and just try to learn. As I said, he's not always right. But he is right more often than I'd wish for the sake of my own self image. And I'm the better for it. Now, OTOH, we have another coder who uses all the latest and greatest. Given a choice between a utilizing a 100 line procedural solution, and a 1,000 line OOP solution, he'll go the OOP route every time. Very modern, very up to speed on the latest and greatest. He also has more Oop's in his final product. And finding the cause of those Oop's, is harder. I know, that is what I do for a living. Those two generate original programs, back in the shop. In the field, on live equipment, I do the testing of both hardware and software. Troubleshoot and fix any issues found. (They provide me copies of source code) OOP as vs Procedural ... each has advantages and disadvantages. And in any particular situation, each might be a better solution than the other.

work
work

Excellent article! As someone who's been programming over thirty years, I get tired of moralizers. I'm proficient with new technology. Sometimes I'm more procedural. Sometimes I'm more object oriented. I will save these perspectives for later reference. It's good to see this published so it can be quoted!

Still_Rockin
Still_Rockin

I too am an older "retread" (IBM mainframes and FORTRAN in the 80's and then iSeries and RPG in the 90's), but one who has made a successful transition to OOP and I've been doing .NET exclusively for 6 years now. I've put intensive effort into OOP/.NET both on the job and off ever since 2002; one has to wonder why the old schooler being described was even let loose on .NET coding. Like the saying goes, you can lead a horse to water... That being said, the "tips folder" is a great idea, I've seen that work well in many places. Also, rather than spending time mucking with screenshots and typing text procedures in the tips, consider investing in a copy of camtasia - much easier to just make a screen recording of the tip (complete with sound of you talking), render to a .swf, and put out on a file share or team site. Another tip - suggest the company buy ALL developers memberships to a site like LearnVisualStudio.net (I'm not affiliated with them, I just have a membership). The cost (especially from a company investment standpoint in their devs' continuing ed, is minimal. That way ALL devs, from jr to sr, can spend time in their off hours polishing their skills in basic to advanced topics, and guys like the old schooler can get a firm grounding in the basics.

chris
chris

"but in so many of the examples that are out there in books and magazines and on the Internet, it is really hard to see the benefits of object-oriented programming." Just am glad that there is at least one other "great mind" out there :-P

dcbohn
dcbohn

OK, I am an ?old guy?, one of those who started on punch cards. The big problem I have with what has been said is, most assume the junior programmer is as good as he thinks he is. I have no room for evangelists in my organization, we have to much work to do. I want people who know which tool to use for which app, not just one way. The app should dictate how it is written. That being said, it does sound as if the senior programmer might benefit from an evening class at a CC on how to used the IDE, it may be disguised as a language class. I have no bias for or against OOP, it is what it is. I have seen bad and poor code written no matter the jargon/ide/school of programming concepts. From my ?vast experience?, most fan boys are not as smart as they think they are. When I go to MSFT events, rolling out the latest VS advances, most of what I hear is that we have done xxx to stop ignorant people from writing bad code. Now Sun events I find I really enjoy. When conducting job interviews, one way I have used to find the knowledge level is to ask the following question, ?What is the difference between a subroutine and an object?? How they answer tells me more about them than any code examples that they could provide.

gary.hewett
gary.hewett

It contains gems of wisdom for dealing with many situations involving the perceived gap between the young and old and not just in the technical arena. I too am heading for the latter at a speed that frightens me but is nonetheless inevitable. I hope and pray that I have been open-minded enough along the way to garner some real gems (like this article does) to offer the next generation as well. Isn't it amazing that timeless wisdom seems so hard to amass yet it has been here all along? It's ALWAYS about learning what the other side brings to the table first before they will look at your offerings. Every generation of programmers and every generation of programming techniques reaches that apex where it appears that everything that could possibly be done is within grasp - yet another proliferation of programmers and techniques springs forth and shows the previous generation new vistas of knowledge and domains of endeavour that simply were not visible to the prior generation. Lets celebrate the youth for what they have yet to learn and accomplish and their willingness to do so and honour the stone-agers for the grand repository of hard-earned knowledge they represent.

Jalapeno Bob
Jalapeno Bob

I have been programing for a long time: The first two languages I learned were PL/1 and assembly. I coded on 80 column punched cards. If I had a 5 kilobyte memory partition to run in, I was thrilled. Speed was measured in microseconds, not nanoseconds. Debugging, in those days, was a real chore because when it ran, you looked at the output of "print statements" to check the intermediate values and when it did not run, you poured through nice thick core dump printout. A "mini-computer" was a Digital Equipment Corp (DEC) PDP-8, or similar machine, with a maximum memory of 4K 12-bit words. We have come a long way since then. Today, most programs are run on laptops and desktops, with background support from those newer, faster versions of those legacy machines I mentioned above. Being interactive means being event-driven and this is where OOP shines. Yes, OOP code is often a resource hog; but modern hardware is so inexpensive that, if done right, it should not hinder the user. Sequential, procedural processing still has its place. When you have to process 10 million records, the good old "do loop" directly on the record image gives a distinct time advantage. This can be very important in the timely handling of the overnight background processing of a business or government agency. The attempt to apply only OOP programming to Texas Integrated Eligibility Redesign System (TIERS) is a major part of the project's failure to be delivered on-time with the contracted performance measures. See http://www.twc.state.tx.us/boards/guides/tiersrefguide.pdf, http://capitolannex.com/2008/02/21/tiers-system-failing-texas-families/ and http://www.window.state.tx.us/comptrol/letters/accenture/ch12.html for more information. There is a need for a well-rounded programmer to understand and be able to use both styles. Although the need for streight procedural programming is fading, I doubt it will ever disappear.

jefferyp2100
jefferyp2100

Sounds like management material to me! Seriously, I worked for a large company that 'retired' programmers like this by moving them into management. The bad news was that like the co-worker above, he not only didn't know OOP, we didn't want anybody else to use it, either. His idea of code reuse was cut, copy and paste. I left the company and took a better job. The manager in question was 'demoted' (but somehow kept the title) and the VP who promoted him was asked to resign or be fired.

Smedley54
Smedley54

Change is hard. Forced change is frightening. And it just gets harder with age and experience - just keep living if you don't believe me. It'll come to you.

mckinnej
mckinnej

The Best Toolbox has a wide assortment of tools and the good mechanic knows when and where to use each of them. When you only have a hammer, everything looks like a nail. The same analogy works with programming. Procedural programming has its place and so does OOP. The trick is knowing when to use them. If you only know one method then you are going to use the wrong one at least part of the time. The subject of the article should spend less time complaining about the closed-minded co-worker and work on opening up his own mind a bit.

ESchlangen
ESchlangen

This is one of the best articles that I've seen on TechRepublic, and not just because of the excellent human relations advice. As one of the old school programmers that is (more slowly than he would like) clawing his way into the modern programming world, it was quite interesting to see that there are people who do actually realize that at least some of the old timers have a positive contribution to make!

Dr_Zinj
Dr_Zinj

I have a much more pragmatic attitude about coding. Documentation and internal notes are absolutely essential, no matter which style you use, especially in the development stages. Generally, the shorter the code is, the better. And the fewest cycles to process is better. Both usually add up to less storage space and faster processing. In my limited experience (I got started programming in the late 70s, early 80s and basically stopped programming in 1994), big programming projects benefit the most from OOP. If you're building very small programs (not to be confused with small modules of a much bigger program), then whether you use OOP or not may not make much difference. This senior programmer apparently teaches (you mentioned a PhD). I suggest hitting him up for examples and even breifing session on why and how his methodology is better. You might be surprised yourself.

showard
showard

Excellent article - As an "old-school" programmer (nearing retirement) who has made the leap to OOP, I completely agree with what was said. Too often the younger progs I work with right out of school say a particular way is right because "the professor said so" or the book says so. I learned many years ago there is always more than one way to solve a problem ( program a solution). But I deal with one young fellow everyday just trying to get him to slow down and analyze the problem before charging in and making wild assumptions. Thanks for the article

scott.damery
scott.damery

One way I have talked my company into sharing and educating on modern procedures or best practices, is through lunch time education where once a month we have an employee teach on a subject and then we can all discuss it at the end...great for teaching and learning from each other.

trilithium
trilithium

In the early 90s working in isolation at home I set myself the task of learning C++ because I recognised it as an important skill. The first problem I had was figuring out what if anything in my voluminous existing C code would be a candidate for conversion to C++. This was the big hurdle. It just was not obvious at all. It took me a while feeling my way in. My first class was a GIF decoder, and as it developed, I became more enthusiastic, having begun to see a way to convert code, in general, to the new paradigm, of classes with member functions. All in all it was a daunting step to get into C++ but well worth it. In my current job I have been using OOP (but not in C++, except in some extensive hobby stuff at home) for 10 years, and that C++ learning experience was important in getting the job. Suppose I had not made the step? What would I be doing now, at 55? The irony is that the OO code I work on at the moment contains vast tracts of badly-structured legacy code that I am often faced with unravelling. One of my co-workers comforted me by saying, sadly, "it is possible to write Fortran in any language".

sithomas
sithomas

Am I the only one whi thinks that describing an older colleague as "having spent decades in the 'stone ages'" is an ageist and derogatory comment?

gardoglee
gardoglee

JKameleon alludes to an important aspect, as have others. There is no best technique for all situations and assignments, any more than there is a best computer language for coding, a best platform for every task, or a best car for everyone. There are, however, some sub-optimum ways for you to address your co-worker and your concerns, several of which have also been pointed out, the worst of which center around building the conflict instead of building the relationship. This is why smart managers pay attention to their employees' people skills, and how the relationships are going in the shop. Having played all three roles in this conflict many times in several shops, my advice is like many others above. You won't get this guy to listen to you until he respects you. He won't respect you until you respect him, and until he can see that you do. You won't be able to deliver what you want to say in a way to which he will be receptive until you get to where you can also hear those parts of what he is saying which are valuable. And if you don't think he has anything valuable to say, you probably need to step back a bit and listen a bit more. As soon as you can do that, the other problem will actually be pretty easy to solve. If he has been in the business long enough to become any kind of a veteran programmer at all, he did learn somewhere along the way how to learn ways to simplify what he does. As an example, that Step Out thing. He probably didn't know about it. He probably would love to use it once he understood it. Presenting it to him as, "Yeah, I know that finding the problem takes going down into the levels, which is why I am glad I can use the Step Out function once I've located the problem" is a lot more likely to work than something like, "Yeah, but if you were a competent debugger you would know how to use the Step Out function." Establish your similarities, and then use that as a bridge to get over your differences. And yes, this takes work and effort on your part. That's part of the personal cost of getting to work on a team, rather than in a single person shop.

Sterling chip Camden
Sterling chip Camden

in the OOP model. You can only add new ones. Then, of course, the sheer number of interfaces becomes unmanageable.

atfalatitkb
atfalatitkb

Enjoyed this article very much. Do you think 20 years from now when "xyz" is the current software development paradigm, those old "oop" programmers will be seen as dinosaurs? I certainly would hope not. Their "oop" skills won't matter. What will matter is their experience. Hopefully the next generation of "xyz" developers will be willing listeners and at the same time willing teachers, just as the current generation of "oop" developers should be.

Justin James
Justin James

Those are all excellent ideas, thanks! J.Ja

SnoopDougEDoug
SnoopDougEDoug

I've written developer docs for over 20 years. Creating an example that is: * Short enough to describe in a couple of pages * Complex enough to make your point * "Real world" Is very difficult. Many of the advantages of O-O programming are only apparent when the project reaches a certain size and complexity. Simple tutorials using vehicles as interfaces and cars and bicycles as classes often seem trivial to new folks. They don't understand that as their project grows in complexity, simply adding a color attribute to an interface is exponentially more difficult as the code base grows. doug

Justin James
Justin James

I know what you mean about those interviews. When I interviewed people, my "secret weapon" question was: "what's the difference between 'passing by reference' and 'passing by value'?" I was *shocked* at the number of "programmers" who didn't know the difference! Some of them had Master's degrees in Computer Science, others were "experienced developers" with 10+ years in the field. Unreal. J.Ja

Justin James
Justin James

Up until recently, I was still using that technique (more or less) to debug JavaScript, and depending upon what the base language is and whether or not I have an IDE for it which can debug the JavaScript, I still do. It's *miserable*. I'm grateful for all of the years I spent learning to code without debugging tools, because it taught me how to effectively work like that (well, as effectively as possible). J.Ja

gotoman_work
gotoman_work

I found over the years is that new technology is in many cases just reworked old technology with enhancements. OO is prime example of this. The biggest issue is that they changed all of the terminology for it and would take the time explain in the terms that the old school understood. It me a couple of years but I finally got a grasp on what OO was. Then I looking at it and I saying that is how I program now. I found that OO is a structure and modular style of coding with controlled access of data and code. The idea of controlled access points is borrowed from Fortran IV. The point I am trying to make is that many people avoid new technology until somebody explains it to them in terms that they understand. The preachers for new technology tend to overlook that fact. Another comment is that I have found most IDE's are very poorly documented and thus making the product unusable do that fact that it is not intuitive to use. I work in the Mainframe and unix environments, so I spend a lot of time working with new ideas. But if you want people to use new technology and ideas, you have to do 2 basic things. One is that you have show people real benefit for it and two you have to make it easy to use for the initial use. Otherwise you have the situation in this post and it is a no win situation for the junior person.

Frgood
Frgood

As a person approching old timer status, your observation is probably the most common affliction. It seems that this is key to the confusion. Sometimes it feels that we are just going in circles every few year. and we, ol' foggies have to apply new terminology to concepts long established. This is easily compounded by you point of college newbies that blindly accept the most recent training without any consideration of real world application. Too often developers and analysts begin development before the anyone has finished presenting a business problem. The balance, in my opinion, would be that young programmers bring great efficiency and energy to the coding process. older developers can temper that enthusiasm with a willingness to examine the problem and can geniunely simplify the object model. This combination could produce great results or, in this case, great frustration.

Justin James
Justin James

That is a really good suggestion. It helps to get everyone on the same page, without making any one person feel targeted or singled out. It's also a great way to exchange ideas in a non-confrontational way. J.Ja

ArnoldZiffle
ArnoldZiffle

I had a co-worker programming in Pascal in a college course. She asked me how 'C' was different. I wrote macros for the 'C' pre-processor which allowed me to compile her Pascal programs in C. Just an exercise.

Justin James
Justin James

I've heard that quote before too, but I am pretty sure it had a different language in there instead of Fortran, but it is so true. I've been guilty of it plenty of times. For a while, I kept trying to shoehorn my previous Perl experience into Java, and then later, VB.Net; it just did not work very well! Indeed, for the first time in my life, I am trying to learn the "XYZ way" of doing things. In this case, I am learning Ruby from a book, and paying close attention to the "Ruby way" of doing things. Luckily for me, the "Ruby way" takes a lot of inspiration from Perl, so it resembles something that I have worked with before and enjoyed. It does *not* resemble the "Java way" (or the "C# way") at all, which in all honesty, is a world which I feel has a sub-culture that tends towards really top-heavy code. But that's another discussion entirely. :) J.Ja

trilithium
trilithium

To say someone spent decades in the 'stone ages' is an ageist and derogatory comment only if it was so-intended. Anyone who would suggest that is too young to know and will get wiser as they get older. I am very proud of the fact that I was in the microprocessor revolution at the beginning working for Texas Instruments writing code for 4-bit and 16-bit microprocessors using punched cards and paper tapes and all that stuff. By the time PCs came along I was used to stripping down minicomputer chassis so assembling and fixing PCs came easy and I still do it for family and friends. I know plenty of people who don't even know how to open the lid on a PC. These and lots of other things happen over the course of a career and the experience provides intuition and confidence. On the principle that "you're only as good as your next bug-fix" I continue to develop my skills as do so many other programmers of all ages, because life is a learning process, and no one ever convinced me it was ok to stop thinking or to stop learning.

JCode
JCode

Without the 'stone age' we wouldn't have the wheel! I think the problem is the focus on change at the expense of experience. Working together with an understanding of the strengths of 'senior' and 'junior' developers is the way to go.

Justin James
Justin James

The rate at which the IT industry develops renders anything more than 10 - 15 years old "the stone ages". The person in question has been in the industry for about 40 years more than the person writing the letter. I think that squarely qualifies them as "having spent decades in the 'stone ages.'". Heck, my first language was BASIC on a mainframe, and my second language was COBOL. My first experiences with a computer were on Wangs. *I* qualify as having spent some time in "the stone ages", and I just turned 30. J.Ja

Beauregard T. Shagnasty
Beauregard T. Shagnasty

No, I'll agree it's an ageist comment, but only because I'm an old guy. I retired a few years ago, and at the time was twenty years older than any/all the other 'coders' at the company. They were all between late-twenties and about forty years old. Here's the kicker: I was the only one of the bunch who used OOP. Even the youngest of them, recently out of college, wrote mostly spaghetti code. I've often wondered what they did with my applications after I left, because I'm sure none of them were able to follow it. :-)

mattohare
mattohare

All my applications have been better for the OOP because I do the encapsulation right. If you really need to do that much debugging into the existing object model, then it wasn't done right in the first place. It either didn't encapsulate or it wasn't tested adequately. The bathwater might be dirty, but don't get rid of the baby too.

Sterling chip Camden
Sterling chip Camden

Those who use languages that don't require OOP but enable it when useful (Ruby, etc.) tend to snicker when they see someone spending an hour just setting up the class hierarchy before they write any real code.

Justin James
Justin James

I can't agree more... when I was working on my presentation for the Parallel Extensions Library a while back, one of my goals was to use some examples that fit the parameters you laid out... I couldn't do it, and ended up falling back on some old-fashioned "text book" examples... Fibonacci series and prime number brite forcers. Demonstrated the ideas well and succintly, but definitely not "real world". J.Ja

SnoopDougEDoug
SnoopDougEDoug

I totally disagree that "most IDEs are very poorly documented". I think much of the problem is that we know what we want to do, but we do not know the term that the IDE uses to describe that action. My first job out of college was maintaining the Unix Programmer's Manual. I would have killed for Visual Studio. Do you remember command-line debuggers? No Intellisense? Just those two features are worth gold to me. I haven't written any code on Unix/Linux in about 5 years, but doesn't NetBeans run on Linux? You might try developing a simple app on Linux and see how difficult it is to port it to your flavor of Unix (we were porting Java code to Solaris back in the day and we only found a couple of version-related issues). doug

Tony Hopkinson
Tony Hopkinson

You'll get a big pile of hints, warnings and outright no ways from the Pascal compiler. :p C is extremely powerful, quite often though it can be like slicing a loaf with a chainsaw.

Sterling chip Camden
Sterling chip Camden

... until you get to a language that makes use of special characters that aren't legal in the name of an identifier for a #define, or that make use of features that can't be translated into C by mere substitution, like lexical closures and continuations.

Slayer_
Slayer_

Though I only saw the end of the ole green screen machines, and I was a kid. Todays youth almost frightens me, they no so little, and usually don't desire to learn more. I wish I knew what you know about those old machines. Though My Cobol/JCL course almost killed me, too different from every other programming language I had learned before that.

johndecoville
johndecoville

I started coding in Fortran in the mid-60's. By the 70's I was also coding in COBOL (quite a bit) and IBM's proprietary language RPG-I. Now I am in my mid-sixties and am leaving both VB.NET, RPG and COBOL behind. Firmly on C# (I love it!). Everything in retrospect is primitive-looking. Just think, in the 1840's people were wondering what steam power (boats and trains) was doing to a sense of time and space. The telegraph was rapidly set up and adopted by the newspapers. Read "What Hath God wrought" a pulitzer-Prize winning recounting of this era. And that was technology too!

jk2001
jk2001

I believe functional programming is from the stone age of computing, but it's hit the mainstream only in the past decade. Relational databases were invented in the early 70s, but only recently have mainstream programmers really taken languages like SQL seriously.

joeller
joeller

You were still using Basic on a mainframe 15 years ago? I learned BASIC on the Naval Academy's DTSS (Dartmouth Time-Sharing System) using teletypes in 1971. Then I learned Fortran and Cobol on the same machine. (Still luckier than my brother who had to learn Fortran using punch cards) But the last time I used BASIC on a main frame was in 1975. Since then I've only used it on one type or another of PC. However there is a great deal of difference from how we programmed back then in the real "Stone Ages" to the description of how this other guy is programming. He sounds to me like the ASP web programmers I learned ASP.Net with. They are not only not as enamoured with the use of OOP for OOP's sake, but a lot of the techniques they are comfortable with and were part and parcel of Web programming are completely anathema to those of us trained in "software engineering". We tend to disparage their techniques but forget that these are the people that built the Internet into what it is today.

santeewelding
santeewelding

You go off half-cocked. And, I think, your use of "troll" sucks. I found it thoughtful; useful; engaging. I would welcome more. Not, less.

Sterling chip Camden
Sterling chip Camden

You're right. It's important to keep classes small and focused. Do one thing well, and then the interface becomes natural and doesn't need to be redesigned. Compose more complex architectures by combining instances of these simple classes, rather than creating complex inheritance hierarchies with ungodly dependencies.

Tony Hopkinson
Tony Hopkinson

I think it was Danny Thorpe, or possibly Mr Kernighan himself. Struck a very loud chord though. I've never done C in windows, just Unix, Linux, VMS and DOS, to be quite honest I don't want to either. There's a damn good reason for a whole plethora of languages that gave us more assistance in not doing something utterly stupid while developing, and I, at least, need all the help I can get. :p

Sterling chip Camden
Sterling chip Camden

It used to be truer than it is now. Most C compilers have gotten a lot more strict over the years, and they give you warnings about differing levels of indirection and such. Give me the days when the compiler would happily take 'A' as a pointer! Nothing that a good cast can't solve, though!

ArnoldZiffle
ArnoldZiffle

LOL. An excellent analogy, I almost fell out of my chair on this one. As for picking one over the other I think if the programmer has a free hand in selecting their development language for a project (and most of the time we don't), they pick something in their comfort zone. Because of the project(s) I'm currently involved with I'm working with VB.Net and C#.Net and SQLSever 2005. I could achieve the same results with Perl cgi or PHP and Oracle, MySql or good old Informix and achieve exactly the same goals. But the environment where I'm contracting is an all MS shop.

Tony Hopkinson
Tony Hopkinson

is a pascal compiler is a policema, a C compiler, an accomplice. :p Less code to write and maintain, versus a better disciplined (all things being equal skill wise) approach in terms or reliability and maintenance. I do both, I'm not aware of anyone picking one or the other based on technical grounds, usually it's people or tools based.

ArnoldZiffle
ArnoldZiffle

I agree it would be a mess trying to emulate 'C' with Pascal. As most hard core 'C' programmers say, "C gives you enough rope to hang yourself!" Which is a fancy way of saying it can do anything.

ArnoldZiffle
ArnoldZiffle

My co-worker's instructor had a big head. No one had ever gotten this particular exercise correct, including my co-worker. He of course assumed this was because of his superiority. My boss and I analyzed her program (which we of course had assisted in) and could find nothing wrong. Eventually we narrowed it down to the fact he'd obtained his original answers on a Big Endian or other machine (probably a PDP-11) and her college was using PC's, which are Little Endian of course. This does not mean he had been incorrect as far as his answer goes but, he had failed to properly address the difference in formulating his problem to the class.

Tony Hopkinson
Tony Hopkinson

returning anything other than an integer, relying on type safety.... Strongly typed to weakly is generally trivial, the other way round is often total rewrite. Even if you managed it, you've lost the compile time support, and you have no defensive code to cope with the weak typing. A guaranteed runtime violation the first time you sniff near the C version. Mind you C developers are no strangers to them....

Sterling chip Camden
Sterling chip Camden

... is in the opposite order of an S-expression -- RP being postfix notation, and S-expressions prefix. Naturally, an S-expression parser logically flips it around internally. RP accurately reflects the true order of evaluation (first evaluate the arguments, then call the function).

ArnoldZiffle
ArnoldZiffle

I have several HP programmable calculators which program in Lisp. HP calls it RPL (for the Reverse Polish part and Lisp). They first implemented it in their HP 28C, 28S series in the mid eighties followed by the 48, 49 and now 50 series. I love messing with these things to see how far you stretch the outside of the ole envelope. Think about this. You have an application which takes say three inputs to derive the answer. You push the three inputs on the stack and enter the name of the app and boom done. Ok now visualize this, push a LIST of objects on said stack and process the list in parallel with an app. I.e.: our original example, push a list, of three element lists on the stack, then execute that three parameter app against the big list and it will process the entire list of lists in parallel. This is on a handheld calculator mind you. Have a nice day!

Sterling chip Camden
Sterling chip Camden

... dates from the late 1950s, yet we are still learning from it today. It boggles the mind to realize that Lisp's original contemporaries were Fortran and COBOL.

Slayer_
Slayer_

The rest is abstract. To think, 10 years ago, thinking in such an abstract way meant you were probably retarded, now its required? Oh how the world has changed eh. This is why I loved doing my own projects, where I am the only support, as I can code it as I want, effeciently ot ineffeciently, I don't need to worry if it is within company standards, because I am the only one to maintain it. The code I produce seems to be a mixed bag, it's usually fast and effecient, easy to make modifications and fixes, but usually hard to add content, I tend to couple my code a lot.

Justin James
Justin James

One of the happiest times in my programming experience, was working with Scheme in high school. It was one of the only times I felt like I was being 100% intellectually engaged with development. I also (for better or worse) really liked COBOL. It was straight forward, and 100% focused on one thing and one thing only. To this day, I tend to find myself doing things with batch processing instead of "real time". There was a certain... simplicity... in that kind of work. 90% of the code was the actual programmatic logic, 10% was fluff. In the last 8 or so years, basically, since I stopped working with Perl, it seems to be flipped the other way around. 90% of the code is interface and accounting for architecture and filling structures to be passed around, etc., and 10% of the code is actual program logic. Bleh. :( J.Ja

Slayer_
Slayer_

Seriously, its terriable, and yet, its amazingly easy to work with, 25 tables, there is a very small relational factor to them. 10 tables are "AP" tables, and 15 tables are "LN" tables. The AP tables are litiritly columns named AP_String1, AP_Flag1, AP_Money1, AP_String2, AP_Flag2........ etc., for the maximum numbr of columns you can put in a table, 10 times. Same with the "LN"'s. There can be many "LN"'s to a single "AP". That is the only relationship. We keep a data dictionary in an excel spreadsheet on what each column actually is used for. Believe it or not, it actually works great. New legislation that says we need to store "_______" info? Just use another empty string field, add it to the spreadsheet, and your done, no database restructuring.

doug.cronshaw@baesystems
doug.cronshaw@baesystems

I recall using BASIC on an RT-11 O/S PDP11. That BASIC could be "compiled" or directly interpreted. From what I remember the compiled version of the BASIC code - with a different filetype for its results - was still run by the BASIC interpreter, it just didn't have to do any of the line-by-line high-level lexical and syntactical analysis, and probably had all the necessary storage already allocated within the pre-defined memory space. It certainly ran a lot faster than the fully interpreted source file version of the same BASIC programs.

mr_bandit
mr_bandit

The first commercial BASIC system was the GE 235 in 1965. The manager at GE, Arnold Spielberg (you may have heard of his son) went to a conference in 1964 at Dartmouth && saw BASIC. He came back to Phoenix (location of GE Computer Systems) and started the project. They ended up hiring a bunch of the Dartmouth students for the summer to help with the project. They setup GE service centers to allow timesharing over phone lines. BASIC was a big step forward at the time because it allowed non-programming engineers (ie mechanical) guys to write programs. GE was the first to figure out how to make a timesharing system commercial. The concept was simple - charge for resources - time, memory space, etc. Sounds familiar... This lead to a number of other BASICs on the other mainframes. I used a 'compiled' BASIC on an HP3000 in 1977 - it was an interpreter with a batch job wrapper. This also led to various BASICs on the microprocessor systems at the time - Uncle Billy's being only one example. What BASIC did, for good or ill, was open up programming to the masses. BTW - I am one of the old-school old-farts. I have a set of techniques that I use to implement embedded systems. These tend to be very old-school, because they are predictable. I write very vanilla code, with the cleverness in the algorithm. I just hope I am open to new concepts :) I cannot afford to have to have code fail (people die), the primary reason I am old school. Also, I tend not to have the fancy tools like an IDE. I get a serial port and an LED. I think the response is good - non-confrontational, respectful, and willing to learn why the guy uses old-school techniques (ie non-OOP). As a veteran of the Tech Wars, I started my formal education (and from my father, who worked on the GE 235), with the modular programming concepts. The thing is - if you put all of your state (ie the variables) into a struct, then pass the struct to carefully written functions, you have objects. If you reference the functions thru another struct or carefully created function, then you have OOP, with inheritance, etc. You get it all if you understand the underlying theory. You don't even need function pointers, just majic numbers that cause functions to be called in a switch statement. Good post && good comments

hercules.gunter
hercules.gunter

You say that the new VAX had a Basic interpreter. If you are referring to VAXBasic, that was actually a compiler, and produced obj files just like other VAX compilers which could be linked with any other collection of obj files to produce an executable. The great thing about this was that if one needed to use mathematical functions in a COBOL program, one could use Basic's mathematical capabilities in an obj called by the COBOL program.

Justin James
Justin James

Yeah, we had some pretty old equipment at my high school. It was either an elderly VMS system on an NCR system. They never quite made it clear what exactly we were working on (and I didn't know enough at the time to give a positive identification), but we had dumb terminals and such. Not quite "vacuum tubes" or "punch card" era (although there were punch cards all over the place, for some reason), but from what I could tell, it was definitely mid-80's technology at best. J.Ja

Realvdude
Realvdude

Here I thought Basic was just the pencil pushers trying to take control of the computers. I was Western Michigan University from 1980-83 and the only mainframe that had Basic interpreter was the new VAX for the business school. I did have to punch card a few Fortran programs before they would allow us in the teletype rooms.

Editor's Picks