Software Development

The distinct lack of fundamental programming theory resources


One thing that I keep being surprised at is the number of basic programming articles I keep seeing available. I would have thought that if someone has not yet gotten the message that global variables are naughty, they never will. Likewise for SQL injection attacks, cross site scripting vulnerabilities, validating user input, following a variable naming scheme (not even which one, just "follow one!"), and so on. I would almost think that these writers are just looking to milk this dead horse for hits or filler content, except that in the Real World I see "experienced programmers" making these types of Mickey Mouse mistakes all of the type (not to disparage the character or Mickey Mouse by comparing him to unvalidated user input).

What I find interesting about the whole situation is that these articles are almost always in a particular context. An article about writing Web applications in ASP.NET might mention that it is best to always use parameters in a SQL query to prevent SQL injection attacks. Or an article about Perl might just mention the problems with global variables. And so on.

I cannot recall ever once seeing a "basic programming theory" book on the shelf of a bookstore. I found a few on Amazon that did not seem related to a particular language. Oh, sure, you will see tons of "learn to program" books that get your feet wet in a particular language. Some of these are aimed at developers who just want to start learning a new language, and others are oriented towards teaching someone the basics of programming, using a particular language. But where does the still "potential" programmer go from there? Alternatively, what prepared them for that?

Code Complete by Steve McConnell probably has more language neutral fundamentals in it than any other collection of writings that I have read. But it is hardly suitable for a new programmer. For one thing, the examples are in a variety of different languages; while the point is obvious to an experienced coder, it is a bit too much for a newcomer to programming. It also refers to way too many things that new programmers do not know anything about.

Ironically enough, many of the items I listed in the first paragraph are not really "language neutral," but they are issues in 99% of the languages in common usage. Sure, SQL injection applies only to applications that use a SQL backend, but that sure is a lot of applications! But as a rule, a book that simply takes the reader through basic fundamentals in a language neutral way, either as a preparation for learning a first language or as a follow up to learning the first language, just does not seem to exist, at least in a mass market format. I think it should! Where is the O'Reilly or Wrox Press Basic Programming book (the O'Reilly one could have a larva on the cover).

Granted, such a book might be too short to be an actual book; it might be better for it to be a Web site. But still, where are the "back to the basics" sources for new programmers? I think maybe some better education in the early years, months, weeks, and days will go a long way towards providing a better crop of programmers in the future. I am not even looking for in-depth, hard Computer Science. I just want to see something with basic, "best practices." I want to see something so universal that it could practically function as a checklist that a good compiler plus a good code reviewer will catch.

The world of programming is getting more and more complex. Just try to figure out where to start if you want to learn Java if you do not believe me. I could not imagine learning to program by being handed a copy of Visual Studio Express or Eclipse; just learning to use the tools takes an experience programmer some time. When I learned to program, the only tool I needed were the BASIC interpreter command, vi, and three basic vi commands (open, save, exit). Modern applications are much more complex, not less complex. To combat the complexity, the programmer learns a million and one frameworks and libraries, each one promising to resolve the problems that the COBOL and C programmers twenty years ago just did not have. Batch processing a fixed width text file and generating a new fixed width text file as output is a far cry from a modern Web app, where a simple "Open link in new window" can cause a massive problem in the backend.

As a result, the fundamentals are more and more ignored as schools and books race to cram everything someone needs to be a programmer into 400 pages or 8 short semesters -- and the quality is suffering as a result. If anyone knows of such a book, I would love to hear about it.

J.Ja

About

Justin James is the Lead Architect for Conigent.

89 comments
ross.linfoot
ross.linfoot

I agree with your comments about the fundamentals being a necessity for having a good grounding in programming, Unfortunately I have found no recent publications that accomplish what you are seeking, and would be interested in discovering them too. The only book that I really found useful was when I started programming in C ??? "The C Programming Language" by Brian W. Kernighan and Dennis M. Ritchie --- Brian W. Kernighan Dennis M. Ritchie. --- - But I think everyone has used that.

Mark Miller
Mark Miller

With few exceptions I always learned about programming through a programming language book. I started with BASIC when I was 12 on an 8-bit computer. Several years later I learned Pascal, again on an 8-bit. I learned about assembly and C in college. The general stuff we learned was data structures, algorithms and automata, and how operating systems function. I took a couple courses where I might've gotten a general programming book. One course I took was called "Comparative Programming Languages", where we got exposed to a few languages, and did a couple assignments in each. It wasn't a memorable course though. I took a senior level course on programming languages, and I remember we used "Programming Languages: Concepts and Constructs", by Ravi Sethi, aka. The Teddy Bear book (because it had a teddy bear on the cover). I think that course focused on the history of programming languages, what the different kinds were, and generally how compilers and interpreters worked. We did programming assignments in it, but they were to just get us exposed to some dynamic languages. The one course where we focused on "building a program" in any depth was in a course I took where we learned assembly language, along with machine architecture. We got assignments to build modules in assembly that did some specific things, and at the end of the semester we built a couple apps. out of the modules. This was 18 years ago. I didn't read "Code Complete" until I got out into the work world.

WKL
WKL

It is an issue of engineering craftsmanship that isn't adequately addressed, not just by the industry but by society and law. And it is an issue that desperately needs to be addressed by society and law because of the potentially grave consequences of slipshod practice and incompetence, particularly where safety and lives are at stake. Which is why licensing is required for a great many disciplines, such as electrical engineering. There are optimization, efficiency, process, interaction, reliability, error/exception handling, etc issues in addition to implementing the basic function that are fundamental aspects of any mechanism design. Whether you're designing a car engine, a nuclear reactor or an information handling algorithm, it's the same thing. You have to have a developed skill and aptitude for machine design if you want to develop something that won't drive people crazy and drive businesses into the ground. It is said that Microsoft bases it's hiring decisions largely on the applicant's ability to creatively/cleverly solve puzzles, but being able to solve arcane puzzles IS NOT the same as being an effective machine designer. Just look at all the times updates and corrections have had to be issued whenever yet another buffer overflow exploit has been discovered, for example. You never, NEVER let outside world trash mess up your mechanism. You never, NEVER let a buffer "overrun" into code space. There is just NO excuse for that. That is nothing less than design incompetence, and any "developer - developer - developer - developer" (http://www.ntk.net/media/dancemonkeyboy.mpg) that is worth half a damn should know how to implement ring buffers with flow controls to avoid that. I don't care how clever you are at figuring out how to "move Mount Fuji", or how wonderful you think your programming "style" is, if you aren't savvy enough to anticipate what could happen to your program out in the real world and invent solutions to deal with it, you need to reconsider your career options. Personally, I'm tired of wasting my time fighting yet another piece of [incompetence] slung together by someone afflicted with the delusion that they actually have a penchant for designing working solutions in software.

ProblemSolverSolutionSeeker
ProblemSolverSolutionSeeker

Look around at work! The emphasis is away from IT and more towards the business end. I cannot argue with this logic, but your concerns about literature reflect this trend. Go to any bookstore - you are likely to see the same pattern. Tons of business books, and dwindling sections for IT, and, maybe, one or two books on 'fundamental programming'.

Locrian_Lyric
Locrian_Lyric

During the boom of the 90's and through today, there have been too many schools mass producing programmers of questionable quality, and the universities are no better. The emphasis is on getting something running, not running well or running within an institutional structure. Maintainability is not taught nor is optimization or efficiency. What feeds this is the corporate world wanting everyone starting new positions to "hit the ground running" and a distinct lack of emphasis on training in many organizations.

gweinman
gweinman

Programming books do teach theory. Many books on patterns exist. Many books on OO exist. Curicula in university extensions offer advanced topics in programming such as patterns and frameworks. However, no book or course can instill the knowledge without tools that effect that knowledge. Can you imagine a carpenter apprentice program that teaches cabinet making without offering tools to practice the discipline? Similarly, books and courses need to focus on a tool that implements everything the book or course teaches. Smalltalk might be the best choice but few interested in programming would sign up to learn in a language no company utilizes. So, the courses teach C++ or Java.

gweinman
gweinman

Programming books do teach theory. Many books on patterns exist. Many books on OO exist. Curicula in university extensions offer advanced topics in programming such as patterns and frameworks. However, no book or course can instill the knowledge without tools that effect that knowledge. Can you imagine a carpenter apprentice program that teaches cabinet making without offering tools to practice the discipline? Similarly, books and courses need to focus on a tool that implements everything the book or course teaches. Smalltalk might be the best choice but few interested in programming would sign up to learn in a language no company utilizes. So, the courses teach C++ or Java.

Tony Hopkinson
Tony Hopkinson

8 bit machine code (thirty years ago) - basic (with line numbers) Pascal, C, Fortran, Delphi, C++, now C#.net Academic only like ADA and Prolog. A multiplicity of scripting environments. Near 100% of good coding practice learnt from being burnt by bad ones, mine or other peoples code. A few books since then, but they are more a restatement of lessons I'd already learnt. So many things you need to know to do programming as a job, that are not formally taught or are implicit in the tools used to teach it.

Elmonk
Elmonk

The prospect of going up the career ladder from humble coder to analyst / business process engineer / architect etc. leads to an attitude of considering the programmer phase just an episode hopefully going over in a short period of time. I've met people with university degrees in informatics who've never heard of how a heap sort works. I've met analysts who've never managed to get a decent program working properly. I myself started coding in assembler some 40 years ago. We eagerly picked up whatever books we could get hold on to in order not to re-invent the wheel. Among them of course Knuth's Art of Computer Programming. Having seen most aspects of IT related jobs I'm now back at "coding" robust and hopefully decent maintainable programs mostly for unattended operation - and I'm proud of it!

Wayne M.
Wayne M.

There are two issues here: one is data validation and the second is computer security. Data validation, including bounds checking, prevents data errors from causing system errors. The key to data validation is to understand where it might be applied and where it need not be applied. It is inappropriate for every function to validate every parameter; validation must be applied appropriately. Security is another issue. Security is a problem largely because TCP/IP has been implemented as an unsecure communications link rather that a compartmentalized link. Anyone in the known world can send an executable to be run on the whole of your computer. There is no reason for public internet or e-mail to have access to anything beyond a protected subset of my computer. The idea of compartmentalization is not to try to prevent someone from doing something bad, but to limit the scope of what he can do. The flip side is that currently I can e-mail out any or all data I can access from my computer. Just because I have access to corporate privileged data does not mean I should be able to e-mail it to anyone in the world. This latter issue is far more representative of the actual security breaches that companies face than due to receiving a buffer overrun error. Data validation is a non-trivial excercise to prevent undefined operation. There is almost always additional data validation that could be applied and even more that has not even been identified. I do not accept the implication that a programmer is incompetent because he has failed to address every potential failure condition; this is one the trade-offs that is made when delivering software. I also think we need to understand the rationale for compartmentalization and not seek indestrucable software.

Mark Miller
Mark Miller

Buffer overflows have shown up in other vendors' software as well, even open source. Programming schools have not taught about making your programs secure. They didn't do it when I took CS, and they still don't. I had to learn the technique for bounds checking my data copying while programming in C. I tried as much as possible to use arrays that were allocated on the stack, and to use a macro we developed that would only copy as much data as the buffer could hold. Buffer overflow problems showed up in the first place because development houses thought it was a good idea to move from whatever they were using to C and C++. If they began with assembly language it's possible they would've had this problem as well. Most other languages were bounds checked, but C and C++ were not. If software is developed with a dynamic language, or Java, or .Net, all arrays/containers are bounds checked by default.

Tony Hopkinson
Tony Hopkinson

A lot of software nowadays is the equivalent of a flatpack assembly. Relatively cheap and put together by well meaning amateurs, who's major skill is clerical. Theoretically, these people were meant to move to a more abstract level of design. Two major problems though, none of the pieces fitted together as desired, no one told them what abstract meant, and if they did, they have absolutely no idea of what it costs. Worse still they took these people and made them component developers, hence MSChart et al. What would one of us do to fix a component like that?. We'd break it up and change it from a wrapped application with a crap UI into a set of components. Not even my ego is big enough to claim I could maintain that POS efficiently or cost effectively.

Justin James
Justin James

Business is definitely where it is at, for the time being. Folks with MBAs are being treated like IT folks were, about 10 years ago. For some reason, people think that if only they get business nailed down into a standard process, sucess will follow. I think the whole thing is a joke, myself. Colleges in my area actually have a major called "restaurant and hotel management". Four year colleges, at that. Why would they think that you need a college degree to do what amounts to following the instructions in a 3 ring binder and having good "people skills"? But I digress. A lot of people are under the impression that IT work is a "commodity" item now, and a lot of that is thanks to the various certifications. Why bother getting someone with that expensive qualification called "experience" when you can get someone with a dinky degree and a cert in something? And the person with the cert has a "standardized" knowledge base, supposedly. So now that the IT problem has been "solved", you can treat IT like a cog in the machine, and just expect it to work. It is rediculous, I know, but that is how people look at it now. IT cannot settle on a "best practice" for curly braces (as Tony loves pointing out on a regular basis ;) ), what makes anyone think that it is a "cut and dried" field like inventory management is beyond me... J.Ja

Tony Hopkinson
Tony Hopkinson

As in units out of the door times sales price. What does is general business perception of what developers do. Get it right first time. Get what right? Cheap talent saves you money. My arse. Lines of Code is good measure of productivity, don't even go here. New things are obviously better, anyone for a bit of QAD. New things are new things, unit tests. Maintenance is just fixing coding errors. ROFLMAO. Being able to code "Hello World" means you can design an air traffic control system. Access enterprise database anyone? The absolute killer is the the tool that can make anyone a competent developer. The thing is when you explain these stupidities in business terms, you get ignored because you are a geek, and geeks know nothing about business. The converse is obviously not true. Says so in all those books. If you understood them you'd know what you were talking about. :p It's not that businesses want silver bullets, it's their perception that we don't. We are resistant to change !. That has got to be the biggest crock ever. Change is what we do.

Tony Hopkinson
Tony Hopkinson

Scholars and tools set developers being promulagators of silver bullets... Personally werewolves don't bother me, I find the concept of the silver bullet offensive though. Either its implemented in yet another piece of shelfware or flavour of the month all things to all men methodology. Or it's lifes work of some type who has never designed crap in the real world and bought into by his students who have just as much valuable experience. Cookie cutter ,or shake and bake came out of the daft arse idea that because you could reuse a library routine, you could reuse an application if you wrapped it up as a component. It's an open question to me how much shake and bake goes wrong because the developer can't choose the ingredients correctly and how much because the ingredients are sh*te. To name two well known ones, (not a dig at MS, I hate 3rd party component suites). MSChart and MSFlexgrid, two POS's I hope never to go near again. Both designed by a committee of half wits on a really bad day. All training at work would do is redress the lack in academia. Necessary I'll admit, but late, seeing as you've already said they were 'qualified' by employing them. I doubt you could train people to write software properly. I bet we could n't define proper. You could show them how you do it properly, but short of global variables, side effects etc, alot is style and how you think. The idea of a best way is another constant drive to dumb down our discipline and turn us into glorified clerics. The best thing thing you can do is communally generate a set of working standards. How we do this ,when we do that, what we call those. The bit's in between the braces will either happen or you'll let the muppet responsible go.

Justin James
Justin James

In my early years, we spent half a year working with EdScheme, a language of less than 20 functions. Our exercises were basically building a more complete and useful language out of it. The language itself was dead simple. It did not get in the way. I learned a lot about raws basics (recursion, error handling, how to break a problem down into small pieces, loose coupling, etc.) simply because there was barely any language to get in the way. It took 10 - 15 minutes to teach all of the EdScheme language at first, to get folks started. Now, which dso you think is better for teaching someone fundamentals? A "real world" langugage where someone can spend months or even years stumbling upon all of its quirks? Or a langugage special designed for such a purpose that got out of the way of the lesson? J.Ja

Tony Hopkinson
Tony Hopkinson

For start patterns are OO. The distinction we are trying to make is more flat pack assembly vs carpentry. It's more how to use a hammer, does not tell you when, where and why to use a nail. You know the aphorism if everything looks like a nail, you always use a hammer. Well that's where we are. Invisible list box controls, word processing in worksheets, matrix transforms in relational databases and my absolute favourite MS Access for the enterprise. We weren't talking builder patterns here, but something much more basic and an advanced course is far too late to learn it. You'd have a shed load of bad habits by then. If you can code in SmallTalk you'll know formal OO at then end of it, the same cannot be said of Java and and definitely not C++. Choose the best tools to teach, not the most popular one in a locale for bodging stuff out the door, because some twit sold it already. Programming is not language, all language is, is syntax and semantics.

Justin James
Justin James

It is also important to note that once you get on the upwards career path, even for those desirous of continual learning, it is hard to do. I hit that analyst state about 6 months ago, and tonight I am doing the first "for fun" coding that I have done in 6 months. My "for work" coding is strictly limited to basic SQL to dig through the database. I know that if I am not careful, my programming with atrophy, very quickly. J.Ja

Justin James
Justin James

The "Reply to All" button is probably a bigger security hole than most applications out there are with bufffer overruns. At one job, the fact that people would just create directories in the public area without setting permissions to be just their group and then stuck sensitive documents in there was a bigger problem than all of the bad code in the world. On the flip side, targeted, planned attacks against bad code usually do more damage at once, like when someone rips off 2 million credit card numbers, as opposed to the accountant accidentally emailing the earnings numbers to his wife. Overall, most security breaches are people who are authorized to use the data misusing it. Sys admins who wipe a drive in anger, a DBA walking out with a USB drive full of tables to sell to a competitor, stuff like that. It is not that they made a mistake or breached security, it is that they abused their priviledges. J.Ja

Wayne M.
Wayne M.

C and C++ were developed to work with hardware and other non-software components. It is not enough merely to return buffers to the heap when you are done with them, one must also make sure to release files, close interfaces, and toggle bits in I/O boards. I have always felt that a programmer who took care of these type of object lifetime issues will handle memory release quite easily. Besides, it's Friday and I just felt like being a pro-C++ anti-GC guy today.

Justin James
Justin James

C/C++ execute quickly because they are not safe. Even the most seasoned C/C++ coder can make a minor mistake(missing asterisk, anyone?) that can cause big problems. But that was the tradeoff folks made for the raw speed of C/C++. I think there is something else in there too, and that is sheer "general purpose-ness" of language. Other than C/C++, what truly general purpose languages were around then? BASIC, Pascal... everything else seemed to be fairly purpose-oriented, from what I can tell (Fortran, COBOL, Lisp, Ada). Smalltalk probably qualifies as a GP language, but I do not know enough about it to say. J.Ja

Tony Hopkinson
Tony Hopkinson

Bounds checking is built in therefore, the coding techniques for avoiding them aren't taught and the coding habits aren't learnt. When I went back to Fortran :D after near 8 years of Delphi, I'd lost those habits, had to regain them, along with many others that newer environments do for you automatically such as auto initialisation etc. If you were never aware of them in the first place.... Deity forfend, these people doing unmanaged code in .Net Then there's managed itself, learn only that and then go to an unmanaged environment, going leak like a sprinkler system isn't it. Not to mention the foolish assumption, that memory management doesn't have to be learnt because the GC takes care of it. There are still times when you have to know how it works and how can you do that, if you don't know how memory is managed.

Mark Miller
Mark Miller

Huh. I thought certs were worthless. That's what I heard just a few years ago. I used to hear endless complaints from people that they had a gazillion certs, but still weren't getting hired. They said every employer was looking at their experience level. What I read is that employers were skeptical of certs, because there were so many people who had "book certs" where they had studied enough to pass the tests on paper, but didn't have the experience necessary to actually do the work. Maybe this had more to do with lack of flexibility. Experience and flexibility can go hand in hand, I imagine.

Locrian_Lyric
Locrian_Lyric

Perhaps as a maintenance coder I get to see the worst of the worst. But when I can go in and eliminate 75% of the code and improve processing time by nearly the same ratio, something is wrong and style has nothing whatsoever to do with it. Mass produced programmers lacking the sense that the allmighty gave a senile toad is certainly a contributing factor. There is FAR more to programming than merely getting programs to run and it goes well past the globals, et cetera. -meaningful variable names -efficient use of code -freeing memory -coding for maintainability -following shop standards -maintaining source control -maintaining code libraries et cetera ad nauseum can all be taught. You can teach the proper way of doing things at least to the point where some of the more blatant and aggravating elements are extinguished. Yes, there is a certain artform to what we do, and to maintain code, you need to be half psychologist to begin with. That said, the artistry has nothing to do with the anarchy.

alaniane
alaniane

I learned how to understand abstract constructs by flow-charting the processes. I still resort to flow-charting when I receive a complicated piece of spec. I will take the spec and diagram the various parts using a combination of flowcharts, UML, and ERD. It just depends on what the spec is trying to accomplish. Maybe, students need to be taught how to diagram their problems. I have found that Assembly language has been quite useful in coding programs. I don't code in Assembly anymore, but having learned Assembly helps me understand the underlying architectures. For example, having to code FSM in assembly has helped to me improve data input validation for programs written in VB. Of course, I like to know what is going on underneath the hood and not just the inputing X gives me Y.

Tony Hopkinson
Tony Hopkinson

after a mathematics course. Function Head return first line of file + Rest of file Head(2) Returns Head + Head of Remainder of head(1). Not the way you would implement it. :D But it get's interesting when you start combining operations Tail(1) for instance as in Head(1)(Reverse)

gweinman
gweinman

The point is construction not the tools used to construct. Programmers construct applications using tools whether the tool is as simple as a hammer or as advanced as a numerical controlled lathe. Advanced tools are designed for specific diciplines that eventually implement in a programming language. A lathe is the same. It implements in the language of metal, plastic, wood, ... I learned programming on Wirth-less languages and then had my eyes opened when procedure based languages arrived in corporate America. I further became aware when OOL/OOP arrived many years later. Since then application construction has evolved substantially. I maintain no programmer is worth their salt if they do not understand patterns. If I interviewed a candidate who did not know what pattern to use to develop a cascading menu that programmer would not advance past the first interview. Anyone can learn to create a procedure based application using any of the OO languages in the market place. That no way suggests the programmer effectively used that OOL. Most likely because that programmer has not learned to think OO. Thinking OO must be taught early. I sugggest the most effective means of conveying that knowledge is training in core patterns and frameworks.

Wayne M.
Wayne M.

As I have tried to state in a previous post, I do not believe that software development has yet matured to the point where there are fundamentals that can be prescribed for licensing purposes. The knowledge base is growing such that new important information is being discovered and old information that was key, is often no longer important. I also feel that the fundamentals differ among different types of software development such that a fundamental in one area (say embedded programming) may be a non-issue in another area (say web development). This is not to say that one area is "better" than another; just that there are significantly different prioritizations of what is important. Given this environment of changing expectations and knowledge, the approach to software quality needs to be one of continual learning. Businesses and organizations need to take responsibility for the technical development of their software staffs. It is simply not acceptable to "hire the best", let them work 3-5 years, and let them go because they are now out of date. Companies need to maintain institutional knowledge while teaching newer developments. The main concern I have with a legal approach is that we will be forced to apply today's standards to yesterday's software. Would we feel comfortable is going through RFCs and suing the developers of e-mail for not putting in a prevention for spam, or suing the developers of FTP and Telnet for sending passwords in the clear? Software development is still a young and growing expertise. It is too early to put a stake in the ground and say "these are the rules." To develop quality software, corporations must take responsibility to educate their staffs regarding new developments in the industry.

WKL
WKL

"But too many of them are in this industry, and too many of them confuse getting code to produce the expected output with the expected input with actually writing good code." Damn straight!

Tony Hopkinson
Tony Hopkinson

I pride myself on being visible, I don't hide behind technicalities ever. You try doing that while Big Fred's bonus is going negative when you screwed up. Doesn't matter how good our intentions are they will still pave our way to 'hell'. The only way to ensure that vested interests didn't take over is to make sure there aren't any. Any academic body after a time has and always will at some point try to redefine those standards to keep themselves in power. Human nature 101.

WKL
WKL

...is that you, and every other software developer, continues to enjoy a period of time where the general populace is rather ignorant of just what these "computer" things actually are, and how it is they come to do what they do, and where this stuff called "software" comes from. They still think in terms of PHYSICAL problems, not PROGRAMMING problems. They still think that if it doesn't work right, it's not "software", whatever THAT is (most people literally don't understand that a "computer" is just a machine that becomes what someone PROGRAMs it to be - not even the corporate execs) but rather more like a car engine with a bad spark plug or something. The old "it's the computer's fault if there's a problem" thinking is still very much alive and well. When more and more people start to understand that computers behave according to a set of instructions that represent a person's thoughts, translated into a series of instructions that are executed sequentially, and that the reason people have problems and FAILURES is due to a PERSON's failure to implement the intended THOUGHT in SOFTWARE - well, THEY are going to start thinking about it. Need I say more? Licensing doesn't necessarily mean that there is only going to be "one way to do things", any more than it does, say, in electronics (although, that may not be such a bad idea! Getting rid of all the different languages, settling on an approved standard, and standard coding techniques, just might be a step in the right direction! Just straightforward, clearly implemented logic!) Enjoy your relative obscurity while you can. I don't think that it's going to last much longer. Heh, heh, heh.

WKL
WKL

It is downright terrifying if you think about it long enough.

Tony Hopkinson
Tony Hopkinson

Because as an industry we kept chucking people without wings out of the plane, or sawing their wings off first. The parachute was a response to the general inability to do things correctly. All correctly being is remembering to release something after you'd finished with it. Ok a tad oversimplified. :p The thing to remember is the GC solves none of the other issues around lifetime management. Such as deferring creation, localisation... The scary thing about it is it hides these issues and in any non trivial development, they will come back and bite your ass. I guarantee you get get masses of memory tied up because it's referenced by a high level instance instead of having the low level instance reference the high. Avoiding crap like that is something you learn real quick in an unmanaged environment. The mistakes there tend to be dangling pointers and simply omitting release, we have tools that identify those issues. We also have techniques for avoiding creating them in the first place. All the GC means is the inexperienced, unskilled or just plain rushed developers can crank something out that won't fall over the first time you run it out of the door.

Tony Hopkinson
Tony Hopkinson

As I said in response to another, go down that route and you end up with there's only one way to do things. No matter all the panacea merchants attempts, software cannot be designed by rote. On the surface a guild of professional programmers sounds reasonable, but historically such things tend to go protectionist, they become resistant to change . What would an established guild of professionals view of OO or Agile have been? All of a sudden they would not have been competent. :D They aren't going to like that. To avoid that, you get down to don't use global variables, give things a meaningful name, avoid side effects. Things that someone with an IQ in the high 20s can appreciate. I think we should have a group that assesses the value of the qualifications that are out there already, academic or propriety. Force them to become meaningful, drive them to become valuable outside their own limited niches. Most experienced professionals accept that none of them mean you will be judged competent in the industry. Got to be careful, how could we say you are better than I, JJ is better than me? We can judge competence, but we must always remember in industry we are only allowed to be as competent as the budget will support. We've all done, nasty bodge, expensive short cuts and used inappropriate technologies. The difference between good and bad is not doing it out of ignorance, and that's where knowledge of the fundamentals comes in. Above all any body should drive those, they can be restated , reworded, adapted, but at their core they are immutable. Probably the only things that are in our industry from a technical point of view. That would be more an ethos, than a rigid structure.

Justin James
Justin James

I agree with the general feeling that industries do not want to be regulated, but when they combine being crucial (or people's lives depend upon them) with not being reliable, they get regulated. Software is getting to that point really, really soon, I suspect. When the Java lisence says to not use it to run a nuclear reactor or air traffic control, I have to wonder... and how many people have broken that license clause, anyways? Ack. Scary thought. J.Ja

Justin James
Justin James

In a world where so many coders barely understand scope, what makes you think that lifetime will happen? I encounter so many apps that count on the GC to clean up unused file handles, for example, and as a result, files remain locked for too long. Or apps that slowly leak memory over the course of weeks because of one small object that is never fully dereferenced so it can be GC'ed. If coders can't figure out lifetime with GC, taking GC away magnifies the problem significantly. Not saying that I disagree. I think anyone who does not understand lifetime, scope, etc. ("the basics") really has no business in this industry. But too many of them are in this industry, and too many of them confuse getting code to produce the expected output with the expected input with actually writing good code. J.Ja

Justin James
Justin James

Whenever anyone mentions GC, there are a lot of tradeoffs to it. How many programmers would there be without GC'ed languages? A heck of a lot less then there are now. Development would take much longer too, due to the added testing, debugging, and coding time needed. Sure, when writing something than thousands or millions of people use (Office comes to mind), the milliseconds saved per user add up to enough time to well justify the developer time. But for your run of the mill internal app, it is really hard to justify the additional coding time of C/C++ compared to Java or C# or VB.Net. While I agree with you, memory management is something that a "pro" will not have many issues with (heck, when writing .Net code, I tend to write like the GC barely exists, and I make its job as easy as possible by setting to null as soon as something is no longer needed), the ratio of "pros" to "shake 'n bakes" is such that GC makes sense. It is kind of like the verbosity of a language like Java makes the overall project go faster, because 90% of the folks on the project will stumble all over an elegant language like Lisp or Perl or whatever. If you really miss the days when non-GC'ed languages ruled the roost, just think back to Windows 3.1 when all Windows apps were C++ (pre-VB 3), and a stable app was a rarity indeed. The pro/shake-n-bake ratio has always been the same, but at least with GC, the shake-n-bakers have a parachute! J.Ja

WKL
WKL

To provide input into what is expected of competent software developers for the purpose of enacting laws requiring professional licensing? How about you, too, Wayne?

Tony Hopkinson
Tony Hopkinson

They don't call it the garbage collector for nothing. In my personal opinion it was designed as a crutch for amateurs. Can't wait for the "oh you don't have to worry about memory management" brigade to really get cracking, another lucrative decade of clearing up after noobs.

Justin James
Justin James

My reading speed and comprehension of paper is many times that of screen. That being said, I also agree that the example you gave is exactly the type of thing I mean. This is knowledge that experienced coders have, but typically cannot even remember where they got it from. Indeed, that is one of the dividing lines between "decent enough" and "good" programmers: if something like that is intuitive to you. Five, six years ago, I was writing Java mixing primitives and objects all over the place, and then complaining about the mess that the mixture made. Today it would not occur to me to do that. J.Ja

jean-simon.s.larochelle
jean-simon.s.larochelle

...of the memory issues and GC cost. I'm not talking about innapropriate advanced optimization. Things like correct library usage for example don't affect readability or maintainability and go a long way towards making an application run smoothly (things like using the primitive wrappers factory method: Integer.valueOf(int) instead of new Integer(int), and other such basic stuff). I think that having exposure to languages such as assembly does raise one's awareness of those factors. One thing that I did'nt mention is software programming magazine. I bought and read a lot of them and I still buy and read the few that are still available on the shelves (paper). A magazine article often is a quick way to raise awareness of some aspects of programming or get to speed on new features in a language (magazine article are often just an overview). My favorite magazine was "Computer Language". This eventually became "Software Development" (still one of my favorite but less interesting than "Computer Language") and is now dead (assimilated by Dr Dobb's, an excellent magazine but no substitute for the real thing). Less paper magazine is good for trees but bad for me because I use to read a lot of those on the bus. JS

Justin James
Justin James

I have seen some markets place a lot of emphasis on certs, and some that didn't. Large shops seem to like them more than small shops. I think once you have 5 - 10 years experience, they know you know the stuff. Some folks just would rather try hiring a freshly certed guy than an old timer and save some money. J.Ja

Tony Hopkinson
Tony Hopkinson

The certs were to get past HR or simply a first pass filter to cut down on teh number resumes to be trawled through. Even better a wordsearch would do taht first pass. Negotiate that hurdle, then you get judged on experience. It's the old catch 22 for new starters, that always comes up when entry level gets squeezed out. Without the experience you can't get the experience. So you get a pile of mendacious gits who sell certs as 'your' way out. To a certain extent they were right, in an abundance of candidates, without them you were f'ed from the get go.

Tony Hopkinson
Tony Hopkinson

Studies show about 30% of the time in maintenance, error correction or enhancement is comprehension. Trying to understand what you've got now, and what the change will do to it. Designing for at least readability and testability have enormous impacts on that time. I pretty much agree with you none of these things are on the academic radar, except possibly efficient code. Like coding for maintainability, there is a lot of personal or perhaps team style in that. Ask Professor Nojob for how to code properly and you'll be lucky so see these mentioned in a foot note. The rest of it will be some arse he cooked up to be different. Ask Mr Tools and he'll tell you to buy X which he just happens to have a copy of at 15k with a pile of bollux by Professor Nojob that says it's great but doesn't prove f'all. Ask someone who's never maintained for any amount of time and he'll say do it his way. Professor NoJobs methodology implemented in Tool's product, usually. Ask us and we'll come out with a variation of "All routines in forms that get data from an external source are prefixed with Load" But it can't be that simple can it? :p If you stick to the base fundamentals which contribute to those two, you are on a winner. (side effects, functions that do one thing, and minimise globals variables added to your list). If you make them language agnostic, easy l o a d , yeah that's easy. And try to stay away from aesthetic issues. foreach with a test and break vs while or repeat, for instance. Then the only thing you have to do is convince management the the time spent on generating, evolving and policing these standards, will have an ROI. Course seeing as if we got it right first time this would be wasted effort...... :D

Tony Hopkinson
Tony Hopkinson

1. Initialise 2. Process 3. Finalise Almost disappeared from the face of the earth in favour of graphical techniques. I still favour it though. A dashed line with a closed diamond symbol means what exactly ? :D Guess I should pick up that new Yourdon stuff. :p

Justin James
Justin James

That is a great suggestion. Have people diagram out the steps to do something, take the language totally out of the picture. Plus, it teaches them real world skills! I really like that as a method of teaching basic programming theory! J.Ja

Tony Hopkinson
Tony Hopkinson

for the penny for the penny to drop on windows event model than it did for OO. OO expressed the way I'd been taught to percieve problems anyway. Coming from a main frame procedural back ground my thinking was very linear. That however can be a boon when looking at strategies for decoupling. Service Oriented architectures or scripting hosts for instance. I'm not the sharpest pencil in the box, but programming at it's most basic (break a task into steps) does seem to come natural to me. I get concepts very quick, implementing them correctly I'm as fast or slow as most of the people who's ability I respect. I was still seeing tightly coupled code in 2003, new code as well, not legacy. I was contracting and I did specialise in pulling projects out of the cart, so may be that contributes to my bias. Any one can write a program, writing a good one and enjoying it is a different matter. Things went bad on us when the talentless descended on our industry for the money. We are still dealing with the consequences of that. For all our intellectual joy in the purist pursuit of programming as a discipline, we still have to sacrifice best for practical in business. As you say the pendulum is swinging and a real estimation of the costs of an unmaintainable code base has hit the business radar. About time too.

gweinman
gweinman

Gamma et al only formalized the concepts of patterns Experienced programmers already canned code so they would not have to redesign solutins to commonly occuring problems. I developed patterns I used for CICS development in COBOL in the early 80's for the same reason. I now see 50 or more published patterns that cover basic programming problems through advanced application construction. I still maintain OO can be best taught by attacking a problem for which a pattern exists. Nickolas Wirth argued a programmers must know all aspects of a programming language before they can create a sound solution to a problem. This is analagous to your statement about scope. Courses on C# cover that plus information hiding, inheritence, ... These courses also teach why those concepts are good ideas. Yet a programmer can know a language well enough to pass a certification test and still not produce good results because they cannot make the leap from procedures to classes. That is: they cannot think OO. So you learned from the school of hard knocks. You were smart enough to raise the level of abstraction sufficient to create your own patterns. How long did that take? Every time I switched to a new language it took me many months to set up abstract modules and I started programming a long time before abstraction was considered more important than machine efficiency. I agree that any bozo who commits pattern example code to memory will not produce quality results. I've seen such people and wondered why. It sounds like you have too. If someone cannot differentiate tightly coupled logic from loosely coupled logic that person does not understand logic well enough for me to recommend their hire. But someone did because they work down the hall. In the 70's and 80's it was very difficult to write loosely coupled applications. IT managers cared more about machine efficiency than labor efficiency. Now the pendulum has swung to our side forever more. Anyway, my belief is a person can be taught how to program and how to use the best programming practices. That is probably a one year curriculum. The next step is teaching that person how to abstract. That is where patterns and frameworks present the best opportunity to convey the knowledge.

Tony Hopkinson
Tony Hopkinson

If you mean GOF speak, then you'd be wrong. I'd been using patterns in OO, well before I read the book and discovered their names. One of the problems with being self taught that. We appear to be arguing past each other, patterns are a specific class of OO scenario. In them the cornerstones of OO are a given. What some of us are saying is because they are given, when the how to use a tool is taught, why is not, or at best is a footnote on page 256. How can you ever understand OO if you don't understand scope for instance? Teach people that way and they do OO by rote, as soon as they run into something that doesn't fit the pattern they know, they break the entire concept in oblivious ignorance. Tightly coupled OO code is an oxymoron. If they cover their code in side-effects, misleading names, jettison exceptions, leak resources, complicate things for the sake of it, leave gaping security holes in fragile crap coupling disparate functions then their patterns are worth less than a cross p1ssed in a snow bank. I learned programming from having to fix my own mistakes at three O'clock in the morning with a big hairy machine operator, losing his weekly bonus because of me, breathing down my neck. A hard school, but you don't want a 0/10 See me off Big Fred! :p Well not a second one :( I wrote my first program in 6502 machine code in 1977, so I've seen a lot of stuff change as well. I've seen a lot stay the same though, coupling and cohesion haven't changed at all, not even a little bit. Yet I meet people with more letters after their name than it it, who respond with Huh?

Editor's Picks