Apps

Variable names should usually be descriptive

Chad Perrin says the long evolution of programming style leads us to one inescapable conclusion about variable naming conventions: what we should name our variables depends on context.

A programming practice that largely vanished with early versions of the BASIC programming language was the use of single-character variable names. The difficulty of deriving meaning from a variable named a, meant to represent a concept like "accounts receivable," impedes the quick and easy comprehension of what our code is supposed to accomplish for us. As a result, a common "best practices" rule of thumb was developed that could be expressed thusly: use descriptive names for variables.

This concept is central to the more modern, more general rule that, to the extent reasonable, source code should be self-documenting. It is preferable to be able to read source code directly and have all the information we need to understand it in full right there in the source, rather than having to read separate documentation of the code and keep track of where that documentation matches up with the syntactic elements of the source itself. It is preferable not least because of the tendency of documentation and source code to get out of sync. This applies almost as much to code comments explaining how code works as to separate documentation.

With self-documenting code, then, code comments can be reserved for why we wrote our source code the way we did. We increase the information density of our source code files without increasing the difficulty of reading them, this way. In fact, reading and understanding get much easier when the source code itself reads a bit like a story of how the code works, rather than like pseudorandom streams of characters.

The rule of thumb that we should use descriptive, (presumably) verbose variable names is just that, though -- a rule of thumb. It should not be taken as a divine commandment, never to be broken. There are cases where long descriptions of what a variable does are not appropriate or desirable when trying to make our code as clear, and thus maintainable, as possible.

To the extent that a variable name is confined to a narrow context, it is often appropriate to use a shorter, less descriptive variable name. In fact, doing so is often more than appropriate -- it is advisable, because a lengthier name would make it take longer for the developer to read and grasp the meaning of the variable in that context, cluttering up the code.

This is particularly the case where a single, primary looping variable is used to iterate through data stored in some kind of data structure, where the meaning of the variable name is fairly explicitly defined in an obvious, encompassing location in the code. Take a Ruby iterator block as an example:

purchase_list.sort.each {|p| puts product_descriptions[p] }

As we can see, the fact that we are iterating through a list of purchases is directly tied to the use of p as a looping variable. That variable, then, is used as an index or key used to retrieve product descriptions from the product_descriptions collection. Loops wherein a primary looping variable serves as the key used to retrieve a value from a collection constitute the most obvious example of where a single-character variable -- often derided as a cardinal sin of programming -- actually helps clarify code. Consider instead this needlessly verbose alternative:

purchase_list.sort.each do |purchase_list_item|

puts product_descriptions[purchase_list_item]

end

In this example, by choosing to be more explicit in describing the source of the looping variable, we not only clutter up the code so that it takes slightly longer to read and understand, we also create a mismatch between the name of the variable and how it is to be used within the loop. In addition, our lines of code are lengthened to the point where, for clarity, a single-line, simple iterator should be broken up into multiple lines -- using do . . . end syntax instead of braces in accordance with Ruby coding style conventions for the multiline version. Another needlessly verbose alternative attempts to correct the deficiency in relevance to where the variable is used within the block:

purchase_list.sort.each do |product_key|

puts product_descriptions[product_key]

end

In this case, we tie the name of the loop variable to its use within the loop, but in so doing we divorce it from its source. Furthermore, in doing so, we make the name essentially redundant with the name of the collection for whose values it serves as keys, and convey no information that is not stunningly obvious to the reader.

Slavish devotion to the common rule of thumb that variable names should be verbosely descriptive produces an effective reduction in quick and easy reading and comprehension of code here, rather than an improvement. One of the most important factors in play is the fact that the beginning of the loop provides a directly corresponding connection between the loop variable's name and the source of its contents on each iteration. Another is its use as a key or index for a collection, where the syntactic significance of the variable in use is made clear by context.

That rule of thumb is certainly not without its merit in most cases, though. It is a rule of thumb because it works most of the time. For instance, given a variable whose scope is global to the current program file, a single-letter name provides little or no guidance within the context of its use in various parts of the program to its meaning for the algorithmic model as defined by the source code. More descriptive names are necessary for such circumstances because of the relative lack of cues in close proximity to the variable's points of use -- a direct result of the distance within the source file from the source of the variable's value.

This is why one might create a hash like the following near the beginning of a program to capture command line arguments whose values must be used later in the program:

command = {

:name => ARGV.shift,

:target => ARGV.shift,

:attribute => ARGV.shift

}

This way, the programmer can see sources and names of command line arguments all in one place at the beginning when maintaining that particular part of the program, and code comments can be added to clarify why things are organized that way. This provides a sort of configuration rule set for the rest of the program, defining the relationship between program inputs and the datums derived therefrom as they are used later.

At the same time, the programmer reading through other parts of the program has an immediate cue as to the meaning of a given datum that originated as a command line argument when it is used within other code because of the descriptive hash-and-key names:

inventory_table = InventoryTracker.new(datafile)

if command[:name].downcase.eql? 'delete'

inventory_table[

command[:target]

].delete_entry(command[:attribute])

end

By contrast, the following could be disastrous for quick and easy comprehension of what our code is doing:

i = InventoryTracker.new(f)

if c[:n].downcase.eql? 'delete'

i[ c[:t] ].delete_entry c[:a]

end

In this case, compressing things into fewer lines of code provides greater brevity only at the cost of any contextual cues about the meaning and purpose of the variables, resulting in what looks more like line noise than actually useful notation. On top of that, we must take into account the difficulty of finding the appropriate variable when doing a text search for something like f or -- worse yet, because variable assignment may not actually contain this specific group of characters -- c[:t]. Instead, assignment may look more like this:

c = {

:n => ARGV.shift,

:t => ARGV.shift,

:a => ARGV.shift

}

Another bad approach would be to simply use ARGV indices:

if ARGV[0].downcase.eql? 'delete'

i[ ARGV[1] ].delete_entry ARGV[2]

end

Now, we have to remember the order in which we have specified users should enter their command line arguments every time we have to make use of one of those arguments in our code. By taking the first approach described here for managing command line arguments, creating a command hash at the beginning of the file with descriptive names, we keep all our non-descriptive variable names together within the context of early organization of the way we handle data in the program, and use more descriptive terms in later code to give the code a self-documenting quality.

The term "cargo cult programming" was coined to describe cases where people employ what they believe to be "best practices" and to copy code in full or at least in form, without understanding the reasons for the practices or how the code they copy works. Taking an approach like this results in abuse of descriptive verbosity in variable naming, turning a practice intended to clarify code into yet another way to obscure its meaning from the programmer who has to read it later.

Ultimately, the correct answer to questions like "How should I name my variables?" is "It depends." When using a rule of thumb, think critically about it to determine whether you are using the rule correctly.

About

Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

181 comments
oldbaritone
oldbaritone

Once in a while, a client will insist on source code for something proprietary. A translator that replaces meaningful names with A1, A2, B1, B2, etc produces source code that will compile and run, and is impossible to debug. I demonstrate to the customer, on the customer's machine, that the complete system will compile and run just as it has been supplied. One time, I supplied full source code. The customer made changes, it made a mess, and they blamed me. I worked a lot of time for free, to fix the mistakes someone else introduced into the system. Next time, they couldn't figure out the code. Escrow source code, but if the customer insists on receiving a copy, be prepared for them to make a mess.

DoctorsDad
DoctorsDad

...was the cobol program which was all basically of the form Move MARY to BED Perform SEX Until WEARY

dogknees
dogknees

Basic+ was the system language for the OS RSTS/E on the venerable PDP-11 series of mini-computers from DEC. (For the youngsters out there) One of the joys of working on these machines was the hard limit on program code size of 32KB. There were ways to swap overlays in and the like to make use of the space, but essentially that was all you got. When you defined a variable, it took one byte per character, and one more byte for each "first" letter. So, "x1, y1, and z1" took 6 bytes, plus 3 for the three different first letters making 9 in total. "X1, X2, and X3" only took 7 as there's only one first letter. So, after writing our code and testing on small data sets, we'd then rename all variables to be "a" followed by some number. EVERY ONE OF THEM!!! This gave us enough space to load larger data sets for the real application, but you can imagine the joy of debugging this mess with 50 different variables called A00 through A49. No type information, no nothing. Ahhhh, the good old days!

knudson
knudson

I was a DBA at a company a while back, a team built this system then brought us in during QA. Fields like 'Data', well it contained data, 'Date' it contained a date, 'Number' guess what it contained. I complained to management, but got nowhere. Well my area CIO tried, but the project owners CIO didn't care. The other part of this is the names need to be consistant from file to file, table to table or whatever. Many years ago I worked on a system, VSAM based, speed was number one for this system and DBs weren't there yet. Maybe 8 files, consider them tables, Custnbr, cust, CNum ... just for one example. There were many common fields all with slightly different variable names. A few years later I took over management of the system, during a major rewrite one thing high on our list was code cleanup for future managablity. All these names got standardized.

Slayer_
Slayer_

I agree that that notation is not needed for modern languages because they all tell the programmer if they screwed up. But I was confused, but 99% of my code is done in VBA and VB6, which has no problem with you assigning a double into a string, it just might not always work as expected. Other stupidities: Label.Caption Textbox.text Don't mix those up... I personally, all my basic variables are single letter notation like the wiki shows. For complex objects, I always use "o" to indicate them, just so that a reader knows I am using an outside object rather than a function or user type. if its my own type, I use "udt". Again, indicating that its not a function, not an object, just a collection of variables. And finally, "cls" to indicate its a class local to the project (not a DLL) Anything other than the basic variable types, I use the 3 letter prefix. Another example: 1=false "1"=true cint("1")=false Go figure right? Implicit conversion from a string to an integer figures it out, but without the implicit conversion it does not work. And finally, it can greatly help figure out scope. I know gbExecute is a global boolean, I don't need to see where its declared. Though, on my constants, I tend to leave out the data type, and just have the scope. This is an example of code I just wrote for my current project. Set oRow = oDocument.Tables(19).Rows.Add(oDocument.Tables(19).Rows(14 + ((i - M_DEBT_SERVICE_CAPACITY_STARTING_INDEX) / M_DEBT_SERVICE_CAPACITY_NUM_OF_COL))) Since the whole goal of modern programming languages is to make the code easy to read, write and understand. I think the notation should be used when needed and ignored if not needed.

DuppyPog
DuppyPog

the next person who comes along that has to modify your code.

jefferyp2100
jefferyp2100

But at least we can all agree that Hungarian Notation should be permanently and irrevocably outlawed.

adornoe
adornoe

Multiply Employee-Total-Salary by Employee-Federal-Income-Tax-Rate Giving Employee-Federal-Income-Tax-Due rounded. Alternately, if one insists: Employee-Federal-Income-Tax-Due rounded = Employee-Total-Salary * Employee-Federal-Income-Tax-Rate. Nothing at all cryptic in there. Everything is clear and concise, even if verbose. But, anyone fresh off the street could understand that "sentence". Verbosity should not be a problem when maintenance can be more costly down the road. And, it's self-documenting. A procedure, which encompasses several statements, is the same as a sub-routine and the name there is also expected to be descriptive of the procedures purposes, even if the statements under the procedure are self-documenting. To me, there is no such thing as overkill when it comes to variable naming convention, and if it were up to me, even the language constructs would never be cryptic. With the speeds of computers nowadays, and with storage being so cheap, hardware considerations are the least of the problems. Software development and maintenance is still the most expensive proposition for any IT shop or IT department. Now, even COBOL could end up being screwed up by those who insist on shorter and more cryptic names, and we could end up with, taxDue = emplSal * taxRate but, when compared to the other verbose statements, the verbose statements are immediately simpler to understand.

Sterling chip Camden
Sterling chip Camden

My clients are all software developers, so the source code is the deliverable. I always keep a copy and tell them so. If they start having problems with "my code", the first thing I do is a diff.

Sterling chip Camden
Sterling chip Camden

MOVE CAPITAL TO SAN-JUAN. GO TO KANSAS-CITY. ALTER MENTAL-STATE TO STONED.

Sterling chip Camden
Sterling chip Camden

... though I used a compiled language, so variable name length didn't matter. OTOH, the compiler only allowed up to six characters per name (five for common variables, and with all the link overlays needed to fit into 32KB almost everything was in common).

apotheon
apotheon

> Label.Caption > Textbox.text Those are terrible names. Rename at least one of them. You don't need Hungarian notation to make up for the stupidity of those names -- you just need to use better names. > And finally, it can greatly help figure out scope. I know gbExecute is a global boolean, I don't need to see where its declared. There are some languages that fail in the clarity department when it comes to scope. Of course, most of the time you're doing something wrong if you have global variables (most of the time) -- but even so, Hungarian isn't really a good way to do that. You want to name it "execute" and indicate it's global? Use a word, so everyone can understand you: global_execute. Then again, maybe not, because you may refactor so it doesn't need to be global any longer -- and then what do you do? Well, then you either have to change a crapton of variable names or just live with the fact the name isn't accurate any longer (or leave it global, but that's not a great thing to do when it doesn't have to be global). If you can't figure out it's boolean-only from the name of the variable and its use, you're doing something wrong. Rename it, or let it contain non-boolean values that evaluate properly in boolean context.

Tony Hopkinson
Tony Hopkinson

in some languages, hungarian notation is a symptom.. So not exactly. Interestingky FXCop which will throw a fit at hungarian notation, doesn;t do it for windows controls, and I must confess I still use it for them.

Slayer_
Slayer_

It has it's places. Control naming for example. When reusing counter variables, it can be handy when re using someone elses code. It can be very useful in languages that do not have variable types, such as VBScript, to both help the programmer remember, and any future readers to know what type each variable is.

Sterling chip Camden
Sterling chip Camden

... should not have their hands in my code. COBOL makes the same mistake in programming languages that many UIs make: trying to bend the computer to outdated metaphors rather than creating new metaphors that better represent what you're trying to accomplish. Programming languages should not be rigidly forced into a human language model. Programming is part human language, part mathematical functions, and part something different from either of those domains.

Tony Hopkinson
Tony Hopkinson

Employee-Federal-Income-Tax-Due rounded = Employee-Total-Salary - Employee-Net-Income or may be Employee-Federal-Income-Tax-Due rounded = Employee-Total-Salary-Employee-Net-Income even Employee-Federal-Income-Tax-Due rounded = Employee-Total-Salary-Employee-Net-Income Perfectly clear.... If there's a total salary, what other sorts of salary are there? Rule one of language design names should not contain operators, ever. Aside from crippling the eyes, designing the parser is a twat, it will have more bugs than the poor fool writing the input. COBOL is meant to be very good at what it does best, this is not it though. Neither verbose nor terse = descriptive power...

apotheon
apotheon

> Nothing at all cryptic in there. Maybe not cryptic, per se -- but certainly a tremendous strain on the eyes. > Everything is clear and concise, even if verbose. I'm not sure you understand the word "concise". > To me, there is no such thing as overkill when it comes to variable naming convention So . . . you completely disagree with the entire article. Right?

Jaqui
Jaqui

[quote]With the speeds of computers nowadays, and with storage being so cheap, hardware considerations are the least of the problems. [/quote] and that love of throwing hardware at the problem INCREASES the energy consumption. That INCREASES the pollution. throwing hardware at the problem is making you a mass murderer, trying to kill ALL life on the planet.

adornoe
adornoe

and realized that, being cute was not healthy for my career nor for the business that I was working at. There was a time that I wrote: Subtract THIS from THAT giving OTHER. When the compiler was updated, THIS became a COBOL keyword and the compiler spit it out the next time the program needed modification.

Slayer_
Slayer_

I didn't give them names, those were default control names. Frequently though, they are named something like lblFirstName txtFirstName the prefix will let you know if you need to use .caption or .text Second, globals are a necessity in vba and vb6. There are no constructors to conveniently pass values around. And visual objects load themselves seemingly randomly. Example, if you call unload on a visual form that is not actually loaded yet, it will load it, run all the initialize and activate events, then run the queryclose and terminate events. Typing the word global is a pain, its much easier to type in "gb" then hit control + space and have the system show me all the global booleans available to me. And, again with the non boolean values that evaluate to boolean, that doesn't work properly, 1=false, "1"=true, -1=true, "-1"=true The notation needs to exist cause the language sucks. Finally, frequently in text editors and such, there will be an "Edit" menu, in there, now this is where this gets tricky, there will either be "Replace..." or "Find and Replace...". Depending on your choice, this will allow you to rename a variable whose datatype has changed.

apotheon
apotheon

That just means that those languages should go in the bin along with Hungarian notation. Actually, real (aka "apps") Hungarian notation is not nearly so bad. It's Microsoft-style (aka "system") Hungarian notation that's the big problem. The modern apps Hungarian notation has let some of the same style as system Hungarian notation seep into it, so it's not completely free of the Lovecraftian horror of system Hungarian notation, but its basic premise -- that the notation indicates the purpose of the variable rather than its data type -- is actually potentially beneficial. Then again . . . if you really want to include those purpose hints, just use a damned word instead of a cryptic note attached to the beginning of your variable name. If you want to remind yourself that the purpose of a given variable is to store an unsanitized name string from user input, call it unsanitized_name, not usName like some kind of bureaucratic IRS jackass getting off on watching people struggle with their tax forms.

Sterling chip Camden
Sterling chip Camden

... is when I need several variables for the same datum. In an old C++ ActiveX context, I recall having: edtName - control variable for edit control strName - string for its text bstrName - same string converted to a BSTR for some COM reason lpszName - same string converted to a null-terminated string for a DLL call But what this reveals to me is not so much the usefulness of Hungarian notation as the stupidity of the type system and the inflexibility of the language to provide these conversions as needed without me having to manipulate all these variables.

adornoe
adornoe

"anyone, fresh off the street" ... should not have their hands in my code. That's a wrong angle for the argument. Taking the statement literally is not how one should approach the issue. It's not a matter that, anybody off the street will be able to get access to a shop's code or system. The statement was meant as a way to highlight the fact that, just about anybody that needed to understand what the code was doing, could do so without having to spend days or weeks getting into the code before finally understanding it. I would be more concerned with the techies out there who can read any and all code, no matter what the language, and that could then alter that code to cause damage to the application and to its users. That happens a lot with the more current programming languages, and thus, we end up with viruses and hacks. So, the issue or problem is not one of "easier understanding" where "just about anybody" can become disruptive; it's about the desire to cause harm or destruction. That can happen with any language and any system. Are there any "safe" languages out there? COBOL makes the same mistake in programming languages that many UIs make: trying to bend the computer to outdated metaphors rather than creating new metaphors that better represent what you're trying to accomplish. What the heck is that about "metaphors". In computing, you either do something or you don't and you either do it right or you don't, and you either use one tool or another. The language of preference doesn't matter as long as it can perform the application's functions according to specifications, and efficiently, and cost-effectively. Programming languages should not be rigidly forced into a human language model. That makes no sense. Programming languages are designed with the understanding that, it's "humans" that will have to understand and use them. Programming is part human language, part mathematical functions, and part something different from either of those domains. All of those are things that directly relate to how "humans" understand the world and even programming languages. No matter what the language capabilities, whether for business use, or for scientific/business, or for entertainment, the language is designed and developed with the understanding that, it's a "human being" that is going to have to understand it and learn it and use it. The most important element to be considered when designing anything, whether it be a computer or a TV or a computer language or an automobile or anything that "humans" are going to interact with, is the "human element". If humans CANNOT use or understand something, then it's of no use, and if something turns out to be overly and unnecessarily complicated, then its days are numbered, and if something turns out to be just another way of "doing the same thing", then it was a product whose time had already passed. Give me the right tool, and make is simple. Things don't need to be complicated if they can be kept simple. Most people are not going to understand the mathematics behind a black hole, but an algorithm that explains black holes is not what most programmers have to worry about. For that, there are more complicated tools and languages. KISS where possible.

adornoe
adornoe

Look, the examples I gave are concise and syntactically correct, depending upon how the definitions for the variables were made. Your examples don't make sense because, they're just basically "move" statements or "equating" statements. Besides, by the naming convention that you're apparently using, the real world logic would seem to be failing. Thus... Employee-Federal-Income-Tax-Due rounded = Employee-Total-Salary - Employee-Net-Income looks illogical, because, taxes due is not the same as whatever the employee makes in salary after subtracting his net income. My example is more correct, syntactically, and more correct with its business logic. However, my example was just meant for demonstration purposes about how COBOL looks with it's self-documenting code. Business-logic-wise, even my example would be slightly incorrect, because, there are many times when, taxes aren't due on the entire amount of earnings, because, there are deductions that could be done for benefits, such as social security and medicare and pensions, etc. Again, it was for demonstration purposes about the language, and it wasn't meant as an example of business logic. Rule one of language design names should not contain operators There are no "operators" within the variable names I used, not as far as COBOL is concerned.

adornoe
adornoe

in the things I said, so why are you having a problem understanding what I did say? Nothing at all cryptic in there. Maybe not cryptic, per se -- but certainly a tremendous strain on the eyes. Nope, not cryptic all all, "per se" or otherwise, but, any fool that can read English would immediately be able to understand what the COBOL statement was accomplishing. To me and to anybody that cares about good coding habits, and also cares about ease of maintenance, worrying about eye strain is the least of the problems. BTW, when I was in the COBOL world, I never heard that problem about "eye strain" mentioned. You are the first one ever that has made that silly accusation about easy and non-cryptic code. Everything is clear and concise, even if verbose. I'm not sure you understand the word "concise" I'm certain you don't know the meaning of "concise". Concise: Expressing much in few words Concise, as in not needing any further explanation or definition, and not needing to go to the documentation for elaboration of a statement. To me, there is no such thing as overkill when it comes to variable naming convention So . . . you completely disagree with the entire article. You are the one not understanding the article, because, the article made arguments for both, long and descriptive names, and for short when it didn't really matter that much. However, when it comes to long and descriptive names, my preference is for "always" using long and descriptive names for variables. A coder's life, and that of those that come after, are made a lot easier when the guess work is removed from the variable names. Right? I have no doubt that I have a lot more experience in the IT field than you have, and when it comes to documentation, and keeping things as simple as possible, I've learned a lot of lessons along the way, and one of the biggest lessons is self-documenting programs. Too bad we've moved away from that with the more cryptic languages and the more cryptic coding habits of many programmers. Though I no longer do COBOL, the reasons for 'self-documenting' and 'ease of use' and 'ease of maintenance', are something that should always be taken into consideration. Programming doesn't have to be complicated. COBOL was/is very robust, without making things overly complicated.

apotheon
apotheon

Please tell me you're kidding.

Sterling chip Camden
Sterling chip Camden

I used to use "class" a lot in languages that later added OOP, for example.

Sterling chip Camden
Sterling chip Camden

Yeah, those are built-in property names within those components of the .NET Framework (and IIRC also in VB6 and prior). Other than the fact that they aren't consistent (s/b Text in both cases, IMO), they aren't ambiguous because they're specific to an instance.

Sterling chip Camden
Sterling chip Camden

... and I think that in the case being batted around here, Microsoft made a mistake to make control variables simple members of the form's class. They should probably have been members of a container/collection of some sort, so the distinction between those and other variables would be clear.

Tony Hopkinson
Tony Hopkinson

as far as a language that needs hungarian, needs swapping 0ut for something with more natural built in descriptive power. Still not sure about controls though tbSurname TextBoxSurname Surname terse hungarian wordy hungarian Is that the control or a local variable? The real point of attack has to be scope, if we can enforce separation of concerns in the presentation layer we don't need a local variable, the purposeful name is no longer ambiguous, hungarian control names in the bin where they belong.

Tony Hopkinson
Tony Hopkinson

a prefix that demonstrates the intent of the code, at the time it was written.....

apotheon
apotheon

Sterling's style has been to ignore everything you say that isn't relevant or civil. That's why he responds with one-liners to what you say -- because the majority of what you say is a straw man (intentional or otherwise), uncivil, or both.

adornoe
adornoe

and you're going to tell me to tone it down?!?! Why don't you try to follow your own advice? And, there were no strawman arguments in any of what I had to say. I just took your words and those of "Chip's", and retorted accordingly. It seems that you're the one with the strawman argument with your accusations. And, hey, at least "Chip" was a lot more gentlemanly in his discussion style. You could learn a lot from him.

apotheon
apotheon

Your argumentation style, consisting largely of denigrating others and misconstruing what they say to attack a straw man, is less than convincing. Sterling's patience with you is astounding.

Sterling chip Camden
Sterling chip Camden

I agree, but it also shouldn't be targeted for Barney and Friends. If you don't insist on a minimum competency for programmers, then their inability to understand the code will not be your biggest problem.

adornoe
adornoe

You misunderstood me No I didn't. I used your every word to create a response back to you. I analyzed every word and every sentence before I issued my response, and, if you'll notice, I broke down your whole post and retorted to every piece of it. I try to be thorough every time I can and when I have the time. I wasn't taking it literally. That's exactly what it looked like. I meant that not any old programmer should have their hands in my code. Did you mean "any OLD programmer", or, "just ANY programmer", because, it sounds like you are attacking the older generation of programmers. So, according to you, are the older programmers not capable because they're suffering from Altheimer's or dementia? Or, what the heck is your problem? BTW, if you're not the owner of the company to which the code belongs, it's not "your code". Any company has to assume that, the next programmer to look at a program is not necessarily the first one that wrote it. They need to be of a certain caliber, or they'll mess it up. Garbage! Look, programming does not require a PHD or an IQ of 175 or better. Most programmers are of average intelligence, and, though you may feel pride in your particular skillset and knowledge, there are many millions just like you out there. You're not that special; except to your mom, of course. Coding to the lowest common denominator doesn't help, either. Coding is not about making things complicated and then taking pride in your masterpiece of complexity. The best programs and best applications, are all coded with "simplicity" in mind, because, if it needs to be revisited, for enhancements or maintenance and for debugging, then the application had better not be a monster in complexity. COBOL used to have features in it which, if one wished, he could create quite complicated programs. I actually took pride in creating complicated masterpieces in my early programming career, with the use of the "alter" instruction. Then, I grew up and realized that programming was not about creating complicated masterpieces; it was about taking complicated programs and complicated systems, and making them seem very simple. Notice that I'm not pointing to COBOL as a tool to simplify matters. Any programming language can be used to create a complicated mess, or it can be used to create a simple and straightforward solution. The simple and straightforward is always preferable, because, it shouldn't require an Einstein to figure out what you've done.

Sterling chip Camden
Sterling chip Camden

I wasn't taking it literally. I meant that not any old programmer should have their hands in my code. They need to be of a certain caliber, or they'll mess it up. Coding to the lowest common denominator doesn't help, either.

adornoe
adornoe

whatever case you tried to make. Don't be silly. If this had been a court battle, you would've lost your case, so it's always better to defer to a better "defender" for your case. If the "case" needed a good defense, you would've been fired from the case from the beginning.

adornoe
adornoe

Oh, one more thing: You should learn to quit while you're behind, or you risk getting further behind. The hyphen and the ASCII character used for "minus" in common programming syntax are the same thing. Engage your brain for a change... Do the following have the same contextual meaning for the "hyphen"? "Employee-Salary" VERSUS "Employee - Salary" On the standard English language QWERTY keyboard used most commonly around the world, it's the character produced by the key between the = key and the 0 key. The context of where you type it does not change the fact that it's ASCII code 45. That was actually so stupid that I shouldn't have to respond to the asininity. Look, silly, a key on the keyboard doesn't necessarily equate to it's use in a programming language. In everyday English or in everyday written language, it certainly has the conventional contextual meaning for the written word. But, the written word is not what programming languages use for programming. Programmers might use the same character set, but, with specialized meanings dependent upon the language specifications. So, do the "*" and "(" and ")" and "+" and "/" and "\" and "{" and "}" characters mean the same in programming languages and they do in the written word? Get a clue, and then start your career again, because, you lack a lot of common sense right now. Even if they were different characters that just looked the same, You need to start reading for comprehension. I never said they were different characters. I said they had "different meanings" or different uses, dependent upon the context. A hyphen is the same as the minus symbol, but, they take on completely different meanings dependent on the use. Why argue against common sense? Your argument is specious, at best. the problem that Tony brought up: it's difficult for humans to differentiate between them in writing, thus potentially leading to code maintenance problems. I spent a lot of years doing COBOL and doing a lot of other programming languages. When a person first learns a programming language, the first thing they learn is the "basics", and one of the basics is the meaning of certain characters within the language. If there is someone who is not capable of learning at least the basics, then that person doesn't belong in IT altogether. However, with COBOL, when it comes to its format, there weren't too many new rules to learn, because, it resembled the written word, and if one wasn't familiar with the written word, then one would also fail at writing COBOL. The meaning of the hyphen vs the minus symbol, are the same in COBOL as in the everyday written English language. That actually kept the special symbols to a minimum and learning COBOL was a breeze, and who knows, perhaps even you would've learned it easily. but of course you won't admit that's a potential problem, because to you, It would be a potential problem to those with IQs of below 50. Most other people, with common sense, would never have a problems differentiating on the meaning contextually. I never in my many years of doing COBOL heard the complaints from anybody, that you and Tony seem to be having. Perhaps you should speak for yourself, because, I doubt that Tony want's to pursue your line of argument. COBOL can do no wrong. COBOL couldn't do it all, but, for its purposes, no other language has ever matched it. I've moved on from COBOL, but not because it was incapable of doing what other languages are now being tasked to do. I moved on because, the marketplace moved on to other languages. If I had my druthers, I would make COBOL the most required skill around. And, hey, speak about what you know, because, when it comes to COBOL, you are very ignorant.

apotheon
apotheon

The hyphen and the ASCII character used for "minus" in common programming syntax are the same thing. On the standard English language QWERTY keyboard used most commonly around the world, it's the character produced by the key between the = key and the 0 key. The context of where you type it does not change the fact that it's ASCII code 45. Even if they were different characters that just looked the same, though, the same problem would occur -- the problem that Tony brought up: it's difficult for humans to differentiate between them in writing, thus potentially leading to code maintenance problems. . . . but of course you won't admit that's a potential problem, because to you, COBOL can do no wrong.

apotheon
apotheon

I love a challenge. Let me know when you have one to offer. Nothing to see here. Move along.

adornoe
adornoe

Apparently, you're mistaking the word "challenge" for "trolling". You apparently cannot stand being challenged on your points or assertions, and will quickly accuse someone of trolling when you don't have an effective way to answer the challenge. Trolling is an overused word on the internet, and you, if you're a professional, should know better than to just use the term without thinking. Hopefully, you're learning something from this back-and-forth. ;)

apotheon
apotheon

> sophistry: A deliberately invalid argument displaying ingenuity in reasoning in the hope of deceiving someone. I know the meaning of the term -- just as you clearly know the technique. Stop trolling me. Also . . . proofread before posting a comment that tries to correct someone else's spelling, grammar, et cetera. That's just some friendly advice (more than you deserve).

adornoe
adornoe

sophistry: A deliberately invalid argument displaying ingenuity in reasoning in the hope of deceiving someone. If you're going to use a word, you must also learn the meaning of it, and the appropriate usage of it, and the appropriate conditions or time for its usage. So, taking the meaning of the word, what was "deliberately inaccurate" in any of the arguments I made regarding COBOL. You tried as best you could, and you never had a good retort to anything I did say. And, I don't have to twist my arguments in order to defend COBOL, since, COBOL had and still has the capability for people to create "self documenting" code when it affords programmers the ability to create code that reads like English. No need for me to "display ingenuity" when the argument is made for me by the language itself. Most of what I did was to repeat the arguments for why the language had and still has a place in programming languages. When it comes to the part of the definition about "deceiving someone", again, what part of what I said regarding COBOL is an attempt to deceive? In every post and counterpost to you, I retorted against your every point, and I didn't have to get creative in order to battle against your ignorance about COBOL. You took umbrage in my mentioning of COBOL in "your" discussion, but, you were never able to effectively answer my points regarding how COBOL had already covered your main points in "your article", and very likely, way before you were even born. Your attitude is pretty immature, and you should leave the writing of articles to those that can handle different opinions like adults. And, hey, I don't need for you to "be drawn" back into the argument. If I feel that a discussion could benefit from my input, then I'll do so, and if you don't approve of my input, you don't need to get so offended and defensive; those are very immature behaviors.

apotheon
apotheon

Sorry -- I'm not letting your sophistry draw me back into this ego feeding of yours.

adornoe
adornoe

You have ceased being amusing. That might have been your problem all along, your thinking that I was trying to be amusing. You may have the "win" in your own mind if you like. You can stop declaring yourself the winner now, and let readers judge your words for themselves. I don't need the win. I just wanted to inform you about how COBOL had solved the issue of clarify for naming of variables a long time ago. You're the one that could not just accept the fact and just had to go on the attack. And, yeah, the readers can judge for themselves, but, if they don't have any knowledge or real experience with COBOL, then they're not qualified to judge.

apotheon
apotheon

You may have the "win" in your own mind if you like. You can stop declaring yourself the winner now, and let readers judge your words for themselves.

adornoe
adornoe

I can't be arsed any more. That sounds so "professional" of you. ;) I think I've posted enough for anyone with two neurons in close proximity and peripheral vision to figure it out. Then, that might be your problem. You have become used to dealing with people with your same capacity and you don't know how to express yourself more clearly. Dealing with me, you need to upgrade yourself to a much higher level. ;) And, what the heck is that about "peripheral vision"? In case you didn't know, let me the first to inform you that, having peripheral vision has nothing to do with coding standards, or programming languages, or communicating effectively with the written word or the spoken language. ;)

adornoe
adornoe

no good retorts Yep! That's exactly what I said. You must not have been reading very closely. Actually, it's you that hasn't been reading closely, and if you had, you were either not reading for comprehension, of the material which I presented to you was over your head. That's usually been the case when I have a "technical" discussion with a newbie.

Tony Hopkinson
Tony Hopkinson

I can't be arsed any more. I think I've posted enough for anyone with two neurons in close proximity and peripheral vision to figure it out.

apotheon
apotheon

You must not have been reading very closely.

adornoe
adornoe

and, if I feel strongly about an issue, I'll get strongly engaged, especially when I see so much ignorance coming from the opposite side. And, hey, Tony was disagreeable, but not as harshly as the direction you took. You may still not like my points about COBOL, but, you still have not had any good retorts against the language. And, I don't mind the number of keystrokes when I feel that being thorough can dispel any misunderstandings. ;)

adornoe
adornoe

what they're talking about. Eventually I found the right way to communicate with you. I'll try and remember next time you post something I disagree with. Discussions can be a lot more civil and go a lot more smoother when people don't get ridiculous or use exaggerated cases and don't take belligerent attitudes. ;) Course, haven't heard you admit that the way to write good code is to simply use COBOL yet, is 'erm somewhat misleading. What you still fail to understand is that, my mentioning of COBOL was to simply point to how the language had already addressed and answered the points about meaningful and understandable naming of variables or any other programing element. People don't have to code in COBOL in order to create an easier to understand program. But, with COBOL, since even the operators and other language elements made for an easier construction of programs, people didn't have to be constantly looking up or referring back to other code in order to understand the "immediate" meaning of a variable or literal name in a chunk of code.

Tony Hopkinson
Tony Hopkinson

I'll try and remember next time you post something I disagree with. Course, haven't heard you admit that the way to write good code is to simply use COBOL yet, is 'erm somewhat misleading...

apotheon
apotheon

You could have saved a lot of keystrokes there by just saying "No, I'm not playing," and it would have clearer. Sometimes, verbosity is just obfuscation. It would have been less intentionally insulting than your closing, sarcastic question, too -- which is kind of amusingly hypocritical of you, given some comments you made in response to Tony.

adornoe
adornoe

At least this wasn't a political argument, where you should never partake in a debate, because you are one of the most ignorant people to ever open up his mouth about anything political. Now, when it comes to tech issues, you can be right and you can be wrong, it all depends upon how much knowledge one has on the topic. When it comes to politics, YOU are always wrong and ignorant. Now, how much do you know regarding COBOL, which is what I've been "arguing" about here?

adornoe
adornoe

and, though I still have a couple of gripes with some of the sample lines of code you included in your last post, I don't think they're worth pursuing. For example, getting OOpy with the ++ or the =+ operators, takes away from "readability" where "even a non-techie" could understand the code. But, since it's not mandatory and people can still use the "old" coding style for statement/sentence construction for "readability", I don't have a problem with the changes. However, for the changes that take away the "English-like" construction, as in the use of ++ or =+, I would prefer a comment to clearly indicate, for the non-programmers, what that particular chunk of code is doing. Other than that, as long as most of the code is still "readable" and easy to understand, I don't have a gripe with the addition of the OOP elements to COBOL. However, to reiterate, I appreciate that you are being a lot more reasonable and civil with your comments. But, you didn't get ridiculous and you didn't get defensive, and you didn't get offensive, and that's the kind of conversation I prefer having.

AnsuGisalas
AnsuGisalas

Word to the wise, in my experience user:Adornoe isn't much of a one for reason, not in the taking, and not in the giving. And not in the following of any kind of discussion ethics.

Tony Hopkinson
Tony Hopkinson

Yes it does, not because of the language though. The two worst examples of poor practice I've ever seen are in two of the ones I use to write the most comprehensible code I can. Poor coders can write crap code in any language. Tell them to use Cobol and they'll go from writing crap code in X to crap code in Cobol. Even if a language was amended so you couldn't have one character variable names, it still doesn't mean they are good names. Even if you made it so you could only use pre-defined names from some sort of data dictionary, they could still be poor names. Some languages facilitate writing more comprehensible code using long names. Some do it through scoping. Some do it through annotation. Some do it through defines and pragmas. In general language choice if a developer gets to make it, which is rare, is based on it's semantic descriptive power. Functional, procedural, OO and to a large extent how comfortable the developer is with those types and the most recent flavour they used. Whether after that choice they write readable code, is all down to whether they know the value of doing so. NumberOfEmployees++; is no more and no less comprehensible than NumberOfEmployees +=1; NumberOfEmployees = NumberOfEmployees + 1; NumberOfEmployees := NumberOfEmployees + 1; Inc(NumberOfEmployees); Swap NumberOfEmployees for N, though, and you start to have problems... Now you'll see no argument from me that someone who cut their teeth on Cobol is likely to be naturally be more verbose than say some one who started with C, a pascal bloke like myself (I use scoping as much as I do naming) would probably sit somewhere in between those two 'extremes'. Nothing to do with language and everything to do with realising at some point you or someone else will have to chnage this code, and that's a lot easier if you can read it! No sarcasm, no wit, no hyperbole. Do you get it now? Sheesh This is a tech site, people come here to learn. I am not stopping until it's clearly understood that COBOL does not imply good code hich is the point you started out with. You do yourself and your audience disservice with such a foolish prejudice. Have you considered that seeing as you don't use a lot of scoping, that some of the short names you find so ambiguous are not within their scope? EMPLOYEE-RATE-PER-HOUR vs the HourlyRate property of an employee class?

adornoe
adornoe

I'm ridiculous?, Absolutely! You're being ridiculous and silly. And, the use of exaggerations is no way to counter the points by others. Your exaggerated example of what could be done with COBOL is better representative of what happens in most other languages, day in and day out. Besides, you're exaggerated example wouldn't work in COBOL, because, syntactically, the compiler would spit it out. It's quite apparent that you're another one of those that butts into a conversation, thinking that he's got something to contribute, but ends up making a fool of himself by demonstration that he knows absolutely nothing about the subject at hand. at least I'm not contradicting myself You don't have to contradict yourself in order to still make a fool of yourself. Your original post was use cobol to address naming issues! That's right, and you're still not addressing how COBOL was meant to be used for programmers to construct code that could be read as easily as if one was reading English-like sentences. All that you're doing is making ridiculous examples and making ridiculous statements and proving that you're not educated enough on the matter to even be partaking in the conversation. I call bollocks because you can write ridiculous code in Cobol, You can write ridiculous code in any language, and you can even write ridiculous sounding sentences in everyday languages, such as English. What the heck does that prove? you agree that you can, Yeah, that's true, people can get ridiculous and stupid in their programming, but, in every COBOL shop that I ever worked at, and especially when I was programming manager and head of IT in other shops, those kinds of people didn't last too long in their jobs. but you shouldn't and then call my intelligence into question? I don't question your intelligence. I question your rationality. Nobody should get ridiculous and stupid when it comes to trying to make their points by using exaggerated examples, especially when those examples, if they were tried in real world situations, would get people fired on the spot. Then we get onto when is a hyphen a hyphen or a minus. Why even make a big deal with what has been accepted and standard practice for hundreds of years and perhaps thousands? Hyphen and minus, they look the same, but they mean different things, depending on the context. In their use within COBOL, the language is emulating everyday situations with the recognition of accepted and standard language notation. I say use - as minus and make - illegal in names, use say _ instead, you tell me _ isn't -. Well of course it isn't, that was the point! Nonsense!! Why not set the same arbitrary rule for standard, written language? Look, in case you haven't understood by now, COBOL was intended to emulate English-like statement and sentence construction. You aren't anybody of much importance to have a say in how the most widely used language, ever, should have to change, just because you are too simple minded to understand the benefits of a language that is easy to understand and easy to write with. Not me who's being stupid here. It's quite stupid to insist that, just because a language is verbose, that it doesn't have a place or importance in the computing world. Millions of people around the world recognized the benefits of the language, and that's the reason that it became the most widely used language ever. Remember that, there were plenty of other language choices for programming at the same time COBOL was becoming the most widely used. So, why did it take over as the number one preferred language? The reasons for that would still be over your head. You're being hard-headed and short-sighted and too simple-minded to recognize the reasons. So, just don't even bother carrying on with this conversation and go ahead and to your cryptice language coding. The world isn't going to care that you don't like COBOL. Stupid is insisting you weren't wrong, There is nothing at all wrong with anything I've said. You may not like or agree with what I said, but then, you're quite irrelevant. when it's apparent to everybody involved that you were, including you from your reply. Perhaps if you expressed yourself in a COBOL-like fashion, you'd make more sense. Right now, you're being very cryptic and very incoherent. Even standards and conventions aren't necessarily the answer, I've seen some out right dumb ones imposed on developers. If you're in the IT field, then you must be one of the most stupid out there if you actually believe in that sentence above. Standards are always preferred, and if you are a proponent of spaghetti-like code and incomprehensible programming, then, you don't need standards. Standards are ALWAYS THE ANSWER in order to create a coherent application, that performs according to expectations, and that would be a lot easier to maintain in the future. It sounds to me like you're not much better than a freshman programmer who hasn't built experience beyond what he has learned in school. Must use hungarian notation. There should only be one return in a function Annotate every line of code. Pseudo code lead comment for every method. Whatever! But, irrelevant to the discussion at hand. Look, that list is very basic, and something that, most people learn in the first day or first week of programming classes. So, it's not very impressive that you could list some very basic standards or rules of programming. Also, could you learn to express yourself in standard and grammatically correct English? Even when you're just ligiting things, the way you express yourself should sound "professional". ;) Aside from one lovely period in my career, I've done nothing but maintain / enhance other people's code, It's a good thing that you're "just" performing maintenance and enhancement, because, from the sounds of it, you wouldn't be good at original ideas and original coding/programming. In fact, it sounds like you're eons from becoming a "developer". if there's an example of bad practice I haven't seen, I'd really rather not. There is a lot you haven't seen and a lot you haven't learned. But, from the sounds of it, you are very likely one of those that creates very bad examples of programming. Your attempts at creating exaggerated and ridiculous examples of COBOL didn't help your case at all. One last thing, COBOL is a tool, and just like any other tool, it can be used correctly and it can be used incorrectly. COBOL compilers check for syntax, and it doesn't care how stupid one wants to be in naming variables or any other elements in a program. Just like a hammer, you can hit the nail on the head, or you can miss or hit the nail sideways, and you'll end up with a mess for a construction project. People who continuosly miss or bend the nail, won't be in construction for long, and it's likewise with COBOL or any other language.

Tony Hopkinson
Tony Hopkinson

Your original post was use cobol to address naming issues! I call bollocks because you can write ridiculous code in Cobol, you agree that you can, but you shouldn't and then call my intelligence into question? Then we get onto when is a hyphen a hyphen or a minus. I say use - as minus and make - illegal in names, use say _ instead, you tell me _ isn't -. Well of course it isn't, that was the point! Not me who's being stupid here. Stupid is insisting you weren't wrong, when it's apparent to everybody involved that you were, including you from your reply. Even standards and conventions aren't necessarily the answer, I've seen some out right dumb ones imposed on developers. Must use hungarian notation. There should only be one return in a function Annotate every line of code. Pseudo code lead comment for every method. Aside from one lovely period in my career, I've done nothing but maintain / enhance other people's code, if there's an example of bad practice I haven't seen, I'd really rather not. :(

adornoe
adornoe

Some cobol based on your example... Resorting to stupidity and ridiculousness is no way to win an argument, and you're presenting yourself as being too childish. E-F-I-T-D r = E-T-S * E-F-I-T-R Might need some annotation.... As programming manager, I would fire your butt on the spot. Being stupid and ridiculous should get your but fired from any job. No such thing as self documenting language. COBOL allows a programmer to be as stupid or as intelligent as he wishes to be. A manager in a COBOL shop, will not be as forgiving, and your programming career in that shop would be over immediately. When it comes to COBOL, it's the expectation that, someone will code for readability while using the underlying structure of the language. You can be stupid or you can be smart, but, by being stupid you will also be gone. You can get as ridiculous in any programming language, and you will still be gone. There are some that lend themselves better to self documentation, but if you are aiming at code comprehension, it's all down to the audience. Not really, dude. The audience for a COBOL program is, expectedly, those in the programming department and those that would care to determine what a program is doing at a high level without having to understand every nook and cranny of the language. The English-like construction for the code is more easily accomplished through COBOL than with any other language. That is why it was designed and that is why it still exists today. COBOL was designed to allow for verbosity, and that verbosity allowed for long names Nothing wrong with that statement. The more readable your code the wider it is. And, that's quite irrelevant. It's still code that gets translated to machine language, and as far as teh machine is concerned, it doesn't care what the language for coding was. Can you understand that? If I was using cobol, I'd probably stick to meaningful names as well, buts that's counteract it's weakness, which is scoping. If you understand how to construct a COBOL program, and you code with simple to understand names, then you wouldn't have to worry your little heart about "scoping". Scoping is good to have, and actually, OOP COBOL has it now. Go do some research and you'll start getting rid of your ignorance. When i ws doing machine code or good old basic , even C, I relied on annotation to make my code comprehensible., not always successfully of course because it doubles the maintenance burden. Nice, but, irrelevant. We're talking about easy comprehension of program code. I don't care that you coded in C or machine language, just make the damn program easy to understand and life will be easier for you and those that follow. COBOL was designed to allow for the lengthier names so that the programs would be a lot easier to understand and enhance and debug. Assembler or machine languages should never enter into a discussion about "readable" English-like coding. As for hypen and minus, one can only assume you've not done much with parsers, because that's a totally unnecessary level of complexity that added nothing to comprehensibility, underscore would have done just as well, as would camel case. Hyphen and underscore are two different characters, and when it comes to the "English-like" construction of a name for a variable or for other element, the hyphen is much preferable, because, the underscore is not part of everyday English, while the hyphen is. When is the last time, besides in programming manuals, the you saw people using underscore in their writing. But, again, you don't have to use the hyphen or the underscore to construct nice English-like names; you can always use the "camel" format for variable name construction. It's a matter of preference. However, your issue is, again, irrelevant when it comes to how one should go about creating readable code. I'm fanatical about comprehensibility, I consider annotation a failure in any language where you could do without it. You've never been in the same boat I was in when managing a large project, where people could create many different names for the same instance of a variable or field. Naming conventions, with long, self-defining names, and that everybody could immediately know what was being referenced, was always the preferred method for a large staff. But, they were all expected to use the same name for the same referenced item or variable. That's called, setting a standard. Hope you know what that means. Some thing to bear in mind syntax and sematics are constraints as much as they are enablers. It's not verbosity or terseness that we need to avoid, it's ambiguity. Another, quite irrelevant point. Ambiguity is overcome when standards are set and followed. The syntax in COBOL is simple, and perhaps that's the part that some people hate, because, when something is made simple, then they believe that, it doesn't require "real" programmers to code in the language. Verbosity is not a problem when someone uses practical naming conventions, and a lot of common sense, and doesn't try to do ridiculous code such as you started with at the beginning of your post above. Using a worst case scenario is not how you win arguments, because, a worst case scenario could be demonstrated with any language. Try harder next time.

Tony Hopkinson
Tony Hopkinson

E-F-I-T-D r = E-T-S * E-F-I-T-R Might need some annotation.... No such thing as self documenting language. There are some that lend themselves better to self documentation, but if you are aiming at code comprehension, it's all down to the audience. The more readable your code the wider it is. If I was using cobol, I'd probably stick to meaningful names as well, buts that's counteract it's weakness, which is scoping. When i ws doing machine code or good old basic , even C, I relied on annotation to make my code comprehensible., not always successfully of course because it doubles the maintenance burden. As for hypen and minus, one can only assume you've not done much with parsers, because that's a totally unnecessary level of complexity that added nothing to comprehensibility, underscore would have done just as well, as would camel case. I'm fanatical about comprehensibility, I consider annotation a failure in any language where you could do without it. Some thing to bear in mind syntax and sematics are constraints as much as they are enablers. It's not verbosity or terseness that we need to avoid, it's ambiguity.

adornoe
adornoe

even if they "look" alike. It's the contextual use that matters. In some languages, you can have == and = having different meanings and performing different functions. So, is the == more equal than the = symbol, or is it the same, or does it perform differently depending on what the compiler/interpreter "understand" them to be in the "context" of their use? In some languages, "(" and ")" can have different meanings, depending upon how they're being used. With COBOL the hyphen and the 'minus' look the same, but, they perform different functions. The hyphen is used in everyday English, and people immediately understand it as being a hyphen because of the context of the statement. In COBOL, a hyphen is used the same way, but only to "join" a series of words to create a single name from those words. Thus Employee-Federal-Tax-Due is the same as Employee Federal Tax Due, but as a single word/name for the compilers use to denote a single variable. Employee - Federal - Tax - Due would create a mess and broken business logic. That would be the "minus" sign gone amuck. Is that simple enough for you to understand?

adornoe
adornoe

hopefully, you are capable of understanding the "difference". It's the contextual use of the character that matters. The hyphen is used to "concatenate" or join words, and sometimes to separate words. When the hyphen is used as a mathematical operator, it loses its meaning as a hyphen and becomes a mathematical symbol. The Hyphen in COBOL has a completely different meaning from the mathematical operator. You code makes sense to you, because you are a COBOL guy and that's the way you write code. It looked horrible to me because I'm not, and I don't. The code would make sense to anyone that has even a rudimentary understanding of the language, and it also has a lot of meaning to anyone that is able to read "English". That is what COBOL was intended to be; a language where the code could be easily understood, and as easy as reading English. If you knew how to read English, you already understood most of what was happening in the program code. Of course, a little bit of common sense goes a bit further towards understanding what the code does, and so does understanding the overall scope of the project and/or program. The first time I was introduced to COBOL, without having taken even a single lesson in COBOL, I and my fellow students, were asked to try to "decipher" what the English-like procedures were trying to accomplish, and most of us got the gist of the application before we even started our first lesson. That's what COBOL was about, and that's why I was sold on the language. Like I said, I've moved on from COBOL, but it was and still is a great idea, and one that should never go way. Of course my c# examples might make you squint as well, not to mention other users of the language. That would be the case for any cryptic language, even if I made it a point to understand the language thoroughly. Programmers build bad habits, and the worst one is writing cryptic code that includes cryptic variable names and cryptic procedures/functions/sub-routines/methods, and the other being lack of documentation within the code and outside the code (external documentation). The point is your point, Cobol equals readable / comprehensible because you use verbose hyphenated names is well 'erm iffy. There is nothing "iffy" about wanting applications that are easy to understand and easy to maintain and that don't require a lengthy learning period for anyone in the future that has to work with the application. All of that saves a lot of money, and staff is still the biggest expense in any IT shop. All the original article said was that meaningful variable names weren't always necessary I got that; but the article also mentioned that, good naming conventions should be used wherever possible, for the sake of understanding. What I did was merely to point out that, COBOL had already accomplished those functions a long time ago. You could be as wordy as you wanted while being as cryptic as you could get away with. Changing i in loop to j or lp or IndexInCustomerList, won't change the teh code in terms of how it's compiled, or the intent as it's percieved by a programmer. Yet, a name that conveys an immediate meaning to the "reader" is much preferable to a cryptic name, even if it's intended for "localized" use. My preference for "self-documenting" code is to always use meaningful names, even if it doesn't matter to the overall application or to the compiler or to the runtime execution. COBOL is irrelevant to the entire discussion, as is any other language without some short limit on the length of names in it. COBOL was/is my example of a self-documenting language. That was part of the discussion, and I put in my 2 cents worth, which has turned out to be a trillion dollar issue in the discussion. The "author" became defensive when he tried to thrash the language as being too verbose and too out of date. Why not have just accepted COBOL as being at the "extreme" of what naming conventions are about, and about what self-documenting code should look like.

apotheon
apotheon

Are you really unable to understand that Tony Hopkinson was referring to the idea that COBOL uses the same character in variable names and as a subtraction operator? Tell me you're only playing stupid. Please.

Tony Hopkinson
Tony Hopkinson

You code makes sense to you, because you are a COBOL guy and that's the way you write code. It looked horrible to me because I'm not, and I don't. Of course my c# examples might make you squint as well, not to mention other users of the language. The point is your point, Cobol equals readable / comprehensible because you use verbose hyphenated names is well 'erm iffy. All the original article said was that meaningful variable names weren't always necessary Changing i in loop to j or lp or IndexInCustomerList, won't change the teh code in terms of how it's compiled, or the intent as it's percieved by a programmer. COBOL is irrelevant to the entire discussion, as is any other language without some short limit on the length of names in it.

apotheon
apotheon

I made a huge mistake there. I missed the fact it was you that responded, and not adornoe. If I had seen it was you, I almost certainly would have gotten the joke. Mea culpa, mea culpa -- mea maxima culpa. You are the winner, and I am the loser, here. > Sorry. I couldn't down vote myself. It seems to me like you probably should have downvoted me, actually (and maybe adornoe, but that's another matter).

adornoe
adornoe

I really wasn't planning on getting into a wall-of-text contest . . . Yet, here you are, being defensive and offensive. Next time, try something that you understand better and that you know you can't be challenged on. Perhaps, something like the name of your first pet, because, there's no way that I could challenge you on that, whether you were lying or not. I would take your word for granted. In the current discusion, you're easily challengeable. Look, COBOL may be a very old language, but, for its intended purposes, it was never broken, and, even today, it can perform as originally intended, but with many improvements, such as OOP and with support for the latest in technology. Age has nothing to do with it. Yet, you mentioned that, I was still stuck in the 1970s because I was "defending" COBOL. Why not try to be consistent? Bad design has everything to do with it. You can take any "well-designed" language and turn it into crap. It's all dependent on the user. But, you are not keeping up with the times. COBOL has been upgraded to do many of the same things that "more modern" languages do, and with much of the same kind of "structure", such as OOP programming. It's still "wordy", but very capable. Perhaps you should try it and then come back with a better understanding. I was able to design and code entire applications using COBOL a lot faster than using any of those other languages. Although I have since moved on, if I had to, and it was available, I would still opt to use COBOL over any other language for "business" applications. and yet, you're willing to claim COBOL is better than newer languages that allow people to more quickly develop software, and more easily maintain it, because COBOL enforces an order of magnitude more verbosity. Why the "yet"? I'm not contradicting myself. You're the one reading something into what I said that isn't there. And, yes, for business applications, if the language was capable of doing the job, I would still prefer COBOL to any other "modern" language. When coding in the language, I never had to backtrack in order to understand what a variable name meant, whereas, with the more cryptic and abbreviated names which are found in applications written with "current" or newer languages, people WILL forget what a variable name meant and what it was used for, sometimes even within the same session of writing or modifying the code. when it comes to pure business applications, COBOL was my "go to" language. I have moved on, but if a shop had the options, I would still prefer COBOL for new development. There was a time when COBOL was the language to use for such purposes. I still don't know of a single "business" application that couldn't benefit from the structure that COBOL afforded, including its "self-documenting" capabilities. That time is past -- because its design did not provide for an ability to keep up with the changing needs of business. Because you say so does not make it so. You're just repeating what you've heard, and what you heard is not always the truth. When it comes to business needs, I haven't notice where business are doing anything different from what they used to. A lot more, yes, but different? No! The only place where COBOL is appropriate is where it is already in place, debugged within an inch of its life, and running stably. When you need to write something new, write it in a better designed language. Bunch of nonsense! For business applications, I still have not met a language that could approach COBOL for the ease of learning and ease of use and ease of maintenance and ease for self-documenting code. As a compiler, the code generated performed as well or better than that generated by "modern" languages. COBOL was always the easiest language to work with. Even after so many years, it's still the easiest language to learn and understand. No other language, modern or old, can match it. Sure, fifty years ago. "Was" is the operative word here. The end of its usability was built into its DNA. Like I mentioned earlier, you haven't been keeping up. COBOL received 4 major upgrades from it's original; it got updated in 1968, in 1974 and 1985, and it's latest upgrades include capabilities of OOP programming, and it's as capable as any other language you can mention. It's even supported in the .Net infrastructure. I never met a business application that couldn't be written in COBOL. That still remains a fact, even in today's more demanding applications. So what? It's Turing complete. By definition, that means it can be used to write a business application. Why reinvent the wheel? If something works, and does its job as well as the competition, then why reinvent the tool? The same could be said for BASIC or VBScript. That doesn't mean it's a good idea. Basic has been upgraded, and so has the foundation for VBScript. There are similarites between what the current version of Visual Basic and C#, and yet, C# is called the more "more modern" language, but, as far as capabilities, there isn't much difference between the two that matter so much for business applications. In fact, an application which I started some 10 years ago, and which I abandoned for a while, I had started using Visual Basic 6, and I've since revived the application, and will finish it in Visual Basic 6 and VBScript (classic). It's a web application using ASP3 with parts of it using Asp.net, and the application has some very robust features with a database that could probably grow to tremendous proportions (using Postgresql), and when I test it against all the different browsers, it performs as well as anyone can expect, and perhaps as well as with any other language. I'll be upgrading it to VB.net and/or for C#.net/Asp.net. When it comes to testing under the different browsers, Opera was actually the faster, much faster in fact, but that's another matter. Programming has become a lot more expensive and a lot more messy in today's IT departments. That's still a fact, and one that should be so easily dismissed. Only a person who doesn't care about saving money and doing things efficiently, would be so dismisse as you seem to be. This has nothing to do with whether people are using COBOL. Yet, people are still using COBOL, and it were so easy to convert from COBOL to an easier and more robust "newer" language, people would've done it a long time ago. Instead of converting, many corporations opted to maintain and even write new programs using COBOL. The conversion to newer tools was too prohibitive, and if the newer languages were actually easier to understand and easier to code with, then I'm pretty sure that there wouldn't still be so much COBOl still in existence. If it's not one thing, it's gotta be the other for you. You can't lose. You're still demonstrating that characteristic. You're the guy who decided to say something is both concise and verbose. COBOL can be concise and verbose at the same time. One sentence can accomplish the same function in one statement that it would take several statements in another language. That makes it concise. COBOL can be verbose in the names people are expected to use for their naming of variables, literals, procedures, etc. But, that verbosity easily pays for itself in the later stages when the application needs to be debugged and enhanced and maintained. Maybe you understand each of them in isolation, but are just incapable of understanding the conflicting meanings when juxtaposed. Actually, you're the one that can't understand the meaning of concise and verbose in the context of this discussion and in context of COBOL. Go back to my paragraph above where I'm trying to clear things up a bit for you. I don't know what you're failing to understand It's not me "failing to understand. It's you being hardheaded and hating to be challenged on your points that you made in your "precious" article. How dare I challenge someone who wrote the article?!?! but, unless you're just trolling, So, to win the argument you have to resort to calling someone a "troll"? Aren't you capable of better debating tactics? Calling someone a troll is the same as admitting that, perhaps you don't like being challenged? you clearly don't understand something when you say "Everything is clear and concise, even if verbose." It's you with the inability to understand "concise" and "verbose" in the context in which it's being used in the discussion and in the context of COBOL as a language. I just said to my girlfriend "If someone said 'Everything is clear and concise, even if verbose,' what would you think?" She said "I'd think he doesn't know what he's talking about." Oh, wow, dude! Do you really believe that you can win the discussion with that kind of fallacious and irrelevant line of argument? Does your girlfriend understand the whole discussion and the context of the discussion? I could use your same example by asking my dog the same question, and if he said "woof, woof", I could take that as affirmation of my position and then bring that into the discussion as proof that it's you that doesn't know what he's talking about. Give me a break, guy. You should know better than that. I didn't tell her anything about the context, Obvioulsly not, and even if she understood what you were talking about, and if she understood what I was trying to say, I wouldn't believe you after the conversation. But, I'm pretty sure that, the conversation between you and your girlfriend never ocurred. You're just trying to score cheap points. your worshipful devotion to the false idol of COBOL, That's actually a pretty silly way to try to win an argument. Do you need to make silly statements in order to try to win the argument? Btw, I have stated, many times, that I've moved on from COBOL and I actually haven't touched COBOL for more than 10 years. But, COBOL is still around, and, from my experiences, it's still a better language for clarity and for ease of use and for self-documenting code. Have I said that often enough? or your overweening arrogance ("I, for certain, understand what programming is all about"? Statments of facts has nothing to do with arrogance. You may not like my attitude about it, but, a fact is a fact. Really?). So, would you like for me to repeat what I've said? But, you might call me arrogant after that. ;) I just read your words to her, out of context, and asked what she thought. She's pretty sure whoever said that is full of it. Gee -- I wonder why. I just spoke to my dog about your line of argument, and he said "woof, woof". Can you blame him for feeling that way? Look, try to be a bit less ridiculous and you might actually start to sound a bit more credible. Try a different tactic next time. Design and programming need to be done with long-term expectations that, staff will change, and that enhancements will be needed, and that things can go wrong. When a program or system is well-documented, and when the programs themselves are written in "self-documenting" fashion, the problems in the future will not seem so daunting. I agree with every word of that. You mean, I didn't come across as arrogant? I do not agree that COBOL is the answer If you would care to examine every word I've used in this discussion, I never said that COBOL is "the answer". But, for what your original premise was about, I proposed that COBOL had already solved "the problem". COBOL is a tool, just like most other languages, and, for what it was supposed to do, with an "English-like" format, it was and is the best around. However, for my purposes, when it comes to easy coding and easy maintenance and easy understanding, I would still prefer COBOL over any other language. with its cluttered syntax, Actually, what you consider cluttered, many considered clear. It's all dependent upon the understanding of the language, and you don't have any understanding of it, even if you took some lessons in school. overloaded non-alphabetical character usage, Believe it or not, you weren't required to be so verbose in naming conventions, and you could actually use the same naming conventions as people use with other languages, but then, you'd be defeating the purpose for "concise and verbose" and "English-like", which were and are the trademarks of the language. I could just as easily written "employeeFederalTaxDue" to replace my original example of "Employee-Federal-Tax-Due". Both would have been clear as to their meanings and intentions. and frankly doltish semantics. It's very easy to attack what you don't understand. And, there's a lot you don't understand. It is a horrid language, Clearly, you never had to manage a large programming project. When it comes to large projects, clarity is of the utmost necessity. in large part because of the impositions it makes on the developer's ability to organize the structure of a program cleanly and understandably in composable units. Like I said, you're not keeping up with developments. COBOL is no longer structured as it was 20 or 30 or 40 or 60 years ago. With OOP, COBOL is very much a modern language, and you can write a program which pretty much resembles OOP programming that you could find in other "modern" languages. Divisions and sections might still be used, but, they're quite irrelevant when it comes to the real structure. Look it up. With an open mind, you might be able to understand the example here: Why Object Orientation for COBOL? http://www.c-sharpcorner.com/UploadFile/rickmalek/Art02-OOIntro12052005040650AM/Art02-OOIntro.aspx Looky here: Exception Handling in Visual COBOL.NET http://www.c-sharpcorner.com/UploadFile/RSM50/2787/ Now, there is still a lot of "COBOL" structure in there, but, it's a much modern language, capable of doing things that people would expect from other languages such as C# and Java and others. Then, you tackled a problem for which you are not well-equipped. Well, you haven't proved me wrong yet. It's interesting that you've arrived at this opinion by virtue of the fact that I dislike COBOL. Come back when you can construct an argument. Actually, the lack of a coherent argument lies on your side. Your initial line of argument was about using clear naming conventions, and using shorter (and cryptic) names when it didn't matter. All that I did was to mention that COBOL had already answered the "problem" a long time ago, and you took humbrage from my mentioning that "four letter" word in your IT universe. So, it would seem to me that, it's you that's being childish and emotional about the issue, and it's you that can't construct a coherent defense against my argument and that is effective in your attack against COBOL. It's also kind of amusing that you're telling me I'm not able to defend my points when you said that the article's author didn't understand it. There are times when people do chew into more than they can swallow. Even I have initiated discussions, only to find out that, there are others with a better knowledge base about the subject, and then I have to defer to their "superior" knowledge. I'm not afraid to admit when I'm wrong, but, on the subject of COBOL, I know about it a lot better than you. First, I didn't understand "concise", then I didn't understand "verbose", and now, I don't understand "diminishing returns". But, hey, at least you're consistent in dismissing everything that anyone puts forward. What do you call a false dichotomy when it consists of three parts, and not two? I would say that, you don't know how to count, because, there were 3 assertions or 3 accusations, in which you accuse others of doing what you do. So, what do you call someone who accuses others of doing what he or she does? I call that, someone who is either a liar, or someone who is not comfortable in his own skin and wants to share his own shortcomings with others. I hope that wasn't too deep for you. You must fail to understand either "concise" or "verbose", because they are conflicting terms. In the everyday definition of the terms, I would agree. But, we're not talking about the everyday definition. We're talking about how COBOL could be concise in the statement constructions, and verbose in the English formulations of those statements. I know that the "paradox" might be beyond your comprehension, but, if you knew anything about project lifecycles and design and coding standards, you would be able to see how the wording is not contradictory. Concise means, able to express many things in a few words, even if the words are lengthy. Verbose means using many words to express very little. When it comes to COBOL, a statement can be concise when it can accomlish many things within that same statement. COBOL might be "verbose" when it comes to lengthy variable names, but, when it comes to the number of actual elements within a statement, it's not much different from other languages. So, again, in the everyday world, you might be correct with the definitions, but when it comes to programming, you're not understanding the meaning of concise and verbose. You probably also fail to understand how there may be a point of diminishing returns when discussing adding more verbosity until your computer starts creaking under the strain of the sheer size of your source files. In the current IT world, it's much preferable to cut on the costs of staff than to cut the costs of storage and ram. Ram and storage and even the computers, are much cheaper than personnel. I would much rather reduce the costs of programming than to save a few pennies on storage and/or ram, and as far as programming is concerned, with an easy to learn and easy to code and easy to maintain structure, COBOL can save a lot more than can be saved through hardware. So, your "diminishing returns" argument holds absolutely no water. Things are exactly the opposite of what you believe or think. But, you sound like you're some novice to the IT world if you actually believe that part about savings in the storage and ram departments. Even I have 2 terabytes of storage on my main PC, and 16 gigabytes of ram, and 6 cores on the CPU, and that's a lot more power and storage and ram than all the mainframes that I ever worked on in my whole IT career combined. Storage and hardware are very cheap, and personnel costs are still many times the costs of the hardware. At one time, way back in the 70s, people did actually have to concern themselves with saving those bytes, but, we're in a different world now, so, to me, it sounds like you're the one in that old world of technology from way back in the 70s. Try a different angle for your arguments; the current ones aren't working and really, lacking in credibility. Look, I'm basically suggesting that you don't understand conciseness or verbosity because it's more polite than the alternative You're already been down that road, and I've destroyed every single one of those arguments before. So, try a new approach, and one that doesn't involve lying or trying to spin your way around my assertions about your arguments are very lacking in credibility. assuming that you're a liar and a troll, Cute! But no cigar. You want ot call me a liar and a troll, but, you're going to pull back because... you don't want to go there? It's the same as me saying that, I was going to call you stupid, but I won't, because, I shouldn't go there. Look, the accusation is made when the thought is brought out in the open. So, you're in actuality calling me a liar and a troll. Your logic stinks and your methods smell. Calling someone a troll is equivalent of showing frustration and begging for the argument to end, because, you're running out of intelligent and coherent counterarguments. engaging in petty acts of conscious evil. You've already lost the argument when your only recourse is to bring out the insults. If you're going to do battle, and the weapons are words and knowledge and wisely formed remarks, insults aren't necessary. If you must resort to insults, the least you could do is to accompany them with some useful facts so that you won't reveal yourself as just being defensive and offensive. The same goes for my suggestion that you aren't familiar with the notion of diminishing returns, I've already destroyed that fallacious argument of yours, and it's you that doesn't really understand the diminishing returns from cryptic and difficult to understand programming. You very likely need to get some lessons on IT management before you can talk about the diminishing returns in IT. because if you are familiar with it, you're either too stupid to grasp the fact that heaping on too much verbosity can make it much more difficult to understand the entire program because you're getting lost in the paragraph-length details or just engaged in malicious attempts to screw with other people. You have things completely backwards. There is no such thing as making things complicated by making things easier to understand. I've already mentioned how I've worked with many different programming languages, and when it comes to the number of characters occupied by the COBOL code versus other languages, then, you're right, COBOL requires more storage for the written lines of code. However, when it comes to the diminishing returns that you like to talk about, the higher cost of maintenance that comes with the more cryptic languages, will far outpace the higher cost of storage for the COBOL code. If a program is immediately easier to understand and debug and modify than one that is cryptic and "short", then the savings from the easier maintenance will far outweigh the measly savings from the shorter and cryptic code. You need to keep in mind that, in any IT department, personnel costs are always much higher than the hardware costs, and those personnel costs don't go away, while the cost of hardware is very cheap in comparison, even if upgrades to the hardware might be needed once in a while. Your line of argument suggests that, you've never been in any position of responsibility in an IT shop. You sound like a simple coder who only knows how to think about the bits and the bytes, and doesn't know how to look at the big picture. Take it as a compliment when I try to give you the easy way out. You sound too simple for me. You sound like a junior programmer and sometimes like a beginner. You're too low level to understand that, there's much more to understand than the simple coding of programs. An IT department is much more than that, and that's why I can see the advantages of a "wordy" language over a cryptic language. I don't need any lectures from anybody with the limited experience that you seem to have, and if there's anybody here needing an easy way out, it would obviously be you. But, since you still don't understand how to look at a project from a high-level viewpoint, I ain't about to let you off the hook, especially not when you pretend to know better than others just because you "wrote the article". Look, when you can't defend your points from a strictly knowledgeable standpoint, it's better to admit that perhaps there are others with a lot more experience than you have, and you could perhaps learn something from them. That still holds true, and you need to acknowledge those facts. I'd like to introduce you to a couple of people who I know for a fact are more knowledgeable than me about many aspects of the skillset we call "programming". Quite irrelevant, and not a winning point. Why not keep the argument between you and me and the rest of the posters? Are you feeling so insecure that you need to recruit the help of others who "might be" more skilled in programming than you. It's very easy to find programmers who understand programming better than you, because, from what you've demonstrated thus far, you still have a long way to go before you can call yourself a "skilled" programmer. Scan this thread for the usernames Sterling "Chip" Camden" and Justin James I've already read their posts and I've answered a couple of them, and they actually sound more knowledgeable and experienced than you, but then, that's not so difficult to do. There are things I know that neither of them knows, but if I was going to hire someone as a programmer out of the two of them and me, all else being equal, I would pick one of them over me every time. Well, duh! Anybody that has seen their posts and articles, would make the intelligent decision to choose either of them over you. But, don't get me wrong. If I ever were to need a programmer, I might even consider you for a position, because, I believe that, you could learn and might benefit from the knowledge of others. But, you'd have to be open-minded. Every time, no hesitation. I agree. I would choose them over you anytime. There are cases where I'd choose me over them, But then, you'd be making a big mistake. If you wan't things done right, never choose anyone with the experience level that you have, when there are hundreds of thousands people out there who could perform better. but they are few and far between. I've often heard that "coders are a dime a dozen", and I believe that to be true. Developers are harder to come by, and good developers, who understand the IT world thoroughly, are oftentimes worth the prime compensation that that leverl of expertise demands and commands. I was at your level a long time ago, and I know exactly why you are so defensive. Perhaps with experience and a bit more wisdom, you'll become a good and level-headed developer some day. You do not measure up. Not to you, because, you don't have the capacity to measure anybody. Even if you have years in the IT world, you still have the newbie mentality. Look, I've already mentioned that I've directed projects for major corporations, and I've even directed IT departments, and I've also taught analysis and computer classes, and I spent many years as a consultant on many different projects. I'm working for myself now, and I've already mentioned a project which, if I can get it to work as I want, it could rival Google and Bing and all news and information websites out there. I could actually do demos of the system right now, but, it would have to be on a private network. Someone with your level of expertise could not even dream about doing what I'm tryng to do. You are some bloviating windbag Results. Results are what matter. Insults have never gotten me anywhere, and I'm pretty sure you won't get anywhere with them either. Try a different tactic, because, you can never win this argument with insults. full of his own "expertise" I don't like to flaunt it, but when pressed, I'll let you know about my expertise. But, it seems that, as far as you are concerned, my points are going in one ear and out the other. whose entire argument boils down to "Verbose is bette,r, I've been down that argument path with you already, and I shouldn't have to restate my positions; if you're interested in what I said before, go back and reread my posts, including this one. "I'm older than you so I'm better," and "I don't know what the hell I'm talking about, but I know COBOL syntax backwards and forwards." Yep! I'm older, and much wiser than you, and that should be abundantly clear from my posts versus yours. But, knowledge can be gained without age. When I started in the IT field, it took me barely a few months before I was a senior programmer and the lead in the staff of 9 programmers. I didn't act like the "lead" programmer, because, I respected the rest of the staff who have more experience with the company's applications, while I had the superior programming skills. A year later, I became programming manager, but I remained "one of them". I'm even ashamed that I'm engaging in an argument such as this, but, I'm doing it because, there are too many people out there who claim to know more than they actually do, with you being a prime example. BTW, my first programming language was Assembler, and I learned COBOL later, when I saw it's capabilities in other shops. I should probably add Tony Hopkinson to the list of developers better than me (in the skills sense, at least) along with Camden and James, but I don't know enough about him to be certain of that. I don't have any argument about that. I agree that they seem to be more knowledgeable than you. You should turn this argument to those superior skills, because, you're not an effective defender of your original premise. Now, here's some homework for you, crusty old paduwan: ask Sterling and Justin what they think of COBOL. I've already posted on some of Justin's discussions regarding COBOL, and he wasn't as argumentative and defensive as you have been. Justin is more mature in his presentation and his discussions and defenses of his arguments. You could learn a lot from him. However, I doubt that either one of them has the experience level that I have. Yeah, they many know a thing or two more than I do about some specific technologies/software, but, applications-wise, and systems-wise, I'm pretty sure that I have a lot more accomplishments under my belt. But, I wouldn't mind having Justin working for me sometime in the future if my project does take off on the internet. The other guys, well, I haven't seen enough of their writings to make a judgment about them. But, in your case, I'm pretty sure that I'd start you on the bottom. ;) Just let me know in advance. I plan to wear earplugs to protect my virgin ears from the invective these respected, skilled, professional developers might share, if they feel the spirit move them when discussing the subject of COBOL. Y'know, the worst thing about Java is that it's the "new" COBOL. How those people feel about COBOL is irrelevant, since, they probably never had to use COBOL and they probably will never have to. It would be like asking a jet pilot how it feels to fly in outer space when that pilot had never been an astronaut or had never been to outer space. It's the same with me, where I couldn't tell you what it feels to program in a language that I've never used, yet heard some bad things about. My arguments would be irrelevant because, I never had the first-hand experience with whatever that language was. So, are you learning anything yet? Being a blogger or the author of the article, does not mean that you know more about the subject matter than those that read or might opine on the subject. That is still a fact. Nothing has changed. I never said otherwise. Oh, yes you did! Even if you didn't use the exact words, you implied it when you said: "I wrote the article", to imply that you knew the subject matter well and better than me. This is a straw man. A strawman is a false argument, and my argument about you believing that you have the superior knowledge on the subject matter, is still true, therefore, there is no strawman argument involved. You need to get a better defense. I've already stated that I don't do COBOL anymore, and I've also stated that I have learned and used many other languages, including some of the more modern ones, such as C# and Java (& Javascript). There is nothing wrong with learning and using the latest tools of the trade, and those languages I mentioned are still very widely used, but, because you're just arguing for arguments sake, you'll find something wrong with my statement above. So . . . you have a strong history of betting on blub languages (and JavaScript). Actually, that's a pretty stupid statement. When COBOL was the "it" language, people went with what was in vogue and with what served to provide the solution, and COBOL was the number one language for many years, and it's still very widely used. If the number one tool, instead of COBOL, had been C or Assembler, then I would've been using them at the time, but, they weren't the most used at the time, and most programmers used COBOL because it was the "tool of the trade" at the time. But, there is a very good reason for it becoming the best tool at the time, and that, again, is the ease of understanding, the ease of learning, the ease of debugging, the ease of maintenance, and the self-documentation aspeckt of the language. No other language, then or now, can match COBOL's characteristics. Are you learning anything yet, or are you still stuck in that "newbie" mentality? Just because the years change, it doesn't mean that, things have to be done differently. That is still a fact. Learn from it. I never said it does mean that. Yes you did. You said that I was still stuck in the 70s because I defended its use for the times, and for even today's computing environments. On the other hand, the fact that "old" is not necessarily "bad" does not mean that "old" is necessarily "good", either. You sound so immature. Are you even thinking about what you're writing before you write it? Take your statement another way and see if it makes sense: On the other hand, the fact that "new or current" is not necessarily "bad" does not mean that "new or current" is necessarily "good", either. Try to be a better thinker, because, your logic stinks. If COBOL, or even Visual Basic, can perform better or as well as the other "current" languages, then why not use them? That is another one of my "factual" statements, and nothing you've said up to now, has changed or invalidate that fact. If they could, I would. How would you know if you've never tried them or used them? The fact is that, you shouldn't talk about what you don't understand. COBOL remains the language with the most lines of code worldwide for the most applications. You need to start facing the reality that, it's not going away any time soon, if ever. That fact is worth recognizing and remembering. Oh, I'm completely aware that, as things currently stand, COBOL is not going away. Then, it might be worth your while to be better informed about the language so that you won't sound so clueless in a discussion about the language. Popularity does not equate to quality, In the case of COBOL, the quality was always there, and the compiled code, performed as well as code written with other languages, and sometimes even faster. People did not develop as many bad habits in programming as they do now with the more "modern" languages. But, that's something that's beyond your "newbie" mentality. though, and I was just surprised to discover that someone able to operate a modern computer with an Internet connection and a Web browser capable of rendering TechRepublic would still think COBOL is a good thing. Is that your way of trying to win an argument? If you can't beat them, insult them? The fact is that, when it comes to learning and using any language, modern or older, I'm pretty sure that I would do a lot better than you, and it wouldn't be just in the learning and using of the language. When it comes to theory, and development of languages, I was involved in that too, and I even wrote my own programming language at one time. I didn't have the time to pursue it, but, it would've been in the same category as ADA, which is another "old time" language, but, which is a lot closer to how the "modern" languages are constructed. I even had a name for my language, where I was calling it "SIS" for "simplified instruction set". The way I saw it at the time, programming was something that, most people should be able to do, hence the "simplified" in the name. And, even today, I don't believe that programming needs to be complicated. One more thing you don't realize is that, COBOL, was and is, as capable as any of the current and newer programming languages, and there is nothing that you can do with Java or C# or C++, that I couldn't do with COBOL, as long as it was/is a business application. But, it seems you're too simple-minded and too narrow-minded to understand that, the internet and browsers and web-sites (like TechRepublic) are just newer ways of doing the things that I used to do way back when COBOL was in its heyday. I was involved in creating the first online banking system, to work over phone lines, before the words "internet" or "web" were even invented. Looky here: PRONTO: Bank on Your Atari http://www.atarimagazines.com/v1n6/pronto.html You need to learn a big lesson, that being that, whatever you're doing today, and whatever you're using today, wasn't invented in today's world. It all started before you even heard of a computer. So, stop with your stupidity and start looking at the bigger picture and the history that brought about today's internet and today's programming languages and today's applications. You yourself might get some benefit from understanding what COBOL is about before carrying on and attacking that which you're unfamiliar with. You should follow that advice, because, you still sound very ignorant when it comes to COBOL and the benefits that it offered in its heyday and even in today's world of computing. You make unwarranted presumptions There's no presumtion involved when I'm looking at what is very clear in front of me, and that being that, you were the one with the attacks against something which you have very litttle or no knowledge about. (beyond that incredibly stupid gaffe about me not understanding the article). What is the ultimate in stupidity, is for someone to tackle a subject for which he's a virtual newbie and basically ignorant about. You took on more than you can chew, and you keep putting your foot in your mouth. Stop while you're behind, or you risk getting further behind. Would it surprise you to learn that I actually learned a little COBOL years ago, still have a COBOL book (I think so, anyway -- I haven't seen it in months, maybe a few years), and have written some COBOL code -- That's the same as saying that, you went to a seminar, or that you attended a lecture, and that you became an expert on the subject after you left that seminar or lecture. That's pretty stupid. Hearing about something or dabbling in something, does not equate to enough experience to be able to discuss the subject intelligently. Would you feel comfortable with me piloting a plane if I told you that, I took introductory lessons on how to fly a plane? If you did, then you would be a moron. Likewise, being a COBOL programmer with heavy experience, is not the same as taking a couple of lessons and perhaps having the manuals around in a closet somewhere. Get a clue and grow up. or that my significant other took COBOL classes in college and wrote a lot of COBOL code (and, quite sanely, decided it sucked and moved on)? You keep going back to the experience of others to make your points. Look, your girlfriend's experience or lack thereof, and Justin's experience or lack thereof, does not qualify you to discuss or dismiss the experience of others. Talk about what you know about, and you won't be making so many mistakes. Of course it would -- because you cannot conceive of someone disagreeing with your worshipful regard for the King of Programming Languages because of actual, you know, reasons You so so childish. Are you sure that someone at TechRepublic hired you to write for them? Or, are you an "invited guest writer" meant to represent the ignorant and "newbie" generation? I'm sure the COBOL world will feel a lot better now that you've been gracious enough to admit that it might be "necessary". Okay, so, is the COBOL world out there free to go about their "business" or not? I imagine they're still waiting on your superior knowledge and wisdom before they can carry on. ;) Of course it's sometimes necessary Yeah! The COBOL can rest easy now. The master has spoken. -- when maintaining all that legacy code, It's not just existing code, you ninny! New code and new applications are still being written with COBOL. Stop being so ignorant. There is a lot you don't know. Perhaps you need to resign from that "tech writing" position and allow someone with some real knowledge to take over. or in the very rare case where it needs to be translated to another language (usually, it should be mostly left alone until the entire system is replaced, hardware included, because the software is so thoroughly real-world tested and debugged that its stability cannot be matched by a bunch of Java blub programmers trying to phase out the COBOL blub programmers). I keep hoping that you'll come up with some intelligent remark that I can agree with, but you keep disappointing me. Look, it's been tried, and where conversions have been successful, it's been expensive and prohitibely time consuming. Converting is not an easy proposition, and it's not easy, no matter what the original language and no matter what the destination language. Conversion are expensive and time consuming, no matter how much somebody hates the language of origin. Besides, that old saying that goes "if it ain't broke, don't fix it" still applies with COBOL. If it still performs the job better or equal to what the replacement would be, then, there is no problem and no fix is required. The bigger problems is, why should a company spend money and time and effort to convert to a more "modern" language, just to please a bunch of programmers who don't like COBOL or the "old way" of doing things? That's a "high level" way of thinking in business and in application analysis that is still beyond the scope of a newbie such as you. If the IT world had a grading system for people in the tech world, and that grading system had levels between 1 and 100, you would be somewhere between 5 and 6. It's quite obvious that, you have a lot ot learn. Languages come and go, but COBOL is still ticking, after more than 60 years. And, it's still very widely used. No other "modern" language will ever match that record. That is still irrefutable, no matter how much you dislike the fact. Legacy code and legacy hardware is why it's "still ticking", and the reason modern programming languages won't exceed its tenure is that the world is changing at an ever-accelerating pace, which means that new languages supplant the old when it is discovered that the old designs are not sufficiently good to support the new work that needs to be done before they can build up the same quantity of legacy code in the world. It's not because COBOL is the bestest programming language EVAR. That is so IGNORANT! In the IT world, there will always be new tools, but, just because something is new, is not a reason to supplant the old for the new just because the "new" is newer and different. Look, if a tool works, and the newer tool is not going to give you any real new advantages, then there really is no reason for the new. Many programmers still believe that, C++ didn't really answer any problems that couldn't be taken care of with the "old" C. Java and C# and Python don't really offer any advantages over C++ or C. It's mostly about preferences with programming languages, and not really about advantages. What can be done with C++ and with C#, can still be done with C, and, if a business application is involved, it can be done with COBOL. If there's an exception to the idea that no other current language will exceed COBOL's tenure, Don't forget, it's not just "tenure". It's the extent of use. COBOL still has more lines of code in existence than any other language, and it's still adding new lines of code; it's not just being "maintained". Are you able to understand that? it'll almost certainly be a flavor of LISP (which is older than COBOL and, given Greenspun's Tenth Rule, may actually have worked its way into more code around the world than COBOL -- in fact, a lot of COBOL may contain ad-hoc, informally specified, buggy implementations of half of Common Lisp). Lisp and COBOL were designed for different functions in the software arena. Lisp was the first language intended for "AI" research, and it also had a background in the mathematics arena. COBOL, was written for business, and it served that function very well, although I used it for other purposes unrelated to business. In fact, I wrote a simple IOCS (input/output control system) with COBOL, where all the functions of data and database access could be perfomed through calls to my COBOL-IOCS system. I would liken my IOCS to what ODBC can do currently for different databases. COBOL compilers have been modernized, but the "verbosity" sin that you hate so much is still the biggest selling point for the language. That's another indisputable fact. Guess you could say I'm a "factual" type of guy. ;) I don't hate verbosity. I dislike excessive verbosity, which is the core of COBOL's design. Believe it or not, COBOL itself, as a language, did not "impose" verbosity upon the coder or the application. It was expected from the programmer to make his code readable and understandable and easy to maintain. After, all, when somebody works for a company, it's expected that, whoever wrote the original code, won't be the same person who will forevermore be responsible for maintainiing and enhancing the code or program. That's where the value of COBOL came in, but still, it was more about the expectation of verbosity rather than a built-in requirement in the system for verbosity. You could write c = a + b, and it would be valid COBOL, but, you'd be pretty soon out of a job in a COBOL shop. So, the "sins" you attribute to COBOL were not from COBOL's making, it was from the mantra that COBOL code should be written as if it were written in English. Verbosity is not, again, something that's in the language, and it's no different from any other language in that, you could create cryptic statements that look like in the "modern" languages. I could write c += b in COBOL, and it would be valid, but it would be violating the "expectations" of "readability" and simplicity where, even a non-IT person could understand the code. Are you still not understanding the reasons for why COBOL was "invented"? But, for now, it's still the better language for ease of use, for ease of coding, for ease of understanding, for ease of maintenance, for ease of learning, and for it's self-documenting characteristics. So you say. Being dismissive is not a way to refute an argument. The fact is that, there is still not other language that can match COBOL in the characteristics which I mentioned above. If you can prove otherwise, make the argument, but dismissing the points is not a way to prove your side of the argument. You're the one that threw out the issue about self-defining variable names, and I was just merely pointing out how COBOL had effectively answered that challenge a long time ago, and way before you had even heard of computers. That is another undeniable fact. You seem to have missed half the article, which talked about stuff like avoiding excessive verbosity. No dude, I read your whole article, and I even wrote about that point of yours in some of my other posts. Now, when it comes to making code easy to understand, there is no such thing as verbosity. COBOL doesn't care if you use long or short names, as long as your variables and other language elements are easy to understand without having to look through other documentation or having to backtrack in the code base to find the meaning of a named instance of an element. People make the accusation of "verbosity" in order to justify using a different or personally preferred language. I saw that a lot when C was becoming a popular language, and the complaint I heard most often for justifying moving from COBOL to C, was that "COBOL was too verbose". I asked some of those programmers to show me what their code would look like if they were to write a sample chunk of code that I had written in COBOL, and they invariable came up with much shorter variable names with one or two more statements for something that took me one line of COBOL code. And, I even took their variable names and made my line of code look a lot shorter than their C code. It's a matter of preferences, and, like I said, it's not COBOL that insists on verbosity, it's the expectations from a COBOL shop to create code that reads like English. It is not the case that if six words are better than one, six hundred are necessarily better still. If it takes six words to make matters clearer and easier in the long run than the use of one, then I'll always choose the clearer and easier path. I don't care how short and how quickly I could have written the original code. You have to remember that, a program or a chunk of code is not going to remain static for the duration of an application. If that were the case, as soon as a program is written, the programmer becomes very expendable. Why not have just accepted the fact that, perhaps COBOL did have it's advantages Furthermore, whatever advantages it had in the past, are still true today. ;) Oh, sure, it had some advantages at one time. I'll have to say it again, whatever advantages it had in the past, are still true today. ;) It has been superseded, however. Why can't you just admit to yourself that things change? Things change, but are they always for the better? Why is it that, when it comes to programming languages, people are always touting the benefits of the "newest kid on the block" language, and not bothering to mention that, perhaps, the "newest kid on the block" wasn't even needed? Why is Python or Ruby or Java or C++ or C#, needed? What problem are they trying to fix? And, if any of them had a problem that can be overcome by the "new kid on the block" language, why couldn't the other languages have been modified to "fix" the problem? COBOL has been modified to fix whatever was broken with it, and so have other languages, so why do we need to create new solutions when the older solutions can do the job just fine? Like I keep saying, you need to start thinking at a higher level, or you'll remain a "newbie" all your life. Also, remember, there are people out here, who might be a lot better informed than you are. That should be the truest statement I've made here so far, and readily apparent. You don't measure up against most people who have been doing programming for at least a year. I don't really know for sure whether you're better or worse informed than me, As far as I'm concerned, you are not even close to the knowledge-base in my background. But, hey, you could get there, but, with your way of thinking, it should be at least 20 years or more before you can get anywhere close to my level. but you clearly are not better at thinking things through You don't have an idea what it means to think things through. You make simple-minded and short-sighted statements and, just because "you wrote the article", you believe that, you are more authorative than whoever might happen upon your article. That's so foolish and incredibly ignorant, and with that kind of thinking, I wouldn't expect you to know that "thinking things through" means. and some of the people better informed than me are right here in this discussion thread; they're obviously much better informed than you. You keep trying to defer to the "superior" experience of others in order to try to make your argument. That would be an argument similar to what a presidentail candidate, with no experience, would make when trying to get elected, with the argument being that, he doesn't need to have experience himself, because, he could hire people with the real experience that could handle matters for himm once he delegates government functions to those assistants. Stop being so foolish and stand on your own. If you can't defend your own points, then, don't even make them. Anytime you defer to others who have "superior knowledge or superior experience" you sound very insecure. Look, one thing I learned while in the Marines is that, you never let the enemy see or feel your fear. Don't let your adversaries know how weak you really are, or they will destroy you by using your own weaknesses against you. And, believe me, you really have exposed yourself as being very weak. Hopefully, for your own sake, you'll learn. I can learn from you guys, and you guys can learn from us. That still remains true. I could be in the field 200 or 500 years, but, I'll be the first to admit that, I don't know it all and I could learn from others. You could try the same. I'll be happy to learn from you the moment you say something worth learning. I could be a guru in all the programming languages in the world, and with heavy experience in all of them, but, when somebody is as hard-headed as you seem to be, of course, you would say something as stupid as, "there is nothing worth learning from you". That's actually pretty asinine. "Get off my damned lawn, I fertilized it with COBOL!" isn't worth learning. Actually, I would prefer, "hey, you, ignorant TechRepublic blogger, get off your high-horse and off Al Gore's internet, and learn a few tricks before you start lecturing others about things you know nothing about". Sometimes, it's better to keep your mouth shut and be thought a fool, than to open it and remove all doubt. ;) Look, it's always better to be well-informed and to have done research on a topic before you try to lecture others on the matter. BTW, for your education (from Wikileaks): http://en.wikipedia.org/wiki/Cobol Lack of structurability In his letter to an editor in 1975 titled "How do we tell truths that might hurt?", which was critical of several programming languages contemporaneous with COBOL, computer scientist and Turing Award recipient Edsger Dijkstra remarked that "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense."[7] In his dissenting response to Dijkstra's article and the above "offensive statement", computer scientist Howard E. Tompkins defended structured COBOL: "COBOL programs with convoluted control flow indeed tend to 'cripple the mind'", but this was because "there are too many such business application programs written by programmers that have never had the benefit of structured COBOL taught well...".[8] Additionally, the introduction of OO-COBOL has added support for object-oriented code as well as user-defined functions and user-defined data types to COBOL's repertoire. Verbose syntax COBOL 85 was not fully compatible with earlier versions, resulting in the "cesarean birth of COBOL 85". Joseph T. Brophy, CIO, Travelers Insurance, spearheaded an effort to inform users of COBOL of the heavy reprogramming costs of implementing the new standard. As a result the ANSI COBOL Committee received more than 3,200 letters from the public, mostly negative, requiring the committee to make changes. On the other hand, conversion to COBOL 85 was thought to increase productivity in future years, thus justifying the conversion costs.[9] COBOL syntax has often been criticized for its verbosity. However, proponents are quick to note that this was an intentional part of the language design and considered by many to be one of the COBOL's strengths. One of the design goals of COBOL was for COBOL code to be readable and understandable to non-programmers such as managers, supervisors and users. This is why COBOL has a very English-like syntax and structural elements???including: nouns, verbs, clauses, sentences, sections, and divisions. Consequently, COBOL is considered by at least one source to be "the most readable, understandable and self-documenting programming language in use today. [...] Not only does this readability generally assist the maintenance process but the older a program gets the more valuable this readability becomes.".[10] On the other hand, the mere ability to read and understand a few lines of COBOL code does not grant to an executive or end user the experience and knowledge needed to design, build, and maintain large software systems.[citation needed] Other defenses Additionally, traditional COBOL is a simple language with a limited scope of function (with no pointers, no user-defined types, and no user-defined functions), encouraging a straightforward coding style. This has made it well-suited to its primary domain of business computing???where the program complexity lies in the business rules that need to be encoded rather than sophisticated algorithms or data structures. And because the standard does not belong to any particular vendor, programs written in COBOL are highly portable. The language can be used on a wide variety of hardware platforms and operating systems. And the rigid hierarchical structure restricts the definition of external references to the Environment Division, which simplifies platform changes.[10]

adornoe
adornoe

the problem being that, the "author" of the article, probably chewed on more than he could swallow. It's not the first time, nor the last, that somebody delved into a topic for which they're not entirely prepared. The author may have had a good point or two to make, but, being so defensive and offensive against the opinion of others, can lead people to believe that, the author was merely trying to create fodder for a discussion and wasn't serious about the treatment of the points he was probably attempting to make.

pgit
pgit

my bad no doubt, bad attempt at bad humor. I found the original comment to the effect quite humorous in an odd way, I had to assume the fellow knew you wrote the article, so there had to be some kind of cosmic loop-disconnect or other metaphysical intervention in his mind in order the authorship and comprehension could be divorced. Semantics I guess. It's one thing to tell an author "you don't know what you're talking about" or point out where they believe you are mistaken, but to tell an author "you obviously didn't understand a word" of your own article... mind bending. I admit my sense of humor is obtuse, and often unappreciated amidst energetic debate. For instance I bust a gut thinking about some of the alternative names I've come up with, every time I pass a local grocery store named "P&C." ("dump and ogle... s#!t and stare... urinate and cogitate... leak and peek...") Sorry. I couldn't down vote myself.

apotheon
apotheon

I understood every word. You appear to have ignored every word except four or five of them where I said that self-documenting code is good, though.

apotheon
apotheon

> Look, COBOL may be a very old language, but, for its intended purposes, it was never broken, and, even today, it can perform as originally intended, but with many improvements, such as OOP and with support for the latest in technology. Age has nothing to do with it. Bad design has everything to do with it. > I was able to design and code entire applications using COBOL a lot faster than using any of those other languages. . . . and yet, you're willing to claim COBOL is better than newer languages that allow people to more quickly develop software, and more easily maintain it, because COBOL enforces an order of magnitude more verbosity. > when it comes to pure business applications, COBOL was my "go to" language. There was a time when COBOL was the language to use for such purposes. That time is past -- because its design did not provide for an ability to keep up with the changing needs of business. The only place where COBOL is appropriate is where it is already in place, debugged within an inch of its life, and running stably. When you need to write something new, write it in a better designed language. > COBOL was always the easiest language to work with. Sure, fifty years ago. "Was" is the operative word here. The end of its usability was built into its DNA. > I never met a business application that couldn't be written in COBOL. So what? It's Turing complete. By definition, that means it can be used to write a business application. The same could be said for BASIC or VBScript. That doesn't mean it's a good idea. > Programming has become a lot more expensive and a lot more messy in today's IT departments. This has nothing to do with whether people are using COBOL. > If it's not one thing, it's gotta be the other for you. You can't lose. You're the guy who decided to say something is both concise and verbose. Maybe you understand each of them in isolation, but are just incapable of understanding the conflicting meanings when juxtaposed. I don't know what you're failing to understand but, unless you're just trolling, you clearly don't understand something when you say "Everything is clear and concise, even if verbose." I just said to my girlfriend "If someone said 'Everything is clear and concise, even if verbose,' what would you think?" She said "I'd think he doesn't know what he's talking about." I didn't tell her anything about the context, your worshipful devotion to the false idol of COBOL, or your overweening arrogance ("I, for certain, understand what programming is all about"? Really?). I just read your words to her, out of context, and asked what she thought. She's pretty sure whoever said that is full of it. Gee -- I wonder why. > Design and programming need to be done with long-term expectations that, staff will change, and that enhancements will be needed, and that things can go wrong. When a program or system is well-documented, and when the programs themselves are written in "self-documenting" fashion, the problems in the future will not seem so daunting. I agree with every word of that. I do not agree that COBOL is the answer -- with its cluttered syntax, overloaded non-alphabetical character usage, and frankly doltish semantics. It is a horrid language, in large part because of the impositions it makes on the developer's ability to organize the structure of a program cleanly and understandably in composable units. > Then, you tackled a problem for which you are not well-equipped. It's interesting that you've arrived at this opinion by virtue of the fact that I dislike COBOL. Come back when you can construct an argument. It's also kind of amusing that you're telling me I'm not able to defend my points when you said that the article's author didn't understand it. > First, I didn't understand "concise", then I didn't understand "verbose", and now, I don't understand "diminishing returns". What do you call a false dichotomy when it consists of three parts, and not two? You must fail to understand either "concise" or "verbose", because they are conflicting terms. You probably also fail to understand how there may be a point of diminishing returns when discussing adding more verbosity until your computer starts creaking under the strain of the sheer size of your source files. Look, I'm basically suggesting that you don't understand conciseness or verbosity because it's more polite than the alternative -- assuming that you're a liar and a troll, engaging in petty acts of conscious evil. The same goes for my suggestion that you aren't familiar with the notion of diminishing returns, because if you are familiar with it, you're either too stupid to grasp the fact that heaping on too much verbosity can make it much more difficult to understand the entire program because you're getting lost in the paragraph-length details or just engaged in malicious attempts to screw with other people. Take it as a compliment when I try to give you the easy way out. > Look, when you can't defend your points from a strictly knowledgeable standpoint, it's better to admit that perhaps there are others with a lot more experience than you have, and you could perhaps learn something from them. I'd like to introduce you to a couple of people who I know for a fact are more knowledgeable than me about many aspects of the skillset we call "programming". Scan this thread for the usernames Sterling "Chip" Camden" and Justin James There are things I know that neither of them knows, but if I was going to hire someone as a programmer out of the two of them and me, all else being equal, I would pick one of them over me every time. Every time, no hesitation. There are cases where I'd choose me over them, but they are few and far between. You do not measure up. You are some bloviating windbag full of his own "expertise" whose entire argument boils down to "Verbose is bette,r, "I'm older than you so I'm better," and "I don't know what the hell I'm talking about, but I know COBOL syntax backwards and forwards." I should probably add Tony Hopkinson to the list of developers better than me (in the skills sense, at least) along with Camden and James, but I don't know enough about him to be certain of that. Now, here's some homework for you, crusty old paduwan: ask Sterling and Justin what they think of COBOL. Just let me know in advance. I plan to wear earplugs to protect my virgin ears from the invective these respected, skilled, professional developers might share, if they feel the spirit move them when discussing the subject of COBOL. Y'know, the worst thing about Java is that it's the "new" COBOL. > Being a blogger or the author of the article, does not mean that you know more about the subject matter than those that read or might opine on the subject. I never said otherwise. This is a straw man. > I've already stated that I don't do COBOL anymore, and I've also stated that I have learned and used many other languages, including some of the more modern ones, such as C# and Java (& Javascript). So . . . you have a strong history of betting on blub languages (and JavaScript). > Just because the years change, it doesn't mean that, things have to be done differently. I never said it does mean that. On the other hand, the fact that "old" is not necessarily "bad" does not mean that "old" is necessarily "good", either. > If COBOL, or even Visual Basic, can perform better or as well as the other "current" languages, then why not use them? If they could, I would. > COBOL remains the language with the most lines of code worldwide for the most applications. You need to start facing the reality that, it's not going away any time soon, if ever. Oh, I'm completely aware that, as things currently stand, COBOL is not going away. Popularity does not equate to quality, though, and I was just surprised to discover that someone able to operate a modern computer with an Internet connection and a Web browser capable of rendering TechRepublic would still think COBOL is a good thing. > You yourself might get some benefit from understanding what COBOL is about before carrying on and attacking that which you're unfamiliar with. You make unwarranted presumptions (beyond that incredibly stupid gaffe about me not understanding the article). Would it surprise you to learn that I actually learned a little COBOL years ago, still have a COBOL book (I think so, anyway -- I haven't seen it in months, maybe a few years), and have written some COBOL code -- or that my significant other took COBOL classes in college and wrote a lot of COBOL code (and, quite sanely, decided it sucked and moved on)? Of course it would -- because you cannot conceive of someone disagreeing with your worshipful regard for the King of Programming Languages because of actual, you know, reasons > I'm sure the COBOL world will feel a lot better now that you've been gracious enough to admit that it might be "necessary". Of course it's sometimes necessary -- when maintaining all that legacy code, or in the very rare case where it needs to be translated to another language (usually, it should be mostly left alone until the entire system is replaced, hardware included, because the software is so thoroughly real-world tested and debugged that its stability cannot be matched by a bunch of Java blub programmers trying to phase out the COBOL blub programmers). > Languages come and go, but COBOL is still ticking, after more than 60 years. And, it's still very widely used. No other "modern" language will ever match that record. Legacy code and legacy hardware is why it's "still ticking", and the reason modern programming languages won't exceed its tenure is that the world is changing at an ever-accelerating pace, which means that new languages supplant the old when it is discovered that the old designs are not sufficiently good to support the new work that needs to be done before they can build up the same quantity of legacy code in the world. It's not because COBOL is the bestest programming language EVAR. If there's an exception to the idea that no other current language will exceed COBOL's tenure, it'll almost certainly be a flavor of LISP (which is older than COBOL and, given Greenspun's Tenth Rule, may actually have worked its way into more code around the world than COBOL -- in fact, a lot of COBOL may contain ad-hoc, informally specified, buggy implementations of half of Common Lisp). > COBOL compilers have been modernized, but the "verbosity" sin that you hate so much is still the biggest selling point for the language. I don't hate verbosity. I dislike excessive verbosity, which is the core of COBOL's design. > But, for now, it's still the better language for ease of use, for ease of coding, for ease of understanding, for ease of maintenance, for ease of learning, and for it's self-documenting characteristics. So you say. > You're the one that threw out the issue about self-defining variable names, and I was just merely pointing out how COBOL had effectively answered that challenge a long time ago, and way before you had even heard of computers. You seem to have missed half the article, which talked about stuff like avoiding excessive verbosity. It is not the case that if six words are better than one, six hundred are necessarily better still. > Why not have just accepted the fact that, perhaps COBOL did have it's advantages Oh, sure, it had some advantages at one time. It has been superseded, however. Why can't you just admit to yourself that things change? > Also, remember, there are people out here, who might be a lot better informed than you are. I don't really know for sure whether you're better or worse informed than me, but you clearly are not better at thinking things through -- and some of the people better informed than me are right here in this discussion thread; they're obviously much better informed than you. > I can learn from you guys, and you guys can learn from us. I'll be happy to learn from you the moment you say something worth learning. "Get off my damned lawn, I fertilized it with COBOL!" isn't worth learning.

pgit
pgit

Well, we now enter the realm of metaphysics... Is it possible you were channeling someone else's knowledge, like some Delphic Oracle? Or were you speaking (writing) in tongues? I'm just trying to come to grips with how you could have written the article yet not comprehended it. I thought only congressional lobbyists were capable of such tasks.

adornoe
adornoe

COBOL? Really? Yeah! COBOL. Really!!! Look, COBOL may be a very old language, but, for its intended purposes, it was never broken, and, even today, it can perform as originally intended, but with many improvements, such as OOP and with support for the latest in technology. You are the first person I've ever met who does not recognize COBOL as the horrid blight that it is. As a person that learned and used many different languages, including the "lower level" and more cryptic ones, such as C, and ALGOL, and Java, and Fortran, and NEAT (assembler for NCR computers), and COMPASS (assembler for Control Data Computers), and TAL (assembler like language for Tandem's Transaction Application Language), and even Assembler (for IBM computers), I can attest to the fact that, I was able to design and code entire applications using COBOL a lot faster than using any of those other languages. Those other languages have their advantages, depending on the application or system, but, when it comes to pure business applications, COBOL was my "go to" language. And, when it came to maintenance, "my" COBOL programs were always easy to debug and enhance and maintain. When it comes to "blight", you apparently haven't had to manage or direct an application or a programming department. When it comes to simplicity for debugging and enhancements and maintenance, COBOL was always the easiest language to work with. The blight exists in the present day, where, there are so many different languages and not too many work on any one of them for a long time, and what those people leave behind as far as code is concerned, leads to a myriad of problems. Programming has become a lot more expensive and a lot more messy in today's IT departments. Maybe you do understand "concise", Don't be afraid to say "for certain". I for sure, understand what concise means. And I, for certain, understand what programming is all about, and I for certain, understand the benefits and limitations of COBOL. I never met a business application that couldn't be written in COBOL. Even the on-line applications, were able to be written using COBOL, and I was involved in some of those, and no doubt, you, at one time or another, when you did any kind of banking or credit/debit card transaction, including using ATMs, were using some of the code which I wrote using COBOL. In fact, back in the early 1980s, a bank I worked at, was the first one to design and implement a home banking system, with COBOL as the main language. then, but if so you probably don't understand "verbose". If it's not one thing, it's gotta be the other for you. You can't lose. Look, I'm being verbose right now, in trying to explain the reasons why COBOL performed so well, and the reasons that it still has a use in the present day. Verbosity is for clarity and for ease of debugging and for ease of maintenance and for ease of enhancements, and for ease of understanding across a whole IT department. Being able to write a statement quicker, with a more "cryptic" language has its advantages, but the disadvantages come later, when those programs have to be maintained. The frame of mind that a person has while creating his "masterpiece" of a program, is not the same as he'll have when he has to revisit that code, and, for certain, that is not going to be the same frame of mind that another coder will have when it's somebody else that has to do the maintenance of that program or system. Design and programming need to be done with long-term expectations that, staff will change, and that enhancements will be needed, and that things can go wrong. When a program or system is well-documented, and when the programs themselves are written in "self-documenting" fashion, the problems in the future will not seem so daunting. You are the one not understanding the article What?! I wrote the article. Then, you tackled a problem for which you are not well-equipped. It's not the first time that a person tackles an issue and then finds that he's not well-qualified to defend his points. A coder's life, and that of those that come after, are made a lot easier when the guess work is removed from the variable names. You seem unfamiliar with the concept of a point of diminishing returns. You're all over the place, aren't you? First, I didn't understand "concise", then I didn't understand "verbose", and now, I don't understand "diminishing returns". Look, when you can't defend your points from a strictly knowledgeable standpoint, it's better to admit that perhaps there are others with a lot more experience than you have, and you could perhaps learn something from them. I'm not afraid to admit that I don't know it all, and in fact, I'm open to learning from others who do know certain matters that I'm not familiar with, and oftentimes, I've even learned from those with less experience than I have. Being a blogger or the author of the article, does not mean that you know more about the subject matter than those that read or might opine on the subject. I have no doubt that I have a lot more experience in the IT field than you have Yeah, that's true, and I'm admitting that I'm older than most in this forum with that statement. It looks like all of your experience must have involved repeating the year 1970 over and over again. So, if you can't beat them, insult them? Is that your technique for winning an argument. I've already stated that I don't do COBOL anymore, and I've also stated that I have learned and used many other languages, including some of the more modern ones, such as C# and Java (& Javascript). Also, you need to use your head a bit more. Just because the years change, it doesn't mean that, things have to be done differently. If something works, and does so efficiently and inexpensively, then why change it? If COBOL, or even Visual Basic, can perform better or as well as the other "current" languages, then why not use them? COBOL remains the language with the most lines of code worldwide for the most applications. You need to start facing the reality that, it's not going away any time soon, if ever. COBOL is the blub language -- worse than VB that way. Look, quit while you're behind, or you'll get further behind. That statement of yours above is so silly and has no meaning and is insignificant in the real world, where there exists a lot more lines of COBOL code than for any other language. Reality is a bitch that can't be overcome. I have no problem with people knowing COBOL, You yourself might get some benefit from understanding what COBOL is about before carrying on and attacking that which you're unfamiliar with. or even using it when necessary, I'm sure the COBOL world will feel a lot better now that you've been gracious enough to admit that it might be "necessary". ;) but acting like it somehow represents the pinnacle of programming language design Languages come and go, but COBOL is still ticking, after more than 60 years. And, it's still very widely used. No other "modern" language will ever match that record. Somebody is always dreaming up new ways of doing the same thing. COBOL compilers have been modernized, but the "verbosity" sin that you hate so much is still the biggest selling point for the language. If C# or any other language could be as easy to understand and code for and maintain as COBOL, I would never need to defend COBOL. But, for now, it's still the better language for ease of use, for ease of coding, for ease of understanding, for ease of maintenance, for ease of learning, and for it's self-documenting characteristics. is a sign of either insanity or terrible ignorance (or perhaps trolling). It seems to me that, you're the one launching your insane attacks against the points I'm making, and you're the one demonstrating ignorance about why someone would prefer COBOL for its self-documenting characteristics. You're the one that threw out the issue about self-defining variable names, and I was just merely pointing out how COBOL had effectively answered that challenge a long time ago, and way before you had even heard of computers. Being so defensive, and so offensive with your attacks, is a way of demonstrating that, you're losing the argument. Why not have just accepted the fact that, perhaps COBOL did have it's advantages and perhaps some of those advantages needed to be adopted by the more current crop of "modern" languages, even while keeping the constructs of those languages. Also, remember, there are people out here, who might be a lot better informed than you are. It's a two way street; I can learn from you guys, and you guys can learn from us.

apotheon
apotheon

You are the first person I've ever met who does not recognize COBOL as the horrid blight that it is. Maybe you do understand "concise", then, but if so you probably don't understand "verbose". > You are the one not understanding the article What?! I wrote the article. > A coder's life, and that of those that come after, are made a lot easier when the guess work is removed from the variable names. You seem unfamiliar with the concept of a point of diminishing returns. > I have no doubt that I have a lot more experience in the IT field than you have It looks like all of your experience must have involved repeating the year 1970 over and over again. COBOL is the blub language -- worse than VB that way. I have no problem with people knowing COBOL, or even using it when necessary, but acting like it somehow represents the pinnacle of programming language design is a sign of either insanity or terrible ignorance (or perhaps trolling).

apotheon
apotheon

That's about the last reason anyone should care about it. The biggest problem is easily the simple fact that it wastes computing resources that could be put to better use. Use a lighter-weight text editor, and if you really really feel a burning need to eat up the rest of those clock cycles you aren't using, devote them to folding@home. Pollution. Bah. The biggest polluters in the world are the regulation-exempt governments trying to regulate our pollution output. Anyone who gives a crap about pollution is either focusing on getting government sized down or damned idiot.

Jaqui
Jaqui

the excessive use of hardware is cheap development is causing more and more hardware requirements for simple things like editing a text file. and more hardware does consume more energy, which creates more pollution. so it is a development model that needs to be gotten rid of.