Software Development optimize

What are your top five programming languages?


Here's a little exercise I'd like all of you who do any programming to try. First, list your top five programming languages, in terms of your expertise (how well you know the language, not necessarily how long you've used it). Here are mine:

  1. Synergy/DE
  2. C/C++
  3. Ruby
  4. C#
  5. PHP

Next, list your top five languages, in terms of how much you've used each one over the last year. Mine again:

  1. C/C++
  2. Synergy/DE
  3. PHP
  4. Delphi
  5. C#

Finally, list your top five languages, in terms of what you'd like to be using more (regardless of actual available opportunities). Mine are:

  1. Ruby
  2. ECMAScript
  3. Lisp (any flavor)
  4. Haskell
  5. Synergy/DE (version 9)

As you can see, there's a bit of a disconnect between the languages I'm using most of the time and those I'd like to be using. This represents a gradual shift in the industry, I think. We're now in the post-Java era, when it's assumed that any 2-bit language will provide at least some support for objects, automatic resource recovery, and safely wrapped pointers. Languages that lead the pack in this era are multi-paradigmatic: they make object-orientation a matter of pragmatism rather than orthodoxy, and they enable a functional programming style to some degree. The very best also facilitate dynamic programming -- which, in a nut shell, means any equivalent of eval.

Being the old guy that I am, I can remember the era before Java (in fact it's the majority of my career!), which I'll call the post-Pascal era. At that time, any decent programming language provided facilities for structured programming. Object-orientation was still largely the stomping ground of theorists rather than application developers. BEGIN-END and CASE statements were all the rage, not to mention good looping constructs. External functions were used sparingly for utility routines. Although many programmers realized that a more radical use of functions might revolutionize programming, available memory, stack space, or plain old institutional lag usually prevented such ideas from being implemented.

I can't help but wonder what will come next. Lisp has certainly demonstrated the staying power of functional programming, but might there not be an even more excellent paradigm in the future? Only time will tell.

About

Chip Camden has been programming since 1978, and he's still not done. An independent consultant since 1991, Chip specializes in software development tools, languages, and migration to new technology. Besides writing for TechRepublic's IT Consultant b...

171 comments
drednot57
drednot57

1, Python 2. C++ 3. C 4. Basic 5. SQL Cut my teeth on Basic, moved to C, then C++ and now favor Python. Use SQL when necessary.

jslarochelle
jslarochelle

Used most: 1) Java 2) C/C++ 3) Ruby 4) Delphi 5) CMD.EXE Batch file scripts What I would like to use: 1) Groovy 2) Java 3) JRuby/Ruby 4) Delphi 5) C/C++ Java will probably remain near the top of my list simply because of annotation and its incredible ecosystem. Groovy integrates best with Java and will support annotations (1.1 does) so this is why it ends up first on my wish list. JRuby has the best IDE support so far in Netbeans and integrates fairly well with Java so I will probably use it. I'm already working on a project for an automated testing tool based on JRuby. I was just fooling around with Netbeans and I ended up with part of the applcation being written so might as well keep it. JS

Absolutely
Absolutely

Expertise Java SQL html/xml C# Ruby Use in the past year C# SQL html/xml Java Ruby What I'd like to be doing Perl C/C++ Ruby SQL OCaml I'm fairly happy with how these line up, although I'd like to learn Perl, based on the little bit I've seen of it.

techrepublic
techrepublic

Perl 6 and ECMAScript will eat them all. Really.

Jaqui
Jaqui

1) C/C++ 2) PHP 3) SQL 4) Perl 5) XML/XSLT in the last year only used PHP and SQL In the future, I want to be using: 1) C 2) Assembler I have something I'm putting together for a personal project, long term, that will meet my interests for the future.

apotheon
apotheon

If by ECMAScript you mean "ECMAScript 4", I might agree [i]to some extent[/i], but not entirely. Ruby 2.0 will be excellent, some older languages (like Smalltalk) are already excellent, and a bunch of functional languages kick serious butt but get marginalized because of the general popular attachment to imperative languages. On top of that, Perl 5 and ECMAScript 4 are unlikely to supplant languages like C, OCaml, Objective C, C++, Haskell, Pascal, and so on, where lower-level development and incredibly tight performance needs apply. I am looking forward to Perl 6 and ECMAScript 4, though.

Tony Hopkinson
Tony Hopkinson

I don't count SQL and PHP as languages. Got more PHP than C++ and SQL is probably equal top with in terms of used.

psiegmun
psiegmun

How come COBOL isn't listed?

apotheon
apotheon

"[i]I don't count SQL and PHP as languages.[/i]" They're languages. SQL is a query language. PHP is an overblown templating language. I hesitate to call them [b]programming[/b] language, of course -- especially SQL -- and, even when I admit to PHP fitting some definition of "programming language" (such as "turing complete"), it's only as a [b]bad[/b] programming language. edit: matched parentheses

Jaqui
Jaqui

SQL is a language, but it's not really a programming one. PHP is a scripting language, like Perl, Java, Python ... and is often forgotten in a listing of languages for programming. Most people would also question XML/XSLT. :D

Sterling chip Camden
Sterling chip Camden

... that this thread didn't bring the flaming Pythonistas out of the woodwork. Oh well. Hey, thanks to everyone who contributed to this conversation -- most comments yet on any post on this blog!

jslarochelle
jslarochelle

I was tempted to mention the Java IDE "factor" but I decided to leave this out of the equation because I did not want to get into the "IDE's are for sissies" argument. However, I think you are correct about this. Eclipse for example handles most of the scafolding most of the time. Of course once this code is generated it has to be maintained so the "IDE factor" is also not a free lunch. A couple of comments about the state of JRuby and Groovy: 1) Groovy 1.x and JRuby 1.1rc1 both compile to byte code (.class). So both will be able to take advantage of JVM "just in time" optimization. Of course because of the dynamic aspects of both languages not everything can be compile ahead of time (eval for example). 2) Netbean 6.0 is a really nice IDE for Ruby. I use it for my JRuby project and I am quite impress by it. It is also a nice demonstration of the improvement to Swing in Java 1.6 (the GUI is quite responsive compared to past version of Swing). Both the JRuby and Groovy team have done an excellent job. In the past, the performances of Groovy have always been among the best for JVM based interpreters. I expect the compiled version to be even better. The current release candidate for JRuby has improved a lot and I expect the final 1.1 release to actually outperform the Ruby C interpreter for some tasks. JS

apotheon
apotheon

"[i]What I am saying is that there is no silver bullet.[/i]" That's not what you were saying before -- even if somewhere in the back of your mind that's what you [b]meant[/b] to say. What you were saying is that Java's scaffolding and noun addiction made it readable, and the lack thereof in more concise, expressive languages made them effectively unreadable by comparison. I'll buy the notion that this is not what you meant, though. . . . and I agree: there's no silver bullet. Ultimately, I think Java's greatest contribution will be the JVM, which will be used as a platform for languages like JRuby, Jython, Rhino, Groovy, and even some kind of Perl 6 port at some point (most likely), along with a slew of others. It's entirely possible that Groovy will be the big winner on the JVM platform, since it combines some of the positives of Ruby with some of the positives of Java, and throws away a lot of the unnecessary rules-lawyering that is baked into both the Java community and the Java language at this point. "[i]I do not think that languages like Ruby (I didn't have time to look at OCaml) are a free lunch.[/i]" Oh, they're not, of course. As things currently stand, Ruby is much slower than Java for long-running processes, for instance -- and because of the incredibly dynamic nature of the language (duck typing and all), it probably can't benefit nearly as much from an optimizing runtime the way Java can. That's one positive benefit of static type systems, at least: certain types of optimization are much easier to have you tools do automatically for you. "[i]I think that if for a project you get x% bug prevention or detection at 'build time' in Java after N(j) hours of work you will have to spend a similar amount of time N(r) doing the same work in Ruby and N(r) will be very close to N(j). The way you do this will be different but the amount of time spend will be similar I think.[/i]" I think that writing good test suites in Ruby buys you more than static type systems in Java, in terms of ensuring relatively bug-free code. I also think that the greater conciseness of code in Ruby contributes to its readability while still providing a syntactic form similar enough in its superficialities to Java that Java programmers will not find it entirely foreign, and can adapt quickly. Language-to-language, Ruby is simply more readable than Java, assuming similar levels of familiarity. On the other hand, the [b]real[/b] benefit to static typing and extra scaffolding in Java for readability is not in the staticness and verboseness itself (which mostly just puts a bunch of ugly substructure between the programmer and the real code for readability purposes) -- it's in the fact that these language characteristics make it easier to write IDEs that work with Java. As a result, there are a lot of great IDEs out there for the language, and not quite so much for Ruby. Ruby isn't a language that needs an IDE -- but when you factor in what IDEs can do for you, even if the Java doesn't really get more readable, it probably becomes about as manageable as the Ruby code. By the way, part of the reason I think Java can close the gap is that it's possible to write good test suites in Java, too. Of course, most Java coders [b]don't[/b], so in practice Java probably fares worse on average than Ruby -- but that's not the language's fault. "[i]I did not find the examples that you have given me to illustrate Java's verbosity very convincing. I have already explained why I do not think that the public static main 'Hello world' example is not a convincing argument (it is impressive and this is why you see this example everywhere when the advantages of Ruby ? or similar language - are discussed) so I will not explain that again.[/i]" Actually, the reason you see it everywhere is not because it's "impressive". What would be more impressive is an actual comparison of big software projects, side-by-side, complete with code line counts, invested development time, relative bugginess, and so on. Ruby would probably win on all counts, since while the Java code was being finished in its first complete iteration the Ruby guys could hunt down hard-to-find bugs, produce greater test suite coverage, and so on. No, the real reason you see examples like "hello world" and fibonacci generators is that they don't take up nineteen pages of code, so they're very portable and can easily be displayed. "[i]Instead I will comment on the other example that is often given. Something like: instanceOfClassX.getMemberY().doSomethingWithIt(..); Where memberY is often a collection. This also, is not a good example of Java's verbosity. It is just an example of bad programming (implementation details being exposed through the interface of the class). Most of the time this type of operation should be encapsulated inside ClassX and then the chain of calls would collapse to: instanceOfClassX.doSomething();[/i]" Okay. Now show where you defined that class and that method and all the plumbing for both. You're under-representing the verbosity of the solution. "[i]Looking at the Java code on my projects I find that most of the code looks very much like Ruby code (and the reverse is also true): objects sending messages to other instances of classes using the familiar instanceName.methodName(..) syntax. It is true that Ruby has a number of features that are more economical in terms of typing but you can get those without totally loosing the advantages of the more verbose static types. Groovy for example includes closures and a number of other Ruby "verbosity antidotes" but you can use interface and static types when appropriate (and it often is for me).[/i]" That's because Groovy is an attempt to make Java more like Ruby, by the way. "[i]Yes using a Java (Groovy) interface is more verbose than using dynamic types but the interface is a formal specification of the API and is much better than a document because part of its implementation is checked by the compiler and so it is not useless verbosity. The same is true of other features like annotations, Enum and static parameter types. I don't use annotation because I like being verbose. I use them because they allow me to check high level design constraints at compile time and thus help me enforce those contraints when other programmer modify the code. Doing the same thing in pure Ruby would actually require more work.[/i]" I like having options. As such, I like some of what Groovy supplies. Unfortunately, Groovy fails at the one thing that most draws me to Ruby: making programming fun (for me). Ruby, interestingly enough, was primarily designed for exactly one thing -- enjoyment of the programmer. Everything else was secondary. There are cases where that shouldn't be the primary purpose of the language, of course, but a lot of the time it's very easy to get by with that at the top of your list of criteria. While Groovy introduced Java to a lot of Ruby's benefits, to some degree at least, it also (from Ruby's perspective) imposed a lot of Java's "programming is serious business" feel. If you don't detect that feel to Java, you probably haven't spent enough time with a more-fun language. Python, by the way, probably has enough of the bureaucratic feel to keep Java programmers comfortable, and enough of the fun factor to be a good one to learn for that contrast with Java. "[i]Again, don't get me wrong I like Ruby and I am using it everyday. I am working on an internal project using JRuby (1.0.2 under Netbeans 6.0) and so far this is going well. The reason I am using JRuby is because I want to use the interpreter (eval) to support loading different test scripts at runtime (the project is an automated testing tool for our suite of Java applications). JRuby is really nice for java programmers because it adds many Ruby extensions to the Java library classes (Collection, etc...) and those Java libraries can be used. However, for large commercial project I prefer Java.[/i]" For "enterprisey" software, running these thirty-year processes handling great masses of requests without ever restarting the runtime, Java is the better option than the current implementation of Ruby (but we don't know what the future will bring) -- if for no other reason than the efficacy of the optimizing runtime which, as I pointed out, is the sort of thing doesn't work as easily with very dynamic languages. "[i]I might actually start using Groovy on large commercial project because it is more 'Java like'. I will have to experiment with it and run some benchmarks.[/i]" Let me know what you find. From the benchmarks I've seen, Groovy outperforms Ruby at least some of the time, and underperforms noticeably compared with Java most of the time -- but Groovy is also a very young language implementation. It may get better. What most interests me about Groovy performance is how (and whether) it makes use of the JVM's ability to support bytecode optimizing runtimes for long-running processes. If so, Groovy doesn't necessarily [b]need[/b] to get much faster to completely supplant Java, I think. "[i]I suspect that we will never agree on this but it is nice to be able to talk about it.[/i]" Well -- we obviously disagree on the relative value of more-than-necessary verbosity for readability. Otherwise, however, I'm not sure we've really run into anything on which we've established distinct disagreement.

jslarochelle
jslarochelle

What I am saying is that there is no silver bullet. I do not think that languages like Ruby (I didn't have time to look at OCaml) are a free lunch. I think that if for a project you get x% bug prevention or detection at ?build time? in Java after N(j) hours of work you will have to spend a similar amount of time N(r) doing the same work in Ruby and N(r) will be very close to N(j). The way you do this will be different but the amount of time spend will be similar I think. I did not find the examples that you have given me to illustrate Java's verbosity very convincing. I have already explained why I do not think that the public static main "Hello world" example is not a convincing argument (it is impressive and this is why you see this example everywhere when the advantages of Ruby ? or similar language - are discussed) so I will not explain that again. Instead I will comment on the other example that is often given. Something like: instanceOfClassX.getMemberY().doSomethingWithIt(..); Where memberY is often a collection. This also, is not a good example of Java's verbosity. It is just an example of bad programming (implementation details being exposed through the interface of the class). Most of the time this type of operation should be encapsulated inside ClassX and then the chain of calls would collapse to: instanceOfClassX.doSomething(); Looking at the Java code on my projects I find that most of the code looks very much like Ruby code (and the reverse is also true): objects sending messages to other instances of classes using the familiar instanceName.methodName(..) syntax. It is true that Ruby has a number of features that are more economical in terms of typing but you can get those without totally loosing the advantages of the more verbose static types. Groovy for example includes closures and a number of other Ruby "verbosity antidotes" but you can use interface and static types when appropriate (and it often is for me). Yes using a Java (Groovy) interface is more verbose than using dynamic types but the interface is a formal specification of the API and is much better than a document because part of its implementation is checked by the compiler and so it is not useless verbosity. The same is true of other features like annotations, Enum and static parameter types. I don't use annotation because I like being verbose. I use them because they allow me to check high level design constraints at compile time and thus help me enforce those contraints when other programmer modify the code. Doing the same thing in pure Ruby would actually require more work. Again, don't get me wrong I like Ruby and I am using it everyday. I am working on an internal project using JRuby (1.0.2 under Netbeans 6.0) and so far this is going well. The reason I am using JRuby is because I want to use the interpreter (eval) to support loading different test scripts at runtime (the project is an automated testing tool for our suite of Java applications). JRuby is really nice for java programmers because it adds many Ruby extensions to the Java library classes (Collection, etc...) and those Java libraries can be used. However, for large commercial project I prefer Java. I might actually start using Groovy on large commercial project because it is more "Java like". I will have to experiment with it and run some benchmarks. I suspect that we will never agree on this but it is nice to be able to talk about it. JS

Sterling chip Camden
Sterling chip Camden

...but to me all that verbosity makes it more difficult to really see what's going on, even though it all contains somewhat relevant information. Concise code, when written elegantly, conveys everything you need to know in as few terms as possible. It's like if you're trying to find your way out of a building, but none of the doors have an "EXIT" sign. Instead, the sign reads "public static humanPassageWay(humanCollection passersBy) throws doorLockedException" and there are similar signs on every other fixture in the building.

apotheon
apotheon

You're right, jslarochelle. Metaprogramming is not a panacea. On the other hand, it solves a lot more than type declarations -- which actually don't solve [b]anything[/b], as proven by statically typed languages with modern type inference systems such as OCaml. Metaprogramming can also solve a lot more than rote scaffolding production, which does nothing but make explicit what should be handled by the compiler and/or interpreter. What am I missing? What's the magical benefit of unnecessary verbosity that escapes me?

jslarochelle
jslarochelle

I agree that metaprogramming can probably help with the validation and this is one place where the performance hit of generating code and other such use of dynamic programming is acceptable. However, this technique has its limitation and I don't think one would be able to fully compensate for the help provided by compilers in languages like Java or C#. Don't forget that I use Ruby and that I enjoy some of the type-saving it provides however I don't think that those new languages are "silver bullets". Programmers will still have to work a lot to produce good quality code. They will work diffrently and I think it becomes a matter of taste. For my taste I think that eventually a language like Groovy (more like Java) will be a better compromise JS

Sterling chip Camden
Sterling chip Camden

... in real-life applications I get awfully tired of typing -- and I mean that in both senses of the word. Yes, nanny languages like Java and C# can stop you from shooting yourself in the foot, but it comes at the price of having to take five different safeties off your pistol before you can even fire it. I've been doing a lot of work in C# and the .NET Framework lately, and the next time I have to put three different casts in the same statement I'm going to scream. As far as quality assurance goes, I think a little metaprogramming can go a long way.

jslarochelle
jslarochelle

Writing a message to the console is a frequent example when discussing verbosity. However, it is not representative of the code you will write in a real application. In a real application the scafolding for the main() method will represent a very small percentage of the total lines of code. Most applications have GUI and are made up of classes that use a syntax that is very similar in most languages (Ruby, Java, C++). If you leave aside code generation and keep the code readable I think the productivity gains of a language like Ruby will not be that important for large applications. The lowering in the number of delivrable lines of code will come at the cost of less compiler checks and you will need more unit tests to get the same level of advance error prevention. If you include the unit tests the line count might actually be very similar. One approach might be to consider that the unit tests are optional and you might get lucky. However, this would not be a good strategy for an industrial grade application. I get very good producticity gain with Ruby on small scripts to parse files and other such tasks. However, I have been working on a desktop application (an automated tests tool for our Java application suite) for a few weeks and I don't see any earth-shaking productivity gain. It sure is fun working in a new language but the gains I get from Ruby so far are in the same order as using AOP in Java for example. Closures, mixin and operator overloading give me a nice productivity boost but this is partly cancelled by having to do more unit tests and more defensive programming. This is why I have more hopes with Groovy because you can use it more like Java. JS

apotheon
apotheon

The more I have to deal with others' code -- or my own from six months ago, for that matter -- the more I appreciate languages that aren't [b]too verbose[/b], like Java, COBOL, and VB. Java devotees who are loyal subjects of the [url=http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html][b]Kingdom of Nouns[/b][/url] have a tendency to pretend that the succinct clarity of languages like Perl, Python, and Ruby is some kind of curse that makes their source unreadable. I find quite the opposite to be true. I rather suspect that the supposed clarity granted by Java's brand of verbosity is, in fact, nothing more than familiarity and confirmation bias. A trivial example follows. Java: [i]public class HelloWorld { public static void main (String[] args) { System.out.println("Goodbye, cruel world!"); } }[/i] Ruby: [i]puts "Goodbye, cruel world!"[/i] I, for one, find the latter more clear, and easier to follow. The problem with verbose languages, for the most part, is that the verbosity they impose tends to have little or nothing to do with what the programmer is actually trying to accomplish in any kind of beneficial manner, but instead consists of the sort of lengthy rambling necessary to convince the interpreter and/or compiler that when the programmer says to do something, (s)he actually means it. I, for one, would prefer the interpreter and/or compiler trusts me just a little so I can get on with the business of actually getting things done.

jslarochelle
jslarochelle

Having other people shooting me in the foot is not something I look forward to. Specially if we are talking about a team. In a team context I prefer to have less ammo available for any potential snipper. JS

alaniane
alaniane

and read someone else's code the more I like languages being verbose. When I first had to write code in VB.net (technically, I've never written code in actual VB unless you count legacy Basic code like GW-Basic or QuickBasic), I hated how verbose everything was. However, after having to go in a debug code written by others, I'm starting to like its verbosity.

Sterling chip Camden
Sterling chip Camden

I have programmed both COBOL and Java, and I think they're both too verbose, but in different ways. COBOL is verbose in terms of data definition and the syntax of procedure division statements. Java is verbose in terms of the class hierarchy required just to get off the ground. As Rodrigo Barreto de Oliveira says, 'The guys who came up with ?public static void main? were probably kidding, the problem is that most people didn't get it was a joke.' What I like about Ruby is that even though everything is part of a class hierarchy, you don't always have to spell that out. Plus, classes are much more mutable. Given, that allows you to shoot yourself in the foot more easily, but I prefer a language that empowers over one that lectures.

Absolutely
Absolutely

... to have an opinion of it, but I enjoy working with Java, as compilation-required languages go. I'm asking for Santa to change all programming languages into project-dependent subsets of the operating systems of the various computer architectures, but I'm expecting a lump of coal.

jslarochelle
jslarochelle

I tried to restrain myself and not get involved with the Java bashing thread but I just can't. First, I don't think that Java's verbosity is as bad as COBOL. This is obvious to anyone that as programmed in both language. Second, I specially don't agree about the "worst-practices programming" part. I think that anyone who has not been living on another planet knows that Java has actually done a lot to elevate programming standards by actually promoting a lot of good programming practices. What Java adds in verbosity is actually needed to communicate through code if you work in a large team. I think this verbosity actually replaces a lot of comments and is better than comments. Interfaces are an example of that. For example you don't need interfaces in Ruby. This would be a strength if you could use interfaces in Ruby when you decide that you need them (Groovy allows this). But not having them at all to me is a weakness. Interfaces are great because they let you specify interfaces in a very formal manner and after you have done that the compiler will start working for you and automatically perform some of the checks that you will have to do using unit test in Ruby. Interfaces have been a key element in most of the successfull modules I have worked on. Now don't get me wrong I love Ruby and I will use it for some projects (I use it almost every day for all kinds of tasks) in the form of JRuby. Java is far from perfect but it has its strong points (annotations is on of them) and I certainly don't think that it has been a nuisance. The good knews for the Java platform lovers is the comming of Groovy that incomporates some of the ideas of languages like Ruby (closures, dynamic programming, ...) and interfaces very easily with Java (this does not solve the problem with multiple JVM but this is a separate problem and is not the point of the discussion). JS

Sterling chip Camden
Sterling chip Camden

...was also a feature of Fortran. I always hated that, and it is the one thing that makes me prejudiced against Python -- though I suppose I'll need to jump into that language at some point to see what all the aeronautics are about.

jslarochelle
jslarochelle

...and turn a line of COBOL code into a comment. This happened to me and the bug was not easy to find (flipping through hundreds of punch cards). Since then I have avoided languages that use tabs and spaces for anything other than formatting (Python is one of them). JS

alaniane
alaniane

in C or C++. Leave off one and you get a whole bunch of compile errors. Of course, the worst syntax error I've dealt with involved accidentally typing a capital "O" instead of a 0 in assembly. It's amazing how much they look alike especially when you have a lot of bitmasks.

Sterling chip Camden
Sterling chip Camden

I always liked the way that if you forgot the period at the end of one statement, you'd get 1245 subsequent compilation errors, all of which were caused by the missing period -- and they would mask all of the real bugs that followed. So, you'd fix the bug, submit the compile (which took anywhere from 10 minutes to 4 days, depending on how close to final project deadlines), and get back the compiler output (printed, of course) that showed that you also forgot another period three lines further down.

Sterling chip Camden
Sterling chip Camden

that Java gave OOP a bad name by requiring complex class design, just like COBOL gave structured programming a bad name by requiring ponderous PERFORMs (ponderous in either size or depth, depending on your design).

Jaqui
Jaqui

My one experience with it was long ago, without ever learning it I was tasked with debugging a cobol app. with the sources printed on fan-fold paper on a dot matrix printer. the stack of paper was 9 inches thick. nope, not going to go there again if I have any choice in the matter. [ just for those who would ask otherwise, it took me 30 minutes reading the code to find all 4 bugs that were stopping the app from building. ]

Tony Hopkinson
Tony Hopkinson

Batch oriented models is something I rarely get to do nowadays, did one a few months back , it's still a handy way of addressing problems on occasion, databases have pretty much killed outside of legacy mainframe systems though. Pity really because that sort of thing, for stuff like ledgers was clean, efficient and very low maintenance.

apotheon
apotheon

Java is the new COBOL. If you really want to torture yourself with overly-verbose, ugly boilerplate-laden, bloated, tedious, worst-practices programming, the modern paradigm demands Java fill that role -- not COBOL. VB is about as effective on that score as Java, maybe even more so (in terms of how awful it is), but not as widely accepted (thanks to the powerful Microsoft gravity well preventing it from striking out into new territory). C# is much like Java in that respect, too, but with some very slightly improved details in its design here and there, and some tremendous issues breaking out of the Microsoft gravity well similar to those of VB. So . . . I guess if you want the agonies of COBOL, you should be using Java instead.

alaniane
alaniane

of Cobol that I like. Maybe, it's the Assembly language part of me, but I like the fact that Cobol separates the code and data divisions. For writing batches it's not a bad language, but I quit trying to learn it when it came to Object-oriented Cobol. I guess if I had to learn for a job, it wouldn't be so bad, but I'll stick to Assembler for hobbies.

Sterling chip Camden
Sterling chip Camden

...would have been in my top five for expertise -- back in 1983. That's when I moved on to more powerful languages and never looked back. Even DIBOL-11 and Fortran-77 were more nimble than COBOL. When I learned C in 1984, it was as if the veil was removed from before my eyes -- and nowadays even that language seems obtuse.

Tony Hopkinson
Tony Hopkinson

No who's doing it or has done a lot of it has posted? No one wanting to do it, is perfectly understandable.

apotheon
apotheon

I've always liked his essays and transcribed presentations. Full of truthiness and verisimilitude.

apotheon
apotheon

I've been known to differentiate "general purpose" programming languages from others -- but those others tend to consist of things like Brainf*ck and PL/SQL (SQL made Turing complete by adding proprietary extensions to it), which are [b]not[/b] particularly "general purpose" even though they're Turing complete. Thus, the word "programming" is still relevant for differentiating between limited DSLs and, well, programming languages. As for scripting -- I'll just quote Larry Wall: A script is what you give the actors. A program is what you give the audience.

Absolutely
Absolutely

[i]apotheon - 12/31/07 It's easy to be a majority -- and thus to have a higher chance of "winning" -- when the opposition just throws in the towel.[/i] ...or doesn't perceive the existence of a contest.

Sterling chip Camden
Sterling chip Camden

... but I also agree that the word "programming" has become too blurred in usage. Perhaps "general purpose language" or some similar term should be applied to a language that is not only Turing complete, but also reasonably capable of programming nearly any application -- distinguishing that class of language from languages that are specifically designed for a more limited domain, like query languages and scripting languages. Of course, now I have a problem with the term "scripting languages", because many so-called scripting languages are actually some of the best general purpose languages. I'd want to limit "scripting languages" to languages that are designed for scripting the behavior of a specific class of application.

apotheon
apotheon

It's easy to be a majority -- and thus to have a higher chance of "winning" -- when the opposition just throws in the towel.

Absolutely
Absolutely

... than I Absolutely must, to achieve a particular goal! I'm just not accustomed to the term "brain cycles."

Sterling chip Camden
Sterling chip Camden

Brain cycles are definitely a very important factor to consider, not only for the initial authorship of a working piece of code, but even more importantly for the ability to maintain and enhance that code. OTOH, when a given operation is performed many thousands of times over a short interval, or when your code will be used as a building block for much more complex and processor-intensive (or time-sensitive) operations, then you have to trade a few extra brain cycles for precious CPU cycles and use a language like C or Assembler to tune it down tight. But that is the exception.

alaniane
alaniane

and I would be in agreement with you. However, the number of job listings that I have seen that call for VBA programmers indicates that many out there are calling macro languages programming languages. Since they have already gained acceptance as programming languages by many, then the argument becomes relevant. Personally, I agree with you in that macro languages are not true programming languages and that writing macros is not the same as programming. I can also agree that strictly speaking SQL is a query language and that we write queries in SQL, not program in SQL. However, when it comes to word definitions, the majority tend to win out and from what I've see out there, the majority tend to lump macro languages, SQL and full-fledged programming languages together.

apotheon
apotheon

I wouldn't call the macro definition syntax for a given application a "programming language". As such, your statement that, given calling application macro definition syntaxes "programming languages", we should also call SQL a "programming language", is irrelevant to me -- since from where I'm sitting an application macro definition syntax is not a programming language.

Absolutely
Absolutely

I guess I implied that, though, when I referred to the time needed to learn enough SQL to do CRUD with it. [i]RE: Apotheon That's what I'm guessing he's talking about and I'm in agreement with his statement if he is talking about "processing" time in reference to brain-cycles and not CPU-cycles when it comes to compiled languages.[/i] Actually, I think it's generally the case that SQL, for a given operation on data, is faster than doing the same operation with something other than SQL. Although there are exceptions, that is what they are, not a contradiction of what I just said is generally the case. [i]I'm just limiting my agreement to within a designated set of parameters. 1) SQL has been optimized for database use and could be said to be "less processor-intensive" when compared to other fully interpreted languages with regard to CRUD database operations.[/i] That's exactly what I meant. [i]2) It takes far less time to write optimized SQL queries for database operations (CRUD) than to write the equivalent with regard to processor performance than it would be to write the code using a compiled language.[/i] I hadn't thought of that exactly, but it's a fairly direct corollary of what I did say, about the relatively short time required for a programmer to learn SQL; it stands to reason that what on can quickly learn to do, one can [u]probably[/u] also then do quickly. [i]3) How processor-intensive a particular SQL query is depends on how the database system implements their SQL interpreter (Although most modern databases should have fairly optimized SQL interpreters, I'm not willing to bet the farm that [not] all of the databases out there do).[/i] I, Absolutely, will not bet against your farm bet.

alaniane
alaniane

programming language should be qualified with an accompanying adjective instead of just using programming language in of itself when referring to different languages. I have no objection to calling SQL just a query language; however the following quote this round of the discussion: "SQL is a misnomer. "SQS" or "SQV" would be more accurate -- structured query syntax, or structured query vocabulary." implies that it is not even a language in of itself. One problem is that the terms "programming" and "programming language" have taken on far broader meanings than just systems and application development. It's not a matter of elevating SQL; it's a matter of not denigrating it. If "programming" can refer to writing macros using a macro language, then why is it inappropriate to apply the term to writing queries in SQL. If most equated the term optimization as applying to querying instead of just to programming, then referring to writing SQL queries as querying would not be a problem. So, we could distinguish programming as applying to either systems or application development, macroing as applying to writing macros for applications and querying as writing SQL queries.

alaniane
alaniane

he's talking about and I'm in agreement with his statement if he is talking about "processing" time in reference to brain-cycles and not CPU-cycles when it comes to compiled languages. I'm just limiting my agreement to within a designated set of parameters. 1) SQL has been optimized for database use and could be said to be "less processor-intensive" when compared to other fully interpreted languages with regard to CRUD database operations. 2) It takes far less time to write optimized SQL queries for database operations (CRUD) than to write the equivalent with regard to processor performance than it would be to write the code using a compiled language. 3) How processor-intensive a particular SQL query is depends on how the database system implements their SQL interpreter (Although most modern databases should have fairly optimized SQL interpreters, I'm not willing to bet the farm that all of the databases out there do).

apotheon
apotheon

I'm pretty sure Absolutely didn't mean to compare SQL with compiled languages like C at all -- that he was referring to languages that are interpreted to varying degrees (Java's bytecode interpreter in the JRE, post-compilation of bytecode; Perl's post-compile phase interpretation of parse trees; interpreted PHP; et cetera). On the other hand . . . the value of programmer time is worth considering as well. That fact is implicit in the choice of certain interpreted languages like Perl and Ruby over compiled languages like C and Pascal, or even over bytecode-compiled languages like Java. Thus, while Absolutely probably didn't mean to directly address a comparison between C and SQL, one could reasonably assume an underlying premise that SQL's "processing" (as measured in braincycles rather than CPU cycles) is more efficient than C's for most instances of performing database queries as part of Absolutely's argument. . . . and I tend to agree with that. That's why I write a lot of stuff in Perl or Ruby instead of C, after all.

apotheon
apotheon

Names also affect our ability to communicate effectively when differentiating between things. Conflating two things by applying the same name to them when, in fact, the name is more properly applied only to one of them, erodes the ability to communicate effectively. Also . . . the power of names to affect perceptions can cause people to make absurd decisions about what tool is the right tool to use for a job. When performing database queries, obviously a query language is the choice. When performing application programming tasks, one benefits from differentiating between query languages and programming languages. Sure, you might give SQL an inflated appearance of "importance" by calling it a programming language, but you also might end up prompting people to do work with SQL that is more properly done with C or Perl.

alaniane
alaniane

that although I find that human nature tends to evaluate things based on perception. So, naming can effect how important something is viewed by individuals. Of course, I shouldn't complain since it gives me work to do correcting their mistakes.

alaniane
alaniane

your basically stating that SQL is better optimized to before CRUD for databases than other general-purpose interpreted languages. I would agree with that. Of course as for proccessor-intensive, it would be database implementation dependent and would only apply to fully interpreted languages. Compiled languages like C/C++ and assembled languages like Assembly could be optimized so as to be less processor-intensive for performing the same operations as a SQL query (after it's possible that the SQL interpreter for the database was written in C and it definitely has been compiled to machine language which has one-to-one correspondency with Assembly). But, if I understand your argument correctly, it would take more development time to implement the optimized algorithms. I would agree with that if that is what you meant by "processing" time for compiled languages.

Absolutely
Absolutely

...to have any given effect on any given data set, in this case comparing the time required for SQL vs. other languages you've mentioned to do the same CRUD. If his choice of words is clearer to you, please refer to apo[b]th[/b]eon's paraphrase [or 'interpretation,' LOL!] of what I said about [u]how much[/u] interpreting occurs in different languages: 2. [i]I think Absolutely's point isn't that interpreted languages are faster than some other class of languages (compiled, for instance). Instead, it's that SQL is faster (for some definition of "faster") than (most) interpreted Turing-complete programming languages. In other words, he wasn't differentiating between interpreted languages and non-interpreted languages -- he was differentiating between SQL and other interpreted languages. At least, that's what I got from it.[/i] That was my intent, absolutely.

apotheon
apotheon

I don't think the answer to overlooking the need to optimize database queries is to start referring to SQL as a programming language in the same breath as assembly or Haskell. It is, instead, to simply drive into the minds of knuckleheaded CompSci bachelor's degree graduates the fact that one should address the bottleneck if one wants more efficient execution. If the bottleneck is in the DB queries, deal with that. If it's in the application controller code, deal with that instead.

alaniane
alaniane

but I'm still curious as to what his definition of processor-intensive is. If he's referring to the fact that SQL has been optimized and that it does not have the added overhead of supporting other applications then I would agree with his statement. However, the term processor-intensive has another meaning for me probably because of my previous experience with Assembly.

alaniane
alaniane

Actually, that's what I consider SQL to be. It's basically a macro-type language for database systems. Where I differ is that to me, the terms "programming" and "programming languages" have acquired a much broader interpretation than just Turing-complete languages. The terms themselves while they may have originally applied to just Turing-complete languages have a broader application today, especially "programming." One problem, I see with limiting the definitions is that programmers have a tendency to few as inconsequential languages they don't consider to be full-fledged languages. Therefore, you see programmers concentrate on optimizing their frontend code and completely skipping the backend queries. After all, SQL is not a "real" language so it's unnecessary to optimize it. If you take the view that C/C++ or VB is the application code language for a project and SQL is the database language for a project then your liable to optimize both ends.

apotheon
apotheon

One could consider SQL as nothing more than an application macro language, where the DBMS is the "application".

apotheon
apotheon

1. I think you're missing an important distinction in this discussion. I don't think anyone disputes the fact that SQL is "a language" -- only that it's a proper "programming language". In other words, nobody's disputing the "language" part. Some of us, however, dispute the applicability of the "programming" part. 2. I think Absolutely's point isn't that interpreted languages are faster than some other class of languages (compiled, for instance). Instead, it's that SQL is faster (for some definition of "faster") than (most) interpreted Turing-complete programming languages. In other words, he wasn't differentiating between interpreted languages and non-interpreted languages -- he was differentiating between SQL and [b]other[/b] interpreted languages. At least, that's what I got from it.

Absolutely
Absolutely

[i]and SQL delivers because it does not need to be compiled, pre-compiled, interpreted, etc., [b]with the processing overhead[/b] of "real" languages. It sticks to what it's good at, and in turn is the very best at it.[/i] I knew that was not as clear as it might have been, but decided not to bother trying to construct the perfect sentence at that time. Also I was curious how you would interpret it as written. In fact, I know that only bit strings can be executed without any interpreting. It might have been clearer if I had phrased it "SQL delivers because it does not require as much processing to perform the instructions sent to it," but I didn't really expect you to bother with that detail, because my essential point, that SQL's advantage is the very same simplicity that makes its status as a full-fledged "language" subject to debate, is not dependent on whether SQL is "not interpreted" or just "interpreted more quickly."

alaniane
alaniane

I agree that SQL doesn't meet your definition of a language; however, SQL does have to be interpreted. The interpreter is built into the database system. The only language (pseudo or real) that does not have to compiled or interpreted is machine language. That's why its important to optimize SQL queries if performance is essential.

Absolutely
Absolutely

[i]I do agree that if you limit the definition of a programming language to the ability to develop an application then SQL would not be a programming language since you cannot develop an application by just using SQL.[/i] In my own words, for any set of data, a language in which I could do CRUD to that data, [b]and[/b] program a convenient UI for a non-programmer, is a "complete" language. As far as using indexes, yes, that's a bit subtler, but each language has its subtleties, too. The way I see it, the [b]best[/b] thing about SQL is that it pretty much sticks to CRUD; storage capacities and the prices of them being what they are, simplistic operations on data need to be done very quickly, and SQL delivers [b]because[/b] it does not need to be compiled, pre-compiled, interpreted, etc., with the processing overhead of "real" languages. It sticks to what it's good at, and in turn is the very best at it.

alaniane
alaniane

So, if you define a programming language as having to be Turing Complete then SQL is not a programming language. However, I think the term programming has come to have a much broader application than just using a Turing complete language. Whether I like the broader application or not doesn't really matter. Personally, I don't consider just writing macros in an application programming per se; however, I've seen the term "programming" applied to VBA.

Tony Hopkinson
Tony Hopkinson

SQL is probably the most easily appreciated. C in windows but no API knowledge HTML and JavaScript but no webserver knowledge. C# without .NET All have similar isssues. SQL is probably the most deceptively simple. Personally I think SQL should be learnt in conjunction with database theory, far too many go down the how to tutorial, which doesn't and in real terms can't, cover when to.

alaniane
alaniane

is on the definition of a programming language. "It doesn't change the fact that anybody able to write good C or C++ could learn SQL in a week of focused study, or a few weeks of after-hours study." Any good programmer should be able to learn the syntax of a new language within a couple of weeks. It's not that difficult even to learn the syntax of Assembly (at least for the 80x86 processors). However, just because a language maybe easier to learn does not make it less of a language. It's easy to learn how to read in Spanish when compared to English or Chinese; however, that doesn't make Spanish any less of language than English or Chinese. I do agree that if you limit the definition of a programming language to the ability to develop an application then SQL would not be a programming language since you cannot develop an application by just using SQL. However, where I disagree is the assumption that programmers can learn to use SQL efficiently within a couple of weeks. They can learn the syntax, but it takes time for them to learn how to effectively write queries. Something as simple as SELECT Sales FROM tblRevenue WHERE YEAR(SaleDate) = 2007 becomes inefficient if you have indexes on tblRevenue. The function YEAR() in the WHERE clause prevents the query from using an index in some if not all databases. In the early 1990s when I only programmed in C/C++ and a little bit of Assembly, I would have agreed with you 100%. When I first starting learning SQL, I figured that it was a cinch to learn; however, after burning myself quite a few times over the years, I have a new respect for what SQL can do.

Absolutely
Absolutely

[i]Real programmers leverage the advantages of each of their tools to produce a good product. [/i] I agree: http://techrepublic.com.com/5208-6230-0.html?forumID=102&threadID=247965&messageID=2388878 "Where databases are involved, programming effectively means employing SQL because of the processing overhead ..." [i]When you query a database and bring back over a million rows when you could have brought back only 10 then your a hack and not a real programmer. I don't care how fast C, C++, or even Assembly can process that million rows, your program is inefficient especially if you're programming in a Client/Server environment.[/i] Agreed again. And, whether those 10 million rows are retrieved from a database using SQL or from an array or many arrays in C or C++ (I don't know Assembly, and I don't talk about things I don't know), sloppy coding is sloppy. It doesn't change the fact that anybody able to write good C or C++ could learn SQL in a week of focused study, or a few weeks of after-hours study. Which makes your and Tony's experiences with programmers who use it badly all the more atrocious, but still doesn't make SQL a programming language.

alaniane
alaniane

whether SQL should be considered or not. If you consider a language to be a programming language if it's a "full-fledged" language that allows interactive I/O then SQL is not a programming language. SQL is definitely not a structured programming language like C or VB, but then Assembly isn't either. You can impose structured programming constructs upon Assembly, but it's not built-in. If you consider a programming language one that can be compiled into native machine language then you've eliminated a whole bunch of languages including Java and C# since neither are compiled directly to ml.

alaniane
alaniane

need someone to come behind them and cleanup the mess they made in the database. Real programmers leverage the advantages of each of their tools to produce a good product. When you query a database and bring back over a million rows when you could have brought back only 10 then your a hack and not a real programmer. I don't care how fast C, C++, or even Assembly can process that million rows, your program is inefficient especially if your programming in Client/Server environment.

Absolutely
Absolutely

One 3"x5" note card holds everything there is to know about SQL and leaves the other side for markup languages.

alaniane
alaniane

and has nothing to do with T-SQL or PL/SQL. CASE WHEN THEN ELSE END is a SQL construct and subqueries (whether correlated or not) are also SQL constructs. I could also add that the WHERE and HAVING clauses act as "if" structures.

Absolutely
Absolutely

Outside of rudimentary CRUD, you're increasingly into areas that are specific to proprietary implementations of what's called generic SQL, and has names with more than 3 letters, ie, different thing.

alaniane
alaniane

SELECT CASE WHEN tot > 100 THEN 0 WHEN tot between 51 and 100 THEN 10 ELSE 15 END AS ShippingCharge FROM tblInvoice is an example. As for for, while logic, SQL implements it as a correlated sub-query. That's one reason it is good idea not to nest too many correlated sub-queries in a program. It follows same O(N) principles with nested queries. Also, other ways of performing calculations, requiring loops in other languages, is using Group By and Having clauses. However, SQL is not a structured language like C or C++. It doesn't follow the sequence, selection, and iteration constructs like structured languages.

Absolutely
Absolutely

What SQL does not have is the more sophisticated logic, like 'for' loops and case/switch. Without those, its math capabilities are limited to the four arithmetic operations, which I admit it does, but that's not enough to call it a programming language on par with C, Java, or even VB. It's still too simplistic to call anything more than a specialized vocabulary and syntax, which should be part of every programmer's repertoire -- a small part.

alaniane
alaniane

calculations on the fly. You can use SELECT ((4*3)+2)*7 AS Val and it will compute the calculation and display it as a column. In fact for database calculations SQL is far more efficient than C/C++ or VB. The problem is that most programmers are not efficient in SQL. They may know the basics, but they have not learned how to proficiently use SQL. I tune queries all the time that have been written by programmers who knew C or VB, but obviously did not know what they were doing in SQL. As for this statement: "If the Order Total is not already calculated and stored in your Database, SQL alone will not calculate if from the quantities in the fields 'Unit Price' and 'Quantity.'" Storing a calculated result for order total would be considered a violation of 3N for database design. Calculated totals are stored in the database not because SQL cannot perform the calculations, but because sometimes denormalizing a database leads to better performance. For database calculations I prefer SQL over any of the other languages out there. I wouldn't use SQL for graphics or interactive I/O, but then that's not what SQL was developed for.

Tony Hopkinson
Tony Hopkinson

most feel a stored proc design too proprietry. The most general failure I see from programmers who do SQL, is most can't go further than a select statement. The number of times you see suck an entire table onto a client, test for a condition on each record and them update, makes you want to scream.

Absolutely
Absolutely

You can make a case that SQL has all the agility with numbers and text that it needs, or that I was imprecise with the word 'calculation' and the phrase 'text processing.' But for now I stand by my previous statements; SQL users can apply numerical [i]conditions[/i] to instructions to [u]C[/u]reate, [u]R[/u]ead, [u]U[/u]pdate, & [u]D[/u]elete, but it cannot perform calculations or processing of what it retrieves. If the Order Total is not already calculated and stored in your Database, SQL alone will not calculate if from the quantities in the fields 'Unit Price' and 'Quantity.' Such derived quantities are generally available because they are computed on input, but the computations are not performed in SQV&S.

alaniane
alaniane

is that SQL programming is different from programming in VB or C/C++. Many programmers fall into the trap of using cursors to perform tasks in SQL because they are similar to while or for clauses in other languages. However, in most cases it is far better to use either temp tables or table vars to accomplish the task than a cursor. Also, sub-queries especially correlated sub-queries can be a difficult concept to understand and if you a correlated sub-query at the wrong time it will cause a performance drag.

alaniane
alaniane

could not be written using SQL SELECT SALES * TAX_RATE AS TAX FROM tblInvoice WHERE InvoiceDate > '12/1/2007' But since I can write this SQL query it does do numerical calculations. This is plain SQL and not PL/SQL or T-SQL. Also, you can process text in SQL with functions like LTRIM, RTRIM, CHARINDEX, LEFT, LEN LOWER, PATINDEX. SQL has both text processing functions and numerical calculation functions. T-SQL, PL/SQL also have iteration and selection constructs. SQL has a selection construct in the CASE construct. It's far more powerful than simple CRUD functionality.

apotheon
apotheon

"[i]'SQV&S' could never catch on without corporate backing.[/i]" How about "SQueVAS"? Just thinkin'. "[i]So-called 'SQL' does crud to database tables[/i]" It also does CRUD (create, read, update, delete) to database tables. Sorry. The pun-bug bit me.

Absolutely
Absolutely

Although I agree with your basic statement, I disagree with the conclusion. Where databases are involved, programming effectively means employing SQL because of the processing overhead -- and because of the programmer's learning overhead, which brings me to my disagreement with you. SQL is a misnomer. "SQS" or "SQV" would be more accurate -- structured query syntax, or structured query vocabulary. "SQV&S" could never catch on without corporate backing. So-called "SQL" does crud to database tables, according to a [i]few[/i] criteria, nothing more. When those criteria can be identified and crud to a database is all that needs to be done, SQL is the best way to do it. Because it does so little, it is a trivial thing for a programmer to add it to his repertoire, but is not a [i]bona fide[/i] programming language. It does no numerical calculations and it does no processing of text. I did include it on my own lists, out of habit, but I should replace it with BASH on all my lists, and probably moving BASH up from where SQL placed, especially on the third list. I wouldn't take special efforts to program without SQL, or without a mouse, but that does not make either a "programming language."

Jaqui
Jaqui

that doesn't suprise me, after all, they don't understand that the dbms is far more secure. Even in a compiled program the business logic can be gotten out of, specially with the free hex editors out there. Try breaking the security in any quality data base system, not going to be anywhere near as easy as opening an executable in a hex editor.

Tony Hopkinson
Tony Hopkinson

down the SP route. For some reason implementing business logic in the DBMS is seen as more constraining than doing so in some programming language.

Jaqui
Jaqui

with stored queries in the database engine itself, those stored queries are effectively programming in sql. A prefectly formed query to get only the explicit data required as a stored query is far less resource hungry than any query compiled on the fly by any application, making the time taken to write the queries and store tham a very good investment for any business.

Tony Hopkinson
Tony Hopkinson

Iv'e done SQL with programming C, Fortran, VB, Delphi..... even PHP, which I agree is f'ing horrible.

Jaqui
Jaqui

based on execution environment, nothing to do with language syntax at all :p

apotheon
apotheon

"[i]PHP is a scripting language, like Perl, Java, Python[/i]" Why would you insult Perl, Java, and Python by saying PHP is "like" them? Why would you insult Perl and Python by including Java in such a list?

Tony Hopkinson
Tony Hopkinson

me question my sanity. Transforms and styling great, do not program with it though. I've seen a few try, it's not worth the trouble.