Software Development

10 development technologies that refuse to die

Even as new tech drives IT forward, certain systems and languages keep the past alive. Here are 10 technologies that will be with us for a while.

The world of software development has a strange irony to it. On the one hand, technology is changing so quickly that developers are forced to constantly learn new tricks to stay current. On the other hand, existing projects and code are so hard to replace, systems can stay in maintenance mode for decades, slowly being significantly rewritten but never actually replaced. These 10 technologies are ones that software developers will be using for a long, long time, even if some are past their heyday.

1: COBOL

COBOL is all over the place and probably always will be. There are millions of lines of COBOL code out there powering banks and insurance companies and other mission-critical systems that handle massive amounts of data. Many of these systems will be in services for decades if not centuries without replacement.

2: VBA

A lot of systems that use VBA, VBScript, or VB6 (all related technologies) are outdated. But VBA is still the macro language for Microsoft Office, and plenty of people depend upon it to do their jobs. As miserable as it is to work in VBA (it has collections but does not allow you to check if a value exists in them?), it will be around for quite a while unless Microsoft somehow comes up with a suitable alternative.

3: .NET WinForms

When Microsoft came out with .NET, developers used WinForms to make Windows applications with it. WinForms was a thin veneer on top of the Win32 API, and for VB6 and MFC developers, it felt very comfortable. For better or worse, Microsoft is replacing WinForms with XAML; first in Silverlight and WPF, and now with Metro. All the same, the fast rise of .NET meant that tons of WinForms applications were built -- and they will be maintained for a long time, just like the VB6 applications out there.

4: Flash

A few short years ago, it was impossible to even imagine a Web without Flash. It was everywhere. While Flash still is everywhere, HTML5 threatens to push it out of its spot for rich Web development. Even so, there will be existing Flash work out there for ages, and it will be maintained and extended. HTML5 still can't replace Flash for some things, either.

5: C

Until fairly recently, C was enjoying a graceful, slow drift away from actual application development and being relegated to the roles of hardware driver and operating system development. And then the iPhone (and later, iPad) were released, causing a massive surge in use of Objective-C, which is a superset of C. Now, thousands upon thousands of developers have learned C in the last few years and used it to write hundreds of thousands of cutting-edge applications. Talk about a comeback! The popularity of iOS will ensure that C will be used for application development for some time to come.

6: FORTRAN

If languages were people, FORTRAN would be regarded as COBOL's fuddy-duddy spinster aunt. But like COBOL, FORTRAN was the language of choice for certain industries and sectors, a pile of code got written in it, and replacing that code is basically impossible. Where COBOL runs the banks, FORTRAN runs things like weather prediction.

7: SQL

SQL is a strange case. On the one hand, databases that use SQL are still all over the place, and SQL is often the only way to work with them. So it is no surprise that the SQL language itself is out there in spades. What is a surprise is how many developers are still writing a lot of SQL code. With all the various database abstraction systems out there, such as the ORMs (Hibernate, Entity Framework, etc.) and other systems (Active Records, LINQ), why in the world does anyone actually write SQL into their applications? It should be the (very rare) exception, not the norm, yet many developers find a need to write SQL. Even if everyone stopped writing SQL by hand tomorrow, though, systems would be automatically generating it anyway.

8: ASP.NET WebForms

When ASP.NET was first released, WebForms had the unenviable task of trying to make Web development feel as familiar as possible to traditional desktop application developers. To make it even more of a challenge, it carried over and extended many of the technologies from Classic ASP, while completely changing the overall model. WebForms clearly suffered from serving too many masters, and less than 10 years later, Microsoft was pushing ASP.NET MVC's streamlined model in its place. Like WinForms, the WebForms' similarities to previous systems led to rapid adoption, so WebForms Web applications will be around for quite some time.

9: Java

Java is nowhere near being close to a decline. It is still a strong, vibrant ecosystem. But if and when the day comes that people start referring to it as "legacy," it still will have many, many years left. It is no surprise that Java is often called "the modern day COBOL" by industry observers. It has a combination of traits (like running on *Nix servers and mainframes) that makes it attractive to the same industries that COBOL appeals to. Java has made impressive inroads into those areas, and even if the flashier uses of Java (like Web development) go away, it will still hold a prime spot in the world of Big Iron.

10: HTML

It is hard to believe, but at one point, the Web was little more than a way of posting documents online so that you could easily access one document from another. About 20 years later, and HTML is now a wildly popular development system that has enabled an unimaginable revolution in how computers are used. And the funny thing is, up until HTML5, it was never deliberately designed to fulfill the role it was filling. It is hard to imagine a computing world without HTML (or one of its descendants) in the future.

Other persistent technologies?

What languages and systems do you think will be hanging around for the foreseeable future? Share your predictions with fellow TechRepublic members.

About

Justin James is the Lead Architect for Conigent.

312 comments
Jessica159
Jessica159

Well, I am searching about PHP Programmers and found this blog which is pretty awesome and have some useful topics for me. I'd like to say keep it up!

baltazor1
baltazor1

Although i never really liked them it still is a very smooth way to introduce children to programming especially now that they are introduced to computers etc at an early age..

coleade09
coleade09

I don't know how you would classify VFP, but I think it should have made the list of top 10. It has been around for quite a while and even some systems today are still running on FoxPro for DOS.

todd_dsm
todd_dsm

Some of these languages are the dna of the system... C isn't going anywhere; it's the language of drivers. Unless you want a lul in conversation between your HDD and anything else, this one's going to be around forever. Java will pass once Mega-Corps realize the world has turned to cloud computing and Python. But they will have to abandon years of hard work with java. It's hard enough for them to agree on the color of an icon, let alone absorbing that the rest of the world is changing direction. Corn flower blue anyone? SQL, ditto. Python, (a little) PHP, HTML5 and noSQL are the wave of the future. But, like stearing the titanic, companies not only need the inclination ($) but the time. It takes longer to steer big ships. Here's "the list" of programming languages: http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html By clicking on their names you can get some pretty charts; EG: java vs. python vs. ruby. You can get a sense of who's comin' and goin'. Fun piece JJ.

willis0966
willis0966

I remember writing a word processor in FORTRAN (of all things!) It worked better than the text editor everyone was using at the time... Once you (anyone) learns to become "fluent" in a language, it feels comfortable. That's why some languages will never die. They might be past their prime but are so entrenched in so many installations that it would become almost impossible to replace them. They'll be around for a long, long time...

cwayneu
cwayneu

Yes those DEC developers were way ahead of their time. DEC's OS's (RSX and VMS) were the most secure, stable, flexible, and elegant OS's I had ever seen. Their interrupt handling, memory management, security Daemon, and AST's were simply a work of art. Maybe by the end of this decade, Windows might be up to par where DEC was back in the mid 80"s. ;-)

Tony Hopkinson
Tony Hopkinson

and DEC kit iself, was always rock solid. Strangely enough the boys holding on to Fortan at my last place of work were also holding on to VMS and a pile of DEC boxes mainly Alphas.Did a nice two year refresher there, while picking up LAMP and unfortunately VB6.

cwayneu
cwayneu

Well Justin, you certainly struck a few nerves here. Most of these technologies will likely hang around in some form until something like the HAL 9000 gets smart enough to take over and just do what we are thinking. ;-) And I agree with Mark Miller's assessment of evolution, that it's not always the best that survive. If that were true (not to start an OS war but), DEC would still be a computer company and VMS would still be an OS player. I've worked with them all (CDC, Prime, HP, IBM, Apollo, DEC, Windows, Apple, Unix, and remember Digital Research's CPM. I found VMS to be next to none, in my humble opinion. I miss those "good old days"... ;-)

smokinsteve
smokinsteve

[quote]If languages were people, FORTRAN would be regarded as COBOLs fuddy-duddy spinster aunt.[/quote] You've got to be kidding me! It's the other way around, isn't it? Fortran was always the sexy one, allowing geeks and other brainiacs to do hardcore tech- and science-oriented number crunching on cool, cutting-edge mean machines like CDC's. COBOL, on the other hand, was that crappy, verbose, supposedly English-like POS designed to let suits make IBM clunkers do boring stuff like work out people's gas bills. And nothing much has changed. If you want to mathematically model climate change or nuclear detonations on a supercomputer, Fortran's the only way you're going to do it. Fuddy-duddy my eye!

BlueCollarCritic
BlueCollarCritic

???With all the various database abstraction systems out there, such as the ORMs (Hibernate, Entity Framework, etc.) and other systems (Active Records, LINQ), why in the world does anyone actually write SQL into their applications? It should be the (very rare) exception, not the norm, yet many developers find a need to write SQL. Even if everyone stopped writing SQL by hand tomorrow, though, systems would be automatically generating it anyway.??? Are you kidding me? I expect this kind of thing from the script kiddies and younger developers who are inexperienced but not from adults with experience. The answer is YOU. Specifically your lack of understanding of Relational Databases and how to get to and work with that data efficiently/effectively. If all you ever work with is tiny data sources (a few hundred gigabytes) then those ???one interface queries all??? tools use will work for you. However when comes to dealing with the important stuff that is measured terabytes or more, the dumbed down interfaces to SQL coding that lets you skip writing and even understanding how to query the database falls flat on its face and that???s because the one-size-fits-all method doesn???t work for all sizes. Let???s flip the idea and see how it sounds from the other direction: A LIST OF 11 Development Technologies That Refuse To Die #11: Programming: With all the wonderful drag and drop software tools available why does anyone still bother to write or even piece together existing code? If the programmers would stop writing code and use these dumbed down tools only then the problem would take care of itself and writing software code could finally be put out of its misery. Sounds dumb doesn???t it? Same goes for this ???No SQL??? mindset. It???s this kind of attitude, that ???I don???t need no stinking SQL??? that results in very poorly written applications that are unable to work with large data sets. Whats worse is its often the simplest mistakes that cause the worst headaches and all because some developer didn???t want to be bothered with learning the SQL language. On the up side this means ensured work for us DB people for many years to come.

beck.joycem
beck.joycem

Evolution is the word. Some species (or spoken languages, or IT tools of any sort) survive almost unchanged, when they are already adapted well to their environment and that environment doesn't change significantly. Others change in response to their environment, others become extinct. The point is that extinction happens when there is no environment in which the species can continue to thrive. So, as evidenced by the vast variety of responses, for example, there are environments in which Fortran continues, almost unchanged, to thrive after decades, but there are also species that have only been around a few years who no longer have a place - a few struggle on, but they're being supplanted by better-adapted species. Take Fortran out of its mostly scientific home and it gasps, flounders and dies. But it won't become extinct unless no-one finds it the best tool for the job they have to do - now, without spending three months learning a new language that doesn't suit their scientific brain nearly as well as Fortran. The same analogy can be drawn in domestic lighting. Candles and oil lamps are still in use, as are many of the generations of electric lighting invented since. Some - probably incandescent lightbulbs - will eventually become extinct, but there will always be a box of candles on my shelf for when we get power cuts.

Suresh Mukhi
Suresh Mukhi

My daughter's first programming language was HTML which she learned in school.

Mark Miller
Mark Miller

C at the top of the list, C++ way down at about 9%, with Objective-C nipping at its heels?? Haskell about ready to break into the top 20? I wouldn't have expected to see this at all. Exciting.

apotheon
apotheon

Your list of "wave of the future" things is shockingly limited, and noSQL isn't "the wave of the future" -- it's just another tool that will probably become very popular for particular purposes. Relational databases will continue to be quite useful for various purposes, and as "noSQL" is defined these days it won't be enough to replace SQL for those purposes. PHP is actually a technology looking for an excuse to die, I think. It just hasn't found that excuse yet. It survives not because it's an excellent critter for its evolutionary niche; it survives because, against all odds, nothing better has appeared in that niche and gained enough traction for people to notice it yet -- though some very-nearly-there options do exist, and they're enough better-designed languages that a lot of people choose them over PHP despite some of the specific niche-fitting benefits of PHP. You present Python as "The Answer" for certain types of programming, as if there are no competitors. It's just a visible example for a particular niche. Others exist as well; it's not the PHP of its niche, the sole tool that truly satisfies the needs of the niche. It's just one of several.

apotheon
apotheon

Languages used primarily for application development are particularly vulnerable to fading away entirely if they become unpopular, because there will be few entrenched codebases (thanks to application turnover) mandating the training of new users of the language -- which means the only thing likely to keep them "alive" will be old programmers set in their ways. This is the kind of problem facing Pascal, now that Borland has crapped out, and it'll be a problem for VB if Microsoft ever ceases to prop it up. People being "fluent" in a language isn't quite enough in the long term.

apotheon
apotheon

You must be the perennial optimist.

Mark Miller
Mark Miller

Dave Cutler, who had worked on VMS at DEC, was the chief developer on Windows NT, which became the basis for Windows 2000, and all subsequent versions. As I recall reading at the time NT came out, he modeled it on VMS's design, but from what I read since then, Bill Gates got his hands on it... The original product idea for NT was that it would directly compete with Unix, as a command-line OS. In the end, MS decided against that, gave it a GUI, but put in the "Command.exe" app. to allow command line control of the system.

Sterling chip Camden
Sterling chip Camden

What they got wrong was price/performance ratio and how to treat their OEMs. Low cost iron running Unix, Xenix, or even DOS undercut their market by providing five-fold performance at half the price or less.

smokinsteve
smokinsteve

[quote]. . . it's not always the best that survive. If that were true, . . . DEC would still be a computer company and VMS would still be an OS player.[/quote] It's terribly sad that DEC are no longer around; their approach to computer architecture and instruction sets was absolutely inspired. Their engineers were geniuses! I loved the way that on a PDP 11, the hardware wasn't microcoded to increment the program counter when fetching an instruction. The increment happened as a matter of course because the instruction was fetched using the autoincrement general addressing mode on the general-purpose register that acted as the program counter. Very elegant! There were no special cases with the PDP (apart from the JSR instruction); everything on the machine (memory, registers, devices) could be accessed and operated on directly using any addressing mode. Brilliant! What a shock it was to later encounter the Intel 8080 and its primitive architecture and instruction set and have to be forever moving stuff in and out of registers to get any work done. Clunk, clunk, clunk, clunk. [quote]I miss those "good old days"...[/quote] Yes, indeed. Well, of course, we were all young back then. Sigh.

Mark Miller
Mark Miller

Your message title reminded me of this little gem from 1961, an IBM 7094 singing "Daisy Bell" with voice synthesis, and its own accompaniment. From a little research I did on this, supposedly this demo of the IBM singing was the inspiration for Kubrick to have HAL sing this song in "2001," as it was put to sleep. http://www.youtube.com/watch?v=41U78QP8nBk Edit: Meant to add, as we probably all know, the letters "H-A-L" were the letters that come before "I-B-M".

Sterling chip Camden
Sterling chip Camden

Compared to COBOL, Fortran is pure bliss. I wrote quite a bit of process-control software in F77 on RSX-11M, and that was awesome! Compared to more modern languages, though (even C, but especially Ruby, etc.) Fortran is pretty old-fashioned.

Tony Hopkinson
Tony Hopkinson

Has interpreted no SQL means you don't need to know anything about databases. Sort of like you don't need to know arithmetic if you have a calculator... This has only been exacerbated by the fact that many developers when writing SQL, are incredibly bad at it. It's just yet another instance of viewing a productivity multiplier which was meant to get more out of your best people as a way of getting as much out of cheaper less competent ones.

Mark Miller
Mark Miller

Alan Kay has talked about the meme of evolution from time to time over the past few years. He addresses it because people tend to use a common sense notion of it as if evolution means "better" or "best," due to the notions of "survival of the fittest" and adaptation. He said that evolution should not connote "better" or "best" in qualitative analysis. What evolution is really about is "fit-ness," the ability of a system to survive and propagate itself, given its environment. He's used a metaphor that I don't think is quite up to snuff. He said that, "The most perfect being that's ever been born could get stepped on by an elephant." That's an aspect of evolution. Happenstance interactions that can make or break a system where the "gradient of challenge," as I'd call it, is so great, and hits so fast, that the system, no matter its potential, doesn't have an adequate response at a stage where it is vulnerable. So even though it's what might be regarded as preferable, it fails, but a less preferable system thrives, because it's more "fit" to its environment. And when you look at that, it's wise to consider what the environment actually is. I mean to emphasize this in our discussion of technology, and who's using it. The problem I have with Kay's analogy is the idea of "the most perfect being." I don't even know what that would be, and he didn't define it. I can see that what he was trying to do was set up a contrast between a system that's in his mind preferable vs. one that's so-so, and how "what's preferable" doesn't really mean a thing in evolutionary terms. In the scenario he discussed, the "elephant" survives just fine. The "most perfect being" does not. Not to say it always works this way in evolution. Just to say that it can, and that we should not mistake "survival" for "best."

apotheon
apotheon

That's a fair assessment of the process of technological evolution.

Mark Miller
Mark Miller

My understanding is that HTML is not a programming language. It's a page description language. It's features are page layout and font selection. Maybe HTML5 could be described as a programming language, though. I haven't looked at it. Several years ago I would've had what I thought was a good definition of programming, but now I'm not so sure. The "easy answer," which isn't necessarily that precise is that programming describes a process. Now the best I can come up with is that programming describes relationships between computational entities that probably fit Turing's definition of what computing is.

apotheon
apotheon

The TIOBE rankings use simplistic metrics, and it's difficult to identify exactly how its metrics end up defining "popular". The ranking can have its uses, but like the subject line for this comment says: take it with a grain of salt.

todd_dsm
todd_dsm

I don't have all day to sit and blog about these things; this is only a highlight reel... Since, I don't perform all PHP downloads in order to bolster support for it we can assume millions of other programmers are doing that. Some of the biggest open source projects on the web are based on PHP (facebook, yahoo, wikipedia, wordpress, digg) so you can make some assumptions about its future. Is it a great language - not really but others seem to like it for simple tasks, like blogging. Blogging, wikis, and the social media apps themselvs are huge and therefore so is the supporting technology. In reference to Python, there are benefits to being the last one to the table. The iPhone and Android were the last smart phones to market, for example, and they forced (or are forcing) companies like Palm and BlackBerry to make fundamental changes or go out of business. Python was not a big language right off. So developers of the language had time to work annoyances [i]out of the language[/i]. If you haven't used it yet I recommend you do. If you know Python you can throw out utility languages like perl, bash, php, C, et al, and use just 1 language to accomplish zillions of tasks. As I am only a db novice, I can only say of nosql that it's just fine but, if I'm wrong, you should call these guys and tell them their design is flawed: https://developers.google.com/appengine/docs/python/datastore/ Since Python and its datastore are part of Google's long-term strategy for world domination they will be happy to know your opinion up front; it will save them years of embarrasment. But your comments will also save YouTube, some of Yahoo's more recent apps, Google Groups, Gmail, and Google Maps, JP Morgan, Industrial Light & Magic, Walt Disney Feature Animation, the Blender 3D (modeling program), Los Alamos National Laboratory (LANL) Theoretical Physics Division, NASA, SchoolTool, and the CIA. http://wiki.python.org/moin/OrganizationsUsingPython They are all just sitting there, waiting for the phone to ring, so you can confirm their worst fears - that their designs are wrong. As far as Python being "The Answer", which is a little dramatic, I'd refer to the above reference and take away not the number of projects using Python but the diversity of applications: web, reporting, analysys, machine learning, GUIs, compiled apps, filesystem scripting, etc. Is it the answer? Maybe. How many other languages can (nearly) replace so many others? And, in your words, you can "Write it once. Compile it once. Distribute to Android, Windows, *nix, or Mac". Does anything else do that? I would hardly use Java to perform system administration tasks that I would have normally used Bash or Perl to accomplish. Listen, I'm not an evangelist for any of these languages. Personally I don't think ruby gets the props it deserves most of the time. But - using old-school input mechanisms, like your eyes and ears, are helpful in spotting trends. The difficult part about spotting trends is that after you see and hear information, you then have to process it and make some sense of it all; those who can do this are called "analysts". And maybe TIOBE rankings are simplistic. Do you have a better reference? Or do you suggest we use no metrics at all during analysis? Most people's problem with these new kids on the scene isn't the new kids - it's usually the person's inability to research and make personal changes, like letting go of old preferences. In short - don't shoot the messenger. It's not bad news, it's only [i]news[/i].

Sterling chip Camden
Sterling chip Camden

Probably the single greatest reason why Synergy/DE still exists as a language is because of the longevity of business applications. In vertical market software, applications don't turn over, they evolve. They're too big to rewrite from scratch, unless you can go several years without a product (or developing in parallel). Many Synergy/DE applications began their lives 20-40 years ago. Now they're on the web or the Windows desktop -- but a lot of the original code still remains.

Mark Miller
Mark Miller

[i]There were no special cases with the PDP (apart from the JSR instruction); everything on the machine (memory, registers, devices) could be accessed and operated on directly using any addressing mode. Brilliant![/i] Reminds me of what it was like to program a Motorola 68000-series architecture. I learned assembly programming on it, and it made a lot of sense. Recently, as an exercise in learning about operating systems, I've gone way back to the MOS 6502. It reminds me of your description of the Intel architecture. The documentation says there are 3 registers, but to me they act like one big segmented register. In order to use most addressing modes you *have* to use the Accumulator. In order to address "zero page" memory (an 8-bit addressing mode) indirectly (ie. dereference a pointer) you *have* to include the Y register in the address calculation. In order to address with 16 bits indirectly you *have* to include the X register. Yuck.

apotheon
apotheon

While that comparison of COBOL and Fortran may not look very technically accurate, I wonder if it might be socially accurate -- like Java partisans making similarly derogatory references to Lisp and Smalltalk (and C, for that matter).

Mark Miller
Mark Miller

In the IT business this is expressed by management's long-standing desire to, "Make the box work as specified," not understanding one wit that how one goes about that matters, that this requires rational thought around a model of "how the box responds to our plan." The dominant idea is, "Just make it work." If they need to spend a few million dollars more to make that happen, so be it, the thinking goes. Add more magic boxes to the system...

Sterling chip Camden
Sterling chip Camden

...if Kay was thinking of SmallTalk and C++ as the perfect person and the elephant, respectively.

AnsuGisalas
AnsuGisalas

The thing is, evolution is not ever about individuals. The fitness it addresses is purely stochastic. Individual variation IS a factor, a driving force for evolution, but in the context of evolution the impact of individuals is relevant only as part of the conglomerate. This is also why evolution will only produce perfection as a fluke. It's entirely incremental, never working towards a finite endpoint, but rather in a general set of directions that can be summed up as "away from non-viability".

Slayer_
Slayer_

Formatting and stuff is supposed to be done through CSS.

apotheon
apotheon

It's all really simplistic. Apparently, the primary criteria are "foo programming" mentions in search results over a range of sources and mentions of programming language names in job postings and resumes. This is from memory; I might be misremembering some detail. The overall story, though, is about like that, as I recall.

Mark Miller
Mark Miller

...and they've confirmed what I've heard about which languages are popular, and which are not, at a point in time. Even this time, Java was near the top, which is what I'd expect. The thing was all the other ones on the list were not where I expected them to be. So that was interesting. Plus, I think they claim their survey is international. So something could be happening halfway around the world that has nothing to do with what's happening here in the U.S. Re. the reliability of the rankings I get what you mean. I looked for its description of how it did the survey, and all it said was that it looked at "the number of developers in each language," and Google search terms, etc. I kind of wondered how they got the "number of developers in each language." I'm pretty sure they never contacted any of us to ask what we use. So not sure how they got that number. Google search terms did seems tricky to me. I've searched on languages I'm not using, but I imagine the typical behavior is the opposite of that.

apotheon
apotheon

QUOTE: I don't have all day to sit and blog about these things; this is only a highlight reel... Maybe so, but the way you presented the list made it sound like Python *itself* was "the wave of the future", and not just that it was an example of its niche. QUOTE: If you haven't used it yet I recommend you do. If you know Python you can throw out utility languages like perl, bash, php, C, et al, and use just 1 language to accomplish zillions of tasks. I'm familiar with Python, and it is certainly flexible and capable within its constraints. Of course, in that regard, it is not particularly different from Perl and Ruby, both of which I prefer over Python. There are cases where I would *not* use it to replace sh (bash is a blight on the face of the computing world, but sh has its uses) or C in many cases because it is not as well suited to some tasks where sh and C are the go-to languages. I also wouldn't use it to replace Perl, but only because for cases where it would do roughly as well as Perl I prefer Perl. QUOTE: As I am only a db novice, I can only say of nosql that it's just fine but, if I'm wrong, you should call these guys and tell them their design is flawed: I snipped the rest of your ridiculous, reductio ad absurdum sarcasm. I'm not saying nosql is a bad idea, and you'd know that if you bothered to read what I said rather than just looking for excuses to attack people who don't use the newest technology for friggin' everything just because it's new. In fact, at least some (if not all) of the examples of nosql users you mention are using nosql databases as high-performance front ends with more traditional SQL relational databases on the back end to provide the benefits that nosql does not provide. As such, and as I already said, "Relational databases will continue to be quite useful for various purposes, and as 'noSQL' is defined these days it won't be enough to replace SQL for those purposes." QUOTE: As far as Python being "The Answer", which is a little dramatic, I'd refer to the above reference and take away not the number of projects using Python but the diversity of applications: web, reporting, analysys, machine learning, GUIs, compiled apps, filesystem scripting, etc. Is it the answer? Maybe. How many other languages can (nearly) replace so many others? And, in your words, you can "Write it once. Compile it once. Distribute to Android, Windows, *nix, or Mac". "One True Language" zealots always end up sounding silly. QUOTE: Does anything else do that? Yes. QUOTE: I would hardly use Java to perform system administration tasks that I would have normally used Bash or Perl to accomplish. Nor would I. Java is, for my taste at least, a miserable language. Why are you acting like I'm its champion? QUOTE: Listen, I'm not an evangelist for any of these languages. Really? You come off like a Python evangelist, with no reservations. QUOTE: And maybe TIOBE rankings are simplistic. Do you have a better reference? Or do you suggest we use no metrics at all during analysis? I suggest taking TIOBE rankings with a grain of salt, as I said. Don't put words in my mouth and try to refute what I didn't say as though it refutes what I did say, or you'll end up looking silly. QUOTE: Most people's problem with these new kids on the scene isn't the new kids - it's usually the person's inability to research and make personal changes, like letting go of old preferences. Right. Now you're saying I'm some kinda change-averse luddite. I have no problem with a "new kid" where it's actually a good idea to use it, but that requires actually understanding the technology well enough to know where it's best used, which means *not* thinking nosql as currently descriptively defined for *everything*, because there are places where that's a bad idea -- and in many cases where nosql is a good idea, it's a good idea to use it *with* more traditional SQL relational DBMSes, rather than as a replacement for them. QUOTE: In short - don't shoot the messenger. It's not bad news, it's only news. I'm not shooting the messenger. I'm just pointing out the fact that there's a lot of information missing from the message, though it is being presented as the whole story -- as if nosql being widely used for specific purposes means it should be universally used for all purposes, and as if Python is some kind of magical panacaea that obsolesces C.

apotheon
apotheon

I suspect that there is not enough volume of such business applications for the language to thrive if Synergex itself evaporated or dropped support for it, even if there was an independent open source implementation of the thing. This is basically the situation facing Pascal now; it survives mostly through inertia alone, now that Borland (the last big champion of a Pascal derivative) is no longer propping up the language family. As long as the corporate champion is there, though, it's a thriving (if relatively small, by comparison with things like C especially) user/developer community.

apotheon
apotheon

It occurs to me that, keeping in mind what most geek-coders think of MBAs and middle management bureaucrats, what you said could be taken as a grave insult to people who prefer COBOL over Fortran. I'm *sure* you didn't mean it that way -- I think, maybe.

Sterling chip Camden
Sterling chip Camden

I remember, for instance, in Data General's version of COBOL on AOS, it had a "SCREEN SECTION" (yes, shouting was mandatory). In that section, you could define fields and their location on the screen in the same way as you would format a report. Cursor movement and editing shortcuts were all built in.

Mark Miller
Mark Miller

...I could see where it was well suited to generating printed financial reports. It was practically a page description language in its own right, with some database and arithmetic logic included for good measure. The problem I saw with it is it seemed like that was the *only* thing it was good at. In the course I took on it, we had a section where we focused on application development, back before GUIs were popular, learning to format text on the screen, and take user input. The thing is I could do the same thing in Basic just as easily. It didn't see Cobol offering anything special in that regard. I understand that this niche of financial reporting has a lot of applicability in the business world. I just couldn't imagine using it for anything I wanted to do with computers.

Sterling chip Camden
Sterling chip Camden

COBOL makes a lot more sense than Fortran to those who think in business terms instead of scientific or math terms. It's just that business terms can be very limiting when you want to do anything else.

Mark Miller
Mark Miller

I highly doubt if I were to change work from one IT operation to the next that it would deviate much from my description. Not only have I seen consistency in where I've worked, but I've heard plenty of other stories from software engineers to confirm what I've said. I generalized and exaggerated somewhat for effect in my last two sentences. Since a few people got on the subject of mainframes, I thought I'd throw that in, since I've read accounts of this actually happening. "Big iron" operations often go for the "few million dollar" and "add more boxes" solution, from what I hear, rather than a cheaper (say, $10,000) solution that involves hiring a contractor to create a less complex solution to the same problem.

Tony Hopkinson
Tony Hopkinson

Is this the first specification or the last + 6 months one illuminated by 20/20 hindsight. To me it's just another trip down a corporate's interpretation of the word quality which we all know is "good enough". The real problem isn't the attitude, it's that their overmastering desire for cheaper all too often obscures the fact that what they are getting isn't good enough.

Mark Miller
Mark Miller

...when they talk about this, they're convinced the "elephant" was Sun and Java, not to mention the internet/web. The commercial Smalltalk community in the '90s was split between those who wanted it to follow the web, making it more of a server platform, and those who thought of it strictly as a desktop platform. The desktop platform-ers won the argument, and ST's prominence declined as a result. The reason they look to Java as the culprit is many in the ST community moved to Java as the internet blasted off. This is one reason we've seen some of the developments that have taken place in the Java community, trying to bring new, advanced features to the language, and some of the frameworks and alternative languages that make the Java scene worth looking at sometimes. They brought the ST development culture with them into the Java community. While the ideas are good efforts, I personally I don't think they look as nice in Java as they would in ST. Re. whether Kay would've applied the analogy to ST and C++ He might've applied it more strongly 12 years ago, but not today. He always regarded Smalltalk as a laboratory that would allow somebody to create something better in it. So I don't think he would ever regard it as equivalent to "the perfect person," because a "perfect" person wouldn't need to develop anything beyond where they were. I remember him complaining in his '97 OOPSLA keynote about Apple choosing C++ over Smalltalk for its "Pink" project in the early '90s. Today he doesn't regard any language as "the perfect person." He's called Lisp and Smalltalk "obsolete," though, "better than a lot of what's out there now." I think this general point ties in well with his analogy. We got to where we are through an evolutionary process, and well...just look at it... It was more of a metaphor to use in an argument against a popular notion of what the concept of evolution means. If he was making an analogy to anything in the technology world it might be that good ideas can and do get "killed" in an evolutionary process, so let's not romanticize it. It does create "fit-ness," but other than that it's rather dumb about it, and we can do better than that. His suggested approach is to take some ideas from science (how scientists think, and perhaps a model here and there) and apply them to computing, and allow researchers to follow their curiosity wherever it leads.

apotheon
apotheon

Java's another possibility for the elephant, I guess. Of course, I think "elephant" might be a bit generous for either of them.

apotheon
apotheon

The interactions in complex systems are, naturally, quite . . . complex. That would be why they're called complex systems, I imagine. That aside, though, the fundamental unit of evolutionary "behavior" (for lack of a better, conceptually simple term) is still the gene, and not the individual or the species, and that was my only point in responding on the subject of evolution here.

AnsuGisalas
AnsuGisalas

The whole system involves a set of carriers, a set of scripts - in both paradigmatic and syntagmatic relations, and a set of environments. Now, the environments are combinations of both global, local and individual effects. A massive climate change is global, a flood is local and catching the eye of a Lion is individual. Bad luck plays a big role in evolution, and bad luck can happen to the best of them. Evolution as observed is the effects of the scripts, as expressed in their distributions among the carriers, and as filtered through the environments, counting only from birth until final procreation.

apotheon
apotheon

Evolution isn't about individuals, and it isn't about species, either. It's about genes -- individual genes. It is merely the logical, impersonal outcome of the presence of genes under the circumstances within which those genes find themselves.

apotheon
apotheon

It might -- if you're being generous.

apotheon
apotheon

I tried commenting about evolution and genetics, and TR threw away my comment.