1. it's too much work to bother
2. it's likely to be broken by a later TR update
Here's to talking to you again, hale and hearty, in a couple of days. . . .
The point is that extinction happens when there is no environment in which the species can continue to thrive. So, as evidenced by the vast variety of responses, for example, there are environments in which Fortran continues, almost unchanged, to thrive after decades, but there are also species that have only been around a few years who no longer have a place - a few struggle on, but they're being supplanted by better-adapted species. Take Fortran out of its mostly scientific home and it gasps, flounders and dies. But it won't become extinct unless no-one finds it the best tool for the job they have to do - now, without spending three months learning a new language that doesn't suit their scientific brain nearly as well as Fortran.
The same analogy can be drawn in domestic lighting. Candles and oil lamps are still in use, as are many of the generations of electric lighting invented since. Some - probably incandescent lightbulbs - will eventually become extinct, but there will always be a box of candles on my shelf for when we get power cuts.
The problem I have with Kay's analogy is the idea of "the most perfect being." I don't even know what that would be, and he didn't define it. I can see that what he was trying to do was set up a contrast between a system that's in his mind preferable vs. one that's so-so, and how "what's preferable" doesn't really mean a thing in evolutionary terms. In the scenario he discussed, the "elephant" survives just fine. The "most perfect being" does not. Not to say it always works this way in evolution. Just to say that it can, and that we should not mistake "survival" for "best."
Individual variation IS a factor, a driving force for evolution, but in the context of evolution the impact of individuals is relevant only as part of the conglomerate.
This is also why evolution will only produce perfection as a fluke. It's entirely incremental, never working towards a finite endpoint, but rather in a general set of directions that can be summed up as "away from non-viability".
Now, the environments are combinations of both global, local and individual effects. A massive climate change is global, a flood is local and catching the eye of a Lion is individual. Bad luck plays a big role in evolution, and bad luck can happen to the best of them.
Evolution as observed is the effects of the scripts, as expressed in their distributions among the carriers, and as filtered through the environments, counting only from birth until final procreation.
That aside, though, the fundamental unit of evolutionary "behavior" (for lack of a better, conceptually simple term) is still the gene, and not the individual or the species, and that was my only point in responding on the subject of evolution here.
Re. whether Kay would've applied the analogy to ST and C++
He might've applied it more strongly 12 years ago, but not today. He always regarded Smalltalk as a laboratory that would allow somebody to create something better in it. So I don't think he would ever regard it as equivalent to "the perfect person," because a "perfect" person wouldn't need to develop anything beyond where they were. I remember him complaining in his '97 OOPSLA keynote about Apple choosing C++ over Smalltalk for its "Pink" project in the early '90s. Today he doesn't regard any language as "the perfect person." He's called Lisp and Smalltalk "obsolete," though, "better than a lot of what's out there now." I think this general point ties in well with his analogy. We got to where we are through an evolutionary process, and well...just look at it...
It was more of a metaphor to use in an argument against a popular notion of what the concept of evolution means. If he was making an analogy to anything in the technology world it might be that good ideas can and do get "killed" in an evolutionary process, so let's not romanticize it. It does create "fit-ness," but other than that it's rather dumb about it, and we can do better than that. His suggested approach is to take some ideas from science (how scientists think, and perhaps a model here and there) and apply them to computing, and allow researchers to follow their curiosity wherever it leads.
Are you kidding me? I expect this kind of thing from the script kiddies and younger developers who are inexperienced but not from adults with experience.
The answer is YOU. Specifically your lack of understanding of Relational Databases and how to get to and work with that data efficiently/effectively. If all you ever work with is tiny data sources (a few hundred gigabytes) then those ???one interface queries all??? tools use will work for you. However when comes to dealing with the important stuff that is measured terabytes or more, the dumbed down interfaces to SQL coding that lets you skip writing and even understanding how to query the database falls flat on its face and that???s because the one-size-fits-all method doesn???t work for all sizes.
Let???s flip the idea and see how it sounds from the other direction: A LIST OF 11 Development Technologies That Refuse To Die
#11: Programming: With all the wonderful drag and drop software tools available why does anyone still bother to write or even piece together existing code? If the programmers would stop writing code and use these dumbed down tools only then the problem would take care of itself and writing software code could finally be put out of its misery.
Sounds dumb doesn???t it? Same goes for this ???No SQL??? mindset. It???s this kind of attitude, that ???I don???t need no stinking SQL??? that results in very poorly written applications that are unable to work with large data sets. Whats worse is its often the simplest mistakes that cause the worst headaches and all because some developer didn???t want to be bothered with learning the SQL language.
On the up side this means ensured work for us DB people for many years to come.
Sort of like you don't need to know arithmetic if you have a calculator...
This has only been exacerbated by the fact that many developers when writing SQL, are incredibly bad at it.
It's just yet another instance of viewing a productivity multiplier which was meant to get more out of your best people as a way of getting as much out of cheaper less competent ones.
To me it's just another trip down a corporate's interpretation of the word quality which we all know is "good enough". The real problem isn't the attitude, it's that their overmastering desire for cheaper all too often obscures the fact that what they are getting isn't good enough.
I generalized and exaggerated somewhat for effect in my last two sentences. Since a few people got on the subject of mainframes, I thought I'd throw that in, since I've read accounts of this actually happening. "Big iron" operations often go for the "few million dollar" and "add more boxes" solution, from what I hear, rather than a cheaper (say, $10,000) solution that involves hiring a contractor to create a less complex solution to the same problem.
If languages were people, FORTRAN would be regarded as COBOLs fuddy-duddy spinster aunt.
You've got to be kidding me! It's the other way around, isn't it?
Fortran was always the sexy one, allowing geeks and other brainiacs to do hardcore tech- and science-oriented number crunching on cool, cutting-edge mean machines like CDC's. COBOL, on the other hand, was that crappy, verbose, supposedly English-like POS designed to let suits make IBM clunkers do boring stuff like work out people's gas bills.
And nothing much has changed. If you want to mathematically model climate change or nuclear detonations on a supercomputer, Fortran's the only way you're going to do it.
Fuddy-duddy my eye!
I understand that this niche of financial reporting has a lot of applicability in the business world. I just couldn't imagine using it for anything I wanted to do with computers.
And I agree with Mark Miller's assessment of evolution, that it's not always the best that survive. If that were true (not to start an OS war but), DEC would still be a computer company and VMS would still be an OS player. I've worked with them all (CDC, Prime, HP, IBM, Apollo, DEC, Windows, Apple, Unix, and remember Digital Research's CPM. I found VMS to be next to none, in my humble opinion. I miss those "good old days"...
Edit: Meant to add, as we probably all know, the letters "H-A-L" were the letters that come before "I-B-M".
. . . it's not always the best that survive. If that were true, . . . DEC would still be a computer company and VMS would still be an OS player.
It's terribly sad that DEC are no longer around; their approach to computer architecture and instruction sets was absolutely inspired. Their engineers were geniuses!
I loved the way that on a PDP 11, the hardware wasn't microcoded to increment the program counter when fetching an instruction. The increment happened as a matter of course because the instruction was fetched using the autoincrement general addressing mode on the general-purpose register that acted as the program counter. Very elegant! There were no special cases with the PDP (apart from the JSR instruction); everything on the machine (memory, registers, devices) could be accessed and operated on directly using any addressing mode. Brilliant!
What a shock it was to later encounter the Intel 8080 and its primitive architecture and instruction set and have to be forever moving stuff in and out of registers to get any work done. Clunk, clunk, clunk, clunk.
I miss those "good old days"...
Yes, indeed. Well, of course, we were all young back then. Sigh.
Reminds me of what it was like to program a Motorola 68000-series architecture. I learned assembly programming on it, and it made a lot of sense. Recently, as an exercise in learning about operating systems, I've gone way back to the MOS 6502. It reminds me of your description of the Intel architecture. The documentation says there are 3 registers, but to me they act like one big segmented register. In order to use most addressing modes you *have* to use the Accumulator. In order to address "zero page" memory (an 8-bit addressing mode) indirectly (ie. dereference a pointer) you *have* to include the Y register in the address calculation. In order to address with 16 bits indirectly you *have* to include the X register. Yuck.
The original product idea for NT was that it would directly compete with Unix, as a command-line OS. In the end, MS decided against that, gave it a GUI, but put in the "Command.exe" app. to allow command line control of the system.
People being "fluent" in a language isn't quite enough in the long term.
C isn't going anywhere; it's the language of drivers. Unless you want a lul in conversation between your HDD and anything else, this one's going to be around forever.
Java will pass once Mega-Corps realize the world has turned to cloud computing and Python. But they will have to abandon years of hard work with java. It's hard enough for them to agree on the color of an icon, let alone absorbing that the rest of the world is changing direction. Corn flower blue anyone?
Python, (a little) PHP, HTML5 and noSQL are the wave of the future. But, like stearing the titanic, companies not only need the inclination ($) but the time. It takes longer to steer big ships.
Here's "the list" of programming languages:
By clicking on their names you can get some pretty charts; EG: java vs. python vs. ruby. You can get a sense of who's comin' and goin'.
Fun piece JJ.
PHP is actually a technology looking for an excuse to die, I think. It just hasn't found that excuse yet. It survives not because it's an excellent critter for its evolutionary niche; it survives because, against all odds, nothing better has appeared in that niche and gained enough traction for people to notice it yet -- though some very-nearly-there options do exist, and they're enough better-designed languages that a lot of people choose them over PHP despite some of the specific niche-fitting benefits of PHP.
You present Python as "The Answer" for certain types of programming, as if there are no competitors. It's just a visible example for a particular niche. Others exist as well; it's not the PHP of its niche, the sole tool that truly satisfies the needs of the niche. It's just one of several.
Since, I don't perform all PHP downloads in order to bolster support for it we can assume millions of other programmers are doing that. Some of the biggest open source projects on the web are based on PHP (facebook, yahoo, wikipedia, wordpress, digg) so you can make some assumptions about its future. Is it a great language - not really but others seem to like it for simple tasks, like blogging.
Blogging, wikis, and the social media apps themselvs are huge and therefore so is the supporting technology.
In reference to Python, there are benefits to being the last one to the table. The iPhone and Android were the last smart phones to market, for example, and they forced (or are forcing) companies like Palm and BlackBerry to make fundamental changes or go out of business.
Python was not a big language right off. So developers of the language had time to work annoyances out of the language. If you haven't used it yet I recommend you do. If you know Python you can throw out utility languages like perl, bash, php, C, et al, and use just 1 language to accomplish zillions of tasks.
As I am only a db novice, I can only say of nosql that it's just fine but, if I'm wrong, you should call these guys and tell them their design is flawed:
Since Python and its datastore are part of Google's long-term strategy for world domination they will be happy to know your opinion up front; it will save them years of embarrasment. But your comments will also save YouTube, some of Yahoo's more recent apps, Google Groups, Gmail, and Google Maps, JP Morgan, Industrial Light & Magic, Walt Disney Feature Animation, the Blender 3D (modeling program), Los Alamos National Laboratory (LANL) Theoretical Physics Division, NASA, SchoolTool, and the CIA.
They are all just sitting there, waiting for the phone to ring, so you can confirm their worst fears - that their designs are wrong.
As far as Python being "The Answer", which is a little dramatic, I'd refer to the above reference and take away not the number of projects using Python but the diversity of applications: web, reporting, analysys, machine learning, GUIs, compiled apps, filesystem scripting, etc. Is it the answer? Maybe. How many other languages can (nearly) replace so many others? And, in your words, you can "Write it once. Compile it once. Distribute to Android, Windows, *nix, or Mac".
Does anything else do that?
I would hardly use Java to perform system administration tasks that I would have normally used Bash or Perl to accomplish.
Listen, I'm not an evangelist for any of these languages. Personally I don't think ruby gets the props it deserves most of the time. But - using old-school input mechanisms, like your eyes and ears, are helpful in spotting trends. The difficult part about spotting trends is that after you see and hear information, you then have to process it and make some sense of it all; those who can do this are called "analysts".
And maybe TIOBE rankings are simplistic. Do you have a better reference? Or do you suggest we use no metrics at all during analysis?
Most people's problem with these new kids on the scene isn't the new kids - it's usually the person's inability to research and make personal changes, like letting go of old preferences.
In short - don't shoot the messenger. It's not bad news, it's only news.
- Keyboard Shortcuts: