Wi-Fi

10 dying IT skills

One of the challenges of working in the IT field is staying on top of emerging technologies - while letting go of those that are becoming obsolete. This Global Knowledge article lists 10 areas that are fading into obscurity.

One of the challenges of working in the IT field is staying on top of emerging technologies - while letting go of those that are becoming obsolete. This Global Knowledge article lists 10 areas that are fading into obscurity.


There are some things in life, like good manners, that never go out of style. And there are other things, like clothing styles, that fall in and out of fashion. But when an IT skill falls out of favor, it rarely ever comes back. Here's our list of 10 dying IT skills. If any of these skills is your main expertise, perhaps it's time to think about updating your skill set.

Note: This article is based on a Global Knowledge white paper by Linda Leung.

1: Asynchronous Transfer Mode

ATM was popular in the late 90s, particularly among carriers, as the answer to overworked frame relay for wide-area networking. It was considered more scalable than frame relay and offered inherent QoS support. It was also marketed as a LAN platform, but that was its weakness. According to Wikipedia, ATM failed to gain wide acceptance in the LAN where IP makes more sense for unifying voice and data on the network. Wikipedia notes that ATM will continue to be deployed by carriers that have committed to existing ATM deployments, but the technology is increasingly challenged by speed and traffic shaping requirements of converged voice and data networks. A growing number of carriers are now using Multi-Protocol Label Switching (MPLS), which integrates the label-switching capabilities of ATM with the packet orientation of IP. IT skills researcher Foote Partners listed ATM in its IT Skills and Certification Pay Index as a non-certified IT skill that has decreased in value in the last six month of 2008.

2: Novell NetWare

Novell's network operating system was the de facto standard for LANs in the 1990s, running on more than 70% of enterprise networks. But Novell failed to compete with the marketing might of Microsoft. Novell tried to put up a good fight by acquiring WordPerfect to compete with Windows Office, but that move failed to ignite the market, and Novell eventually sold WordPerfect to Corel in 1996. Novell certifications, such as Certified Novell Engineer, Master Certified Novell Engineer, Certified Novell Certified Directory Engineer, and Novell Administrator, were once hot in the industry. But now, they are featured in Foote Partners' list of skills that decreased in value in 2008. Hiring managers want Windows Server and Linux skills instead.

3: Visual J++

Skills pay for Microsoft's version of Java declined 37.5% last year, according to the Foote Partners' study. The life of J++, which is available with Microsoft Visual Studio 6.0, was not a smooth one. Although Sun Microsystems licensed Java to Microsoft to develop J++, Microsoft failed to implement some features of the official Java standard while implementing other extensions of its own. Sun sued Microsoft for licensing violations in a legal wrangle that lasted three years. Microsoft eventually replaced J++ with Microsoft .NET.

4: Wireless Application Protocol

Yes, people were able to browse the Internet in the late 90s before Apple's iPhone. Web site operators would rewrite their content to the WAP's Wireless Markup Language, enabling users to access Web services such as email, stock results and news headlines using their cell phones and PDAs. WAP was not well received at the beginning because WAP sites were slow and lacked the richness of the Web. WAP has also seen different levels of uptake worldwide because of the different wireless regulations and standards around the world. WAP has since evolved and is a feature of Multimedia Messaging Service, but there is now a new generation of competing mobile Web browsers, including Opera Mobile and the iPhone's Safari browser.

5: ColdFusion

ColdFusion users rave that this Web programming language is easy to use and quick to jump into, but as many other independent software tools have experienced, it's hard to compete with products backed by expensive marketing campaigns from Microsoft and others. The language was originally released in 1995 by Allaire, which was acquired by Macromedia (which itself was purchased by Adobe). Today, it is superseded by Microsoft .NET, Java, PHP, and the language of the moment: open source Ruby on Rails. A quick search of the Indeed.com job aggregator site returned 11,045 jobs seeking PHP skills, compared to 2,027 CF jobs. Even Ruby on Rails, which is a much newer technology - and which received a major boost when Apple packaged it with OS X v10.5 in 2007 -- returned 1,550 jobs openings on Indeed.com.

6: RAD/extreme programming

Back in the late 90s and early 2000s, the rapid application development and extreme programming development philosophies resulted in quicker and more flexible programming that embraced the ever-changing needs of customers during the development process. In XP, developers adapted to changing requirements at any point during the project life rather than attempting to define all requirements at the beginning. In RAD, developers embraced interactive use of structured techniques and prototyping to define users' requirements. The result was accelerated software development. Although the skills were consistently the highest paying in Foote Partners survey since 1999, they began to lose ground in 2003 due to the proliferation of offshore outsourcing of applica­tions development.

7: Siebel

Siebel is one skill that makes a recurring appearance in the Foote Partners' list of skills that have lost their luster. Siebel was synonymous with customer relationship management in the late 90s and early 2000s, and the company dominated the market with a 45% share in 2002. Founded by Thomas Siebel, a former Oracle executive with no love lost for his past employer, Siebel competed aggressively with Oracle until 2006 when it was ultimately acquired by the database giant. Siebel's complex and expensive CRM software required experts to install and manage. That model lost out to the new breed of software-as-a-service (SaaS) packages from companies such as Salesforce.com, which deliver comparable software over the Web. According to the ITJobsWatch.com, Siebel experts command an average salary of GBP52,684 ($78,564), but that's a slide from GBP55,122 a year ago. Siebel is ranked 319 in the job research site's list of jobs in demand, compared to 310 in 2008.

8: SNA

The introduction of IP and other Internet networking technologies into enterprises in the 1990s signaled the demise of IBM's proprietary Systems Network Architecture. According to Wikipedia, the protocol is still used extensively in banks and other financial transaction networks and so SNA skills continue to appear in job ads. But permanent positions seeking SNA skills are few and far between. ITJobsWatch.com noted that there were three opening for permanent jobs between February and April, compared to 43 during the same period last year. Meanwhile, companies such as HP offer consultants with experience in SNA and other legacy skills, such as OpenVMS and Tru64 UNIX for short-term assignments.

9: HTML

We're not suggesting the Internet is dead, but with the proliferation of easy-to-use WYSIWYG HTML editors enabling non-techies to set up blogs and Web pages, Web site development is no longer a black art. Sure, there's still a need for professional Web developers, but a good grasp of HTML isn't the only skill required of a Web developer. Professional developers often have expertise in Java, AJAX, C++, and .NET, among other programming languages. HTML as a skill lost more than 40% of its value between 2001 and 2003, according to Foote Partners.

10: COBOL

Is it dead or alive? This 40-year-old programming language often appears in lists of dying IT skills. But it also appears in as many articles about organizations with legacy applications written in COBOL that are having a hard time finding workers with COBOL skills. IBM cites statistics that 70% of the world's business data is still being processed by COBOL applications. But how many of these applications will remain in COBOL for the long term? Even IBM is pushing its customers to "build bridges" and use service-oriented architecture to "transform legacy applications and make them part of a fast and flexible IT architecture."


About the author

Linda Leung is a senior IT journalist with 20 years' experience editing and writing news and features for online and print. She has extensive experience creating and launching news Web sites, including most recently, independent communities for customers of Cisco Systems and Microsoft.

84 comments
ron
ron

Woo hoo! My Banyan Vines certification is still safe!

The 'G-Man.'
The 'G-Man.'

All these 'geeks' who write on the web about random tech stuff. I think this will die out over the next few years and only the good ones will survive!

AvijitCRM
AvijitCRM

Siebel is still rocking and will be rocking atleast for coming 5 years... Avijit

mario.aguirre
mario.aguirre

I think, at least here in South America, that one skill that once was widely used (and still some apps are still running on it, majorily on Counters), was programming in Clipper!!! COBOL it's still alive and kicking when you swim in AS/400 and iSeries mainframes, but definitely, Clipper has almost reached it's extinction...

pohsibkcir
pohsibkcir

Excellent column Ms Leung ... I would suggest however, the one dying IT skill left off your list is, the IT Department itself. Shortly after the .com bust,early in the first decade of the 21st century, Technical Schools started churning out IT Professionals like candy and most of the training they received was centered on currently popular technologies alone, or with a summary only of technologies outside the technologies felt to be the most necessary. The Job-Shops were flooded with a new generation of IT specialists, whose learning institutions have tuned an indispensable skill-set driven industry, into an industry of inter-changeable and disposable parts. The hourly wage for a qualified IT Pro has gone from $50 plus, to an average of $35 dollars an hour or less. When the need for legacy support goes away, so will the need for IT Professionals. Like most Technical Support, IT issues will be outsourced to someone in a different country, who will input the customer's issues into their workstation and repeat whatever solutions present themselves on the computer monitor in front of them. Loved the column, Linda ... Thank you.

darpoke
darpoke

I really think, with the general atmosphere of 'if it ain't broken, don't fix it' that is prevalent here (I assume from the current incumbent COBOL engineers), I can't help feeling that everyone's missing a trick. The fact is that many things that aren't 'broken' can still be improved upon. Surely that's one of the prime defining factors of our dynamic industry? I used to work in a place when I was at college, a supermarket, which is one of the best places to find the true bottom-line approach to spending. Our department would lose a member of staff (out of about 8 or 9 people, total), and the remaining people would pull together out of necessity to make up the shortfall. The result? That person would never be replaced. This view may reduce spending in the short term - it's hard to argue with those numbers. But what about lost productivity or revenue? You can't expect that 8 people can do the same amount of work as 9 people and achieve the same level of quality control or attention to detail. Conservation of energy applies to staff output too. The numbers don't show this, however, which is how it gets conveniently ignored for so long. If record companies can legally pursue damages for perceived losses from downloading, which blatantly ignores the fact that most filesharers would never have paid them in the first place, then why is it so widely accepted that COBOL, for example, is the best way to do something 'because it works'? That attitude would kill most technical manufacturing companies. Since when was good enough good enough?

sjdorst
sjdorst

It seems to be a nearly lost art. Debugging cables to allow communication between 2 serial devices with hardware handshaking has gone the way of the dodo, at least for us. We still use serial printers, but we have them connected through a networked DigiOne. We still have one serial application that we can eliminate once we find a Windows COM port redirector that works well.

jkrichen
jkrichen

As your comment on HTML indicates, it is important to note that HTML still lives, but it lives through its supporting technology and the availability of tools that foster the initial creation of web pages. HTML by itelf will not highlight a resume. However, any web developer who is not capabile of understanding and modifing HTML when the need arises lacks the skill, in my opion, to be considered a profession web developer. Jack

Techtoo
Techtoo

Disagree on HTML being a dying skill. Assembly languages should be in the list.

Murfski-19971052791951115876031193613182
Murfski-19971052791951115876031193613182

At the risk of sounding like a dinosaur, I can remember working hard to learn how to handle the CP/M command language, and then doing batch files in DOS. We've come a long way since then, but I sometimes still miss it.

sukhen
sukhen

When I started my career in IT in 1983, I used to hear that COBOL was dying. I left IT in 2003 and I used to hear that time also the same thing. Now in about halfway past 2009 also, I am hearing the same thing about COBOL. So many languages and software professionals came and gone but not COBOL from IT. Now I am confident that COBOL will never go.

mikifinaz1
mikifinaz1

The year two thousand rolled around and COBOL and FORTRAN programmers could name their price. I worked at a bank and most people don't realize how close we came to major disruptions in many areas as these old programmers scrambled to check over all that old code for date issues. We had a lead coder that was almost eighty and he ran crash courses in COBOL so that we could scan over all the code working as his eyes to spot the problems. So, never say never, because some day it might come back to bite you in the ass. Like having to work with Novell or J++ in some rare case.

Justin James
Justin James

People have been talking about replacing COBOL applications for 20 years now, and they will still be talking about it 50 years from now. While it certainly isn't a "hot skill", I would much rather be an experienced COBOL developer right now than an experience .Net or Java developer, in terms of job prospects. While COBOL itself is not a difficult language to learn, the systems written in it are massive; it takes years to really integrate into many projects maintaining/extending COBOL programs, and once you are integrated, you are very valuable. And why won't COBOL go away? Simply put, it is easier to train COBOL programs than it is to rewrite a million line application. By the time the rewrite is done, tested, and is functionally equivalent, it's been 10 years and $20 million dollars, and the requirements have change so much that the application needs 2 years worth of work to be caught up, and the system it was implemented in is now obsolete anyways. Sure, lots of folks are building out other features in other systems and "bridging" to the legacy apps, but the fact is, those legacy apps are here forever. They are the nuclear waste of IT. J.Ja

chako11
chako11

is it .. best of luck ..

Tony Hopkinson
Tony Hopkinson

good enough, (which it is by the way). Cobol was specifically designed to do what they want on the hardware they use with the operational models they use. The improvement for a tech switch is really hard to quantify, and extermely dependant on a huge range of factors. The cost though is all too easy to figure out, the risks are hideous. If everything went extraordinarily well, that's replacing your hardware, infrastructure, business operations, people retraining/replacing and the complete redesign and re-programming. I'd say the big boys were still looking at attempting recouping millions from a productivity improvement that is more of an aesthetic than a reality. If you are up for buying that, perhaps you'd be interested in this bridge. :p When was good enough good enough, always in business, even tech companies. They don't come up with new stuff for a laugh.

b4real
b4real

Serial communications are still the lifeblood of many industrial automation and technology solutions. Unfortunately many new systems don't have RS-232 ports anymore.

manasseh
manasseh

I thought about mentioning Serial Cabling but didn't bother until I saw your post. In November I got a customer (referral from a friend of a friend - previous IT consultant all of a sudden went to Europe, either a secret agent or just plain nuts, I'm not sure which) with a server crash. Turns out they had Netware 4 - haven't seen that in a LONG time - no Netware disks, no system documentation, etc. I managed to rebuild the system with a Netware 5 CD I still had from a customer that went out of business and I guessed at the serial printer settings and managed to get it all working. Then just a few weeks ago they called with a printer problem. I suspected the old cabling (silver satin end-to-end through the ceiling with connectors falling apart!) was the problem. Turns out it was a bad auto-cutter. Replaced the printer, told them how bad the cabling was and within hours after I left another printer stopped working. I came back with my good ol' serial breakout boxes, installed a proper wall jack, patch cable, etc. and everything is OK again. I am sure I am in a VERY small minority that had the proper tools & knowledge to fix that cable.

manasseh
manasseh

"Professional developers often have expertise in Java, AJAX, C++, and .NET, among other programming languages" It is pretty hard to write a web page (I am talking about WRITING, not DESIGNING) without knowledge of HTML. The difference now is that many, probably most, web programmers are writing in another language (ColdFusion, PHP, .NET, Java, etc.) and using that language to create the HTML. To program an AJAX application you need to know JavaScript for the communication but the end results are normally in HTML output by the JavaScript - again, you can't do that if you don't know HTML. Since it is nearly impossible to do any web page programming without knowing HTML, it simply doesn't need to be mentioned in job requirements. Assembly languages are a different story. There will always be a need for assembly language programmers, but the particular assembly language needed changes over time. I learned Univac 1100 assembler in college and I wrote a little bit of 8086 assembler in the early days of PCs - if I had an assembly language project now it would almost definitely be in a new language - probably more different than those 2 than the differences between ColdFusion & PHP.

Turin73
Turin73

Assembly will end up like cobol, it's going ... oh no it's not. It is in the manufactuing plants, steel factories, etc, it will be there forever.

b4real
b4real

Not really dying - but not really gone either.

MPG187
MPG187

Are you talking about DOS commands? Sometimes I still use them. Lots of tasks in Windows can be done in the command prompt, like making accounts and managing files, because they repetitive things can be done by using batch files. Just today I uploaded a file on my desktop to my FTP site and it was easier to type in "FTP" and "lcd Desktop" and "put filename" than to browse to my desktop folder.

rmerchberger
rmerchberger

The company I work for still utilizes dot-matrix printers for multipart forms. Twice in the last year I was called upon to write batch scripts (in XP) to send printer codes to the printer to accomplish a particular task. I was the only person available who 1) could remember basic printer codes and 2) knew how to write batch scripts with I/O redirection. And... if this "trusted computing" crap comes to fruition, I dare say those of us who still dabble in the "black arts" may become very popular again. I keep several old computers around just for that reason; to make sure I have hardware that the gubbermint didn't have a hand in designing their backdoors into. I still use my Tandy Model 200 often, and I still know how to use OS-9... not the MacIntosh one, the original _MicroWare_ one! It's like I always say: "If it's stupid, but works, it ain't stupid!" Laterz! Roger "Merch" Merchberger

adornoe
adornoe

COBOL is IMMORTAL I've been gone from the COBOL world for about 9 years and I'm amazed about how it just keeps on ticking. It's the Timex of the IT world. It will never die. For more than 20 years, people have been predicting it's demise and lo-and-behold, some 20 years later, it's still a dominant programming language. I remember people also predicting, some 10 years ago, that C and C++ would overtake COBOL in use and importance and that COBOL would eventually get replaced by those languages. Some 10 years later, it is C and C++ that are facing extinction and COBOL is still ticking along. In fact, the companies who continue to develop COBOL and it's OOP versions, will still be with us some 10 or more years from now laughing at those who continue to predict the demise of COBOL. In fact, COBOL was a great idea when it was born some 50+ years ago, and it will still be a great idea some 20 years from now. If anything, it should never be allowed to die. No other language can match its features for programming.

Tony Hopkinson
Tony Hopkinson

CDate rolls over 2015, I know of at least one place where that's going to bite an arse, because I was told not to bother fixing it in 99 for Y2K. Apparently that code will be gone by then. :(

Peleg
Peleg

I did a bit of COBOL in the '80s. I was a Fortan/Assembly real-time programmer in between jobs and got temporary position doing COBOL for the old FSLIC. I literally learned the language well enough on the job in about two weeks so that I was praised by the client manager for my good work. Now, I'm no genius -- I've worked with such people and I'm not that good. So, I'd have to agree that one of the strengths of COBOL's strengths is that it is easy to learn. But I think COBOL's real strength, and perhaps one reason it manages to hold on against all odds is that it is a really good language for solving the problems it was designed to solve. It seemed, at the time, the best tool for the job. I think that business data processing requires a little bit of very accurate arithmetic, a good bit of string manipulation, and a lot of I/O. I think that describes COBOL as I understand it. While no one seemed to noticed it at the time, it also allowed the creation of complex nested data structures that I had to always struggle to create in Fortran and Assembly and that really impressed me. The only mainstream language that I recall supporting such data structures at the time was Pascal, but check me on that. C was still coming up at that time, if I remember. I came away with the distinct impression that for a lot of what I did in Fortran, COBOL would have been a better language for the task. I entertained the idea of using COBOL for some of the crunching that I had to do in my real-time work and only using Fortran and Assembly for those time-critical and device-interface tasks. But I never told anyone because I knew all the Fortran/Assembly language programmers I worked with would have had me locked up in a rubber room for such hersy. What I learned by that experience is to never fall in love with a language, but to acquire as large a bag of tricks as I can and to pull out the trick that best suits the task at hand.

major.malfunction
major.malfunction

I ran an IT dept for 10 years at a company that based its software (2 million plus lines of code) on COBOL. Everyday was a struggle to go to work simply becuase all I did was fight with the COBOL dinosaurs who not only were stuck in 1980, they acutally though that EVERYONE else was using the wrong programming languages. The "problem" with COBOL is that the people that have been using it for a long time are entrenched in old school thinking and are stuck in how to solve today's problems. Imagine getting into arguements with these fossils who think that a 9600 baud modem is faster then using a VPN over the Internet. And why use SSH when telnet will do the same thing...even in the clear over the Internet! Furthermore, when your COBOL programmers are DYING at a faster rate then you can replace them, its time to get a newer language that you can actually hirer people for! I saw more COBOL programmer die from natural causes or go into retirement then I saw get hired in the last 5 years! The youngest programmer was probably 40!! And the other problem is that once you get have 2 million lines of code you have been using 20 years, NOBODY knows how to do it newer or better in that flock. basically, their whole careers have been spent fixing or changing a few lines of existing code for customer requests for bux fixes. So now if they need to rewrite something from the groud up, 99% of them have no idea how to do it since it always been there already for them. The biggest cry is always "That will take like 2 years to switch to another language!" And then 2 years later, everyone is still whining about the current system and I'm saying "You know, we could have been in beta by now!". Things change and move on, especially in technology. It does NOT matter if it still works. If that was the case, we'd all still be using water wheels or steam power. Sure they work and were reliable, but guess what, we found easier and quicker ways. The whole thing is about future support and paths to it. If all my programmers are in their 40s and 50s and I can't find anybody to hire...what's my 5 and 10 year plans? Teaching new programmers COBOL doesn't cut it either. The kids fresh out of college just don't want to learn it and the ones that do are just desperate for any work they can find. meaning, they suck as programmers to begin with and nobody will hire them for the stuff the supposedly know. So i just hired a below average programmer that doesnt even know the stuff he went to school for but now I will teach him a legacy language becuase I'm so desperate? If COBOL was relevant in today's world and had a future, why isn't it being taught? And while you are at it, find me ONE high school kid with asperations to be a COBOL programmer someday! ROFL!

ganyssa
ganyssa

It may not be the current hot platform, but we have over 100 COBOL developers on the payroll. As a large insurance company, the mainframe is going nowhere - in fact, we recently purchased a new one. We build pretty front end interfaces and keep feeding the dragon.

darpoke
darpoke

When you put it like that, the risks (which are critical - this is core business data) simply outweigh any perceived benefit. I suppose that it will stay that way until there is any sufficient motivation to migrate to another format. Now I think about it - it's a trivial example - but the firmware in our router here in the office is from 2006. Any computer software would be patched monthly, if not once or twice weekly - but the core networking functions of a router simply haven't changed in the past four years. No convincing reason to rewrite what works. I stand corrected. Cheers for that! :-)

sjdorst
sjdorst

Wow - come to think of it, it's been so long since I did any breakout - 3, maybe 4 jobs ago, I'm not sure if I can FIND my personal serial tools! I'll have to look...

melekali
melekali

...for whom you work. There are many small jobs for which knowledge of HTML is directly relevant and useful (such as a very small non-profit which has asked for my expertise). Your points are well taken, though. Yet with the proliferation of software that makes creating pages easy for one who might not know any HTML, certain niches are disappearing.

Deadly Ernest
Deadly Ernest

knowledge or skill. Thus we have so many badly done pages and sites - because they don't know the html they don't go in and clean it up.

Tony Hopkinson
Tony Hopkinson

They are cheap, robust and easy to maintain. I loved to see some poor fool do the volume of printing I've seen at places with a lexmark or some such. Three boxes of music rule per night. :p You'd have to pop in a twice an night, fit anew printhead and reprint the last 100 pages....

gsmith
gsmith

I think the name says it all. To me the determining factor is if the application is "real time/one at a time" user interface (Object Oriented) or a massive "nightly batch/pass" type process. So many things are the latter that COBOL is ideally suited for that it will probably always be used for the latter. You can't afford to be rewriting that stuff every couple years when Microsoft decides to come out with a new version of .NET and drops support for the old one.

Ocie3
Ocie3

We are still using steam, mostly to generate electricity -- in factories as well as in public utilities -- all over the world, and burning coal or natural gas (methane) in the boilers!! The heat from nuclear reactors that are used in electric power generation produces steam to run the generators. Steam still turns the turbines of many of the world's ocean-going ships, too, although most of the others are powered by diesel engines. For that matter, diesel engines are also used to power electric generators, but mostly for "relatively small" amounts of electricity, such as for a household or a farmstead, or as emergency backups or for arc welding. :-) Maybe COBOL in the future IT realm will be like steam in today's economies. But we know that the burning of coal and methane for the massive production of anything must come to a halt, or this planet will become increasingly overheated. :-(

Baruch Atta
Baruch Atta

"... ran an IT dept for 10 years..." my ass. You don't even know how to spell, so why should we belive you about your past? And "ROFL"? Now, you have told us who you really are. Go back to High School. You don't even mention any other language or technology, all you do is crab crab crab. No solutions, just problems. You are a problem and complain guy, and I'll bet you have been fired from 10 jobs in the last five years.

Justin James
Justin James

You are right about the issues finding COBOL developers. Fewer and fewer colleges are teaching it, due to the "COBOL is dead" misperception. And you are right, kids out of college don't want to walk into a COBOL environment and learn it. Finally, it is a good point that many COBOL developers have been so isolated in the mainframe bubble that they still think it is 1982 or something. That being said, I expect a major uptick in COBOL salaries at some point in the future as the problem of them dying faster than they are made (as you point out) reaches a critical stage. If I were a young developer who did not care if the app or languagwe was flashy or trendy, I'd learn COBOL. In a few years, it will be a great way to make big bucks with mucho job security. J.Ja

Tony Hopkinson
Tony Hopkinson

Rewrite, rework, reengineer and refactor are my mantras, got to pick the right battlefields if you want to get anywhere though. No dollars and cents now = no can do.

b4real
b4real

Of serial tools for a free download. Some of the tools work with generic RS-232 ports also.

oldbaritone
oldbaritone

"I have so much code to write that I cannot keep up." Or maybe it's just a matter of supply and demand - and the demand FAR exceeds the supply for Assembly programmers and good COBOL programmers alike. And I made a career as a consultant by writing interfaces between "legacy systems" (i.e. mainframes) and PC/networked systems. And the funniest part is that when we started the interface project, the legacy system was only going to be running "for a short time, maybe 6 months, a year at most, because we're in the process of implementing a new (brand-name) system..." That was 12 years ago. The legacy system is COBOL/RPG. And the "one-size-fits-all" brand-name complete business systems didn't fit the company when they started implementing it. It was customizable, but estimates ran at around 1/2 million lines of customized code and tens of millions of dollars - to do what the company was ALREADY doing in COBOL. So they scrapped the change, bought a new mainframe, and the COBOL program is still running today. And they're always looking for COBOL programmers. Maybe COBOL lives in Jurassic Park?

darpoke
darpoke

but it sounds like you've been getting paid to spin your wheels a hell of a lot in the last 4 decades. Sure, Assembly & COBOL may have done the job, but could it have been done more efficiently? I have so much code to write that I cannot keep up. Surely this implies horrendous inefficiency? I'm not having a go, perhaps there were specific reasons restricting your choice of language/environment. It just strikes me as sticking to ones' guns for the sake of it. Correct me, please, if this is wrong.

tomcarneal
tomcarneal

COBOL applications have been written for the last fifty years and many are still running. I personally have been writing Assembler (bigger dinosaur that COBOL) exclusively for the last 40 years and have only missed one week of pay the entire time. I have so much code to write that I cannot keep up. COBOL is the same way. In fact, it was not for the many reasons systems were replaced in Y2K (cannot find or understand source, etc). There would be much more out there today. Folks, this is not a game, everything is bottom line for most companies (that actually pay people to write and maintain systems). COBOL is really not that difficult. Just wait until you have to crawl through the systems that are out there. No way do you really want to rewrite them unless they fail to do the job for you.

adornoe
adornoe

I think the name says it all. To me the determining factor is if the application is "real time/one at a time" user interface (Object Oriented) or a massive "nightly batch/pass" type process. Actually, COBOL, as implement by the mainframe makers and mini-computer designers, was capable of doing the work for on-line transaction processing as well as huge batch jobs. In my old career as a Tandem project manager/analyst/designer/programmer, we developed full systems which handled multi-threaded transactions. All development was in COBOL. In the back-end, we designed and coded for batch processing using the same language, COBOL. Tandem was doing requester/server processing many years became it became popular on the PCs. The PCs of today still don't even come close to what we were doing with Tandem, and COBOL. BTW, the Tandem machines were non-stop machines with heavy multi-threading and massively-parallel computing and mirrored disks. Those capabilities are still around for the HP servers (which owns what's left of Tandem) and IBM worlds. IOW, COBOL was and is capable of handling just about anything that the lower-level and newer languages can. It's just dependent upon the implementation and the support built for it.

oldbaritone
oldbaritone

The only certain FACT is CHANGE. The planet's average temperature has been changing since time began. I'm old. I was taught in grade school that the next global ice age was on the way, and the world as we know it would end. Now my kids are being taught that global warming is fast bringing us to crisis, and the world as we know it will end - and it's all because we drive automobiles. Our nearby university is advertised as a leader in emerging alternative energy technology - yet they refuse to buy "scrap steam" from the nearby (about 1 mile) utility's electric plant. Instead, they run their own coal-fired physical plant to generate their own steam. It's not about "being green" - obviously. The utility has tried many times, but the university likes to villainize big corporations. Protesters carry signs condemning coal-fired electric plants. So the utility has no market for useful energy, and the university keeps burning MORE coal (unnecessarily) and belching out CO2 while telling us the world will end because of greenhouse gasses. Reduce - Reuse - Recycle? Only if it matches your political agenda. And oddly enough, I learned COBOL at that University, back in the 70's, on an IBM System/370 mainframe. ;-)

adornoe
adornoe

Actually, my reply to your post was also deleted. So, if yuo think that it was from a request from me, then you are mistaken. Your post and mine are both kaput. But, take heart. My reply to your post wasn't all bad. It was lenghty and it was a line by line rebuttal to your post. Though it wasn't all complimentary and it wasn't all nice, I did end it on a good note. Here's how I ended my reply to your post: If I've been insensitive and insulting to you, then I apologize. My main problem is with your positions. Personally, I don't know you. And you don't know me. But, I will not back down from a discussion where I believe that the truth is on my side.

Duke E Love
Duke E Love

Gotta love free speech!! My bet is that someone cried to mommy.

adornoe
adornoe

Do you have to try to be a condescending prick or does it come natural? It comes natural when I encounter stupidity or idiocy. BTW, did you somehow relate to my statement when I said: Next time you get involved in an adult discussion, please allow the adults around you do the posting.

Deadly Ernest
Deadly Ernest

1. The MM-GW supporters point to info from a few weather stations collected over the last 200-250 years. Many of those same collection points are coastal stations and ships logs for weather entries in the same locations over four hundred years fairly match that for the same locals for the period the MM-GW supporters point to, but the older entries show high temperatures for that area which are higher than they are today. Since the instrumentation for most of both periods was the same, it's safe to accept they agree where they agree and the older data is also accurate, thus disproving the point the MM-GW people are trying to create. 2. The maritime weather information is from thousands of ships over hundreds of years and including tens of thousands of entries from around the world, with many being noted on the same day at different times - such entries were made every few hours. 3. The weather model being used by the MM-GW people are the same ones they've been using for just over twenty years, and they state they can predict ten and twenty years into the future because it's trends - yet the predictions from ten and twenty years ago do NOT match what we have today, trends or otherwise. Thus disproving the reliability of the models. Trends frequently get stuffed up by unusual events too. Every major volcano eruption causes significant disruptions in the overall weather patterns for many years. One such eruption in Indonesia in 1815 cause freezing frosts in Europe in the summer of 1816 and affected world weather patterns for many decades. In fact it's the recovery from the major drop in temperature this caused that most MM-GW supporters see as the cause for global warming, when it's just the earth returning to balance as the dust cloud gradually settles back to earth. 4. Global warming has been happening for thousands of years and there's tons of scientific evidence to support that. What the top experts in global climate studies don't know is exactly why it's happening and all the factors that affect it. they do know of some, like solar radiation, solar flares, volcanic activity, major ocean currents - but are still working on identifying the other factors. Compared to the volcanic activity alone, man is a minor issue, about on par with the naturally created bush fires each year. ............ Anyway, I think this sub-thread has wandered too far off course.

adornoe
adornoe

To claim, however, that the entire theory is so much bunk based on the fact that New York City had some cold weather? How can I put this so you'll understand it? You are guilty of the same errors committed by the global warming junk scientists. While I did mention that NY City did have record low temperatures in June, you only chose to use that bit of data to make your point. However, you conveniently decided to disregard the fact where I mentioned that during the last 11 years, the planet as a whole has been cooling. You cannot pick and choose in order to make a point. When you do that, your point is immediately invalid. What's more apparent is that, you sound completely unaware of the arguments being made in the last 20 years that have proven that "global warming" science is nothing but junk science which has been debunked very nicely by true science. Nothing illustrates better how much the junk scientists hate the truth than when they are challenged on their science by true scientists who are not tied to an agenda. Whenever Al Gore or even a "global warming scientist" is challenged to a debate regarding the science, the "global warming" "scientists" ALWAYS reject any debate and follow their rejection immediately by stating that the debates are over and "the matter is settled". Those junk scientists know quite well that they could never win a debate on the true science of "global warming" or "global cooling". Those people are cowards and charlatans.

adornoe
adornoe

So the earth is not warming up? The evidence is not there. The evidence used by the global warming alarmists is either made up or a result of data that was "picked or chosen". I can prove that last year was the warmest ever on record if all that I did was select for my studies the days when the temperatures were 95 and over and only from the warm areas of the planet in the summer and where snow is never seen. I could also prove that last year was the coldest ever if I only selected temperatures in the winter and where those temperatures never rose above 70 degrees. So, buddy, it's not about words; it's about the true scientific method and the truth. Words by themselves is not what this discussion is about. It's about the real science and not about politicized science or agendized science or consensus science. The funny thing about words, you can use them to rationalize anything. Most smart and informed people know enough to know when they're being had or lied to. It sounds like you are the type that would fall for any kind of rationalization. Furthermore, no one can rationalize a scientific hoax. The global warming scaremongers can use false science and false information to scare people into believing that something needs to be done to combat the lie, but, in the end, the scientific facts will stand on their own and no amount of rationalization can make a lie a fact. It is the mental equivalent of sticking your fingers in your ears and yelling "LALALALALALALALA". Next time you get involved in an adult discussion, please allow the adults around you do the posting.

Tony Hopkinson
Tony Hopkinson

and if you collected the same measurements for the same places at regular intervalsin time and distance, you can come up with an average across that geographical area. How good an extrapolation to a global average that you can trend is dependant on the size and quality of sample. Otherwise you are in the 'science' of wearing odd socks to avoid having a car crash.

darpoke
darpoke

- I certainly don't want to get dragged down into another 'it is!'... "it isn't!" argument about climate change, which is how all such discussion ultimately end, but I would like to point summat out. With regards to the reliability of weather forecasting in the long term versus the short term - what is actually being predicted are trends. You can't expect them to tell you if it's going to rain on the third tuesday of the next month, because the statistics involved in meteorology don't work that way. They can, however, identify a long-term period of heating or cooling a little more definitively. It's like market predictions. They can tell you how many will buy X and how many Y - but not what each individual will purchase. It's not a magic 8-ball.

Tony Hopkinson
Tony Hopkinson

But the number of data points collected from either source is far too low, throw in early imprecision about where and when they took the measurement and it's a clasic three blind men describing an elephant job. It's just a bunch of people massaging some statistics so the answer they want comes out. That's the real problem with the science on either side, no objectivity. Even if the scientist in question keeps an open mind they'd have to be dumber than the hole in cow's arse not to realise where the next funding cheque was coming from...

Deadly Ernest
Deadly Ernest

the people claiming MAN MADE global warming are looking only at a small set of weather station data that only really started gathering data a bit over 200 years ago. Sadly for them, maritime records from ships logs going back over four hundred years shows that the current average temperature today is lower than what it was 300 years ago. Other data sets from ice core taps and ocean bed taps indicate the planet has been warming and cooling for many millions of years and a major warming period started just over 10,000 years ago and we're still in it. All the above is due to the interaction of the sun, the volcanoes, the earth core, and other natural weather effects - not a man made item in the lot. If the people pushing the man made global warming theory who claim future disaster because their models say so, why can't the models predict the weather for next week or next year, but only ten or twenty years down track? Answer, because the models are garbage and don't really work. Some have been pushing this garbage for nearly thirty years and today we are no where near the points they claimed would be reached ten or twenty or thirty years ago - which shows how faulty their models are. The whole issue is just way to complex to model well at the moment.

darpoke
darpoke

- the part about the scientific method, that is. Like Tony said, though, it's a little redundant when attempting to theorise about something so esoteric as our global impact as a species. It's slightly difficult to design a suitable control group for, to say the least. To claim, however, that the entire theory is so much bunk based on the fact that New York City had some cold weather? How can I put this so you'll understand it? I know: FAIL. Epic, of course.

Duke E Love
Duke E Love

The funny thing about words, you can use them to rationalize anything. It is the mental equivalent of sticking your fingers in your ears and yelling "LALALALALALALALA".

Tony Hopkinson
Tony Hopkinson

Global warming / climate change is nothing but a political issue. Traditional scientific methods are reductionist, ie isolate a small number of factors establish a control and then take empirical measurements. How can you do that when the planet is your laboratory, and chaos theory rules it? We can make simple models and introduce change, and show things to look for, see how those simple models react, but they don't allow us to predict anything but the most extreme conditions back out in the real world, because we were using a simple model. Look up the butterfly effect ffs. Human impact on climate change can be neither proven nor disproven. To do it by the traditional scientific method we would need a spare panet and an epoch, and all that would tell us with no humans there would be no human caused impact, which is sort of obvious isn't it. If you haven't read it gets James Gleick's Chaos (history of chaos theory). If you have read it again, you obviously didn't get it. All you are asking me to do is rate Dick Cheney as more credible than Al Gore, you need to do a lot better than that.

adornoe
adornoe

Any scientist that says different has a different personal agenda. There's no real science in pro or anti global warming, not a speck... You are completely wrong! First off, if the evidence is flawed to begin with, there is no reason to postulate a theory. At that point alone, the global warming theory becomes junk science because the evidence is the main criteria for a theory. If the evidence is flawed or lacking, then the theory falls apart even before it begins. Then, if a theory is also dependent upon evidence that is massaged before it is put into analysis or into a computer model, then whatever theory arises from that junk evidence also becomes junk. Thus, you end up with junk science. Ever hear of GIGO? Also, if a theory is dependent upon a computer model for predictions and analysis, and that computer model has been designed to just prove a theory while dismissing the possibility that the theory might be "disproved", then the model itself is junk. If the evidence that is input to the models is picked and chosen only to prove a theory, then the model will only spit out junk results. If the models are designed to "dismiss" data that could prove damaging to a theory, then the model is junk from its inception. The most important points in pure science state that if any evidence at all disproves the theory, then the theory holds no validity. Since the global warming theory fails miserably against the data collected, then the theory should never have been postulated at all. In each and every step of the scientific methods, the "global warming theory" fails miserably. But, in true science, if even one step fails to explain away any evidence to the contrary, then the theory is junk and the proponents of the theory are nothing but shysters if they continue to insist that the theory holds any kind of water at all. I could continue forever, but there is one major and very important point to be made for true science. If a theory is proposed, then the proponents of the theory are the ones charged with proving that the theory has any validity. If others find fault with the theory and those faults are major, then the proponents of the theory have to backtrack and explain why the theory fails in those areas. However, even if minor faults are found in the theory, the proponents of the theory need to examine the evidence and explain why the theory fails there too. With global warming, the faults are major and the faults are minor and the faults are too numerous. When the evidence is so huge against global warming, and the instances of the negative evidence are so many, then it is up to any "true" and "un-agendized" scientist to completely dismiss the theory. When the evidence for global warming is completely destroyed by a cooling trend which has lasted for the last 11 years, then the "theory" is completely junk. A minor piece of evidence (some would call it huge) against global warming is the fact that in NY City, this past June is the coolest on record since 1958, as indicated by records from the National Weather Center. Lastly (for now), when the major proponents and supporters of global warming theory are politicians and the media, then the theory doesn't even belong in the realm of true science. When a theory is so flawed in so many areas of the scientific method and so unprovable, then it is in fact junk science. No true scientist without an agenda would even think about getting involved with the junk science that is "global warming". BTW, did you notice that the proponents of global warming have backed away from calling it "global warming" and are now proponents of "climate change"? Of course, no one can deny that climate change does indeed occur. But, they never admitted that their original "global warming theory" was nothing but junk science. That kind of people are nothing but shysters and cowards. The theory should never have been proposed at all. And since the scientific evidence against "global warming" is so abundant, then the theory has, for all practical and scientific purposed, been proved wrong. There is no need to prove "pro" or "anti" global warming. The theory should never have existed. If a theory cannot be proved because the evidences is so abundant against it, then there is no need tor any scientist to Prove it wrong. The evidence is sufficient by itself to invalidate the theory. Your "pro"/"anti" assertions are not part of the scientific method and that's not the way science works.

Tony Hopkinson
Tony Hopkinson

for the proof that we aren't, well not unless you want to go blue and die.... Any scientist that says different has a different personal agenda. There's no real science in pro or anti global warming, not a speck... You can telle that by the way everybody goes around 'proving' the opposition is wrong, never that they are right.

adornoe
adornoe

But we know that the burning of coal and methane for the massive production of anything must come to a halt, or this planet will become increasingly overheated. There are certain technologies and certain advancements in human knowledge that don't age and don't become obsolete and will not disappear just because there is a new way of doing the same tasks. The wheel, as an example, was invented thousands of years ago. To this date, it is still the most often used tool for transportation around. COBOL, though old by today's software standards, is still very useful and will be so for many years to come. It may be around 100 years from now. It's like the old saying, "if it ain't broke, don't fix it". Why replace something which still is very useful and very powerful and not prone to "the latest is the greatest" mentality that abounds in the computer fields. Furthermore, your analogies in regards to the use of methane and coal are very misplaced and very wrongheaded. . First off, when a problem doesn't exist, why invent or devise a solution? That is what "global warming" is about. Global warming is not science and neither are the solutions being devised to repair the fabricated problem. Neither you nor any real scientist who is not involved in a personal agenda can prove that global warming is being exacerbated by the human burning of fossil fuels. In fact, the temperature records for the last 10 years indicate that we are in a cooling trend. That's the opposite of your "overheating planet" proposition. Stop swallowing the Al Gore Kool-aid and get with the real truth and the real science.

Tony Hopkinson
Tony Hopkinson

Adding it to the quiver, is something I've considered a few times. Once all the ructions in the finance industry have settled out a bit, I'm going to give it a serious consider, especially seeing as I'm too old to learn this new stuff....