Software Development

Project failures may soon carry legal risks for programmers

Is Waste Management's lawsuit against SAP a harbinger of legal trouble for programmers who work on projects that end in disaster? Justin James explains why he thinks it is and offers advice about what programmers can do.

Project disasters come with the territory in IT. Some observers (including me) pessimistically believe that the majority of IT projects fail to meet their original goals. All too often, "descope and declare victory" is the only possible way to end a project. Part of the problem is that, regardless of how badly the project goes, the customer does not have much recourse.

All of this changed last week.

Waste Management's SAP lawsuit

Waste Management has initiated a $100 million lawsuit against SAP for a failed ERP implementation. While the size of the Waste Management SAP project is much larger than the typical programming project, the problems it faced were not unusual. From FBI computer upgrades to Windows Vista, "behind schedule and over budget" is par for the course. Programming projects are just as problematic as large integrations, and they are often a key component of those floundering integrations.

I think this case is the canary in the coal mine. Regardless of whether Waste Management wins the case, if the courts allow it to go to trial, programmers are in for a rough ride. Why? If the case is tossed out -- particularly on a standard "no warranty implied" contract clause in the software End User License Agreement (EULA) -- it means that these pieces of boilerplate legalese provide "cover" for our failures to deliver on a salesperson's promises. If the case makes it to trial, it means that any failed project is probably fair game for a lawsuit.

Are "programming malpractice" cases around the corner?

Many doctors have to fend off baseless malpractice lawsuits from people hoping to make quick cash from a settlement, since these cases can be so expensive to defend. My fear is that we will soon see this type of environment in IT.

Suing for "programming malpractice" would be like suing an artist because your portrait isn't realistic or flattering or impressive enough for your taste. I hate to sound forgiving of the high failure rate in IT, but programming is not a cut-and-dried practice at this point. It's not like designing a light switch or a stepladder, where it is quite clear if the fault lies with the user, the designer, or the manufacturer.

Even when the project fully meets the specs, it rarely meets the user's needs, since it is so hard to write a perfect project specification and requirements document. In other words, even the most perfect project could be wide open to these kinds of lawsuits.

Even worse, "programming malpractice" suits could drive out what little innovation is left in IT. Programmers will not be willing to write new software unless their company has the deep pockets and slick lawyers to protect them. Open source projects will collapse, since the lack of incorporation will make the individual contributors legally responsible.

What's a programmer to do?

I'm not a lawyer, so I obviously cannot offer any legal advice -- I'll just share what I plan to do. I will continue to make sure that I put forth my best efforts; I'll also be diligent about showing the customer that I am working in good faith. This is part and parcel for any relationship, but it's especially important to me now.

If you are an independent contractor or consultant, run your own company, or could otherwise be held personally liable in the case of a lawsuit, you may want to consult a lawyer to find out what your options are. And, if at all possible, make sure that your salespeople don't make wild promises.

At the end of the day, there is no need to panic. But the age of legal consequences for project failure may be upon us. Given the failure rate of IT projects, I fear that far too many developers are wide open to legal troubles.

Additional project management resources fromTechRepublic

J.Ja

About

Justin James is the Lead Architect for Conigent.

173 comments
apotheon
apotheon

"[i]you really can eat on less than $250/hr, even with only 50% of your time booked and with the taxes ... but most of those fees are going in the pocket of the companies consultants work for[/i]" You're right -- an independent consultant can eat on less than $250/hr. Just keep in mind it takes about $250/hr. to even begin to edge into the "well-paid" income brackets, because after all the expenses and taxes and so on, that's equivalent to about $83k/yr for a salaried employee. Expenses vary from one field of expertise to another, but remember that an independent Oracle consultant would have to pay for his own licenses on different versions of the software (usually via some kind of subscription package) to keep up to date on the technologies (s)he has to support. $250/hr. is overpriced for a lot of the consultancies, though -- places that have so-so consultants and pay them about a third of that while pocketing the rest (as you say). What's especially sad is that many such consultancies charge upwards of $500/hr. for the same service. "[i]anyway, simple fact is Oracle has always been overpriced and over-rated ... which brings us right back to the lame brains who buy it[/i]" No kidding.

Matthew Yurksaitis
Matthew Yurksaitis

Project Failures are often not attributed to a single root cause, there is plenty of fodder to go around in most typical IT projects. Most projects do not spend the time in planning to analyze and define requirements (function, non-functional and customer based) and verify that the requirements meed the customers need before starting to develop against them. Why?, simple - time and money, with all the effort into standard methodologies and practices the decision still comes down to time & money. Perhaps taking more time up front in the incubation(conception), and planning phases to focus on verifying the requirements drafted actually meet the customers needs would help reduce legal liabilty and project failures due to schedule and cost overruns but it won't eliminate it fully due to the unkowns which hit every project, IT or other market segment included; such as technical issues/constraints due to previously undisclosed information, environmental and staffing issues etc... these are common to all projects and are often some of the key factors cited as causes of project failures. In short until the customers truly work collabortively with full disclosure with project teams this issue is one of joint responsibility which will need to be addressed on a case by case basis and not by industry wide legal actions.

Mark Miller
Mark Miller

Just from your description the way I would interpret this is that if anything the company making the software is liable for what their programmers produce. Now, if you extend this to independent developers I can see what you're talking about, because they're their own business. I think it would be best to wait for the court to act, though, before coming to conclusions, because they may agree with you. I think this issue has to do with breach of contract. Just a guess. From working for companies that produce custom software for major firms (though nothing on the scale of ERP), an understanding I gained is that the producer needs to treat the requirements and design documents like contracts. They are agreed upon, approved, etc., and they are legally binding. If there are any amendments to those documents GET THEM IN WRITING!!!! Do NOT make requirements/design changes over the phone with no corresponding documentation. And another thing, make the definitions very clear. A problem I've seen with requirements documents is someone was sloppy and left vague holes in it you could drive a truck through, and got it approved. Big mistake. What I've found in the past, though I don't know if this would fly now, is e-mails will suffice for documenting amendments to approved documents. It's kind of messy, but it seems to flow well with the producer/customer relationship. It never got to the point of going to court over a dispute in my experience, but it really helped with a client that was all over the place. We could point to what they said (documented) and say, "This is what you said you wanted, and it is what we provided to you." I think so long as that is the standard, and the producer does a good faith effort to implement what was agreed upon they should be in safe territory. That's my assumption. If that's not the case, I don't know what to say. It throws the whole process into turmoil. Another aspect that could be related to the WM/SAP case is scheduling. If SAP promised in their agreement to have the system completely installed by a date certain, with no provisions for renegotiating deliverables or the time of installation, and they failed to deliver on time then I imagine WM would have a case. It would've been up to SAP to provide themselves with some wiggle room. Anyway, the part where you talk about how programmers can be liable for what [i]salespeople[/i] negotiate sent chills down my spine. I recoiled in horror (just kidding, but not much). If I had a say in it I would NEVER let a salesperson negotiate a project with a client alone. I don't care if the company is liable or not. Someone senior from the engineering team needs to go along as a sanity check, because the motivation of the salesperson is to close the sale. Period. They'll promise the moon to do it, too. It's the programmers who are left with the consequences in terms of trying to meet the schedule (HA!) that was negotiated. I've seen and heard about situations where the salespeople are negotiating by themselves and it is one horror story after the next. Salespeople, unless they have a software dev. background (some do), have no idea what they're doing beyond customer relations. I'm not knocking it. It's a valuable skill, one which programmers find challenging to master, but customer relations in and of itself is not enough.

wade
wade

Buy a $1,000,000+ umbrella insurance policy for self and/or business and forget about it.

ernestm
ernestm

It's a ridiculous strawman to say that programmers will be the targets here, because God knows it's not their responsibility. Individual workmen don't get sued when a building collapses. Instead, the planners who uses substandard materials, who told the workmen to cut corners, who ordered less or no testing of the product - they are the ones who will be held responsible. And that's good; they should be. In few other industries can companies plan to do crap work. I've seen a lot of failed projects. And not a single one of them was the fault of any programmer. They were the fault of sales guys that set up a deal that would never pay for the labor required; they were the fault of corrupt, lazy, or incompetent management that valued any host of things above the product quality; they were the fault of project managers trying to curry favor at the cost of realistic planning. Frankly I feel personally offended by the core idea behind this "scare" headline. I went into management because I saw lots of good programmers saddled with failed projects through no fault of their own but of the incompetent sons of bitches that were being "paid the big bucks." I challenge you to come up with any scenario related to any big-company SAP implementation where *any* substantive problem is the fault of any programmer.

JimTheGeordie
JimTheGeordie

Some years ago i attended a very good series of seminars on legal issues for contractors hosted by the Australian Computer Society. The lawyer conducting the seminars pointed out that the nature of the industry is such that you can become vulnerable to attack even when you have done nothing wrong. One example he gave is that you carry out a programming project for a client using a (licensed) product which contains material which your licensor has licensed from a third party. If the licensors agreement with the third party breaks down,the third party may insist that you (and your client) desist from using his material. It is often impossible to check the ramifications of such situations beforehand. When asked what one could do to protect oneself, the lawyer replied "Can you trust your wife ?". The implication was, of course, that no one sues anyone who has not got any money or assets.

MadestroITSolutions
MadestroITSolutions

Sounds like the right time to open PII (Programmer's Insurance, Inc.), lol....

cg221
cg221

This is nothing new. The liability already exists - for paid work or products, anyway. Any time you agree to do something or supply something, you make a contract, whether it's in writing or just agreed in a phone call or meeting. If you fail to deliver what you promised, or it's deficient or unsuitable, you normally have to give the money back, accept a reduced fee or pay damages. Sometimes you'll get sued. You can usually write a contract to limit your liability to no more than the amount you're being paid, but not deny liability altogether. All this is just centuries-old common law and SAP getting sued by a client isn't going to change it one iota. They're not the first IT company to be sued and they won't be the last. Nor is it going to lead to people suing the makers of free software. If you're not getting paid for something there's no enforceable contract. What IT contractors and consultants can, and do, do to protect themselves from financial disaster if a project fails is to insure against the liability. This type of insurance is commonplace in the industry and is often included automatically in contracts arranged through an agency.

Tig2
Tig2

The programmer is generally not the fault point. Let's look at the PM, the constantly changing business requirements, lack of senior level buy in, re- prioritization, and disaster as being the some fault points to consider. Bad PM? Some teams can get past that, some teams can't. And it often takes way too long to get a bad PM booted and a better PM engaged. And then the team is going backward in time in many respects because the new PM is walking into a crisis and still needs to build a rapport with the team. Changing requirements? I hate to accept change orders once I have sign off to go forward. Fact is that they happen. About all a good PM can do is implement strict controls and document everything. Hint: Change analysis should be a part of this. I could go on and on but you see my point. I was with a company doing an SAP implementation. Did the sales guy over- promise and under-deliver? The answer is "possibly". The truth is that the company simply did not have the right players engaged during the initial analysis and the people who WERE engaged kept going forward instead of asking the tough questions. That was compounded by no one having any real world experience with SAP and no one taking into consideration what the impact would be on other, current projects. Final analysis? EVERYONE failed. Incidentally, didn't you steal "de-scope and declare victory" from me??? :D

$$$$$$$$$$
$$$$$$$$$$

So, suppose I'm head of the procurement department of Big Oil, and I need some computer crap. Why does Big Silicon think I want to talk to a lying shyster instead of one of the heads of the department that will be producing what I'm buying? Isn't the custom of having a "Sales Department" just derived from the debit/credit division of accounting? Is it really a skill set that's judged important by the clients? I really, really doubt it.

adam.howard500
adam.howard500

And be sure to include the premiums for such insurance in the price you quote for the client. If the client balks at that, offer them another contract that states they don't have to pay a portion of said insurance premiums, but they have to agree to waive any and all claims, etc. Let them then choose which way they want it. Higher quote with some nice safe insurance or lower price but fewer options.

$$$$$$$$$$
$$$$$$$$$$

But if the Sales Department takes heat, it will spread through the company like you're a block of copper.

Justin James
Justin James

Yup, this is a very dangerous situation. Another similar one to it is the GPL. Statically linking to a GPL'ed library, or copy/pasting some GPL'ed code into your project automatically makes the entire item GPL'ed. So all it takes is on programmer on your staff to look up how to do something on the Web, see someone else's code posted, copy/paste it, and BAM, you are now wide open to be sued. Licensing is indeed a very tricky situation. J.Ja

apotheon
apotheon

"[i]When asked what one could do to protect oneself, the lawyer replied 'Can you trust your wife ?'. The implication was, of course, that no one sues anyone who has not got any money or assets.[/i]" There's another implication: that your wife, by way of analogy, is similar to the software, in that there comes a time when you just have to trust that things will go well -- but sometimes, they won't. It's part of life. If you distrust your wife, you should cut her loose and get a replacement. If you refuse to ever trust anyone, though, you'll never find a wife you can keep around.

Justin James
Justin James

... but it felt too much like selling "volcano insurance" at this point. Maybe in a few years, a few more lawsuits like this, and it will be a very attractive idea. The problem is, I suspect that the moment you sell insurance against it, that means that you are *guaranteeing* that people will sue, because now they know you can pay up. :( On that note, I wonder if banning medical malpractice insurance would almost entirely eliminate malpractice suits? J.Ja

Justin James
Justin James

"This is nothing new. The liability already exists - for paid work or products, anyway. Any time you agree to do something or supply something, you make a contract, whether it's in writing or just agreed in a phone call or meeting. If you fail to deliver what you promised, or it's deficient or unsuitable, you normally have to give the money back, accept a reduced fee or pay damages. Sometimes you'll get sued. You can usually write a contract to limit your liability to no more than the amount you're being paid, but not deny liability altogether." The major issue at hand is that the nature of IT projects makes it extremely difficult to precisely specify deliverables. To write that contract and have it not be able to fall into "he said/she said", you need to follow a strict Waterfall methodology (or something similar). I think that is one major reason why so many large companies and nearly every government and military project uses Waterfall or a variation. It's the only way to truly be able to look back at the end of a project and determine if the contract was fulfilled or not. Now, every project contract I've read (a few dozen, obviously not a super large body of experience) included a dozen and one fairly standard clauses, usually with the phrase "express or implied" involved. That's why, to me, this case is landmark; it is allowing for the possibility that those clauses are no suffient armor against a lawsuit. Every one of these contracts also had arbitration clauses, which deny the right to sue. So that's another "boilerplate" item that this lawsuit seems to tear down. "What IT contractors and consultants can, and do, do to protect themselves from financial disaster if a project fails is to insure against the liability. This type of insurance is commonplace in the industry and is often included automatically in contracts arranged through an agency." I don't think this is nearly as common in IT as it *should* be. I have worked for some companies that had it, and some that didn't. J.Ja

apotheon
apotheon

You make good points quite well with that response. I agree with your take on the matter, as presented in your comment, 100%. The liability insurance point is an important one that I don't think anyone else in this discussion has brought up, too (though I haven't read every single post yet, so I'm not sure about that).

NickNielsen
NickNielsen

The truth is that [u]the company simply did not have the right players engaged during the initial analysis[/u] and the people who WERE engaged kept going forward instead of asking the tough questions. (my emphasis) Justin addresses this in some of his posts in this thread, but I don't think many people realize just how important it is to ask the end users what their requirements are. In every failed project I've been involved in (thankfully a relatively small number!), the project almost invariably failed because nobody?nobody?ever asked the intended user what their requirements were. Supervisors and managers (even work flow analysts!) had their input, but the end user was never consulted. The final result in each case actually made doing the work more difficult. On the other hand, the best software project I ever saw, I saw as the end user. One of the developers spent a week going through the work flow, seeing the current system in operation and actually using it. The result of that project was probably the best software application I've ever used.

Justin James
Justin James

"Incidentally, didn't you steal "de-scope and declare victory" from me???" As a matter of fact, I did. And darned proud of it! Every since you first said it, it has crept into my life at least once a week. It's scary, but those few words describe far too much of my life too accurately for my taste. PS - the subject line is sarcastic, in light of the article itself. :) I agree, PMs, BAs, salespeople, the customers... everyone is involved, and everyone shares some responsibility. But customers so infrequently own up and say, "hey, we didn't know our own needs until we went through this integration, half of this failure is ours." Instead, their people get defense and tell their bosses that the evil vendor is messing up. And this is true an awful lot of the time, too. Another huge concern is that many programmers are their own PMs, BAs, etc. Sometimes this works, particularly on smaller projects. And some methodologies, especially those in the Agile vein of thinking encourage it. But many less programmers are really good at wearing those hats than are trying to wear those hats. And, in many cases, a product is being developed with no direct customers, and then later cut to fit each customer. In these cases, the programmers have a LOT of input into the project, more than they usually would. Project failure is definitely almost never the sole fault of the programmers. Indeed, failure is almost guaranteed by many organizations' institutional cultures, like signing a contract without consulting the tech folks to see if it is possible to fulfill the obligations. I'm really not saying that it is the fault of programmers, but projects that involve programming are definitely high risk! J.Ja

mattohare
mattohare

He was required to get standard business liability insurance in order to get contracts for them. He split the premium between the next four contracts. Mind, he already had his competitors under-bid by more than the premium cost.

apotheon
apotheon

"[i]On that note, I wonder if banning medical malpractice insurance would almost entirely eliminate malpractice suits?[/i]" There's too much legal precedent set in a world where malpractice insurance exists. The lawsuits would be reduced in number, and in the size of monetary rewards, but it would still be far too easy to win some money for a frivolous claim.

Sterling chip Camden
Sterling chip Camden

... reduce the awards given by judges if they knew that they were putting the MD out of business, except when the malpractice was egregious.

cg221
cg221

Yes, but lots of things are difficult to estimate. Defence and construction projects, for example. Nevertheless contractors have to stick out their necks and name a price, because businesses won't usually write a blank cheque. So the contractor takes a business risk. Whether he uses a "waterfall" model or simply plucks a number out of the air he can still get it wrong, and then he ends up negotiating compensation, or going to arbitration, or to court. A court is just the ultimate arbitrator. If this case goes to court maybe the court will say that some types of boilerplate denials of liability have no effect. If it does, that will be because that is already the law, and that they've never been worth the paper they're written on. Denying all liability is probably already ineffective in most jurisdictions, but limiting it to the total value of the contract and denying consequential damages is probably sound. So if SAP gets sued, I don't think it will cause any upheaval in the IT industry. If there is an upheaval, it may be a beneficial one, clarifying what standard of performance is acceptable in law, and if we're lucky, removing some of that endless bumf in contracts and licence agreements.

apotheon
apotheon

"[i]To write that contract and have it not be able to fall into 'he said/she said', you need to follow a strict Waterfall methodology (or something similar).[/i]" I disagree. Get everybody to sign off on a contract that specifies an agile methodology for contract fulfillment, where at every stage of an agile methodology for development both parties sign off on the incremental deliverable before proceeding. This results in a much better set of protections for both parties involved, ultimately, than a Waterfall-style contract fulfillment methodology attached to a Waterfall-style development methodology. "[i]That's why, to me, this case is landmark; it is allowing for the possibility that those clauses are no suffient armor against a lawsuit. Every one of these contracts also had arbitration clauses, which deny the right to sue. So that's another 'boilerplate' item that this lawsuit seems to tear down.[/i]" It's still not entirely clear to me that what has been litigated here is anything but a standard, reasonable breach of contract suit, actually. "[i]I don't think this is nearly as common in IT as it *should* be. I have worked for some companies that had it, and some that didn't.[/i]" I'm opposed to an insurance-overrun industry as much as I am to a regulation-overrun industry -- because they're almost identical circumstances for the industry. Insurance is just a different form of mediocrity-enhancing regulation.

Justin James
Justin James

I find that asking the user is important. Even better is asking the user "why?" "Why?" gets to the *heart* of the requirement. After all, if the current process is so great, why are they talking to the IT department to begin with? For example, the user gives a requirement of, "I need to see this column in the output." The BA/PM/programmer says, "why?" The user responds with, "because I need to scan this column looking for all of the records with that value, like I do now with the current system." The BA/PM/programmer then responds with, "no, you don't need to see that column, you need a search function for that column!" At which point you have a happy user. The whole point of writing software most of the time is to accomplish one of two goals: * Automate the routine stuff, freeing manpower for where human intervention is required and guaranteeing that the work is done consistently * Leverage the data storage/search/retrieval capabilities of the computer to replace paper That, by and large, is *it*. Most projects get limited to accomplishing only a section of the second point; no one agrees enough on what the process truly is, so they forget about trying to automate any of it, and the application is basically just a data browser with extremely loose constraints on input, so the users can continue to do things their way. And because they gave up on trying to *improve* anything, they merely replace the paper & file cabinent with a DB backend. All because no one thought to ask "why?" and got caught up in endless arguments over requirements without understanding the basic assumptions. J.Ja

Tig2
Tig2

On the other hand- nah. You can use it. It describes way too much of every bodies life these days. I have my own opinions of Agile. I am currently watching an "Agile" project spin totally out of control. The PM is clueless- not really his fault, he came in to the project at about the half way point so is still playing catch up. Add to that the fact that he has never used the Agile methodology. The architect is the only person who really KNOWS the system- reasonable because he designed it in the first place- but the programmers are all in another country and have been cheerfully writing code without his input. Oh and a requirements doc and detail design doc. Since when can you develop detail design when you don't know what the requirements are? Programmers should be programmers. Not PMs, BAs, or any other thing. I look to my programmers to tell me how far business has wedged their heads up their collectives. Once I get the straight skinny from the people who make it all happen, I can advise business. That shouldn't be the programmers job. In fact, I think that it is a requirement that there be a layer between business and the programmer. I am VERY careful to insulate my programmers from business. It dramatically reduces the number of ad hoc change orders when I do. To me, this boils down to proper analysis on the front end, critically looking at the proposed end result, engaging the right people earlier, and shifting away from the "blame" culture that seems to be so prevalent. It means recognizing a high risk project from the beginning and implementing a roadmap of "sanity checks" to insure that the project has any hope of success. And it means being willing to truly research a solution before you begin- a step frequently overlooked. Had Waste Management truly done their due diligence on the front end, it is possible that they would never have considered SAP a winning vendor. Had they insisted on tighter controls, they would have had fewer problems. They own just as much of the blame for this failure as SAP does. But assigning blame doesn't fix a thing.

Jaqui
Jaqui

would be to actually code to the applicable ISO standards for the project. While not every project has a specific ISO standard, the category of the software most likely does. The benefit is that both sides have a benchmark for minimum acceptable functionality that they can't change. long term, the industry benefits because it makes more people aware that there are standards for applications, based on intended use. This causes less fuzzy thinking in the project specs generation. edit to add: here, the IT based published standards from the ISO: http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_tc_browse.htm?commid=45020&published=on

Jaqui
Jaqui

from reading the descriptions of agile development in here, it sounds a lot like what has been called a "Flat Management" structure in effect. Which is where whichever team member has the right skills to lead that section of the project is the team leader in practice. [ paper trail it's still the designated leader by the company ] This method allowing the team to adjust to the changing requirements with minimal disturbance.

Justin James
Justin James

I thank both of you for clearing up those ideas of Agile I had. It is clear to me now that a lot of people are claiming "Agile" who shouldn't be. What you both described makes sense, and agrees with a lot of what I've read in some other places as well. It certainly sounds better than the "anarchy" (Chip's word) that a lot of authors describe. Like many of the other items brought up, I've been around some "Agile" environments that made me queasy. It is clear to me that Agile requires a way of thinking and determination to follow it through that most businesses lack, and when they try to use it without being 100% willing, they botch it, just like they call something "Six Sigma" just because they re-examined the process and took out a step or two in it... J.Ja

Sterling chip Camden
Sterling chip Camden

Though it can easily transition to it, just as a failed waterfall methodology can, too. Agile is about admitting what waterfall never admits: requirements have to change, because nobody can really nail them down from the beginning. You do that in conversation with the user, and you prevent scope creep by reasoned agreement.

apotheon
apotheon

"[i]My understanding was that Agile cut out the BA's and PM's by and large, and did away with things like 'signoff' and 'milestones'. That's why I doubted that Agile would work well like you described. Then again, I suspect that many people who promote Agile simply have a beef with the PMP system, or generally refuse to define anything on paper, so they rename 'code cowboy' to be 'Agile' and piggyback on that movement.[/i]" There's a lot of that going on ("code cowboys" calling themselves "Agile"), but that's not what "Agile Software Development" is really about. There are PMs in that there is always someone that leads the project, but not some separate managerial type who knows little to nothing about the actual coding going on and separates the programmers from the clients. The PMP system itself is pretty much a definite no-no in Agile methodologies. That doesn't mean there's no structure or process in an Agile methodology -- just that it's not the same bureaucratic, every cog in its place process. Of course, there are some Agile "principles" that don't perfectly fit with the process I sketched out -- but what I described is about 90% Agile and about 2% Waterfall. "[i]Most people that I read and hear promoting Agile seem to not work in this way. Instead, it sounds like they want to essentially allow the programmer & customer to work together, with iterations being virtually unplanned.[/i]" There's always [b]some[/b] planning for an iteration, or else: 1. the programmers aren't really working with the clients 2. there's no actual methodology, "Agile" or otherwise "[i]In a nutshell, a formal embrace of what used to be called 'scope creep'.[/i]" In some respects, that is what's going on with Agile -- except that it's only "scope creep" in the sense with which we're all familiar if nothing is taken out. The iterative development methods collected under the name "Agile" are meant to allow a new direction at each stage -- which involves throwing out what it turns out isn't going to work or isn't a good idea, and adding in what it turns out is actually needed. Doing it this way means that at each iteration you're both affirming that the developers are doing what they're supposed to be doing [b]and[/b] ensuring that the developers don't just keep getting saddled with additional work as more feature requests are thrown into the mix. One of the key components of this is ensuring that the client sees how the software develops, rather than just sitting far outside the process dictating more features and changing specs without knowing what's actually going on. . . . and since each iteration ends with a working and deliverable (if unfinished) piece of software, the client gets to find out first-hand how bad some of its ideas were to begin with. "You got what you wanted. Now that you have it, you can say you don't want it, and we'll do something different for the next iteration." "[i]I've known that certain stock clauses, particularly those that appear in employment contracts, aren't very sturdy. It makes me wonder what the point is, other than to intimidate the less legally savvy.[/i]" That [b]is[/b] the point. "[i]I beleive that the law (including legal contracts) should be self-evident and be "lexical closures" in that you do not need to know any law or information outside of what you are reading to understand its meaning fully and accurately, provided you are fluent in the language that it is written in and have access to a basic dictionary.[/i]" Actually, what you're describing sounds like it's dynamically scoped. Lexical scoping means that the scope of the current routine plus the scope of all super-routines of that routine, all apply -- while with dynamic scoping, only the scope of the current routine applies. In fact, a lexical closure explicitly imports something from its parent scope, and becomes a closure only because that parent scope is closed while the closure's scope is still active. In fact, now that I think about it, it seems like lexical scope (complete with closures) is what our legal system is -- and precedent is the parent scope that ends up being included in the local scope of a lexical closure.

Justin James
Justin James

"I tend to guess you've picked up misconceptions, because I don't see any reason that shouldn't work with an agile methodology." My understanding was that Agile cut out the BA's and PM's by and large, and did away with things like "signoff" and "milestones". That's why I doubted that Agile would work well like you described. Then again, I suspect that many people who promote Agile simply have a beef with the PMP system, or generally refuse to define anything on paper, so they rename "code cowboy" to be "Agile" and piggyback on that movement. "The iterations are divided up that way for a number of reasons, including allowing for changes in development direction at each stage, and setting very clear goals for completion of the next with specific plans for what functionality will be completed." Most people that I read and hear promoting Agile seem to not work in this way. Instead, it sounds like they want to essentially allow the programmer & customer to work together, with iterations being virtually unplanned. In a nutshell, a formal embrace of what used to be called "scope creep". I think a VERY important note, is that most of these folks align with the "Web 2.0" crowd. Maybe that's a part of it. They love this idea of going from idea to deployment in a day or two, and the "endless beta". And they call this "Agile". That's where a lot of my impressions of it come from. "It looks like SAP deserves the legal wringer..." I could not agree more. SAP is worse than Oracle, from what I can tell. "... and I haven't really seen any evidence to suggest this will turn into a free-for-all of frivolous lawsuits against programmers." Yup, we're definitely still in the speculation stages. I hope that this is a fluke, not a sign of things to come. "Anyway, I've known for a long time that arbitration and zero liability clauses aren't airtight by any stretch." I've known that certain stock clauses, particularly those that appear in employment contracts, aren't very sturdy. It makes me wonder what the point is, other than to intimidate the less legally savvy. Which bothers me. After all, if I sign a contract that says "XYZ", why should it secretly really mean "XY only, Z is for suckers who don't have a degree in law". Which is an extraordinarily dangerous situation in my mind, for many reasons; I beleive that the law (including legal contracts) should be self-evident and be "lexical closures" in that you do not need to know any law or information outside of what you are reading to understand its meaning fully and accurately, provided you are fluent in the language that it is written in and have access to a basic dictionary. J.Ja

apotheon
apotheon

"[i]This is quite contrary to the Agile methods *as I understand them*. There are a lot of permutations of Agile out there, and many misconceptions which I may have picked up along the way. But I beleive that the only way that would work with Agile would be to simply bill per-hour.[/i]" I tend to guess you've picked up misconceptions, because I don't see any reason that shouldn't work with an agile methodology. The payment would be by the iteration, rather than the hour, most likely. Agile methodologies tend to run in many rapid iterations rather than a couple of major project arcs, with a working (if only partial) deliverable at each stage. The iterations are divided up that way for a number of reasons, including allowing for changes in development direction at each stage, and setting very clear goals for completion of the next with specific plans for what functionality will be completed. This kind of approach tends to eliminate a lot of the problems with longer development stages, such as goals that are effectively impossible to define up-front and mismatches between desired functionality and what's delivered. "[i]Every contract I've read for a project essentially says, 'regardless of what the salesperson said, here's what you get, and we can't sue each other, and at best a failure results in non-payment'. That's why I am really shocked and concerned that this might reach a court.[/i]" I dunno . . . I think an occasional lawsuit isn't necessarily the end of the world. It looks like SAP deserves the legal wringer -- and I haven't really seen any evidence to suggest this will turn into a free-for-all of frivolous lawsuits against programmers. Anyway, I've known for a long time that arbitration and zero liability clauses aren't airtight by any stretch.

Justin James
Justin James

"Get everybody to sign off on a contract that specifies an agile methodology for contract fulfillment, where at every stage of an agile methodology for development both parties sign off on the incremental deliverable before proceeding." This is quite contrary to the Agile methods *as I understand them*. There are a lot of permutations of Agile out there, and many misconceptions which I may have picked up along the way. But I beleive that the only way that would work with Agile would be to simply bill per-hour. That being said, it is clear that the Waterfall project plans so favored by government, military, and big companies have had their share of expensive failures (from today: Census Bereau going to hand-data collection: http://blogs.zdnet.com/service-oriented/?p=1082&tag=nl.e540)... I just beleive that Waterfall provides the risk management that these environments prefer. I won't even say "risk mitigation" because these projects seem to fail all of the time. But Waterfall explicitly spreads the blame around enough so that everyone can point the finger at everyone else, but not enough to feel cheated. It's the CYA attitude once again. :) "It's still not entirely clear to me that what has been litigated here is anything but a standard, reasonable breach of contract suit, actually." Same here. But... and it's a big "but", my assumption would be that the vendor (SAP) had all of the usual anti-lawsuit "armor" in the contract (especially an arbitration clause), and that's a concern. If the armor is not bulletproof, it opens the whole industry up. Every contract I've read for a project essentially says, "regardless of what the salesperson said, here's what you get, and we can't sue each other, and at best a failure results in non-payment". That's why I am really shocked and concerned that this might reach a court. "I'm opposed to an insurance-overrun industry as much as I am to a regulation-overrun industry -- because they're almost identical circumstances for the industry. Insurance is just a different form of mediocrity-enhancing regulation." I agree, but the math on these kinds of lawsuits stinks, bad. The insurance, as parasitic as it is, becomes a necessity. Businesses are risk adverse, and insurance is a hedge against risk. Few business people will want to roll the lawsuit dice and risk hitting snake eyes. J.Ja

Sterling chip Camden
Sterling chip Camden

Insurance is a bleed on revenue, and encourages risk avoidance in order to keep your premiums down. Neither of these are good for innovation.

Justin James
Justin James

"One of the first things I realized in my Intro to Programming class was that code is reusable!" When I was taking a course in EdScheme (my third language learned), I discovered that there was a command to load a library from disk; with the way the course was structured, each project built upon previous ones. So instead of copy/pasting the stuff, I'd load it off the disk. The teach assigned me negative marks. Her principle was that because the other code was not visible in the current project, it was harder to work with. My stance was that this way, if I found a bug in the library code, I only had one spot to fix it in. That teacher taught me a lot, but this was a good case of me ignoring her... J.Ja

NickNielsen
NickNielsen

Nor is Pascal, C, C+, Ada, Assembler or any other language. If you're lucky, you will be allowed to choose the language that best fits the needs of the current project. During my short time as a programmer, I discovered that most programmers are not lucky. One of the first things I realized in my Intro to Programming class was that code is [u]reusable[/u]! If my instructor said "Great job!" on a page formatting or data retrieval routine, why should I reinvent that wheel? Just run the cards through the duplicator, making the necessary changes, and whoomp, there it is. I don't think some of my classmates ever quite grasped that.

NickNielsen
NickNielsen

But at the very least, the first COBOL program. ;)

Sterling chip Camden
Sterling chip Camden

COBOL was my second language too, right after BASIC. After that hopeless start, I was saved by Algol, assembly, and C. To be fair, COBOL (if done properly [is that possible?]) can be used to learn structured programming. I have written programs in COBOL of which I could be proud. The trouble is, it's much easier to write copy/paste programs of thousands of lines of COBOL mantra than it is to structure a tight solution with good code reuse. On a lighter note, the guy who sat next to me at the time, Keith Smith (hey Keith, are you still alive?) entertained himself by using clever variable and procedure names so he could write gems like "PERFORM SEXUAL-FAVORS UNTIL EXHAUSTED".

Justin James
Justin James

COBOL was the second language I learned, it was on an NCR system, running some UNIX variant. :) You're right though... COBOL isn't the *cause* or even an *enabler*, but it is definitely a *symptom*, in my opinion. An odd note, a great many of the programmers I've met that I really respected cut their teeth on COBOL/mainframe programming. I have a feeling that either there used to be some good things happeningon those projects, or that the skills and traits needed to become a "COBOL survivor" are the same skills and traits that I generally respect in a programmer. Either theory works for me. :) J.Ja

apotheon
apotheon

So . . . it seems you're saying that when COBOL was invented, someone should have asked (with horror in his voice) "WHY‽"

NickNielsen
NickNielsen

[i]...no one thought to ask "why?" and got caught up in endless arguments over requirements without understanding the basic assumptions.[/i] That's probably the root cause for every failed software project since the invention of COBOL. edit: minimize

Justin James
Justin James

"Had Waste Management truly done their due diligence on the front end, it is possible that they would never have considered SAP a winning vendor." "Possible" should be "definite". Even Gartner will tell you that the failure rate of SAP projects is quite high. J.Ja

apotheon
apotheon

"[i]I look to my programmers to tell me how far business has wedged their heads up their collectives. Once I get the straight skinny from the people who make it all happen, I can advise business. That shouldn't be the programmers job. In fact, I think that it is a requirement that there be a layer between business and the programmer. I am VERY careful to insulate my programmers from business.[/i]" This is only really necessary because of the current state of corporate culture -- which is to a significant degree a function of the current state of corporate law. That's one reason I prefer to stay the hell away from major corporations in my work. "[i]shifting away from the 'blame' culture that seems to be so prevalent[/i]" . . . thanks ultimately to the state of corporate law, for the most part. There naturally arises a blame culture in bureaucracy, which is in turn a result of institutionalized liability limitation. "[i]But assigning blame doesn't fix a thing.[/i]" That's true -- but fixing something isn't what they're trying to accomplish by assigning blame. Fixing things isn't really the goal in corporate culture.

apotheon
apotheon

"[i]These 'assembly line' shops are *not* about being a 'good developer', they are about fitting into a machine guided by a few senior developers and/or architects. They plan the code out practically to the function level or even the pseudocode of functions level, and the programmers merely implement that plan. In those situations, it's the ability to fit into that machine with minimal adaptation required that decides your level of 'fitness' much more than your ability to program well.[/i]" I see. Well, in that case, all I can say is "Who gives a crap?" Maybe middle management does. Great. I'm happy. That means they'll be less competitive, and I (or someone like me) can swoop in and grab a hunk of the market. "[i]I was trying to contrast it to the unfortunate reality that a lot of software barely manages to get written in. The things that make self-taught people really good developers on average makes them a poor fit for these assembly line environments.[/i]" I thought you were trying to make a case for the importance of a college education if you want to be a programmer that knows how to program to (useful) standards. "[i]A furniture factory can't make use of the chair legs I hand carve, but a local furniture company that specializes in artistic pieces can. Factories value sameness over quality, since sameness *is* the exact quality that factories strive for.[/i]" Apples and oranges. In the programming world, what that factory does is reuse code -- which is what I'm suggesting would be a great way to do away with bureaucratic assembly-line programming practice. "[i]I think the reason why Calc in particular is pushed so hard in college for that education is due to the historic linkages between CS departments and math departments.[/i]" I [b]know[/b] that's the immediate reason. There are other reasons, too -- like the fact that math department blowhards who do administrative work and make their graduate students discover stuff for them rather than pursuing interesting research themselves all immediately think "calculus" the moment you say "math", and ignore the other "higher" maths entirely. "[i]but few of these things, outside of your recommendations of geometry and symbolic logic are available as courses to make mandatory in college.[/i]" What's wrong with geometry and symbolic logic? I've done both, and it took considerably less time than eighteen credits of calculus and calc prep. "[i]'If you implement ODF faithfully, you're standards-compliant, regardless of any additional features.' Not if your features create data which needs to be stored in the document.[/i]" Yes . . . even if the features create data stored in the document! Software [b]supports[/b] standards, documents [b]adhere to them[/b]. If you use a feature that makes a document unreadable to another fully standards-compliant application, it's the document that's noncompliant -- not the application! As long as the application meets all the requirements of the standard, any additional features have no effect on the application being standards-compliant. After all, adding a plugin to the GIMP so that it can embed messages in images steganographically doesn't make it PNG noncompliant, nor does the fact that it can edit GIF files. Creating a GIF file with the GIMP, on the other hand, creates an image that [b]is[/b] PNG noncompliant. What you're saying is akin to saying that, just because the GIMP can work with layers and save image formats with layers, it's not bitmap compliant. In fact, you're basically saying that MS Word is RTF noncompliant just because it can edit OOXML files! What matters for RTF compliance is whether it supports the entire RTF standard in this case -- and, in the case of the document, what matters is whether the document makes use of any features that are not RTF compliant themselves. "[i]Let's pretend, for example, that ODF didn't support italic text. But you want your app to be able to do italic text. So now, your choice is to extend ODF in a non-standard way to have an italics marker, or to not have the feature.[/i]" If you include the italic feature in OpenOffice.org, despite ODF not supporting it (in your hypothetical example), that doesn't mean OO.o is ODF standards-noncompliant. It means that, in any document where you use the italics feature, [b]the document[/b] is noncompliant. . . . or are you trying to tell me that Vim is XHTML 1.0 Strict standards-noncompliant, and I'm violating standards by using it for web development, just because I could write HTML 3.0 with it? I guess there's [b]no such thing[/b] as a standards-compliant web design application, then.

Justin James
Justin James

". . . which has little to do with being a good developer, and a lot to do with having learned certain things about bureaucracy, and certain buzzwords and passphrases that help you scam your way through the bureaucracy as well as the rest of the corporate politicians." Which is exactly the point I'm trying (rather poorly, judging by the confusion) trying to convery. These "assembly line" shops are *not* about being a "good developer", they are about fitting into a machine guided by a few senior developers and/or architects. They plan the code out practically to the function level or even the pseudocode of functions level, and the programmers merely implement that plan. In those situations, it's the ability to fit into that machine with minimal adaptation required that decides your level of "fitness" much more than your ability to program well. "I thought we were talking about what makes a good programmer -- not the knowledge needed to correctly perform the secret handshakes of corporate politics." We were, I was trying to contrast it to the unfortunate reality that a lot of software barely manages to get written in. The things that make self-taught people really good developers on average makes them a poor fit for these assembly line environments. A furniture factory can't make use of the chair legs I hand carve, but a local furniture company that specializes in artistic pieces can. Factories value sameness over quality, since sameness *is* the exact quality that factories strive for. Calculus is a far less effective means of gaining that background in "pure logical thinking" than many other approaches that would, as a bonus, provide a whole lot of other knowledge that is a lot more relevant to computer science, programming, and so on, than the bits of calculus that aren't just "pure logical thinking". That's my point -- which you seem to have just blown right past. ... Why subject someone bound for a programming career to calculus when equal time spent playing Go would be so much more useful to that person? ... You can get the same thing from a combination of high school geometry and Symbolic Logic 101." I think the reason why Calc in particular is pushed so hard in college for that education is due to the historic linkages between CS departments and math departments. I've said elsewhere (and here) many times that Calc is not the only, and certainly not the best way of getting that experience. Indeed, simply implementing some really common library routines in Lisp or Scheme (I did it in EdScheme) is a much better way to learn it. I love chess and sudoku. I'm currently fascinated by a word game on my cell phone called "Bookworm", I highly recommend it. but few of these things, outside of your recommendations of geometry and symbolic logic are available as courses to make mandatory in college. "If you implement ODF faithfully, you're standards-compliant, regardless of any additional features." Not if your features create data which needs to be stored in the document. Let's pretend, for example, that ODF didn't support italic text. But you want your app to be able to do italic text. So now, your choice is to extend ODF in a non-standard way to have an italics marker, or to not have the feature. ODF is probably a really bad example; while I've not read the spec (all 700+ pages of it), I am sure that it includes hooks for stuff like this. But I hope it makes my meaning a little bit more clear. Standards have a tendency to put a limitation on what can be done, or they provide a very massive system for allowing the implementer to extend the standard dynamically, in a way that any other implementer of the standard can use your extension with zero additional coding. Lisp works like this; indeed, that is the whole point of Lisp. I feel like Smalltalk might work like this. But very few standards work like this. J.Ja

apotheon
apotheon

"[i]If I walked into a shop with a standardized way of doing things, and they handed me a code standard book that said, 'use Hungarian Notation' I would have been baffled. And they would be scratching their heads wondering if I really had been programming for well over 15 years like I claim I have.[/i]" This is not a matter of failing to find critical information interesting enough to learn. It's a matter of not having learned the specifics of Hungarian Notation yet. . . . and it's worth noting that most places that supposedly use Hungarian Notation don't even know what they're doing -- don't understand Hungarian Notation at all. For instance, real Hungarian Notation doesn't involve "int" being part of the name of a variable -- that's not the kind of "type" intended by Simonyi when he invented the style. "[i]There's also a huge range in how people self-teach. Some people might go to MIT's Web site, find their courses that they offer online, and essentially give themselves an MIT Comp Sci degree on their own. Other people will simply start in the hobbyist/enthusiast category and realize one day that they know enough to do it professionally, and go for it. And some people will deliberately go through the same steps as a hobbyist/enthusiast, but with the explict goal of starting a career.[/i]" . . . all of which is fine. Each approach tends to prepare someone for different sorts of work than the other approaches. As long as you stick to the work you can reasonably do effectively, you're golden. "[i]But I would never fit in at a place like IBM or Microsoft[/i]" . . . which has little to do with being a good developer, and a lot to do with having learned certain things about bureaucracy, and certain buzzwords and passphrases that help you scam your way through the bureaucracy as well as the rest of the corporate politicians. "[i]There is simply too much common terminology and knowledge missing.[/i]" I thought we were talking about what makes a good programmer -- not the knowledge needed to correctly perform the secret handshakes of corporate politics. "[i]I feel that Caluclus does provide a background in pure logical thinking, which is really helpful to a programmer.[/i]" Calculus is a [b]far less effective[/b] means of gaining that background in "pure logical thinking" than many other approaches that would, as a bonus, provide a whole lot of other knowledge that is a lot more relevant to computer science, programming, and so on, than the bits of calculus that aren't just "pure logical thinking". That's my point -- which you seem to have just blown right past. Yes, you can learn logical thinking from calculus, but its ineffectiveness in this regard is pretty easily demonstrated by the really abysmal logical thinking skills of the vast majority of people with nontrivial calculus educational background. Forcing eighteen credits of calculus on CompSci students just to expose them to logic is a bit like forcing eighteen credits of anatomy on civil engineers just to expose them to concepts like leverage and structural design. "[i]Heck, if they play crossword puzzles or chess 3 hours a day, it will be just as, if not more so, useful than taking Calc.[/i]" My point exactly. Why subject someone bound for a programming career to calculus when equal time spent playing Go would be so much more useful to that person? "[i]Personally, I am grateful for the Calculus, because it taught me to formally express the steps I took in my head, and taught me how to debug a problem when the results on paper were not what I expected in my head.[/i]" You can get the same thing from a combination of high school geometry and Symbolic Logic 101. "[i]See above. The point was, if they told me to use it, I would not have known what they meant, even though I use it.[/i]" See what I said above -- the point is that secret handshakes were not the focus of any of my points. Being a good developer is what I addressed. "[i]I think where the classroom education is helpful is when they actually teach and show how the standards and how to follow them.[/i]" . . . except that most classes teach things incorrectly when it comes to standards, and often permanently damage students who aren't sufficiently resilient to overcome the handicap of trusting the knowledge imparted by dolts with tenure. "[i]Let's say that I'm working on a word processing app, using the ODF format as my default format. I can't add a feature that ODF does not support, for example.[/i]" Sure, you can. If you implement ODF faithfully, you're standards-compliant, regardless of any additional features.

Jaqui
Jaqui

with the code re-use of libraries we are effectively at the assembly line stage. When there was not a huge collection of pre-written code available for re-use is when it was manual labour stage, done by artisans. what is needed, to help the industry gain strength, and consumer trust, is for the current assembly line to be staffed by artisans, who are permitted to make improvements on the process. my comments about the standards for basic security, networking security and application "class" functionality being a "common ground" to help give us the numbers of people skilled enough to be able to see where and how areas of the assembly line can be improved, yet have a solid enough understanding of the critical issues about security to avoid making zero day exploits happen when implementing a change.

Justin James
Justin James

"I don't see what's wrong with only learning what's interesting or needed. The problem is if the autodidact doesn't find critical knowledge interesting." This was the point I was trying to convey with my Hungarian Notation example. If I walked into a shop with a standardized way of doing things, and they handed me a code standard book that said, "use Hungarian Notation" I would have been baffled. And they would be scratching their heads wondering if I really had been programming for well over 15 years like I claim I have. :) ". . . and if it's not needed, then you don't need it. What's the problem?" The problem is, an organization at the point where the programming is basically "engineering" requires an awful lot of shared experiences and common knowledge. Most of that is not needed outside such an environment. It makes entering an environment like that extraordinarily difficult. Of course, many (probably most) self-taught programmers have no desire to work in an environment like that, either. There's also a huge range in how people self-teach. Some people might go to MIT's Web site, find their courses that they offer online, and essentially give themselves an MIT Comp Sci degree on their own. Other people will simply start in the hobbyist/enthusiast category and realize one day that they know enough to do it professionally, and go for it. And some people will deliberately go through the same steps as a hobbyist/enthusiast, but with the explict goal of starting a career. Personally, I started down the first track in high school, and ended up straddling the second two during college. The result is, it took me many years to learn some really basic things, while I ended up learning a lot of really advanced things. But I would never fit in at a place like IBM or Microsoft, at least not without either going through the MIT online courses (or an equivalent program), or re-enrolling in college and getting a CS degree. There is simply too much common terminology and knowledge missing. "It might help understand my position if you remind yourself that I'm of the opinion developers don't need to know calculus unless their work involves calculus." I agree, but I feel that Caluclus does provide a background in pure logical thinking, which is really helpful to a programmer. As long as they get it somewhere, somehow, I'm happy. Heck, if they play crossword puzzles or chess 3 hours a day, it will be just as, if not more so, useful than taking Calc. Personally, I am grateful for the Calculus, because it taught me to formally express the steps I took in my head, and taught me how to debug a problem when the results on paper were not what I expected in my head. "Hungarian notation is a local style standard, and something like that should never be regarded as a general-purpose development standard." See above. :) The point was, if they told me to use it, I would not have known what they meant, even though I use it. It's the little things like that which make working in a standardized shop tough for the self-taught. "As such, they are often advocates for standards where they make sense. I have been a member of two standards committees myself, because I knew from hard experience in both cases that standards were sorely needed." Yup, that's why I joined the HTML 5 working group. I got tired of complaining about it (even the one time I did) and started doing something about it. "But when you find out in practice that it's impossible to code a universal solution to a problem because the standard is inconsistent, then you KNOW WHY you need a standard for XYZ." I think where the classroom education is helpful is when they actually teach and show how the standards and how to follow them. For example, don't just say, "write standard HTML", but do a project where you need to write a basic HTML parser, and then start throwing the garbage that too many HTML authors crank out. I think that would get the message across quite nicely. :) I know that a lot of folks, self-taught or not, simply are not aware of the existence of many standards. I've had too many encounters where I showed someone the RFC or W3C spec or whatever, and they were baffled by what they were looking at. I got interested in specs around 2000, I was writing my own e-commerce system in Perl, and I wanted to use HTTP status 403 to handle logging in, since it was standard, did not require cookies, already implemented in browsers, provided for encryption of login information, and allowed a "remember me" system in the browser, securely. I had two choices, to dynamically edit .htaccess from my code, or have the code re-implement the status 403 authentication system. I chose the latter (to avoid dependence on Apache), and since "no libraries" was a project spec I gave myself (no, I am not into self-mutilation), I had to implement it (and the MD5 hashing algorithm) by hand from scratch (no code copying either was another project spec). At the end of the day, I learned the HTTP spec inside and out, in ways that continue to benefit me. Since then, I followed the path of reading specs and doing my best to adhere to them. But at the same time, there is a lot that I'm not aware of. I didn't know that the ISO had code practice specs until Jaqui posted the link. Now, the ACM has recommendations on a standardized education in various IT related disciplines (http://www.acm.org/education/curricula-recommendations). The question is, do the acknowledged benefits of standards apply to education? I say "yes". What you are both seeing with education is the same thing I see in computing standards. Standards give us a common ground to work on. But they also limit us at the same time. Let's say that I'm working on a word processing app, using the ODF format as my default format. I can't add a feature that ODF does not support, for example. So the standards limit innovation in certain ways. At the same time, because I don't have to spend time figuring out how I am going to store my documents, or writing document converters, since I'm using ODF, I have more time and energy to innovate. It's the exact same principle. A standardized formal education provides the recipient with the common knowledge and experiences that many environments demand, and that self-teaching might not provide. At the same time, the "one size fits all" approach disregards the individual talents and interests. Someone could be a real DB wiz, for example, but drop out before they took a DB course because they hated linked lists and such. Right now, we are in a kind of 1700's level of technology. We can do lots of things, but it has to be done by hand. So we set up workshops with a bunch of highly skilled workers who have been put through a standard training system. What we really need is the equivalent of the steam engine, to allow one person to do the work of 20. And I'd love there to be a programming John Henry, heralding the end of the "monolithic code era". But until that day is here, I fear that we are going to be stuck in a world where most developers are churning out bland code on boring projects, which require a fairly standardized set of "best practices", knowledge, and experience to do, and is probably fair game for regulation, licensing, and such. To quote myself, "bleh". :) J.Ja

Jaqui
Jaqui

strive for the use of standards, I'll bet we all know that there is a lot of room for improvement in the standards. It's odd how many Project Leaders, and those involved in the project specs stage, have no clue that there are a minimum of two standards for every project. The security aspect, and the networking protocols. Both have a standard in the ISO list, and BOTH will apply to 99.995% of projects. It doesn't matter what programming language is used, what os it's for, or what tools are used, the networking and security standards will apply to anything but a stand alone application that will be run in an embedded type environment.

Sterling chip Camden
Sterling chip Camden

... I have to agree with apotheon. Persons who are self-taught are not under the illusion that they have learned all they need to know. As such, they are often advocates for standards where they make sense. I have been a member of two standards committees myself, because I knew from hard experience in both cases that standards were sorely needed. If some professor had told me that "you need standards for XYZ" I would have written that in my little notebook and answered correctly on the exam. But when you find out in practice that it's impossible to code a universal solution to a problem because the standard is inconsistent, then you KNOW WHY you need a standard for XYZ.

apotheon
apotheon

"[i]The problem is, self taught people tend to only learn what is interesting to them or needed immediately.[/i]" I don't see what's wrong with only learning what's interesting or needed. The problem is if the autodidact doesn't find critical knowledge interesting. "[i]Boring things that are not needed for the task at hand tend to be ignored, unless someone is basically not just 'self-teaching' but following a formal education program on their own.[/i]" . . . and if it's not needed, then you don't need it. What's the problem? It might help understand my position if you remind yourself that I'm of the opinion developers don't need to know calculus unless their work involves calculus. There are other topics far more relevant that get ignored as a result in the typical CompSci course of study, and self-taught programmers make exactly those trade-offs with the CompSci graduates: while the grad learns a bunch of crap that may never be relevant to his development work, the autodidact learns things that are more relevant, and chooses the direction of his career by pursuing knowledge in areas that interest him. "[i]I am a great example. I understood multithreaded programming and was writing multithreaded programs before I knew that the way I wrote variable names was called 'Hungarian notation'. No way could I fit in as a cog in a formal, standardized programming machinery.[/i]" Hungarian notation is a local style standard, and something like that should never be regarded as a general-purpose development standard.

Justin James
Justin James

"In my experience, self-taught programmers are more concerned with standards than University Java-mill detritus like the average daycoder." I agree (which sounds self contradictory). The "puppy mills" that churn out "shake 'n bake" coders don't provide any information about standards, or even create awareness of them. Meanwhile, self-taught people are (by definition) concerned with learning and doing things right, and almost always eventually discover standards, and even try to adhere to them. The problem is, self taught people tend to only learn what is interesting to them or needed immediately. Boring things that are not needed for the task at hand tend to be ignored, unless someone is basically not just "self-teaching" but following a formal education program on their own. That's why I said: "Until the vast majority of programmers go through a standardized education like engineers do, there can be no widespread usage of standards." It applies to the folks who went through "puppy mills" too. They have the "standardized" part down fine, they just didn't get anything approaching "engineering". The self-taught people have the "engineering", but all too often lack the "formalized" stuff that teaches the boring, rote knowledge. I am a great example. I understood multithreaded programming and was writing multithreaded programs before I knew that the way I wrote variable names was called "Hungarian notation". No way could I fit in as a cog in a formal, standardized programming machinery. :) J.Ja J.Ja

apotheon
apotheon

In my experience, self-taught programmers are more concerned with standards than University Java-mill detritus like the average daycoder.

Justin James
Justin James

A huge contributor to the situation is the number of self taught people in the industry. Until the vast majority of programmers go through a standardized education like engineers do, there can be no widespread usage of standards. On top of that, it is quite popular to portray programming as magic, for a variety of reasons. J.Ja

Jaqui
Jaqui

when you look at the security specs for applications security specs for opeating systems what functionality is required for an office suite app specs what needs to be addressed for a dbms spec. They don't have them to cover every possibility, or that cover the latest language / trend etc, but the base app type specs are a good foundation for the project specs to build on. if they were followed more, then it would help to improve recognition to the fact they exist, and that there is a science and some engineering behind software development, it isn't just magic.

Justin James
Justin James

There are a few problems here. The first problem is that too many aspects of programming have not reached the level of "this is engineering" to be good candidates for ISO (or similar) standardization. The next problem is that programming still is evolving much more rapidly than standards bodies can keep up. I joined the HTML 5 working group a month or two back, and let me tell you, they are going insane over stuff just trying to keep up with current "state of the art" let alone "setting the pace" for future development. The third problem is that standards tend to limit possibilities and innovation... for example, a word processor that uses the ODF spec cannot implement a feature that the ODF spec does not account for. I think that around the time when the ISO specs can emcompass everything (or enough of everything) that we do is around the time when they will, and it will also be around the time that programmers start to be licensed as well. J.Ja

Editor's Picks