IT Employment

Do software developers bear moral responsibility for tech features that may violate human rights?


In his ZDNet blog this week, Charles Cooper interviews Grady Booch about the morality behind software development. Booch, the inventor of the Unified Modeling Language, talks about confronting questions of morality in regard to how governments later deploy their finished work.

Booch takes issue in particular with the government's usage of software that might violate human rights, such as the right to privacy.

He sites an example:

"London's installing more video cameras per square mile on the street than anybody else. All right, not a lot of software there. But what happens when they couple that with facial recognition software so I can actually track individuals as they go through the city?"

He compares the conundrum modern software developers face with the issues scientists faced in '40s and '50s in regard to nuclear power. His point: "I may have the ability to do these things, but should I do these things?

Another example he uses:

"Let's say I'm working on some bit of software that enables sort of a social networking kind of thing that enables connectivity among people and there's potential for the exposure of lots of information. Well, do I then add a particular feature realizing it may have a coolness factor? But at the same time I may just have found a way that pedophiles can get into this network more easily."

He states that it is an incredible privilege and responsibility to be a software developer because they "collectively and literally change the world." But even if they aren't responsible for how their software is ultimately used, should they bear an ethical responsibility for it? It's an interesting argument. What do you think?

About

Toni Bowers is Managing Editor of TechRepublic and is the award-winning blogger of the Career Management blog. She has edited newsletters, books, and web sites pertaining to software, IT career, and IT management issues.

48 comments
gmedina
gmedina

When a company or anyone develops a software with good purpose, sometimes they don??t see the dark force of it. It is like giving a knife to someone you don??t know, he may use it to pass the butter or make a hole in a person??s belly.

MikeGall
MikeGall

???London???s installing more video cameras per square mile on the street than anybody else. All right, not a lot of software there. But what happens when they couple that with facial recognition software so I can actually track individuals as they go through the city???? Hmm, I'm not that interesting. If the government is hiring someone to analyze my activity, there is probably a good reason for it. As for the pedophile issue, the same could be said for phones, or even public transportation. Technology itself isn't evil, but can be used for evil. A gun can be used for bad purposes, or it could be used by your wife to fend off an attacker.

cvicd9
cvicd9

Yes should be a moral obligation on the other uses of software. The problem is who will assign it. Individual or any other association??? UHMMMM

BrokenEagle
BrokenEagle

There is a difference between developing something for an immoral purpose and developing something that someone else uses for an immoral purpose. I am only responsible for my actions, not for how someone perverts something I made.

ByteBin-20472379147970077837000261110898
ByteBin-20472379147970077837000261110898

This is like holding the gun maker responsible for a person's murder. It's not the tool at fault, but the HOW the people use that tool. As with anything, if something is used contrary to manufacturer's intent, then it's the fault of the person using it. Well, in most cases, that is. In this litigious society, however, people will sue for anything. So whether it's good or not, if you're gonna get sued, you're gonna get suded. I learned in school that when developing software, the first thing you should do is sit down and decide why you need to write the software, what features there will be and what benefits and disadvantages there are to each feature. If a feature seems more disadvantageous, then one should leave it out. If there are more disadvantages to writing the software than advantages, then don't write it. IOW, when writing a program, think about what good is it if it existed. And realize that no matter what you do, be it writing software, or anything at all, SOMEone somewhere is not going to like it. The old saying goes: You can't please everyone. At least if you please the majority, then you may have done something worthwhile.

aaronjsmith21
aaronjsmith21

You know, I have said this before, and I will say it again. WHY DOES IT MATTER!!!! If you are doing something you shouldn't then you will worry about this, but you shouldn't have been doing that in the first place. If someone wants to follow my every move, let them. I guess you just need to be better at hiding, or have better tools. As some have said in these posting, you can't control what happens to your work, also, why should some else get paid for something you can develop yourself. I wish this subject would not come up again!!!!!

SnoopDougEDoug
SnoopDougEDoug

Everyone bears responsibility for their actions, be it in the work they produce, the actions (and inactions) they take during their daily life, what they buy, who they associate with, and so on. Software folks are no different. I will pretty much do anything that is not immoral, unethical, illegal, or fattening (although I'm loosy-goosy on the latter). If I willingly participate in an activity that I know is illegal, immoral, or unethical, then I should bear the consequences. I tell my teenager that everyone will eventually find out what you do, so be prepared. If it is something to be ashamed of, you probably should not do it. BTW, I have no major issue with cameras on every light pole, as long as access to them is available to everyone. If my tax dollars are paying for something, I want access to it. I also understand that despite blanketing downtown London with cameras, it is no safer than before.

jasocher
jasocher

What you're asking is for the developer to be responsible for someone else's misuse. If we tried to take into account all the ways in which something could be misused (which is next to impossible), nothing would ever get done. We would have no internet, no email, no phones, etc. Not to mention that morals will vary from person to person anyhow - so who's to say what's good or bad in a moral sense? Pondering this concept a complete waste of time.

wrlang
wrlang

Pretty arrogant stuff. Software developers create tools, just like Sears Craftsman. A tool doesn't change the world until someone uses it to change the world. The examples aren't even good ones. The real question is what is the tool supposed to be used for, not what could the tool be used for. Simple answer too. A hammer can be used to kill someone, but does the maker of the hammer have a moral responsibility for someone using the tool for the wrong purpose? Answer is no. If the tool's purpose is to engage in immoral activities, then the answer is yes. If a developer created software to help pedophiles find legitimate youngsters to prey on and allowed pedophiles to avoid law enforcement sting operations, then the answer is yes that developer is morally responsible and should be jailed. Solution is also simple. If a developer has a real and immense fear of what the tool may be used for they shouldn't develop it, or they should build in things that would block immoral use. For example, the cool new software feature goes to the feds pedophile database and if the name/address on the database is the same as the name on the user account info then they can't use the feature. Of course there are ways around this blocker, but at least the developer has tried to prevent immoral use.

geoallen
geoallen

I work in the healthcare services industry. My work is rife with the potential of privacy violations. I work diligently to do everything possible to protect the data with which I work. Those who work for me are diligent as well. But that only goes as far as others desire to distribute that data. We can only control what is in our own reach...we cannot control what others might do with our work, to some extent. I began my career in IT many years ago programming missile guidance systems for the military. It was my job and I did it the best that I knew how so the target was struck exactly telling myself that collateral damage was less likely if I did, and I prayed and hoped that those who controlled the use of these weapons were smart enough not to use their power.

bchirgwin
bchirgwin

Depends. A lot of non-computer things can be used in this manner as well. A gun. Should guns manufacturers have the same responsibility? Guns would not exist if they needed to be held to such a standard. Even the federal do not call list can be used to make calls (Why a company would want to contact one that doesn't want to be is beyond be, but I am on the list and still receive calls). Even a vehicle can be used in the act against human rights (terrorism). Take the issue of cameras. Just because it is cameras watching the public this is an issue. If it was actual policeman watching the public, this would not be an issue. Why is there a difference? Is it because it is cheaper to use cameras instead of 100s of officers?

john
john

Clearly there are multiple ways of looking at this issue. One way is the difference between custom software and packaged software. If you write packaged software the things are a *little* clearer. For example, as other posters have commented, you could include in the license agreement how the software can be used (since you hold the copyright). However, in the case of custom software, where you are a consultant/contractor/hired gun you do not hold the copyright. Sure, you could refuse to put in a feature that sounds dangerous to you but you may really have no idea how your customer will use your software in the future. Yes, you could ask, but realistically, your customer might not even know at the time. The best test, IMHO, is to ask yourself if you would like the software to be used against you. If you are not worried, then there is no problem. If I wrote a text editor I would not be worried that someone could write a letter in my name using it. Sure, that could happen but my software changes nothing in this equation. Look within, the answers are there.

dogknees
dogknees

While I usually hate the term "slippery slope", it does apply here. If we are to not develop software that "could" be used in a way that breaches human rights, we'd pretty much have to stop development of all software. After all someone writing a ransom note might use MS Word. It's an old argument and I'm firmly in the camp that says it's not the responsibility of the inventor/developer to decide on appropriate usage of an invention. Unless it's the sort of thing that can only be used in a harmful fashion that is. For example developing a bomb that can only destroy an entire planet, and won't operate on anything smaller is probably inappropriate and the inventor would bear considerable responsibility. Though, even then if you're in the camp that says it would be preferable to destroy the earth rather than allow their country to be invaded, you might well disagree.

Tony Hopkinson
Tony Hopkinson

If you write a piece of software to put to an immoral use (bearing in mind morality is wholly subjective anyway), or that you know will be put to use in an immoral fashion, then you have abdicated your responsibility. If someone uses your work in a fashion you or others consider immoral, then the someone is responsible. Pattern recognition is the first step towards AI, yes it can track political opponents, it can also track 'political' opponents, who've just committed ot attempted to commit a terrorist atrocity as well. You can't have the potential of one without the other. How about defibrillators, are they immoral?

Jaqui
Jaqui

When we put those "feeping creatures" into the software [ websites or other based ] We should, morally, make sure that we do everything we can to have the sofware be secure. [ vis a vis openBSD's security audit of their code base, longest running os with fewest exploits, 10 + years and only one exploit from their config. ] If we don't do evrything we can to make it secure, then we SHOULD be legally liable for damages in event of illegal access. I know that will piss a lot of people off, since it is 100% opposite of all EULAs, and is even opposite of the GNU-GPL. But I say: If you are not willing to accept responsibility for your code, then don't release it until you are. and EULA is expressly designed to screw the end user over.

carlheydman_jr
carlheydman_jr

Is gun manufactures responsible for misuse of their product? At the same time, the tobacco industry was made partially responsible for theirs. It might come down to, ?If I don?t do it, someone else will, and they?ll get paid for it.? Should we include long documents of lawyer speak to absolve programmers and distributors of ethical responsibility? Should we place destruction code to activate if it is used for a purpose we judge unethical, and who will decide the ethicalness of the use?

Deadly Ernest
Deadly Ernest

that's who'll assign the culpability, but after the fact.

Tony Hopkinson
Tony Hopkinson

It's either a given or it's not. Imposing your own moral standards on someelse , is futile, counter productive and demeaning to all concerned. Morality and ethics are subective and personal, or they are institutionalised dogma fit for the ignorant and those who manipulate them. Only those who don't pratice the morals they say they accept are immoral.

ralph.wahrlich
ralph.wahrlich

Any tool or procedure can be misused, or cause problems at certain times (eg: automobiles; guns and knives, seat belts; CCTVSs...). I suggest the key test for releasing any tool would be... - whether the primary, intended use is for 'good' or 'evil' (keeping the argument simple! :-). - whether people will in general use the tool for beneficial purposes, after due consideration of human nature and the current state of society Without meaning to sound blindly idealistic, I regard something as 'beneficial' if it maximises *everyone's* ability to their lives in a positive way (in the way they would wish to live), and progresses the overall wellbeing of society. That should be the acid test of whether a tool is released. :-)

Tony Hopkinson
Tony Hopkinson

Are you saying you shouldn't do it unless you get paid?

aaronjsmith21
aaronjsmith21

Well, what if the purposed that the bomb was created was not to destroy earth? Even tho it could be used for the, the purpose is for something like a giant asteroid, or something to that matter. I guess everything has its ups and downs. I guess the fact of the matter is, that someone can take something that was created and intended for the use of wrong and provide something useful and beneficially good.

Deadly Ernest
Deadly Ernest

happening since the creation of public places - the only differences have been in how you do the tracking. From one person following to a dozen people using radio to keep them boxed, to watching with binoculars from a high vantage point.

Ed Woychowsky
Ed Woychowsky

I agree with you, there can be a world of difference between the developer?s intended use for software and its actual use. Take for example, BELL, which was a program used by Continental Can Corporation to identify employees who were about to be vested for a pension so that they could be fired and Sam Spade a Windows network security tool. BELL was designed from the ground-up for immoral and illegal purposes while Sam Spade has a legitimate purpose. In the case of BELL the programmer knew from the get-go that the program was intended for nefarious purposes, but didn?t have the courage say no and probably get fired. In today?s job market that kind of courage is rare, in fact I only know instance where a programmer quit instead of writing a program to rank employees that contributed to the United Way by percentage of their salary. In his opinion there was no legitimate use for that kind of information.

joseph.toro
joseph.toro

Consider: IBM and the Holocaust http://www.ibmandtheholocaust.com/ This question is not new. Leaders will always use any new technology for their own agenda. As creators of technology, we can not be responsible for the actions of anyone else, especially world leaders.

highlander718
highlander718

Of course they are MORALY responsible. Likewise, the gun maker is MORALY responsible. Many people can live with this responsibility. As long it is not LEGAL ...many do not care.

Nodisalsi
Nodisalsi

There is no authority on Ethics and Moral Values - oh yeh, you might come up with suggestions but if you think about it carefully enough you'll realise that only the *individual* can really decide. And even when they are comfortable taking on a job, that individual cannot be aware of all the possible abuses and function creep from their work.

.Sherwood
.Sherwood

I believe the moral responsibility lies in the intent of the development. If the intent is to do something illegal or maleficient toward unsuspecting / innocents, then yes the developer should be held to account. If the intent was to develop a system that is beneficient but is then used by someone else in a maleficient manner, then no, there is no culpability on the part of the developer. The argument that "if I didn't do it, then someone else would have" is just plain off-base. If it is wrong it is wrong, regardless of how many people are willing to do it.

Deadly Ernest
Deadly Ernest

Gun manufacturers sell guns by telling people about their power and killing capabilities. The tobacco companies weren't done in for selling tobacco, but for telling people thre was no harm in it, when they knew there was a danger in the use of the product. A better comparison of the tobacco situation would be the use of opium, heroin, and morphine at th end of the 1800s and early 1900s. These drugs were initially promoted as being only beneficial, morphine and heroin were originally promoted as having no addictive properties. When this was shown to be untrue, the companies selling the products stopped saying they weren't addictive and started advising people they were addictive. The tobacco companies found out in the 1950s that the tobacco was addictive and had minro amounts of harmful toxins in it, they didn't tell the public, but spent the next 50 years publicising that their product was NOT harmful in any way and not addictive. And they got done for lying to the public. People invent many products, and almost all can be used in a bad manner - cars provide transport, but can also be used to kill; knives are needed to cut steak, but can kill people; the same with any invention. The immorality comes with the immoral usage and user, not the inventor. However, if the invention has only immoral uses, then you can raise this argument - some items along this line are bioweapons that target humans only.

tuomo
tuomo

If you don't take the responsibility you are not able to work on any secure environment. It may hurt (you) but on long run it is the only thing to do and trust me, it will pay back. Over 30 years in (computer) business I have some war stories to tell - internal/external ethic problems and sometimes tough but now I have a lot of people trusting me and for a reason. Not all employers like that but always ask for what purpose the system/software will be used and make you own decision how you would feel being on the other end! A warning, this may get you fired! Never happened to me but I have seen that.

ranthony2
ranthony2

Ethics? Who brought up ethics? Ethics and Morality are not the same, since ethics looks inward and derives from one's group. Yes, of course you are morally responsible, just as you are responsible for everything you do. The implied question, the one you probably meant, is the important one: Are you culpable? And that depends on YOUR intention, not the intention of the user. Example: You build a really nice hotrod, but somebody steals it and kills a pedestrian; obviously, the driver is culpable and you are not. What's that you say? The thief had an easy time stealing the car because you left the keys in the ignition? No, that won't make you culpable, either, not even if you put a "steal me" sign on the car: You still would not be morally complicit in the pedestrian's death -- BUT you, or a court, might decide you should share some responsibility for the consequences of that death. Another example: You provide no outside lighting on your home, surround the house with dense shrubbery, and put a sign on the front lawn that says "Unguarded money inside"; further, you go on vacation, leaving the front door unlocked, and do stop delivery of neither newspaper or mail, so it becomes obvious that nobody is home. A burglar comes and steals everything you own. You did not CAUSE the theft, but are you responsible for it? Damn right you are! -- and you are also very stupid. But: are you culpable in that theft? No, you are not guilty, since you did not do the theft nor cause it. Yes, your stupidity or carelessness made it easy for the thief, but you have no moral obligation to act reasonably or intelligently to protect your property.

aaronjsmith21
aaronjsmith21

I am just spouting, that if I can make a useful program and want credit for it, but it turns out after I made it that it could do some bad stuff, should I destroy it? I would try to adjust it to make it not do the bad thing, but in all it is the end user that should be morally and legally responsible for what it does. For example: Torrents!!!! The basis of the creation is great, Peer to Peer software to reduce server loads, although, people have used this technology for illegal and immoral purposes. And also talking about getting paid, most Linux developers don't receive anything for what they make. But crackers/hackers somewhat favor Linux for their tool of use. But the person that is developing the software is not going to stop just cause that can happen. And if he don't do it, someone will.

Tony Hopkinson
Tony Hopkinson

all the tech has done is chnage the quantity of potential monitoring.

Jaqui
Jaqui

Can't argue your point. I think that there should be accountability for exploits that expose confidential data though.

john
john

Do you know the BELL programmer? Such a simple report could have many uses and and entry-level programmer would bang it out. Perhaps he was told that the report would be used to notify the employees that they COULD retire (averting potential lay-offs). Perhaps it was a new programmer who didn't think things through before embarking on his first assignment.

C_Tharp
C_Tharp

If someone developed code that could deliver patches and apply them so that the user is not inconvenienced or required to take action, but they knew that the same code could be used to do damage (i.e. virus, worm, etc.), should they proceed because of the benefit or seek another solution because it is ethical?

MikeGall
MikeGall

"Yes, your stupidity or carelessness made it easy for the thief, but you have no moral obligation to act reasonably or intelligently to protect your property." Hard for the courts to decide was is reasonible paranoia. Small town versus large town could make a difference. What a guy from small town Oklamhoma is required, to know that when they move to inner city New York he needs an alarm system, the club, and a pistol to protect his car? Leaving a small child in the middle of a mall, would be negligent, but you wouldn't be responsible for the pedophiles actions, just for whatever punishment negligience deserves (smack up top the head should still be allowed :)

Tony Hopkinson
Tony Hopkinson

for said pedestrian's death if I choose to be. I can be found culpable, an other person could feel differently than I. No external body can asssign me the responsibility, that's something I take or I don't. That's what I meant. I do have a moral obligation to act reasonably and intelligently to protect others from my property. I decided that, whether, you any any bugger else agrees is of no moral or ethical consequence whatsoever. Any legal consequences are simply a an immoral and unthical imposition of an external behavioral framework according to my standards.

Tony Hopkinson
Tony Hopkinson

Pretty much my position as well. I have it on authority that torrents are evil, someone from RIAA said so. Who could disgree with that bastion of morality?

Deadly Ernest
Deadly Ernest

privacy in a public place, and the cctv surveillance only occurs in public places - we aren't talking about videoing people in their private homes. There are some security cameras in work places, but they belong to the employer, ie the owner of the premises, and they should have a right to know what's going on in their premises (they are legally responsible for what happens there) within reason. I say within reason as I've heard some USA government buildings have security cameras in the toilets to catch people slipping weapons out of concealed bags etc.

Tony Hopkinson
Tony Hopkinson

Tracked the 7/7 guys. It found and pretty much nailed the last lot. If you want to hide from CCTV, you can, but you can't avoid looking like you are hiding to human observers. Should we not have had the power to do that? No Should the power be used to say expose a political opponent in an affair? Again No Anything can be abused, so your choices are do nothing or take the risk, at that point the answer is in fact very easy. What definitely needs to happen is if an agency does abuse, then they must pay big style. None of this early retirment drivel, or case cancelled beacuse they won't get a fair trial sh*t, serious time at her majesties pleasure coupled with seizure of assets and being barred from holding office again. This is not really a tech issue, anyone remember the supergun affair, UK version of Irangate. No computers involved in that, just some DTI forms, the government attempted to suppress as evidence damaging National Security, when all they did was damage government reputation and co-incidentally would have put four innocent men in jail. Punishment, none.....

C_Tharp
C_Tharp

Tech monitoring allows recording which has more permanence, is easeir to duplicate, and is easier to distribute than earlier methods of surveillance. Even the smallest thing can exist forever. As standards and mores change, the action that was recorded may be judged by the new standard. That may work to your benefit or to your detriment. Can you say "paranoia"? Everyone has something to hide. If the potential for bad is too great, should the potential for good be abandoned? There is no easy answer.

Deadly Ernest
Deadly Ernest

in it, but a well written and well tested piece is extremely unlikely to have any such exploitable code in it. Every piece of exploitable code I've seen the actual code for (which I admit is not many) has been poorly written, and investigation showed that it hadn't been well tested either.

Jaqui
Jaqui

that could violate human rights are the features that would allow the legal authorities to track someone, and because they are new, they most likely have exploits in them that would allow others to gain information they shouldn't have access to. [ hand me a court order for information from database about specific individual, no problem, you have access legally. ] though, a US court order is useless on me, I'm in Canada and would not respect a US court order, the US courts do NOT have jurisdiction here.

Deadly Ernest
Deadly Ernest

and now back on line with most of the job done, but not yet finished. Also on broadband too. Exploitable code in a program is, and should be, ALWAYS accountable as it's bad work and unprofessional - it comes from insufficient care and testing. This is especially so when the exploit is known about before you start on the latest version and DON'T fix it.

Ed Woychowsky
Ed Woychowsky

It has always been my practice to try to get into the customer?s head when getting a new assignment. Although time consuming, knowing the hows and whys does prevent misunderstandings when it comes time to deliver. I admit they could lie, but in that case the developer is probably the same person that maintains the program in production and depending upon geography should have noticed something was amiss. Unless of course they were in the first round of firings.