Tech & Work

Do software developers bear moral responsibility for tech features that may violate human rights?

In his ZDNet blog this week, Charles Cooper interviews Grady Booch about the morality behind software development. Booch, the inventor of the Unified Modeling Language, talks about confronting questions of morality in regard to how governments later deploy their finished work.

Booch takes issue in particular with the government's usage of software that might violate human rights, such as the right to privacy.

He sites an example:

"London's installing more video cameras per square mile on the street than anybody else. All right, not a lot of software there. But what happens when they couple that with facial recognition software so I can actually track individuals as they go through the city?"

He compares the conundrum modern software developers face with the issues scientists faced in '40s and '50s in regard to nuclear power. His point: "I may have the ability to do these things, but should I do these things?

Another example he uses:

"Let's say I'm working on some bit of software that enables sort of a social networking kind of thing that enables connectivity among people and there's potential for the exposure of lots of information. Well, do I then add a particular feature realizing it may have a coolness factor? But at the same time I may just have found a way that pedophiles can get into this network more easily."

He states that it is an incredible privilege and responsibility to be a software developer because they "collectively and literally change the world." But even if they aren't responsible for how their software is ultimately used, should they bear an ethical responsibility for it? It's an interesting argument. What do you think?


Toni Bowers is Managing Editor of TechRepublic and is the award-winning blogger of the Career Management blog. She has edited newsletters, books, and web sites pertaining to software, IT career, and IT management issues.

Editor's Picks