Software Issues 3: The Ethics of Programming
Writers, filmmakers, and scientists have all been guilty of ignoring the long-term ethical implications of their respective crafts until it was too late. Does software programming have any ethical ramifications? Aren't we just practicing a neutral craft? We're not ministers, therapists, policemen, politicians, or lawyers who have to deal with ethical dilemmas. Certainly the writers of poorly researched news, the producers of violent films, and the atomic physicists all felt the same way at one time.
Software that we write has ramifications in the real world. If not, it wouldn't be very useful. Thus, it has the potential to sweep across the world faster than a deadly manmade virus or to affect society every bit as much as genetic manipulation. Maybe we can't see how right now, but in the future our code will have ever-greater potential for harm or good.
Of course, there's the issue of hacking. That's clearly a crime. Or is it that clear? Isn't hacking acceptable for our government in the event of national security? What about for other governments? Cases of life-and-death emergency? Tracking down deadbeat parents? Screening the genetic profile of job candidates?
Where is the line drawn? Who decides?
Do programmers have responsibility for how their code is used? What if a programmer writes code to pry into confidential information or copy-protected material? Does he bear responsibility along with the person who used the program? What about a programmer who knowingly or unknowingly writes code to "fix the books?" Should he be liable? What about software sabotage? Software warfare? Do programmers have a moral responsibility if they write software that the CIA uses to disrupt communications or in other ways destabilize a foreign country? Given the damage such software could potentially do, isn't that software team potentially more morally bankrupt than the Manhattan Project team? What about the team writing software for sophisticated weaponry? Do they have blood on their hands? How about writing a Web site so that a hate group can spread its message? What about writing the software that powers porn sites? Do developers have any control over how their software is used?
Clearly, one could go on and on. We're already faced with tough ethical questions every day. One thing is certain: In the future, the ethical problems will get even tougher and more critical. We as programmers must start by understanding that we do have ethical responsibilities and we do have the choice to ignore or assume those responsibilities.
Of course, a few of us may not be involved in creating child porn sites, doing accounting for the mob, or subverting foreign governments. Do we really have to worry about any ethical issues? The answer is that we can't avoid it. Some of these issues are as fundamental as professional responsibility and integrity.
One example is the common dilemma of what to do when we're directed to implement architectures or coding that are plain sloppy or bad. Do we have any professional imperative mandating the quality level of our craft? Or are we simply hired laborers who stick the brick where we're directed, knowing that the arch will fall and the bridge will collapse? Are we willing to write anything, no matter how shoddy, because that's what we're getting paid to do?
What if doctors had this attitude? What if they agreed to do any surgery as long as the patient demands it and pays well? Remove your left ventricle? Add a third ear? Okay, you're the boss.
In one episode of the old Bob Newhart show, Bob has just published a book on building bookcases. He asks his handyman, George, to build a new bookcase for the bookcase book. He gives George some plans he made, using the directions in his book. George glances at the plans and attempts to offer some suggestions. Bob cuts him off, tersely telling George that he's expected to just build the bookcase according to the detailed diagram. George finally gives up in frustration and goes to work.
At the end of the show, Bob comes in and admires the fruition of his plans. He gloats and brags to George about how well his plans worked out. He goes to his desk, picks up the copy of his bookcase how-to book, and goes to put it on the shelf. The book doesn't fit. All the shelves are too small.
George has the dignity and grace to merely show a satisfied smile, collect his tools, and leave without saying a word.
Certainly this represents an ethical dilemma that all of us in software development can relate to. Many times we've worked for the manager or project planner who has worked hard on the specifications, and is reluctant or even hostile when the developer suggests changes. To agree that the plan is imperfect is an admission of fallibility. And plans are never perfect, so we end up with software that meets the plan perfectly but doesn't hold any books.
How hard should the programmer push back, in that case? Should he dig in his heels, refusing to sign his name to shoddy work even at the cost of his job? Should he simply write the code as specified, rationalizing that it's not his job to make changes? Should he ignore the planner and do it correctly anyway? Or, like George, should he make some attempt to offer suggestions but back off if the planner is unreceptive?
And certainly this dilemma also applies to the software planner. What do you do when the client demands software architectures that you know are flawed? Do you lose customers over issues of craftsmanship or give them what they want? If you do give them what they want and it turns out to be poor quality, doesn't that do you more harm than good? Won't the client ultimately blame you anyway, and won't the failure of the project hurt the reputation of your company in the long run?
The catch is that the planner or programmer must also be sensitive to the possibility that he or she could be wrong. A programmer who has his own way of doing things and is not flexible will end up fighting over personal preferences rather than substantial issues of quality, reputation, and responsibility.
There are no absolute answers to these questions, as there are no absolutes in any ethical choice. The double whammy is the fact that, even though there are no answers to these questions, that in no way reduces our responsibility to consider and act on them responsibly.
About the Author
Tyson Gill is the director of information technology at Alitum, Inc., in San Diego, California. He also teaches Visual Basic and Microsoft.Net programming at the University of California, San Diego. He is well known for his influential presentations on design, architecture, planning, and coding. Tyson is the author of Visual Basic 6: Error Coding and Layering (Prentice Hall, 2000, ISBN 0-13-017227-8) and the upcoming title Planning Smarter: Creating Blueprint-Quality Software Specifications (Prentice Hall, 2001).