Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do no harm #33

Open
Widdershin opened this issue Feb 27, 2018 · 14 comments
Open

Do no harm #33

Widdershin opened this issue Feb 27, 2018 · 14 comments

Comments

@Widdershin
Copy link
Owner

@chbarts make a good point over on Lobste.rs: https://lobste.rs/s/yojsj3/programmer_s_oath

Point zero is arguable. Is it never permissible to write code that hurts others? Well, that’s the same question as whether self-defense is ever justified, or whether the existence of weapons is ever justified, or whether the existence of a military is ever justified. Not being a pacifist, I cannot agree to it as it stands, and being a pragmatist, I suggest you punt the issue as being outside of what this oath should try to make all programmers everywhere agree on.

@mo-g
Copy link
Collaborator

mo-g commented Feb 27, 2018

I feel strongly in favour of the 0th Tenet.

Not everyone will want to sign this oath regardless of what it contains or doesn't, and it is impossible to make a universally compatible version of the oath without rendering it meaningless.

Not to reducto ad hitlerum (but I'm going to reducto ad hitlerum) is the Hippocratic oath meaningless because some of those who are medically trained go on to develop VX, or research the effects of Polonium ingestion for weapons use? By the same argument, it could be argued that the point about privacy is arguable because terrorists, or because companies need targeted advertising to survive, or many other justifications.

One could argue that a purely technical, instead of moral oath is appropriate:

Always test your code.

Commit to respecting style guides.

Start every project by creating a repository in a code management tool.

But that wouldn't meet the stated goals of the Oath.

@chobeat
Copy link

chobeat commented Feb 27, 2018

There are already plenty technical oath. A moral one is indeed required and desireable. Justifying violence and oppression, or privacy violations just because "there's a need for it" is a rather weak argument. We lived for thousands and thousands of years without targeted advertisment and I'm quite sure we will be able to get rid of it in a matter of a few decades (in the worst case scenario, hopefully sooner). It's not a historical constant and we shouldn't accept it as a fact of life.

@benwr
Copy link

benwr commented Feb 28, 2018

I will not sign an oath that includes the 0th tenet or a tenet that is very similar to it.

To truly accept the 0th tenet, one must do at least one of two things:

  1. Believe that there is a clear and morally important separation between action and inaction, or
  2. Accept that they must spend every moment, every resource, every thought, trying to optimally reduce harm and exploitation.

I do not believe 1 (and I think that those who do are wrong, though I also expect this to be very common), and I cannot agree to 2.

@benwr
Copy link

benwr commented Feb 28, 2018

I would be much more amenable to @Widdershin's suggestion from lobste.rs:

I will only undertake honest and moral work. I will stand firm against any requirement that causes unnecessary harm.

@mo-g
Copy link
Collaborator

mo-g commented Feb 28, 2018

@benwr

I would be much more amenable to Widdershin's suggestion from lobste.rs:

I will only undertake honest and moral work. I will stand firm against any requirement that causes unnecessary harm.

That's... literally the current text of the 0th Tenet?


Edit: Just to clarify to everyone, I'm an idiot and I can't read.

@Widdershin
Copy link
Owner Author

Not quite:

- I will only undertake honest and moral work. I will stand firm against any requirement that exploits or harms people.
+ I will only undertake honest and moral work. I will stand firm against any requirement that causes unnecessary harm.

@benwr
Copy link

benwr commented Feb 28, 2018

I will stand firm against any requirement that exploits or harms people.

vs

I will stand firm against any requirement that causes unnecessary harm.

"unnecessary" is very important to me from this sentence

@Widdershin
Copy link
Owner Author

Widdershin commented Feb 28, 2018

Also, it has been posited that this discussion centers around pacifism.

I was interested to learn last night that even Gandhi viewed violence/harm as being sometimes necessary.

http://www.mkgandhi.org/nonviolence/phil8.htm

@mo-g
Copy link
Collaborator

mo-g commented Feb 28, 2018

Derp. This is why I don't pull all nighters any more. My eyes stop working around 2300. Sorry about that. (Edit: I think this was literally a case of 'change blindness'. Fascinating.)

I'd not oppose the change from harms people to causes unnecessary harm. But I'm strongly in favour of retaining the 'no exploitation' clause. I can understand the difference between 'any harm' and 'unnecessary harm' and had a fascinating conversation about the 'thou shalt not kill' commandment with someone I showed this earlier. But; what would be a valid scenario to exploit someone?

@Widdershin
Copy link
Owner Author

Widdershin commented Feb 28, 2018

I dropped the 'no exploitation' clause in my proposed change because I felt it comes under unnecessary harm, but I'm happy to make it explicit.

@mo-g
Copy link
Collaborator

mo-g commented Feb 28, 2018

It seems a valid thing to make explicit, given the rise in 'exploitation' as a business model among start-ups. So, my proposed amendment to 0. is:

- I will only undertake honest and moral work. I will stand firm against any requirement that exploits or harms people.
+ I will only undertake honest and moral work. I will stand firm against any requirement that exploits people, or causes unnecessary harm.

@mo-g
Copy link
Collaborator

mo-g commented Feb 28, 2018

@Widdershin

Also, it has been posited that this discussion centers around pacifism.

I think that's accurate. But in @benwr's support, the argument holds true before we reach a "Godwin's Law" scenario of actual violence - the concept of 'necessary harm' also underpins industrial action, whistleblowing and many more 'socialist' and 'pacifist' actions.

Edit: Further example of 'necessary harm' that's perhaps a little more relevant to the oath's target audience: The case where a security researcher publicly discloses an unfixed exploit to force an uncooperative developer to patch their code - when they refused to, or delayed doing so following a responsible private disclosure.

@zyphlar
Copy link

zyphlar commented Feb 28, 2018

To go full Nuremburg/Milgram: it's been repeatedly demonstrated that lots of people can be convinced that harming people is necessary. If you want this oath to dissuade participants in, say, a genocide, or more commonly a manipulative ad platform, then you're going to have to give it teeth. Words get twisted by society to fulfill its ends.

I think the effect of the Hippocratic Oath was largely to make doctors neutral. It makes a point of not using knives, healing both freedman and slave (presumably also friend and enemy), and keeping clients' behaviors secret (even if you'd otherwise be tempted to, say, report them to authorities.) In situations of injustice, the Hippocratic Oath remains neutral. It effectively communicates the value that life and health are higher priorities than human or religious conceptions of justice, conflict, or even law. I think doctors see this neutrality as protecting them from the messiness of taking sides in conflicts, a la Red Cross vehicles in a warzone. They value mercy over justice. Presumably if an American doctor ran across a Nazi and a Jew, and the Jew stabbed the Nazi in self-defense, the doctor would heal the Nazi, thus allowing them to continue their fight. (There are stories like this in Afghanistan of American doctors saving the life of terrorists who just attacked the very base they're being healed at: this part of the oath has very real and significant consequences.) Neutrality also means that others who are not neutral may use you to their own non-neutral ends. (The doctors are housed at the American base, not in the Iraqi hospital. It's just a coincidence that soldiers often leave wounded Iraqis behind when they retreat... right?)

I am not sure that this sort of neutrality is a valuable thing for programmers to swear to. I think in these morally-complex times, where your GPS code and chip may be literally used both in guided missiles and hiking gear, it deserves carefully-worded consideration. It may sound social-justicey, but if "remaining neutral in situations of injustice is taking the side of the oppressor," then the oath should reject the neutral aspects of the Hippocratic and be explicitly pro-justice (hinted at in the pro-morality preamble.) Spell out what that means. Think: you are an engineer at Twitter. A Jewish user posts a message calling a self-identified Nazi user an asshole. The Nazi user reponds by quote-tweeting the Jewish user for all his followers to see. What does this oath say you should consider doing? When your code becomes the ground on which and the weapon with which people fight their battles, and you purport to use your code morally and justly, can your code afford to remain neutral? One of the apparent aims of this document is to discourage programmers from working for companies that would use their skills towards immoral ends: if we were doctors, it sounds like we'd have to be conscientious objectors to military service. That has to be emphasized.

And to demonstrate how "exploit" can have the exact same problems, that exact word is used in capitalism to mean "to utilize for profit" even though practically speaking it does create people who become exploited. That itself is its own Marx rabbit hole, but again clarifying exactly how much exploitation you think is moral is a useful exercise otherwise again it's pretty meaningless. I'm personally okay with this oath if it implies that the only ethical code is Stallmanesque, but it's going to exclude big chunks of programmers including myself.

(And yeah, I think waaaay more programmers need to take a college ethics course, because the answers to these questions have not quite yet been answered. I have ideas for what oath I'd swear to, but I don't have suggested changes. These are more questions for the maintainer. This document is basically a statement of normative ethics which immediately raises the question of how far any particular philosophical branch we want to go down. Maybe a reading of Applied Engineering Ethics is required. For example it states that "The paramount value recognized by engineers is the safety and welfare of the public" which seems appropriately Hippocratic without neglecting the duty to care for all citizens. But if your society considers some humans to be vermin, it's easy to see how this value could be twisted to justify extermination as contributing to "the safety and welfare of the [non-dehumanized] public." Inclusive and explicit language like "all humans" or "all life" seems needed. But again we're back at the question of whether or not a weapon counts as contributing towards "safety." Maybe you're choosing to remain neutral on that and just emphasize that "self defense" weapons should operate in a manner safe for their users. But online ads that mine bitcoins in users' browsers are also safe for their operators. I think the oath I'd swear to would be quite pacifist and utilitarian: "welfare of all living things." The ACM Code seems good, but verbose and still limited to "people" and allows "justified harm," whatever that is supposed to mean.)

If this oath doesn't change the status quo, is it even necessary? What about the status quo do you feel needs changing? Are existing engineering oaths not sufficient?

@vassudanagunta
Copy link
Contributor

@zyphlar I ❤️'d your comment, but that doesn't mean I think the Oath should get more specific. See this other comment of mine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants
@zyphlar @benwr @Widdershin @chobeat @vassudanagunta @mo-g and others