What I’m Saying When I Say The Things That I Say

I said this, in reference to the conversation about drones and “just war” theory:

Moral arguments are generally pretty intuitive, unless what you’re really trying to do is justify power or violence.

And since I have a bunch of smarty-pants friends who make interesting counter-points, I feel the need to elaborate.

Intuitive probably isn’t a great word to use. It’s a cousin of “common sense” in that both ideas try to make some set of assumptions or base of knowledge universal when actually they’re particular. I’m not sure of the Webster’s definition of intuitive and plus I think it’s obnoxious when people cite definitions. By intuitive I basically mean “agreeable” or “easy to accomplish.” When I say a software program is intuitive I mean that I can make it work without too much difficulty.

Here’s an example of what I’d consider an “intuitive” moral argument. It’s really easy for me to make the argument to other law-abiding American citizens that it should be illegal for us to kill each other. So long as both of us have a basic capacity to practice empathy, that’s an agreeable position for everyone involved.

What’s “non-intuitive” is me trying to explain to an innocent Pakistani teenager why it’s morally acceptable for my country to kill them. If our positions were reversed, I wouldn’t find the pro-drone argument readily agreeable. And those who are trying to argue for drone strikes that will inevitably cause “collateral damage” are going to need to invoke some convoluted reasoning meant ultimately to cover over the fact that an unfair balance of power is in the picture.

Advertisements

5 comments

  1. Steve Elliott

    I’ll reposition my counter to your original post w/r/t the two examples you put forth here. My point wasn’t that a drone strike isn’t more morally complicated than your run-of-the-mill murder (it is). My point was that invoking the idea of moral intuition is dangerous because it’s intellectually baseless. It assumes that others’ esoteric moral judgments will be roughly aligned with my own. Once we accept that assumption, it seems to me that we’re perilously close to accepting moral relativism; because, in practice, our moral compasses aren’t all aligned to the same true north. They’re shaped by our cultures/upbringings/etc. (It’s funny to note the paradox here: the assumption of intuitive moral absolutes causes us to unwittingly slide into moral relativism.)

    At the same time, of course nearly everyone would agree that murdering a countryman is wrong, but even that story can take different forms when we move past relying on intuition. The question of legality adds another murky layer here, but to keep it straightforward: a deontologist (read: Christian) might argue that murder should be illegal because it is an evil act in and of itself. A consequentialist might argue that murder should be illegal because if it were not, the cost to society would be catastrophic.

    It sounds to me that you don’t believe drone strikes are morally justified; I think they may be, but I don’t know. I feel that the moral judgment relies heavily on the government’s level of intelligence in any given situation. In any case, I don’t see the fact that the morality of drone strikes is not easily intuited as a feather in the anti-drone argument’s cap. Often–and especially in matters of war–the most morally ambiguous issues are also the most urgent ones.

    • JaredHillaryRuark

      I’m not trying at all to attach the notion of moral intuition to a set of deontological moral absolutes. Let me delve a little bit deeper into the moral assumptions/reasoning behind the examples I gave. A moral argument is one that addresses all parties involved in a decision. If a decision in no way effects someone other than you, then it doesn’t need moral justification. Ethical conflicts are always social, in other words. This is where I draw a distinction between a moral argument and a justification for the exercise of power or violence. The latter does not take into account whether or not the terms of the moral argument are agreeable (or even reasonable) to all parties involved in the decision. It is not reasonable to expect an innocent to accept their fate as “collateral damage” because no one I know of would accept those terms if their positions were reversed. I’m saying then, that Just War theorizing is very rarely moral. It’s power dressing up in the disguise of morality, hoping no one notices.

      To the deontological/relativism piece: So far as I can tell, there isn’t any grounding at all for ethical positions outside of shared experiences or the imagined approximations of shared experience (empathy). I guess you would call that relativism, although I think it’s a relativism grounded in concrete experience and social consensus. The assumption I’m making is that a majority of people are capable of listening and aren’t complete sociopaths. Unfortunately that seems like pie-in-the-sky stuff a lot of the time, but I don’t know what else you’re going to hang your hat on, morally speaking.

  2. Steve Elliott

    Full disclosure: I’m a pretty cold-hearted consequentialist, at least in the abstract. A drone strike is undoubtedly a massive exercise of power and violence, but in my mind that fact has little to do with its (im)morality. Although I see your point about social ethics, I don’t believe that a proper moral argument (“one that addresses all parties involved in a decision”) is necessary for a correct moral act; I’d contend every act is ultimately performed at the level of the individual, regardless of the input/dialogue that led up to the decision. (I’d also contend that every act or nonact has a moral dimension… so I would not draw a distinction In my view, the correct moral act is the one that produces the greatest good, regardless of the nature of the act. (I’ll demur on defining the greatest good… that’s a whole other rabbit hole.) To grossly oversimplify: if a drone strike saves more lives than it takes, it is justified. That said, it will almost certainly be impossible for anyone (including Obama) to be certain that any drone strike has ever been justified by that litmus test.

    You’re right that there is no grounding for the vast majority of people’s ethical positions, and there might not be any for mine either; maybe my moral philosophizing is just window dressing for a base intuition. “A relativism grounded in concrete experience and social consensus” is indeed the best we can hope for in the real world. I’m just not a very practical philosopher.

    • Steve Elliott

      Whoops. The parenthetical sentence in the middle of first paragraph should have read: “I’d also contend that every act or nonact has a moral dimension… so I would not draw a distinction between a moral argument and a justification for the exercise of power or violence.”

      • JaredHillaryRuark

        Sorry, I just now got a chance to give this a read. (I had to work on some stuff and also watch the entire first season of House of Cards.)

        I don’t think I have any big issues with consequentialist ethical reasoning, but I think it gets murky pretty quickly, especially when applied to large scale acts. How do you define or even try to measure “the greatest possible good?” for instance. Like you point out, there’s probably no way to know whether a drone strike passes muster from the standpoint of saving more lives than it takes. I’d also point out that whenever those discussions start happening some lives tend to be weighted a lot more heavily than others.

        I’m personally skeptical as to whether a state power knowingly killing innocents does much at all to squelch terrorism in the long run. The argument that actions like drone strikes are exactly what causes terrorism seems at the very least plausible to me. I have thoughts about why we prefer the costs of war (financial and human) to those of a relative peace (peace is costly, too!), but I’ll probably save those for another time.

        Anyhow, I think the problem with defending actions according to fuzzy or unprovable consequentialist notions rather than more concrete and widely acceptable moral reasoning is that it ultimately equates might and right. People are going to see through a moral justification that’s really just a cover for power, and then finally you’ll have no moral legitimacy at all.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s