The ultimate ethical dilemma


Samson Corwell

Recommended Posts

I don't know about consensus in general, but it seems to exist in the screenwriting world. John Truby, a famous script doctor, says when you create a character, treat a psychological flaw as harmful to the character and a moral flaw as harmful to others.

That seems to be the generally accepted attitude on the paths of the creative writing wilderness I have been traveling over the last few years.

When you think about it, that's the way Rand did her fictional characters.

If only life imitated fiction...

Michael

Link to comment
Share on other sites

  • Replies 125
  • Created
  • Last Reply

Top Posters In This Topic

This is about the initiation of force. A reason could warrant a button-presser killing someone close to him, say to mercifully kill his dying wife. But none to kill a stranger. And he is complicit if he presses either.

Not saving a loved one if given the opportunity would be inhuman.
You're rationalizing. If it were about saving a life the choice would decide who lives and inaction would save both. Frankly, you're insane to think that because a thing or anyone commits to killing several that you would kill fewer in order for it to not kill at all.

If you chose a button you are more contemptuous than the maker of the machine. And if the random victim of your decision learned of it before his death, or if the loved ones of the victim sought revenge, you should not be surprised that either would kill you back.

Link to comment
Share on other sites

This is about the initiation of force. A reason could warrant a button-presser killing someone close to him, say to mercifully kill his dying wife. But none to kill a stranger. And he is complicit if he presses either.

Not saving a loved one if given the opportunity would be inhuman.
You're rationalizing. If it were about saving a life the choice would decide who lives and inaction would save both. Frankly, you're insane to think that because a thing or anyone commits to killing several that you would kill fewer in order for it to not kill at all.

If you chose a button you are more contemptuous than the maker of the machine. And if the random victim of your decision learned of it before his death, or if the loved ones of the victim sought revenge, you should not be surprised that either would kill you back.

What? If you don't press any button they both die. So you can save one, or not. Are you going to save a loved one or another person at random?

You can save them because you've had the rules of the scenario explained to you. You didn't choose to be in it, but you are... so make the choice. The loved one, the random person, or both?

Link to comment
Share on other sites

If it does not pertain to reality, it's irrational. If it's irrational, it's immoral.

Only to imagine forcing some poor sod into this cruel dilemma, let alone the person who 's about to die indiscriminately -

is an inessential mind-game with no purpose. It also undermines a rational morality by its inherent nihilism.

Or don't you think words can be immoral/irrational?

Mikee got it right - IF there were a loved one involved (which there's not). I'd save her, then go and kill the guy who put me and her into that nightmare.

Irrationality implies immorality? If anything, I'd argue that the set of everything that is immoral is a subset of everything that is irrational. I know Objectivism operates on a principle of rational self-interest, but you've got to be kidding me when it comes to this. How in anyway whatsoever is posing this kind of question or simply even pondering it immoral or nihilistic? If you can put your explanation into a two-column proof, then it might make a whole lot more sense. I'm not trying to sound rude or anything, but this is just so...strange to me.
If it does not pertain to reality, it's irrational. If it's irrational, it's immoral.

Only to imagine forcing some poor sod into this cruel dilemma, let alone the person who 's about to die indiscriminately -

is an inessential mind-game with no purpose. It also undermines a rational morality by its inherent nihilism.

Or don't you think words can be immoral/irrational?

Mikee got it right - IF there were a loved one involved (which there's not). I'd save her, then go and kill the guy who put me and her into that nightmare.

Irrationality implies immorality? If anything, I'd argue that the set of everything that is immoral is a subset of everything that is irrational. I know Objectivism operates on a principle of rational self-interest, but you've got to be kidding me when it comes to this. How in anyway whatsoever is posing this kind of question or simply even pondering it immoral or nihilistic? If you can put your explanation into a two-column proof, then it might make a whole lot more sense. I'm not trying to sound rude or anything, but this is just so...strange to me.

If reality is one's only and final arbiter, then every act or statement should be examined for its rationality/morality.

I have (let's say) twenty irrational/immoral thoughts a day ( do I have to mention that this has nothing to do with the conventional,.duty-bound or a prudish 'morality'? ) all of which are variations on a theme :

What would happen if...What should and would I do if...?

Each imagined 'scenario' has imagined consequences, all of which I hope I introspect honestly: Rational or irrational?

Many will not pass the test, and 'tho reluctantly sometimes, I discard them.

By briefly holding in my mind that immoral image, doesn't make ME immoral - it's the premise of my irrational thought that is immoral. Not I - unless, I lazily condone it, and ultimately carry it out..

In the same way, I thought the premises contained in your opening question are immoral/irrational - but they have no bearing on what is your morality, or how rational you are..

Link to comment
Share on other sites

Since force is involved one merely tries to save the greatest value. Sans force one tries to save the greatest value too. Rand once posited somewhere, apropos that--I don't remember where--if a spouse and child were involved and one loved both but only one could be saved, she'd have to choose the spouse. Now remember, Rand wasn't big on maternalism to say the least. (Barbara Branden said there wasn't anything maternal about her.)

Immorality only enters this picture respecting one being forced to choose if the greater value is sacrificed to the lesser. This might happen if you are worried about what other people might think about your choice so you choose based on appearances. Such would be a Peter Keating social metaphysics-ing his way through life. The "Wet Nurse" in Atlas Shrugged implicitly was rejecting of that. It was his moral saving grace in the tragedy that took his life.

--Brant

there is an implicit and unjustified assumption in the problem that the agency saying push this button or that is in itself a moral agency and not in itself a beneficent or even amoral power just passing along information and in itself cannot or should not act on it for it has no values of its own involved

I question too whether this question represents "the ultimate ethical dilemma," for that has not been established or addressed

Link to comment
Share on other sites

You can save them because you've had the rules of the scenario explained to you. You didn't choose to be in it, but you are... so make the choice. The loved one, the random person, or both?

Hardly. The subject is being compelled to act for the would-be murderer. Instead of the buttons (or machine or whatever it is) doing the killing it is allowing him to do the killing instead. But he may still decide to not be party to the decision despite being put into the situation against his will.

Link to comment
Share on other sites

Somehow you think that my will would be tied to that of the guys holding my family for ransom. But I don't control them and therefore I have nothing to "wash my hands" of. Besides, you're not considering alternatives. You didn't ask if I would try to save them nor if I would risk my life for theirs.

Link to comment
Share on other sites

Samson,

Come to think of it, morality is just a code, not a specific code.

So as long as there is an aware conceptual consciousness, I suppose morality is possible in all situations for it. Just not a rational morality for situations where no rational choice is possible.

So, to alter my comment: "When rational morality ain't possible, it ain't relevant."

btw - Writing that suddenly sent echoes along dark recesses in the alleys of my mind. There are contexts, then there are contexts, and even contexts where that can apply.

I'll have to think about reality, too.

That sucker doesn't like being not relavant...

:smile:

Michael

Michael, The "code" of a rational morality, when all is said and done, is for its practitioner to be able to live a lifetime without guilt and constant fear.

For instance, the 'trolley problem', or any like it, is intended to demonstrate that no man can

be perfect. But philosophical skeptics such as Dawkins are not happy with that fact alone - they also want to rub it in, and portray a world in which a random and arbitrary Fate, or evil men, rule us. That's his 'reality', by which we either sacrifice our values, or have them taken from us.

Heads you lose; tails you lose. Whatever you choose, somebody dies. Fear or guilt, or both.

That we are imperfect to the skeptic reveals his premises: those of an ex-mystical

intrinsicist, who in his disillusioned and bitter heart is still secretly waiting for Divine Revelation .. and to find instant Perfection conferred upon him. Or finally says:"screw 'em all, if I can't be moral, then why should anyone?!".

I think we are what we are, not perfect and not imperfect, but constantly moving forward in that direction, with only our reason to guide us.

Link to comment
Share on other sites

Tony,

My point is the following:

2 + 2 = 5

Is that math or not math?

It's actually both.

If math means doing stuff with numbers, 2 + 2 = 5 is perfect math.

If math means correct equations, 2 + 2 = 5 is not math at all.

You can use this same reasoning about meanings with morality.

Morality is a category of ideas (whether true or false). And morality is an evaluation of those ideas (with the true ones being morality and the false ones not).

Same word. Two different meanings. People are not always clear about this.

If you want to use both meanings in the same article or in the same discussion, I think it is best to use different words or phrases for the different meanings, or at least qualify them carefully and clearly. That way people will know what you are talking about and there is no danger of false attributions sneaking in from in between the cracks--whether intentional or not.

I, personaly, strive for clarity in my stylistic discipline on using the English language, so I try to take my own advice. And I correct myself when I mess up as I consider double-meaning vagueness to be an error--or writing weakness--and entirely fixable. That is, unless I want to write misleading propaganda or something like that (maybe for a villain in a work of fiction).

As always, my reasoning method is to try to identify something correctly, then evaluate it. Not evaluate it first, then try to identify it.

Michael

Link to comment
Share on other sites

If you chose a button you are more contemptuous than the maker of the machine.

Bryce,

So you believe that there is a morally good value you can use as a moral standard in the choice between bad and bad? One that makes a person more bad--more contemptuous--than identifiable evil if he does not adhere to it?

And this good is universal to all such situations?

That's a tall order.

Where no real choice is available except being a sacrificial animal in some capacity, I say what a person does depends on his subjective values at that moment, his emotional state of mind, especially his fear level, etc.--and nobody can blame him morally for any of the choises he makes. Rand also has a quote to that effect somewhere.

In fact, in one respect, her view is the exact opposite of yours (and even goes against what I just said, albeit she said it, too, somewhere--I'll have to dig to find it).

Here is a quote from "The Ethics of Emergencies" (in The Virtue of Selfishness):

The proper method of judging when or whether one should help another person is by reference to one's own rational self-interest and one's own hierarchy of values: the time, money or effort one gives or the risk one takes should be proportionate to the value of the person in relation to one's own happiness.

To illustrate this on the altruists' favorite example: the issue of saving a drowning person. If the person to be saved is a stranger, it is morally proper to save him only when the danger to one's own life is minimal; when the danger is great, it would be immoral to attempt it: only a lack of self-esteem could permit one to value one's life no higher than that of any random stranger. (And, conversely, if one is drowning, one cannot expect a stranger to risk his life for one's sake, remembering that one's life cannot be as valuable to him as his own.)

If the person to be saved is not a stranger, then the risk one should be willing to take is greater in proportion to the greatness of that person's value to oneself. If it is the man or woman one loves, then one can be willing to give one's own life to save him or her—for the selfish reason that life without the loved person could be unbearable.

Conversely, if a man is able to swim and to save his drowning wife, but becomes panicky, gives in to an unjustified, irrational fear and lets her drown, then spends his life in loneliness and misery—one would not call him "selfish"; one would condemn him morally for his treason to himself and to his own values, that is: his failure to fight for the preservation of a value crucial to his own happiness. Remember that values are that which one acts to gain and/or keep, and that one's own happiness has to be achieved by one's own effort. Since one's own happiness is the moral purpose of one's life, the man who fails to achieve it because of his own default, because of his failure to fight for it, is morally guilty.

The virtue involved in helping those one loves is not "selflessness'' or "sacrifice," but integrity. Integrity is loyalty to one's convictions and values; it is the policy of acting in accordance with one's values, of expressing, upholding and translating them into practical reality. If a man professes to love a woman, yet his actions are indifferent, inimical or damaging to her, it is his lack of integrity that makes him immoral.

But then again, John Galt said he would kill himself if the bad guys found out about him and Dagny and they tortured her to get him to comply. (As a great example of fictional irony, she did not kill herself, but instead shot a guard in cold blood when the bad guys were torturing him.)

If I understand what you are saying, when a person is presented with a choice between doing something rational and watching his loved one die, or doing something irrational to remove the threat to the life of his loved one, if he chooses the life of his loved one over exercising rationality at that moment, you would find him "more contemptuous than the maker of the [death] machine," i.e., you would condemn him as worse than evil.

I sincerely don't understand your thinking.

In fact, if someone put Ayn Rand in the situation of the opening problem, I don't see her hesitating at all to preserve Frank's life.

Michael

Link to comment
Share on other sites

Since force is involved one merely tries to save the greatest value. Sans force one tries to save the greatest value too. Rand once posited somewhere, apropos that--I don't remember where--if a spouse and child were involved and one loved both but only one could be saved, she'd have to choose the spouse. Now remember, Rand wasn't big on maternalism to say the least. (Barbara Branden said there wasn't anything maternal about her

Interesting. There is a story about a Roman matron held for ransom or something, who was given the Sophie's choice of saving her brother , or her husband or chid (I forget which, maybe both). She chose the brother saying, I can get another husband and or child but I cannot get another brother.

Link to comment
Share on other sites

Michael, I think I explained myself well - and if anything, I usually over-write and over-explain.

I've had many decades of making allowances for others' moralities - and ultimately, also getting

pushed around by those moralities; I've identified and evaluated - the morality as well

as the individuals holding it. Here, in one corner of the cyber-universe, is one place

(with its identity on the mast-head) where other people can jolly well make allowances for MY

morality, ask questions, or go and read about rational selfishness, if they are interested.

Link to comment
Share on other sites

If I understand what you are saying, when a person is presented with a choice between doing something rational and watching his loved one die, or doing something irrational to remove the threat to the life of his loved one, if he chooses the life of his loved one over exercising rationality at that moment, you would find him "more contemptuous than the maker of the [death] machine," i.e., you would condemn him as worse than evil.

Yes. I would because a man who wants sympathy for committing 'evil' is worse than a man who doesn’t. And I’m not talking about someone who’s asking for forgiveness for prior questionable actions. This man wants - to bastardize the term - to have his cake and eat it too. He wants to not feel the guilt of his atrocities but not commit to not doing them either if he's under compulsion.

I sincerely don't understand your thinking.

In fact, if someone put Ayn Rand in the situation of the opening problem, I don't see her hesitating at all to preserve Frank's life.

Michael

This is about compelling the subject to murder and the timer is the clincher. The altruist, seemingly left with no other options, believes that to press either button is compassionate and the timer a coward's excuse to not make an ethical decision (to press one of the two buttons). But I think a rational person would see the timer as an altruist's justification for making murder compassionate.

Edited.

Link to comment
Share on other sites

You are presented with two buttons. I..

Blah, blah, blah.... Hardly the ultimate ethical dilemma. I n point of fact, this is 50 years old or more as hundreds - thousands - of young men (mostly men, AFIK), were placed in missile silos with launch codes.

The ultimate ethical dilemma: You are married and fall in love with someone else.

Here on this board is an astrounding essay about the false challenge of metaphysically impossible scenarios. Look for Stuart Hayashis's "Argument from Arbitrary Metaphysics" on OL here:

http://www.objectivistliving.com/forums/index.php?showtopic=4957

Link to comment
Share on other sites

You are presented with two buttons. If you press the red button, someone close to you dies. If you push the blue button, a random person dies. If you press neither within a certain amount of time both die. What to do?

My reaction might be to let the timer run out, because I cannot knowingly condemn someone to death by my actions. If someone did opt to press the red button, I would not blame them, but I might not call it ethical.

John Galt had a solution. He said if the bad guys ever took Dagny and tortured her to make him play their game he would jump out a window.

So there is your answer. If confronted with that choice jump out a window. Make sure the window is high enough.

Ba'al Chatzaf

Link to comment
Share on other sites

If I understand what you are saying, when a person is presented with a choice between doing something rational and watching his loved one die, or doing something irrational to remove the threat to the life of his loved one, if he chooses the life of his loved one over exercising rationality at that moment, you would find him "more contemptuous than the maker of the [death] machine," i.e., you would condemn him as worse than evil.

Yes. I would because a man who wants sympathy for committing 'evil' is worse than a man who doesn’t...

Bryce,

In other words, when you judge the morality of a person's action, you want to judge his feelings first, then his action? By your own words, you are including what the person wishes others to think about him is a fundamental part of your equation.

That's a premise that needs checking.

You went on about guilt, compassion, etc. These were not the issues on the table.

It looks to me like you are using the evaluate first, then try to identify approach. In your post, you even mentioned a target you hate that was not on the table ("the altruist").

To step outside your frame, here's a question. What if a person makes the choice you don't approve of and doesn't give a crap about what others think about him? And he doesn't give a crap what you think about him? And he doesn't seek anything about guilt, compassion, sympathy and all the rest of the issues you added to the problem?

Worse than evil, also?

Michael

Link to comment
Share on other sites

You are presented with two buttons. If you press the red button, someone close to you dies. If you push the blue button, a random person dies. If you press neither within a certain amount of time both die. What to do?

My reaction might be to let the timer run out, because I cannot knowingly condemn someone to death by my actions. If someone did opt to press the red button, I would not blame them, but I might not call it ethical.

John Galt had a solution. He said if the bad guys ever took Dagny and tortured her to make him play their game he would jump out a window.

So there is your answer. If confronted with that choice jump out a window. Make sure the window is high enough.

Bob,

Or you could shoot a guard in cold blood to get at your loved one.

(Not nearly as dangerous as jumping out a window...)

:)

Michael

Link to comment
Share on other sites

To step outside your frame, here's a question. What if a person makes the choice you don't approve of and doesn't give a crap about what others think about him? And he doesn't give a crap what you think about him? And he doesn't seek anything about guilt, compassion, sympathy and all the rest of the issues you added to the problem?

Worse than evil, also?

Michael

Unless he's trying to evade responsibility, if he would contentiously shrug off the anger of the family of the random someone he murdered, I would wonder why he had that John Galt-like demeanor. I think I'd be taken aback by it and question him down to see if he could understand the problem of his decision. Because if he could I would demand restitution instead of killing him. Though I don't believe a man so rationally self-interested would comply with the machine. But if he didn't let me question him down, I would put aside my wondering to kill him.

But now that I'm wondering, I wonder if you or anyone else who would press the blue button would accept being murdered so that the subject's close friend would not have to die.

If John Galt or Ayn Rand would have made the decision you suggest either would have been mistaken to do so.

In other words, when you judge the morality of a person's action, you want to judge his feelings first, then his action? By your own words, you are including what the person wishes others to think about him is a fundamental part of your equation.

That's a premise that needs checking.

That's not a premise that needs checking. If I had the inkling I was about to be murdered for my murderer's feelings towards someone I don't know, I wouldn't wait until after he took his action of killing me to kill him. Mostly because I physically couldn't for being dead, but because his intent is what counts. If, as he placed his finger on the button to push it down, he was suddenly paralyzed and unable to kill me, I would hold him as accountable as his twin in the next death machine room who wasn't paralyzed and who did murder my twin.

Link to comment
Share on other sites

Samson,

Come to think of it, morality is just a code, not a specific code.

So as long as there is an aware conceptual consciousness, I suppose morality is possible in all situations for it. Just not a rational morality for situations where no rational choice is possible.

So, to alter my comment: "When rational morality ain't possible, it ain't relevant."

btw - Writing that suddenly sent echoes along dark recesses in the alleys of my mind. There are contexts, then there are contexts, and even contexts where that can apply.

I'll have to think about reality, too.

That sucker doesn't like being not relavant...

:smile:

Michael

Michael, The "code" of a rational morality, when all is said and done, is for its practitioner to be able to live a lifetime without guilt and constant fear.

For instance, the 'trolley problem', or any like it, is intended to demonstrate that no man can

be perfect. But philosophical skeptics such as Dawkins are not happy with that fact alone - they also want to rub it in, and portray a world in which a random and arbitrary Fate, or evil men, rule us. That's his 'reality', by which we either sacrifice our values, or have them taken from us.

Heads you lose; tails you lose. Whatever you choose, somebody dies. Fear or guilt, or both.

That we are imperfect to the skeptic reveals his premises: those of an ex-mystical

intrinsicist, who in his disillusioned and bitter heart is still secretly waiting for Divine Revelation .. and to find instant Perfection conferred upon him. Or finally says:"screw 'em all, if I can't be moral, then why should anyone?!".

I think we are what we are, not perfect and not imperfect, but constantly moving forward in that direction, with only our reason to guide us.

Ex-mystical intricist? What the hell is that?
Worse than evil, also?
See Darkseid.

You are presented with two buttons. I..

Blah, blah, blah.... Hardly the ultimate ethical dilemma. I n point of fact, this is 50 years old or more as hundreds - thousands - of young men (mostly men, AFIK), were placed in missile silos with launch codes.

The ultimate ethical dilemma: You are married and fall in love with someone else.

Here on this board is an astrounding essay about the false challenge of metaphysically impossible scenarios. Look for Stuart Hayashis's "Argument from Arbitrary Metaphysics" on OL here:

http://www.objectivistliving.com/forums/index.php?showtopic=4957

I've read that [the link] and quite frankly I cannot agree.
Link to comment
Share on other sites

But now that I'm wondering, I wonder if you or anyone else who would press the blue button would accept being murdered so that the subject's close friend would not have to die.

Bryce,

I never said I would push anything. I never even hinted that I would. That doesn't mean I would or I wouldn't. It just means you decided I would out of your own biases, not out of anything I wrote.

You misidentified.

This is a very simple case of making an assumption on first seeing something, i.e., instantly evaluating it, then presenting that in the place where correct identification should be.

This--the epistemological method you have used so far in your posts--is the premise I suggest you check. I call the correct method "cognitive before normative" and the error "normative before cognitive." (This is based on the idea that it's very difficult--actually impossible without blind luck--to evaluate something correctly if you don't identify it correctly.)

But it's your mind, your method and your decision.

Getting on to the second part of your statement above, you wonder what people would accept or not. Here you have hit on why the OP's situation is not a question about morality. There is no correct answer. There's not even a good one.

At best, it's a question about personal values under intense far-fetched stress, not about universal moral principles. At the normal level, it's not a situation any of us can expect to encounter for real in our lifetimes, so this thing is play--not morality--for most all of us. Something closer to a crossword puzzle or a riddle than a true ethical dilemma. It's a game and nothing more.

It's great stuff, too, for people to get pissed at each other online without worrying about the consequences of actually fighting. Call it an interactive virtual surreality show.

:smile:

Michael

Link to comment
Share on other sites

But now that I'm wondering, I wonder if you or anyone else who would press the blue button would accept being murdered so that the subject's close friend would not have to die.

Bryce,

I never said I would push anything. I never even hinted that I would. That doesn't mean I would or I wouldn't. It just means you decided I would out of your own biases, not out of anything I wrote.

You misidentified.

This is a very simple case of making an assumption on first seeing something, i.e., instantly evaluating it, then presenting that in the place where correct identification should be.

This--the epistemological method you have used so far in your posts--is the premise I suggest you check. I call the correct method "cognitive before normative" and the error "normative before cognitive." (This is based on the idea that it's very difficult--actually impossible without blind luck--to evaluate something correctly if you don't identify it correctly.)

But it's your mind, your method and your decision.

You misunderstood (misidentified?) me but I wrote it in a way that could be and was misunderstood.

But now that I'm wondering, I wonder if you or anyone else who would press the blue button would accept being murdered so that the subject's close friend would not have to die.

I've read all of your posts so I know you didn't write it. If I had meant it as you thought I did I would have written, "...you who would press the blue button or anyone else..."

And I'm not pissed off. Then again, you haven't written that I was.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now