Monday, 9 December 2013

Handout on utilitarianism (for A Level Religious Studies etc.)


(THIS WILL A HEYTHROP PHILOSOPHY POSTER FREE TO SCHOOLS)

Utilitarianism

Jeremy Bentham

Jeremy Bentham  [1748-1832], the father of utilitarianism, famously declared that

" . . . actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness. By happiness is intended pleasure, and the absence of pain; by unhappiness, pain, and the privation of pleasure."

Utilitarianism is a form of consequentialism – it says that only the consequences of an act are morally relevant.

Bentham says that the right thing to do in any given situation is to act to produce the happiest outcome – the happiest outcome  according to Bentham, is that which produces the most pleasure and the least pain.

Bentham himself developed a “felicific calculus” factors such as intensity and duration of pains and pleasures could be fed to calculate the right course of action.

A simple example of such a utilitarian calculation – should I steal that child’s sweets? Doing so might give me the pleasure of eating them. But it would deprive the child of the same pleasure and cause her considerable unhappiness to boot. On balance, stealing the sweets will cause less happiness than not stealing them. So the right thing to do, on this simple utilitarian calculation,  is not to steal the sweets.


The happy-drug counter-example

One glaring problem with the simpler forms of utilitarianism is that they seem prone to an obvious sort of counterexample. What if we could make everyone feel wonderfully happy by constantly injecting them with a happy-drug? Would that be the right thing to do, morally speaking?
No. Turning everyone into blissed-out drug zombies would be wrong. Making people “feel good” may be of some moral importance. But it’s not of overriding importance.

John Stuart Mill
 
Higher and lower pleasures

One way in which a utilitarian might respond to this sort of counterexample is to distinguish between higher and lower pleasures. J.S. Mill does precisely this. An intense, drug-induced reverie may be agreeable. But it produces a pleasure of a very shallow sort compared to, say, the pleasures of the intellect - which, according to Mill, include the appreciation of poetry and philosophical debate. Doping people up to the eyeballs may induce an intense sort of pleasure, but it deprives them of the opportunity to enjoy higher, more important pleasures. Which is why it would be the wrong thing to do.

So unlike Bentham – pleasures differ not just quantitatively but qualitatively as well.

            This distinction between higher and lower pleasures may get the utilitarian off the hook so far as the “happy-drug” objection goes, but it strikes many as objectionably elitist and paternalistic. Is the pleasure of engaging in philosophical debate or listening to Mozart reallysuperior to that of filling ones belly with chocolate ice-cream? Aren’t such distinctions mere snobbery?
Mill thought not. He argues that only those who have experienced both the higher and lower pleasures are in any position to judge which are best, and those who have had the luxury of experiencing both tend to prefer the higher.
But is this true? Actually, many of those in a position to enjoy both kinds of pleasure like to be seen to enjoy the higher while secretly over-indulging their taste for the lower.

Transplant case

Another classic counterexample to utilitarianism is the transplant case. Suppose you’re the doctor in charge of six patients. The first has a minor medical condition easily cured. The others have failing organs and will soon die without transplants. No replacement organs are available. But then you discover that the first patient can provide perfect donor organs. So you can murder the first patient to save the rest. Or you can cure the first and watch five die. What is the right thing to do?
            A simple utilitarian calculation suggests you should kill one patient to save the rest. After all, that will result in five happy patients and only one set of grieving relatives rather than one happy patient and five sets of grieving relatives. Yet the killing of one patient to save the rest strikes most of us very wrong indeed.

Act and Rule Utilitarianism

Some utilitarians attempt to deal with this kind of case by distinguishing between act and rule utilitarianism.

Act utilitarianism – each action should be judged solely on its ability to produce the greatest happiness.

Rule utilitarianism - we should follow those rules that will produce the greatest happiness.

A rule utilitarian might say that “Do not kill the innocent” and “Do not punish the innocent” are rules that increases happiness overall. So we should always follow these rules, even on those rare occasions (such as the transplant case) when following them does reduce happiness.

Mill’s Rule Utilitarianism

J.S. Mill suggests that we should be rule utilitarians except where we face a dilemma generated by two rules. Then we should appeal directly to the principle of utility itself.

For example: “do not steal” and “do not allow people to starve” are rules that will generally produce greater happiness. But where I can feed a starving person only by stealing food for them, I must break one or other of these two rules. Under these circumstances, I must then revert to act utilitarianism and judge which action will produce the happiest outcome.

So Mill and Bentham differ in that:

1. Bentham is an act utilitarian whereas Mill favours a form of rule utilitarianism
2. Bentham does not distinguish between higher and lower pleasures, Mill does.

A criticism of rule utilitarianism

Why I should follow the rule even in a situation where the result is less happiness? It seems ridiculous to insist that I should tell the truth to the serial killer who demands to know where my children are hiding, even if telling the truth does in generallead to increased happiness. Indeed, it would surely be wrong for me to tell the truth under such circumstances. But it seems that is not something the rule-utilitarian can allow (or can Mill deal with it?)

DO FOLLOWING A SEPARATE TEXT BOX?

Nozick’s Experience Machine

Here’s one last apparent counter-example to utilitarianism from the contemporary philosopher Robert Nozick. Suppose a machine is built that can replicate any experience. Plug yourself in and it will stimulate your brain in just the way it would be stimulated if you were, say, climbing mount Everest or walking on the Moon. The experiences this machine generates are indistinguishable from those you would get if you were experiencing the real thing.
For those of us that want to experience exotic and intense pleasures. this machine offers a fantastic opportunity. Notice it can even induce higherpleasures - the pleasure gained from engaging in a philosophical debate or listening to a Beethoven symphony need be no less intense for being experienced within a virtual world.
            Many of us would be keen to try out this machine. But what of the offer permanently to immerse yourself in such pleasure-inducing world?
Most of us would refuse. Someone who has climbed Everest in virtual reality has not really climbed Everest. And someone who has enjoyed a month-long affair with the computer-generated Lara Croft has not really made any sort of meaningful connection with another human being.
The truth is we don’t just want to “feel happy”. Most of us also want to lead lives that are authentic. Someone who (like Truman in The Truman Show) had unwittingly lived out their whole life within a carefully controlled environment might subjectively feel content and fulfilled. But were they to be told on their deathbed that it had all been a carefully staged illusion - that there had been no real relationships, that their “achievements” had all been carefully managed - then they might well feel that theirs was, after all, a life sadly wasted.
Again, it seems that “feeling good” is not, ultimately, what’s most important to most of us. Nor, it seems, is arranging things to maximize the feeling of happiness always morally the right thing to do. Secretly plugging everyone into a deceptive, Matrix-like pleasure-inducing virtual world would surely be very wrong indeed.

No comments:

Post a Comment