fbpx

Would you kill for £3?

  • post Type / Young Humanists International
  • Date / 5 May 2005

By Tom Stafford

In the 1960s psychologist Stanley Milgram tested a cross section of ordinary Americans to see if they’d administer potentially lethal electric shocks to a mild-mannered little man, sitting in an electric chair. The findings stunned the world.

Yale University professor Stanley Milgram’s 1960s’ experiments were perhaps the most important ever performed in psychology. He was interested in ‘the dilemma of obedience’, in how ordinary people could be induced to abandon their moral instincts by malevolent authority. While Milgram was specifically motivated by a desire to understand the Nazis, his findings may just as easily explain our complacency about the injustices of the global economy.

The participants in Milgram’s tests were recruited via a newspaper advertisement for ‘an experiment on learning and memory’ that promised $4.50 for one hour’s work. In the waiting room of Yale’s psychology department they met, on separate occasions, another ‘volunteer’ (actually an actor) – a small, friendly, middle-aged man with glasses. Then the stern-looking experimenter would arrive and ‘randomly’ choose the actor to be the ‘learner’ and the real volunteer to be the ‘teacher’. The experimenter would tell the teacher that the experiment concerned the use of punishment on memory; electric shocks would be delivered to the learner every time he answered a question incorrectly.

The teacher was shown the electric shock apparatus: a generator with 30 switches labelled with voltages ranging from 15 to 450 volts. Each switch also had a written rating: the most innocuous voltage had the assessment ‘slight shock’; towards the other end of the scale there was the caution ‘danger: severe shock’; the final two switches were labelled ‘XXX’.

The experimenter and the teacher would strap the learner into the electric chair, which was partitioned from the main room. The experimenter would stand while the teacher sat in the main room by the shock generator. A row of lights indicated the learner’s responses to the test questions.

The teacher would be told to increase the voltage every time the learner answered incorrectly. The learner had a script that involved him getting questions wrong and performing set responses as the teacher moved up the voltage scale. At 75 volts the learner would begin to grunt with pain. At 120 he would start to shout that the shocks were becoming painful. At 150 he would cry out that he had enough of the experiment. His protestations would turn to agonised screams at 270 volts. At 300 he would shout in desperation that he would no longer provide answers (the experimenter would inform the teacher that no answer was a wrong answer). Beyond 315 volts the learner was silent.

Shocking results The question Milgram sought to answer was very simple. What proportion of normal people would continue administering shocks up to the full lethal voltage? What proportion would act as if to kill an innocent person for no better reason than $4.50 and that they were told to by a psychology professor? There was no compulsion on the participants to continue. They were not being coerced in any way except verbally. If they questioned the experimenter he would say that he accepted full responsibility for the experiment. If questioned further he simply said: ‘You must go on.’

Before he released his results, Milgram asked a group of psychiatrists what proportion they thought would administer lethal dosages. What did these ‘experts in people’ think? They thought that only one person in a thousand – a ‘psychotic minority’ of 0.125 per cent – would deliver lethal shocks. The real proportion was 65 per cent.

The moral of Milgram’s research is clear: we must beware evil systems more than we must beware evil men. We all contain the capacity to perform evil acts, and will disregard our moral instincts if put in situations that capitalise upon our normal human weaknesses.

To investigate how different factors influence people’s behaviour Milgram implemented a number of variations to his experiment. He showed how important the proximity of the victim was to denial of responsibility; ‘only’ half as many people (still 30 per cent) would administer seemingly lethal shocks if the victim was in the same room. Another variation showed how being part of a group allowed even greater denial of responsibility; when the volunteer was part of a team of three with two additional actors primed to obey the experimenter until the bitter end, obedience was 93 per cent. (If the confederates refused to obey only 10 per cent of volunteers delivered the maximum shock.)

Any normal person in the experiment would have had doubts, but Milgram showed that people usually put such reservations aside if others conform. The ‘dissenters’ in Milgram’s experiment allowed the volunteers to realise that their doubts were legitimate. When people connect their doubts they begin to realise that they are right to worry and wrong to remain silent. This is why, in an age when an increasingly atomised society is fed by an increasingly concentrated media, forming ordinary, community-level, connections may be one of the most radical things you can do.

The importance of dissent
Professor Charlan Nemeth, of the University of California at Berkeley, has researched the effect of dissent on group decisions for 25 years. ‘Dissent,’ she says, ‘even when wrong, stimulates the kinds of thinking that leads to better and more creative solutions. While people dislike the dissenter, and will give him/her no credit for the influence on their thinking, they are more likely to read more information on all sides of the issue.

‘They will use more strategies in solving problems and they come up with better solutions. When you have no dissent, there is a tendency to disregard opposing information, rush to judgment and to assume unanimity even when it does not exist.’

People who appear to ignore dissent have been found to adopt minority opinions when asked for their views privately, later or in a different form. One experiment showing this asked groups to judge the colour of blue- and green-hued slides. Each group of six volunteers contained two plants who announced that they saw some of the blue slides as green. During the experiment there was a small but significant effect caused by this ‘dissenting’ minority; a small number of people were influenced to announce that they too saw some blues as green. But the most interesting effect was found after the end of the main experiment.

Afterwards, participants were asked in isolation to look at a continuous colour scale and judge where blue turned into green. Sure enough, all participants – even those who appeared not to have been influenced during the experiment – were more likely to judge borderline cases as green. The minority had altered people’s perception, even if it hadn’t immediately altered their behaviour.

Most psychologists interpret this kind of effect within the framework proposed by Serge Moscovici. Moscovici proposed that while majorities tend to influence people by compliance – immediate, public, conformity –, minorities tend to influence people by conversion – slow-acting changes on their private thinking. This influence of minority opinions may be so subtle as to affect people without them even realising it.

Subjects who were exposed to disobedience in Milgram’s studies usually reported that they were not affected by the behaviour of the ‘rebels’. They claimed they would have stopped administering shocks anyway. The results tell a different story: compliance with the experimenter’s orders was 83 per cent higher when other people involved were obedient.

It looks like the task of the dissenting minority will always be a thankless one. Although it influences other people, it is seldom credited for doing so. We’ll never know, for example, the extent to which the dedication of anti-war activists fundamentally altered the plans for the current Iraq campaign.

Conformity, on the other hand, is the dark side of human sociability. Just as it’s natural for us to love, to share, to give support and to look to others for support, so it is also all too natural to take our lead from the majority, to act as others act, to remain silent when others remain silent. Research like Milgram’s demonstrates just how powerful conformity can be. But the same research also contains seeds of hope: when conformity is the norm, the power rests with dissenting voices. So the moral is clear: although it can feel hopeless to be in the minority, you can have a powerful effect. But you’ll never be thanked for it.

Tom Stafford was as at 22/05/03 a final-year psychology PhD student at Sheffield University

Culled from http://www.theecologist.org/archive_article.html?a….

Share
WordPress theme developer - whois: Andy White London