Stanley Milgram is famous for one thing: in 1961 he set up an experiment at Yale which purported to show that most of us are subservient to authority, and will subject others to extremes of pain merely on the instructions of a man in a white coat. The experiment is sometimes referred to as the Eichmann experiment, because of its supposed relevance to questions about the holocaust:
In the aftermath of the Holocaust and the events leading up to World War II, the world was stunned with the happenings in Nazi German and their acquired surrounding territories that came out during the Eichmann Trials. Eichmann, a high ranking official of the Nazi Party, was on trial for war crimes and crimes against humanity. The questions is, "Could it be that Eichmann, and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"Stanley Milgram answered the call to this problem by performing a series of studies on the Obedience to Authority. Milgram's work began at Harvard where he was working towards his Ph.D. The experiments on which his initial research was based were done at Yale from 1961-1962.
In response to a newspaper ad offering $4.50 for one hour's work, an individual turns up to take part in a Psychology experiment investigating memory and learning. He is introduced to a stern looking experimenter in a white coat and a rather pleasant and friendly co-subject. The experimenter explains that the experiment will look into the role of punishment in learning, and that one will be the "teacher" and one will be the "learner." Lots are drawn to determine roles, and it is decided that the individual who answered the ad will become the "teacher."
Your co-subject is taken to a room where he is strapped in a chair to prevent movement and an electrode is placed on his arm. Next, the "teacher" is taken to an adjoining room which contains a generator. The "teacher" is instructed to read a list of two word pairs and ask the "learner" to read them back. If the "learner" gets the answer correct, then they move on to the next word. If the answer is incorrect, the "teacher" is supposed to shock the "learner" starting at 15 volts.
The generator has 30 switches in 15 volt increments, each is labeled with a voltage ranging from 15 up to 450 volts. Each switch also has a rating, ranging from "slight shock" to "danger: severe shock". The final two switches are labeled "XXX". The "teacher" automatically is supposed to increase the shock each time the "learner" misses a word in the list. Although the "teacher" thought that he/she was administering shocks to the "learner", the "learner" is actually a student or an actor who is never actually harmed. (The drawing of lots was rigged, so that the actor would always end up as the "learner.")
At times, the worried "teachers" questioned the experimenter, asking who was responsible for any harmful effects resulting from shocking the learner at such a high level. Upon receiving the answer that the experimenter assumed full responsibility, teachers seemed to accept the response and continue shocking, even though some were obviously extremely uncomfortable in doing so.
Today the field of psychology would deem this study highly unethical, but it revealed some extremely important findings. The theory that only the most severe monsters on the sadistic fringe of society would submit to such cruelty is disclaimed. Findings show that, "two-thirds of this studies participants fall into the category of ‘obedient' subjects, and that they represent ordinary people drawn from the working, managerial, and professional classes (Obedience to Authority)." Ultimately 65% of all of the "teachers" punished the "learners" to the maximum 450 volts. No subject stopped before reaching 300 volts!
Many people have been troubled by Milgram's work, but the overall tone is well reflected by the article cited above: it was maybe unethical, but nevertheless showed us something important about ourselves.
I've just come across a book called "The Psychological Mystique" by Stewart Justman which has some interesting things to say about Milgram's experiment, prompting me to write this. The book in general is concerned with the spread of psychological explanations, in the tradition of Philip Rieff's "Triumph of the Therapeutic": how the language of psychology displaces our moral responses.
Milgram believed he'd exposed an appalling tendency toward blind submission "even though it took a contrived situation and intense pressure to bring this tendency out and even though there is no evidence that any participant in the experiment ever inflicted suffering on a helpless victim." Suffering was indeed inflicted on some unwitting victims though: the poor wretches who volunteered for the experiments. Reading the responses of some of the subjects as they were told to continue with the punishments is not pleasant:
"I can't stand it! I'm not going to kill that man in there! You hear him hollering?...he's hollering. He can't stand it. What's going to happen to him?...I'm not going to get that man sick in there...know what I mean?...I refuse to take the responsibility. He's in there hollering!...[...] I don't mean to be rude, but I think you should look in on him. All you have to do is look in on him. [...] Something might have happened to the gentleman in there, sir!"
This was from one of the "subservient" moral monsters that Milgram thought he'd exposed, but surely the moral monster here was the psychologist urging the subject on. As Justman says: "The indecency of the incident lies not just in the way Milgram reduces a man to begging and stammering, but in his cool observation of the subject's suffering, and interpretation of it as evidence of the subject's moral block, not his own."
Here's Milgram: "I observed a mature and initially poised businessman enter the laboratory smiling and confident. Within twenty minutes he was reduced to a twitching stuttering wreck, who was rapidly approaching nervous collapse. He constantly pulled on his ear lobe and twisted his hands. At one point he pushed his fist into his forehead and muttered "oh God let's stop it". And yet he continued to respond to every word of the experimenter, and obeyed to the end." It never occurs to Milgram that his own behaviour might be subject to the same moral scrutiny he's so keen to apply to this poor man. The subjects are accused of subservience to authority, yet Milgram himself is the one inflicting suffering on others in the name of a higher authority; namely science. The subjects at least displayed their conflict, but not Milgram. He inflicts real suffering on his subjects and never has a moment's hesitation as to whether he should proceed or not. His subjects go through hell inflicting what is in fact sham suffering - and they're the ones who apparently demonstrate the moral failings of humanity!
This is entrapment; this is fraud. This experiment tells us nothing except the lengths psychologists will go to to justify disgraceful behaviour and then build a career on it. So why is it so highly regarded, and so often cited as one of the key experiments of the 20th century? Because it fits in with the cliched thinking that we all have a dark side, that we're all capable of being Eichmanns, that underneath the cultured surface we're all monsters. We're all guilty. But as Justman concludes: "we find the Milgram experiment illuminating less for what it reveals about any presumed tendency to dumb obedience than for what it says about the beliefs of a culture where such manipulations could be prized as a contribution to knowledge."
I would go one stage further though. The experiment as normally understood certainly has no relevance to the Eichmann situation. Eichmann and his like were not conflicted about what they did; they didn't suffer agonies of conscience. It didn't occur to them, because they accepted the moral consensus. They believed that Europe should be cleansed of Jews because they accepted the Nazi view that Jews were a contamination that needed to be removed. Eichmann didn't sit at his desk sweating away, "Oh God, if only someone else had this responsibility! I know it's wrong, but I can't disobey the Fuhrer!" That's not how it was. He just accepted that suffering was necessary in the name of a higher cause. But there is a real lesson for us now. The parallel to Eichmann is not in the subjects' behaviour but in Milgram's: how people can blithely inflict suffering in the name of a higher cause, never have a moment's doubt about what they're doing, and be lauded for their efforts by their peers.
Excellent post, and timely for me as it's a subject I've been giving a lot of thought to lately while reading Goldhagen's "Hitler's Willing Executioners: Ordinary Germans and the Holocaust." The other thread of thought it brings to mind is the idea that having an education, even an advanced one, somehow contributes to making one morally superior as well as intellectually. As can be seen in innumerable examples, absent some independent moral context, it can even be an impediment to moral behavior, as can be seen in Milgram's, what might be called extremely egotistical behavior. (His experiment is so important, he's dulled himself to the misery he's causing - misery that would be obvious to a person not so obssessed with the "science.")
Posted by: Solomon | April 03, 2004 at 10:01 PM
Justman's critique of Milgram as you describe it is insightful, and I agree that Milgram's behavior is in fact the most shameful aspect of the experiment. But I don't understand how you can claim that "This experiment tells us nothing except the lengths psychologists will go to to justify disgraceful behaviour and then build a career on it" without pointing to some methodological problem (aside from any moral issue, of course). It's still the case that these subjects continued to inflict what they believed to be ever greater pain on another person. It's heartening that the detailed results show the subjects resisting the scientist's orders--just one other conscience around would probably have steeled them enough to say no. Nevertheless, they did submit to the scientist's authority. I think the main problem in applying the results broadly is the peculiarity of the situation--the authority figure is a scientist, which had its own peculiar implications in 1960. I wonder if the subjects would behave the same way today, in an era of homeopathic medicine and toxic waste. But this weakness doesn't mean the results are meaningless.
Have you ever read "Ordinary Men"? It describes in excruciating detail the activities of Reserve Police Battalion 101, the men responsible and their attempts to rationalize what they had done. The book does not forgive them. Here are the reviews on Amazon: http://www.amazon.com/exec/obidos/tg/detail/-/0060995068/104-3195012-9903916?v=glance
Posted by: Clay | April 05, 2004 at 07:29 PM
Clay - Okay, the phrase of mine that you quote was a touch hyperbolic, but I do think that what was demonstrated was entirely trivial. So, people submit to authority? Well who would have guessed? Reading the comments of the subjects, it's like they're being polite: they've been invited to participate in this experiment which they believe will advance our understanding of learning, and they don't want to let anyone down. That's the way it goes. It may be true that the results would be different now: people are more sophisticated.
Posted by: Mick H | April 06, 2004 at 10:54 AM
here it is
Posted by: milgram | November 04, 2004 at 08:17 PM
I am just wondering if anyone knows abt any replica of the Milgram's study, apparently, there's been one done not long ago on TV or something, I want 2 find out, as I am doing a uni essay on it. And i just want people opinion's about if the experiment is 2 be done on now, wud the results be the same or different???
Thanks
Jo
Posted by: Jocelyn | January 04, 2005 at 12:02 AM