When Good People Do Evil
45 years ago, Stanley Milgram’s
classic experiments showed that, under orders, decent human beings will do anything.
Imagine that you have responded to
an advertisement in the New Haven newspaper seeking subjects for a study of
memory. A researcher whose serious demeanor and laboratory coat convey
scientific importance greets you and another applicant at your arrival at a
Yale laboratory in Linsly-Chittenden Hall. You are here to help science find
ways to improve people’s learning and memory through the use of punishment. The
researcher tells you why this work may have important consequences. The task is
straightforward: one of you will be the “teacher” who gives the “learner” a set
of word pairings to memorize. During the test, the teacher will give each key
word, and the learner must respond with the correct association. When the
learner is right, the teacher gives a verbal reward, such as “Good” or “That’s
right.” When the learner is wrong, the teacher is to press a lever on an
impressive-looking apparatus that delivers an immediate shock to punish the
The shock generator has 30 switches,
starting from a low level of 15 volts and increasing by 15 volts to each higher
level. The experimenter tells you that every time the learner makes a mistake,
you have to press the next switch. The control panel shows both the voltage of
each switch and a description. The tenth level (150 volts) is “Strong Shock";
the 17th level (255 volts) is “Intense Shock"; the 25th level (375 volts) is “Danger,
Severe Shock.” At the 29th and 30th levels (435 and 450 volts) the control
panel is marked simply with an ominous XXX: the pornography of ultimate pain
You and another volunteer draw
straws to see who will play each role; you are to be the teacher, and the other
volunteer will be the learner. He is a mild-mannered, middle-aged man whom you
help escort to the next chamber. “Okay, now we are going to set up the learner
so he can get some punishment,” the experimenter tells you both. The learner’s
arms are strapped down and an electrode is attached to his right wrist. The
generator in the next room will deliver the shocks. The two of you communicate
over an intercom, with the experimenter standing next to you. You get a sample
shock of 45 volts—the third level, a slight tingly pain—so you have
a sense of what the shock levels mean. The researcher then signals you to
As the shock levels increase in intensity, so do the
Initially, your pupil does well, but
soon he begins making errors, and you start pressing the shock switches. He
complains that the shocks are starting to hurt. You look at the experimenter,
who nods to continue. As the shock levels increase in intensity, so do the
learner’s screams, saying he does not think he wants to continue. You hesitate
and question whether you should go on. But the experimenter insists that you
have no choice.
In 1949, seated next to me in senior
class at James Monroe High School in the Bronx, New York, was my classmate,
Stanley Milgram. We were both skinny kids, full of ambition and a desire to
make something of ourselves, s o that we might escape life in the confines of
our ghetto experience. Stanley was the little smart one who we went to for
authoritative answers. I was the tall popular one, the smiling guy other kids
would go to for social advice.
I had just returned to Monroe High
from a horrible year at North Hollywood High School, where I had been shunned
and friendless (because, as I later learned, there was a rumor circulating that
I was from a New York Sicilian Mafia family). Back at Monroe, I would be chosen
“Jimmy Monroe”—most popular boy in Monroe High School’s senior class.
Stanley and I once discussed how that transformation could happen. We agreed
that I had not changed; the situation was what mattered.
Situational psychology is the study
of the human response to features of our social environment, the external
behavioral context, above all to the other people around us. Stanley Milgram and
I, budding situationists in 1949, both went on to become academic social
psychologists. We met again at Yale in 1960 as beginning assistant professors—him
starting out at Yale, me at NYU. Some of Milgram’s new research was conducted
in a modified laboratory that I had fabricated a few years earlier as a
graduate student—in the basement of Linsly-Chittenden, the building where
we taught Introductory Psychology courses. That is where Milgram was to conduct
his classic and controversial experiments on blind obedience to authority.
Milgram’s interest in the problem of
obedience came from deep personal concerns about how readily the Nazis had
obediently killed Jews during the Holocaust. His laboratory paradigm, he wrote
years later, “gave scientific expression to a more general concern about
authority, a concern forced upon members of my generation, in particular upon
Jews such as myself, by the atrocities of World War II.”
As Milgram described it, he hit upon
the concept for his experiment while musing about a study in which one of his
professors, Solomon Asch, had tested how far subjects would conform to the
judgment of a group. Asch had put each subject in a group of coached
confederates and asked every member, one by one, to compare a set of lines in
order of length. When the confederates all started giving the same obviously
false answers, 70 percent of the subjects agreed with them at least some of the
far would a person go under orders? How far do you predict
that you would go?
Milgram wondered whether there was a
way to craft a conformity experiment that would be “more humanly significant"
than judgments about line length. He wrote later: “I wondered whether groups
could pressure a person into performing an act whose human import was more
readily apparent; perhaps behaving aggressively toward another person, say by administering
increasingly severe shocks to him. But to study the group effect … you’d
have to know how the subject performed without any group pressure. At that
instant, my thought shifted, zeroing in on this experimental control. Just how
far would a person go under the experimenter’s orders?”
How far up the scale do you predict
that you would go under those orders? Put yourself back in the basement with
the fake shock apparatus and the other “volunteer”—actually the
experimenter’s confederate, who always plays the learner because the “drawing"
is rigged—strapped down in the next room. As the shocks proceed, the
learner begins complaining about his heart condition. You dissent, but the
experimenter still insists that you continue. The learner makes errors galore.
You plead with your pupil to concentrate; you don’t want to hurt him. But your
concerns and motivational messages are to no avail. He gets the answers wrong
again and again. As the shocks intensify, he shouts out, “I can’t stand the
pain, let me out of here!” Then he says to the experimenter, “You have no right
to keep me here!” Another level up, he screams, “I absolutely refuse to answer
any more! You can’t hold me here! My heart’s bothering me!”
Obviously you want nothing more to
do with this experiment. You tell the experimenter that you refuse to continue.
You are not the kind of person who harms other people in this way. You want
out. But the experimenter continues to insist that you go on. He reminds you of
the contract, of your agreement to participate fully. Moreover, he claims
responsibility for the consequences of your shocking actions. After you press
the 300-volt switch, you read the next keyword, but the learner doesn’t answer.
“He’s not responding,” you tell the experimenter. You want him to go into the
other room and check on the learner to see if he is all right. The experimenter
is impassive; he is not going to check on the learner. Instead he tells you, “If
the learner doesn’t answer in a reasonable time, about five seconds, consider
it wrong,” since errors of omission must be punished in the same way as errors
of commission—that is a rule.
You would never sell out your morality. Right?
As you continue up to even more
dangerous shock levels, there is no sound coming from your pupil’s shock
chamber. He may be unconscious or worse. You are truly disturbed and want to
quit, but nothing you say works to get your exit from this unexpectedly
distressing situation. You are told to follow the rules and keep posing the
test items and shocking the errors.
Now try to imagine fully what your
participation as the teacher would be. If you actually go all the way to the
last of the shock levels, the experimenter will insist that you repeat that XXX
switch two more times. I am sure you are saying, “No way would I ever go all
the way!” Obviously, you would have dissented, then disobeyed and just walked
out. You would never sell out your morality. Right?
Milgram once described his shock
experiment to a group of 40 psychiatrists and aske d them to estimate the
percentage of American citizens who would go to each of the 30 levels in the
experiment. On average, they predicted that less than 1 percent would go all
the way to the end, that only sadists would engage in such sadistic behavior,
and that most people would drop out at the tenth level of 150 volts. They could
not have been more wrong.
In Milgram’s experiment, two of
every three (65 percent) of the volunteers went all the way up to the maximum
shock level of 450 volts. The vast majority of people shocked the victim over
and over again despite his increasingly desperate pleas to stop. Most
participants dissented from time to time and said they did not want to go on,
but the researcher would prod them to continue.
Over the course of a year, Milgram
carried out 19 different experiments, each one a different variation of the
basic paradigm. In each of these studies he varied one social psychological
variable and observed its impact. In one study, he added women; in others he
varied the physical proximity or remoteness of either the experimenter-teacher
link or the teacher-learner link; had peers rebel or obey before the teacher
had the chance to begin; and more.
In one set of experiments, Milgram
wanted to show that his results were not due to the authority power of Yale
University. So he transplanted his laboratory to a run-down office building in
downtown Bridgeport, Connecticut, and repeated the experiment as a project
ostensibly of a private research firm with no connection to Yale. It made
hardly any difference; the participants fell under the same spell of this
The data clearly revealed the
extreme pliability of human nature: depending on th e situation, almost everyone
could be totally obedient or almost everyone could resist authority pressures.
Milgram was able to demonstrate that compliance rates could soar to over 90
percent of people continuing to the 450-volt maximum or be reduced to less than
10 percent—by introducing just one crucial variable into the compliance
Want maximum obedience? Make the
subject a member of a “teaching team,” in which the job of pulling the shock
lever to punish the victim is given to another person (a confederate), while
the subject assists with other parts of the procedure. Want resistance to
authority pressures? Provide social models—peers who rebel. Participants
also refused to deliver the shocks if the learner said he wanted to be shocked;
that’s masochistic, and they are not sadists. They were also reluctant to give
high levels of shock when the experimenter filled in as the learner. They were
more likely to shock when the learner was remote than in proximity.
Milgram studied a thousand ordinary citizens from varied backgrounds.
In each of the other variations on
this diverse range of ordinary American citizens, of widely varying ages and
occupations and of both genders, it was possible to elicit low, medium, or high
levels of compliant obedience with a flick of the situational switch. Milgram’s
large sample—a thousand ordinary citizens from varied backgrounds—makes
the results of his obedience studies among the most generalizable in all the
social sciences. His classic study has been replicated and extended by many
other researchers in many countries.
Recently, Thomas Blass of the
University of Maryland-Baltimore County analyzed the rates of obedience
in eight studies conducted in the United States and nine replications in
European, African, and Asian countries. He found comparably high levels of
compliance in all. The 61 percent mean obedience rate found in the U.S. was
matched by the 66 percent rate found across all the other national samples. The
degree of obedience was not affected by the timing of the studies, which ranged
from 1963 to 1985.
Other studies based on Milgram’s ha ve shown how powerful the obedience effect can be when legitimate authorities
exercise their power within their power domains. In one study, most college
students administered shocks to whimpering puppies when required to do so by a
professor. In another, all but one of 22 nurses flouted their hospital’s
procedure by obeying a phone order from an unknown doctor to administer an
excessive amount of a drug (actually a placebo); that solitary disobedient
nurse should have been given a raise and a hero’s medal. In still another, a
group of 20 high school students joined a history teacher’s supposed
authoritarian political movement, and within a week had expelled their fellows
from class and recruited nearly 200 others from around the school to the cause.
Now we ask the question that must be
posed of all such research: what is its external validity, what are re al-world
parallels to the laboratory demonstration of authority power?
In 1963, the social philosopher
Hannah Arendt published what was to become a classic of our times, Eichmann
in Jerusalem: A Report on the Banality of Evil. She provides a detailed analysis
of the war crimes trial of Adolf Eichmann, the Nazi figure who personally
arranged for the murder of millions of Jews. Eichmann’s defense of his actions
was similar to the testimony of other Nazi leaders: “I was only following
orders.” What is most striking in Arendt’s account of Eichmann is all the ways
in which he seemed absolutely ordinary: half a dozen psychiatrists had
certified him as “normal.” Arendt’s famous conclusion: “The trouble with
Eichmann was precisely that so many were like him, and that the many were neither
perverted nor sadistic, that they were, and still are, terribly and
Arendt’s phrase “the banality of
evil” continues to resonate because genocide has been unleashed around the
world and torture and terrorism continue to be common features of our global
landscape. A few years ago, the sociologist and Brazil expert Martha Huggins,
the Greek psychologist and torture expert Mika Haritos-Fatouros, and I
interviewed several dozen torturers. These men did their daily dirty deeds for
years in Brazil as policemen, sanctioned by the government to get confessions
by torturing “subversive” enemies of the state.
The systematic torture by men of
their fellow men and women represents one of the darkest sides of human nature.
Surely, my colleagues and I reasoned, here was a place where dispositional evil
would be manifest. The torturers shared a common enemy: men, women, and
children who, though citizens of their state, even neighbors, were declared by “the
System” to be threats to the country’s national security—as socialists
and Communists. Some had to be eliminated efficiently, while others, who might
hold secret information, had to be made to yield it up by torture, confess and
then be killed.
Torture always involves a personal
relationship; it is essential for the torturer to understand what kind of
torture to employ, what intensity of torture to use on a certain person at a
certain time. Wrong kind or too little—no confession. Too much—the
victim dies before confessing. In either case, the torturer fails to deliver
the goods and incurs the wrath of the senior officers. Learning to determine
the right kind and degree of torture that yields up the desired information
elicits abounding rewards and flowing praise from one’s superiors. It took time
and emerging insights into human weaknesses for these torturers to become adept
at their craft.
What kind of men could do such
deeds? Did they need to rely on sadistic impulses and a history of sociopathic
life experiences to rip and tear the flesh of fellow beings day in and day out
for years on end?
In a recent study of 400 al-Qaeda members, 90% came from caring, intact families.
We found that sadists are selected
out of the training process by trainers because they are not controllable. They
get off on the pleasure of inflicting pain, and thus do not sustain the focus
on the goal of extracting confessions. From all the evidence we could muster,
torturers were not unusual or deviant in any way prior to practicing their new
roles, nor were there any persisting deviant tendencies or pathologies among
any of them in the years following their work as torturers and executioners.
Their transformation was entirely explainable as being the consequence of a
number of situational and systemic factors, such as the training they were
given to play this new role; their group camaraderie; acceptance of the
national security ideology; and their learned belief in socialists and
Communists as enemies of their state.
Amazingly, the transformation of
these men into violence workers is comparable to the transformation of young
Palestinians into suicide bombers intent on killing innocent Israeli civilians.
In a recent study, the forensic psychiatrist Marc Sageman found evidence of the
normalcy of 400 al-Qaeda members. Three-quarters came from the upper or middle
class. Ninety percent came from caring, intact families. Two-thirds had gone to
college; two-thirds were married; and most had children and jobs in science and
engineering. In many ways, Sageman concludes, “these are the best and brightest
of their society.”
Israeli psychologist Ariel Merari,
who has studied this phenomenon extensively for many years, outlines the common
steps on the path to these explosive deaths. First, senior members of an
extremist group identify young people who, based on their declarations at a
public rally against Israel or their support of some Islamic cause or
Palestinian action, appear to have an intense patriotic fervor. Next, they are
invited to discuss how seriously they love their country and hate Israel. They
are asked to commit to being trained. Those who do then become part of a small
secret cell of three to five youths. From their elders, they learn bomb making,
disguise, and selecting and timing targets. Finally, they make public their
private commitment by making a videotape, declaring themselves to be “the
living martyr” for Islam. The recruits are also told the Big Lie: their
relatives will be entitled to a high place in Heaven, and they themselves will
earn a place beside Allah. Of course, the rhetoric of dehumanization serves to
deny the humanity and innocence of their victims.
The die is cast; their minds have
been carefully prepared to do what is ordinarily unthinkable. In these
systematic ways a host of normal, angry young men and women become transformed
into true believers. The suicide, the murder, of any young person is a gash in
the fabric of the human family that we elders from every nation must unite to
prevent. To encourage the sacrifice of youth for the sake of advancing the
ideologies of the old must be considered a form of evil that transcends local
politics and expedient strategies.
A host of normal, angry young men and women become transformed
into true believers.
Our final extension of the social
psychology of evil from artificial laboratory experiments to real-world
contexts comes to us from the jungles of Guyana. There, on November 28, 1978,
an American religious leader persuaded more than 900 of his followers to commit
mass suicide. In the ultimate test of blind obedience to authority, many of
them killed their children on his command.
Jim Jones, the pastor of Peoples
Temple congregations in San Francisco and Los Angeles, had set out to create a
socialist utopia in Guyana. But over time Jones was transformed from the
caring, spiritual “father” of a large Protestant congregation into an Angel of
Death. He instituted extended forced labor, armed guards, semistarvation diets,
and daily punishments amounting to torture for the slightest breach of any of
his many rules. Concerned relatives convinced a congressman and media crew to
inspect the compound. But Jones arranged for them to be murdered as they left.
He then gathered almost all the members at the compound and gave a long speech
in which he exhorted them to take their lives by drinking cyanide-laced
Jones was surely an egomaniac; he
had all of his speeches and proclamations, even his torture sessions,
tape-recorded—including his final suicide harangue. In it Jones distorts,
lies, pleads, makes false analogies, appeals to ideology and to transcendent
future life, and outright insists that his orders be followed, all while his
staff is efficiently distributing deadly poison to the hundreds gathered around
him. Some excerpts from that last hour convey a sense of the death-dealing
tactics he used to induce total obedience to an authority gone mad:
Please get us some medication. It’s
simple. It’s simple. There’s no convulsions with it. [Of course there are,
especially for the children.] … Don’t be afraid to die. You’ll see, there’ll
be a few people land[ing] out here. They’ll torture some of our children here.
They’ll torture our people. They’ll torture our seniors. We cannot have this. … Please, can we hasten? Can we hasten with that medication? … We’ve
lived—we’ve lived as no other people lived and loved. We’ve had as much
of this world as you’re gonna get. Let’s just be done with it. (Applause.). … Who wants to go with their child has a right to go with their child. I think
it’s humane. … Lay down your life with dignity. Don’t lay down with tears
and agony. There’s nothing to death. … It’s just stepping over to another
plane. Don’t be this way. Stop this hysterics. … Look, children, it’s just
something to put you to rest. Oh, God. (Children crying.). … Mother,
Mother, Mother, Mother, Mother, please. Mother, please, please, please. Don’t—don’t
do this. Don’t do this. Lay down your life with your child.
And they did, and they died for “Dad.”
A fitting conclusion comes from
psychologist Mahrzarin Banaji: “What social psychology has given to an
understanding of human nature is the discovery that forces larger than
ourselves determine our mental life and our actions—chief among these
forces [is] the power of the social situation.”
The most dramatic instances of
directed behavior change and “mind control” are not the consequence of exotic
forms of influence such as hypnosis, psychotropic drugs, or “brainwashing."
They are, rather, the systematic manipulation of the most mundane aspects of
human nature over time in confining settings. Motives and needs that ordinarily
serve us well can lead us astray when they are aroused, amplified, or
manipulated by situational forces that we fail to recognize as potent. This is
why evil is so pervasive. Its temptation is just a small turn away, a slight
detour on the path of life, a blur in our sideview mirror, leading to disaster.