Obedience to Authority

LEARNING OBJECTIVES

  • Describe the early psychological studies on obedience and summarize what the results of those experiments tell us.
  • Explain when people are likely to obey the demands of people in authority and when they are more likely to withstand the pressure to obey.

The study of when and why people obey the commands or instructions of someone in authority has been dominated by the most famous set of social psychology experiments ever conducted—those of Stanley Milgram (previously discussed in Chapters 1 and 4). Milgram’s experiments are so well-known, in fact, that the social psychologist Lee Ross said that they “have become part of our society’s shared intellectual legacy—that small body of historical incidents, biblical parables, and classic literature that serious thinkers feel free to draw on when they debate about human nature or contemplate human history” (L. Ross, 1988, p. 101).

The Setup of the Milgram Experiments

Milgram’s research on obedience began as an investigation of conformity. Milgram was interested in whether the kinds of pressures observed in Asch’s conformity experiment were powerful enough to lead people to do something far more significant than report an incorrect line length. He wondered what would happen if he asked participants to deliver electric shocks whenever a subject performing a task responded incorrectly. In reality, the “subject” was the experimenter's confederate. Would participants conform to the example set by other obedient participants, even when doing so involved hurting another human being?

This is an interesting question, but Milgram never pursued it. The reason is that he first needed to obtain data from a control group to determine participants’ willingness to deliver electric shocks in the first place, when there was no one to model compliance (Evans, 1980). And that’s when he got his surprising result—one that radically changed his research agenda. A large percentage of participants were willing to do something they thought was hurting another person, even when there was no group of other participants leading the way.

Recall from Chapter 1 the basic procedure of Milgram’s experiments. After responding to a newspaper ad, participants showed up for an experiment on learning. The setup was rigged so that the participant was always assigned to the role of the “teacher” and the confederate to the role of the “learner.” The teacher’s job was to administer an electric shock every time the learner—a genial, middle-aged man who was strapped into a chair with his arm attached to a fake shock generator—made a mistake by reporting the wrong word from a list of word pairs (such as glove/book, grill/detergent, anvil/pope). Teachers were briefly strapped to the chair themselves and given a 45-volt shock so they would know the shocks were painful. The teacher started off by delivering 15 volts after the learner’s first mistake. With each subsequent mistake, the teacher had to increase the shock by 15 volts. As the mistakes accumulated, participants found themselves required to deliver 255, 300, and 330 volts of electricity—all the way up to 450 volts. (In reality, no electric shock was delivered to the learner.) If a participant expressed reservations or tried to terminate the experiment, the experimenter would respond with a carefully scripted set of responses: “Please continue,” “The experiment requires that you continue,” “It is absolutely essential that you continue,” and “You have no other choice; you must go on.”

The great surprise in these studies was how many participants continued to obey the experimenter’s orders and deliver the maximum level of shock to the confederate. In the remote-feedback version of the experiment, in which the learner was in an adjoining room and could not be heard except when he vigorously pounded on the wall after a shock of 300 volts, 66 percent of the participants continued the learning experiment and delivered the maximum shock of 450 volts. In the voice-feedback version, the participants could hear the learner’s increasingly desperate pleas—including screaming that he had a heart condition—until finally, and ominously, he became silent. Despite the many cues that the learner was suffering, 62.5 percent of the participants delivered the maximum shock (Milgram, 1965, 1974).

A participant sits on a chair. A man standing on his side attaches some wires to his right arm, that are connected to a shock generator.
(A)
The participant gets up from the chair during the Milgram experiment. The participant gets up from the chair during the Milgram experiment.
(B)
THE MILGRAM EXPERIMENT Participants were led to believe that the shock generator had 30 levels of shock, ranging from “slight shock” to “danger: severe shock” to “XXX.” (A) A participant being given a sample shock of 45 volts (this was the only real shock in the experiment). (B) A participant standing up to ask the experimenter if he could stop the experiment.

Opposing Forces

Milgram’s participants were caught in an agonizing conflict. On the one hand were forces compelling them to complete the experiment and continue delivering shocks (Reeder, Monroe, & Pryor, 2008). Among these forces was a sense of fair play: They had agreed to serve as participants, they had already received payment for doing so, and they felt they now had to fulfill their part of the bargain. Some were probably also motivated by the reason they had agreed to participate in the first place: to advance science and the understanding of human behavior. Normative social influence was also likely at play—in this case, the desire to avoid the disapproval of the experimenter or anyone else associated with the study. Closely related to this concern was the very human desire to avoid “making a scene” and upsetting others (Goffman, 1966; R. S. Miller, 1996).

On the other hand, several powerful forces compelled participants to want to terminate the experiment. Foremost among these was the moral imperative to stop the suffering of the learner (Burger, Girgis, & Manning, 2011). Participants may have felt a specific desire not to hurt the genial man they had met earlier as well as a more abstract reluctance to hurt others. Some were also probably concerned about what would happen if something went wrong. “What if he dies or is permanently injured?” “Will there be a lawsuit?” Still others may have wondered about the prospect of having to walk out with the learner after everything was over and the resulting embarrassment they might feel or possible retaliation from the learner.

Understanding these opposing forces leads to a better understanding of why participants responded the way they did and why the whole experience was so stressful. How might the rate of obedience change if the strength of these opposing forces were modified (Blass, 2000, 2004; A. G. Miller, 1986)? Milgram tried to answer this question through a comprehensive series of studies in which he conducted informative variations on his original experiments (see A Closer Look on p. 281).

Would You Have Obeyed?

Nobody anticipated the widespread levels of obedience Milgram observed. A group of psychiatrists predicted that fewer than 1 percent of all participants—a pathological fringe—would continue until they delivered the maximum amount of shock. This failure of prediction is matched by an equally noteworthy failure of after-the-fact insight: The vast majority of people believe, even after hearing the basic results and all the study variations, that they themselves would never deliver very high levels of shock. Thus, although Milgram’s experimental variations shed light on when and why people engage in such surprising behavior, they don’t provide a fully satisfying explanation, or else we would be more likely to accept that we ourselves might obey in the same situation. As Lee Ross put it, the experiments do not pass a critical “empathy test” (L. Ross, 1988). They don’t lead us to empathize fully with the obedient participants and take seriously the possibility that we would also obey to the end—as most participants did. A truly satisfying explanation might not convince us that we would surely obey, but it should at least convince us that we might act that way.

Milgram’s work is often mentioned in discussions of how people sometimes obey the directives of malevolent government officials and engage in sadistic torture or commit hideous crimes against humanity, such as those witnessed during the Holocaust in Nazi Germany and more recent massacres in Bosnia, Cambodia, Rwanda, and Darfur. Explanations of such incomprehensible cruelties vary along an “exceptionalist-normalist” continuum. The exceptionalist thesis is that such crimes are perpetrated only by “exceptional” people—that is, exceptionally sadistic, desperate, or ethnocentric people. Many Germans were virulent anti-Semites. The Serbs harbored long-standing hatred and resentment against the Bosnians. The Rwandan Hutus had a score to settle with the Tutsis. The normalist thesis, in contrast, is that most people are capable of such destructive obedience, and given the right circumstances, almost anyone would commit such acts.

Milgram’s research, of course, is typically taken to support the normalist view. Milgram himself certainly took this position. In 1979, on the CBS TV show 60 Minutes, journalist Morley Safer asked Milgram whether he thought something like the Holocaust could happen in the United States. Milgram offered this opinion:

I would say, on the basis of having observed a thousand people in the experiment and having my own intuition shaped and informed by these experiments, that if a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium sized American town. (Quoted in Blass, 1999, p. 955)

Let’s take a closer look.

THEY TRIED BUT FAILED

One reason people think they would never behave like the average participant in Milgram’s studies is that they misunderstand exactly how the average participant behaved (L. Ross, 1988). People conjure up images of participants casually going along with the experimenter’s commands, increasing the shock level from trial to trial, and being relatively inattentive to the learner’s situation. Indeed, Milgram’s experiments have often been described as demonstrations of “blind” obedience.

But that’s not what happened. Participants didn’t blindly obey. Nearly all tried to disobey in one form or another, but they weren’t particularly good at it. Nearly everyone called the experimenter’s attention to the learner’s suffering in an implicit plea to stop the proceedings. Many stated explicitly that they refused to continue (but nonetheless went on with the experiment). Some got out of their chair in defiance, only to sit back down moments later. As Ross pointed out, “the Milgram experiments have less to say about ‘destructive obedience’ than about ineffective and indecisive disobedience” (L. Ross, 1988, p. 103).

This distinction is critical. Most of us have had the experience of having good intentions but not being able to translate those intentions into effective action. For instance, maybe you’ve wanted to speak up more forcefully and effectively against racist, sexist, or homophobic remarks, but you were too slow to respond or the words didn’t come out as forthrightly as you intended (Woodzicka & LaFrance, 2001). Or maybe you’ve wanted to reach out to someone who was being ignored at a party, but you were distracted by your own social needs. Most of us can relate to being good-hearted but ineffective, but most of us can’t relate to being uncaring.

A chilling parallel to the behavior of Milgram’s participants is the behavior of some of the German soldiers called on to execute Polish Jews during World War II (Browning, 1992). Members of German Reserve Police Battalion 101 were mostly men who hoped to avoid the inevitable violence of the war by volunteering for police duty in Hamburg. After the invasion of Poland, however, they were reassigned to serve as military police in occupied territory. Most of their duties consisted of routine police work. But on July 13, 1942, the men were roused from their barracks before dawn and taken to the outskirts of the village of Józéfow, where they were given gruesome orders: to round up all the Jewish men, women, and children from the village, send all able-bodied young men to a work camp, and shoot the rest.

Most were shocked and repelled by their orders. Many resisted. But their resistance, like that of Milgram’s participants, was feeble. Some kept busy with petty errands or moved to the back of the battalion, hoping to avoid being called on. Others took part in the roundup but then refrained from shooting if no one was watching. Still others fired but missed intentionally. What they didnt do was state assertively that they wouldn’t participate or that what they were being asked to do was wrong. They tried to find an easy way to disobey, but there was no easy way—and so they obeyed. (Of course, many of the acts of genocide during the Holocaust were perpetrated by individuals who, unlike many of the soldiers in Reserve Police Battalion 101, fully embraced what they were doing.)

In the case of Milgram’s experiments, participants had trouble halting the proceedings partly because the experimenter wasn’t playing by the normal rules of social life. The participants offered reasons for stopping the experiments, but the experimenter largely ignored those reasons, making minimally responsive statements such as “The experiment requires that you continue.” Participants were confused and uncertain about how to act. As noted in the earlier discussion of conformity, people tend not to act decisively when they lack a solid grasp of the events happening around them. What should you do when told to deliver electric shocks to “teach” someone who’s no longer trying to learn anything, at the insistence of an authority figure who seems unconcerned about the learner’s predicament? How do you respond when events have stopped making sense?

These questions have important implications for those real-world instances of destructive obedience with which we should be most concerned. Many of the most hideous episodes of genocide, for example, have occurred right after large-scale social upheaval. Without reliable norms of appropriate behavior, people may lack the confidence necessary to take decisive action to stop such atrocities.

RELEASE FROM RESPONSIBILITY

The inability of Milgram’s participants to stop the experiment meant they were trapped in a situation of terrible conflict and stress. Although they knew that what was happening should not continue, they didn’t know how to bring it to an end. They were therefore desperate for anything that would reduce their stress. Fortunately for the participants (but unfortunately for the learner, if he really had been receiving electric shock), the experimenter provided something to reduce their stress by taking responsibility for what was happening. When participants asked, as many did, “Who is responsible for what happens here?” the experimenter responded, “I am responsible.” Participants seized on this assertion as a justification for their actions, and the stress they were experiencing was significantly reduced.

Of course, the cover, or “out,” the experimenter provided worked only because participants viewed the person taking responsibility as a legitimate authority. People generally don’t let just anyone take responsibility and then assume that everything is okay. Suppose you’re approached by a strange character on campus who says, “Quick, help me set fire to the administration building; I’ll take full responsibility.” You certainly would refuse to pitch in. In Milgram’s experiments, however, participants believed they could legitimately transfer responsibility to the experimenter because he was a representative of science. In nearly all the variations, the experimenter was affiliated with Yale University, a respected institution (although obedience was still high when the experimenter operated out of a storefront in downtown Bridgeport, Connecticut). These aspects of the situation made it easier for participants to reduce their own stress over what was happening by assuming that the experimenter knew better and was ultimately responsible for what was happening.

The cover provided by authorities has implications for some of history’s worst acts of destructive obedience. Among Nazis in Germany, Hutus in Rwanda, and Boko Haram in Nigeria, demands to obey were issued by authority figures who either explicitly took responsibility for the group’s actions or whose position supported an assumption of responsibility. And such claims of responsibility have nearly always been bolstered by some overarching belief system. Whether based on nationalism, religious ideology, or ethnic identity, every example of organized aggression has been draped in a seemingly legitimizing ideology that seeks to present otherwise hideous actions in a way that makes them seem morally appropriate (Staub, 1989; Zajonc, 2002).

STEP-BY-STEP INVOLVEMENT

It’s also important to remember that Milgram’s participants didn’t deliver 450 volts of electric shock right away. Instead, each participant first administered only 15 volts to the learner. Who wouldn’t do that? That’s feedback, not punishment. Then 30 volts. No problem there either. Then 45, 60, 75—each step a small one. Once participants started down this path, though, it was hard to stop, and they administered more and more shock. Indeed, the increments were so small that if a certain level of shock seemed like too much, why wouldn’t the previous level also have been too much (S. J. Gilbert, 1981)?

A black-and-white photo of a downtown area with various shops.
(A)
A black-and-white photo of a person sitting in an office.
(B)
LEGITIMIZING THE EXPERIMENT To see how participants would react if the experiment were not conducted at Yale and its authority seemed less legitimate, Milgram had them report to (A) a fictitious business called Research Associates of Bridgeport, located above a storefront in downtown Bridgeport, and (B) a seedy office. Obedience rates declined somewhat but remained high even under these conditions.

The step-by-step nature of participants’ obedience is a powerful reason why so many administered as much electric shock as they did. Most of us have had the experience of gradually getting in over our heads in this way. We may tell a “little white lie”—but one that sets in motion a cascade of events that requires more and more deception. (Many a TV sitcom plot rests on this very sequence.) Our behavior often creates its own momentum, and it’s hard to know in advance where that behavior will lead. Milgram’s participants can certainly be forgiven for not foreseeing how everything would unfold. Would any of us have seen it any more clearly?

The parallels between this element of Milgram’s procedure and what happened in Nazi Germany are striking. German citizens weren’t asked, when the Nazis first seized power, to assist with or condone the deportation of Jews, Roma, gay people, and communists to the death camps. Instead, the rights of these groups were gradually stripped away. Certain business practices were restricted, then travel constraints were imposed, and then citizenship was narrowed; only later were people loaded into boxcars and sent to the death camps. Of course, the step-by-step process in Nazi Germany is no excuse for the atrocities committed, but the Nazis would doubtless have had a much harder time getting so many people to comply if they had started with the last step.

Would Milgram Get the Same Results Today?

Milgram’s studies were done in the early 1960s. But that was then; this is now. If you conducted Milgram’s experiments today, would you get the same results? Some argue that today’s more intense media coverage of such events as domestic spying by the U.S. National Security Agency, the influence of powerful lobbyists on elected officials, and constant cries of “fake news” by former president Trump and his supporters have made people less trusting of authority and thus less likely to obey instructions to harm another individual. Perhaps, but that’s a difficult idea to test because ethical concerns make it impossible to replicate Milgram’s experiments today. All psychological research must now be approved by an institutional review board (IRB) whose responsibility is to make sure any proposed research wouldn’t cause undue stress to the participants or harm them in any way (see Chapter 2). Few, if any, IRBs would approve a direct replication of Milgram’s experiments.

Jerry Burger, however, did the next best thing by conducting a near-replication of Milgram’s basic experiment to investigate whether people’s tendency to obey authority has changed since Milgram’s time (Burger, 2009; Burger, Girgis, & Manning, 2011). Burger identified a critical moment in the original proceedings when disobedience was most likely: right after the participant had (supposedly) delivered 150 volts of electric shock and the learner protested and demanded to be released. It was something of a now-or-never moment: Four out of five of Milgram’s participants who didn’t stop at this point never stopped at all.

Burger saw an opportunity. It would be ethically unacceptable to put people through the stress of deciding between disobeying the experimenter and administering 300 or 400 volts of electricity. But the procedure isn’t so stressful—and is thus more ethically acceptable—up to the 150-volt level. Until that point, Milgram’s learner hadn’t protested, so the pain caused by the shocks (the participants would presume) couldn’t be that bad. Burger therefore sought and received permission from the IRB at his university to replicate Milgram’s basic experiment up to that point only.

A CLOSER LOOK

Opposing Forces in the Milgram Experiments

Milgram knew that many of his participants were experiencing tremendous conflict. On the one hand, they were worried about the learner, but on the other hand, they felt compelled to obey because they didn’t want to disappoint the experimenter, obstruct his scientific inquiry, or create a scene. In a series of variations on his main paradigm, Milgram altered elements of the experimental setup to examine the impact of intensifying or reducing these two opposing forces.

FOR CRITICAL THINKING

  1. What real-world implications might the “tuning in the learner” studies have for, say, military engagements or online interaction?
  2. Can you think of other variations of either “tuning in the learner” or “tuning out the experimenter” that Milgram could have tried to influence participants’ willingness to obey?

The results were essentially the same as those obtained by Milgram himself. In Burger’s study, 70 percent of the participants were willing to administer the next level of shock (165 volts) after hearing the learner’s protest. This compares with 82 percent of Milgram’s participants—not a statistically significant difference. Men and women were equally likely to continue past the critical 150-volt level. Today, people seem to react to pressure to obey the same way they did more than 50 years ago (see also Doliński et al., 2017).

A photo shows two persons in a room where one person wearing a blue laboratory coat has tied the other person’s left arm with the chair. He fixes a machine on his left wrist.
A photo shows a big machine with Shock Generator type 2 L B written on it, and the digits 1, 2, 3, and 4 are displayed on top of it.
REVISITING MILGRAM (A) In Burger’s 2009 near-replication of the original Milgram experiments from the 1960s, participants faced the same conflict over whether to administer increasing levels of shock (up to 165 volts) to the learner or to call a halt to the learner’s suffering by refusing to continue. (B) Burger used the same type of bogus shock generator used by Milgram.

Resisting Social Influence

People don’t always conform, comply, and obey. They sometimes engage in heartening, even heroic, acts of independence—refusing to go along with misguided peers, defying the illegitimate demands of a commanding officer, or blowing the whistle on unethical business practices. What enables people to hold their ground, follow their conscience, and resist being influenced by others?

The pressure to give in to others can be offset by the tendency to resist attempts to restrict freedom of action or thought. According to reactance theory, people experience an unpleasant state of arousal when they believe their free will is threatened, and they often act to reduce this discomfort by reasserting their prerogatives (Brehm, 1956). If your parents say you mustn’t dye your hair, get a tattoo, or hang out with a particular group of friends, does your desire to do so diminish or increase? Reactance theory predicts that the moment you feel your freedom is being taken away, it becomes more precious, and your desire to maintain it increases.

Once motivated to resist, what factors might increase someone’s ability to stand firm? One important variable is practice. In Milgram’s obedience studies, many participants wanted to disobey and even tried to do so, but they weren’t very good at it (Milgram, 1963, 1974). Maybe if they had been trained to disobey when the situation called for it, they would have done a better job. There is evidence that the Christians who tried to save Jews during the Holocaust tended to be people who had a history of helping others, either as part of their job or as volunteers. Those who helped the most often didn’t have any higher regard for their Jewish neighbors than those who helped less; they were simply more practiced in reaching out and providing aid.

Another way to increase the ability to resist social influence is to have an ally. In Asch’s conformity experiment, having just one additional person who departed from the majority was enough to reduce conformity rather dramatically (Asch, 1956). Indeed, the most important lesson of Asch’s research is just how difficult it can be to be the lone holdout. People also need to be wary of potentially slippery slopes (recall the earlier discussion of the foot-in-the-door compliance technique). The stepwise procedure in Milgram’s experiments may have played an important role in the surprising levels of obedience observed in those studies. It’s often easier to resist influence from the start, rather than give in and hope to put a stop to things later on. As the Catholic Church teaches, “Avoid the near occasion of sin.”

Former New York Governor Andrew Cuomo at a press conference.
A photo of Lindsey Boylan holding a mobile in one hand and addressing the press.
RESISTING SOCIAL INFLUENCE When the Me Too movement erupted, norms and awareness surrounding sexual harassment shifted, and many people who had previously stayed silent about efforts to cover up wrongdoing suddenly refused to do so. Many people took a stand in the face of considerable pressure from their harassers and employers not to step forward. Lindsey Boylan is just one example, reporting a pattern of sexual harassment on the part of New York governor Andrew Cuomo during the time she worked at the state’s economic development agency.

It’s important to keep in mind, too, that many social influence attempts are based on appeals to emotion, as our earlier discussion of the effect of mood on compliance makes clear. A particularly effective strategy for dealing with emotion-based approaches is simply to put off a response. If there is a “first law” of emotional experience, it is that emotions fade and moods change. Therefore, the compulsion to give in because you are caught up in a particular emotion can be diminished simply by waiting to respond. After the initial feelings dissipate, you can then decide whether to comply with a request on the merits of the idea, not on the basis of a bad mood or an intense emotional state. You might feel a strong desire to please when someone asks for a donation, for example, and then later on question whether there are other causes that are more deserving of your money or would put it to better use. Waiting a bit before deciding can put you in a better frame of mind to weigh the relative merits of different causes.

LOOKING BACK

As Milgram’s experiment exemplifies, many factors contribute to people’s willingness to obey leaders who demand unethical behavior. But when circumstances lead the individual to attend closely to the victim, obedience decreases substantially. When circumstances lead the individual to be relatively inattentive to the person in authority, obedience is even more greatly reduced, suggesting that it’s more effective to make it easier for participants to disobey than it is to increase their desire to disobey. Several elements of the situation may make obedience easier to understand: A person’s attempts to disobey are often blocked; the person in authority often takes responsibility for what happens; and once the obedience begins, there is typically no obvious stopping point.

Glossary

reactance theory
The idea that people reassert their prerogatives in response to the unpleasant state of arousal they experience when they believe their freedoms are threatened.