As World War II ended and the Holocaust became shockingly more real than the rumor or propaganda some believed it was, people wondered how it could happen. Why, somewhere along the way, did not more Germans involved in the genocide object? The same was asked of Stalin's Russia, Mao's China and Pol Pot's Cambodia.
In 1961, psychologist Stanley Milgram set out to answer that, undertaking a series of now infamous experiments on obedience and reprehensible behavior. About two-thirds of Milgram's nearly 800 study subjects, pressed by an authoritative experimenter, were willing to administer increasingly powerful electric shocks to an unseen stranger despite cries of agony and pleas to stop.
The dark side of humanity seemed to be easy to draw out - just have a government saying it was the right thing to do. And then put the people most likely to like government in charge of any programs that others might detest. Milgram did that, dividing subjects into two categories: obedient or disobedient. This makes sense. People inclined to conservative or libertarian views are less likely to be in government at all, and more inclined to rebel against government intrusions, so it is more difficult to get them to commit immoral acts.
These were not CIA operatives specially trained to extract information from terrorists, they were ordinary people hurting other ordinary people, a lot like the people who carried about the famous genocides of the 20th century
After examining the experiences of more than 100 of Milgram's participants, Matthew Hollander, a graduate student in sociology at the University of Wisconsin-Madison, sees more nuance in their performances - and maybe a way to prevent real-world occurrences of authority overriding ethical judgment.
"Milgram claimed to have found sort of a dark side to human nature that people were not quite as attuned to," says Matthew Hollander, a graduate student in sociology at the University of Wisconsin-Madison. "His study participants were much more likely to obey than he expected, and that was an understandably uncomfortable result.
"The majority did cave, and follow the experimenter's orders. But a good number of people resisted, and I've found particular ways they did that, including ways of resisting that they share with the people who ultimately complied."
Hollander's analysis of audio recordings of the experiments yielded him to note six practices employed against the repeated insistence of Milgram's authority figure. Hollander found study subjects resorting to silence and hesitation, groaning and sighing to display the effort it took to comply, and uncomfortable laughter. They also found more explicit ways to express their discomfort and disagreement. Subjects stalled by talking to the recipient of the shocks and by addressing their concerns to the experimenter. Most assertively, they resorted to what Hollander calls the "stop try."
"Before examining these recordings, I was imagining some really aggressive ways of stopping the experiment -- trying to open the door where the 'learner' is locked in, yelling at the experimenter, trying to leave," Hollander says. "What I found was there are many ways to try to stop the experiment, but they're less aggressive."
Most often, stop tries involved some variation on, "I can't do this anymore," or "I won't do this anymore," and were employed by 98 percent of the disobedient Milgram subjects studied by Hollander. That's compared to fewer than 20 percent of the obedient subjects.
Interestingly, all six of the resistive actions were put to use by obedient and disobedient participants.
"There are differences between those two groups in how and how often they use those six practices," says Hollander, whose work is supported by the National Science Foundation. "It appears that the disobedient participants resist earlier, and resist in a more diverse way. They make use of more of the six practices than the obedient participants."
Therein lies a possible application of Hollander's new take on Milgram's results.
"What this shows is that even those who were ultimately compliant or obedient had practices for resisting the invocation of the experimenter's authority," says Douglas Maynard, a UW-Madison sociology professor who leads the Garfinkel Laboratory for Ethnomethodology and Conversation Analysis. "It wasn't like they automatically caved in. They really worked to counter what was coming at them. It wasn't a blind kind of obedience."
If people could be trained to tap practices for resistance like those outlined in Hollander's analysis, they may be better equipped to stand up to an illegal, unethical or inappropriate order from a superior - or a terrorist hijacking a plane.
"It doesn't have to be the Nazis or torture at Abu Ghraib prison in Iraq or in the CIA interrogations described in the recent U.S. Senate report," he says. "Think of the pilot and copilot in a plane experiencing an emergency or a school principal telling a teacher to discipline a student, and the difference it could make if the subordinate could be respectfully, effectively resistive and even disobedient when ethically necessary or for purposes of social justice."
Published in the British Journal of Social Psychology.
Stanley Milgram's 'Dark Side' Experiments Show How To Keep Atrocities From Happening
Comments