The idea was to study the psychology of imprisonment to see what happens when you put good people in a dehumanizing place. But within a matter of hours, what had been intended as a controlled experiment in human behavior took on a disturbing life of its own. After a prisoner rebellion on the second day of the experiment, the guards began using increasingly degrading forms of punishment, and the prisoners became more and more passive. Each group rapidly took on the behaviors associated with their role, not because of any particular internal predisposition or instructions from the experimenters, but rather because the situation itself so powerfully called for the two groups to assume their new identities. Interestingly, even the experimenters were so caught up in the drama that they lost objectivity, only terminating the out-of-control study when an objective outsider stepped in, reminding them of their duty to treat the participants humanely and ethically. The experiment, scheduled to last two weeks, ended abruptly after six days.
As we have come to understand the psychology of evil, we have realized that such transformations of human character are not as rare as we would like to believe. Historical inquiry and behavioral science have demonstrated the “banality of evil” that is, under certain conditions and social pressures, ordinary people can commit acts that would otherwise be unthinkable.
In addition to the Stanford Prison Experiment, studies conducted in the 1960s by Stanley Milgram at Yale University also revealed the banality of evil. The Milgram experiments asked participants to play the role of a “teacher” who was responsible for administering electric shocks to a “learner” when the learner failed to answer test questions correctly. The participants were not aware that the learner was working with the experimenters and did not actually receive any shocks. As the learners failed more and more, the teachers were instructed to increase the voltage intensity of the shocks—even when the learners started screaming, pleading to have the shocks stop, and eventually stopped responding altogether. Pressed by the experimenters—serious-looking men in lab coats who said they’d assume responsibility for the consequences—most participants did not stop administering shocks until they reached 300 volts or above—already in the lethal range. The majority of teachers delivered the maximum shock of 450 volts.
We all like to think that the line between good and evil is impermeable—that people who do terrible things, such as commit murder, treason, or kidnapping, are on the evil side of this line, and the rest of us could never cross it. But the Stanford Prison Experiment and the Milgram studies revealed the permeability of that line. Some people are on the good side only because situations have never coerced or seduced them to cross over.
This is true not only for perpetrators of torture and other horrible acts, but for people who commit a more common kind of wrong—the wrong of taking no action when action is called for. Whether we consider Nazi Germany or Abu Ghraib prison, there were many people who observed what was happening and said nothing. At Abu Ghraib, one photo shows two soldiers smiling before a pyramid of naked prisoners while a dozen other soldiers stand around watching passively. If you observe such abuses and don’t say, “This is wrong! Stop it!” you give tacit approval to continue. You are part of the silent majority that makes evil deeds more acceptable.
In the Stanford Prison Experiment, for instance, there were the “good guards” who maintained the prison. Good guards, on the shifts when the worst abuses occurred, never did anything bad to the prisoners, but not once over the whole week did they confront the other guards and say, “What are you doing? We get paid the same money without knocking ourselves out.” Or, “Hey, remember those are college students, not prisoners.” No good guard ever intervened to stop the activities of the bad guards. No good guard ever arrived a minute late, left a minute early, or publicly complained. In a sense, then, it’s the good guard who allowed such abuses to happen. The situation dictated their inaction, and their inaction facilitated evil.
But because evil is so fascinating, we have been obsessed with focusing on and analyzing evildoers. Perhaps because of the tragic experiences of the Second World War, we have neglected to consider the flip side of the banality of evil: Is it also possible that heroic acts are something that anyone can perform, given the right mind-set and conditions? Could there also be a “banality of heroism”?