And also this: The most frequent violation in obedient sessions (those who shocked till the end) involved reading the memory test questions over the simulated screams of the learner. Doing this effectively guaranteed that the learner would fail the test and receive another shock.
Basically, being willing to shock other people without stopping was more about violence itself being permitted then about being obedient person. Rule followers followed the protocol until they concluded "nope, this is too much" and stopped mistreating the victim.
The (eventually) disobedient subjects were better at respecting the experimental process they were given than the "obedient" ones who went all the way to the maximum voltage. Why was that?
Could it be a sign that the disobedient subjects were on average more concentrated on the task at hand (smarter? less stressed? better educated? more conscientious?) than the ultimately obedient ones, and therefore were more likely to realise they were "hurting" the alleged learner and stop?
Or could it be that the obedient subjects were more likely to realise there was something fishy going on, suspecting the "learner" wasn't really being shocked, and thus were paying less attention to the learning rules?
Or was it, as the article suggests, that the obedient ones may have shut down emotionally under pressure to follow through, and their mistakes are the result of that?
Or were the obedient ones more likely to be actual sadists, who were enjoying the shocks so much that they didn't even care if the "learner" didn't hear their question, giving them a greater chance of shocking them again?
Unfortunately I think the Milgram experiment has become so entrenched in popular culture that there's absolutely no way it can be properly repeated to explore these questions.
Many people are cruel. Not all people, maybe; not most people, also maybe; but some people enjoy hurting others. We see this everywhere. Isn't it possible that this kind of profile jumped on the occasion to inflict pain on people with no fear of repercussions?
In other words, isn't this study just a sort filter to triage / order students from most cruel to less cruel?
You are trapped in an experiment and you have the impression that things went too far and you think you can't escape? You rush it. You hear horrible noises? You just pretend you don't hear them. These are all classical mental patterns. There are million ways to explain them.
The reading of questions while the subject was screaming is acting in a way that seems like that it is a performative action of conforming to the pattern and that the failure of the pattern is caused by the answerer failing to conform to the pattern. That makes the shocks a punishment for failing to conform. The questioner has a facade of doing the right thing by going through the motions, even though they are breaking the rules by doing so, because if the other party were compliant that rule wouldn't have been broken. That the shocks were painful would feel appropriate to those who had a strong sense that nonconformity should be punished. It is less them following the rules and more them assuming the intent of the rules and permitting abuse because the intent was not their decision. It might make them less willing participants to the abuse and more 'not my problem' active participants.
His study plainly shows that most people, in the right circumstances, will act in unimaginably cruel ways.
* kids grow to be rich because they accept delayed gratification
* alpha males are the leader of the pack and all other males are useless
* people accept violence if there is a higher authority which justifies it with a reason
How many people suffered or delivered suffering because of their beliefs in the above?
If we remove this cycle of abuse, what is the natural rate of humans that will hurt others?
An uncomfortable idea, as victims become perpetrators, it may be best to segregate victims to prevent future abuse and victimization.
It can be reframed as cca discipline too, willingness to suffer a bit for later rewards. Can see this as massive success multiplier in many real world situations.
Wonderful idea. Let's not forget to segregate the poors, since they commit violent crimes at higher rates too. We can build a perfect utopia if only we just get rid of all the undesirables!
The delayed gratification thing in particular is correlation vs. causation. It was really more about trust. Forcing kids to delay gratification is meaningless or counterproductive.
Extremely individual reactions, what makes one tougher breaks another completely and permanently, and everything in between.
I'd say everybody experienced some sort and level of abuse, typical school bullies (which were usually also bullied somehow, hence the behavior).
If we remove this cycle of decency, what is the natural rate of humans that will hurt others?
The premise is flawed, humans learn from their environment and there's really no way to put a human in a coffin until they're 20 and see what they do then.
Let’s put it another way, if a catholic priest touches a choirboy, it’s not a good idea to let the choirboy become a priest and victimize the next generation of choirboys.
Gross but perhaps a benefit to society
> The study authors propose that the experimenter played a major, passive role in establishing this dynamic. When the participants broke the rules and skipped steps, the authority figure rarely intervened to correct them or pause the session. By staying silent and letting the memory study fall apart, the experimenter allowed an atmosphere of illegitimate violence to flourish.
This sounds like looting scenarios to me. ie. When a situation descend into chaos, some people will just surf/leverage that chaos, instead of attempting a return to normalcy, for whatever reason.
Here is Derren Brown's attempt at repeating the experiment: https://www.youtube.com/watch?v=Xxq4QtK3j0Y
Yeah, but you can also find that rate if you remove the trigger (abuse) from the environment (society) and see how the rate changes.
You don't have to lock someone in a coffin, or something ridiculous like that (and that would be counterproductive anyway). You create a society, or a least a sub-society, where there's no abuse, and see how much abuse is invented by the people raised in that environment.
What the experiment actually showed: People follow orders when the orders are justified within a persuasive ideological context, e.g. you value science and the scientific researcher is telling you to proceed for the sake of science.
In the first, people who follow the orders of Nazis are not necessarily ideologically aligned with the Nazis, they might just be in a brainless order-following trance. But this isn't real, and in reality the people who were "just following orders" were in fact ideological committed to the cause and should be judged accordingly.
The "rule-breaking" isn't referring to anything the researchers were doing.
It's referring to what the participants were doing. It points out that the compliant subjects who delivered the shocks weren't always following the procedure they were given perfectly. Which is, of course, expected, since people in general don't follow instructions 100% perfectly all the time, and especially not the first time they do something.
> Kaposi and Sumeghy interpret these patterns as a complete breakdown of the supposedly legitimate scientific environment. The subjects were not committing violence for the sake of an orderly memory study. With the scientific elements either forgotten or rushed, the laboratory changed into a setting for unauthorized and senseless violence.
This feels like a huge stretch. Forgetting a step at one point or reading something out loud too early isn't a "complete breakdown of the supposedly legitimate scientific environment" -- a "scientific environment" that is completely fictional to begin with.
Turns out those are not valid examples either. So I am genuinely wondering: what remains of the field of psychology, except for a group of people who find it interesting to think about how other people think/behave? Are there examples of actual, useful and valid conclusions coming from that field?
These were Yale students, so probably smarter than average, and the study didn’t do a very convincing job make it seem believable from what I’ve read.
When I took psychology in college I had to submit to random experiments to as part of my grade (there were alternatives but the experiments were easier). Before I’d ever heard of Milgram, if one of those studies had put me in a similar situation I would have smelled a rat immediately.
When I was in middle school the teachers created a fake “government decree” to convince us that there was a new sin tax on products kids use (as a simulation). I immediately knew it was fake as did many other students, but that didn’t stop us from playing along for fun. I talked to a few of my teachers later and they genuinely believed that we fell for it.
The performance, or signal, or whatever we're calling it. That's the important thing.
That said the study has been replicated many times since the original, with researchers adjusting different parameters like participant screening, changing the gender balance, or varying the roles (teacher/student, researcher/technician...) Across these variations, the overall result stays quite consistent: under certain conditions, ordinary people can be led to do harmful things.
Other experiments have also looked at which factors make this more likely, and for example, diffusing responsibility seems to be one of the most effective ones.
The pop culture version of what happened in those experiments is “regular people will administer potentially lethal shocks when told to”, and that claim has been refuted experimentally many times over.
Contrary to most reports, the original experimenters never told participants that the shocks are supposedly lethal or even dangerous. When participants were actually told that there was a health risk, and that they should ignore it, the vast majority of participants refused to administer the shocks in a later recreation.[1]
In other words, the Milgram experiment, as commonly understood, is somewhere between sensationalism and an outright lie.
The article doesn't say that more people refused than was previously known.
It just concludes that most people weren't following instructions in a way that would have supported the validity of the supposed memory experiment.
Smooth shiny white walls, beakers and test tubes filled with brightly colored liquids on shiny metal tables… Science!
With good enough propaganda machine, any percentage of people would end up 'ideologically committed to the cause' but I don't think they should necessarily 'be judged accordingly' regardless of the larger context..
The article quantifies the amount of rulebreaking. The article actually compares rule breaking across participants and notes that those who were better at obeying the instructions of the experiment are the ones who refused to continue till the end.
The article doesn't invalidate the milgrim experiments. It claims that the interpretation from traditional literature is possibly wrong.
The article doesn’t claim that the experiment was invalidated, but that some conclusions drawn from it are not well founded.
Now the interesting question is _why_ did those people who followed the rules quit at a greater rate? _Why_ did those people follow the rules more closely in the first place? Was there any variation in how the rules were presented? What is the difference in between folks who follow the rules more closely and folks who don't? What can we learn about the human condition from this?
Basically under ill guidance of authority, people can become real monsters. That is the conclusion I got from it, and is now still worse.
> While every obedient participant reliably pressed the shock lever, they regularly neglected or ruined the other steps required to justify the shock.
Procedural violations here include things like asking the question while the person in the other room was still screaming.
* Did the subjects who went full voltage stop caring about the "learning" protocol because they realised it was all fake? Then the conclusions of Milgram's experiment are invalid.
* Did the subjects who went full voltage make more mistakes because they were more anxious and fearful of the experimenter? Then underlying fear might be a mechanism for blind obedience, and further research would be interesting.
* Did the subjects who went full voltage just enjoy electrocuting the dude so much that they stopped caring about asking the questions correctly? Then blind obedience is the least of our worries, widespread sadism is much more concerning.
The act of torturing was not due to the torturer obeying the rules. Instead, torturers broke the rules and created conditions that allowed them more torture.
In order for someone to answer this, I think you need to come up with some sort of definition what "actual", "useful" and "valid" actually means here in this context.
Lots of stuff from psychology been successfully applied to treat people in therapy with various issues, but is that "valid" enough for you? Something tells me you already know some people are being helped in therapy one way or another, yet it seems to me those might not be "useful" enough, since I don't clearly understand what would be "useful" to you if not those examples.
And based on everyone I've met, and on Dan Ariely's own actions (1), I've concluded this one is true.
We all cheat a little from time to time.
Ex : for me, driving a few km/h above the speed limit is "cheating a little"
1 : https://www.businessinsider.com/dan-ariely-duke-fraud-invest...
The version of the Milgram experiment taught to undergrads asks people to believe that you'll follow orders you would ordinarily consider abhorrent simply because you were commanded. But there's basically no evidence for that. People follow orders if those orders are justified in a way that seems persuasive. Nobody ever doubted that Nazis persuaded people to join them. That's not a surprising or even remotely novel finding.
This does suggest that subjects who are bought into and understand the purpose behind what they’re doing, and are attentive to how the specific tasks they’re doing tie into the bigger picture, are more likely to be actively engaging their judgement as they go. And subjects who are just trying to follow the tasks as given to them are sort of washing their hands of the outcomes as long as they’re following the directions (which is, ironically, causing them to fail at following the directions too).
If they think the procedure is to read the next question when the previous one has been completed, and they do, even if the other person is screaming, they think they're "following rules". They're not the ones who came up with the procedure.
Which is the whole point: the participants were trying to follow rules, even if they made mistakes in following those rules. The idea that there was a total "breakdown" of the rules doesn't seem supported at all.
It's being consistently verified in real time if you track current events.
See also the replication crisis.
And social science/history/economics is about learning the standard lessons of the field (even if those lessons are themselves simplistic compared to the real world, they are a baseline of common knowledge).
His relationship with Jeffrey Epstein isn’t a good look either.
Instead, most participants rushed through, most likely to end their own negative experience. Which is much more nuanced that "gosh, they told me to do it."
Your point is fair, but what is really nuanced is that the people who 'stopped' were the best ones at following the rules.
This seems interesting to me - they were conscientious about 'what was happening' - not just blithly following orders.
The 'rule followers' maybe were conscientiously applying the 'spirit of the test' and quit when they realized it was not reasonable.
The others were 'pressing buttons'.
Even then, it's subject to interpretation. There's a perfectly rational reason why people might subject to 'following the rules' if that's what they've been asked to do and have a sense of 'dutiful civic conduct' and 'trust in institutions'.
Mason Cooley
...Which is a good metaphor for the "experiment" as a whole.
I don't know what experience of therapy you've had in the past, but this is typically not how it works. People get better when a treatment is applied that is suitable to them as a person and the context, not sure where you'd get the whole "people get better no matter what treatment is applied", haven't been true in my experience.
Reassessing one of the most famous psychological experiments in history, a recent analysis of audio recordings reveals that subjects who seemingly obeyed orders to administer severe electric shocks actually broke the rules of the scientific study most of the time. The authors suggest that this routine violation of experimental procedures transformed the laboratory into a scene of unauthorized violence, altering our understanding of compliance and coercion. The research was published in the journal Political Psychology.
In the early 1960s, American social psychologist Stanley Milgram conducted a series of experiments to understand how ordinary people could be directed to commit violent acts. Volunteers were recruited for what they were told was a study on memory and learning at Yale University. Upon arriving, they were assigned the role of a teacher and introduced to a learner, who was actually an actor working for the researchers.
The teacher was instructed to read a list of word pairs to the learner and test their memory. For every incorrect answer, the teacher had to administer an electric shock, increasing the voltage steadily up to a supposedly lethal level. As the shocks grew stronger, the learner would begin to grunt, shout protests, and eventually scream in simulated agony.
For decades, psychologists have generally accepted that the participants who went all the way to the maximum voltage did so because they believed in the scientific validity of the enterprise. The assumption has been that the presence of a lab-coated authority figure gave the violent actions a sense of legitimacy. Theoretical explanations for the high rates of obedience rely heavily on the idea that the volunteers willingly participated in a structured, orderly scientific protocol.
Lead author David Kaposi, a researcher at The Open University in the United Kingdom, and his colleague David Sumeghy wanted to test whether this assumption matched the reality of the sessions. Kaposi and Sumeghy questioned whether the obedient participants actually followed the specific instructions that made up the memory test cover story. If the volunteers ignored the scientific procedures, the theoretical justification for their violence would be drawn into question.
To investigate this, the researchers turned to the original audio tapes preserved at the Yale University Library. They secured recordings from four experimental conditions that closely resembled the standard baseline setup. After excluding sessions with missing data or technical irregularities, the sample yielded 136 full audio recordings of individual sessions.
The research team broke down the participants into two populations based on their ultimate actions in the lab. Obedient individuals were those who administered all the shocks up to the theoretical maximum. Disobedient individuals were those who refused to continue at some point and formally ended their participation.
Kaposi and Sumeghy then evaluated how well each participant adhered to the explicit rules of the memory and learning study. According to the original directions, the teacher was required to complete a strict five-step sequence for every single shock. This cycle involved reading a test question, evaluating the learner’s answer, announcing the shock voltage, pressing the shock lever, and finally reading the correct answer aloud.
A failure to complete any portion of this sequence was labeled as a procedural violation. The researchers categorized these deviations into two types based on how they occurred. An omission took place when a subject completely skipped a step, such as forgetting to announce the voltage level or failing to read the correct answer.
A commission occurred when the subject technically performed a step but did so in a way that ruined the premise of the memory study. For example, some participants read the next test question aloud while the learner was actively screaming in protest. Under those conditions, it would be impossible for the learner to hear the question, rendering the educational facade meaningless.
The audio analysis revealed a striking pattern of rule-breaking across the board. Out of all the participants traditionally classified as completely obedient, not a single one actually followed the full five-step procedure from start to finish. While every obedient participant reliably pressed the shock lever, they regularly neglected or ruined the other steps required to justify the shock.
In fact, nearly half of the shock sequences administered by obedient participants contained one or more procedural violations. On average, these individuals violated the experimental rules in 48.4 percent of their actions. The act of pressing the shock lever was consistently completed, but the scientific framework surrounding it was continuously broken.
The researchers noted that the sessions generally unfolded in three distinct phases. In the early stages, where the learner remained relatively quiet, procedural violations were low across both groups. As the session progressed and the learner’s recorded protests intensified, the rate of rule-breaking spiked dramatically.
During the final phase of the maximum-voltage sessions, the learner’s protests ceased completely. Despite this silence, the surviving subjects did not return to following the proper scientific procedures. Their violation rates remained consistently high until the very last shock was delivered.
The researchers compared these overall rates with the rule-breaking behavior of the disobedient subjects. To ensure a fair comparison, they only analyzed the portion of the disobedient sessions where those participants were actively complying and administering shocks. The data showed that disobedient subjects committed significantly fewer procedural violations during their compliant phases.
On average, eventual quitters violated the experimental protocol in 30.6 percent of their active sequences. Unlike the fully compliant group, several of the disobedient individuals followed the procedure flawlessly right up until the point they refused to continue. The statistical difference highlights that the people who eventually quit were actually better at following the scientific protocol than those who went to the end.
The most frequent violation in obedient sessions involved reading the memory test questions over the simulated screams of the learner. Doing this effectively guaranteed that the learner would fail the test and receive another shock. By talking over the protests, the obedient subjects abandoned the goal of testing memory and simply facilitated continuous shocks.
Kaposi and Sumeghy interpret these patterns as a complete breakdown of the supposedly legitimate scientific environment. The subjects were not committing violence for the sake of an orderly memory study. With the scientific elements either forgotten or rushed, the laboratory changed into a setting for unauthorized and senseless violence.
The study authors propose that the experimenter played a major, passive role in establishing this dynamic. When the participants broke the rules and skipped steps, the authority figure rarely intervened to correct them or pause the session. By staying silent and letting the memory study fall apart, the experimenter allowed an atmosphere of illegitimate violence to flourish.
This silent approval of the deteriorating conditions may have functioned as a form of coercive control. Participants found themselves trapped in a situation where the stated rules no longer applied, yet the expectation to deliver shocks remained constant. The authors suggest that this environment compromised the volunteers’ freedom to choose, bringing into doubt the idea that they were acting out of willing obedience.
The researchers acknowledge that their findings rely entirely on observable behavioral records. The audio tapes demonstrate what the participants and the experimenter said, but they do not provide direct evidence of internal motivations. It remains unclear whether the rule-breaking was driven by anxiety, stress, forgetfulness, or an intentional coping mechanism.
Future investigations could look closer into the immediate interactions between the teacher, the learner, and the authority figure. Analyzing exactly how the experimenter’s silence shapes participant behavior might offer fresh insights into how destructive obedience is manufactured in real time. Ultimately, this audio analysis shifts the focus away from the electric shocks themselves and onto the collapsed rules that surrounded them.
The study, “From legitimate to illegitimate violence: Violations of the experimenter’s instructions in Stanley Milgram’s ‘obedience to authority’ studies,” was authored by David Kaposi and David Sumeghy.