On Torture, Ticking Bombs & Sam Harris

I’m sure most people reading this will be aware that Sam Harris has received quite a lot of pushback for his views on torture – see here & here for recent examples (plus read the comments), and see here and here for Harris’s and Richard Dawkins’s response to the often vitriolic nature of the pushback.

I don’t particularly want to get into the debate here – though if other people wish to do so in the comments below, no problem – but I thought it might be interesting to flag up some data I’ve collected via this activity at my Philosophy Experiments web site:

Should You Kill The Fat Man?

This activity has been completed by more than 100,000 people, and it includes a “ticking bomb” torture scenario that people are asked to judge. (If you haven’t completed the activity, I suggest you do so now before reading any further). I’m not going to detail the scenario here – it’s fairly standard – but what the responses show is that given this particular setup a large majority of people think that torture should be used (75% say “Yes”, 25% say “No”). This is the case for males and females, and across different countries.

Here’s a link to the data (which also shows details the precise form of the scenario).

A few points:

1. The only significant difference in how people respond is between males and females. More males than females think torture is justified given this particular setup (77% to 71%) – this difference will be statistically significant (albeit I’ve not actually done a chi-squared test).

2. I’m fully aware that people will consider my ticking bomb scenario – and maybe all ticking bomb scenarios – to be unrealistic. My view is that this criticism misses the force of the “ticking bomb” thought experiment. I think it is best understood as a “wedge” that attempts to show that whether torture is ruled out on any particular occasion is an empirical question rather than a matter of principle.

3. It’s almost certainly the case that nothing follows about the morality of torture from the fact that the practice is endorsed by a large majority of people in some particular (hypothetical) circumstance – not least, because it’s easy to think of examples where a large majority of people endorse some practice we’d consider to be immoral (think slavery, for instance).

4. However, this data does show that most people do not find the sorts of views espoused by Sam Harris to be particularly counterintuitive.

Leave a comment ?

16 Comments.

  1. “My view is that this criticism misses the force of the “ticking bomb” thought experiment. I think it is best understood as a “wedge” that attempts to show that whether torture is ruled out on any particular occasion is an empirical question rather than a matter of principle.”

    These two sentences fairly demolish most of the criticism of SH on torture that I’ve seen.

  2. I’d just like to say that the ticking bomb scenario takes place in a universe where torture is 75% certain to be the only way to obtain the information.

    That’s not the universe I live in.

    My moral judgment may be different in that universe. (I’d have answered differently if it said “Someone told me torture is 75% certain..”)

  3. Dennis Sceviour

    A point was raised that the ticking bomb example is unrealistic. Is there an example of an actual experience where someone was tortured with convincingly established benefits?

  4. @Dave – No, that’s not right.

    The scenario has it that it is estimated that there is a 75% chance that torturing the fat man will result in him giving up the required information. There is no certainty attached to that aspect.

    You’re right that it states that torture is the only way to gain the required information.

  5. @Jeremy
    True that’s the wording.

    My reading was that it was a 60% to 90% chance – i.e. it was certain that it was much more likely to work than not.

    Your point seems to be that you could take the view that, as it was an estimate, I could feel free to ignore it based on the fact that estimator could be lying?

  6. @David – I think the significant point is that the estimate might be wrong (rather than that there is any deliberate intention to mislead).

    In other words, it’s intended to make the scenario a little more true to life in that it’s sensitive to the fact that one can’t know with any certainty what the chances are that torture will work.

    But, generally speaking, as I said in the OP, I think the criticism that a “ticking bomb” scenario is unrealistic misses the force of the thought experiment.

  7. Hey

    “Your moral consistency score is 100% (higher is better) Well done. This score suggests that you are admirably consistent in the way you view morality. In fact, none of the people who have completed this activity demonstrate greater moral consistency in their responses than you manage. But don’t feel too pleased with yourself. Most people don’t think about morality very clearly!|

    Great!!!

    Sort of on the same subject, here is a scenario and question regarding Harris on torture. The US grows more and more fearful of Iran regarding its nuclear programme. The US eventually feels that the only option is to strike Iran’s nuclear reactors which would potentially kill millions of people. Does anyone think Harris, within his logic, would support the kidnapping and torture of US diplomats by the Iranian secret service to determine which civil nuclear reactors they are planning to strike and thereby save countless lives?

  8. Left-wing radicals often forget how much distance there is between *their* moral intuitions, and the moral intuitions of ordinary people.

  9. My moral intuitions have always tended towards some kind of consequentialism. I’m not categorically opposed to torture or even killing in principle — there are always exceptional cases. So, just to check the state of my intuitions I started doing the linked survey.

    Unfortunately, I am not confident that this survey is free of confounds. The Fat Man case has no internal validity — it potentially tracks the wrong kinds of intuitions. After all, I don’t want fatness to play any role in my choice. Moreover, it is at best only a distraction for it to be there — if point of the thought-experiment is to test our intuitions about the doctrine of double effect, not to determine the relative merits of bigotry.

    Although this makes no difference to how I respond to the first “Fat Man” case — I won’t push him, even to save the five. But it has an effect on how I would like to answer the saboteur case. As it is currently stated, I would still choose not to push him. However, I would respond differently if it were phrased like this:

    Marty Bakerman is on a footbrige above the train tracks. He can see that the train approaching the bridge is out of control, and that it is going to hit five people who are stuck on the track just past the bridge. The only way to stop the train is to destroy the remote control which controls the speed of the train. The remote is being held by a railway worker who is on a catwalk overhead. The only way to destroy the remote control is by loosening the bolts of the catwalk. Marty knows with absolute certainty that the railway worker is responsible for the failure the train’s brakes: upset by train fare increases, he sabotaged the brakes with the intention of causing an accident. Marty can loosen the bolts of the catwalk, which will only kill the saboteur but save the five people already on the track; or he can allow the train to continue on its way, which will mean that the five will die.

    It would be interesting to see if this alteration would have any effect on how others would respond.

  10. Hi Ben

    I understand why you suspect confounding variables might mess up the fat man intuitions, statistics, etc., but I don’t think it’s actually happening to any great extent (though I can’t be sure).

    See the discussion here (including comments, if you can face it).

    http://blog.talkingphilosophy.com/?p=1677

    Your variation on theme is interesting. One day I might get around to re-programming some of these activities to incorporate the variations that people suggest. It’s not massively time-consuming, it’s just there are always other things to do!

  11. Jeremy;

    Reading this thought experiment, I remember for some reason the Milgram experiment. Basically, this experiment tested the effect of authority on the electrical shocks that they administer to a subject (actor) in another room. In this real experiment, most people adminiter nearly deadly shocks because that was what they were asked to do by an authority figure.

    What happened to the moral intuition of those people in those circumstances? And did the fact that the majority of people administer lethal shocks make it morally right.

    My point is: Taking into account that these moral intuitions are revealed in specific circumstances, which certainly have an effect on the outcome; how valuable are “moral intuitions” to determine ethics?. Having said this, I am still trying to figure out if this thought is relevant to the article and post. Sorry if it is not, but I still feel it has some value.

  12. Juan – I certainly don’t think the fact that a large number of people think that some act x is right makes that act right (that sort of position is naive moral relativism).

    But one of the interesting things about Milgram’s experiment is that many of the people administering the “shocks” felt incredibly distressed doing so, which suggests that at least some of their intuitions were contrary to their behavior.

    Moreover, in his book Social Animal, the social psychologist, Eliot Aronson, reports that when you ask people whether or not they would themselves deliver shocks in the Milgram situation, almost universally people say that they would not. The reality, of course, is that probably they would – certainly that’s what the evidence from the many repetitions of Migram’s experiments suggests – but nevertheless the fact that people clearly think the authority figure should not be obeyed suggests that maybe what’s going on in that situation is that people’s behavior becomes detached from their moral sense.

    I think the point you’re making is relevant here in that it’s a reminder that plenty of non-rational, and indeed, irrational, stuff can come into play when one is making decisions in extremis.

  13. Jeremy;

    “I think the point you’re making is relevant here in that it’s a reminder that plenty of non-rational, and indeed, irrational, stuff can come into play when one is making decisions in extremis.”

    Thank you for the clarification, becuase that was exactly what I was trying to say, and how that relates to the thought experiments. And the question we can raise Do thought experiments designed to put people “in extremis” reveal our human nature or our human moral intuitions?

    Following the observation that people would not give shocks if not compelled by authority, we can cite the relationship of humans to muder. Studies indicate very clearly that for the majority of the populations it is not easy to kill. For example in WWII studies indicate than only ~10% of soldiers actually engaged in killing the enemy. There is a significant emotional barrier to destroy the life of another human being. And doing so, can have long lasting emotional effects on the killer.
    However, all this can change with training. Proper training desensitize humans to the act of killing and in many cases avoid the PTSD. currently, when appropiatedly trained, soldiers fully engaged in combat.

    My point is that a fundamental moral intuition, do not do harm to other human being, can be heavily conditioned by the enviroment. And my questions is do we and how do we take this into acount in our ethics? And in our thought experiments?

  14. I’m unsure of what we are to conclude from the experiment, because it involves testing “intuitions” in cases far beyond the bounds of their application.

    The scenario is essentially a very long conditional. (1) If it is true that the man placed the bomb 24 hours ago, and (2) if the bomb will kill a million people if it explodes, and (3) if bomb experts could defuse it if they find it, and (4) if one cannot convince or trick the man, and (5) if torture has a 75% chance of success, and (6) if there is no other way to solve the problem, then torture might be an appropriate response.

    The problem is that – with the possible exception of (3) – just none of these could be determined to be true with any degree of reliability in the kind of real-world situations in which human intuitions develop. Thus, even if the conditional itself might be true, it couldn’t be applied to any real world problem where out intuitions are relevant.

    Consider that the fat man told you he placed a bomb in the mistaken impression that doing so (telling you) would enable his escape. You begin to torture him and he then asserts that he made the whole thing up. What then, given that you could only have begun to torture him on the basis that it was true that he placed the bomb…?

  15. @greg/juan – The other angle that SH brings into the discussion is “collateral damage” and how our moral intuitions don’t appear to square with the objective results. In talking about bombing a factory, say, we might view the inadvertent slaughter or maiming of some innocent bystander as justified, or at least ameliorated, by the fact that we didn’t intend to do anything of the sort. However, so far as the bystander is concerned, the result is the same. If we really are concerned with “harm”, rather than “greater good” or something similar, we should either view torture as we view collateral damage, or view collateral damage as we view torture.

    That was my take away, at least. I don’t read SH as attempting to justify torture, rather, as pointing out why our moral intuitions on the methods don’t square with the results.

  16. Ok, this test/questionnaire on torture is fun. - pingback on August 16, 2012 at 2:51 pm

Leave a Comment


NOTE - You can use these HTML tags and attributes:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Trackbacks and Pingbacks: