I don’t know whether this fallacy has a name of its own – I’m sure that Mike LaBossiere can tell us if it does – but how often have you seen somebody argue along the following lines?
P1. X believes A, B, and C.
P2. Y and Z (and others) believe A, B, C, and D.
C. (Therefore) X believes D.
What then happens is that X is criticised for believing D, even though D may be a proposition that X has never argued for, expressly relied upon, or even affirmed. In some cases, D may be some horrible proposition that would suggest X is of bad character if X actually believes it. In other cases, it may merely be something absurd, clearly false, or highly controversial.
As it stands, the argument that X believes D is straightforwardly invalid. It is no more valid if it takes the following variant form:
P1. X believes A, B, and C.
P2. Y and Z (and others) believe A, B, and C because they believe D.
C1. (Therefore) X believes A, B, and C because X believes D.
C. (Therefore) X believes D
Reversing the two premises, arguments like this are similar to the classic (and straightforwardly fallacious):
P1. Some Xs are A’s.
P2. X-1 is an X.
C. X-1 is an A.
Perhaps, however, there is something more going on in the minds of people who use arguments such as I’ve identified.
Perhaps, on a particular occasion, they think that A,B,C without D is somehow an incoherent package of beliefs, and so they attribute to X what they see as the more coherent A,B,C,D.
Or perhaps they are reasoning inductively from a sociological observation that most people who believe A,B,C also believe D, so X probably believes D. Or maybe, related to the previous paragraph, they think that you could only, rationally, come to believe A,B,C on the basis of first believing D. Or the idea might be that believing D, which is widespread, causes a widespread bias in favour of people believing A,B,C (though D is highly controversial, or clearly false, or some such thing, once it’s explicitly identified).
Although it’s always open to someone to put these sorts of arguments, they are obviously going to be tricky in any particular case. Reasons have to be given as to why D produces a bias, why D might be widely (perhaps subconsciously?) believed even though it is clearly false, or absurd, or whatever, once identified; why the position A,B,C, without D, is incoherent; why there is no other basis for thinking A,B,C; and/or whatever else might be required to make out the particular argument. You need to be careful before you move too quickly to saddle somebody with the absurd or clearly false or highly controversial or just plain horrible proposition D.
That said, the temptation to move quickly and incautiously down this path seems to be a strong one. Often we have enough background beliefs of our own (“Surely no one could possibly believe A,B,C unless they first believe D!”) that we find it very natural to draw the final inference intuitively and almost unconsciously. I know that I sometimes feel this temptation, and I’m sure I’ve often succumbed to it. I don’t think there’s a lot of point in castigating people for it, or even in apologising when caught doing it.
Hasty reasoning of this kind, leaving out steps, and failing to recognise just how difficult and inconclusive such arguments tend to be, is all too tempting. It’s lazy. It cuts corners. It can lead to you paying insufficient attention to what an opponent is really saying. In the extreme, it might encourage you to demonise an opponent (X surely “must” believe the horrible proposition D!) without a good basis. But it is not the sort of thing done only by irrational or ill-willed people.
My proposal is not so much that we go around castigating this way of thinking, which is almost ubiquitous. I don’t want to give real examples of it (and I could, as mentioned above, almost certainly find cases where I’ve done it, too). However, it’s something that we might be more aware of and careful about, given all that I’ve said, and especially as it provides a route to misunderstanding and even demonising opponents. And in some cases, our opponents are right there, taking part in discussion with us, so we can simply ask them: “Are you relying on proposition D?”
All in all, attributing beliefs to opponents needs to be done with great care if they have not expressly relied on or otherwise asserted those particular beliefs. Speculating about what your opponents really think (but are not saying) may not be the worst of intellectual crimes, and it may be very tempting. Sometimes these speculations might even be relevant and useful (say your opponent claims to be relying on “nice”, attractive, good-for-their-public-image premises E and F, but you have independent reason to think they are really reasoning from discredited proposition D).
As always nuance is important, but if we want to be fair, make progress, and avoid flame wars, let’s at least be careful about the kinds of reasoning I’ve discussed. At their worst, they are obviously fallacious. Even at their best, they are highly uncertain and need a lot of work before they can be employed cogently.