This episode is part of the Plato’s Academy Centre course on the Socratic Method. In this lesson, we will be learning about how cognitive therapists spot common cognitive distortions and how these compare to common logical fallacies.
In the previous lesson, we looked at how cognitive therapists use Socratic Questioning today to help clients evaluate their beliefs in terms of evidence and helpfulness. They also help clients examine the "logic" of their beliefs. This is a slightly trickier approach, although closer in some regards to what Socrates and other philosophers were doing.
We're talking about informal logic here, i.e., whether our thinking is broadly rational and consistent or not. One of the most direct questions we might ask ourselves is therefore simply: Is that belief logical? People often find that hard to answer, though, without having some examples of ways in which their reasoning might be flawed.
Our thinking is definitely irrational if it is based upon a set of beliefs that contradict one another.
Logical Contradictions
First of all, like Socrates, we might look for contradictions. Our thinking is definitely irrational if it is based upon a set of beliefs that contradict one another. One of the most basic principles of logic, indeed, is called the Law of Noncontradiction. If two statements are contradictory they're not necessarily both false, but at least one of them must be. They cannot both be true, that is, but they could both be false.
Spotting contradictory beliefs can require effort and rigorous honesty but it can also be very powerful because most (but not all) of us do feel a strong urge to change our thinking when forced to admit that our own beliefs are in conflict with one another.
Cognitive Distortions
Beck and other cognitive therapists have often found it's helpful to teach their clients the names of typical "cognitive distortions", colloquially known as "thinking errors". There are many studies showing that different cognitive distortions or biases are more common when people are depressed, anxious, or very angry. The cognitive therapist, David Burns, for instance, has a list of about ten common "thinking errors" in his bestselling book Feeling Good. Other therapists use slightly different lists but they generally have a lot in common. Some basic examples are:
Overgeneralization, or making sweeping statements that go well beyond the known facts.
Example: "Nobody likes me" versus "Some people don't like me."Catastrophizing, or exaggerating how severe a threat is likely to be.
Example: "What if my wife leaves me? I won't be able to cope!" versus "My wife probably won't leave me and even if she did it might be really bad but not the end of the world; I would survive and carry on."Discounting, or trivializing information that should cause us to change our behaviour
Example: "Pete said he likes me but he's just being nice" versus "Pete said he likes me, and for all I know he's telling the truth, so I shouldn't dismiss that as if it doesn't count."Mind-reading, or assuming what other people think without checking – a problem especially common in severe social anxiety.
Example: "Everyone at work thinks I'm an idiot and I don't deserve my job" versus "I don't know what people think until I ask them, so I should find good ways to get feedback from my colleagues."
There are many more cognitive biases and distortions. As we'll see, they often resemble what philosophers call informal logical fallacies. For example, the cognitive distortion which psychologists call "overgeneralization" is basically the same as the informal fallacy philosophers refer to as making a "faulty generalization".
Informal Fallacies
Logical fallacies are arguments that are generally understood to be illogical – they're "wrong moves" in reasoning. Often people simply use them by mistake. Rhetoricians throughout the ages, though, have also used them quite deliberately to manipulate their audiences by "cheating" in an argument. Today, as you'll notice, some of these fallacies are extremely common in the media and in online discussions.
Ad Hominem fallacy. For instance, a very common informal fallacy is traditionally known as the argumentum ad hominem. This involves criticizing a person in order to discredit what they're saying. Politicians do this an awful lot, and now it's increasingly common on social media. For example, "This scientific claim can't be true because the people saying it are [conservatives / liberals] and everything they say is a lie!" Of course, nobody lies all the time, and if we're going to think for ourselves we need to learn to judge statements on their own merits rather than leaping to conclusions based on our political prejudices, and so on. Socrates, you may notice, is careful to avoid attacking others. In fact, he may do the opposite, and praise the character of those whose beliefs he is nevertheless questioning.
Straw Man fallacy. You'll also find many examples of the Straw Man fallacy online. This consists in falsely attributing an easily-refutable position to someone in order to discredit them. It's a straw man they're attacking when they do that, a fake opponent that they've manufactured to make an easy target, not the real person. For example, someone might say "Cognitive therapists believe that all emotional problems can be solved by reasoning and that's clearly not true." That, however, is not a claim that any real cognitive therapist has ever made – it's just a caricature of their theory. Attacking a straw man is a way of cheating in a debate, to make it look as if you've refuted your opponent when really you're just refuting something they never said.
Causal fallacies. Scientists are trained, from the outset, to avoid these fallacies. When people who are not educated in research methods quote scientific studies, though, they often fall into errors of reasoning about causation. Causal fallacies take several forms. One of the most common is the post hoc ergo propter hoc fallacy or "after the event therefore because of the event." For instance, someone might drop dead after taking a specific type of medicine. You'll often find people arguing "He dropped dead right after he took those pills therefore they killed him!" That does not prove a causal connection. (For all we know, he may have been dying anyway and didn't take enough of the medicine to cure himself.) A more general error of the same kind is known in science as the fallacy of confusing correlation and causation. Every researcher knows to avoid this methodological mistake. Many studies measure statistical correlations. It might be easy to show, for instance, that severity of depression is highly-correlated with frequency of negative thoughts. Nevertheless, it does not logically follow from this correlation that negative thoughts actually cause depression – it could be the other way around.
These are just a few examples of logical fallacies. There are many more. Learning to spot them helps protect us against bad reasoning or "sophistry", which is, sadly, very common in politics and the media, and extremely prevalent in debates on social media.
If you're interested in discussing online you can join our Substack Note thread about "How can we best define wisdom?" and subscribe to the Plato’s Academy Centre on Substack.
Share this post