Chapter 11: Logical Fallacies
Note: This is an extremely long chapter with detailed examples and demonstrations, so we strongly recommend making use of the linked Table of Contents in the interactive web version of this text in order to navigate and find the information you seek.
Taken with kind permission from the book Why Brilliant People Believe Nonsense by J. Steve Miller and Cherie K. Miller.
Objectives
Upon completion of this chapter, readers will be able to
- Define and recognize common logical fallacies.
- Analyze and identify fallacious reasoning.
- Avoid logical fallacies in their own work.
- Avoid committing the “Fallacy Fallacy.”
Logic - Common Fallacies
The dull mind, once arriving at an inference that flatters the desire, is rarely able to retain the impression that the notion from which the inference started was purely problematic.- George Eliot, in Silas Marner
Brilliant people believe nonsense [because] they fall for common fallacies.
Even the brightest among us fall for logical fallacies. As a result, we should be ever vigilant to keep our critical guard up, looking for fallacious reasoning in lectures, reading, viewing, and especially in our own writing. None of us are immune to falling for fallacies.
Until doctors come up with an inoculation against fallacies, I suppose the next best thing is to thoroughly acquaint ourselves with the most common fallacies. I chose the following fallacies by comparing a dozen or so university sites that list what they consider the most common fallacies that trip up students.
Snoozer Alert!
Sorry, but this chapter doesn’t contain fascinating stories and intriguing intellectual puzzles. But please resist the temptation to skim to the following section. To think critically, we simply must familiarize ourselves with logical fallacies. Otherwise, we're fair game for all sorts of nonsense. Think of it like math. While the formulas themselves might be boring, we learn them in order to hopefully use them for something practical in the future. You'll assuredly find many of the below fallacies used in conversations and articles.
Think of logical fallacies as the grammar you must master to learn a foreign language. Before you can use a language practically (like writing a note to that ravishing foreign exchange student in her native language), you simply must learn the vocabulary and grammar. Similarly, logical fallacies are a part of the vocabulary of logical thinking. I'll try to make understanding them as painless as possible.
Learn these well. Reflect upon them. Look for them in the media. Familiarizing yourself with errant reasoning goes a long way toward helping you to write, reason, speak, and listen with more critical precision.
Tip: If some of my definitions and examples don't sufficiently clarify, look up the fallacy in Wikipedia or other sources for alternate explanations.
Below this list of fallacies, I'll give you a bit of practice by asking you to connect a fallacy with an errant argument. Finally, I'll give a few tips on checking your own argumentation (particularly in writing and speeches) for fallacies.
Twenty-Seven Common Fallacies
Ad Hominem
Translated into English: "against the person," aka "damning the source," the "genetic fallacy," "poisoning the well," related to "tu quoque" (you, too!). Defined as attacking the person (e.g. -- can't be trusted, is a moron, etc.) rather than the argument.
I don't believe anything he says because he's a biased political liberal.
Yet, shouldn't we assess his arguments based upon his evidence and argumentation, rather than solely because of his political label?
Caution: Sometimes a person has indeed been shown to be untrustworthy. Cautioning readers that he has been repeatedly caught in flagrant lies isn't an ad hominem fallacy. Noting a person's lack of integrity can be valid, if his argument requires us to trust him.
Tip: If the person's character is either irrelevant to the argument or unknown, focus on the facts and arguments.
Affirming the Consequent
AKA "converse error" or "fallacy of the converse." This is a formal fallacy (the form of the argument isn't valid) that assumes if the argument is valid going one direction, it's also valid when run the opposite direction.
Premise 1: If I get the flu, I'll be nauseated.
Premise 2: I'm nauseated.
Conclusion: Therefore, I have the flu.
This is invalid because while it may be true that if you get the flu, you'll get nauseated, the converse isn't always true. You can be nauseated and yet not have the flu. Perhaps you have a hangover or are pregnant.
Tip: If you see an argument in the following form, it's affirming the consequent:
Premise 1: If P, then Q
Premise 2: Q
Conclusion: P
Appealing to Extremes
Taking an assertion to an extreme, even though the arguer may never take it to that extreme.
Avid health advocates blow out their knees by their 50s by running marathons. Therefore, don't prioritize regular exercise.
But not all avid health advocates run long distances as their primary exercise. It's an extreme statement.
Argument from Authority
AKA "argumentum ab auctoritate," "appeal to authority." Claiming that a position is true because an authority says it's true.
Even when the referenced authority is a true authority in the field, arguments should ultimately be based upon facts and reasoning rather than quoting authorities. Also beware of people quoting false authorities, like football stars or models selling insurance or technology.
We know global warming is true because a number of great scientists assure us it's true.
Caution: Sometimes citing authorities can be a valid part of an argument. For example, if a hefty percentage of respected scientists who specialize in a related field are all warning us about the dangers of global warming, this in itself provides evidence that global warming is at the very least a viable theory that needs to be seriously considered. Alternately, if no respectable scientists took global warming seriously, then this would surely be a strike against it, even though ultimately, we're looking for hard evidence rather than numbers of testimonies.
Tip: Ask yourself,
- Are these truly experts in the field I'm discussing? Would some view them as either biased or holding to fringe views on the subject?
- Have I explained clearly how I'm using these authorities as evidence, within the larger scope of my argument?
- Would it be relevant to explain the evidence that led the authorities to come to their position on the subject?
- Are you using their testimonies as helpful resources, quoting them as a part of a larger argument, or quoting them as a slam dunk argument to make your case? Make sure you're not saying something like: Dr. Authority believes x, so we should believe x as well.
Argument from Ignorance
AKA "appeal to ignorance," "argumentum ad ignorantium," related to "non-testable hypothesis." Assuming that a claim is true because it has not been or cannot be proven false (or vice versa, assuming that a claim is false because it has not been or cannot be proven true.)"
Nobody can prove that my client was at the scene of the crime, therefore he's innocent.
Of course, he may be in fact guilty. We may just lack sufficient evidence that he was there.
Caution: While some would say "absence of evidence is not evidence of absence," this isn't true in every case. For example, if I walk outside and see no evidence of rainfall (no puddles, the streets aren't wet), I'm justified in taking this as evidence that it hasn't rained recently. In this case, the absence of evidence for rain is indeed evidence for the absence of rain.
Band Wagon
AKA "ad populum fallacy," "appeal to widespread belief," "appeal to the majority," "appeal to the people." If a large number of people believe it, it must be true. It appeals to our desire to fit in.
Most people use Microsoft products, so they must be the best.
Everybody I know smokes cigarettes, so it can't be that bad.
Caution: Some people naturally despise majority opinion and relish holding contrarian positions. Those who disagree with opinions held by a majority of intelligent people should at least make sure they understand the reasons informed people give to justify their beliefs.
Tip: Remember that popular opinion is often wrong, and what's cool today may seem foolish tomorrow. In fact, it's often those who stand against the crowd who change the world. As Apple, Inc. said it in their motto: "Think different."
Begging (Evading) the Question
AKA "circular argument," or "petitio principii," translated "assuming the initial point." The conclusion is assumed in a premise. This typically isn't as obvious as it first sounds.
The Writing Center at the University of North Carolina gives a good example.
Active euthanasia is morally acceptable. It is a decent, ethical thing to help another human being escape suffering through death.
At first read, it may seem pretty straightforward. But let's examine it as a premise and conclusion:
Premise: It is a decent, ethical thing to help another human being escape suffering through death.
Conclusion: Active euthanasia is morally acceptable.
Look closely at these two sentences and you'll discover that they actually do nothing more than state the same thing twice; the conclusion merely dresses up the premise in different words. "Decent, ethical" in the premise is worded "morally acceptable" in the conclusion. "To help another human being escape suffering through death" in the premise becomes "active euthanasia" in the conclusion.
Thus, the argument doesn't tell us much, if anything, about why euthanasia is morally acceptable. It leaves us asking the implied question over again, "But why is it acceptable?", showing that the premise and conclusion merely begged (i.e., evaded) the question.
Tip: Typically, rewriting the argument in the form of premises and a conclusion reveals when a question is being begged. Do you agree with the premises? Are there gaps in the line of argument? Does the conclusion say nothing more than the premises already stated?
Bifurcation
AKA "false dichotomy," "black-or-white fallacy," the "either-or fallacy," related to a "false dilemma." The argument makes it appear that there are only two possible answers, but there are actually more. Consider the bandit’s command, “Your money or your life!”
There is a third option, of course, you could fight the bandit. That option may be unappealing, but generally, there is always at least a third option.
Tip: Ask yourself, are there really two and only two options? If not, are any of the other options viable? Have all other options been sufficiently ruled out?
Dogmatism
Not even considering an opponent's argument, because of overconfidence in one's own position.
Statement: Mercedes makes the best car ever.
Retort: But according to Consumer Reports....
Dogmatic Defense: I don't care what those studies say; I know! Mercedes is the best.
Emotional Appeals
An appeal to emotion that is irrelevant (or largely irrelevant) to the argument.
The death penalty can't be right. Have you seen a person die in an electric chair?
Caution: Emotion can often be a legitimate part of an argument.
Look at these poor birds dying from an oil spill. This demonstrates one reason we should take great precautions to avoid such mishaps.
Equivocation
Related to "semantics," "playing with words." Using the same word with more than one meaning, thereby invalidating the argument.
Of all the animals, only man is rational. No woman is a man. Therefore, no woman is rational.
In the first instance, "man" means "mankind," whereas in the second instance, "man" means "the male gender." This change in meaning invalidates the argument.
Tip: Look carefully at the argument's important words. Are they used in a consistent way, or do they shift meanings?
Fallacy of Exclusion
Focusing on one group's behavior as if the behavior is exclusive to that group.
Watch those women drivers. They're always thinking of something other than their driving.
But are male drivers any better? Shouldn't this statement be based on psychological studies and statistics of accidents rather than personal observations of one sex?
False Dilemma
AKA "false dichotomy," "either/or," "black/white," "excluded middle." A form of bifurcation, this fallacy allows for only two extreme positions, although a legitimate middle ground might be arguable. Sometimes they paint one side as so extreme that nobody could ever agree with it.
"Either you support House Stark, or you’re a traitor to the realm."
Why this is a false dilemma:
This statement presents only two options—supporting House Stark or being a traitor—when in reality, there are many other possible stances:
- You might support House Targaryen, House Lannister, or remain neutral like House Tyrell.
- You might oppose all noble houses entirely (like the Brotherhood Without Banners).
- You might care more about the White Walkers than who rules Westeros.
This oversimplification ignores the complexity of the political landscape in Game of Thrones and falsely limits the choices to just two, creating a false dilemma.
Tip: When only two extreme alternatives are given, look for middle ground.
Faulty Analogy
AKA "weak analogy." Comparing two similar things to make a point, but the analogy breaks down because of one or more significant dissimilarities. Let’s look at another Game of Thrones example.
"Daenerys Targaryen is like Abraham Lincoln because they both freed people from slavery."
Why this is a faulty analogy:
At first glance, this might seem reasonable—they both opposed slavery. However, the comparison falls apart under scrutiny:
- Different contexts: Lincoln was a real historical figure operating in a 19th-century political and legal framework, while Daenerys is a fictional character in a fantasy world who uses dragons and war.
- Different motivations and methods: Lincoln worked through political and legal channels to preserve the Union and end slavery. Daenerys often used violence, fire, and fear to achieve her goals.
- Different outcomes: Lincoln is remembered for fostering democracy and unification, while Daenerys’s trajectory ends in tyranny and mass destruction.
The analogy oversimplifies complex figures and falsely equates their actions and legacies, making it a faulty analogy.
Tip: Is the analogy truly alike in all relevant respects?
Glittering Generality
AKA "Weasel Words." Using words in such a broad way that almost everyone resonates with them in the same way, thus lending credence to the argument. Thus, those who argue that their position is really about "freedom," "love," "human rights," etc., can gain a following, even though the words may mean different things to different people, or are being used in such a vague way as to be essentially meaningless. Consider the example, below.
"A just society always stands up for the most vulnerable among us."
Why this is a glittering generality:
- Emotionally powerful: Words like "just society," "stands up," and "most vulnerable" appeal to moral ideals and compassion.
- Vague and undefined: It doesn't specify who the "most vulnerable" are (children? the elderly? the poor? refugees?) or what "standing up" entails.
- Assumes universal agreement: It relies on shared positive associations without offering concrete definitions or solutions.
This type of language is persuasive because it sounds ethical and noble, but it lacks clarity or specificity, which is the hallmark of a glittering generality.
Hasty Generalization
Related to "non-representative sample," "fallacy of insufficient statistics," "fallacy of insufficient sample," "fallacy of the lonely fact," "leaping to a conclusion," "hasty induction," "secundum quid” (converse accident). A conclusion was reached via inadequate evidence, such as when a sample cited was inadequate (e.g., atypical or too small) to warrant a generalized conclusion.
Most Hollywood stars have terrible marriages. Just read the tabloids.
Their conclusion may or may not be true, but reading tabloids is no way to decide the issue. News sources by their very nature select what's "newsy." Since a nasty divorce is more newsy than a stable marriage, the former gets the press, giving the impression that most Hollywood stars can't hold a marriage together.
I'll never fly again. I read about too many accidents and hijackings.
Again, you don't hear about the thousands of flights with no incidents. Thus, you're judging from the news you hear, which is both an atypical and small sampling. The National Safety Council calculated the odds of dying in a motor vehicle accident as one chance in 98 over a lifetime. The odds for dying in air travel (including private flights) was one chance in 7,178.
Tip: Notice the sample size and where it's drawn from. Is it adequate to warrant the conclusion? Is the conclusion stated in terms that are too general and sweeping?
Inconsistency
AKA "non contradiction." The argument contradicts itself.
Only statements that can be justified with scientific experiments can be believed.
Yet, this statement itself can't be justified by scientific experiments.
Our brains developed, not to think logically, but for survival in an agrarian society. Therefore, we can't trust our reasoning.
This statement uses logical reasoning, although it's claiming logical reasoning is not to be trusted.
Moral Equivalency
Arguing incorrectly that two moral issues are sufficiently similar to warrant the same treatment. It often compares lesser misdeeds to major atrocities.
Killing in war is legalized murder.
In some instances, this may be true. But in all instances?
Our local police act like Nazis—they have no respect for my human right to drive my car like I want.
I don’t know why his political opponents complain about his legal troubles with regard to human trafficking; after all, their candidate was accused of shoplifting when he was thirteen.
Non Sequitur
Translated: "it does not follow." A general category that includes "hasty generalization," "slippery slope," "affirming the consequent," "missing the point," etc.) The conclusion does not follow from the premises.
Patrick always smiled at me and was so respectful. He couldn't have burned down the gym.
Is there some absolute law of nature that states that respectful, smiling people never burn down gyms? While Patrick's character in relation to you can be a relevant piece of evidence to be considered, it's a non sequitur to say that it proves he could have never burned down a gym.
Nonsequiturs can be really absurd. Consider this next, very obvious one.
"Darth Vader is really good at using the Force, so he’d probably be great at baking cookies."
Why this is a non sequitur:
- The conclusion (he’d be great at baking cookies) does not logically follow from the premise (he’s good at using the Force).
- There’s no clear connection between Force abilities—like telekinesis, mind control, or lightsaber combat—and baking skills.
- This jump in logic makes the argument absurd and illogical, which is exactly what defines a non sequitur.
Tips:
- Forget the conclusion for a moment. Looking solely at the premises, ask yourself what can be concluded from the premises.
- Now look at your conclusion. Ask yourself what kind and amount of evidence you'd need to support this conclusion. Do the premises provide that kind of evidence?
- Is your conclusion too extreme? Would it be closer to the truth if it weren't overstated?
Failing Occam's Razor
Occam’s Razor refers to the idea that one should prefer a simpler explanation (or hypothesis) to a more convoluted or complicated one. This idea is attributed to William of Ockham. It is not attributed to the 14th century English philosopher and theologian because he came up with it but rather because he used it often.
Here is an example.
Your best friend Ralph flunked Calculus. Possible reasons:
- If we were to run a psychological profile of both Ralph and his professor, we might find that they have diametrically opposed learning styles, thus making communication extremely difficult.
- Aliens kept Ralph up all night before both the midterm and final exam, questioning him and keeping him from adequate rest and preparation.
- Ralph admitted to never doing his homework and seldom attending lectures.
Occam's Razor would prefer the third, more simple and obvious explanation.
Warning: Occam's Razor doesn't decide all cases, since many explanations that end up being proven over time are indeed more complicated than their disproven counterparts. Typically, when choosing between competing scientific theories, the best fit with the observable data wins over simplicity. So it's wise to consider Occam's Razor something to consider rather than a hard and fast rule.
Post Hoc Ergo Propter Hoc
Translated "after this, therefore because of this." Often shortened to "post hoc," also called "faulty causality," "faulty cause," "false cause," or "correlation vs. causation"). Correlation and causation are confused in that one event follows another and the former is falsely assumed to be the cause of the latter.
Ever since his trip to India, Alfred's been sick. Obviously, he caught some-thing in India that our doctors can't diagnose.
Tips:
- When one event is claimed as the cause of another, look for other possible causes. In the above example, perhaps Alfred caught something the day he arrived back home, or already had an illness before going to India, but never developed symptoms until he returned.
- Give evidence beyond "this happened after that," to support your claim. For example, you might discover that Alfred consulted with seven American diagnostic specialists, who all agreed that it was a malady they'd never before seen. This would lend credence to the "he caught it in India" theory.
Red Herring
Deflecting an argument by chasing a rabbit (an irrelevant topic.) The name "red herring" was originally used in fox hunting, when a herring (type of fish) was dragged across a trail to throw the dogs off the scent of the fox. This fallacy is often called “whataboutism” because is often starts with “Oh yeah? Well what about . . .” as a response.
After Harry's wife caught him gambling away his paycheck and asked for an explanation, he responded, "At least with gambling I have a chance to get my money back. What about your weekly purchase of clothes that ends up in a bag for Goodwill? And why isn't your recent raise helping us to pay our debts?"
Harry's arguments deflect from the immediate issue: he gambled away his paycheck.
Sure, the mercury found in seafood is often unsafe, but fishermen have to make a living like everyone else.
Tip: If you're not sure, write the argument out as a line of argument. This typically shows clearly where the argument got off track.
Reductionism
AKA "oversimplifying," "sloganeering." Reducing large, complex problems to one or a few simplistic causes or solutions.
The problem with our economy can be reduced to two words: trade imbalance.
What about other relevant issues, such as the drain of a huge national debt?
Tip: Ask yourself, "What other factors may contribute to this problem, or be a part of the solution?"
Slippery Slope
AKA "snowball argument," "domino theory," "absurd extrapolation," "thin edge of the wedge," "camel's nose." Arguing that one change or event will inevitably lead to another, eventually landing them at a place they never wanted to go.
If we allow more restrictions on purchasing guns, this will be followed by further restrictions and eventually the government will confiscate all our guns.
We can’t allow same sex marriage! Next thing you know, people will want to marry toasters and chairs!
Caution: Slippery slopes do exist. The question is, just how slippery is the slope? Is it slippery enough to make the slide to the bottom inevitable?
Tip: Look closely at your argument for each link in the chain of consequences. Is there adequate evidence to conclude that each progression is either inevitable or fairly certain? Are there abundant historical precedents that back up the claim? Are there historical precedents that provide contrary evidence?
Stacking the Deck
AKA "cherry picking." Listing the arguments (or evidence) that support one's claim while ignoring the ones that don't.
Capitalism inevitably leads to a violent revolution by the proletariat. Here are fifty examples from history.
Tip: Ask yourself, "Are there counterexamples that the arguer is ignoring, or are they simply pulling out examples that support their theory?
Straw Man
Presents a weak form of an opposing argument, then knocks it down to claim victory.
Jack emailed his professor that he missed class due to a bad case of the flu and that he would bring a doctor's note. The next day, the professor announced in class that he would not excuse Jack's absence because his excuse was that he didn't feel like coming (not mentioning the flu or the note). Since the professor put Jack's argument in such a weak form, he was arguing against a straw man rather than Jack's actual defense.
Tip: Do you know the strongest arguments of your opponents? If so, are those the arguments you're arguing against? If not, then you are not presenting logical arguments but rather presenting logical fallacies.
Sweeping Generalization
AKA dicto simpliciter. Assumes that what is true of the whole will also be true of the part, or that what is true in most instances will be true in all instances.
"First-year college students are all irresponsible and just want to party instead of study."
This is a sweeping generalization because it unfairly assumes that all first-year students behave the same way, ignoring the many who are focused, disciplined, and academically driven.
Tip: Particularly when arguers use all-inclusive words like "all," "always," "never," "nobody," or "everybody," ask yourself if the premises and/or conclusions should have been presented in less stark terms. Do you know first-year college students who are responsible and study a lot? If so, not all first-year college students are irresponsible and just want to party instead of study.
Action Points
A Checklist for Spotting Your Own Fallacies
Ask these questions before turning in a paper, making a speech, or arguing with friends.
- How would your opponents respond to your argument? What parts would they likely attack? Have you actually read the strongest arguments of your opponents and considered their side? Is there a way to strengthen your weak arguments?
- How would your argument look as a syllogism or line of argument? Do you have adequate evidence for your premises? Does your conclusion flow logically from your premises?
- Is your conclusion presented with the degree of certitude that's warranted by the evidence? (Be especially cautious if you use all-encompassing words like "always," "never," "everyone," etc.)
- Are there certain types of fallacies that you often fall for? (Consider how professors responded to your earlier papers or speeches and how your friends respond to your arguments.)
Logic - How to Do it Wrong
Anyone who denies the law of non-contradiction should be beaten and burned until he admits that to be beaten is not the same as not to be beaten, and to be burned is not the same as not to be burned.- Avicenna
Brilliant people believe nonsense [because] they contradict, leave out valid options, and knock down straw men.
Those Who Question Logic
To the mind that's yet to be "enhanced" by some strains of modern thought, the above quote probably comes across as amusing, but useless. After all, who would deny something as basic as the law of non-contradiction or the basic laws of logic? If saying "My roommate annoys me" is no different than saying "My roommate doesn't annoy me," then how can we ever say anything meaningful? Moreover, the very act of denying non-contradiction assumes the law to be true.
Yet, some argue that our brains, like our opposable thumbs and other body parts, evolved not to perfect our logic, but to optimize our survival. According to these thinkers, when early man moved up in the world from hunter-gatherers to the African Delta, survival of the fittest favored those who learned to cooperate to grow crops, raise families, and breed domestic animals. Thus, our brains evolved to foster domesticity, rather than think through logically rigorous legal or scientific or philosophical arguments.
(Digression: Surely it's equally plausible, even when reflecting upon recent history, that evolution should favor brains that are ruthless and conniving; employing a logic that's better suited to achieve selfish ends than to seek truth. When dispassionately objective intellectuals taught ideas that displeased Stalin, he removed them from the gene pool by the thousands. Thus, a large portion of 20th century man, under such regimes as Lenin, Stalin, Mao, Hitler, and Pol Pot, survived by suppressing their creativity and independent thought and perfecting a "don't piss off the morons in charge" type of thinking. In my mind, it would be difficult to prove that long ago, living in small communities on the Delta, brilliant misfits would have survived any better.)
Thus, following this naturalistic line of argument, our brains developed primarily for primitive survival, not to reflect accurately on the great scientific theories of cosmology or macroeconomics or to develop rigorous rules of logic. Those who walked about the early Delta with their minds distracted by such matters were almost certainly eliminated from the gene pool by animals higher up on the food chain.
Rather than being equipped for higher level thinking, according to this theory, we find our brains uniquely suited to think in ways that enhance our self-confidence, enable us to compete, socialize, and convince the opposite sex to mate with us.
As a result, today's brains should resonate more with Glamour Magazine, Playboy, and Sports Illustrated, than Physics Today or Philosophy Now. In its favor, this theory successfully predicts the type and quality of magazines available for purchase at service station check-out counters. Such academics as Psychologist Susan Blackmore and Philosopher Alex Rosenberg similarly argue that our brains, in their present state of evolution, deceive us in many ways and can't be trusted. Why then should we trust in the ability of our empirical investigations or logical argumentation to help us find truth? Without recounting the intricate details, I should also mention that eighteenth century philosopher David Hume argued, with breathtaking influence on modern thought, that taking empiricism to its logical conclusion leads to skepticism concerning any certain knowledge. His works, and many who built upon his foundation, have led some contemporary intellectuals to a thoroughgoing despair of finding truth through science or logic or any other means. This is all to say that if you read widely, you'll run across many who teach that all truth is relative and a search for truth is futile. Rather than set forth a defense of our ability to find truth, or at the very least that we have the ability to weed through nonsense in order to get closer to the truth, I'll just note that I've never found a thoroughgoing skeptic who lives consistently with his skepticism.
As soon as he opens his mouth or wields his pen, he begins making statements that depend upon the very laws of logic he denies. When Blackmore argues that our minds deceive us and can't be trusted, why does she go on to write the next chapter? If she really believes what she wrote, she can't trust her reasoning. If I believe what she wrote, I can't trust in either the accuracy of her writings or my ability to interpret them. So why keep reading? After a professor teaches his students that we can't know truth, no sooner has he left the classroom and met his department chair than he engages her in an argument, based upon the facts and logic he denies in class, about his deplorable salary. And he certainly won't be satisfied if his boss responds that the argument is pointless because all truth is relative.
In the end, whether you claim to be a thoroughgoing skeptic or a believer in our ability to find truth, logic would seem useful, at least in arguing for a raise. So since this isn't a book on epistemology, let's proceed as if logic is indeed useful, and try to sharpen our ability to use it.
The Syllogism* as a Useful Starting Point
Syllogism - a type of argument that begins with two or more premises and draws a conclusion.
Increasingly, I find myself putting complex, convoluted, or long-winded arguments into the form of syllogisms in order to evaluate them. The value of this process was demonstrated to me at a recent philosophical conference. I was astonished to hear a philosopher attack a 450-page book by reducing the author's line of argument to a simple, three-line syllogism. If the philosopher succeeded, then no matter how many studies the author quoted, no matter how much data he accumulated, no matter how many more pages he wrote; if his line of argument was illogical, his conclusion wasn't warranted.
Here's the classic example of a simple, correctly formulated logical syllogism:
Premise 1: All men are mortal.
Premise 2: Socrates is a man.
Therefore: Socrates is mortal.
The beauty of a correctly formulated syllogism is that if we agree with the premises, then we must agree with the conclusion. Do you agree that all men are mortal? Do you agree that Socrates is a man? If so, then you must believe that Socrates is mortal. It's a logically airtight argument.
To evaluate someone's argument, try to put it in a syllogistic format and focus on two questions:
- Do you agree with the premises? (Are they either intuitively obvious or well-supported by evidence?)
- Does the conclusion logically follow from the premises?
Of course, arguments can get quite complicated, requiring complicated syllogisms to replicate them in logical form. If you're interested in exploring the more complex forms, study deductive logic. But I find that basic syllogisms suffice to evaluate the vast majority of meaningful arguments, even when evaluating chapters or entire books.
Let's Analyze an Argument!
Let's start with an argument proposed by a bright person and analyze it. Here are a couple of formulations of an argument put forth by Richard Dawkins, a popular science writer who once taught at Oxford University.
In his book, The God Delusion, Dawkins seeks to establish atheism, primarily by attacking theism. But he does present one positive argument for atheism, which he claims demonstrates that there is almost certainly no God. Dawkins believes the argument is devastating to theism— "an unrebuttable refutation." It makes for a good argument to examine, since Dawkins states it in a few sentences rather than arguing it extensively.
Here's how he puts it:
"...any creative intelligence, of sufficient complexity to design anything, comes into existence only as the end product of an extended process of gradual evolution. Creative intelligences, being evolved, necessarily arrive late in the universe, and therefore cannot be responsible for designing it."
Later in the book, he puts it this way:
"The whole argument turns on the familiar question 'Who made God?', which most thinking people discover for themselves. A designer God cannot be used to explain organized complexity because any God capable of designing anything would have to be complex enough to demand the same kind of explanation in his own right. God presents an infinite regress from which he cannot help us to escape."
Think!
Before reading any further, try your own hand at responding to Dawkins. He says that he has "yet to hear a convincing answer" to his argument.7 Do you think it's irrefutable? If the argument seems rather muddled to you, start by reading one sentence at a time and asking yourself, "Do I agree or disagree with this statement, and why?" Perhaps trying to put it in syllogistic format would help, or trying to express it as a line of argument. (Caution: Try not to let your personal worldview interfere with your reasoning. The question I'm asking is not "Is there a God?" but rather "Is Dawkins' argument irrefutable?")
Using a Line of Argument* and Syllogism to Clear Muddy Waters
Line of Argument - a simplified form of a long or convoluted argument, summarized as a series of sentences.
If I understand Dawkins correctly, here's his line of argument:
There are only two possible ways that God's existence could be accounted for:
- He was created by another being. But that explanation doesn't really help because then we have to ask, "Who made that designer, and the one who made him?" which leads to an infinite regress of questions which we can never fully answer.
- He slowly evolved through time. But if He evolved, He would not have developed His incredible intelligence and power until the end of a long process of evolution. Yet, in order to create the universe, He needed this intelligence and power at the beginning. Thus, He couldn't have created the universe. Besides, what are the odds that such a complex being could evolve through purely naturalistic causes?
Dawkins thus concludes that since both of these scenarios are highly unlikely, it's highly unlikely that God exists.
Put in a syllogism, it might read like this:
Premise 1: If God exists, he must have come into existence by either being created by another being or evolving slowly through time.
Premise 2: It's highly unlikely that God came into existence by either being created by another being or evolving slowly through time.
Conclusion: It's highly unlikely that God exists.
Think!
Does laying it out as a line of argument and as a syllogism help? Do you think I did it accurately? Now think through the line of argument and syllogism. Do you agree with each of the premises? (Is it sound?*) Did Dawkins argue correctly from these premises? (Is it valid?*)
Sound Syllogism - the premises are true and the form of the argument is valid.
Valid Syllogism - the form of the argument is correct, whether or not the premises or conclusion are true.
As we continue with this chapter, we'll introduce some logical fallacies and apply them to both Dawkins' argument and the introductory discussion.
Fallacy #1: Bifurcation
Dawkins' argument seems to be a good example of a fallacy called bifurcation, whereby the argument assumes that only two (note the prefix "bi", meaning "two") possibilities exist, whereas there are actually more. This fallacy is particularly pernicious because it seems to contain an element of sleight of hand. If it is presented by a person we respect or agree with, we tend to assume that his premises represent all possibilities and we focus on the validity of the argument rather than the accuracy of the premises.
So here's how Dawkins' argument appears to be guilty of bifurcation.
He assumes that there are two and only two possible explanations for the proposed existence of God:
- He was either created by another being, or
- He evolved by natural means slowly over time.
To justify limiting the existence of God to these two options, Dawkins should have eliminated a third, seemingly viable option: that God could have simply existed from eternity past. After all, until well into the 20th century, the majority of scientists saw no problem in believing that matter existed from eternity past. Why then could God not have existed from eternity past? Is there evidence (either empirical or logical) that if God exists, He could not have existed from eternity past (or, alternately, could not exist outside of time and space)? If there is such evidence, then Dawkins should forward it. Otherwise, his premises are misleading and inaccurate in that they unnecessarily ignore this option.
To put it another way, Dawkins claims that there are two and only two ways the existence of God could be explained. By explaining those two away, he claims to have explained away the existence of God. Yet, he's ignored (or deflected his readers from) a third possibility which he needs to explain away as well: that God existed from eternity past. By overlooking this third option, his argument fails, falling to the fallacy of bifurcation.
Other Examples of Bifurcation
The Atlanta Falcons' loss to the New England Patriots was due to either inept play or poor coaching.
But aren't there more options than two? Perhaps they lost primarily because of a brilliant strategy by the opposing coaching staff, or the Patriots quarterback was on a roll, or the injury to the Falcon running back caused the Falcons to resort to "Plan B" rather than "Plan A," or any number of other possibilities that the armchair critic needs to rule out.
What a despicable child! He obviously either inherited bad genes or has inept parents.
What are some other possible contributing factors to the child's behavior? Perhaps he's sick or tired or teething.
Fallacy #2: The Straw Man
I'm dealing in this chapter with arguments that are very common. Familiarize yourself with them and you'll begin to see them everywhere—in articles, news broadcasts, Facebook discussions— everywhere!
The Straw Man fallacy presents a weak form of an opposing argument so that it's easy to destroy it and declare victory. The writer or speaker never actually attacks the opponent's arguments. Instead, he avoids the opponent's arguments by "knocking down a straw man."
Dawkins seems to have erected and knocked down a straw man in the argument we considered above. In brief, he argued that it's very unlikely that an evolved or created God exists. But the vast majority of theistic theologians and philosophers of the Western world would likely agree with this statement. In fact, I don't believe I've ever met a theist who believes in a created or evolved God. So arguing against this kind of a God says nothing about the existence of the eternal God that most of Dawkins' opponents believe in.
Thus, Dawkins has set up an irrelevant straw man (or in this instance, a Straw God), and tried to disprove His existence. If successful, he merely succeeds in knocking down a position that his opponents never held. The philosophers and theologians he's attacking overwhelmingly define God as one who existed from eternity past (or exists outside time and space). Dawkins should have attacked the position held by those he attacks.
Michael Ruse, Professor of Philosophy at Florida State University, himself an atheist, criticizes Dawkins' argument in part for this very reason. He concludes: "...I want to extend to Christians the courtesy of arguing against what they actually believe, rather than begin and end with the polemical parody of what Dawkins calls 'the God delusion.'"
Another Example of Arguing against a Straw Man
A friend remarks to you: "The last three winters have been colder than average. So much for the theory of global warming!"
Your friend assumes that global warming advocates argue in this manner: "If temperatures are truly rising, every year and every geographical location should show increased warmth." But nobody argues this. It's arguing against a straw man. Global warming advocates actually argue that over long periods of time the average temperature is increasing. Those who argue against global warming should argue against this rather than a straw man. In fact, this is why global warming is now usually referred to as climate change, which describes the changing climate more accurately.
Summary
The arguments we've examined in this chapter were put forth by bright people with topnotch education credentials—often PhDs holding prestigious positions. If they are subject to falling for logical fallacies, how much more the rest of us?
Why do brilliant people believe nonsense? Because they fail to sufficiently check their beliefs against logical fallacies. How can we guard ourselves from similar errors in thinking?
How to Spot Logical Fallacies and Keep from Using Them in Our Own Communications
Take time to think through arguments that are important to you.
Most don't. In fact, they barely even pay attention. Philosopher and scientist Francis Bacon once wrote: "Some books should be tasted, some devoured, but only a few should be chewed and digested thoroughly." For the latter books, articles or lectures, if the argumentation is complicated or unclear, I often summarize it with a line of argument, sometimes chapter by chapter. It takes a bit of time, but it keeps me from ending the book in a mental fog.
Don't be intimidated by credentials and claims.
Surely this is, in part, why people take nonsense promoted by well-credentialed people at face value. Never listen to anyone without engaging your critical thinking.
Beware of the tendency to uncritically accept the arguments of those you agree with, or arguments that have an agreeable conclusion.
Professor H. Allen Orr, in the New York Review of Books, reflected on Dawkins' argument and his way of arguing. According to Orr:
"Indeed he suffers from several problems when attempting to reason philosophically. The most obvious is that he has a preordained set of conclusions at which he's determined to arrive. Consequently, Dawkins uses any argument, however feeble, that seems to get him there and the merit of various arguments appears judged largely by where they lead."
Ask yourself, "Are there facts or personal experiences that don't fit with either the premises or the conclusion?"
One might hear someone say, "Online classes are ineffective because students always get distracted and don't learn anything." And you might ask yourself, "Are there facts or personal experiences that don't fit with either the premises or the conclusion?" Perhaps a friend might respond, "Actually, I took an online calculus class last semester and found it more effective than in-person lectures because I could pause and rewatch difficult parts. I got an A in the class." Or, you might cite the meta-analysis by Dr. Barbara Means that found that online learners performed as well as or better than face to face learners.
A student's personal experience doesn't fit the premise that online classes always lead to distraction, nor the conclusion that they are ineffective. The research doesn’t fit the premise, either. Asking this question helps reveal exceptions that challenge overgeneralized or unsupported arguments.
Put it in a syllogism (or line of argument) and ask yourself two questions:
- Are the premises supported by sufficient evidence?
- Does the conclusion follow logically from the premises?
(Ask, is the data complete and accurate? Is the reasoning from that data clear and accurate?)
Have others look at the argument.
Learn from Hewlett Packard's practice of running an idea by the person next to you. If the idea is important to you, discuss it with others. We all think a bit differently and it's very likely that others will see aspects of the issue that you don't see.
For example, Einstein once observed that scientists are typically poor philosophers. Whether he's right or not, psychologists do find people typically having strong and weak areas of reasoning. If a scientist is trying to reason philosophically, he might be wise to run his arguments by a philosopher. It's often wise to run important arguments by people who think differently from you.
See how others in the field respond.
Dawkins' argument is philosophical, and the field of philosophy has a rich history of arguments concerning the existence of God. It would seem unlikely, though not impossible, that an expert in animal behavior (Dawkins) would dream up a slam dunk argument than never occurred to any great philosophical thinker from Plato to Immanuel Kant to Bertrand Russell. If Dawkins' argument were truly original and significant, I'd expect a loud chorus of respected philosophers to be hailing this argument's arrival.
Yet, the responses I've seen by philosophers and academics have been underwhelming at best. Philosopher William Craig went so far as to declare it "the worst atheistic argument in the history of Western thought." Academic biologist H. Allen Orr noted that the argument was "shredded by reviewers." For example, some attack the argument by noting that an explanation doesn't typically require an explanation of the explanation (responding to Dawkins' contention that theists must forward an explanation as to where God came from). In other words, if we were to visit the dark side of the moon and find an advanced, but long-abandoned (at least a century old, deduced from its state of natural aging) mining operation, where all the inscriptions were in a non-human language, wouldn't we be justified in positing that alien intelligences were behind it, even if we had no idea how the aliens came to be or where they were from? And it's not just theistic philosophers who find Dawkins' argument lacking.
Atheist Michael Ruse attacks Dawkins' argument in this way:
"Like every first-year undergraduate in philosophy, Dawkins thinks he can put to rest the causal argument for God's existence. If God caused the world, then what caused God? Of course, the great philosophers, Anselm and Aquinas particularly, are way ahead of him here. They know that the only way to stop the regression is by making God something that needs no cause. He must be a necessary being. This means that God is not part of the regular causal chain but in some sense orthogonal to it. He is what keeps the whole business going, past, present and future, and is the explanation of why there is something rather than nothing."
Surely such rejoinders are legitimate challenges that Dawkins should respond to. Had he run his argument by some philosophers prior to publishing, perhaps he could have responded to their objections.
Think Differently (Creative Thinking)
One of philosopher Immanuel Kant's most valuable contributions to practical human thought was his insight that we don't experience things entirely as they are. While some people insist that seeing is believing, we all know that seeing can also be deceiving. For example, Kant notes that we don't see objects directly. Rather, we're a step removed in that we see reflections of objects on our retinas. We take another step back from real objects when our brains bring our own interpreting mechanisms to those objects, such as "quality" or "cause and effect."
Modern psychology confirms and extends Kant's insight. We don't "see" the reflections on our retinas in the same way. While you may see a green object on your retina, I may see it as brown, since I'm color-blind to certain greens. And we're well aware of common optical illusions and misperceptions. That's why eye-witness testimony is often contradictory, even when the witnesses are honest. Often, what we see shouldn't be believed.
You've probably seen illustrations such as this, where our minds fool us. How many "F"s do you see in this passage?
FINISHED FILES ARE THE RESULT OF YEARS OF SCIENTIFIC STUDY COMBINED WITH THE EXPERIENCE OF YEARS.
Most people see only three. That's all I saw the first two times I read it. Actually, there are six. (Look slowly at each letter and count again, perhaps starting at the end.) This is similar to the problem drivers have spotting motorcycles on streets where they are rare. We're watching for cars and trucks and may not see the motorcycles at all.
Are the horizontal lines below curved or straight? Use a ruler or straight edge to see.
Fallacies such as bifurcation, like a good magician or an illusion, play on our brains' tendencies to see certain things incorrectly or to be distracted from crucial details. How can creativity help us to overcome distractions and wrong directions in order to innovate productively?
Broaden your range of input.
Who would you prefer to edit your writing?
- A dyslexic person, who struggles to read well?
- Slow readers?
- A top academic who teaches grammar and literature?
- A person so proficient at reading that she can polish off an entire novel in an evening?
Intuitively, most authors seem to seek out exclusively d) and e) types, and I agree that their input has a place. After all, shouldn't avid readers and top grammarians have valuable input?
But I'm increasingly seeking editorial input from a wider range of people. While fast readers may excel at telling you if your story is interesting and flows well, the slow reader may be better for thinking through your line of argument, spotting places that need more documentation, or helping you with the rhythm produced by combinations of long and short sentences. Literature professors tend to love clever analogies and brilliant descriptions, whereas the average reader may see these as distractions from the story line. That's why I like input from both.
Academics have a high tolerance for detailed argumentation and theory. While I'll get their input on this book, I can't quite trust their verdict if they tell me it's interesting. If I'm writing, not primarily for professors, but for their students and the broader public, I treasure input from those who aren't naturally interested in my subject matter. I'm blessed with dyslexic twins and love their input. That's one reason I use lots of white space, bullet points, and illustrations. Dyslexics cringe when they see a page full of unbroken words. I've found that if I can hold the attention of struggling readers, I'm more likely to captivate a broad range of readers, and in the end delight academics as well.
At times, ignore the current theory that drives your research, and allow non-experts to offer ideas; or just throw a bunch of stuff against the wall to see what sticks.
Sometimes our theories and methods keep us from trying potentially fruitful experiments. Since we seldom recognize that the ruling theory may have deflected us onto a side road, it sometimes helps to toss it and try something new.
Isn't this the way inventor Thomas Edison often proceeded? I still picture him in his later years, stopping beside the road to sample plants that might be used as a substitute for the rubber used to make tires, which was in short supply during World War II.
- A thirteen year old, Jack Andraka, took an intense interest in trying to cure pancreatic cancer, after it killed a family friend. Being new to the field, he took a different direction from the standard research, resulting in his inventing a simple, cheap test to detect pancreatic cancer early, when it can be successfully treated.
- Don Valencia, a cellular biologist who developed tests to diagnose autoimmune diseases, had worked on isolating molecules in human cells without destroying them. It occurred to him that this technique might work for making a concentrated extract of coffee that could capture its flavor more successfully than other extracts. He experimented with it in his kitchen, trying out different flavors on his neighbors. Once perfected, he took it to Starbucks. They eventually hired him and used the technology to expand their product line to coffee ice cream and bottled beverages.
Employ higher levels of reasoning.
Bloom's Taxonomy (most refer to the "revised" taxonomy), distinguishes different types of thinking, suggesting ways for us to move past rote memory. Unfortunately, many students seem to seldom move past merely identifying and memorizing the important parts (what might be on the test) of texts and lectures.
Bloom's Revised Taxonomy, cft.vanderbilt.edu, Vanderbilt University Center for Teaching,
Yet, to succeed in real life, we must go further than recognition or rote memorization (see Level 1 in the below graphic.). We need to develop the skills of comprehending (Level 2), applying (Level 3), analyzing (Level 4), synthesizing (Level 5) and evaluating (Level 6). Search "Bloom's Taxonomy" in Google and you'll find many lists of specific characteristics of each level of thinking. Referring to such lists when working through an issue can suggest new ways to approach it.
For example, in our discussion of Richard Dawkins' argument, I first stated it (Level 1) and several times put it in my own words to try to clarify it (Level 2). We skipped application but analyzed it (Level 4) by putting it in a line of argument and syllogism, so that we could identify and examine the premises. We did a bit of synthesis (Level 5) when we brought in outside ideas of how theists conceive of the eternal existence of God, and how other thinkers have responded to the argument. Finally, evaluation (Level 6) came to play when we noted that there seems to be an element of smoke and mirrors involved in the fallacy of bifurcation.
So if you're evaluating an argument or a proposal, consider running it through Bloom's Taxonomy to expand your ways of looking at the issue. Note how several levels involve creativity.
Logic - Recognizing Fallacies
Read not to contradict and confute; nor to believe and take for granted; nor to find talk and discourse, but to weigh and consider.- Francis Bacon
Brilliant people believe nonsense [because] they either fail to recognize fallacies or misapply the ones they know.
Warning
Learning fallacies can be fatal to your argumentation and detrimental to your relationships. For these reasons, I teach logical fallacies with a great deal of hesitation. It's a bit like selling firearms to a person with no training in how to use them. I'd hate to be known as one who arms Internet trolls*.
Troll - A participant in social media who delights in haughtily slamming other people's positions before fully understanding either their position or the context of the discussion.
So before I present a large list of fallacies, I'll acquaint you with a particularly pernicious type of fallacious reasoning that's running rampant on the Internet, but which is strangely absent from lists of fallacies. I call it "The Fallacy Fallacy."
The Fallacy Fallacy: Debunking Debunking
I often read comments on blog posts or articles or Facebook discussions which accuse the writer of committing a specific logical fallacy and thus declaring the argument thoroughly debunked, typically with an air of arrogant finality. While the debunker may feel quite smug, intelligent participants consider him quite sophomoric*. In reality, he's typically failed to even remotely understand the argument, much less apply the fallacy in a way that's relevant to the discussion.
Sophomoric - A statement that is immature and poorly informed but is spoken with overconfidence and conceit. The word is a composite of two Greek words meaning "wise" and "fool."
Surely this fallacy deserves a proper name and should be listed with other fallacies. Thus I'll define "The Fallacy Fallacy" as "Improperly connecting a fallacy with an argument, so that the argument is errantly presumed to be debunked.”
Don't be a troll. Here are a few ways people misapply fallacies, thus committing "The Fallacy Fallacy":
They misunderstand the fallacy.
"YOU'RE ALWAYS ARGUING WITH JAMIE, WHICH IS OBVIOUSLY AD HOMINEM." (Trolls delight in using all caps, confusing louder with smarter.) If the person was actually arguing against Jamie's arguments, rather than putting Jamie down as a person, then the arguments weren't ad hominem at all.
They fail to appreciate nuance. (They understand the fallacy, but apply it errantly.)
Someone quotes Albert Einstein to bolster his argument. "THAT'S AN APPEAL TO AUTHORITY!" shouts the troll. But citing authorities isn't always fallacious. If a person cites Einstein concerning a question of relativity theory, then Einstein is a legitimate authority. Thus, quoting him can be a legitimate part of an argument, although it's typically not a slam dunk in itself. While arguments concerning establishing facts should be argued on the basis of the evidence, in many cases citing authorities can help to substantiate the evidence.
They assume a thorough debunking when there's typically more to the argument.
While trolls are celebrating their "brilliant" comments with a victory dance and a handful of Skittles, their opponents are often typing a clarification that makes the trolls' comments irrelevant. We simply must take the time to thoroughly understand the arguments we're evaluating.
Making Arguments More Fruitful
For those who sincerely want to learn from one another by hashing out issues, consider this: Trolls "flame" opponents by either calling them morons or presenting their arguments dogmatically, as if they have crushed their opponents. If you're concerned about the truth, seek more to understand than to demonstrate your brilliance. To accomplish this, suggest rather than slam; express tentativeness rather than dogmatic finality; ask questions rather than accuse.
Does it in any way weaken a counterargument to word it in a cautious, humble manner, such as: "At first glance your argument appears to be an unwarranted appeal to authority. Are you really saying that your position is correct solely because Einstein believes it as well?"
In this way, the opponent is more likely to respond in a reasonable manner and you save face in case you took the comment out of context or otherwise misunderstood it.
Benjamin Franklin on Fruitful Argumentation
Franklin was one of the most influential people in American history. He learned a lesson early in life which he considered of such significance that he discussed it at some length in his autobiography. He describes learning Socratic argumentation, which he delighted to use in humiliating his opponents. (As an annoying ass during this phase of a few years, he was a predecessor to the modern-day internet troll.)
But over time, he realized that this method failed to either persuade others or to help him learn from them. Rather, it disgusted people. So he changed his method of argumentation. In Franklin's own words, he discovered the value of
"never using, when I advanced anything that may possibly be disputed, the words certainly, undoubtedly, or any other that give the air of positiveness [meaning "dogmatism"] to an opinion; but rather say, I conceive or apprehend a thing to be so and so; it appears to me, or I should think it so or so, for such and such reasons; or I imagine it to be so; or it is so, if I am not mistaken. This habit, I believe, has been of great advantage to me when I have had occasion to inculcate my opinions, and persuade men into measures that I have been from time to time engaged in promoting."
As a result, Franklin became a skilled negotiator and persuader, allowing him to help start America's library system, organize firefighters, run a successful printing business, improve our postal service, negotiate with the French to aid us in the Revolutionary War, and assist in finalizing and adopting the Declaration of Independence, just to name a few of an astonishing array of accomplishments.
Some Helpful Ways to Organize Fallacies
The plethora of known fallacies can be quite unwieldy, so let's first of all look at some helpful ways of classifying them. In this way, when you sense an argument is invalid but can't remember the name of the specific fallacy, at least you might be able to identify the category in order to better evaluate or research it.
(Example: "That sounds like a fallacy of definition.") Although no single categorization scheme has become standard, you'll find some of the categories (such as "formal" and "informal") used widely.
Aristotle
Aristotle was perhaps the first to categorize logical fallacies in his De Sophisticis Elenchis (Sophistical Refutations). He lists 13 fallacies under two categories: Verbal (those depending on language) and Material (those not depending on language). In modern times, those building on Aristotle's two divisions often add a third: Logical or Formal—fallacies that violate the formal rules of the syllogism.
Philosopher J. L. Mackie
Mackie divided fallacies into:
Fallacies in a Strict Sense
Invalid forms of deductive reasoning; the conclusion doesn't logically follow from the premises.
Formal Fallacies - The conclusion is invalid because of the argument's form. Example: Exerting the consequent—If there are too many cooks, there's chaos in the kitchen. There's chaos in the kitchen, therefore there are too many cooks. (If p then q. q, therefore p)
Informal Fallacies - The conclusion is invalid for reasons other than its form. (Example: Using vague or ambiguous terms.)
Fallacies in Non-Deductive Reasoning and in Observation
Errors in inductively reasoning from evidence to a conclusion or hypothesis.
Induction and Confirmation - example: post hoc ergo propter hoc - the fact that event "b" followed event "a" doesn't absolutely prove that event "a" caused event "b".
Analogy - A weak analogy, one that has few or trivial points of resemblance, may have no evidential value at all.
Statistics - Example: If students from City High School outperform students from County High School on standardized tests, this doesn't necessarily imply City High School has better teachers. Perhaps administrators skew the scores, or one district has more high-risk students. Or, in some cases private schools are allowed to reject students who require extra academic support while a public school cannot. Such a difference in student body makeup may cause test scores to be higher at a private school but not because the academic offerings or instructors or students are superior.
Probability - Example: Although the probability of flipping a coin five times and getting heads every time is low, that doesn't mean that if you got heads four times in a row, it's very unlikely that you'll get heads in the next flip. The odds are still 50/50.
Observation - Example: Often what we observe is skewed by what we want or expect to observe. If you have ever thought you were about to take a sip of Coca Cola but instead got a sip of Dr. Pepper or sweet tea, you know it tasted terrible (or very strange) because of your expectations. Expectations can skew observation or experience.
Fallacies in Discourse
The argument fails because of some reason other than invalid deductive reasoning or arguing from evidence.
Inconsistency- You can't have it both ways. "Petitio Principii" - Including your conclusion in your premises (aka begging the question or arguing in a circle).
A Priori Fallacies - Bringing to the argument unfounded preconceptions that influence the conclusion.
"Ignoratio Elenchi" - Missing the point: An argument concerning something that was never meant, in the context of the argument, to be proven.
Fallacies of Interrogation - Demanding a narrow and specific answer to questions that demand broader answers. Example: "Answer yes or no: Have you stopped beating your wife?"
Fallacies in Explanation and Definition - Example: using the same word in two different ways in an argument, thus invalidating the argument.
Historian David Hackett Fischer
In Fischer's instructive and delightful book, Historians' Fallacies, he discusses 112 fallacies under 11 categories. Note that these apply far beyond professional historians. Whenever we blog about an event, summarize our family vacation on Facebook, or write that first high school paper on "What I Did for My Summer Vacation," we're telling history, and risk committing these fallacies. Here are Fischer's categories:
Question-Framing - Historians begin their research by asking one or more questions. If these questions are vague or ill-conceived, they will yield the wrong answers. Example: asking a complex question and expecting a simple answer.
Factual Verification - Failure to rigorously employ the best methods for verifying historical data.
Factual Significance - Historians can't report every fact from a period of history; they must be selective. If they select based on the wrong criteria, their conclusions will likely be wrong as well.
Generalization - Improper statistical reasoning from historical data. Example: Drawing a general conclusion from an insufficient sampling of data.
Narration - Historians gather threads of historical data and weave them into stories. Yet, "nothing but the facts" is often at odds with great storytelling, which assigns feelings and even time sequences that may not be warranted by the historical data.
Causation - Example: The reductive fallacy reduces a complex historical cause to a simplistic one.
Motivation - Historians often assign motives without sufficient evidence; for example, assuming that a Roman Emperor thinks, reacts, and is motivated by the same things that motivate a middle-aged academic historian at Berkeley.
Composition - Historians tend to study and write about groups, or individuals as part of groups, whether the groups be social, religious, national, ideological, cliques, castes or economic. One fallacy of composition is assuming that the character of one member is shared by the rest of the group.
False Analogy - Example: People often reason from a partial analogy to declare there's an exact correspondence; but in reality, analogies are seldom exactly parallel.
Semantical Distortion - Problems with unclear or imprecise prose. For example, the failure to clarify definitions of terms.
Substantive Distraction - The argument shifts the reader's attention to issues that are irrelevant to the discussion.
While categorization schemes are helpful for getting an overview of types of fallacies, none seem to be without their downsides. For example, some fallacies seem to fit snugly into multiple categories.
A Great Big List of Fallacies
In my first Appendix, I list a great number of fallacies. I don't recommend trying to memorize them. Rather, familiarize yourself with each of them so that in the future, when you run across an argument that doesn't sound quite right, you can return to the list to search for a fallacy that might apply. If you're reading this for a class, your teacher or professor may single out certain fallacies that they deem the most important or the most frequently abused in literature and the media.
Conclusion
There are many ways to go wrong in our arguments. Some are a bit technical. But by familiarizing ourselves with fallacies, learning to apply them correctly, and discussing disagreements in a civil and humble manner, we can learn from each other and mutually come closer to the truth.
Attribution
This chapter is revised from the first edition of Open Technical Communication, Chapter 5.13: “Logic – Common Fallacies” by Steve and Cherie Miller, Chapter 5.14: “Logic – How to Do It Wrong” by Steve and Cherie Miller, and Chapter 5.15: “Logic – Recognizing Fallacies” by Steve and Cherie Miller, which are all openly available under a Creative Commons Attribution license.
Chapters 5.13, 5.14, and 5.15 of the first edition of Open TC were revised with permission from Steve and Cherie Miller’s Why Brilliant People Believe Nonsense, Chapter 10: “They Contradict, Leave out Valid Options, and Knock Down Straw Men,” Chapter 11: “They Fall for Other Common Fallacies,” and Chapter 12: “They Either Fail to Recognize Fallacies, or Misapply the Ones They Know” by Steve and Cherie Miller.
AI Assistance Notice
Some parts of this chapter were brainstormed, drafted, and/or revised in conversation with ChatGPT 4o and Google Gemini 2.5 Flash. All AI-generated content was reviewed and revised as needed by a human author.
Next: Chapter 12: Business Correspondence →