This is one I’ve had saved as a draft in my “to do” list since June 7th. Conveniently, I just finished Psych 304: Effective Thinking this semester. The motivation and content of this essay remain essentially the same, but my hope is that my writing will now be clearer.
Matt Dillahunty often says, “I want to believe as many true things and as few false things as possible.”
How often do we actually try to falsify our beliefs? How often do we check to see if the things we believe are actually true? Do we have the strength of character to admit when we’re wrong? Are we intellectually honest enough to change our beliefs when they are proven false by empirical facts?
“A trial seeks to ascertain the truth of the matters in issue between the parties…” . Key among the steps in a trial, the steps in ascertaining the truth, are the direct examination and the cross-examination. Direct examination is the questioning of a witness by the lawyer that called them to the stand. In other words, direct examination is done to bolster one’s case. Carrying this analogy over to our thought processes, direct examination is akin to confirmation bias- choosing to see only the facts or presentation that already agree with our positions. The problem with this is that it would lead to a miscarriage of justice; the truth cannot be discovered this way. In order to test the veracity of the testimony, the opposing legal team is allowed to question the witness through cross-examination. According to the American Bar Association, “the purpose of cross-examination is to test the credibility of statements made during direct examination” and “the attorney might try to … impeach the witness or the evidence” .
If we really want to know that our beliefs are true, we should put them on trial and fearlessly so. If our beliefs are true, then they have nothing to fear from critical examination. Only those with something to lose should fear reasonable inquiry.
Facione & Gittens  outline four basic tests all arguments must pass if a claim is to be accepted as true or very probably true:
- Logical Strength
An argument must successfully meet all of these requirements in order to be valid. Untrue premises defeat the very foundation of an argument. Logically moving from figurative A to B should be demonstrably causal or “if, then”. An irrelevant argument does nothing to build support for a claim, even if it’s true; e.g. quoting Shakespeare in discussion about physics. Finally, “it makes no sense to give a reason as the basis for one’s claim and then to use that claim as the basis for one’s reason” (Facione & Gittens, 2016, p.142).
A critical thinker must be more concerned with what is true rather than what they want to be true. In that vein, here are some common mistakes, flaws in reasoning, and fallacies to be aware of and test one’s own arguments and beliefs against.
1. Hypothesis Testing
“No amount of experimentation can ever prove me right; a single experiment can prove me wrong.” -attributed to Albert Einstein
In the strictest sense, “proof” that something is true is limited to mathematics and proof theory. Because of this, in all other engagements what we actually do is seek to disprove a null hypothesis. All experiments, and even the legal analogy above work on the premise of two mutually exclusive hypotheses, and disproving the null in order to accept that the alternative hypothesis is probably true. A guilty verdict is the result of disproving innocence. Accepting significant results is due to disproving the null. Logically, we don’t prove the truth, but rather disprove the falsehoods around it and come to the truth by process of elimination.
Correctly identifying falsifiable null and alternative hypotheses can be difficult. It can even result in requiring a series of hypothesis testing because initial results may broadly accommodate multiple alternative hypotheses. But something true cannot be disproven; facts are necessarily true by definition. Therefore, we should “question with boldness”  because not only can the truth withstand it, but cannot otherwise be made to shine.
2. Motivated Reasoning
Nonscientists use an erroneous definition of science which claims it must be “observable and repeatable”. By this definition, crossing the street would qualify as science! Actual scientists describe the scientific method as testing predictions with falsifiable hypotheses , as discussed above. The false definition of science is the result of “motivated reasoning”, which is not actually reasoning at all but rationalizing.
The ideologue does not engage in critical thinking; it is anathema to them. Rather than follow the evidence to a conclusion (empirical or inductive reasoning), the ideologue begins with the conclusion and selectively chooses only the evidence to that supports their point of view (deductive). When confronted with contradictory facts, the ideologue will rationalize and try to bend reality to fit their preconceptions. This is intellectually dishonest and cannot lead to the truth. Because of this, the ideologue is a danger to the search for truth. Look at any closed society and you will find attempts to control information. There is nothing an ideologue fears more than reason and contradictory facts.
On page 276, Facione and Gittens write, “Ideological belief structures are socially normative – those who disagree are seen as outsiders, ignorant, mistaken, abnormal, or dangerous”. Moreover, they note on page 278 that, “…John Locke described the habit of talking and listening only to those who agreed with the beliefs and opinions one already holds as a basic type of flawed thinking”.
Ideological arguments rarely, if ever, stand up the four basic tests for a valid argument. They often not testable or falsifiable. It becomes a feedback loop of confirmation bias. It is also an extremely comfortable trap to fall into, because we like being told we’re correct and we like people that are similar to us. It’s something we all have to carefully monitor ourselves for if we’re going to be honest. We must be more interested in what is true, than what we want to be true. We have to stay open to cross-examining our own beliefs.
3. Logical Fallacies
Here are the logical fallacies I see most frequently when people debate on social media. All of these fail the test of relevance.
- The Straw Man. This is the practice of erecting a false representation of an opponent’s argument and demolishing it, rather than what they actually say. It’s dishonest in the representation of one’s opponent, often to the point of caricature. The effect is arguing with a defenseless “straw man” that cannot fight back, but which is also not the real opponent. It can look good and maybe feel clever, but also runs the risk of of making the person doing it look foolish if they’re discovered. Being “hit back” by an actual opponent whose actual arguments you’ve failed to dismantle can be embarrassing.
- Shifting the Burden of Proof. Also called an appeal to ignorance. This one is very simple and continues the trial analogy above. Much like a person should be thought innocent until proven guilty, an idea or a claim should not be accepted until there is supporting evidence. Lack of evidence against a claim cannot be considered evidence against for it. There is no shame in saying “I don’t know”. Suspension of judgment is far wiser than to accept a claim that could be false. It is unreasonable to accuse somebody of theft and demand proof of their innocence; rather, the accusation itself must be proven. Otherwise any person who stayed home playing video games with no alibi could unjustly be called guilty! The onus always lies with the claimant. This is indisputable. An old Latin proverb says, “Quod gratis asseritur, gratis negator”. That which is freely asserted can be freely dismissed.
- Misuse of Authority. The false assumption that if a notable person makes a claim, it must be true. E.g.- citing a physicist on matters of biology, a religious leader on matters of science, or a politician on business affairs. And vice versa for all three. There are twelve criteria an expert must meet to be considered a trustworthy source and they must meet them all [3, p.123]. After all, you would never allow anyone to testify in court on your behalf who wasn’t:
- Learned in topic X
- Experienced in topic X
- Speaking about topic X
- Up-to-date about topic X
- Capable of explaining the basis of their claim about topic X
- Free of conflicts of interest
- Acting in accord with our interests
- Informed about the specifics of the case at hand
- Mentally stable.
If it really is “better to remain silent and be thought a fool than to speak and remove all doubt” , then prudent behavior in our trial analogy is exercising the right to remain silent rather than incriminate ourselves by opining beyond the scope of our knowledge. It’s okay to not know everything! There’s no shame in it. Better to suspend judgment on a topic than to proclaim something false.
Putting our beliefs on trial and subjecting them to cross-examination is really the only way to know if they’re true. Do honest research. Read opposing viewpoints and give them fair-minded consideration. Analyze both sides with reason rather than emotion and see which one withstands the scrutiny. Be willing to change your mind about something if you’re wrong. If we limit ourselves to sources of confirmation bias and only deal with straw men of opposing views, we’re not being honest. That’s choosing to live in a construct of reality. That’s choosing to believe a delusion instead of the truth.
I choose to embrace reality. It’s a beautiful and wonderful thing.
- Think Critically, 3rd ed. by Peter Facione and Carol Ann Gittens
- Thomas Jefferson, in a letter to to Peter Carr. 10 August 1787.
- The Logic of Scientific Discovery, by Sir Karl Popper, PhD. 1934.
- Attributed to both Twain and Lincoln, most likely sourced from Proverbs 17:28