My last essay verged on being pedestrian, if you’ll pardon the pun, and wasn’t really congruent with my new vision for this blog. I’ll still talk about gadgets and design from time-to-time because they’re passions of mine, but I couldn’t help feeling like I’d immediately regressed after declaring intent to evolve my writing. This post is an attempt to get back on track.
“Dual-process theories have dominated social and cognitive psychology since the 1970’s (Wason and Evans, 1975)” (Varga & Hamburger, 2014). That may be an unspeakably ugly citation, but it’s also the fastest way to establish rough origins for the psychological study of modal thinking. Dual process theory came to dominate psychology because of overwhelming evidence through research and experimentation from the likes of Daniel Kahneman.
Born in Tel Aviv, Kahneman’s psychology career began in the Israeli Defense Forces before attaining his PhD at Berkeley in 1961. He’s been published over 20 times in peer-reviewed journals (peer-review is the vetting process in which others try to falsify the work), and in 2002 he won the Nobel prize in economics for his work on the psychology behind economic theory. In 2007 the American Psychological Association gave him their Lifetime Achievement Award. When it comes to heuristics and dual-process theory, Dr. Kahneman and his research are as impeccable as we can hope for to establish the reality of dual-process thinking.
So what is it?
Basically, humans engage in two types of thinking. Dr. Kahneman’s terminology has become the de facto way of discussing it in psychology. “‘System 1,’ he explained, is the brain’s fast, automatic, intuitive approach. ‘System 2,’ he said, refers to the mind’s slower, analytical mode, where reason dominates” (Walsh, 2014).
If we’re aware of this, we may be able to analyze our thought processes and self-diagnose whether or not we’re being reasonable more effectively. Fear of a given thing (e.g., spiders or our child crossing a busy street) are generally rooted in Type 1, emotional reactions. Type 2 thinking may be able to allay our fears when we remember we’re much larger than a spider and can roll up come newspaper to deal with it, or that our child is now old enough to safely navigate the task. Are we acting reflexively, out of fear, or have we taken the time to objectively look at the situation? Heuristics have a way of clouding our reason, blinding us to certain facts, and rarely give us the whole picture.
This isn’t to say that heuristics are always wrong or that critical thinking will always lead us to a different, or even reasonable, conclusion. Gut instincts can be right and motivated reasoning can result in rationalizations for false conclusions. However, by being both aware of our potential faults in logic and willing to address them we increase our chances of arriving at the truth of a matter.
So what is a heuristic? It’s a simple rule or mental shortcut that we use to get through day-to-day life without overanalyzing everything. Can you imagine if you had to stop and critically analyze everything throughout the day? We could never get anything done! A simple decision between two snacks or what shirt to wear would become a torturous debate. When decisions have more cost associated with a poor choice, like purchasing a car or what shirt to wear to a job interview, we rightly eschew Type 1 thinking. Likewise, in science or mathematics that have definite right and wrong answers, we cannot simply trust our gut instinct that 3 times 9 is 54 or that the earth is flat because we can’t see its curvature. Leaping to conclusions is often foolish. Heuristics have their time and place, and we all use them.
This is far from a complete list, but here are some common heuristics we often use:
- Anchoring heuristic. This is a weird cognitive bias in which the anchor acts as the key point we estimate around, even if it is completely arbitrary. Imagine a jar full of pennies and guessing game for how many are in the container. If the proctor of the guessing game sets a base number in the guesser’s mind by asking if there are more or less than X, the guesser will anchor their answer as an adjustment around X. Two wildly disparate numbers such as 900 or 3,000 will yield guesses closer to those anchor numbers than otherwise independently guessed.
- Authority heuristic. Also called the trust heuristic, this is the tendency to believe a perceived authority figure. We do this because most of the time, it works! E.g., parents warning a child not to touch something hot, or a teacher showing us a math problem. The problem is it can also lead to the misuse of authority logical fallacy I’ve written about before. E.g., assuming a Senator is speaking knowledgeably on science simply because they sit on that committee. Always validate your sources!
- Availability heuristic. This is a mental shortcut that biases us toward the most recent or easily recalled examples. It can have nothing to do with the reality of a situation or it can be exact repetition, either way, we’ll convince ourselves of the applicability of our memory. For immediate issues, it generally works in the favor of our safety and welfare. “Don’t touch that hot pan!” For larger, more complicated issues, it’s important to remember that we’re not really thinking very hard or analyzing objectively, and should probably expend more effort at problem solving.
- Illusion of control heuristic. This is the tendency of people to overestimate their ability to control or influence events. It “fosters a sense of power rather than powerlessness” (Mathieu, 2012). It is thought to influence gambling behavior and belief in the paranormal (Vyse, 1997, pp.129-130). It’s generally considered to be a positive illusion, though I would imagine that might change in hindsight if all your money is left at the casino.
- Representativeness heuristic. Similar to availability, in that they are both comparisons, this mental short-cut relies on finding similarities between things. It’s the assumption of stereotype based on frequency, or assuming that a Birkenstocks-wearing Dave Matthews Band listener also must smoke marijuana. They might not smoke pot and could simply have terrible fashion sense.
These are just some of the heuristics I’m able to recall, as I don’t have my psych textbooks handy for this essay, but there are lots more. We use them all the time to navigate our daily lives while expending relatively little mental energy. Heuristics are a fantastically useful framework through which to view a circumstance and make snap judgments. These mental shortcuts allow us to be dramatically more productive in the real world. But they are not examples of critical thinking. They are are biases, and often emotionally rooted. They “generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course” (Gilovich & Savitsky, 1996).
Type 2 thinking, or critical thinking, is the objective analysis and evaluation of an issue in order to form a judgment. Objective analysis requires us to view both sides of an issue and examine the arguments and evidence. It requires us to deal honestly in relevant facts, and be open to the possibility that we may be wrong. For larger, more complicated issues, critical thinking is the only way to reliably arrive at the truth.
This does not mean that critical thinkers are more intelligent or ethical than reflexive thinkers. We are all capable of and engage in both types of thinking. Both types of thinking are necessary in life. The key is recognizing which type of thought we’re applying to an issue, and determining if it is the most appropriate for the circumstance.
- Gilovich, Thomas; Savitsky, Kenneth (1996). “Like Goes with Like: The Role of Representativeness in Erroneous and Pseudo-Scientific Beliefs” (PDF). Skeptical Inquirer 20(2): 34–40. doi:10.1017/CBO9780511808098.036.
- Mathieu, I. (2012, Aug 8). The illusion of control. Psychology Today. Retrieved from https://www.psychologytoday.com/blog/emotional-sobriety/201208/the-illusion-control
- Varga, A. L., & Hamburger, K. (2014). Beyond type 1 vs. type 2 processing: the tri-dimensional way. Frontiers in psychology, 5.
- Vyse, Stuart A. (1997), Believing in Magic: The Psychology of Superstition, Oxford University Press US
- Walsh, C. (2014, Feb 5) Layers of choice. Harvard Gazette. Retrieved from http://news.harvard.edu/gazette/story/2014/02/layers-of-choice/
- Wason P. C., Evans J. St. B. T. (1975). Dual processes in reasoning? Cognition 3, 141–154