Our minds are wired to select and interpret evidence
supporting the hypothesis “I'm OK”. A variety of mechanisms:
conscious, unconscious, and social direct our attention to ignore the bad and
highlight the good to increase our hope and reduce our anxiety.
We work hard to retain the belief that “I'm OK” even when faced with significant
losses. Self-justification is deeply ingrained in each of us. Mental schema make it easier for us to
perceive information that supports what we already know or believe.
Unfortunately we often get it wrong.
Our thinking is the result of our own perception, judgment,
experience, and bias. Our brain distorts reality to increase our
self-esteem through self-justification. People perceive themselves readily as the origins of good effects and
reluctantly as the origins of ill effects. We present a one-sided argument to
Confirmation bias is the strong human tendency to dismiss or distort
contrary to our beliefs and readily seek out evidence that supports our views.
Humility reduces our need for
self-justification and allows us to admit to and learn from our mistakes. It can
help us overcome many of these distortions.
People suffering from depression often reverse
this bias and interpret evidence to support their fears they are not worthy.
During times of stress, overload, or threat, we often resort to a
form of thinking, called primal thinking, that incorporates many of these
fallacies. For an accurate appraisal it is important to reassess the situation
using effortful, valid, thoughtful, and accurate analysis that properly allows for the
complexities we face. Employ critical thinking and work to understand what is.
Styles of Distorted Thinking
In addition to the logical fallacies
that can misrepresent or misuse evidence, here is a list and short description of several common forms of distorted
Filtering (selectivity): This is a failure to consider all the evidence in a
balanced and objective assessment. We go where our attention is, and our
attention is inherently limited. Selectivity is a failure to consider a neutral, or
balanced, point of view. It can have two basic forms. The first is
considering only the negative details and magnifying them while filtering out
all the positive aspects of a situation. The second is taking the positive
details and magnifying them while filtering out all the negative aspects of a
situation. In any case evidence that supports your bias is selected, favored, or
weighted more heavily than evidence contrary to your bias. Find the realistic balance between the optimistic and pessimistic
points of view. Seek out, carefully consider, and assimilate all the
Overgeneralization: It is incorrect to arrive at a general conclusion
based on a single incident or piece of evidence. This is a common example of the
fallacy of basing a conclusion on unrepresentative
evidence. Consider a broad range of representative evidence before drawing a
conclusion. Consider systematic evidence, and dismiss anecdotal evidence.
The parable of the
blind men and the elephant
illustrates the dangers in generalizing from unrepresentative evidence. What
each person experienced was a true portion of the elephant, but taken
individually each sample was unrepresentative of the entire elephant. Each blind
man extended the evidence gathered only from his limited
point of view to incorrectly conclude he understood
the whole of the elephant. Each sample can be accurately interpreted only when all the samples are
integrated to create a representative whole.
Polarized Thinking (false choice, dichotomy, primal thinking,
false dilemma, black and white thinking): This is the fallacy of thinking that things are
either black or white, good or bad, all or nothing. This fallacy can lead to
rigid and harmful rules based on primal thinking when
it is efficient to compress complex information into simplistic categories for
rapid decision making during times of stress, conflict, or threat. Polarized
thinking can also lead to unhelpful forms of perfectionism. The reality often lies in
the sizeable middle ground between these extreme poles. Recognize and
reject the false dichotomy. The words “either / or” are a reliable signal
alerting us to a false dichotomy. Find other alternatives that provide a constructive
solution. Dialogue is a powerful tool for moving beyond
a false dichotomy. A clever Zen master teaches his students to reject a false dichotomy
and go beyond polarized thinking with the following challenge. He places a cup
of tea before the student, then says “If you drink that cup of tea, I will beat
you with a stick, and if you don't drink that cup of tea I will beat you with a
The student has to reject the false dichotomy, recognize options other
than the two presented, and create other alternatives, such as offering the tea
to the instructor, or asking his advice, to avoid punishment.
phenomenon are intrinsically dual. Consider the image on the left, known as the
Rubin vase / profile illusion. Do you see a
vase or two human profiles looking at each other eye to eye? An optical illusion—demonstrating
a surprising feature or limitation of our visual perception system—causes us to see either
the vase or the faces at any one time. This is determined by perceiving either the black
as the foreground and the white as the background, or vice versa, at any instance.
This perception easily flips as our attention shifts and
we see the other image. We cannot see both at once and we can voluntarily see
either one at a given time. What we see is an image that can be perceived as
either at any particular instance. Arguing for vase vs. face misses the point;
the image is intrinsically both. Focusing on the false dichotomy of face or vase
distracts us from understanding the intrinsic duality of face and vase. Quantum
physics elegantly describes how light is both wave and particle. Asking if
Barack Obama is black or white, if you are liberal or conservative, republican
or democrat, with us or against us, scientific or religious, can obscure a
Everyday language includes many subtle false dichotomies. Asking “do the ends justify the
means” focuses on a false choice between these ends and those
means. It dismisses the important possibilities of achieving important goals by
other, less destructive means. Asking “whose fault is this” encourages us to choose a
single person to blame. Justifying actions by saying “I had no choice” falsely
dismisses the many alternatives that were not imagined and not chosen. Asking if
a particular behavior results from “nature or nurture” distracts us from
recognizing that most behavior results from a combination of both. Concluding
“you get what you pay for” dismisses the possibility of market inefficiencies or breakthroughs in product
design, manufacturing techniques, or discovering new value and new types
of value in unusual places.
False dichotomies are harmful because they distract us from the many
alternatives that could provide creative solutions or help us constructively
resolve conflict. Consider the distinction between
the false dichotomy of “black or white” and the accurate dichotomy of “black or
non-black”. Non-black includes a vast range of colors spanning shades of gray,
the colors of the rainbow, and the infinite shades of colors in between. Yet all
of these rich and varied possibilities are dismissed when we accept the false
dichotomy of “black or white”. The red rose, green grass, blue sky, and
golden sunshine all disappear when we focus narrowly on “black or white” rather
than “black or non-black”.
False dichotomies confuse complements with opposites. The complement
of black is
non-black, which includes a wide range of colors. The opposite of black is anti black,
which is the single color we call white.
Using the phrase “I think of this somewhat differently . . .” can create a
useful transition when you are confronted with a false dilemma or a question based
on false assumptions. It creates space for introducing an alternative viewpoint
and moving the conversation in a more constructive direction.
Mind Reading: You conclude, incorrectly and without considering other
alternatives or testing your assumptions, that you understand how another person
is thinking and what their reasons and motives are for taking a particular
action. This is an example of the Fundamental Attribution Error where you
incorrectly attribute an action or intent to an agent. One example of this is
drawing a negative conclusion in the absence of supporting information. Focusing
only on evidence that supports a negative position, while neglecting to consider
alternative positive explanations is the fallacy of not considering
representative evidence. It is false to conclude the “he must hate me because he
didn't say 'hi' to me.” There are many plausible explanations for why he
neglected to say “hi”.
Personalization (Egocentric bias, self-reference): This is the fallacy
of incorrectly thinking that everything people say or do is a reaction to you.
It is an egocentric viewpoint where you attribute personal meaning to everything
that happens. Face it, you are not really that important nor
influential. This point-of-view often causes the predator to view himself as the
true victim; their cause is just and is not to be thwarted. It also often
results in a set of self-centered rules.
Attribution Errors: It is a fallacy to
believe you can correctly know a person's intent for behaving as they do. Their
actions may or may not be deliberate. The person may not even be aware of what
they are doing. Their actions may or may not be directed at you. Their actions
may have unintended consequences or may result from an accident or chance. We
judge others based on behavior and we judge ourselves based on intent. It is
difficult to determine cause when only effect can be observed.
This error is so common and so misleading it has been named the
Fundamental Attribution Error (FAE).
Intentional Stance: A class of attribution errors based on the
belief that outcomes only result from an agent's intent, and that bad things are the result of intentional evil. One example is
attributing natural disasters such as drought, floods, and hurricanes to the
revenge of supernatural forces. Personal examples, such as attributing the
difficulties faced by the Nazis to the “diabolical Jew”, quickly provide a basis
for distrust and hate. Intent cannot be reliably inferred
Pattern Discernment: We may think we see a
pattern that isn't there;
the outcomes are simply the result of random events.
Or we can think we recognize a pattern that is different from what we actually
see. We may also fail to recognize a pattern that is present.
Catastrophizing: You anticipate an unreasonable disaster based on a
small problem. Every scrap of bad news turns into an inevitable tragedy. It is
the error of using a personal, pervasive, and permanent
explanatory style despite contrary evidence. This
is another example of the more general fallacy of basing a conclusion on
unrepresentative evidence. Consider a broader range of representative
evidence before drawing a conclusion. Strike a realistic balance between
optimistic and pessimistic views. Skip the histrionics.
Control fallacies: It is a fallacy to mistake what you
can change for what you cannot change. Do not
underestimate the degree of control you have for your own actions. You are not
helpless, powerless, nor perpetually a victim. Examine the alternatives you have
for taking action and responsibly for your life. Also do not overestimate
your responsibility for the happiness and pain of others. Be realistic in
evaluating the power and influence you do and do not
have over yourself and others.
Fallacy of Fairness: Your sense of justice may not be shared widely
and is certainly not shared universally. The world may not be fair, or at least
it may not always work according to what you feel is fair. Examine your own
sense of justice and continue to reconcile it with what happens in the world.
The principle of empathy is a good basis for justice.
Anger is the emotion that urges us to act on our sense
of justice. Choose your battles carefully to make the most constructive use of
your limited time, energy, and other resources. Don't harbor resentment at every
injustice you perceive, and examine your feelings of self righteousness. Gather
evidence to make an informed decision.
Outward Causes: We are biased to think that basically we are all
right and that therefore our difficulties are caused by outward causes. This
leads us quickly to blame others for our difficulties. It also opens the door to
hating others because we blame them for our difficulties.
This fallacy describes an inappropriate external
locus of control.
Blaming: Do not be quick to hold others responsible for your pain. Do
not blame yourself unjustifiably for the failures of others. Consider a broad
range of representative evidence, including the likelihood that there are many
causes contributing to each outcome, before drawing a conclusion. See
disproportionate responsibility, below.
Disproportionate Responsibility: (Single causes) Generally many causes contribute to
each result, outcome, event, or incident. For example, the causes contributing
to an automobile accident can include: design of the automobile, manufacture of
the automobile, maintenance of the automobile, design of the road system,
weather conditions, driver training, driver preparation, driver attention,
choice of vehicle, choice of route, choice of time and schedule, passenger behavior, pedestrians, obstacles,
traffic signals, other cars and drivers on the road, and other factors. Be
objective when assessing blame or taking credit. Divide the responsibility for
the good or the bad result proportionality among each of the contributors, based
on how their actions or inactions affected the result. Perhaps you deserve some
of the credit or must take some of the blame, but it is unlikely you or they are solely
responsible. Don't make the mistake of polarized thinking when assessing
responsibility. Don't attribute undue blame to a
Should (counterfactual thinking, imperatives): Don't get angry every
time someone does not act according to
your ideal. The word “should” is a plea to behave according to a particular (often implicit) set of
values and beliefs. Examine those beliefs, and decide
if they really do apply to the person or situation that is irritating you. What
is the evidence? What can you
change and what can't you change? It is unreasonable to expect that others
will act according to your ideal vision of their behavior or role, especially
when your preferences are unstated. See the fallacy of change, below.
Fallacy of Change: It is unrealistic to believe you can
peoples' nature, personality, deeply ingrained habits, or strongly held beliefs.
Be realistic about what you can change and what you
cannot. Do not depend unrealistically on others for your own well being.
Ignorance: Choosing to ignore or
dismiss relevant information, choosing a narrow worldview; refusing to
inquire, examine, study, and learn; rejecting alternative viewpoints before
examining or considering them; ignoring or denying
evidence; choosing to stay unaware; and holding desperately to your limited beliefs are
all ways to choose ignorance over
wisdom and more carefully considered
evidence. Blind faith, forgetfulness, and lack of
introspection are also forms of ignorance. When coupled with your attachment to
an idea, belief, someone, or something,
ignorance can surface as pretention,
deception, shamelessness, lack of rigor, inconsideration and disrespect of
others, and distraction.
With nearly three million Wikipedia articles to study, millions of books to read,
more than six billion people to meet, and new discoveries being made every day,
no one can know it all.
We are all ignorant. In addition, we sometimes
choose to ignore readily apparent information that contradicts our beliefs.
Avoid this form of deliberate ignore-ance. Stay curious.
Emotional Reasoning: We decide with both our heart and our head.
Continue to improve your emotional competency and ensure a healthy and
constructive balance of both passion and reason. Identify and verify the
assumptions that are being made. Carefully consider the
evidence before deciding. Exercise impulse control while enjoying the
constructive passions of life.
Being Right (denial): Dogmatically holding onto an opinion,
defending an action can be a destructive result of stubborn
pride. Denial is a failure to acknowledge evidence. Even if you believe you are right, decide if you would rather be
right or be happy. Don't waste time pursuing the fallacy of change
described above. Examine your
sense of justice and the assumptions you are making. Gather
evidence to make an informed decision, but even if
you are right, it may not be a battle worth fighting. How is this working for
Cognitive Dissonance: Tension between thoughts and
actions inconsistent with those thoughts. A tense and uncomfortable contradiction
exists unless your actions support your thoughts and
beliefs. To close the gap and relieve
this tension humans often revise their thoughts to support their actions. People who
cannot stop smoking convince themselves that smoking is good. They highlight the
relaxation, autonomy, sophistication, weight control, and maturity symbolized by smoking. They
certainly don't emphasize the health risks, expense, and filth created by the
habit they cannot escape. Irrevocable bad decisions are similarly defended. People who bought the
wrong car, lost money in the stock market, went on a disappointing vacation, or got a bad haircut spontaneously
invent clever defenses for the actions they are now stuck with. What is
remarkable is how strongly we believe these self-justifying stories when we make them up ourselves.
Confabulation: Manufacturing a plausible story to account
for surprising events or behavior. People often unknowingly fill gaps in memory with fabrications
that they believe to be true. They confuse imagination with
memory, or they confuse true memories with false memories. Often people can’t seem to stop
themselves from making up explanations after the fact for whatever it was they
just did for unconscious reasons.
Optimism: Believing that all is good and everything will turn out fine
provides the important benefits of encouraging us to persist toward our goals and overcome obstacles. However, unchecked
easily detach us from the cold harsh truths of reality. Examine the
evidence, think critically, allow for skepticism,
consider a variety of viewpoints, come to a balanced
conclusion, and act responsibly.
Heaven's Reward Fallacy: Don't expect every sacrifice you make to be
rewarded. Don't play the martyr. Sometimes life is fair, but too often it is
not. No one is coming to save you. You are responsible for your own life, well being, and happiness.
Exercise your autonomy and take action because you
want to, not because you believe you will mysteriously be rewarded.
Just World Theory: The mistaken belief that good things happen to good people
and bad things happen to bad people. This is sometimes used as an excuse to
blame the victim; “he got what he deserves.”
Asch Effect: People often change their opinions to agree with the
majority, despite the presence of clear contrary evidence. Experiments conducted
Solomon Asch demonstrated the effects of group pressure on the modification
and distortion of individual judgment. Experimental subjects often modified their judgment
or estimate of an observation to conform with the majority opinion of a group.
Bias: The tendency to attribute positive motives to in-group members
(especially yourself) and negative motives to out-group members (especially
those regarded as “the enemy”).
Global Labeling: This is the fallacy of overgeneralization, combined
with an unrepresentative stereotype. Suspend
judgment until you have an
opportunity to meet and understand a person as an individual. Do not generalize
one or two qualities into a negative judgment about a person or group. The
symbol is not the person.
Stereotypes: Human memory is organized into
schema which are clusters of knowledge or a general conceptual framework
that provides expectations about events, objects, people, and situations in
life. For example, if you are asked to describe a bird, you are likely to recall
some description (prototype) based on a blend of common specific bird species,
or you will recall a specific bird you are familiar with. This
attribute of memory leads us to rely on
These are simplified and standardized conceptions or images held in common by
members of a group. While stereotypes are an essential feature of human memory,
they can cause problems when the attributes associated with the group are
incorrectly extended to an individual. For example, a common stereotype of a
bird includes the ability to fly, however extending that stereotype to a penguin
leads to an incorrect conclusion.
Magical Thinking: Believing that the laws of physics,
economics, or the laws of
cause and effect, don't apply to you. Believing in miracles or believing that
wishful thinking or sheer will alone can cause the outcome you are hoping for
are examples of magical thinking, as are appeals to paranormal or supernatural
phenomena. Don't let optimism exceed the bounds of reality. Hope is not a
Accepting Repetition as Evidence: Sometimes a person will
simply repeat their opinion when asked to provide evidence to justify an assertion or belief they
have expressed. They may repeat their position emphatically, engage in various
dominance displays, highlight various
power symbols, show impatience, or assert
their positional power as they simply repeat their opinion. A variation of this
fallacy is to claim “everyone knows . . . is true” as the evidence. But
repetition is not evidence, and it should not be accepted as evidence.
Assumptions, Opinions, Rumors become fact: It is easy for assumptions,
opinion, or rumors to be accepted as fact. This can happen if these ideas or
stories seem reasonable on the surface, or they support your views or interests,
if they advance some hoped for outcome, or they are
expressed by someone in authority or someone you trust, if the stories are fun
to tell, or if others you know
also share these beliefs. The incorrect assumption, opinion, and rumor that
the earth is the center of the universe went unchallenged by millions of people
for perhaps thousands of years. Other rumors and unchallenged assumptions can be
even more destructive. When you hear a rumor, take the time to
identify and examine the source, and get independent confirmation of it
before passing it on. Don't accept myths, legends, and other speculations and
fiction as fact.
Reification: is a fallacy of ambiguity.
It is the error of treating an abstract construct as if it represents a concrete event or physical entity.
For example, a particular painting is a
specific, real, physical entity, but “art” is an abstract concept with
inherently arbitrary and fuzzy boundaries. Arguing that a particular painting is
or is not “art” explores the boundaries of the abstraction, but doesn't tell us
anything about the painting. Because our brain creates mental symbols
for abstractions as readily as it does for real objects, we are easily
fooled into believing that our particular concept of “art”, “truth”, “beauty”, “good”,
“democracy”, “justice”, or
“government” is real, well defined, widely shared, and correct. A related error
is to treat a non-living abstraction as if it has intent or judgment. Stating
that “The government has decided . . .” falsely attributes intent and
responsibility to an abstraction. Remember
that abstractions are nothing more than arbitrarily defined, ephemeral, imprecise
mental constructs. It may help
to think of abstractions like a rainbow. A rainbow is a beautiful emergent phenomenon
created in our minds as the result of seeing sunlight refracted through thousands of rain drops. But the rainbow is not real and everyone sees it slightly differently
depending on their particular viewpoint. Abstractions are as elusive as the legendary
pot of gold at the end of a rainbow. Don't get too attached to them. Operational
definitions can help reduce the ambiguity inherent in the abstractions we
Sunk Cost Fallacy: Because sunk costs are already spent and cannot be
recovered, it is irrational to consider the value of sunk costs when considering
alternative actions. Future actions cannot reverse past losses. Economics and business decision-making recognize sunk costs as the
costs that have already been incurred and which can never be recovered to any significant
degree. Economic theory proposes that a rational actor does not let sunk costs
influence a decision because past costs cannot be recovered in any case. This is
also called the bygones principle; let bygones be bygones. This recognizes that
you cannot change the past. The fallacy of sunk costs is to consider sunk costs
when making a decision. Sound business decisions are based on a forward-looking view,
ignoring sunk costs. Unfortunately human beings continue to value a past
investment of money, effort or some intangible quality (e.g., “credibility” or
“face”) independent of the investment's probability of paying future dividends.
The irrelevance of sunk costs is a well-know principle of business and
economics, but common behavior often ignores this fallacy of trying to undo the
past. For example, revenge is an attempt to recover
the sunk costs that represent some past and irrevocable harm or
falsely reason “I have too much invested to quit now” when it is rational to only
look at the future prospects of the activity. Arguing
that “we must continue to fight to honor those who have already died” is another
tragic but appealing fallacy of sunk costs.
Suggestive Context (perception set): Sometimes the context in which
information is presented is so familiar, or so compelling, that we quickly
perceive evidence or draw conclusions without sufficient checking. We then hold
firmly to these incorrect conclusions. Here are some examples to try yourself: A
bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How
much does the ball cost? Write down your answer. Double check your answer. Now
read the correct answer here. For a
second example: Look at the following text
FINISHED FILES ARE THE
RESULT OF YEARS OF SCIENTIFIC
STUDY COMBINED WITH THE
EXPERIENCE OF YEARS
How many times do you see the letter ‘F’ in the sentence above? Count them only
once. Write down your answer. Now read the correct answer
Mere Exposure Effect: People prefer objects they have been previously exposed
to, even if that exposure was so brief they do not recall it. Feelings
apparently come first. Affect—our
feeling about something—precedes and strongly influences our cognitive
about what we like and don't like. Quite often a statement such as: “I decided
in favor of X” is no more than an after-the-fact justification—a
confabulation—for the vague feeling
that: “I liked X.” Most of the time information collected about alternatives
serves us less for making a decision than for justifying it afterward.
Advertisers exploit this effect when they get you to prefer their product simply
because you have seen it first or more often.
The “Seven Sins” Of Memory: Although we tend to think of our memories
as retaining a perfect record of our experiences, human memory distorts in these
seven ways, documented by
- Transience: Memories fade over time.
- Absent-mindedness: Lapses of attention cause us to forget
- Blocking: When conflicting demands are placed on our memory, they
may interfere with each other and block recall. The word may make it to the tip
of your tongue but no further.
- Misattribution: Memories are retrieved, but they are associated
with the wrong time, place, or person.
- Suggestibility: Memory is distorted to agree with a suggested
result. See “suggestive context” above.
- Bias: Memory is distorted by our own attitudes, beliefs, emotions,
point-of-view, or experiences.
- Persistence: Sometimes unwanted memories cannot be put out of mind.
The Ego Defense Mechanisms: These distortions help us avoid accepting
evidence that challenges our self-image as a good and worthy person or that
challenge our strongly held stereotypes. Perhaps they act to reduce
anxiety, but because they are distortions, they are
not helpful in the longer term.
- Denial: arguing against an anxiety-provoking stimuli by stating it
doesn't exist. Refusing to perceive the more unpleasant aspects of external
- Displacement: taking out impulses on a less threatening target. The
mind redirects emotion from a ‘dangerous’ object to a ‘safe’ object.
- Intellectualization: avoiding unacceptable emotions by focusing on
their intellectual aspects. Concentrating on the intellectual components of
the situation to distance yourself from the anxiety-provoking emotions
associated with these situations.
- Projection: moving unacceptable impulses in yourself onto someone
else. Attributing to others your own unacceptable or unwanted thoughts or
- Rationalization: supplying a logical or rational reason as opposed
to the real reason. Constructing a logical justification for a decision that
was originally arrived at through a different mental process.
- Reaction formation: taking the opposite belief because the true
belief causes anxiety.
- Regression: returning to a previous stage of development. Reverting
to an earlier stage of development in the face of unacceptable impulses.
- Repression: pulling thoughts into the unconscious and preventing
painful or dangerous thoughts from entering consciousness.
- Sublimation: acting out unacceptable impulses in a socially
- Humor: Refocusing attention on the somewhat comical side of the
situation to relieve negative tension; similar to comic relief.
- “You're entitled to your own opinions, but you're not entitled to your own
facts.” ~ Senator Daniel Patrick Moynahan
- “The human understanding when it has once adopted an opinion (either as
being the received opinion or as being agreeable to itself) draws all things
else to support and agree with itself.” ~
- “A lie can get halfway around the world before the truth can even get its
boots on.” ~
- “Ignorance is a choice.” ~ Neriah Lothamer
- “The greatest of faults, I should say, is to be conscious of none.” ~
- “‘I have done that,’ says my memory, ‘I cannot have done that,’ says my
pride, and remains inexorable. Eventually—memory yields.” ~
- “The saddest lies are the ones we tell ourselves.” ~ Lucille Clifton
- “Our mental limitations prevent us from accepting our mental
limitations.” ~ Robert A. Burton
- “Wisdom is ‘seeing through the illusion’” ~ McKee & Barber
A Mind of its Own: How Your Brain Distorts and Deceives, by Cordelia
Vital Lies Simple Truths: The Psychology of Self Deception, by Daniel
Prisoners of Hate: The Cognitive Basis of Anger, Hostility, and Violence,
by Aaron T. Beck
Decision making and behavioral biases, Wikipedia entry.
Asch conformity experiments, Wikipedia entry.
Influence: The Psychology of Persuasion, by Robert B. Cialdini
Destructive Emotions: A Scientific Dialogue with the Dalai Lama, by Daniel Goleman
Mistakes Were Made (But Not by Me),
by Carol Tavris and Elliot Aronson
On Being Certain: Believing You Are Right Even When You're Not,
Greenwald, A. G. (1980).
totalitarian ego: Fabrication and revision of personal history. American
Psychologist, 35, 603-618.
An Encyclopedia of Claims,
Frauds, and Hoaxes of the Occult and Supernatural, by The James Randi
Psychology of Intelligence Analysis: Biases in Perception of Cause and Effect,
Chapter 11, Center for the Study of Intelligence, Central
Intelligence Agency, 1999
Maps of Bounded Rationality: A Perspective on Intuitive Judgment and Choice,
Nobel Prize Lecture, December 8, 2002 by Daniel Kahneman
Feeling and Thinking, Preferences Need No Inferences, by R. B. Zajonc, University of Michigan,
American Psychologist, February, 1980.
One Red Shoe, a movie farce, staring a young Tom
Hanks, based on distorted