Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, refers to the common tendency for people to perceive events that have already occurred as having been more predictable than they actually were before the events took place. As a result, people often believe, after an event has occurred, that they would have predicted, or perhaps even would have known with a high degree of certainty, what the outcome of the event would have been, before the event occurred. Hindsight bias may cause distortions of our memories of what we knew and/or believed before an event occurred, and is a significant source of overconfidence regarding our ability to predict the outcomes of future events. Examples of hindsight bias can be seen in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems as individuals attribute responsibility on the basis of the supposed predictability of accidents.
- 1 History
- 2 Factors
- 3 Effects
- 4 Consequences
- 5 Visual hindsight bias
- 6 Attempts to decrease
- 7 Mental illness
- 8 Examples
- 9 See also
- 10 References
- 11 Further reading
The hindsight bias, although it was not yet named, was not a new concept when it emerged in psychological research in the 1970s. In fact, it had been indirectly described numerous times by historians, philosophers, and physicians. In 1973, Baruch Fischhoff attended a seminar where Paul E. Meehl stated an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along. Baruch, a psychology graduate student at the time, saw an opportunity in psychological research to explain these observations.
In the early seventies, investigation of heuristics and biases was a large area of study in psychology, led by Amos Tversky and Daniel Kahneman. Two heuristics identified by Tversky and Kahneman were of immediate importance in the development of the hindsight bias; these were the availability heuristic and the representativeness heuristic. In an elaboration of these heuristics, Beyth and Fischhoff devised the first experiment directly testing the hindsight bias. They asked participants to judge the likelihood of several outcomes of US president Richard Nixon's upcoming visit to Beijing (then romanized as Peking) and Moscow. Some time after president Nixon's return, participants were asked to recall (or reconstruct) the probabilities they had assigned to each possible outcome, and their perceptions of the likelihood of each outcome was greater or overestimated for events that actually had occurred. This study is frequently referred to in definitions of the hindsight bias, and the title of the paper, "I knew it would happen," may have contributed to the hindsight bias being interchangeable with the phrase, "knew-it-all-along phenomenon."
In 1975, Fischhoff developed another method for investigating the hindsight bias, which was, at the time, referred to as the "creeping determinism hypothesis". This method involves giving participants a short story with four possible outcomes, one of which they are told is true, and are then asked to assign the likelihood of each particular outcome. Participants frequently assign a higher likelihood of occurrence to whichever outcome they have been told is true. Remaining relatively unmodified, this method is still used in psychological and behavioural experiments investigating aspects of the hindsight bias. Having evolved from the heuristics of Tversky and Kahneman into the creeping determinism hypothesis and finally into the hindsight bias as we now know it, the concept has many practical applications and is still at the forefront of research today. Recent studies involving the hindsight bias have investigated the effect age has on the bias, how hindsight may impact interference and confusion, and how it may affect banking and investment strategies.
This section may require cleanup to meet Wikipedia's quality standards. The specific problem is: unstructured (March 2013) (Learn how and when to remove this template message)
Outcome valence and intensityEdit
Hindsight bias has been found to occur more likely when the outcome of an event is negative than positive, a phenomenon consistent with the more general tendency for people to pay more attention to negative outcomes of events than positive ones. In addition, hindsight bias is affected by the severity of the negative outcome. In malpractice suits, it has been found that the more severe the negative outcome the more dramatic the jurors' hindsight bias. In a perfectly objective case, the verdict would be based on the physician's standard of care instead of the outcome of the treatment; however, studies show that cases that end in severe negative outcomes (such as death) result in higher levels of hindsight bias. For example, in 1996, LaBine proposed a scenario where a psychiatric patient told a therapist that he was contemplating harming another individual. The therapist did not warn the other individual of the possible danger. Participants were each given one of three possible outcomes; the threatened individual either received no injuries, minor injuries, or serious injuries. Participants were then asked to determine if the doctor should be considered negligent. Participants in the "serious injuries" condition were not only more likely to rate the therapist as negligent but also rated the attack as more foreseeable. Participants in the no injuries and minor injury categories were more likely to see the therapist's actions as reasonable.
The role of surprise can help explain the malleability of hindsight bias. Surprise influences how the mind reconstructs pre-outcome predictions in three ways:
- Surprise is a direct metacognitive heuristic to estimate the distance between outcome and prediction.
- Surprise triggers a deliberate sense-making process.
- Surprise biases this process by enhancing the retrieval of surprise-congruent information and expectancy-based hypothesis testing.
Pezzo's sense-making model supports two contradicting ideas of a surprising outcome. The results can show a lesser hindsight bias or possibly a reversed effect, where the individual believes the outcome wasn't a possibility at all, or the outcome can lead to the hindsight bias being magnified to have a stronger effect. The sense-making process is triggered by an initial surprise. If the sense-making process does not complete and the sensory information is not detected or coded, the sensation is experienced as a surprise and the hindsight bias has a gradual reduction. When there is a lack of a sense-making process, the phenomena of reversed hindsight bias is created. Without the sense-making process being present, there is no remnant of thought about the surprise, therefore leading to a sensation of not believing the outcome as a possibility.
Along with the emotion of surprise, personality traits affect hindsight bias. A new integrative lens model is an approach to figure out the bias and accuracy in human inferences due to their individual personality traits. This model integrates on accurate personality judgments and hindsight effects as a by-product of knowledge updating.
During the study, three processes showed potential to explain the occurrence of hindsight effects in personality judgments:
- Changes in an individual's cue perceptions
- Changes in the use of more valid cues
- Changes in the consistency with which an individual applies cue knowledge
After two studies, it was clear that there were hindsight effects for each of the "Big Five" personality dimensions. Evidence was found that both the utilization of more valid cues and changes in cue perceptions, but not changes in the consistency with which cue knowledge is applied account for the hindsight effects. During both studies, participants were presented with target pictures and were asked to judge each target's levels of the Big Five.
It is more difficult to test for hindsight bias in children than adults because the verbal methods used in experiments on adults are too complex for children to understand, let alone measure bias. There have been some experimental procedures created with visual identification to test children in a way they can grasp. Methods with visual images start by presenting a blurry image that becomes clearer over time. In some conditions, the subjects know what the final object is, and in others they don't. In cases where the subject knows what the object shape will become when the image is clear, they are asked to estimate the amount of time other participants of similar age will take to guess what the object is. Due to hindsight bias, the estimated times are often much lower than the actual times because the participant is using their knowledge while making their estimate.
These types of studies have presented results that show that the hindsight bias affects children as well as adults. Hindsight bias in adults and in children shares a core cognitive constraint. That constraint is a tendency to be biased on one's current knowledge when attempting to recall or reason about a more naïve cognitive state—regardless of whether that more naïve state is one's own earlier naïve state or someone else's. Children have a theory of mind, which is their mental state of reasoning. Hindsight bias is a fundamental problem in cognitive perspective-taking. After reviewing developmental literature on hindsight bias and other limitations, it was found that some of children's limitation in the theory of mind may stem from the same core component as hindsight bias. This key factor brings forth underlying mechanisms. A developmental approach is necessary for a comprehensive understanding of the nature of hindsight bias in social cognition.
Another topic that affects the function of hindsight bias is the auditory function of humans. To test the effects of auditory distractions on hindsight bias, four experiments were completed. Experiment one included plain words, in which low-pass filters were used to reduce the amplitude for sounds of consonants; thereby, making the words more degraded. In the naïve-identification task, participants were presented a warning tone before hearing the degraded words. In the hindsight estimation task, a warning tone was presented before the clear word followed by the degraded version of the word. Experiment two included words with explicit warnings of the hindsight bias. It followed the same procedure as experiment one, however, the participants were informed and asked not to complete the same error. Experiment three included full sentences of degraded words rather than individual words. Experiment four included less-degraded words in order to make the words easier to understand and identify to the participants.
By using these different techniques, it offers a different range of detection and evaluates the ecological validity of the effect. In each experiment, hindsight estimates exceeded the naïve-identification rates. Therefore, knowing the identities of words caused people to overestimate others' naïve ability to identify moderately to highly degraded spoken versions of those words. People who know the outcome of an event tend to overestimate their own prior knowledge or others' naïve knowledge of the event. As a result, speakers tend to overestimate the clarity of their message while listeners tend to overestimate their understanding of ambiguous messages. This miscommunication stems from hindsight bias, which then creates a feeling of inevitability. Overall, this auditory hindsight bias occurs despite people's effort to avoid it.
To understand how a person can so easily change the foundation of knowledge and belief for events after receiving new information, three cognitive models of hindsight bias have been reviewed. The three models are SARA (Selective Activation and Reconstructive Anchoring), RAFT (reconstruction after feedback with take the best) and CMT (causal model theory). SARA and RAFT focus on distortions or changes in a memory process, while CMT focuses on probability judgments of hindsight bias.
The SARA model, created by Rüdiger Pohl and associates, explains hindsight bias for descriptive information in memory and hypothetical situations. SARA assumes that people have a set of images to draw their memories from. They suffer from the hindsight bias due to selective activation or biased sampling of that set of images. Basically, people only remember small, select amounts of information—and when asked to recall it later, use that biased image to support their own opinions about the situation. The set of images is originally processed in the brain when first experienced. When remembered, this image reactivates, and the mind can edit and alter the memory, which takes place in hindsight bias when new and correct information is presented, leading one to believe that this new information when remembered at a later time is the persons original memory. Due to this reactivation in the brain, a more permanent memory trace can be created. The new information acts as a memory anchor causing retrieval impairment.
The RAFT model explains hindsight bias with comparisons of objects using knowledge-based probability then applying interpretations to those probabilities. When given two choices, a person recalls the information on both topics and makes assumptions based on how reasonable they find the information. An example case is someone comparing the size of two cities. If they know one city well (e.g. because it has a popular sporting team or through personal history) and know much less about the other, their mental cues for the more popular city increase. They then "take the best" option in their assessment of their own probabilities. For example, they recognize a city due to knowing of its sports team, and thus they assume that that city has the highest population. "Take the best" refers to a cue that is viewed as most valid and becomes support for the person's interpretations. RAFT is a by-product of adaptive learning. Feedback information updates a person's knowledge base. This can lead a person to be unable to retrieve the initial information, since the information cue has been replaced by a cue that they thought was more fitting. The "best" cue has been replaced, and the person only remembers the answer that is most likely and believes that they thought this was the best point the whole time.
Both SARA and RAFT descriptions include a memory trace impairment or cognitive distortion that is caused by feedback of information and reconstruction of memory.
CMT is a non-formal theory based on work by many researchers to create a collaborative process model for hindsight bias that involves event outcomes. People try to make sense of an event that has not turned out how they expected by creating causal reasoning for the starting event conditions. This can give that person the idea that the event outcome was inevitable and there was nothing that could take place to prevent it from happening. CMT can be caused by a discrepancy between a person's expectation of the event and the reality of an outcome. They consciously want to make sense of what has happened and selectively retrieve memory that supports the current outcome. The causal attribution can be motivated by wanting to feel more positive about the outcome, and possibly themselves.
Are people liars or are they tricking themselves into believing that they knew the right answer? These models would show that memory distortions and personal bias play a role.
Hindsight bias has similarities to other memory distortions, such as misinformation effect and false autobiographical memory. Misinformation effect occurs after an event is witnessed; new information received after the fact influences how the person remembers the event, and can be called post-event misinformation. This is an important issue with eyewitness testimony. False autobiographical memory takes place when suggestions or additional outside information is provided to distort and change memory of events; this can also lead to false memory syndrome. At times this can lead to creation of new memories that are completely false and have not taken place.
All three of these memory distortions contain a three-stage procedure. The details of each procedure are different, but all three can result in some psychological manipulation and alteration of memory. Stage one is different between the three paradigms, although all involve an event, an event that has taken place (misinformation effect), an event that has not taken place (false autobiographical memory), and a judgment made by a person about an event that must be remembered (hindsight bias). Stage two consists of more information that is received by the person after the event has taken place. The new information given in hindsight bias is correct and presented upfront to the person, while the extra information for the other two memory distortions is wrong and presented in an indirect and possibly manipulative way. The third stage consists of recalling the starting information. The person must recall the original information with hindsight bias and misinformation effect, while a person that has a false autobiographical memory is expected to remember the incorrect information as a true memory.
Cavillo (2013) tested whether there is a relationship between the amount of time the experimenters gave the participants to respond and their level of bias when recalling their initial judgements. The results showed that there is in fact a relationship; the hindsight bias index was greater among the participants asked to respond rapidly than among participants allowed more time to respond.
Distortions of autobiographical memory produced by hindsight bias have also been used as a tool to study changes in students’ beliefs about paranormal phenomena after taking a university level skepticism course. In a study by Kane (2010), students in Kane's skepticism class rated their level of belief in a variety of paranormal phenomena at both the beginning and at the end of the course. At the end of the course they also rated what they remembered their level of belief had been at the beginning of the course. The critical finding was that, not only did students reduce their average level of belief in paranormal phenomena by the end of the course, they also – at the end of the course – falsely remembered the level of belief they held at the beginning of the course, falsely remembering a much lower level of belief than had really been the case. It is the latter finding that is a reflection of the operation of hindsight bias.
To create a false autobiographical memory, the person must believe a memory that is not real. To seem real, the information must be influenced by their own personal judgments. There is no real episode of an event to remember, so this memory construction must be logical to that person's knowledge base. Hindsight bias and misinformation effect recall a specific time and event; this is called an episodic memory process. These two memory distortions both use memory-based mechanisms that involve a memory trace that has been changed. Hippocampus activation takes place when an episodic memory is recalled. The memory is then available for alteration by new information. The person believes that the remembered information is the original memory trace, not an altered memory. This new memory is made from accurate information, and therefore the person does not have much motivation to admit that they were wrong originally by remembering the original memory. This can lead to motivated forgetting.
Following a negative outcome of a situation, people do not want to accept responsibility. Instead of accepting their role in the event, they might either view themselves as caught up in a situation that was unforeseeable with them therefore not being the culprits (this is referred to as defensive processing) or view the situation as inevitable with there therefore being nothing that could have been done to prevent it (this is retroactive pessimism). Defensive processing involves less hindsight bias, as they are playing ignorant of the event. Retroactive pessimism makes use of hindsight bias after a negative, unwanted outcome. Events in life can be hard to control or predict. It is no surprise that people want to view themselves in a more positive light and do not want to take responsibility for situations they could have altered. This leads to hindsight bias in the form of retroactive pessimism to inhibit upward counterfactual thinking, instead interpreting the outcome as succumbing to an inevitable fate.
This memory inhibition that prevents a person from recalling what really happened may lead to failure to accept mistakes, and therefore may make someone unable to learn and grow to prevent repeating the mistake. Hindsight bias can also lead to overconfidence in decisions without considering other options. Such people see themselves as persons who remember correctly, even though they are just forgetting that they were wrong. Avoiding responsibility is common among the human population. Examples are discussed below to show the regularity and severity of hindsight bias in society.
Hindsight bias has both positive and negative consequences. The bias's also play a role in the process of decision-making within the medical field.
Positive consequences of hindsight bias is an increase in one's confidence and performance, as long as the bias distortion is reasonable and does not create overconfidence. Another positive consequence is that one's self-assurance of their knowledge and decision-making, even if it ends up being a poor decision, can be beneficial to others; allowing others to experience new things or to learn from those who made the poor decisions.
Hindsight bias decreases one's rational thinking because of when a person experiences strong emotions, which in turn decreases rational thinking. Another negative consequence of hindsight bias is the interference of one's ability to learn from experience, as a person is unable to look back on past decisions and learn from mistakes. A third consequence is a decrease in sensitivity toward a victim by the person who caused the wrongdoing. The person demoralizes the victim and does not allow for a correction of behaviors and actions.
Hindsight bias may lead to overconfidence and malpractice in regards to doctors. Hindsight bias and overconfidence is often attributed to the number of years of experience the doctor has. After a procedure, doctors may have a “knew it the whole time” attitude, when in reality they may not have actually known it. In an effort to avoid hindsight bias, doctors use a computer-based decision support system that help the doctor diagnose and treat their patients correctly and accurately.
Visual hindsight biasEdit
Hindsight bias has also been found to affect judgments regarding the perception of visual stimuli, an effect referred to as the “I saw it all along” phenomenon. This effect has been demonstrated experimentally  by presenting participants with initially very blurry images of celebrities. Participants then viewed the images as the images resolved to full clarity (Phase 1). Following Phase 1, participants predicted the level of blur at which a peer would be able to make an accurate identification of each celebrity. It was found that, now that the identity of the celebrities in each image was known, participants significantly overestimated the ease with which others would be able to identify the celebrities when the images were blurry.
The phenomenon of visual hindsight bias has important implications for a form of malpractice litigation that occurs in the field of radiology. Typically, in these cases, a radiologist is charged with having failed to detect the presence of an abnormality that was actually present in a radiology image. During litigation, a different radiologist – who now knows that the image contains an abnormality – is asked to judge how likely it would be for a naive radiologist to have detected the abnormality during the initial reading of the image. This kind of judgment directly parallels the judgments made in hindsight bias studies. Consistent with the hindsight bias literature, it has been found that abnormalities are, in fact, more easily detected in hindsight than foresight. In the absence of controls for hindsight bias, testifying radiologists may overestimate the ease with which the abnormality would have been detected in foresight.
Attempts to decreaseEdit
Research suggests that people still exhibit the hindsight bias even when they are aware of it or possess the intention of eradicating it. There is no solution to eliminate hindsight bias in its totality, but only ways to reduce it. Some of which include considering alternative explanations or opening one's mind to different perspectives. In terms of auditory communication, the speaker would try to provide more clarity in his or her delivery and the listener may seek greater clarification.
The only observable way to decrease hindsight bias in testing is to have the participant think about how alternative hypotheses could be correct. As a result, the participant doubts the correct hypothesis and reports that he or she would not have chosen it.
Given the fact that researchers' attempts to eliminate hindsight bias in its entirety have failed, some believe there is a possible combination of motivational and automatic processes in cognitive reconstruction. Incentive prompts participants to use more effort to recover even the weak memory traces. This idea supports the causal model theory and the use of sense-making to understand event outcomes.
Schizophrenia is an example of a disorder that directly affects the hindsight bias. Individuals with schizophrenia are more strongly affected by the hindsight bias than are individuals from the general public.
The hindsight bias effect is a paradigm that demonstrates how recently acquired knowledge influences the recollection of past information. Recently acquired knowledge has a strange but strong influence on schizophrenic individuals in relation to information previously learned. New information combined with rejection of past memories can disconfirm behavior and delusional belief, which is typically found in patients suffering from schizophrenia. This can cause faulty memory, which can lead to hindsight thinking and believing in knowing something they don't. Delusion-prone individuals suffering from schizophrenia can falsely jump to conclusions. Jumping to conclusions can lead to hindsight, which strongly influences the delusional conviction in individuals with schizophrenia. In numerous studies, cognitive functional deficits in schizophrenic individuals impair their ability to represent and uphold contextual processing.
Post-traumatic stress disorderEdit
Post-traumatic stress disorder (PTSD) is the re-experiencing and avoidance of trauma-related stressors, emotions, and memories from a past event or events that has cognitive dramatizing impact on an individual. PTSD can be attributed to the functional impairment of the prefrontal cortex (PFC) structure. Dysfunctions of cognitive processing of context and abnormalities that PTSD patients suffer from can affect hindsight thinking, such as in combat soldiers perceiving they could have altered outcomes of events in war. The PFC and dopamine systems are parts of the brain that can be responsible for the impairment in cognitive control processing of context information. The PFC is well known for controlling the thought process in hindsight bias that something will happen when it evidently does not. Brain impairment in certain brain regions can also affect the thought process of an individual who may engage in hindsight thinking.
Cognitive flashbacks and other associated features from a traumatic event can trigger severe stress and negative emotions such as unpardonable guilt. For example, studies were done on trauma-related guilt characteristics of war veterans with chronic PTSD 8. Although there has been limited research, significant data suggests that hindsight bias has an effect on war veterans' personal perception of wrongdoing, in terms of guilt and responsibility from traumatic events of war. They blame themselves, and, in hindsight, perceive that they could have prevented what happened.
Health care systemEdit
Accidents are prone to happen in any human undertaking, but accidents occurring within the healthcare system seem more salient and severe due to their profound effect on the lives of those involved, sometimes resulting in the death of a patient. In the healthcare system, there are a number of methods in which specific cases where accidents happened are reviewed by others who already know the outcome of the case. These methods include morbidity and mortality conferences, autopsies, case analysis, medical malpractice claims analysis, staff interviews, and even patient observation. Hindsight bias has been shown to cause difficulties in measuring errors in these cases. Many of these errors are considered preventable after the fact, clearly indicating the presence and importance of a hindsight bias in this field. There are two sides of debate in how these case reviews should be approached to best evaluate past cases: the error elimination strategy and the safety management strategy. The error elimination strategy aims to find the cause of errors, relying heavily on hindsight (therefore more subject to the hindsight bias). The safety management strategy relies less on hindsight (less subject to hindsight bias) and identifies possible constraints during the decision making process of that case. However, it is not immune to error.
Hindsight bias results in being held to a higher standard in court. The defense is particularly susceptible to these effects, since their actions are the ones being scrutinized by the jury. Due to the hindsight bias, defendants are judged as capable of preventing the bad outcome. Although much stronger for the defendants, hindsight bias also affects the plaintiffs. In cases where there is an assumption of risk, hindsight bias may contribute to the jurors perceiving the event as riskier due to the poor outcome. This may lead the jury to feel that the plaintiff should have exercised greater caution in the situation. Both of these effects can be minimized if attorneys put the jury in a position of foresight rather than hindsight through the use of language and timelines. Judges and juries are likely to mistakenly view negative events as being more foreseeable than what it really was in the moment, when looking at the situation after the fact in court. Encouraging people to explicitly think about the counterfactuals was an effective means of reducing the hindsight bias. In other words, people became less attached to the actual outcome and were more open to consider alternative lines of reasoning prior to the event. Judges involved in fraudulent transfer litigation cases were subject to the hindsight bias as well, resulting in an unfair advantage for the plaintiff, showing that jurors are not the only ones sensitive to the effects of the hindsight bias in the courtroom.
Since hindsight leads people to focus on information that is consistent with what happened while inconsistent information is ignored or regarded as less relevant, it is likely included in representations about the past as well. In a study with Wikipedia articles the latest article versions before the event (foresight article versions) were compared to two hindsight article versions: the first online after the event took place and another one eight weeks later. In order to be able to investigate various types of events, even including disasters (e.g., the nuclear disaster of Fukushima), for which foresight articles do not exist, the authors made use of articles about the structure that suffered damage in those instances (e.g., the article about the nuclear power plant of Fukushima). When analyzing to what extent the articles were suggestive of a particular event, they only found articles about disasters to be much more suggestive of the disaster in hindsight than in foresight - indicating hindsight bias. For the remaining event categories, however, Wikipedia articles did not show any hindsight bias. In an attempt to compare individuals' and Wikipedia's hindsight bias more directly, another study came to the conclusion that Wikipedia articles are less susceptible to hindsight bias than individuals' representations.
- "I Knew It All Along…Didn't I?' – Understanding Hindsight Bias". APS Research News. Association for Psychological Science. Retrieved 29 January 2019.
- Fischhoff, B. (1975). "Hindsight ≠ foresight: The effect of outcome knowledge on judgment under uncertainty". Journal of Experimental Psychology: Human Perception and Performance. 1 (3): 288–299. doi:10.1037/0096-15126.96.36.1998.
- Roese, N. J.; Vohs, K. D. (2012). "Hindsight bias". Perspectives on Psychological Science. 7 (5): 411–426. doi:10.1177/1745691612454303. PMID 26168501.
- Hoffrage, Ulrich; Pohl, Rüdiger (2003). "Research on hindsight bias: A rich past, a productive present, and a challenging future". Memory. 11 (4–5): 329–335. doi:10.1080/09658210344000080. PMID 14562866.
- Boyd, Drew. "Innovators: Beware the Hindsight Bias". Psychology Today. Sussex Publishers. Retrieved 29 January 2019.
- Blank, H.; Nestler, S.; von Collani, G.; Fischer, V (2008). "How many hindsight biases are there?". Cognition. 106 (3): 1408–1440. doi:10.1016/j.cognition.2007.07.007. PMID 17764669.
- Arkes, H.; Faust, D.; Guilmette, T. J.; Hart, K. (1988). "Eliminating the hindsight bias". Journal of Applied Psychology. 73 (2): 305–307. doi:10.1037/0021-9010.73.2.305.
- Fischhoff, B (2007). "An early history of hindsight research". Social Cognition. 25: 10–13. CiteSeerX 10.1.1.365.6826. doi:10.1521/soco.2007.25.1.10.
- Tversky, A.; Kahneman, D. (1973). "Availability: A heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9.
- Fischhoff, Baruch; Beyth, Ruth (1975). "I knew it would happen: Remembered probabilities of once—future things". Organizational Behavior and Human Performance. 13: 1–16. doi:10.1016/0030-5073(75)90002-1.
- Bernstein, D.M.; Erdfelder, E.; Meltzoff, A. N.; Peria, W.; Loftus, G. R. (2011). "Hindsight bias from 3 to 95 years of age". Journal of Experimental Psychology: Learning, Memory, and Cognition. 37 (2): 378–391. doi:10.1037/a0021971. PMC 3084020. PMID 21299327.
- Marks, A. Z.; Arkes, H. R. (2010). "The effects of mental contamination on the hindsight bias: Source confusion determines success in disregarding knowledge". Journal of Behavioral Decision Making. 23 (2): 131–160. doi:10.1002/bdm.632.
- Biais, Bruno; Weber, Martin (2009). "Hindsight Bias, Risk Perception, and Investment Performance" (PDF). Management Science. 55 (6): 1018–1029. doi:10.1287/mnsc.1090.1000.
- Schkade, D.; Kilbourne, L. (1991). "Expectation-Outcome Consistency and Hindsight Bias". Organizational Behavior and Human Decision Processes. 49: 105–123. doi:10.1016/0749-5978(91)90044-T.
- Fiske, S. (1980). "Attention and weight in person perception: The impact of negative and extreme behavior". Journal of Personality and Social Psychology. 38 (6): 889–906. doi:10.1037/0022-35188.8.131.529.
- Harley, E. M. (2007). "Hindsight bias in legal decision making". Social Cognition. 25 (1): 48–63. doi:10.1521/soco.2007.25.1.48.
- Müller, Patrick A.; Stahlberg, Dagmar (2007). "The Role of Surprise in Hindsight Bias: A Metacognitive Model of Reduced and Reversed Hindsight Bias" (PDF). Social Cognition. 25: 165–184. doi:10.1521/soco.2007.25.1.165.
- Nestler, Steffen; Egloff, Boris; Küfner, Albrecht C. P.; Back, Mitja D. (2012). "An integrative lens model approach to bias and accuracy in human inferences: Hindsight effects and knowledge updating in personality judgments". Journal of Personality and Social Psychology. 103 (4): 689–717. doi:10.1037/a0029461. PMID 22844973.
- Bernstein, D. M.; Atance, C.; Loftus, G. R.; Meltzoff, A. (April 2004). "We saw it all along: Visual hindsight bias in children and adults". Psychological Science. 15 (4): 264–267. doi:10.1111/j.0963-7214.2004.00663.x. PMC 3640979. PMID 15043645.
- Birch, Susan A. J.; Bernstein, Daniel M. (2007). "What Can Children Tell Us About Hindsight Bias: A Fundamental Constraint on Perspective–Taking?". Social Cognition. 25: 98–113. doi:10.1521/soco.2007.25.1.98.
- Bernstein, Daniel M.; Wilson, Alexander Maurice; Pernat, Nicole L. M.; Meilleur, Louise R. (2012). "Auditory hindsight bias" (PDF). Psychonomic Bulletin & Review. 19 (4): 588–593. doi:10.3758/s13423-012-0268-0. PMID 22648656.
- Blank, H.; Nestler, S. (2007). "Cognitive Process Models of Hinsight Bias". Social Cognition. 25 (1): 132–147. doi:10.1521/soco.2007.25.1.132.
- Pohl, R. F.; Eisenhauer, M.; Hardt, O. (2003). "SARA: A Cognitive Process Model to Stimulate the Anchoring Effect and Hindsight bias". Memory. 11 (4–5): 337–356. doi:10.1080/09658210244000487. PMID 14562867.
- Loftus, E., F. (1991). "Made in Memory: Distortions in Recollection After Misleading Information". The Psychology of Learning and Motivation, 25, 187–215. New York: Academic Press
- Hertwig, R.; Fenselow, C.; Hoffrage, U. (2003). "Hindsight Bias: Knowledge and Heuristics Affect our reconstruction of the Past". Memory. 11 (4–5): 357–377. doi:10.1080/09658210244000595. hdl:11858/00-001M-0000-0025-8C80-E. PMID 14562868.
- Nestler, S.; Blank, H.; von Collani, G. (2008). "A Causal Model Theory of Creeping Determinism". Social Psychology. 39 (3): 182–188. doi:10.1027/1864-93184.108.40.206.
- Mazzoni, G.; Vannucci, M. (2007). "Hindsight bias, the misinformation effect, and false autobiographical memories". Social Cognition. 25 (1): 203–220. doi:10.1521/soco.2007.25.1.203.
- Calvillo, Dustin P. (2013). "Rapid recollection of foresight judgments increases hindsight bias in a memory design". Journal of Experimental Psychology: Learning, Memory, and Cognition. 39 (3): 959–964. doi:10.1037/a0028579. PMID 22582966.
- Kane, M.; Core, T.J.; Hunt, R.R. (2010). "Bias versus bias: Harnessing hindsight to reveal paranormal belief change beyond demand characteristics". Psychonomic Bulletin & Review. 17 (2): 206–212. doi:10.3758/PBR.17.2.206. PMID 20382921.
- Kane, M.J. (2010). "Can people's minds be changed? How can we know?". Skeptic. 16 (1): 28–31.
- Nadel, L., Hupbach, A., Hardt, O., & Gomez, R. (2008). "Episodic Memory: Reconsolidation". Dere, D., Easton, A., Nadel, L., & Huston, J., P. (Eds), Handbook of Episodic Memory (pp. 43–56). The Netherlands: Elsevier.
- Pezzo, M.; Pezzo, S. (2007). "Making Sense of Failure: A Motivated Model of Hindsight Bias". Social Cognition. 25 (1): 147–165. CiteSeerX 10.1.1.319.1999. doi:10.1521/soco.2007.25.1.147.
- Tykocinski, O. E.; Steinberg, N. (2005). "Coping with disappointing outcomes: Retroactive pessimism and motivated inhibition of counterfactuals". Journal of Experimental Social Psychology. 41 (5): 551–558. doi:10.1016/j.jesp.2004.12.001.
- Louie, T. A.; Rajan, M. N.; Sibley, R. E. (2007). "Tackling the monday-morning quarterback: Applications of hindsight bias in decision-making settings". Social Cognition. 25 (1): 32–47. doi:10.1521/soco.2007.25.1.32.
- Arkes, H. R. (2013). "The consequences of the hindsight bias in medical decision making". Current Directions in Psychological Science. 22 (5): 356–360. doi:10.1177/0963721413489988.
- Blank, H.; Musch, J.; Pohl, R. F. (2007). "Hindsight Bias: On Being Wise After the Event". Social Cognition. 25 (1): 1–9. doi:10.1521/soco.2007.25.1.1.
- Harley, Erin M.; Carlsen, Keri A.; Loftus, Geoffrey R. (2004). "The 'Saw-It-All-Along' Effect: Demonstrations of Visual Hindsight Bias" (PDF). Journal of Experimental Psychology: Learning, Memory, and Cognition. 30 (5): 960–968. doi:10.1037/0278-73220.127.116.110. PMID 15355129.
- Berlin (2000). "Malpractice issues in Radiology: Hindsight bias". American Journal of Roentgenology. 175 (3): 597–601. doi:10.2214/ajr.175.3.1750597. PMID 10954437.
- Muhm, J.; Miller, W.; Fontana, R; Sanderson, D.; Uhlenhopp, M. (1983). "Lung cancer detected during a screening program using four-month chest radiographs". Radiology. 148 (3): 609–615. doi:10.1148/radiology.148.3.6308709. PMID 6308709.
- Pohl, R.; Hell, W. (1996). "No reduction in Hindsight Bias after Complete Information and repeated Testing". Organizational Behavior and Human Decision Processes. 67 (1): 49–58. doi:10.1006/obhd.1996.0064.
- Hell, Wolfgang; Gigerenzer, Gerd; Gauggel, Siegfried; Mall, Maria; Müller, Michael (1988). "Hindsight bias: An interaction of automatic and motivational factors?". Memory & Cognition. 16 (6): 533–538. doi:10.3758/BF03197054. PMID 3193884.
- Woodward, T. S.; Moritz, S.; Arnold, M. M.; Cuttler, C.; Whitman, J. C.; Lindsay, S. (2006). "Increased hindsight bias in schizophrenia". Neuropsychology. 20 (4): 462–467. CiteSeerX 10.1.1.708.6018. doi:10.1037/0894-418.104.22.1681. PMID 16846264.
- Freeman, D; Pugh, K; Garety, PA (2008). "Jumping to conclusions and paranoid ideation in the general population". Schizophrenia Research. 102 (1–3): 254–260. doi:10.1016/j.schres.2008.03.020. PMID 18442898.
- Holmes, Avram J.; MacDonald, Angus; Carter, Cameron S.; Barch, Deanna M.; Andrew Stenger, V.; Cohen, Jonathan D. (2005). "Prefrontal functioning during context processing in schizophrenia and major depression: An event-related fMRI study". Schizophrenia Research. 76 (2–3): 199–206. doi:10.1016/j.schres.2005.01.021. PMID 15949653.
- Brewin, C.; Dalgleish, R.; Joseph, S. (1996). "A dual representation theory of posttraumatic stress disorder". Psychological Review. 103 (4): 670–68. doi:10.1037/0033-295x.103.4.670. PMID 8888651.
- Richert, K. A.; Carrion, V. G.; Karchemskiy, A.; Reiss, A. L. (2006). "Regional differences of the prefrontal cortex in pediatric PTSD: an MRI study". Depression and Anxiety. 23 (1): 17–25. doi:10.1002/da.20131. PMID 16247760.
- Braver, Todd S.; Barch, Deanna M.; Keys, Beth A.; Carter, Cameron S.; Cohen, Jonathan D.; Kaye, Jeffrey A.; Janowsky, Jeri S.; Taylor, Stephan F.; Yesavage, Jerome A.; Mumenthaler, Martin S.; Jagust, William J.; Reed, Bruce R. (2001). "Context processing in older adults: Evidence for a theory relating cognitive control to neurobiology in healthy aging". Journal of Experimental Psychology. 130 (4): 746–763. CiteSeerX 10.1.1.599.6420. doi:10.1037/0096-3422.214.171.1246.
- Beckham, Jean C.; Feldman, Michelle E.; Kirby, Angela C. (1998). "Atrocities Exposure in Vietnam Combat Veterans with Chronic Posttraumatic Stress Disorder: Relationship to Combat Exposure, Symptom Severity, Guilt, and Interpersonal Violence". Journal of Traumatic Stress. 11 (4): 777–785. doi:10.1023/a:1024453618638. PMID 9870228.
- Hurwitz, B., & Sheikh, A. (2009). Healthcare Errors and Patient Safety. Hoboken, NJ: Blackwell Publishing.
- Starr, V. H., & McCormick, M. (2001). Jury Selection (Third Edition). Aspen Law and Business
- Oeberst, A.; Goeckenjan, I. (2016). "When being wise after the event results in injustice: Evidence for hindsight bias in judges' negligence assessments". Psychology, Public Policy, and Law. 22 (3): 271–279. doi:10.1037/law0000091.
- Peterson, R. L. (2007). Inside the Investor's Brain: the power of mind over money. Hoboken, NJ: Wiley Publishing.
- Simkovic, M., & Kaminetzky, B. (2010). "Leverage Buyout Bankruptcies, the Problem of Hindsight Bias, and the Credit Default Swap Solution". Seton Hall Public Research Paper: August 29, 2010.
- Fischhoff, Baruch (1975). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty". Journal of Experimental Psychology: Human Perception and Performance. 1 (3): 288–299. doi:10.1037/0096-15126.96.36.1998. ISSN 0096-1523.
- Nestler, Steffen; Blank, Hartmut; von Collani, Gernot (2008). "Hindsight bias doesn't always come easy: Causal models, cognitive effort, and creeping determinism". Journal of Experimental Psychology: Learning, Memory, and Cognition. 34 (5): 1043–1054. doi:10.1037/0278-73188.8.131.523. ISSN 1939-1285.
- Oeberst, Aileen; von der Beck, Ina; D. Back, Mitja; Cress, Ulrike; Nestler, Steffen (17 April 2017). "Biases in the production and reception of collective knowledge: the case of hindsight bias in Wikipedia". Psychological Research. 82 (5): 1010–1026. doi:10.1007/s00426-017-0865-7. ISSN 0340-0727. PMID 28417198.
- Oeberst, Aileen; von der Beck, Ina; Cress, Ulrike; Nestler, Steffen (20 March 2019). "Wikipedia outperforms individuals when it comes to hindsight bias". Psychological Research. doi:10.1007/s00426-019-01165-7. ISSN 0340-0727. PMID 30895365.
- Excerpt from: David G. Myers, Exploring Social Psychology. New York: McGraw-Hill, 1994, pp. 15–19. (More discussion of Paul Lazarsfeld's experimental questions.)
- Ken Fisher, Forecasting (Macro and Micro) and Future Concepts, on Market Analysis (4/7/06)
- Iraq War Naysayers May Have Hindsight Bias. Shankar Vedantam. The Washington Post.
- Why Hindsight Can Damage Foresight. Paul Goodwin. Foresight: The International Journal of Applied Forecasting, Spring 2010.
- Social Cognition (2007) Vol. 25, Special Issue: The Hindsight Bias