Curse of knowledge
The curse of knowledge is a cognitive bias that occurs when an individual, communicating with other individuals, unknowingly assumes that the others have the background to understand. This bias is also called by some authors the curse of expertise, although that term is also used to refer to various other phenomena.
For example, in a classroom setting, teachers have difficulty teaching novices because they cannot put themselves in the position of the student. A brilliant professor might no longer remember the difficulties that a young student encounters when learning a new subject. This curse of knowledge also explains the danger behind thinking about student learning based on what appears best to faculty members, as opposed to what has been verified with students.
History of conceptEdit
The term "curse of knowledge" was coined in a 1989 Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The aim of their research was to counter the "conventional assumptions in such (economic) analyses of asymmetric information in that better-informed agents can accurately anticipate the judgement of less-informed agents".
Such research drew from Baruch Fischhoff's work in 1975 surrounding hindsight bias, a cognitive bias that knowing the outcome of a certain event makes it seem more predictable than may actually be true. Research conducted by Fischhoff revealed that participants did not know that their outcome knowledge affected their responses, and, if they did know, they could still not ignore or defeat the effects of the bias. Study participants could not accurately reconstruct their previous, less knowledgeable states of mind, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge". This receipt of knowledge returns to the idea of the curse proposed by Camerer, Loewenstein, and Weber: a knowledgeable person cannot accurately reconstruct what a person, be it themselves or someone else, without the knowledge would think, or how they would act. In his paper, Fischhoff questions the failure to empathize with ourselves in less knowledgeable states, and notes that how well people manage to reconstruct perceptions of lesser informed others is a crucial question for historians and "all human understanding".
This research led the economists Camerer, Loewenstein, and Weber to focus on the economic implications of the concept and question whether the curse harms the allocation of resources in an economic setting. The idea that better-informed parties may suffer losses in a deal or exchange was seen as something important to bring to the sphere of economic theory. Most theoretical analyses of situations where one party knew less than the other focused on how the lesser-informed party attempted to learn more information to minimize information asymmetry. However, in these analyses, there is an assumption that better-informed parties can optimally exploit their information asymmetry when they, in fact, cannot. People cannot utilize their additional, better information, even when they should in a bargaining situation.
For example, two people are bargaining over dividing money or provisions. One party may know the size of the amount being divided while the other does not. However, to fully exploit their advantage, the informed party should make the same offer regardless of the amount of material to be divided. But informed parties actually offer more when the amount to be divided is larger. Informed parties are unable to ignore their better information, even when they should.
A 1990 experiment by a Stanford University graduate student, Elizabeth Newton, illustrated the curse of knowledge in the results of a simple task. A group of subjects were asked to "tap" out well known songs with their fingers, while another group tried to name the melodies. When the "tappers" were asked to predict how many of the "tapped" songs would be recognized by listeners, they would always overestimate. The curse of knowledge is demonstrated here as the "tappers" are so familiar with what they were tapping that they assumed listeners would easily recognize the tune.
A study by Susan Birch and Paul Bloom involving Yale University undergraduate students used the curse of knowledge concept to explain the idea that the ability of people to reason about another person's actions is compromised by the knowledge of the outcome of an event. The perception the participant had of the plausibility of an event also mediated the extent of the bias. If the event was less plausible, knowledge was not as much of a "curse" as when there was a potential explanation for the way the other person could act. However, a recent replication study found that this finding was not reliably reproducible across seven experiments with large sample sizes, and the true effect size of this phenomenon was less than half of that reported in the original findings. Therefore, it is suggested that "the influence of plausibility on the curse of knowledge in adults appears to be small enough that its impact on real-life perspective-taking may need to be reevaluated."
Other researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children.
Related to this finding is the phenomenon experienced by players of charades: the actor may find it frustratingly hard to believe that his or her teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.
In the Camerer, Loewenstein and Weber's article, it is mentioned that the setting closest in structure to the market experiments done would be underwriting, a task in which well-informed experts price goods that are sold to a less-informed public. Investment bankers value securities, experts taste cheese, store buyers observe jewelry being modeled, and theater owners see movies before they are released. They then sell those goods to a less-informed public. If they suffer from the curse of knowledge, high-quality goods will be overpriced and low-quality goods underpriced relative to optimal, profit-maximizing prices; prices will reflect characteristics (e.g., quality) that are unobservable to uninformed buyers ("you get what you pay for").
The curse of knowledge has a paradoxical effect in these settings. By making better-informed agents think that their knowledge is shared by others, the curse helps alleviate the inefficiencies that result from information asymmetries (a better informed party having an advantage in a bargaining situation), bringing outcomes closer to complete information. In such settings, the curse on individuals may actually improve social welfare.
Economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better-informed agents can accurately anticipate the judgments of lesser-informed agents is not inherently true. They also sought to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents when selling their products. The reason is said to be that better-informed agents fail to ignore the privileged knowledge that they possess and are thus "cursed" and unable to sell their products at a value that more naïve agents would deem acceptable.
It has also been suggested that the curse of knowledge could contribute to the difficulty of teaching. The curse of knowledge means that it could be potentially ineffective, if not harmful, to think about how students are viewing and learning material by asking the perspective of the teacher as opposed to what has been verified by students. The teacher already has the knowledge that he or she is trying to impart, but the way that knowledge is conveyed may not be the best for those who do not already possess the knowledge.
The curse of knowledge can show up in writing where the use of pronouns or acronyms may have a meaning to the writer but not to the reader. In addition, the receiver can feign knowledge to avoid appearing stupid.
It can also show up in computer programming where the programmer fails to produce understandable code, e.g. comment their code, because it seems obvious at the time they write it. But a few months later they may have no idea why the code exists.
Another example is writing a to-do list and viewing it at a future time as the knowledge at the time of writing is now lost.
The concept was first popularized by Chip and Dan Heath in the book Made to Stick.
- Adaptive bias – Idea that the human brain has evolved to reason adaptively, rather than truthfully or even rationally
- Adverse selection
- The curse of expertise
- Dunning–Kruger effect – Cognitive bias in which people with low ability at a task overestimate their ability
- Einstellung effect
- Empathy gap
- False consensus effect – Attributional type of cognitive bias
- Naive realism
- Zone of proximal development
- Kennedy, Jane (1995). "Debiasing the Curse of Knowledge in Audit Judgment". The Accounting Review. 70 (2): 249–273. JSTOR 248305.
- Hinds, Pamela J. (1999). "The curse of expertise: The effects of expertise and debiasing methods on prediction of novice performance". Journal of Experimental Psychology: Applied. 5 (2): 205–221. doi:10.1037/1076-898X.5.2.205.
- Wieman, Carl (2007). "The 'Curse of Knowledge', or Why Intuition About Teaching Often Fails" (PDF). APS News. 16 (10). Archived from the original on 2016-04-10.CS1 maint: BOT: original-url status unknown (link)
- Froyd, Jeff; Layne, Jean (2008). "Faculty development strategies for overcoming the "curse of knowledge"". 2008 38th Annual Frontiers in Education Conference. doi:10.1109/FIE.2008.4720529. ISBN 978-1-4244-1969-2.
- Camerer, Colin; Loewenstein, George; Weber, Martin (1989). "The Curse of Knowledge in Economic Settings: An Experimental Analysis" (PDF). Journal of Political Economy. 97 (5): 1232–1254. CiteSeerX 10.1.1.475.3740. doi:10.1086/261651. Archived from the original on 2015-03-06.CS1 maint: BOT: original-url status unknown (link)
- Fischhoff, Baruch (1975). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty". Journal of Experimental Psychology: Human Perception and Performance. 1 (3): 288–299. doi:10.1037/0096-1518.104.22.1688. Reprinted: Fischhoff, Baruch (2003). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty". Qual Saf Health Care. 12 (4): 304–11. doi:10.1136/qhc.12.4.304. PMC 1743746. PMID 12897366.
- Myerson, Roger B. "Negotiation in Games: A Theoretical Overview". In Un-certainty, Information, and Communication: Essays in Honor of Kenneth J. Arrow, vol. 3, edited by Walter P. Heller, Ross M. Starr, and David A. Starrett. New York: Cambridge Univ. Press, 1986.
- Forsythe, Robert; Kennan, John; Sopher, Barry (1991). "An Experimental Analysis of Strikes in Bargaining Games with One-Sided Private Information" (PDF). The American Economic Review. 81 (1): 253–278. JSTOR 2006799. Archived from the original on 2016-05-08.CS1 maint: BOT: original-url status unknown (link)
- Banks, Jeff; Camerer, Colin F.; and Porter, David. "Experimental Tests of Nash Refinements in Signaling Games." Working paper. Philadelphia: Univ. Pennsylvania, Dept. Decision Sci., 1988.
- Heath, Chip; Heath, Dan (Dec 2006). "The Curse of Knowledge". Harvard Business Review. Retrieved 26 April 2016.
- Newton, Elizabeth Louise. 1990. The rocky road from actions to intentions. PhD diss., Stanford University.
- Birch, S. A.J.; Bloom, P. (2007). "The Curse of Knowledge in Reasoning About False Beliefs" (PDF). Psychological Science. 18 (5): 382–386. CiteSeerX 10.1.1.583.5677. doi:10.1111/j.1467-9280.2007.01909.x. PMID 17576275. Archived from the original on 2016-05-07.CS1 maint: BOT: original-url status unknown (link)
- Ryskin, Rachel A.; Brown-Schmidt, Sarah (25 March 2014). "Do Adults Show a Curse of Knowledge in False-Belief Reasoning? A Robust Estimate of the True Effect Size". PLOS ONE. 9 (3): e92406. Bibcode:2014PLoSO...992406R. doi:10.1371/journal.pone.0092406. PMC 3965426. PMID 24667826.
- Birch, Susan A. J.; Bernstein, Daniel M. (2007). "What Can Children Tell Us About Hindsight Bias: A Fundamental Constraint on Perspective–Taking?" (PDF). Social Cognition. 25 (1): 98–113. CiteSeerX 10.1.1.321.4788. doi:10.1521/soco.2007.25.1.98. Archived from the original on 2016-05-07.CS1 maint: BOT: original-url status unknown (link)