Talk:Curse of knowledge

Latest comment: 1 month ago by Florian Blaschke in topic An overlooked part of the Dunning–Kruger effect

Concept Introduction edit

The first paragraph on the page does a poor job of clearly introducing the concept. I recommend changing the first paragraph to "The curse of knowledge is a cognitive bias that occurs when, in predicting others’ forecasts or behaviors, individuals are unable to ignore knowledge they have that others do not have, or when they are unable to disregard information already processed. Cite error: A <ref> tag is missing the closing </ref> (see the help page). KieraMolloy18 (talk) 01:20, 26 April 2016 (UTC)Reply

I think that is a good edit!
Kmdoiron (talk) 04:03, 27 April 2016 (UTC)Reply
Also, I'm not really sure why the vast majority of this conversation is showing up on the edit source page but no longer is on the talk read page since I last edited earlier in the week...
Kmdoiron (talk) 04:11, 27 April 2016 (UTC)Reply
I don't understand why a lot of this is not showing up on the preview page. If anyone is looking for more detail on the changes suggested, please click "edit" and read through the notes on this section. KieraMolloy18 (talk) 13:30, 29 April 2016 (UTC)Reply

References

History of concept edit

While it is important to highlight the evolution of the concept, this section is very scattered and does not clearly outline the important points about curse of knowledge history. Much of the section should be moved to "implications" because it has nothing to do with history. The following paragraphs are my suggestions to streamline the main points of the original section, where I have also removed sentences that are repetitive or confusing. I have also given the section structure, and explained the evolution of the concept in relation to the researchers.

The term "curse of knowledge" was coined in the Journal of Political Economy by economists Colin Camerer, George Loewenstein, and Martin Weber. The aim of their research was to counter the "conventional assumptions in such (economic) analyses of asymmetric information in that better-informed agents can accurately anticipate the judgement of less-informed agents." [1]
Such research drew from Baruch Fischoff's work in 1975 surrounding hindsight bias, a cognitive bias that knowing the outcome of a certain event makes it seem more predictable than may actually be true. [5] Research conducted by Fischhoff revealed that participants did not know that their outcome knowledge affected their responses, and, if they did know, they could still not ignore or defeat the effects of the bias.[5] Study participants could not successfully reconstruct their previous, less knowledgeable states accurately, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge."[5] This receipt of knowledge returns to the idea of the curse proposed by Camerer, Loewenstein, and Weber: a knowledgeable person cannot accurately reconstruct what a person, be it themselves or someone else, without the knowledge would think, or how they would act. In his paper, Fischoff questions the failure to empathize with ourselves in less knowledgeable states, and notes that how well people manage to reconstruct perceptions of lesser informed others is a crucial question for historians and "all human understanding."[5]
This research lead to economists Camerer, Loewenstein, and Weber to focus on the economic implications of the concept and question weather the curse harms the allocation of resources in an economic setting.[1]

KieraMolloy18 (talk) 02:24, 26 April 2016 (UTC)Reply

Could add information about the research that Camerer, Loewenstein, and Weber or Weber did, like the specific original experiments' methods, hypotheses, and results. Maybe add some years to make the history section more credibly historical. Same thing with the Fischoff results, maybe explain his methods to further the reader's background/understanding of how it is tested.

References

  1. ^ Froyd, Jeff; Layne, Jean (2008). "Faculty Development Strategies for Overcoming the "Curse of Knowledge"". Retrieved 26 April 2016.

experimental evidence edit

I would like to expand on the second paragraph of this section, since the tapping experiment is a classic example of curse of knowledge, and can be easily understood by readers.

A 1990 experiment by a Stanford graduate student, Elizabeth Newton, illustrated the curse of knowledge in the results of a simple task. A group of subjects were asked to "tap" out well known songs with their fingers, while another group tried to name the melodies. When the "tappers" were asked to predict how many of the "tapped" songs would be recognized by listeners, they would always overestimate. The curse of knowledge is demonstrated here as the "tappers" are so familiar with what they were tapping that they assumed listeners would easily recognize the tune. [1]

The Burch and Bloom experiment (2003) should be moved below the tapping experiment and changed to be more easily followed by readers.

A study by Susan Birch and Paul Bloom in 2003 used the curse of knowledge concept to explain the idea that the ability of people to reason about another person's actions is compromised by the knowledge of the outcome of an event. The perception the participant had of the plausibility of an event also mediated the extent of the bias. If the event was less plausible, knowledge was not as much of a "curse" as when there was a potential explanation for the way the other person could act.[6] In addition, and more recently, researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children. KieraMolloy18 (talk) 02:43, 26 April 2016 (UTC)Reply

References

  1. ^ Heath, Chip; Heath, Dan (Dec 2006). "The Curse of Knowledge". Harvard Business Review. Retrieved 26 April 2016.

Integration of Implications and Applications edit

The information in these sections seem to address the same questions, such as how the curse of knowledge is applied to real-life situations. I suggest the combination of these sections and the removal of any information that is a simple restatement of previously-mentioned information.

For example, I would like to remove the sentence, "Economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better informed agents can accurately anticipate the judgments of lesser informed agents is not inherently true." KieraMolloy18 (talk) 21:03, 18 April 2016 (UTC)Reply


The implication paragraphs are also worded complicatedly and should be rephrased to be more succinct and clear. I also agree that both sections can be combined together. You could also add how the curse of knowledge is prevalent in fields other than education and economics.


I would also maybe add a picture as the page seems rather "unattractive" to the eye. Simple pictures/diagrams that explain the phenomenon could aid the reader's understanding.

Robin Hogarth and Coining of Expression edit

According to Steven Pinker,The Sense of Style, 322 n 3: "The term 'curse of knowledge' was coined by Robin Hogarth and popularized by Camerer, Lowenstein, & Weber, 1989." — Preceding unsigned comment added by 68.174.70.78 (talk) 17:39, 2 June 2018 (UTC)Reply

The authors themselves make this point: "' This term was suggested by Robin Hogarth. " (12333 n 1) [The Curse of Knowledge in Economic Settings: An Experimental Analysis Author(s): Colin Camerer, George Loewenstein, Martin Weber Source: The Journal of Political Economy, Vol. 97, No. 5 (Oct., 1989), pp. 1232-1254] — Preceding unsigned comment added by 68.174.70.78 (talk) 19:05, 2 June 2018 (UTC)Reply

Newton and tapping edit

The article says:

A 1990 experiment by a Stanford graduate student, Elizabeth Newton, illustrated the curse of knowledge in the results of a simple task

Just where is this experiment written up? We're not told. However, a little web-searching provides mentions of and references to:

Elizabeth Newton. Overconfidence in the communication of intent: Heard and unheard melodies. PhD diss., Stanford University, 1990.

I can't find this. However, I can find

Elizabeth Louise Newton. The rocky road from actions to intentions. PhD diss, Stanford University, 1990.

-- it's here. And yes, it really has an entire chapter devoted to tapping.

This page of this book references both of these. What's going on here? More.coffy (talk) 08:54, 5 December 2018 (UTC)Reply

Merger proposal edit

I propose to merge the article on the curse of expertise into this article, or better, to replace it with a redirect to this article. I think that the former article needs much improvement anyway. For instance, it nowhere unambiguously lays out exactly what topic(s) it's intended to be about. Furthermore, at least one of its references—the 1999 article by Hinds—is about precisely the same topic as this article.—PaulTanenbaum (talk) 17:29, 14 February 2020 (UTC)Reply

Not really my area of expertise, but it sounds a reasonable improvement/simplification. How about «the curse of expert knowledge» 08:48, 15 February 2020 (UTC)Timpo (talk)
What are your reasons for the merge? Why do you think both articles are talking about the same topic? On a cursory examination they appear to be different but I could be convinced. --Danielklein (talk) 00:02, 16 December 2020 (UTC)Reply

No MurrayScience (talk) 01:33, 13 December 2020 (UTC)Reply

Support the proposal as initially proposed. My reading of the two articles is that they are describing synonyms or the same concept; in either case, the ideas are so closely related that they are best discussed in one place. There is useful complementary content on the two pages, and hence readers would be better served through consolidation on one page. Klbrain (talk) 14:55, 6 March 2021 (UTC)Reply

An overlooked part of the Dunning–Kruger effect edit

Basically the same is the widely overlooked inverted part of the Dunning–Kruger effect: Not only do people with low competence ("beginners") overestimate their own competence, but people with very high competence ("experts") in turn underestimate theirs. Think of a subject you know really well, and you may also remember being surprised at what basic knowledge about that subject average people or laypeople completely lack. As an experienced Wikipedian, for example, you may be inclined to assume that average people have an understanding of or familiarity with basic Wikipedia concepts and principles and fundamental facts about how Wikipedia works – which you take for granted after having gotten used to them for years – even though they generally (or at least very frequently) don't. --Florian Blaschke (talk) 21:12, 26 February 2024 (UTC)Reply