Cherry picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position. It is a kind of fallacy of selective attention, the most common example of which is the confirmation bias. Cherry picking may be committed intentionally or unintentionally. This fallacy is a major problem in public debate.
The term is based on the perceived process of harvesting fruit, such as cherries. The picker would be expected to only select the ripest and healthiest fruits. An observer who only sees the selected fruit may thus wrongly conclude that most, or even all, of the tree's fruit is in a likewise good condition. This can also give a false impression of the quality of the fruit (since it is only a sample and is not a representative sample).
Cherry picking has a negative connotation as it directly suppresses evidence that could lead to a more complete picture.
A concept sometimes confused with cherry picking is the idea of gathering only the fruit that is easy to harvest, while ignoring other fruit that is higher up on the tree and thus more difficult to obtain (see low-hanging fruit).
Cherry picking can be found in many logical fallacies. For example, the "fallacy of anecdotal evidence" tends to overlook large amounts of data in favor of that known personally, "selective use of evidence" rejects material unfavorable to an argument, while a false dichotomy picks only two options when more are available. Cherry picking can refer to the selection of data or data sets so a study or survey will give desired, predictable results which may be misleading or even completely contrary to reality.
Choosing to make selective choices among competing evidence, so as to emphasize those results that support a given position, while ignoring or dismissing any findings that do not support it, is a practice known as "cherry picking" and is a hallmark of poor science or pseudo-science.— Richard Somerville, Testimony before the US House of Representatives Committee on Energy and Commerce Subcommittee on Energy and Power, March 8, 2011.
Rigorous science looks at all the evidence (rather than cherry picking only favorable evidence), controls for variables as to identify what is actually working, uses blinded observations so as to minimize the effects of bias, and uses internally consistent logic."— Steven Novella, "A Skeptic In Oz", April 26, 2011
In a 2002 study, researchers "reviewed 31 antidepressant efficacy trials to identify the primary exclusion criteria used in determining eligibility for participation. Their findings suggest that patients in current antidepressant trials represent only a minority of patients treated in routine clinical practice for depression. Excluding potential clinical trial subjects with certain profiles means that the ability to generalize the results of antidepressant efficacy trials lacks empirical support, according to the authors."
In argumentation, the practice of "quote mining" is a form of cherry picking, in which the debater selectively picks some quotes supporting a position (or exaggerating an opposing position) while ignoring those that moderate the original quote or put it into a different context. Cherry picking in debates is a large problem as the facts themselves are true, but need to be put in context. Because research cannot be done live and is often untimely, cherry picked facts or quotes usually stick in the public mainstream and, even when corrected, lead to widespread misrepresentation of groups targeted.
A one-sided argument (also known as card stacking, stacking the deck, ignoring the counterevidence, slanting, and suppressed evidence) is an informal fallacy that occurs when only the reasons supporting a proposition are supplied, while all reasons opposing it are omitted.
Peter Suber has written: "The one-sidedness fallacy does not make an argument invalid. It may not even make the argument unsound. The fallacy consists in persuading readers, and perhaps ourselves, that we have said enough to tilt the scale of evidence and therefore enough to justify a judgment. If we have been one-sided, though, then we haven't yet said enough to justify a judgment. The arguments on the other side may be stronger than our own. We won't know until we examine them. So the one-sidedness fallacy doesn't mean that your premises are false or irrelevant, only that they are incomplete."
"With rational messages, you need to decide if you want to use a one-sided argument or a two-sided argument. A one-sided argument only presents the pro side of the argument, while a two-sided argument presents both sides. Which one you use will depend on which one meets your needs and the type of audience. Generally, one-sided arguments are better with audiences already favorable to your message. Two-sided arguments are best with audiences who are opposed to your argument, are better educated or have already been exposed to counter arguments."
Card stacking is a propaganda technique that seeks to manipulate audience perception of an issue by emphasizing one side and repressing another. Such emphasis may be achieved through media bias or the use of one-sided testimonials, or by simply censoring the voices of critics. The technique is commonly used in persuasive speeches by political candidates to discredit their opponents and to make themselves seem more worthy.
The term originates from the magician's gimmick of "stacking the deck", which involves presenting a deck of cards that appears to have been randomly shuffled but which is, in fact, 'stacked' in a specific order. The magician knows the order and is able to control the outcome of the trick. In poker, cards can be stacked so that certain hands are dealt to certain players.
The phenomenon can be applied to any subject and has wide applications. Whenever a broad spectrum of information exists, appearances can be rigged by highlighting some facts and ignoring others. Card stacking can be a tool of advocacy groups or of those groups with specific agendas. For example, an enlistment poster might focus upon an impressive picture, with words such as "travel" and "adventure", while placing the words, "enlist for two to four years" at the bottom in a smaller and less noticeable point size.
- The Internet Encyclopedia of Philosophy, "Fallacies", Bradley Dowden (2010)
- Cherry Picking
- Klass, Gary. "Just Plain Data Analysis: Common Statistical Fallacies in Analyses of Social Indicator Data. Department of Politics and Government, Illinois State University" (PDF). statlit.org. ~2008. Archived from the original (PDF) on March 25, 2014. Retrieved March 25, 2014.
- Goldacre, Ben (2008). Bad Science. HarperCollins Publishers. pp. 97–99. ISBN 978-0-00-728319-4.
- "Devious deception in displaying data: Cherry picking", Science or Not, April 3, 2012, retrieved 16 February 2015
- Novella, Steven (26 April 2011). "A Skeptic In Oz". Science-Based Medicine. Retrieved 16 February 2015.
- "Typical Depression Patients Excluded from Drug Trials; exclusion criteria: is it "cherry picking"?". The Brown University Psychopharmacology Update. Wiley Periodicals. 13 (5): 1–3. May 2002. ISSN 1068-5308. Based on the studies:
- Posternak, MA; Zimmerman, M; Keitner, GI; Miller, IW (February 2002). "A reevaluation of the exclusion criteria used in antidepressant efficacy trials". The American Journal of Psychiatry. 159 (2): 191–200. doi:10.1176/appi.ajp.159.2.191. PMID 11823258.
- Zimmerman, M; Mattia, JI; Posternak, MA (March 2002). "Are subjects in pharmacological treatment trials of depression representative of patients in routine clinical practice?". The American Journal of Psychiatry. 159 (3): 469–73. doi:10.1176/appi.ajp.159.3.469. PMID 11870014.
- "One-Sidedness - The Fallacy Files". Retrieved 14 October 2014.
- Peter Suber. "The One-Sidedness Fallacy". Retrieved 25 September 2012.
- The fine art of propaganda: a study of Father Coughlin's s=Institute for Propaganda Analysis. Harcourt Brace and Company. 1939. pp. 95–101. Retrieved November 24, 2010.
- C. S. Kim, John (1993). The art of creative critical thinking. University Press of America. pp. 317–318. Retrieved November 24, 2010.
- Ruchlis, Hyman; Sandra Oddo (1990). Clear thinking: a practical introduction. Prometheus Books. pp. 195–196. Retrieved November 24, 2010.
- James, Walene (1995). Immunization: the reality behind the myth, Volume 3. Greenwood Publishing Group. pp. 193–194. Retrieved November 24, 2010.
- Shabo, Magedah (2008). Techniques of Propaganda and Persuasion. Prestwick House Inc. pp. 24–29. Retrieved November 24, 2010.
- The One-Sidedness Fallacy - Peter Suber, Philosophy Department, Earlham College
- Developing a Promotional Strategy - Michigan State University, Extension Bulletin E-1939
- A characterization of the one-sidedness fallacy within the framework of the cognitive distortions, Paul Franceschi, 2009