Misinformation is false or inaccurate information. Examples of misinformation include false rumors, insults and pranks, while examples of more deliberate disinformation include malicious content such as hoaxes, spearphishing and propaganda. News parody or satire may also become misinformation if it is taken as serious by the unwary and spread as if it were true. The terms "misinformation" and "disinformation" have been associated with the neologism "Fake News," defined by some scholars as “fabricated information that mimics news media content in form but not in organizational process or intent.”
The history of misinformation, along with that of disinformation and propaganda, is tied up with the history of mass communication itself. Early examples cited in a 2017 article by Robert Darnton are the insults and smears spread among political rivals in Imperial and Renaissance Italy in the form of "pasquinades,” anonymous and witty verse named for the Pasquino piazza and "talking statue" in Rome, and in pre-revolutionary France as "canards," or printed broadsides that sometimes included an engraving to help convince readers to take their wild tales seriously.
The spread in Europe and North America of Johannes Gutenberg's mechanized printing press increased the opportunities to spread English-language misinformation. In 1835, the New York Sun published the first large-scale news hoax, known as the "Great Moon Hoax," which was a series of six articles claiming to describe life on the Moon, "complete with illustrations of humanoid bat-creatures and bearded blue unicorns." The fast pace and sometimes strife-filled work of mass-producing news broadsheets also led to copies rife with careless factual errors and mistakes, such as the Chicago Tribune's infamous 1948 headline "Dewey Defeats Truman."
In the so-called Information Age, social networking sites have become a notable vector for the spread of misinformation, "fake news" and propaganda. These sites provide users with the capabilities to spread information quickly to other users without requiring the permission of a gatekeeper such as an editor, who might otherwise require confirmation of its truth before allowing its publication. Journalists today are criticized for helping to spread false information on these platforms, but research such as that from Starbird et al. and Arif et al. shows they also play a role in curbing the spread of misinformation on social media through debunking and denying false rumors.
According to Anne Mintz, editor of Web of Deception: Misinformation on the Internet, the best ways to determine whether information is factual is to use common sense. Mintz advises that the reader check whether the information makes sense and whether the founders or reporters of the websites that are spreading the information are biased or have an agenda. Professional journalists and researchers look at other sites (particularly verified sources like news channels) for that information, as it might be reviewed by multiple people and heavily researched, providing more concrete details.
Martin Libicki, author of Conquest In Cyberspace: National Security and Information Warfare, noted that the trick to working with misinformation is the idea that readers must have a balance of what is correct and what is incorrect. Readers cannot be gullible but also should not be paranoid that all information is incorrect. There is always a chance that even readers who have this balance will believe an error to be true or that they will disregard factual information as incorrect. According to Libicki, readers' prior beliefs or opinions also affect how they interpret new information. When readers believe something to be true before researching it, they are more likely to believe something that supports these prior beliefs or opinions. This phenomenon may lead readers who otherwise are skilled at evaluating credible sources and facts to believe misinformation.
Misinformation is spread for numerous reasons, some of which are not the result of an attempt to deceive but of carelessness, cognitive bias and/or social and work pressures. The next sections discuss the role of social media dynamics, the lack of internet gatekeepers, bad information from media sources, and competition in news and media .
Social media and misinformationEdit
Contemporary social media platforms offer a rich ground for the spread of misinformation. Combatting its spread is difficult for two reasons: the profusion of information sources, and the generation of "echo chambers." The profusion of information sources makes the reader's task of weighing the reliability of information more challenging, heightened by the untrustworthy social signals that go with such information. The inclination of people to follow or support like-minded individuals leads to the formation of echo chambers and filter bubbles. With no differing information to counter the untruths or the general agreement within isolated social clusters, the outcome is a dearth, and worse, the absence of a collective reality, some writers argue.
Lack of Internet gatekeepersEdit
Because of the decentralized nature and structure of the Internet, writers can easily publish content without being required to subject it to peer review, prove their qualifications, or provide backup documentation. Whereas a book found in a library generally has been reviewed and edited by a second person, Internet sources cannot be assumed to be vetted by anyone other than their authors. They may be produced and posted as soon as the writing is finished.
Bad information from media sourcesEdit
An example of bad information from media sources that led to the spread of misinformation occurred in November 2005, when Chris Hansen on Dateline NBC made a claim that law enforcement officials estimate 50,000 predators are online at any moment. Afterwards, the U.S. attorney general at the time, Alberto Gonzales, stated that Dateline estimated 50,000 predators are online at any given moment. However, the number that Hansen used in his reporting had no backing. Hansen said he received the information from Dateline expert Ken Lanning. However, Lenning admitted that he made up the number 50,000 because there was no solid data on the number. According to Lenning, he used 50,000 because it sounds like a real number, not too big or not too small and referred to it as a "Goldilocks number". The number 50,000, is used often in the media to estimate number when reporters are unsure of the exact data, reporter Carl Bialik has said.
Competition in news and mediaEdit
Because news organizations and websites hotly compete for viewers, there is a need for great efficiency in releasing stories to the public. News media companies broadcast stories 24 hours a day, and break the latest news in hopes of taking audience share from their competitors. News is also produced at a pace that does not always allow for fact-checking, or for all of the facts to be collected or released to the media at one time, letting readers or viewers insert their own opinions, and possibly leading to the spread of misinformation.
Misinformation can affect all aspects of life. When eavesdropping on conversations, one can gather facts that may not always be true, or the receiver may hear the message incorrectly and spread the information to others. On the Internet, one can read content that is stated to be factual but that that may not have been checked or may be erroneous. In the news, companies may emphasize the speed at which they receive and send information but may not always be correct in the facts.
In regards to politics, some view being a misinformed citizen as worse than being an uninformed citizen. Misinformed citizens can state their beliefs and opinions with confidence and in turn affect elections and policies. This type of misinformation comes from speakers not always being upfront and straightforward. When information is presented as vague, ambiguous, sarcastic, or partial, receivers are forced to piece the information together and assume what is correct.
Websites have been created to help people to discern fact from fiction. For example, the site FactCheck.org has a mission to fact check the media, especially politician speeches and stories going viral on the internet. The site also includes a forum where people can openly ask questions about information they're not sure is true in both the media and the internet. Other sites such as Wikipedia and Snopes.com are also important resources for verifying information.
Some scholars and activists are pioneering a movement to eliminate the mis/disinformation and information pollution in the digital world. The theory they are developing, "information environmentalism", has become a curriculum in some universities and colleges.
- List of common misconceptions
- Character assassination
- Defamation (also known as "slander")
- Counter Misinformation Team
- Junk science
- Flat earth
- Social engineering (in political science and cybercrime)
- Information environmentalism
- "Definition of misinformation". Merriam-Webster Dictionary Online. Retrieved 2019-02-24.
- Lazer, David M. J.; Baum, Matthew A.; Benkler, Yochai; Berinsky, Adam J.; Greenhill, Kelly M.; Menczer, Filippo; Metzger, Miriam J.; Nyhan, Brendan; Pennycook, Gordon; Rothschild, David; Schudson, Michael; Sloman, Steven A.; Sunstein, Cass R.; Thorson, Emily A.; Watts, Duncan J.; Zittrain, Jonathan L. (2018). "The science of fake news". Science. 359 (6380): 1094–1096. doi:10.1126/science.aao2998. PMID 29590025.
- "A short guide to the history of 'fake news' and disinformation". International Center for Journalists. Retrieved 2019-02-24.
- "The True History of Fake News". The New York Review of Books. Retrieved 2019-02-24.
- Vosoughi, Soroush; Roy, Deb; Aral, Sinan (2018-03-09). "The spread of true and false news online". Science.
- Tucker, Joshua A.; Guess, Andrew; Barbera, Pablo; Vaccari, Cristian; Siegel, Alexandra; Sanovich, Sergey; Stukal, Denis; Nyhan, Brendan. "Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature". Hewlett Foundation White Paper.
- Starbird, Kate; Dailey, Dharma; Mohamed, Owla; Lee, Gina; Spiro, Emma (2018). "Engage Early, Correct More: How Journalists Participate in False Rumors Online during Crisis Events". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). Retrieved 2019-02-24.
- Arif, Ahmer; Robinson, John; Stanck, Stephanie; Fichet, Elodie; Townsend, Paul; Worku, Zena; Starbird, Kate (2017). "A Closer Look at the Self-Correcting Crowd: Examining Corrections in Online Rumors" (PDF). Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17): 155–169. doi:10.1145/2998181.2998294. ISBN 9781450343350. Retrieved 25 February 2019.
- Mintz, Anne. "The Misinformation Superhighway?". PBS. Retrieved 26 February 2013.
- "Towards automated real-time detection of misinformation on Twitter - IEEE Conference Publication". ieeexplore.ieee.org. Retrieved 2018-10-16.
- Libicki, Martin (2007). Conquest in Cyberspace: National Security and Information Warfare. New York: Cambridge University Press. pp. 51–55. ISBN 9780521871600.
- Messerole, Chris. "How misinformation spreads on social media—And what to do about it". Brookings Institute. Retrieved 24 February 2019.
- Benkler, Y. (2017). "Study: Breitbart-led rightwing media ecosystem altered broader media agenda". Retrieved 8 June 2018.
- Stapleton, Paul (2003). "Assessing the quality and bias of web-based sources: implications for academic writing". Journal of English for Academic Purposes. 2 (3): 229–245. doi:10.1016/S1475-1585(03)00026-2. Retrieved March 21, 2013.
- Gladstone, Brooke (2012). The Influencing Machine. New York: W. W. Norton & Company;. pp. 49–51. ISBN 978-0393342468.
- Croteau; et al. "Media Technology" (PDF): 285–321. Retrieved March 21, 2013. Explicit use of et al. in:
- Barker, David (2002). Rushed to Judgement: Talk Radio, Persuasion, and American Political Behavior. New York: Columbia University Press. pp. 106–109.
- "Our Mission". www.factcheck.org. Retrieved 2016-03-31.
- "Ask FactCheck". www.factcheck.org. Retrieved 2016-03-31.
- "Info-Environmentalism: An Introduction". Retrieved 2018-09-28.
- "Information Environmentalism". Digital Learning and Inquiry (DLINQ). 2017-12-21. Retrieved 2018-09-28.
- Bakir, V. & McStay, A. (2017). Fake News and The Economy of Emotions: Problems, causes, solutions. Digital Journalism, 1–22. https://doi.org/10.1080/21670811.2017.1345645
- Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
- Baillargeon, Normand (4 January 2008). A short course in intellectual self-defense. Seven Stories Press. ISBN 978-1-58322-765-7. Retrieved 22 June 2011.
- Christopher Murphy (2005). Competitive Intelligence: Gathering, Analysing And Putting It to Work. Gower Publishing, Ltd.. pp. 186–189. ISBN 0-566-08537-2. — a case study of misinformation arising from simple error
- Jürg Strässler (1982). Idioms in English: A Pragmatic Analysis. Gunter Narr Verlag. pp. 43–44. ISBN 3-87808-971-6.
- Christopher Cerf, Victor Navasky (1984). The Experts Speak: The Definitive Compendium of Authoritative Misinformation. Pantheon Books.