Open main menu

Wikipedia β

The term was coined by internet activist Eli Pariser in his eponymous book

Filter bubbles result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click-behavior and search history).[1][2][3] As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.[4] The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal[5] and addressable.[6] The surprising results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook,[7][8] and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers,[9] spurring new interest in the term,[10] with many concerned that the phenomenon may harm democracy.[11][12][10]

(Technologies such as social media) lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.

— Bill Gates 2017 in Quartz[13]

Contents

ConceptEdit

 
Social media, seeking to please users, can shunt information that they guess their users will like hearing, but inadvertently isolate what they know into their own filter bubbles, according to Pariser.

The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. He related an example in which one user searched Google for "BP" and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were "strikingly different".[14][15][16][5]

Pariser defined his concept of filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms".[14] Other terms have been used to describe this phenomenon, including "ideological frames"[15] or a "figurative sphere surrounding you as you search the Internet".[17] The past search history is built up over time when an Internet user indicates interest in topics by "clicking links, viewing friends, putting movies in your queue, reading news stories" and so forth.[17] An Internet firm then uses this information to target advertising to the user or make it appear more prominently in a search results query page.[17]

Pariser’s idea of the ‘filter bubble’ was popularized after the Ted Talk he gave in May 2011.[18] These ‘bubbles’ are created by algorithms that use 57 different signals to determine search results. These signals include “[the] computer being used,” “where you’re sitting,” “the browser doing the surfing,” and more. Pariser gives examples of how ‘filter bubbles’ work and where they can be seen. In an attempt to test the validity of ‘filter bubbles’ Pariser asked two of his friends to search the word ‘Egypt’ on Google and send him the search results. What each of them found were two completely different search results, one focusing on the political tensions in the country at the time, and one with vacation advertisements.

In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information"[19] and "creates the impression that our narrow self-interest is all that exists".[15] It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots".[20] He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook.[20] According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that it has the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation".[15] He wrote:

A world constructed from the familiar is a world in which there’s nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.

— Eli Pariser in The Economist, 2011[21]

A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization,[22] which happens when the Internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views; the term cyberbalkanization was coined in 1996.[23][24][25]

Although his speech did not employ the adjective "filter", President Obama's farewell address identified a similar concept to filter bubbles as a "threat to [Americans'] democracy", i.e., the "retreat into our own bubbles, ...especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions... And increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there."[26]

ReactionsEdit

There are conflicting reports about the extent to which personalized filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg writing in Slate did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting exactly the same search—the results of all five search queries were nearly identical across four different searches, suggesting that a filter bubble was not in effect, which led him to write that a situation in which all people are "feeding at the trough of a Daily Me" was overblown.[15] A scientific study from Wharton that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste.[27] Consumers apparently use the filter to expand their taste, not limit it.[27] Book reviewer Paul Boutin did a similar experiment among people with differing search histories, and found results similar to Weisberg's with nearly identical search results.[5] Harvard law professor Jonathan Zittrain disputed the extent to which personalisation filters distort Google search results; he said "the effects of search personalization have been light".[15] Further, there are reports that users can shut off personalisation features on Google if they choose[28] by deleting the Web history and by other methods.[5] A spokesperson for Google suggested that algorithms were added to Google search engines to deliberately "limit personalization and promote variety".[15]

While algorithms do limit political diversity, some of the filter bubble is the result of user choice.[29]  In a study by data scientists at Facebook, they found that for every four Facebook friends that share ideology, users have one friend with contrasting views.[30][31]  No matter what Facebook’s algorithm for its News Feed is, people are simply more likely to befriend/follow people who share similar beliefs.[30]  The nature of the algorithm is that it ranks stories based on a user’s history, resulting in a reduction of the “politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals.”[30]  However, even when people are given the option to click on a link offering contrasting views, they still default to their most viewed sources.[30]  “[U]ser choice decreases the likelihood of clicking on a cross-cutting link by 17 percent for conservatives and 6 percent for liberals.”[30]

There are reports that Google and other sites have vast information which might enable them to further personalise a user's Internet experience if they chose to do so. One account suggested that Google can keep track of user past histories even if they don't have a personal Google account or are not logged into one.[5] One report was that Google has collected "10 years worth" of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine,[16] although a contrary report was that trying to personalise the Internet for each user was technically challenging for an Internet firm to achieve despite the huge amounts of available web data. Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores.[16] There is agreement that sites within the Internet, such as the Washington Post, The New York Times, and others are pushing efforts towards creating personalized information engines, with the aim of tailoring search results to those that users are likely to like or agree with.[15]

Several designers developed tools to counteract the effects of filter bubbles.[32] Swiss radio station SRF voted the word filterblase (the German translation of filter bubble) word of the year 2016.[33]

Counter MeasuresEdit

By IndividualsEdit

Users can take actions to burst through their filter bubbles. Some make a conscious effort to evaluate what information they are exposing themselves to, thinking critically about whether they are engaging with a broad range of content.[34] Steps to “re-engineer your internet diet” include creating your own research team of smart, insightful media leaders who wisely consume and produce only valid and credible articles. Users can examine their search history and cut sources that are unverifiable or weak. This view argues that users should change the psychology of how they approach media with their biases already intact instead of relying on a tech to erase their biases. Tech can also be used to combat filter bubbles.[35] Chris Glushko, the VP of Marketing at IAB, advocates using fact-checking sites like Snopes.com to eradicate the occurrence of fake news.[36]

Websites such as allsides.com[37] and hifromtheotherside.com[38] aim to expose readers to different perspectives with diverse content. Some additional plug-ins aimed to help us step out of our filter bubbles and make us aware of our personal perspectives; thus, these media show content that contradicts with our beliefs and opinions. For instance, Escape Your Bubble asks users to indicate a specific political party they want to be more informed about.[39] The plug-in will then suggest articles from well-established sources for you to read relating to that political party, encouraging users to become more educated about the other party.[39] In addition to plug-ins, there are apps created with the mission of encouraging us to open our echo chambers. Read Across the Aisle is a news app that reveals whether or not users are reading from diverse new sources that include multiple perspectives.[40] Each source is color coordinated, representing the political leaning of each article.[40] When users only read news from one perspective, the app communicates that to the user and encourages readers to explore other sources with opposing viewpoints.[40] Although apps and plug-ins are tools humans can use, Eli Pariser stated “certainly, there is some individual responsibility here to really seek out new sources and people who aren’t like you.”[29]

Since web-based advertising can further the effect of the filter bubbles by exposing users to more of the same content, users can block much advertising by deleting their search history, turning off targeted ads, and downloading browser extensions.[41][42] Extensions such as Escape your Bubble[43] for Google Chrome aim to help curate content and prevent users from only being exposed to biased information, while Mozilla Firefox extensions such as Lightbeam[44] and Self-Destructing Cookies[45] enable users to visualize how their data is being tracked, and lets them remove some of the tracking cookies. Some use anonymous search engines such as YaCy, duckduckgo,[46] StartPage,[47] and Disconnect[48] in order to prevent companies from gathering their web-search data. Swiss daily Neue Zürcher Zeitung is beta-testing a personalised news engine app which uses machine learning to guess what content a user is interested in, while "always including an element of surprise"; the idea is to mix in stories which a user is unlikely to have followed in the past.[49]

The European Union is taking measures to lessen the impact of the filter bubble. The European Parliament is sponsoring inquiries into how filter bubbles affect people’s ability to access diverse news.[50] Additionally, it introduced a program aimed to educate citizens about social media.[51] In the U.S., the CSCW panel suggests the use of news aggregator apps to broaden media consumers news intake. News aggregator apps scan all current news articles and direct you to different viewpoints regarding a certain topic. Users can also use a diversely-aware news balancer which visually shows the media consumer if they are leaning left or right when it comes to reading the news, indicating right-leaning with a bigger red bar or left-leaning with a bigger blue bar. A study evaluating this news balancer found “a small but noticeable change in reading behavior, toward more balanced exposure, among users seeing the feedback, as compared to a control group”.[52]

By Media CompaniesEdit

In light of recent concerns about information filtering on social media, Facebook acknowledged the presence of filter bubbles and has taken strides toward removing them.[53] In January 2017, Facebook removed personalization from its Trending Topics list in response to problems with some users not seeing highly talked-about events there.[54] Facebook’s strategy is to reverse the Related Articles feature that it had implemented in 2013, which would post related news stories after the user read a shared article. Now, the revamped strategy would flip this process and post articles from different perspectives on the same topic. Facebook is also attempting to go through a vetting process whereby only articles from reputable sources will be shown. Along with the founder of Craigslist and a few others, Facebook has invested $14 million into efforts "to increase trust in journalism around the world, and to better inform the public conversation”.[55] The idea is that even if people are only reading posts shared from their friends, at least these posts will be credible.

Ethical ImplicationsEdit

As the popularity of cloud services increases, personalized algorithms used to construct filter bubbles are expected to become more widespread.[56] Scholars have begun considering the effect of filter bubbles on the users of social media from an ethical standpoint, particularly concerning the areas of personal freedom, security, and information bias.[57]

Filter bubbles in popular social media and personalized search sites can determine the particular content seen by users, often without their direct consent or cognizance,[56] due to the algorithms used to curate that content. Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and have their identities socially constructed as a result of the pervasiveness of filter bubbles.[56]

Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles.[58] Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of “The Filter Bubble,” have even expressed concerns regarding the risks of privacy and information polarization.[59][60] The information of the users of personalized search engines and social media platforms is not private, though some people believe it should be.[59] The concern over privacy has resulted in a debate as to whether or not it is moral for information technologists to take users’ online activity and manipulate future exposure to related information.[60]

Since the content seen by individual social media users is influenced by algorithms that produce filter bubbles, users of social media platforms are more susceptible to confirmation bias,[61] and may be exposed to biased, misleading information.[62] Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized filtering.[63]

See alsoEdit

ReferencesEdit

  1. ^ Bozdag, Engin (23 June 2013). "Bias in algorithmic filtering and personalization". Ethics and Information Technology. 15 (3): 209–227. 
  2. ^ Web bug (slang)
  3. ^ Website visitor tracking
  4. ^ Huffington Post, The Huffington Post "Are Filter-bubbles Shrinking Our Minds?"
  5. ^ a b c d e Boutin, Paul (May 20, 2011). "Your Results May Vary: Will the information superhighway turn into a cul-de-sac because of automated filters?". The Wall Street Journal. Retrieved August 15, 2011. By tracking individual Web browsers with cookies, Google has been able to personalize results even for users who don't create a personal Google account or are not logged into one. ... 
  6. ^ Zhang, Yuan Cao; Ó Séaghdha, Diarmuid; Quercia, Daniele; Jambor, Tamas (February 2012). "Auralist: Introducing Serendipity into Music Recommendation" (PDF). ACM WSDM. 
  7. ^ "The author of The Filter Bubble on how fake news is eroding trust in journalism". The Verge. 2016-11-16. Retrieved 2017-04-19. 
  8. ^ Baer, Drake. "The ‘Filter Bubble’ Explains Why Trump Won and You Didn’t See It Coming". Science of Us. Retrieved 2017-04-19. 
  9. ^ DiFranzo, Dominic; Gloria-Garcia, Kristine (2017-04-01). "Filter Bubbles and Fake News". XRDS. 23 (3): 32–35. ISSN 1528-4972. doi:10.1145/3055153. 
  10. ^ a b Jasper Jackson (8 January 2017). "Eli Pariser: activist whose filter bubble warnings presaged Trump and Brexit: Upworthy chief warned about dangers of the internet’s echo chambers five years before 2016’s votes". The Guardian. Retrieved March 3, 2017. ...“If you only see posts from folks who are like you, you’re going to be surprised when someone very unlike you wins the presidency,” Pariser tells the Guardian.... 
  11. ^ AUTHOR: MOSTAFA M. EL-BERMAWY (November 18, 2016). "Your Filter Bubble is Destroying Democracy". Wired. Retrieved March 3, 2017. ...The global village that was once the internet ... digital islands of isolation that are drifting further apart each day ... your experience online grows increasingly personalized ... 
  12. ^ Drake Baer (November 9, 2016). "The ‘Filter Bubble’ Explains Why Trump Won and You Didn’t See It Coming". New York Magazine. Retrieved March 3, 2017. ...Trump’s victory is blindsiding ... because, as media scholars understand it, we increasingly live in a “filter bubble”: The information we take in is so personalized that we’re blind to other perspectives.... 
  13. ^ Kevin J. Delaney (February 21, 2017). "Filter bubbles are a serious problem with news, says Bill Gates". Quartz. Retrieved March 3, 2017. ...Gates is one of a growing number of technology leaders wrestling with the issue of filter bubbles, ... 
  14. ^ a b Parramore, Lynn (October 10, 2010). "The Filter Bubble". The Atlantic. Retrieved April 20, 2011. Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill.... 
  15. ^ a b c d e f g h Weisberg, Jacob (June 10, 2011). "Bubble Trouble: Is Web personalization turning us into solipsistic twits?". Slate. Retrieved August 15, 2011. 
  16. ^ a b c Gross, Doug (May 19, 2011). "What the Internet is hiding from you". CNN. Retrieved August 15, 2011. I had friends Google BP when the oil spill was happening. These are two women who were quite similar in a lot of ways. One got a lot of results about the environmental consequences of what was happening and the spill. The other one just got investment information and nothing about the spill at all. 
  17. ^ a b c Lazar, Shira (June 1, 2011). "Algorithms and the Filter Bubble Ruining Your Online Experience?". Huffington Post. Retrieved August 15, 2011. a filter bubble is the figurative sphere surrounding you as you search the Internet. 
  18. ^ "Beware online 'filter bubbles". 
  19. ^ "First Monday: What's on tap this month on TV and in movies and books: The Filter Bubble by Eli Pariser". USA Today. 2011. Retrieved April 20, 2011. Pariser explains that feeding us only what is familiar and comfortable to us closes us off to new ideas, subjects and important information. 
  20. ^ a b Bosker, Bianca (March 7, 2011). "Facebook, Google Giving Us Information Junk Food, Eli Pariser Warns". Huffington Post. Retrieved April 20, 2011. When it comes to content, Google and Facebook are offering us too much candy, and not enough carrots. 
  21. ^ "Invisible sieve: Hidden, specially for you". The Economist. 30 June 2011. Retrieved June 27, 2011. Mr Pariser’s book provides a survey of the internet’s evolution towards personalisation, examines how presenting information alters the way in which it is perceived and concludes with prescriptions for bursting the filter bubble that surrounds each user. 
  22. ^ Note: the term cyber-balkanization (sometimes with a hyphen) is a hybrid of cyber, relating to the Internet, and Balkanization, referring to that region of Europe that was historically subdivided by languages, religions and cultures; the term was coined in a paper by MIT researchers Van Alstyne and Brynjolfsson.
  23. ^ "Cyberbalkanization" (PDF). 
  24. ^ Van Alstyne, Marshall; Brynjolfsson, Erik (November 1996). "Could the Internet Balkanize Science?". Science. 274 (5292). doi:10.1126/science.274.5292.1479. 
  25. ^ Alex Pham and Jon Healey, Tribune Newspapers: Los Angeles Times (September 24, 2005). "Systems hope to tell you what you'd like: 'Preference engines' guide users through the flood of content". Chicago Tribune. Retrieved December 4, 2015. ...if recommenders were perfect, I can have the option of talking to only people who are just like me....Cyber-balkanization, as Brynjolfsson coined the scenario, is not an inevitable effect of recommendation tools,,,, 
  26. ^ Obama, Barack (10 January 2017). President Obama’s Farewell Address (Speech). Washington, D.C. Retrieved 24 January 2017. 
  27. ^ a b Hosanagar, Kartik; Fleder, Daniel; Lee, Dokyun; Buja, Andreas (December 2013). "Will the Global Village Fracture into Tribes: Recommender Systems and their Effects on Consumers". Management Science, Forthcoming. SSRN 1321962 . 
  28. ^ Ludwig, Amber. "Google Personalization on Your Search Results Plus How to Turn it Off". NGNG. Retrieved August 15, 2011. Google customizing search results is an automatic feature, but you can shut this feature off. 
  29. ^ a b "5 Questions with Eli Pariser, Author of ‘The Filter Bubble’". Time. ISSN 0040-781X. Retrieved 2017-05-24. 
  30. ^ a b c d e West, Joshua Bleiberg and Darrell M. (2017-05-24). "Political polarization on Facebook | Brookings Institution". Brookings. Retrieved 2017-05-24. 
  31. ^ Bakshy, Eytan; Messing, Solomon; Adamic, Lada (2015-05-07). "Exposure to ideologically diverse news and opinion on Facebook". Science: aaa1160. ISSN 0036-8075. PMID 25953820. doi:10.1126/science.aaa1160. 
  32. ^ "How do we break filter bubble and design for democracy?". March 3, 2017. Retrieved March 3, 2017. 
  33. ^ "«Filterblase» ist das Wort des Jahres 2016". December 7, 2016. Retrieved December 27, 2016. 
  34. ^ "Are we stuck in filter bubbles? Here are five potential paths out". Nieman Lab. 
  35. ^ Ritholtz, Barry. "Try Breaking Your Media Filter Bubble". Bloomberg. Retrieved 22 May 2017. 
  36. ^ Glushko, Chris. "Pop the Personalization Filter Bubbles and Preserve Online Diversity". Marketing Land. Retrieved 22 May 2017. 
  37. ^ "Allsides". allsides.com. 
  38. ^ "Hi From the Other Side". 
  39. ^ a b "Be More Accepting of Others - EscapeYourBubble". www.escapeyourbubble.com. Retrieved 2017-05-24. 
  40. ^ a b c "A news app aims to burst filter bubbles by nudging readers toward a more “balanced” media diet". Nieman Lab. Retrieved 2017-05-24. 
  41. ^ "uBlock Origin - An efficient blocker for Chromium and Firefox. Fast and lean.". 
  42. ^ "Privacy Badger". 
  43. ^ "Who do you want to know better?". Escape Your Bubble. 
  44. ^ "Shine a Light on Who’s Watching You". Lightbeam. 
  45. ^ "Self-destructing cookies". Add-ons. 
  46. ^ "Duck Duck Go". 
  47. ^ "Start Page". 
  48. ^ "Disconnect Search". 
  49. ^ Mădălina Ciobanu (3 March 2017). "NZZ is developing an app that gives readers personalised news without creating a filter bubble: The app uses machine learning to give readers a stream of 25 stories they might be interested in based on their preferences, but 'always including an element of surprise'". Journalism.co.uk. Retrieved March 3, 2017. ... if, based on their consumption history, someone has not expressed an interest in sports, their stream will include news about big, important stories related to sports,... 
  50. ^ Catalina Albeanu (17 November 2016). "Bursting the filter bubble after the US election: Is the media doomed to fail? At an event in Brussels this week, media and politicians discussed echo chambers on social media and the fight against fake news". Journalism.co.uk. Retrieved March 3, 2017. ... EU referendum in the UK on a panel at the "Politicians in a communication storm" event... On top of the filter bubble, partisan Facebook pages also served up a diet heavy in fake news.... 
  51. ^ "European Commission". 
  52. ^ Resnick, Paul; Garrett, Kelly R.; Kriplean, Travis; Munson, Sean A.; Stroud, Natalie J. (23–27 February 2017). "Bursting Your Filter Bubble: Strategies for Promoting Diverse Exposure". CSCW '13. doi:10.1145/2441955.2441981. Retrieved 22 May 2017. 
  53. ^ Vanian, Jonathan (25 April 2017). [Fortune.com "Facebook is Testing This New Feature to Fight 'Filter Bubbles'"] Check |url= value (help). Business Source complete: 45. Retrieved 22 May 2017. 
  54. ^ Sydell, Laura (25 January 2017). "Facebook Tweaks its 'Trending Topics' Algorithm to Better Reflect Real News". KQED Public Media. NPR. 
  55. ^ Vanian, Jonathan (25 April 2017). [Fortune.com "Facebook is Testing This New Feature to Fight 'Filter Bubbles'"] Check |url= value (help). Business Source complete: 45. Retrieved 22 May 2017. 
  56. ^ a b c Bozdag, Engin; Timmerman, Job. "Values in the filter bubble Ethics of Personalization Algorithms in Cloud Computing" (PDF). Research Gate. Retrieved 6 March 2017. 
  57. ^ Al-Rodhan, Nayef. "The Many Ethical Implications of Emerging Technologies". Scientific American. Retrieved 6 March 2017. 
  58. ^ "The Filter Bubble Raises Important Issues - You Just Need to Filter Them Out For Yourself". Rainforest Action Network. Retrieved 6 March 2017. 
  59. ^ a b Sterling, Greg. "Mark Zuckerberg’s manifesto: How Facebook will connect the world, beat fake news and pop the filter bubble". Marketing Land. Retrieved 6 March 2017. 
  60. ^ a b Morozov, Evgeny. "Your Own Facts". New York Times. Retrieved 6 March 2017. 
  61. ^ El-Bermawy, Mostafa. "Your Filter Bubble is Destroying Democracy". Wired. Retrieved 6 March 2017. 
  62. ^ "How to Burst the "Filter Bubble" that Protects Us from Opposing Views". MIT Technology Review. Retrieved 6 March 2017. 
  63. ^ Borgesius, Frederik; Trilling, Damian; Möller, Judith; Bodó, Balázs; de Vreese, Claes; Helberger, Natali. "Should we worry about filter bubbles?". Internet Policy Review. Retrieved 6 March 2017. 

Further readingEdit

External linksEdit