BitChute

BitChute is a video hosting service launched by Ray Vahey in January 2017.[1] It is known for accommodating far-right individuals and conspiracy theorists, and for hosting hate speech.[a][b] Some creators who use BitChute have been banned from YouTube; some others crosspost content to both platforms or post more extreme content only to BitChute.[2][15] BitChute claims to use peer-to-peer WebTorrent technology for video distribution,[1] though this has been disputed.[16][17]

BitChute
A tilted black square with white text reading "BIT", followed by red text reading "CHUTE"
Type of site
Online video platform
Available inEnglish
Country of originUnited Kingdom
Created byRay Vahey
ParentBit Chute Limited
URLbitchute.com
RegistrationOptional
LaunchedJanuary 2017; 4 years ago (2017-01)
Current statusActive

History

 
Vahey interviewed in 2018

Bit Chute Limited, BitChute's corporate identity, was registered by Ray Vahey in January 2017 in Newbury, England.[3][18][19] At the time of the site's launch, Vahey described BitChute as an alternative to mainstream platforms; he believed these platforms had demonstrated "increased levels of censorship" over the previous few years by banning and demonetising users (barring them from receiving advertising revenue), and "tweaking algorithms to send certain content into obscurity".[1]

In November 2018, BitChute was banned from PayPal.[20][21] PayPal also banned Alex Jones, the Proud Boys, Tommy Robinson, and several anti-fascist groups and users at the same time.[20] In 2019, crowdfunding website IndieGogo also banned BitChute.[14] BitChute has also been banned from using Patreon and Stripe.[16]

In January 2019, BitChute announced in a post on Gab that they would move their domains over to Epik, a small domain registrar known for accepting the registration of websites that host far-right content.[11][22]

In March 2020, a new provision to Germany's Network Enforcement Act required social media companies to report instances of hate speech on their platforms to authorities. However, online news platform Coda reported that while the law applies to platforms including YouTube, Facebook, and Twitter, BitChute is one of the platforms not affected by the provision.[23] In early August 2020, Twitter began blocking posts linking to the site, later showing a warning to users who clicked on the links.[24][14]

As of January 2021, BitChute was in the process of being reported to Ofcom by the Community Security Trust after discovering Holocaust denial and Holocaust glorification content on the website, among other content considered harmful by the charity such as conspiracy theories related to COVID-19.[25] The trust's blog stated this will be an important test case for Ofcom's new role regarding regulation of social media in the United Kingdom, especially concerning extremism and hateful content.[26] Also in January, BitChute added "incitement to hatred" to its list of prohibited content, using the definition from the United Kingdom's Audiovisual Media Services Regulations 2020,[27] though Bellingcat wrote the following month that "racist slurs, Nazi imagery and calls for violence against Jews remained common in video comment sections."[14]

Content

Since launching, BitChute has accommodated far-right groups and individuals.[a] The Southern Poverty Law Center wrote in 2019 that the site hosts "hate-fueled material", the Anti-Defamation League wrote in 2020 that "BitChute has become a hotbed for violent, conspiratorial and hate-filled video propaganda, and a recruiting ground for extremists", and Bellingcat wrote in 2021 that the platform was "rife with racism and hate speech".[11][12][14] According to a 2020 report from anti-extremism group Hope not Hate, BitChute "actively promotes" content which was removed from other platforms as hate speech. Hope not Hate also documented videos hosted on BitChute supporting or produced by terrorists groups, including ISIS and the neo-Nazi groups National Action and Atomwaffen Division.[7][13] A June 2020 report from British Jewish group Community Security Trust said that some terrorist videos had been on the site for over a year, and that BitChute only removes this content when forced to.[8][28] An academic analysis published in July 2020 using a data set gathered in 2019 over five months found that BitChute had more hate speech than Gab, but less than 4chan. It found that only a small group of channels on the network had any meaningful engagement, almost all of which pushed conspiracy- and hate-laden content. Like the research from Hope not Hate, this analysis found content promoting the Atomwaffen Division posted to BitChute, including a recruitment video.[15]

BitChute's founder Ray Vahey has described BitChute as "politically neutral".[20] Hope Not Hate wrote in their 2020 report that "in actuality, the company chooses to almost exclusively promote content and producers that engage in hate speech and harmful misinformation" and that the "vile and dangerous content that abounds on BitChute is a result of deliberate decisions on the part of their founder and team".[13] Bellingcat reported in 2021 that Vahey used the platform's Twitter account to promote antisemitic conspiracy theories, COVID-19 misinformation, and QAnon content.[14]

BitChute is part of a group of "alt-tech" websites that position themselves as less strictly-moderated alternatives to mainstream social media platforms like YouTube, Facebook, and Twitter.[6][10] Deen Freelon and colleagues writing in Science characterised BitChute as among the alt-tech sites that are "dedicated to right-wing communities", and listed the site along with 4chan, 8chan, Parler, and Gab. They noted there are also more ideologically neutral alt-tech platforms, such as Discord and Telegram.[10] Joe Mulhall of the UK anti-racism group Hope Not Hate has categorised BitChute among the "bespoke platforms" for the far-right, which he defines as platforms which were created by people who themselves have "far-right leanings". He distinguishes these from "co-opted platforms" such as DLive and Telegram, which were adopted by the far-right due to minimal moderation but not specifically created for their use.[14]

Some creators who have been banned from YouTube or had their channels demonetised subsequently migrated to BitChute.[2] The far-right conspiracy theory channel InfoWars migrated to BitChute after being banned by YouTube in 2018.[3] Other creators maintain a presence on YouTube and on BitChute, and some post more extreme content on BitChute while using YouTube for less extreme material.[15] Prominent far-right and alt-right video creators who have cross-posted to both YouTube and BitChute include Lauren Southern, Stefan Molyneux, Millennial Woes, Computing Forever, and Paul Joseph Watson.[2][4][29]

The platform also hosts misinformation related to the COVID-19 pandemic.[7] The conspiracy theory video Plandemic has been viewed on BitChute millions of times after having been removed from other platforms for spreading medically harmful misinformation.[30][31][13]

Model

BitChute does not rely on advertising, and users can send payments to video creators directly.[29] Since its launch, the site has promoted its use of the peer-to-peer technology WebTorrent as a means to decentralise hosting and reduce costs.[1] BitChute allows creators to monetise the videos they publish on the platform by linking to fundraising websites including SubscribeStar, PayPal, and cryptocurrency processors. Although Paypal banned BitChute themselves from using their service, BitChute still links to PayPal pages for creators who choose to use them.[14]

At launch, the site claimed it was using peer-to-peer WebTorrent technology.[1] However, a November 2019 report by Fredrick Brennan, published in The Daily Dot, failed to find any evidence of peer-to-peer data transfer in BitChute's videos.[16] All videos Brennan downloaded came directly from BitChute's servers, with no part of the videos received from peers. According to Brennan, magnet links on the site do not work. Brennan challenged BitChute's use of the word "delist" to describe deplatforming users, saying that the wording is misleading in that it makes BitChute seem falsely similar to BitTorrent (where a site maintains one "list" of content, but independent trackers may be created as well), when in reality BitChute is just deleting a user's videos from the BitChute site.[16] According to Ars Technica in April 2021, the option to host videos using WebTorrent on BitChute "appears to have been deprecated".[17]

See also

Notes

  1. ^ a b Known for accommodating far-right individuals and conspiracy theorists[2][3][4][5][6][7][8][9]
  2. ^ Known for hosting hateful material[10][7][11][12][13][14]

References

  1. ^ a b c d e Maxwell, Andy (29 January 2017). "BitChute is a BitTorrent-Powered YouTube Alternative". TorrentFreak. Archived from the original on 9 December 2017. Retrieved 10 December 2017.
  2. ^ a b c d Daro, Ishmael N.; Lytvynenko, Jane (18 April 2018). "Right-Wing YouTubers Think It's Only A Matter Of Time Before They Get Kicked Off The Site". BuzzFeed News. Archived from the original on 5 July 2018. Retrieved 4 May 2019.
  3. ^ a b c Schroeder, Audra (2 November 2018). "Far-right conspiracy vloggers have a new home". The Daily Dot. Archived from the original on 4 May 2019. Retrieved 4 May 2019.
  4. ^ a b Tani, Maxwell (22 September 2017). "'There's no one for right-wingers to pick a fight with': The far right is struggling to sustain interest in its social media platforms". Business Insider. Archived from the original on 8 December 2017. Retrieved 10 December 2017.
  5. ^ Robertson, Adi (9 October 2017). "Two months ago, the internet tried to banish Nazis. No one knows if it worked". The Verge. Archived from the original on 4 April 2018. Retrieved 24 May 2019.
  6. ^ a b Livni, Ephrat (12 May 2019). "Twitter, Facebook, and Insta bans send the alt-right to Gab and Telegram". Quartz. Archived from the original on 24 May 2019. Retrieved 24 May 2019. The far right have plenty of places to go when they are no longer welcome on mainstream platforms—like Parler, Minds, MeWe, and BitChute, among others.
  7. ^ a b c d Dearden, Lizzie (22 July 2020). "Inside the UK-based site that has become the far right's YouTube". The Independent. Archived from the original on 5 August 2020. Retrieved 14 August 2020.
  8. ^ a b Doward, Jamie; Townsend, Mark (28 June 2020). "The UK social media platform where neo-Nazis can view terror atrocities". The Guardian. ISSN 0029-7712. Archived from the original on 13 August 2020. Retrieved 14 August 2020.
  9. ^ Tighe, Mark; Galvin, Joe (31 January 2021). "Facebook acts as conspiracy theories on Covid in Ireland go viral". The Times. Retrieved 18 May 2021.
  10. ^ a b c Freelon, Deen; Marwick, Alice; Kreiss, Daniel (4 September 2020). "False equivalencies: Online activism from left to right". Science. 369 (6508): 1197–1201. Bibcode:2020Sci...369.1197F. doi:10.1126/science.abb2428. ISSN 0036-8075. PMID 32883863.
  11. ^ a b c Hayden, Michael Edison (11 January 2019). "A Problem of Epik Proportions". Southern Poverty Law Center. Archived from the original on 12 January 2019. Retrieved 12 January 2019.
  12. ^ a b "BitChute: A Hotbed of Hate". Anti-Defamation League. 31 August 2020. Archived from the original on 4 September 2020. Retrieved 4 September 2020.
  13. ^ a b c d Davis, Gregory (20 July 2020). "Bitchute: Platforming Hate and Terror in the UK" (PDF). Hope not Hate. Archived from the original on 8 August 2020. Retrieved 18 August 2020.
  14. ^ a b c d e f g h Andrews, Frank; Pym, Ambrose (24 February 2021). "The Websites Sustaining Britain's Far-Right Influencers". Bellingcat. Retrieved 25 February 2021.
  15. ^ a b c Trujillo, Milo; Gruppi, Maurício; Buntain, Cody; Horne, Benjamin D. (13 July 2020). "What is BitChute? Characterizing the "Free Speech" Alternative to YouTube" (PDF). Proceedings of the 31st ACM Conference on Hypertext and Social Media. Association for Computing Machinery: 139–140. arXiv:2004.01984. doi:10.1145/3372923.3404833. S2CID 220434725.
  16. ^ a b c d Brennan, Fredrick (27 November 2019). "Bitchute claims to be a decentralized platform—that's not true". The Daily Dot. Archived from the original on 28 November 2019. Retrieved 28 November 2019.
  17. ^ a b Chant, Tim De (29 April 2021). "Conspiracy theorist said death threats were "jokes"—but jury didn't buy it". Ars Technica. Retrieved 18 May 2021.
  18. ^ "Bit Chute Limited — Overview". Companies House. Archived from the original on 9 December 2019. Retrieved 14 August 2020.
  19. ^ "BitChute — Terms & Conditions". BitChute. Archived from the original on 15 August 2020. Retrieved 14 August 2020.
  20. ^ a b c Blake, Andrew (14 November 2018). "BitChute, YouTube alternative, cries foul over apparent punt from PayPal". The Washington Times. Archived from the original on 27 November 2018. Retrieved 28 November 2018.
  21. ^ Newton, Casey (15 November 2018). "Facebook has a growing morale problem". The Verge. Archived from the original on 4 May 2019. Retrieved 4 May 2019.
  22. ^ Martineau, Paris (6 November 2018). "How Right-Wing Social Media Site Gab Got Back Online". Wired. ISSN 1059-1028. Archived from the original on 2 May 2019. Retrieved 4 May 2019.
  23. ^ Butini, Cecilia (2 March 2020). "Germany to force social media companies to report hate speech to police". Coda. Archived from the original on 8 August 2020. Retrieved 14 August 2020.
  24. ^ P, Jamie (7 August 2020). "Bitchute Blocked by Twitter? Here's Why". Tech Times. Archived from the original on 15 December 2020. Retrieved 14 August 2020.
  25. ^ Editor, Fiona Hamilton, Crime & Security. "'Hateful' BitChute video site is first test for Ofcom". The Times. ISSN 0140-0460. Retrieved 2 February 2021.CS1 maint: extra text: authors list (link)
  26. ^ "BitChute - A Very British Problem". Community Security Trust. 28 January 2021. Retrieved 2 February 2021.
  27. ^ "Community Guidelines". BitChute. Retrieved 25 February 2021.
  28. ^ Zonshine, Idan (15 June 2020). "New UK report exposes massive online network of far-right antisemitism". The Jerusalem Post. Archived from the original on 18 August 2020. Retrieved 19 August 2020.
  29. ^ a b Alexander, Julia (7 March 2018). "Controversial YouTubers head to alternative platforms in wake of 'purge'". Polygon. Archived from the original on 4 May 2019. Retrieved 4 May 2019.
  30. ^ Lytvynenko, Jane (1 June 2020). "After The "Plandemic" Video Went Viral In The US, It Was Exported To The Rest Of The World". BuzzFeed News. Archived from the original on 7 August 2020. Retrieved 19 August 2020.
  31. ^ Bellemare, Andrea; Nicholson, Katie; Ho, Jason (21 May 2020). "How a debunked COVID-19 video kept spreading after Facebook and YouTube took it down". Canadian Broadcasting Corporation. Archived from the original on 20 August 2020. Retrieved 19 August 2020.