Open main menu

An example of deepfake technology: actress Amy Adams in the original (left) is modified to have the face of actor Nicolas Cage (right)

Deepfakes (a portmanteau of "deep learning" and "fake"[1]) are media that take a person in an existing image or video and replace them with someone else's likeness using artificial neural networks.[2] They often combine and superimpose existing media onto source media using machine learning techniques known as autoencoders and generative adversarial networks (GANs).[3][4] Deepfakes have garnered widespread attention for their uses in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud.[5][6][7][8] This has elicited responses from both industry and government to detect and limit their use.[9][10]

HistoryEdit

The development of deepfakes has taken place to a large extent in two settings: research at academic institutions and development by amateurs in online communities.[11][12] More recently it has also been adopted by industry.[13]

Academic researchEdit

Academic research related to deepfakes lies predominantly within the field of computer vision, a subfield of computer science.[11] An early landmark project was the Video Rewrite program, published in 1997, which modified existing video footage of a person speaking to depict that person mouthing the words contained in a different audio track.[14] It was the first system to fully automate this kind of facial reanimation, and it did so using machine learning techniques to make connections between the sounds produced by a video's subject and the shape of the subject's face.[14]

Contemporary academic projects have focused on creating more realistic videos and on improving techniques.[15][16] The “Synthesizing Obama” program, published in 2017, modifies video footage of former president Barack Obama to depict him mouthing the words contained in a separate audio track.[15] The project lists as a main research contribution its photorealistic technique for synthesizing mouth shapes from audio.[15] The Face2Face program, published in 2016, modifies video footage of a person's face to depict them mimicking the facial expressions of another person in real time.[16] The project lists as a main research contribution the first method for re-enacting facial expressions in real time using a camera that does not capture depth, making it possible for the technique to be performed using common consumer cameras.[16]

In August 2018, researchers at the University of California, Berkeley published a paper introducing a fake dancing app that can create the impression of masterful dancing ability using AI.[17][18] This project expands the application of deepfakes to the entire body; previous works focused on the head or parts of the face.[17]

Amateur developmentEdit

The term deepfakes originated around the end of 2017 from a Reddit user named "deepfakes".[2] He, as well as others in the Reddit community r/deepfakes, shared deepfakes they created; many videos involved celebrities’ faces swapped onto the bodies of actresses in pornographic videos,[2] while non-pornographic content included many videos with actor Nicolas Cage’s face swapped into various movies.[19] In December 2017, Samantha Cole published an article about r/deepfakes in Vice that drew the first mainstream attention to deepfakes being shared in online communities.[20] Six weeks later, Cole wrote in a follow-up article about the large increase in AI-assisted fake pornography.[2] In February 2018, r/deepfakes was banned by Reddit for sharing involuntary pornography.[21] Other websites have also banned the use of deepfakes for involuntary pornography, including the social media platform Twitter and the pornography site Pornhub.[22] However, some websites have not yet banned Deepfake content, including 4chan and 8chan. [23] Other online communities remain, including Reddit communities that do not share pornography, such as r/SFWdeepfakes (short for "safe for work deepfakes"), in which community members share deepfakes depicting celebrities, politicians, and others in non-pornographic scenarios.[24] Other online communities continue to share pornography on platforms that have not banned deepfake pornography.[23]

Commercial developmentEdit

In January 2018, a proprietary desktop application called FakeApp was launched.[25] This app allows users to easily create and share videos with their faces swapped with each other.[26] As of 2019, FakeApp has been superseded by open-source alternatives such as Faceswap and the command line-based DeepFaceLab.[27][28]

Larger companies are also starting to use deepfakes.[13] The mobile app giant Momo created the application Zao which allows users to superimpose their face on TV and movie clips with a single picture.[13] The Japanese AI company DataGrid made a full body deepfake that can create a person from scratch.[29] They intend to use these for fashion and apparel.

TechniquesEdit

Deepfakes rely on a type of Neural Network called an autoencoder.[4][30] These consist of an encoder, which reduces an image to a lower dimensional latent space, and a decoder, which reconstructs the image from the latent representation. Deepfakes utilize this architecture by having a universal encoder which encodes a person in to the latent space.[31] The latent representation contains key features about their facial features and body posture. This can then be decoded with a model trained specifically for the target.[4] This means the target's detailed information will be superimposed on the underlying facial and body features of the original video, represented in the latent space.[4]

A popular upgrade to this architecture attaches a generative adversarial network to the decoder.[31] A GAN trains a generator, in this case the decoder, and a discriminator in an adversarial relationship.[31] The generator creates new images from the latent representation of the source material, while the discriminator attempts to determine whether or not the image is generated.[31] This causes the generator to create images the mimic reality extremely well as any defects would be caught by the discriminator.[32] Both algorithms improve constantly in a zero sum game.[31] This makes Deepfakes difficult to combat as they are constantly evolving, anytime a defect is determined it can be corrected.[32]

ApplicationsEdit

PornographyEdit

Around 96% of deepfakes on the internet feature pornography that was created without the consent of the actors or the people, often female celebrities, whose likeness was used.[33] Deepfake pornography prominently surfaced on the Internet in 2017, particularly on Reddit.[34] The first one that captured attention was the Daisy Ridley deepfake, which was featured in several articles.[34] Other prominent pornographic deepfakes targeted celebrities such as Gal Gadot, Emma Watson, Masie Williams, Taylor Swift and Scarlett Johansson.[34][35][36][37] As of October 2019, most of the targeted deepfake subjects on the internet were British and American Actresses.[33] However, around a quarter of the subjects are South Korean, the majority of which are K-pop stars.[33]

In June 2019, a downloadable Windows and Linux application called DeepNude was released which used neural networks, specifically generative adversarial networks, to remove clothing from images of women. The app had both a paid and unpaid version, the paid version costing $50.[38][39] On June 27 the creators removed the application and refunded consumers.[40]

PoliticsEdit

Deepfakes have been used to misrepresent well-known politicians in videos. In separate videos, the face of the Argentine President Mauricio Macri has been replaced by the face of Adolf Hitler, and Angela Merkel's face has been replaced with Donald Trump's.[41][42] In April 2018, Jordan Peele collaborated with Buzzfeed to create a deepfake of Barack Obama with Peele's voice; it served as a public service announcement to increase awareness of deepfakes. [43] In January 2019, Fox television affiliate KCPQ aired a deepfake of Trump during his Oval Office address, mocking his appearance and skin color.[44]

In May 2019, speaker of the United States House of Representatives Nancy Pelosi was the subject of two viral videos, one of which had the speed slowed down to 75 percent,[45] and another which edited together parts of her speech at a news conference for the Fox News segment Lou Dobbs Tonight. Both videos were intended to make Pelosi appear as though she was slurring her speech.[46] President Donald Trump shared the latter video on Twitter, captioning the video "'PELOSI STAMMERS THROUGH NEWS CONFERENCE'".[47] These videos were featured by many major news outlets, which brought deepfakes to the attention of the United States House Intelligence Committee.[48][49]

ActingEdit

There has been speculation about deepfakes being used for creating digital actors for future films. Digitally constructed/altered humans have already been used in films before, and deepfakes could contribute new developments in the near future.[50] Amateur deepfake technology has already been used to insert faces into existing films, such as the insertion of Harrison Ford's young face onto Han Solo's face in Solo: A Star Wars Story,[51] and techniques similar to those used by deepfakes were used for the acting of Princess Leia in Rogue One.[52]

ConcernsEdit

Degradation of womenEdit

It has been claimed that the original purpose of the technology is to “control and humiliate women.”[53]

FraudEdit

Audio deepfakes have been used as part of social engineering scams, fooling people into thinking they are receiving instructions from a trusted individual.[54] In 2019, a U.K.-based energy firm’s CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive.[55] The perpetrator reportedly called three times and requested a second payment but was turned down when the CEO realized the phone number of the caller was Austrian and that the money was not being reimbursed as he was told it would be.[55]

Effects on credibility and authenticityEdit

The presence of deepfakes makes classifying videos as satirical or genuine increasingly difficult.[41] AI researcher Alex Champandard has said people should know how fast things can be corrupted with deepfake technology, and that the problem is not a technical one, but rather one to be solved by trust in information and journalism.[41] The primary pitfall is that humanity could fall into an age in which it can no longer be determined whether a medium's content corresponds to the truth.[41]

Similarly, computer science associate professor Hao Li of the University of Southern California states that deepfakes created for malicious use, such as fake news, will be even more harmful if nothing is done to spread awareness of deepfake technology.[56] Li predicts that genuine videos and deepfakes will become indistinguishable in as soon as half a year, as of October 2019, due to rapid advancement in artificial intelligence and computer graphics.[56]

ResponsesEdit

DetectionEdit

Most of the academic research surrounding Deepfake seeks to detect the videos.[57] The most popular technique is to use algorithms similar to the ones used to build the deepfake to detect them.[57] By recognizing patterns in how Deepfakes are created the algorithm is able to pick up subtle inconsistencies.[57] Researchers have developed automatic systems that examine videos for errors such as irregular blinking patterns of lighting.[11] This technique has also been criticized for creating a "Moving Goal post" where anytime the algorithms for detecting get better, so do the Deepfakes.[57] The Deepfake Detection Challenge, hosted by a coalition of leading tech companies, hope to accelerate the technology for identifying manipulated content.[9]

Other techniques uses Blockchain to verify the source of the media.[58] Videos will have to be verified through the ledger before they are shown on social media platforms.[58] With this technology only videos from trusted sources would be approved, decreasing the spread of possibly harmful Deepfake media.[58]

CelebritiesEdit

Scarlett Johansson, a frequent subject of deepfake porn, spoke publicly about the subject to The Washington Post in December 2018.[59] In a prepared statement, she expressed that despite concerns, she would not attempt to remove any of her deepfakes, due to her belief that they do not affect her public image and that differing laws across countries and the nature of internet culture make any attempt to remove the deepfakes "a lost cause".[59] While celebrities like herself are protected by their fame, however, she believes that deepfakes pose a grave threat to women of lesser prominence who could have their reputations damaged by depiction in involuntary deepfake pornography or revenge porn.[59]

Internet reactionEdit

In February 2018, Pornhub said that it would ban deepfake videos on its website because it is considered “non consensual content” which violates their terms of service.[60] They also stated previously to Mashable that they will take down content flagged as deepfakes.[61] Writers from Motherboard from Buzzfeed News reported that searching “deepfakes” on Pornhub still returned multiple recent deepfake videos.[60]

In the same month, representatives from Twitter stated that they would suspend accounts suspected of posting non-consensual deepfake content.[22] GIF hosting site Gfycat and chat site Discord are also planning to ban deepfake content from their platforms.[62] On Reddit, the r/deepfakes subreddit was banned on February 7, 2018, due to the policy violation of "involuntary pornography".[63][64][65][66][67] In September 2018, Google added "involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the block of results showing their fake nudes.[68]

Facebook has previously stated that they would not remove deepfakes from their platforms.[69] The videos will instead be flagged as fake by third-parties and then have a lessened priority in user's feeds.[70] This response was prompted in June 2019 after a deepfake featuring a 2016 video of Mark Zuckerberg circulated on Facebook and Instagram.[69]

Legal responseEdit

In the United States, there have been some responses to the problems posed by deepfakes. In 2018, the Malicious Deep Fake Prohibition Act was introduced to the US Senate,[71] and in 2019 the DEEPFAKES Accountability Act was introduced in the House of Representatives.[10] Several states have also introduced legislation regarding deepfakes, including Virginia,[72] Texas, California, and New York.[73] On October 3, 2019, California governor Gavin Newsom signed into law Assembly Bills No. 602 and No. 730.[74][75] Assembly Bill No. 602 provides individuals targeted by sexually explicit deepfake content made without their consent with a cause of action against the content’s creator.[74] Assembly Bill No. 730 prohibits the distribution of malicious deepfake audio or visual media targeting a candidate running for public office within 60 days of their election.[75]

In the United Kingdom, producers of deepfake material can be prosecuted for harassment, but there are calls to make deepfake a specific crime;[76] in the United States, where charges as varied as identity theft, cyberstalking, and revenge porn have been pursued, the notion of a more comprehensive statute has also been discussed.[68]

In popular cultureEdit

"Picaper" by Jack WodhamsEdit

The 1986 Mid-December issue of Analog magazine published the novelette "Picaper" by Jack Wodhams. Its plot revolves around digitally enhanced or digitally generated videos produced by skilled hackers serving unscrupulous lawyers and political figures.[77]

Jack Wodhams calls such fabricated videos picaper or mimepic—image animation. To Wodhams, pornography is not the major danger of this technology. The sobering conclusion is that "the old idea that pictures do not lie is going to have to undergo drastic revision".[77]

A Philosophical InvestigationEdit

In the 1992 techno-thriller A Philosophical Investigation by Philip Kerr, "Wittgenstein", the main character and a serial killer, makes use of both a software similar to Deepfake and a virtual reality suit for having sex with an avatar of the female police lieutenant Isadora "Jake" Jakowicz assigned to catch him.[78]

The CaptureEdit

Deepfake technology is part of the plot of the 2019 BBC One drama The Capture. The series follows British ex-soldier Shaun Emery, who is accused of assaulting and abducting his barrister. Expertly doctored CCTV footage is used to set him up and mislead the police investigating him.[79][80]

ReferencesEdit

  1. ^ Brandon, John (16 February 2018). "Terrifying high-tech porn: Creepy 'deepfake' videos are on the rise". Fox News. Retrieved 20 February 2018.
  2. ^ a b c d Cole, Samantha (24 January 2018). "We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now". Vice. Retrieved 4 May 2019.
  3. ^ Schwartz, Oscar (12 November 2018). "You thought fake news was bad? Deep fakes are where truth goes to die". The Guardian. Retrieved 14 November 2018.
  4. ^ a b c d PhD, Sven Charleer (17 May 2019). "Family fun with deepfakes. Or how I got my wife onto the Tonight Show". Medium. Retrieved 8 November 2019.
  5. ^ "What Are Deepfakes & Why the Future of Porn is Terrifying". Highsnobiety. 20 February 2018. Retrieved 20 February 2018.
  6. ^ "Experts fear face swapping tech could start an international showdown". The Outline. Retrieved 28 February 2018.
  7. ^ Roose, Kevin (4 March 2018). "Here Come the Fake Videos, Too". The New York Times. ISSN 0362-4331. Retrieved 24 March 2018.
  8. ^ "Adversarial Learning of Deepfakes in Accounting" (PDF). Arxiv.org. Retrieved 9 October 2019.
  9. ^ a b "Join the Deepfake Detection Challenge (DFDC)". deepfakedetectionchallenge.ai. Retrieved 8 November 2019.
  10. ^ a b Clarke, Yvette D. (28 June 2019). "H.R.3230 - 116th Congress (2019-2020): Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019". www.congress.gov. Retrieved 16 October 2019.
  11. ^ a b c Harwell, Drew (12 June 2019). "Top AI researchers race to detect 'deepfake' videos: 'We are outgunned'". The Washington Post. Retrieved 8 November 2019.
  12. ^ Sanchez, Julian (8 February 2018). "Thanks to AI, the future of 'fake news' is being pioneered in homemade porn". NBC News. Retrieved 8 November 2019.
  13. ^ a b c Porter, Jon (2 September 2019). "Another convincing deepfake app goes viral prompting immediate privacy backlash". The Verge. Retrieved 8 November 2019.
  14. ^ a b Bregler, Christoph; Covell, Michele; Slaney, Malcolm (1997). "Video Rewrite: Driving Visual Speech with Audio". Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques. 24: 353–360 – via ACM Digital Library.
  15. ^ a b c Suwajanakorn, Supasorn; Seitz, Steven M.; Kemelmacher-Shlizerman, Ira (July 2017). "Synthesizing Obama: Learning Lip Sync from Audio". ACM Trans. Graph. 36.4: 95:1–95:13 – via ACM Digital Library.
  16. ^ a b c Thies, Justus; Zollhöfer, Michael; Stamminger, Marc; Theobalt, Christian; Nießner, Matthias (June 2016). "Face2Face: Real-Time Face Capture and Reenactment of RGB Videos". 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: 2387–2395. doi:10.1109/CVPR.2016.262. ISBN 9781467388511.
  17. ^ a b Farquhar, Peter (27 August 2018). "An AI program will soon be here to help your deepfake dancing – just don't call it deepfake". Business Insider Australia. Retrieved 27 August 2018.
  18. ^ "Deepfakes for dancing: you can now use AI to fake those dance moves you always wanted". The Verge. Retrieved 27 August 2018.
  19. ^ Haysom, Sam (31 January 2018). "People Are Using Face-Swapping Tech to Add Nicolas Cage to Random Movies and What Is 2018". Mashable. Retrieved 4 April 2019.
  20. ^ Cole, Samantha (11 December 2017). "AI-Assisted Fake Porn Is Here and We're All Fucked". Vice. Retrieved 19 December 2018.
  21. ^ Kharpal, Arjun (8 February 2018). "Reddit, Pornhub ban videos that use A.I. to superimpose a person's face over an X-rated actor". CNBC. Retrieved 20 February 2018.
  22. ^ a b Cole, Samantha (6 February 2018). "Twitter Is the Latest Platform to Ban AI-Generated Porn". Vice. Retrieved 8 November 2019.
  23. ^ a b Hathaway, Jay (8 February 2018). "Here's where 'deepfakes,' the new fake celebrity porn, went after the Reddit ban". The Daily Dot. Retrieved 22 December 2018.
  24. ^ "r/SFWdeepfakes". Reddit. Retrieved 12 December 2018.
  25. ^ "What is a Deepfake and How Are They Made?". Online Tech Tips. 23 May 2019. Retrieved 8 November 2019.
  26. ^ Robertson, Adi (11 February 2018). "I'm using AI to face-swap Elon Musk and Jeff Bezos, and I'm really bad at it". The Verge. Retrieved 8 November 2019.
  27. ^ "Faceswap is the leading free and Open Source multi-platform Deepfakes software". 15 October 2019 – via WordPress.
  28. ^ "DeepFaceLab is a tool that utilizes machine learning to replace faces in videos. Includes prebuilt ready to work standalone Windows 7,8,10 binary (look readme.md).: iperov/DeepFaceLab". 19 June 2019 – via GitHub.
  29. ^ Pangburn, D. J. (21 September 2019). "You've been warned: Full body deepfakes are the next step in AI-based human mimicry". Fast Company. Retrieved 8 November 2019.
  30. ^ Zucconi, Alan (14 March 2018). "Understanding the Technology Behind DeepFakes". Alan Zucconi. Retrieved 8 November 2019.
  31. ^ a b c d e Kan, C. E. (10 December 2018). "What The Heck Are VAE-GANs?". Medium. Retrieved 8 November 2019.
  32. ^ a b "These New Tricks Can Outsmart Deepfake Videos—for Now". Wired. ISSN 1059-1028. Retrieved 9 November 2019.
  33. ^ a b c Dickson, E. J.; Dickson, E. J. (7 October 2019). "Deepfake Porn Is Still a Threat, Particularly for K-Pop Stars". Rolling Stone. Retrieved 9 November 2019.
  34. ^ a b c Roettgers, Janko (21 February 2018). "Porn Producers Offer to Help Hollywood Take Down Deepfake Videos". Variety. Retrieved 28 February 2018.
  35. ^ Goggin, Benjamin. "From porn to 'Game of Thrones': How deepfakes and realistic-looking fake videos hit it big". Business Insider. Retrieved 9 November 2019.
  36. ^ Lee, Dave (3 February 2018). "'Fake porn' has serious consequences". Retrieved 9 November 2019.
  37. ^ Cole, Samantha (19 June 2018). "Gfycat's AI Solution for Fighting Deepfakes Isn't Working". Vice. Retrieved 9 November 2019.
  38. ^ Cole, Samantha; Maiberg, Emanuel; Koebler, Jason (26 June 2019). "This Horrifying App Undresses a Photo of Any Woman with a Single Click". Vice. Retrieved 2 July 2019.
  39. ^ Cox, Joseph (9 July 2019). "GitHub Removed Open Source Versions of DeepNude". Vice Media.
  40. ^ "pic.twitter.com/8uJKBQTZ0o". 27 June 2019.
  41. ^ a b c d "Wenn Merkel plötzlich Trumps Gesicht trägt: die gefährliche Manipulation von Bildern und Videos". az Aargauer Zeitung. 3 February 2018.
  42. ^ Patrick Gensing. "Deepfakes: Auf dem Weg in eine alternative Realität?".
  43. ^ Romano, Aja (18 April 2018). "Jordan Peele's simulated Obama PSA is a double-edged warning against fake news". Vox. Retrieved 10 September 2018.
  44. ^ Swenson, Kyle (11 January 2019). "A Seattle TV station aired doctored footage of Trump's Oval Office speech. The employee has been fired". The Washington Post. Retrieved 11 January 2019.
  45. ^ "Faked Pelosi videos, slowed to make her appear drunk, spread across social media". Washington Post. Retrieved 1 July 2019.
  46. ^ Rimer, Sara (11 September 2019). "Q&A: LAW's Danielle Citron Warns That Deepfake Videos Could Undermine the 2020 Election". BUToday. Boston University. Retrieved 11 September 2019.
  47. ^ Novak, Matt. "Bullshit Viral Videos of Nancy Pelosi Show Fake Content Doesn't Have to Be a Deepfake". Gizmodo. Retrieved 1 July 2019.
  48. ^ CNN, Donie O'Sullivan. "Congress to investigate deepfakes as doctored Pelosi video causes stir". CNN. Retrieved 9 November 2019.
  49. ^ "'Deepfakes' called new election threat, with no easy fix". AP NEWS. 13 June 2019. Retrieved 10 November 2019.
  50. ^ Kemp, Luke (8 July 2019). "In the age of deepfakes, could virtual actors put humans out of business?". The Guardian. ISSN 0261-3077. Retrieved 20 October 2019.
  51. ^ Radulovic, Petrana (17 October 2018). "Harrison Ford is the star of Solo: A Star Wars Story thanks to deepfake technology". Polygon. Retrieved 20 October 2019.
  52. ^ Winick, Erin. "How acting as Carrie Fisher's puppet made a career for Rogue One's Princess Leia". MIT Technology Review. Retrieved 20 October 2019.
  53. ^ Mahdawi, Arwa (29 June 2019). "An app using AI to 'undress' women offers a terrifying glimpse into the future | Arwa Mahdawi". The Guardian. ISSN 0261-3077. Retrieved 9 November 2019.
  54. ^ Statt, Nick (5 September 2019). "Thieves are now using AI deepfakes to trick companies into sending them money". Retrieved 13 September 2019.
  55. ^ a b Damiani, Jesse. "A Voice Deepfake Was Used To Scam A CEO Out Of $243,000". Forbes. Retrieved 9 November 2019.
  56. ^ a b "Perfect Deepfake Tech Could Arrive Sooner Than Expected". www.wbur.org. Retrieved 9 November 2019.
  57. ^ a b c d June 18, Kara Manke|; 2019June 20; 2019 (18 June 2019). "Researchers use facial quirks to unmask 'deepfakes'". Berkeley News. Retrieved 9 November 2019.
  58. ^ a b c "The Blockchain Solution to Our Deepfake Problems". Wired. ISSN 1059-1028. Retrieved 9 November 2019.
  59. ^ a b c "Scarlett Johansson on fake AI-generated sex videos: 'Nothing can stop someone from cutting and pasting my image'". The Washington Post. 31 December 2018. Retrieved 19 June 2019.
  60. ^ a b Cole, Samantha (6 February 2018). "Pornhub Is Banning AI-Generated Fake Porn Videos, Says They're Nonconsensual". Vice. Retrieved 9 November 2019.
  61. ^ Gilmer, Damon Beres and Marcus. "A guide to 'deepfakes,' the internet's latest moral crisis". Mashable. Retrieved 9 November 2019.
  62. ^ Ghoshal, Abhimanyu (7 February 2018). "Twitter, Pornhub and other platforms ban AI-generated celebrity porn". The Next Web. Retrieved 9 November 2019.
  63. ^ Böhm, Markus (7 February 2018). ""Deepfakes": Firmen gehen gegen gefälschte Promi-Pornos vor". Spiegel Online. Retrieved 9 November 2019.
  64. ^ barbara.wimmer. "Deepfakes: Reddit löscht Forum für künstlich generierte Fake-Pornos". futurezone.at (in German). Retrieved 9 November 2019.
  65. ^ online, heise. "Deepfakes: Auch Reddit verbannt Fake-Porn". heise online (in German). Retrieved 9 November 2019.
  66. ^ "Reddit verbannt Deepfake-Pornos - derStandard.de". DER STANDARD (in German). Retrieved 9 November 2019.
  67. ^ Robertson, Adi (7 February 2018). "Reddit bans 'deepfakes' AI porn communities". The Verge. Retrieved 9 November 2019.
  68. ^ a b Harrell, Drew. "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'". The Washington Post. Retrieved 1 January 2019.
  69. ^ a b "Facebook has promised to leave up a deepfake video of Mark Zuckerberg". MIT Technology Review. Retrieved 9 November 2019.
  70. ^ Cole, Samantha (11 June 2019). "This Deepfake of Mark Zuckerberg Tests Facebook's Fake Video Policies". Vice. Retrieved 9 November 2019.
  71. ^ Sasse, Ben (21 December 2018). "S.3805 - 115th Congress (2017-2018): Malicious Deep Fake Prohibition Act of 2018". www.congress.gov. Retrieved 16 October 2019.
  72. ^ "'Deepfake' revenge porn is now illegal in Virginia". TechCrunch. Retrieved 16 October 2019.
  73. ^ Brown, Nina Iacono (15 July 2019). "Congress Wants to Solve Deepfakes by 2020. That Should Worry Us". Slate Magazine. Retrieved 16 October 2019.
  74. ^ a b "Bill Text - AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action". leginfo.legislature.ca.gov. Retrieved 9 November 2019.
  75. ^ a b "Bill Text - AB-730 Elections: deceptive audio or visual media". leginfo.legislature.ca.gov. Retrieved 9 November 2019.
  76. ^ Call for upskirting bill to include 'deepfake' pornography ban The Guardian
  77. ^ a b "Picaper". Internet Speculative Fiction Database. Retrieved 9 July 2019.
  78. ^ Philip Kerr, A Philosophical Investigation, ISBN 978-0143117537
  79. ^ Bernal, Natasha (8 October 2019). "The disturbing truth behind The Capture and real life deepfakes". The Telegraph. Retrieved 24 October 2019.
  80. ^ Crawley, Peter (5 September 2019). "The Capture: A BBC thriller of surveillance, distortion and duplicity". The Irish Times. Retrieved 24 October 2019.

External linksEdit