Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a for-profit San Francisco-based artificial intelligence research laboratory. GPT-3's full version has a capacity of 175 billion machine learning parameters. GPT-3, which was introduced in May 2020, and is in beta testing as of July 2020, is part of a trend in natural language processing (NLP) systems of pre-trained language representations. Prior to the release of GPT-3, the largest language model was Microsoft's Turing NLG, introduced in February 2020, with a capacity of 17 billion parameters or less than 10 percent compared to GPT-3.
|Initial release||June 11, 2020 (beta)|
|Type||Autoregressive Transformer language model|
|License||Code unavailable, only accessible by a paywalled API|
The quality of the text generated by GPT-3 is so high that it is difficult to distinguish from that written by a human, which has both benefits and risks. Thirty-one OpenAI researchers and engineers presented the original May 28, 2020 paper introducing GPT-3. In their paper, they warned of GPT-3's potential dangers and called for research to mitigate risk.:34 David Chalmers, an Australian philosopher, described GPT-3 as "one of the most interesting and important AI systems ever produced."
Microsoft announced on September 22, 2020 that it had licensed "exclusive" use of GPT-3; others can still use the public API to receive output, but only Microsoft has control of the source code.
According to The Economist, improved algorithms, powerful computers, and an increase in digitized data have fueled a revolution in machine learning, with new techniques in the 2010s resulting in "rapid improvements in tasks" including manipulating language. Software models are trained to learn by using thousands or millions of examples in a "structure ... loosely based on the neural architecture of the brain". One architecture used in natural language processing (NLP) is a neural network based on a deep learning model that was first introduced in 2017—the Transformer. GPT-n models are based on this Transformer-based deep learning neural network architecture. There are a number of NLP systems capable of processing, mining, organizing, connecting, contrasting, understanding and generating answers to questions.
On June 11, 2018, OpenAI researchers and engineers posted their original paper on generative models—language models—artificial intelligence systems—that could be pre-trained with an enormous and diverse corpus of text via datasets, in a process they called generative pre-training (GP). The authors described how language understanding performances in natural language processing (NLP) were improved in GPT-n through a process of "generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task." This eliminated the need for human supervision and for time-intensive hand-labeling.
In February 2020, Microsoft introduced its Turing Natural Language Generation (T-NLG), which was then the "largest language model ever published at 17 billion parameters." It performed better than any other language model at a variety of tasks which included summarizing texts and answering questions.
A May 28, 2020 arXiv preprint by a group of 31 engineers and researchers at OpenAI[a] described the development of GPT-3, a third-generation "state-of-the-art language model". The team increased the capacity of GPT-3 by over two orders of magnitude from that of its predecessor, GPT-2, making GPT-3 the largest non-sparse[further explanation needed] language model to date.: GPT-3's higher number of parameters grants it a higher level of accuracy relative to previous versions with smaller capacity. GPT-3's capacity is ten times larger than that of Microsoft's Turing NLG.
Sixty percent of the weighted pre-training dataset for GPT-3 comes from a filtered version of Common Crawl consisting of 410 billion byte-pair-encoded tokens.:9 Other sources are 19 billion tokens from WebText2 representing 22% of the weighted total, 12 billion tokens from Books1 representing 8%, 55 billion tokens from Books2 representing 8%, and 3 billion tokens from Wikipedia representing 3%.:9 GPT-3 was trained on hundreds of billions of words and is capable of coding in CSS, JSX, Python, among others. Since GPT-3's training data was all-encompassing, it does not require further training for distinct language tasks.
On June 11, 2020, OpenAI announced that users could request access to its user-friendly GPT-3 API—a "machine learning toolset"—to help OpenAI "explore the strengths and limits" of this new technology. The invitation described how this API had a general-purpose "text in, text out" interface that can complete almost "any English language task", instead of the usual single use-case. According to one user, who had access to a private early release of the OpenAI GPT-3 API, GPT-3 was "eerily good" at writing "amazingly coherent text" with only a few simple prompts.
Because GPT-3 can "generate news articles which human evaluators have difficulty distinguishing from articles written by humans," GPT-3 has the "potential to advance both the beneficial and harmful applications of language models.":34 In their May 28, 2020 paper, the researchers described in detail the potential "harmful effects of GPT-3" which include "misinformation, spam, phishing, abuse of legal and governmental processes, fraudulent academic essay writing and social engineering pretexting". The authors draw attention to these dangers to call for research on risk mitigation.:34
In his July 29, 2020, review in The New York Times, Farhad Manjoo said that GPT-3—which can generate computer code and poetry, as well as prose—is not just "amazing", "spooky", and "humbling", but also "more than a little terrifying".
Daily Nous presented a series of articles by nine philosophers on GPT-3. Australian philosopher David Chalmers described GPT-3 as "one of the most interesting and important AI systems ever produced".
An article in Towards Data Science stated that GPT-3 was trained on hundreds of billions of words and is capable of coding in CSS, JSX, Python, and other languages.
The National Law Review said that GPT-3 is an "impressive step in the larger process", with OpenAI and others finding "useful applications for all of this power" while continuing to "work toward a more general intelligence".
An article in the MIT Technology Review, cowritten by Deep Learning critic Gary Marcus , stated that GPT-3's "comprehension of the world is often seriously off, which means you can never really trust what it says." According to the authors, GPT-3 models relationships between words without having an understanding of the meaning behind each word.
- GPT-3 has been used by Andrew Mayne for AI Writer, which allows people to correspond with historical figures via email.
- GPT-3 has been implemented by Jason Rohrer in a retro-themed chatbot project named Project December, which is accessible online and allows users to converse with several AIs using GPT-3 technology.
- GPT-3 was used by The Guardian to write an article about AI being harmless to human beings. It was fed some ideas and produced eight different essays, which were ultimately merged into one article.
- GPT-3 is used in AI Dungeon, which generates text-based adventure games.
- Brown, Tom B.; Mann, Benjamin; Ryder, Nick; Subbiah, Melanie; Kaplan, Jared; Dhariwal, Prafulla; Neelakantan, Arvind; Shyam, Pranav; Sastry, Girish; Askell, Amanda; Agarwal, Sandhini; Herbert-Voss, Ariel; Krueger, Gretchen; Henighan, Tom; Child, Rewon; Ramesh, Aditya; Ziegler, Daniel M.; Wu, Jeffrey; Winter, Clemens; Hesse, Christopher; Chen, Mark; Sigler, Eric; Litwin, Mateusz; Gray, Scott; Chess, Benjamin; Clark, Jack; Berner, Christopher; McCandlish, Sam; Radford, Alec; Sutskever, Ilya; Amodei, Dario
- Brown, Tom B.; Mann, Benjamin; Ryder, Nick; Subbiah, Melanie; Kaplan, Jared; Dhariwal, Prafulla; Neelakantan, Arvind; Shyam, Pranav; Sastry, Girish; Askell, Amanda; Agarwal, Sandhini; Herbert-Voss, Ariel; Krueger, Gretchen; Henighan, Tom; Child, Rewon; Ramesh, Aditya; Ziegler, Daniel M.; Wu, Jeffrey; Winter, Clemens; Hesse, Christopher; Chen, Mark; Sigler, Eric; Litwin, Mateusz; Gray, Scott; Chess, Benjamin; Clark, Jack; Berner, Christopher; McCandlish, Sam; Radford, Alec; Sutskever, Ilya; Amodei, Dario (July 22, 2020). "Language Models are Few-Shot Learners". arXiv:2005.14165.
- Shead, Sam (July 23, 2020). "Why everyone is talking about the A.I. text generator released by an Elon Musk-backed lab". CNBC. Retrieved July 31, 2020. Four preprints were released between May 28 and July 22, 2020.
- Bussler, Frederik (July 21, 2020). "Will GPT-3 Kill Coding?". Towards Data Science. Retrieved August 1, 2020.
- Sagar, Ram (June 3, 2020). "OpenAI Releases GPT-3, The Largest Model So Far". Analytics India Magazine. Retrieved July 31, 2020.
- Chalmers, David (July 30, 2020). Weinberg, Justin (ed.). "GPT-3 and General Intelligence". Daily Nous. Philosophers On GPT-3 (updated with replies by GPT-3). Retrieved August 4, 2020.
- Hao, Karen (September 23, 2020). "OpenAI is giving Microsoft exclusive access to its GPT-3 language model". MIT Technology Review. Retrieved September 25, 2020.
The companies say OpenAI will continue to offer its public-facing API, which allows chosen users to send text to GPT-3 or OpenAI’s other models and receive its output. Only Microsoft, however, will have access to GPT-3’s underlying code, allowing it to embed, repurpose, and modify the model as it pleases.
- "An understanding of AI's limitations is starting to sink in". The Economist. June 11, 2020. ISSN 0013-0613. Retrieved July 31, 2020.
- Polosukhin, Illia; Kaiser, Lukasz; Gomez, Aidan N.; Jones, Llion; Uszkoreit, Jakob; Parmar, Niki; Shazeer, Noam; Vaswani, Ashish (June 12, 2017). "Attention Is All You Need". arXiv:1706.03762 [cs.CL].
- "Natural Language Processing". Retrieved July 31, 2020.
- Radford, Alec; Narasimhan, Karthik; Salimans, Tim; Sutskever, Ilya (June 11, 2018). "Improving Language Understanding by Generative Pre-Training" (PDF). p. 12. Retrieved July 31, 2020.
- Sterling, Bruce (February 13, 2020). "Web Semantics: Microsoft Project Turing introduces Turing Natural Language Generation (T-NLG)". Wired. ISSN 1059-1028. Retrieved July 31, 2020.
- "Language Models are Unsupervised Multitask Learners" (PDF). Retrieved December 4, 2019.
GPT-2, is a 1.5B parameter TransformerCite journal requires
- Ray, Tiernan (June 1, 2020). "OpenAI's gigantic GPT-3 hints at the limits of language models for AI". ZDNet. Retrieved July 31, 2020.
- "OpenAI API". OpenAI. June 11, 2020.
- "TechCrunch – Startup and Technology News". TechCrunch. June 11, 2020. Retrieved July 31, 2020.
If you’ve ever wanted to try out OpenAI’s vaunted machine learning toolset, it just got a lot easier. The company has released an API that lets developers call its AI tools in on “virtually any English language task.”
- Arram (July 9, 2020). "GPT-3: An AI that's eerily good at writing almost anything". Arram Sabeti. Retrieved July 31, 2020.
- Manjoo, Farhad (July 29, 2020). "How Do You Know a Human Wrote This?". The New York Times. ISSN 0362-4331. Retrieved August 4, 2020.
- Weinberg, Justin, ed. (July 30, 2020). "Philosophers On GPT-3 (updated with replies by GPT-3)". Daily Nous. Retrieved July 31, 2020.
- Simonite, Tom (July 22, 2020). "Did a Person Write This Headline, or a Machine?". Wired. ISSN 1059-1028. Retrieved July 31, 2020.
- Claypoole, Theodore (July 30, 2020). "New AI Tool GPT-3 Ascends to New Peaks, But Proves How Far We Still Need to Travel". The National Law Review. Retrieved August 4, 2020. Cite has empty unknown parameter:
- Marcus, Gary (December 1, 2018). "The deepest problem with deep learning". Medium. Retrieved September 29, 2020.
- Marcus, Gary; Davis, Ernest (August 22, 2020). "GPT-3, Bloviator: OpenAI's language generator has no idea what it's talking about". MIT Technology Review. Retrieved August 23, 2020.
- GPT-3 (September 8, 2020). "A robot wrote this entire article. Are you scared yet, human? | GPT-3". The Guardian. ISSN 0261-3077. Retrieved September 15, 2020.