A generative pre-trained transformer (GPT) is a type of large language model (LLM) and a prominent framework for generative artificial intelligence. It...
50 KB (4,444 words) - 06:46, 9 November 2024
OpenAI o1 (redirect from O1 (generative pre-trained transformer))
OpenAI o1 is a generative pre-trained transformer. A preview of o1 was released by OpenAI on September 12, 2024. o1 spends time "thinking" before it answers...
11 KB (1,109 words) - 15:04, 25 October 2024
ChatGPT (redirect from Chat Generative Pre-trained Transformer)
globally. ChatGPT is built on OpenAI's proprietary series of generative pre-trained transformer (GPT) models, and is fine-tuned for conversational applications...
199 KB (17,246 words) - 11:30, 7 November 2024
GPT-2 (redirect from Generative Pre-trained Transformer 2)
Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained...
44 KB (3,264 words) - 11:35, 29 October 2024
GPT-3 (redirect from Generative Pre-trained Transformer 3)
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer...
54 KB (4,915 words) - 00:40, 2 October 2024
GPT-4 (redirect from Generative Pre-trained Transformer 4)
Generative Pre-trained Transformer 4 (GPT-4) is a multimodal large language model created by OpenAI, and the fourth in its series of GPT foundation models...
62 KB (6,004 words) - 04:24, 8 November 2024
GPT-1 (category Generative pre-trained transformers)
Generative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture...
32 KB (1,064 words) - 15:45, 8 May 2024
of pre-trained systems, such as generative pre-trained transformers (GPTs) and BERT (bidirectional encoder representations from transformers). For many...
99 KB (12,358 words) - 08:46, 1 November 2024
GPT-4o (category Generative pre-trained transformers)
GPT-4o ("o" for "omni") is a multilingual, multimodal generative pre-trained transformer developed by OpenAI and released in May 2024. GPT-4o is free,...
17 KB (1,782 words) - 18:32, 3 November 2024
advancements in generative models compared to older Long-Short Term Memory models, leading to the first generative pre-trained transformer (GPT), known as...
140 KB (12,184 words) - 21:14, 7 November 2024
can also analyze images. Claude models are generative pre-trained transformers. They have been pre-trained to predict the next word in large amounts of...
13 KB (1,269 words) - 22:47, 5 November 2024
Generative pre-trained transformer, a type of artificial intelligence language model ChatGPT, a chatbot developed by OpenAI, based on generative pre-trained...
1,006 bytes (133 words) - 08:50, 17 October 2024
GPT-J (category Generative pre-trained transformers)
developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like text that continues from...
11 KB (982 words) - 01:29, 8 November 2024
Generative Pre-trained Transformer 4Chan (GPT-4chan) is a controversial AI model that was developed and deployed by YouTuber and AI researcher Yannic Kilcher...
9 KB (1,124 words) - 04:52, 23 September 2024
DALL-E (category Generative pre-trained transformers)
Initiative. The first generative pre-trained transformer (GPT) model was initially developed by OpenAI in 2018, using a Transformer architecture. The first...
52 KB (3,970 words) - 15:37, 18 October 2024
called generative pre-trained transformers (GPT), such as GPT-4o, to generate text. GPT models are large language models that are pre-trained to predict...
31 KB (3,212 words) - 23:26, 23 October 2024
long stretches of contiguous text. Generative Pre-trained Transformer 2 ("GPT-2") is an unsupervised transformer language model and the successor to...
194 KB (16,843 words) - 00:48, 8 November 2024
IBM Watsonx (category Generative pre-trained transformers)
Watsonx is IBM's commercial generative AI and scientific data platform based on cloud. It offers a studio, data store, and governance toolkit. It supports...
7 KB (596 words) - 22:39, 24 October 2024
OpenAI Codex, which is a modified, production version of the Generative Pre-trained Transformer 3 (GPT-3), a language model using deep-learning to produce...
17 KB (1,674 words) - 10:07, 28 October 2024
IBM Granite (category Generative artificial intelligence)
data and generative AI platform Watsonx along with other models, IBM opened the source code of some code models. Granite models are trained on datasets...
7 KB (481 words) - 22:53, 24 October 2024
as multimodal Generative AI. The paper's title is a reference to the song "All You Need Is Love" by the Beatles. The name "Transformer" was picked because...
8 KB (2,713 words) - 11:31, 23 October 2024
Artificial intelligence (section Generative AI)
meaning), transformers (a deep learning architecture using an attention mechanism), and others. In 2019, generative pre-trained transformer (or "GPT")...
267 KB (26,753 words) - 07:05, 9 November 2024
popular Internet spaces without mention of the full theory. Generative pre-trained transformers (GPTs) are a class of large language models (LLMs) that employ...
33 KB (3,041 words) - 01:07, 8 November 2024
BLOOM (language model) (category Generative pre-trained transformers)
176-billion-parameter transformer-based autoregressive large language model (LLM). The model, as well as the code base and the data used to train it, are distributed...
4 KB (500 words) - 17:28, 8 September 2024
example, generative pre-trained transformers (GPT), which use the transformer architecture, have become common to build sophisticated chatbots. The "pre-training"...
71 KB (6,775 words) - 06:57, 9 November 2024
OpenAI Codex (category Generative pre-trained transformers)
Research Access Program. Based on GPT-3, a neural network trained on text, Codex was additionally trained on 159 gigabytes of Python code from 54 million GitHub...
13 KB (1,306 words) - 01:56, 1 April 2024
(2022-10-01). "GPTQ: Accurate Post-Training Quantization for Generative Pre-trained Transformers". arXiv:2210.17323 [cs.LG]. Dettmers, Tim; Svirschevski,...
159 KB (13,490 words) - 07:06, 5 November 2024
Microsoft Copilot (category Generative pre-trained transformers)
Microsoft Copilot is a generative artificial intelligence chatbot developed by Microsoft. Based on the GPT-4 series of large language models, it was launched...
58 KB (5,317 words) - 14:04, 4 November 2024
Internet, medicine, and artificial intelligence, in particular generative pre-trained transformers. In economics, it is theorized that initial adoption of a...
12 KB (823 words) - 02:38, 14 July 2024
model Deep linguistic processing Factored language model Generative pre-trained transformer Katz's back-off model Language technology Statistical model...
14 KB (2,212 words) - 19:23, 6 November 2024