A generative pre-trained transformer (GPT) is a type of large language model (LLM) and a prominent framework for generative artificial intelligence. It...
50 KB (4,440 words) - 21:00, 16 December 2024
ChatGPT (redirect from Chat Generative Pre-trained Transformer)
globally. ChatGPT is built on OpenAI's proprietary series of generative pre-trained transformer (GPT) models and is fine-tuned for conversational applications...
202 KB (17,426 words) - 03:52, 22 December 2024
OpenAI o1 (redirect from O1 (generative pre-trained transformer))
OpenAI o1 is a generative pre-trained transformer (GPT). A preview of o1 was released by OpenAI on September 12, 2024. o1 spends time "thinking" before...
12 KB (1,218 words) - 21:28, 22 December 2024
GPT-4 (redirect from Generative Pre-trained Transformer 4)
Generative Pre-trained Transformer 4 (GPT-4) is a multimodal large language model trained and created by OpenAI and the fourth in its series of GPT foundation...
63 KB (6,125 words) - 00:58, 22 December 2024
GPT-3 (redirect from Generative Pre-trained Transformer 3)
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer...
54 KB (4,915 words) - 02:50, 8 December 2024
GPT-2 (redirect from Generative Pre-trained Transformer 2)
Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained...
44 KB (3,264 words) - 19:57, 6 December 2024
GPT-1 (category Generative pre-trained transformers)
Generative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture...
32 KB (1,064 words) - 19:57, 6 December 2024
of pre-trained systems, such as generative pre-trained transformers (GPTs) and BERT (bidirectional encoder representations from transformers). For many...
100 KB (12,428 words) - 14:36, 20 December 2024
compared to previous versions. Claude models are generative pre-trained transformers. They have been pre-trained to predict the next word in large amounts of...
14 KB (1,372 words) - 19:51, 21 December 2024
advancements in generative models compared to older Long-Short Term Memory models, leading to the first generative pre-trained transformer (GPT), known as...
142 KB (12,348 words) - 05:22, 21 December 2024
GPT-4o (category Generative pre-trained transformers)
GPT-4o ("o" for "omni") is a multilingual, multimodal generative pre-trained transformer developed by OpenAI and released in May 2024. GPT-4o is free,...
18 KB (1,815 words) - 20:21, 22 December 2024
Generative pre-trained transformer, a type of artificial intelligence language model ChatGPT, a chatbot developed by OpenAI, based on generative pre-trained...
1 KB (181 words) - 00:17, 21 December 2024
GPT-J (category Generative pre-trained transformers)
developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like text that continues from...
11 KB (982 words) - 19:58, 6 December 2024
Generative Pre-trained Transformer 4Chan (GPT-4chan) is a controversial AI model that was developed and deployed by YouTuber and AI researcher Yannic Kilcher...
9 KB (1,124 words) - 06:05, 14 November 2024
DALL-E (category Generative pre-trained transformers)
Initiative. The first generative pre-trained transformer (GPT) model was initially developed by OpenAI in 2018, using a Transformer architecture. The first...
52 KB (3,976 words) - 15:48, 22 December 2024
IBM Watsonx (category Generative pre-trained transformers)
Watsonx is IBM's commercial generative AI and scientific data platform based on cloud. It offers a studio, data store, and governance toolkit. It supports...
8 KB (628 words) - 11:02, 14 December 2024
OpenAI Codex, which is a modified, production version of the Generative Pre-trained Transformer 3 (GPT-3), a language model using deep-learning to produce...
17 KB (1,674 words) - 10:07, 28 October 2024
of vectors using self-supervised learning. It uses the encoder-only transformer architecture. It is notable for its dramatic improvement over previous...
30 KB (3,383 words) - 16:53, 20 December 2024
long stretches of contiguous text. Generative Pre-trained Transformer 2 ("GPT-2") is an unsupervised transformer language model and the successor to...
201 KB (17,519 words) - 07:58, 22 December 2024
as multimodal Generative AI. The paper's title is a reference to the song "All You Need Is Love" by the Beatles. The name "Transformer" was picked because...
13 KB (3,664 words) - 16:17, 17 December 2024
language models called generative pre-trained transformers (GPT). They are based on a deep learning architecture called the transformer, which contains artificial...
62 KB (5,889 words) - 12:51, 20 December 2024
IBM Granite (category Generative artificial intelligence)
data and generative AI platform Watsonx along with other models, IBM opened the source code of some code models. Granite models are trained on datasets...
7 KB (499 words) - 04:37, 19 December 2024
Carmen Sandiego? Other "Attention Is All You Need" elgooG Generative pre-trained transformer "Me at the zoo" Predictions of the end Relationship with Wikipedia...
32 KB (3,374 words) - 00:07, 7 December 2024
Carmen Sandiego? Other "Attention Is All You Need" elgooG Generative pre-trained transformer "Me at the zoo" Predictions of the end Relationship with Wikipedia...
37 KB (2,539 words) - 17:23, 4 December 2024
Carmen Sandiego? Other "Attention Is All You Need" elgooG Generative pre-trained transformer "Me at the zoo" Predictions of the end Relationship with Wikipedia...
38 KB (3,713 words) - 07:56, 16 December 2024
Artificial intelligence (section Generative AI)
meaning), transformers (a deep learning architecture using an attention mechanism), and others. In 2019, generative pre-trained transformer (or "GPT")...
271 KB (27,121 words) - 22:40, 20 December 2024
Carmen Sandiego? Other "Attention Is All You Need" elgooG Generative pre-trained transformer "Me at the zoo" Predictions of the end Relationship with Wikipedia...
11 KB (840 words) - 13:15, 18 December 2024
BLOOM (language model) (category Generative pre-trained transformers)
176-billion-parameter transformer-based autoregressive large language model (LLM). The model, as well as the code base and the data used to train it, are distributed...
4 KB (502 words) - 15:47, 9 December 2024
Gemini, formerly known as Bard, is a generative artificial intelligence chatbot developed by Google. Based on the large language model (LLM) of the same...
127 KB (9,108 words) - 19:52, 22 December 2024
Carmen Sandiego? Other "Attention Is All You Need" elgooG Generative pre-trained transformer "Me at the zoo" Predictions of the end Relationship with Wikipedia...
54 KB (3,987 words) - 16:51, 14 November 2024