
Generative pre-trained transformer - Wikipedia
A generative pre-trained transformer (GPT) is a type of large language model (LLM) [1][2][3] that is widely used in generative AI chatbots. [4][5] GPTs are based on a deep learning architecture called …
Introduction to Generative Pre-trained Transformer (GPT)
Dec 12, 2025 · Generative Pre-trained Transformer (GPT) is a large language model that can understand and produce human-like text. It works by learning patterns, meanings and relationships …
What is GPT (generative pre-trained transformer)? | IBM
Generative pre-trained transformers (GPTs) are a family of advanced neural networks designed for natural language processing (NLP) tasks. These large-language models (LLMs) are based on …
What is GPT AI? - Generative Pre-Trained Transformers Explained
Generative Pre-trained Transformers, commonly known as GPT, are a family of neural network models that uses the transformer architecture and is a key advancement in artificial intelligence (AI) …
This review provides a detailed overview of the Generative Pre-trained Transformer, including its architecture, working process, training procedures, enabling technologies, and its impact on various …
What is GPT AI (Generative Pre-trained Transformer)?
May 19, 2025 · GPT AI, short for Generative Pre-trained Transformer, is a breakthrough in the field of artificial intelligence that has transformed how machines understand and generate human language.
What is GPT (Generative Pre-Trained Transformers)? - H2O
What is GPT (Generative Pre-Trained Transformers)? GPT, short for Generative Pre-Trained Transformers, is an advanced open-source language model that utilizes transformer architectures to …
What even is a generative pre-trained transformer (GPT)?
Sep 30, 2024 · Exploring the basics of Generative Pre-Trained Transformers (GPT), this article breaks down how they work, their core mechanisms, and how smaller versions can be trained for specific …
Mastering Generative Pre-Trained Transformer Systems
Jun 4, 2025 · This article explains Generative Pre-Trained Transformers (GPTs), detailing their capabilities in writing, answering, summarizing, and conversing. It explores the underlying …
GPT-3 - Wikipedia
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only [2] transformer model of deep neural network, which …