Generative Pre-trained Transformer
G
Generative Pre-trained Transformer
Definition
A family of autoregressive language models developed by OpenAI that use transformer architecture to generate human-like text. GPT models are pre-trained on large text corpora using next-token prediction, then fine-tuned for specific applications.