Generative Pre-trained Transformer

From PKC
Jump to navigation Jump to search

Generative Pretrained Transformer (GPT) is a type of natural language processing (NLP) model that uses deep learning to generate text. It is based on the Transformer architecture, which was first introduced in 2017 by Google researchers. GPT models are trained on large datasets of text and can be used to generate new text that is similar to the training data. GPT models can be used for a variety of tasks, such as summarization, question answering, and machine translation.

— ChatGPT


References


Related Pages