Revision history of "Generative Pre-trained Transformer"

Jump to navigation Jump to search

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

  • curprev 10:06, 19 January 2023Benkoo2 talk contribs 622 bytes +105
  • curprev 10:05, 19 January 2023Benkoo2 talk contribs 517 bytes +517 Created page with "{{Blockquote |text= Generative Pretrained Transformer (GPT) is a type of natural language processing (NLP) model that uses deep learning to generate text. It is based on the Transformer architecture, which was first introduced in 2017 by Google researchers. GPT models are trained on large datasets of text and can be used to generate new text that is similar to the training data. GPT models can be used for a variety of tasks, such as summarization, question answering, and..."