GPT is brief for Generative Pre-training Transformer (GPT), a language mannequin written by Alec Radford and printed in 2018 by OpenAI, Elon Musks’s synthetic intelligence analysis laboratory. It makes use of a generative mannequin of language (the place two neural networks excellent one another by competitors) and is ready to purchase information of the world and course of long-range dependencies by pre-training on various units of written materials with lengthy stretches of contiguous textual content.
GPT-2 (Generative Pretrained Transformer 2) was introduced in February 2019 and is an unsupervised transformer language mannequin educated on eight million paperwork for a complete of 40 GB of textual content from articles shared
Source hyperlink