OpenAI Built Generative Pre-Trained Transformer Model
Generative Pre-Trained Transformer is an NLP model for generative language modeling. Basically, its concept is based on the possibility that supervised learning can amalgamate with unsupervised pre-trained data sets for better and exponentiated results. GPT from OpenAI showed promising results and led the way for upcoming GPT-2 and GPT-3 models. GPT used 110 million parameters, which were quite a huge number at that time. GPT is simply finetuning of huge garbage. Read more
June, 2018