GPT-2 Was Released With 1.5 Billion Parameters

GPT-2

GPT-2 Was Released With 1.5 Billion Parameters

Generative Pre-trained Transformer-2 (GPT-2) is a transformer machine learning model for auto-text generation. Using NLP and deep learning can perform various text-related tasks like answering questions, summarization, and translation. It has 1.5 billion parameters (training set). It works amazingly for short paragraphs but loses its senses for longer paragraph generation. We have to give it a prime statement; the rest it can handle. Read more

February, 2019