GPT-2

GPT-2

A direct scale-up of GPT-1 with 1.5 billion parameters, trained on 8 million web pages. Known for its ability to generate coherent text, sometimes indistinguishable from humans, but could be repetitive.

ConversationSummarizationAnalysis
Provider
Openai
Release Date
November 5, 2019
Size
MEDIUM
Parameters
1.5B

Model Insights

All Model Responses

Related Models