-
The effect of various text generation methods on the outputs of GPT-2
When generating text using the GPT-2 Large model, we found that both the method of generation, and text prompt used, have a statistically significant effect on on the output produced. In four out of six trials we found that the Nucleus Sampling method proposed by Holtzman, et all[mfn referencenumber=1]Holtzman, Buys, Du, Forbes, Choi. (2020). The…
-
A Crash Course in Generating and Measuring Neural Text with GPT-2
A full crash course in neural text generation. We review various methods of generating text with GPT-2 (the “little brother” of GPT-3), including Beam Search, Top-K and Top-P sampling. We also review some key metrics of generated text, including perplexity and repetition, and try to get a more intuitive sense of these measures.