The unreasonable effectiveness of recipe generation with the GPT-2 sample model

The release of the OpenAI GPT-2 sample language model from the paper Language Models are Unsupervised Multitask Learners (also see Better Language Models and Their Implications) shows great promise of what is to come. The paper describes how training data was collected by following outbound links from Reddit. This got me thinking about what types of content it has seen. I have experimented with triggering recipe generation from the model by using “recipe” and similar conditioning texts....

February 27, 2019 · Peter Krantz

Nurse Christmas Havin's Love or Using Deep Learning for Romantic Novel Titles

Torch is a scientific computing framework with wide support for machine learning algorithms. Andrej Karpathy has an excellent blog post explaining recurrent neural networks (RNNs) and character level models. With his sample code for Torch it is very easy to get started creating your own RNN by using text from a specific domain. This model learns to predict the next character in a sequence. The model can be used to generate text character by character that looks similar to the original training data....

December 23, 2015 · Peter Krantz