The unreasonable effectiveness of recipe generation with the GPT-2 sample model

The release of the OpenAI GPT-2 sample language model from the paper Language Models are Unsupervised Multitask Learners (also see Better Language Models and Their Implications) shows great promise of what is to come. The paper describes how training data was collected by following outbound links from Reddit. This got me thinking about what types of content it has seen. I have experimented with triggering recipe generation from the model by using “recipe” and similar conditioning texts.

Continue reading

Christmas carols and death metal lyrics in Tensorflow

Image of saints singing.

After the previous experiment with a character based recurrent neural network (RNN) for romantic novel titles I wanted to find more details about word level RNN:s. I was happy to find that Sung Kim has made it easy to explore word-level RNN:s with Tensorflow in this repository. Training text is 50/50 Christmas carols and death metal lyrics.

Output examples below. (added line breaks and bolded lines that seemed like reasonable song names :-). The resulting texts seem choppier than the character level RNN I tried previously. But still fun! Continue reading

Nurse Christmas Havin’s Love or Using Deep Learning for Romantic Novel Titles

A couple kissing behind a handheld fan.

Torch is a scientific computing framework with wide support for machine learning algorithms. Andrej Karpathy has an excellent blog post explaining recurrent neural networks (RNNs) and character level models. With his sample code for Torch it is very easy to get started creating your own RNN by using text from a specific domain. This model learns to predict the next character in a sequence. The model can be used to generate text character by character that looks similar to the original training data. Continue reading