Haiku Generation with the GPT-2 language model

There’s something so unpretentious about Haikus that I enjoy. And even though they are not meant to rhyme the syllable structure just rolls off your tongue.

For my foray into using machine learning to generate art, I took openAIs GPT-2 language model and then trained it on a haiku dataset. In the dataset, I tokenized seasonal (kigo), nature (shizen) and contrast (kireji) nouns and verbs so these could be used to randomly augment the dataset to match the 5-7-5 syllable structure. This resulted in just over 13,200 haiku poem structures that are viably augmentable through these means.


The plan is to build a website where people play with their own machine learning haikus. Coming soon!