
Show HN: Python module to easily generate text using a pretrained char-rnn - minimaxir
https://github.com/minimaxir/textgenrnn
======
neoncontrails
Nice to see a preexisting model ship with the code. Always surprised me how
difficult it is to find and compare trained RNNs, given how tiny a typical
char-RNN model is, and how long they usually take to train. I hope in the near
future it becomes viable to load different models on demand — models perhaps
trained on the same corpus but differing in their choices of parameters, so
that it might be possible to glimpse the qualitative impact of, say, window
size on the model's output. (I'm especially curious what it looks like at
n>1000, does it degrade? Pay diminishing returns? Do longer window sizes help
RNNs to stay on topic for longer than a sentence or two? I'm really interested
in the answer, but CUDA poops out on my machine generally around the n=256
threshold. So I'm hoping more projects like yours take off :)

