Hacker News new | past | comments | ask | show | jobs | submit login

It's definitely better than the original using Markov chains. It fits very well this use case, and in my opinion only this use case.

GPT2 is still very random and quite stupid.

You start it with your love for your girlfriend as a context, she becomes a cam girl into hard core anal two paragraphs later. You start with religion, "Muslims must be exterminated". You start with software and you get a description of non existent hardware with instructions about how to setup a VPN in the middle. You start with news, and you can read than China supports the Islamic state.

That's cool because it has more context than Markov chains which usually have only 3 words of context, but it's still a long way to go before I trust anything generated by this kind of algorithm.




This stuff is pretty much indistinguishable from the real thing...

https://www.reddit.com/r/SubSimulatorGPT2/comments/f1pypf/so...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: