Hacker News new | past | comments | ask | show | jobs | submit login

As someone who works on a Python library solely devoted to making AI text generation more accessible to the normal person (https://github.com/minimaxir/aitextgen ) I think the headline is misleading.

Although the article focuses on the release of GPT-Neo, even GPT-2 released in 2019 was good at generating text, it just spat out a lot of garbage requiring curation, which GPT-3/GPT-Neo still requires albeit with a better signal-to-noise ratio. Most GPT-3 demos on social media are survivorship bias. (in fact OpenAI's rules for the GPT-3 API strongly encourage curating such output)

GPT-Neo, meanwhile, is such a big model that it requires a bit of data engineering work to get operating and generating text (see the README: https://github.com/EleutherAI/gpt-neo ), and it's unclear currently if it's as good as GPT-3, even when comparing models apples-to-apples (i.e. the 2.7B GPT-Neo with the "ada" GPT-3 via OpenAI's API).

That said, Hugging Face is adding support for GPT-Neo to Transformers (https://github.com/huggingface/transformers/pull/10848 ) which will help make playing with the model easier, and I'll add support to aitextgen if it pans out.




Totally off topic: can you fix the pip3 installer for aitextgen? I just filed an issue on GH issue tracker.


I don't know when it started (as it's been years since I wrote anything in Python) but the last couple of years I've been seeing way way more of these generic Python environment / configuration errors around the internet that are hard to diagnose and debug. Has something happened over the last few years with Python and its configuration and dependency management?


It's more of a Homebrew issue, which is likely what the OP hit. (incidentally I just hit a similar-but-unrelated Homebrew Python environment issue)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: