Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually, it is not true. Hilarious

Author compares different encoders: for Facebook's NLLB and GPT2. Where did title came from?

Another point is that OpenAI changed encoders for chat models. Link: https://github.com/openai/openai-cookbook/blob/main/examples...

Now English is less optimized for tokens usage and other languages are much more balanced. E.g. Ukrainian takes only twice as much tokens, before it had 6 times more tokens



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: