Hacker News new | past | comments | ask | show | jobs | submit login

>I think there will be open source models at GPT-4 level that can run on consumer GPUs within a year or two.

There is indeed already open source models rivaling ChatGPT-3.5 but GPT-4 is an order of magnitude better.

The sentiment that GPT-4 is going to be surpassed by open source models soon is something I only notice on HN. Makes me suspect people here haven't really tried the actual GPT-4 but instead the various scammy services like Bing that claim they are using GPT-4 under the hood when they are clearly not.




Makes me suspect you don't follow HN user base very closely.


You're 100% right and I apologize that you're getting downvoted, in solidarity I will eat downvotes with you.

HNs funny right now because LLMs are all over the front page constantly, but there's a lot of HN "I am an expert because I read comments sections" type behavior. So many not even wrong comments that start from "I know LLaMa is local and C++ is a programming language and I know LLaMa.cpp is on GitHub and software improves and I've heard of Mistral."




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: