I understand the content did not meet your needs and you have all rights to verbalize your disappointment but I don't agree to calling it "low effort SEO". I have no personal interest in promoting any of these models. I shared the article with a belief that it may be helpful for people who are exploring the alternatives to GPT (like I was a week ago).
> I have no personal interest in promoting any of these models.
Could you please add direct links to the home page or repo of each of the projects in the article? Readers expect to be able to click on the name of a project to see the project first-hand. The lack of these links combined with the "contact us" sales pitch at the bottom is part of what makes this article look spammy and seems to contradict your claim of "no personal interest".
And then I'm talking with corporate managers who are searching for the alternatives and are simply lost with this leaderboard as they have no clue what these scores mean to them, they don't know if they can use the models commercially, and what are the use cases for each of them. Different people, diffrent needs. Thanks for your opinion, though.
One can run a llama 13B finetune on a 2020 16GB laptop, at (for me) ~230 ms/token, and hook it up to duckdb with off the shelf open source UIs.
But it is kind of "stupid" compared to gpt3.5 unless you make the jump to Falcon 40B or LLaMA 30B/65B. And (right now) you need a pretty beefy PC for those models.