I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write.
I like that.
Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".
Reddit has the worst search functionality. Drives me mad. I use google to search for reddit content. If reddit search improves google will lose it's value even more.
By the time I visit HN, all posts are already upvoted for me. Thanks to all the voters. So I hardly feel the need to. I don't know how posts with zero votes start off but much appreciation to whoever reads them and upvotes.
reply