Hacker Newsnew | past | comments | ask | show | jobs | submit | cyprx's commentslogin

it won't work when every single PRD now has the word "extensible". i think overcomplexity often comes from requirements/ business usecases first


meanwhile the CTOs plan to apply AI into their production codebases :)


wow every rust topics have uncountable number of comments, it's indeed a successful language


i tried grok 3 with Think and it was right also with pretty good thinking


I don't have access to Think, but I tried Grok 3 regular, and it was hilarious, one of the longest answers I've ever seen.

Just giving the headings, without any of the long text between each one where it realizes it doesn't work, I get:

    Solution
        [... paragraphs of text ommitted each time]
    Issue and Revision
    Revised Solution
    Final Solution
    Correct Sequence
    Final Working Solution
    Corrected Final Solution
    Final Correct Solution
    Successful Solution
    Final answer
    Correct Final Sequence
    Final Correct Solution
    Correct Solution
    Final Working Solution
    Correct Solution
    Final Answer
    Final Answer
Each time it's so confident that it's worked out the issue, and now, finally, it has the correct, final, working solution. Then it blows it again.

I'm surprised I didn't start seeing heading titles such as "Working solution-FINAL (3) revised updated ACTUAL-FINAL (2)"


totally agree a team with full of 10x engineers would just introduce new techs every month, if not they would get bored pretty fast and leave the team anyway. most of the time normal engineers are more suitable, especially during maintenance phase where there are many boring/ repetitive tasks


I had been using Cursor for a month until a day when my house got no internet, then i realized that i started forgetting how to write code properly


I had the exact same experience, pretty sure this happens in most cases, people just don’t realize it


Just get a Mac Studio with 512GB RAM and run a local model when the internet is down.


Which local model would you recommend that comes close to cursor in response quality? I have tried deepseek, mistral, and few others. None comes close to quality of cursor. I keep coming back to it.


Possibly useful comment on local models, perhaps also fitting on machines with less ram:

https://news.ycombinator.com/item?id=43340989


[flagged]


A $10k backup plan? That makes sense. No wonder you used a throwaway.


[flagged]


The hardware isn't free. Someone asks a question and your answer is who cares about $10k of hardware hanging around as a subpar backup?


See, the thing is, I never wanted to comment on the cost or feasibility of the hardware at all. What I was commenting was that any backup plan is expected to be subpar by very nature, and if not, shouldnbe instantly promoted. If you'll notice that was 100% of what I said. I was adding to the pile of "this plan is stupid". Cursor has an actual value proposition.

Of course then you disrespected me with a rude ad hominem and got a rude response back. Ignoring the point and attacking the persin is a concession. M

For the record, I and many others use throwaways wvery single thread. This isn't and shouldn't be reddit.


You're right, I shouldn't have said the throwaway bit, sorry. However, you're ignoring the context of the conversation, which is a $10k piece of hardware. I don't know what you expected to add to the conversation by saying "who cares?" when someone asks for advice, in context or even in isolation.


wrong. where the user is asking for recommended models (for offline use), they’re not saying “yes in fact I will burn $10000 on a computer”, not at all lol


Back up your $20 a month subscription with a $2000 Mac Studio for those days your internet is down.

Peak HN.


Lol he suggested a $10k Mac Studio

But you can at least resell that $10k Mac Studio, theoretically.


Trying to do that with an M1 laptop of 32 GB and it's hard to get even 1000 euro's for it in the Netherlands whereas the refurbished price is at double of that.


Even more absurd is that Mac Studio with 512GB RAM costs around $9.5K


> Peak HN.

But, alas, not a single upvote.


Maybe this "backup" solution.. developed into commodity hardware as an affordable open source solution that keeps the model and code locally and private at all times is the actual solution we need.

Lets say a cluster of raspberry pi's / low powered devices producing results as good as claude 3.7-sonnet. Would it be completely infeasible to create a custom model that is trained on your own code base and might not be a fully fledged LLM but provides similar features to cursor?

Have we all gone bonkers sending our code to third parties? The code is the thing you want to keep secret unless your working on an open source project.


2000$? You wish!


Lol, not sure where I got the 2k from. Brain fart, but I'll let it stand :D


Can one run cursor with local LLMs only?


... to make *completely* sure that they forgot how to program?


But then I’d be using a Mac, and that would slow my development down and be generally miserable.


lol


Me too. I completely forgot the standard library and basic syntax of my daily language. wow. I went back to VSCode and use Cursor for the AI model to ask questions.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: