Hacker Newsnew | past | comments | ask | show | jobs | submit | fnoef's commentslogin

“At this point, I think a know more about manufacturing than anyone currently alive on earth” - Elon Musk [0]

[0] https://m.youtube.com/shorts/S2Bo3S99Tas


This is such an absurd take.

For starters, if I'm a "house builder" by trade, then yeah, I am going to build the house myself. Otherwise, why should the client pay me, and not the guy I'm subcontracting?

Secondly, there is no such thing as a "house builder" profession. It consists of a lot of different trades people, some of them having legal power to sign off your house build (for example an electrician). Now, we could try to push for something similar in software engineering, and say require you to have an "authentication engineering certificate" in order to handle code related to auth, and only a person holding the certificate can allow such code for production use. But I'm pretty sure all the vibe coders and tech bros will cry how unfair and bureaucratic the system is.

But of course the entire SWE profession is based on grifting, and extracting as much money as possible from the customers while cutting the costs. If you are so afraid to save passwords to a database, then at least don't call yourself a software engineer.


> For starters, if I'm a "house builder" by trade

You're not a house builder, you're a widget maker who needs a house to live in. Auth is almost never your startup's core competency or offering. Spending one of your very valuable five engineers on the auth tarpit while you lose deals because SSO is hard could be life and death for you.


Isn’t it a bit ironic that a (presumably statically generated) blog post about “programming sucks” is being chocked to death by HN?

Yeah this was just a failure on my part, I was too lazy to go the ISR route and was on the Cloudflare free plan. I was not expecting any traffic haha.

Get a VPS and host it there. Costs less than a cup of coffee a month, with tens of TB of traffic.

Nice article by the way!


oh I probably should, but I'm kinda busy with checks notes trying to feed my family :D

Not really, no.

As this post was inspired by "Programming Sucks," that the traffic generated by it made something break is quite on point.


"Make sure to double check everything, and MAKE NO MISTAKES!!!"

Don't hallucinate!

"YOU'RE A SENIOR SOFTWARE ENGINEER!!!"

"Ultrathink!"

Record the existing container id, rescale the service to 2 instances (hence bringing a second container up), wait for the second one to be healthy, (optional) stop directing traffic to the old container, wait a few seconds, stop the old container, rescale the service back to 1 instance.

Here's a CLI plugin that automates this: https://github.com/wowu/docker-rollout

Another vote for this - we’ve been using it for years without issues.

Blue Green Deployment. There must be a docker container to handle this or at least a bash script.

edit: thanks to next comment for referencing one


I’m wondering why Anthropic, who has “the most powerful, hold me bro, AI in the world” just didn’t vibe code their own, better, version of bun? haven’t Dario said that coding is cooked in 6 month, like 12 months ago?

Looks like coding is in a downward spiral towards complete chaos

When I was a kid, we've been told to be cautious with third party dependencies, that code can do anything and it's a risk to evaluate.

With the new generation of yolo NPM scripters, they simply don't evaluate the risks. They will even fight back telling you that it's the way of doing things.

In reality, it's the warning we learnt back then, that's the result of be mindlessly importing third dependencies without thinking.

In other words, the risks were always there, the new "modern way", let's put it that way, doesn't put the effort anymore.


That, and it is combined with not being willing to write a few functions oneself, which one could easily do, and then not have to add a dependency. But it is also a result of trying to do everything quickly quickly! and being pushed to do that.

The more one knows about computer programming, algorithms, data structures, how things are usually implemented in general, the better one can avoid unnecessary dependencies. Needs the right environment though to execute on that.


  > that's the result of be mindlessly importing third dependencies without thinking
tbf, most tech-related corporate environments don't want you to think, just do (kpi, mbo, okr et al) and this is one of the results

> When I was a kid ... With the new generation

Let's be real tho, there's a whole lot of people who have been around enough to know better that do this too.


My Linux server runs a cron job, that can spin off a thread and even use other ~apps~ tools. Did I invent AGI?

Does your Linux server decide what processes it should launch at what time with a theory of what will happen next in order to complete a goal you specified in natural language? If so yes, I reckon you sure have!

Claude does not have a "theory" of anything, and I'd argue applying that mental model to LLM+Tools is a major reason why Claude can delete a production database.

Well, humans also routinely accidentely delete production databases. I think at this point arguing that LLMs are just clueless automatons that have no idea what they are doing is a losing battle.

They’re not clueless they just don’t have a memory and they don’t have judgement.

They create the illusion of being able to make decisions but they are always just following a simple template.They do not consider nuance, they cannot judge between two difficult options in a real sense.

Which is why they can delete prod databases and why they cannot do expert level work


>they cannot do expert level work

Well this is just factually incorrect considering they are currently on par with grad students in some areas of mathematics.


Not sure if you are being pedantic but mathematics is quite different from other fields because it is highly structured, reasoning is explicit and it contains a dense volume of high level training data. Results are able to be verified easily due to its structure.

Even then, they are most effective in assisting and are not able to produce results independently. If you have proof otherwise I would love to read up on it


I like to think of LLMs as idiot savants. Exceptional at certain tasks, but might also eat the table cloth if you stop paying attention at the wrong time.

With humans, you can kind of interview/select for a more normalized distribution of outcomes, with outliers being less probable, but not impossible.


When you're applying reasoning like this, sure, why not? What difference would it make?

I mean maybe it’s a losing battle today, but it is correct. So in a few years when the dust settles, we’ll probably all be using LLMs as clueless automatons that still do useful work as tools

So... systemd is AGI now?

Maybe. But probably not. It doesn't matter if it's AGI though. If those other apps and tools do simple things that are predictable, then we can be pretty sure what will happen. If those tools can modify their own configuration and create new cron jobs, it becomes much harder to say anything about what will happen.

Most of us work on software that can modify its own configuration and create new jobs. I, too, have worked in ansible and terraform.

The key break here is the lack of predictability and I think it's important that we don't get too starry eyed and accept that that might be a weakness - not a strength.


Well do you make 100 billion bucks with it? If no, then not AGI.

Many of these developers adopted the tools against their will, as means to bring home salary while they still can. In the mean time, the AI folks are working hard to just eliminate their job.

You know, perspective matters. When you sell a knife with the promise of a tool that helps you cut onions, is a completely different story from when you market it as a weapon to kill your neighbor.

AI is massively marketed by AI people as a tool to replace your job. So either the AI people are bad at marketing or the gains in other industry are insignificant/ do not generate shareholder value.


> AI is massively marketed by AI people as a tool to replace your job

Keep in mind who pays the AI companies.

It's not you, it's the C-levels. The marketing is aimed at them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: