Hacker Newsnew | past | comments | ask | show | jobs | submit | groundtruthdev's commentslogin

Would you feel comfortable flying on an airplane where the programmers don’t care about secure code, correctness, or the ability to reason about and optimize algorithms—where “good enough” is the philosophy? Most people intuitively say no, because in safety-critical and large-scale systems, engineering rigor isn’t optional. Software may look intangible, but when it runs aircraft, banking systems, or global platforms, the same discipline applies.

The “Facebook/YouTube codebases are a mess so code quality doesn’t matter” line is also misleading. Those companies absolutely hire—and pay very well—engineers who obsess over security, performance, and algorithmic efficiency, because at that scale engineering quality directly translates to uptime, cost, and trust.

Yes, the visible product layers move fast and can look messy. But underneath are extremely disciplined infrastructure, security, and reliability teams. You don’t run global systems on vibe-coded foundations. People who genuinely believe correctness and efficiency don’t matter wouldn’t last long in the parts of those organizations that actually keep the lights on.


Do you think the people writing the code that operates aircraft care about code quality? After the boeing incident I do not.


Fair point and that’s exactly why Airbus has been eating Boeing’s lunch. When engineering culture takes a back seat to cost, schedule, and optics, outcomes diverge fast. In safety-critical systems, rigor isn’t optional, it’s the competitive advantage.


I find it difficult to believe software is Airbus’ competitive edge. First, their software for aircrew bidding is an absolute and utter disaster. Date filtering has been broken nearly a year despite multiple releases being pushed. Date management is like THE KEY functionality of aircrew bidding. I also use their flight plan software and it’s like they never bothered to ask a pilot how they use a flight plan in flight.

I think Airbus is riding the coat tails of solid engineering done in the 80s and continuing to iterate that platform vs Boeing trying to iterate on a hardware platform from the 60s. Airbus benefited significantly from 20s years of engineering and technological progress. Since the original design of the A320, changes have been incremental. Slightly different engines, addition of GPS/GNS, CPDLC, CRT to LCD screens. Meanwhile Boeing has attempted to take a steam gauge design from the 60s and retrofit decades of technology improvements and, critically, they attempted to add engines significantly altering the aerodynamics of the aircraft.


Which Boeing incident? The 737 Max was a correct implementation of bad requirements -- there's no indication of a code quality problem here. Starliner definitely had more indications of code issues, but was not an aircraft.


This defense is missing the point. Yes, humans aren’t remote-driving the cars, and yes, most miles are autonomous. But the relevant question isn’t how often a human intervenes — it’s how many humans must be continuously available for the system to function at all. Even if interventions are rare, Waymo still needs operators on shift, fully alert, low-latency, and trained for local conditions, and that cost exists whether they’re doing something or not. Capacity planning is driven by correlated failures, not averages: blackouts, construction, special events, and weather can cause many vehicles to request help at once, and we’ve already seen queues form. That means the human layer is sized for worst-case concurrency, not “99.99% of miles.” So no, it’s not “just guys in the Philippines driving cars,” but it’s also not “so infrequent you wouldn’t believe.” It’s a highly autonomous system with a permanent human ops shadow, and the fact that this work is offshored strongly suggests that shadow is economically material. Miles are autonomous. Ops are not.


In this hypothetical world where AI reliably generates software, large and small software providers alike are out of luck. Companies will go straight to LLMs or open-source models, fine-tune them for their needs, and run them on in-house hardware as costs fall, spreading expenses across departments. Even LLM providers won’t be safe. Brand, lock-in, and incumbent status won’t save anyone. The advantage goes to whoever can integrate, customize, and scale internally. Hypothetically is the keyword.


What are the other consequences of unlimited cheap reliable quality software? It's hard to think about but feels more important than just SaaS companies going bankrupt.


Sounds like a great opportunity for my company! Who can I hire to help me figure out how to do this stuff?


Short answer no. In a world where AI reliably generates software, companies will bypass SaaS vendors, even the large ones, and go straight to the best LLM providers for tailored solutions. Brand, lock-in, and capital won’t save traditional vendors.


I'm not talking about the saas companies. I'm talking about the meta/google/apple/openAI/anthropic etc. They can clone any decent business overnight now, cheaper and faster than you can.

That's the point they win, we all lose.


Only if you and everyone else decide to use theirs

Work on alternatives, be indie too


If AI really makes software cheap and fast, the future isn’t generic SaaS clones competing in hours. Companies will just generate their own hyper-custom internal versions, Salesforce clones tailored to their exact workflows. Brand and lock-in won’t save vendors; internal control and cost savings will.


To the article’s author: what is the timeline for removing human engineers from your own organization?


I imagine it's even easier to remove the CEO/Executive staff. Actually, why have anyone there at all? Surely this company can LLM their way to having no staff whatsoever!


Yeah, extraordinary claims need some internal consistency before external evangelism. I’d expect the same from other companies whose CEOs make these kinds of claims, like Nvidia and Anthropic.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: