Hacker Newsnew | past | comments | ask | show | jobs | submit | fmbb's commentslogin

Fwiw and since you received several comments about it, your first comment did not come off to everyone as making excuses. It was pretty clear you were trying to turn peoples attention to the real problem.

There was also no fatalistic tone about the system being too powerful to change. Just clear sharing of observations IMO.

It is not unusual to receive this reaction (being blamed for fatalism and making excuses) from observations like these, I have noticed.


I suspect a lot of people commenting in this thread have never been on one of these ships or have any idea of what the typical state of maintenance is, and how inaccessible the tech compartments are when the vessel is underway. This isn't exactly a server room environment. When vessels are new (in the first five years or so) and under the first owners they are usually tip-top. Then, after the first sale the rot sets in and unless there is a major overhaul you will see a lot of issues like these, usually they do not have such terrible consequences. They tend to last for 25 years or so (barring mishaps) and by then the number of repairs will be in the 100's and the vessel has changed hands a couple of times.

Passenger carrying vessels are better, but even there you can come across some pretty weird stuff.

https://eu.usatoday.com/story/travel/cruises/2025/08/27/msc-...

And that one was only three years old, go figure.


System 7 had built in tools to read and write DOS disks: https://en.wikipedia.org/wiki/Apple_File_Exchange

I distinctly remember how it was the bare minimum. You'd mount a disk or open a plain text file, and there'd be a lot of strange characters that weren't decoded properly.

And that's why we all had to buy a copy of MacLinkPlus!

50 is not even close.

Those banners often list up to 3000 ”partners”.


The government can bail Bitcoin owners out by buying a lot of Bitcoin and holding it, or even burning the wallets.

Fun and easy to reskin as well!

Depends on what the datacenters are used for.

AI has no utility.

Almonds make marzipan.


AI has way more utility than you are claiming and less utility than Sam Altman and the market would like us to believe. It’s okay to have a nuanced take.

"AI has no utility" is a pretty wild claim to make in the tail end of 2025.

Still surprised to see so many take this as such a hot claim. Is there hype, absolutely, is there value being driven also absolutely.

Whenever I see someone say AI has no utility, I'm happy that I don't have to waste time in an argument against someone out of touch with reality.

I'm more unhappy than happy, as there are plenty of points about the very real bad side of AI that are hurt by such delusional and/or disingenuous arguments.

That is, the topic is not one where I have already picked a side that I'd like to win by any means necessary. It's one where I think there are legitimate tradeoffs, and I want the strongest arguments on both sides to be heard so we get the best possible policies in the end.


I agree, but you can't win against religious people. Better spend your time talking to the rest of us.

The article made a really interesting and coherent argument, for example. That's the kind of discourse around the topic I'd like to see.


If AI is so useful, we should have a ton of data showing an increase in productivity across many fields. GDP should be skyrocketing. We haven’t seen any of this, and every time a study is conducted, it’s determined that AI is modestly useful at best.

I don't need data to know that I use it every day and offload lots of tasks to it.

It's like the smooth brains who still post "lol 6 fingers"

Well, I don't like marzipan, so both are useless? Or maybe different people find uses/utility from different things, what is trash for one person can be life saving for another, or just "better than not having it" (like you and Marzipan it seems).

ok in that case you don't need to pick on water in particular, if it has no utility at all then literally any resource use is too much, so why bother insisting that water in particular is a problem? It's pretty energy intensive, eg.

Marzipan is fun, but useful?

AI is at least as useful as marzipan.


Activated almonds create funko pops. I’d still take the data centers over the funko pops buying basedboys that almonds causes.

AI has no utility _for you_ because you live in this bubble where you are so rabidly against it you will never allow yourself to acknowledge it has non-zero utility.

> It's amazing how far we've regressed in efficency.

I don’t think we have. This is always what efficiency leads to, higher resource consumption. The phenomenon was described already in the 1800s: https://en.wikipedia.org/wiki/Jevons_paradox

JS and the web has seen performance improvements. They lead to more ads being served and more code being released faster to users.


> This is always what efficiency leads to, higher resource consumption.

That's not the same thing. If you make batteries more efficient then people build more devices that run on batteries and then you need more batteries than ever. But you also get a bunch of new devices you didn't used to have.

When computers get more efficient, miserly corporations cut staff or hire less competent programmers and force their customers to waste the efficiency gains on that, because they don't have enough competition or the customers are locked in by a network effect. The difference is whether you actually get something for it instead of having it taken from you so somebody else can cheap out.


Both of you are right.

Without any regulations companies will create software that costs more to the users, but saves pennies to the company.

So, we have regressed in efficency.

They are not mutually exclusive but one follows from the other.


Your framing is correct

It’s company vs user not regression vs efficiency


There's a lot of advanced home insulation out there. In theory, buying expensive insulation is much better than making energy cheaper, because you only pay for the insulation once, and then save money indefinitely from it. But most people don't re-insulate their homes (or try to find cold-spots, seal leaks, etc). Despite the fact that we have more efficient insulation, it hasn't driven up demand for insulation. Why is this? The idea that efficiency == increased demand is a logical idea. But humans aren't logical.

We have more efficient hardware, so we should be seeing hardware everywhere. But actually we all use the same amount of hardware we did 20 years ago. We all have a desktop, a laptop, a smartphone, a modem, hell even a computer watch, like we did 20 years ago. But they're more efficient now.

Where we do see more hardware now, is in pre-existing appliances, like fridges, TVs. And why is there more hardware? Sometimes it's just a remote control ("turn off TV"). But more often, the increase in adoption follows a specific motive: like figuring out that they could sell ads or subscriptions through it. And the hardware itself is not what's making the ads work: it's the software, that collects the information, feeds it to companies over networks, lets them data-mine it and sell it continuously. Both of these are just serving a human interest to make more money through the creative use of surveillance and marketing. And honestly, most of this could've been done with hardware and software 20 years ago. But because it's in vogue now, we see more of the hardware and software now.

We are comforted by coming up with an explanation that makes logical sense, like the paradox. But the paradox applies most when it coincides with an unrelated human interest. What motivates us is not A+B=C, but a combination of different calculations that sometimes involve A and B, and incidentally add up to C.


> This is always what efficiency leads to, higher resource consumption. The phenomenon was described already in the 1800s: https://en.wikipedia.org/wiki/Jevons_paradox

Completely wrong an irrelevant analogy!

I see where you went sideways, you confused trigger with consequence completely. Here the efficiency for the very same application got very, very very, increadibly hugely, galactically worse. Not better. The premise of the linked article is that the same application gets more efficient. Then comes the increased use of the affected resource. Here the same application went shit, complete shit, concerning efficiency, and had no effect on memory manufacture and prices, WhatsApp is not that significant in computing.

Probably a better analogy was that if technological and tigtly related economical advances raise the availability of resources (here memory, CPU) then things go dumb. If something then the generalized (from time to any resources) Parkinson's law is relevant here: increasing available resources beyond reasonable leading to waste and bad quality outcomes, overcomplication.


The resource is compute, flops, instructions, cpu seconds, bogomips. And RAM.

The application is ”business logic”.

The engine is JS. The more efficient JS engines get the more compute and memory JS will use to deliver business logic in the universe.


A reasonable comment, unfairly downvoted.

That said, I do firmly agree with the parent: there is choice involved here, engineering decisions.

The Microsoft world is particularly bloated, as they insist on shoehorning in unwanted anti-features into their OS. Much more efficient operating systems (and ways of building a chat client) exist.

Jevon's paradox may describe a general tendency, but it's no excuse for awful software.


Oh it’s not an excuse for anything. It is just an observation about our economic system.

> It is an old language, and there are a few features that could probably be left out like the OOP-related features, and some libraries in the ecosystem over-complicate things like in Haskell.

If one can stand a language that is just a little bit older, there is always Standard ML. It is like OCaml, but perfect!


I love standard ml (I'm currently writing a compiler for it), but it's far from perfect. There are weird special cases in the language definition like the ad-hoc overloading between int and real, or the annoying value restriction. Records also feel half-baked, with no support for updating fields or for partially-specified records where the full set of fields is not known


> with no support for updating fields

While it's not yet standard nearly all Standard ML implementations support what has become known as "Successor ML" [0]. A large subset of Successor ML is common to SML/NJ, MLton, Poly/ML, MLKit and other implementations.

That includes record update syntax, binary literals, and more expressive patterns among other deficiencies in Standard ML.

For me the two big remaining issues are:

1) There's only limited Unicode support in both the core language and the standard library. This is a big issue for many real-world programs including these days the compilers for which SML is otherwise a wonderful language.

2) The module system is a "shadow language" [0] which mirrors parts of SML but which has less expressiveness where modules cannot be treated as first-class values in the program. Also if you define infix operators in a module their fixity isn't exported along with the function type. (Little annoyance that gets me every time I am inclined to write Haskell-style code with lots of operators. Though maybe that's just another hint from the universe that I shouldn't write code like that.) Of course, the fix to that would be a fundamentally different language; not a revised SML.

[0] http://mlton.org/SuccessorML

[1] https://gbracha.blogspot.com/2014/09/a-domain-of-shadows.htm...


As I was saying, Successor ML is like OCaml, but perfect!


I've written a fair amount of SML (though years ago) and I can't remember the value restriction ever being an issue.

I certainly agree that SML isn't really a production language, though.


I love SML, and came to OCaml a bit begrudgingly initially just for a larger ecosystem. But, I think the object system gets a unjustly bad wrap. The OCaml object system is dope. It is rarely needed and not the best feature for most use cases, but when you want structurally-typed records, or need open recursion (or need to represent JS objects for the amazing `Js_of_ocaml`), they are a perfect fit.


I far preferred Standard ML when I learned both back in the day, but it was OCaml that won the headspace it seemed.

Ironically probably because it had the "O"bjects in it, "which was the style of the time"... something that has since dropped off the trendiness charts.


Should go directly to htmx 4.1, so we can finally have xhtmx 1.0


Depends if it’s possible.


Python is slow due to design decisions in the language. For example operator dispatch is slow without some kind of static analysis. But this is hindered by how dynamic the language is.


It's hard to make Python run fast when it pervasively uses duck typing. It makes types only resolvable at runtime. JIT is the only thing that can work here at the moment, but I think that needs to make very similar assumptions to a branch predictor, plus it needs to identify lexical regions (is that what they're called?). People here have criticised PyPy, but I've forgotten why.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: