Hacker News new | past | comments | ask | show | jobs | submit | lynguist's comments login

I agree completely with all the points made in the article and would double down on the critique of the philosophy of biology as a discipline and science itself: namely that is has historically evolved as a mere system of typologies and that this is so settled in and so sacrosanct that biology is such a superstition with human-made categories a couple hundred years ago and not a real science.

We see the exact same things also when discussing what is a species and also completely disregarding the reality of horizontal gene transfers etc in the strict, traditional trees.

The models are quite wrong and even reduced wrong.

There is this one famous article that shows how traditional biology would go and analyze a transistor radio, namely just label its assumed components!

Here is the discussion: https://news.ycombinator.com/item?id=31697757


Touch on a stove makes any stove I know gastritic. A drip of water and it turns on power mode by itself and melts lids. A bit of oil and it randomly turns on itself. A piece of wet cloth and it does the same. Sometimes even nothing at all triggers it.


Honestly, I wouldn’t mind. I would continue and live my life. I would ride public transportation to go to work, I would go to the library if I want to check something out, I would just be spontaneous instead of preplanning everything online or be influenced by opinions online.


> I would ride public transportation

Even simple things like this would be difficult. For starters, ticketing tends to be powered by 5G on buses and over physical networks for permanent fare gates like at a railway station.

Bus arrivals screens and boards at stations are also fed by the network.

I guess if we take a generous interpretation of the OP's question though, those would still be able to function assuming they work over a private network instead of the internet.


No it's true.

Apple has hardware accelerated compressed swapping.

Windows has compressed swapping.

And Linux is a mess. You have to manually configure a non-resizable compressed zram, or use it without compression on a non-resizable swap partition.


In the mid 90s professional video game programmers used typically a 1920x1080 display, just to have a larger code canvas and display sharper text.

From the 90s on 1600x1200, 1920x1080, 2048x1536 were resolutions one could find on professional displays.

From the 2010s on resolutions increased tremendously and 3840x2160 became the norm for consumer and professional displays.

When working with code you essentially work with text. You just want a big canvas and crisp text, thus high resolution.


I guess. I think the important thing is getting the program in your head, not on the screen. If the code is too complicated to hold it all in your mind then more columns of crisp text will not save you.


But that means the next Air will be upgrade worthy for those who want an Air and a memory uplift from 24 max to 32 max!


16 GB or 24 GB for inference doesn’t cut it for large models…

Mac hardware offers up to 128 GB shared RAM.


It is in a weird middle ground. It is much worse ram and much slower hardware (factor 4 or so IIRC) and only an option if you need more than 24 but less than 200. Also, only if you think that 9000€ is pocket change but 20000€ is cost prohibitive. Also, you need all that power but no server, no ECC,... And you only ever want to do inference but no training or tuning.

If you want a Mac anyway, sure. But if you don't care, this seems like a very, very specific Venn diagram.


It's not much worse RAM, though. RTX 4090 has memory bandwidth of 1050 Gb/s. M2 Ultra is 800 Gb/s. And you can get a Mac Studio with Ultra and 128Gb of RAM for $3K or less. It's great for 70-150B models.

You're correct that it's only good for inference, but most people running local LLMs only do inference.


That config Mac Studio costs 5,800 Euros (minimum). Where do you get it for 3,000 USD?


You don't need an M2 for this, an M1 will do just fine. Nor do you need the one with maxed-out SSD, which jacks up the price considerably. Finding a brand new M1 Ultra for $4K or less right now is pretty easy. When I got mine, about a year ago, $3K was the best deal I could find.


So we are rather “large” beings and not small ones.

Are these orders of magnitude scaled by 10 to go from one to the next?


Powers of 10 is the classic presentation: https://youtu.be/0fKBhvDjuy0


I haven't seen this in many years, it really is a spectacular way of making you feel the vastness of the universe and the difference in scales.


Yes


I find the “a la mode” vs “au jus” discussion right under the daemon one very interesting!

I wasn’t familiar with both of these expressions but I looked it up and “a la mode” is an American culinary expression, meaning “served with ice cream”. And “au jus” is also an American culinary expression, meaning “gravy” or “broth”. Now, even though they are both derived from a French expression that is a prepositional phrase with à (meaning with), it does not matter any more when they were borrowed to English.

“A la mode” became a new adverbial expression meaning just that: “served with ice cream”. You can have pie a la mode = pie served with ice cream, but obviously not *pie with a la mode = pie with served with ice cream.

And “au jus” became a noun expression meaning “broth” or “gravy”. And you must say sandwich with au jus = sandwich with gravy and can’t say *sandwich au jus = sandwich gravy.

What is extremely interesting here is that it bothers the prescriptivist who wants language to be a certain way he feels it is supposed to be, also the author on that webpage.


I immediately ordered my daemon to cook me some pilipili au jus de cuniculus.

Also, I think I will risk opening my eyes now.


You can say sandwich au jus but it refers to the sandwich with gravy not the gravy


Yeah, I was hanging out with someone recently who kept using "au jus" like "sauce", i.e. "you could make that with an au jus" , "ooh yeah that would be so good with an au jus on the side!" or similar ...


If you dig into this person’s posting history and also if you read regularly on HN for a couple years you will notice that it is actually this very user that deliberately uses the 0 prefixed 5 digit year numbers, and also goes out of their way to include year numbers into their posts to make people ask this question.


Hmm, I checked the last two pages of their history and this is the only comment with a year, so it can't be that out of their way.


I'm puzzled about what sorts of discussions of historical coinage policies lynguist is used to reading that don't mention specific years.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: