Hacker Newsnew | past | comments | ask | show | jobs | submit | andrewmcwatters's commentslogin

All people unfamiliar with Linux at a documentation level assume that because Linux is Linux it must be pretty well documented, but in reality, just building the thing and creating an init is extremely poorly documented process for such mature software.

You’re not missing anything. It’s amazing Linux makes any progress at all, because the most high touch points about the damn thing are basically completely undocumented.

And if they are, the documentation is out of date, and written by some random maintainer and describes a process no longer used or it’s by a third-party and obviously wrong or superfluous and they have no idea what they’re talking about.

Edit: Oh it’s a cultural issue, too. Almost everything revolving around Linux documentation is also an amateur shitshow. Systemd, that init system and so much more that everyone uses? How do you build it and integrate it into a new image?

I don’t know. They don’t either. It’s assumed you’re already using it from a major distribution. There’s no documentation for it.


docs.kernel.org is generated from in tree readmes, docs, type/struct/function definitions. Making it a lot easier to read/browse documentation that would (previously) require grepping the source code to find.

I realize the site also hosts some fairly out-of-date articles, there is room for improvement. Those hand written articles start with an author & timestamp, so they're easy to filter.


A small but big detail that irritates me is one used to be able to search Applications faster through the dedicated Applications overlay, but now this behavior appears to just be a shortcut to Spotlight, which suffers from incredibly poor index planning.

In the past, when Spotlight was too slow to show me my most used applications by the first few letters, I'd bail and use Applications.

Now I'd have to use Finder, but opening that up would be slow enough that I'd almost need a desktop shortcut.

So, in essence, I have to hack around the most common functionality of using an application on an operating system, which is finding the damn thing. And this is supposed to be the most polished operating system on the market?

Apple frequently appears to be asleep at the wheel.


Yeah, I used to have a hot corner set up so that I could fling my mouse towards the upper left and then type the first letter or two of the app name, just like in Gnome.

Now that causes the screen to freeze for half a second (possibly my fault - I have 'reduce animations' switched on, but it seems to freeze the screen for the duration of the animation that would previously have played), and then the colour wheel spins for a couple of seconds, and then it might finally respond to my keyboard input... but even then, it fails to find the app maybe 20% of the time. This is on a ~1yo M4 Macbook Pro w/ 36 GB RAM.

So for the past month I've been training myself to alt+tab round to the finder window and navigate to the apps folder from there.

I've never been much of a Macos fan, but this is shockingly poor - less of a papercut, more a wedge of smouldering bamboo shoved under my fingernails.


On the other side of the fence, I enjoy the new Spotlight-for-Applications that opens when I hit the touch bar key (I still have an M1) for the old Launchpad. It seems to sort programs by frequency, so it knows that I open Ghostty far more often than Ghostery, and typing "Gh" will bring me to Ghostty instead of Ghostery. In the old Launchpad, applications were always presented alphabetically when you began typing, so Ghostery always was selected instead of Ghostty. I had to type "gh" right key enter before, but now just I just hit "gh" enter.

Tahoe's new Spotlight refresh includes an application specific option (open spotlight then arrow/cursor to the right or press cmd+1), and it will only match on applications, which is indeed very fast compared to a full blown Spotlight search...

except it doesn't match on Apple's built-in applications like Calendar or Screenshot.app, which makes it useless to me since I don't mentally separate Apple Apps from third party ones when trying to find or search for apps.


I concede that this is the state of the art in secure deployments, but I’m from a different age where people remoted into colocated hardware, or at least managed their VPSs without destroying them every update.

As a result, I think developers are forgetting filesystem cleanliness because if you end up destroying an entire instance, well it’s clean isn’t it?

It also results in people not knowing how to do basic sysadmin work, because everything becomes devops.

The bigger problem I have with this, is the logical conclusion is to use “distroless” operating system images with vmlinuz, an init, and the minimal set of binaries and filesystem structure you need for your specific deployment, and rarely do I see anyone actually doing this.

Instead, people are using a hodgepodge of containers with significant management overhead, that actually just sit on like Ubuntu or something. Maybe alpine. Or whatever Amazon distribution is used on ec2 now. Or of course, like in this article, Fedora CoreOS.

One day, I will work with people who have a network issue and don’t know how to look up ports in use. Maybe that’s already the case, and I don’t know it.


> The bigger problem I have with this, is the logical conclusion is to use “distroless” operating system images with vmlinuz, an init, and the minimal set of binaries and filesystem structure you need for your specific deployment, and rarely do I see anyone actually doing this.

In the few jobs I’ve had over 20 years, this is common in the embedded space, usually using yocto. Really powerful, really obnoxious tool chain.


What you describe is from the "pets" era of server deployment, and we are now deep into the "cattle" era. Train yourself on destroying and redeploying, and building observability into the stack from the outset, rather than managing a server through ssh. Every shop you go to professionally is going to work like this. Eventually, Linux desktops will work like this also, especially with all the work going into systemd to support movable home directories, immutable OS images with modular updates, and so forth.

> What you describe is from the "pets" era of server deployment, and we are now deep into the "cattle" era.

You still need to be able to work with individual servers. Saying "they're cattle, not pets" is just being a lazy sysadmin.


I don't think this viewpoint is very pragmatic. "Pet" and "cattle" approaches solve different scales of problems. Shops should be adaptable to using either for the right job.

I already do this professionally, and when something is broken, we collectively as an industry have no idea why except for rolling back to a previous deployment because we have no time for system introspection, nor do we really want to spend engineering hours figuring it out. Just nuke it.

The bigger joke is everyone behaves like they have a ranch for all this cattle infrastructure.

In reality, the largest clients by revenue in the world have PetSmart. And frankly many of them, a fish bowl.


Flashbacks of gaming on an XP-era HP Pavilion with graphics so bad water didn’t even render in Halo 1 PC flood my mind.

Countless kids played Morrowind below par spec on family computers all across America.


My Geforce2 MX 200/400 with an Athlon and 256MB of RAM began to become useless in ~2002/2003 with the new DX9 games.

Doom3? Missing textures. Half Life2? Maybe at 640x480. F.E.A.R? More like L.A.U.G.H. Times changed so fast (and on top of that, shitty console ports) that PCs didn't achieve great numbers at home until 2009 with a new machine.

Altough I began to play games like Angband, Nethack and the like in that era and in opened an amazing libre/indie world until today.

And, yes, I replayed Deus Ex because it had tons of secrets and it ran on a potato. Perfectly playable at 800x600 at max settings.


That sucks. I’m not a big fan of Tailwind, but at least it helps non-designers make somewhat decent user-interfaces.

It’s hard to run a software business.


What’s your point? Say everything you just said again, but with software engineering and Indians, instead of manufacturing and the Chinese, or textiles and Vietnam and Pakistan.

There’s no reason American cars need to exist either, they basically all perform worse dollar-for-dollar, feature-for-feature, than foreign cars.

In fact, let’s offshore everything. There’s no reason not to use Filipinos for McDonald’s and In-n-Out drive-thru speakers.

Let’s all adopt Chinese tang ping. Lay down and die. Treat every effort of labor as replaceable and void of respect.

If China and India wanted to wage effortless war with the US all it would have to do is stop exporting goods and labor to us.


Please read my comment again. This time, consider that our laws and regulations are not laws of physics or axioms of mathematics and are therefore able to be changed. The comment will make more sense in that light.


The only thing I've never understood about the HPV vaccination is that for some reason after a certain age as an adult in the United States, no primary care provider appears to recommend you get it in addition to your regular vaccination schedule.

Is the idea that you're married and have a single partner and the risk factor has dropped below a certain percentage of the population where there's little reason to recommend getting it if the likelihood is that you've already acquired HPV in your lifetime thus far?

Every other vaccination appears to be straightforward, besides HPV, and I don't know why. I've also never heard a clear answer from a physician.

Is it just that our vaccination schedules are out of date in the United States? This seems to be the most likely culprit to me.


Here's the CDC's most recent recommendations (from 2019) https://www.cdc.gov/mmwr/volumes/68/wr/mm6832a3.htm

The justification for 27-45 year olds heavily references a meeting. Based on time, author and title, I think either https://stacks.cdc.gov/view/cdc/78082/cdc_78082_DS1.pdf or https://pmc.ncbi.nlm.nih.gov/articles/PMC10395540/ should be a fair summary of the meeting (I hope...).

I don't really have time to read it all, but the basic idea is as you said - the cost-benefit ratio is off. Basically expanding from something like the current case, to vaccinating up to 45 year old will avert an extra 21k cases of cancer (compared to the base case of 1.4 million) - so about an extra 1.5% cases averted, while the direct vaccination costs are estimated to increase from 44 billion to 57 billion (+29%).

The current guidance says "do not recommend" plus "consult your doctor". You should read that as "blanket vaccination as public policy is cost inefficient in that age range" not "you as a 45 year old should not get the vaccine categorically".


It wasn't tested in those over 45, thus it is not approved over 45. Doesn't stop off-label use, but means it's not going to be on any schedules.


Interesting, it looks like you can use ´global myvar’ now, as compared to ´myvar’ implicit globals, say from back in 5.1, or ´local myvar’.

It’s worth noting that global is a reserved keyword now, so environments that had a ´global()´ function for escaping environments will now need to rename their helper function.


But.. why ? Globals are just variables that reside in the wrapping env table that also contain the C functions. If a closures is a onion of table lookups out of the function context from local -> function scope -> global scope is simply the last lookup before a not found variable with nil is declared?


Module exports with side effects, and setting environments doesn’t guarantee global access.


There's a lot of ecosystem behind it that makes sense for moving off of Node.js for specific workloads, but isn't as easily done in Rust.

So it works for those types of employers and employees who need more performance than Node.js, but can't use C for practical reasons, or can't use Rust because specific libraries don't exist as readily supported by comparison.


I don't care what you other people in auth do, I work in auth too, please stop making signing into anything 5 steps.

1. First I get redirected to a special sign-in page.

2. Then I sign-in with my email only.

3. Then it finally asks me for a password, even for services that would never reasonably use SSO or have another post-email receive process.

4. Then I get redirected again to enter 2fa.

5. Then these websites ask if I want to create a passkey. No, I never want to create a passkey, and you keep asking me anyway.

6. Then, and only then, do I get to finally go back to using the service I wanted, and by then, you've lost whatever my `?originalUrl=` was, and I have to find it again.

No, don't send me a magic link. Because then I have to go do 4 more steps with Gmail or another mailbox provider and now signing in has become 10 or more steps.

No, don't tell me getting rid of passwords will help most of the population, and then force all of us to do the above, and blatantly lie to us that it's better.

Stop it. Get some help.


I still find myself stuck on step 0: find the fucking log in button that is for some reason tiny/looks disabled/not easily discernible as a button


If you created a passkey, it would be one step.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: