Not supporting plugins for a type checker to me is a plus. It’s quite frustrating that some Python projects only typecheck if you have plugins. That is a major source of frustration.
python packages that do a lot of metaprot can only be properly type checked if you replicate that metaprogramming at the type level. e.g. if dataclasses were not part of the standard library they would need a plug-in to be handled correctly.
Yes, this is it. You can usually find some discounted or free way to get most books, but I had classes where you literally submitted homework through the same system you accessed the book through, and it was like $150.
From what I gather this has basically derailed CI for the morning for the majority of places out there. Only workaround is pinning build-time dependencies, which only pip and uv seem to let you do well. Poetry is SOL / heavily cache-dependent as to whether it works.
A full rewrite instead of finding a way to gradually increment already feels like a lot of risk. Asking your core dev team to potentially pick up an entirely new skillset in Rust would introduce too much on top and probably cause people to flake from the project.
Totally agree something like Rust would be good in a vacuum, but existing contributors and ecosystem would present problems. Having tooling built in the same ecosystem as the end product makes it way easier to contribute.
"Pandas alternative" kind of undersells it -- it's drastically faster and supposedly has a much more intuitive interface. The limits of how long you can keep doing things entirely in-memory (and postpone the move to something like Spark) get higher and higher.
The installation and first five minutes of any kind of product is hugely make or break. I keep my resume in LaTeX via Overleaf, but probably wouldn’t bother with it if I had to get LaTeX running locally, which has always seemed fairly complex to me (though I’m admittedly no LaTeX has expert and may entirely be wrong).
This surprises me. On most platforms it’s just a package download and install. On Mac, it’s macTeX. On Linux, it’s whatever your distro calls texlive via the package manager. On windows it’s mikTeX. That’s not exactly complex or requiring any sort of latex expertise. Linux can be the one that requires the most thinking if they don’t have one package that pulls in all of what you need, but I can’t remember it being more than a couple minutes of effort last time I did it on Ubuntu or fedora.
The difficulty is getting multiple collaborators to install and pin the same packages, where everyone might be using a different platform/distro.
Example: I might commit a change that compiles perfectly fine with my version of asmath, but it conflicts with the version of asmath in the style guide of some UC Berkeley department/lab.
It requires choices and knowing what to install and if things don’t work, troubleshooting the install can be difficult. For a first time task of “install latex”, it’s not the easiest. Especially for newer users. I e done it half a dozen times and I’m still not quite sure if I’ve done it right on my Mac (right away).
I wasn’t aware of a brew package; I will definitely check that out. I have always been using the texlive installer for macOS (MacTeX), which is very easy to use. Although the install instructions can be a bit long and important to read when Apple breaks things.
Is 10 gigs really that much nowadays? I have to think that if you're frequenting HN you're likely to have at least a terabyte in storage on your personal computer?
It’s not about the HN visitor… it’s about the collaborator or grad student who might be on an entry level computer with 8GB of RAM and 256 GB of storage. The entire system needs to be easy for them to install and maintain. And even if I have 1TB of storage, if I could avoid an extra 10GB of space in my backups, I’d appreciate it.
The fact that you don't seem to realize that downloading 10GB of stuff just to edit/generate PDF documents is completely bonkers just shows how out of touch Latex afficionados are.
As far as I'm concerned, the outputs are pretty good but until somehow really makes no-nonsense software that can do that in an efficient manner, it might as well not exist at all.
Fair enough. I just have a Nix Flake to handle this stuff for me now so I just do `nix build`, but obviously that's getting into territory that is super geeky.
“AI” (as absurdly broad of a term as it is) has legitimate use case. It isn’t JUST hype. However, because of the buzz, it’s being shoehorned into so many places it really just isn’t the proper fit for, and it’s hard to figure it where.
However, there ARE plenty of areas it IS the right fit for. Lots of “fuzzy” systems that would struggle to be rule-based and generalizable benefit hugely from LLMs and other fuzzy / intentionally broadly scoped tooling.
Source — I work at a “chat with your data” startup, and our product just categorically wouldn’t be worthwhile if the above weren’t true :)
reply