Hacker News new | past | comments | ask | show | jobs | submit | jdsalaro's comments login

> He trained for hundreds of hours to fly the capsule manually and the docking process is excruciatingly slow

_COME ON TARS!_


> CFI

They're referring to Control Flow Integrity [1]

[1] https://en.m.wikipedia.org/wiki/Control-flow_integrity


> Why would I want/need this?

always, golang is overly opinionated regarding where modules and binaries are stored. I don't like that and I've blown my local development environment into pieces because of that (looking at you GRPC, yikes)

But also, imagine that you, like me, need to test Python, Java+Kotlin+Gradle and NodeJS+Angular stuff. Do you really want to install _all that_ natively ? Just for a couple of merge reviews, and even if not, do you _really_ want to install all that natively ? The answer is always, IMHO, a resounding and clear no.

> It was a Visual Basic 6 program, so I just took two half days going through every EXE & DLL related to Windows and VB, eventually finding the difference. Tedious but not rocket science. Is it to avoid these cases?

For example, but also much worst, as mentioned in the OP it's to prevent the very real possibility of crippling your OS's language runtimes and also to stay productive.


IMHO the solution for the problem of devs in the same company having different environments is not Adsf and its competitors like Mise, but things like Nix/Guix and Docker.

At work, because everyone uses Mac, we ended up using Kandji to achieve the same thing: everyone has the same tools and environments, but that is only if you already have to use that due to security audits and stuff like that.

If I had a small company myself I would probably setup everything with Guix as I really like the way it works, more than Nix (though only because I prefer Lisp config files and because Guix doesn't suffer from any polemics like the flake soap opera).


I just use small VMs. I use Vim as my code editor and just ssh into it. No difference, latency wise. if I was on Linux, I'd use LXC containers.


If you had trouble with gRPC and Go combo, you might find the way it is integrated in C# a breath of fresh air.

https://learn.microsoft.com/en-us/aspnet/core/grpc/basics?vi...


there's something I don't get, why do you have globally installed tools that asdf can manage at the same time that you have asdf installed?


So that in any directory you can type `<command name>`? asdf Node installs aren't like Python virtual environments; they're centrally installed and one Node version (and its packages) is shared across all diretories that want it + global tools.


> So that in any directory you can type `<command name>`?

Any tool installed via asdf is available on any directory as long as you are accessing that directory via a shell spawned with a .profilerc or similar which contains your asdf configuration.

> asdf Node installs aren't like Python virtual environments

Correct, neither should they be.

> they're centrally installed and one Node version (and its packages) is shared across all diretories

sure, they are, and that's by design. You're conflating a runtime manager with a package manager. Venvs are _not_ runtime manager, the moment you need another Python version you're done for. asdf.vm is _not_ a package manager, the moment you want package isolation while working on an asdf install is the moment you install yourself pipenv, poetry, pdm or use python venvs for that.

> that want it + global tools.

which is achieved as I've shown below. Still, there was no reason in your usecase to modify or play with globally installed tools besides asdf, through which you can then define global runtime versions and those global versions will hold your global tools, usable wherever.


OP here, although I hoped I took an example that was relatable, it seems it wasn't as relatable as I expected.

> Is it just me that never even wants to get to the problems that asdf attempts to solve?

You aren't alone, the scenario isn't ideal. However, brew's Python installation on MacOS as are Debian's and Ubuntu's are _extremely_ brittle. You are one cask, formulae or apt package away from needing to do a weekend-long spelunking session or a full blown system re-install if you have deadlines.

PyEnv is a pain to set up, and maintain, which is what I used in the years before as well as after Python 2 was deprecated and projects started slowly migrating to newer python versions.

> That example in the article of managing multiple python 2.7 versions sounds like a horror story.

It is a horror story, but is very common.

Have you tried to install and maintain Java, Kotlin and Graddle installations for a given project although your machine is not primarily a Java, Kotlin, Graddle box? That is a real nightmare, not so much with asdf.vm.


You shouldn't need asdf to work with JVM stuff. I would suggest learning how to use SDKMAN: https://sdkman.io/

It will manage the JDK for you. Usage is basically this:

   # Install a JDK, that version is now default
   sdk install java <version>
   # Another one, it asks if you want to change the default
   sdk install java <another-version>
   # List available and installed versions
   sdk list java
   # Change which one you're using in this shell
   sdk use java <version>
That's all.

You can also manage Gradle/Maven installations with SDKMAN, but that's not necessary, usually, because most JVM projects include a "wrapper" script which downloads the needed Maven/Gradle version for you.

This works regardless of whether your project also needs Kotlin/Groovy etc. as those are just managed by Gradle/Maven (the only exception I can think of is if you use Kotlin Multiplatform as that will depend on the platform dependencies as well).

So once you know SDKMAN, you can manage any JVM-based project with just this:

    sdk use java <jdk-version-used-by-project>
    ./gradlew build # or ./mvnw package
If you need to do anything else, you should complain to the project authors as this is really all you should need!


> If you need to do anything else, you should complain to the project authors as this is really all you should need!

Sure. But you might need node for some front end build tool, or a language server for sql. Then you can use two version managers, or just asdf.


> brew's Python installation on MacOS as are Debian's and Ubuntu's are _extremely_ brittle

I've been using python installed using homebrew and haven't found any issues. In homebrew you can install a specific python version like python@3.11 and using venvs avoids most of the issue (I think you can't install packages outside of a venv in python 3.12 or higher).


My experience with Poetry has been mixed the last couple of times I've tried it, it attempts to do way to many things but often failed to do them properly. Determining dependencies and proper packaging and upload to PyPI are the ones which come to mind.

asdf.vm together with pipenv is my go-to for Python environment management.


pip-tools[0] is all most things need IMO. It’s a great balance of simplicity and utility.

[0] https://pip-tools.readthedocs.io/en/stable/

As for Poetry, it is constantly improving and has gotten very popular. It should not be dismissed, especially for larger projects since its dependency management is so much better than pipenv. This is a good primer: https://diegoquintanav.github.io/poetry-primer.html


pyenv + pip tools is all one needs. Supposedly uv is gunning to be a drop in for both. I think there’s a good chance uv pulls it off and becomes defacto for this use case.

I think it’s fair to see appeal in poetry, but ultimately the maintainers have created a culture that feels a bit too non-collaborative to outside ideas or use cases beyond what they more narrowly envisage. That said, my perspective may just be tainted by multiple poor experiences interacting with the maintainers.


> pyenv + pip tools

then you'd also need rbenv, nvm, etc.

and pyenv can implode in marvelous ways.


I can’t speak to rbenv or npm, but IMO it’s better of use well known and canonical tools for each rather than a more unknown mega tool that wraps or replaces these.

pyenv isn’t perfect, and isn’t what I’d use for prod images, but for dev setup it’s relatively bulletproof and less issue-prone than any alternative I’ve seen.


Been using pyenv daily for years now. In what way could it implode? It's worked great so far.


this makes me itchy, pulling the whole internet without looking into your development machine sounds like a very bad idea. It's the equivalent of an IDE's "do you trust this project" but on steroids.


just curious, have you tried using asdf.vm with pipenv ? I've never needed anything else and have yet to have any problems.

A couple of moons ago I used Poetry, but gave up on it because it was so heavy and unfortunately would bug out often.


Not tried pipenv, I'm a bit tied by what the company uses.

I did try using something similar to asdf (can't remember the name, think it changed), but it still didn't really solve the problem of OS dependencies and things needing to be compiled, and the problems arising from me not running the same OS as the application would run on. A dockerfile solves that, my system is a carbon copy of the prod environment.


> A dockerfile solves that, my system is a carbon copy of the prod environment.

Yeah, that can definitely not be beat, if at all probably only due to comfort.

> Not tried pipenv

I've been meaning to put a tutorial out there with my workflow since forever, if I had it I'd point you to it.

I recommend you give it a try if you get the chance, you might like it.


Not OP but I've been using asdf and Poetry together and have been pretty happy with my setup. What issues were you experiencing?


> I recently started using https://github.com/prefix-dev/pixi for Python projects

Why is it based on the Conda ecosystem? Do you happen to know?

I assume it's for portability, but that sounds heavy.


For as much improvement as there has been with what can be distributed via PyPI, there are still some domains that have gnarlier dependencies than wheels happily handle alone, and you either need to reach for the system package manager (and loose the ability to really control the dependency environment from that mismatch), or take advantage of the Conda ecosystem.

My org does a lot of work combining machine learning with oceanographic and climate modeling, which are both domains that have deep dependency chains that don't always mesh well, especially as our researchers mix in R and other languages as the same time, and the Conda ecosystem helps us a ton with that, but there are issues that `conda` and `mamba` don't help us out with.

Pixi takes a swing at some of what the Conda ecosystem hasn't been great at (at least without a lot of manual custom ceremony) that Cargo, Poetry, Pipenv, PDM, and other dependency and workflow management tools have demonstrated can be done such as lock files, cross platform dependency management, task running, and defining multiple related environments.

What's really cool when you have a mix of projects, Pixi can work almost entirely PyPI native out of a `pyproject.toml`, other than installing Python from Conda-Forge, so you can mix and match environments but stay with the same tool. https://prefix.dev/blog/using_python_projects_with_pixi docs: https://pixi.sh/latest/advanced/pyproject_toml/


it certainly looks interesting! I'm still not sure if "It’s npx for everything else" is good marketing :P

> can activate project tooling upon cd’ing into a project folder

this probably can be replicated with zsh hooks: https://zsh.sourceforge.io/Doc/Release/Functions.html#Hook-F...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: