Hacker News new | past | comments | ask | show | jobs | submit | dgrove's comments login

Looks like the XMPP adapter hasn't been open sourced so it can't be built

https://github.com/SAMA-Communications/sama-server/tree/main... https://github.com/SAMA-Communications/xmpp-adapter

Also I don't see anything about E2EE support


there are open PRs in both server and client, that's gonna be first release around the corner I guess


scp has the assumption that you have a login on the computers you're trying to share data from. wormhole allows for sharing with others without providing login access to the computer


Right. Also you may have to reconfigure some firewalls to use scp.

Typically, a firewall allows outbound connections without needing an explicit entry for the protocol, and in the case of magic wormhole, both sides are an outbound connection. So it passes right through.

If you've got security-minded folk managing that sort of thing for you, it's possible that magic wormhole will upset them for this reason. More for policy/compliance reasons than actual security ones.


Both problems can be worked around by having a third, general-purpose host where both source/destination hosts can scp to/from. Not quite as straightforward because you have to copy twice and do it from both sides, but has the benefit of not having to install bespoke software.


> Both problems can be worked around by having a third, general-purpose host where both source/destination hosts can scp to/from.

Yup it's what I do, that 3rd computer having a fixed IP. Conveniently that computer can also keep a copy of the file(s).

Linux/BSDs/OS X (which is kinda a Unx too) all come stock with scp* and I don't really use Windows, so I'm a happy camper.


I think you could use an ssh tunnel between the intermediary and the destination such that the scp connection from the source makes it all the way through in one go, rather than leaving files on the intermediary. You'd be forwarding to the ssh port via ssh, so it would be a confusing bit of sshception.

If I tried to actually come up with the actual commands for this, I'm sure I'd burn a whole afternoon on fiddling with it.


This either requires the destination to accept inbound connections, or you'd need a permanent SSH tunnel, both of which you'd probably want to avoid.


The lack of package signing and reproducible builds leaves a lot to be desired


Redhat (pre-RHEL) solved package signing around 1999 with RPM 2.0/3.0 by using PGP and later replacing it with GPG. Debian solved it around 2003 by also using GPG.

https://dan.drydog.com/rpm-signing-howto.html

https://www.cryptnet.net/fdp/crypto/strong_distro.html

With truly reproducible builds, it's possible to introduce distributed caching of artifacts and selective probabilistic rebuilds from source to attest/verify integrity in a distributed manner.


Their apks aren't hosted at stable urls either so you can't even write a script that can reliably create a reproducible build.


In general:

1. There should be an easy-to-use API, CLI, library, and data (sqlite db or whatever) to query package metadata efficiently.

2. The mythological purity of rolling releases building against edge versions without dependency constraints or maintaining stable versioning causes problems in the real world(tm). There are many cases where past versions are needed. Example: ffmpeg is buggy as hell and has to be managed very carefully. Another example: binutils, gcc, mpfr, mpc, and toolchain friends have to be built together with compatible versions. Further example: don't compile anything with Clang/LLVM 14+ unless you want all of your code to break because some genius decided to break the world out of ideological perfectionism. macports, Homebrew, nix, and Arch are just some who are guilty of this sin.


Packages are signed in exactly the same way Debian packages are signed, ie the package files themselves are not signed but the index file that lists them is.


Because a single hot key for signing on a random build server has never fucked anyone before?

https://www.techtarget.com/whatis/feature/SolarWinds-hack-ex...


Please move those goalposts farther. I can still see them.


Both the package and the index is signed actually. That's why it still works when installing APK files directly


I'm confused as to why package signing isn't standard for all package managers.

Didn't Pypi just remove signing too?


PyPi did indeed, but it's a fairly interesting case. It was removed because the implementation was ineffective

More information on that here: https://blog.yossarian.net/2023/05/21/PGP-signatures-on-PyPI...

There was a lot of talk about why this didn't go the other way; keeping signing, but making the practice meaningful. I forget the details about that.


still true it doesn't make news I suppose


HKPK doesn't have a ton of adoption and only works in browsers. So this does nothing for curl, wget, pip


Sure, so you add it as part of the P2MS script but that doesn't solve the issue of every re-key costing money


Is this because you have a time-locked transaction to move the funds from the P2MS to a P2PKH? How does that work when it's "re-keyed"? Wouldn't each re-key move to a new P2MS and have a transaction fee associated with it as well as have a second transaction for the HTLC transaction?


The only problem with LN is that is actively requires an internet connection. Wherein Bitcoin could be done offline similarly to a Card Imprinter


Bitcoins entire premise is around the movement of unspents. If you always use the same wallet for every transaction it is pretty easy to track who sent the money. If you instead are always sending your money to a newly derived wallet from your HD (Hierarchical Deterministic) root it does make it much more difficult to track what money was money being spent on something and what money is part of a persons total value


3rd Party browsers are not, they have their own sync infrastructure. This is mostly affecting builds of Chromium that are not directly built by Google https://groups.google.com/a/chromium.org/g/embedder-dev/c/NX...


"Stocks only go up"


In the long run, yes, because companies are money generating machines.


Yep, such is the nature of inflation.


I mean, historically, stocks have gone up in real terms (discounting inflation).


Or QE.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: