Hacker Newsnew | past | comments | ask | show | jobs | submit | TacticalCoder's commentslogin

Wait... Most of my shell scripts have zero unused variables: I prefer to comment them if I may need them later on.

Why do you disable SC2034?

I don't think not having unused variables prevent me from doing things in my scripts!?

I understand if it's a preference but SC2034 is basically one of my biggest timesavers: in my case unused variables are typically a bug. Except, maybe, ANSI coloring variables at the top of the script.


I've got many like these I copied from various people over the years.

One I came up and that I use all the time:

    alias wl='wc -l'
I use it so much I sometimes forget it's not stock.

That's a nice one.

One thing I do is configure my keyboard so that "modifier+{ijkl}" mimicks the inverted T arrows key cluster. So there's never a need for me to reach for the arrow keys. And {ijk} makes more sense than vi's {hjkl} and is faster/more logical/less key fingers travel. The nice thing is: as I do this at the keyboard level, this works in every single map. "modifier" in my case is "an easily reachable key in a natural hand position on which my left thumb is always resting" but YMMV.

I set that up years ago and it works in every app: it's gorgeous. Heck, I'm using it while editing this very message for example.

And of course it composes with SHIFT too: it's basically arrow keys, except at the fingers' natural positions.


That is the spirit! A friend recommended me to buy a Bambu P2S: there are parts I want to print and I don't want to model then send them to have them printed, nor to bother my friend all the time. Funnily enough I've got magnets falling too: for an alarm system on the doors/windows and they don't hold well anymore after the years. Then my car's radar detection device (fully legal) doesn't fit nicely in the phone holder I use to that effect: I want it a specific angle (I want it both inclined and facing towards me a bit). So I'll model those and just print them. There are a few things like that where I keep thinking: "If I had a 3D printer, I'd just print a part".

Most importantly: I've got a 11 y/o and I think it's cool for the kid to see how it works.

Already watch a few vids. Doesn't look too hard for simple things.


I have fond memories of visiting a university in the early 90s on a demo day and there was a (physical) sphere in a Cornell box:

https://en.wikipedia.org/wiki/Cornell_box

And next to it was a super beefy computer doing a 3D rendering of a similar scene.

35 years+ later I've got "many spheres in a Cornell box" rendering in my browser, love it : )


The guys deterministically bootstrapping a simple compiler from a few hundred bytes, which then deterministically compiles a more powerful compiler and so on are on to something.

In the end we need fully deterministic, 100% verifiable, chains. From the tiny boostrapped beginning, to the final thing.

There are people working on these things. Both, in a way, "top-down" (bootstrapping a tiny compiler from a few hundred bytes) and "bottom-up" (a distro like Debian having 93% of all its packages being fully reproducible).

While most people are happy saying "there's nothing wrong with piping curl to bash", there are others that do understand what trusting trust is.

As a sidenote although not a kernel backdoor, Jia Tan's XZ backdoor in that rube-goldberg systemd "we modify your SSHD because we're systemd and so now SSHD's attack surface is immensely bigger" was a wake-up call.

And, sadly and scarily, that's only for one we know about.

I think we'll see much more of these cascading supply chains attack. I also think that, in the end, more people are going to realize that there are better ways to both design, build and ship software.


> If you don't what's the point of checking only the install script?

The .tar.gz can be checksummed and saved (to be sure later on that you install the same .tar.gz and to be sure it's still got the same checksum). Piping to Bash in one go not so much. Once you intercept the .tar.gz, you can both reproduce the exploit if there's any (it's too late for the exploit to hide: you've got the .tar.gz and you may have saved it already to an append-only system, for example) and you can verify the checksum of the .tar.gz with other people.

The point of doing all these verifications is not only to not get an exploit: it's also to be able to reproduce an exploit if there's one.

There's a reason, say, packages in Debian are nearly all both reproducible and signed.

And there's a reason they're not shipped with piping to bash.

Other projects shall offer an install script that downloads a file but verifies its checksum. That's the case of the Clojure installer for example: if verifies the .jar. Now I know what you're going to say: "but the .jar could be backdoored if the site got hacked, for both the checksum in the script and the .jar could have been modified". Yes. But it's also signed with GPG. And I do religiously verify that the "file inside the script" does have a valid signature when it has one. And if suddenly the signing key changed, this rings alarms bells.

Why settle for the lowest common denominator security-wise? Because Anthropic (I pay my subscription btw) gives a very bad example and relies entirety on the security of its website and pipes to Bash? This is high-level suckage. A company should know better and should sign the files it ships and not encourage lame practices.

Once again: all these projects that suck security-wise are systematically built on the shoulders of giants (like Debian) who know what they're doing and who are taking security seriously.

This "malware exists so piping to bash is cromulent" mindset really needs to die. That mentality is the reason we get major security exploits daily.


> And I do religiously verify that the "file inside the script" does have a valid signature when it has one.

If you want to go down this route, there is no need to reinvent the wheel. You can add custom repositories to apt/..., you only need to do this once and verify the repo key, and then you get this automatic verification and installation infrastructure. Of course, not every project has one.


Europe here. I disagree. Many SMEs are totally happy with Google Workspace and Canva, as GP mentioned. I know people using just that. And they don't understand why there are people suffering from the Microsoft-Stockholm syndrome.

The market may not yet be 365-sized but as GP mentioned: it's there.

And there are young people arriving at an age to open a business who have never used a Windows computer in their entire life. To them Microsoft is the company that make the virus-infested, slow, computers full of ads they see at their grandparents' house. That cohort ain't buying Windows / buying Office / using Azure.


And burntsushi is one of us: he's regularly here on HN. Big thanks to him. As soon as rg came out I was building it on Linux. Now it ships stocks with Debian (since Bookworm? Don't remember): thanks, thanks and more thanks.

Big thanks to him indeed (and for other projects in Rust space as well).

// really hoping openai wouldn't now force him to work on some crappy codex stuff if he stays there / in astral.


> Next I'm going to set it loose on 263 GB database of every stock quote and options trade in the past 4 years.

Options quotes alone for US equities (or things that trades as such, like ADS/ADR) represent 40 Gbit per second during options trading hours. There are more than 60 million trades (not quotes, only trades) per day. As the stock market is opened approx 250 days per year (a bit more), that's more than 60 billion actual options trades in 4 years. If we're talking about quotation for options, you can add several orders of magnitude to these numbers.

And I only mentioned options. How do you store "every stock quote and options trade in the past 4 years" in 263 GB!?


> And I only mentioned options. How do you store "every stock quote and options trade in the past 4 years" in 263 GB!?

I think this would be pretty straightforward for Parquet with ZSTD compression and some smart ordering/partitioning strategies.


I see, I said "stock quote" instead of "minute aggregates". You are correct that data set is much larger and at ~1.5TB a year [0] I did not download 6TB of data onto my laptop. Every settled trade options or stocks isn't that big.

[0] https://massive.com/docs/flat-files/stocks/quotes


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: