we had a much larger disruption to logistics just a few years ago if you recall =) and hiring really took off then. so i don't think this argument really holds any water
Money was free then, and hiring took off in software, which people were relying on a lot more during the pandemic. All of my friends who are performers or work in events were wrecked financially. The industry still hasn’t really recovered.
Not that I disagree with your assessment but in the spirit of hn pedantry - google had a very significant breach where gmail was a primary target and that was “only” 16 years ago in mid 2009. So bad that it has its own wikipedia page: https://en.wikipedia.org/wiki/Operation_Aurora
The DNS filter setting on the FortiGate analyzes the DoH traffic and strips out the ECH parameters sent by the DNS server in the DoH response. If the client does not receive those parameters, it cannot encrypt the inner SNI, so it will send it in clear text.
So basically they mess with DoH ECH config and trigger fallback behavior in the clients. I don't think any browsers do this yet but I think this loophole is not gonna last.
I'm surprised that works. Doesn't TLS1.3 do the thing where it crosschecks (a hash of) the setup parameters after key-agreement to protect against exactly this kind of downgrade attack?
(My phone screen is too small to look through the RFCs right now.)
I think what you're describing is TLS1.3 Finished verification so that happens after DoH response during the actual handshake. Basically this works because ECH is fairly new and there's no HSTS-style "always use ECH for this site" configuration yet. And ofc this only works if you configured FortiGate as your DNS (corp network) or if it's doing MITM (though I'd expect browser would verify cert fingerprint for DoH connections as well).
This is literally impossible. What your corp fw likely does is mitm outer SNI because your IT department installed your company CA in every client's trust store. So unless you do that at national level your only other option is to block ECH entirely.
Edit: actually totally possible but you need build quantum computer with sufficient cubits first =)
Won't make a bit of difference because everything is in a sort of container (not Docker) anyway. Unless you're suggesting those libraries to be distributed as base image to every possible Borg machine your app can run on which is an obvious non-starter.
Well if you just create brand new account with no/random connections you'd be pretty disappointed… that’s how they get ya. It’s totally fine as a sort of virtual rolodex, maybe some content marketing, mediocre as job board although all job boards seem to have turned into a total lemon market so there’s that
At the time this was written powerful backend server only had like 4 cores. Linux only started adopting SMP like that same year. Also CPU caches were tiny
Serving less than 1k qps per core is pretty underwhelming today, at such a high core count you'd likely hit OS limitations way before you're bound by hardware
Linux had been doing SMP for about 5 years by that point.
But you're right OS resource limitations (file handles, PIDs, etc) would be the real pain for you. One problem after another.
Now, the real question is do you want to spend your engineering time on that? A small cluster running erlang is probably better than a tiny number of finely tuned race-car boxen.
My recollection is fuzzy but i remember having to recompile 2.4-ish kernels to enable SMP back in the day which took hours... And I think it was buggy too.
Totally agree on many smaller boxes vs bigger box especially for proxying usecase.
Nothing about current js ecosystem screams good architecture it’s hacks on hacks and a culture of totally ignoring everything outside of your own little bubble. Reminds me of early 2000s javabeans scene
Reminds me of early 2000s JS scene, in fact. Anything described as "DHTML" was a horrid hack. Glad to see Lemmings still works, though, at least in Firefox dev edition. https://www.elizium.nu/scripts/lemmings/
Unqualified broadscale hate making no assertions anyone can look at and check? This post is again such wicked nasty Dark Side energy, fueled by such empty vacuous nothing.
It was an incredible era that brought amazing aliveness to the internet. Most of it was pretty simple & direct, would be more intelligible to folks today than today would be intelligible & legible to folks from them.
This behavior is bad & deserves censure. The hacker spirit is free to be critical, but in ways that expand comprehension & understanding. To classify a whole era as a "horrible hack" is injurous to the hacker spirit.
reply