Hacker News new | past | comments | ask | show | jobs | submit | alexjplant's comments login

I remember a handful of insanely specific things from this age that I've since verified with my parents - I generally don't seem to misremember or fabricate memories. That being said there are a _lot_ of things that I require prompting via photo or video to remember and probably 90% of my life from that time is unrecoverable. I vividly recall random moments from family functions and some intense nightmares but the family dog, for instance, is a complete black hole. Weird stuff.

Primary school is still knocking around but I think I (and many others) suppress it to retain their sanity in adulthood :-P


This is perhaps my favorite Stack Overflow answer of all time. I don't remember when I last saw such an approachable explanation of something so critical yet complicated.

> One of the canonical approaches, graph coloring, was first proposed in 1981.

This is about as far as my professor took this topic in class ~13 years ago. Nevertheless the slides that he used to illustrate how the graph coloring problem applied to register allocation stick with me to this day as one of the most elegant applications of CS I've ever seen (which are admittedly few as I'm not nearly as studied as I ought to be).

> Code generation is a surprisingly challenging and underappreciated aspect of compiler implementation, and quite a lot can happen under the hood even after a compiler’s IR optimization pipeline has finished. Register allocation is one of those things, and like many compilers topics, entire books could be written on it alone.

Our class final project targeted a register-based VM [1] for this exact reason. I also found writing MIPS assembly simpler than Y86 [2] because of the larger number of registers at my disposal (among the other obvious differences).

[1] https://www.lua.org/doc/jucs05.pdf

[2] https://esolangs.org/wiki/Y86


> Stack Overflow

This is on 'Programming Language Design and Implementation Stack Exchange'[0] -- I'm not pointing this out to tell you that you're wrong -- I think the various 'Stacks Exchange' often have better, more thoughtful answers than the average Stack Overflow question.

[0] really rolls off the tongue


For more theoretical and pure CS stuff, yes they do since some point between ~2012-5 when SO got overrun by web developers and people grinding coding challenges. And of course the mushrooming number of specialist SE network sites [https://stackexchange.com/sites] like PLD, Theoretical CS, CodeReview, Computational Science, DBA, SWE, CrossValidated etc. + SO in several non-English languages [https://stackoverflow.com/help/non-english-questions].

Even the standards of behavior between different tags on SO itself vary greatly (in terms of how well-written a question is, whether it has an MCVE, whether the OP check it wasn't a dupe and searched existing Q&A, etc.).

If you want to chronicle the descent, look at Meta posts about "give me teh codez"-type questions [https://meta.stackoverflow.com/search?q=%22give+me+teh+codez...].


Stackexchanges are some of favourite things to idly read.

Short, snappy, nice LaTeX and so on.

And then you sometimes you have just sublimely creative people answering e.g. Ron Maimon's answers on physics SE are a goldmine


Most StackExchange websites can also be read offline, since Kiwix regularly archives them:

https://library.kiwix.org/#lang=eng&category=stack_exchange


This is actually also the main thing I have ChatGPT write for me e.g. I can have it dial the level of mathematics to exactly where I can be bothered to tolerate (sometimes life is too short for symbols)

Amen to that. It’s amazing (to me) the difference in cultures of the different exchanges. I see some of the most awesome answers and explanations on aviation SE. Whereas I rarely try to post (answer or question) to SO anymore, because it feels like navigating the DMV.

Unfortunately, just like with parser generators and more powerful parsing algorithms (in particular bottom-up LR and such), the practice proved very different than the theory. Linear-scan and variants of it have become the norm for register allocation.

GCC uses a region-based allocator with graph coloring, based to some extent on Callahan/Koblenz's work I think: https://gcc.gnu.org/git/?p=gcc.git;a=blob;f=gcc/ira.cc

Note that in the GCC context, LRA means Local Register Allocator, not linear scan: https://gcc.gnu.org/git/?p=gcc.git;a=blob;f=gcc/lra.cc

(There was much more talk recently of GCC's LRA than IRA because completing the reload-to-LRA transition in the compiler threatened the removal of some targets still without reload support.)


I've had a lot of success using chordal graph allocators. They provide plenty of extra dimensions of 'relaxation' to tune them, they're incremental (so they allow pinning), and they decay nicely when their constraints are violated. Because of their incremental nature & "niceness" of decay, they can be forced into a nice hierarchical form ("middle out" on the loops). The main driving algorithm (maximum cardinality search) is a little harebrained; but, if you just relax and write up the code, you'll find it is surprisingly short & robust, and highly amenable to unit testing.

Spilling/filling is a bit exciting, since chordal coloring doesn't provide a lot of direction, but I've found that pressure heuristics fill in the gap nicely. The whole thing relies on having a robust interference graph — which more than kind of sucks — but, we don't get into compilers unless we've weaponized our bit-set data-structures in the first place.


Is this true though? Last time I worked on a compiler (admittedly quite a few years ago) Briggs was the bare minimum; our compiler in particular used an improvement over Callahan's hierarchical register allocation (the basic idea of which is that you should prioritize allocation in innermost loops, over a better "global" graph coloring, since spilling once in the inner loop costs way more than spilling several registers in the linear/ setup part of the code).

I would expect that only compilers for immature languages (that don't care about optimization) use naive RA.


Or JIT compilers, where compilation time is an important factor -- e.g. V8 uses variations on linear scan.

At least GCC appears to use a graph coloring algorithm. LLVM seems to have moved from linear scan to a custom algorithm in version 3; I have no idea what they're using nowadays.

LLVM now uses something they call the "Greedy Register Allocator". As far as I can tell, it's a variation on the linear allocator with some heuristics. Here's a presentation: https://www.youtube.com/watch?v=hf8kD-eAaxg

Compilers are of course on of the purest applications of theoretical computer science.

Just that none of the parsing methods I learned are actually used commonly in most real compilers.

Oh, it's from Alexis King. No wonder it's written so well.

This is pretty great satire because it's a (presumably) working language that's chock-full of ridiculous design decisions that also happen to be in some of the most popular ones that people use every day. You can probably guess which is the chief target of its ire (though there are funnies from a few others peppered in there too).

A Reddit-tier post would be an image macro or Twitter screenshot. Alternatively maybe I find it funny because I've had to listen to lazy developers defend some of these types of decisions with a straight face.


Based on the following I think they also meant _how_ the code is running:

> The other group (increasingly large) just wants to `git push` and be done with it, and they're willing to spend a lot of (usually their employer's) money to have that experience. They don't want to have to understand DNS, linux, or anything else beyond whatever framework they are using.

I'm a "full full-stack" developer because I understand what happens when you type an address into the address bar and hit Enter - the DNS request that returns a CNAME record to object storage, how it returns an SPA, the subsequent XHR requests laden with and cookies and other goodies, the three reverse proxies they have to flow through to get to before they get to one of several containers running on a fleet of VMs, the environment variable being injected by the k8s control plane from a Secret that tells the app where the Postgres instance is, the security groups that allow tcp/5432 from the node server to that instance, et cetera ad infinitum. I'm not hooking debuggers up to V8 to examine optimizations or tweaking container runtimes but I can speak intelligently to and debug every major part of a modern web app stack because I feel strongly that it's my job to be able to do so (and because I've worked places where if I didn't develop that knowledge then nobody would have).

I can attest that this type of thinking is becoming increasingly rare as our industry continues to specialize. These considerations are now often handled by "DevOps Engineers" who crank out infra and seldom write code outside of Python and bash glue scripts (which is the antithesis to what DevOps is supposed to be, but I digress). I find this unfortunate because this results in teams throwing stuff over the wall to each other which only compounds the hand-wringing when things go wrong. Perhaps this is some weird psychopathology of mine but I sleep much better at night knowing that if I'm on the hook for something I can fix it once it's out in the wild, not just when I'm writing features and debugging it locally.


> I can attest that this type of thinking is becoming increasingly rare as our industry continues to specialize.

This (and a few similar upthread comments) sum the problem up really concisely and nicely: pervasive, cross-stack understanding of how things actually work and why A in layer 3 has a ripple effect on B in layer 9 has become increasingly rare, and those who do know it are the true unicorns in the modern world.

Big part of the problem is the lack of succession / continuity at the university level. I have been closely working with very bright, fresh graduates/interns (data science, AI/ML, software engineering – a wide selection of very different specialisations) in the last few years, and I have even hired a few of them due to being that good.

Talking to them has given me interesting insights into what and how universities teach today. My own conclusion is that the reputable universities teach very well, but what they teach to is highly compartmentalised and typically there is little to no intersection across areas of study (unless the prospective student hits the pot of luck and enrolls in elective studies that go across the areas of knowledge). For example, students who study game programming (yes, it is a thing) do not get taught the CPU architectures or low-level programming in assembly; they have no idea what a pointer is. Freshly graduated software engineers have no idea what a netmask is and how it helps in reading a routing table; they do not know what a route is, either.

So modern ways of teaching are one problem. The second (and I think a big one) is the problem that the computing hardware has become heavily commoditised and appliance-like, in general. Yes, there are a select few who still assemble their own racks of PC servers at home or tinker with Raspberry Pi and other trinkets, but it is no longer an en masse experience. Gone are the days when signing up with an ISP also required building your own network at home. This had an important side effect of acquiring the cross-stack knowledge, which can only be gained today by willingfully taking up a dedicated uni course.

With all of that disappearing into oblivion, the worrying question that I have is: who is going to support all this «low level» stuff in a matter of 20 years without a clear plan for the cross-stack knowledge to succeed the current (and the last?) generation of unicorns?

So those who are drumming up the flexibility of k8s and alike miss out on one important aspect: with the lack of cross-stack knowledge succession, k8s is a risk for any mid- to large-sized organisation due to being heavily reliant on the unicorns and rockstar DevOps engineers who are few and far between. It is much easier to palm the infrastructure off to a cloud platform where supporting it will become someone else's headache whenever there is a problem. But the cloud infrastructure usually just works.


> For example, students who study game programming (yes, it is a thing) do not get taught the CPU architectures or low-level programming in assembly; they have no idea what a pointer is. Freshly graduated software engineers have no idea what a netmask is and how it helps in reading a routing table; they do not know what a route is, either.

> So modern ways of teaching are one problem.

IME school is for academic discovery and learning theory. 90% of what I actually do on the job comes from self-directed learning. From what I gather this is the case for lots of other fields too. That being said I've now had multiple people tell me that they graduated with CS degrees without having to write anything except Python so now I'm starting to question what's actually being taught in modern CS curricula. How can one claim to have a B.Sc. in our field without understanding how a microprocessor works? If it's in deference to more practical coursework like software design and such then maybe it's a good thing...


> […] self-directed learning.

And this is whom I ended up hiring – young engineers with curious minds, who are willing to self-learn and are continuously engaged in the self-learning process. I also continuously suggest interesting, prospective, and relevant new things to take a look into, and they seem to be very happy to go away, pick the subject of study apart, and, if they find it useful, incorporate it into their daily work. We have also made a deal with each other that they can ask me absolutely any question, and I will explain and/or give them further directions of where to go next. So far, such an approach has worked very well – they get to learn arcane (it is arcane today, anyway) stuff from me, they get full autonomy, they learn how to make their own informed decisions, and I get a chance to share and disseminate the vasts of knowledge I have accumulated over the years.

> How can one claim to have a B.Sc. in our field without understanding how […]

Because of how universities are run today. A modern uni is a commercial enterprise, with its own CEO, COO, C<whatever other letter>O. They rely on revenue streams (a previously unheard-of concept for a university), they rely on financial forecasts, and, most important of all, they have to turn profits. So, a modern university is basically a slot machine – outcomes to yield depend entirely on how much cash one is willing to feed it. And, because of that, there is no incentive to teach across the areas of study as it does not yield higher profits or is a net negative.


Maybe in the US. Any self-titled Engineer in Europe with no knowledge of CPU's, registers, stacks, concurrency, process management, scheduling, O-notation, dynamic systems, EE, and a bigass chunk of Math from Linear to Abstract it would be insta-bashed down in the spot with no degree at all.

Here in Spain atthe most basic uni you are almost being able to write a Minix clone from scratch into some easy CPU (Risc-V maybe) from all the knowledge you got.

I am no Engineer (trade/voc arts, just a sysadmin) and I can write a small CHIP8 emulator at least....


I am not based in the US, and I currently work for one of the top 100 universities of the world (the lower 50 part, though).

Don't link to this guy's site. He has a serious personal problem with every reader of HN (including the vast majority he's never met and knows nothing about) and serves an NSFW image to anybody that has this site in the referrer request header.

I find it insane that everybody is seemingly okay with the fact that websites get to know what a user was browsing before landing on their site. And that there's no straightforward way to disable this behavior without the use of extensions.

Highly recommend installing https://addons.mozilla.org/en-US/firefox/addon/smart-referer


Hmm, the addon description says "has been largely superseeded by better browser defaults"

huh, no idea why it would say that, the only sane default is when referrer is not leaked to websites at all.

Doesn't seem NSFW to me at all. It's a skin textured egg with hairs in an egg cup.

https://cdn.jwz.org/images/2024/hn.png


Huh, looks like he changed the image in 2024 (based on url). Before today, I'd have said the jwz image was goatse.

It was always the hairy egg/scrotum.

This is a piece of Internet lore I'm not familiar with! Any good summary or article or anything you'd recommend?

There's nothing much to it. Jamie Zawinski just doesn't think highly of Venture Capital etc. He thinks that the first tech boom ruined San Francisco and he thinks that Paul Graham is terrible. That's pretty much it.

Considering that I've spent many a night at Bootie SF (back when it was at DNA Lounge) on too many drugs or alcohol to respond to on-call issues, it's entirely possible that he had more power over tech than it had over his life.

Like many people in San Francisco, he is an intelligent guy with great will to power and eccentric views.


Why do you need an article? Just find a link to some page on http://jwz.org somewhere on news.ycombinator.com, and click it. Note that you will get a (really quite tame) NSFW image. Said image will explain jwz's feelings regarding Hacker News.

Presumably they are looking for more than the 2 lines in the image that don't explain anything about why.

The lines do explain why, from the horse's mouth. I don't know what value-add you'd expect from an "article" - an interview with jwz?

https://www.jwz.org/blog/2011/11/watch-a-vc-use-my-name-to-s... doesn't mention HN directly, but will give you a sense of his POV (copy-and-paste the link or you'll get the behavior described up-thread)

Alright, maybe "article" was a poor word choice? Thanks for an example of of what happens when you link to their site from here, I was just curious to read into the drama, it sounds like an interesting story from a pocket of the internet I'm not familiar with.

When I said "article," I guess I meant that I don't want to watch a youtube video detailing the drama or whatever, I was hoping to just read about it. Either way, thanks for the link and the explanation further down the thread.


Coders at Work has an interview with a pretty comprehensive backstory of his time in the tech industry. If you want to get more of his vibe, he's also in the documentary Code Rush.

Honestly it's refreshing to see someone in tech stand up to the VC-only vision of what tech can be. I still enjoy HN, but got a good laugh at image that popped up and agreed he has a point.

He seems to have a pretty good grasp of the average commenter here, I don't know what your problem is with him exercising his free speech rights on his own website to deal with people he feels that way about.

Sounds like you're the one with the personal problem. Let people read and link to what they want, and mind your business.

Get a better browser that doesn't leak the site you came from all over the place.

Sounds like a sensible and principled fellow.

Considering that Netscape used to have “about:jwz” as an Easter egg URL and he was influential in open sourcing Netscape…

Didn't any unknown about: string just lead to somebody's wwwhome with simple URL rewriting? At least, with some versions?

No there was a pretty small but interesting list of people in the source code.

https://www.jwz.org/doc/about-jwz.html


Yeah, I remembered that after, had to open another browser window to confirm it because otherwise I just got the cached one, and then edited.

You can hate jwz as much as you want but the fact remains that probably 95% of ycombinator funded startups couldn't exist without making extensive use of the gpl, bsd, Apache and similar licensed software produced by a lot of open source curmudgeons and greybeards.

Many of whom share his same opinion on VCs and late stage capitalism.


> You can hate jwz as much as you want

I don't hate him and didn't write anything of the sort. I've never met him. He might as well be the tooth fairy or the Easter bunny as fas as I'm concerned.

> the fact remains that probably 95% of ycombinator funded startups couldn't exist without making extensive use of the gpl, bsd, Apache and similar licensed software produced by a lot of open source curmudgeons and greybeards. Many of whom share his same opinion on VCs and late stage capitalism.

I've been using FOSS for personal use since I was a child. I'm a literal card-carrying supporter of the FSF. I'm more passionate about it than I am shitting on random people on the internet. If he has a problem with VCs then he can spam them with pictures of genitalia and ad-hominem attacks, not me, who's just some random dude who _isn't_ worth millions of dollars from a software business exit (along with thousands of others on this site).


It's a shitty move, I'll grant you, but it's also clearly a form of protest, and acts of protest are by-design intended to make the public uncomfortable in some way. As far as this kind of thing goes, I personally find it mildly amusing, mildly offensive, and mildly clever. I think the internet is more interesting for people like jwz doing things like this. Not necessarily "trolling," because this--for all its crude imagery and sardonic language--it's a very clever way to continuously raise awareness toward a larger issue they have taken with society. I was just motivated to read a pair of blog posts from 13-years ago, ones that required additional effort to access, and considered jwz's perspective on VCs and a hyper-aggressive work culture...for anyone who's ever attempted to publish any kind of content? That's kind of impressive.

Not that I would seek to deny your valid annoyance and offense. It's a non-consensual scrotum in a teacup. I personally would have gone a different direction, I get why that would raise someone's hackles.


> You can hate jwz as much as you want but the fact remains that probably 95% of ycombinator funded startups couldn't exist

wait, this would be a bad thing?


But without six different AI-enabled VSCode forks to choose from, how would I ever write software?

[flagged]


Oh wow, love it, a testacle in a teacup, dudes just made my day!

> Oh wow, love it, a testacle in a teacup, dudes just made my day!

It's an eggcup.


Well now we're just splitting pubes.

No, that's the science project being worked on at cern, the large hardon collider

He's not wrong per se, just really hostile to the idea.

Unexpectedly appropriate with the price of eggs these days

$2.50 for 10 eggs? Not getting that reference.

> Don't tell me what to do.

Breathe.


As I understand it this has already been done for several systems. The first time I saw such a device it was one of those bootleg "100 Games In A Joystick" things that Ben Heck tore apart because it contained a glop-top "NES on a chip".

> FPGAs are pretty expensive I imagine compared to some 40 year old cpu design and 1kb of ram.

The MiSTER Pi set me back $180 and is perhaps my favorite purchase of 2024. It'll run almost any system made before the year 2000 and the gap is rapidly closing for lesser-appreciated consoles like the Saturn and Jaguar. This represents a tremendous value; it's hard to argue with that kind of money for an entire century of gaming. I heartily recommend picking one up.


Tangentially related: I have a vague memory of reading somewhere that the PCM sampling frequency for CD Audio was decided between Sony and Phillips by way of a surf contest. Their respective frequencies of choice were such that they could fit some arbitrarily-long piece of classical music onto a single disc so they decided to battle it out on the waves (the conference where they were hashing this out was in Hawaii). Phillips supposedly won so we got 44.1 kHz.

I just did a cursory web search to find this anecdote and was unsuccessful. Did I make this up whole cloth or is it just buried someplace? Or was I bamboozled at a young age by some random forumite on a now-defunct site?

*EDIT: This comprehensive account [1] seems to confirm that the story is completely apocryphal.

[1] https://www.dutchaudioclassics.nl/The-six-meetings-Philips-S...


Sony won, not Philips. It seems that the rationale is like so: a higher rate would not be compatible with both NTSC and PAL VCRs, and a lower rate would shrink the transition band for the antialiasing filter (or alternatively, reduce usable audio bandwidth to an extent that encroaches on the generally accepted limit of human hearing). Although the latter hardly seems relevant when the alternatives being debated were so close (Philips' 44056 Hz, for example)!

https://en.wikipedia.org/wiki/44,100_Hz#Origin


> a lower rate would shrink the transition band for the antialiasing filter (or alternatively, reduce usable audio bandwidth to an extent that encroaches on the generally accepted limit of human hearing)

I've seen people smarter than me argue that the ideal sampling rate is actually somewhere around 64 kHz because it would allow for a gentler anti-aliasing filter with fewer phase artifacts.


Why couldn't they make use of a sampling rate with five samples per frame (which would exactly give 88.2kHz by the way)?

Because doubling the sample rate would half the playing time of a CD, or require twice the density, while not bringing any audible increase in quality for the human ear.

> noodles

Replace them with a heartier starch like lentils or farro. As a bonus you also end up with better micro and macronutrients than you'd get from enriched egg noodles or the like.

My current meal prep stew comprises olive oil, pork, leeks, garlic, carrots, mushrooms, lentils, onions, tomato paste, and kale. For seasoning I'll throw in red pepper flakes and italian seasoning along with some balsamic vinegar if I don't get the depth of flavor I want. A little gorgonzola or other blue cheese on the side really sets it off.


While there are guidelines in the form of an unclassified, gov't-hosted desk reference [1] ultimate discretion seems to be left to adjudicators. There are public archives [2] that talk about such decisions made in tricky cases of criminal activity, financial malfeasance, potential blackmail, etc. I've seen at least one where somebody admitted to doing hard drugs while having a clearance and actually got upgraded to a TS (though this was ~10 years ago and I can't find it via search engine).

[1] https://www.dhra.mil/portals/52/documents/perserec/adr_versi...

[2] https://doha.ogc.osd.mil/Industrial-Security-Program/Industr...


Musk's clearance was already somewhat restricted because of it and limited things he could see at SpaceX

https://www.cnbc.com/2019/03/07/elon-musk-asks-pentagon-for-...

After he did a $1 million lottery a day only for Registered Republicans, to bypass pay for registration laws, in a swing state during the election.. I'm guessing any leftovers of that will be lifted.


More recent sources, lawyers advised him not to try for higher clearance because of the drug use: https://www.thedailybeast.com/elon-musk-denied-access-to-spa...

They run their own DNS infra so that when you set the SOA for your zone to their servers they can decide what to resolve to. If you have protection set on a specific record then it resolves to a fleet of nginx servers with a bunch of special sauce that does the reverse proxying that allows for WAF, caching, anti-DDoS, etc. It's entirely feasible for them to exempt specific requests like this one since they aren't "protect[ing] the whole DNS" so much as using it to facilitate control of the entire HTTP request/response.

Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: