Hacker Newsnew | past | comments | ask | show | jobs | submit | haberman's commentslogin

> Thankfully, there is the esm-integration proposal, which is already implemented in bundlers today and which we are actively implementing in Firefox.

From the code sample, it looks like this proposal also lets you load WASM code synchronously. If so, that would address one issue I've run into when trying to replace JS code with WASM: the ability to load and run code synchronously, during page load. Currently WASM code can only be loaded async.


This is not strictly true; there are synchronous APIs for compiling Wasm (`new WebAssembly.Module()` and `new WebAssembly.Instance()`) and you can directly embed the bytecode in your source file using a typed array or base64-encoded string. Of course, this is not as pleasant as simply importing a module :)

> Not everyone in history thought that 12-TET was an acceptable compromise. Johann Sebastian Bach thought we should use other tuning systems

This is presented as fact, but as I understand it there is no conclusive evidence for what Bach intended wrt temperament. There is a theory that the title page of the Well-Tempered Clavier encodes Bach’s preference in the calligraphic squiggles, but this is a recent theory and speculative. I don’t believe there are any direct statements by Bach as to his intention.


TL;DR: when a user writes to /proc/self/mem, the kernel bypasses the MMU and hardware address translation, opting to emulate it in software (including emulated page faults!), which allows it to disregard any memory protection that is currently setup in the page tables.


It doesn't bypass it exactly, it's still accessing it via virtual memory and the page tables. It's just that the kernel maintains one big linear memory map of RAM that's writable.

Thank You.


> So many of our foundational institutions – hiring, journalism, law, public discourse – are built on the assumption that reputation is hard to build and hard to destroy. That every action can be traced to an individual, and that bad behavior can be held accountable. That the internet, which we all rely on to communicate and learn about the world and about each other, can be relied on as a source of collective social truth. [...] The rise of untraceable, autonomous, and now malicious AI agents on the internet threatens this entire system.

I disagree. While AI certainly acts as a force multiplier, all of these dynamics were already in play.

It was already possible to make an anonymous (or not-so-anonymous) account that circulated personal attacks and innuendo, to make hyperbolic accusations and inflated claims of harm.

It's especially ironic that the paragraph above talks about how it's good when "bad behavior can be held accountable." The AI could argue that this is exactly what it's doing, holding Shambaugh's "bad behavior" accountable. It is precisely this impulse -- the desire to punish bad behavior by means of public accusation -- that the AI was indulging or emulating when it wrote its blog post.

What if the blog post had been written by a human rather than an AI? Would that make it justified? I think the answer is no. The problem here is not the AI authorship, but the actual conduct, which is an attempt to drag a person's reputation through mudslinging, mind-reading, impugning someone's motive and character, etc. in a manner that was dramatically disproportionate to the perceived offense.


Lately I'm seeing more and more value in writing down expectations explicitly, especially when people's implicit assumptions about those expectations diverge.

The linked gist seems to mostly be describing a misalignment between the expectations of the project owners and its users. I don't know the context, but it seems to have been written in frustration. It does articulate a set of expectations, but it is written in a defensive and exasperated tone.

If I found myself in a situation like that today, I would write a CONTRIBUTING.md file in the project root that describes my expectations (eg. PRs are / are not welcome, decisions about the project are made in X fashion, etc.) in a dispassionate way. If users expressed expectations that were misaligned with my intentions, I would simply point them to CONTRIBUTING.md and close off the discussion. I would try to take this step long before I had the level of frustration that is expressed in the gist.

I don't say this to criticize the linked post; I've only recently come to this understanding. But it seems like a healthier approach than to let frustration and resentment grow over time.


Agreed, TFA is a good example of how to write down expectations explicitly.

But as far as dinging Hickey for the fact that he eventually needed to write bluntly? I'm not feeling that at all. Some folks feel that open-source teams owe them free work. No amount of explanation will change many of those folks' minds. They understand the arguments. They just don't agree.


> he eventually needed to write bluntly

Is there a history of that here? Were there earlier clear statements of expectations (like CONTRIBUTING.md) that expressed the same expectations, but in a straightforward way, that people just willfully disregarded?

I don't mean to "ding" anybody, I mostly just felt bad that things had gotten to the point where the author was so frustrated. I completely agree that project owners have the right to set whatever terms they want, and should not suffer grief for standing by those terms.


I don't remember the exact situation, but I think this relates to this:

Clojure core was sent a set of patches that were supposed to improve performance of immutable data structures but were provided without much consideration of the bigger picture or over optimized for a specific use case.

There's a Reddit thread which provides a bit more detail so excuse me if I got some of it wrong: https://www.reddit.com/r/Clojure/comments/a01hu2/the_current...

*Edit* - actually this a better summary: https://old.reddit.com/r/Clojure/comments/a0pjq9/rich_hickey...


Dissatisfaction n. 3 is the essence of the problem: "Because Clojure is a language and other people's jobs and lives depend on it, the project no longer feels like someone's personal project which invites a more democratic contribution process". This is a common, and modern, feeling that the more users a certain thing has, the more the creators/maintainers have a duty to treat it as a "commons or public infrastructure" and give the users a vote on how the thing is to be managed and developed. This is, of course, utter horsesh*t.


> Is there a history of that here?

I have been maintaining not-super-successful open source projects, and I've had to deal with entitled jerks. Every. Single. Time. I am totally convinced that any successful open source project sees a lot more of that.

> Were there earlier clear statements of expectations (like CONTRIBUTING.md) that expressed the same expectations, but in a straightforward way, that people just willfully disregarded?

IMO it's not needed. I don't have to clearly state expectations: I open source my code, you're entitled to exactly what the licence says. The CONTRIBUTING.md is more some kind of documentation, trying to avoid having to repeat the same thing for each contribution. But I don't think anyone would write "we commit to providing free support and free work someone asks for it" in there :-).


Someone once said: Abuse and expectations erode a culture of cooperation.

I am currently seeing this in real time at $work. A flagship product has been placed onto the platform we're building, and the entire sales/marketing/project culture is not adjusting at all. People are pushy, abusive, communicate badly and escalate everything to the C-Level. As a result, we in Platform Engineering are now channeling our inner old school sysadmins, put up support processes, tickets, rules, expectations and everything else can go die in a ditch.

Everyone suffers now, but we need to do this to manage our own sanity.

And to me at least, it feels like this is happening with a lot of OSS infrastructure projects. People are getting really pushy and pissy about something they need from these projects. I'd rather talk to my boss to setup a PR for something we need (and I'm decently successful with those), but other people are just very angry that OSS projects don't fullfil their very niche need.

And then you get into this area of anger, frustration, putting down boundaries that are harmful but necessary to the maintainers.

Even just "sending them to the CONTRIBUTING.md". Just with a few people at work, we are sending out dozens of reminders about the documentation and how to work with us effectively per week to just a few people. This is not something I would do on my free time for just a singular day and the pain-curbing salary is also looking slim so far.


Furthermore, writing down the contract calmly, as part of a plan, can avoid having to bang it out in frustration and leaving a bad taste.


> I don't say this to criticize the linked post

What you have written is obviously a criticism of the linked post.


If I'm criticizing the linked post, then I'm also criticizing myself, because I could easily imagine having written it.


I think some might get the impression that you're complaining about Hickey's tone. Perhaps your emotional terms "frustration," "defensive," and "exasperated" may be the reason.


I don't see anything wrong with the way he expressed himself, and I think his point is totally legitimate. I mostly just felt bad that he experienced so much grief about it, on account of a gift he was offering to the world.


"So much grief." It sounds like you're trying to interpret Hickey's emotions. How would you check whether your interpretation is accurate?


I don't know if you're a native English speaker, so apologies if this isn't appropriate. But the word 'grief' has more than one vernacular meaning.

"Giving someone grief" means giving someone a hard time.

So "he experienced so much grief" can just mean that it can just mean that people criticised him. It doesn't necessarily express anything about Rich Hickey's state of mind.


More concretely, I think the magic lies in these two properties:

1. Conservation of mass: the amount of C code you put in will be pretty close to the amount of machine code you get out. Aside from the preprocessor, which is very obviously expanding macros, there are almost no features of C that will take a small amount of code and expand it to a large amount of output. This makes some things annoyingly verbose to code in C (eg. string manipulation), but that annoyance is reflecting a true fact of machine code, which is that it cannot handle strings very easily.

2. Conservation of energy: the only work that will be performed is the code that you put into your program. There is no "supervisor" performing work on the side (garbage collection, stack checking, context switching), on your behalf. From a practical perspective, this means that the machine code produced by a C compiler is standalone, and can be called from any runtime without needing a special environment to be set up. This is what makes C such a good language for implementing garbage collection, stack checking, context switching, etc.

There are some exceptions to both of these principles. Auto-vectorizing compilers can produce large amounts of output from small amounts of input. Some C compilers do support stack checking (eg. `-fstack-check`). Some implementations of C will perform garbage collection (eg. Boehm, Fil-C). For dynamically linked executables, the PLT stubs will perform hash table lookups the first time you call a function. The point is that C makes it very possible to avoid all of these things, which has made it a great technology for programming close to the machine.

Some languages excel at one but not the other. Byte-code oriented languages generally do well at (1): for example, Java .class files are usually pretty lean, as the byte-code semantics are pretty close to the Java langauge. Go is also pretty good at (1). Languages like C++ or Rust are generally good at (2), but have much larger binaries on average than C thanks to generics, exceptions/panics, and other features. C is one of the few languages I've seen that does both (1) and (2) well.


Nicely put!

Haven't seen C's allure quite explained that way.


The rate of college attendance has increased dramatically in the last 250 years, and especially in the last 75.

In 1789 there were 1,000 enrolled college students total, in a country of 2.8M. In 2025, it is 19M students in a country of 340M. https://educationalpolicy.org/wp-content/uploads/2025/11/251...

In 1950, 5.5% of adults ages 25-34 had completed a 4 year college degree. In 2018, it was 39%. https://www.highereddatastories.com/2019/08/changes-in-educa...

With attendance increasing at this rate (not to mention the exploding costs of tuition), it seems possible that the methods need to change as well.


So now we have a lot more people who can teach and mark exams.


I’ll repeat what I said at that time: one of the benefits of the new design is that it’s less vulnerable to the whims of the optimizer: https://news.ycombinator.com/item?id=43322451

If getting the optimal code is relying on getting a pile of heuristics to go in your favor, you’re more vulnerable to the possibility that someday the heuristics will go the other way. Tail duplication is what we want in case, but it’s possible that a future version of the compiler could decide that it’s not desired because of the increased code size.

With the new design, the Python interpreter can express the desired shape of the machine code more directly, leaving it less vulnerable to the whims of the optimizer.


Yeah, I believe that statement and it seems to hold true for MSVC as well. Thanks for your work inspiring all of this btw!


A long time ago I read that CadQuery has a fundamentally more powerful geometry kernel than OpenSCAD, so I dropped any attempt to try OpenSCAD.

Years later, I never actually got the hang of CadQuery, and I'm wondering if it was a mistake to write off OpenSCAD.

I am pretty new to CAD, so I don't actually know when I would run into OpenSCAD's limitations.


The notable limitations for OpenSCAD are:

- functional programming model --- some folks find not having traditionally mutable variables limiting

- output is as an STL, or DXF using polylines

- native objects are spheres, cylinders, cubes, with functions for hull and Minkowski, so filleting and other traditional CAD operations can be difficult


> Hotspot is the choice for high performance programs. Approaching its performance even with C++ requires a dedicated team of experts.

It's very surprising to hear you say this, as it's so contrary to my experience.

From the smallest programs (Computer Language Benchmarks Game) to pretty big programs (web browsers), from low-level programs (OS kernels) to high-level programs (GUI Applications), from short-lived programs (command-line utilities) to long-lived programs (database servers), it's hard to think of a single segment where even average Java programs will out-perform average C, C++, or Rust programs.

I hadn't heard of QuestDB before, but it sounds like it's written in zero-GC Java using manual memory management. That's pretty unusual for Java, and would require a team of experts to pull off, I'd think. It also sounds like it drops to C++ and Rust for performance-critical tasks.


It's a statement of my experience in the performance achieved in practice by real developers who lack dedicated language support teams. And even the ones who enjoy dedicated language support teams. I could point to gRPC. gRPC-Java is slapping gRPC-C++ sideways. Why is that? Because when a codebase is increasingly complex, the C-style lifetime management becomes too difficult for developers to ponder, and they revert to relying on the slower features of the language platform, like reference counting smart pointers.

I think hybrid implementations, where a project enjoys the beneficial aspects of the language runtime at large, but delegates small, critical functions to other languages, makes sense. That keeps the C, C++, or Rust stuff contained to boundaries that are ponderable and doesn't let those language platforms dictate the overall architecture of the program.


If gRPC overhead is critical to your system, you've probably already lost the plot on performance in your overall architecture.

You make a fair point about smart pointers, and median "modern C++" practices with STL data structures are unimpressive performance-wise compared to tuned custom data structures, but I can't imagine that idiomatic Java with GC overhead on top is any better.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: