Hacker News new | past | comments | ask | show | jobs | submit login

The problems that beset the Javascript ecosystem today are the same problems that beset the Unix ecosystem, back in the 90s when there still was one of those. TC39 plays the role now that OSF did then, standardizing good ideas and seeing them rolled out. That's why Promise is core now. But that process takes a long time and solutions from the "rough consensus and running code" period stick around, which is why instanceof Promise isn't enough of a test for things whose provenance you don't control.

Of course, such a situation can't last forever. If the idea is good enough, eventually someone will come along and, as Linux did to Unix, kill the parent and hollow out its corpse for a puppet, leaving the vestiges of the former ecosystem to carve out whatever insignificant niche they can. Now the major locus of incompatibility in the "Unix" world is in the differences between various distributions, and what of that isn't solved by distro packagers will be finally put to rest when systemd-packaged ships in 2024 amid a flurry of hot takes about the dangers of monoculture.

Bringing it back at last to the subject at hand, Deno appears to be trying to become the Linux of Javascript, through the innovative method of abandoning the concept of "package" entirely and just running code straight from wherever on the Internet it happens to live today. As a former-life devotee of Stack Overflow, I of course applaud this plan, and wish them all the luck they're certainly going to need.

The impetus behind "lol javascript trash amirite" channer takes today is exactly that behind the UNIX-Haters Handbook of yore. I have a printed copy of that, and it's still a fun occasional read. But those who enjoy "javascript trash lol" may do well to remember the Handbook authors' stated goal of burying worse-is-better Unix in favor of the even then senescent right-thing also-rans they favored, and to reflect on how well that played out for them.




And your example is why we have the "lol javascript trash amirite" chorus, because as you've noted these problems were solved decades ago. Yet for some reason, the JS and npm ecosystems always seem to have some dependency dustup once or twice a year.


Yes, that's largely my point. I'm not sure why it is surprising to see an ecosystem, twenty-five or so years younger than the one I compared it to, have the same problems as that one did twenty-five years or so ago.


In one of Robert "Uncle Bob" Martin presentation you may find the answer. The number of developers duplicates each 5 years. That means that any point in time half of the developers have less than 5 years of experience. Add to that realization the fact that inexperienced developers are learning from other inexperienced developers and you get the answer on why we repeat the same mistakes again and again.

I guess that is a matter of time that the reality changes, we will not duplicate the number of developers indefinitely and experience and good practices will accumulate.

Taking into account the circumstances, we are not doing so badly.


> The number of developers duplicates each 5 years

You probably mean "double" here, but the bottom line is that there is zero data to back up that claim.

He literally made up that number out of thin air to make his talk look more important.


Let's say it's 10 years, or make it 15 years, for the sake of the argument.

How does that change his original argument?


It should be fairly simple to look up people describing themselves as developers in the census data I think?


Does the census actually track that? I just did the questionnaire last night online and it didn't ask me anything about my occupation.

Or did you mean something other than the US Census (e.g. GitHub or Stack Overflow or LinkedIn profiles)?


The long form asks about your line of work. Most people get the short form.


No, I meant the US census (or whatever national census), I didn’t actually check if they asked that since it seemed like such a basic thing :/ sorry.


Not as easy as you might think, since “developers” isn't particular to software and software developers have lots of other near-equivalent terms, the set of which in use changes over time, and many of them aren't unique to software, either.

OTOH, historical BLS data is easy to look up.


Do you have the source? Sounds like an interesting talk.


I found it! :) I has a lot of content and insights.

"Uncle" Bob Martin - "The Future of Programming"

https://www.youtube.com/watch?v=ecIWPzGEbFc


That's not the source, it's the claim.

There is zero evidence for his claim that the number of developers double every five years.


Off the top of my head, coding boot camps


Pardon me if I've misunderstood you. I feel that this line of reasoning that excuses modern Javascript's mistakes on the basis of it being a young language to be spurious. We don't need to engineer new languages that recreate the mistakes of previous ones, or even worse, commit entirely new sins of their own. It's not like no-one saw the problems of the Node/JS ecosystem, or the problems of untyped languages, coming from a distance. Still, Node.js was created anyway. I would argue that it, along with many of its kindred technologies, has actually contributed a net deficit to the web ecosystem.


Okay, then, argue it.


That line of reasoning suggests progress isn't being made and we are just reliving the past.


There are multiple reasons for this failure mode, only some of them subject to social learning.

Part of the problem is a learning process, and indeed, I think the Javascript world should have learned some lessons - a lot of the mess was predictable, and predicted. Maybe next time.

But part of the problem is that we pick winners through competition. If we had a functional magic 8-ball, we'd know which [ecosystem/language/distro/OS/anything else] to back and save all the time, money and effort wasted on marketplace sorting. But unless you prefer a command economy, this is how something wins. "We" "picked" Linux this way, and it took a while.


It's also not a surprise to see a similar process of stabilization play out at a higher layer of the stack, as it previously did at a lower one. Neither is it cause for regret; this is how lasting foundations get built, especially in so young a field of endeavor as ours. "History doesn't repeat itself, but it often rhymes."


25 years is roughly one generation. A new generation grows up, has no memory of the old problems?

Same with Covid, is roughly 20 years ago and people forgot there was SARS.


It ain't surprising, but rather just disappointing, that an ecosystem can't or won't learn from the trials and tribulations of other ecosystems.

EDIT: also, Node's more than a decade old at this point, so it is at least a little bit surprising that the ecosystem is still experiencing these sorts of issues.


Is it really though? Node is infamous for attracting large groups of people with notoriously misguided engineering practices whose egos far surpass their experience and knowledge.

I've been stuck using it for about 4 years and it makes me literally hate computers and programming. Everything is so outrageously bad and wrapped in smarmy self congratulating bullshit. It's just so staggeringly terrible...

So these kind of catastrophes every few months for bullshit reasons seem kind of obvious and expected, doesn't it?


NIH Syndrome is a double-edged sword that persists regardless of innovations.


This analogy doesn't hold up at all.

The UHH is a fun read, yes, but the biggest real-world problem with the Unix Wars was cross-compatibility. Your Sun code didn't run on Irix didn't run on BSD and god help you if a customer wanted Xenix. OK, you can draw some parallel here between React vs. Vue vs. Zeit vs. whatever.

But there was also the possibility, for non-software businesses, to pick a platform and stick to it. You run Sun, buy Sun machines, etc. That it was "Unix" didn't matter except to the software business selling you stuff, or what kind of timelines your in-house developers gave.

There is no equivalent in the JS world. If you pick React, you're not getting hurt because Vue and React are incompatible, you're getting hurt because the React shit breaks and churns. Every JavaScript community and subcommunity has the same problem, they keep punching themselves in the face, for reasons entirely unrelated to what their "competitors" are doing. Part of this is because the substrate itself is not good at all (way worse than Unix), part is community norms, and part is the piles of VC money that caused people to hop jobs and start greenfield projects every three months for 10 years rather than face any consequences of technical decisions.

Whatever eventually hollows out the mess of JS tech will be whatever figures out how to offer a stable developer experience across multiple years without ossifying. (And it can't also happen until the free money is gone, which maybe has finally come.)


"Pick React and stick to it" is the exact parallel to your "pick Sun and stick to it". Were you not there to see how often SunOS and Solaris updates broke things, too? But those updates were largely optional, and so are these. If you prefer React 15's class-based component model, you can pin the version and stick with it. You won't have access to new capabilities that rely on React 16 et cetera, but that's a tradeoff you can choose to make if it's worth your while to do so. You can go the other way if you want, too. The same holds true for other frameworks, if you use a framework at all. (You probably should, but if you can make a go of it starting from the Lions Book, then hey, have a blast.)

I agree that VC money is ultimately poison to the ecosystem and the industry, but that's a larger problem, and I could even argue that it's one which wouldn't affect JS at all if JS weren't fundamentally a good tool.

(To your edit: granted, and React, maybe and imo ideally plus Typescript, looks best situated to be on top when the whole thing shakes out, which I agree may be very soon. The framework-a-week style of a lot of JS devs does indeed seem hard to sustain outside an environment with ample free money floating around to waste, and React is both easy for an experienced dev to start with and supported by a strong ecosystem. Yes, led by Facebook, which I hate, but if we're going to end up with one de facto standard for the next ten years or so, TS/React looks less worse than all the other players at hand right now.)


> React is both easy for an experienced dev to start with and supported by a strong ecosystem.

I wouldn't say getting started with ReactJS is easy (or that it's properly supported). Each team that uses React within the same company uses a different philosophy (reflected in the design) and sometimes these flavors differ over time in the same team. We're back to singular "wizards" who dictate how software is to be built, while everyone else tinkers. It's a few steps from custom JS frameworks.


    The UHH is a fun read, yes, but the biggest real-world
    problem with the Unix Wars was cross-compatibility. 
    Your Sun code didn't run on Irix didn't run on BSD 
    and god help you if a customer wanted Xenix. 
    OK, you can draw some parallel here between 
    React vs. Vue vs. Zeit vs. whatever.
    
    But

You made your point, proved yourself wrong, and then went ahead ignoring the fact that you proved yourself wrong.


>The UHH is a fun read, yes, but the biggest real-world problem with the Unix Wars was cross-compatibility. Your Sun code didn't run on Irix didn't run on BSD and god help you if a customer wanted Xenix. OK, you can draw some parallel here between React vs. Vue vs. Zeit vs. whatever

POSIX is a set of IEEE standards that have been around in one form or another since the 80s, maybe JavaScript could follow Unix's path there.


The existence of such a standard doesn't automatically guarantee compliance. There are plenty of APIs outside the scope of POSIX, plenty of places where POSIX has very under specified behavior, and even then, the compliance test suite doesn't test all of the rules and you still get tons of incompatibilities.

POSIX was, for the most part, not a major success. The sheer dominance of Linux monoculture makes that easy to forget, though.


Of course it doesn't guarantee compliance, but like all standards it makes interop possible in a predictable way, e.g. some tcsh scripts run fine under bash, but that's not by design. The inability or unwillingness of concerned parties to adopt the standard is a separate problem. This is why "posixly" is an adverb with meaning here.


This is slightly off-tangent, but as someone who has written production software on the front-end (small part of what I do/have done) in:

Vanilla -> jQuery -> Angular.js -> Angular 2+, React pre-Redux existence -> modern React -> Vue (and hobby apps in Svelte + bunch of random stuff: Mithril, Hyperapp, etc)

I have something to say on the topic of:

> "If you pick React, you're not getting hurt because Vue and React are incompatible, you're getting hurt because the React shit breaks and churns."

I find the fact that front-end has a fragmented ecosystem due to different frameworks completely absurd. We have Webcomponents, which are framework-agnostic and will run in vanilla JS/HTML and nobody bothers to use them.

Most frameworks support compiling components to Webcomponents out-of-the-box (React excepted, big surprise).

https://angular.io/guide/elements

https://cli.vuejs.org/guide/build-targets.html#web-component

https://svelte.dev/docs#Custom_element_API

If you are the author of a major UI component (or library of components), why would you purposefully choose to restrict your package to your framework's ecosystem. The amount of work it takes to publish a component that works in a static index.html page with your UI component loaded through a <script> tag is trivial for most frameworks.

I can't tell people how to live their lives, and not to be a choosy beggar, but if you build great tooling, don't you want as many people to be able to use it as possible?

Frameworks don't have to be a limiting factor, we have a spec for agnostic UI components that are interoperable, just nobody bothers to use them and it's infuriating.

You shouldn't have to hope that the person who built the best "Component for X" did it your framework-of-choice (which will probably not be around in 2-3 years anyways, or have changed so much it doesn't run anymore unless updated)

---

Footnote: The Ionic team built a framework for the singular purpose of making framework-agnostic UI elements that work with everything, and it's actually pretty cool. It's primarily used for design systems in larger organizations and cross-framework components. They list Apple, Microsoft, and Amazon as some of the people using it in production:

https://stenciljs.com/


No one uses them because SSR is either non-existent or clunky with them.

Ignoring a common use case when inventing something is a good way to get your shit ignored in turn. Which is what happened.


Web components aren't really there yet. They will be two or three years from now. Some time between now and then, I expect React will gain the ability to compile down to them, which shouldn't be too hard since web components are pretty much what happens when the React model gets pulled into core.


You can compile React to Webcomponents with community tooling, the core framework just doesn't support them:

https://github.com/adobe/react-webcomponent

By "aren't really there yet", what do you mean? If you mean in a sense of public adoption and awareness, totally agree.

If you mean that they don't work properly, heartily disagree. They function just as well as custom components in any framework, without the problem of being vendor-locked.

You may not be able to dig in to the internals of the component as well as you would a custom build one in your framework-of-choice, but that's largely the same as using any pre-built UI component. You get access to whatever API the author decides to surface for interacting with it.

A properly built Webcomponent is generally indistinguishable from consuming any other pre-built UI component in any other framework (Ionic built a multi-million dollar business of off this alone, and a purpose-built framework for it).


Very unlikely. Web components and React are trying to solve different problems, and the React team has repeatedly said this isn't going to happen.


> nobody bothers to use them

Here's the sad but unavoidable truth: the main purpose of Javascript currently is to keep Javascript developers employed.


Spoken like someone who's never seen what people perpetrate in, say, Java.


> Deno appears to be trying to become the Linux of Javascript

Deno always sounded more "like the Plan 9 of Javascript" personally to be honest. It seems to be better (yay for built-in TypeScript support! Though I have my reservations about the permission management, but that's another discussion) but perhaps not better enough (at least just yet) to significantly gain traction.


The permissions management is a little tricky to think about at first, but once you get the hang of it I think it's actually quite nice. Setting strict permissions on CLI tools help to ensure that the CLI isn't doing anything nefarious when you're not looking (like sending telemetry data). Since this CLI has --allow-run, I can also have it execute a bin/server script that _does_ have network and read/write permissions, but only in the current app directory.


The problem I saw was how quickly you need to open up the permissions floodgates. I saw them live-demo a simple http server, and to do something as basic as that you need to open up full file system and network access. So if you’re doing anything like setting up a server (i.e. one of the core things one does when using a server-side scripting language), you’re back to square 1.


Ah never mind, I see they now have finer grained scopes. That should help.


Deno was always Typescript-first fwiw


I have doubts about how this could possibly work. The idea is you pull a .ts file directly, right? Then your local ts-in-debo compiles that to extract typedefs for intellisense/etc and the JS. What happens when it was created for a different version of typescript than what you’re running? Or if it was created targeting different flags that what you’re using? This will cause lots of problems:

I’m running my project with ts 3.6. Library upgraded to 3.7 and adds null chaining operators. Now my package is broken. In node land, you compile the TS down to a common target before distributing so you don’t have this problem.

Similar, I’m using 3.8 and package upgrades to 3.9 and starts using some new builtin types that aren’t present in my TS. Now my package is broken. Previously you’d export a .d.ts targeting a specific version and again not have this problem.

Or, I want to upgrade to 3.9 but it adds some validations that cause my dependencies to not typecheck, now what?

Or, I’m using strictNullChecks. Dependent package isn’t. Trying to extract types now throws.

I’ve brought these all (And many other concerns) up to the deno folks on numerous occasions And never gotten a answer more concrete than “we’ll figure out what to do here eventually”. Now 1.0 is coming, and I’m not sure they’ve solved any of these problems.


> I’m running my project with ts 3.6. Library upgraded to 3.7 and adds null chaining operators. Now my package is broken.

Isn't this similar to not upgrading node and using an updated version of an npm package that calls a new function added to the standard library? All npm packages have a minimum node version, and similarly all deno code has a minimum deno version. Both use lockfiles to ensure your dependencies don't update unexpectedly.

> Or, I’m using strictNullChecks. Dependent package isn’t.

This definitely sounds like a potential problem. Because Deno enables all strict checks by default, hopefully library authors will refrain from disabling them.


Node updates much less frequently than TS, so even if it was a problem before, it’s more of a problem now.



Rephrase: people use new TS features much more often than they use new Node features.


That might be true in general, but I seem to run into problems with the two with about equal frequency. One of the recent ones I ran into with node was stable array sort.


Yes, npm package maintainers spend a lot of time on node version compatibility. Here is a quote from prettier on their recent v2 release:

> The main focus should be dropping support for unsupported Node.js versions.

https://github.com/prettier/prettier/issues/6888


On the other hand, trying to setup a typescript monorepo with shared/dependent projects is a huge pain since everything needs to be transposed to intermediary JS that severely limits or breaks tooling.

Even TS project references make assumptions about the contents of package.json (such as the entry file), or how the compiler service for VsCode preloads types from @types/ better than for your own referenced projects, which sadly ties TS to that particular ecosystem.

Language version compatibility is a good point, but perhaps TSC could respect the compiler version and flags of each package's tsconfig.json, and ensure compatibility for minor versions of the language?

Since I enjoy working in TS I'm willing to wait it out as well, the pros far outweigh the cons. Now that GitHub/MS acquired NPM, I have hopes that it will pave the way to make TS a first-class citizen, though I don't know if Deno will be part of the solution or not.


> TSC could respect the compiler version and flags of each package's tsconfig.json

That’s the problem - there is no tsconfig.json. You’re only importing a single URI.


I see. While I don't know the details, it seems it would promote the use of "entry/barrel" files once again.


> running code straight from wherever on the Internet it happens to live today.

This, exactly this. Young me thought this was a point of the whole thingy we call Internet.

And exactly that is what I like about QML from Qt. Just point to a file and that's it.


Go tried it; went over like a lead balloon. Theory; lead balloons don't fly anywhere.


How is it a lead balloon? Go got super popular in the period before /vendor and Dep (later modules). Yes, people wanted and got versions too, but the URL part stayed. ISTM, they had a Pareto optimal 20% piece of the puzzle solved and bought them selves time to solve the other 80% years later.


Go still identifies packages by URL. The recent modules feature just added the equivalent of lockfiles like npm, yarn, cargo, etc. It also added some unrelated goodies like being able to work outside of $GOPATH.


> Deno appears to be trying to become the Linux of Javascript, through the innovative method of abandoning the concept of "package" entirely and just running code straight from wherever on the Internet it happens to live today.

I really like Deno for this reason. Importing modules via URL is such a good idea, and apparently it even works in modern browsers with `<script type="module">`. We finally have a "one true way" to manage packages in JavaScript, no matter where it's being executed, without a centralized package repository to boot.


Then again, this broke a package that, by its very nature, isn't running in production. And the problem was solved within three hours.

So I'm not sure how much everything-used-to-be-great-nostalgia is justified here.


Someone rolls out code where a serious bug fell through QA cracks, and appears to be breaking a mission-critical path. Your biggest client is on the phone screaming FIX IT NOW. Three hours is an eternity.


Screaming "FIX IT NOW" because bootstrapping a new React app isn't working? Who, what, when, where?!


You roll back one version. Problem is fixed in thirty seconds.


Let’s add: appears to be breaking mission critical path that also slipped through cracks in QA. Mistakes happen, run CI/CD before getting to the mission critical path.


My development environment is my production environment.


F


I think you missed my point so let me clarify: If your job is to develop software, then your computer is your production environment. It's where you run your production - your development. This is hopefully separate from where your customers runs development.


Only as much as it ever is. That's why I'm making fun of it.


I remember the beginning of React (before Webpack) when server compilation looks fine and that magic works as <script>react.js</script> in browser. This looks like new era where HTML is fixed. But no, we have 15 standards now. Everything is finished when I found Webpack 3-line module with 20 lines Readme description. We have 1000 modules and 1000 weak points after that. React has x1000 overhead

Any package and package manager has hot points:

- no standards, api connection issues (different programming styles and connection overhead)

- minor version issues (just this 1 hour bug 0-day)

- major sdk issues (iOS deprecate OpenGL)

- source package difference (Ubuntu/CentOS/QubesOS need a different magic for use same packages)

- overhead by default everywhere that produce multiple issues




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: