I had a similar jarring experience moving from C# dev to web apps.
I knew I wanted to use building blocks available via npm.
I went to download npm, and found the way you get npm is via installing node.js
Node js and the the package manager are inextricably tied together. Node js provides a local js runtime environment for the e.g. dev time packages to execute. It's also the api for interacting with the filesystem. Somewhere down there, yes, Node js is a web server, but it's kind of 'the' javascript execution environment for building local packages ("plugins"?) against.
The whole ecosystem is built up on javascript. If you're running grunt or gulp or webpack, it's all javascript.
When you install packages globally they're put in a global folder.
There are exceptions (I'm looking at you Cypress) that have native assemblies instead of js, but by and large there it's js, and it's stored locally. There are some pretty simple conventions (e.g. the 'node_modules' folder and the hidden '.bin' folder inside that)
When running npm it feels like passing a one-liner javascript command into node's javascript repl/environment.
The .bin commands are ambiently available.
It makes a lot of sense how it's grown organically... why there haven't been efforts to separate the constituent parts... I don't know. At some point maintaining the 'working thing' with <large degree of> complexity is easier than following a more rigorous, provable set of tools.
I think it's a culture thing. Probably the same reason most of the problems are fixed by deleting node_modules and re running `npm i`
NPM was born out Bundler for Ruby. Node doesn't require you to use npm, but they are bundled together now. There are other popular registries like Yarn.
Hmm slightly disappointed. I thought I'd make a mashup video game short with generated art.
The output resolution is locked at 512x512.
The "target style images" seem to be locked to that handful that come with the application.
The brush materials don't include anything man-made.
I really like this project, especially when going through the docs.
Nearly all the abstractions are very clean in their approach, the only outlier being the hack of using `#each` `/each` comments to indicate looping. I suppose the HTML specs don't have anything for this yet, so one has to adapt in some way.
I like the docs because, having come from a different dev environment, and seeing web/html as a vast sprawling landscape of surface area to learn, littered with defunt, old, broken, and janky throwaway bits everywhere - the documentation implicitly points out the core path of what's needed to understand how to bind and work with data in a web page.
Bravo!
I'm convinced this clarity comes from you distilling down what exactly _are_ the core pieces, and only implementing them first.
some rambling questions:
I wonder how this (synergyjs) works for a webapp type situation? Would one extract .js scripts as modules to reuse custom components? or is that trying to bring this into a more complex setup than is the sweet spot for this lib? e.g. is this better suited for a page at a time?
Also, maybe I'm dense, but does this approach lend itself to single page apps? My understanding is it shines when you have single pages with some logic behind them. If you need to jump to another page, the "passed data" would be encoded in the url, yes?
(I seem to recall the html 5 standards were looking to add something (was it 'proxy'?) that was something like frames, but involved web components, and maybe even passing of data through slots... is that a thing?)
RE: Would one extract .js scripts as modules to reuse custom components? Yes, you can certainly do that. the define function supports both template element nodes or strings, so you can work directly in a HTML document (the web will be getting HTML imports before too long) or do it all in JS and publish to NPM.
Hi mjgoeke, an update RE the "hack of using #each", synergy is now using <template each="item in items"> since v2 (https://synergyjs.org/repeated-blocks). Much less hacky, and has also saved a few more bytes from the overall package size.
Thank you, much appreciated. Yes, there are proposals for native templating, but i suspect it will take some time to agree on such things. I opted for comments in the end mainly to support multiple top level nodes, but they also prove helpful for place-finding when inspecting the DOM.
I gotta say, my heart sunk when I saw the headline. I totally thought ddg finally needed income and were advertising with generic non-targeted ads at the top of their page!
Something is off. The matched colors show an RGB with each, but it's the same value when the color is obviously different.
I took a screenshot of it and pulled it up in an image editor to make sure.
> Many readers also fail to see how asymmetrical any debate on this topic is. Whatever I say at this point, no matter how scientifically careful, appears to convey an interest in establishing the truth of racial differences (which I do not have and have criticized in others). Does it matter that Stephen J. Gould’s The Mismeasure of Man was debunked long ago, or that James Flynn now acknowledges that his eponymous effect cannot account for the race-IQ data? No, it doesn’t. This is a moral panic and a no-win situation (and Klein and my other “critics” know that). I did not have Charles Murray on my podcast because I was interested in intelligence differences across races. I had him on in an attempt to correct what I perceived to be a terrible injustice done to an honest scholar. Having attempted that, for better or worse, I will now move on to other topics.
He later, at the request of his audience, did a 2 hour talk with Klein. This is all (the quote above, the links) included in the page linked in parent.
You can draw your own conclusions.
Node js and the the package manager are inextricably tied together. Node js provides a local js runtime environment for the e.g. dev time packages to execute. It's also the api for interacting with the filesystem. Somewhere down there, yes, Node js is a web server, but it's kind of 'the' javascript execution environment for building local packages ("plugins"?) against.
The whole ecosystem is built up on javascript. If you're running grunt or gulp or webpack, it's all javascript. When you install packages globally they're put in a global folder. There are exceptions (I'm looking at you Cypress) that have native assemblies instead of js, but by and large there it's js, and it's stored locally. There are some pretty simple conventions (e.g. the 'node_modules' folder and the hidden '.bin' folder inside that) When running npm it feels like passing a one-liner javascript command into node's javascript repl/environment. The .bin commands are ambiently available.
It makes a lot of sense how it's grown organically... why there haven't been efforts to separate the constituent parts... I don't know. At some point maintaining the 'working thing' with <large degree of> complexity is easier than following a more rigorous, provable set of tools. I think it's a culture thing. Probably the same reason most of the problems are fixed by deleting node_modules and re running `npm i`