Hacker News new | comments | show | ask | jobs | submit login

I don't know about others but I tire of these comments chiding the "out of control" front-end ecosystem. The situation has already changed years ago. We aren't going to get off of your lawn while you continue to beat a dead horse. The web is built on open standards and there are billions of pages and apps out there. To expect web development to be a perfect monoculture is a failure of your imagination and ability to adapt rather than a failure of the ecosystem.



I get the Javascript fatigue fatigue part, but this comment breaks the HN guideline against name-calling in arguments, and even crosses into incivility:

> We aren't going to get off of your lawn while you continue to beat a dead horse.

> a failure of your imagination and ability to adapt

Please don't do this in HN comments regardless of how tired you are.


Fair enough, though I don't see any name calling here. I will try to be more civil in future comments.


Not the OP, but I expect the web to be a thing where documents (i.e. mainly text) don't have any issues rendering on my 5-year old phone or on my 8-year laptop (both of which work very well, still, and which I don't plan to replace anytime soon).

The recent web practices (I'm talking mostly of the last 2-3 years, since more and more people have started copying Twitter with its launch of its "every website needs to be a JS-app" thingie) have practically destroyed how most of the websites are displayed by the 2 devices I own. Sites like Hacker News or Wikipedia, which still work perfectly fine, are very far and few between. I sincerely deplore this.


A web limited to textual content is a pretty quaint and uninspiring vision, IMO.

The web started that way because static documents are relatively easy to represent, but the future (and present, for that matter) is rich experiences that can be distributed in the same way.

But there are many steps left to get there, so either buckle up and help build that future, or get used to an ever-shrinking ecosystem of the purely textual Web.

I'm personally very excited by the convergence of mobile and web development, PWAs, improved functionality for layouts, self-hosted app architectures (like Tent and Sandstorm, language-agnostic runtime via Web Assembly, better serving performance via HTTP/2, low level multimedia APIs, encapsulation of rich UX via Web Components, and so on and so on.

Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.


My opinion happens to differ. Textual content is incredibly rich and is likely a target for all content in the future. Consider, we are literally training models to accurately caption things today.

Why? Because language is a hell of an abstraction. More, it is approachable and can often be bounded. Consider, I am not even remotely skilled in art. However, I can adequately describe, for me, a picture by saying who is in it and the event. Try doing the same without language. Often "quaint and uninspiring textual" language.

Do I opine for the what we had in the 90s? Not necessarily. But I do grow weary of folks reinventing frameworks that merely further complicate the very abstractions that they created and now require these frameworks.


> Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.

Could have said those same words literally a decade ago when we were all struggling to figure out all this Rails and MVC stuff and how automated ORMs worked above MySQL. Seven or eight paradigm shifts later, we're all still confused and I see no cohesive knitting.


Do you want to bet on that last one?


Sure, but in the meantime the web is a much less accessible place for screen readers, underpowered mobile devices, and those without good internet connections


> Sure, it's bewildering right now, but in the future, this will all be knit together in a cohesive way.

You'll get over that optimism when 20 years passes and it's a worse mess than it is now; that's what going to happen.


The fact is that 99% of what's done on the web is (mainly textual) document management/display. Facebook (the website responsible for React!) is 99% document display.

The truth is that "rich experiences" are mostly made by enhancing the basic experience of reading rich documents. If you want an inspiring vision of a document-centered future, look at IndieWeb.org. And/or roca-style.org. It's a vision of how the web can be knit together into a cohesive whole based on URLs, rather than segmented into a bunch of unaddressable silos.


About self-hosted app architectures, Cloudron is really good in terms of stability. Just got a newsletter from them that they are hitting 1.0 soon.


You could say the same thing about flying cars. A car with four wheels is a pretty quaint and uninspiring vision, IMO. However it happens to be infinitely more practical.


It was was has worked for almost 5 thousand years, so I am going to go ahead and say it is not quaint, its not uninspiring, and when you say "in the future it will all be knit together in a cohesive way" I laugh heartily. It never has been, and it wont be.


And I would love if my 2001 Honda Accord was compatible with Tesla's autopilot, but I understand it is not a realistic expectation.

I'm not sure why you'd expect the web to be a. Mostly text and b. able to render easily on obsolete devices.

The web is becoming a robust application delivery platform. That is so, so awesome. Most people do not want to be stuck with shitty looking, text only websites. Moving the platform forwards necessitates that it will use more resources. Increased resources availability and consumption over time is fairly consistent across most aspects consumer computing.


The point isn't that the web should be mostly text, but that it shouldn't comprise of layers and layers of unnecessary fiddle-faddle that doesn't add anything useful to the end-user's experience.

If you can convey what your trying to convey with a JS-less (or even just JS-lite) 2-page website, then don't build a monolithic, scroll-hijacking, background-parallaxing fireworks show of a website, tied together with the most tenuous of Javascript libraries.

I'm all for the web as an application delivery platform, but not every website, or application, needs so much bulk.


That's not a problem with the ecosystem though. That's a problem with bad developers. The same is true for any technology and programming language.


They may be robust but the user experience still sucks. JS heavy web sites are unresponsive and turn my 3 year old MBP into a vacuum cleaner. Facebook is the best example. I can barely load the web site without the fans spinning. Firefox can't handle it at all. It is unusable for me.


That is crazy. I use Facebook on a 5 year old MBP with no problems. I have Ad Block Pro and U-Block, but even without them, my computer can handle Facebook just fine.


ublock origin is the only adblocker you need. Adblock Plus does the same thing but is less efficient and lets some ads through by default, and ublock is abandoned.

Don't run two ad blockers, they will just use more resources for no benefit.


Something is wrong with your MBP.


It isn't just him. Facebook slowed for whatever reason on my PC too in the last half a year.

4.5 GHz 4670k, 16Gb of tight timings ram.


Might be more Firefox than your macbook or the web. I've noticed (while developing an extension) that Firefox feels noticeably more sluggish than Chrome. Safari somehow feels even faster than Chrome, but I'm too tied to the extension ecosystem of Chrome to switch.


I have the same experience with Firefox on OSX. I noticed that it isn't an issue with the dev version of Firefox with multiprocess turned on.


Firefox is definitely partly to blame. Chrome does work better and Safari is probably the best of the lot.


Safari is definitely faster than Chrome on Mac.


It also uses significantly less power.


Alas I dream for the day when Safari adopts WebExtensions


I'm using Chrome but I tend to agree. A particularly painful area of Facebook is Buy/Sell pages. It slows to a crawl if you scroll through too many listings. Even on my 6 core X99 system.


This experience is not unique to the web, from what I've seen. Apps that aren't well made can easily drop frames on a 3 year old iPhone.


Ubuntu on chromebook user reporting. I cant complain much on the latest chrome even with my 2 gigs of ram and much less powerful processor.


> Increased resources availability and consumption over time is fairly consistent across most aspects consumer computing.

Worth remembering this is the case for people interested in tech. The local library still runs Vista on a 10yo system. My parents will use Android 2.x until the phone does not turn on anymore. Bandwidth updates don't apply to many people living outside of towns. Etc. It's been a long time since we've reached a point where an average person shouldn't need more bandwidth and power to look for information online.

And BTW​, you can have beautiful text-only websites. These 2 properties are not related.


> The web is becoming a robust application delivery platform.

This was our mistake. I don't want garbage shoved in my face. I want to read. And that's it.


Amen! React might be the best tool for building hybrid mobile or even cross platform electron apps, but the truth is, it suck balls for web development​.

Bundling the entire website inside a single js file, that needs to be re-downloaded everytime you add a div or change a single css class is stupid, sorry.

Your website doesn't need to be pure text. It can be beautiful, modern and responsive. And it doesn't need much js for that.

The world has become mobile first, not js-centric. Pushing spa everywhere is just wrong.


React is great for web development (using Next.js) based on my recent experience.

Citing only one side of any architectural trade-offs isn't particularly interesting either. The other side is that we can now easily build sites using Universal JS (serverside-rendered combined with SPA).

Delivering a website with seemingly instantaneous repaints even on flaky internet connections is just a superior end-user experience.

Just click through the primary nav of a Universal JS site vs an old-school one, and it feels like we've been putting up with the equivalent of the old 300 ms tap delay, but for all websites and website links.

Not engineering away that latency penalty will tend towards foolish for website properties that want to remain competitive.

Users will become increasingly accustomed to very responsive sites and the latency penalty that is currently normalised will become more glaring and unacceptable.


What has been your experience on server side rendering ? We are very concerned about SEO,etc - have you seen any impact of using Next.js on SEO performance,etc


We have a number of Next sites in development at the moment, but none in production (soon!).

SEO shouldn't be a problem, especially as the initial page is serverside rendered.

The only slight complexity is in returning sitemap.xml I believe, which requires a little bit more configuration currently. If you search the Github repo for 'SEO' you should find some tickets (open and / or closed) that discuss this.


I'm with you, but it's just part of the growing pains. The web is successful because it's easy to layer businesses on top of each other... I can have a Tumblr with a Shopify backend and Google Adwords on top. Apps are walled gardens so they can only enforce one business model at a time. That can make things nice and tidy, but it walls you out of the messy, compositional business models of the web.

Because business models are composed on the web, it's just harder to settle on unified design standards. It takes time for everyone to agree on a separation of responsibilities on the page. This is compounded by the sheer newness of the business models. My webcomic about fancy cheeses has a segment marketing a line of tractors now to industrial buyers at a conference in Wisconsin this week? OK. That's an opportunity I probably wouldn't have had selling Java apps.


Well what's unique about HN and Wikipedia? They're largely non-monezited. If Buzzfeed can make more money off of a flashy website it's hard to argue with.


HN doesn't do much, but Wikipedia does a decent amount of stuff with JS on their website despite people taking it as an example of "the web as documents"

meanwhile, I don't know how FB or Twitter are nice user experiences when you operate on a "paginate 20 tweets at a time" philosophy.


I don't think it's either/or. You can make a "flashy" (in the sense of Buzzfeed) without it being burdened by huge amounts of js. Likewise, monetized sites are still frequently not like Wikipedia or HN.


Ten years ago MapQuest was state of the art, and you'd click "next tile" over and over to traverse a map. Then Google Maps showed up with its voodoo ajax and infinite scroll and ushered in the modern web app era. Sure, some folks overdo it, but I'm not going back.


That is a vision that is incompatible with the reality of computing from the past 10 years.


I would love HN to provide an easy way to see where the reply thread to this first comment ends. Some js might enable that. Basic html/css - I think no.


It's that [-] button next to the username above each comment. It collapses all the cascading comments. It is enabled via JS. If you disable JS, then it disappears.


I suspect you could hack a similar collapsing thread UI together with styled radio buttons and no JS, if you really wanted to prove a point.


Thanks, I didn't see it


But I seldom hear people saying the old web pages can't be shown on today's browser.


Huh? Who is talking about monoculture? OP certainly isn't, and I don't think anyone else is seriously suggesting such a thing.

It's not unreasonable to expect that the ecosystem doesn't operate under an ever-expanding set of wrappers, frameworks, layers, and models, and it's certainly not unreasonable to expect that our tools don't suck.

The open standards you talk about ARE already a part of the ecosystem, and well-established: HTML, HTTP, DOM, and ECMAScript. The new JS library of the week is not a part of that.


First, let's be clear here: React was first open sourced in 2014, being used at Facebook extensively before that. Sure, it's no Win32 API, but it's not exactly the new kid on the block.

But the point of the web already having these standards is exactly the point - these frameworks are built on top of and contribute to the underlying universal standards, so what's the problem?


There's this cargo cult complaint where people talk about 'layers and layers of bad abstractions' while rarely offering a superior solution or describing exactly which abstractions are bad and why they are bad. Mostly just hemming and hawing about things being "bad" and "overcomplicated".

It's basically the same instinct as complaining about code written from a previous dev that you didn't write--because you didn't write it, you weren't there to think about the very real tradeoffs that always have to be made when actually implementing something, rather than just complaining about the previous person's implementation.

I never know which abstraction people are complaining about. Is it http? html? the dom? javascript? wrapping libs like jquery? view frameworks like react? routing frameworks? complete frameworks like angular? what's the solution? get rid of all of those things? use different UI abstractions? get rid of javascript ?(oh wait, you can already). The truth is, it's just easier to complain than it is to implement, so that's what people do.


Completely agree here. I think many people realize this and have built awesome tooling to be able to manage different ecosystems and environments.

One that I've been particularly keen on over the last year is GraphQL. If you're interested in simplifying frontend development while giving it more power and control to dictate the data that it needs for each component / view, you should check out GraphQL. I know what you're thinking... "Ugh, another dev tool to learn etc etc". But it's definitely one that's here the stay and a bigger idea than even REST itself. The OPEN STANDARD of it is KEY.

The idea is that you have a type system on the API level and one endpoint. And querying works using a standard GraphQL language, so it's cross-API compatible and you don't need to go learn a whole new API convention at each new place or project you're working on since with REST, they're probably all built in different ways. And GraphQL works across all platforms (i.e. no more proprietary SDKs). All you need are things like Relay and Apollo and that's it.

I've actually been working on a platform called Scaphold.io (https://scaphold.io) to help people learn more about GraphQL and give them a cleanly packaged server and help you manage your data so you don't have to in order to feel the benefits of GraphQL right off the bat. Because of the use of GraphQL, it comes with the best of both worlds of Parse and Firebase with awesome data modeling tools, real-time capabilities, and of course it's built on an open standard.

Truly is the future of APIs that works perfectly with React, and any other framework that comes out down the road for that matter. Facebook's tooling is really neat cause it's built from first principles as well as from their experience running one of the largest distributed systems in the world. So they definitely know a thing or two about building developer tools that are cross-platform compatible and developer-friendly.


GraphQL existed a decade ago, it was called OData and it wasn't as revolutionary as everyone thought it would be at the time.


OData - which was essentially LINQ over URL - was nice (but incomplete) idea with terrible realization. We've used it in two projects in the past and we've fallen back from using it in both cases:

- OData never got wider adoption outside (and even inside) MS. GraphQL is already adopted on every major platform.

- OData had shitton of documentation and format changes with every version. Backward compatibility not included. GraphQL has clear specification which is pretty stable since its first publication.

- OData queries had no validation. You could send almost every query just to get runtime response that your framework/library doesn't support that functionality - even if it was developed by Microsoft itself. On the contrary all major GraphQL server implementations have full feature support.

- Thanks to Introspection API GraphQL has self-descriptive metadata model, which is great for tooling and compile time query verification. Also versioning, schema and documentation are all included into GraphQL from start.

- And finally GraphQL focuses on exposing capabilities of your service into well defined format understood by frontend, not on exposing subset of your database to the user - like OData did.


Monocultures develop because one tool is far better than all the others.

Javascript isn't a monoculture because all the tools are bad.


Monocultures develop because of corporate backing. The web being open and unclaimed is what leads to the great diversity. As someone who does both server and JavaScript development, the tools are not bad. Often times they solve real problems well.


That's silly - if anything it's the other way around. Web tools have a ridiculous degree of corporate backing, especially with the countless libraries from Facebook and Google (React, Angular, Go, Dart, Flow, IMMUTABLE, etc.), and even Microsoft with TypeScript and others. Meanwhile, Make and friends don't have any corporate backing.


None of them can lay claim to the web though, contrasted with other ui platforms like windows, ios, etc.


Plus IE and Chrome, fwiw


The web is pretty much corporate. All the major tools are built by major corporations, all the browsers are developed by major corporations (with the exception of Firefox, but Mozilla gets most of its money from major corporations) all the major web players are major corporations, all the major web monetization is rented by corporations.

The web's no longer an open platform. Yeah you can put up a blog on your own server using only FOSS. But if you want to drive traffic to it, you're using a huge corporate ad network or some SEO company. If you want to scale it, you're using a huge corporate cloud. If you want to make money on it, you're using a huge corporate ad network.


Firefox started out as Netscape so you can trace it back to big company.


Ah and so what good monoculture can you direct us to?


Rails? Ruby isn't a monoculture, but Rails does seem pretty dominant.


If you like the Rails monoculture, you may like Ember as a front-end framework. Most of the choices are made for you, and on the whole they're very good default choices. If you choose Ember, you get: a CLI tool (ember-cli), a templating/rendering engine (Glimmer), a router, a persistence layer (ember-data), a unit testing tool (qunit), a server-side renderer (fastboot), etc. If you want to swap one of things out you can, but the default has great support. There have been some bumps along the way, but everything fits together and works.


By the way, the OP I responded to said "front-end", not "JavaScript". How does your attitude jive with WebAssembly which will allow you to compile C++ or Rust for the browser?


I've been struggling to understand how WebAssembly will change things on the web. Will you be able to compile an arbitrary C/C++ binary to run on a browser? Will syscalls be emulated? What about the filesystem?

It looks like to get such a thing working we would need an OS running on top of JS. But maybe I'm missing something.


There is clearly some sort of mapping available to a JavaScript FFI for doing I/O, as both the Unreal engine and Unity have been ported to WebAssembly. If you can run a video game engine in the browser you can run anything. I believe there are proposals in place as well to add opcodes to directly interface with browser capabilities rather than going through a JS FFI.


> If you can run a video game engine in the browser you can run anything

no


enlighten us


ok, assuming your request is not sarcastic:

- operating systems

- hardware drivers

- a networking stack

- trading algorithms

- fighter jet firmware

- ultra low latency DSP

- processing data from LHC collisions or imaging satellites.

- etc

- etc

- etc

just because webgl enables fast gpu processing (by basically writing a small subset of C that is shader code) doesnt now mean the web platform can be used for evetything.


The implication was applications. But even then, everything on your list is emphatically not true:

- operating systems

Ironically Linux was one of the first things to ever be compiled to run in the browser using empscripten.

- hardware drivers

Linux doesn't run without hardware drivers. In this case, the hardware drivers were wrapping JS FFIs.

- a networking stack

Yes you can.

- trading algorithms

Why not?

- fighter jet firmware

It's not a fighter jet, but the SpaceX Dragon 2 space capsule's UI is built using Chromium.

- ultra low latency DSP

Why not?

- processing data from LHC collisions or imaging satellites.

Why not?


You can emulate a network stack and hardware drivers, but your "network stack" can't directly send packets outside of HTTP/Websockets/WebRTC/etc. and your "hardware drivers" just emulate hardware that doesn't actually exist.

Trading algorithms, ultra low latency DSP, "processing data from LHC collisions or imaging satellites" are I think references to performance limitations. WebAssembly requires a fair bit of runtime checks (though some can be pawned off to the hardware if you're clever), and has a number of other quirks that hurt performance, like the fact that most languages end up needing two stacks when compiled for wasm.

The issue is even clearer when you move to processing large data sets because WebAssembly only allows a 32-bit address space at the moment. Add to that the lack of concurrency or native SIMD, and it's pretty clear it is way too early to dance on the graves of native executables.


Runtime checks don't stop you from doing heavy data processing or having microsecond-level response times. It seems like your objections fall into two categories. One is caused by it running in ring 3, which can be solved by correcting "anything" to "any user program". The other is moderate performance limitations that don't stop it from running full video game engines. Those may not be ideal but they won't stop you from running anything. Video games are among the most demanding things you can do in most ways. The single core of today is better than the multi core of a few years ago, and nobody claimed that those couldn't run all programs.


The memory limits will definitely completely prevent you from running some real-world programs. Programs that don't fold well into a single threaded event loop are also a problem at the moment (including Unreal/Unity games that use background threads).

Also, unless you definition of "can run a game" includes "can run a game at 3fps" I'm pretty skeptical that the entire repitoire of games for those engines can make the jump to WebAssembly today.


Concurrency and SIMD support are in the works.


I'm aware, but "If you can run a video game engine in the browser you can run anything" isn't exactly hedged on the future is it?


wish i had more time for the multitude of your generic "why not?"s, but i don't :(

any language (high or low level) that allows you to write and execute deterministic assembly or machine code can be used to implement whatever you want. whether WASM ever reaches this type of broad security-sensitive wild-west remains to be seen, but my money is on "never".

wasm will certainly expand into additional areas, but they will all be security-gated and will not have unfettered, raw access to all hardware (even with some additional user-grokkable permission model), because it has to run on the web: "W"asm


That's entirely addresed by "The implication was applications."

Nobody was trying to make a claim about web assembly drivers. Just applications.


A tool called Emscripten provides a runtime environment. You can include things from that, like a virtual filesystem. That said, a lot of WebAssembly code doesn't have a dependency on file APIs. For example, you can check out the web-dsp project: https://github.com/shamadee/web-dsp

If you're looking for an intro, I wrote a series of posts on WebAssembly (there are 5 links at the bottom of this intro): https://hacks.mozilla.org/2017/02/a-cartoon-intro-to-webasse...


> It looks like to get such a thing working we would need an OS running on top of JS. But maybe I'm missing something.

You're not missing anything, that's exactly what's happening. WebGL/VR, WebCL, WebRTC, Filesystem API... the browser is becoming a simple feature-rich client OS for the cloud, and WebAssembly is the machine language.


Javascript fatigue fatigue.


It won't change the fact that a lightweight scripting language is coerced into running increasingly heavier apps. Defending it, comes off as a Stockhilm syndrome really.


What does "lightweight scripting language" mean? So much hand waving in this thread. ES6 is an advanced programming language rivaling most others and JS VMs are some of the most optimized runtimes in existence.


https://en.m.wikipedia.org/wiki/Scripting_language

Arguing grammar and references to adoption rates just illustrates the rest.


> To expect web development to be a perfect monoculture is a failure

Except, it's already a monoculture of one language (JS) and one display algorithm (DOM) and everything is just compensating for that.

The "runtime" is not general enough.


WebAssembly will remedy that.


People wouldn't complain about it so much if it didn't suck.


> I don't know about others but I tire of these comments chiding the "out of control" front-end ecosystem

Old people also get tired by all these "apps" wining for your attention; they want "just a phone". Young people grew up with it. Note to self: I'm getting old.

Old people were used to buying development toolset (compiler+tools+IDE), usually including a physical book, with which they can create software. Young people are used to assessing open source projects, and piecing a project together on top of many freely available libraries/frameworks.


I'd be careful of making such generalized statements, whipper-snapper...some of us senile old devs actually pieced together some "wild and free" C-libs (i think we used duct tape and hairpins, but im losing my memory in my advanced age) and even used dynamic-loading (DLLs of course) back in the day.


>Old people also get tired by all these "apps" wining for your attention; they want "just a phone". Young people grew up with it.

https://www.youtube.com/watch?v=z-194bOCJnE




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: