Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What are the cons of Google Polymer?
87 points by LukeFitzpatrick on May 8, 2016 | hide | past | web | favorite | 77 comments



Disclaimer: I led development of gaming.youtube.com. We chose Polymer at the time for non-technical reasons.

Some downsides: Performance is not great. Lack of coherent data architecture (e.g. Redux). Difficult to manage state change (as opposed to React) due to a more limited programming model - e.g. template ifs and repeats don't know to reevaluate in many cases. This makes it harder to integrate with third party js libraries. Inferior tooling vs React.

Philosophically Polymer is more about the DOM and React is more JS-focused. Some of this is personal preference.


I found you don't really need Redux if you follow the "DOM is your API" paradigm. Components/events/changes make a lot more sense after that.


>the "DOM is your API" paradigm

What does that mean, exactly? Do you have any resources you could link to?


Not OP and I don't have any resources for you, but I spent a while playing around with Polymer in the very early (0.3) stages, and was one of the first internal Google customers for them.

I found that once you have web components, there's nothing stopping you from representing your entire webapp's UI in terms of semantic elements only. For example, a HN comment thread could look something like this:

  <hn-thread 
     title="Ask HN: What are the cons of Google Polymer?"
     votes="47"
     comments="46"
     posted="20160509T12:12:00">
    <hn-comment author="sshumaker" posted="20160509T14:12:00">
      Disclaimer: I led development of gaming.youtube.com...
      <hn-comment author="justjco" posted="20160509T14:40:00">
        I found you don't really need Redux if you follow the "DOM is your API" paradigm....
        <hn-comment ...></hn-comment>
      </hn-comment>
      <hn-comment author="torgoguys" posted="20160509T14:56:00">
       You answered the question asked, but I'm also curious as to what you liked about it for that project.
      </hn-comment>
    </hn-comment>
  </hn-thread>
All of the details about how to present a <hn-thread> or <hn-comment> are encapsulated in the definition of the custom element, and you only specify the data that changes between each one. Similarly, you could define methods on <hn-comment> that pop up a comment box when you hit "reply", for example, and this also drops out of the light DOM tree.

When you do it this way, your DOM becomes basically isomorphic to what you would write in your Redux state or send in a JSON response. You could, in fact, write a trivial helper to convert between JSON and DOM, but there isn't really a need when you could just send HTML down the wire.


This sounds quite a lot to React components. Cool to see different technologies agree on the benefits of componentized UI!

That said I believe Redux is not so much about representing the state tree as it is about making its mutations centralized and predictable (the same actions, replayed, will always produce the same state). While a componentized tree often helps ensure this, it is less of a case when the state shape itself doesn’t neatly match the rendered component tree. This is the use case where Redux, in my opinion, offers some options.


Yeah, it is quite a lot like React components. Polymer and React actually came out around the same time (early 2013), but Polymer spent forever in development and only recently reached 1.0, so React got a lot of the developer momentum.

If you're doing webcomponents right (i.e. adhering to the Law of Demeter, not handing out references to internal shadow/shady-DOM nodes, defining an API for possible child components), then you naturally get centralized & predictable mutations as well. The methods & child nodes of a top-level component become the API by which events can mutate the page, and then any state changes propagate down to the leaves of the component tree only via well-defined APIs.


I really can't wait till more people figure this out. Using html for documents again is going to be very powerful and freeing for those who adopt it.


I see now, many thanks for the informative reply!


> > the "DOM is your API" paradigm

> What does that mean, exactly? Do you have any resources you could link to?

Check out the "Thinking in Polymer" talk from Polymer Summit 2015: https://www.youtube.com/watch?v=ZDjiUmx51y8

Specifically talks about "the DOM is your API" at 0:49 --> https://youtu.be/ZDjiUmx51y8?t=50

disclaimer: technical writer for polymer


You answered the question asked, but I'm also curious as to what you liked about it for that project.


Pluses:

It was vastly preferable to server-side template rendering (how YouTube is traditionally built). :)

Generally well-built components with support for material design patterns.

Polymer team was responsive to issues and quick to address feedback.

Plus a bunch of non-technical reasons like developer mindshare within YouTube etc.


You can use Redux with Polymer projects. I've created an element for that: https://github.com/lastmjs/polymer-redux-store


>e.g. template ifs and repeats don't know to reevaluate in many cases.

Examples?


They don't re-evaluate on nested path changes, e.g. if (item.bar.isActive) won't trigger if isActive changes. You can work around this in practice but often it requires significant architectural changes and for your regular JS code to be Polymer-aware.

This wasn't true in Polymer 0.6 but required Object.observe which is SLOW.

React's render everything and diff approach avoids this issue.


I agree this can be a pain and it does require significant architectural changes, but needing "nested path changes" is also often a code smell calling for better component encapsulation. The DOM/polymer already supports nesting components.


This seems to be a common assumption in the JS world. Even many polymer elements seem to focus on providing stunts for JS procedural style programming rather than DOM first programming. Do you have any good resources on refactoring JS code to be more DOM centric? I'm thinking tools like d3.


There's a bit of hassle required to bind conditionals to object paths IIRC. Also CSS classes bound to variables don't work well if you're also using custom-style elements, which is virtually inevitable.


It pretty much requires you to use the babel/crisper/vulcanize toolchain, which in my limited experience doesn't do nearly a good a job of minifying Javascript than other tools (because it can't do symbol renaming). I haven't found a way to use browserify with web components, for example, so no ES6 modules.

Polymer is made of magic. It uses magic cutting-edge browser features where available, and where they aren't it fakes them using magic. Some of its APIs like Polymer.dom(x) are completely magic. When it works, it works really well. When it doesn't you are going to waste so much time trying to find out why.

It also assumes that everything you're doing is through Polymer; so you won't get much mileage out of, say, JQuery. While it admits the existence of other Javascript libraries it tends to blank them at parties.

If I were to do my current project again... I'd probably still choose Polymer, although I'd take another look at the other Javascript UI toolkits. Web components are so, so nice for designing UIs --- I can finally use actual computer science techniques like abstraction and modularity for building UIs! --- and it's got the best consistent look and feel that I've ever seen in a Javascript UI toolkit.


Here's an experiment for ES6 and Polymer via Browserify + Babel: https://www.npmjs.com/package/poly-next


>which in my limited experience doesn't do nearly a good a job of minifying Javascript than other tools (because it can't do symbol renaming).

It's possible to extract js-code from the vulcanized html file to the separate js-file using the crisper command, than run the uglifyjs command with appropriate minification arguments.

Example:

   vulcanize --inline-css --inline-scripts ./index.html | crisper -h ./build/index.html -j ./build/script.js
   cd ./build
   uglifyjs -m -c warnings=false -o ./script.js --source-map=./script.js.map ./script.js


You can totally use ES6 modules, you just have to split the web component JavaScript and HTML into separate files. That's what we did for Home Assistant, which we documented here: https://github.com/home-assistant/home-assistant-polymer#bui...


We use it pretty extensively at Cloudstitch and like it on the whole.

A few commenters mentioned poor browser support, but we haven't experienced any problems other than IE<=10.

In many ways Polymer is just a shim for the WebComponents spec with data binding added in, along with a standard library of web components. The resulting framework-style is basically plain-JS/HTML with a la carte use of Web Components where appropriate. Unlike [my perception of] React or Angular, you can use Polymer a bit without going all in. (Ironically the one thing you can't currently do is mix Polymer Elements from Source A with Polymer Elements from Source B at runtime if they have common, but separately hosted, dependencies).

WebComponents feel a bit like Java swing, in that making HelloWorld is high overhead, but once you've got a nice toolbox of components going, you can pull them out and use them flexibly. This is not unlike React components, except Polymer/WebComponents use an HTML-centric definition format while React uses a JS-centric definition format.


>Unlike [my perception of] React or Angular, you can use Polymer a bit without going all in.

For what it’s worth, you can integrate React in your existing Backbone/Ember/Angular/etc app one component at a time. At my previous company, we have been doing this over the course of a year while shipping new features thanks to the interactivity React enabled.

Ryan Florence gave a talk about integrating React into an existing app: https://www.youtube.com/watch?v=BF58ZJ1ZQxY


This is even easier with polymer or web components since you can use them in any template or jsx as simple as you put there regular HTML tags


Wait a minute, you mean you had problems on IE 10? That's a widely used browser.


I only recall problems on 9 actually, but I wanted to be careful not to oversell after checking the compat matrix


If you are using the shady dom (web components lite, rather than the low perf full polyfill), the library requires you to do dom manipulation through the polymer local DOM API. [0]

That means without shimming other libraries, polymer is incompatible with any other libraries you might use to manipulate the DOM. For example, you can't easily mix angular, react, or ember templates with some polymer elements, because (e.g.,) angular's ng-if directive doesn't use polymer.dom to inject created nodes.

This is currently my biggest beef with polymer. Someday, webcomponents will be great, principally due to composability and portability. Just plug the one component you need in to your extant work. But for now, that promise is not quite realized.

[0]: https://www.polymer-project.org/1.0/docs/devguide/local-dom....


I'm on the Polymer team. Shady DOM is probably the biggest beef that we have with the project, but it was a very necessary tradeoff given that the Shadow DOM polyfill was just too slow.

Polymer can seamlessly switch between Shadow DOM and Shady DOM (though your own elements needs to be tested under both). You can still use the Shadow DOM polyfill is compatibility is a higher concern than raw performance.

Safari and Chrome will have native Shadow DOM v1 implementations by the end of the year, and quite possibly Firefox. If we have enough resources, we may decided to make an IE/Edge specific Shadow DOM polyfill that'll be much faster because it'll path DOM prototypes rather than wrap DOM nodes. Wrapping is necessary because Safari doesn't properly let you patch Node prototypes. If this happens, we'll have fast and compatible DOM APIs.


Yes, I'm definitely looking forward to the Real Shadow DOM being browser-supported. I was unable to use the full webcomponents polyfill in my own application, as it introduced gamebreakers with contenteditable-based rich text editors [0]. Putting polymer anywhere on my page meant my rich text editor (quilljs fwiw) broke on non-chrome browsers, even through I wasn't doing anything polymeresque with the editor. It was just the polyfill interacting poorly with the browser's range implementation. In my case (my application also heavily leaned on angular for routing and page-level templating), I couldn't mix-and-match angular and polymer and also maintain cross-browser support of the text editor.

But I still am _super_ excited about the future of the webcomponents world. I mean, if nothing else, having well-namespaced element ids (if I have two editor components on the page, I can't just grab "#dateinput") will be great. I love the idea of directly attaching properties to DOM nodes, encapsulating templates, and projecting content into viewports. It's seriously rad. I am 100% on board with the notion, it's just the polyfill in the meantime that's problematical.

https://github.com/webcomponents/webcomponentsjs/issues/212


I've never used it, because the official demo site was incredibly slow. If the resources of google can't even make their own library feel responsive, then I'll spend my time hitting my head against a different platform.


I'm assuming you mean https://elements.polymer-project.org/. It seems quite fast to me.


What browser are you using?


I noticed my latest version desktop Safari freezing up on their docs site.


The problem here is that you're using Safari.


it's all webkit, no?


Slow as hell in Firefox because Firefox doesn't natively support web components so they use a JavaScript polyfill.

Also it is over-complicated for simple websites. You'll end up in bower-npm-grunt-etc-etc hell very quickly.


> Firefox doesn't natively support web components

FWIW, that's because there hasn't been a reasonably stable, agreed upon collection of specs for us to implement. Custom Element only reached multi-vendor agreement a few months ago (https://annevankesteren.nl/2016/02/custom-elements-no-longer...).

Polymer is a useful library, but it only represents a single vendor's vision, not any sort of standardized specification.


Fair enough of a point. Web components have a lot of potential and it's good to hear Mozilla et al are slowly helping build consensus on the custom element instatiation! Polymer is one opinionated take on Web components which feels a bit heavy handed for my taste, I'm experimenting with x-tag from MSFT lately. Much lighter weight. The nice thing once these specs get finalized it'll be much easier to mix and max from whatever components you like (if they follow a DOM centric approach).

One thing that's still frustrating is the lack of html rel import in Firefox. Any word if that's ever going to change? Http2 changes the performance constraints quite a bit so the previous decision not to support native html imports means you have to use vulcanize and you can't dynamically mix and match sources.


> the lack of html rel import in Firefox. Any word if that's ever going to change?

Right now we have no intention of shipping HTML Imports; it's easy to polyfill, and we suspect that implementing ES2015 Modules and the module loader spec will change how we look at the problem of reusable components. To that end, support for a restricted subset of <script type="module"> should land in Firefox Nightly builds tomorrow (https://bugzil.la/1240072), so we're making significant progress on that front.


> it's easy to polypill,

It may be easy from an internal implementation point of view, but from a usage point of view I disagree, unfortunately. It disallows using static html/css only web-components which strongly goes against the grain of HTML's intent of providing markup only documents. Granted it's a minor use case these days, but it is important to the underlying philosophy of HTML and strongly limits the feasibility of building a static html-only _and_ modular website. Let's say a client's JS is disabled (via NoScript which addons.mozilla.org reports as having 2 million users). In this case it'd completely break html-only web-components for 2 million plus users, even components used primarily for CSS modularity. Unless I'm missing something with how ES2015 modules will work...? This seems like a major lack of support for any notion of custom-components and particularly of user preferences on how they view the web. Which is probably why this seems such (to me) an odd stance for the Firefox developers to take. I'll be following the ES2015 modules to figure out if I'm missing something in this!

Here's an example use case: tailoring static html based documentation with use case based theming. Depending on the clients origin and/or browser type, it'd be trivial to redirect http html-import requests to support various themes/clients. While technically possible without native html-imports, it'd require either JS support or dynamic rewriting of html files (essentially run-time vulcanization on the server) rather than a simple http redirect to a modular component.

> To that end, support for a restricted subset of <script type="module"> should land in Firefox Nightly

It'll be interesting to see how ES2015 Modules and module loader spec eases the day-to-day usage of web-components. Even with polyfills Firefox seems to be difficult to coax into loading custom components -- vulcanizing the html is the only method I've found to be reliable. I'll test drive the <script type="module"> and file any bugs that I might find, thanks for the heads up! The script module should support x-tag library pretty well.


Are you using the full shadow dom polyfill or the Shady DOM shim? Shady DOM is _significantly_ faster as the cost of full spec compliance.


For those who (like me) have never heard of it, here's a link: https://www.polymer-project.org


What is it though? The page does not say...


The examples seemed rather clear to me of what it is in general, but I didn't look at the details.


For the moment, it doesn't really have great browser support and less intent to implement than some other features of the OWP. Whereas service workers at least had Mozilla pushing along with it, Google seems to be "going it alone" a bit more on Polymer and having a tougher time building a coalition for it.


This is not true.

All of the browsers are on board to ship Web Components and have been actively engaged in the standards discussions.

Safari just shipped Shadow DOM v1 in Safari Technology Preview. https://webkit.org/blog/6017/introducing-safari-technology-p...

Edge are actively working on Shadow DOM and Custom Elements support https://blogs.windows.com/msedgedev/2015/07/15/microsoft-edg...

Firefox is working on Shadow DOM and Custom Element support as well.

The odds are good that other browsers will begin landing these features in their stable releases in 2016.


Hmm, I've found that the polyfills work fine, if you need to target IE11 and latest Firefox/Chrome. Even old Chrome versions work well with the polyfill.


The shadow DOM works in IE, all the spec?


The shadow DOM polyfill works fine in IE. There are a few problems with IE though, the main one I've encountered is that dom-repeat doesn't work inside table.


Firefox? Safari iOs?


Yes to Firefox. I don't test Safari.


For a public site is important...


you are kind of forced to use bower.

lacks older browser support

mixing it with other virtual-dom libraries like react is not easy or straightforward.


What's the definition of "older browser" here? As far as I know, polymer also uses a polyfill [1] to provide the web components functionality to older browsers.

[1] : https://www.polymer-project.org/1.0/resources/compatibility....


No support for IE9 even with polyfills. Some people can get away with not supporting IE9, but sadly I am not one of those people.


Yeah, but last time I checked it fails: for example there isn't a way to polyfill shadow DOM...


Polymer 1.0 doesn't use shadow DOM (unless you ask it to). Instead it uses this thing called 'shady DOM', which is roughly equivalent but requires accesses to the DOM to go through APIs.

See here for the details:

https://www.polymer-project.org/1.0/articles/shadydom.html


The difference between the webcomponents.js polyfill and webcomponents-lite.js is precisely that the former includes the shadow dom polyfill.


I recently used Polymer for a large project. Performance is ok on chrome with full web-components, but unsurprisingly the full webcomponent polyfill is very slow elsewhere. Stick to the ShadyDom polyfill if possible.

Vulcanize is pretty cool, though we had a hard time getting it to work right in a gulp workflow.


Can you share this large project?


If you want to compare with react for example or angular it lacks a mobile framework.


Well, only partly true. Because all paper/iron elements are adapting seamlessly on mobile, there is no real need for a mobile framework.


I am talking about React Native and Ionic to build mobile apps with transferable skills


True, but if you are looking for offline adaptions of your webapp there is a really cool technology called service worker that works as an offline cache for your whole app.


polymer-native is currently under active development and available on github


Excluding the performance issue, which largely seems to be confined to instances where polyfills are required, the cons don't seem so bad. The tooling will improve as the library and community mature, and other problems are being addressed already by the looks of it.

I've got a project in the pipeline for which I was planning to use Cycle.js, but I've always had a niggling interest in Polymer.

What I'd like to know is whether Polymer offers any significant pros, because after consuming some of the docs the offer seems to be a better reimagining of ASP.NET WebForms. What are the benefits?


An issue we faced is with different versions. Each new major version changed so much that we had to rewrite some things. I will wait for a stabler release


I have been using Polymer since 0.5 and it used to be like this but since the first major version (1.0) it has not required me to rewrite anything.


I've used Polymer.dart for browser game consiting mainly of simple windows. From that I can say the most important downside is the poor performance on mobile (even with shady DOM). Despite that I'd probably still choose Polymer if I need browser framework because webcomponents encapsulation feels great from developer perspective.


Don't forget to take a look at http://x-tag.github.io especially if you're looking for alternatives to Polymers take on a Web components. Haven't used either extensively yet but x-tag seems more bare metal.


More http requests (or vulcanize, maybe...)


With http 2.0 everything is piped in single request.


Yeah, you only need a single http request thanks to vulcanization.


And from a SEO point of view?


It lacks a routing system


Not anymore, with the carbon-route elements they built a pretty awesome routing system. https://elements.polymer-project.org/elements/carbon-route


It's not supposed to have one. Polymer is simply a shim and collection of WebComponents. Polymer isn't a fully-fledged application framework itself. You'd still use something like Flux, Redux or Cycle (with a router) on top of Polymer. Polymer is intended to be used essentially for views only - the same role React plays in the Redux + React stack.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: