
70% Repetition in Style Sheets: Data on How We Fail at CSS Optimization - gmays
https://meiert.com/en/blog/70-percent-css-repetition/
======
franciscop
As someone who has gone great lengths to improve this by creating a smaller
bootstrap [1], I have to say that in the end of the day it will probably not
matter. CSS is hardly the bottleneck and the same time spent optimizing it
would probably be better spent:

\- Optimizing your images.

\- Concatenating+minimizing+gzip the JS and CSS.

\- Optimizing Javascript.

Likely in order; mileage might vary. If those 3 are already in place, then it
_might_ make some difference to optimize the CSS.

First meassure what you actually want to improve (it is _not_ page size per
se, it is loading time). Then make sure that you optimize it and not something
else.

[1] [https://picnicss.com/](https://picnicss.com/)

~~~
nym
Would throw in there "Use CDNs as much as possible"

~~~
Y7ZCQtNo39
I just don't feel comfortable using them in certain situations, like SaaS web
apps. What if they go down, for any reason, and since I depended on the CDN
for a core framework (say, React), now my entire site isn't going to load.
Queue customer e-mails.

That being said, for less mission-critical projects, they do have their place.

~~~
Jach
CDNs aren't an either-or thing. It's common to fall back to a copy on your
server if the CDN version doesn't load.

CDNs involve a big set of tradeoff choices you have to make, they don't make
sense for every circumstance. I don't think I'd ever throw in "Use CDNs as
much as possible" into an optimization advice list.

~~~
nym
Yeah, keep the dynamic stuff on your webserver, static stuff on the CDN. Part
of the reason I say "as much as possible" is because I've seen significant
conversion improvements by enabling a CDN.

------
venning
I find problems with both the author's methodology and hypothesis.

Counting unique declarations doesn't really tell me anything. CSS isn't simply
a declarative language. CSS rules are _ordered_. Sometimes (and oftentimes
within webapps) you need to repeat a declaration because you're using it to
overwrite an earlier declaration but only when the second selector is valid.
For example, it's very common to have lots of `display` declarations that turn
on or off elements by overwriting earlier rules. Because of this, you're going
to have lots of "repeat" declarations, but they're not repeated in their
_location_ within the order. With complex pages, you're going to wind up
twisting yourself in knots trying to keep everything logically ordered if you
also want to minimize declarations.

Additionally, from my naive understanding of browser engines, the performance
cost of multiple rules applying to an element is greater than the cost of
multiple declarations within a rule. Why minimize them?

But my largest concern is the mental overhead of trying to simplify rules in
code. Co-locating declarations affecting a logical unit is extremely helpful.
The tool we have to accomplish this is called a selector. Minimizing
declarations increases selector count, which kind of defeats their purpose.
Pre-processors can't solve this for you either, due to CSS rules being
ordered. If all you're building is a static page then, sure, I can see
minimizing declarations being some kind of goal. But CSS gets used for a lot
more than static pages.

As other commenters have noted, if the author's concern is overly-large CSS
files, run-length encoding alleviates a lot of that, at least if the
declarations are in the same file (a reason to not use http2 as a silver
bullet and use concatenation instead, if that's a concern).

EDITS: Added a couple sentences.

~~~
designer023
I agree completely. This just seems like this reordering/grouping of
properties makes the css a nightmare to manage and moved scatters related
properties all over the stylesheets. It's would be like grouping the same
coloured/shaped parts of a car together because they are the same colour and
shape, and not grouping them by what they are - it might reduce the number of
times you need to write the word red, but when you need to update at how a
part looks on the car then it would be a mess!

------
positivecomment
gzip takes care of the repetition and I don't think CSS parsing has any
meaningful impact when you consider all the freak show of resources and
scripts a typical page loads (I have no data on this though, just intuition).

A maintainable CSS is way, way more important. Writing complex CSS is very
easy, but debugging can be a nightmare.

~~~
HugoDaniel
If it is repeated then it is harder to maintain. If you have worked on a big
project you know that it is way harder you have to change things in a few dif.
places instead of just one, and there will be much harder to maintain
consistency accross your style.

Can you give a counter-example where repeating CSS _improves_ maintainability
?

~~~
positivecomment
When you have a base style sheet (it can be a framework or a base style sheet
for your main application and sections overwriting them or custom style sheets
for some tenants of your application or whatever) that you want to customize?

Many component-based designs also prefer repeating styles for each component
because of maintainability but I guess they wouldn't be counted in the article
as they use generated root names.

My point is, _not repeating_ styles should not be the target, but it may of
course be the side-effect.

------
kazinator
If you condense everything that can be condensed, the resulting style sheet
could be harder to adapt to changing requirements.

One example of this is when some styles are "accidentally" the same for two
elements. If you condense them into one rule, then you lose the independent
control.

 _" Hey look, 97% of the users always turn up bass and treble together and
turn down the middrange; we can condense the three-band equalizer panel down
to one knob!"_

~~~
moosingin3space
This is why I personally like the Vue/React way of scoped CSS -- it promotes
loading a simple baseline CSS (like Skeleton) for the full page and
specializing in the individual component layer. It's easy to refactor in the
case of changing requirements (if the same across components, move style into
global CSS, if not, move into component).

~~~
jorblumesea
Shout to React's Styled Components: [https://github.com/styled-
components/styled-components](https://github.com/styled-components/styled-
components)

Very nice to work with.

------
dmitriid
...in which some guy is surprised how a system with 0 modularity, 1 flat
namespace, 0 encapsulation, and cascading rules leads to repetition

~~~
TheAceOfHearts
CSS supports XML namespaces [0], although I've never seen someone using it.

Shadow DOM supports style isolation [1], although I don't think I've never
seen anyone using it outside of toy projects.

There's also a newly proposed containment [2] CSS property, but it's only
supported on Chrome.

For simpler websites where you skip using any build tool, I don't think it's a
problem. For larger applications that support build tools, css-modules is a
good alternative [3].

[0] [https://developer.mozilla.org/en-
US/docs/Web/CSS/@namespace](https://developer.mozilla.org/en-
US/docs/Web/CSS/@namespace)

[1] [https://webkit.org/blog/4096/introducing-shadow-dom-
api/](https://webkit.org/blog/4096/introducing-shadow-dom-api/)

[2] [https://developers.google.com/web/updates/2016/06/css-
contai...](https://developers.google.com/web/updates/2016/06/css-containment)

[3] [https://github.com/css-modules/css-modules](https://github.com/css-
modules/css-modules)

------
tyingq
Particularly hard to fix for a site that's been around a while. It's difficult
to identify dead css because you would have to have something like 100% DOM
test coverage.

Perhaps that "dead" style is only used for that error div that isn't visible
unless some rare condition exists.

~~~
eberkund
PurifyCSS does that:
[https://github.com/purifycss/purifycss](https://github.com/purifycss/purifycss)

The only issue I have come across is when you are inserting elements with CSS
classes using JavaScript. A workaround for this is importing the CSS in the JS
module you are using it in via webpack or PurifyCSS also has an option to
manually exclude certain selectors.

~~~
tyingq
As you mention, It does if you somehow click every combination of buttons,
invoke every error, corner case, etc. It can't know about HTML that isn't
there, but could be.

------
specialist
Unnecessary duplicate "artwork" has always been a problem for digital mediums.

Example from the dark ages: naive (or lazy) CAD drafters would often draw
overlapping lines, which then caused the pen plotter to draw that many strokes
on the mylar, which would then have too much ink, botching the output. Enough
of a problem that 3rd party tools popped up to dedupe plot files.

\---

Opera was onto something with server side rendering for their mini browser.
They should sell that as a service.

In fact, someone needs to take it even further:

Render to some page description language (maybe PDF), and then automagically
rehydrate into minimal, inferred HTML+CSS+images. Kinda like how tabula and
others scrape PDFs to reconstitute tabular information.

One benefit is throwing away all the box model layout cruft. True WYSIWYG.

Another, even bigger, benefit is sanitizing content, like removing web bug
tracking images.

\---

I'm currently test driving the Mercury Reader extension for Chrome. Ignoring
all styling also sounds like a terrific idea. I may prefer this strategy.
[https://postlight.com/](https://postlight.com/)

------
leeoniya
[https://github.com/rtsao/styletron](https://github.com/rtsao/styletron)
generates a pretty thorough de-dupe of CSS rules

------
robocat
Example of low repetition from his page:

view-
source:[https://meiert.com/setup/default.css](https://meiert.com/setup/default.css)

If this is worthwhile doing, a source tool to transform it would make more
sense IMO.

------
dangayle
I use Tachyons for 99% of my styles, writing minimal amounts of highly
specific styles. My productivity has gone way up and I no longer have to worry
about CSS pretty much at all.

------
pspeter3
I think the best thing I got from this article was the discovery of CSS Stats.
[http://cssstats.com/](http://cssstats.com/)

------
trebor
As part of our build process, not only do we minify our stylesheets (which
makes an attempt to unify nearby selectors, if similar) but we precompress.
And our library will route asset requests through a proxy script that adds
cache-control immutable, so only the first request will download the asset.

Repetition is simpler than the complexity of managing huge projects that are
constantly changing.

------
unabst
> excess of repetition is the definition of bad code

No. The lack of abstraction tools is the definition of a lower level language.

You know it's a lower level language when, as a developer, you feel the need
for one higher. I submit Less and Sass as evidence.

Honestly, this alone making it to standard css would have me taking a week off
knowing how much time I'd save in the future:

.c {.b .a}

... where a and b and c are all classes.

------
k__
Seems like CSS-in-JS has solved this problem already.

Libs like styletron consolidate duplicated style declarations.

------
jimmaswell
In my experience CSS would be fairly more efficient if it supported
inheritence. When I want multiple similar declarations I have to either
cope/paste them all and slightly edit them or apply multiple definitions to
the targets.

------
roesel
A few plots would go a long way in this post.

