Hacker News new | past | comments | ask | show | jobs | submit login
CSS Nesting Module (w3.org)
182 points by bpierre on Sept 1, 2021 | hide | past | favorite | 71 comments



It’s not apparent from the document posted, but this is actually almost six years old [0] and already implemented as a PostCSS plugin for just as long [1]. It was adopted by the CSS Working Group a couple of years ago [2]. So this is very well-established and you’ve been able to use this syntax for many years. But it’s good to see it moving forward and hopefully browsers will implement it soon now.

[0] https://tabatkins.github.io/specs/css-nesting/

[1] https://github.com/csstools/postcss-nesting

[2] https://github.com/w3c/csswg-drafts/pull/2878


This is good historical context but missing one bit and another deserves clarification.

The nesting semantics has been a part of SASS/SCSS for the better part of a decade before the proposal, which is now pretty much standard in CSS pre-/post-processing.

It’s been available to use all that time, but AFAIK still requires a build tool. It’s only now becoming a tentative possibility in userland.


Yes, sorry, I was assuming knowledge of Sass etc.

When I first saw this link I thought “Hang on a sec., this says ‘First Public Working Draft’, but hasn’t this spec. been around for ages?”

I figured people might make the mistake that this was something new that the W3C were only just getting around to rather than something that has been cooking for a long time.

Didn’t mean to imply that the spec. itself sprang up out of nowhere – it was definitely based on the Sass work that came before it, and you have been able to use this syntax with Sass for a very long time!



This looks awesome. I'm so glad they re-used the & syntax popular in many compile-to-CSS languages. Between this and CSS variables, I might not even need SASS in my toolchain anymore, which would be amazing.

@nest looks extremely cool too. An easy way to keep rules affecting one selector logically grouped with it, even if the selector relies on a small thing about its parent elements.


Was thinking about SCSS/LESS too, but notice that this is not valid:

    .foo {
        color: blue;
        .bar {
            color: red;
        }
    }

The ampersand is required, unlike in LESS and SASS. I don't yet understand why, because requiring ampersand makes copy pasting things difficult, as you need to add or remove seemingly redundant ampersands to move in out of nesting.


it says it on the page, it's required because otherwise parsing will need unbounded lookahead, since any rule could also be a selector

>Nesting style rules naively inside of other style rules is, unfortunately, ambiguous—the syntax of a selector overlaps with the syntax of a declaration, so an implementation requires unbounded lookahead to tell whether a given bit of text is a declaration or the start of a style rule.

>For example, if a parser starts by seeing color:hover ..., it can’t tell whether that’s the color property (being set to an invalid value...) or a selector for a <color> element. It can’t even rely on looking for valid properties to tell the difference; this would cause parsing to depend on which properties the implementation supported, and could change over time.

>Requiring directly-nested style rules to use nest-prefixed selectors works around this problem—an & can never be part of a declaration, so the parser can immediately tell it’s going to be parsing a selector, and thus a nested style rule.

>Some non-browser implementations of nested rules do not impose this requirement. It is, in most cases, eventually possible to tell properties and selectors apart, but doing so requires unbounded lookahead in the parser; that is, the parser might have to hold onto an unknown amount of content before it can tell which way it’s supposed to be interpreting it. CSS to date requires only a small, known amount of lookahead in its parsing, which allows for more efficient parsing algorithms, so unbounded lookahead is generally considered unacceptable among browser implementations of CSS.


Seems like a premature optimization at the cost of the user.

I don't care how fast it runs if it doesn't do what I want.

EDIT: yeah, you probably are right ... I know too little about parsers. Maybe there exists a formal proof that you can't make a fast parser for that syntax without unbounded lookahead.

Maybe they should even create CSS from the start as two part system. Convenient syntax for humans, once written compiled to StyleAssembly comfortable for the browser.


It still does what you want! This is an issue of syntax, not semantics.

Avoiding unbounded lookahead is not "premature" optimisation - if introduced, it would never be possible to remove due to the algorithmic complexity involved.


It's living in the browser and this feature is mostly useful in complex CSS. Most definitely not a premature optimization.


hope all your visitors on old hardware feel the same way


Those were my reactions, too! The one thing that I want the most whenever I have to jump out of the CSS per-processor world and write regular CSS.

It is worth noting that in SASS you can already refer to parent elements as they have shown with the `@nest` rule (apologies if I misunderstood what you said and you already know this). I do prefer the more verbose `@nest` syntax though, as the intention is clearer.


I'm sure there's a reason here that I'm not seeing, but it feels like it's being different for the sake of being different to me-- requiring `&` even in the simple case where you're combining them with the descendant combinator (plain space), but then also requiring `@nest` in any situation where `&` is not at the start of each selector.

SASS takes the opposite approach-- there's an implicit `&` at the beginning if you don't specify it elsewhere, and it can appear anywhere (i.e. the example `:not(&)` does not have to have anything like the `@nest` syntax in SASS).


There's an explanation in section 2, in the green box that says "Why can’t everything be directly nested?"


I agree! I may use Sass if I were to write functions to generate grids, or element permutations based on a color palette, but at this point I don’t feel it’s all that needed.


I fail to understand people's infatuation with CSS nesting.

It is nice not to write too much to get some pseudo elements or selectors working, but that's the extent of it.

Tying your CSS to the markup is a recipe for misery and pain. Over all these years, I've never seen it not turn out to be a nightmare.


I feel that problem is that some CSS has to be tied to markup and some of it - doesn't.

There is bunch of CSS features and techniques that work only when you enforce certain parent-child relationships of the CSS rules.

Simplest example of it is 'position:absolute' that requires some parent node to have 'position: relative' to be useful. But there is more to that, both flexbox and grid require certain properties to applied to both parent and children nodes at the same time.

Nesting is one way to enforce those relationships and make sure that they are co-located in the stylesheet code.

Something like that is, in my view, good usage of the nesting:

   .parent {
       display: flex;

       & > .child {
          flex: 1 1 auto;
       }
   }

   .child {
       color: red
   }
I specifically added an example that changes color for class `child` and I think this should not be included in the nested rule because this part of the styling is not affected by a parent in any meaningful way.


I agree. I find this approach works very well. The component's own styles are just related to how it looks: color, images, fonts, etc. The parent's styles tell a child where to go in the layout. In general I find if I maintain this division, my css remains sane and I can reuse a component in just about any scenario.

In your example, if .child had flex defined internally, then that becomes a bad time in my experience.


Totally agree with you. Nesting selectors increase coupling with DOM. I did this a lot and I was not productive. At first seems to reduce code, but them, every change in the template I needed to change in the CSS.

What helped was to start building utility classes which I can reuse them.

As the other guy said, nesting selectors has few good usages, should be used carefully.


I started using Tailwind and couldn't agree more. Everything nowadays seems to be deeply nested and closely tied to a component. Most large React apps have a strict 1 component <-> 1 CSS file relationship. Try deciphering how your styles are being inherited to a component through a maze of nested components and their independent CSS files: nightmare.


Nice, nesting seems to be so much more understandable for typical developers, will be nice to not need SASS or LESS preprocessors just for this.


While as part of SASS nesting is a marvelous feature I am an absolute fan of, I can’t help but see this as yet another step in the direction of making new and varied browsers even more impossible to implement. I rather like the “unixy” route of having SASS handle the complexity of understanding nesting and having a simpler browser consume plain flat CSS.


That ship has long sailed. Nesting is inconsequential at this point.


Nesting is useful, but not as useful as you think.

It’s great for keeping your pseudo-selectors close to your selectors. It’s great for faking scoping, by wrapping a whole file in the whole file in a single classname labelling your component.

But nesting also increases specificity and can distribute the name of a selector token across multiple places within the source file making debugging much harder.

If you use nesting as an aid to writing more CSS faster, you’re using it wrong. If you use it as an aid to constructing what would be a neat CSS file without nesting, then you’re doing it right.


I read the page's reasoning for why everything can't be directly nested, but I still don't get it. Why is an "unbounded lookahead" an issue?


The amount of lookahead necessary dictates what algorithms you can use for parsing. If you go from 1-token lookahead to unbounded lookahead, you might force a lot of people to rewrite their parsers or make major changes (because they would have to use different algorithms). This would mean that a lot of CSS parser would simply not adopt the new syntax, at least not in a timely fashion.

If you keep 1-token lookahead, existing CSS parsers can just drop this in.


Keeping a tiny bounded lookahead is essential for the extremely fast and memory-efficient parsing that browsers want for CSS. Sass, less, &c. don't really have to care about that, and they have the benefit of also being able to completely trust that the user won't give pathological inputs - unlike browsers on the web. Also, I think the standards bodies want to minimize their changes in general, to keep the door open for unexpected changes later.


I'm no expert in parsing but my gut tells me there's no way this can actually be a performance concern as much as a implementation complexity concern


Right... you don't want to force everyone to make significant changes to their parsers. If you are writing a C or C++ parser, you know up-front that there is lots of funny business in the language syntax and can plan your parser accordingly. If you are writing parser for something a bit more sane, like Java or CSS, then you can choose a much simpler architecture for your parser.

Part of this means that if, say, your language is LL(1) or something like that, you will want to keep future versions of the language LL(1). This can put you in a bit of a tight spot sometimes, when you're making backwards-compatible changes.


The term "unbounded" seems a bit strange to me too. At worst you only need to lookahead until the next ; or { to figure out if it's a property or a nested selector.

I guess that's unbounded because the parser can't know how much data it will have to buffer, but in practice we're talking like a few dozen bytes most of the time.


You could be parsing invalid syntax (forgot a } somewhere), and now your parser tries to nest EVERYTHING until it finds that matching curly brace or errors out at EOF.


IIRC there are easily reproducible pathological cases. The author of Sass said it was one of mistakes he made when creating Sass: he didn't require & for nesting everywhere.

Unfortunately, on mobile I can't quickly find the relevant links.


I don’t mind the & being required, I just don’t like the fact that there are 2 syntax options depending on whether & is at the start or not.

If the parsing problems are that difficult, I’d rather the @nest be always required.


The lookahead isn’t to the next bit of CSS syntax, it’s looking ahead to the next potential selector match in the DOM. If that can’t be resolved, you can’t unblock the CSS resolution to un-suspend the next blocked portion of the initial render


It’s the same reason backtracking is a performance hit in any scenario. CSS is blocking, and if a selector needs to wait for every network resolution and JS parse/execution to resolve it’s essentially as expensive as your worst case scenario.


This looks familiar, pre CSS 1993 familiar in fact http://1997.webhistory.org/www.lists/www-talk.1993q4/0264.ht... https://eager.io/blog/the-languages-which-almost-were-css/

Pei Y. Wei (wei@sting.berkeley.edu) of ViolaWWW graphical browser:

    (BODY fontSize=normal
          BGColor=white
          FGColor=black
      (H1   fontSize=largest
            BGColor=red
            FGColor=white)
    )


This seems nice but we'll still likely need preprocessors to aggregate multiple files. The import feature of CSS results in additional, serial requests, which are very not good.


This is where standards and bundlers are converging. Move as much syntax to standards as possible to relieve compilers of the duty, use bundlers to do static analysis of dependencies and build optimal packages for the network waterfall.


sometimes your preprocessor is

    cat


or just minify + cat


unless you send different files each request, pretty sure browser caching is still a thing and imports with `preload` links are just fine?


We rally need Web Bundles so we don't have to merge files, but can create a bundle with all the individual files in it that populate the network cache correctly.


> The import feature of CSS results in additional, serial requests, which are very not good.

I'm not sure about that. HTTP2 server push seems to fix this: https://en.wikipedia.org/wiki/HTTP/2_Server_Push



The networking overhead of serial requests would be reduced by multiplexing in Http2, i.e., multiple http requests/responses can reuse the same tcp connection in parallel...


I dont think that helps much in this case (or not any more than http/1.1 keep-alive did).


Multiplexing is completely different from keep-alive. With http2 a web browser requires one tcp connection per website as communication is in parallel, approaches like conncatenating several css files into one have no use with http2 anymore! Http1 keep-alive allows reusing the same tcp connection for sequential requests/responses only.


Oh i know, i just don't think it helps in this situation, as waiting on the tcp connection is not the hold up in this situation.


I dont think server push is a thing anymore, but preload hints (<link rel=preload) might work just as well for this usecase.


Between this and SASS starting to add awkward rules like deprecating "/" for division and stuff, it's finally time for me to start pushing internally for PostCSS + plugins.

There's even one for mixins! I might have been out of that space for too long.


Sass didn't deprecate `/` for division for their own giggles - it's because `/` is increasingly being used in plain CSS values (most prominent example I can think of now is css grid[1]), so it became increasingly hard to distinguish raw values from Sass expressions

1: https://developer.mozilla.org/en-US/docs/Web/CSS/grid-column


Another case is to delimit alpha in color functions: rgb(255 0 0 / 50%) is valid for full red at 50% opacity.


Suppose one of your favorite technical pen brands was suddenly only available in squishy ball form.

Would you care about the very legitimate factory processes that made them do it? Or would you instead buy one that didn't?

I'm an end user for SASS and what they are doing for their own reasons is making me want to walk away.


WTF. Why is w3.org blocking tor exit nodes? That's a recent and worrying development!


Hell YES, I've been using SCSS for this reason only, just to have a CSS codebase that didn't need repetition and nesting provided more documentation by default(given that there's a hierarchy in place)


I used SASS in a recent project essentially just for this feature, so it seems like the CSS WG is slowly chipping away at convenience features that some people have been using for quite some time.


Good!

Now, CSS scopes back in the menu when?


There's work CSS scoping going on as well.

https://drafts.csswg.org/css-scoping-2/

Yet another thing Miriam Suzanne has been working on is a proposal for CSS Layers. I think it's a way of declaring what styles should "win" when specificity is equal instead of "last one loaded wins" or using the very crude `!important` tool. This could be particularly helpful if stylesheets for 3rd party components are loaded dynamically or on platforms where the style author doesn't have absolute control over stylesheet order.

https://drafts.csswg.org/css-cascade-5/

All of these will take some time to get right and then some patience for older browsers to die off (or decent fallback strategies, I think all these would be hard to polyfill).


I would love to see this implemented within the style attribute in html, it would create a nice middle ground between totally separate css files and the new trend of designing with phudo-css class names with tools like Tailwind. It keeps your css close to the semantic html without drowning it in style and class attributes. So you could do something like this (clearly pointless) example:

  <div style="
    > h1 {
      font-weight: bold;
    }
  ">
    <h1>title</h1>
  </div>
Just add a way to inherit from a css class within the style attribute and you have a sort of inline sass.


Inline styles defeat very large part of browser optimizations, so it is very unlikely something like this would be added.


Even worse, any modern CSP would forbid this anyway.


The very last example in 2.1 ("__bar.foo { color: red; }") should instead be "foo.__bar { color: red; }" if I understand it correctly.


No, I think the example is correct. __bar in this case is interpreted as an element name (custom html element), not a class.


Good. But nesting could and should have been added to CSS a decade ago or more, and it is an abject failure of the working group that it wasn’t, despite the obvious, overwhelming demand from authors.


Considering it works just fine with a build step it's not that urgent.

CSS Custom Properties, on the other hand, I'd argue were much more important to drive through quickly as a build step can't modify them at runtime.


"Abject failure" is more than harsh. Not many people have the expertise to sit on these committees and the passion to drive through new changes. It takes a while.


In 2012, the www-style mailing list (where the CSS WG organised and discussed before moving to GitHub) was receiving up to 1400 messages a month: https://lists.w3.org/Archives/Public/www-style/ . There was active participation from professional, full-time standards experts employed by Google, Apple, Mozilla, Opera, Microsoft and more organisations. The author of this nesting spec first proposed it in 2011: https://lists.w3.org/Archives/Public/www-style/2011Jun/0022....

And this was hardly the first such discussion. You can find requests going back for years before even that point, with people requesting nesting/hierarchical rules and being shot down.

So no, there was no lack of expertise or passion. So why has it taken this long? A failure of leadership? A failure of process? Perhaps the few, pigheaded opponents of an obviously desired and useful feature were able to sabotage progress by making it impossible to achieve consensus. I don't know. But I refuse to let them off the hook because they finally got around to delivering something in 2021 that the web should have had in 2001.


> So no, there was no lack of expertise or passion.

Passion to make proposals != passion to drive through changes. Making the proposal is the first step. If you're in it for the long haul, you make revisions and get consensus.

Software engineers, largely speaking, love to design things, build them, and move on to the next project instead of dealing with maintenance. Standards committees, largely speaking, are designed to get consensus first and figure out what the issues are with a proposal before implementing it. This kind of "eat your vegetables" way of working drives off a lot of people. And of the remaining engineers who are patient enough to drive something through committee, most of them are off busy doing other things.

You might have a taste of what this is like if you have ever worked at a company that did design docs before implementation. Like, if you're proposing a change to the system, and you write up a short document and get a couple other engineers assigned to review it. Have you ever had more than a couple engineers assigned, like five or ten? All looking at it with critical eyes? Now imagine that they work at different companies.

> Perhaps the few, pigheaded opponents of an obviously desired and useful feature were able to sabotage progress by making it impossible to achieve consensus.

Jeezus, that's a great example of the kind of attitude that makes this so painful in the first place. I want to print this comment out on paper and mail it to the next person who complains about slow standards committees.

You're speculating about how people's personality flaws are sabotaging the process. Well, guess what? You're not the only one doing making shitty comments like that. People who make committees work get a lot of disrespect from random strangers on the internet.

Maybe someday you'll sit on a committee, but you shouldn't have to do that in order to have an ounce of empathy for how standards committees work.


Dude, this spec was ten years between ideation and a FPWD, for a feature with proven demand and prior art. Sympathy for committee members would be easier if they gave any sign of recognising that this constituted a failure to make timely progress and fulfil the working group's chartered purpose: advancing CSS to simplify web authoring.

But standards committees display the same depressing insularity and hostility to outside criticism as most institutions, preferring to censure its tone instead of reflecting on its truth. In reality, it wouldn't matter how diplomatically it was conveyed, it wouldn't trigger any kind of self-reflection, just the exact same special-pleading about their task's unique difficulty.


The various working groups are plagued by various problems.

SVG is carried forward by a few determined people against all odds (see e.g. this tale by Amelia Bellamy-Royds https://codepen.io/AmeliaBR/post/me-and-svg)

IIRC the CSS working group is just slow for various reasons (from disinterest to lack of people to drive things forward).

Some parts of working groups and standards committees have been completely taken over by Google that just pushes its own agenda.

And so on.


It's definitely way overdue. I hope this gets in soon and full modern browsers support.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: