I wrote Helium CSS a few years ago, https://github.com/geuis/helium-css.
There are many problems with the approach being taken with MinCSS that are reflected in how Helium was constructed.
1) Server-side parsing is brittle. Using regex's for this is a no-no. Helium uses document.querySelector in the browser to really test if a selector matches or not. It even has Sizzle for older browser support.
2) Different css is used to target different browsers, or at least different capabilities. Running this server-side doesn't tell you which browser some totally valid css is meant to be used for.
3) Removing un-matched css rules by default is a no-no. Again, tools like this can be clever but they're never smart. It takes the engineer's eye to say, "This is something I actually need to keep" or "Yup, remove this".
4) Helium works in the browser. This lets you easily run it across multiple browsers that you need to support.
You could make a headless browser that uses Helium, but you're going to be limited to Webkit (phantomjs). Its a thought I've had, but never really thought it was worth the trouble of building such a limiting system.
I encourage anyone who needs a versatile tool to try Helium. Feel free to contribute patches and stuff.
i.e. what about ajax calls that result in a new div showing?
or a responsive CSS file with media queries depending upon the size of the screen?
 You feed in all URLs at once;
`urls = ["/", "/sign_up/", "/about/",]`
 It pre-proccesses also JS files and tries to match CSS selectors as well. That would work for jQuery and Zepto!
Edit: Okay, that won't work for `addClass`! :)
And PHP etc is too dynamic.
Personally I think the 'best' option is a program which 'records' you using the site, and as a Dev I could use my site for like 30mins, touching each point of key functionality. Then at the end, review what sections of CSS were never used, and pick and choose what to discard based upon that (i.e. I can manually then check for fringe cases I might know of).
Additionally, my version can also inline the page images as base-64 encoded blocks.
Check this out for a use case where I apply mincss and gain a massive performance boost: http://www.peterbe.com/plog/mincss-in-action
Here is a link to my latest version of it:
At this point it has become less simple and more versatile.
Having a proper tool to do this is great news!
The other issue, is that it is all to often that user events trigger the adding, subtracting or refactoring selectors.
How does this handle these scenarios?
That said, I am sure that static sites/non-JS intensive sites might be able to use this.
I've been meaning to port it to node and run it with a headless browser. Contributions are always welcome.
How does it work?
- Open up the first page of the site you want to check and press the "Scan" button in the "CSS Coverage" Firebug tab.
- In case of a rich (Ajax or DHTML) site, open up as many divs/popups/tabs in the page as possible and press "Scan" again.
- Visit other pages of your site and press "Scan" again.
Each time you press "Scan", the CSS files that are included in the current page are shown with the number of times the rules has been found applied on your page before it.
I installed it.
Managed to get 100kB of CSS down to about 2kB for the single-page site I was using it on.
Like the author here, I didn't get into exploring how to make this work effectively on a large site with a number of dynamic pages - the only thing I added for this type of approach was to provide a whitelist.
Would be nice if you could browse through the proxy for a bit and have it record the overall usage of CSS for your site.
Ex: A line of CSS that sets text color of li tags to something, but every li tag already would be that color because of a CSS statement for body.
I'm guessing it'd be hard to build something useful and correct, but I think it'd be an interesting extension of this idea.
You can feed mincss a bunch of different URLs and or different .html files and it'll work out the CSS in that. (Note: even if you feed it the raw HTML as a string you need to give it a URL)
Or, you do one page at a time. However that's slow and will only be an option if you have serious caching. See
It also supports lists of HTML and CSS files to search through. It can perform login, find duplicates, list used rules as well as unused rules, as well as count how many times each rule is matched in the markup.
Requires Node, can be installed with npm.
1. Start the extension scan
2. Scan all external CSS currently in use (page load)
3. Scan all inline CSS currently in use (page load)
5. Developer clicks a Finish and Report button:
a) Warning for inline CSS that could be moved externally
b) Warning for external CSS that is never used
c) Export (per CSS file, or combined and minified) CSS files, with unused CSS removed.
Very possible, some css rules are only used in 1 or 2 pages.
Here's an example of using it on just one page at a time http://www.peterbe.com/plog/mincss-in-action