Hacker News new | past | comments | ask | show | jobs | submit login
CSS Stats (cssstats.com)
142 points by sethbannon on Nov 24, 2014 | hide | past | favorite | 32 comments



hey CSS Stats - let me know if we can sponsor you. chris at maxcdn com - can get you an account right now.


This is awesome in a "now I want to spend ages refactoring my CSS to make my numbers look awesome" kind of way.


If you'd like to collect similar stats for your own code base, check out parker[1] and stylestats[2]. At my company, we aggregate certain numbers from both tools and send them to Graphite to monitor over time, as inspired by this talk from GitHub[3][4].

[1] https://github.com/katiefenn/parker

[2] https://github.com/t32k/stylestats

[3] https://speakerdeck.com/bleikamp/sass-at-github?slide=68

[4] https://vimeo.com/86700007


I get only blank pages returned. Assuming it has buckled under the onslaught of HN visitors, what does it show when it works normally?


Something like this: https://www.dropbox.com/s/vatqtaur20f35yi/Screenshot%202014-...

I found their github[1] and ran it locally. Worked perfectly.

[1]https://github.com/mrmrs/cssstats


Cool, thanks. Those are some pretty big numbers. What kind of a site is it?

It is really interesting to see the variance, btw. Twitter, KickStarter and Bootstrap running at thousands of rules, Google and Wikipedia at under 200. And a whole bunch in the 500-1000 range, such as Facebook, Amazon and Mozilla. I'm actually surprised at the relatively low rule-count for Facebook and Amazon as both those sites always gave me the impression (perhaps mistakenly) of having more CSS rules than I could read in a lifetime. Conversely, Github and Stack Overflow have more rules than I would have expected.

I guess that just goes to show that CSS complexity and visual clutter do not necessarily correlate.


Pretty sure the Facebook one is just checking their logged-out page, which is much more straightforward than the logged-in view. Logged-in I count ~144kb of CSS, which still isn't bad, but isn't as mind-bogglingly low.

Amazon is definitely impressive though!

(As are Stripe, Apple, and Airbnb, in basically that order.)


That site specifically was a porn site. Our 3.0 upgrade of the site involved a lot "get this working, NEXT!" so the CSS is kind of a mess. I've got a nice stat page to show me just how much of a mess now though!


It looks like we have a bug and that we're not grabbing all of amazons css - just the first referenced stylesheet. Opening an issue on github. Thanks for the heads up.


This was happening to me too. I could see the output from node that said it figured out there were multiple stylesheets, but it didn't seem to care, and just went about its business showing the first one.


Kudo's for the devs; it really was as simple as downloading the zip from github and issuing a install && start :) Pulled in phantomjs, etc. and just worked out of the gate :)

It still has some issues with style detection (as noticed in other comments), but once that gets fixed, I'll be sure to check back :) Nice project!


Apologies for the downtime. Should be back up now.


Still not loading here.


It is. Thanks!


down again, 504 bad gateway


same here


What would this tool be useful for?

It almost seems like an on boarding tool for 3rd party CSS but you can't construct or derive style rules from any of the presented formats, just raw colors/sizes.

The charts and metrics all seem superfluous.


The specificity graph is quite meaningful. For example, compare the specificity graphs for Yahoo (http://cssstats.com/stats?url=http%3A%2F%2Fyahoo.com#specifi...) and Amazon (http://cssstats.com/stats?url=http%3A%2F%2Famazon.com#specif...) - selected for both being public-facing sites of similar size.

Because CSS rules are applied by specificity (http://specificity.keegan.st/) first and source order second, any property overwrites later in the document require equal or greater specificity than rules in the middle. A chart with a mountain in the middle today will probably develop more mountains on the right over time - it's a leading indicator of poor maintainability. Pairing this tool with one that tells you the line number of offenders would help in identifying areas in need of refactor.

Total vs Unique declarations reveals opportunities for reusable OOCSS-type classes (which itself can be difficult to do in a stylesheet full of specificity mountains).

The colors aren't terribly useful without a histogram and some of the other stats do seem a bit superfluous, but they may just need someone more experienced than I to draw the right conclusions.


Amazon might not be the best example - loading http://cssstats.com/stats?url=http%3A%2F%2Famazon.com#specif... multiple times has generated multiple, very different graphs. I wonder if the CSSStats bot has been thrown into more than one A/B experiments.


It's not currently grabbing all of the stylesheets - we are working on a fix for this at the moment.


Doesn't support gzip compressed files? I'm getting back garbage from my production css file.


Please feel free to open an issue and let us know specifically what is going wrong

http://github.com/mrmrs/cssstats/issues

Thanks!


Is there a browser extension version of this so that it can be run on pages that require authentication?


We don't have one yet - but that's a great idea. We'd like to support basic auth somehow.


I had no idea Facebook's CSS was so minimal (less than a kilobyte!). Bravo, FB.



Now I'm seeing 49kB. Strange. It's not cookies either because it's 49 in an Incognito window as well.


A/B testing I would assume.


Definitely will try to improve my styles to reduce unique definitions.


The site loads very fast. That's impressive.


Why not just host on aws?


It is hosted on AWS! I am just a dev ops n00b. Working out the kinks at the moment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: