1. Sorting by the median. The mean is not very informative for the quality of the source. Most sources provide low-scored content with eventual hits that drive the mean up. The median fixes this problem.
2. Cutting off at 10 submissions. An arbitrary minimum to exclude pure luck from the results.
In the end, this ranking excludes websites like github.com and youtube.com, but it features some less known sources.
What problem does the median fix? Many of the top sites in this list are fairly niche; some don't even really exist any more (e.g., adgrok.com being a business that sold to Twitter in 2011)...Undoubtedly, median is a better metric than mean when the desire is to remove outliers...but in the way that HN works, I'm not sure that need is relevant here. github.com and nytimes.com are absent from this list because a lot of their links get submitted...but I bet a lot more Github users can recall 5 great submissions in the past week from either domain than they can from chris-granger.com, even among fans of Light Table and Eve.
That said, I would be interested in the mean, just to see how different the two lists might be.
1. Sorting by the median. The mean is not very informative for the quality of the source. Most sources provide low-scored content with eventual hits that drive the mean up. The median fixes this problem.
2. Cutting off at 10 submissions. An arbitrary minimum to exclude pure luck from the results.
In the end, this ranking excludes websites like github.com and youtube.com, but it features some less known sources.