Hacker News new | past | comments | ask | show | jobs | submit login

Is this a mostly US problem?

When visiting non-US sites, they mostly don't seem to load 8 different JS files from different domains, which then load another layer of JS files, repeating for 3 to 8 iterations.




I think it's a global trend. If you can afford cheap data collection then why not do it?

For example ghostery indicates 8-11 trackers blocked on major Hungarian news portals. I'd guess this is close to an average. Check a DailyMail article ;)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: