When visiting non-US sites, they mostly don't seem to load 8 different JS files from different domains, which then load another layer of JS files, repeating for 3 to 8 iterations.
I think it's a global trend. If you can afford cheap data collection then why not do it?
For example ghostery indicates 8-11 trackers blocked on major Hungarian news portals. I'd guess this is close to an average. Check a DailyMail article ;)
When visiting non-US sites, they mostly don't seem to load 8 different JS files from different domains, which then load another layer of JS files, repeating for 3 to 8 iterations.