Created this, cause I hated having to slow down my websites by adding the gigantic Google Tag Manager and Analytics libraries, combined with their randomly slow CDN, just to track some basic page views.
Both of Google's recommended methods of adding Analytics to a website load Google's JS library asynchronously[1]. Including it has no meaningful impact a site's time to first paint or interactivity.
There's a good argument to say loading 70Kb of JS you're not using is a bad idea, and if that's the case then your smaller script is an improvement, but there's also a good argument to say Google Analytics is powerful and useful and maybe you should be using the features that those 70Kb give you in order to improve you site in ways the user would notice. I guess it's down to your specific use case.
Really good point and I agree. This was intriguing at first glance because who doesn't want less JS on their site?! Of course! However, missing out on other events being tracked, advanced demographic information(?), and what else?? That's a huge miss - especially if you work with Google Ads and have your Analytics linked. Your ad campaigns won't be as effective and could potentially cost more money to save someone an unknown amount of loading time.
My firefox never cache anything. Going forth and back in history always reloads the page. That's the reason I have many tabs open - to prevent reloading and waiting.
about.cache says:
Number of entries: 418
Maximum storage size: 30720 KiB
Storage in use: 12698 KiB
Storage disk location: none, only stored in memory
Nothing is ever really cached anymore, meaningfully at least.
We're in an age where websites happily serve up 8mb JavaScript bundles to display their static homepage to a mobile browser that has maybe a couple hundred MB allocated to caching things. Sure, it may be there if you immediately load another page on that site (and the devs aren't using a bundler that serves a different 8mb next time).
But if they leave and come back, they'll get the same full download experience.
For what it's worth, I run a/b tests in my work where I consistently see large caching effects. For example, when we compare an experimental version of something to the currently deployed version it's critical that we cache-bust the control group. And changes that improve caching generally make things significantly better.
Caching effects are much smaller on Safari with its per-domain cache, but even there if something broke caching for us it would be a high priority to fix.
It depends. Browsers that care about user privacy and tracking are moving to first-party-isolated caching. So GA will be cached for a user on your site only if the user has visited _your_ site before.
You're getting downvoted, probably because your comment is whataboutery at best. You're only relevant when you asume someone with a 6Mb homepage is starting with replacing his Analytics code. A usecase that will probably never occur.
Small piece of history: I used to work on Analytics and they would give out swag. One piece was a T-shirt with the entire tracking snippet on it. It was a great conversation starter.
Sad to see that Tag Manager and the currently tracking JS are big enough to force website owners to compromise.
It's cool that in also works with SPA frameworks. Thanks for sharing.
In other surveilance news, I've been adding Matomo[1] for web analytics to my site for the first time. Used to be called Piwik? It's free/selfhostable web analytics, which is more than enough for small projects and landing pages.
I've been pleasantly surprised with how it works so far, but I had to write some code to trigger it in the right part of my react-router initialization.
For the surveillance conscious I can also recommend using Fathom (https://usefathom.com). I’ve been using it for a few weeks now and have only good things to say.
I haven't tried Mamoto/Piwik, but from the looks of it they look like they have a pretty heavy feature set. If it works for you, I don't see any reason to switch. But if you prefer something that's intentionally low on features and "minimal", Fathom is pretty cool.
While you're at it, why not be less invasive and neuter some of the data you send across? Does Google need to know about screen colordepth, language settings and text encodings, or fine-grained viewport dimensions, or is that stuff just bits of entropy to fingerprint users? Hardcode it!
The snippet seems to circumvent the Google Analytics Opt-out Browser Add-on which many sites mention as a way to disable GA tracking in their privacy policy. So if you use the snippet, don't tell your users that they can opt out with the add-on.
Are you sure it circumvents it? My uBlock Origin is still properly blocking the API call to google analytics. But since the Add-on is from Google itself it could work in different ways indeed.
Could you make it a tiny bit smaller by aliasing `window` and `document` and `localStorage` to remove the duplicated calls? `var w=window;var d=document;var l=localStorage;`
Interstingly, copying/pasting the code into https://www.minifier.org/ gains 69b of savings, but https://jscompress.com/ shows an error: `Invalid regular expression: /+/: Nothing to repeat (line: 3, col: 48)`
Would you expect Google's API to change often? My biggest concern with something like this is their /collect endpoint changing and analytics data stops being tracked.
Will add an option for that as soon as I'm home. Also got many feature requests for exposing an event function. Both of these things shouldn't add any significant size to the snippet.
You can also pretty reasonably remove "colorDepth:true", I can not possibly imagine you have many indexed or RGB545 displays going to your site these days.
Isn't the tag manager and analytics library likely to be cached, thus using zero bytes on average versus the non-zero number of bytes for this library?
That only means it needs to check if there's a new version every 15 minutes though (e.g. by file hash or modification date), not that it has to download it again even if there's no changes every 15 minutes. I haven't looked into the cache settings though.
GTM used to load async, but they wanted to support A/B testing.. If you aren't using A/B testing then you would get perfectly good performance by using the older GTM snippet, it will just continue to use bandwidth at these 15 minute intervals. Even then it probably will only test the hash to see if the container changed..
This seems to me like one real problem that is pretty trivial to fix leading to a lot of over tuning for some kind of ideal.. and then being stuck if you need to support anything else.
When I see the hoops people have to jump through to get basic privacy from Google, it makes me happy that my company's legal department prohibits us from using Google products for HIPAA reasons.
So I think it's a totally legit way to use GA, since you don't always have the ability to add libraries. For example if you want to track GA server side.