The funny thing for me is how much more intuitive this is when you run an actual server, or vps, and how fundamental it is in that context ("Oh crap, bunch of views, they are slowing this laptop down a bunch!") while being so much more involved here.
I'm hesitant to generalize past myself, but I really wish someone made me learn the basics on a web server in my own home, where I could reboot it by pulling out a power cord, and transfer files with a usb stick. I would have been so much more able to make thought out trade-offs or attempt grander things.
Its straight-forward and free (at the margin) to throw up a password protected site on a server, but daunting in SaaS world. Same story for having a bash script update some page with a gnuplot chart with data obtained via curl/wget, vs. CORS and an API, and a charting library, and json ...
I shudder to think of how much time I wasted as I was just learning web anything on solutions that were way too big and complex for my goals at the time - from docker, to lambdas, AWS orbit based nuclear powered whatever to run a clock widget or whatnot. I'm certainly glad I learned those things, and they are scalable and safer and all those good things, but hard to put into context without knowing how one stupid server doing it all by itself would do it.
This is all a drawn out way of saying that the cheapest lightsail vps is like 3 bucks a month, and you can implement all the fanciest netlify features right on it, and if you never have done that, you really should. If anything when you do move to netlify, you will really appreciate it on a different level.
> lightsail vps is like 3 bucks a month, and you can implement all the fanciest netlify features right on it, and if you never have done that, you really should
Can you expand what Netlify features you're talking about or provide any guides/links that would steer a newbie in the right direction? I appreciate the "back to basics" attitude, but don't know where to even get started...
^^ This is isn't tax, legal, or sysadmin advice. SanchoPanda doesn't offer tax, legal, or sysadmin advice. Please consult your tax, legal, or sysadmin advisor before making any tax, legal, or web infrastructure related decisions.
I'm curious how a cronjob would be used as Lambda functions here? I'd have expected it to be CGI here instead.
Do note that I'm not encouraging people to use CGI for their serverless needs whatsoever but if we're going back to basic then CGI is the closest thing I can think of for serverless.
Seems like a big miss on their part. I love Netlify and I use them for my personal site. When I got their newsletter email announcing this, the first thought that entered my head was: "Finally! I can take Google Analytics off my site. About time a serious contender took on GA."
Unfortunately this costs $108/year more than GA. And while there are a couple of GA alternatives out there, having the financial backing of someone like Netlify makes a big difference. I don't want to think about analytics, I just want to know roughly how much traffic my site is getting, not having to worry about an open source project dying in the next year and having to find yet another replacement.
GA is nice for client-side stuff, but Adblock has made the numbers completely unreliable, especially for tech-focused content. Instead I prefer to just use server-side logs. At my tiny scale it's easy to just assume each IP is a unique, and people are free to block GA without affecting my metrics.
Honestly I just wrote a small set of functions to log everything to Google Analytics via its measurement protocol[0]. I know I said I didn't use it client-side, but the dashboard is still top-notch for exploring analytics data and there's a ton of processing it does for free - IP to location is one.
I also stream all logs into text files so I can process them later too.
Try https://goaccess.io/ works very well for us with haproxy cluster and rsyslog for central logging for all web and api requests. We run goaccess in real-time mode processing 100000 requests per day with local ip to city database.
I've always heard GoAccess recommended as the go-to solution for server side logging. Though I don't have personal experience with it, it does look amazing.
This makes a ton of sense of for Netlify, and I hope they keep improving on it. If this existed 5 months ago I might not have rolled my own analytics using Netlify functions and Google Spreadsheets^. As it is, $9 per site, and an inability to log client-side routing make it not worth switching from my own, super customizable event-based solution (would have to come up with a new API schema for calling URLs to log specific events / goals).
This is excellent. The first thing I did was remove GA. It's also incredible that they backfill data.
Yes, it costs money. But this is exactly what I want, and Netlify now has a significant incentive not to do anything shady with the data (and by not using cookies or JavaScript, they hamstring any potential ability to sell the data anyway).
The funny thing about this is tons of developers are using Netlify for SPA's and therefore only that first request will be caught by their edge nodes, all subsequent page views will be missed.
(i work there) lets just say this has not escaped our attention and we even discussed holding off launching until we had an easy way to log an incremental pageview from the clientside. ultimately we decided to ship first and iterate later. limiting the scope was definitely helpful running up to the launch.
Should be noted that Netlify provides a lot of tools for JAMstack sites, like their Netlify CMS (which you can self-host if you're determined) and Netlify Functions.
If you need the SPA framework for productivity reasons, then you could use something like Gridsome to have it generate a static site.
I was thinking the same. Realistically, I believe the only way you could do that on the client would be JS. But if they added that, it would change all their feel good marketing about no JS...
Yes! I have a static personal page, hosted via github pages, where I just want barebones information about how many people visit the page, from which country etc. I get maybe 10 visits per week. I probably won't pay 9$ for this.
That landing page does not have a single screenshot of what you get for $9 per month. Do you pay the $9 and then find out what the analytics page you're getting is going to look like, and if it's suitable for your website?
I'd purchase this, not because I'm likely to look at it often, but because Netlify provides me a great deal of value that I don't even pay for. It may be more than the cost of Google Analytics, but they've earned my money I think.
I've been hoping that somebody would do this for quite some time. I previously experimented with using CloudFront logs in GoAccess for awhile for my personal blog in order to avoid invasive JavaScript-based tracking, but it was too much trouble to maintain.
I really like the idea of the CDN provider offering this as a service that requires no effort and provides a nice interactive dashboard.
The pricing on this is unfortunately too much for me to justify using it for my personal site, but I'd happily use it for anything more substantive.
> I built Logflare for Cloudflare. A good example of what you’re talking about I think. https://logflare.app
Total aside, but I really love the feel of your site. Very polished and I love the way the tool is presented (not to mention that it seems like a useful service to boot!).
good catch! i believe this was a misstatement and have requested we change this wording. for the time being what’s on the docs is what we have https://www.netlify.com/docs/analytics/
not quite the same - that statement is more accurate than saying "sessions". "sessions" has a contextual meaning in analytics and our marketing copy misused the term.
Might be using browser local storage APIs? Hah. I imagine you could also do some kind of web socket/long poll and try to fingerprint the connection a little so there’s high likelihood it’s the same session when it reconnects between page loads.
For hobby sites, I guess the owner might be fine with grepping logs or installing webalizer anyway. Those who really need this are developers at companies where the boss thinks they need it. I've tried to push for Statcounter just to stay away from Google, but bossman eventually wanted Google. This is yet another tool in the belt for being able to work without Google, and for a company (even a small business), 9 euros a month is very little.
> Those who really need this are developers at companies where the boss thinks they need it.
I think that hits the nail on the head. For most usages of GA the product they are selling is pretty graphs that management are convinced they need, any actual actionable insight they can give is secondary to that and I'd bet used but a small fraction of GA users.
I'm curious about comparing my Netlify Analytics numbers to my Google Analytics numbers. I'm seeing a lot more users and page views on Netlify Analytics. I know that GA can be blocked client side, but does GA exclude bot/crawler traffic from stats as well? Or can I assume that the difference between GA and Netlify Analytics numbers are due to adblock / noscript / etc?
i cant speak to GA but just dropping by to point to the docs https://www.netlify.com/docs/analytics/ for details on the numbers. it’s IP based, and some filtering is done as described
“The Pageviews and Top pages charts include only responses with Content-Type: text/html and a status code of 200, 201, or 304. We filter the data by status code this way so that we don’t count errors or double count redirects. This also applies to the Pageviews total for your site.
Unique visitors counts different IP addresses engaging with your site within a single day. If someone loads pages of your site on multiple different days, they will be counted as a unique visitor for each day. The Unique visitors total for your site is a sum of the daily numbers.”
Their marketing on this is brilliant. They're capitalizing on the growing adblock/jsblock trend to point out that server side analytics are more accurate.
To be fair, back in the day when you had a single web server, analytics was easy. I to miss the days of being able to do a 'tail -f web.log' and watch as your page hit Slashdot...
I'm hesitant to generalize past myself, but I really wish someone made me learn the basics on a web server in my own home, where I could reboot it by pulling out a power cord, and transfer files with a usb stick. I would have been so much more able to make thought out trade-offs or attempt grander things.
Its straight-forward and free (at the margin) to throw up a password protected site on a server, but daunting in SaaS world. Same story for having a bash script update some page with a gnuplot chart with data obtained via curl/wget, vs. CORS and an API, and a charting library, and json ...
I shudder to think of how much time I wasted as I was just learning web anything on solutions that were way too big and complex for my goals at the time - from docker, to lambdas, AWS orbit based nuclear powered whatever to run a clock widget or whatnot. I'm certainly glad I learned those things, and they are scalable and safer and all those good things, but hard to put into context without knowing how one stupid server doing it all by itself would do it.
This is all a drawn out way of saying that the cheapest lightsail vps is like 3 bucks a month, and you can implement all the fanciest netlify features right on it, and if you never have done that, you really should. If anything when you do move to netlify, you will really appreciate it on a different level.