It's cool that Google is already starting to copy features from Duck Duck Go ;-) Next stop: promising not to keep a log of our searches, or assuming I want to be "signed in" on Search, just because I am on Reader..?
In fairness, implementing it on Google's scale is not really the same "feature", and since Duck Duck Go isn't a self-contained search engine, you can still question how end-to-end secure its final implementation is.
Why would it be any different on Google's scale? I mean, presumably the solution is the same install-ssl-plugin-and-point-it-to-a-cert fare that you would go through for any web server. Am I missing something here?
Super linearly is a new term to me, but it doesn't grow linearly. However, it does grow, and Google is huge. The cost, in terms of labor, focus bandwidth, and money, is large at their scale.
The cost of TLS is linear in every way I can think of. The TLS handshake is basically constant overhead--in space and time--and O(n)--in space and time--transform of the actual data.
Sure, but talking about it being expensive 'on a large scale' is a bit odd - since revenue/profits also grow with scale. It seems to imply a greater than linear growth in costs.
Google has a completely different back-end architecture than a dinky website, and the costs come from the sophistication and complexity of the systems required at that scale.
Google has multiple data centers, dedicated caching and web clusters, complex internal network routing and load-balancing, etc. Then there's stuff like javascript and image hosting optimizations that you've got to make sure work without triggering SSL warnings across thousands of servers. Enabling SSL support for something like Googe search could potentially touch tens of thousands of systems. In short, maybe it's as simple as flipping a bit to enable SSL on the web farm, but I doubt it. There's probably a massive amount of internal engineering and organization behind the change, and that cost is what you're neglecting.
To give you perspective, it's trivial to enable SSL on a single-box website; it can be done in an afternoon. When we turned it on at Justin.tv, it took a few days (let's call it a week) of initial work, and ongoing maintenance costs to make sure that SSL sessions weren't breaking as we made improvements across the site. The costs go up as you get bigger and more complicated.
If it takes 4 hour to get SSL working for a site with 5,000 users then linear growth to a site with 250,000,000 users is 50,000 hours. IMO, google can get SSL working in less than 50,000 hours.
spideroak.com has always been an all SSL site, and there are a few additional annoyances:
- You need a cert for example.com, and maybe a wildcard cert for .example.com, and for any additional levels of subdomains such as *.{x,y,z,etc}.example.com.
- Even modern browsers sometimes abort downloads over SSL silently, without even an indication to the user that the download failed (despite sending a content-length header.) That's the really frustrating one.
Google's log of searches is an amazingly useful feature. Sometimes I google for some obscure library, find it, go to lunch, and then completely forget what it was called or what search terms I used to find it. With the search history, this is not a problem.
I also use Google search for package tracking, and it's nice to get the numbers out of the search history instead of the merchant's site.
I know everyone wants to think that Google is collecting this information so that they know who to come for first when they take over the world... but it's also possible that the feature exists simply because it's useful.