Hacker News new | past | comments | ask | show | jobs | submit login

So AMP does two major things:

(1) It prescribes a stripped-down version of HTML and uses a JS loader to render fast and load as much resources as possible asynchronously

(2) It caches the website in the Google CDN and delivers it via HTTP/2

By applying best practices of structuring web applications and optimizing for the critical rendering path (1) can be achieved, too. The AMP loader is just an opinionated way to do this for simple pages. Everyone can get similar results without AMP by following web performance best practices like [1]:

- Reducing the critical resources needed

- Reducing the critical bytes which must be transferred

- Loading JS, CSS and HTML templates asynchronously

- Rendering the page progressively

- Minifying & Concatenating CSS, JS and images

And there is lots of good tooling for this (e.g. postcss, processhtml, cssmin, UglifyJS, imagemin, critical, gulp-rev-all, ...)

(2) is not only harder. It is what limits the broad applicability of AMP. Cached data in the Google CDN cannot be invalidated, neither in the CDN itself nor in ISP interception caches, corporate proxies or the browser cache [2]. The effective consequence of this is that you can only use AMP when the content is mostly static. This is the case for news websites or other publications that are only changed by human editors. It completely breaks down when you try to create a dynamic site, for example a social network or a shop.

That is why our startup Baqend [3] takes a different route. We say that developers are clever enough to use existing tooling to achieve (1): an efficient rendering and loading experience. And for (2) we add a caching scheme that also employs CDNs for delivery but keeps data consistent. This is made possible by a simple process:

1. When a browser connects to a Baqend-based website, it loads a Bloom filter containing all potentially stale cached URLs.

2. Every stale URL is requested using HTTP revalidation to refresh stale copies and update caches.

3. When an update operation changes a resource (e.g. an image, JSON object or even a complex query result) its URL is instantly invalidated in the CDN [4] and marked as stale in the Bloom filter.

4. When loading resource, a statistical estimation of the expected TTL (cache liftime) is made. Whenever that prediction fails, the Bloom filter and the automatic CDN invalidation compensate the difference between estimation and real lifetime.

Using this scheme (developed at the University of Hamburg in cooperation with Baqend), every kind of dynamic data can be treated as cachable data and the applicability of is not limited to data that seldomly changes and even works for write-heavy resources with rich consistency guarantees (Δ-Atomicity, Read-Your-Writes, Monotonic Reads, Monotonic Writes, Causal Consistency) [5].

Of course I'm biased but you'd like to see AMP-like acceleration coupled with fresh cached data plus tooling, layout and frameworks of your choice, have a look at our Backend-as-a-Service.

[1] https://developers.google.com/web/fundamentals/performance/c....

[2] https://github.com/ampproject/amphtml/issues/1901

[3] http://www.baqend.com/

[4] https://www.fastly.com/blog/building-fast-and-reliable-purgi...

[5] http://www.slideshare.net/felixgessert/talk-cache-sketches-u...




AMP also forces precalculated layout of the page and disables bad CSS selectors. You can take a page, take out all the JS cruft, and put all the resources on a CDN, and still end up with mobile layout adding 500 milliseconds to the initial page render, or more, if relayouts are triggered during render.

The majority of regular web developers aren't even aware of layout costs.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: