Hacker News new | past | comments | ask | show | jobs | submit login

When the site was originally designed, there were only 4 or 5 sizes, one for the feed, one for the product page, one for related products, one for thumbnails, etc.

As the design changed, extra sizes got added to the image processing, and as with most early stage start-ups, it worked therefore there was no need to fix it.

We eventually ended up in the position of having 2.5M products all with preprocessed images of certain sizes from our design history. As we wanted more flexibility with design, but also knew that a lot of our images would have a very low likelihood of being accessed (fashion items have single runs and are never remade in future seasons), a big batch process didn't seem appropriate. Additionally, it would mean storing several different copies of images in our S3 bucket, even if we knew the product would not likely be seen again.

A more attractive solution (at least to us) was to do the hybrid approach, where we would resize on demand, and then cache for a long time. This way, we only do the processing for images that need it, in almost a functionally identical way to large scale batch processing, but the process is demand-led.




But the nice thing is - using a storage solution where you don't pay for what you don't use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: