Hacker News new | past | comments | ask | show | jobs | submit login
TinyJPG – compress JPEG files with a balance between quality and file size (tinyjpg.com)
133 points by davidbarker on Nov 22, 2014 | hide | past | favorite | 30 comments



If you're wanting something that's not tied to a webservice jpeg-recompress https://github.com/danielgtaylor/jpeg-archive is great, configurable too based on the quality you want.


keep in mind that the best way to actually improve perceptual quality with JPEG is to increase the resolution and decrease (sometimes dramatically) the 'quality' slider: http://users.wfu.edu/matthews/misc/graphics/ResVsComp/JpgRes...


I agree with this based on my fuzzy impressions of my own past experience, but I never did any careful explicit tests. It’s too bad the original source image used for the comparison in the document you linked is really crappy quality with terrible artifacts though, and the resizing routine is the fairly unimpressive bicubic scaling (or in the first example nearest neighbor). Makes the comparison pretty useless IMO.

I’d love to see a similar comparison starting with a much better source image, but I don’t have time to do the tests myself right now.

I’d also love to see a comparison of (a) a high-quality image downscaled using a very high quality resizing algorithm and then JPEG compressed versus (b) JPEG compressed first to match the size of image a and then resized in-browser by various browsers’ resizing routines. There was a pretty big difference between the way browsers resize images the last time I checked a few years ago.


I don't think that article made strong arguments.

Using nearest-neighbour resampling on the low-res image is an absolute joke. He didn't even look at an objective quality measurement (PSNR). The human visual system is very sensitive to edges, and the high-res image has more pronounced blocking artefacts. Downsampling a high-res image is an unnecessary load on the end user, the 8x8 block transform was chosen for good reason.

I'm not necessarily saying the low-res is superior, but I disagree this ad hoc method is the 'best' way (compared to optimising the coding).


PSNR is not a very good test of human perception though, and optimizing for PSNR has had unfortunate consequences on image resizing algorithms IMO.

Whether downscaling a high res image is an excessive load on the end user depends a lot on the end user. From what I understand (but I’m not an expert, so please correct if this is wrong), bandwidth is the main bottleneck not only on I/O latency but also on CPU use, not image rescaling. I’m guessing even mobile phones of the last few generations don’t even break a sweat when downscaling images (can’t they use GPU for this?). As always, it would be a good idea to actually test CPU use / latency / battery drain from rendering images at different sizes/JPEG quality levels on the target client device.


Other than pathological examples, PSNR is pretty useful for gauging the quality of photographs, and is very common in literature. Problems aside, it would have been nice to see some rate-distortion curves.

I agree eye balling the results is just as important, but I don't believe everyone should adopt this method because one dude thinks it looks better. Personally, I dislike the blocking artefacts around the neck and badge of the high-res image, even if some details are sharper.


Good job! The compression is very good (close to mozjpeg).

Even though lossy recompression theoretically is suboptimal, I think tools like that are still great in practice.

Lots of casual users don't pay much attention to the quality setting, but in JPEG quality setting makes files grow exponentially, so getting quality right makes a huge difference.

And many graphics tools still use libjpeg with default settings, which don't include even basic filesize optimizations.

Just getting a properly optimized JPEG encoder that has reasonable quality setting is going to be huge improvement for many users.


I have used tinypng so many times and am completely satisfied with their service!


Me too. Since they have an API, it would be really nice if someone could make a WordPress plugin.


Kraken Image Optimizer: https://kraken.io have a WordPress plugin, in case you're interested. It additionally optimizes GIF images.


That is an awesome plugin. Thank you for posting this. I wouldn't have known about it otherwise. I'm working on an extremely image heavy site and this is perfect. It even lets you chose between "lossy and lossless" compression.



This sure beats opening the GIMP all the time just to compress a JPG, and teaching our content writers how to use the command line to compress images with a CLI utility. Always did like TinyPNG, so I'm glad to see this too.


I use IrfanView on Windows most of the time. It's small so it starts up pretty fast and changing quality is a few short clicks away (most of the time the options on the save dialog are enough). I don't even have to upload it to somebody I don't know, it works offline too and I even managed to teach it to my mother (she uses the batch tool now). So it can't be that hard.


Hide the CLI utility by running it at the end of the 'deploy' phase.


This is fine for some uses however if you have a major amount of content then it is just easier to configure the web server to compress the images with Google Pagespeed. This you can turn off with a switch added to the URL. This means you don't have to care about the images you upload as that content will be served optimised. You can then focus on your content rather than be forever compressing images manually.

Importantly Pagespeed can use 4:2:2 colourspace which can work very well for reducing the file sizes. You can also set it to serve even smaller .webp images where browsers can render them.


Gotta say I really don't like all these sites that re-encode images (even more so with videos) and claim that "we can reduce filesize with, like, no quality loss using our Super Duper Compression Magic!" Because no, it's not magic, you're just re-encoding images, and possibly doing lossless optimizations on top of that (though who knows, they might even think the re-encoding is all the "optimization" an image might need already). Just look at this quick comparison I made, you can clearly see the JPG compression artifacts in the TinyJPG "optimized" version (which turned my optimized 1.27MB original JPG into a crummy-looking 431KB JPG): http://screenshotcomparison.com/comparison/101286 (this is a crop from the original image).

Rather than letting a service re-encode your images, you should rather use something that just optimizes them in an actual lossless fashion (optipng, mozjpegtran, or any service that makes use of these), and if you want to squeeze them to an even smaller size in a lossy fashion, then just save your JPGs at lower quality yourself or quantize your PNGs in a controlled fashion (pngquant does a pretty great job with that) - this is especially true with the latter, because haphazard color reduction with PNGs can lead to completely awful-looking results with higher-res images with lots of colors. Here's a comparison for that too: http://screenshotcomparison.com/comparison/101289

Bottom line: Feel free to apply lossless optimizations to your heart's content, but anything more than that you're better off just saving to lower JPG quality yourself to begin with (you'll get better results this way too since you're doing just a single lossy encode instead of two) or quantizing your PNGs yourself, provided you actually care about the quality of your images.

EDIT: Figured I could post some more images.

1. Original source image: http://blisswater.info/images/tiny/original.png (4987 KB) 2. JPG quality 90 encode: http://blisswater.info/images/tiny/encoded-q90.jpg (1033 KB) (encoded with ImageMagick using convert original.png -quality 90 encoded-q90.jpg) 2. Optimized JPG Q90: http://blisswater.info/images/tiny/encoded-q90-optimized.jpg (978 KB) (lossless optimization with mozjpegtran -copy none -outfile encoded-q90-optimized.jpg encoded-q90.jpg) 4. TinyJPG result with original.png as source: http://blisswater.info/images/tiny/encoded-tinyjpg.jpg (1581 KB) 5. TinyJPG result with encoded-q90.jpg as source: http://blisswater.info/images/tiny/encoded-q90-tinyjpg.jpg (485 KB)

As you can see by comparing 2 and 5, there is a very notable quality loss as a result of the TinyJPG re-encode. What's even more interesting is that if you want to avoid double conversion by uploading JPGs, TinyJPG will actually quantize your PNGs first (like their TinyPNG service does), as can be seen in 4 - this is something I was not expecting and find rather baffling as it quite notably alters the source image on its own even before the JPG compression.


Great comments. As co-creator of TinyPNG and TinyJPG I agree with you mostly; re-encoding images is not ideal. But it works for many people because a (too high quality) JPEG file is what they have.

There are going to be compression artefacts when compressing. The point is that is a fine line where they are unnoticeable for the casual observer but result in huge savings in file size. This fine line can indeed be found manually with a good JPEG encoder and a lot of time. Many people have neither.

We made TinyJPG/TinyPNG for people who want to use images on websites (or in apps) that are of high enough quality for casual observers while having very small file sizes. The artefacts in your first comparison image won't be seen by most people from normal viewing distances.

So we can mean a world of difference for people viewing your site with a mobile or poor internet connection. It can be the difference between a hero image that takes 4 seconds to load or 1 second to load.

If you care about the highest quality possible without any sacrifices in file size at all, then TinyJPG is not for you.

>> TinyJPG will actually quantize your PNGs first (like their TinyPNG service does)

It's because you uploaded a PNG image – you also got a PNG image back but gave it a "jpg" extension yourself. It isn't actually a JPEG. TinyPNG/TinyJPG have the same back end service. They both accept PNG & JPEG, and currently won't convert between the formats.


I think it's unfair to use as a comparison an image which is outside of the design scope for JPEG. Quoting from http://info.eps.surrey.ac.uk/FAQ/standards.html :

"JPEG is designed for compressing either full-colour (24 bit) or grey-scale digital images of "natural" (real-world) scenes.

It works well on photographs, naturalistic artwork, and similar material; not so well on lettering, simple cartoons, or black-and-white line drawings (files come out very large)."


It doesn't matter though, a photograph of a low contrast scene with a few blocky subjects would produce the same result.


Is this different then jpegmini ?


You mean apart from the fact that you can't install it locally to reduce latency and save the many megabytes you were hoping to save by uploading the optimized files later?


Anyone know if it's idempotent? What happens if you run the compression ten times consecutively on one image?


(Just answered it myself; definitely not idempotent, though on the example images I threw at it, it seemed to converge rather than multiplying artificats and devolving to noise...)


"Artificats" makes it sound like errors in the the compression algorithm make felines appear in the image.

Come to think of it, given how many cat pictures there are on the Internet, I wonder if there's a place for a compression algorithm tuned for cats (which could make artificats a reality)…


As you've found out, it's not. We think you get best results by compressing only once, with a source image of enough quality. This applies to both TinyPNG and TinyJPG.


Not impressed. Equipped with imagemagick and imageoptim, I'm able to achieve similar results. Exporting the example-original.jpg with 55% quality in Pixelmator and then optimizing with ImageOptim, my JPG is smaller and not visually worse for the naked eye.


I got 75% savings on about 15 files. I'm rather impressed.


Optimize them again, you will save 75% more!


Now try with ImageOptim, a free and open source tool. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: