For a multitude of reasons ranging from user conversion rate to search engine ranking, it is important for web sites to load fast. The first rule of thumb to having fast loading pages is to keep the amount of data the users need to transfer small. The smaller the web page file and its dependencies are, the faster it will load. Saving bandwidth is always good, both from the service provider perspective, who saves in data center costs, and from the end user perspective, in particular for those users who have slow mobile connections.
On any average site, most of the transferred data is due to image files. For example loading the front page of the Finnish broadcasting company yle.fi totals 1,4 MB of data, which of 0,9 MB is image files. Another example, the front page of the Finnish Defence Forces totals 365 KB of data which of 284 KB are due to images.
Therefore, a quick and easy way to make your site load faster is to optimize the images. Linux users can easily do this using the programs optipng and jpegoptim, which are available in most Linux distributions. After installing the programs, open a console and go to the directory where your web site is, e.g. cd /var/www/. Then run the following commands to automatically search for all png and jpeg files in the directory and its subdirectories, and optimize them in-place:
find . -iname '*.png' -print0 | xargs -0 optipng -o7 -preserve find . -iname '*.jpg' -print0 | \ xargs -0 jpegoptim --max=90 --strip-all --preserve --totals
Both optipng and jpegoptim work losslessly, meaning that the visible image quality will not decrease even though the file sizes decrease. Additionally jpegoptim also has options to do lossy compression, meaning that the visible image quality can be reduced a little bit, but usually the loss of quality is very small but file size savings big. In our example above the use the option –max=90 which will decrease the JPEG image quality to 90 for pictures that are above it. This is pretty safe, since quality level 90 is still very high and no website visitor is likely to spot any defects in the image. If more compression is needed, it is better to experiment e.g. using the Gimp and the Save for web feature with live preview.In the jpegoptim command, the option –strip-all will strip any metadata including Exif data from images. For websites JPEG metadata is most likely not needed, so it is a preffy safe decision to strip them.If a server administrator wants to automatically optimize all images on a server (like we do on Seravo’s servers), create a script with the contents:
#!/bin/bash echo `date` >> /root/optipng.log find /var/www/ -mtime -2 -iname '*.png' -print0 | \ xargs -0 optipng -o7 -log /root/optipng.log -preserve echo `date` >> /root/jpegoptim.log find /var/www/ -mtime -2 -iname '*.jpg' -print0 | \ xargs -0 jpegoptim --max=90 --preserve --totals >> /root/jpegoptim.log
Then make this into a cronjob by adding a line to cron like:
# m h dom mon dow command 0 1 * * * /root/optimize-images.sh
With these in place, the script will be run every night at 01:00 and it will find all PNG and JPEG images that have been created or modified during the last two days, and then it will optimize them in-place and preserving all other file attributes. Exif and metadata is not stripped, as in this scenario we cannot be sure that all images stored in website folders are meant only for web site viewing.
Update 2013-12-04: As of jpegoptim version v1.3.0 a new option ‘–all-progressive’ is available, which is recommended to use as the JPEG images will the load progressively and thus provide a better end-user visual experience during page load.
If the server has some publicly uploadable folder it might be a good idea to downscale images for web use, as 8Mpix images directly uploaded from a digital camera are way too big for any reasonable web use. To automatically scale all images in a directory to that the longer edge is at max 1000 pixels run Mogrify (part of GraphicsMagic):
mogrify -resize "1000x1000>" *.jpg
Other programs also exist, like pngcrush and jpegtran, but in our tests they where inferior to optipng and jpegoptim. However it seems that there is also some room for improvement for these tools, as for example the Yahoo Smush.it service can do a better job in particular in small file size PNG optimizations. Even more compression for PNG files can be achieved using the TinyPNG service, but it works in a lossy way and thus images might get visible defects. In most cases, the optipng and jpegoptim commands above will do a great job.
One question however remains: why do not all image editors have excellent optimization functions built-in in the first place?
CSS sprites, embedded images and glyph fonts
If you have multiple small images you can combine them into CSS sprites with GraphicMagick using this command:
gm convert +append *.png sprite.png
Now the sprite.png will be a vertically combined image of your PNG images. The resulting file size will be smaller than the sum of the small images, and it will also load faster because the browser needs to make only one HTTP round-trip to fetch the file. See article at A List Apart on what CSS markup to use with CSS sprites.
If all there is on a site is a few small images, they could even be embedded into the HTML code using the data URI scheme.
Another option is to use glyph fonts like Font Awesome. They have the added benefit of scaling to any size, as font files are vector graphics. If the graphic designer is clever, the whole site could be made using only CSS tricks and custom fonts, with perhaps only two or three actual images, and thus load blazing fast. That we leave up as an exercise to our readers.
You might also be interested in these related articles: