Octopress Update

Octopress is a highly customizable blogpost generator. Posts are written and stored in Markdown and Octopress allows easy Jekyll distribution as HTML through S3 or Github. If you use Octopress, I recommend following ’Making Octopress Fast’ written by Eric Wendelin. The guide will help you setup a GZipped website on Amazon S3. It worked like a charm but I lost two important elements: Previewing and quick deploys.


After following the guide, all HTML/CSS and JS files are stored as GZip-9 files. Unreadable unless you add a ‘Content-Encoding gzip’ to the header of each file and enable a deflate mechanic to your local webserver. Besides, Jekyll is shipped with a preview WEBrick server which is rendered useless.

Solution: Preview from the public folder and add a second directory for compression.

I’ve added a referral to a compressed directory. Then make sure that all directives for minifying and combining read from #{public_dir} and that all zipping is delayed until after task tocompressed is invoked.

desc "Copying public contents to compressed folder"
task :tocompressed do
   puts "## Copying to compressed directory"
   puts "\n## copying #{public_dir} to #{compressed_dir}"
          cp_r "#{public_dir}/.", "#{compressed_dir}"
                 cd "#{compressed_dir}"

Adding an extra directory to the process results in a public folder which can be previewed. Deployment is done from compressed_dir.

Iterative deploys

Here is the problem I was experiencing. Deploys became long and dreadful after adding GZip to the deployment process. The s3cmd tool allows incremental uploading but since I started GZipping the files, s3cmd seems to just upload everything. At first I thought this might be because I was adding ‘Content encoding’ headers to the files I was deploying. Then I wondered if I could get around it with the ’ –skip-existing ’ parameter to the s3cmd command.

Solution: The problem was
caused by GZip leaving a timestamp inside the zip file. This was solved by adding the -n parameter to the gzip commando.

desc "GZip HTML"
task :gzip_html do
puts "##GZipping HTML"
  system 'find compressed/ -type f -name \*.html -exec gzip -9 -n {} \;'
     Dir['**/*.html.gz'].each do |f| test(?f, f) and File.rename(f, f.gsub(/\.html\.gz/, '.html'))

I still find myself evaluating the Octopress environment but it seems highly customizable so what’s not to love?

Final Week for Google’s RSS Reader

Google will kill it’s reader on the first of July, as mentioned here. I have been trying alternatives since their announcement last March. If you haven’t made up your mind, now is the time to check out this huge list of Reader alternatives. I’m still waiting for the Digg reader to appear, but after reading Macdrifter’s Feedly review i’m pretty certain that at least, I’ve found a promising alternative.

Don’t Worry About China Making Cheaper Products

A friend in Shanghai pointed me to this story today; The fastest computer in the world:

The Tianhe-2 was built by the National University of Defense Technology in China.It will be deployed at the National Supercomputer Center in Guangzho – two years ahead of schedule.

Amazing. From a t-shirt economy to railroads, highways, planes; And just look at the rate Chinese are registering patents. One phrase comes to mind: “You should not worry about China making cheaper products, you should worry about China making better products”.

Beautify Terminal With Oh My ZSH

Beautify terminal with Oh My ZSH

Get a marvelous terminal with just a few steps. If you are already using ZSH as a shell yet you might like to try this. If you are using bash I’d recommend giving it a go and see what you like best.

I installed ZSH using brew:

brew install zsh


Then I installed Robby Russell’s Oh My ZSH using:

curl -L https://github.com/robbyrussell/oh-my-zsh/raw/master/tools/install.sh | sh

Now edit your profile like so:

vim .zshrc

And you can setup your theme. I’ve chosen ‘af-magic’ for my terminal.

The Week We Ordered a Cow

A new habit. Instead of going to the supermarket, Suna and I decided we wanted to buy fresh meat from the farmer. We had three reasons for this:

  • Horse meat scandal: Over the last year, European supermarkets sold horse meat to customers intending to buy beef. It seems like supermakets have no idea, or do not care about where their food comes from. The farmer we located has his cows graze 10km from our house.
  • Price: This one is a bit hard to calculate but I tried. I took the price of 1kg of ground beef from the supermarket compared that to the stack we had. Turns out, the price of our meat is about 20% less of the supermarket’s. It’s a hard calculation but i’m pretty sure it’s cheaper generally. We had to portion the meat ourselves and purchase a freezer to store it in.
  • Changing eating habits: After half a decade of restaurant food, the last year was full of making dishes in our kitchen. Now that we are cooking every day, we found that there are a few dishes we can make well. We found that we tend to vary our vegetables but not our meats. Our farmer source delivered various parts, from soup bone to round steak. It forces us to cook a diverse meal every day.

I put the story above on Facebook and got mixed replies. Some responded with repulsion, others got curious and wanted to try ordering for themselves. It seemed to me that my friends who grew up on a farm were least appalled by the idea of sourcing your own meat. Perhaps some don’t realize that eating meat means killing an animal. It’s a sad thing but it’s true. Yesterday Suna and me watched cows graze while we took a walk, we agreed that at least our cow was able to walk freely outside.

From Raw Materials

I made my own Ketchup with a recipe found online (Dutch). The people that tasted the sauce seem to prefer the home made version.

Making ketchup seems like a small thing but we’re starting to enjoy to go through the process of making food from it’s raw materials. I’m thinking what to make next; bread, sake?

Put That Cloud Outside the US

Some marketing advice by Trevor Pott for non US cloud companies:

To effect change we are left with a boycott in everything but name. It means that non-US Western businesses need to start using “not subject to US law” as a marketing point. We need cloud providers and software vendors that don’t have a US presence, no US data centers, no US employees – no legal attack surface in that nation of any kind. Perhaps most critical of all, we need a non-American credit-card company.

JPG PNG and WebP – Crunching Images for Performance

The other day we’re scratching our heads about an e-commerce website with terrible load times and found out that 75% of the content were images… and those images weren’t optimized. We went through the process of doing just that and also stumbled on an exciting new format: WebP in the process.

Optimizing JPG and PNG

First the website. Most images were product-images and logo’s etc. These had been created by the site admin, who used Photoshop before uploading them to the admin tool. She asked me why we need to optimize before uploading.

The original bitmap of an image is in fact too large to use online, two common encodings are PNG and JPG. A PNG is a lossless gzip, meaning the decompressed image looks exactly the same as before compression. JPG is lossy, the compression algorithm gets rid of the data by averaging neighboring pixels to make the image a lot smaller. I recommend her to use JPG, unless she need transparancy which isn’t supported with JPG.

However, PNG and JPG image files are often needlessly large. This is due to extra data inside the PNG file such as comments or unused palette entries as well as the use of an inefficient DEFLATE compressor for both PNG and JPG.

To optimize the user’s experience, PNG images should be optimized which can be done using free tools like pngcrush. For convenience, I introduced all the staff to imgoptim – which applies a number of crushers to optimize an image – and requested to optimize the images before uploading. The same day, we saw a 300kb reduction on the homepage, hooray! But a week later we realized that manual optimization wasn’t sufficient; large images were once again popping up left and right and it was hard getting everyone to optimize by hand. So we build a simple cron which is executed daily to take care of the process:

The commands we build in a bash script ($1 = path given to bash script);

find $1 -name *.jpg -print0 | xargs -0 jpegoptim --strip-all -f
find $1 -name *.png -print0 | xargs -0 optipng  -o7
find $1 -name '*.jpg' -exec  /home/joop/bin/jpegrescan -s {} {} \;
find $1 -name '*.jpg'  -exec jpegtran -copy none -optimize -progressive -outfile {} {}\;

I let that run over the weekend and the results were impressive:

folder size reduction
before 5.17 GB
after 4.64 GB 10%

A 10% difference with just a few lines of code! I will now look at most online properties with this script in mind. Use this for optimizing your web properties image folder!

WebP and zopfli

After some digging on compression techniques, we stumbled on the ‘new’ image format called WebP.

WebP is a new image format developed by Google that provides lossless and lossy compression for images on the web. WebP lossless images are 26% smaller in size compared to PNGs. In approach webp images are a hybrid of png and jpg, best of both worlds.

I ran a test:

Picture I took at 경복궁. left: Original (6.2 MB) Right: WebP (0.5 MB)Picture I took at 경복궁. left: Original (6.2 MB) Right: WebP (0.5 MB)

Scenario image size
Original 6.2 MB|
JPG 1.3 MB|21%
WebP 0.5 MB|8%

Google wasn’t lying with their 26%; in my test I was able to reduce the image to 21% (due to lossy setup of 80% quality). I would like to set this up for more websites but unfortunately, WebP isn’t supported by a lot of browsers yet. In fact, only Chrome was able to show me my image on my computer! So browser support is definitely a problem, however a fallback method to jpg could be made to support al browsers.

Another discovery in compression: the WebP compression technique led Google developers to a sideproject called zopfli. It’s supposed to encrypt files further then gzip or 7 zip; Interestingly enough, the compression is supported on all browsers that support deflate, including IE6.

I wonder if zopfli could be used to compress JPG and PNG files so the marvelous compression rates could be achieved on browsers other than Google Chrome. It would probably not be as efficient as WebP but at least we can guarantee browser support with minimal resources.

Mod_pagespeed on Nginx

mod_pagespeed is an open source module that can optimize your site for speed. It was already available for Apache webservers and now the module is also available for nginx. We already deployed mod_pagespeed on some Apache2 production environments, I installed the nginx version over the weekend to see how that compares.

Installed using the instructions here: github/pagespeed.
Note: I rebuild using the instructions and nginx was installed in a different directory, I had to clean up and relocate some paths including the initrd scripts.

Then I edited the nginx configuration:

vi /usr/local/nginx/sites-available/default 

With the following config:

server {
    listen 8080;
    server_name lab.joop.in;
    root /var/www/lab.joop.in;
    index index.html index.htm index.php;
location ~ "\.pagespeed\.([a-z]\.)?[a-z]{2}\.[^.]{10}\.[^.]+" { }
location ~ "^/ngx_pagespeed_static/" { }
location ~ "^/ngx_pagespeed_beacon$" { }


And I tweaked the options a bit:

vi /usr/local/nginx/conf.d/ngx_pagespeed.conf

pagespeed on;
pagespeed ImageRecompressionQuality 80;

pagespeed EnableFilters combine_css,rewrite_css,sprite_images,combine_javascript,rewrite_imagesinline_images,recompress_images,resize_images,collapse_whitespace,remove_comments,extend_cache,combine_heads,move_css_above_scripts,make_google_analytics_async,convert_png_to_jpeg,insert_image_dimensions,rewrite_javascript;

# needs to exist and be writable by nginx
pagespeed FileCachePath /var/ngx_pagespeed_cache;

Then restart nginx and when you open the page you will notice a

X-Page-Speed: <>

in the HTTP header.

With mod_pagespeed, I was able to alter images to inline content to reduce requests. Beside that, I was quickly able to combine all CSS files into one, and defer the JS execution on the page.

Website 1: A snappy magento website (homepage)

version load time first byte Start Render DOM elements requests
no pagespeed 2.551s 0.111s 1.065s 2175 55
with pagespeed 2.363s 0.130s 2120 51

Website 1: A slow and bulky wordpress blog (homepage)

version load time first byte Start Render DOM elements requests
no pagespeed 11.854s 0.107s 1.251s 10312 139
with pagespeed 11.277s 0.127s 1.190s 10313 134

The outcome was moderate; Eventhough I used a lot of (expiremental) filters they seemed to reduce loading times only a little. Beside the moderate resuls, i’m not too excited about solving problems on a webserver level; problems are best fixed at their origin – like writing code that’s fast to begin with. However, reducing work like inline image generation and automatic sprite creation is a useful one. My biggest problem at the moment is that I can’t seem to get Varnish to play nicely with it, always caching the version of the site that isn’t optimized by mod_pagespeed; however, we also see this problem on our apache servers. In general, this module will play out nicely for a quick speed injection for our smaller nginx projects which don’t get the speed attention they deserve.


base64 encoding makes file sizes roughly 33% larger than their original binary representations, which means more data down the wire (this might be exceptionally painful on mobile networks) data URIs aren’t supported on IE6 or IE7 base64 encoded data may possibly take longer to process than binary data (anyone want to do a study on this?) (again, this might be exceptionally painful for mobile devices, which have more limited CPU and memory) (side note: CSS background-images seem to actually be faster than img tags) From Davidbcalhoun.com