Octopress Update

Octopress is a highly customizable blogpost generator. Posts are written and stored in Markdown and Octopress allows easy Jekyll distribution as HTML through S3 or Github. If you use Octopress, I recommend following ’Making Octopress Fast’ written by Eric Wendelin. The guide will help you setup a GZipped website on Amazon S3. It worked like a charm but I lost two important elements: Previewing and quick deploys.

Previewing

After following the guide, all HTML/CSS and JS files are stored as GZip-9 files. Unreadable unless you add a ‘Content-Encoding gzip’ to the header of each file and enable a deflate mechanic to your local webserver. Besides, Jekyll is shipped with a preview WEBrick server which is rendered useless.

Solution: Preview from the public folder and add a second directory for compression.

I’ve added a referral to a compressed directory. Then make sure that all directives for minifying and combining read from #{public_dir} and that all zipping is delayed until after task tocompressed is invoked.

desc "Copying public contents to compressed folder"
task :tocompressed do
   puts "## Copying to compressed directory"
   puts "\n## copying #{public_dir} to #{compressed_dir}"
          cp_r "#{public_dir}/.", "#{compressed_dir}"
                 cd "#{compressed_dir}"
    end

Adding an extra directory to the process results in a public folder which can be previewed. Deployment is done from compressed_dir.

Iterative deploys

Here is the problem I was experiencing. Deploys became long and dreadful after adding GZip to the deployment process. The s3cmd tool allows incremental uploading but since I started GZipping the files, s3cmd seems to just upload everything. At first I thought this might be because I was adding ‘Content encoding’ headers to the files I was deploying. Then I wondered if I could get around it with the ’ –skip-existing ’ parameter to the s3cmd command.

Solution: The problem was
caused by GZip leaving a timestamp inside the zip file. This was solved by adding the -n parameter to the gzip commando.

desc "GZip HTML"
task :gzip_html do
puts "##GZipping HTML"
  system 'find compressed/ -type f -name \*.html -exec gzip -9 -n {} \;'
     Dir['**/*.html.gz'].each do |f| test(?f, f) and File.rename(f, f.gsub(/\.html\.gz/, '.html'))
        end
     end

I still find myself evaluating the Octopress environment but it seems highly customizable so what’s not to love?

Final Week for Google’s RSS Reader

Google will kill it’s reader on the first of July, as mentioned here. I have been trying alternatives since their announcement last March. If you haven’t made up your mind, now is the time to check out this huge list of Reader alternatives. I’m still waiting for the Digg reader to appear, but after reading Macdrifter’s Feedly review i’m pretty certain that at least, I’ve found a promising alternative.

Don’t Worry About China Making Cheaper Products

A friend in Shanghai pointed me to this story today; The fastest computer in the world:

The Tianhe-2 was built by the National University of Defense Technology in China.It will be deployed at the National Supercomputer Center in Guangzho – two years ahead of schedule.

Amazing. From a t-shirt economy to railroads, highways, planes; And just look at the rate Chinese are registering patents. One phrase comes to mind: “You should not worry about China making cheaper products, you should worry about China making better products”.

Beautify Terminal With Oh My ZSH

Beautify terminal with Oh My ZSH

Get a marvelous terminal with just a few steps. If you are already using ZSH as a shell yet you might like to try this. If you are using bash I’d recommend giving it a go and see what you like best.

I installed ZSH using brew:

brew install zsh

zsh

Then I installed Robby Russell’s Oh My ZSH using:

curl -L https://github.com/robbyrussell/oh-my-zsh/raw/master/tools/install.sh | sh

Now edit your profile like so:

vim .zshrc

And you can setup your theme. I’ve chosen ‘af-magic’ for my terminal.

The Week We Ordered a Cow

A new habit. Instead of going to the supermarket, Suna and I decided we wanted to buy fresh meat from the farmer. We had three reasons for this:

  • Horse meat scandal: Over the last year, European supermarkets sold horse meat to customers intending to buy beef. It seems like supermakets have no idea, or do not care about where their food comes from. The farmer we located has his cows graze 10km from our house.
  • Price: This one is a bit hard to calculate but I tried. I took the price of 1kg of ground beef from the supermarket compared that to the stack we had. Turns out, the price of our meat is about 20% less of the supermarket’s. It’s a hard calculation but i’m pretty sure it’s cheaper generally. We had to portion the meat ourselves and purchase a freezer to store it in.
  • Changing eating habits: After half a decade of restaurant food, the last year was full of making dishes in our kitchen. Now that we are cooking every day, we found that there are a few dishes we can make well. We found that we tend to vary our vegetables but not our meats. Our farmer source delivered various parts, from soup bone to round steak. It forces us to cook a diverse meal every day.

I put the story above on Facebook and got mixed replies. Some responded with repulsion, others got curious and wanted to try ordering for themselves. It seemed to me that my friends who grew up on a farm were least appalled by the idea of sourcing your own meat. Perhaps some don’t realize that eating meat means killing an animal. It’s a sad thing but it’s true. Yesterday Suna and me watched cows graze while we took a walk, we agreed that at least our cow was able to walk freely outside.

From Raw Materials

I made my own Ketchup with a recipe found online (Dutch). The people that tasted the sauce seem to prefer the home made version.

Making ketchup seems like a small thing but we’re starting to enjoy to go through the process of making food from it’s raw materials. I’m thinking what to make next; bread, sake?

Put That Cloud Outside the US

Some marketing advice by Trevor Pott for non US cloud companies:

To effect change we are left with a boycott in everything but name. It means that non-US Western businesses need to start using “not subject to US law” as a marketing point. We need cloud providers and software vendors that don’t have a US presence, no US data centers, no US employees – no legal attack surface in that nation of any kind. Perhaps most critical of all, we need a non-American credit-card company.

JPG PNG and WebP – Crunching Images for Performance

The other day we’re scratching our heads about an e-commerce website with terrible load times and found out that 75% of the content were images… and those images weren’t optimized. We went through the process of doing just that and also stumbled on an exciting new format: WebP in the process.

Optimizing JPG and PNG

First the website. Most images were product-images and logo’s etc. These had been created by the site admin, who used Photoshop before uploading them to the admin tool. She asked me why we need to optimize before uploading.

The original bitmap of an image is in fact too large to use online, two common encodings are PNG and JPG. A PNG is a lossless gzip, meaning the decompressed image looks exactly the same as before compression. JPG is lossy, the compression algorithm gets rid of the data by averaging neighboring pixels to make the image a lot smaller. I recommend her to use JPG, unless she need transparancy which isn’t supported with JPG.

However, PNG and JPG image files are often needlessly large. This is due to extra data inside the PNG file such as comments or unused palette entries as well as the use of an inefficient DEFLATE compressor for both PNG and JPG.

To optimize the user’s experience, PNG images should be optimized which can be done using free tools like pngcrush. For convenience, I introduced all the staff to imgoptim – which applies a number of crushers to optimize an image – and requested to optimize the images before uploading. The same day, we saw a 300kb reduction on the homepage, hooray! But a week later we realized that manual optimization wasn’t sufficient; large images were once again popping up left and right and it was hard getting everyone to optimize by hand. So we build a simple cron which is executed daily to take care of the process:

The commands we build in a bash script ($1 = path given to bash script);

find $1 -name *.jpg -print0 | xargs -0 jpegoptim --strip-all -f
find $1 -name *.png -print0 | xargs -0 optipng  -o7
find $1 -name '*.jpg' -exec  /home/joop/bin/jpegrescan -s {} {} \;
find $1 -name '*.jpg'  -exec jpegtran -copy none -optimize -progressive -outfile {} {}\;

I let that run over the weekend and the results were impressive:

folder size reduction
before 5.17 GB
after 4.64 GB 10%

A 10% difference with just a few lines of code! I will now look at most online properties with this script in mind. Use this for optimizing your web properties image folder!

WebP and zopfli

After some digging on compression techniques, we stumbled on the ‘new’ image format called WebP.

WebP is a new image format developed by Google that provides lossless and lossy compression for images on the web. WebP lossless images are 26% smaller in size compared to PNGs. In approach webp images are a hybrid of png and jpg, best of both worlds.

I ran a test:

Picture I took at 경복궁. left: Original (6.2 MB) Right: WebP (0.5 MB)Picture I took at 경복궁. left: Original (6.2 MB) Right: WebP (0.5 MB)

Scenario image size
Original 6.2 MB|
JPG 1.3 MB|21%
WebP 0.5 MB|8%

Google wasn’t lying with their 26%; in my test I was able to reduce the image to 21% (due to lossy setup of 80% quality). I would like to set this up for more websites but unfortunately, WebP isn’t supported by a lot of browsers yet. In fact, only Chrome was able to show me my image on my computer! So browser support is definitely a problem, however a fallback method to jpg could be made to support al browsers.

Another discovery in compression: the WebP compression technique led Google developers to a sideproject called zopfli. It’s supposed to encrypt files further then gzip or 7 zip; Interestingly enough, the compression is supported on all browsers that support deflate, including IE6.

I wonder if zopfli could be used to compress JPG and PNG files so the marvelous compression rates could be achieved on browsers other than Google Chrome. It would probably not be as efficient as WebP but at least we can guarantee browser support with minimal resources.