檔案壓縮順序造成壓縮率的差異

Hacker News Daily 上看到「Why are tar.xz files 15x smaller when using Python's tar library compared to macOS tar?」這篇,作者問了為什麼他用 Pythontarfile 壓出來比起用 tar 壓出來小了 15 倍,檔案都是 JSON 檔壓成 XZ 格式:

I'm compressing ~1.3 GB folders each filled with 1440 JSON files and find that there's a 15-fold difference between using the tar command on macOS or Raspbian 10 (Buster) and using Python's built-in tarfile library.

看到 1440 個檔案應該會有直覺是一分鐘一個檔案,跑一天的量...

隔天他把原因找出來了,在裝了 GNU Tar 並且加上 --sort='name' 參數後,壓出來的大小就跟 Python 的 tarfile 差不多了:

Ok, I think I found the issue: BSD tar and GNU tar without any sort options put the files in the archive in an undefined order.

After installing GNU tar on my Mac with:

brew install gnu-tar

And then tarring the same folder, but with the --sort option:

gtar --sort='name' -cJf zsh-archive-sorted.tar.xz /Users/user/Desktop/temp/tar/2021-03-11

I get a .tar.xz archive of 1.5 MB, equal to the archive created by the Python library.

底層的原因是檔名與檔案內容有正相關的相似度 (因為裡面都是 sensor 資料),依照檔名排序壓縮就等於把類似的 JSON 檔案放在一起壓,使得 xz 可以利用這點急遽拉高壓縮率:

My JSON files contain measurements from hundreds of sensors. Every minute I read out all sensors, but only a few of these sensors have a different value from minute to minute.

By sorting the files by name (which has the creation unixtime at the beginning of it), two subsequent files have very little different characters between them. Apparently this is very favourable for the compression efficiency.

遇到類似的情境可以當作 tuning 的一種,測試看看會不會變小很多...

補上 nginx 對 favicon 的壓縮...

從「Compressed favicons are 70% smaller but 75% of them are served uncompressed」這邊看到的,他們發現大約有 73.5% 的網站沒有壓縮 favicon.ico 檔:

The HTTP Archive dataset of favicons from 4 million websites crawled from desktop devices on May 2019 shows that 73,5 % of all favicons are offered without any compression with an average file size of 10,5 kiB, 21,5 % are offered with Gzip compression at an average file size of 4 kiB, and 5 % offer Brotli compression at an average file size of 3 kiB.

我自己的也沒加... 補上 gzip 相關的設定後,favicon.ico 的傳輸量從 4.2KB 降到 1.2KB。

我是使用 nginx,在 Ubuntu 上 nginx 的 nginx.conf 內 gzip 預設已經有開,所以只要增加一些設定讓他知道要處理 ico 檔案就可以了。

方法是在 /etc/nginx/conf.d/gzip.conf 裡面放:

gzip_comp_level 9;
gzip_types image/vnd.microsoft.icon image/x-icon;
gzip_vary on;

跟文章裡面提到的多了兩個設定,一個是 gzip_comp_level 改成 9 (預設是 1),另外有 gzip 時應該要在 Vary 表示,避免 cache 出錯。

使用 PNG 對圖片失真壓縮...

PNG 是無失真影像壓縮格式,但我們仍然可以修改 pixel (失真) 讓 PNG 壓縮率更好。今天在「PNG can be a lossy format」看到的 Mac OS X 應用程式就是這個用途。

雖然是應用程式,但作者還是有說明 algorithm 是哪些,分別是從哪裡來。其中兩個是:

文章最後,作者對 GIF 很感冒... XD

GIF has antiquated compression and it's a complete waste of bandwidth. Even lossy GIF is worse than lossless optimized PNG.

另外,JPEG/WebP 還是比較小,不過 JPEG 有很多格式,瀏覽器與作業系統的支援度還是很大的阻礙:

Whether lossy PNG gives better results than JPEG depends on the image. JPEG often gives smaller files, except when image has sharp edges (e.g. text) or any transparency (which JPEG does not support at all).

Optimized lossy PNG is still a bit larger than lossy JPEG-XR/WebP/JPEG-2K, but unlike these formats it's supported by all browsers and operating systems without any fuss or hacks.

最後發現 lossypng 是 Go 寫的,程式碼也不長,看起來頗好玩的... (也許包成 ports?)

Google 發表與 zlib/deflate 相容的壓縮程式,再小 5%...

GoogleApache License, Version 2.0 發表了與 zlib/deflate 相容的壓縮程式:「Compress Data More Densely with Zopfli」。

與 zlib/deflate 相容代表現有的 browser 都不需要變動,而在 project 頁面上是這樣寫:

Zopfli Compression Algorithm is a new zlib (gzip, deflate) compatible compressor. This compressor takes more time (~100x slower), but compresses around 5% better than zlib and better than any other zlib-compatible compressor we have found.

比起現有的 zlib-compatible compressor 大約慢 100 倍 (XDDD),但對於靜態內容的幫助會很大,因為壓一次後就可以用很多次。

xz (LZMA) 的壓縮率

之前 BBS 備份都是用 gzip 加上 openssl 加密後丟上 Amazon S3,檔案大約 1GB 左右,曾經用過 bzip2,大約是 900MB,但多出來的壓縮時間與換到的空間讓人沒辦法接受...

前陣子在測 7z 格式時才發現 xz 的壓縮率高的嚇人... 當然,壓縮的時間會更久,但可以壓到少於 500MB,這對於丟上 S3 的成本就少了很多...

這是壓縮的結果:

xz -1xz -2 的速度都非常快,跟 gzip -9 以及 bzip2 -9 差不多。沒意外的話 (像是軟體專利),應該是未來的趨勢了...