The term data compression describes decreasing the number of bits of information that has to be stored or transmitted. This can be done with or without losing info, so what will be deleted throughout the compression shall be either redundant data or unnecessary one. When the data is uncompressed later on, in the first case the content and its quality will be identical, while in the second case the quality shall be worse. You will find various compression algorithms which are more effective for various kind of information. Compressing and uncompressing data in most cases takes a lot of processing time, therefore the server carrying out the action must have adequate resources to be able to process your data quick enough. A simple example how information can be compressed is to store how many consecutive positions should have 1 and how many should have 0 within the binary code instead of storing the particular 1s and 0s.

Data Compression in Website Hosting

The ZFS file system which is run on our cloud hosting platform uses a compression algorithm named LZ4. The aforementioned is considerably faster and better than every other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the overall performance of websites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that quickly, we can generate several backups of all the content stored in the website hosting accounts on our servers on a daily basis. Both your content and its backups will need reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web servers where your content will be kept.