The term data compression refers to lowering the number of bits of data which has to be stored or transmitted. This can be done with or without the loss of info, so what will be erased during the compression can be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the info and its quality shall be the same, while in the second case the quality shall be worse. You'll find different compression algorithms that are better for different sort of info. Compressing and uncompressing data in most cases takes lots of processing time, which means that the server performing the action needs to have enough resources in order to be able to process your data quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 in the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Shared Web Hosting

The compression algorithm that we employ on the cloud internet hosting platform where your new shared web hosting account will be created is called LZ4 and it's applied by the cutting-edge ZFS file system that powers the system. The algorithm is superior to the ones other file systems use as its compression ratio is a lot higher and it processes data a lot quicker. The speed is most noticeable when content is being uncompressed since this happens even faster than info can be read from a hard drive. Therefore, LZ4 improves the performance of every Internet site located on a server which uses the algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio let us make multiple daily backups of the whole content of all accounts and store them for thirty days. Not only do the backup copies take less space, but also their generation will not slow the servers down like it can often happen with some other file systems.