Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. Thus, the compressed info requires less disk space than the initial one, so more content might be stored using the same amount of space. There're various compression algorithms that work in different ways and with many of them just the redundant bits are removed, therefore once the data is uncompressed, there's no loss of quality. Others erase excessive bits, but uncompressing the data at a later time will result in lower quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, and in particular CPU processing time, so any web hosting platform which uses compression in real time needs to have ample power to support that attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of keeping the actual code.
Data Compression in Hosting
The compression algorithm employed by the ZFS file system which runs on our cloud web hosting platform is named LZ4. It can enhance the performance of any Internet site hosted in a hosting account with us because not only does it compress info more effectively than algorithms used by alternative file systems, but also uncompresses data at speeds which are higher than the hard disk reading speeds. This is achieved by using a great deal of CPU processing time, which is not a problem for our platform considering that it uses clusters of powerful servers working together. One more advantage of LZ4 is that it allows us to create backups a lot faster and on less disk space, so we shall have several daily backups of your files and databases and their generation won't influence the performance of the servers. This way, we can always recover any content that you may have deleted by accident.