Data Compression
Learn precisely what Data Compression is and see how it could affect your websites along with the experience of your site visitors.
Data compression is the reduction of the number of bits which need to be stored or transmitted and the process is very important in the web hosting field because info kept on hard disks is often compressed so as to take less space. You will find many different algorithms for compressing data and they offer different efficiency based upon the content. Some of them remove only the redundant bits, so no data will be lost, while others remove unnecessary bits, which results in worse quality once the particular data is uncompressed. This method uses a lot of processing time, so a hosting server should be powerful enough so as to be able to compress and uncompress data immediately. An illustration how binary code could be compressed is by "remembering" that there're five consecutive 1s, for example, in contrast to storing all five 1s.
-
Data Compression in Shared Web Hosting
The ZFS file system that runs on our cloud hosting platform uses a compression algorithm identified as LZ4. The latter is significantly faster and better than every other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the performance of sites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that quickly, we're able to generate several backup copies of all the content kept in the
shared web hosting accounts on our servers daily. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the web hosting servers where your content will be stored.
-
Data Compression in Semi-dedicated Servers
The
semi-dedicated server plans that we provide are created on a powerful cloud platform which runs on the ZFS file system. ZFS employs a compression algorithm named LZ4 that is greater than any other algorithm these days in terms of speed and data compression ratio when it comes to processing website content. This is valid particularly when data is uncompressed because LZ4 does that more rapidly than it would be to read uncompressed data from a hard disk drive and as a result, sites running on a platform where LZ4 is present will work quicker. We are able to take full advantage of the feature although it needs quite a considerable amount of CPU processing time because our platform uses a large number of powerful servers working together and we never make accounts on a single machine like a lot of companies do. There's an additional reward of using LZ4 - considering the fact that it compresses data really well and does that extremely fast, we can also make multiple daily backup copies of all accounts without affecting the performance of the servers and keep them for an entire month. By doing this, you'll always be able to bring back any content that you erase by mistake.