Data compression is the compacting of info by decreasing the number of bits which are stored or transmitted. Consequently, the compressed info will require considerably less disk space than the original one, so extra content can be stored on the same amount of space. There're many different compression algorithms that work in different ways and with a number of them only the redundant bits are deleted, which means that once the information is uncompressed, there's no loss of quality. Others delete unneeded bits, but uncompressing the data later will result in lower quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, in particular CPU processing time, therefore each and every Internet hosting platform that employs compression in real time needs to have adequate power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of keeping the entire code.