In the compression theory class, Clare asked her 50 students to submit their assignment essays digitally.
Once she received all of them, she decided to compress them to save space on her server. After compression, she realized that different files have different compression ratios.
Being the lady of efficiency that she is, she decided to write a compression program that non-trivially compresses every file in each archive by the same ratio. Will she succeed?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Consider what would happen if there was a compression algorithm which could shrink any arbitrary file even by a single bit, always.
Let us say that we create all the possible files containing b bits and try to archive them. There would be 2 b such files in that archive.
But there are only 2 b − 1 files of length ≤ b − 1 . So, we are somehow destroying information. This is something we cannot allow since the decompressor needs to have the entire information available to function.