
Originally Posted by
Hcodec
Of course one of the first laws i studied. So i found a way to transform the elements of set A into a subset of highly compressible numbers of lower entropy, the inverse takes less steps (signal bits) to reconstruct than the original size. Let's take a set of nine random numbers, unique as to not waste time with a huffman tree. {8,1,3,4,6,2,7,9,5,} 813462795 to binary is 30 bits. Entropy is. 28.65982114 bits. After a 4 step transform your number becomes. (0,0,1,2,5,6,7,3,5) or (1,2,5,6,7,3,5) but since i have not found a way to make the sets variable length and not loose integrity I'll add padding to make the set 8 digits (0,1,2,5,6,7,3,5) which is 21 bit plus 2 signal bits plus 2.33 bits for padding a 0 is 25.33 bits total or 2.814777778 bits per number. I would like to explain the transform, which is a great encryption also but probably move this out of off topic. I am not a programmer, this was a simple hand cipher compression problem.