Im no math-viz. When reading about entropy I understand that it has something to do with the information contained.
High entropy means more randomness.
Low entropy means more compact structures/patterns etc?
Then I heard about the Kolmogorov complexity, the Shannon Entropy etc...
I viewed also a video explaining entropy. But I wasnt quite sure if I understood if one has a mathematical way of finding the low entropy of a chunk of information.
Say I have a file of 256 bytes. The data is computer opcodes, so its some patterns, but it looks random, but its not very random.
Is it possible to find the low-entropy value of these bytes (or 2048 bits)? meaning it contains all the information but the entropy-value tells how much it could have been compressed?
What is this mathematical formula called? Shannon or? (since I am no math-viz I was wondering if anyone here could give me a hint or direction, or if it is no such formula).
Thanks.