Page 1 of 2 12 LastLast
Results 1 to 30 of 47

Thread: Great to be hear...... a newbie.

  1. #1
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post

    Great to be hear...... a newbie.

    Great to be here.

    Long short story...... if Possible.
    I became interested in data compression trying to help a brilliant friend Kelly D. Crawford PH.D. over come his abuse of alchohol. Thinking that if he took me on as a student he could hang on long enough to get help. He was programmer for many well known companies. We tackled data compression for 8 long years. My job was to come up with out of the box ideas, he would code. I started from knowing nothing. Just as we were making a breakthrough, he passed from liver problems.

    This was over a year ago, while looking over some of my notes and saw many things that I still think are possibilities, but I am not a programmer, aside from taking basic back in 1980.

    Came here to learn and perhaps connect with a programmer who would like to take a look at some of the out of the box ideas for random data compression.....pseudo random data, in particular.

    Jon
    Last edited by Hcodec; 13th July 2020 at 18:31. Reason: spelling

  2. #2
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Welcome, welcome!

    Sorry to hear your story.

    Random data compression is an everyday topic in the encode.su forum. Look under the https://encode.su/forums/19-Random-Compression subforum.
    What do you mean by pseudo random data exactly? And especially why would you like to compress such data?

    For me:
    All random data is pseudo random since all data were created by some process.

  3. #3
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Thanks Gotty,

    It was quite the journey! living in another country i was only able to do so much. I am trying to figure out what to do with what I learned.

    What do you mean by pseudo random data exactly? I am referring to the Kolmogorov complexity or entropy when talking about the complexity of a bit stream. We were on a quest night and day for eight years to come up with a way to compress random data. It was quite the learning experience for me. I invented many concepts for the first time only to discover that others had made the same discovery many years before. I invented variable length coding, 3d cartisan point encryption, fractal geometry, and many other off the wall ideas only to find a way to compress random data.... only to find others invented the same years before. I came up with one idea that I had never seen since or after that showed the most hope and that is what led me here. I came up with a way to change the number and place value of digits subjectively in a pseudo random permutation order, that allows for an easy inverse. A way to take a random stream of any length of a high entropy and change it to a very low entropy. It is a new pseudo random generator. That allows for compression. I hope I can explain it more. Yes! I agree all data is pseudo random unless the source is based on some generator that defies to be quantified like radiation noise (hardware random number generators).
    Last edited by Hcodec; 14th July 2020 at 06:28.

  4. #4
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    I came up with a way to change the number and place value of digits subjectively in a pseudo random permutation order, that allows for an easy inverse. A way to take a random stream of any length of a high entropy and change it to a very low entropy.
    Are you familiar with the pigenhole principle and the counting argument? #1. #2

  5. #5
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Of course one of the first laws i studied. So i found a way to transform the elements of set A into a subset of highly compressible numbers of lower entropy, the inverse takes less steps (signal bits) to reconstruct than the original size. Let's take a set of nine random numbers, unique as to not waste time with a huffman tree. {8,1,3,4,6,2,7,9,5,} 813462795 to binary is 30 bits. Entropy is. 28.65982114 bits. After a 4 step transform your number becomes. (0,0,1,2,5,6,7,3,5) or (1,2,5,6,7,3,5) but since i have not found a way to make the sets variable length and not loose integrity I'll add padding to make the set 8 digits (0,1,2,5,6,7,3,5) which is 21 bit plus 2 signal bits plus 2.33 bits for padding a 0 is 25.33 bits total or 2.814777778 bits per number. I would like to explain the transform, which is a great encryption also but probably move this out of off topic. I am not a programmer, this was a simple hand cipher compression problem.

  6. #6
    Member
    Join Date
    Sep 2018
    Location
    Philippines
    Posts
    121
    Thanks
    31
    Thanked 2 Times in 2 Posts
    Quote Originally Posted by Hcodec View Post
    Of course one of the first laws i studied. So i found a way to transform the elements of set A into a subset of highly compressible numbers of lower entropy, the inverse takes less steps (signal bits) to reconstruct than the original size. Let's take a set of nine random numbers, unique as to not waste time with a huffman tree. {8,1,3,4,6,2,7,9,5,} 813462795 to binary is 30 bits. Entropy is. 28.65982114 bits. After a 4 step transform your number becomes. (0,0,1,2,5,6,7,3,5) or (1,2,5,6,7,3,5) but since i have not found a way to make the sets variable length and not loose integrity I'll add padding to make the set 8 digits (0,1,2,5,6,7,3,5) which is 21 bit plus 2 signal bits plus 2.33 bits for padding a 0 is 25.33 bits total or 2.814777778 bits per number. I would like to explain the transform, which is a great encryption also but probably move this out of off topic. I am not a programmer, this was a simple hand cipher compression problem.
    Is your algorithm "recursive" or "perpetual" compression, i.e., you can apply the same algorithm to the output again and again and still achieves compaction?
    Last edited by compgt; 14th July 2020 at 12:27.

  7. Thanks:

    Gotty (16th July 2020)

  8. #7
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    @compgt

    I've never tried, but i would think it would eventually hit a wall because of the signal bits that have to be stored for the decode scheme. Good question, thanks....I'll try it out.

  9. #8
    Member
    Join Date
    Sep 2018
    Location
    Philippines
    Posts
    121
    Thanks
    31
    Thanked 2 Times in 2 Posts
    Quote Originally Posted by Hcodec View Post
    @compgt

    I've never tried, but i would think it would eventually hit a wall because of the signal bits that have to be stored for the decode scheme. Good question, thanks....I'll try it out.
    Yes, eventually your recursive function might stop compressing and start expanding.

    In 8 years of thinking data compression, i believe you should had tried coding your ideas to once and for all see if your algorithm compresses. I had 2 years of intensive random data compression thinking (2006-2007), and i still think i already solved it but without a decoder there's no proof of that. We must not be afraid of actually coding our compression ideas because it will make us face the truth if our algorithm is working or not.

    But for the most part, experts and academics state that random data compression is *not possible*, by the simplest argument that is the Pigeonhole Principle and other clever mathematics.

  10. #9
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    What i meant was i never tried a recursive scheme. I've held on to this compression scheme for eight years. A few months ago i figured out what this does. If you mean a decoder as in, once compressed can you restore the file, stream, bits to their original size in a lossless order, sure i did that 8 years ago. It is a simple bijective inverse.

    I'm more nervous of this as a stream or block cipher. Even though R.S.A. and Elliptic curve have saturated the learning centers.

  11. #10
    Member
    Join Date
    Sep 2018
    Location
    Philippines
    Posts
    121
    Thanks
    31
    Thanked 2 Times in 2 Posts
    Quote Originally Posted by Hcodec View Post
    What i meant was i never tried a recursive scheme. I've held on to this compression scheme for eight years. A few months ago i figured out what this does. If you mean a decoder as in, once compressed can you restore the file, stream, bits to their original size in a lossless order, sure i did that 8 years ago. It is a simple bijective inverse.

    I'm more nervous of this as a stream or block cipher. Even though R.S.A. and Elliptic curve have saturated the learning centers.
    I don't know why but people here ask for a compressor/decompressor outright. If you think it is fine to divulge your compression algorithm (which you say is random data compressor or a new pseudo random number generator), i think you will find here eager listeners.

  12. #11
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Good, I'll move our talk over to the main forum.

  13. #12
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    This topic should be moved to "Random Rompression".

  14. #13
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    Of course one of the first laws i studied.
    OK. Good. But your example contradicts it. Let's count then.

    Quote Originally Posted by Hcodec View Post
    Let's take a set of nine random numbers, unique as to not waste time with a huffman tree. {8,1,3,4,6,2,7,9,5,} 813462795 to binary is 30 bits. Entropy is. 28.65982114 bits.
    I think, it's 9*log2(10)=29.89735 bits. How did you get your results?
    If it is a random number it can be anything between "000000000" and "999999999", right? Emphasis on: "anything". How many numbers are there? 1'000'000'000. Can you compress any of them to less?

    Quote Originally Posted by Hcodec View Post
    After a 4 step transform your number becomes. (0,0,1,2,5,6,7,3,5) or (1,2,5,6,7,3,5)
    Hohoo, wait-wait. If you have 7 digits left (which covers only the one hundredth of the original 9-digit range), there are around 100 original 9-digit numbers that end up being the same 7-digit number after the transformation. So you simply cannot reverse the process and find out which was the original 9-digit for a specific 7-digit number.
    Try transforming all the 9-digit numbers, enumerate all the results, and tell us, how many "(1,2,5,6,7,3,5)" did you have.
    See the problem?

  15. #14
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    @Gotty

    You will understand once you see the transform. I understand what your concerns are and I am glad that you wrote them. This method has to stand up to the strictest and harshest criticism and if it does not then it will go by the way of all snake oil products. I certainly understand that many have come before me claiming the impossible, so I do not expect anyone to give this the time of day and am thankful for your comments. I reserve the right to fail at this horribly. However this transform has many applications aside from what I am trying to accomplish with it.

    The transform takes can take a block of N length or a bit stream of any length and depending on the Key you use outputs pseudo random permutations mapped directly from your bitstream or block, in base N.

    I made this as an unbreakable encryption, but realized it has other uses.

    Here is a simple example and then I will post links to video for how it can be used as compression.

    The key in this example is {1,0} in blue. The (1,1) is a starting seed.

    1x
    0
    1 0
    1

    step 1 above bits in black is how you build your permutation from your original seed do {1,1} you look at your seed and then find the coresponding position or index in your key. Here we are looking for a 1 from the seed, we see that there is in position or n-1 position 0 (index 0) a 1 so we place an x in the key and a 0 beside the position of the seed.

    Step 2 below. Next we are looking for the next 1 in the seed to the first 1 in the key not counting the x's that have already been marked off. We see that it is in position N-1 or position 1 so our next permutation is complete. Our first was {1,1} and our next subset is (0,1).

    1x
    0
    1x0
    1 1

    Step 3 below is move the key for a recursive loop but we use the seed (0,1), the sub set of the seed {1,1}. Now we are looking for the first 0 in black to the first position of the 0 in blue and we see it is in the second index or position 2, But we are using n-1 so it is position 1.

    1x1
    0 0x
    1x0 1
    1 1

    Step 4 below, we are looking for the position in blue of the black 1 and we see it is in the first index or position 1 which translates to 0. We now have our next permutation which is a subset of or set A which we called a seed but from here on out it will be a bitstream or a number stream. We do the same as in the previous steps and move our key over. Please note I am using the word key in this example, but when we start using this as a compression scheme our key will change in order and length and this will allow us to filter what we want out output to be so we do not produce all permutations just a select set that we can manipulate easily to compress. We now repeat the steps as before for our next iteration or permutation. Please note all permutation are built from our original seed/random stream. In this case our next permutation is

    1x1x1x
    0 0x0x
    1x0 1 1
    1 1 0 1

    Sorry this took so much time and space but I wanted to make sure you understood how to do a very very simple permutation before moving on to compression. Because compression uses this scheme.

    Here is the video link for the process above.

  16. #15
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Thank you for sharing your ideas, and for the video link.

    Quote Originally Posted by Hcodec View Post
    Please note I am using the word key in this example, but when we start using this as a compression scheme our key will change in order and length and this will allow us to filter what we want out output to be so we do not produce all permutations just a select set that we can manipulate easily to compress.
    If you change the key during compression (based on the input?): how would you know which was your compression-key during decompression? Can you decompress the result?
    In your earlier post you mentioned signal bits and padding. How do you know looking at the compressed result which is a signal bit, a padding and which is data?

  17. #16
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Here is a video using the 9 numbers. Where the entropy is changed by using the transform. It uses the same transform as the example above. Please note the toggle bit is needed to note if the stream starts with a odd or even number. The stream is also changed from high to low entropy. In the next video I will show how the decode works. It is a simple process. perhaps you have already figured it out. If you reverse the of (0,0,1,2,5,6,7,3,5) you end up with the original of (8,1,3,4,6,2,7,9,5). Because of the key of (3,4,5,6,7,8,9,0,1,2) any number that starts even will end in 0, in 4, or less than 4 steps, and any number that starts in an odd number will end in a 1, in less than for steps.

    Using another method it is possible to change the entropy even more where you can use a huffman code.

    You do not change the key, (however you can use various keys multi key encryption but that is another topic) you always use the same key. The key though, becomes part of the plaintext or in this case part of the stream you are compressing. The new permutation is very subjective. Depending on the key you can produce all permutation of length n counting in base n or filter the results so the permutations only progress through specific numbers in this case if it is an even number the permutation output will always start with (9,7,5,3,1) if it is even (8,6,4,2,0) Then you subtract a 0 or subtract a 1. Inversely you add a 1 or a 0.

    Video showing transform of number from high entropy to low.

  18. #17
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Do you understand the decode or do you want me to post the video. It is a simple process.

  19. #18
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    our key will change in order and length
    Quote Originally Posted by Hcodec View Post
    You do not change the key
    Now which one?
    Please explain the key selection process. How do you choose a key?

    Quote Originally Posted by Hcodec View Post
    The key though, becomes part of the plaintext or in this case part of the stream you are compressing.
    When the key is stored together with the compressed output, then key + compressed output is larger than input, isn't it? In your above example your input is 9 digits and your key is 10 digits and your encoded result is 9 digits. I can't see how it is a compression.

  20. #19
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    Do you understand the decode or do you want me to post the video. It is a simple process.
    I don't even understand the compressing part.
    Yes, please explain how decompression works. For the decompression I expect a less then 9-digit input altogether. If the information you need for the decompression is more then 9 digits, then...

  21. #20
    Member
    Join Date
    Sep 2018
    Location
    Philippines
    Posts
    121
    Thanks
    31
    Thanked 2 Times in 2 Posts
    Maybe, it's just very small compression gains. He designed it initially as a cipher.

    I had a few ideas before but with only little compression there was no need to implement a decompressor.

  22. #21
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    For instance this random number set {2,5,7,2,6,3,3,4,1} in six steps using the same key as all my examples above maps to (0,0,0,0,0,0,0,0,0). You make a new file containing your transformed data of a certain block sizethat also contains 3 bits as a signal to loop nine 0's recursively using the key (3,4,5,6,7,8,9,0,1,2) it will restore the set,(0,0,0,0,0,0,0,0,0) to (2,5,7,2,6,3,3,4,1) but you have to understand how to use this scheme inversely.

  23. #22
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    As an example. If you want to compress a list of n=5 super permutations it might not compress very well.... here are the numbers.

    12345123415234125341235412314523142531423514231542 312453124351243152431254312 13452134251342153421354213245132415324135241325413 21453214352143251432154321

    but if you run it through the transform first you will be surprised..... here is a video showing the results.

    Very easy to reconstruct or inverse using the transformed file of mostly 1's and the key. The same for other pseudo random files, once the correct key is found.

  24. #23
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    in six steps
    It was four before, now it's six. It is really unclear for me what you are doing. You are losing me completely.

    Quote Originally Posted by Hcodec View Post
    once the correct key is found.
    What do you mean by that?
    I need a clear answer please: how do you select or construct the key?

    Quote Originally Posted by Hcodec View Post
    using the transformed file of mostly 1's and the key.
    So you need the key for decompress, right?
    I need a clear answer please: What is needed exactly for decompression: data + key? How many digits are exactly needed for decompression in your examples?

    Quote Originally Posted by Hcodec View Post
    but you have to understand how to use this scheme inversely.
    OK. So please show it.

  25. #24
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    In the next video I'll show you how to reconstruct the super permutation from the file of mostly 1s and the key. The key I'll use is (1,2,3,4,5). Stick with this, it is a method that you will not find anywhere, it is new but has many applications. For example once you identify the sequences for instance in n=5 super permutations you can forecast n=6,7,8,9,10... what about forecasting mersenne twister or other supposedly secure rng's, hashes, prng, csprng, prp generators.

  26. #25
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Here is the inverse video of the compressed super permutation set of n=5. It is important to understand how this inverse works not only because it allows a lossless restore, but it allows you to spot very easily sequences when you use it to forcast. As an example I was looking at a site that said they used a rng but did not state which one or how accurate it was. They later changed it right after I ran my test. It was not because of my test that they made the change, someone must of run a nist or other test. It took 30 numbers by hand and i saw a huge flaw by running it through this transform. It looked fine at a glance. I picked a key and ran their numbers recursively through the transform 5 times. I saw that the transform on the 5 loop showed a non random out put set. I inversed 5 loops and was able to predict exactly which number would occur next. That said all I can do at the moment is treat this as a hand cipher transform. If it could be made into a program to show hundreds of permutations or outputs it would be very interesting. I do not know the limits of this transform. I'll show you in the next post how to spot errors in simple random sets of less than 20 digits that look random but when you run them through this transform they become obvious.

    Please do not think I am so conceited as to think this is a holy grail. I don't, but if used in a tool kit of a developer it could prove valuable. I think it should be researched more as a transform to find its limits, and potential. Right now the only people to know about this are people who have read this thread.

    I'll show more in the next post if anyone is interested?
    Last edited by Hcodec; 16th July 2020 at 18:23.

  27. #26
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Quote Originally Posted by Hcodec View Post
    here is a video showing the results.
    It says it's a private video.

  28. #27
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Sorry, open to public.

  29. #28
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    Form your decoding demo, it looks like you need the key for decoding. So actually you can only reconstruct the original numerical sequence by having the transformed sequence + the key (its length and content?).
    I suspect that you need some auxiliary information as well, like the number of the passes you applied to the original input, am I right?

    Did you try implementing your algorithm as a software? You would then have a really useful way of understanding the strengths and the limitations of your transformation.
    As I understood from your posts you tried to apply a manually selected key to a manually selected numerical sequence (right?), and you "have the feeling" that it would work with any input and so this transformation would be able to produce a highly compressible sequence always. From my experience, (and compgt may also agree) random data is really "thick": if you find a way to transform one input to a promising compressible sequence with one method, then other sequences will fail miserably.
    Please try implementing your method, run it with all numerical sequences of length 9 for example. Then you will see that some numerical sequences will confirm your feeling, but many of them will not want to become a nice compressible sequence. Please do try.

  30. #29
    Member
    Join Date
    Jul 2020
    Location
    Guatemala
    Posts
    26
    Thanks
    2
    Thanked 1 Time in 1 Post
    Like i said i have been working with this for 8 years. Here is a video that proves it will produce all permutations not just some. By filtering your out puts to not include all permutation a simple signal bit will take less bits than you would gain by compression making sure your compressed file is is always smaller.

  31. #30
    Member Gotty's Avatar
    Join Date
    Oct 2017
    Location
    Switzerland
    Posts
    554
    Thanks
    356
    Thanked 356 Times in 193 Posts
    A simple signal bit would half the number of possible permutations, right?

Page 1 of 2 12 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •