Results 1 to 3 of 3

Thread: Random neural network weights for paq8

  1. #1
    Member
    Join Date
    Mar 2011
    Location
    USA
    Posts
    226
    Thanks
    108
    Thanked 106 Times in 65 Posts

    Random neural network weights for paq8

    Hi,

    I made a small modification to paq8l which seems to slightly improve its compression rate. When constructing the mixer I initialized the neural network weights to pseudorandom values (with a constant seed) instead of a fixed value.

    Empirical testing seems to indicate that using random weights consistently outperforms paq8l. However, the improvement is very small (usually around 0.001 bits per byte of the original file). Since the change has no significant computational cost, I thought it would be worth posting about in this forum. Newer versions of paq8 based on paq8l might benefit from this change.

    Here are some results on the Calgary corpus files (showing cross entropy rate):
    Code:
    File    paq8l           random weights  difference
    bib     1.4969681203    1.4956374574    0.001330663
    book1   2.0057252441    2.0055120836    0.0002131605
    book2   1.5953075026    1.5950521764    0.0002553262
    geo     3.4345631905    3.4326454519    0.0019177386
    news    1.9057344947    1.9053439502    0.0003905445
    obj1    2.7735791883    2.7624077079    0.0111714805
    obj2    1.4549899071    1.4544100564    0.0005798507
    paper1  1.9654265344    1.9622643897    0.0031621447
    paper2  1.9904600822    1.9884160096    0.0020440726
    pic     0.3508760218    0.3506340518    0.00024197
    progc   1.9157351214    1.9121361326    0.0035989887
    progl   1.1831317743    1.1815620494    0.0015697249
    progp   1.1507982053    1.1487394746    0.0020587307
    trans   0.9916905984    0.9899215665    0.0017690319

  2. #2
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 779 Times in 486 Posts
    That's strange behavior for a neural network with no hidden layer. What about using the same initial value for all the weights, but larger or smaller than the value currently used?

  3. #3
    Member
    Join Date
    Mar 2011
    Location
    USA
    Posts
    226
    Thanks
    108
    Thanked 106 Times in 65 Posts
    You are right, it looks like it is possible to outperform the random weights by tuning the constant weight value. I tried several different constant weights and the best one I found was 50. Here are the results:

    Code:
    File    paq8l           random weights  weights set to 50
    bib     1.4969681203    1.4956374574    1.4955052293
    book1   2.0057252441    2.0055120836    2.0054855675
    book2   1.5953075026    1.5950521764    1.5950143703
    geo     3.4345631905    3.4326454519    3.4326780358
    news    1.9057344947    1.9053439502    1.9052838422
    obj1    2.7735791883    2.7624077079    2.7621463411
    obj2    1.4549899071    1.4544100564    1.454136402
    paper1  1.9654265344    1.9622643897    1.9619003796
    paper2  1.9904600822    1.9884160096    1.9881997083
    pic     0.3508760218    0.3506340518    0.350658461
    progc   1.9157351214    1.9121361326    1.9115109309
    progl   1.1831317743    1.1815620494    1.1811791383
    progp   1.1507982053    1.1487394746    1.14812396
    trans   0.9916905984    0.9899215665    0.9897331132
    average 1.7296418561    1.7274773256    1.7272539628

Similar Threads

  1. Sometimes data look like random... here's an interesting file:
    By Alexander Rhatushnyak in forum The Off-Topic Lounge
    Replies: 29
    Last Post: 25th December 2010, 04:05
  2. goodbye and some random thoughts
    By Christian in forum The Off-Topic Lounge
    Replies: 72
    Last Post: 25th January 2010, 05:40
  3. Dark Space Random Thoughts
    By Tribune in forum The Off-Topic Lounge
    Replies: 19
    Last Post: 14th March 2009, 16:22
  4. Random Data Question.
    By Tribune in forum Data Compression
    Replies: 7
    Last Post: 13th June 2008, 20:30
  5. rnd - simple pseudo random number generator
    By encode in forum Forum Archive
    Replies: 21
    Last Post: 14th January 2008, 03:41

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •