Results 1 to 12 of 12

Thread: Channel Capacity

  1. #1
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Channel Capacity

    Hello,
    Is there any way to calculate the channel capacity by given given transition probability matrix as follows:

  2. #2
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,257
    Thanks
    307
    Thanked 795 Times in 488 Posts
    Sorry, not enough information.

  3. #3
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hello Matt,
    What else information do you need??

  4. #4
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,257
    Thanks
    307
    Thanked 795 Times in 488 Posts
    Channel capacity depends on bandwidth and signal to noise ratio. http://en.wikipedia.org/wiki/Channel_capacity

    I assume the matrix describes error probabilities with a ternary input and binary output? The input entropy in that case is at most log(3)/log(2) bits per symbol. Then the question is how much of this information can you recover from the binary output stream?

  5. #5
    Member
    Join Date
    Sep 2008
    Location
    France
    Posts
    877
    Thanks
    478
    Thanked 273 Times in 114 Posts
    It seems to be a 3x3 matrix, with the diagonal removed.

    In this case, it's a ternary input/output.
    And it seems there is enough information to provide bandwidth considering a direct O(1) compression.

    It looks a lot like homework to me...

  6. #6
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hello guyz
    Thanks for your help. I think this example might make it more clear for you. Could you please help me to understand how to do this? What is r, Log Y What is X and what is Y in this case? How to find P(x) and P(y)?


    Thanks

  7. #7
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,257
    Thanks
    307
    Thanked 795 Times in 488 Posts
    Quote Originally Posted by Cyan View Post
    It seems to be a 3x3 matrix, with the diagonal removed.
    I don't think so because the columns each add to 1. I think it is a transition matrix with 3 symbol alphabet for input and 2 symbol alphabet for output. If your input symbols are (A,B,C) and the output symbols are (0,1), then it says, e.g. if the input is A then the output is 0 with probability 0.7 and so on.

    In any case, it's not a symmetric channel. The question is probably how much information can you transmit for each symbol. You probably only want to use A and B to mean 0 and 1 because these have lower error probabilities than C, and probably B more often than A for the same reason. So you would choose p(X = A) a little less than 0.5. To get the optimal value, I suppose you would calculate the derivative of I(X; Y) = H(Y) - H(Y|X) with respect to p(X = A) and set it to 0 to find the maximum and solve for p(X = A). Then you plug that value back into I(X; Y).

    To answer your other questions, X and Y are the input and output probability distributions. P(x) means the probability of X = x. Log Y doesn't make any sense to me unless it means log(|Y|) = log(2) = 1 (using log base 2), or log(3) in the symmetric example. H(r) is the entropy of one row, which is the same for all rows only if the channel is symmetric.

  8. #8
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi Matt, Thanks a lot. Sorry for the delay. Yes this is not symmetric channel. I guess this is called Binary erasure channel (BEC). But how exactly can we calculate the I(x;y) for this channel??




  9. #9
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,257
    Thanks
    307
    Thanked 795 Times in 488 Posts
    I(X; Y) = H(Y) - H(Y|X). H(Y) is the entropy of the channel output assuming you don't know the channel input. H(Y|X) is the entropy assuming you do know. For an error free channel, this would be 0 because if you know the input, then you know the output.

    Suppose the input X has a probability distribution (1/3, 1/3, 1/3). Then the output probability of Y = 0 is (0.7 + 0.2 + 0.4)/3 = 1.3/3 = 0.433. Then H(Y) = -0.433 log2 0.433 - 0.567 log2 0.567 = 0.973.

    To calculate H(Y|X), calculate the entropy of each of the 3 columns and sum them weighted by their probabilities (1/3). For example, the first column is -0.7 log2 0.7 - 0.3 log2 0.3.

    You can do better if X has a non-uniform distribution as I mentioned earlier. To find it, take the derivative of I(X; Y) with respect to P(X = A) and P(X = B), set to 0, and solve to find the maximums.

  10. #10
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi Mr. Matt thanks a lot for your response. This channel is known binary erasure channel. Is it the same way as in BSC?
    Thanks

  11. #11
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,257
    Thanks
    307
    Thanked 795 Times in 488 Posts
    A binary symmetric channel is easier to calculate. If the error probability is p then H(Y) = 1, H(Y|X) = H(p) = -p log p - (1-p) log (1-p).

  12. #12
    Member
    Join Date
    Apr 2012
    Location
    sydney
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi matt, thanks again I have designed a small program to calculate that and here is the result. Could u please tell me whether is it right or not?
    thanks
    P(x) matrix = (1/3, 1/3, 1/3)
    then P(X)= 1
    P(y)=~1
    H(X|Y) = 2.4430 bit
    then I(x;y) =~ 0.1291 bit

Similar Threads

  1. Replies: 4
    Last Post: 2nd December 2012, 02:55
  2. New lossless compressor for 24-bit images (3 channels, 8 bits per channel)
    By Alexander Rhatushnyak in forum Data Compression
    Replies: 28
    Last Post: 23rd September 2010, 01:43
  3. Channel info
    By Shelwien in forum The Off-Topic Lounge
    Replies: 0
    Last Post: 18th August 2010, 18:53
  4. Encode's IRC channel details?
    By Scientist in forum The Off-Topic Lounge
    Replies: 1
    Last Post: 18th June 2010, 00:06
  5. Creating our IRC channel
    By Bulat Ziganshin in forum The Off-Topic Lounge
    Replies: 15
    Last Post: 28th November 2009, 12:16

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •