1. ## Channel Capacity

Hello,
Is there any way to calculate the channel capacity by given given transition probability matrix as follows:   Reply With Quote

2. Sorry, not enough information.  Reply With Quote

3. Hello Matt,
What else information do you need??  Reply With Quote

4. Channel capacity depends on bandwidth and signal to noise ratio. http://en.wikipedia.org/wiki/Channel_capacity

I assume the matrix describes error probabilities with a ternary input and binary output? The input entropy in that case is at most log(3)/log(2) bits per symbol. Then the question is how much of this information can you recover from the binary output stream?  Reply With Quote

5. It seems to be a 3x3 matrix, with the diagonal removed.

In this case, it's a ternary input/output.
And it seems there is enough information to provide bandwidth considering a direct O(1) compression.

It looks a lot like homework to me...  Reply With Quote

6. Hello guyz
Thanks for your help. I think this example might make it more clear for you. Could you please help me to understand how to do this? What is r, Log Y What is X and what is Y in this case? How to find P(x) and P(y)? Thanks  Reply With Quote

7. Originally Posted by Cyan It seems to be a 3x3 matrix, with the diagonal removed.
I don't think so because the columns each add to 1. I think it is a transition matrix with 3 symbol alphabet for input and 2 symbol alphabet for output. If your input symbols are (A,B,C) and the output symbols are (0,1), then it says, e.g. if the input is A then the output is 0 with probability 0.7 and so on.

In any case, it's not a symmetric channel. The question is probably how much information can you transmit for each symbol. You probably only want to use A and B to mean 0 and 1 because these have lower error probabilities than C, and probably B more often than A for the same reason. So you would choose p(X = A) a little less than 0.5. To get the optimal value, I suppose you would calculate the derivative of I(X; Y) = H(Y) - H(Y|X) with respect to p(X = A) and set it to 0 to find the maximum and solve for p(X = A). Then you plug that value back into I(X; Y).

To answer your other questions, X and Y are the input and output probability distributions. P(x) means the probability of X = x. Log Y doesn't make any sense to me unless it means log(|Y|) = log(2) = 1 (using log base 2), or log(3) in the symmetric example. H(r) is the entropy of one row, which is the same for all rows only if the channel is symmetric.  Reply With Quote

8. Hi Matt, Thanks a lot. Sorry for the delay. Yes this is not symmetric channel. I guess this is called Binary erasure channel (BEC). But how exactly can we calculate the I(x;y) for this channel??  Reply With Quote

9. I(X; Y) = H(Y) - H(Y|X). H(Y) is the entropy of the channel output assuming you don't know the channel input. H(Y|X) is the entropy assuming you do know. For an error free channel, this would be 0 because if you know the input, then you know the output.

Suppose the input X has a probability distribution (1/3, 1/3, 1/3). Then the output probability of Y = 0 is (0.7 + 0.2 + 0.4)/3 = 1.3/3 = 0.433. Then H(Y) = -0.433 log2 0.433 - 0.567 log2 0.567 = 0.973.

To calculate H(Y|X), calculate the entropy of each of the 3 columns and sum them weighted by their probabilities (1/3). For example, the first column is -0.7 log2 0.7 - 0.3 log2 0.3.

You can do better if X has a non-uniform distribution as I mentioned earlier. To find it, take the derivative of I(X; Y) with respect to P(X = A) and P(X = B), set to 0, and solve to find the maximums.  Reply With Quote

10. Hi Mr. Matt thanks a lot for your response. This channel is known binary erasure channel. Is it the same way as in BSC?
Thanks  Reply With Quote

11. A binary symmetric channel is easier to calculate. If the error probability is p then H(Y) = 1, H(Y|X) = H(p) = -p log p - (1-p) log (1-p).  Reply With Quote

12. Hi matt, thanks again I have designed a small program to calculate that and here is the result. Could u please tell me whether is it right or not?
thanks
P(x) matrix = (1/3, 1/3, 1/3)
then P(X)= 1
P(y)=~1
H(X|Y) = 2.4430 bit
then I(x;y) =~ 0.1291 bit  Reply With Quote

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•