Given a random variable , and
The weights can be described as
are independent, , and
Thus and .
If . then , and .
We want to prove the following inequality:
Let’s start with definition of variance for continuous random variable:
Consider previous example:
Example: Simple coin toss
Examining the expected values we get
MATLABcode for this experiment is
clear p = rand(1); N = 1000; S = zeros(1, N+1) X_bar = zeros(1, N+1) for n = 1:N X_n = rand(1) < p; S(n + 1) = S(n) + X_i; X_bar(n + 1) = S(n + 1) / n; end
X_barwill converge to a value since
pis a random number
For a majority decoding algorithm, if majority of the () transmitted identical digits are received correctly, then the received digit is considered correctly decoded. Let be the number of errors in the transmission of the () transmitted identical digits, and as the probability that each of the () bits can be decoded correctly on its own. Assume that the errors in each of the () positions are independent of each other.
So if the transmitted bits are
000, for the receiving bits to be
010, the probability is
(a) If , what is the probability that one transmitted bit using majority decoding algorithm is decoded correctly?
Total bits are bits, so if there is at least 3 changed, it cannot be decoded correctly.
Then we can say the probability of 3 bits unchanged is
Which is the binomial random variable formula / distribution:
So the probability of decoding correctly is
Plugging the number in, we get
(b) If we only want use 3 identical bits in the majority decoding algorithm, what is the minimum required to have a better performance compared to (a)?
(c) If we use 7 identical bits, repeat (b).