ELEC 321

Tutorial 3

Updated 2017-09-29

Expected Value and Variance

Given a random variable , and

The weights can be described as

so

and

Consider

are independent, , and

Thus and .

If . then , and .

Chebyshev Inequality

We want to prove the following inequality:

Proof:

Let’s start with definition of variance for continuous random variable:

Consider previous example:

Example: Simple coin toss

Examining the expected values we get

So

The MATLAB code for this experiment is

clear
p = rand(1);

N = 1000;
S = zeros(1, N+1)
X_bar = zeros(1, N+1)
for n = 1:N
	X_n = rand(1) < p;
	S(n + 1) = S(n) + X_i;
	X_bar(n + 1) = S(n + 1) / n;
end

Over time, X_bar will converge to a value since p is a random number

Problem Set A

A.12

For a majority decoding algorithm, if majority of the () transmitted identical digits are received correctly, then the received digit is considered correctly decoded. Let be the number of errors in the transmission of the () transmitted identical digits, and as the probability that each of the () bits can be decoded correctly on its own. Assume that the errors in each of the () positions are independent of each other.

So if the transmitted bits are 000, for the receiving bits to be 010, the probability is

(a) If , what is the probability that one transmitted bit using majority decoding algorithm is decoded correctly?

Total bits are bits, so if there is at least 3 changed, it cannot be decoded correctly.

Then we can say the probability of 3 bits unchanged is

Which is the binomial random variable formula / distribution:

So the probability of decoding correctly is

Plugging the number in, we get

(b) If we only want use 3 identical bits in the majority decoding algorithm, what is the minimum required to have a better performance compared to (a)?

(c) If we use 7 identical bits, repeat (b).