# Expectation

The expectation is an important concept in the probability theory and let’s try to motivate the concept through an example say gambling and we are interested in the question: “**Does gambling pay off?**”.

We have the roulette in casinos where the idea is that it contains a ball and it spins and when it stops, the ball lands on one of the numbers

And it has the numbers from 00 to 36

The one type of bet the person can place is on the exact number and the standard pay off in such bet is 35:1 that means if someone bets Rs. 1 on a number and if the ball lands on that number, then the person gets Rs. 35

The question of interest is “Does gambling pays off?”

So, if you’re to play roulette a large number of times, then eventually will you be winning money or losing money?

As almost everyone has heard that the odds are in the favor of the house(casino) meaning the casino wins in the long run. Let’s try to understand the basis of this statement.

To answer this question, we are interested in knowing the profit in the long run, and this profit we can think of as a Bernoulli’s trial, everytime I play the game, there are two outcomes: either I’ll succeed or I’ll fail i.e either the ball lands on the number I have chosen or it will not land on that number. And if I succeed, then my profit is going to be Rs. 34 because I gave Rs. 1 and got back Rs. 35 and if I don’t success, then my loss is going to be Rs. 1

Let’s denote the random variable by **X** which gives the profit at my end and not the casino’s end and the sample space would be mapped to either of the two possible values i.e **-1** or **34**

The question is: “If I play this game 1000 times, then how much can I expect to win on average?”

The probability of the random variable taking on the value 34 would be given as ‘**1/38**’ as there are 38 possible outcomes and we are assuming that this is fair roulette so the ball has a chance of landing of number is equally likely and if that happens then my profit is Rs. 34 and the chance of losing Rs. 1 can then be easily computed by subtracting the probability of success from 1

Now considering the frequency-based definition of probability, we have the following:

But we already know the probability of winning i.e **pₓ(34)** and we can substitute its value in the above formula/equation and we have the ‘# games’ as 1000:

So, we can say that out of these 1000 games, we are going to win 26 times, which means in 26 games we are going to win Rs. 34 and in the remaining 974 games we are going to lose Rs. 1 and we can say that the average gain would be:

So, in the long run, the profit is always going to be negative(from the customer’s perspective). This is how casinos are designed, these pay off values are based on the probability computations. And in the long run, the casinos are always going to make money.

And from the casino's endpoint, it is very unlikely one person is going to play 1000 times but if all of these are Bernoulli's trials that means person 1 placing the bet is the same as person 2 placing the bet(as these are independent trials) and at the end of those 1000 trials, the casino will end up making money.

The value that we have computed(**-64**) is called as the expected gain or the expected value of the random variable profit and we compute that using the probability distribution of the random variable.

We can denote the expected value of the random variable using the below notation:

And the below is the computation that we have done

There are 26 wins(out of a total of 1000 matches) and each win gives us Rs. 34 and there are 974 losses each with a loss of Rs. 1 and we have taken the average.

Let’s re-arrange the terms a bit

So, now we have computed the expected value of profit as the sum of all the values that the random variable can take multiplied by its corresponding probability(meaning the probability of the random variable taking on that value), in the above case, as we have a total of 2 possible outcomes, we are taking the average where the two quantities are weighted based on their probability instead of taking the equal weights.

Let’s take one more example:

So, the person pays Rs. 6000 upfront and if his car gets stolen then the insurance company pays him Rs. 200000.

And we are interested in the expected gain of the insurance company at the end of 1 year.

We define gain/profit as the random variable and this random variable can take on two values: if the car does not get stolen then whatever premium amount the customer has paid, the insurance company keeps the amount and if the car is stolen then the insurance company has to pay back Rs. 200000 which means its loss is (Rs. 200000 — Rs. 6000) i.e Rs. 194000

This is again like a Bernoulli’s trial as there are two possible outcomes. And here is the probability of the random variable taking on both the values:

So, the expected gain of the insurance company is Rs. 2000(long term gain, if we repeat the experiment a large number of times, then this is the value that we expect the random variable to take and it need not be a value that the random variable can actually take for example 2000 is not a value that the random variable can take but this is the expected value of the random variable if we repeat this experiment a large number of times).

Let’s twist this problem a bit:

Let’s call the premium as Rs. x, so the random variable can take on the value ‘x’ if the car does not get stolen and it can take on the value -(200000-x) if the car gets stolen

We are given the distribution as:

So, earlier when the probability of the car being stolen was 2%, the premium amount as Rs. 6000, now that the probability of a car being stolen is changed to 10%, the premium also changed significantly.

Expectation plays a very important role in all of the domains. It is the long term gain or long term average of repeating the experiment many times which is the average value that the random variable can take.

## Properties of expectation

- The linearity of expectation: let’s say ‘
**Y**’ is the linear function of the random variable ‘**X**’

Now using the formula for expectation, we have:

The green circled part in the above image is the sum of the probability of all values that the random variable can take which means it is going to be 1

Substituting the values for relevant parts, we have the expectation of Y as the following:

**So, if two random variables are linked linearly, then their expectation formula/values would also be linked linearly.**

2. If we have a set of random variables **X₁, X₂, X₃, ……, Xₙ** then the expected value of the sum of these random variables is the same as the sum of the expected values of these random variables. The proof of this property is discussed in detail in another article.

3. Let’s see the relationship between expectation and mean value of a sample/population:

Say the sample space consists of ‘**n**’ students and there is a random variable denoted by ‘**W**’ which maps the students to weights

And we are interested in the expected value of this random variable, for simplicity let’s assume that every student has a unique weight, that means the random variable can take ’**n**’ values, and each of these ’**n**’ values are equally likely because each of the students is equally likely, therefore the probability of the random variable taking on any particular weight is going to be ‘**1/n**’

The formula for expectation is as below:

We can plug in the probability value which would be ‘**1/n**’ for every weight and we get the expected as the below which is the same as the average:

And this value is the same as the average.

So, **whenever we have equally likely outcomes, the expected value is the same as the mean value(center of gravity)**.

When the outcomes are not equally likely, then the center of gravity definition still holds for the expectation value. Let’s take the same example discussed in the previous article wherein we are interested in knowing in the probability that the first patient would have a matching blood group

This distribution can take infinite values and the expected value of this distribution is given as:

So, over here we are multiplying the expected value by its probability for example ‘1’ is multiplied by ‘0.09’ which means the probability that the very first patient has a matching blood group is 0.09(probability of random variable taking on value 1) and so on. All the values are not equally likely in this case.

This series is a special series known as arithmetic-geometric series and the sum of such series is given by the formula as depicted in the below image

‘**a**’ is the starting value which is 1

‘**d**’ is the difference between successive terms which is 1(2–1 = 1)

‘**r**’ is the ratio in which the values are increasing which in this case is 0.91 which in this case can also be represented as (1-p)

The below figure shows where the 11.11 mark would be and that is indeed the center of gravity as half the probability mass is on its left and the other half on its right.

The expectation is also called as the center of gravity just as the mean value is called as the center of gravity and the expectation and mean value is the same when we have a uniform distribution(where each value is equally likely) and if it's not a uniform distribution than the expected value would be a weighted sum where the weights are the probabilities that the random variable can take.

References: PadhAI