Expected value. Discrete random variables Mathematical definition of discrete random variable

2. Basics of probability theory

Expected value

Consider a random variable with numerical values. It is often useful to associate a number with this function - its “mean value” or, as they say, “average value”, “index of central tendency”. For a number of reasons, some of which will become clear later, the mathematical expectation is usually used as the “average value”.

Definition 3. Mathematical expectation random variable X called number

those. the mathematical expectation of a random variable is a weighted sum of the values ​​of a random variable with weights equal to the probabilities of the corresponding elementary events.

Example 6. Let's calculate the mathematical expectation of the number that appears on the top face of the die. It follows directly from Definition 3 that

Statement 2. Let the random variable X takes values x 1, x 2,…, xm. Then the equality is true

(5)

those. The mathematical expectation of a random variable is a weighted sum of the values ​​of the random variable with weights equal to the probabilities that the random variable takes certain values.

Unlike (4), where the summation is carried out directly over elementary events, a random event can consist of several elementary events.

Sometimes relation (5) is taken as the definition mathematical expectation. However, using Definition 3, as shown below, it is easier to establish the properties of the mathematical expectation necessary for constructing probabilistic models of real phenomena than using relation (5).

To prove relation (5), we group into (4) terms with identical values ​​of the random variable:

Since the constant factor can be taken out of the sign of the sum, then

By determining the probability of an event

Using the last two relations we obtain the required:

The concept of mathematical expectation in probabilistic-statistical theory corresponds to the concept of the center of gravity in mechanics. Let's put it in points x 1, x 2,…, xm on the mass number axis P(X= x 1 ), P(X= x 2 ),…, P(X= x m) respectively. Then equality (5) shows that the center of gravity of this system material points coincides with the mathematical expectation, which shows the naturalness of Definition 3.

Statement 3. Let X- random value, M(X)– its mathematical expectation, A– a certain number. Then

1) M(a)=a; 2) M(X-M(X))=0; 3M[(X- a) 2 ]= M[(X- M(X)) 2 ]+(a- M(X)) 2 .

To prove this, let us first consider a random variable that is constant, i.e. the function maps the space of elementary events to a single point A. Since the constant multiplier can be taken beyond the sign of the sum, then

If each member of a sum is divided into two terms, then the whole sum is divided into two sums, of which the first is made up of the first terms, and the second is made up of the second. Therefore, the mathematical expectation of the sum of two random variables X+Y, defined on the same space of elementary events, is equal to the sum of mathematical expectations M(X) And M(U) these random variables:

M(X+Y) = M(X) + M(Y).

And therefore M(X-M(X)) = M(X) - M(M(X)). As shown above, M(M(X)) = M(X). Hence, M(X-M(X)) = M(X) - M(X) = 0.

Because the (X - a) 2 = ((XM(X)) + (M(X) - a)} 2 = (X - M(X)) 2 + 2(X - M(X))(M(X) - a) + (M(X) – a) 2 , That M[(X - a) 2 ] =M(X - M(X)) 2 + M{2(X - M(X))(M(X) - a)} + M[(M(X) – a) 2 ]. Let's simplify the last equality. As shown at the beginning of the proof of Statement 3, the mathematical expectation of a constant is the constant itself, and therefore M[(M(X) – a) 2 ] = (M(X) – a) 2 . Since the constant factor can be taken out beyond the sign of the sum, then M{2(X - M(X))(M(X) - a)} = 2(M(X) - a)M(X - M(X)). The right side of the last equality is 0 because, as shown above, M(X-M(X))=0. Hence, M[(X- a) 2 ]= M[(X- M(X)) 2 ]+(a- M(X)) 2 , which was what needed to be proven.

From the above it follows that M[(X- a) 2 ] reaches a minimum A, equal M[(X- M(X)) 2 ], at a = M(X), since the second term in equality 3) is always non-negative and equals 0 only for the specified value A.

Statement 4. Let the random variable X takes values x 1, x 2,…, xm, and f is some function of the numerical argument. Then

To prove this, let’s group on the right side of equality (4), which defines the mathematical expectation, terms with the same values:

Using the fact that the constant factor can be taken out of the sign of the sum, and the definition of the probability of a random event (2), we obtain

Q.E.D.

Statement 5. Let X And U– random variables defined on the same space of elementary events, A And b- some numbers. Then M(aX+ bY)= aM(X)+ bM(Y).

Using the definition of the mathematical expectation and the properties of the summation symbol, we obtain a chain of equalities:

The required has been proven.

The above shows how the mathematical expectation depends on the transition to another reference point and to another unit of measurement (transition Y=aX+b), as well as to functions of random variables. The results obtained are constantly used in technical and economic analysis, in assessing the financial and economic activities of an enterprise, during the transition from one currency to another in foreign economic calculations, in regulatory and technical documentation, etc. The results under consideration allow the use of the same calculation formulas for various parameters scale and shift.

Previous

Solution:

6.1.2 Properties of mathematical expectation

1. Mathematical expectation constant value equal to the most constant.

2. The constant factor can be taken out as a sign of the mathematical expectation.

3. The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations.

This property is true for an arbitrary number of random variables.

4. The mathematical expectation of the sum of two random variables is equal to the sum of the mathematical expectations of the terms.

This property is also true for an arbitrary number of random variables.

Example: M(X) = 5, M(Y)= 2. Find the mathematical expectation of a random variable Z, applying the properties of mathematical expectation, if it is known that Z=2X+3Y.

Solution: M(Z) = M(2X + 3Y) = M(2X) + M(3Y) = 2M(X) + 3M(Y) = 2∙5+3∙2 =

1) the mathematical expectation of the sum is equal to the sum of the mathematical expectations

2) the constant factor can be taken out of the mathematical expectation sign

Let n independent trials be performed, the probability of occurrence of event A in which is equal to p. Then the following theorem holds:

Theorem. The mathematical expectation M(X) of the number of occurrences of event A in n independent trials is equal to the product of the number of trials and the probability of the occurrence of the event in each trial.

6.1.3 Dispersion of a discrete random variable

The mathematical expectation cannot fully characterize a random process. In addition to the mathematical expectation, it is necessary to enter a value that characterizes the deviation of the values ​​of the random variable from the mathematical expectation.

This deviation is equal to the difference between the random variable and its mathematical expectation. In this case, the mathematical expectation of the deviation is zero. This is explained by the fact that some possible deviations are positive, others are negative, and as a result of their mutual cancellation, zero is obtained.

Dispersion (scattering) of a discrete random variable is the mathematical expectation of the squared deviation of the random variable from its mathematical expectation.

In practice, this method of calculating variance is inconvenient, because leads to cumbersome calculations for a large number of random variable values.

Therefore, another method is used.

Theorem. The variance is equal to the difference between the mathematical expectation of the square of the random variable X and the square of its mathematical expectation.

Proof. Taking into account the fact that the mathematical expectation M(X) and the square of the mathematical expectation M2(X) are constant quantities, we can write:

Example. Find the variance of a discrete random variable given by the distribution law.

X
X 2
R 0.2 0.3 0.1 0.4

Solution: .

6.1.4 Dispersion properties

1. The variance of a constant value is zero. .

2. The constant factor can be taken out of the dispersion sign by squaring it. .

3. The variance of the sum of two independent random variables is equal to the sum of the variances of these variables. .

4. The variance of the difference between two independent random variables is equal to the sum of the variances of these variables. .

Theorem. The variance of the number of occurrences of event A in n independent trials, in each of which the probability p of the occurrence of the event is constant, is equal to the product of the number of trials by the probabilities of the occurrence and non-occurrence of the event in each trial.

Example: Find the variance of DSV X - the number of occurrences of event A in 2 independent trials, if the probability of the occurrence of the event in these trials is the same and it is known that M(X) = 1.2.

Let's apply the theorem from section 6.1.2:

M(X) = np

M(X) = 1,2; n= 2. Let's find p:

1,2 = 2∙p

p = 1,2/2

q = 1 – p = 1 – 0,6 = 0,4

Let's find the variance using the formula:

D(X) = 2∙0,6∙0,4 = 0,48

6.1.5 Standard deviation of a discrete random variable

Standard deviation random variable X is called Square root from dispersion.

(25)

Theorem. The standard deviation of the sum of a finite number of mutually independent random variables is equal to the square root of the sum of the squares of the standard deviations of these variables.

6.1.6 Mode and median of a discrete random variable

Fashion M o DSV the most probable value of a random variable is called (i.e. the value that has the highest probability)

Median M e DSV is the value of a random variable that divides the distribution series in half. If the number of values ​​of a random variable is even, then the median is found as the arithmetic mean of two average values.

Example: Find the mode and median of the DSV X:

X
p 0.2 0.3 0.1 0.4

M e = = 5,5

Progress

1. Familiarize yourself with the theoretical part of this work (lectures, textbook).

2. Complete the task according to your own version.

3. Make a report on the work.

4. Protect your job.

2. Purpose of the work.

3. Work progress.

4. Solving your own option.


6.4 Task options for independent work

Option #1

1. Find the mathematical expectation, dispersion, standard deviation, mode and median of the DSV X, given by the distribution law.

X
P 0.1 0.6 0.2 0.1

2. Find the mathematical expectation of the random variable Z if the mathematical expectations of X and Y are known: M(X)=6, M(Y)=4, Z=5X+3Y.

3. Find the variance of DSV X - the number of occurrences of event A in two independent trials, if the probabilities of occurrence of events in these trials are the same and it is known that M (X) = 1.

4. A list of possible values ​​of a discrete random variable is given X: x 1 = 1, x 2 = 2, x 3= 5, and the mathematical expectations of this value and its square are also known: , . Find the probabilities , , , corresponding to the possible values ​​of , , and draw up the DSV distribution law.

Option No. 2

X
P 0.3 0.1 0.2 0.4

2. Find the mathematical expectation of the random variable Z if the mathematical expectations of X and Y are known: M(X)=5, M(Y)=8, Z=6X+2Y.

3. Find the variance of DSV X - the number of occurrences of event A in three independent trials, if the probabilities of occurrence of events in these trials are the same and it is known that M (X) = 0.9.

4. A list of possible values ​​of a discrete random variable X is given: x 1 = 1, x 2 = 2, x 3 = 4, x 4= 10, and the mathematical expectations of this value and its square are also known: , . Find the probabilities , , , corresponding to the possible values ​​of , , and draw up the DSV distribution law.

Option #3

1. Find the mathematical expectation, dispersion and standard deviation of DSV X, given by the distribution law.

X
P 0.5 0.1 0.2 0.3

2. Find the mathematical expectation of the random variable Z if the mathematical expectations of X and Y are known: M(X)=3, M(Y)=4, Z=4X+2Y.

3. Find the variance of DSV X - the number of occurrences of event A in four independent trials, if the probabilities of occurrence of events in these trials are the same and it is known that M (x) = 1.2.

The concept of mathematical expectation can be considered using the example of throwing a die. With each throw, the dropped points are recorded. To express them, natural values ​​in the range 1 – 6 are used.

After a certain number of throws, using simple calculations you can find the average arithmetic value dropped points.

Just like the occurrence of any of the values ​​in the range, this value will be random.

What if you increase the number of throws several times? With a large number of throws, the arithmetic average of the points will approach a specific number, which in probability theory is called the mathematical expectation.

So, by mathematical expectation we mean the average value of a random variable. This indicator can also be presented as a weighted sum of probable value values.

This concept has several synonyms:

  • average value;
  • average value;
  • indicator of central tendency;
  • first moment.

In other words, it is nothing more than a number around which the values ​​of a random variable are distributed.

In various fields human activity approaches to understanding mathematical expectation will be somewhat different.

It can be considered as:

  • the average benefit obtained from making a decision, when such a decision is considered from the point of view of large number theory;
  • the possible amount of winning or losing (gambling theory), calculated on average for each bet. In slang, they sound like “player’s advantage” (positive for the player) or “casino advantage” (negative for the player);
  • percentage of profit received from winnings.

The expectation is not mandatory for absolutely all random variables. It is absent for those who have a discrepancy in the corresponding sum or integral.

Properties of mathematical expectation

Like any statistical parameter, the mathematical expectation has the following properties:


Basic formulas for mathematical expectation

The calculation of the mathematical expectation can be performed both for random variables characterized by both continuity (formula A) and discreteness (formula B):

  1. M(X)=∑i=1nxi⋅pi, where xi are the values ​​of the random variable, pi are the probabilities:
  2. M(X)=∫+∞−∞f(x)⋅xdx, where f(x) is the given probability density.

Examples of calculating mathematical expectation

Example A.

Is it possible to find out the average height of the dwarfs in the fairy tale about Snow White. It is known that each of the 7 dwarves had a certain height: 1.25; 0.98; 1.05; 0.71; 0.56; 0.95 and 0.81 m.

The calculation algorithm is quite simple:

  • we find the sum of all values ​​of the growth indicator (random variable):
    1,25+0,98+1,05+0,71+0,56+0,95+ 0,81 = 6,31;
  • Divide the resulting amount by the number of gnomes:
    6,31:7=0,90.

Thus, the average height of gnomes in a fairy tale is 90 cm. In other words, this is the mathematical expectation of the growth of gnomes.

Working formula - M(x)=4 0.2+6 0.3+10 0.5=6

Practical implementation of mathematical expectation

The calculation of the statistical indicator of mathematical expectation is used in various fields practical activities. First of all, we are talking about the commercial sphere. After all, Huygens’s introduction of this indicator is associated with determining the chances that can be favorable, or, on the contrary, unfavorable, for some event.

This parameter is widely used to assess risks, especially when it comes to financial investments.
Thus, in business, the calculation of mathematical expectation acts as a method for assessing risk when calculating prices.

This indicator can also be used to calculate the effectiveness of certain measures, for example, labor protection. Thanks to it, you can calculate the probability of an event occurring.

Another area of ​​application of this parameter is management. It can also be calculated during product quality control. For example, using mat. expectations, you can calculate the possible number of defective parts produced.

The mathematical expectation also turns out to be irreplaceable when carrying out statistical processing received during scientific research results. It allows you to calculate the probability of a desired or undesirable outcome of an experiment or study depending on the level of achievement of the goal. After all, its achievement can be associated with gain and benefit, and its failure can be associated with loss or loss.

Using mathematical expectation in Forex

Practical use this statistical parameter is possible when conducting operations on the foreign exchange market. With its help, you can analyze the success of trade transactions. Moreover, an increase in the expectation value indicates an increase in their success.

It is also important to remember that the mathematical expectation should not be considered as the only statistical parameter used to analyze a trader’s performance. The use of several statistical parameters along with the average value increases the accuracy of the analysis significantly.

This parameter has proven itself well in monitoring observations of trading accounts. Thanks to it, a quick assessment of the work carried out on the deposit account is carried out. In cases where the trader’s activity is successful and he avoids losses, it is not recommended to use exclusively the calculation of mathematical expectation. In these cases, risks are not taken into account, which reduces the effectiveness of the analysis.

Conducted studies of traders’ tactics indicate that:

  • The most effective tactics are those based on random entry;
  • The least effective are tactics based on structured inputs.

In achieving positive results, no less important are:

  • money management tactics;
  • exit strategies.

Using such an indicator as the mathematical expectation, you can predict what the profit or loss will be when investing 1 dollar. It is known that this indicator, calculated for all games practiced in the casino, is in favor of the establishment. This is what allows you to make money. In the case of a long series of games, the likelihood of a client losing money increases significantly.

Games played by professional players are limited to short periods of time, which increases the likelihood of winning and reduces the risk of losing. The same pattern is observed when performing investment operations.

An investor can earn a significant amount by having positive expectations and making a large number of transactions in a short period of time.

Expectation can be thought of as the difference between the percentage of profit (PW) multiplied by the average profit (AW) and the probability of loss (PL) multiplied by the average loss (AL).

As an example, we can consider the following: position – 12.5 thousand dollars, portfolio – 100 thousand dollars, deposit risk – 1%. The profitability of transactions is 40% of cases with an average profit of 20%. In case of loss, the average loss is 5%. Calculating the mathematical expectation for the transaction gives a value of $625.

Basic numerical characteristics of discrete and continuous random variables: mathematical expectation, dispersion and standard deviation. Their properties and examples.

The distribution law (distribution function and distribution series or probability density) completely describes the behavior of a random variable. But in a number of problems, it is enough to know some numerical characteristics of the value under study (for example, its average value and possible deviation from it) in order to answer the question posed. Let's consider the main numerical characteristics of discrete random variables.

Definition 7.1.Mathematical expectation A discrete random variable is the sum of the products of its possible values ​​and their corresponding probabilities:

M(X) = X 1 R 1 + X 2 R 2 + … + x p p p.(7.1)

If the number of possible values ​​of a random variable is infinite, then if the resulting series converges absolutely.

Note 1. The mathematical expectation is sometimes called weighted average, since it is approximately equal to the arithmetic mean of the observed values ​​of the random variable at large number experiments.

Note 2. From the definition of mathematical expectation it follows that its value is no less than the smallest possible value of a random variable and no more than the largest.

Note 3. The mathematical expectation of a discrete random variable is non-random(constant. We will see later that the same is true for continuous random variables.

Example 1. Find the mathematical expectation of a random variable X- the number of standard parts among three selected from a batch of 10 parts, including 2 defective ones. Let's create a distribution series for X. From the problem conditions it follows that X can take values ​​1, 2, 3. Then

Example 2. Determine the mathematical expectation of a random variable X- the number of coin tosses before the first appearance of the coat of arms. This quantity can take on an infinite number of values ​​(the set of possible values ​​is the set natural numbers). Its distribution series has the form:

X P
R 0,5 (0,5) 2 (0,5)P

+ (when calculating, the formula for the sum of infinitely decreasing geometric progression: , where ).

Properties of mathematical expectation.

1) The mathematical expectation of a constant is equal to the constant itself:

M(WITH) = WITH.(7.2)

Proof. If we consider WITH as a discrete random variable taking only one value WITH with probability R= 1, then M(WITH) = WITH?1 = WITH.

2) The constant factor can be taken out of the sign of the mathematical expectation:

M(CX) = CM(X). (7.3)

Proof. If the random variable X given by distribution series


Then M(CX) = Cx 1 R 1 + Cx 2 R 2 + … + Cx p p p = WITH(X 1 R 1 + X 2 R 2 + … + x p r p) = CM(X).

Definition 7.2. Two random variables are called independent, if the distribution law of one of them does not depend on what values ​​the other has taken. Otherwise the random variables dependent.

Definition 7.3. Let's call product of independent random variables X And Y random variable XY, the possible values ​​of which are equal to the products of all possible values X for all possible values Y, and the corresponding probabilities are equal to the products of the probabilities of the factors.

3) The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY) = M(X)M(Y). (7.4)

Proof. To simplify calculations, we restrict ourselves to the case when X And Y take only two possible values:

Hence, M(XY) = x 1 y 1 ?p 1 g 1 + x 2 y 1 ?p 2 g 1 + x 1 y 2 ?p 1 g 2 + x 2 y 2 ?p 2 g 2 = y 1 g 1 (x 1 p 1 + x 2 p 2) + + y 2 g 2 (x 1 p 1 + x 2 p 2) = (y 1 g 1 + y 2 g 2) (x 1 p 1 + x 2 p 2) = M(X)?M(Y).

Note 1. You can similarly prove this property for a larger number of possible values ​​of the factors.

Note 2. Property 3 is true for the product of any number of independent random variables, which is proven by mathematical induction.

Definition 7.4. Let's define sum of random variables X And Y as a random variable X+Y, the possible values ​​of which are equal to the sums of each possible value X with every possible value Y; the probabilities of such sums are equal to the products of the probabilities of the terms (for dependent random variables - the products of the probability of one term by the conditional probability of the second).

4) The mathematical expectation of the sum of two random variables (dependent or independent) is equal to the sum of the mathematical expectations of the terms:

M (X+Y) = M (X) + M (Y). (7.5)

Proof.

Let us again consider the random variables defined by the distribution series given in the proof of property 3. Then the possible values X+Y are X 1 + at 1 , X 1 + at 2 , X 2 + at 1 , X 2 + at 2. Let us denote their probabilities respectively as R 11 , R 12 , R 21 and R 22. We'll find M(X+Y) = (x 1 + y 1)p 11 + (x 1 + y 2)p 12 + (x 2 + y 1)p 21 + (x 2 + y 2)p 22 =

= x 1 (p 11 + p 12) + x 2 (p 21 + p 22) + y 1 (p 11 + p 21) + y 2 (p 12 + p 22).

Let's prove that R 11 + R 22 = R 1 . Indeed, the event that X+Y will take values X 1 + at 1 or X 1 + at 2 and the probability of which is R 11 + R 22 coincides with the event that X = X 1 (its probability is R 1). It is proved in a similar way that p 21 + p 22 = R 2 , p 11 + p 21 = g 1 , p 12 + p 22 = g 2. Means,

M(X+Y) = x 1 p 1 + x 2 p 2 + y 1 g 1 + y 2 g 2 = M (X) + M (Y).

Comment. From property 4 it follows that the sum of any number of random variables is equal to the sum of the mathematical expectations of the terms.

Example. Find the mathematical expectation of the sum of the number of points obtained when throwing five dice.

Let's find the mathematical expectation of the number of points rolled when throwing one dice:

M(X 1) = (1 + 2 + 3 + 4 + 5 + 6) The same number is equal to the mathematical expectation of the number of points rolled on any dice. Therefore, by property 4 M(X)=

Dispersion.

In order to have an idea of ​​the behavior of a random variable, it is not enough to know only its mathematical expectation. Consider two random variables: X And Y, specified by distribution series of the form

X
R 0,1 0,8 0,1
Y
p 0,5 0,5

We'll find M(X) = 49?0,1 + 50?0,8 + 51?0,1 = 50, M(Y) = 0?0.5 + 100?0.5 = 50. As you can see, the mathematical expectations of both quantities are equal, but if for HM(X) well describes the behavior of a random variable, being its most probable possible value (and the remaining values ​​do not differ much from 50), then the values Y significantly removed from M(Y). Therefore, along with the mathematical expectation, it is desirable to know how much the values ​​of the random variable deviate from it. To characterize this indicator, dispersion is used.

Definition 7.5.Dispersion (scattering) of a random variable is the mathematical expectation of the square of its deviation from its mathematical expectation:

D(X) = M (X-M(X))². (7.6)

Let's find the variance of the random variable X(number of standard parts among those selected) in example 1 of this lecture. Let's calculate the squared deviation of each possible value from the mathematical expectation:

(1 - 2.4) 2 = 1.96; (2 - 2.4) 2 = 0.16; (3 - 2.4) 2 = 0.36. Hence,

Note 1. In determining dispersion, it is not the deviation from the mean itself that is assessed, but its square. This is done so that deviations of different signs do not cancel each other out.

Note 2. From the definition of dispersion it follows that this quantity takes only non-negative values.

Note 3. There is a formula for calculating variance that is more convenient for calculations, the validity of which is proven in the following theorem:

Theorem 7.1.D(X) = M(X²) - M²( X). (7.7)

Proof.

Using what M(X) is a constant value, and the properties of the mathematical expectation, we transform formula (7.6) to the form:

D(X) = M(X-M(X))² = M(X² - 2 X?M(X) + M²( X)) = M(X²) - 2 M(X)?M(X) + M²( X) =

= M(X²) - 2 M²( X) + M²( X) = M(X²) - M²( X), which was what needed to be proven.

Example. Let's calculate the variances of random variables X And Y discussed at the beginning of this section. M(X) = (49 2 ?0,1 + 50 2 ?0,8 + 51 2 ?0,1) - 50 2 = 2500,2 - 2500 = 0,2.

M(Y) = (0 2 ?0.5 + 100²?0.5) - 50² = 5000 - 2500 = 2500. So, the variance of the second random variable is several thousand times greater than the variance of the first. Thus, even without knowing the distribution laws of these quantities, based on the known dispersion values ​​we can state that X deviates little from its mathematical expectation, while for Y this deviation is quite significant.

Properties of dispersion.

1) Variance of a constant value WITH equal to zero:

D (C) = 0. (7.8)

Proof. D(C) = M((C-M(C))²) = M((C-C)²) = M(0) = 0.

2) The constant factor can be taken out of the dispersion sign by squaring it:

D(CX) = C² D(X). (7.9)

Proof. D(CX) = M((CX-M(CX))²) = M((CX-CM(X))²) = M(C²( X-M(X))²) =

= C² D(X).

3) The variance of the sum of two independent random variables is equal to the sum of their variances:

D(X+Y) = D(X) + D(Y). (7.10)

Proof. D(X+Y) = M(X² + 2 XY + Y²) - ( M(X) + M(Y))² = M(X²) + 2 M(X)M(Y) +

+ M(Y²) - M²( X) - 2M(X)M(Y) - M²( Y) = (M(X²) - M²( X)) + (M(Y²) - M²( Y)) = D(X) + D(Y).

Corollary 1. The variance of the sum of several mutually independent random variables is equal to the sum of their variances.

Corollary 2. The variance of the sum of a constant and a random variable is equal to the variance of the random variable.

4) The variance of the difference between two independent random variables is equal to the sum of their variances:

D(X-Y) = D(X) + D(Y). (7.11)

Proof. D(X-Y) = D(X) + D(-Y) = D(X) + (-1)² D(Y) = D(X) + D(X).

The variance gives the average value of the squared deviation of a random variable from the mean; To evaluate the deviation itself, a value called the standard deviation is used.

Definition 7.6.Standard deviationσ random variable X is called the square root of the variance:

Example. In the previous example, the standard deviations X And Y are equal respectively

Most full description a random variable is its distribution law. However, it is not always known and in these cases one has to be content with less information. Such information may include: the range of change of a random variable, its largest (smallest) value, some other characteristics that describe the random variable in some summary way. All these quantities are called numerical characteristics random variable. Usually these are some non-random numbers that somehow characterize a random variable. The main purpose of numerical characteristics is to express in a concise form the most significant features of a particular distribution.

The simplest numerical characteristic of a random variable X called her expected value:

M(X)=x 1 p 1 +x 2 p 2 +…+x n p n. (1.3.1)

Here x 1, x 2, …, x n– possible values ​​of the random variable X, A p 1, p 2, …, р n– their probabilities.

Example 1. Find the mathematical expectation of a random variable if its distribution law is known:

Solution. M(X)=2×0.3+3×0.1+5×0.6=3.9.

Example 2. Find the mathematical expectation of the number of occurrences of an event A in one trial, if the probability of this event is equal R.

Solution. If X– number of occurrences of the event A in one test, then, obviously, the distribution law X has the form:

Then M(X)=0×(1–р)+1×р=р.

So: the mathematical expectation of the number of occurrences of an event in one trial is equal to its probability.

Probabilistic meaning of mathematical expectation

Let it be produced n tests in which the random variable X accepted m 1 times value x 1, m 2 times value x 2, …, m k times value x k. Then the sum of all values ​​in n tests is equal to:

x 1 m 1 +x 2 m 2 +…+x k m k.

Let's find the arithmetic mean of all values ​​taken by the random variable:

Values ​​– relative frequencies of occurrence of values x i (i=1, …, k). If n big enough (n®¥), then these frequencies are approximately equal to the probabilities: . But then

=x 1 p 1 +x 2 p 2 +…+x k p k =M(X).

Thus, the mathematical expectation is approximately equal (the more accurate, the larger number tests) the arithmetic mean of the observed values ​​of the random variable. This is probabilistic meaning mathematical expectation.

Properties of mathematical expectation

1. The mathematical expectation of a constant is equal to the constant itself.

M(C)=C×1=C.

2. The constant factor can be taken out of the mathematical expectation sign

M(CX)=C×M(X).

Proof. Let the distribution law X given by the table:

Then the random variable CX takes values Cx 1, Cx 2, …, Сх n with the same probabilities, i.e. distribution law CX has the form:

M(СХ)=Сх 1 ×р 1 +Сх 2 ×р 2 +…+Сх n ×p n =

=C(x 1 p 1 +x 2 p 2 +…+x n p n)=CM(X).

3. The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY)=M(X)×M(Y).

This statement is given without proof (the proof is based on the definition of mathematical expectation).

Consequence. The mathematical expectation of the product of several mutually independent random variables is equal to the product of their mathematical expectations.

In particular, for three independent random variables

M(XYZ)=M(X)×M(Y)×M(Z).

Example. Find the mathematical expectation of the product of the number of points that can appear when throwing two dice.

Solution. Let X i– number of points per i th bones. It could be numbers 1 , 2 , …, 6 with probabilities. Then

M(X i)=1× +2× +…+6× = (1+2+…+6)= × ×6= .

Let X=X 1 ×X 2. Then

M(X)=M(X 1)×M(X 2)= =12.25.

4. The mathematical expectation of the sum of two random variables (independent or dependent) is equal to the sum of the mathematical expectations of the terms:

M(X+Y)=M(X)+M(Y).

This property is generalized to the case of an arbitrary number of terms.

Example. 3 shots are fired with probabilities of hitting the target equal to p 1 =0.4, p 2 =0.3 And p 3 =0.6. Find the expected value total number hits.

Solution. Let X i– number of hits at i-th shot. Then

М(Х i)=1×p i +0×(1–p i)=p i.

Thus,

M(X 1 +X 2 +X 3)= =0.4+0.3+0.6=1.3.