This page contains the basic Rules for the Mean, Variance, Covariance, and Correlation for the expectation of random variables. This summary can be extremely helpful if you do not work regularly in statistics or are a new student. The proofs of these rules can be purchased for a nominal fee from the Order page.
The proofs are usually required problems or test questions in a non calculus first course in statistics. The Rules and their proofs are part of a Statistical Review which is approximately 27 pages in 10 point type. It is a handy review for someone who has been away from statistics for a while but suddenly finds an article using these Rules. Students will find them helpful as well. If you have MathType, you may edit the file.
FORMULAS AND RULES FOR EXPECTATIONS OF RANDOM VARIABLES
Formulas and Rules for the Mean of the Random Variable X
Formulas for the Mean
where pi is the probability of the occurrence of the value of xi.
Rules for the Mean
Rule 1.
The expectation of a constant, c, is the constant.
E(c) = c
Rule 2.
Adding a constant value, c, to each term increases the mean, or expected value, by the constant.
E(X+c) = E(X)+c
Rule 3.
Multiplying a random variable by a constant value, c, multiplies the expected value or mean by that constant.
E(cX ) = cE(X)
Rule 4.
The expected value or mean of the sum of two random variables is the sum of the means. This is also known as the additive law of expectation.
E(X+Y) = E(X)+E(Y)
Formulas and Rules for the Variance, Covariance and Standard Deviation of Random Variables
Formulas for the Variance
or
or
Formulas for the Standard Deviation
Formulas for the Covariance
or
or
Rules for the Variance
Rule 1.
The variance of a constant is zero.
Rule 2.
Adding a constant value, c, to a random variable does not change the variance, because the expectation (mean) increases by the same amount.
Rule 3.
Multiplying a random variable by a constant increases the variance by the square of the constant.
Rule 4.
The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent.
and in terms of the sigma notation
When two random variables are independent, so that
Rules for the Covariance
Rule 1.
The covariance of two constants, c and k, is zero.
Rule 2.
The covariance of two independent random variables is zero.
Rule 3.
The covariance is a combinative as is obvious from the definition.
Rule 4.
The covariance of a random variable with a constant is zero.
Rule 5.
Adding a constant to either or both random variables does not change their covariances.
Rule 6.
Multiplying a random variable by a constant multiplies the covariance by that constant.
Rule 7.
The additive law of covariance holds that the covariance of a random variable with a sum of random variables is just the sum of the covariances with each of the random variables.
Rule 8.
The covariance of a variable with itself is the variance of the random variable. By definition,
Formulas and Rules for the Correlation Coefficient of Random Variables
Rules for the Correlation Coefficient
Rule 1.
Adding a constant to a random variable does not change their correlation coefficient.
Rule 2.
Multiplying a random variable by a constant does not change their correlation coefficient. For two random variables
Z = a+bX and W = c+dY, where a,b,c, and d are constants,
Because the square root of the variance is always positive, the correlation coefficient can be negative only when the covariance is negative. This leads to
Rule 3.
The correlation coefficient is always at least -1 and no more than +1.
Formulas and Rules for the Sample Mean, Variance, Covariance and Standard Deviation, and Correlation Coefficient of Random Variables
Rules for Sampling Statistics
Rule 1.
The sample mean, is computed by
Rule 2.
The sample variance is
or
The sample standard deviation s, is
or
Rule 3.
The sample correlation coefficient is the same as the population correlation coefficient.
No comments:
Post a Comment