Hi all, This Blog is an English archive of my PhD experience in Imperial College London, mainly logging my research and working process, as well as some visual records.

Showing posts with label Probability Theory. Show all posts
Showing posts with label Probability Theory. Show all posts

Tuesday, 23 October 2007

M2S1 PROBABILITY AND STATISTICS II - Imperial College

Recommended Texts
G. R. Grimmett and D. R. Stirzaker, Probability and Random Processes (2nd Edition/3rd Edition).
[Very useful for probability material of the course].
W. Feller, An Introduction to Probability Theory and Its Applications. Vols 1 and 2. [A classical
reference text].
G. Casella and R.L. Berger, Statistical Inference. [A very useful text, which covers statistical ideas as
well as probability material].
There are many such introductory texts in the Mathematics library. Other books relating to specific
parts of the course will be recommended when relevant.
Also, there will be a course WWW page accessible from http://stats.ma.ic.ac.uk/ayoung. It will
contain links to course handouts, exercises and solutions.
Professor A. Young (room 529, email alastair.young@imperial.ac.uk)

Thursday, 2 August 2007

Moment-generating function

Moment-generating function

In probability theory and statistics, the moment-generating function of a random variable X is

M_X(t)=E\left(e^{tX}\right), \quad t \in \mathbb{R},

wherever this expectation exists. The moment-generating function generates the moments of the probability distribution.

For vector-valued random variables X with real components, the moment-generating function is given by

 M_X(\mathbf{t}) = E\left( e^{\langle \mathbf{t}, \mathbf{X}\rangle}\right)

where t is a vector and \langle \mathbf{t} , \mathbf{X}\rangle is the dot product.

Provided the moment-generating function exists in an interval around t = 0, the nth moment is given by

E\left(X^n\right)=M_X^{(n)}(0)=\left.\frac{\mathrm{d}^n M_X(t)}{\mathrm{d}t^n}\right|_{t=0}.

If X has a continuous probability density function f(x) then the moment generating function is given by

M_X(t) = \int_{-\infty}^\infty e^{tx} f(x)\,\mathrm{d}x
 = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2x^2}{2!} + \cdots\right) f(x)\,\mathrm{d}x
 = 1 + tm_1 + \frac{t^2m_2}{2!} +\cdots,

where mi is the ith moment. MX( − t) is just the two-sided Laplace transform of f(x).

Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral

M_X(t) = \int_{-\infty}^\infty e^{tx}\,dF(x)

where F is the cumulative distribution function.

If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

S_n = \sum_{i=1}^n a_i X_i,

where the ai are constants, then the probability density function for Sn is the convolution of the probability density functions of each of the Xi and the moment-generating function for Sn is given by

M_{S_n}(t)=M_{X_1}(a_1t)M_{X_2}(a_2t)\cdots M_{X_n}(a_nt).


Related to the moment-generating function are a number of other transforms that are common in probability theory, including the characteristic function and the probability-generating function.

The cumulant-generating function is the logarithm of the moment-generating function.