Difference between theoretic moment and sample moment and understanding
OK, So I'm reading the following definitions
$E(X_k)$ is the kth (theoretical) moment of the distribution (about the origin), for k = 1,2,...
$M_k=\frac{1}{n}\sum_{i=1}^nX^k_i$ is the kth sample moment, for k = 1,2,...
My questions are:
How is the sample moment different from the theoretical moment? Isn't $E(X^k)=\frac{1}{n}\sum_{i=1}^nX^k_i$ ?
What is the connection between sample moment and theoretical moment?
2 Answers
$\begingroup$If you sample a probability distribution $n$ times, you get a different probability distribution called the sample distribution, which is given by picking a random sample. The theoretical moments of the sample distribution are the sample moments. In various senses, the sample distribution converges to the original distribution as $n \to \infty$, and the same is true of the moments.
The two are not the same. For example, the original distribution might be a (fair) coin flip. If you sample a coin flip $n$ times - that is, if you flip $n$ independent coins - the sample distribution is the distribution of heads and tails you see, which need not be an even 50/50 split, and in fact can't be if $n$ is odd.
$\endgroup$ 2 $\begingroup$Besides, everything that already was mentioned in previous comments/answers, probably the most important property of the sample moment is the fact that it converges (in probability or almost surely where slightly more tight conditiones are met). Namely, by the (weak) law of large numbers (assuming that $E|X^k| < \infty$) , you have that
$$
\frac{1}{n}\sum_{i=1}^k X_i^k \xrightarrow{p} EX^k, \quad n \to \infty.
$$
Where the later, $EX^k$, is a constant that characterizes population's distribution. As such, there is an estimation method called "Method of Moments" that relies on this property to estimate various parameters of the population distribution.