Recall the strong law of large numbers.
Theorem [SLLN]. Let be a sequence of i.i.d. random variables with finite mean and variance Then, the sequence of sample means converges almost surely to . In other words, almost surely
Under a signal-noise interpretation, SLLN tells the signal (mean) dominates asymptotically. This result is sharpened by the central limit theorem.
Theorem [CLT]. Let be a sequence of i.i.d. random variables with finite mean and variance Then the sequence of properly normalized sample means convergences in distribution to the Gaussian
The signal-noise interpretation is that amplifying the signal by the noise (standard deviation) reveals the Gaussian. We try to elucidate the normalization of in (2). We begin by asking for a quantitative, pre-asymptotic version of SLLN in order to quantify the decay on the error . This is the goal of Chebyshev's inequality. Since the sample variance of is , we compute
Taking the change of variables ,
which pins down the scale in natural units on the tolerance as the standard deviation , since the right hand side is no longer artifically dependent on the sample size . Notice the appearce of in both the CLT and Chebyshev's inequality. Both cases point out that amplifying by the standard deviation stablizes the error . In particular, the decay rate on is of order .