Power Law

In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one quantity varies as a power of another.

Distribution
The density function of a power law follows the following distribution:


 * $$p(x) \propto x^{-\alpha}$$

with $$\alpha > 1$$.

Scale-free property
One attribute of power laws is their scale invariance. Given a relation $$f(x) = ax^{-k}$$, scaling the argument $$x$$ by a constant factor $$c$$ causes only a proportionate scaling of the function itself. That is,


 * $$f(c x) = a(c x)^{-k} = c^{-k} f(x) \propto f(x).\!$$

That is, scaling by a constant $$c$$ simply multiplies the original power-law relation by the constant $$c^{-k}$$. Thus, it follows that all power laws with a particular scaling exponent are equivalent up to constant factors, since each is simply a scaled version of the others.

Self-organized Criticality
In systems that are self-organized critical ,at the critical point between phases power laws emerge. In neural networks for example, the distribution of neural avalanche sizes follows a power law.