Kolmogorov entropy, also known as metric entropy, Kolmogorov-Sinai entropy, or KS entropy, is defined as follows. Divide phase space
into -dimensional
hypercubes of content
. Let
be the probability that a trajectory is in hypercube
at
,
at
,
at
, etc. Then define
(1)
|
where
is the information needed to predict which hypercube
the trajectory will be in at
given trajectories up to
. The Kolmogorov entropy is then defined by
(2)
|
The Kolmogorov entropy is related to Lyapunov characteristic exponents by
(3)
|
The Kolmogorov entropy is 0 for nonchaotic motion and positive for chaotic motion.