Entropy rates will be considered as a tool
for quantitative characterization of dynamic
processes evolving in time.
Let be a time series, i.e.,
a series of measurements done on a system in consecutive
instants of time
. The time series
can be considered as a realization of a
stochastic process
, characterized
by the joint probability
distribution
function
,
Pr
.
The entropy rate of
is defined as [1]:
where is the
entropy of
the joint distribution
:
Alternatively, the time series can be
considered as a projection of a trajectory
of a dynamical system, evolving in some measurable
state space.
As a definition of the entropy rate of a dynamical system,
known as the Kolmogorov-Sinai entropy
(KSE) [2, 3, 4]
we can consider the
equation (1), however, the
variables
should be understood as m-dimensional
variables, according to a dimensionality of the
dynamical system [5].
If the dynamical system is evolving in a continuous
measure
space, then any entropy depends on a partition
chosen to discretize the space and the KSE is defined as
a supremum over all finite partitions
[2, 3, 4].
The KSE is a topological invariant, suitable for classification of dynamical systems or their states, and is related to the sum of the system's positive Lyapunov exponents (LE) according to the theorem of Pesin [6].
A number of algorithms
(see, e.g., [7, 8, 9, 10]
and references therein)
have been proposed for
estimation of the KSE from time series.
Reliability of these estimates, however,
is
limited [11]
by available amount of data, finite precision
measurements and noise always present in experimental data.
No general approach to estimating
the entropy rates
of stochastic processes has been established,
except of simple cases
such as finite-state
Markov chains [1].
However,
if is a zero-mean stationary Gaussian process
with spectral density function
, its
entropy rate
,
apart from a constant term,
can be expressed using
as [12, 13, 14]:
Dynamics of a stationary Gaussian process
is fully described by its spectrum. Therefore
the connection (3) between the entropy rate
of such a process and its spectral density
is understandable. The estimation of the entropy rate
of a Gaussian process is reduced to the estimation
of its spectrum.
If a studied time series was generated by
a nonlinear, possibly chaotic, dynamical system,
its description in terms of a spectral density
is not sufficient. Indeed,
realizations of isospectral Gaussian
processes are used in the surrogate-data based tests
in order to discern nonlinear (possibly chaotic)
processes from colored noises [15, 16].
On the other hand, there are results indicating that
some characteristic properties of nonlinear
dynamical systems may be ``projected'' into
their ``linear properties'', i.e., into spectra,
or equivalently,
into autocorrelation functions:
Sigeti [17] has demonstrated that
there may be a relation between the sum of
positive Lyapunov exponents
(KSE)
of a chaotic dynamical system
and the decay coefficient characterizing the
exponential decay at high frequencies of spectra
estimated from time series generated by the dynamical
system. Asymptotic decay of autocorrelation
functions of such time series is ruled
by the second eigenvalue of the Perron-Frobenius
operator of the dynamical system [18, 19].
Lipton & Dabke [20] have also investigated
asyptotic decay of spectra in relation to properties
of underlying dynamical systems.