تجزیه و تحلیل آنتروپی فرآیندهای تصادفی در وضوح محدود
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|21785||2005||8 صفحه PDF||سفارش دهید||2680 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Physica A: Statistical Mechanics and its Applications, Volume 357, Issue 1, 1 November 2005, Pages 71–78
The time evolution of complex systems usually can be described through stochastic processes. These processes are measured at finite resolution, which necessarily reduces them to finite sequences of real numbers. In order to relate these data sets to realizations of the original stochastic processes (to any functions, indeed) it is obligatory to choose an interpolation space (for example, the space of band-limited functions). Clearly, this choice is crucial if the intent is to approximate optimally the original processes inside the interval of measurement. Here, we argue that discrete wavelets are suitable to this end. The wavelet approximations of stochastic processes allow us to define an entropy measure for the order–disorder balance of evolution regimes of complex systems, where order is understood as confinement of energy in simple local modes. We calculate exact results for the fractional Brownian motion (fBm), with application to Kolmogorov K41 theory for fully developed turbulence.
Many physical systems investigated at present have a complex evolution in time. Frequently, the major information we can obtain on their dynamics comes from time series of noisy appearance , , ,  and . These series are samples at finite resolution of subjacent stochastic processes, whose properties are better investigated through a multi-resolution approach, because the realizations of the mentioned stochastic processes are singular everywhere functions. This is, for example, the case of 1/fα1/fα noises from self-organized systems , , ,  and . Thus, singularities are common, and should be interpreted as details that influence function variation at all scales  and . In this paper we propose a method for entropy analysis of arbitrary complex time series that takes advantage of this theoretical standpoint. Moreover, we account for the fact that measurements are made at finite resolution, considering the consequences of sampling, what is not fully accomplished in previous approaches. In general, other formulations are strongly influenced by information–theoretical arguments, applied to the analysis of chaotic behavior . We see difficulties in two main aspects. First, the notion of complexity that is employed in these formulations is based on entropy rates, like the Kolmogorov–Sinai entropy, which measures the degree in which information on initial conditions is lost when the systems evolve. This gives a scale of complexity ranging from zero (non-chaotic deterministic case) to infinite (stochastic case). In this scale, finite values quantify the complexity of deterministic chaos. As a consequence, the problem of defining a proper complexity estimator for stochastic processes is substituted by the statistical investigation of chaotic deterministic dynamical systems . Second, these formulations do not consider the relationship that scaling has with disorder . Our approach is tailored to face these difficulties. Here, we begin by assuming that the stochastic processes are supported on the real axis, and measured at a discrete set of points with a sampling interval ττ, during a time TT. The resolution of the measurement is N=T/τN=T/τ, and for convenience we make T=1T=1 and τ=1/N=2-Jτ=1/N=2-J (we will work in these units). Such measurement results in a loss of information, which depends on two factors: the resolution, and the interpolation space in which the stochastic processes will be projected . The idea is that the mere sampled values say almost nothing about a function. Much more information is conveyed through the hypothesis on how the function varies between the sampled values.This regularity hypothesis, usually implicit when we draw smooth curves between data points, is an essential element of the theory. Without this hypothesis, no information found on the discrete and finite data sets can be attributed to the subjacent model, that is assumed to hold on the continuous support. Now, in the time-scale (time–frequency) domain, it makes sense to search a representation (i) that is minimally affected by the created end singularities, (ii) that provides an interpolation based on multi-scale approximation of the actual singularities, and (iii) that deals with the resolution NN as a direct experimental parameter. The first and second requirements are the most crucial, and establish the way in which the energy (L2L2-norm squared) is assumed to be distributed inside the interval of measurement, so that this distribution corresponds to the singularity structure of the process, seen at a finite resolution. As a supplementary condition, the best is that the algorithmic complexity grows only linearly with the length of the time series. These requirements are met if we project the stochastic processes in a discrete wavelet space.
نتیجه گیری انگلیسی
In conclusion, we propose that the entropy measure  is suitable to estimate the complexity of time series by revealing the degree in which detail modes are excited in a process. These detail modes are easily represented as localized oscillations or wavelets, which in turn are very effective mathematical instruments for time-scale analysis. Considering such phenomena as fluid or scalar turbulence  and , that are characterized by fluctuating cascades from high inertial scales to low dissipative scales, or plasma turbulence , in which some scales are most important for the energy transfer, the proposed entropy seems a natural measure of the intrinsic order – disorder balance. In such phenomena, many structures (vortices, convective cells) appear at intermediate scales, that should be detected as excitations of intermediate scale modes, lowering the entropy, as compared with noises that are completely determined by microscopic dissipation (whose characteristics are indicated in Fig. 1). For example, if a theory with the same correlations of Kolmogorov K41 theory  is valid, implying View the MathML sourceH=13, the time series of velocity increments would lead to the entropy View the MathML sourceS(13,∞)=4.27…, a number significantly small, compared to the case in Fig. 1. What we learn from the fBm analysis presented here suggests that the measurement of the entropy is a strong method of characterization. Furthermore, it is interesting to obtain experimental curves of entropy for stochastic processes depending on parameters, or in situations where the entropy may change with time. This method has given results (which will be published soon) in three situations: in the analysis of the sound of lungs, in the analysis of heart beat series, and in the analysis of global positioning system errors caused by equatorial spread F in ionospheric plasma. We observe that such entropy measurements could give important information on the non-equilibrium approach to self-organized regimes  and , as well as, on self-organized regimes made unstable, for example, by diffusion .