Brain Research Group >> Research >> Change-point analysis ... | << previous next >> |
A vast variety of approaches to the extraction of quantitative features from an EEG signal was introduced during more than 70 years of electroencephalography. As for any signal, it seems promising to elaborate a mathematical model of the EEG signal. However, mathematical models (Nunez 1995; Freeman 1992) and physiological findings linking the EEG to electrical activities of single nerve cells (Eckhorn et al. 1988; Gray et al. 1989) remain problematic, and no single model of EEG dynamics has yet achieved the goal of integrating the wide variety of properties of an observed EEG and single-cell activities (Wright & Liley 1995). Successful attempts were limited to autoregressive modelling of short EEG segments (for a review see Kaipio & Karjalainen 1997). Further significant progress in this direction can hardly be expected, because the dynamics of EEG depends on brain activities related to a very complex dynamics of various types of information processing, which is related to repeatedly renewed internal and external information; thus stationary dynamic equations evidently cannot adequately describe an EEG signal.
The application of non-linear dynamics (or deterministic chaos) methods to the problem of the description of an EEG was relatively successful (Jansen 1991; Roeschke et al. 1997; Pritchard & Duke 1992). This theory operates with ensembles of trajectories of dynamical systems instead of a single trajectory and uses the probabilistic approach for description of observed systems. However, methods of non-linear dynamics are based upon the hypothesis that the brain's electrical activity can be described by stationary dynamic models. Such a hypothesis is unrealistic in many cases.
One way or another, all approaches to the description of an EEG use probabilistic concepts. Therefore, statistical approaches seem to be the most feasible and theoretically satisfactory methodology for the quantitative analysis of the EEG signal up to now.
Early in the history of electroencephalography, in view of the demands for quantitative estimation of the EEG signal the reasonable question of its statistical nature was risen. Norbert Wiener proposed considering the EEG as a stochastic signal by analogy with the output characteristics of any complex system (Wiener 1961). It was thought at that stage of the exploration of EEG that the main laws of the dynamics of the total EEG signal could be studied on the basis of its probability--statistical estimations irrespective of the real biophysical origin of cortical electrical processes (Lopes da Silva 1981). As a result, a considerable body of work appeared concerning the stochastic properties of the EEG signal. The main conclusion was that the EEG may actually be described by the basic stochastic concepts (in other words, by probability distributions), but only at rather short realizations, usually not longer than 10--20 s, because the EEG turned out to be an extremely non-stationary process. The variability of power of the main spectral EEG components, e.g., for successive short term (5--10 s) segments, ranged up to 50--100 % (Oken & Chiappa 1988). It became clear that the routine statistical characteristics could be computed for the EEG only after its prior segmentation into relatively stationary intervals. This, in turn, required the development of techniques for the detection of the boundaries between the stationary segments in the EEG signal. The first positive findings in this line have not only directed the way for more correct estimation of the EEG statistical properties but, more importantly, provided the initial foundation for the principally novel understanding of the EEG temporal structure as a piecewise stationary process (Bodenstein & Praetorius 1977).