Approximate time needed to complete the tutorial: 60 min.
The detrended fluctuation analysis (DFA) algorithm is a scaling analysis method used to estimate long-range temporal correlations of power-law form (Peng et al., 1995)(Hardstone et al., 2012). In other words, if a sequence of events has a non-random temporal structure with slowly decaying auto-correlations, DFA can quantify how slowly these correlations decay as indexed by the DFA power-law exponent. Typical classes of correlations are reflected in DFA exponents 'α' of:
Uncorrelated sequence: α ~ 0.5
Anti-correlated sequence: 0 < α < 0.5
Long-range temporal correlations: 0.5 < α <1
Strong correlations that are not of a power-law form: α > 1
The interest in long-range temporal correlations (LRTC) analysis is rooted in empirical as well as modeling results indicating that the dynamics of self-organizing multi-unit systems is characterized by fluctuations that tend to appear in a specific temporal ordering as indexed by power-law decaying correlations (i.e., very slowly decaying correlations). That a system is “self-organizing” means that, except for some “fairly slow” stochastic drive, most of the fluctuations are endogenous as opposed to being determined by strong external inputs. The physical background for the phenomenon of temporal autocorrelations that persist over many time scales is that activities make changes to the internal structure of these systems and this structure in turn provides a bias of when and where future activity can flow
(Poil et al., 2012). You may think of it as water flowing down a mountain slope carving a network of rivers that is ever changing but also sufficiently stable that it strongly influences where water can flow from day to day, month to month, year to year, etc. In other words, the dynamical spatial structures in complex self-organizing systems are the substrate of the memory that ensures a degree of correlation in activity over time.
The interpretation of DFA crucially depends on the pre-processing of the data you analyze. In our case (analysis of human brain waves (e.g. measured using electroencephalography (EEG) or magnetoencephalography (MEG)), the preprocessing consists of extracting the amplitude envelope of distinct oscillations, but it could equally well be, e.g., the phase-locking of oscillations in different brain areas as computed in a sliding window. In the latter case, DFA applied to the time series of synchronization values would provide you with information about the possibly non-random temporal order of synchronization between the two brain regions (Montez et al., 2006).
In this tutorial, you will explore the idea of ongoing neuronal oscillations holding a memory of their own dynamics on fairly long time scales, i.e., several orders of magnitude longer than the period of the oscillatory process. Moreover, we will investigate the influence of signal-to-noise ratio on the estimated correlations in simulated data.
We here show you how you can perform detrended fluctuation analysis in Matlab using NBT.
In this test case we assume, that we want to find the DFA in the alpha frequency band (8–13 Hz), with a sampling frequency of 250 Hz. If you do not have any data you can simply generate a signal using random numbers (in the Matlab command window):
Signal = randn(300*250,23);
This will generate a 5 mins (i.e., 300 seconds) long signal with 23 channels.
NBT has several functions to help removing artifacts, we will not go into details here. Only, we would like to emphasize that large-amplitude transient artifacts will strongly influence the temporal structure of the signal and, therefore, are better cut away. Importantly, the time series should be stitched together after cutting artifact segments away. Also note, that signals in NBT have time along the 1st dimension, and channels along 2nd dimension of the signal matrix (see example above).
First, we use nbt_GetAmplitudeEnvelope to filter the signal and get the amplitude envelope, [Signal, SignalInfo] = nbt_GetAmplitudeEnvelope(Signal, SignalInfo, hp, lp, filter_order).
SignalInfo = nbt_Info; %this initialize an Info Object
SignalInfo.converted_sample_frequency = 250; %This sets the sampling frequency to 250 Hz.
AmplitudeEnvelope = nbt_GetAmplitudeEnvelope(Signal, SignalInfo, 8, 13, 2/8); % Extract the amplitude envelope using a FIR filter, and the hilbert transform.
The function nbt_Info defines an NBT info object, which contains information about your signal. Note the last parameter 2/8. This is the filter order (in seconds), which we set such that at least two oscillation cycles are covered by the filter window.
Hereafter, it is easy to calculate the DFA exponents. From the function definition [DFAobject,DFA_exp] = nbt_doDFA(Signal, InfoObject, FitInterval, CalcInterval, DFA_Overlap, DFA_Plot, ChannelToPlot, res_logbin);, we see that we can find the DFA exponent and plot the fluctuation function by :
The first two parameters; Signal, and SignalInfo are the Signal and the associated information about this signal (see above). The two next parameters are FitInterval, and Calcinterval; they determine which interval we fit, and which we calculate. The DFA_overlap tells how much overlap we want of our windows (see above), in this case 50%. The plotting parameters DFA_plot and ChannelToPlot should be self-explanatory. The last parameter res_logbin is the resolution of the logarithmic binning, which by default is 10.
After the calculations are finished you can find the DFA values for all channels in the MarkerValues field.
Let us test that the DFA yields a power-law exponent of α~0.5 for random data. We define a “sampling frequency” of 300 Hz in order to interpret the time scales in units of seconds and generate “20 min of random activity”.
Fs = 300;
Signal = randn(20*60*Fs,1);
Before computing the DFA fluctuation function 'DFA_y' and how it scales with increasing window sizes, we have to define a time range of interest. The DFA is not accurate for very short windows nor for long windows, because of statistical variability. As a rule of thumb, you should only estimate correlations up to time scales of 10% of the amount of data available.
Note, the computation of 'nbt_doDFA.m' may take tens of seconds to complete!
Long-range temporal correlations imply that there are modulations of the fluctuations not only on the short time scales that tend to catch the attention of our visual system, but also on long time scales. Low-frequency fluctuations, however, are usually removed at data acquisition using a high-pass filter of around 0.1–1 Hz. Besides, neuronal systems typically do not generate much low-frequency activity compared to the amount of low-frequency noise in the laboratory. Therefore, we cannot use the “raw” data from the acquisition for LRTC analysis and need to pre-process the data before searching for correlations. Since we are interested in distinct oscillations, we use a bandpass filter and the Hilbert transform to extract the amplitude envelope of the narrow-band oscillations. The difference between analyzing modulations of the original signal versus the amplitude envelope of a filtered signal is explained in the figure below.
The frequency content of broadband signals and of the amplitude envelope of narrow-frequency-band neural oscillations.A, A signal from a single MEG channel (0.1–100 Hz) is shown at two time scales. Note from the upper trace that the high-amplitude 10-Hz oscillations are riding on slow fluctuations (~ 1 Hz). B, The slow fluctuations are unlikely to be of neural origin, as may be inferred from the power spectra showing the spectral density of the entire 20-min signal in A (thick line) and reference data from the same channel (thin line, see Materials and Methods). The neural signals dominate above but not below 1 Hz, with prominent peaks at 10 and 20 Hz (see arrows). C, The signal shown in A has been filtered at 10 Hz with a Morlet wavelet (passband 6.7–13.3 Hz). The thin lines are the real part and the amplitude envelopes (thick lines) are the modulus of the wavelet transform. D, The power spectrum of the amplitude envelope of the signals at 10 Hz exhibits a 1/f-type of power spectrum (circles) with α = 0.58 in the range from 0.01–1 Hz, thereby indicating that the fluctuations of the amplitude envelope of these oscillations are correlated at time scales of 1–100 s. On the contrary, the 10-Hz amplitude envelope of the reference data gave rise to a white-noise spectrum characteristic of a temporally uncorrelated process (dots). From (Linkenkaer-Hansen et al., 2004).
One should, however, always be cautious that the pre-processing of data does not introduce unwanted properties in the data. For example, band-pass filtering is based on a weighted sum of samples within a window that is a few oscillation cycles wide and therefore even a sequence of random numbers is expected to have temporal autocorrelations in the amplitude fluctuations on certain (short) time scales after filtering (!).
To see this, we narrow-band filter the random data at 8–13 Hz (the so-called “alpha-frequency band”) and compute the DFA on the amplitude envelope of the oscillations (i.e., 'abs(hilbert(Data_filtered))'). First we design a so-called “FIR filter”, which is a finite impulse response filter, i.e., only samples in a limited time window are integrated by the filter, namely the number of samples indicated by the so-called “order” of the filter. In the filter script ('nbt_filter_fir.m'), the filter order is given in units of seconds (to prevent an influence of sampling frequency). The filter order should be chosen to include at least two cycles of the low-frequency component included in the passband, which is 0.25 s for an 8-Hz oscillation:
highpass = 8; % Highpass corner frequency.
lowpass = 13; % Lowpass corner frequency.
fir_order = 0.25; % Filter order in units of seconds.
[Data_filtered] = nbt_filter_fir(Signal, highpass, lowpass, Fs, 2/8);
To see that the filter worked as expected, plot the 'Data_filtered':
If you zoom in on a shorter time scale you will notice that the signal is oscillatory.
The amplitude envelope of the oscillations (see Figure on p. 5) may be determined by the absolute value of the so-called hilbert transform:
You should now see oscillations that fluctuate erratically in amplitude!
As you can see, the DFA fluctuation function no longer increases linearly with increasing window size, which means that the correlations do not decay according to a simple power-law with just one exponent for all time scales. In order to find the time scales that are sufficiently long that autocorrelations are not affected by the integration by the filter, we have to find the time range where fitting the power law gives a good fit and DFA exponent α ~ 0.5
Re-compute the DFA, fitting first to the short-range time scales, e.g.:
From these two DFA plots, you can see that at time scales from 100 ms to 1 s, the amplitude fluctuations of the 10 Hz filtered data are strongly correlated. In fact, the exponent is larger than 1, indicating that the correlations are not of a power-law form (only for 0.5 < α < 1). In the time range of 3–120 s, however, correlations are still absent as evidenced by α ~ 0.5. Depending on the properties of the bandpass filter, one may be able to achieve uncorrelated fluctuations after filtering at shorter time scales than 3 s. Nevertheless, the conclusion from the above is that for this specific 8–13-Hz filter it is safe to interpret correlations on time scales longer than 3 s as originating from the original signal and not introduced by the filter.
You have just studied the influence of filtering on autocorrelations. Let's turn to real MEG data and quantify the decay of long-range temporal correlations (LRTC) in the alpha-frequency band. Download the sample MEG signal here
Use the fitting range that you have just identified as having a proper null-hypothesis, i.e., when quantifying correlations in real data, you wish to test the time scales that do not contain correlations induced by the filters.
Long-range temporal correlations require a non-uniformity of fluctuations over time, or a so-called “patchy” time structure.
To see this, compare the time series of the filtered noise and the filtered MEG brain signal using the 'plot' command (you may want to write a 'whos' command to see your workspace variables):
>> … % you should know how to code these commands by now...
It would be unfortunate if LRTC were present already in the laboratory noise. Check this by analyzing the empty-room recording 'RMS_noise' as you just analyzed the 'RSM'.
>> … % you should know how to code these commands by now...
Narrow-band filtered MEG noise fortunately seems to have close to no long-range temporal autocorrelations. This raises another issue, however: If the signal-to-noise ratio is poor, can we then reliably quantify the temporal correlations of the underlying physiological process?
A crude way to elucidate the problem is to “artificially decrease the signal-to-noise ratio” of your neuronal data by adding an amplified version of the empty-room recording on top of the neuronal data and see if it influences the estimated LRTC (make sure you understand the data input!):
>> … % you may be able to do this by now, otherwise ask...
What happened to the DFA-exponent compared to the original filtered RSM data without added noise?
This is a problem, because different conditions, animals or subject groups may often have different signal-to-noise ratio and, therefore, a change in LRTC potentially could reflect different oscillation amplitudes rather than a true effect on LRTC. This influence of signal-to-noise ratio on biomarker values, however, is a generic one that also applies to coherence, cross-frequency phase locking, graph analyses, etc.