autocorrelation

Home > Economics > Time Series Analysis > autocorrelation

The correlation between a time series and a delayed version of itself.

Time series data: A time series is a series of data points indexed in time order. Understanding time series data is crucial in studying autocorrelation.
Autocorrelation: Autocorrelation is a statistical measurement that measures the degree of similarity between a time series and its lagged version.
Cross-correlation: Cross-correlation is a statistical measurement that determines the similarity between two different time series as a function of the displacement of one relative to the other.
White noise: White noise is a random signal that is constant across all frequencies. It has no autocorrelation.
Stationary time series: A stationary time series is one that has constant mean, variance, and autocorrelation over time.
Non-stationary time series: A non-stationary time series is one that has varying mean or/and variance over time.
Correlogram: A correlogram is a plot of the autocorrelation function (ACF) and partial autocorrelation function (PACF) versus time lags.
Autoregressive (AR) models: An autoregressive (AR) model is a linear regression model that makes use of lagged values of the response variable as predictors.
Moving average (MA) models: A moving average (MA) model is a regression model that makes use of lagged residuals as predictors.
Autoregressive integrated moving average (ARIMA) models: An autoregressive integrated moving average (ARIMA) model combines the AR and MA models with an integrated component that removes non-stationarity.
Seasonal ARIMA models: Seasonal ARIMA models are used to model and forecast seasonal data.
Vector Autoregression (VAR): Vector Autoregression (VAR) models are multivariate time series models that simultaneously model multiple time series.
Time series decomposition: Time series decomposition involves decomposing a time series into its trend, seasonal, and residual components.
Fourier analysis: Fourier analysis is a method used to transform time-domain data into frequency-domain data, allowing us to understand the frequency structure of time series data.
Wavelet analysis: Wavelet analysis is a method used to analyze time series data at different scales over time.
Positive autocorrelation: Also known as serial correlation, it occurs when the values of a time series are correlated with their lagged values in a positive manner, meaning that values that occur close in time are similar to each other.
Negative autocorrelation: It occurs when the values of a time series are correlated with their lagged values in a negative manner, meaning that values that occur close in time are dissimilar to each other.
Zero autocorrelation: It occurs when the values of a time series are not correlated with their lagged values, meaning that the values of the series remain constant over time.
Partial autocorrelation: It is a measure of the relationship between two values of a time series, controlling the effects of all the intermediate values between them.
Lag dependence: It occurs when the current value of a time series depends on a previous value at a certain lag, indicating a non-random structure in the data.
Seasonal autocorrelation: It occurs when a time series displays a pattern that repeats itself over a fixed period of time, such as a day, week, month or year.
Spatial autocorrelation: It occurs when the values of a spatial or geographical variable are correlated with their neighboring values, indicating that there is a spatial pattern in the data.
Model-based autocorrelation: It is the autocorrelation that is present in the residuals of a time series model, indicating that the model is not capturing all the information in the data.
"Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay."
"The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise, or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies."
"It is often used in signal processing for analyzing functions or series of values, such as time domain signals."
"Different fields of study define autocorrelation differently, and not all of these definitions are equivalent."
"In some fields, the term is used interchangeably with autocovariance."
"Unit root processes, trend-stationary processes, autoregressive processes, and moving average processes are specific forms of processes with autocorrelation."
"Informally, it is the similarity between observations of a random variable as a function of the time lag between them."
"Identifying the missing fundamental frequency in a signal implied by its harmonic frequencies."
"Autocorrelation is a mathematical tool for finding repeating patterns."
"The analysis of autocorrelation can help reveal the presence of a periodic signal obscured by noise."
"The similarity between observations of a random variable."
"The analysis of autocorrelation is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise."
"Identifying the missing fundamental frequency in a signal implied by its harmonic frequencies."
"Signal processing."
"In some fields, the term is used interchangeably with autocovariance."
"It is often used in signal processing for analyzing functions or series of values, such as time domain signals."
"Unit root processes, trend-stationary processes, autoregressive processes, and moving average processes."
"Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay."
"Repeating patterns, such as the presence of a periodic signal obscured by noise."
"Unit root processes, trend-stationary processes, autoregressive processes, and moving average processes."