Long Memory in the Volatility of Indian Financial Market: An Empirical Analysis Based on Indian Data
©2014
Textbook
100 Pages
Summary
This book examines the long memory characteristics in the volatility of the Indian stock market, the Indian exchange rates and the Indian banking sector. This book also reviews the chain of approaches to estimate the long memory parameter. The long memory characteristics of the financial time series are widely studied and have implications for various economics and finance theories. The most important financial implication is related to the violation of the weak-form of market efficiency which encourages the traders, investors and portfolio managers to develop models for making predictions and to construct and implement speculative trading and investment strategies. In an efficient market, the price of an asset should follow a random walk process in which the price change is unaffected by ist lagged price changes and has no memory.
Excerpt
Table Of Contents
iv
List of Tables
Descriptive Statistics of Stock Returns ... 28
Results of Local Whittle test for estimation of parameter d. ... 36
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for S&P CNX Nifty under Gaussian distribution ... 38
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for CNX 100 under Gaussian distribution ... 39
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for CNX 500 under Gaussian distribution ... 40
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for CNX Nifty Junior under Gaussian distribution ... 41
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for CNX Midcap under Gaussian distribution ... 42
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models for CNX Smallcap under Gaussian distribution ... 43
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for S&P CNX Nifty Student-t distribution ... 44
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for CNX 100 Student-t distribution ... 45
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for CNX 500 Student-t distribution ... 46
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for CNX Nifty Junior Student-t distribution ... 47
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for CNX Midcap Student-t distribution ... 48
v
Maximum likelihood parameter estimates of GARCH, IGARCH and FIGARCH
models under for CNX Smallcap Student-t distribution ... 49
Standard deviation and mean absolute error for all the tests for varying sample size. ... 55
Descriptive Statistics of INR exchange rates ... 59
Results of Aggregate Variance method test for estimation of Hurst exponent H. ... 66
Results of GPH test for estimation of Hurst exponent H. ... 67
Descriptive statistics of stock returns ... 70
The Hurst exponent for the sub-sample periods for various volatility proxies... 72
Ranking the market efficiency during the sub-periods and the overall period based on
the estimated Hurst exponents ... 73
Estimation results and diagnostics for the pre-crisis period ... 75
Estimation results and diagnostics for the crisis period ... 76
Estimation results and diagnostics for the post-crisis period ... 77
Estimation results and diagnostics for the whole period ... 78
Estimation results for the FIGARCH model ... 80
Estimation results for the FIAPARCH model ... 81
vi
List of Figures
Time plots of returns and prices data series for all the indices. ... 31
ACF (Autocorrelation function) plots for all the indices. Horizontal dashed lines
represent (± 1.96(1/T)
0.5
) ... 34
Plots of the mean value and 95% confidence band (2.5% and 97.5% quantiles) of the
estimated Hurst exponents for the Gaussian white noise and the fractional Gaussian
noise (for H = 0.4, 0.5 and 0.6) for aggregated variance method (AVM) and Geweke
and Porter-Hudak (GPH) approaches. ... 53
Comparison of the estimated power-law exponents for AVM and GPH approaches with
ideal (H
in
= H
out
) values. ... 57
Price and Return plots for all Indian exchange rates. ... 62
ACF (Autocorrelation function) plots for all the exchange rates. Horizontal dashed lines
represent (± 1.96(1/T)
0.5
). ... 64
Time plots of returns and prices data series ... 71
1
Chapter 1: Introduction
1.1
. Volatility and long memory
Volatility is considered to be an important ingredient of quantitative finance and a plethora of
literature exist related to estimating, modeling and forecasting volatility. If we want to
estimate daily volatility using daily closing prices, the most widely used estimators are the
demeaned squared daily returns and the demeaned absolute daily returns. But these estimates
of volatility are very noisy, inefficient and biased in nature. Another way of estimating
volatility more precisely is to use intraday high frequency data. However, in many cases, high
frequency data is not available at all or sometime it is available only over smaller intervals.
High frequency data is generally very expensive and requires considerable computational
resources for analysis. High frequency data also suffers from market microstructure issues
which makes volatility estimation using high frequency data highly complex. Several studies
have highlighted the importance of volatility estimators that utilize the opening, high, low
and closing prices of an asset because they give rise to much more efficient estimates of
volatility compared to volatility estimated using conventional return data. The opening, high,
low and closing prices are also readily available for most of the tradable assets and indices in
financial markets and potentially contain more information for estimating volatility when
compared to the close to close return data that is conventionally made use of.
The Indian stock market has grown rapidly and significantly in the last decade and provides
enormous opportunities to investors seeking high returns. Now, the Indian equity markets
have global presence and have attracted the attention of global investors. Hence, it is essential
to study the behavior of the volatility of returns from the Indian stock market. Long memory
in the volatility of stock returns is one of the important stylized facts to consider. This has
triggered the interests of researchers, market participants, practitioners and regulators to
unravel the behavior of the equity markets. Being an emerging market, the Indian markets are
characterized by high volatility, thin trading, high returns and various frictions. High
volatility in the returns, in one sense, highlights the vulnerability of the markets and hence the
respective economy.
The fluctuation in exchange rates can significantly impact the returns of an asset in foreign
currency. The Indian exchange rate market has grown significantly in the last decade and
2
provides enormous opportunities to investors. Like in other emerging markets, foreign
investors also face higher risk when investing in India. So, to earn higher returns, it is
essential to analyze the behavior of the volatility of Indian exchange rates relative to liquid
currencies like US dollar, GBP, Euro and Japanese yen to develop meaningful investment
and trading strategies and to mitigate the associated risks. The Indian banking sector has also
experienced significant growth in the last decade and has become an important investment
target, by providing enormous investment opportunities to investors and portfolio managers.
Hence, it is essential to study the behavior of the volatility of returns from the Indian banking
sector.
Volatility is an important input to various finance applications which includes portfolio
selection and allocation, derivatives pricing, futures hedging, risk management, implementing
trading strategies, asset pricing and asset allocation. It is to be noted that volatility of a market
is not directly observable and hence the literature on volatility is devoted on the procedures to
extract volatility from the observable data. This has resulted in the development of various
volatility estimators and models to measure and forecast volatility.
Market inefficiency refers to the fact that the market does not react immediately as new
information flows in but responds to it gradually over a period of time. The violation of
efficient market hypothesis supports the presence of persistence (long memory) or anti-
persistence (mean reversion) in the stock market. In economics and finance, whether or not
asset prices display long-range dependence is still an important area to explore in research
because of its importance for capital market theories. The analysis related to the long memory
property is realized through the estimation of the power-law scaling exponent or the Hurst
exponent. The term power-law scaling exponent has its origin in physics (Hurst, 1951) but
also finds application in financial markets (Mandelbrot, 1971, 1997). The subject of detecting
long memory in a given time series was first studied by Hurst (1951), an English hydrologist,
who proposed the concept of the Hurst exponent based on Einstein's contributions to the
Brownian motion in physics to deal with the obstacles related to the reservoir control near the
Nile river dam. Scaling exponents have characteristics that reflect facts having a bearing on
market efficiency. The presence of long memory in the evolution of asset prices describes the
higher-order correlation structure in the series and supports the possibility of predicting its
behaviour in a market setting.
3
The study of long-range dependence in financial time-series has a long history and has
remained an active topic of research in economics and finance (Mandelbrot (1971), Greene
and Fielitz (1977), Cutland et al. (1995), Baillie et al. (1996)). The analysis related to the long
memory property can be realized through the estimation of the fractional integration
parameter or the Hurst exponent. It has been observed that the squared return, the absolute
return and the logarithm of the squared return of financial assets or exchange rates exhibit
serial correlations that show hyperbolic decay similar to those of an I(d) process (Taylor
(1986)). This persistence has a major impact on the future volatility of the stock markets,
exchange rates and banking sector under the influence of shocks.
Asymmetric long memory in the volatility of stock returns is one of the important areas to
explore in research. The asymmetric response of volatility to news indicates that the falling
prices result in greater volatility than rising prices of the same magnitude. This phenomenon
is known as leverage effect (Black, 1976 and Nelson, 1991). After the pioneering work of
Black (1976), many studies have tried to examine the story behind the asymmetry of
volatility. The literature talks about two possible explanations for the asymmetry in volatility,
the first being operating and financial leverage effect (Black, 1976) and the second being the
volatility feedback effect (Bekaert and Wu, 2000). The EGARCH model (Nelson, 1991),
GJR-GARCH model (Glosten, Jagannathan and Runkle, 1993) and APARCH model (Ding,
Granger and Engle, 1993) are popular asymmetric GARCH class models.
Varity of measures of long-range dependence are used in the finance literature. In the time
domain, long memory of the financial time series is related to a hyperbolically decaying of
the autocovariance function. On the other hand, in the frequency domain, the presence of
long memory in the financial time series is highlighted by a spectral density function that
approaches infinity near the zero frequency; in other words, such series display power at low
frequencies (Lo, 1991; Di Sario et al, 2009). Such developments in the literature have led
researchers to develop stochastic models that can capture long memory characteristics of the
financial time series, such as the fractionally-integrated I(d) time series models introduced to
economics and finance by of Granger (1980), Granger and Joyeux (1980), and Hosking
(1981). The commonly used measure of long memory is the Hurst exponent (H) or the self-
similarity parameter which is a dimensionless parameter and diverse methodologies exist to
estimate it. The concept of the Hurst exponent finds its applications in many research fields
including the field of financial studies due to the ground-breaking work of Mandelbrot (1963,
4
1997) and Peters (1991, 1994). The Hurst exponent lies in the range 0 H 1. If the Hurst
exponent is 0.5 then the process is said to follow a random walk. When the Hurst exponent is
greater than 0.5, it suggests positive long-range autocorrelation in the return series or
persistence in the stock price series. On the other hand, when the Hurst exponent is smaller
than 0.5, it suggests the presence of negative autocorrelation in the returns or mean reversion
in the stock price series. The second measure, d, is the fractional integration parameter, which
can be estimated from fitting an ARFIMA(p,d,q) model on volatility series or by applying
Fractionally Integrated GARCH class models on the log-differenced series. As highlighted by
the fractional integration theory that the fractional difference parameter is not an integer
value (for example: 0 or 1) but a fractional value (see Baillie, 1996). The fractional
differencing parameter indicates the order of integration of the financial time series.
Fractionally integrated time series is different from both the stationary and the unit-root
processes in a way that the fractionally integrated processes are persistent (i.e., they reflect
long memory) and mean reverting. The fractionally integrated parameter is given as d
(0,
0.5). When d > 0.5 the time series is considered to be non-stationary in nature and when d
(-0.5, 0) the time series is considered to be anti-persistent in nature. In this book, we have
highlighted the popular approaches in the asset pricing literature to examine the long memory
in the volatility.
The long memory characteristics of the financial time series are widely studied and have
implications for various economics and finance theories. The most important financial
implication is related to the violation of the weak-form of market efficiency which
encourages the traders, investors and portfolio managers to develop models for making
predictions and to construct and implement speculative trading and investment strategies. In
an efficient market, the price of an asset should follow a random walk process in which the
price change is unaffected by its lagged price changes and has no memory. Random walks in
stock prices and exchange rates present important challenges to market participants and
analysts. If the random walk model holds then prediction by analysts is like that of
astrologers. Traditional random walk tests (based on studies before 1980s) of asset returns
were primarily based on the serial correlation of price changes. The pioneering work of
Kendall and Hill (1953) and Fama (1965) present strong and voluminous evidence in favour
of the random walk hypothesis which support the weak-form efficient market hypothesis.
Past studies have provided evidence in support of the hypothesis that nominal exchange rate
series follow random walks. See for instance, Bachelier (1900), Cootner (1964), Samuelson
5
(1965), Malkiel and Fama (1970), Giddy and Dufey (1975), Roll (1979), Meese and
Singleton (1982), Adler and Lehmann (1983), Darby (1983), Hsieh (1988) and Baillie and
Bollerslev (1989). However, Huizinga (1987) and Grilli and Kaminsky (1991) find evidence
against the random walk hypothesis for exchange rates. It is to be noted that pricing
derivative securities with random walk process or Brownian motion process may not be
appropriate if the true underlying stochastic process exhibits long memory property. Hence, it
is important to consider the long memory property while exploring the characteristics of
financial derivatives which has implications for the derivative market participants, risk
managers and asset allocation decisions makers.
1.2
. Structure of the book
The book comprises of six chapters. The outlines of the chapters are as follow:
Chapter 1 highlights importance of volatility in various finance applications. It also provides
the background information about market efficiency and long memory property in the
financial time series.
Chapter 2 highlights the development in the literature related to the long memory property of
the financial time series and estimation of long memory parameters.
In Chapter 3, we provide various approaches to estimate the long memory parameter in both
the time and the frequency domain.
In Chapter 4, we examine the long memory characteristics in the volatility of the Indian stock
market. We apply both the semi-parametric method (the Local Whittle (LW) estimator) and
the parametric method (GARCH class models) to accomplish our goal. Modeling long
memory in the volatility of the Indian stock market has been a neglected area of research
because very few studies exist in the literature mainly focused on the long memory property
of volatility in the Indian stock market.
In Chapter 5, we examine the long memory characteristics in the volatility of the Indian
exchange rates relative to US dollar, GBP, Euro and Japanese yen. We apply the aggregated
variance method (AVM) and the Geweke and Porter-Hudak (GPH) (1983) analysis (a semi-
parametric technique) to accomplish our goals. Modeling long memory in the volatility of the
Indian exchange rates has been a neglected area of research because very few studies exist in
the literature mainly focused on the long memory property of volatility in the Indian
6
exchange rates. Hence, our study can be considered as a contribution on this topic which
involves the analysis of the main proxies of volatility. On the other hand, this chapter also
investigates the accuracy of the AVM and the GPH approaches by means of Monte Carlo
simulation experiments.
In Chapter 6, we examine the asymmetric long memory characteristics in the volatility of the
Indian banking sector and in particular, the CNX Bank Nifty index. We apply the Detrended
Fluctuation Analysis (DFA) technique and the GARCH family of models (namely, GARCH
(1,1), EGARCH (1,1), GJR-GARCH (1,1), FIGARCH (1,d,1) and FIAPARCH (1,d,1)) to
accomplish our goal. Modeling asymmetric long memory in the volatility of the Indian stock
market has been a neglected area of research because very few studies exist in the literature
that are mainly focused on the long memory property of volatility in the Indian stock market.
Hence, our study can be considered as a contribution on this topic which involves the analysis
of the main proxies of volatility and the asymmetric behavior of the conditional volatility.
7
Chapter 2: Literature Review
2.1. Long range dependence in the financial time series
The study of long-range dependence in financial time series has a long history and has
remained an active topic of research in economics and finance. See, for instance, Mandelbrot
(1971), Greene and Fielitz (1977) and Cutland, Kopp, and Willinger (1995). Mandelbrot
(1972) finds that the R/S analysis shows superior properties over autocorrelation and variance
analysis (because it can work with distributions with infinite variance) and spectral analysis
(because it can detect non-periodic cycles). Greene and Fielitz (1977) utilize the Hurst
rescaled-range (R/S) method and provide evidence in support of long memory in the daily
stock return series. With the development of the log periodogram regression estimator by
Geweke and Porter-Hudak (1983), based on the order of integration parameter d in the
ARFIMA model of Granger and Joyeux (1980) and Hosking (1981), triggered the literature
of the fractionally integrated models. Diebold and Rudebusch (1989) explore the long
memory characteristics of the US real GNP data. Lo (1991) find that the classical R/S test
used by Mandelbrot and Green and Fielitz suffers from a drawback in that it is unable to
distinguish between long memory and short range dependence. Lo (1991) proposes a
modified test of the R/S statistic which can distinguish between short term dependence and
long memory and finds that daily stock returns do not show long-range dependence
properties. Cheung and Lai (1995) analyze data from Austria, Italy, Japan and Spain and
detect long memory in these markets. In addition, this finding was invariant to the choice of
estimation methods employed. In particular, results from both the modified `rescaled range'
and the spectral regression method, which was used to model an ARFIMA process indicated
the presence of long memory dynamics in the data. Willinger, Taqqu, and Teverovsky (1999)
empirically find that Lo's modified R/S test leading to the acceptance of the null hypothesis
of no long-range dependence for CSRP (Center for Research in Security Prices) data is less
conclusive than it appears. This is so because of the conservative nature of the test statistic in
rejecting the null hypothesis of no long-range dependence, by attributing what is found in the
data to short-term dependence instead. Peters (1991) use R/S approach to study the long
memory characteristics of daily exchange rates data of US dollars, Japanese yen, British
pounds, Euros and Singapore dollars, and finds evidence that support the presence of long
memory properties in exchange rates. Baillie, Chung, and Tieslau (1996) investigate the long-
8
range dependence properties in inflation time series and find positive results. Corazza and
Malliaris (2002) carry out a study on foreign currency markets and find evidence of long
memory. They also find that Hurst exponent does not remain fixed but changes dynamically
with time. In addition, they provide evidence that foreign currency returns follow either a
fractional Brownian motion or a Pareto-Levy stable distribution. Cajueiro and Tabak (2004)
use the rolling sample approach to calculate Hurst exponents over the period October 1992 to
October 1996 and provide evidence of long-range dependence in Asian markets. Carbone,
Castelli, and Stanley (2004) propose the detrending moving average (DMA) algorithm to
estimate the Hurst exponent, which does not require any assumption regarding the underlying
stochastic process or the probability distribution function of the random variable. Matteo,
Aste, and Dacorogna (2005) study the scaling properties of daily foreign exchange rates,
stock market indices and fixed income instruments by using the generalized Hurst exponent
approach and find that the scaling exponents can be used to differentiate markets in their
stage of development. Cajueiro and Tabak (2005) study the possible sources of long-range
dependence in returns of Brazilian stocks and find that firm specific variables can partially
explain the long-range dependence measures, such as the Hurst exponent. Souza, Tabak, and
Cajueiro (2008) study the evolution of long memory over time in returns and volatilities of
British pound futures contracts by using the classic R/S approach, the detrended fluctuation
analysis (DFA) approach and the generalized Hurst exponent (GHE) approach and find a
change in the long memory characteristics of the British pound around the time of the
European financial crisis. Serletis and Rosenberg (2009) use the detrending moving average
(DMA) approach to calculated the Hurst exponent and find evidence in support of anti-
persistence (mean reversion) in the US stock market. They also estimate the local Hurst
exponent (on non-overlapping windows of 50 observations) to examine the evolution of
efficiency characteristics of index returns over time. Kristoufek (2010) re-examines the
results of Serletis and Rosenberg (2009) and finds that there are no signs of anti-persistence
in the US stock market.
After the Autoregressive Conditional Heteroskedasticity (ARCH) model and the Generalized
ARCH (GARCH) model were introduced by Engle (1982) and Bollerslev (1986)
respectively, numerous extensions of ARCH models have been proposed in the literature, by
specifying the conditional mean and conditional variance equations, which are potentially
helpful in forecasting the future volatility of stock prices. Engle and Bollerslev (1986)
propose the Integrated GARCH (IGARCH) model to capture the impact of a shock on the
9
future volatility over an infinite horizon. However, these GARCH and IGARCH models are
not able to capture the long memory property of volatility satisfactorily. To deal with this
shortcoming, Baillie et al. (1996) propose the fractionally integrated GARCH (FIGARCH)
model to allow for fractional orders I(d) of integration, where 0 < d < 1. This model
estimates an intermediate process between GARCH and IGARCH. They apply the
FIGARCH model to examine the persistence in Deutschmark - U.S. dollar exchange rates
volatility. Vilasuso (2002) obtains the exchange rate volatility forecast by using FIGARCH
model and finds that the FIGARCH model produces significantly better volatility forecasts
(for 1-day and 10-days ahead) compared to GARCH and IGARCH. Kang and Yoon (2006)
investigate the asymmetric long memory features in the volatility of Asian stock markets.
Cheong, Nor and Isa (2007) investigate the asymmetry and long memory volatility behavior
of the Malaysian Stock Exchange daily data by considering the financial crisis between 1991
to 2006 on various sub-periods (pre-crisis, crisis and post-crisis) and find mixed results.
Granger and Ding (1995) utilize the Geweke and Porter-Hudak (1983) test to examine the
presence of long-memory in absolute returns of the S&P 500 Index. The estimation of the
long memory parameter d in the volatility series as per the Geweke and Poter-Hudak test
involves an ordinary linear regression of the log periodogram of a volatility series (with the
proxy being the absolute return or the squared return) with the log frequency as the
explanatory variable. Lobato and Velasco (2000) apply a two-step semi-parametric estimator
to obtain the long-memory parameter of stock market volatility and trading volume. They
conduct their analysis in the frequency domain which involves tapering the data. Assaf and
Cavalcante (2005) use the modified rescaled range (R/S) statistic of Lo (1991), the rescaled
variance measure of Giraitis et al. (2000), and the semi-parametric estimator proposed by
Robinson (1995) and the Fractionally Integrated Generalized Autoregressive Conditional
Heteroskedasticity (FIGARCH) by Baillie et al. (1996) to estimate the fractional parameter d
for the Brazilian stock market. Kilic (2004) makes use of both parametric and nonparametric
methods to examine the long memory characteristics in the volatility of the Istanbul Stock
Exchange National 100 Index.
Gu and Zhou (2007) apply Detrended Fluctuation Analysis (DFA), R/S analysis and modified
R/S analysis to study the long memory property of the volatility of 500 stocks traded on the
Shanghai Stock Exchange (SHSE) and Shenzhen Stock Exchange (SZSE) and find strong
evidence in support of long memory in the volatility of the 500 stocks. Dionisio et al. (2007)
10
analyze the behavior of volatility for various international stock market indices in the context
of non-stationarity and prefer the FIGARCH model over the GARCH and the IGARCH
models for capturing the behavior of volatility. Bentes et al. (2008) use the FIGARCH model
and entropy measures to study the long memory property of the volatility time series for S&P
500, NASDAQ 100 and Stoxx 50 indices to compare US and European Markets and find that
both perspectives show nonlinear dynamics in the volatility time series. Oh et al. (2008) study
the long-term memory in the KOSPI 1 minute market index and exchange rates of six
countries relative to US dollar (5-minutes data of exchange rates are used for Euro, UK GBP,
Japanese Yen, Singapore SGD, Switzerland CHF and Australia AUD) using DFA and the
FIGARCH model. Their findings are supportive of long memory in the volatility series which
can be attributed to the volatility clustering observed in the series. Di Sario et al. (2008)
utilize approaches based on wavelets and aggregate series to test for long memory in the
volatility of the Istanbul Stock Exchange National 100 Index. They make use of absolute
returns, squared returns and log squared returns as proxies of volatility and find that all
volatility series display long memory property. Kang et al. (2010) utilize two semi-parametric
tests (the Geweke and Porter-Hudak (GPH) test and the Local Whittle (LW) test) and the
FIGARCH model to examine the long memory property in the volatility of the Chinese stock
market and find evidence of long memory features in the volatility time series and suggest
that the assumption of non-normality provides better specifications regarding the long
memory volatility processes. Fleming and Kirby (2011) apply fractional-integrated time
series models on realized volatility and trading volume of 20 firms to investigate the joint
dynamics of the trading volume of stocks and their volatility and find a strong degree of
correlation between the innovations to volume and volatility. They suggest that trading
volume can be used to obtain more precise estimates of daily volatility for cases in which
high-frequency returns are unavailable.
11
Chapter 3: Long memory tests
3.1. Long memory in a financial time series
Both time domain and frequency domain measures are available to detect the presence of
long memory in the time series. In the time domain, a hyperbolically decaying
autocovariance function characterizes the presence of long memory. Suppose x
t
is a stationary
process and
is its autocovariance function at lag , then, the asymptotic property of the
autocovariance function is given as:
| |
| | (1)
where H
(0,1) is a long memory parameter and called the Hurst exponent.
In the frequency domain, the long memory is present when the spectral density function
approaches infinity at low frequencies. Suppose
f() is the spectral density function. The
series x
t
is said to exhibit long memory if
( ) ~ | |
0 (2)
where C
f
> 0 and H
(0,1).
There exist various approaches to test the long memory property of the time series. In this
chapter, we briefly explain the most popular approaches which include R/S analysis,
modified R/S analysis, detrending moving average analysis, generalized Hurst exponent
approach, Lo's modified R/S analysis, detrended fluctuation analysis, Local Whittle
approach, exact Local whittle approach and discrete wavelet transform approach.
3.2. R/S analysis
Mandelbrot and Wallis (1969) propose the R/S analysis based on Hurst (1951), which helps
in the estimation of the self-similarity parameter and the long-range dependence parameter H
in the time series. The procedure for using the R/S analysis is as follows:
First divide the time series (of returns) of length L into d subseries (Z
i,m
) of length n. For each
subseries m = 1, ...,d:
12
1.
Find the mean (E
m
) and the standard deviation (S
m
).
2.
Next, normalize the data of subseries (Z
i,m
) by subtracting the sample mean,
Y
i,m
= Z
i,m
- E
m
for I = 1, ... . ..., n.
3.
Find a cumulative time series
,
=
,
for i = 1, ..., n.
4.
Find the range Rm = max{x
1,m
, ..., x
n,m
} - min{x
1,m
, ..., x
n,m
}.
5.
Rescale the range (R
m
/S
m
).
6.
Calculate the mean value of the rescaled range for all sub-series of length n
( / ) =
1
R/S statistics asymptotically follows the relation:
( / ) ~
The value of the Hurst exponent H can be estimated by running an ordinary least squares
(OLS) linear regression over a sample of increasing time horizons.
( / ) = log + log (3)
Note that H = 0.5 for white noise. When the process is persistent (i.e., has long memory),
then H > 0.5 and for an anti-persistent process (i.e., with mean reversion), H < 0.5.
3.3. Modified R/S analysis (R/S-AL)
In small samples, the R/S analysis can show a significant deviation of the estimates of the
Hurst exponent from 0.5 even for the Gaussian white noise. To overcome this problem, Annis
and Lloyd (1976) and Peters (1994) introduce a new formulation to improve the performance
of R/S analysis for small n.
( / ) =
-
1
2
- 1
2
2
-
340
-
1
2 1
2
-
> 340
(4)
13
The Hurst exponent is calculated as 0.5 plus the slope of (R/S)
n
- E(R/S)
n
. The resulting
statistics is known as R/S-AL.
3.4. Detrending moving average analysis (DMA)
In order to calculate the Hurst exponent, the detrending moving average (DMA) approach is
used. Suppose x
t
is a financial time series with t = 1,..., N. The nth order moving
average of x
t
is given by:
,
=
1
In finding
,
, the last point of the time window of size n is taken as the reference point. The
series x
t
is detrended by subtracting
,
and the standard deviation of x
t
about the moving
average
,
is computed as follows:
=
1
-
-
,
(5)
is computed for different values of the moving average window n over the interval (n,
N). The Hurst exponent is computed as the slope of a log-log plot between
and n.
Arianos and Carbone (2007) show the power law behavior:
(6)
where H is the Hurst exponent and 0 < H < 1.
Equation (6) can also be written as:
(
)
( )
This linear relationship between
and n on a log-log plot supports the presence of power
law (fractal) scaling which indicates that there is self-similarity in the series. This means the
fluctuations over small time scales are related to fluctuations over larger time scales. In
particular, the Hurst exponent can be used to identify the long memory properties of the time
series.
14
3.5. Generalized Hurst exponent
Di Matteo and Aste (2002) propose the generalized Hurst exponent (GHE) approach for
financial time series which is based on the scaling of qth order moments of the distribution of
the series. The GHE is a generalization of the approach proposed by Hurst (1951). Suppose x
t
is the time series of logarithmic exchange rates. The qth order moment of the distribution of
the increments (with t = v, 2v, ..., T) of the time series x
t
is given by:
,
=
|
- |
| |
(7)
Where v is the time resolution and it is kept constant (here v = 1 day) and is the time
interval which varies between v and
max
.
The generalized Hurst exponent H(q) is defined from the scaling behaviour of
,
(Barabási
& Vicsek, 1991), which can be assumed to follow the relation
,
~
( )
(8)
For q = 1, the generalized Hurst exponent approach yields the Hurst exponent (H(1)) which
describes the scaling behaviour of the absolute values of increments and is similar to the
Hurst exponent obtained from the R/S analysis. The scaling exponent for q = 2 is associated
with the scaling of the autocorrelation function and is related to the spectral density function
(Flandrin, 1989).
3.6. Lo's modified R/S analysis
Lo (1989) proposes a modified R/S statistic which can be applied to distinguish between
long-range and short-range dependence in a series. Suppose y
t
is the log difference time
series with t = 1,..., N, then range R is defined as:
= max
( - ) - min
( - ) (9)
Where is the sample estimator of the population mean. The range is usually rescaled by the
sample standard deviation to find the R/S statistic. Lo (1989) finds that the distributional
15
properties of the rescaled range are affected by the presence of short-range dependence.
Therefore, he proposes a modified R/S statistic Q
n
and its statistical behaviour is invariant
over a general class of short memory processes, but deviates for long memory processes. The
modified R/S statistic Q
n
is defined as:
=
1
( )
max
( - ) - min
( - ) (10)
Where
( ) is estimated by
( ) =
+ 2
( ) , ( ) = 1 -
+ 1
, < .
Where
and are the sample variance and autocovariance estimators of y and
( ) are
the weights suggested by Newey and West (1986).
Under the null hypothesis of no long-term memory, the distribution of the random variable
V
n
(q) is
( ) =
1
which converges to the Brownian Bridge. V
n
(q) denotes the dependence of the modified R/S
on the truncation lag. We compute V
n
(q) for several different values of q to check the
sensitivity of the statistic to the lag length. The null hypothesis is accepted at 99% confidence
level if V
n
(q) is contained in the interval (0.721, 2.098).
3.7. Detrended fluctuation analysis (DFA)
Peng et al. (1994) propose the Detrended Fluctuation Analysis (DFA) to examine the long-
range dependence property in the time series. Suppose x(t) be the integrated financial time
series of logarithm returns, i.e. x(t) = ln(P(t)) with t = 1,..., N. In this method, the
integrated time series is divided into blocks of the same length n. The ordinary least square
method is used to estimate the trend in each block. In each block, the ordinary least square
line is expressed as x
n
(t). The trend from the series is removed by subtracting x
n
(t) from the
integrated series x(t) in each block. This procedure is applied to each block, and the
fluctuation magnitude is defined as
16
=
1
[ ( ) - ( )] (11)
This step is repeated for every scale n, and to estimate Hurst exponent, the following scaling
relationship is defined as
(12)
Where H is the Hurst exponent and 0 < H < 1.
Equation (12) can also be written as:
(
)
( )
This linear relationship between
and n on a log-log plot supports the presence of power
law (fractal) scaling which indicates that there is self-similarity in the series. This means the
fluctuation over small time scales are related to fluctuations over larger time scales. The
Hurst exponent can be used to identify the long memory properties of the time series.
We consider two extensions of Detrended Fluctuation Analysis (DFA) to deal with any short-
range dependence in the series:
1)
Apply DFA analysis to shuffled data: Divide the series in non-overlapping blocks of 5
observations. Shuffle the data in each block (permutation of data in each block) and
apply DFA approach. The goal of shuffling the data series in each block is to destroy
any structure of autocorrelation within these blocks (Cajueiro and Tabak, 2004, 2005).
2)
Apply DFA analysis to aggregated data: Here also divide the series in non-
overlapping blocks of 5 observations. Take the average of each block and apply DFA
approach to calculate Hurst exponent (Cajueiro and Tabak, 2005).
3.8. Local Whittle method
Robinson (1995) introduces the Local Whittle (LW) estimator which assumes the behavior of
the spectral density
f() close to the origin, i.e. when = 0,
( ) ~ ( )| |
0 (13)
17
The computation of the local Whittle estimator based on the periodogram involves an
additional parameter m, which is less than N/2 and is assumed to satisfy the condition:
1
+ 0
For the spectral density satisfying equation (13), the Whittle log-likelihood function is given
as:
( , ) =
1
+
(14)
where
= 2 / and G is a constant. The estimate of G is given as:
=
1
Replace in equation (14) G by its estimate ,
( ) =
1
(
)
( ) -
2 - 1
(15)
The value which minimizes L(H) asymptotically converges in probability to the actual
value H as N
f. Robinson (1995) shows that
-
(0, 0.25), (16)
A major issue on the usage of the Local Whittle test is the choice of bandwidth parameter m.
3.9. Exact Local Whittle test
Phillips and Shimotsu (2004) find that if the value of d lies in the non-stationary region (i.e.
beyond ¾), the Local Whittle (LW) estimator becomes awkward to use as the asymptotic
theory associated with the LW estimator is discontinuous at d = ¾ and d = 1 because of non
normal limit theory. It is to be noted that H = d + 0.5. In addition, for d > 1, the LW estimator
becomes inconsistent. Shimotsu and Phillips (2005) propose an Exact Local Whittle (ELW)
estimator and find that it is consistent and has the same N(0, ¼) limit distribution for all
18
values of d when the optimization covers an interval of width less than 9/2. The ELW is
based on the minimization of the objective function:
( , ) =
1
+
1
(
)
(17)
Shimotsu and Phillips (2005) propose to estimate d and G by minimizing Q
m
(G, d), such that
,
= arg min
( ( ,), [ , ])
( , ) (18)
Where
1
and
2
are the lower and upper bounds for d such that -<
1
<
2
<. Concentrating
Q
m
(G, d) with respect to G, Shimotsu and Phillips (2005) find that
d
d
satisfies
= arg min
[ , ]
( ) (19)
where
( ) =
( ) -
2
and
( ) =
1
(
)
Shimotsu and Phillips (2005) propose the asymptotic relationship which is given as:
-
,
(20)
and
+
(log )
+
log
> (21)
where is the degree of approximation of the spectral density of x
t
in the neighborhood of
origin G.
Details
- Pages
- Type of Edition
- Erstausgabe
- Publication Year
- 2014
- ISBN (eBook)
- 9783954897452
- ISBN (Softcover)
- 9783954892457
- File size
- 1.7 MB
- Language
- English
- Publication date
- 2014 (April)
- Keywords
- Hurst Exponent Market Efficiency Fractional Differencing Parameter Long Memory Volatility Modelling