Docstoc

INFORMATION ENTROPY AND EFFICIENT MARKET HYPOTHESIS

Document Sample
INFORMATION ENTROPY AND EFFICIENT MARKET HYPOTHESIS Powered By Docstoc
					International Conference On Applied Economics – ICOAE 2011                                                                             463


                      INFORMATION ENTROPY AND EFFICIENT MARKET HYPOTHESIS
                                         DANIEL TRAIAN PELE1,ANA-MARIA ȚEPUȘ2
                                                                  Abstract
This study aims to demonstrate that the extreme values of returns distribution are mostly associated with particular periods of stock
markets inefficiency, when their level of uncertainty reaches a local minimum. We propose an estimator of uncertainty, through the
entropy of probability density function of returns. The relationship between the level of uncertainty of a stock market and extreme
values of returns distribution is illustrated trough a binary logistic regression model estimated for main indexes of four stock markets
from Central and Eastern Europe.


JEL codes: G01 - Financial Crises, G14 - Information and Market Efficiency; Event Studies, G15 - International Financial Markets.
Keywords: entropy, entropy of a function, efficient market hypothesis.
1.Introduction
    Modeling the capital markets is closely linked to the efficient market hypothesis, a concept found in conjunction with the
rationality of investor behavior. Since Bachelier's groundbreaking study(1900), most of the twentieth century academic research in
finance has been formed around the paradigm of efficient markets.
    From the classical definition of Fama(1970), until more recent developments of Timmerman and Granger(2004), Efficient Market
Hypothesis(EMH) is inseparable from the concept of information and the mechanism of incorporation of a certain set of information
in the trading price of a financial asset.Numerous studies have investigated how the efficient market hypothesis, as a theoretical
model, is observed at the level of empirical reality.
    Stiglitz(1981) shows that the hypothesis of an efficient market, where ‖prices fully reflect available information‖(Fama, 1970), is
not consistent with the notion of Pareto optimum. Moreover, the information reflected in stock prices are just information that does
not require a cost to obtain them. Summers (1986) argue that statistical tests commonly used for testing the Efficient Market
Hypothesis have a relatively low power. Thus, a basic statistical law says that if we cannot reject the null hypothesis, we cannot
automatically accept it as a valid hypothesis. From this point of view, are forms of inefficiency that can not be discriminated by the
usual statistical tests, and we cannot deduce from these tests a conclusion about the validity of Efficient Market Hypothesis.
    Malkiel (2003) discusses the efficiency of capital markets through the criticisms that have been made over time. He clearly
describes the random walk hypothesis: the logic of this hypothesis is that if the information is immediately reflected in stock prices,
then tomorrow's price change will reflect only the tomorrow‘s information and will be independent of today‘s price change. As the
information is unpredictable, then price changes must be unpredictable and random.
    The link between a certain measure of complexity and the Efficient Market Hypothesis is quite clear: if we assume an efficient
market (in weak form), then the stock price follows a random walk model, i.e. the time series of returns is a white noise process.
    In terms of quantitative measures, such a white noise process has the highest level of complexity; on the contrary, if the Efficient
Market Hypothesis is not met, then the price is no longer a random walk process and consequently, the level of complexity of the
market will be lower. For instance, if the price is a purely deterministic process, completely predictable, then a minimum level of
complexity is achieved; if the price is a purely random process, completely unpredictable, then we are dealing with the maximum
level of complexity.
    Risso(2008) uses entropy as a measure of complexity to investigate the hypothesis that stock market crashes are most closely
associated with periods of low entropy. The relationship between stock market crashes and informational efficiency is as follows: if
the market is inefficient, so the information is not instantaneously reflected in prices, then local trends appears in the evolution of the
price. But once the information is incorporated into stock price, the investor‘s reaction may lead to a significant collapse of stock
price.
    Our study aims to demonstrate that both the decline, as well as the significant growth of the trading prices can be explained by
lower levels of stock market complexity, in relation to the Efficient Market Hypothesis.
    The paper is organized as follows: in Section 1 we describe the concept of entropy as a measure of complexity and propose a
method of estimating entropy using probability density function of returns; in Section 2 we describe the theoretical model used to
investigate the relationship between the level of complexity of stock market and the occurrence of extreme values of returns; in
Section 3 are presented the results of the estimated model for stock markets of Romania, Bulgaria, Hungary and the Czech Republic
and the last section is for conclusions.
2. Information Entropy as a measure of complexity
    Entropy is both a measure of uncertainty and complexity of a system, with numerous applications in physics ( the second
principle of thermodynamics), in information theory, in biology (DNA sequence complexity), medicine, economics (complexity of a
system).

1
  Ph.D., Lecturer, Department of Statistics and Econometrics, University of Economics, Bucharest,
Email: danpele@ase.ro.
2
  Ph.D. Candidate, Department of Money, University of Economics, Bucharest,
Email: anamaria.tepus@yahoo.com .
464                                                                                International Conference On Applied Economics – ICOAE 2011


                                                                       x1......xn 
                                                                       p ...... p 
                                                                   X :            
 If X is a discrete random variable, with probability distribution     1         n         p  P( X  xi ) , 0  pi  1 and
                                                                                     , where i
 pi  1
 i
                                                                                                   H ( X )   pi log 2 pi
                                                                                                                                        i
             ,     then      Shannon        Information     Entropy      is      defined      as        follows:                                        .
[1]
                                                                                      H ( X )   (1 / n) log 2 (1 / n)  log 2 n
   For uniform distribution the Shannon Entropy reaches his maximum:                                i                                       , while the
minimum value is attained for a distribution like the following:
           x1......xn    
       X :
          1.........0
                          
                          
                          , H(X )  0 .
   In other words, high levels of entropy are obtained for situations with high uncertainty and low levels of entropy are associated
with                        situations                         with                        lower                        uncertainty.
The relation between entropy and capital markets is straightforward(Risso, 2008).
                                                                                        1, rt  0
                                                                                   st  
   Let t
          r p p
               t t 1   t    log P  log P
                               t 1              the logreturn of an asset and let      0, rt  0 a random variable associated to ‖bull-
bear‖ states .
   Then for a certain period of time, one can define the information entropy of the 0 and 1 sequence:
H   p log 2 p  (1  p) log 2 (1  p) , where p  P(st  1)  1  P(st  0) .
   In other words, we can define the entropy as a measure of the complexity of the stock market, by transforming the time series of
logreturns into a sequence of 0 and 1.From our point of view, the methodology should be extended by taking into account the
continuous nature of returns distribution and in the following we propose such an extension of this methodology.
2.1. Entropy of a function
      Unlike the case of a discrete random variable, the entropy of a continuous random variable is difficult to define.
    If X is a continuous random variable with probability density function f (x) , then we can define, by analogy with the Shannon
information entropy, differential entropy:
      H ( f )   f ( x) log 2 f ( x)dx
                A                       ,                                                [2]
    where A is the support set of X.
    A naive estimator of differential entropy could be used to quantify the complexity level of a stock market, and the results for BET
Index of Bucharest Stock Exchange(Pele, 2011) shows a significant correlation between the values of this estimator and probability
of extreme negative values of daily logreturns distribution.
    Unfortunately, differential entropy does not have all the properties of Shannon entropy: it can take negative values and in addition
is not invariant to linear transformations of variables. However, we can define the entropy of a function that satisfies certain
properties, through a transformation called quantization. We present the essential elements of this transformation, as they appear in
Lorentz (2009).
    Definition 1.

      Let f : I  [a, b]  R a continuous real-valued function, let n  N and define
                                                                         *                  xi  a  (i  1 / 2)h , for i  0,.., n  1 ,

where h  (b  a) / n . The sampled function for f is n
                                                       S ( f )(i)  f ( xi ) for i  0,.., n  1 .
  Quantization process refers to creating a simple function that approximates the original function. Let q  0 a quantum. Then the
                                                   Qq ( f )( x)  (i  1 / 2)q
following function defines a quantization of f:                                  , if f ( x) [iq, (i  1)q) .
    Definition 2.

      Let f measurable and essentially bounded on the interval [a, b] and let q  0 . Let i
                                                                                         I               [iq, (i  1)q)
                                                                                                                           and
                                                                                                                                 Bi  f 1 ( I i ) . Then
                                    H q ( f )    ( Bi ) log 2 ( ( Bi ))
the entropy of f at quantum q is             i                       , where  is the Lebesgue measure.
    Following these definitions, one can compute the entropy of any continuous functions defined on a compact interval.
                                                                      H ( f )  log n
      If f ( x)  x on [0,1] , then for a fixed quantum q  1 / n , q        2
                                                                                 , the maximum value of the entropy.
      The following theorem (Lorentz, 2009) provides a conceptual framework for defining an estimator of entropy of a function.
      Theorem
International Conference On Applied Economics – ICOAE 2011                                                                                    465


    Let f continuous on [a,b] and let 1/n sampling space. Let
                                                              S n ( f ) the sampled function and Qq S n a quantization of S n with
                       c (i)  card {(i  1 / 2)q  Qq Sn } (number of values (i  1 / 2)q in Qq S n ) and let pn (i) probability of
quantum q  0 . Denote n
                 c (i)      c (i)
       pn (i)  n         n
                 cn ( j) n
                       j
value i:                                     .
            lim   p (i) log
              h 0     i
                           n        2   pn (i)  H q ( f )
   Then                                             .                               [3]
   The above theorem assures us that regardless of quantization and sampling, we obtain a consistent estimator of the entropy of a
function.
   1.2. Entropy of probability density function of returns
   This conceptual framework can be used in order to define the entropy of probability density function(pdf) of returns.
                                                                                  

    Let f a continuous real-valued function such as f ( x)  0 and            
                                                                                      f ( x)dx  1
                                                                                                     . Then the hypothesis of Lorentz theorem are
                             H q ( f ).
fulfilled and we can compute
    In reality, the analytical expression of probability density function is unknown, so we can estimate the density using a
nonparametric approach, such as Kernel Density Estimation (KDE).
                                         ^
                                                    1 n  x  xi 
                                         f ( x)        K      
                                                    nh i1  h  .
    Thus, KDE estimator of pdf is                                                            [4]

                                                                                                                              K ( x)dx  1
    K is a kernel function, with the following properties: K ( x)  0, x  R , K ( x)  K ( x), x  R , R                                  and
 xK ( x)dx  0.
R
    The parameter h is a scale parameter, whose choice determines the quality of the estimate (is also
    known as smoothing parameter or bandwidth).
    Basically, our methodology involves the following steps to estimate the entropy of the probability density function of returns:

    Let
        rt the time series of logreturns for a time period T , and let f (x) the probability density function.

    If n  2 , we estimate pdf with Kernel Density Estimation, obtaining values              for i  0,.., n  1 .
             k                                                                      f ( xi )
         S
    Les n sampled function as n
                                   S ( f )(i)  f ( xi ) for i  0,.., n  1 .

    Let
           q  2 k a quantum; then define Qq Sn ( f )( j )  (i  1 / 2)q , if f ( x j ) [iq, (i  1)q) .
                                        c (i)      c (i) card{ f ( x j )  [iq , (i  1)q)}
                            pn (i)  n          n 
                                      cn ( j) n
                                             j
                                                                               n
    Compute probabilities                                                                              .
                                                                          H q ( f )   pn (i) log 2 pn (i)
                                                                          ˆ
    One can estimate the entropy of probability density function as                 i                     .           [5]
    This estimator of entropy of the probability density function of returns will be called, in what follows, PDF Entropy.

    Notes
                                xmax  xmin
                                   h          x  xmin  (i  1 / 2)h
   Actually, we have chosen          n      and i                      , for i  0,.., n  1 .
   For computational reasons, we have used n  2  2  128 , and KDE was done using a Gaussian kernel:
                                                     k     7


K ( x)  exp(  x 2 / 2) / 2 .
   There are many distributions for which the probability density function is not necessarily bounded (Chi-square distribution is one
such example); moreover, even pdf is bounded, his range is different from distribution to distribution. To ensure comparability
between the results of the estimation in various markets, we proceeded to standardize the values estimated by KDE:
                f ( xi )  min f
 f ( xi ) 
              max f  min f
                                   , where max f and min f are the extreme estimated values of probability density function.
466                                                                                   International Conference On Applied Economics – ICOAE 2011

      One can define a normalized estimator of entropy of probability density function of returns (Normalized PDF Entropy):
                       pn (i) log 2 pn (i)
ˆ
HN q ( f )             i
                                  log 2 n
                                      .      [6]
    This will ensure comparability among different markets; in fact, this estimator of complexity will be used below to illustrate the
relationship between the degree of complexity of the market and the likelihood of extreme events.
3. The theoretical model of entropy as a predictor of extreme values of returns distribution
    Entropy can be regarded as a measure of informational efficiency of a stock market; if the market is weak form efficient, then the
price follows a random walk process, therefore the bull and bear market situations are likely probable. In terms of information
entropy, market efficiency is equivalent to the situation of maximum entropy, maximum complexity or maximum uncertainty.
    Conversely, when the price exhibits a predominant trend(upwards or downwards), the level of certainty is high, and such periods
are described by lower values of entropy.
    Our working hypothesis is that the likelihood of both tails of returns distribution could be explained by lower values of entropy.
To verify this hypothesis we estimate the following logistic regression model:
                          exp(  0  1 H t )
      P(Yt*  1) 
                        1  exp(  0  1 H t ) .                                      [7]
      In the above equation, we have:

      -
          Yt*  1 , where Yt*  {1 | rt  rt  , P(rt  rt  )  0.01} {1 | rt  rt , P(rt  rt  )  0.01} (upper and lower tail).

      -
           Ht  is market information entropy at time t, quantified by Normalized Shannon Entropy and by Normalized PDF Entropy.

      Normalized Shannon Entropy was estimated using the methodology from Risso(2008).
                                                  1, rt  0
                                             st  
      Thus, for a certain time period T, let      0, rt  0 .
                                                                      p  P(st  1) .
      Using a rolling window   T , one can compute the probability 
      Then the Normalized Shannon Entropy for the entire interval of length T is
      H  [ p log 2 p  (1  p ) log 2 (1  p )] / log 2 (n)
                                                            .                 [8]
      Also, we have estimated Normalized PDF Entropy for several time intervals T; moreover, since the time series of daily returns is

very noisy, the model was estimated using returns calculated on a local time window  : t     t     r p p
                                                                                                    t  .

    As we need to discriminate among several models, we should use a performance indicator of the logistic regression model.
In general, such an indicator is defined by comparing the likelihood function of estimated model with the likelihood function of the
model when the exogenous variable is removed.
                                        2
      One can define pseudo- R , as a measure of model‘s performance(Nagelkerke,1991):
      R 2  1  exp{2[log L(M )  log L(0)] / n} , where L(M ) and L(0) are likelihood function of the model, with and without
the exogenous variable.
                                 log(1  R 2 )  2[log L(M )  log L(0)] / n , this could be interpreted as the surplus of
      Rewriting the expression as
                                                         2
information due to explanatory variable. Unfortunately, R will never reach 1, not even for a perfect model, so the following
adjustment is made(Nagelkerke,1991):
      Radj  R 2 /[1  exp( 2 log L(0) / n)]
       2
                                        .
      We choose the model that best describes the correlation between reality and theoretical hypothesis using as criterion the
                        2
                       Radj
maximization of               .



4. Results
   We estimated the Normalized Shannon Entropy and the Normalized PDF Entropy using various rolling-windows and various
time periods, for main indexes of stock markets from four countries of Central and Eastern Europe (Romania, Bulgaria, Hungary, and
Czech Republic).
   There are several studies in the literature dealing with the market efficiency of those countries, and the results are quite
contradictory.
   Emerson et al.(1997), found varying levels of efficiency and varying speeds of movement towards efficiency within a sample of
four shares, selected from Sofia Stock Exchange.
International Conference On Applied Economics – ICOAE 2011                                                                                                                                                                                                                                                                                                                                                                      467

    Rockinger and Urga(2000), in a study covering the period 1993-1999, argue that Hungarian market always satisfies weak
efficiency, while for the Czech market, they document convergence toward efficiency. Hájek(2007), on a study for the period 1995-
2005, concludes that the weak form of the EMH cannot be validated on the Czech stock market, since daily price changes of both
individual stocks and indices are systematically linearly dependent. Also, the level of efficiency increases for weekly or monthly
returns.
    Dragotă and Mitrică(2004) and Dragotă et. al.(2009) found different conclusions regarding the efficiency of Romanian stock
market; while in the first paper, the Efficient Market Hypothesis is rejected, the second paper reveals a significant movement toward
efficiency.
    Our analysis shows that are different degrees of efficiency among the four stock markets investigated and this result could be
explained using information entropy as a measure of stock market uncertainty.
4.1. Estimation results for BET Index of Bucharest Stock Exchange
    To estimate the model, we have used daily logreturns of BET Index, for time period 19 September 1997-15 March 2011(3364
daily observations).
                                                                                                        2
Table 1: Adjusted Pseudo - R for the two estimators of complexity for BET Index
                                                     Normalized Shannon Entropy                                                                                                                                                                       Normalized PDF Entropy
                                                      1                                      2                                                 3                                          4                 5                                1                                     2                                    3                                    4                                    5
             T=60                                    0.012                                    0.008                                               0.015                                        0.016               0.003                              0.056                                   0.044                                  0.026                                  0.021                                  0.008
             T=100                                   0.017                                    0.014                                               0.024                                        0.023               0.002                              0.043                                   0.023                                  0.010                                  0.009                                  0.009
             T=150                                   0.011                                    0.006                                               0.010                                        0.015               0.002                              0.023                                   0.014                                  0.002                                  0.006                                  0.016
             T=200                                   0.011                                    0.007                                               0.010                                        0.015               0.004                              0.009                                   0.001                                  0.000                                  0.000                                  0.006
             T=240                                   0.005                                    0.004                                               0.007                                        0.010               0.004                              0.003                                   0.000                                  0.000                                  0.000                                  0.001
                       2
                      Radj                                                                                                                                                                                                                                                                                                                                                                                                  60 and
   As the                              values shows, the best results are obtained using Normalized PDF Entropy as explanatory variable, with T
  1.
    The fact that the best results for BET Index were obtained for T  60 suggests that the Romanian stock market has no long
temporal           memory,           the       local        temporal         context          being         most           relevant.
In addition, market complexity estimation using entropy of probability density function of returns provides better results than the
classical Shannon entropy.
      0.15                                                                                                                                                                                                            1
       0.1                                                                                                                                                                                                          0.95
      0.05                                                                                                                                                                                                           0.9
         0                                                                                                                                                                                                          0.85
     -0.05                                                                                                                                                                                                           0.8
      -0.1                                                                                                                                                                                                          0.75
     -0.15                                                                                                                                                                                                           0.7
             12/15/1997




                                                                              12/15/2002

                                                                                           12/15/2003

                                                                                                            12/15/2004

                                                                                                                         12/15/2005

                                                                                                                                          12/15/2006



                                                                                                                                                                     12/15/2008
                          12/15/1998

                                       12/15/1999

                                                    12/15/2000

                                                                 12/15/2001




                                                                                                                                                        12/15/2007



                                                                                                                                                                                  12/15/2009

                                                                                                                                                                                                12/15/2010




                                                                                                                                                                                                                                         12/15/1998

                                                                                                                                                                                                                                                       12/15/1999




                                                                                                                                                                                                                                                                                              12/15/2002

                                                                                                                                                                                                                                                                                                           12/15/2003




                                                                                                                                                                                                                                                                                                                                                  12/15/2006

                                                                                                                                                                                                                                                                                                                                                               12/15/2007




                                                                                                                                                                                                                                                                                                                                                                                                      12/15/2010
                                                                                                                                                                                                                           12/15/1997




                                                                                                                                                                                                                                                                    12/15/2000

                                                                                                                                                                                                                                                                                 12/15/2001




                                                                                                                                                                                                                                                                                                                        12/15/2004

                                                                                                                                                                                                                                                                                                                                     12/15/2005




                                                                                                                                                                                                                                                                                                                                                                            12/15/2008

                                                                                                                                                                                                                                                                                                                                                                                         12/15/2009




Graph 1: Daily logreturns of BET Index                                                                                                                                                                        Graph 2: Normalized PDF Entropy( T                                                                                                        60 ,   1 )
   The results of the optimum model, for Normalized PDF Entropy,                                                                                                                                                                        T  60 and   1 , are presented below.
Table 2: Estimation results of logistic regression for BET Index

                                                                                           Parameter                                                   Estimate                                 Standard Error                                        Wald Chi-Square                                                          Pr > ChiSq
                                                                                            0                                                         10.9884                                  2.2109                                                24.702                                                                   <0.0001
                                                                                            1                                                         -15.7776                                 2.4034                                                43.0948                                                                  <0.0001

                                                                                                                                                                                                                      2
                                                                                                                                                                                                                     Radj
                                                                                       Observations                                                           3304                                           Pseudo-                                                                                                                          0.0558
                                                                                                                                                                                                             Chi-Square(Hosmer-Lemeshow
                                                                                       -2 Log L                                                               701.381                                        Goodness of Fit Test)                                                                                                            13.2112
                                                                                                                                      2
                                                                                       Pseudo- R                                                              0.0112                                         Pr > ChiSq                                                                                                                       0.1048
468                                                                                                                                                                                                                                            International Conference On Applied Economics – ICOAE 2011

    Analyzing the results of the estimation, we can see that the entropy adversely affect the likelihood of extreme values of daily
returns. Thus, if entropy increases by 0.1 units (e.g. from 0.8 to 0.9), then the odds of occurrence of extreme values of BET returns
drops by around 80%.
Table 3: Normalized PDF Entropy of BET Index(tails and body of returns distribution)
                       Descriptive                                                      N                                                        Mean                                             Median                              Mode                   Std.                                                        Minimum                                                     Maximum
                       Statistics                                                                                                                                                                                                                            Deviation

                       Yt*  1                                                          78                                                       0.9167                                           0.9234                              0.7417                 0.0486                                                      0.7417                                                      78

                       Yt*  0                                                          3226                                                     0.9449                                           0.9528                              0.9054                 0.0355                                                      0.7624                                                      3226


    Moreover, the average entropy is significantly lower in the days corresponding to the extreme values of returns distribution than
the other days.
4.2. Estimation results for SOFIX Index of Sofia Stock Exchange
   To estimate the model, we have used daily logreturns of SOFIX Index, for time interval 26 november 2001 – 4 January 2011
(2239 daily observations).
                                                                                           2
Table 4: Adjusted Pseudo - R for the two estimators of complexity for SOFIX Index
                                   Normalized Shannon Entropy                                                                                                                                                                              Normalized PDF Entropy
                                    1                                             2                                              3                                           4                                            5       1                                   2                                            3                                           4                                            5
            T=60                   0.018                                           0.019                                            0.004                                         0.001                                          0.002     0.077                                 0.082                                          0.171                                         0.078                                          0.089
            T=100                  0.021                                           0.040                                            0.031                                         0.015                                          0.004     0.115                                 0.087                                          0.145                                         0.069                                          0.040
            T=150                  0.028                                           0.039                                            0.047                                         0.056                                          0.045     0.089                                 0.064                                          0.178                                         0.079                                          0.058
            T=200                  0.070                                           0.091                                            0.161                                         0.144                                          0.102     0.141                                 0.041                                          0.162                                         0.090                                          0.061
            T=240                  0.092                                           0.100                                            0.110                                         0.087                                          0.088     0.098                                 0.020                                          0.132                                         0.107                                          0.093
                                                                                                                         2
                                                                                                                        Radj
      As can be seen from the values of                                                                                                          for estimated logistic regression models, the best performance offers Normalized Entropy
Estimator for   T  150 and   3 as well as Normalized PDF Entropy Estimator for T  60 and  3 . The results are
substantially similar, but based on the results of Hosmer-Lemeshow test we chose the model with T  60 and   3 , since the
other model presents a lack of fit.
    For SOFIX Index also, market complexity estimation using entropy of probability density function of returns provides better
results than the classical Shannon entropy.

                 0.1                                                                                                                                                                                                                         1
                                                                                                                                                                                                                                          0.98
                0.05                                                                                                                                                                                                                      0.96
                                                                                                                                                                                                                                          0.94
                  0                                                                                                                                                                                                                       0.92
                                                                                                                                                                                                                                           0.9
               -0.05                                                                                                                                                                                                                      0.88
                                                                                                                                                                                                                                          0.86
                -0.1                                                                                                                                                                                                                      0.84
                                                                                                                                                                                                                                          0.82
               -0.15                                                                                                                                                                                                                       0.8
                                                                                                                                                                                                                                                 2/27/2002
                                                                                                                                                                                                                                                             8/27/2002
                                                                                                                                                                                                                                                                         2/27/2003
                                                                                                                                                                                                                                                                                     8/27/2003




                                                                                                                                                                                                                                                                                                                         2/27/2005


                                                                                                                                                                                                                                                                                                                                                 2/27/2006
                                                                                                                                                                                                                                                                                                                                                             8/27/2006
                                                                                                                                                                                                                                                                                                                                                                         2/27/2007
                                                                                                                                                                                                                                                                                                                                                                                     8/27/2007




                                                                                                                                                                                                                                                                                                                                                                                                                         2/27/2009


                                                                                                                                                                                                                                                                                                                                                                                                                                                 2/27/2010
                                                                                                                                                                                                                                                                                                 2/27/2004
                                                                                                                                                                                                                                                                                                             8/27/2004


                                                                                                                                                                                                                                                                                                                                     8/27/2005




                                                                                                                                                                                                                                                                                                                                                                                                 2/27/2008
                                                                                                                                                                                                                                                                                                                                                                                                             8/27/2008


                                                                                                                                                                                                                                                                                                                                                                                                                                     8/27/2009


                                                                                                                                                                                                                                                                                                                                                                                                                                                             8/27/2010
                       7/15/2002
                                   1/15/2003


                                                           1/15/2004
                                                                       7/15/2004


                                                                                                7/15/2005
                                                                                                            1/15/2006




                                                                                                                                                  7/15/2007
                                                                                                                                                              1/15/2008


                                                                                                                                                                                      1/15/2009




                                                                                                                                                                                                                          7/15/2010
                                               7/15/2003




                                                                                    1/15/2005




                                                                                                                        7/15/2006
                                                                                                                                     1/15/2007




                                                                                                                                                                          7/15/2008


                                                                                                                                                                                                  7/15/2009
                                                                                                                                                                                                              1/15/2010




          Graph 3: Daily logreturns of SOFIX Index                                                                                                                                                                                    Graph 4: Normalized PDF Entropy( T                                                                                                                          60 ,   3 )


      The results of the optimum model, with Normalized PDF Entropy as exogenous,                                                                                                                                                                                                    T  60 and   3 , are presented below.
                                                                                      Table 5: Estimation results of logistic regression for SOFIX Index
                                                                                                                                                    Estim                                    Standard                                        Wald    Chi-                                                                                    Pr > Ch
                                                                       Parameter                                                    ate                                                   Error                                           Square                                                                            iSq
                                                                                          0                                        33
                                                                                                                                                    27.58                                    3.7150                                          55.1294                                                                                         <0.0001

                                                                                                                                       -                                                                      4.0649                             68.9309                                                                                     <0.0001
                                                                                          1                                        33.7489
International Conference On Applied Economics – ICOAE 2011                                                                                                                                                                                                                                                                                                                              469


                                                                                                 Observati                                                    217                                              R2                                                                                        0.1
                                                                                     ons                                                          9                                     Pseudo- adj                                                                                                   710
                                                                                                                                                     349.                               Chi-Square(Hosmer-                                                                                               8.5
                                                                                                 -2 Log L                                         314                                 Lemeshow Goodness of Fit Test)                                                                                  538
                                                  0.02                                           Pseudo-
                                                                                                   0.3
                                              96             Pr > ChiSq              R2         813
    Analyzing the results of the estimation, we can see that the entropy negatively affect the likelihood of extreme values of daily
returns. Thus, if entropy increases by 0.1 units (e.g. from 0.8 to 0.9), then the odds of occurrence of extreme values of SOFIX returns
drops by around 97%.
                                                 Table 6: Normalized PDF Entropy of SOFIX Index(tails and body of returns distribution)
                                                Descript                                                         N                                            Mea                          Medi                            Mod                          Std.                                                       Minim                                   Maxim
                                            ive Statistics                                                                                        n                                   an                       e                                     Deviation                                        um                                        um
                                                                                                                 42                                           0.90                         0.91                            0.84                                  0.034                                             0.8423                                  0.9759
                                                             Yt*  1                                                                              95                                  32                       23                                    1
                                               0.95        0.95       0.89        0.028         0.8404        0.9954
                                                                                                                 2137
                                           27          87
                                                             Yt*  0
                                                                   48          4
   Moreover, the average entropy is significantly lower for the days with extreme values of returns (0.90) than the other days (0.95).
4.3. Estimation results for BUX Index of Budapest Stock Exchange
   To estimate the logistic regression model, we have used daily logreturns of BUX Index, for time horizon 1 April 1997 – 4 January
2011 (3442 daily observations).
                                                                                                  2
Table 7: Adjusted Pseudo - R for the two estimators of complexity for BUX Index
                                                      Normalized Shannon Entropy                                                                                                                                                        Normalized PDF Entropy
                                                       1                                        2                                  3                                    4               5                                      1                            2                                  3                                4                              5
             T=60                                     0.001                                      0.001                                0.001                                  0.001             0.019                                    0.169                          0.167                                0.127                              0.187                            0.193
             T=100                                    0.000                                      0.000                                0.002                                  0.005             0.024                                    0.183                          0.185                                0.156                              0.205                            0.190
             T=150                                    0.000                                      0.001                                0.000                                  0.003             0.018                                    0.183                          0.189                                0.190                              0.237                            0.227
             T=200                                    0.005                                      0.004                                0.006                                  0.000             0.004                                    0.138                          0.142                                0.156                              0.198                            0.184
             T=240                                    0.016                                      0.016                                0.018                                  0.003             0.000                                    0.167                          0.154                                0.137                              0.163                            0.172
                                                                                                                            2
                                                                                                                           Radj
   As can be seen from the values of                                                                                                              for estimated logistic regression models, the best performance offers Normalized Entropy
Estimator for T  150 and   4 .
    For BUX Index also, market complexity estimation using entropy of probability density function of returns provides better results
than the classical Shannon entropy.
       0.2                                                                                                                                                                                     1
      0.15                                                                                                                                                                                  0.98
                                                                                                                                                                                            0.96
       0.1                                                                                                                                                                                  0.94
      0.05                                                                                                                                                                                  0.92
        0                                                                                                                                                                                    0.9
                                                                                                                                                                                            0.88
     -0.05                                                                                                                                                                                  0.86
      -0.1                                                                                                                                                                                  0.84
     -0.15                                                                                                                                                                                  0.82
                                                                                                                                                                                             0.8
      -0.2
                                                                                                                                                                                                                            11/5/1999



                                                                                                                                                                                                                                                     11/5/2001

                                                                                                                                                                                                                                                                 11/5/2002

                                                                                                                                                                                                                                                                              11/5/2003

                                                                                                                                                                                                                                                                                          11/5/2004

                                                                                                                                                                                                                                                                                                       11/5/2005




                                                                                                                                                                                                                                                                                                                                                                    11/5/2010
                                                                                                                                                                                                   11/5/1997

                                                                                                                                                                                                               11/5/1998



                                                                                                                                                                                                                                         11/5/2000




                                                                                                                                                                                                                                                                                                                    11/5/2006

                                                                                                                                                                                                                                                                                                                                11/5/2007

                                                                                                                                                                                                                                                                                                                                            11/5/2008

                                                                                                                                                                                                                                                                                                                                                        11/5/2009
             11/5/1997

                         11/5/1998

                                     11/5/1999




                                                                         11/5/2002

                                                                                     11/5/2003




                                                                                                                                      11/5/2007
                                                 11/5/2000

                                                             11/5/2001




                                                                                                  11/5/2004

                                                                                                              11/5/2005

                                                                                                                          11/5/2006



                                                                                                                                                  11/5/2008

                                                                                                                                                              11/5/2009

                                                                                                                                                                          11/5/2010




Graph 5: Daily logreturns of BUX Index                                                                                                                                                Graph 6: Normalized PDF Entropy( T                                                                                            150 ,   4 )


   The results of the optimum model, with Normalized PDF Entropy as exogenous,                                                                                                                                                                                               T  150 and   4 , are presented below.
                                                                                                      Table 8: Estimation results of logistic regression for BUX Index
                                                                                                                                                                 Estim                      Stand                             Wald    Chi-                                                              Pr > Ch
                                                                                                 Parameter                                         ate                                  ard Error                          Square                                                           iSq
                                                                                                                 0                                71
                                                                                                                                                                 28.91
                                                                                                                                                                                        5
                                                                                                                                                                                            2.909                             98.7797                                                                   <0.0001

                                                                                                                                                      -                                     3.271                                       120.0030                                                        <0.0001
                                                                                                                 1                                35.8376                              5
470                                                                                             International Conference On Applied Economics – ICOAE 2011



                                       Observati               329                      R2                                    0.17
                                 ons                      2               Pseudo- adj                                   10
                                                             492.         Chi-Square(Hosmer-Lemeshow                       11.1
                                       -2 Log L           411           Goodness of Fit Test)                           252
                                       Pseudo-                 0.04                                                           0.19
                                  R2                      13                 Pr > ChiSq                                 47

    Analyzing the results of the estimation, we can see that the entropy negatively affect the likelihood of extreme values of daily
returns of BUX Index. Thus, if entropy increases by 0.1 units (e.g. from 0.8 to 0.9), then the odds of occurrence of extreme values of
BUX Index returns drops by around 98%.
                         Table 9: Normalized PDF Entropy of BUX Index(tails and body of returns distribution)
                             Descrip           N               Mea           Medi            Mod         Std.       Mini           Maxi
                        tive                              n             an              e             Deviation   mum            mum
                        Statistics
                                               64              0.88          0.88            0.81         0.040       0.811            0.9668
                        Yt  1
                          *                               46            2               14            5           4
                                               3228            0.94          0.95            0.86         0.034       0.817            0.9906
                        Yt  0
                          *                               46            05              32            4           2


    For BUX Index, average entropy is significant higher in the body of returns distribution (0.94), than in the upper and lower
tail(0.88).
4.4. Estimation results for PX Index of Prague Stock Exchange
    To estimate the binary logistic regression model, we have used daily logreturns of PX Index, for time interval 7 September 1993 –
5 January 2011 (4184 daily observations).
                                          2
Table 10: Adjusted Pseudo - R for the two estimators of complexity for PX Index
                         Normalized Shannon Entropy                                           Normalized PDF Entropy
                          1           2           3             4            5       1         2         3            4        5
                T=60     0.001         0.000         0.001           0.002          0.000     0.033       0.081       0.067          0.039      0.099
                T=100    0.000         0.004         0.002           0.002          0.000     0.071       0.141       0.115          0.050      0.111
                T=150    0.022         0.016         0.011           0.015          0.016     0.063       0.173       0.132          0.077      0.125
                T=200    0.009         0.005         0.000           0.003          0.002     0.086       0.141       0.161          0.075      0.147
                T=240    0.007         0.003         0.000           0.000          0.000     0.126       0.157       0.131          0.075      0.118
                                                    2
                                                   Radj
      As can be seen from the values of                   for estimated logistic regression models, the best performance offers Normalized Entropy
Estimator for T  150 and   2 as well as Normalized PDF Entropy Estimator for T  200 and  3 . The results are
substantially similar, but based on the results of Hosmer-Lemeshow test we chose the model with T  200 and   3 , since the
other model
   presents a lack of fit.
   For PX Index also, market complexity estimation using entropy of probability density function of returns provides better results
than the classical Shannon entropy.
        0.15                                                                   1
         0.1
                                                                             0.95
        0.05
           0                                                                  0.9

        -0.05                                                                0.85
         -0.1
                                                                              0.8
        -0.15
         -0.2                                                                0.75
                30/01/1995

                20/06/1996
                19/02/1997




                  7/6/2000
                  5/2/2001

                  4/6/2002

                26/09/2003
                28/05/2004
                26/01/2005
                20/09/2005
                19/05/2006




                  8/9/2009
                14/10/1997
                17/06/1998




                  3/2/2003




                17/01/2007
                13/09/2007
                15/05/2008
                13/01/2009
                 6/10/1995




                 12/2/1999
                 7/10/1999


                 3/10/2001




                 10/5/2010




                                                                                    17/06/1998




                                                                                    19/05/2006
                                                                                    17/01/2007
                                                                                    13/09/2007
                                                                                    15/05/2008
                                                                                    13/01/2009
                                                                                    30/01/1995

                                                                                    20/06/1996
                                                                                    19/02/1997
                                                                                    14/10/1997




                                                                                    26/09/2003
                                                                                    28/05/2004
                                                                                    26/01/2005
                                                                                    20/09/2005
                                                                                      3/2/2003
                                                                                      7/6/2000
                                                                                      5/2/2001

                                                                                      4/6/2002




                                                                                      8/9/2009
                                                                                     6/10/1995




                                                                                     12/2/1999
                                                                                     7/10/1999


                                                                                     3/10/2001




                                                                                     10/5/2010
International Conference On Applied Economics – ICOAE 2011                                                                              471


Graph 7: Daily logreturns of PX Index                  Graph 8: Normalized PDF Entropy( T            200 ,   3 )

   The results of the optimum model, with Normalized PDF Entropy as explanatory variable,            T  200 and   3 , are presented
below.
                                   Table 11: Estimation results of logistic regression for PX Index
                                                    Standard
                        Parameter       Estimate    Error        Wald Chi-Square                      Pr > ChiSq
                         0             18.3822     1.8960       93.9932                              <0.0001

                         1             -24.6228    2.1505       131.0977                             <0.0001


                                                                 R2
                        Observations       3983         Pseudo- adj                                      0.1611
                                                        Chi-Square(Hosmer-Lemeshow          Goodness
                        -2 Log L           653.879      of Fit Test)                                     11.3840
                                  2
                        Pseudo- R          0.0283     Pr > ChiSq                                        0.1809
    Analyzing the results of the estimation, we can see a negative correlation between entropy and the likelihood of extreme values of
daily returns of PX Index. Thus, if entropy increases by 0.1 units (e.g. from 0.8 to 0.9), then the odds of occurrence of e xtreme values
for PX Index returns drops by around 92%.
Table 12: Normalized PDF Entropy of PX Index(tails and body of returns distribution)
                  Descriptive      N          Mean         Median      Mode        Std.         Minimum       Maximum
                  Statistics                                                       Deviation
                   Yt*  1         78         0.9060       0.9037      0.8161      0.0455       0.8161        0.9833

                   Yt*  0         3905       0.9352       0.9402      0.8937      0.0320       0.8048        0.9899


    For PX Index, average entropy is significant higher in the body of returns distribution (0.93), than in the upper and lower
tail(0.90).
5.Conclusions
    Informational efficiency of the stock markets is an intensely debated topic in recent years, especially in the context of current
economic and financial crisis. From an information theory point of view, capital market efficiency may be associated with a high
degree of uncertainty. Indeed, if a stock market is efficient, meaning that information is transmitted instantly and is completely
incorporated in trading prices, it is virtually impossible to anticipate future evolutions in prices or returns. This translates into a high
degree of uncertainty, specific to a random walk behavior of stock prices.
    As random walk process is the most complex in terms of predictability, it is natural to use entropy as a measure of uncertainty, in
relationship to market efficiency.
    The main conclusion of the study is that periods characterized by a sharp drop in entropy, when the market complexity level
reaches a local minimum, and uncertainty is low, are associated with the occurrence of extreme values of returns.
    In this study we analyzed the relationship between complexity and predictability of stock market crashes stock using entropy of
probability density function of returns as a measure of complexity. Results of the estimated models show that this complexity
estimator produces better results than classical Shannon entropy.
    The entropy of probability density function of returns can be used in two ways: as a tool to measure the degree of market
efficiency, as well as to compare, in terms of efficiency, two or more stock markets.
    As a tool for measuring the degree of stock market efficiency, we proposed an indicator who can detect, at a time, the level of
efficiency, taking into account the local temporal context. Unfortunately, this indicator has a number of disadvantages, among which
the difficulty to deduce some theoretical properties, since the probability density function does not take values in a standardized
interval, for any distribution. Another major disadvantage is that it depends very much on the local temporal context, which can be
improved by estimating the indicator using intraday data.On the other hand, as a tool for comparison, PDF Normalized Entropy may
be useful in evaluating the relative efficiency of two markets by comparing the sensitivity of the chance of occurrence of extreme
values of the returns distribution to a change in entropy.
    According to Efficient Market Hypothesis, a growing level of market uncertainty should be reflected in an adverse change in the
chances of occurrence of extreme values of returns.
    Lower the magnitude of expected odds change as a result of one unit increase of entropy, lower the degree of market efficiency.
    From this point of view, the Romanian stock market is the least efficient in relation to capital markets analyzed in this study.
According to the estimated model, if entropy increases by 0.1 units (e.g. from 0.8 to 0.9), then the chance of occurrence of extreme
values of BET return drops to around 80%.
472                                                                       International Conference On Applied Economics – ICOAE 2011

   For the Bulgarian market this expected change is approximately 97%, for the Hungarian market is about 98%, and for the Prague
Stock Exchange Index the expected change is about 92%.
   The explanations for these differences between the countries surveyed can be found in the existing inequalities in terms of
economic and capital market developments.
Acknowledgments
    This paper was cofinanced from the European Social Fund, through the Human Resources Development Operational Sectorial
Program 2007-2013, project number POSDRU/89/1.5/S/59184, „Performance and excelence in the post-doctoral economic research
in Romania‖.
6.References
    Bachelier, L.(1900), ‖Théorie de la Spéculation‖ (first two pages of English translation), Annales Scientifiques de l'Ecole
Normale Superieure, I I I -17, 21-86. (English Translation;- Cootner (ed.), (1964) Random Character of Stock Market Prices,
Massachusetts Institute of Technology pp17-78).
    Bouzebda, S., Elhattab, I.(2009), ‖Uniform in bandwidth consistency of the kernel-type estimator of the Shannon's entropy‖,
Comptes Rendus Mathematique, Volume 348, Issues 5-6, March 2010, Pages 317-321, ISSN 1631-073X.
    Dragotă, V., Stoian, A., Pele, D. T., Mitrică, E., Bensafta, M.(2009), „The Development of the Romanian Capital Market:
Evidences on Information Efficiency‖, Romanian Journal of Economic Forecasting, vol. 10 (2), 2009, pp. 147-160.
    Dragotă, V., Mitrică, E.(2004), ―Emergent capital market‘s efficiency: The case of Romania‖, European Journal of Operational
Research, 155, 2004, pp. 353-360.
    Emerson, R., Stephen, H., Zalewska-Mitura, A.(1997), ‖Evolving Market Efficiency with an Application to Some Bulgarian
Shares‖, Economics of Planning, Volume 30, Issue 2, Pages 75-90.
    Fama, E.(1970), ‖Efficient Capital Markets: A Review of Theory and Empirical Work‖, Journal of Finance, 25(2), 383–417.
    Hájek, J.(2007),"Czech Capital Market Weak-Form Efficiency, Selected Issues", Prague Economic Papers, University of
Economics, Prague, vol. 2007(4), pages 303-318.
    Lorentz, R. (2009), ‖On the entropy of a function‖. J. Approx. Theory 158, 2 (June 2009), 145-150.
    Malkiel, B.(2003), ―The Efficient Market Hypothesis and Its Critics‖, Journal of Economic Perspectives, Volume 17, Number 1,
Winter 2003, pp. 59 - 82.
    Mateev, M.(2004), ‖CAPM Anomalies and the Efficiency of Stock Markets in Transition: Evidence from Bulgaria‖, South-
Eastern Europe Journal of Economics, 1 (2004) 35-58.
    Nagelkerke, N. J. D.(1991), ‖A note on a general definition of the coefficient of determination‖, Biometrika (1991) 78(3): 691-
692.
    Parzen, E.(1962), ‖On The Estimation Of Probability Density Function And Mode‖, The Annals Of Mathematical Statistics, 33,
1065-1076.
    Pele, D.T.(2011), ‖ Information entropy and occurrence of extreme negative returns‖, Journal of Applied Quantitative Methods,
Forthcoming, 2011
    Risso, A. (2008), ―The Informational Efficiency and the Financial Crashes‖, Research in International Business and Finance,
Vol. 22, pp. 396-408.
    Rockinger, M. , Urga, G.(2000), ‖The Evolution of Stock Markets in Transition Economies‖, Journal of Comparative Economics,
Volume 28, Issue 3, September 2000, Pages 456-472.
    Sain, S.(1994), Adaptive Kernel Density Estimation, Ph.D. Thesis, Houston, Texas.
    Saporta, G.(1990), Probabilité, Analyse des Données et Statistique , Ed.Technip.
    Scott, D. W.(1992), Multivariate Density Estimation: Theory, Practice And Visualisation, New York, John Wiley.
    Silverman, B. W.(1986), Density Estimation For Statistics And Data Analysis, London, Chapman and Hall.
    Stiglitz, J.(1981), ―The Allocation Role of the Stock Market: Pareto Optimality and Competition‖, The Journal of Finance, Vol.
36, No. 2, Papers and Proceedings of the Thirty Ninth Annual Meeting American Finance Association, Denver, September 5-7, 1980.
(May, 1981), pp. 235-251.
    Summers, L.(1986), ‖Does the Stock Market Rationally Reflect Fundamental Values?‖, The Journal of Finance, 41(3), 591–601.
    Timmermann, A., Granger, C.(2004), ‖Efficient market hypothesis and forecasting‖, International Journal of Forecasting, 20(1),
15–27.
    *** Website of Sofia Stock Exchange: www.bse-sofia.bg
    *** Website of Bucharest Stock Exchange: www.bvb.ro
    *** Website of Budapest Stock Exchange: www.bse.hu
    *** Website of Prague Stock Exchange: www.pse.cz

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:6
posted:9/25/2012
language:Latin
pages:10