There was a time when India was discussed as the land of snake charmers, black magic and epidemics but the revolutionary Indian growth story changed everything. Indian economy at its height compelled the world to change its viewpoint towards India. Out of the several factors which changed the face of modern India, we are going to discuss the most roaring of them i.e. our share market. The earlier reform procedures adopted by India gave India the two most sought after world-class brands i.e. SENSEX and NIFTY. The magical figures displayed by our market turned all the heads on India. And India became one of the most favored places for investment. Now we are going to deal with the ups and downs in the share market since last two years i.e. since year 2006.our share market has went through many phases in there 2 years. We saw the investors getting overjoyed at 21K and we saw them crying too when it crashed. We saw how the market rewarded the undervalued shares and how the overvalued shares fell down to demonstrate the saying “everything which rise more than expected, has to fall.” So to analyze the saga of Indian share market, we had two indices to follow: BSE Sensex and NSE nifty. Though NSE nifty is a more advanced option and has left BSE Sensex far behind, still we call BSE Sensex as the barometer of our economy. That's why we have followed the BSE Sensex. It was not possible to track each and everyday figure of the Sensex since last two years. The performance of the Sensex is analyzed with the help of data and graphs collected from various sources and some of the most talked about movements of Sensex starting with the secondary market summary of each year, firstly year 2006 and then year 2007.

In the secondary market, the uptrend continued in 2006-07 with BSE indices closing above 14000(14,015) for the first time on January 3, 2007. After a somewhat dull first half conditions on the bourses turned buoyant during the later part of the year with large inflows from Foreign Institutional Investors (FIIs) and larger participation of domestic investors. During 2006, on a point-to-point basis, Sensex rose by 46.7%. BSE Sensex (top 30stocks) which was 9,398 at end-December 2005 and 10,399 at end-May 2006, after dropping to 8,929 on June 14, 2006, recovered soon thereafter to rise steadily to 13787 by end-December 2006, was higher than those in most emerging markets of Asia, e.g. South Korea, Thailand, Malaysia and Taiwan; and was the second highest among emerging markets, which serves as a fuel for the price discovery process, is one of the main criteria sought by the investor while investing in the stock market. Market forces of demand and supply determine the price of any security at any point of time. Impact cost quantifies the impact of a small change in such forces on prices. Higher the liquidity, lower the impact cost.

During December 2005, the greatest demerger of Indian history between the Ambones paved the way for 9000. And the Sensex entered the year 2006 with a 9000 + figure. On Feb. 10th 2006, we saw two roaring figures, both Sensex and Sachin Tendulkar crossing 10000 Mark. Sensex's surge to 11000 Points on 21st After falling by 307 points on 12th April 2006 on account of Heavy selling by FIIs in both cash and futures markets and a move by stock exchanges to raise margins on share transactions by about 250 basis points, the 131-year-old BSE on Thursday, April 20, 2006 crossed yet another milestone Now, everything was going fine….perhaps it was the lull before the storm. This May crash saw the Sensex shedding its market capitalization by as much as 14% in just one month.

. Although the indices showed some intermittent fluctuations, reflecting change in the market sentiments, the indices maintained their north-bound trend during the year. The buoyant conditions in the Indian bourses were aided by, among other things, India posting a relatively higher GDP growth persistence of difference in domestic and international levels of Indian rupee on the back of larger capital inflows. The BSE Sensex (top 30 stocks) too echoed a similar trend to NSE nifty. January 2008. However, BSE and NSE indices declined subsequently reflecting concerns on global developments. BSE Sensex yielded a Compounded return of 36.5 per cent per year between 2003 and 2007. In terms of simple average, BSE Sensex has given an annual return of more than 40 per cent during the last three years.

After touching 14K mark on December 5th FIIs have pressed substantial sales over those days in contrast to an intermittent surge in inflow in February 2007. As a result, the Sensex which closed at 14091 on January 31st, closed at 12938 on February 28th Market continued to reel under selling pressure on 5thmarch 2007 taking cue from weak global markets and heavy FII sales as a result of fall over 400 points, all the indices were in red. On April 24th, The Sensex again crossed the 14K mark and was trading at 14,150.18 having gained 221.85 points or 1.59%. The midcap and small cap indices were rather end it finally closed at 13872. Further we can see May and June having month end figures at 14544 and 14651 respectively. The benchmark BSE 30-Share Sensitive Index (Sensex) breached the 15,000-mark, to reach a record high of 15007.22, for the first time intra-day on Friday, July 06 2007 before closing at 14964.12. Despite weak global cues, Indian stocks were in great demand, especially auto, pharm, IT and metals stocks. The Sensex experienced its second bigger ever fall on 2nd august 2007. The fall came in after the Fed Reserve cut its discount interest rate at an emergency meeting and JPMorgan Chase agreed to buy Bear Stearns for USD 2 a share. Sensex closed down 951.03 points or 6.03% at 14809.49, When FIIs were pumping money in stock market and were Net Buyers of Equity worth Crores; the Sensex was moving Up, Up and Up on weekly basis. Many thought that FIIs were playing blind in Indian stock market. But when FIIs have turned Net Sellers of Equity and have started booking profit backed by massive sell off of shares in global markets; Sensex has to go down. As expected; the Sensex plunged by 600 Points in early trading on 16th August and most of the shares were down by 4 to 5 percent. But very soon the Sensex surpassed the gloomy days and Stock markets on Wednesday, September 19th, 2007.

After scaling new heights of 20000+, Sensex entered year 2008 with rosy pictures. The trade pundits, brokers and even investors predicted new heights for the year. And they felt their predictions coming true when Sensex touched the 21000 mark on 8th.But the rosy picture soon turned gloomy. The skyrocketing Sensex suddenly started heading south and Sensex saw the biggest absolute fall in history, shedding 2062 points intra-day. It closed at 17,605.35, down 1408.35 points or 7.4 per cent. It fell to a low of 16,951.50. The fall was triggered as a result of weakness in global markets, but the impact of the global rout was the biggest in India. The market tumbled on account of abroad based sell-off that emerged in global equity markets. Fears over the solvency of major Western banks rattled stocks in Asia and Europe. After the worst January in the last 20 years for Indian equities, February turned out to be a flat month with the BSE Sensex down 0.4%. India finished the month as the second worst emerging market. The underperformance can partly be attributed to the fact that Indian markets outperformed global markets in the last two months of 2007and hence we were seeing the lagged impact of that outperformance. In the shorter term, Developments in the US economy and US markets continued to dominate investor sentiments globally and we saw volatility move up sharply across most markets. The Bombay Stock Exchange (BSE) Sensex fell 4.44 percent on Monday, 31st march the last day of the financial quarter, to end the quarter of March down 22.9 percent, its biggest quarterly fall since the June 1992 quarter, as reports of rising inflation and global economic slowdown dampened market sentiments. Financial stocks led the Sensex slide along with IT. According to market analysts, IT stocks fell on worries about the health of the US economy. Indian IT firms depend on the US clients for a major share of their revenues. Reasons for the present slowdown (Q1, FY 08-09) The first month of the financial year 08-09 proved to be a good one for investors with the month ending on a positive note. The BSE Sensex showed a gain of 10.5% to close at 17287 points. A combination of firming global markets and technical factors like short covering were the main reasons for the up move in the markets. Though inflation touched a high of 7.57% against 6.68% in march 2008 as a result RBI hiked CRR by 50 bps to take the figure to 8%, still emergence of retail investors was also seen; a fact reinforced by the strong movement in the mid-cap and small- cap index that rose 16% and 18% respectively. So April was the last month to close positive. Then after nobody saw a stable Sensex even now. Sometimes it surged by 600+ points, but very next day it plunged by some 800 odd points and this story is still continuing. Every prediction, every forecasting has failed. The Sensex is dancing on the music of lifetime high inflation rates, historic crude prices, tightening RBI policies, weak industrial production data, political uncertainties and obviously the sentiments of domestic as well as FIIs. The only relief came in the form of weakening Indian rupees which enlightened the IT sector and most recently the UPA gaining vote of confidence. Presently it is revolving around the figures of 14000 and no one knows what next? The 30-share BSE Sensex fell 117.89 points or 0.67% at 17,373.01 on Tuesday, 6 May 2008. The key benchmark indices ended lower as investors resorted to profit booking due to lack of positive triggers in the market. On 30thIn May, Indian inflation stood at 8.2%.The market declined sharply as a hike in fuel prices by about 10% announced by the Union government on Wednesday, 4 June 2008, triggered possibility of a surge in inflation to double digit level. The BSE Sensex declined 843.39 points or 5.14% to 15,572.18 in the week ended 6 June 2008. The S&P CNX Nifty fell 242.3 points or 4.97% to 4627.80 in the week. On 6 June 2008, local benchmark indices underperformed their global peers, hit by rumors that the Reserve Bank of India (RBI) may hike cash reserve ratio (CRR) or interest rate later in the day to tame runaway inflation. The 30-share BSE Sensex declined 197.54 points or 1.25% to settle at 15,572.18.On 9th. On the week ending 27th An 800+ point surge was experienced in the market on the day following UPA gaining vote of confidence but the very next day market couldn't maintain the momentum and since then it's in a doldrums' position. Presently, we can saw market plunging after the RBI announced further hikes in Repo rate as well as CRR both increased to 9%. Also, the serial blasts at Ahmadabad and Bangalore adding to the worries and enhancing the negative sentiments. And above all we can't see any positive trigger that can dilute the flow of negative news.

Three sets of Assumptions imply an efficient capital market:

- An efficient market requires that a large number of competing.
- New information regarding securities comes to the market in a random fashion, and the timing of one announcement is generally independent.
- The competing investors attempt to adjust security prices rapidly to reflect the effect of new information although the price adjustment may be imperfect, it is unbiased.

This means that sometimes the market will over-adjust or under-adjust, but an Investor cannot predict which will occur at any given time. If we believe that efficient Market hypothesis is a valid proposition, then the current asset prices should reflect all generally available information. The efficient market hypothesis implies that since Market prices reflect all available information, including the information about the future, The only difference between the prices at Pt and Pt+1 are events that we cannot possibly Predict, i.e. a random event. Hence, in an efficient market, stock prices can be statistically tested for random walk hypothesis. The early survey of Fama (1965) concluded that the stock market was efficient. Fama (1965) analyzed the distribution of a large data set and showed that empirical evidence seems to confirm the random walk hypothesis: a series of prices changes have no memory. The main theoretical explanation that lies behind this observation is the efficient market hypothesis. The EMH has received a lot of empirical support in the academic literature during seventies and eighties. This line of thought has always been received with a lot of skepticism in the professional community, which led to the use of charts and technical analysis rules for trading strategies in markets. Professionals have always claimed that classical statistical tests are mainly linear and therefore, unable to capture the complex pattern of price changes exhibit. However Time series forecasting is an important research area in several domains. Traditionally, forecasting research and practice has been dominated by statistical methods. As we get to know more about the dynamic nature of the financial markets, the weakness of traditional methods becomes apparent. In the last few years, research has focused on understanding the nature of financial markets before applying methods of forecasting in domains including stock markets, financial indices, bonds, currencies and varying types of investments.

Besides its heterokedasticity long-range dependence, long memory process has other

Certain unique properties. Mandelbrot and Wallis (1969) and Mandelbrot (1972) showed

A long-range dependence process could demonstrate itself as a highly non-Gaussian time series with large skewers and kurtosis, and carries non-periodic cycles. A long memory process could allow conditional heteroskedicity (Fung et al 1994), which could be the explanation of no periodic cycles. It seems a long memory model is more flexible than an ARCH model in terms of capturing irregular behavior. I have tried to test the long memory of the Indian stock market using CNX NIFTY as the market proxy. For doing the same I have chosen two tests which are considered robust for testing the long memory component of markets. These two models could complement each other and allow a comparison of the robustness of the results.

Experts on financial markets and economists on both Random Walk and Efficient Market Hypothesis have undertaken good amount of research work. However, long memory models are relatively new to applied economics. Though its origin dates back to Mandelbrot's (1969) work, it was not until 1980's that researchers began to apply the rescaled range analysis, one of the tools in the long memory theory, to financial markets and macroeconomic prices. In 1991 Lo modified the classical R/S method. Based on Beran (1994, pp-41-66), a stationary process with long memory has the following qualitative features:

- Certain persistence exists. In some periods the observations tend to stay at high levels, in some other periods, the observations tend to stay at low levels.
- During short time periods, there seem to be periodic cycles, However looking through the whole process; no apparent periodic cycles could be identified.
- Overall, the process looks stationary.

The primary objective of this study is to investigate if the price behavior in Indian stock market can be characterized by long memory models. It is not hard to find evidence to argue that price series with random appearance might be non-linear dynamic. But it is difficult to say what kind of non-linear dynamics. Another commonly used stochastic model ARCH and its variants share similar symptoms with long memory models, such as non-normality and heteroscedasticity, but they have totally different generating mechanisms and implications. A time series with ARCH property typically has two components, a conditional mean and a conditional variance function. The non-linearity of the series comes from the non-linearity of the conditional variance. An ARCH model that fits the data well could improve the prediction of the variances of prices but not the price itself (Bera and Higgins 1995). A long memory model is a single mean equation (system) and has a flexible structure. It represents short as well as long memory simultaneously. Literature available depicts studies concerning developed and emerging markets, though major work has been undertaken for developed market like US. The unavailability of reliable data may one of the important reasons why very few studies have been undertaken in emerging markets. For Indian financial market very few studies have tested long memory component. Some important work has been done by Barman and Madhusoodan (1993), Thomas (1995) and Basu & Morey (1998) covering Indian market. Basu & Morey used Variance Ratio Test methodology devised by Lo and MacKinlay to test the long memory of the Indian stock market. The current proposal is to study Indian stock market with respect to its long-memory using stock market returns during last one decade during which the stock market has gone through the liberalization process and also on few occasions it has been subject to few extreme volatile situations for many reasons. The study attempts to examine the efficient market hypothesis on Indian conditions by implementing two important techniques that are robust to time varying volatility. The study has been based on the idea that variance keeps changing over time and hence a test like Variance Ratio test would not only help to test the random walk theory in stock prices but also being The potential presence of stochastic long memory in financial asset returns has been an important subject of both theoretical and empirical research. If assets display long memory, or long-term dependence, they exhibit significant autocorrelation between observations widely separated in time. Since the series realizations are not independent over time, realizations from the remote past can help predict future returns. Persistence in share returns has a special claim on the attention of investors because any predictable trend in returns should be readily exploitable by an appropriate strategy by the market participants. During last one decade, the Indian financial system has been subjected to substantial reforms with far reaching consequences. These reforms process has helped in dramatic improvement in transparency level in financial markets including stock market. The regulatory changes that have taken place during last one decade of financial sector reforms has led us to believe that financial market have become more efficient with respect to the price discovery mechanism and helped the market to grow exponentially. The country has also experienced the mild contagion effect of financial crisis in International markets and successfully sailed through the period of Asian crisis not significantly jeopardizing the interest of the domestic economy. There have been significant changes in the regulations for smooth and efficient functioning of capital market in the country. During the last decade we have seen cleansing of the stock market system by market regulators and emergence of National Stock Exchange of India has greatly helped the system to achieve the present level of transparency and efficiency. The market has undergone substantial change due to introduction of hedging products like futures and options. Risk management system has been changing in keeping pace with change in scenario. Other reforms in the form of deregulation of interest rate, tax reforms, banking sector reforms, reforms in the external sector, etc. has also helped market participants to value assets according to their intrinsic values. Liquidity has greatly increased as the market spread has reached the far away villages bringing investor together. The concept of developing a large order book in the stock market made the pricing of stocks more accurate and efficient and also resulted in bringing down the bid/ask spread benefiting the investors community as a whole. International investors' access to the domestic market has also helped in increasing liquidity. All these helped in better dissemination of information and hence possibly increased the level of efficiency in asset prices. The level of such efficiency in prices needs to be tested with various models that exist in the literature. The earlier work done on Indian market (Basu and Morey (1998) and Barman and Madhusoodan (1993), Thomas (1995) that has a major component from the pre-reforms period. Reforms process has led to a regime shift and hence it is necessary to test the market with the market data after the introduction of financial sector reforms.

Helms (et al 1984) applied rescaled range analysis to detect the existence of long memory in the futures prices of the soybean complex. With the Hurst exponent in the range of 0.5 to 1 indicating long memory, the authors found that the Hurst exponents ranges from 0.558 to 0.71 for daily prices of two futures contracts in 1976 and from 0.581 to 0.67 for intra-day prices of 5 contracts in 1977-78. Fung and Lo's (1993) long memory study analyzed the prices of two interest rate futures markets, Eurodollars and Treasury Bills. The results from the classical R/S analysis and Lo's (1991) modified R/S analysis provide no evidence of the existence of the long memory and support for the weak form EMH. Peters (1994) notes that most financial markets are not Gaussian in nature and tend to have sharper peaks and fat tails a phenomenon well known in practice. One of the key observations made by Peters (1994) is the fact that most financial markets have a long memory, what happens today affects the future forever. One strand of my motivation comes from Peters, (1994) Fractal Market Hypothesis. Long memory analysis have been conducted for stock prices (Greene and Fielitz (1997), Aydogan and Booth (1988), Lo (1991), Cheung, Lai and Lai (1993), Cheung and Lai (1995), Barkoulas and Baum (1997)), spot and futures currency rates (Booth, Kaen and Koveos (1982a) Cheung (1993a), Cheung and Lai (1993), Bhar (1994), Fang, Lai and Lai (1994), Barkoulas, Labys and Onochie (1997a)), gold prices (Booth, Kaen and Kovoes (1982b)), Cheung and Lai (1993)), international spot commodity prices (Barkoulas, Labys, and Onochie (1976b)), and commodity and stock index futures (Helms, Kaen and Koveos (1984), Barkoulas, Labys, and Onochie (1997a)), inflation rate (Scacciavillani (1994), Hasslerand Wolters (1995), spot and forward metal prices (Fraser and MacDonald (1992)). Fung et al (1994) considered intraday stock index futures and tested forlong memory by using variance ratio, R/S and AFIMA models. All these types of analyses concluded that no long memory exists in the data. The results are mixed, but all the authors agreed that identification of long memory is very significant in at least two senses: (a) the time span and strength of long memory will be an important input for investment decisions regarding investment horizons and composition of portfolios; and (b) prediction of price movements will be improved. On the background of this, it has become important to test the long memory existence in Indian market taking the stock market data for last one decade during which substantial regulatory changes have taken place and the market practices have changed dramatically and various hedge products have been introduced to improve the risk management.

The procedures for collecting and transforming data affect any serious statistical modeling. The daily closing values of the index NIFTY for the period from July 1990 to November 2001 is considered for the study. From July 1990 to October 1995, the NIFTY values used here is the simulated values maintained by IISL (a subsidiary of NSEIL which looks into the Index products of NSEIL). From November 1995 to November 2001, the actual close values of NIFTY have been taken for the purpose of the study. The index consists of underlying stocks whose closing prices determine the closing values of NIFTY. But using daily prices often encounters one problem; the limits for daily price changes, based on the closing price of the previous day. Therefore the series are truncated and that might distort non-linear modeling. Due to earlier provision of price bands, man times the stocks hit a circuit breaker and hence the series gets distorted. What we have today is not the price bands on the individual stocks, but there is a band for the index. However, the analysis of daily price data is necessary to understand any findings. Returns have been calculated for various time lags like 1 day, 14 day, 30 day, 90 day, 180 day, 270 day, 360 day, 720 day and 1800 days to understand to what extent, the long memory process exists, if it exists at all.

Sharpe (1970) notes that: ”normal distributions assign very little likelihood to the occurrence of really extreme values. But such values occur quite often.“ Turner and Weigel (1990) (as quoted in Peters, 1996) in an extensive study of volatility, using S& index returns from 1928 through 1990, found that “daily return distributions for the Dow Jones and S&P are negatively skewed and contain a larger frequency of returns around the mean interspaced with infrequent very large and very small returns as compared to a normal distribution.” While analyzing the CNX NIFTY returns data for the period from July1990 to November 2001 it was found that the tails were a bit fatter, and more significantly the peak around the mean was higher than predicted by the normal distribution. This can be seen from the graph below. The daily returns were normalized so that they have a mean of zero and standard deviation of one. The most common explanation of the fat tails is that information shows up in infrequent clumps, rather than in a smooth and continuous fashion. What we can also see from the above, at both the tails, we have incidence of three and four sigma event. The daily returns are negatively skewed for NIFTY and contain a large frequency of returns around the mean. The following table summarizes the findings for NIFTY:

If asset prices display long memory, or long-term dependence, they exhibit significant autocorrelation between observations widely separated in time. This implies that what has happened not only in recent past but long time back has a bearing on the today's market prices and hence existence of an autocorrelation between these observations. Today's risk containment policies followed in Indian securities market is built on the basis of historical price behavior. Since the series realizations are not independent over time, realizations from the remote past can help us predict future movements in asset prices. Persistence in share returns has a special claim on the attention of investors because any predictable trend in returns should be readily exploitable by an appropriate strategy. A number of studies have tested the long memory hypothesis for stock market returns. Peters (1989) used Hurst Rescaled Range (R/S) analysis to measure non-periodic cycles in financial data. He concluded that capital market prices do not reflect information immediately, as the efficient market hypothesis assumes, but rather follow a biased random walk that reflects persistence. Using the rescaled range (R/S) method, Greene and Fielitz (1977) also reported evidence of persistence in daily U.S. stock returns series.Barkoulas and Baum used Spectral Regression Test to test the long memory of US stocks and found only few stock do have long memory. Related research into stock market overreaction has also uncovered evidence that a measure of predictability can be identified ex post in stock returns. Specifically, shares of companies, which have performed well in the past subsequently, perform less well in the future, while shares, which have performed badly in the past usually, improve their performance (MacDonald and Power, 1992). However, according to some authors, the classical R/S test is biased toward finding long-term memory too frequently. Stock market returns may follow biased time paths that standard statistical tests cannot distinguish from random behavior. Rescaled range analysis can be used to detect long-term, non-periodic cycles in stock market returns. If this technique is not applied correctly, however, then it can be influenced by short-term biases, leading to the erroneous conclusion that the stock market has long-term memory. Lo (1991) developed a modified R/S method, which addresses some of the drawbacks of the classical R/S method. Using the variant of R/S analysis, Lo finds no evidence to support the presence of long memory in U.S. stock returns. Applying Lo test, which does not rely on standard regression techniques and is robust to short-term dependence, provides statistical support for the hypothesis that stock market returns follow a random walk (Ambrose, 1993). Using both the modified R/S method and the spectral regression method, Cheung and Lai (1995) find no evidence of persistence in several international stock returns series. Crato (1994) reports similar evidence for the stock returns series of the G-7 countries using exact maximum likelihood estimation. Wright (1999) used AFRIMA model to test long memory in emerging market including India and came up with the conclusion that emerging markets appear to have considerable serial correlation which stands contrast to the results for the developed markets like US, where there is little evidence for any serial correlation in stock returns. The primary focus of these studies has been the stochastic long memory behavior of stock returns in major capital markets. In contrast, the long memory behavior in smaller markets has received little attention. Contrary to findings for major capital markets, Barkoulas, Baum, and Avalos in a Working Paper found significant and robust evidence of positive long-term persistence in the Greek stock market.

Today, we see an overwhelming response to emerging markets from investors across the world. These markets have provided diversification opportunity to international investors. It must be noted that such markets are very likely to exhibit characteristics different from those observed in developed capital markets as the market micro-structure is different in emerging markets vis-à-vis developed ones. Biases due to market thinness and nonsynchronous trading should be expected to be more severe in the case of the emerging markets. In case of Indian stock market, we have seen on many occasions irregularities in price behavior not because of the true market conditions but due to some sort of manipulations carried out by a group of greedy market participants to throw the market out of gear. Another important factor that need to be considered is that these emerging markets have been going through many regulatory changes to improve the efficiency level and can not be fully compared with the developed and established markets. In this study, I look for evidence of long memory in Indian capital market. I have used data about returns from the National Stock Exchange of India Ltd. to check for persistence of long memory in daily returns data. The Nifty data is actual closing values from November 1995 to November 2001 while the data pertaining to the period before November 1995 has been simulated NIFTY data using the closing prices of the related stocks that might have been obtained from other stock exchanges before NSEIL started operation in November 1994. These dataset has been constructed by NSEIL while building NIFTY and the same is available with IISL Two statistical methods have been used for the test: (a) the Variance Ratio Test and (b) the Hurst Exponent (R/S Analysis) to test the data. These two tests have been selected to find out if both give the same result or the results differ even if we use the same dataset.

The variance ratio test has been used to test permanent as well as the temporary memory components. Barman & Madhusoodan (1993) carried out the test for Indian market to find out the long memory. Variance Ratio Test popularized by Cochrane (1988), and used by MacDonald and Power (1992) etc. can be explained as below:

Z (k) = (1/k) * { Var (X t-k) / Var (X t) } ……..(1)

Where Xt denotes a one period return, obtained from the first difference of the natural logarithmic of the share price, and Xt -k denotes the k-period return calculated using the kth difference of the share price. The possibilities are as follows:

- If the price series follows a random walk, this ratio should equal unity.
- If the series is stationary, the ratio will tend to zero.
- If share price exhibit mean reversion, Z (k) should lie between zero and one.
- Values of Z (k) above one indicate that a current increase in the value of the share

price will be reinforced in the future by further positive increases.

H, a Hurst exponent, is produced by the rescaled range analysis, or R/S analysis which was established by hydrologist H E Hurst in 1951, further developed by B Mandelbrot in the 1960s and 1970s, and applied to economic price analysis by Booth et al (1982), Helms et al (1982) Peters (1989) and others in 1980s. For a given time series, the Hurst exponent measures the long-term no periodic dependence and indicates the average duration the dependence may last. Standard autocorrelation tests detect long-term dependency in stock market prices if dependent behavior is periodic and if the periodicity is consistent over time. Fundamental historical changes however alter the period of cycles. Mandelbrot (1972) proposes a statistic to measure the degree of long-term dependency, in particular, “non periodic cycles”. The rescaled range, or R/S statistic, is formed by measuring the range between the maximum and minimum distances that the cumulative sum of a stochastic random variable has strayed from its mean and then dividing this by its standard deviation. An unusually small R/S measure would be consistent with mean reversion, for instance, while an unusually large one would be consistent with return persistence. To construct this statistic, consider a sample of returns X 1, X 2,….., X n and let denote the sample mean.

Q n = 1/(n .n) [ max ( X j - n ) - min ( X j - n ) ] …….(2)

In his original work, Mandelbrot suggested using the sample standard deviation estimator for the scaling factor, n We have used this as my second technique to judge persistence in stock market returns. Peters (1994) has discussed this method in a simpler and neater way. Let us take a series of data X1, X 2, ….., X n and let denote the sample mean. Let n again be the standard deviation. The rescaled range was calculated by first rescaling or “normalizing” the data by subtracting the sample mean:

Z r = X r - r = 1,2……n ……… (3)

The resulting series, Z, now has a mean of zero. The next step creates a cumulative time series Y:

Y 1 = Z 1 + Z r r = 2,3…….n ……….(4)

Note that by definition the last value of Y (Y n) will always be zero because Z has a mean of zero.

The adjusted range, R n is the maximum minus the minimum value of the Y r:

R n = max (Y 1 ,………, Y n ) - min (Y 1 ,………, Y n ) ……..(5)

The subscript n for R n now signifies that this is the adjusted range for X1, X 2, ….., X n Because Y has been adjusted to a mean of zero, the maximum value of Y will always be greater than or equal to zero, and the minimum will always be less than or equal to zero. Hence, the adjusted range R n will always be non-negative. However, this equation applies only to time series in Brownian motion: that have mean zero and variance equal to one. To apply to any time series (like stock returns), we need to generalize the equation. Hurst found that the following was a more general form of equation:

R= c * n H ……….(6)

The R/S (or R) value is referred to as the rescaled range analysis because it has mean zero and is expressed in terms of local standard deviation. In general, the R/S value scales as we increase the time increment, n, by a “power-law “value equal to H, generally called the Hurst exponent. The procedure used for calculations is listed below (Peters, 1994; pp. 62-63).

1. Begin with a time series of length M. Convert this into a time series of length N = M -of logarithmic ratios: Ni = log(Mi+1 / Mi), I = 1,2,3, …….., (M-1)

2. Divide this time period into A contiguous sub periods of length n, such that A*n = N.

Label each sub period Ia with a = 1,2,3,….A. Each element in Ia is labeled Nk,a such that k= 1,2,3,…n. For each Ia of length n, the average value is defined as: en = (1/n) *Nk,a where en = average value of the Ni contained in sub period Ia of length n.

3. The time series of accumulated departures (Xk,a) from the mean value for each sub period Ia is defined as Xk,a = (Ni,a - ea) where k = 1,2,3, ….n

4. The range is defined as the maximum minus the minimum value of Xk,a within each

sub period Ia: RIa = max(Xk,a) - Min (Xk.a) where 1<= k<= n.

5. The sample standard deviation calculated for each sub period Ia: SIa = (1/n) * Ni,aEa.

6. Each range RIa is now normalized by dividing the SIa corresponding to it. Therefore,

The rescaled range for each Rla sub period is equal to RIa / SIa. From step 2, we had A

Contiguous sub periods of length n. Therefore, the average R/S value for length n is

Defined as (R/S)n = (1/A) *(RIa / SIa).

7. The length n is increased to the next higher value, and (M-1) /n is an integer value. We use values of n that include the beginning and ending points of the time series, and steps 1 through 6 are repeated until n = (M-1)/2.

4.5(f) Hurst's Empirical Law:

Hurst (1951) also gave a formula for estimating the value of H from a single R/S value

(as quoted in Peters, 1996):

H = log (R/S) / log (n/2) …….(7)

where n = number of observations This equation assumes that the constant c of the above equation is equal to 0.5. Feder (1988) shows that the empirical law tends to overstate H when it is greater than 0.70 and understate it when it is less than or equal to 0.40. However for short data sets, where regression is not possible, the empirical law can be used as a reasonable estimate. It is clear that H is a parameter that relates mean R/S values for subsamples of equal length of the series to the number of observations within each equal length subsample. H is always greater than 0. The method discussed above would become clearer by looking at the calculations done for NIFTY data. We can use daily NIFTY closing data for the last decade (i.e. 1990 to 2001) and calculate daily logarithmic returns. The same data has been used to estimate H for 8 periods of time (with N=2, 30, 90, 180, 270. 360, 720, & 1800) since plotting a line between log (R/S) vs. log (N) is very difficult.

There are 3 distinct classifications for the Hurst exponent (H):

1. H = 0.5

H equal to 0.5 denotes a random series, the process is white noise.

2. 0<=H < 0.5

This type of system is anti persistent or mean reverting. That mean if the system has been up in the previous period, it is likely to be down in the next period. The strength of anti-persistent behavior will depend on how close H is to 0.

3. 0.5 < H <1.0

Here we have a persistent or trend reinforcing series, long memory structure exists. That means, if the series has been up (down) in the last period, hence the chances are that it will continue to be positive (negative) in the next period. Trends are apparent. The strength of the trend reinforcing behavior, or persistence, increases as H approaches 1. The closer H is to 0.5, the noisier it will be and the trend would be less defined. Persistent series are fractional Brownian motion, or biased random walks. The strength of the bias depends on how far H is above 0.50. The greatest advantage of R/S analysis is that the measure is independent of the distribution assumption for a given series. The robustness of results remains unaffected regardless whether the distribution is normal or non-normal. The dependence the Hurst exponent captures is the nonlinear relationship inherent in the structure of the series (Peters (1991)).

None of the values for the time lags is equal to 0.50 indicating that Indian stock market

Can not be said to follow random walk in so far as the daily returns are concerned when

We use NIFTY as the market proxy. This shows that there is a definite possibility for

Persistence in the NIFTY returns but the values for time lags above 360 days are very

Close to 0.50 leading us to believe that there is enough noise in the series and the trend is not perfectly established. However, for the shorter period (up to 9 months time lag), the Values are reasonably higher than 0.5 indicating a definite possibility for persistence.

After going through all the analysis regarding the stock market in last 2 years, we can say that stock market touched its peak at 21000 but then crashed badly. Now it is revolving around a 14000-16000 figure. Though the Sensex is a barometer and after seeing such fluctuations one could be afraid of investing. Still we can say that people can play safe by investing the blue-chips and undervalued shares, we can summarize the happenings of year 2007 as a year which redefined the resistance levels at Sensex. Strong economic data, heavy inflow of funds from FIIs towards the close of previous calendar year and decent to highly encouraging surge in earnings of top notch companies all pointed to a rosy 2007. The rupee's rise against the US dollar the regulator's decision to restrict investments made through participatory notes, rising crude oil prices, the sub-prime mortgage woes in US, concerns over a slowing down US economy and The Left parties' opposition to the Indo-US nuclear pact, did halt the market's progress at times. But the inherent strength of the Indian economy, fairly buoyant results quarter after quarter, the various chops and subsidies announced by the government and sustained efforts made by the market regulator to keep investor confidence in the system alive kept the momentum going Presently the hike and seek being played by crude prices, inflation and RBI is affecting our market to a great extent. And adding to the worries are global slowdown, political instability, serial bomb blasts, negative public sentiments etc. So even after such downturns, we can be hopeful for a positive market. The normality tests on the daily NIFTY returns for the last one decade indicates the needto explore the application of non-linear modeling techniques in capital market. But wecome to see that the results from the persistence tests are split. The variance test clearlyimplies that there does not exist any short-term or long memory in the market returns asgiven by NIFTY returns data for more than 10 years and it shows a clear pattern of mean reversion.However, the R/S analysis does give indications of long-term memory for alltime lags but with higher time lags of more than two years there exists long memory butwith noise as the values are close to 0.5. In either case, analysis shows that the movement of stock prices does not follow a random movement. However, a more rigid analysisneeds to be performed, maybe by using Lo's modified R/S Analysis. Also, for a foolproofanalysis, the data used should be for a period longer than just one decade. Studies need tobe undertaken on individual stocks to understand if the component stocks of the proxyalso follow the same pattern and in case they do not follow the trend as given by eitherVariance ratio test or R/S analysis for NIFTY, what could be the possible explanation.These studies need to be conducted to understand if the present risk containment systemapplicable in capital market margining is in line with the long memory of the capitalmarket. If there is not enough proof to support long memory, then we can discard the pathof historical volatility to use as margining principles and go for something else.

**Source:** Essay UK - http://doghouse.net/free-essays/economics/revolutionary-indian-growth-story.php

This Economics essay was submitted to us by a student in order to help you with your studies.

This page has approximately ** words.**

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay UK, *Revolutionary indian growth story*. Available from:
<http://doghouse.net/free-essays/economics/revolutionary-indian-growth-story.php> [21-02-19].

If you are the original author of this content and no longer wish to have it published on our website then please click on the link below to request removal: