New Wave of Chaotic Market Movement

The presence of noise in stockmark et trading shows the inadequacy of the linear testing models that led to the random walkmodel and EMT. Noise theory shows that the information-processing properties of public capital markets are so bluntly powerful that fundamental information about underlying business values is crowded out by extraneous information or noise. There is a feedbacksystem in which individuals overreact to information or withhold action in the face of information.

Feedbackprocesses are the hallmarks of a nonlinear system. They indicate a nonproportional relationship between a cause and its effect (e.g., between news and price changes). This insight of noise theory has not been recognized for its full power. The distinction between linear and nonlinear is fundamental to an understand ing of stockmark et behavior and how investors and managers should think about markets and market prices.

Linearity means proportionality: A change in one variable produces a proportionate change in another specified variable. What makes the CAPM linear, for example, is its assertion that the expected riskpremium of a stockvaries in direct proportion to β.

EMT is linear in two ways. First, the statistical models underlying the weakform are simple linear regression analyses; correlation coefficients are statements about how variables are related on a straight-line basis over time. In other words, the time series of data is tested for correlation by fitting a straight line to the data and then calculating the correlation coefficient.

Second, the semistrong form of EMT is linear because it defines a proportional relationship between information changes and price changes. In particular, the semistrong form says that information is swiftly incorporated into prices without bias. In other words, there is a proportional relationship between information changes about business values and resulting price changes in the financial asset (stocks) representing those businesses.

In contrast and a bit simplistically, nonlinearity means the absence of proportionality: Changes in one variable will produce a change in another variable, but exponentially rather than proportionally. To take a prosaic example, the 1-gram straw that breaks the 1-ton camel’s backis nonlinear because the cause is utterly disproportionate to the effect.

Volatile stockprices and roaring or crashing markets are often attributable to an incremental bit of information piled on top of cumulated bits of information. If one company announces that its earnings aren’t going to be as strong as people had hoped, its stock price may take a haircut, but the market overall may not blush. But as the weeks go by and a few more companies in that sector say the same thing, Wall Street gets rattled. The shares of all the stocks in that sector can suddenly get punished, and the pounding can spread across the market as a whole. At some point, the creepy Wall Street saying that there is never only one cockroach starts to resonate.

The fact that the market may react slowly or may overreact to bits of new information is of course what noise theory teaches and explains. The distinction between nonlinear and linear systems goes well beyond noise theory, however, because noise theory itself is constrained by the efficiency paradigm. Nonlinear dynamics and chaos theory breakfrom that context and imply a fundamentally different understanding of public capital market phenomena with a broader perspective on investor and market behavior.

There is no a priori reason to believe that public capital markets are linear systems rather than nonlinear systems. Therefore, one of the first questions that must be considered in understanding such markets is whether they follow linear or nonlinear processes. More sophisticated techniques than were available when the random walk model was first developed are now used to investigate precisely that question.

One reason such techniques were unavailable in the 1960s, 1970s, and even early 1980s was the need for powerful computer systems that not only could process data more swiftly but also could go beyond the simplified mathematical models of straight lines and investigate the curvatures of multidimensional data streams. Armed with such resources, researchers now start with the consensus view that empirical research shows that a random walkdescribes stock prices fairly well, subject to some anomalies. Then they dig deeper.

One tool for the digging actually dates to the early part of the twentieth century. It was developed by the hydrologist H. E. Hurst when he was working on the Nile River Dam project.2 Hurst had to develop reservoir discharge policies to maintain reservoir water levels in the light of rainfall patterns.

To understand how the reservoir system worked, Hurst would record its water level each day at noon and calculate the range (essentially differences between the high and low levels and the average levels). If the range increased in proportion to the number of observations recorded, one could conclude that the reservoir system was a random one. Otherwise, it was nonrandom and exhibited some pattern, knowing either of which could enable the hydrologist to set the reservoir’s discharge policies.

Hurst developed a simple tool called the H exponent to determine whether the range increased as would a random process or whether it exhibited a more patterned behavior. Skipping the mathematical details, if a system’s H equals .50, then the system behaves according to a random walk. The probability that any particular move will follow any other move is 50-50 and thus completely up to chance.

If H is less than .50, the system is mean reverting. That means that if the system has moved up for a number of observations, it is more likely to move down over the next number of observations, and vice versa. Conversely, if H is greater than .50, the system is correl ative or persistent: if the system has moved up for a number of observations, it is more likely to continue to move up over the next number of observations, and vice versa. If H is .60, for example, the probability that a positive move will follow a positive move is 60%.

H may change over time. For example, H may be in the .70s over some period and then drop to near .50 and subsequently increase again. The number of observations (or time periods) over which H is sustained at other than .50 (before returning to near .50) is a measure of the average cycle length of the system.

In the case where H exceeds .50 for a sustained period, the length of that period is a measure of the system’s memory—the extent to which past events influence present and future events. In the context of investment analysis, it measures the period over which an investor can use information to his or her advantage.

During the 1990s, some market analysts figured out that the H exponent can also be applied to markets to determine whether they are random too. One of them even published his results. Edgar Peters, a money manager in Boston, applied it to the Standard and Poor’s 500 Index (the S&P 500) for monthly data over a 38-year period from January 1950 through July 1988.

Peters found that H was . for average periods of approximately four years, indicating a strong persistent element in the S&P 500 rather than a random process. Beyond average periods of four years, however, H was not significantly different from .50 (it was .52/-.02).

So Peters concluded that the S&P 500 begins to lose memory of events after four years. The S&P 500 thus is not random, and events today continue to affect price changes for up to an average of four years.
Read More : New Wave of Chaotic Market Movement

Related Posts