Tuesday 3 July 2012

HFT, flash crashes, true liquidity and time redefinition


 

One of my thesis students, motivated by the quantitative analysis of market data, fell on the subject of the so-called "flash market crash" of May 6 2010 and the related subject of high-frequency trading (HFT). The Dow Jones losing USD 1 trillion in a matter of minutes to then recover quite quickly most of the value lost? That's exciting! Well, at least it is intriguing...
So I was intrigued myself. In fact it was good news for some microstructure authors. An event like that to be analyzed? Well, the fashionable subject of microstructure in the mid-90s, losing then its hype some years later, had found a true exciting event to question again the microstructure of financial markets. At the end of the 90s, it was thought that this was the end of the story because of the liberalisation of exchanges and the possibility to run an market order-book exchange on a server in your garage (like a certain Apple's story).
But, no no no... That wasn't it all. Today, we talk about "market exchange colocations", referring to traders wishing to be again physically very close to the exchange servers because they say they trade up to the...nano second.

Let's go back to our 2010 flash crash. For a visual practitioner's research study on the flash crash with some nice graphs (not for colorblind people), you can take a look at Nanex. They produce graphs with 500ms intervals. Nanex, an intraday market data provider, is part of the firms mentioned in the SEC report as to have ".. hypothesized that these delays are due to a manipulative practice called “quotestuffing” in which high volumes of quotes are purposely sent to exchanges in order to create data delays that would afford the firm sending these quotes a trading advantage". SEC denies this as being the fundamental reason given their own analysis. But concludes:
Nevertheless, as discussed in the Executive Summary, the events of May 6 clearly demonstrate the importance of data in today’s world of fully-automated trading strategies and systems. The SEC staff will therefore be working closely with the market centers to help ensure the integrity and reliability of their data processes, especially those that involve the publication of trades and quotes to the consolidated tape. In addition, the SEC staff will be working with the market centers in exploring their members’ trading practices to identify any unintentional or potentially abusive or manipulative conduct that may cause such system delays that inhibit the ability of market participants to engage in a fair and orderly process of price discovery.
Many debates involve either: (1) who to blame, (2) what safeguards to put on, (3) what limits to define (with microstructure indicators to be used as thresholds), (4) or how to come up with a rule based on a formula trying to encompass the drivers used by the HFT algorithms. First of all, rules to handcuff algorithms would be an endless story. Second, there is still little emphasis on the need or not of HFT itself. The SEC acknowledges still on the data issue and its importance, and whatever the analysis is about, it shows that it is quite the analogy of a car that runs at 200mph but in the dark, without lights, nor brakes. And instead of asking ourselves "is there any value to drive at 200mph", we seem to be saying "well, maybe we should use Xenon lights and prevent all cars to drive by new moon nights". Where are all the studies of the 90s debating on the ins & outs of call vs. continuous markets?

Another comment in the SEC report states that:
At that time, the number of volatility pauses, also known as Liquidity Replenishment Points (“LRPs”), triggered on the New York Stock Exchange (“NYSE”) in individual equities listed and traded on that exchange began to substantially increase above average levels.
We must be very careful here because if someone is manipulating the price behaviour and integrates the triggers of volatility pauses in his/her (statistical) algorithm, there is probably something to do. It is always very dangerous to be extremely liberal on what markets can do but then to put ad-hoc limits/sanctions based on some indicators that may be part of the strategy of these traders thereafter.

Also, it is quite 101 in financial microstructure to know that liquidity is a three-dimensional problem: price, volume and time. To be perfectly liquid is to be able to sell immediately any quantity without any price impact. This of course is not what we find in the real world and that's why there is a trade-off between these three dimensions. It is also why coming up with a single measure of liquidity is not easy. There is a lot of research by Amihud & Mendelson on this subject since the second half of the 80s. I like to cite their paper of 1987 because, apart from being based on bigger time intervals than milliseconds, the conclusion is straightforward: "We conclude that the trading mechanism has a significant effect on stock price behaviour". And that probably started the hype of microstructure in the 90s. Strikingly, more than 20 years later, the SEC report mentions that:
This large fundamental trader chose to execute this sell program via an automated execution algorithm (“Sell Algorithm”) that was programmed to feed orders into the June 2010 E-Mini market to target an execution rate set to 9% of the trading volume calculated over the previous minute, but without regard to price or time.
Another interesting element is that, before, time being defined as clock-time, we were inclined in looking at price- and volume-related measures only. Easley, López and O'Hara (2011) bring the interesting idea that in HFT, "...trade time rather than clock time is the relevant metric to use in sampling the information set. Trade time can be measured by volume increments,...". This is very interesting. First, on a theoretical microstructure ground, it is bringing a modern view on what the time dimension represents. Even more, it would provide a more direct interconnectedness with the volume dimension. Second, it reminds me of the application of "wavelets" from physics to finance for time-frequency analysis. There is certainly something to capture about the various frequency components. Third, philosophically, we might have a concern just related to the "information set". Microstructure is the study of the mechanisms allowing investors to exchange securities or assets based on (1) their willingness to trade for operational reasons, and (2) their information set, whatever that could be. Of course, the idea behind is that if we better understand market formation, then we could probably identify which properties lead to better "information release" through the trading mechanism. Therefore, there is a certain number of motivations behind microstructure, namely mostly "liquidity" and "price discovery". Now, if we focus on price discovery, we might ask ourselves what is the philosophical meaning of having a trading timeline that is thousand (if not million) times faster than that of the flow of fundamental information available to the market (and cynically, of the capacity of the systems and regulators to follow). Of course, the perfect continuum allows to trade just behind having received a new information. But in the meantime, it might generate a lot of "noise" or "liquidity traps".

And finally, most of the litterature is scarce on the definition of HFT and the description of its properties. A lot of press takes back and "re-tweets" the conclusion of the Tabb group report of January 2011 on HFT accounting for 77% of the transactions in UK. But as far as I was told, a transaction involves at least two counterparts, and the real purpose of the transaction is defined by a third party who "gave the mission". So, without this additional information, we can only conclude on the HFT, through algorithmic trading, being the technical mean to trade. That's why "blaming HFT" is not an issue depending on what you want. Yes, if I have an information, I want to trade as soon as possible, which would be good for price discovery and market efficiency purposes. So, if HFT allows me to trade as quickly as possible and dilute the market impact, I would be happy with it. But on the other side of the transaction, I might have a market maker who is using HFT to be "in-&-out" in some seconds, and this is a totally different story, even though me may talk about the same transaction. And a long-only fund doesn't need to stay less than 1 minute in a trade. Golub (2011), referred here below, devotes some reflexions on the definition and characterization of HFT.

In conclusion, the provocative tone of this article aims at trying to ask the right questions before chasing answers. Also, we tend to use terms and "re-tweet" stories. But they deserve more precision in the wording if we want to really evolve in the subject.

References
  • Amihud, Y. & H. Mendelson (1987), "Trading Mechanisms and Stock Returns: an Empirical Investigation", Journal of Finance, July 1987, Vol XLII, n°3, pp. 533.
  • CFTC and SEC report (2010), "Findings regarding the market events of may 6, 2010", Report of the staffs of the CFTC and SEC to the joint advisory committee on emerging regulatory issues, September 30, 2010, 104 pages. 
  • Easley, D., M. López de Prado & M. O'Hara (2012), "Flow Toxicity and Volatility in a High-Frequency World", Review of Financial Studies, Vol. 25, No. 5, pp. 1457-1493, 2012.
  • Easley, D., M. López de Prado & M. O'Hara (2011), "The Microstructure of the ‘Flash Crash’: Flow Toxicity, Liquidity Crashes and the Probability of Informed Trading", The Journal of Portfolio Management, Vol. 37, No. 2, pp. 118-128, Winter 2011.
  • Golub, A. (2011), "Overview of High Frequency Trading", Working Paper, part of a FP7 EU project, 50 pages.
  • Sornette, Didier and Von der Becke, Susanne, Crashes and High Frequency Trading (August 2011). Swiss Finance Institute Research Paper No. 11-63.

No comments:

Post a Comment