دانلود مقاله ISI انگلیسی شماره 13173
ترجمه فارسی عنوان مقاله

تجارت با زمان تاخیر کم

عنوان انگلیسی
Low-latency trading
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
13173 2013 34 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Journal of Financial Markets, Volume 16, Issue 4, November 2013, Pages 646–679

ترجمه کلمات کلیدی
تجارت با فراوانی بالا - محدود کردن بازارهای منظور - استراتژی سفارش دادن - نقدینگی - کیفیت بازار -
کلمات کلیدی انگلیسی
High-frequency trading, Limit order markets, NASDAQ, Order placement strategies, Liquidity, Market quality,
پیش نمایش مقاله
پیش نمایش مقاله  تجارت با زمان تاخیر کم

چکیده انگلیسی

We define low-latency activity as strategies that respond to market events in the millisecond environment, the hallmark of proprietary trading by high-frequency traders though it could include other algorithmic activity as well. We propose a new measure of low-latency activity to investigate the impact of high-frequency trading on the market environment. Our measure is highly correlated with NASDAQ-constructed estimates of high-frequency trading, but it can be computed from widely-available message data. We use this measure to study how low-latency activity affects market quality both during normal market conditions and during a period of declining prices and heightened economic uncertainty. Our analysis suggests that increased low-latency activity improves traditional market quality measures—decreasing spreads, increasing displayed depth in the limit order book, and lowering short-term volatility. Our findings suggest that given the current market structure for U.S. equities, increased low-latency activity need not work to the detriment of long-term investors.

مقدمه انگلیسی

Our financial environment is characterized by an ever increasing pace of both information gathering and the actions prompted by this information. Speed in absolute terms is important to traders due to the inherent fundamental volatility of financial securities. Relative speed, in the sense of being faster than other traders, is also very important because it can create profit opportunities by enabling a prompt response to news or market activity. This latter consideration appears to drive an arms race where traders employ cutting-edge technology and locate computers in close proximity to the trading venue in order to reduce the latency of their orders and gain an advantage. As a result, today’s markets experience intense activity in the “millisecond environment,” where computer algorithms respond to each other at a pace 100 times faster than it would take for a human trader to blink. While there are many definitions for the term “latency,” we view it as the time it takes to learn about an event (e.g., a change in the bid), generate a response, and have the exchange act on the response. Exchanges have been investing heavily in upgrading their systems to reduce the time it takes to send information to customers, as well as to accept and handle customers’ orders. They have also begun to offer traders the ability to co-locate the traders’ computer systems in close proximity to theirs, thereby reducing transmission times to under a millisecond (a thousandth of a second). As traders have also invested in the technology to process information faster, the entire event/analysis/action cycle has been reduced for some traders to a couple of milliseconds. The beneficiaries from this massive investment in technology appear to be a new breed of high-frequency traders who implement low-latency strategies, which we define as strategies that respond to market events in the millisecond environment. These traders now generate most message activity in financial markets and according to some accounts also take part in the majority of the trades.2 While it appears that intermediated trading is on the rise [with these low-latency traders serving as the intermediaries, e.g., Menkveld (in this issue)], it is unclear whether intense low-latency activity harms or helps the market. Our goal in this paper is to examine the influence of these low-latency traders on certain dimensions of market quality. More specifically, we would like to know how their combined activity affects attributes such as bid-ask spreads, the total price impact of trades, depth in the limit order book, and the short-term volatility of stocks.3 To investigate these questions, we utilize publicly-available NASDAQ order-level data that are identical to those supplied to subscribers and provide real-time information about orders and executions on NASDAQ. Each entry (submission, cancellation, or execution) is time-stamped to the millisecond, and hence these data provide a very detailed view of NASDAQ activity. We begin by providing a discussion of the players in this new millisecond environment: proprietary and agency algorithms. We document periodicities in the time-series of market activity, which we attribute to agency algorithms. We also look at the speed at which some traders respond to market events—the hallmark of proprietary trading by high-frequency trading firms—and find that the fastest traders have an effective latency of 2–3 ms during our sample period. We propose a new measure of low-latency activity based on “strategic runs” of linked messages that describe dynamic order placement strategies. While our measure might reflect some activity originating from agency algorithms, our restriction to long strategic runs makes it more likely that the measure predominately captures the activity of high-frequency traders, and we believe that it is highly correlated with their presence in the market. As such, we view this measure as a proxy for the activity of high-frequency traders. An advantage of our measure is that it can be constructed from publicly-available data, and therefore does not rely on specialty datasets that may be limited in scale and scope. We show that our measure is highly correlated with aggregate trading by high-frequency trading firms in the 120-stock NASDAQ HFT dataset studied in Brogaard (2012), Brogaard, Hendershott, and Riordan (2012), and Carrion (in this issue). To assess robustness, we attempt to exclude agency algorithms from our measure, and find that our conclusions are unchanged. However, due to the manner in which the measure is constructed, there is no certainty that it only captures high-frequency trading. We use our measure to examine how the intensity of low-latency activity affects several market quality measures. We find that an increase in low-latency activity reduces quoted spreads and the total price impact of trades, increases depth in the limit order book, and lowers short-term volatility. Our results suggest that the increased activity of low-latency traders is beneficial to traditional benchmarks of market quality in the current U.S. equity market structure, one that is characterized by both high fragmentation and wide usage of agency and proprietary algorithms. We use a variety of econometric specifications to examine the robustness of our conclusions. Furthermore, we employ two distinct sample periods to investigate whether the impact of low-latency trading on market quality differs between normal periods and those associated with declining prices and heightened uncertainty. Over October 2007, our first sample period, stock prices were relatively flat or slightly increasing. Over our second sample period, June 2008, stock prices declined (the NASDAQ index was down 8% in that month) and uncertainty was high following the fire sale of Bear Stearns. We find that higher low-latency activity enhances market quality in both periods.4 Our paper relates to small but growing strands in the empirical literature on speed in financial markets and high-frequency trading (which is a subset of algorithmic trading comprised of proprietary algorithms that require low latency). With regard to speed, Hendershott and Moulton (2011) and Riordan and Storkenmaier (2012) examine market-wide changes in technology that reduce the latency of information transmission and execution, but reach conflicting conclusions as to the impact of such changes on market quality. There are several papers on algorithmic trading that characterize the trading environment on the Deutsche Boerse (Prix et al., 2007, Groth, 2009, Gsell, 2009, Gsell and Gomber, 2009 and Hendershott and Riordan, 2013), the interdealer foreign exchange market (Chaboud, Hjalmarsson, Vega and Chiquoine, 2013), and the U.S. equity market (Hendershott, Jones, and Menkveld, 2011). A smaller set of papers focuses on high-frequency trading. Kirilenko, Kyle, Samadi, and Tuzun (2011) look at high-frequency traders in the futures market during the flash crash episode. Brogaard (2012) seeks to characterize high-frequency trading on NASDAQ and BATS, while Brogaard, Hendershott, and Riordan (2012) study the impact of high-frequency trading on price discovery in U.S. equities. Three other papers also appear in this special issue on high-frequency trading. Menkveld (in this issue) is a case study of a particular high-frequency trader who acts as a market maker on Chi-X and Euronext. Carrion (in this issue) uses the NASDAQ HFT dataset to examine the sources of profitability of high-frequency trading firms, how they carry out their strategies, and their impact on market efficiency. Hagströmer and Norden (in this issue) use special data from NASDAQ OMX Stockholm to separately characterize the strategies of “market making” and “opportunistic” high-frequency trading firms. The rest of this paper proceeds as follows. The next section describes our sample and data. Section 3 provides an introductory discussion of the millisecond environment with some evidence on the activity of proprietary and agency algorithms. Section 4 describes our measure of low-latency activity. In Section 5 we estimate the impact of our measure on diverse measures of market quality. In Section 6 we discuss related papers and place our findings within the context of the literature, and Section 7 concludes.

نتیجه گیری انگلیسی

Our papermakestwosignificant contributions.First,wedevelopameasureoflow-latency activity usingpublicly-availabledatathatcanbeusedasaproxyforhigh-frequencytrading. Second, westudytheimpactthatlow-latencyactivityhasonseveraldimensionsofmarket quality bothduringnormalmarketconditionsandduringaperiodofdecliningpricesand heightenedeconomicuncertainty.Ourconclusionisthatinthecurrentmarketstructurefor equities, increasedlow-latencyactivityimprovestraditionalyardsticksofmarketqualitysuchas liquidityandshort-termvolatility.Ofparticularimportanceisour finding thatattimesoffalling pricesandanxietyinthemarket,thenatureofthemillisecondenvironmentandthepositive influence oflow-latencyactivityonmarketqualityremains.However,wecannotruleoutthe possibilityofasuddenandseveremarketconditioninwhichhigh-frequencytraderscontributeto a marketfailure.Theexperienceofthe “flash crash” in Mayof2010demonstratesthatsuch fragilityiscertainlypossiblewhenafewbigplayersstepasideandnobodyremainstopostlimit orders.Whileourresultssuggestthatmarketqualityhasimproved,webelieveitisasyetan unresolvedquestionwhetherlow-latencytradingincreasestheepisodicfragilityofmarkets,and we hopethatfutureresearchwillshedlightonthisissue. The millisecondenvironmentwedescribe—with itsclock-timeperiodicities,tradingthat respondstomarketeventsovermillisecondhorizons,andalgorithmsthat “play” with one another—constitutesafundamentalchangefromthemannerinwhichstockmarketsoperated even afewyearsago.Still,theeconomicissuesassociatedwithlatencyin financialmarketsare not new,andtheprivateadvantageofrelativespeedaswellasconcernsovertheimpactof fast tradersonpriceswerenotedwellbeforetheadventofourcurrentmillisecondenvironment The earlyadvocatesofelectronicmarketsgenerallyenvisionedarrangementswhereinalltraders would enjoyequalaccess(e.g., MendelsonandPeake,1979). Webelievethatitisimportantto recognize thatguaranteeingequalaccesstomarketdatawhenthemarketisbothcontinuousand fragmented(aspresentlyintheU.S.)maybephysicallyimpossible. The first impedimenttoequalaccessisthegeographicaldispersionoftraders(Gode and Sunder, 2000). Ourevidenceonthespeedofexecutionagainstimprovedquotessuggests that someplayersarerespondingwithin2–3 ms,whichisfasterthanitwouldtakefor informationtotravelfromNewYorktoChicagoandback(1,440miles)evenatthespeedof light (about8ms).Whileco-locationcouldbeviewedastheultimateequalizerofdispersed traders, itinevitablyleadstotheimpossibilityofachievingequalaccessinfragmentedmarkets. Since thesamestockistradedonmultipletradingvenues,aco-locatedcomputerneartheservers of exchangeAwouldbeatadisadvantageinrespondingtomarketeventsinthesamesecurities on exchangeBcomparedtocomputersco-locatedwithexchangeB.Unlessmarketschangefrom continuous toperiodic,sometraderswillalwayshavelowerlatencythanothers.Itisofspecial significance, therefore,thatour findings suggestthatincreasedlow-latencyactivityneednot invariablyworktothedetrimentoflong-terminvestorsinthepost-RegNMSmarketstructurefor U.S. equities.