By Louis Lovas
For the first quarter of this year, equity markets soared on a sugar high; market indices regularly hit new highs as exhibited by the S&P 500’s (CME:SP.C) 12% rise since the end of last year. The debate rages on how long this will continue. There are numerous factors that could make this year different from the past three, ranging from the continuation of central bank easing policy to improved economic conditions. How do we know this? It’s all in the data as the major economic indicators and market indices are tracked, scrutinized and compared to past results.
Yet the undercurrent of the equity market’s exuberance is a continued downward trend in volumes and trader-loving volatility. NYSE’s volume composite index (MVOLNYE) has been on a slow slide reaching all the way back into last year, down nearly 10% year-over -year and the VIX too hit a six-year low. Again, how do we know this? It’s all in the data or more specifically, the analysis of the data over time.
For the professional trader, volumes are a reflection of money flows, achieving margins hinges on total volume and a sprinkle of volatility, all the while maintaining an accurate audit trail of trading activity. With the crush of compliance with increasing regulatory actions cascading from Dodd-Frank, the Consolidated Audit Trail (CAT) and the repercussions of Knight Capital’s mishap in the SEC’s proposed RegSCI (Regulation Systems Compliance and Integrity), we live under a cloud of market uncertainty and regulatory oversight. It is a new normal, a fait accompli that is shaping the future and forcing firms to elevate their game. And how do we know this? It’s all in the data.
The new normal may represent a dearth of (market) data but also mandates an imperative that firms recognize that its intrinsic value impacts the bottom line. Sluggish reactions to dynamic markets lead to business decision missteps that can result unknowingly in risk-laden exposure.
Disruptive power of innovation
Amid the cacophony of the narrative of algorithmic trading unfolds the story of Complex Event Processing (CEP), a new breed of technology and a tool for understanding data.
CEP is a story of the disruptive power of innovation, a nice segue to understanding data, specifically temporal analysis of time-series data. It excels at exacting data consistency from trades, quotes, order books, executions and even news and social sentiment which can instill trader confidence for ensuring profit and minimizing risk.
With so many liquidity sources – having a consistent and uniform data model across fragmented markets enables effective analysis for trade model design, statistical pattern analysis and understanding order book dynamics. This spans real-time, historical and contextual content – practically speaking it’s hard to separate them. This starts with an understanding of what is a time series.
In techie-speak time series refers to data that has an associative time sequence, a natural ordering to its content such as rates, prices, curves, dividend schedules, index compositions and so on. Time Series data is often of very high velocity. The UTP Quote Data Feed (UQDF) provides continuous time-stamped quotations from 13 U.S. market centers representing literally hundreds of terabytes annually. The data’s temporal ordering allows for distinct analysis revealing unique observations and patterns and the possibility for predicting future values. Time series often are called data streams that represent infinite sequences (i.e. computation that does not assume that the data has an end) or simply real-time data, such as intra-day trades. CEP is a temporally-sensitive programming paradigm designed for calculating and extracting meaningful statistics that are unique to and dependent on the data’s temporal nature. This includes not just the notion of duration and windows of time, but also temporal matching logic of a fuzzy nature such as trade prices to the nearest or prevailing quote.
Consider the scenario where there is a need to understand historic price volatility to determine accurate statistical thresholds of future price movements. It’s not simply a matter of determining price spikes but discerning when they occur, for how long and when a high (or low) threshold is crossed. It is CEP’s intrinsic sense of time that makes it uniquely suited to analyzing time series for achieving data consistency, the foundation for accurate trade decisions. Consistency is also about eliminating anomalous and spurious conditions, bad ticks if you will. But the trick is recognizing a bad tick from a good one. Historical precedence, ranging from the last millisecond to the previous year provides the benchmark for the norm and the means to recognize deviations. CEP’s analytical effectiveness is relative to the depth of the data set. The further back you look the more confidence can be achieved going forward. Of course this assumes that the future behaves like the past. This is the basis for back-testing algorithmic trading models.
Data can be an ally for back-testing, simulation, valuation, compliance, benchmarking and numerous other business critical decisions. It is the fodder for understanding the global economy and the markets. The natural temporal ordering of time series data draws analysis distinct from any other and has given rise to a whole field of study and discourse. For understanding complex event processing, it’s all in the data.
No comments:
Post a Comment