The High-Frequency Trading Arms Race: Frequent Batch Auctions as a Market Design Response[1][2] is a must-read. The tldr; is that a continuous mechanism (sequential trade processing) only looks efficient in time-space, but is actually inefficient in volume-space (most of the volume trades at stale prices). The solution is discrete batch auctions (batch trade processing).
This is really interesting, but doesn't it presume that there's only one trading venue for each instrument, and that instruments aren't correlated?
You can make time discrete for a single auction, but the time between auctions still has to be continuous unless everything's traded in a single exchange, which can't in reality happen even if we want it to (and we probably don't).
As I understand it, most of the most dramatic HFT stuff is already targeting multi-venue trades or correlations.
Decentralized assets[1] (technically possible already today, but not yet used for any high-volume instrument) have a single trading venue, namely the blockchain(s) on which the assets live. If multiple chains are involved[2][3], each individual trade is still atomic, but there would be a more complex ordering of trades: you could have trades which take longer than others to occur, and failed trades would also be publicly visible.
It's obvious that they see the issue, since the examples in their slide deck are about correlation between different securities and between the same security trading on different exchanges. What I don't see is how batch auctions get around the distributed systems problem; we're still talking about an eventually-consistent system, right? There still must be opportunities for sniping, right?
(I'm not smart enough to know the answer, which is why I'm asking).
1. slides: http://faculty.chicagobooth.edu/eric.budish/research/HFT-Fre... 2. paper: http://faculty.chicagobooth.edu/eric.budish/research/HFT-Fre...