Why would you want to place the server in the center? If you have a computer on both ends, then the information from exchanges reaches the other exchange at the same time the instruction from a computer in the center would, so you can have those computers on the ends make exactly the same decision at approximately the same time, using the same data (or newer local data, even)?
> Why would you want to place the server in the center?
Because for arbitrage you need to know the price in both area's before you can react. If its instantaneous at the collocated site and 5ms to get to your other machine, you need to wait 5ms to act and then add about the same time back on to send your order to the further location.
If your machine is in the middle then you only need to wait 2.5ms to act and then add about 2.5 ms to send your orders.
This type of arbitrage is much easier if all orders originate from one machine( as opposed to each collocated machine sending its piece of the stat arb order.
If you have 2 machines sending their own parts of the arb order then you have to sync them if one fails to get its side of hte order, and hence more latency.
EDIT in case I wasn't clear, the benefit is having one machine do the arb instead of two machines sending simultaneous orders. This way the two machines don't need to sync up to determine if one leg was done and one leg was hung. All the information is in the same program on the same machine so there is no distributed state to reconcile.
No. The parent suggests having 2 machines (1 at each end), but each trading order would be issued by only 1 machine (not 2). So there is no synchronization issue.
Example: machine A instantaneously become aware of some event at exchange A. It sends an order to remote exchange B (which takes 5ms) and at the same time sends a matching order to exchange A. Total time = 5ms, which is the same as having a machine in the middle having a latency of 2x2.5 = 5ms. [Edit #1: I edited this sentence to reflect the fact the algo sends the matching order immediately to the local exchange.]
So I still don't understand why is there any advantage in having a computer in the middle.
Edit #2: you wrote an edit ("This way the two machines don't need to sync up to determine if one leg was done and one leg was hung. All the information is in the same program on the same machine so there is no distributed state to reconcile.") which I think finally clarifies the advantage in having a computer in the middle. You don't run the risk of the 2 computers at each end executing orders that conflict with each other.
You do: that's the signal that's sent over in 5ms. You then combine it with the local price 5ms ago, and you can construct the exact same instruction the computer in the center would have come up with 2.5ms ago, based on the price in both locations 5ms ago.
Both sides can this way make identical decisions to eachother, at the same time, and identical to those that would have been made by a computer in the center.
You can't wait to hear back from the other exchange and see that the leg is available, "pure" arb teams in HFT typically quote on both products of a pair (e.g. cash vs futures FX), at favorable prices (e.g. I'll quote a bid at X on A if I can aggress at >= X/the implied price of X on B) and turn around and hit the other product as soon as they get filled. Sometimes they'll miss, so they're really running stat arb strategies, but that's the closest you'll get to pure arb in this space.
"a patent was filed in the U.S. by the Massachusetts Institute Of Technology, titled “System and method for relativistic statistical securities trading”,"
We should all be thankful to that proud institution for taking decisive action to ensure another wasteful technology well never see the light of day.
There is some chance I'm misreading your sarcasm here, but if you are willing to accept that arbitrageurs provide a useful function in the markets by keeping prices equal for roughly equivalent instruments traded on geographically diverse trading centers, then what's wrong with arbitrageurs competing with one another to provide that function?
The issue is that resources are being spent not to create value, but merely to ensure that HFT #1 captures it rather than HFT #2. It's a lot like wearing a suit or going to college - it doesn't create value, but it ensures that applicant #1 gets a job over applicant #2.
Unlike normal market competition, this one actually is a zero sum game.
The faster different markets come to agree on a price for similar instruments, the more efficient the markets are. The profit opportunity for the fastest arbitrageur drives competing arbitrageurs to compete. Ergo arbitrageurs are working with all of their skill and ability to make sure that markets are as efficient as possible.
Now you have a global marketplace where investors have a pretty good idea that their local markets have reasonable prices. The system works for investors and arbitrageurs (the good ones, anyways) alike. What's wrong with that?
And as a thought experiment, ask yourself which of the following would have had more of an effect on your life. Suppose you failed your humanities classes (assuming you are an engineer/CS and work as such). Would your life have been significantly different than if you forgot what you learned? How about if you cheated and passed? Mine certainly would be.
In principle, wearing a suit also has a small positive effect. It arouses some women. But in general it's mostly wasteful signalling.
The sheepskin effect doesn't directly measure the effect of having a degree vs having exactly the same experience/education but no degree. This is because the person who gets the degree after 4 years might on average have worked a lot harder, and learned a lot more than the person who studies for 4 years but doesn't get the degree.
Equality at the speed of light limited timescale is of relatively small economic value to the world. They are basically equalizing prices faster than you could ever learn about them.
The service of arbitrage is useful, but there is a bit of a winner take all here and a race to the bottom (meaning highest cost way of accomplishing the service). No one may care if it's equalized in 1ms vs 200ms, but the party that can do it in 1ms is going to take all the gain from the spread.
(I don't really blame the arbitrageurs; the markets should be doing batched sealed bid or other mechanisms to equalize participants; and avoid diverting massive amounts of funds to floating balloons between trading centres)
there is a bit of a winner take all here and a race to the bottom (meaning highest cost way of accomplishing the service)
It's true that the fastest arbitrageur takes most of the profit, but when we say the cost is going up (i.e. the "arms race"), that means a bigger chunk of the arb profits (which are basically fixed, regardless of cost) are going to network providers, NIC vendors, FPGA vendors, and so forth. The costs to investors are coming down as the chances of buying a mispriced instrument become smaller and smaller.
None of this is simply "arbitrage" as HFTers want the public to believe. The major brokers are perfectly capable of buying and selling at the best prices across any non-darknet market with even rudimentary order routers. What's happening is HFTers are gaining informational advantages and then exploiting them in microseconds. Based on realtime orders, they'll infer that someone wants to buy share abc at price x, deviation y, and then bet on that happening. HFTers can even infer which brokerage firms are placing the orders based on technical data including lag times and data-sharing partnerships.
Should we stop making videogames too, because playing a game does nothing for the world.
Should we stop making music? If all musicans picked up trash on the side of the road, we would have a cleaner world. We can just replay 90s music forever.
If you want to argue in favor of a centralized system of allocating brain power from each according to his ability and to each according to his need, fair enough. Let's try it and see how it works out.
MIT made about 70 million last year from its patents [1]. I'd guess that patent is no barrier for this technology to be adopted if there's enough practical value in it.
Microwave works on line-of-sight and weather patterns easily mess it up and drop the link.
That's why you usually also run your microwave data over a fiber link as backup. Of course, when microwave works, it gives you a decent improvement in latency, at the cost of bandwidth.