Login Page - Create Account

Support Board


Date/Time: Sun, 19 May 2024 15:10:18 +0000



Post From: Problem with market depth desapearing

[2015-03-31 05:04:04]
i960 - Posts: 360
Whether they update with a level of 0 or not, the code has to be resilient against that. Regardless of what IB is sending (where they make *no* guarantee they won't send 0.0) it should be ready for anything. Also, I'm not talking about spread contracts, I'm talking about the spread between bid/ask and missing levels because there are legitimately *no* bids or asks there.

The way the insert portion of the code is working is that it's assuming a contiguous arrival of data. It should not assume this - it should assume it's working with sparse data because event orders are never guaranteed. There's a lot of pre-optimization going on here with no actual measurement of the amount of inserts vs deletes vs updates. The vast majority of market depth data for futures contracts is most likely going to be updates. Inserts and deletes are probably a very low percentage of the actual messages so if the for loop has to check up to MAX_NUM_DOM_LEVELS in order to be *correct* it should just do that. I bet if the code were profiled very little time would even be spent in the insert or delete cases. Like I said, if you want to increase the efficiency of that loop processing, simply use a separate tracking variable for used size rather than iterating all the way up to MAX_NUM_DOM_LEVELS on deletes. Checking for price or volume equal to 0 is not robust. But also like I said, the micro efficiency improvement might not be worth the additional complexity if it lends itself to incorrectness of brittleness.

I'm curious to see what the TempDomStructure.Volume change actually does in the real world, but I suspect any 0 price 0 size messages for a given position might still screw up elements that are after that position because the insert loop will break earlier than it should.