Login Page - Create Account

Support Board


Date/Time: Wed, 05 Mar 2025 18:21:57 +0000



Post From: Orderbook data

[2022-01-09 04:58:29]
User5044343 - Posts: 68
I've been collecting orderbook data ( records 122, 140, 141 ) for F.US.EPH22 ( E-mini futures march 2022 contract ).

I'm investigating a particular millisecond, 1641572372525.


The issue I found is that in one batch of the 140 record, with a timestamp of 1641572372525. It contains over 800 MARKET_DEPTH_DELETE_LEVEL enums ( quantity 0, on the ask side). Thats over 800 different price points that go to 0 on the orderbook.

Obviously that isn't possible and I must be looking at this the wrong way.

In the next batch of orderbook updates, with the same time stamp, the price levels get MARKET_DEPTH_INSERT_UPDATE_LEVEL enum.

My questions are as follows:

Should I be grouping the MARKET_DEPTH_DELETE_LEVEL and MARKET_DEPTH_INSERT_UPDATE_LEVEL by timestamp instead of the the batches that they get grouped by ?

If not by timestamp what is the proper way to group orderbook delete/updates ?
Date Time Of Last Edit: 2022-01-09 17:55:26