Support Board
Date/Time: Wed, 27 Nov 2024 13:38:27 +0000
Post From: Timing problems
[2014-04-08 17:35:50] |
Hendrixon - Posts: 130 |
Just look on the picture. On a 1 Tick chart like this, the milliseconds (top study) should be all over that place, very random... but they don't... they just orderly go from 0 to 999. on a 2000ms chart update interval (like in the picture) it takes two and a half minutes for the milliseconds to go from 0 to 999. See the bottom study showing seconds... going from 0 to 59 normally... and see how you can almost fit 3 MINUTES in how long it takes the "milliseconds" to do their thing there... Sorry I can't explain this better cause I can't make any sense why SC behaves like that. its too weird. For local seconds time stamp use this: int& IndexFollow = sc.PersistVars->i1;
if (IndexFollow < sc.Index) { SYSTEMTIME systime; GetLocalTime(&systime); IndexFollow = sc.Index; Second[IndexFollow] = systime.wSecond; } For milliseconds use this: int& IndexFollow = sc.PersistVars->i1;
if (IndexFollow < sc.Index) { SYSTEMTIME systime; GetLocalTime(&systime); IndexFollow = sc.Index; Millisecond[IndexFollow] = systime.wMilliseconds; } Remarks: 1. Don't use sc.CurrentSystemDateTime cause its wrong, it rounds "seconds" instead of truncating so I don't trust it on anything. For example you report ticks timestamp like this: 00:00:05.450 is rounded to 00:00:05 (five) 00:00:05.550 is rounded to 00:00:06 (yes SIX) Those are real milliseconds from the feed... not your counter. 2. use sc.OnExternalDataImmediateStudyCall = true; 3. you need to make sure your local time is spot on cause even with Meinberg's NTP client you could get +/-150ms from real time based on your system quartz frequency drift. Date Time Of Last Edit: 2014-04-08 17:37:49
|