Support Board
Date/Time: Mon, 25 Nov 2024 07:19:29 +0000
[Programming Help] - ACSIL sc.TickSize Returns 0*
View Count: 428
[2024-03-21 03:37:56] |
Gradient - Posts: 89 |
Hi, I've created a custom study that is triggered by the "Start Scan" process. The study uses sc.TickSize to normalize values. However, sc.TickSize is not returning the tick size for any of the symbols. I haven't made any changes to the symbol settings. They are still the defaults. I've tested this method on different chart types to see if it was isolated to a specific type but the issue persists. I've attached a screenshot of the message log of the issue. Can someone advise how to resolve this? Thanks. |
Tick Size Error.png / V - Attached On 2024-03-21 03:36:57 UTC - Size: 7.75 KB - 72 views |
[2024-03-21 04:21:39] |
Sierra_Chart Engineering - Posts: 17172 |
Very unlikely there is a problem with this. We think the issue is that you are not converting the value properly to a string.
Sierra Chart Support - Engineering Level Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy: https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation For the most reliable, advanced, and zero cost futures order routing, use the Teton service: Sierra Chart Teton Futures Order Routing |
[2024-03-21 04:58:01] |
Gradient - Posts: 89 |
I don't understand. Per the documentation, sc.TickSize is a float value: ACSIL Interface Members - Variables and Arrays: sc.TickSize There's no need to convert to a string to normalize float values. Given that I'm iterating over a watchlist containing multiple symbols, to compute metrics for each symbol, it is necessary to normalize values by each symbols' respective tick size. I saw that there was a prior issue with sc.TickSize here: ACSIL: sc.SetDefaults and sc.TickSize Why should sc.TickSize be converted to a string? How is it possible for the return value to be 0* for each symbol? This is similiar to the issue in the Thread referenced above. Was that issue ever resolved? |
[2024-03-21 05:02:50] |
Gradient - Posts: 89 |
For clarity, I have a watchlist attached to a chart containing multiple symbols. The chart contains a study. When I select StartScan from the Chart Menu, this triggers the study to be computed across every symbol in the watchlist. Given that different symbols have different minimum fluctuations, to compute metrics, it is necessary to normalize values by the minimum fluctuation (i.e TickSize). At no point is any string conversion being done. Intuitively, I should be able to select "Start Scan" from the Chart Menu and the sc.TickSize should update as the chart changes per each symbol in the Watchlist. However, sc.TickSize is always 0*. |
[2024-03-21 15:39:59] |
John - SC Support - Posts: 36238 |
The reference to the String conversion is for how you are outputting the information to the log. Make sure you are doing this properly. In other words, our belief is that you are not writing the information properly to the Message Log, and that the Tick Size is being properly picked up. For the most reliable, advanced, and zero cost futures order routing, use the Teton service: Sierra Chart Teton Futures Order Routing |
[2024-03-21 16:41:05] |
Sierra_Chart Engineering - Posts: 17172 |
The Tick Size is definitely not zero at all. It is set correctly.
Sierra Chart Support - Engineering Level Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy: https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation For the most reliable, advanced, and zero cost futures order routing, use the Teton service: Sierra Chart Teton Futures Order Routing |
[2024-03-21 19:09:39] |
Gradient - Posts: 89 |
Okay I understand the rationale for the string formatting comment. I did however, implement the string conversion per the documentation as I've done several times before with no issues. To further investigate the issue I structured an experiment. My goal is to read in data from a study, compare it to some level, and divide by the sc.TickSize. I'm dividing by the sc.TickSize to normalize the values as this is being run via "Start Scan" from the Chart Menu. Experiment: I computed a metric (i.e. simple deviation of current High from value of study). I computed this metric 1) using the sc.TickSize and 2) without using the sc.TickSize I then stored the values of both in a Subgraphs. I also stored the High used to compute the metric in a subgraph as well as the reference study output and the sc.TickSize. See Below: //reference from study stored in sc.Subgraph[0] //No Tick Size sc.Subgraph[5][sc.Index]=abs((sc.Subgraph[0][sc.Index-1]-sc.High[sc.Index-1])); //With Tick Size sc.Subgraph[6][sc.Index]=abs((sc.Subgraph[0][sc.Index-1]-sc.High[sc.Index-1]))/sc.TickSize; //High sc.Subgraph[7][sc.Index]=sc.High[sc.Index]; //Store sc.TickSize sc.Subgraph[8][sc.Index]=sc.TickSize //Graphs //Reference Study sc.Subgraph[0].Name="Reference Study"; sc.Subgraph[0].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[0].PrimaryColor = RGB(8, 25, 128); sc.Subgraph[0].LineWidth = 4; sc.Subgraph[0].DrawZeros = false; sc.Subgraph[0].DisplayNameValueInDataLine=true; //Current High sc.Subgraph[7].Name="Current Chart Value"; sc.Subgraph[7].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[7].PrimaryColor = RGB(8, 25, 128); sc.Subgraph[7].LineWidth = 4; sc.Subgraph[7].DrawZeros = false; sc.Subgraph[7].DisplayNameValueInDataLine=true; //No Tick Size sc.Subgraph[5].Name="No Tick Size"; sc.Subgraph[5].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[5].PrimaryColor = RGB(128, 25, 128); sc.Subgraph[5].LineWidth = 4; sc.Subgraph[5].DrawZeros = false; sc.Subgraph[5].DisplayNameValueInDataLine=true; //With Tick Size sc.Subgraph[6].Name="With Tick Size"; sc.Subgraph[6].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[6].PrimaryColor =RGB(18, 255, 128); sc.Subgraph[6].LineWidth = 4; sc.Subgraph[6].DrawZeros = false; sc.Subgraph[6].DisplayNameValueInDataLine=true; //With Tick Size sc.Subgraph[8].Name="Actual Tick Size Value"; sc.Subgraph[8].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[8].PrimaryColor =RGB(18, 55, 18); sc.Subgraph[8].LineWidth = 4; sc.Subgraph[8].DrawZeros = false; sc.Subgraph[8].DisplayNameValueInDataLine=true; I've attached the screenshot of the output from each of the subgraphs. TickSize is still 0 when simply storing the value to subgraph. Also, the sc.High[sc.Index] of the current chart is 0 as well as each of the subgraph outputs. How do you advise to correct this? |
Tick Size Error 2.png / V - Attached On 2024-03-21 19:08:31 UTC - Size: 19.22 KB - 74 views |
[2024-03-21 21:54:09] |
Sierra_Chart Engineering - Posts: 17172 |
At this at this point we are marking the thread as Programming Help.
Sierra Chart Support - Engineering Level Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy: https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation For the most reliable, advanced, and zero cost futures order routing, use the Teton service: Sierra Chart Teton Futures Order Routing |
[2024-03-23 18:48:58] |
ForgivingComputers.com - Posts: 960 |
It is not clear if you are building this correctly, since we are not seeing the entire cpp file. There is no sc.SetDefaults section, and the sequence of lines is incorrect. I don't know if this is your problem, as I don't think it would compile, but.... This line is missing the semicolon: sc.Subgraph[8][sc.Index]=sc.TickSize
|
[2024-03-29 03:24:32] |
Gradient - Posts: 89 |
I believe I found the issue. I created a simple study just to output the sc.TickSize value.(see below) #include "sierrachart.h" using namespace std; SCDLLName("ShowTickSize") void GetTickSize(SCStudyInterfaceRef sc){ //store tick size in subgraph sc.Subgraph[0][sc.Index]=sc.TickSize; } SCSFExport scsf_c1( SCStudyInterfaceRef sc) { //Inputs if (sc.SetDefaults){ //set graph defaults sc.GraphName= "ShowTickSize"; //Reference Study sc.Subgraph[0].Name="Tick Size"; sc.Subgraph[0].DrawStyle=DRAWSTYLE_LINE; sc.Subgraph[0].PrimaryColor = RGB(8, 25, 128); sc.Subgraph[0].LineWidth = 4; sc.Subgraph[0].DrawZeros = false; sc.Subgraph[0].DisplayNameValueInDataLine=true; } //display entries and exits on main graph instead of subgraph sc.GraphRegion=1; GetTickSize(sc); return; }; When running this study as in my primary study, I noticed that sc.TickSize is 0 sometimes and sometimes it returns the actual ticksize. I checked the message logs and received an error stating that there was a CPU exception and the study made the platform unstable until removing it and restarting the platform. I believe this explains why sc.TickSize was 0 in the prior posts. I've attached examples of running this simple study via a scan which shows sometimes the return value is correct and sometimes it is 0. Ideally, removing the study and restarting the platform should correct this per the message logs. |
TickSize Exception.png / V - Attached On 2024-03-29 03:20:35 UTC - Size: 20.28 KB - 61 views TickSize Working.png / V - Attached On 2024-03-29 03:22:29 UTC - Size: 63.34 KB - 61 views TickSize Not Working.png / V - Attached On 2024-03-29 03:22:36 UTC - Size: 65.52 KB - 96 views |
[2024-03-29 03:41:10] |
Gradient - Posts: 89 |
After removing the studies and restarting the platform, I'm still experiencing the same issue. CPU exceptions can be caused by division by zero. In the original study I am dividing by the sc.TickSize which is sometimes zero as I've shown above. This is likely why there is a CPU exception, but it doesn't explain why the sc.TickSize is correct sometimes and zero other times. |
[2024-03-29 16:39:31] |
Gradient - Posts: 89 |
UPDATE: I believe that I corrected this issue by just creating a new study. I implemented the same logic in a new DLL and tested it and it appears to be working properly. |
To post a message in this thread, you need to log in with your Sierra Chart account: