Login Page - Create Account

Support Board


Date/Time: Mon, 25 Nov 2024 07:36:23 +0000



Post From: ACSIL sc.TickSize Returns 0*

[2024-03-21 04:58:01]
Gradient - Posts: 89
I don't understand.

Per the documentation, sc.TickSize is a float value: ACSIL Interface Members - Variables and Arrays: sc.TickSize


There's no need to convert to a string to normalize float values.

Given that I'm iterating over a watchlist containing multiple symbols, to compute metrics for each symbol, it is necessary to normalize values by each symbols' respective tick size.

I saw that there was a prior issue with sc.TickSize here: ACSIL: sc.SetDefaults and sc.TickSize

Why should sc.TickSize be converted to a string?

How is it possible for the return value to be 0* for each symbol? This is similiar to the issue in the Thread referenced above. Was that issue ever resolved?