Login Page - Create Account

Support Board


Date/Time: Thu, 18 Apr 2024 15:33:19 +0000



[Programming Help] - Automated Strategy/Study - Processing Each Tick Considerations

View Count: 1485

[2018-06-13 21:40:18]
User281933 - Posts: 4
I am trying to design an automated strategy so that it will not start to lag in busy markets (when the ticks are coming in very fast). I am trying to understand/map out the logical sequence of events (and related considerations/limitations) that happen in Sierra to a strategy during a busy market (e.g. ES at news time). I have looked into the documentation, but I still can’t get it straight in my head how best to design the strategy to cope with fast moving markets.

I would be grateful if anyone can contribute to my understanding of the best methods to mitigate lag in the strategy. If the information is available in the documentation, or has been addressed previously, then I apologise, but would appreciate a pointer to the appropriate location.

Requirements, Assumptions and Context
Assume that the Strategy must be executed on each tick (e.g. it’s a short term strategy).

Assume that there are a few indicators that need to be updated and also some calculations to do on level II data.

Assume that there is no lag in the datafeed, other than the natural lag/jitter from the Exchange to the computer location through the internet (due to distance considerations).

Assume that the computer has suffient CPU cores, memory and disk spped for the majority of calculations - we are concentrating on what happens when the market gets busy and the ticks are starting to come in too fast to allow the relevant calculations to be carried out for each tick.

Conceptually, each time a new tick comes in and the strategy performs a typical number of calculations. A simple assumption is that each of these strategy loops takes a certain (average) amount of time to perform. That means that there is a certain amount of ticks that can be calculated each second, and if the ticks start to come in faster that that, then something has to give - which one of the scenarios below happens, or does something else happen?

Scenario 1: The application diligently calls the strategy for each tick that comes in, and the strategy diligently performs the same (average) set of calculations, so the strategy starts to lag. Any orders submitted by the strategy while in a state of lag will likely not be applicable.

Scenario 2: The application diligently calls the strategy for each tick that comes in, but the strategy is designed for this, and performs some sort of quick test on each tick to check how long it’s been since the last tick (or something similar) - if it’s it too short a time, then the strategy performs a subset of calculations - thus (hopefully) not falling behind.

Scenario 3: The application knows if the strategy is falling behind and skips data, or signals somewhere (where the strategy can check) that there is a processing lag.

I not looking for code or anything, just the logical progression or options available so I can design appropriately

Regards,

Kieran
[2018-06-13 23:03:51]
Sierra Chart Engineering - Posts: 104368
Is this a spreadsheet automated trading system or does it use ACSIL? We recommend using ACSIL.

In the case of ACSIL, read this section:
Working with ACSIL Arrays and Understanding Looping: Update Study Function Calls
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing
[2018-06-14 08:22:48]
User281933 - Posts: 4
Thanks for the information.

Yes it uses ACSIL.

So if I'm understanding this correctly, in the Update Study Function Calls section, it sounds like that if the chart update interval is set to, say 200ms, and there happens to be multiple trade market data received within that time frame, then the Study can step through from the Prior Array Size to the Current Array Size and perform calculations for each piece of market data. Is this correct?

If the above is correct, the question then becomes what happens if the Study cannot step/loop through all of the captured market data (performing calculations for each value) before it's next called in 200ms (due to a lack of processing power)? It sounds like the Study might start to lag behind until the market data rate drops enough for it to catch up...

If correct, the above may not happen very often, but is there any recommended way to perform a relatively quick (time frame) check to see if the time between market data events is below a certain threshold and this may allow a subset of calculations to be performed, rather than the full set - in order to prevent the Study potentially going into a lagging state???

From what I can understand, the question is still valid for the case where sc.OnExternalDataImmediateStudyCall is set to TRUE.

Hopefully the above questions make sense...

Regards,

Kieran
Date Time Of Last Edit: 2018-06-14 08:24:05
[2018-06-14 08:48:13]
Flipper - Posts: 65
200ms is a very long time for SC. What studies are you running that take longer than that to calculate?
[2018-06-14 09:33:30]
User281933 - Posts: 4
Hi Flipper,

The 200ms could be any number - say 50ms.

I'm only new to Sierra - have previously been with Ninja, and have some programming experience.

I'm just trying to shine a light into the dark corners where the market data starts to come in too fast to allow a Study to be executed for each market data event that has been received since the last Update Study Function Call.

If 10 market data events come in within the 50ms window, then each of the 10 market data values can be looped through (and associated strategy calculations performed) before the next Update Study Function call happens (50 ms later).

So what happens if 50 events come in the 50ms window? What happens if 300 come in - eventually we get to the stage that there are too many market data events within the update interval - does the Study start to lag further and further behind the actual market price due to a lack of processing power to keep up with all of the computations that have to be done for each data value (assuming that the same calculations are performed for each market value)?

It's not about passing judgement (good or bad) on Sierra - it's about me trying to understand the designed behavior of the software in edge cases where the market gets very busy, but the Study is one that (ideally) requires all market data points to be evaluated...

Apologies - I understand what I'm trying to ask in my head - putting it down on (virtual) paper seems to be the difficult aspect of things :-(

Regards,

Kieran
[2018-06-14 10:27:17]
Flipper - Posts: 65
Welcome to the proper side. I was once there too, with NT. SC is nothing like Ninja.

To to check how long it takes each study to calculate you can open the chart studies window (F6) and just after the study name that you have loaded on your chart you can see how long each study takes to calculate. Most are in the sub 10 ms.

SC is very high performance, nothing like NT. Test it. Even try a fast replay like 60X. I'd be surprised if you can get it to lag.
[2018-06-15 17:33:07]
Sierra Chart Engineering - Posts: 104368
then the Study can step through from the Prior Array Size to the Current Array Size and perform calculations for each piece of market data. Is this correct?
The page we linked to covers this subject in its entirety. You just need to read the documentation. We will not say anything here about this. Reference:

Working with ACSIL Arrays and Understanding Looping


If the above is correct, the question then becomes what happens if the Study cannot step/loop through all of the captured market data (performing calculations for each value) before it's next called in 200ms (due to a lack of processing power)? It sounds like the Study might start to lag behind until the market data rate drops enough for it to catch up...

The study function is not going to get called again until it finishes the calculations and the next chart update interval occurs. This is never a problem. If a chart update needs to be skipped because of the time to calculate a study it will be skipped.

Also never use sc.OnExternalDataImmediateStudyCall . We should remove that.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing
Date Time Of Last Edit: 2018-06-15 17:48:29
[2018-06-18 20:22:50]
User109179 - Posts: 5
To access individual trades and bid/ask updates in between calls into the study function, use the following functions:

sc.GetTimeAndSales

Is there a similar function to GetTimeAndSales providing access to all the bid / ask updates for the entire depth? It appears to only be the level 1 updates.

c_ACSILDepthBars
This looks like it wraps the level2 updates and provides some stats on each bar but doesn't gives access to all the updates.
[2018-06-19 00:02:37]
Sierra Chart Engineering - Posts: 104368
No, there is no similar function for market depth data.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing

To post a message in this thread, you need to log in with your Sierra Chart account:

Login

Login Page - Create Account