Login Page - Create Account

Support Board


Date/Time: Thu, 25 Apr 2024 10:05:58 +0000



Market Depth Historical Graph - Memory Error

View Count: 1056

[2021-03-06 12:01:59]
User907968 - Posts: 802
Hi,

Recently I am seeing this error message more frequently, mainly affecting ES & NQ:


Failed to add to the depth data for ESH21-CME, possibly as a result of running out of memory. The depth data accounted for 1.63 gigabytes of memory. Depth data will be unavailable until the chart can be reloaded. | 2021-03-06 05:22:19.559

What I have done already:
Read this - Working With Charts: Out of Memory Condition
Read this - Market Depth Historical Graph: Market Depth Historical Graph Not Displaying
Reduced days to load to 1
Reduced number of days to load depth to 1

Sometimes the depth data will disappear completely until the chart is reloaded.

My charts are 1 second (or less) so I can understand that near the end of the trading day there is potentially a substantial amount of depth data associated with the chart bars.

That being said, I have separate instances specifically for running these charts and the total memory used by any individual instance is typically less than 1GB.
There is no shortage of system memory available >20GB, so although it would be easy for me to increase the system memory, it doesn't seem as though this would actually be a solution.
[2021-03-06 15:07:15]
nosast - Posts: 290
I have the exact same issue. Nearly every day depth just stops and I need to reload the chart. Same log message and also lots of Memory available to be allocated by SC.

Supposedly due to the increased volatility this happens more often currently and SC is running into some allocation limit. I would be more than fine with SC using all available memory if it is needed and would rather upgrade system RAM.

Also noticed that Bookmap running on another machine also takes up a lot of memory recently for nearly the same NQ chart (around 7-8 Gigs).
Date Time Of Last Edit: 2021-03-06 15:07:41
[2021-03-06 17:23:32]
Sierra Chart Engineering - Posts: 104368
There is no shortage of system memory available >20GB, so although it would be easy for me to increase the system memory, it doesn't seem as though this would actually be a solution.
The Windows operating system is failing to grant the memory request.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing
[2021-03-06 18:44:52]
nosast - Posts: 290
And how could there be a fix or a workaround? Not sure if there is anything we could set on an OS basis and also other programs are able to request more memory with no problem.

I guess the issue is per chart, as I don't have the same problems when opening much more charts and SC then uses according to task manager a lot more memory.

As a workaround, could it be possible to e.g. just store the depth on the filesystem instead of RAM when one chart gets over 1 GB or so?
[2021-03-06 18:58:40]
Sierra Chart Engineering - Posts: 104368
Not sure what the solution would be. Certainly system configuration changes must have some effect. Try increasing system memory and try disabling the system paging file.

Maybe there is something we can do with Windows API memory management functions for managing memory for the process but this is not anything we can spend time allocating time to now.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing
Date Time Of Last Edit: 2021-03-06 18:59:53
[2021-03-06 19:38:36]
User907968 - Posts: 802
try disabling the system paging file
This made no difference for me.

Try increasing system memory
Can do, certainly, memory is inexpensive in the grand scheme of things, but I'll eat my hat if it makes any real difference.

Maybe there is something we can do with Windows API memory management functions for managing memory for the process but this is not anything we can spend time allocating time to now.
Ok fair enough, but surely we can't be the only users affected by this.
[2021-03-07 16:02:05]
Sierra Chart Engineering - Posts: 104368
Adding more memory may or may not make a difference. It is impossible for us to know. Only Microsoft can best answer that question.

It might actually make a difference because the problem could be that the operating system is not able to allocate a large enough continuous block of memory requested with the current memory state of your system at the time the memory is requested.

So if you do have a lot more memory available, it might actually make a difference.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing
Date Time Of Last Edit: 2021-03-07 16:02:59
[2021-03-07 16:22:58]
Mr.Arsov - Posts: 16
I've ran into the same issue on ES/NQ when I try and load MORE than 2-3 days of depth data and it's the exact same error with 1.63 GB. I've noticed it happens on "faster/lower" timeframes. For example if I try to load 5 days of depth on a 1 second chart I can't. But 5 days on a 10 min chart there is no problem.Also one thing that helps to load more data is filtering the size of liquidity so historical market depth shows only liquidity above 10%-15% (example) of max visible orders.
I'm running SC on Linux Mint 19.3 through Wine if that makes any difference.
Edit: I'm just giving my input if it's of any help to SC support. I personally don't mind it since I don't need more than a few hours of depth data.
Date Time Of Last Edit: 2021-03-07 16:24:35
[2021-03-10 12:36:40]
User907968 - Posts: 802
Adding more memory made no difference, even with >55GB free and 1 single chart open, there seems to be a hard limit of ~1.6GB for depth data.

operating system is not able to allocate a large enough continuous block of memory
Certainly a consideration, but I can allocate large memory block ok using 'new', 'malloc' or 'std::vector' for example.

edit:
As a workaround, I am currently switching to RTH hours only just after the open to reduce the amount of depth data loaded, so far this has meant that the depth data doesn't randomly disappear from the screen when the 'memory limit' is reach.
Date Time Of Last Edit: 2021-03-10 13:50:00
[2021-03-10 14:03:50]
Sierra Chart Engineering - Posts: 104368
Ok we probably therefore have to change how we manage memory and use multiple smaller blocks. Not sure how soon we can get to this.
Sierra Chart Support - Engineering Level

Your definitive source for support. Other responses are from users. Try to keep your questions brief and to the point. Be aware of support policy:
https://www.sierrachart.com/index.php?l=PostingInformation.php#GeneralInformation

For the most reliable, advanced, and zero cost futures order routing, *change* to the Teton service:
Sierra Chart Teton Futures Order Routing

To post a message in this thread, you need to log in with your Sierra Chart account:

Login

Login Page - Create Account