I'm running an embedded SI instance that has has approx 100 running queries, some of which have a data frequency of 25Hz so quite a large volume of data is passing through the system.
The problem is that whenever the memory usage of the application (which I'm monitoring using SysInternals ProcessExplorer) reaches around 1.4GB one of my high frequency observers throws a virtual alloc memory exception and processing in the system terminates.
The 2012 64bit server that the instance is running on has 64GB of installed RAM. How do I make all this memory available to my SI instance.
First, it will be available unless you are running a 32-bit process. Make sure that your process is set to either "AnyCPU" or "x64". The default for a console application is, I believe, x86 only.
Second, make sure that you aren't running under the debugger.
From there, check the code for your observer and make sure that it's cleaning up after itself. Make sure all disposable objects are disposed. Make sure that you are clearing memory when required. Ideally, also see if you can get a stack trace when the exception
gets thrown - that'll help you determine where it's happening.
DevBiker (aka J Sawyer)
Microsoft MVP - Sql Server (StreamInsight)
If I answered your question, please mark as answer.
If my post was helpful, please mark as helpful.