Concurrent read only access to SDF file and SSCE_M_OUTOFMEMORY RRS feed

  • Question

  • Greetings...

    I've put together a web api that reads data from several different SDF files.  I say this in the full knowledge that SQL Server Compact 4.0 SP1 is not really robust enough for a web environment.  We'll be working on using SQL Server for this in the future, but for the time being I have this issue to deal with.

    First off, the SDF files that are accessed are accessed read-only.  Concurrent access to the same files seems to work most of the time.  There are cases where SQL CE throws SqlCeException.  The details in the exception are...


    with a single error in the Errors collection with the message "There is not enough memory on the device running SQL Server Compact to complete this operation."   

    Once one of these is thrown, then just about any attempt to read these files fails in the same way until I recycle the app pool.

    The connection string settings for accessing the files are...

                    SqlCeConnectionStringBuilder builder = new SqlCeConnectionStringBuilder();
                    builder.DataSource = <Path to SDF file>
                    builder.MaxDatabaseSize = 4091;
                    builder.FileMode = "Read Only";
                    builder.TempFilePath = System.IO.Path.GetTempPath();
                    using ( var connection = new SqlCeConnection( builder.ConnectionString ) )
                    { .... }

    The exception is thrown when opening the connection and sometimes when executing the reader.

    One final observation, there is another part of this API that also reads SDF files using almost identical connection settings.  I do not have proof, but there must be concurrent accesses to the same files in this case as well, but I have never seen this exception thrown in these cases, even when the first API is throwing them right and left.

    The API does use a lot of memory, but is does not seem to struggle in any case except here.

    My questions...

    1. Is there any way to know what level of memory management is encountering the problem?  I.e. C++, C-RTL, HeapAlloc, Win32, ... ?
    2. I have seen some comments that  the MaxDatabaseSize setting is only used to allocate memory when there are concurrent accesses of the same file.  When that occurs it must be sufficiently large to hold the entire database in memory.  Is this correct?
    3. If this is correct, then does the fact that all of my sdf connections are read only change that behavior.
      The reason that I ask is that I have set the value of MaxDatabaseSize to 20 for files that are significantly larger than that expecting it to fail.  It does not fail in my case.
    4. In general, can anyone explain this behavior?

    Thanks in advance for the help.

    Regards, Ken

    Monday, September 23, 2019 3:33 PM

All replies

  • >> there is another part of this API that also reads SDF files using almost identical connection settings. 

    What are the differences?? It is important that access to a particular file uses the exact same connection string.

    Have you tried without read only mode?

    Please mark as answer, if this was it. Visit my SQL Server Compact blog

    Tuesday, October 1, 2019 5:34 AM
  • Thanks for the followup...

    I think that I misled you with the "almost identical" connection strings.  It is more correct to say that whenever an SDF file is accessed concurrently, exactly the same connection string is used.  The two components of the API that I was referring to do not try to open the same files.  Within an API component different threads in the API can access the same SDF file at the same time.  When this occurs, the same connection string is used by both threads.

    I have not tried not using the read only mode.  Why would you think that that could help?  I would imagine that the memory demands for concurrent read/write access to an SDF would be greater than for read-only.

    Tuesday, October 1, 2019 6:17 PM