none
how to set the properties 'DefaultMaxBufferSize' and 'DefaultMaxBufferRows' RRS feed

  • Question

  • In this document: http://technet.microsoft.com/en-us/library/cc966529.aspx 

    "For example, if your Estimated Row Size is 500 bytes per record, 500 bytes/record* 10,000 records, is approximately 4.8 MB which is less than DefaultMaxBufferSize of 10 MB, but greater than a MinBufferSize of 64 KB. In this scenario, SSIS will make adjustments to size the buffer as closely as possible to 4.8 MB."

    So why don't we simply set DefaultMaxBufferRows to be the max value 100M, then we will always get the best perfomence. Am I missing sth here?

     

    Sunday, December 5, 2010 12:10 PM

Answers

  • Note that you want to make changes to both DefaultBufferMaxRows and DefaultBufferSize.  It is the combination of both of these properties together that will help to control you data flow. As James stated, it is sometimes advisable to limit the size of the buffers as large buffers can put increased pressure on your system.  In addition, some components use a private buffer to enact changes to the public buffer of the data flow which may be non-optimal.

    The main item to remember here is that you need to test the settings that are correct for your environment.  Note that these settings will potentially be different dependent on the load that the system is under at any given point in time, so it may even be optimal to have different settings depending on the day of month, time of day, etc.  Test, test, test...

    * Just as a side note, large buffer sizes often are optimal.  But I've had systems that absolutely depended on very small buffer sizes in order to complete in a timely fashion.


    Please mark answered posts. Thanks for your time.
    Monday, December 6, 2010 8:02 PM
    Moderator

All replies

  • Be aware you may have more than one buffer running through the pipeline at any given time, so you may end up consuming all your memory very quickly. Plus if you have packages executing simultaneously that will exacerbate the problem.

    Also, if you have a big buffer retrieving from a slow source, you may find it takes a long time to fill the buffer before moving it through the pipeline, so if you have any slow downstream components you won't get the performance advantage of continually feeding them rows while the source catches up.

    Cheers, James


    James Beresford @ www.bimonkey.com
    SSIS / MSBI Consultant in Sydney, Australia
    SSIS ETL Execution Control and Management Framework @ SSIS ETL Framework on Codeplex
    Sunday, December 5, 2010 11:12 PM
  • Hi BI Monkey, thanks for you reply. I'm still not sure what should I do. If i simply set the DefaultMaxBufferRows to be the max value 100M, then what's the drawback here? SSIS will make adjustments to size the buffer as closely as possible to actrual value - base on that document 

    Monday, December 6, 2010 4:19 PM
  • Note that you want to make changes to both DefaultBufferMaxRows and DefaultBufferSize.  It is the combination of both of these properties together that will help to control you data flow. As James stated, it is sometimes advisable to limit the size of the buffers as large buffers can put increased pressure on your system.  In addition, some components use a private buffer to enact changes to the public buffer of the data flow which may be non-optimal.

    The main item to remember here is that you need to test the settings that are correct for your environment.  Note that these settings will potentially be different dependent on the load that the system is under at any given point in time, so it may even be optimal to have different settings depending on the day of month, time of day, etc.  Test, test, test...

    * Just as a side note, large buffer sizes often are optimal.  But I've had systems that absolutely depended on very small buffer sizes in order to complete in a timely fashion.


    Please mark answered posts. Thanks for your time.
    Monday, December 6, 2010 8:02 PM
    Moderator