locked
What is the best way to send large data from WCF service to Client RRS feed

  • Question

  • Hi,

            What is the best way to send large data (500MB data per client) from SQL server to the client machine via WCF service with NetTcpBinding and without RM?

               

    1.      Use dataset and Buffering : Problem I found with this scenario are

    Memory utilization (Buffer size per client will be large and it can not handle many clients: outofmemoryException

     

    2.      Can we use dataReader with streaming/chunking?(an Example will help  a lot)

     

    3.      Use dataset and streaming  

     

    4.      Use dataset and streaming  and chunking (how to achieve chunking in Nettcpbinding)

     

    5.      Any other solution???

     

    Dileep Agarwal

    Friday, April 21, 2006 6:40 PM

Answers

  • Dileep,

    I can not give you code samples but I would suggest thinking about the following:

    1) You could do the first stage of processing on the data while it is still in the SQL Server using C# on the server.  Then you can reduce the data that needs to be sent.  You clearly do not need 500MB of data on the client for final reporting values since that would produce many many reams of paper.  So, look at the processing that is currently happening on the client and try to move most of that to the server.  Moving 500MB of data over WCF will take quite a while even using streaming.

    2) If you can not move processing from the client for some reason consider using a cursor into the database and operate using client server ODBC/OLE connections from hte client.  The client/server SQL systems are very stable and can allow you to walk your way through the data with limited buffering required on either end.

    3) If you need random access to the data, and it has to happen on the client side, and it has to happen using WCF, then streaming is probably the most efficient.  But, even a bulk copy of the data from the server side SQL Server instance to another running on the client machine might be a better way to go.

    Friday, April 21, 2006 8:09 PM

All replies

  • Considering that the max usable virtual memory allocation is 2G, .5G is a very considerable amout of data to send.  The first thing I'm going to say is, "Don't do that".  ;>  If this is a 500MB dataset is there no way of altering the query to reduce the size?

    If not, then I would say that streaming would be the best way of doing it so you don't have to have such a massive buffer.

    Thanks,

    Scott

    Friday, April 21, 2006 7:09 PM
  • Hi Scott,

                       Thanks for the reply. In our present application we are having requirements where we are transferring large reporting data at client printer (although this takes time).

    But the question is if I will use streaming than while fetching the data from the database I have to store that data on the server in some object or in a dataset (before converting it in memory stream and transfer it to client end) and object / dataset itself will occupy large memory.

    How Can I perform this streaming task, without storing the data in object or dataset?

     Is there any other way to fetch the data from SQL server and send it to the client?

    Can I consider dataReader to fetch data from SQL server instead of dataset?

    Right now I do not have much idea about this and your input (some helping link or code) will help me to get some solution.

    Dileep Agarwal

     

    Friday, April 21, 2006 7:28 PM
  • Dileep,

    I can not give you code samples but I would suggest thinking about the following:

    1) You could do the first stage of processing on the data while it is still in the SQL Server using C# on the server.  Then you can reduce the data that needs to be sent.  You clearly do not need 500MB of data on the client for final reporting values since that would produce many many reams of paper.  So, look at the processing that is currently happening on the client and try to move most of that to the server.  Moving 500MB of data over WCF will take quite a while even using streaming.

    2) If you can not move processing from the client for some reason consider using a cursor into the database and operate using client server ODBC/OLE connections from hte client.  The client/server SQL systems are very stable and can allow you to walk your way through the data with limited buffering required on either end.

    3) If you need random access to the data, and it has to happen on the client side, and it has to happen using WCF, then streaming is probably the most efficient.  But, even a bulk copy of the data from the server side SQL Server instance to another running on the client machine might be a better way to go.

    Friday, April 21, 2006 8:09 PM
  • Hi All,

    Thanks for the reply form all of you. It will help me a lot. We are looking to reduce the data size.

     I have few questions regarding streaming with chunking and the bulk copy:

    1. Can we abort a chucking operation in between from server and client side?

    2. How to implement a bulk copy operation (any help link)

    3. Can I consider the following solution?

    Data can be fetched from the SQL server in  file and with compression, streaming and chunking we can transfer the data to the client side and we can use that data file to generate the reports.

    Or

    Bulk copy or some other SQL technique will be better in this scenario

    Thanks,

    Dileep Agarwal

    Monday, April 24, 2006 9:38 PM
  • http://wcf-chuncking.sourceforge.net/
    Here you can find good solution on large data sending. There are 3 bindings that based on Nettcpbinding, NetNamedPipeBinding and Http binding.
    They  implement  all  fichus of those but additionally apply chunking
    Tuesday, February 5, 2008 7:25 AM
  • Tuesday, June 3, 2008 10:55 AM