none
How transfer datatable or dataset by using WCF?

    Question

  • How transfer datatable or dataset by using WCF?

    Is it possible to send datatable over tcp channel using WCF?

    What is the best way to send data over network XML or DataTable for larger data?

     

    Tuesday, August 28, 2007 12:22 PM

Answers

  •  

    1) See documentation: http://msdn2.microsoft.com/en-us/library/aa347876.aspx

     

    2) Yes, it should be possible. Any type that's serializable by one of WCF's serializers (the default DataContractSerializer or the XmlSerializer) can be sent over WCF. DataSet and DataTable are serializable with both serializers.

     

    Having said that:

    - I seem to remember hearing something about possible bugs in this area... if you run into issues, please open a Connect bug or contact Product Support if the issue is blocking you

    - Using DataSets in particular is not recommended, especially if you care at all about interop. Also, using Typed Datasets will force you to either use the XmlSerializer on the client (usually meaning less performance, etc.) or to share the typed dataset type between client and server (which violates SOA principles and may not always be possible).

    - Consider using the newer .Net3.5 technologies (Linq to Sql, Linq to Entities) - they both generate types easily usable in WCF. Also, especially if 3.5 is not an option, consider transferring the data into your own data contract types before sending - this will ensure interoperability and will give you much more control over what goes on the wire.

     

    The actual transfer mechanism (HTTP, TCP, named pipes, etc...) doesn't matter. In fact, this is one of the main benefits of using WCF: you can create your contract without worrying about the transport mechanism.

     

    3) It's very hard to give general advice without knowing your exact scenario (typical data size and "chattiness"? transfering changes only or entire datasets? WCF-to-WCF or interop?) and without knowing what you want to optimize for. (speed? size on the wire? interop usability? coding efficiency?) Take a look at the Linq technologies as I said before, they seem to provide a nice balance between all these factors. Use the binary XML encoding if going WCF-to-WCF.
    Tuesday, August 28, 2007 9:15 PM
  • 1) http://www.codeplex.com/WCFvsRemoting

    2) I dont think so most of the custom serialization support is at the dataset level.

    3) Neither and definetly not datasets / data contracts these are very inefficient for distributed computing as they are heavy objects which you are going to destroy and recreate .Create a datacontract for the message and a datacontract for your data objects  , use a binary serialization method and

    let Wcf take care fo how the message looks on the wire. Note in some cases packing the message can be usefull eg flag fields etc . In a lot of cases its best to convert from Business domain objects ( optomised for OO) to messages ( optomised for how to send on the wire eg - big infrequent messages)  and convert them back again.  That way you dont have messages / serialization logic corrupting your Business layer and vise versa.

     

     

    Regards,

     

    Ben

    Tuesday, August 28, 2007 2:35 PM

All replies

  •  suyog.dabhole wrote:

    How transfer datatable or dataset by using WCF?

    Is it possible to send datatable over tcp channel using WCF?

    What is the best way to send data over network XML or DataTable for larger data?

    Kindly provide ans as soon as possible

    Tuesday, August 28, 2007 12:30 PM
  • 1) http://www.codeplex.com/WCFvsRemoting

    2) I dont think so most of the custom serialization support is at the dataset level.

    3) Neither and definetly not datasets / data contracts these are very inefficient for distributed computing as they are heavy objects which you are going to destroy and recreate .Create a datacontract for the message and a datacontract for your data objects  , use a binary serialization method and

    let Wcf take care fo how the message looks on the wire. Note in some cases packing the message can be usefull eg flag fields etc . In a lot of cases its best to convert from Business domain objects ( optomised for OO) to messages ( optomised for how to send on the wire eg - big infrequent messages)  and convert them back again.  That way you dont have messages / serialization logic corrupting your Business layer and vise versa.

     

     

    Regards,

     

    Ben

    Tuesday, August 28, 2007 2:35 PM
  •  

    1) See documentation: http://msdn2.microsoft.com/en-us/library/aa347876.aspx

     

    2) Yes, it should be possible. Any type that's serializable by one of WCF's serializers (the default DataContractSerializer or the XmlSerializer) can be sent over WCF. DataSet and DataTable are serializable with both serializers.

     

    Having said that:

    - I seem to remember hearing something about possible bugs in this area... if you run into issues, please open a Connect bug or contact Product Support if the issue is blocking you

    - Using DataSets in particular is not recommended, especially if you care at all about interop. Also, using Typed Datasets will force you to either use the XmlSerializer on the client (usually meaning less performance, etc.) or to share the typed dataset type between client and server (which violates SOA principles and may not always be possible).

    - Consider using the newer .Net3.5 technologies (Linq to Sql, Linq to Entities) - they both generate types easily usable in WCF. Also, especially if 3.5 is not an option, consider transferring the data into your own data contract types before sending - this will ensure interoperability and will give you much more control over what goes on the wire.

     

    The actual transfer mechanism (HTTP, TCP, named pipes, etc...) doesn't matter. In fact, this is one of the main benefits of using WCF: you can create your contract without worrying about the transport mechanism.

     

    3) It's very hard to give general advice without knowing your exact scenario (typical data size and "chattiness"? transfering changes only or entire datasets? WCF-to-WCF or interop?) and without knowing what you want to optimize for. (speed? size on the wire? interop usability? coding efficiency?) Take a look at the Linq technologies as I said before, they seem to provide a nice balance between all these factors. Use the binary XML encoding if going WCF-to-WCF.
    Tuesday, August 28, 2007 9:15 PM
  • hi all,

     

    I have build a WCF services and return dataset to a client application. I understand teh fact that interopt is broken somehow by using dataset...

     

    one part of my servcie is defined as follow :

     

    [OperationContract(Action = "http://Maillefer.Nomos.Types/IAlarmHistory/ReadHistory", ReplyAction = "http://Maillefer.Nomos.Types/IAlarmHistory/ReadHistoryResponse")]

    System.Data.DataSet ReadHistory(DateTime from, DateTime To, string lan);

     

    This service is calling a datalayer store procedure which return requested data. Then as you see above the service return a dataset which will be bind to a datagrid view as a datasource..

     

    According to what you say , dataset should be avoid, i know that there are heavy, so should I change my servcice to return an XMLDocument instead ?

     

    What do you mean in your post by defining a DataContrat and a Message contract ?

     

    thnaks for clarificatison

    regards

    serge

    Tuesday, October 30, 2007 10:50 AM
  • There is a high performance alternative
    Transfers everything bidirectionally (remoting/sync/async  PC and PocketPC, windows, linux ...)
     
    Wednesday, November 07, 2007 4:23 AM
  •  

    Hi Serge ,

     

    Sending an Xml document will not help as its the cvretaing the dataset , serializing it to xml and destoying it that makes it inefficient - they were designed to load from a DB and kept around but in Ntier scenarios we destroy them after sending -t his is very inefficient and contrary to aht they were designed for. With Schema changes disabled its not too bad on the wire. Other remoting products dont help this..

     

    That said if there are few users and performance is not critical  go for it .. Reports are often better handled like this.  Just not a good case for green field solutions.

     

     

    Regards,

     

    Ben

    Sunday, November 11, 2007 7:03 AM
  • What could it be esle Ben, I would rather prefer to do it right inside on facing later troubles.

    So any way my dalatayer will strore the resulte store procedure in a dataset but then If I am not returning the dataset on the wire what eles could it be, array ?

     

    Does array will be really more powerfull that dataset if you consider that you need to build the array correctly ?

    For my case client application should easily be able to recover a set of rows and columnd name if they want to process retrun data correctly.

    So does my array could easily contains a DataRowCollection and DataColumnCollection if I need to get it process by anything or those collection are .NEt specific ?

     

    thnaks for comments

    serge

    Sunday, November 11, 2007 2:35 PM
  •  

    Hi Serge,  

    "

    What could it be esle Ben, I would rather prefer to do it right inside on facing later troubles.

    So any way my dalatayer will strore the resulte store procedure in a dataset but then If I am not returning the dataset on the wire what eles could it be, array ?"

     

    Arrays or List ( which get automatically converted to an array and back to a list at the client)  are good.

     

     

    "Does array will be really more powerfull that dataset if you consider that you need to build the array correctly ?"

     

    Arrays of objects are not as powerfull ( ie functional) as datasets but your are only sending data to the cleint - the client is the one acting on it . These days you can bind arrays to grids etc ..  The thing is they are light wight the do not have overhead.

     

    "For my case client application should easily be able to recover a set of rows and columnd name if they want to process retrun data correctly.

    So does my array could easily contains a DataRowCollection and DataColumnCollection if I need to get it process by anything or those collection are .NEt specific ?"

     

    This is the real gotcha you cant send Datarows , if you dont use Datasets you need your own objects,  This means you need to do your data access yourself ( or use an OR mapper like Wilson OR mapper, nHibernate ) though with 3.5 you  can use DLINQ. If you have a larger system id even recommend the DTO pattern (  Martin Fowlers interpretation) for the actual data you send to the clients eg do not send the data objects ( or your business objects) you get from the DB rearrange and flatten it so you can get  more data into a call and the client server call shoudl match a use case .  This avoids round trips.

     

    For existing code ( or new 2 Tier code) im happy to use  Datasets as they are quick and easy to develop but you are paying for that with a slower system and more difficult maintenance ( try using an int key  and sending  3 or 4 related table updates/inserts  to the mid Tier) .  I use datasets for most smaller  2 Tier apps as the maintenance cost is not important however for 3+ tier i dont use them any more .  ( I dont like Stored Procs either except for reports  , just another layer to manage for little gain , the extra work is better spend in a messaging layer . They can be ok for mid sized 2 Tier systems to create a Pseudo business tier but not for nTier CRUD )

     

    Regards,

     

    Ben

    Monday, November 12, 2007 3:07 AM
  • Hmmm thansk ben,

    I have something interrestng that I would like to extend here...

    You said:

    "( I dont like Stored Procs either except for reports  , just another layer to manage for little gain , the extra work is better spend in a messaging layer . They can be ok for mid sized 2 Tier systems to create a Pseudo business tier but not for nTier CRUD )".

     

    For our development I have requested the help of a Microsft Architect for our n tier application and he was staright in  a datalayer with store proc ...

    So I guess every one has his own idea about this...in a similar way has Colors, anyone will get hi own opinion of a colofull painting..

    For me I like those strore proc because it is really easy to manage and easy to maintain...

    Why are so so agains that ?

     

    serge 

     

    Monday, November 12, 2007 7:44 PM
  • Can the guy spamming the adds cut it - you can use Remoting if you want ... The performance has very little to do with the message size and wire speed a lot of the cost is serialization , security and interop requirements with 3rd party systems eg SOAP or xml.. And with binary serialization , everytime 1 client wanted an upgrade every app needed a new copy of the dll , it was a nightmare we and J2EE have been there... 7 years ago.  

     

    On Stored Procs...

     

     

    I was hot on stored procs 10 years ago and in larger 2 Tier development they create a Pseudo Middle Tier , eg you can generate sequences efficiently   , allowing you to do somethings better ,and dynamic queries were much slower . In those days i went to a site which was a disaster because everything was dynamic and some basic queries were taking forever in production.

     

    These days for nTier apps and precompiled queries  it is no longer relevant , times change some people dont and apply a tried and true model to newer technology  , that said ( i hope i didnt offend anyone) in some large organisations they have a solid DBA team that maitain and manage the Stored procs ie not the devs - in that case the value they provide for large databases in terms of custom indexes , locking and tunning is high .

     

    Im not totally against stored procs just against them in NTier apps for CRUD and non report unctionality. Why ,

    - Higher SQL skill required ( well for non Crud Stored procs that need error levels etc)

    - Hard to Debug except for CRUD

    - For CRUD ( where the query is the same) they add no value , you can change the query in code or in my case a resource file and you can hit F5. With a stored proc you got to change the stored proc param , and change the code that calls it. If you are talking about its easy to change logic in teh Stored proc - they shouldnt be there thats the role of code not the db.

    - Another layer to deploy ... This is the clincher - you already have a middle Tier ,

    - Stored procs eventually contain logic but that is not the role of the data layer eg which items are fetched and how now to find a bug or understand soem code i have to go to the stored proc instead of just doing a find reference and having it in front of me.

    - A lot of peopel dislike Stored procs because it makes you db dependedt but i dont care about this..

     

    Basically what value do they give ? Again for smaller apps (2 Tier) fine but when you get larger apps with 300+ stores procs  and versioning them ( new app uses old one , older app with pacth use new one ) can be a nighmare.

     

    Its horses for courses and many people apply a catch all solution ... and we all make mistakes - I know i have made many ( including Datasets + Stored Procs in a high transaction middleTier remoting scenario in 2001) .

     

    Regards,

     

    Ben

     

    Tuesday, November 20, 2007 9:37 AM
  • Thanks for your comment ben,

    I think in depends also a lot on the people who gonna use your application. If it has been clearly define with your final user that qerries will mnot be able to be modiefied unless by recompilingpart of the code then its ok.

    In my case for instance, I distribute a supervision application worldwilde, where finale user are very basic knowledge on computers or idewas there really have on data they collect. It means that the data type retrive from a datastore for instance could change every 2 weeks if they have no idea...or a new request occurs...it can be any event that needs to change the data retrive.

    Using store proc in that way offer me more agility on the place affected by the changes, I amde it so that only store proc changes and all is done. With out recompliing my businses logic layer...

    With this in place I have build a n tier architecture based on SOA...And I would say that it is open enough in that way for crazy custoemr demands..

     

    thnaks

    serge

     

    Tuesday, November 20, 2007 8:22 PM
  • Importing data is fine - an sql skillset is easier to get . There are many reasosn For using stored procs thoguh a lot of people use the wrong ones.   eg For Skillset     Not performance or maintainability ( you can put SQL in resource files)

     

    SOA is really good for these different demands.

     

    Regards,

     

    Ben

    Wednesday, November 21, 2007 2:46 AM
  •  

    Hi,

     

      I am declaring Datatable and one generic list as datat contract in my application.But i am facing problem while serializing the objects as xml serailizer does not support generic list.My client app take xml serializer as default serailizer.So not able to get generic list in proxy class.Please reply. 

     

    Ragards,

    Amit

    Monday, July 28, 2008 7:05 AM
  • We can send/receive datatable from WCF service but remember following points:

    1. each table that is being passed /returned must have table name. It means that table.tablename propety should not be blank.

    2. Every table that is being passed/returned from WCF service must have schema.

    Thanks

    Aditya Pathak

    • Proposed as answer by mahdi87_gh Tuesday, June 07, 2011 7:27 AM
    Monday, December 27, 2010 8:38 AM