none
Can you speed up the below function RRS feed

  • Question

  • Hello there,

    the below function is called thousands of times per minute on our online production servers, Data-set being serialized is around 300 K in size each.

    is there anyway to make the below function faster ? 

    we are running the below code in Web Service (.net V4.5) that transforms data set into XML Response to be consumed by third parties.

    thank you in advance.

      Public Shared Function GetXML(DS As DataSet) As XmlElement
            Dim X As New XmlDocument, S As New IO.StringWriter
            DS.WriteXml(S)
            X.LoadXml(S.ToString)
            Return X.DocumentElement
        End Function

    Thursday, August 16, 2018 5:20 PM

All replies

  • Are you constrained to a DataSet or is a DataSet being used because it was a simply solution?

    Please remember to mark the replies as answers if they help and unmark them if they provide no help, this will help others who are looking for solutions to the same or similar problem. Contact via my Twitter (Karen Payne) or Facebook (Karen Payne) via my MSDN profile but will not answer coding question on either.
    VB Forums - moderator
    profile for Karen Payne on Stack Exchange, a network of free, community-driven Q&A sites

    Thursday, August 16, 2018 5:52 PM
    Moderator
  • Check the next function:

    Public Shared Function GetXML2(ds As DataSet) As XmlElement
       Dim x As New XmlDocument
       Using w = x.CreateNavigator().AppendChild()
          ds.WriteXml(w, XmlWriteMode.IgnoreSchema)
       End Using
       Return x.DocumentElement
    End Function
    

    Thursday, August 16, 2018 6:21 PM
  • we are running the below code in Web Service (.net V4.5) that transforms data set into XML Response to be consumed by third parties.

    Really? High performance Web solutions and SOA solutions don't use  dataset or datatable. You don't see WCF service sending and receiving XML serialized objects as data contracts using a   datasets. You don't see restful solutions like Odata or ASP.NET WebAPI sending and receiving data using a dataset.


    Friday, August 17, 2018 1:28 AM
  • i am constrained since the data source is coming from a legacy back-end that was developed with .net 2.0 and is too huge to convert. so i have strict requirement and we are handling thousands of requests per minute so every piece of optimization at our end is useful.
    Monday, August 20, 2018 8:42 AM
  • i am constrained since the data source is coming from a legacy back-end that was developed with .net 2.0 and is too huge to convert. so i have strict requirement and we are handling thousands of requests per minute so every piece of optimization at our end is useful.
    Monday, August 20, 2018 8:42 AM
  • i am constrained since the data source is coming from a legacy back-end that was developed with .net 2.0 and is too huge to convert. so i have strict requirement and we are handling thousands of requests per minute so every piece of optimization at our end is useful.

    The dataset was a bad choice,  and you are stuck with it until such time that the solution is converted to use technology that is more suited towards speed, like using the DTO pattern with collection serialize it to XML or Json. 

    You even have System.IO involved, and you want speed? It's not happening.

    https://www.codeguru.com/vb/gen/vb_misc/oop/article.php/c7063/Data-Transfer-Object-Pattern-Goes-VBNET.htm

    https://www.codeproject.com/Articles/1050468/Data-Transfer-Object-Design-Pattern-in-Csharp

    Monday, August 20, 2018 10:07 AM
  • Just something else to consider...

    Does the dataset change for every web service call?  If you have a lot of calls to the service receiving the same copy of data then you might consider caching the XmlDocument(s) after creating them so that you can just pass the cached copies on subsequent calls.


    Reed Kimble - "When you do things right, people won't be sure you've done anything at all"

    Monday, August 20, 2018 10:41 AM
    Moderator