locked
need som direction RRS feed

  • Question

  • Hi All,

    I'm a overseeing a project using asp.net tech uppfront for data capture and vb.net client where the data is manually manipluated and then returned SQL Server and back to the Asp.net client.

    Have any of you heard of an instance where you have used IIS to cache to relieve performance problems? is this a often occurrance ? isn't this a sign that the application as such is poorly designed?

    Background to my question are based on the development team has developed 1 major Object which is about 2MB in size which is sent back & forth between the client (VB.Net) and the server (SQL Server) which has created performance problems in the application.

    in Order to cure the problem the development team has recommended installing IIS where by  they analyze the gigantic object and could caching data from object that doesn't need to be sent back and forth thereby minimizing the object itself

    this does not actully address the problem itself that the poorly designed but is this a good way of getting rid of a problem ? or should we be looking at the main problem basiclly the application as such??

    Any Ideas are much obliged

    Christopher

    Monday, January 9, 2006 5:56 PM

Answers

  • Hi Christopher,

    In my opinion, a single 2MB object that travels back and forth frequently hardly seems like an acceptable approach.

    I would recommend a deep examination of the design and major "refactoring" to remedy the situation, rather then trying to plug the hole in the dam with band-aid, so to speak.

    Arnon

    P.S.

    You said : "in Order to cure the problem the development team has recommended installing IIS where by  they analyze the gigantic object and could caching data from object that doesn't need to be sent back and forth thereby minimizing the object itself"

    While I know there's some esoteric way to run ASP.NET on Apache (see http://weblogs.asp.net/israelio/archive/2005/09/11/424852.aspx) - don't you already have IIS installed ?

    Monday, January 9, 2006 7:14 PM
  • Hi,

    I guess, there is some fundamental problem with the design, if you are required to pass objects of 2MB size. Even if you try to plug the hole now it is certain to throw up some time later. Usually if you have so much data to be passed a pointer is passed instead and the data retrieved as and when required. Another approach could be to load the data incrementally from the repository and show the results, as and when required. I guess this needs refactoring to fix this.

    regards

    Projyal

     

    Tuesday, January 10, 2006 12:20 PM

All replies

  • Hi Christopher,

    In my opinion, a single 2MB object that travels back and forth frequently hardly seems like an acceptable approach.

    I would recommend a deep examination of the design and major "refactoring" to remedy the situation, rather then trying to plug the hole in the dam with band-aid, so to speak.

    Arnon

    P.S.

    You said : "in Order to cure the problem the development team has recommended installing IIS where by  they analyze the gigantic object and could caching data from object that doesn't need to be sent back and forth thereby minimizing the object itself"

    While I know there's some esoteric way to run ASP.NET on Apache (see http://weblogs.asp.net/israelio/archive/2005/09/11/424852.aspx) - don't you already have IIS installed ?

    Monday, January 9, 2006 7:14 PM
  • Would help to know what object this is. Is it a custom object developed by your team or is it a .NET object like a DataSet ??

    In either case the design should respect logical and physical component boundaries while transferring data.

    If it is a .NET DataSet(which I would not recommend anyway), then a simple fix would be to only get the changes( using the GetChanges method) and pass only the changed data back and forth.

    If it is a custom (serializable) object then make sure that only data that is absolutely required to cross application boundaries, does so. If a part of the data is common( between different user sessions) then split that up and put it in a shared cache. If a part of the data is constant ( does not change ), split that up and cache it. Think about using a database cache, If your web server needs to support a large number of users and the size of the cached data is large.

    Hope this helps.

    Regards

    Anand

     

    Tuesday, January 10, 2006 6:32 AM
  • Hi,

    I guess, there is some fundamental problem with the design, if you are required to pass objects of 2MB size. Even if you try to plug the hole now it is certain to throw up some time later. Usually if you have so much data to be passed a pointer is passed instead and the data retrieved as and when required. Another approach could be to load the data incrementally from the repository and show the results, as and when required. I guess this needs refactoring to fix this.

    regards

    Projyal

     

    Tuesday, January 10, 2006 12:20 PM
  • Hi,

    2 MB is too huge to move back and forth in the application.

    But  more details on the requirement that lead to this design, would help in suggesting alternatives.

    What i can presume from what you have explained is, your asp.net UI captures some user entered data--> this is sent to vb.net application(middle tier) which manipulates the data-->this data sent to sqlserver-->sqlserver sends it back to webpage.

    What i fail to understand is why can't you send that data to the UI from your middle tier itself , why do you need to send it from the SQLServer?

    If facing problems in sending data from your middle tier to UI as well, then probably using RPC would help to some extent to avoid complete page load.

    I hope i have understood your problem correctly  and hope this helps :)

    Regards,

    Anjana

     

     

     

    Friday, January 20, 2006 5:09 PM