locked
Many asynchronous calls vs single call to the API RRS feed

  • Question

  • User-425894179 posted

    We are developing a REST API which among others is going to be consumed by an HTML5 frontend via javascript. The application is for use within the organization and usually has about 300 users, but we want to scale well up to 1000 users or so.

    Normally connections to the API will be made within the LAN so the quality and latency of the connection will be good, although it is not excluded occasional use over the Internet where connections could be slower and with more lag via 3G/4G.

    The two options that we thought are:

    1. The frontend will make several simultaneous asynchronous calls to the API to load the various components of the interface.

      • Pros: Simplicity.
      • Cons: More connections to the server.
    2. The controller of the frontend will make a single call to the API passing as parameters which objects need to be fetched.

      • Pros: Only one connection to the server, although the server will make several connection to the database.
      • Cons: Requires mechanisms in both frontend and API. It complicates the design.

    Further explanations: There will be different resources .../Product .../Locations etc. These resources could be fetched alone, but there will be another abstract resource .../screen?Product&Locations that will fetch both in one call.

    ¿What do you think is the best approach?

    Thanks in advance.

    Monday, December 28, 2015 8:47 AM

Answers

  • User-1946294156 posted

    One thing that you will need to take into consideration is the size of the data stream that you plan on sending.  If it gets larger, you might see latency in the page loading. 

    I always say that simplicity is better.  However, you will need to make sure that your servers can handle the load requests.  With most standard servers, it should be able to handle the load (remember, not everyone will be hitting the server at one time).  But this can be a simple test can be setup.  You can setup a client to do load testing of the server. 

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Monday, December 28, 2015 12:44 PM
  • User585649674 posted

    Go for the first option.

    1) The services are atomic, hence can be reused and code becomes maintainable.

    2) Most browsers can make 5 requests at a time. with javascript loading you can update individual areas in your webpage.

    3) For e.g : In second option, If one request takes 3 seconds to process at server side, and the concurrent request limit is at 25, the 26th user will have to wait for 3 seconds and he will receive response only after 6 seconds. if you use first option, all 26 users will get something or other in the first second and by 4th second all 26 users would have the page loaded completely. It will appear as site is performing better from user perspective. If you have used facebook, imagine if facebook finds all your friends, then friends - friends and their posts and then the posts where your friends are tagged , the posts which you friends commented in a single request, Imagine how much time it will take for the page to load, considering there are billions of profiles.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Tuesday, December 29, 2015 4:37 AM

All replies

  • User-1946294156 posted

    One thing that you will need to take into consideration is the size of the data stream that you plan on sending.  If it gets larger, you might see latency in the page loading. 

    I always say that simplicity is better.  However, you will need to make sure that your servers can handle the load requests.  With most standard servers, it should be able to handle the load (remember, not everyone will be hitting the server at one time).  But this can be a simple test can be setup.  You can setup a client to do load testing of the server. 

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Monday, December 28, 2015 12:44 PM
  • User585649674 posted

    Go for the first option.

    1) The services are atomic, hence can be reused and code becomes maintainable.

    2) Most browsers can make 5 requests at a time. with javascript loading you can update individual areas in your webpage.

    3) For e.g : In second option, If one request takes 3 seconds to process at server side, and the concurrent request limit is at 25, the 26th user will have to wait for 3 seconds and he will receive response only after 6 seconds. if you use first option, all 26 users will get something or other in the first second and by 4th second all 26 users would have the page loaded completely. It will appear as site is performing better from user perspective. If you have used facebook, imagine if facebook finds all your friends, then friends - friends and their posts and then the posts where your friends are tagged , the posts which you friends commented in a single request, Imagine how much time it will take for the page to load, considering there are billions of profiles.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Tuesday, December 29, 2015 4:37 AM