none
Improve Performance of Access Web App on SharePoint 2013 on premise site. RRS feed

  • General discussion

  • Hi All,

    I have SharePoint 2013 on premise site on which we an access web app. The app has around 10 tables with approximately 5 to 10 columns in each table and approximately 1000 to 4000 entries in each of the table. However with such small amount of data also the performance is very bad. 

    Note: We have on insert as well as on update macro running on almost all the tables.

    Is there a way we can improve performance of the Access web app.

    Thanks in Advance, 


    Monali

    Thursday, April 21, 2016 12:46 PM

All replies

  • Hi Monali,

    >>However with such small amount of data also the performance is very bad.

    Could you share us how bad performance of your Access web app? How long will an action finish? I suggest you check which action cost how long time, and share us the related code.

    Best Regards,

    Eddward


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.


    Friday, April 22, 2016 1:24 AM
  • Hi Eddward,

    Thanks for reverting. With so many macro's it is difficult to share code. I can tell you the architecture of the system:

    We have a SharePoint list that is connected to access app. A particular column from this SharePoint list is used in all the tables in the access as lookup field. We have one to many relationship between this SharePoint list and all the access tables.

    The list view page(Home page) has Subview and related items control that show relevant data on the page. It takes around 12 seconds to switch from one item to another from the list view page on home page. 

    Do I need to do some Database changes to improve performance?

    Note: There is no Marco on page load.

    Thanks & Regards,


    Monali

    Friday, April 22, 2016 5:59 AM
  • If the macros fire when you update one record, then performance should be ok.

    However, if you updating many records and macro fires for each row, then things get rather slow.

    You also mention you using 2013, but perhaps this is a 2010 app. In this case, the tables are SharePoint lists, and as they go beyond 5000 rows, they tend to slow down. Make sure you have indexes on any filter or criteria you use in those macros.

    I pushed on-premises SharePoint tables to 80,000 rows, but was VERY careful in doing updates.

    If you already using a 2013 web application, then the tables are already in SQL server – performance should be much better. You could even in theory do direct SQL updates on those rows – but not from the web application.

    Regards,

    Albert D. Kallal (Access MVP)

    Edmonton, Alberta Canada

    Friday, April 22, 2016 8:03 PM
  • Can't you just download it from SharePoint, run your processes on your Windows machine, and re-load it back to SharePoint?  Anything close to the metal will run much faster than any kind of web-based setup.

    MY BOOK

    Saturday, April 23, 2016 6:57 PM
  • If we talking about multiple related tables, then such a download and especially the up-load takes time.

    If they are SharePoint tables, they are in fact replicated to the desktop. So if you could run a query 100% server side, then when you connect with desktop, all the changed rows have to sync down.

    And if you run a query on the local tables, then an up-size has to occur. (so you don’t eliminate the transfer issue).

    However, in this case, it not clear if this is a 2013 web app (SQL tables), or a 2010 web application (SharePoint lists + sync).

    So we don't know what kind of tables we are dealing with. However, if they are SharePoint, then uploading of even 4000 rows takes significant time.

    Regards,

    Albert D. Kallal (Access MVP)

    Edmonton, Alberta Canada

    Saturday, April 23, 2016 8:18 PM
  • Hello Albert,

    Thank you for your reply. This is a 2013 web app(SQL tables) and not a 2010 web application. Macro are generally run when a record is inserted or updated.

    Anything that I can do to boost the performance of the app?

    Thanks & Regards,

    Monali


    Monali


    • Edited by MonaliB Monday, April 25, 2016 1:15 PM
    Monday, April 25, 2016 1:14 PM
  • Ok, so we determined that you ARE running SQL Azure with this application.

    The only area that I would “check” and “double” check is indexing.

    Make sure that for any column used in a for/each data macro filter or lookup criteria has an index.

    As noted, such row counts should be easily handled by SQL Azure, the only “un-known” is how much work/updating the table triggers do.

    The other issue is database designs that have to update lots of rows likely is less than ideal (especially so since we cannot execute an update query – but have to use data macros).

    So say in place of grabbing all product categories and saving a category total, you ONLY update the final category total (by adding or subtracting the category amount for each row update). I suspect you already doing this, but such minor design changes can have dramatic increases in performance.

    (so you add or subtract ONE value in place of totaling up a whole bunch of rows). This means you are storing totals values, but the engine level triggers (data macros) keep those values updated at all times.

    I mean, updating a row, and having a trigger run is not going to be slow UNLESS you are updating lots of rows – updating lots of rows will be slow especially if as you noted each row as a trigger (data macro).

    Not knowing more details here, it is hard to suggest design changes to speed things up, but #1 on the list is ensure that any column search/filtered by those data macros has an index on such columns.

    The bottom line is you don’t have a high speed row update ability in Access web, and tossing on top of this issue is trigger that run for each row updated.

    So adopting a design in which you don't have to update lots of rows is likely the solution, if possible.

    Regards,

    Albert D. Kallal (Access MVP)

    Edmonton, Alberta Canada

    Monday, April 25, 2016 4:28 PM