none
DAO: Is it possible to Update without DAO first fetching? RRS feed

  • Question

  • I am using VS C++ DAO code to access an Oracle Database via ODBC. The actual database is on a remote server and so going through Listener.

    I want to update every field in the records in a recordset (except the key field).  So the code in essence is: (I have omitted all the error checking).

    RecordSet->MoveFirst();

    while (!RecordSet->EOF)
    {
        RecordSet->Edit();
        <Set all the fields except the key field>;
        RecordSet->Update( DAO::dbUpdateRegular, false );
        RecordSet->MoveNext();
    }

    According to the online documentation all of the Move() methods cause the values for the fields in the new record to be fetched from the database. Since the actual database is on a remote system I would like to avoid that for performance reasons.  Is that possible?

    Thanks.

    Saturday, June 18, 2016 10:49 PM

Answers

  • You should see the best performance improvement if you replace the While loop with a set-based UPDATE SQL statement. Search for "RBAR" ("Row-By-Agonizing-Row").

    Matthias Kläy, Kläy Computing AG

     
    • Marked as answer by smj_w Monday, June 20, 2016 12:45 AM
    Sunday, June 19, 2016 9:57 AM

All replies

  • > the values for the fields in the new record

    There is no new record in this code. There are only existing records. All existing records will have to be fetched, assuming you need to update all records in your While loop.

    It is rare that all fields in all rows need to be updated. What is your real-world scenario?


    -Tom. Microsoft Access MVP

    Sunday, June 19, 2016 1:31 AM
  • consider using the SQL statement by the Currentdb.execute strSQL, dbfailonerror
    • Edited by aushknotes Sunday, June 19, 2016 4:09 AM
    Sunday, June 19, 2016 3:52 AM
  • Thank you for your replies. For the second one first - I tried that but it was MUCH slower than using Edit() ... Update(). By an order of magnitude.

    The real world scenario is that this is a data collection app where data is being stored after being received from telemetry devices in the field. The table has more fields than are SELECTed by the QueryDef that creates the RecordSet. This RecordSet is only concerned with those fields in the table being updated.   I.e.,

    The Select statement is "SELECT <Update Field 1>, <Update Field2>, etc. FROM <Table> WHERE Key >= <low key> AND Key <= <high key> order by Key;" )

    QueryDef = Database->CreateQueryDef( "", <Select Statement> );
    RecordSet = QueryDef->OpenRecordSet( DAO::dbOpenDynaSet, 0, DAO::dbOptimistic );

    On my test system each go through the loop takes a little over 3 msecs. So even in the best scenario the app can barely handle 300 updates a second. I have been tasked with improving the performance and wondering whether we can stick with DAO. (The code has been around a while).  

    Years ago this MS article:

     http://msdn.microsoft.com/en-us/library/ff965871

     was VERY influential in making the decision to go with DAO.  But I have not seen the article updated since then for more recent version of the OS and Access.  So as a side question I wonder what is the current recommended way for working with databases from native C++ code.  Using DAO is very convenient because I can work with MS Access databases and Oracle databases with the same code.

    BTW - I turned on ODBC tracing.  I cannot believe how many separate ODBC callas are needed!

    Sunday, June 19, 2016 5:46 AM
  • You should see the best performance improvement if you replace the While loop with a set-based UPDATE SQL statement. Search for "RBAR" ("Row-By-Agonizing-Row").

    Matthias Kläy, Kläy Computing AG

     
    • Marked as answer by smj_w Monday, June 20, 2016 12:45 AM
    Sunday, June 19, 2016 9:57 AM
  • Thanks - Interesting reading about RBAR!  So something like:

    EXEC SQL "DELETE FROM <MergeTable>;"

    Loop INSERTing new records into <MergeTable>; (Using DAO AddNew() ... Update()).

    EXEC SQL "MERGE INTO <TableToBeUpdated> USING <MergeTable> etc....;"

    I'll try it and see what happens.

    Sunday, June 19, 2016 7:49 PM
  • That's great!  Doing the above method I went from 16,500 updates per minute to over 24,000! Plus it easily copes with the situation where non-consecutive records are being updated. Hopefully that will meet the project needs (for now).
    Monday, June 20, 2016 12:45 AM