none
Question about OracleBulkCopy ?!? RRS feed

  • Question

  • I have very huge files which I need to insert all rows in these files to Database.

    I am using two method:

    The first is, I am getting first 40000 üoüs and inserting them to the db using OracleBulkCopy. Then next 40000 rows, the next....

    When using this approach I have not any problem. All rows are inserted to the db successfully.

    Bu the second is to insert all rows to the db at onceş Ühen using this approach app gives me Unhandled Exception: Out of memory exception.  Is it normal?  

    I asked this question because I think that may be dividing files to parts was unnecessary?

    Thanks for any advance. 


    Assassin's Creed: Nothing is True, Everything is Permitted.


    Friday, March 23, 2012 8:17 AM

Answers

  • What is the file size? If file size are more than 100 MB than it advisable to process files one by one and I believe you are doing that in first approach. But what I do not understand is why you are partitioning the rows (40000) manually to send it to DB. OracleBulkCopy has a property as Batch Sizeand by setting it a number (40000) will be taken care internally.

    More is here,

    http://docs.oracle.com/html/E10927_01/OracleBulkCopyClass.htm#CHDCDEIA

    For your second approach, “Out of memory exception” is normal for large file processing in a 32bit hardware/OS as whatever RAM you have it has a limit of using only 2 GB. So when large file size is trying to load by C# you get this exception. This can be eliminated by a 64 bit OS/Hardware as it got unlimited memory usage.

    More is here,

    http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx


    Lingaraj Mishra

    Saturday, March 24, 2012 6:19 AM

All replies

  • hi,

    this has nothing to do with ADO; you should ask this question at an Oracle forum.

    But from what I remember, it is never a bad idea to split these bulk inserts up in different files, that is if you mean that the second way is also a bulk insert. If it is not a bulk insert, then it 's normal you get memory problems, because every entry would be logged, and that is exactly what is avoided by doing a bulk insert.


    Regards, Nico

    Friday, March 23, 2012 9:17 AM
  • What is the file size? If file size are more than 100 MB than it advisable to process files one by one and I believe you are doing that in first approach. But what I do not understand is why you are partitioning the rows (40000) manually to send it to DB. OracleBulkCopy has a property as Batch Sizeand by setting it a number (40000) will be taken care internally.

    More is here,

    http://docs.oracle.com/html/E10927_01/OracleBulkCopyClass.htm#CHDCDEIA

    For your second approach, “Out of memory exception” is normal for large file processing in a 32bit hardware/OS as whatever RAM you have it has a limit of using only 2 GB. So when large file size is trying to load by C# you get this exception. This can be eliminated by a 64 bit OS/Hardware as it got unlimited memory usage.

    More is here,

    http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx


    Lingaraj Mishra

    Saturday, March 24, 2012 6:19 AM
  • :)) Yeees??? OH no......

    I changed my app. Have deleted all unnecessary codes. And instead of them I write BatchsIze = 4000;

    BUt it giving me exception:

    ORA 00604: error occured at recursive SQL level 1

    ORA-00054:reousrce busy and acquire with NOWAIT specified or timeout expired. 

    But then I drop and recreate table. But know it is again giving me Out of Memory exception.


    Ferhad Jabiyev


    Saturday, March 24, 2012 8:14 AM