none
"Provider=Microsoft.Jet.OLEDB.4.0" "Record is too large" RRS feed

  • Question

  • In C#, I am trying to use OLEDB and SCHEMA.INI to read a text fixed length file and hit the wall on record size.

    • Is there another provider that allows a larger record size for OLEDB and keep code the same?
    • Second choice, if no improved provider exists then what would be the best way to read a fixed length text file and stick the fields into a data table for a bulk copy to SQL Server? Please point me to an example if one exists.

    Thanks in advance for any help here.

    Doug

    Tuesday, November 24, 2015 9:08 PM

Answers

  • Here is an answer (in code) I came up with by replacing OLEDB with a Stream Reader instead -

    // OLEDB "Record is too large" failure fix using StreamReader
    using (StreamReader sr = new StreamReader(string.Format(@"{0}\{1}.txt", sourcedatapath, destinationTableName)))
    {
    	while (sr.Peek() >= 0)
    	{
    		// read a line at a time
    		string line = sr.ReadLine();
    		// then parse and cram fields into data table columns
    		DataRow row = dataTableTxtFile.NewRow();
    		foreach (DataColumn column in dataTableTxtFile.Columns)
    		{
    			string columnName = column.ColumnName;
    			// come up with your row layout for the text file
    			int zeropos = int.Parse(columnStartLength[columnName].Split('|')[0]);
    			int length = int.Parse(columnStartLength[columnName].Split('|')[1]);
    			string value = line.Substring(zeropos, length);
    			row[columnName] = value;
    		}
    		dataTableTxtFile.Rows.Add(row);
    	}
    }
    
    It works for me with some extra column definition, but this gives the general idea of how to do in code.

    • Marked as answer by Doug Mann Tuesday, December 8, 2015 4:27 PM
    Tuesday, December 8, 2015 4:27 PM

All replies

  • Hello Doug,

    >>Is there another provider that allows a larger record size for OLEDB and keep code the same?

    You could have a try with the new Microsoft.ACE.OLEDB.12.0 provider, you could download it from below site:

    http://www.microsoft.com/en-us/download/details.aspx?id=13255

    >>Second choice, if no improved provider exists then what would be the best way to read a fixed length text file and stick the fields into a data table for a bulk copy to SQL Server?

    From your description, it seems you are trying to import data to the sql server, for this, what a suggested approach is to use the bcp.exe tool:

    https://msdn.microsoft.com/en-us/library/ms162802.aspx

    It is fast and easy to use.

    Regards.


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

    Wednesday, November 25, 2015 2:02 AM
    Moderator
  • Thanks Fred,

    I tried the new OLEDB provider and it complained similarly that "This operation will fail because the text file you are about to import contains more than 255 columns".

    Regarding the bcp.exe tool, I was more interested in a C# solution to maintain the code which does other things beside a bulk copy. Thinking along the lines of perhaps a StreamReader or StringReader class, but not sure if these approaches are given way to something newer/better.

    Wednesday, November 25, 2015 2:01 PM
  • >>I tried the new OLEDB provider and it complained similarly that "This operation will fail because the text file you are about to import contains more than 255 columns".

    So is that your data file has more that 255 fields? if so, this is a by designed limitation.

    >>Regarding the bcp.exe tool, I was more interested in a C# solution to maintain the code which does other things beside a bulk copy.

    I recommend this approach because the bcp.exe is written by native code, it should be faster than the .NET code.

     


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

    Thursday, November 26, 2015 1:42 AM
    Moderator
  • If you want to stick to mostly .NET code, one option would be to use Transact-SQL and the BULK INSERT command with the System.Data.SqlClient library.

    https://msdn.microsoft.com/en-us/library/ms188365.aspx

    Another option, which would be slower, would be to use the TextFieldParser Class and export to a DataTable. Then use the DataTable as the source for the SqlBulkCopy Class.

    I'm afraid that due to the number of columns in the table you will have to rule out the OLEDB or ODBC Text providers.


    Paul ~~~~ Microsoft MVP (Visual Basic)

    Friday, November 27, 2015 6:31 PM
  • Here is an answer (in code) I came up with by replacing OLEDB with a Stream Reader instead -

    // OLEDB "Record is too large" failure fix using StreamReader
    using (StreamReader sr = new StreamReader(string.Format(@"{0}\{1}.txt", sourcedatapath, destinationTableName)))
    {
    	while (sr.Peek() >= 0)
    	{
    		// read a line at a time
    		string line = sr.ReadLine();
    		// then parse and cram fields into data table columns
    		DataRow row = dataTableTxtFile.NewRow();
    		foreach (DataColumn column in dataTableTxtFile.Columns)
    		{
    			string columnName = column.ColumnName;
    			// come up with your row layout for the text file
    			int zeropos = int.Parse(columnStartLength[columnName].Split('|')[0]);
    			int length = int.Parse(columnStartLength[columnName].Split('|')[1]);
    			string value = line.Substring(zeropos, length);
    			row[columnName] = value;
    		}
    		dataTableTxtFile.Rows.Add(row);
    	}
    }
    
    It works for me with some extra column definition, but this gives the general idea of how to do in code.

    • Marked as answer by Doug Mann Tuesday, December 8, 2015 4:27 PM
    Tuesday, December 8, 2015 4:27 PM