locked
getting java.io.IOException: The stream is closed when retrieving image field from DB with JDBC type 4 driver RRS feed

  • Question

  • Hi, 

    I am trying to port a proprietary application previously installed on Oracle to SQL Server. The application is written in Java, i just had to replace the oracle JDBC driver with microsoft's to make it work with SQL Server. Everything works fine except for a piece of code dealing with retrieval of data from columns of type image.

    I keep getting this exception

    java.io.IOException: The stream is closed.
            at com.microsoft.sqlserver.jdbc.BaseInputStream.checkClosed(SimpleInputS
    tream.java:93)
            at com.microsoft.sqlserver.jdbc.SimpleInputStream.read(SimpleInputStream
    .java:272)

    the line of code retrieving the data is

    InputStream inputStream=contentResultSet.getBinaryStream(1);

    I tried with the jTDS driver from http://jtds.sourceforge.net  and i did not have the problem.

    has someone seen this before? is there anyway around it?

    thanks

    Wednesday, May 23, 2012 12:23 PM

Answers

  • Yes, the Microsoft driver follows the JDBC spec with respect to getBinaryStream =>

    Note: All the data in the returned stream must be read prior to getting the value of any other column. The next call to a getter method implicitly closes the stream. Also, a stream may return 0 when the method InputStream.available is called whether there is data available or not.

    So getBinaryStream is designed to return a true stream of bytes.  We return a true stream of bytes that is NOT buffered in memory, it is a stream that is read right off the wire from the server response.   You cannot close the contentDataResultSet until you fetch all the bytes for our implementation.   I suspect the other driver buffers entire TDS response internally in memory so this is why it works (they don't implement true stream in other words).

    So you can do something like this if you want a detachable inputstream:

    InputStream inputstream = contentDataResultSet.getBlob(1).getBinaryStream();

    This works because getBlob in our implementation reads all the bytes into a byte array, then getBinaryStream returns a stream over the cached data.


    Matt

    • Proposed as answer by Matt Neerincx [MSFT] Wednesday, May 23, 2012 4:21 PM
    • Marked as answer by kc_az Thursday, May 24, 2012 7:15 AM
    Wednesday, May 23, 2012 4:21 PM

All replies

  • Have not seen this before, it should work, it is pretty simple.  Is the back-end column type IMAGE?  Also, which Microsoft JDBC driver are you using?

    Also, I suspect the stream could get closed if you moved to the next row in contentResultSet, we don't cache the stream.


    Matt

    Wednesday, May 23, 2012 2:15 PM
  • Have not seen this before, it should work, it is pretty simple.  Is the back-end column type IMAGE?  Also, which Microsoft JDBC driver are you using?

    Also, I suspect the stream could get closed if you moved to the next row in contentResultSet, we don't cache the stream.


    Matt

    Yes the back-end column type is IMAGE indeed and i am using Microsoft JDBC driver 4.0.

    here's the complete code retrieving the column from the DB

    private InputStream getDBDataInputStream(int ID) throws SQLException, IOException {

            PreparedStatement selectContentDataStatement = DBManager.getConnectionFromPool().prepareStatement(retrieve_content_statement);       

    /* select contentData from data_content_tab  where content_id=?*/     

            ResultSet contentDataResultSet = DBManager.executeQuery(selectContentDataStatement, id);       

            if (!contentDataResultSet.next()) throw new IOException((new StringBuilder()).append("Missing data for content ").append(id).append("!").toString());       

            InputStream inputstream = contentDataResultSet.getBinaryStream(1);       

            DBManager.close(selectContentDataStatement, contentDataResultSet);       

            return inputstream;       

            DBManager.close(selectContentDataStatement, contentDataResultSet);   

    }


    when you say the stream is not cached, this means we should read the whole byte content from the stream before we close 

    the result set or move to another record, correct?

    thanks again





    • Edited by kc_az Wednesday, May 23, 2012 3:56 PM
    Wednesday, May 23, 2012 3:53 PM
  • Yes, the Microsoft driver follows the JDBC spec with respect to getBinaryStream =>

    Note: All the data in the returned stream must be read prior to getting the value of any other column. The next call to a getter method implicitly closes the stream. Also, a stream may return 0 when the method InputStream.available is called whether there is data available or not.

    So getBinaryStream is designed to return a true stream of bytes.  We return a true stream of bytes that is NOT buffered in memory, it is a stream that is read right off the wire from the server response.   You cannot close the contentDataResultSet until you fetch all the bytes for our implementation.   I suspect the other driver buffers entire TDS response internally in memory so this is why it works (they don't implement true stream in other words).

    So you can do something like this if you want a detachable inputstream:

    InputStream inputstream = contentDataResultSet.getBlob(1).getBinaryStream();

    This works because getBlob in our implementation reads all the bytes into a byte array, then getBinaryStream returns a stream over the cached data.


    Matt

    • Proposed as answer by Matt Neerincx [MSFT] Wednesday, May 23, 2012 4:21 PM
    • Marked as answer by kc_az Thursday, May 24, 2012 7:15 AM
    Wednesday, May 23, 2012 4:21 PM
  • Note the one downside here is IF the image is really large, this is going to cause a huge memory allocation.  Another way to accomplish the same task is don't close the resultset but fetch the data before you close the resultset, this results in true streaming where entire IMAGE is not loaded into a byte array.   If your application uses small IMAGE data this will not be a problem but if you are dealing with really large IMAGE data (100 MB for example) this will have a big performance boost if you correctly stream the data.


    Matt

    Wednesday, May 23, 2012 4:24 PM