So, I am attempting to use ADO.Internet to stream personal files data saved within an image column inside a SQL Compact database.

To get this done, I authored a DataReaderStream class that can take an information readers, opened up for consecutive access, and signifies it as being a stream, redirecting calls to see(...) around the stream to IDataReader.GetBytes(...).

One "strange" facet of IDataReader.GetBytes(...), when in comparison towards the Stream class, is the fact that GetBytes necessitates the client to increment an offset and pass that in every time it's known as. It will this despite the fact that access is consecutive, and you cannot read "backwards" within the data readers stream.

The SqlCeDataReader implementation of IDataReader makes sure this by incrementing an interior counter that identifies the entire quantity of bytes it's came back. Should you pass inside a number either under or more than time, the technique will throw an InvalidOperationException.

The issue with this particular, however, is the fact that there's a bug within the SqlCeDataReader implementation that triggers it to create the interior counter towards the wrong value. This leads to subsequent calls to see on my small stream tossing exceptions once they should not be.

I discovered some infomation concerning the bug on this MSDN thread.

I could develop a disgusting, horribly hacky workaround, that essentially uses reflection to update the area within the class towards the correct value.

The code appears like this:

    public override int Read(byte[] buffer, int offset, int count)
    {
        m_length  = m_length ?? m_dr.GetBytes(0, 0, null, offset, count);

        if (m_fieldOffSet < m_length)
        {
            var bytesRead = m_dr.GetBytes(0, m_fieldOffSet, buffer, offset, count);
            m_fieldOffSet += bytesRead;

            if (m_dr is SqlCeDataReader)
            {
                //BEGIN HACK
                //This is a horrible HACK.
                    m_field = m_field ?? typeof (SqlCeDataReader).GetField("sequentialUnitsRead", BindingFlags.NonPublic | BindingFlags.Instance);
                    var length = (long)(m_field.GetValue(m_dr));
                    if (length != m_fieldOffSet)
                    {   
                        m_field.SetValue(m_dr, m_fieldOffSet);
                    }
                //END HACK
            }

            return (int) bytesRead;
        }
        else
        {
            return 0;
        }
    }

For apparent reasons, I would rather not make use of this.

However, I don't want to buffer the whole items in the blob in memory either.

Does anyone are conscious of a means I'm able to get streaming data from a SQL Compact database without needing to turn to such horrible code?

I approached Microsoft (with the SQL Compact Blog) plus they confirmed the bug, and recommended I personally use OLEDB like a workaround. So, I'll try might find out if that actually works for me personally.

Really, I made the decision to repair the problem just by not storing blobs within the database to start with.

This removes the issue (I'm able to stream data from the file), as well as fixes some issues I would have encounter with Sql Compact's 4 GB size limit.