You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
// TODO: Make the initial allocation size configurable?
Ideally a configured value of 0 (or -1) would mean: use columnSize * cbElement.
This would help to work around a bug in the Oracle BI ODBC drivers SQLGetData implementation.
The driver expects to get cbAllocated instead of cbAvailable as the 5th parameter in the SQLGetData call. And even worse: it continues to return SQL_SUCCESS_WITH_INFO with a single NULL-byte when it has no more data until the 5th parameter is as big as cbData.
The result is that pyodbc seg-faults after all RAM has been allocated for the buffer.
The only sane way to work around this, without breaking compatibility, is to ensure to never get a SQL_SUCCESS_WITH_INFO return. Which means make the initial buffer as big as the maximum column size.
The text was updated successfully, but these errors were encountered:
defransen
changed the title
Please make the initial allocation size in ReadVarolumn configurable
Please make the initial allocation size in ReadVarColumn configurable
Jun 20, 2022
I don't have an Oracle install to test against. Would a flag that caused pyodbc to pass cbAllocated actually fix the issue? That is, would it return SQL_SUCCESS if it thought it had the right amount of buffer?
Also, is there a bug report open somewhere on the driver we link to and follow?
... as the comment suggests:
pyodbc/src/getdata.cpp
Line 95 in a4b0b75
Ideally a configured value of 0 (or -1) would mean: use
columnSize * cbElement
.This would help to work around a bug in the Oracle BI ODBC drivers SQLGetData implementation.
The driver expects to get
cbAllocated
instead ofcbAvailable
as the 5th parameter in theSQLGetData
call. And even worse: it continues to returnSQL_SUCCESS_WITH_INFO
with a single NULL-byte when it has no more data until the 5th parameter is as big ascbData
.The result is that pyodbc seg-faults after all RAM has been allocated for the buffer.
The only sane way to work around this, without breaking compatibility, is to ensure to never get a
SQL_SUCCESS_WITH_INFO
return. Which means make the initial buffer as big as the maximum column size.The text was updated successfully, but these errors were encountered: