You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
BigQuery queries have a limited response size1 so syncs may fail when a large response is generated.
The python-bigquery-sqlalchemy2 library supports passing a destination query parameter so the fix for this probably involves adding a new setting (e.g. destination_table) and passing that to the SQLAlchemy URL construction in
Hello @edgarrmondragon I will happily work on this I am just a bit lost and I do not really understand the issue, how would destination solve this? Also tagging @pnadolny13 to understand why it was decided to go w/ sqlalchemy instead of using standard google.cloud bigquery client?
BigQuery queries have a limited response size1 so syncs may fail when a large response is generated.
The
python-bigquery-sqlalchemy
2 library supports passing adestination
query parameter so the fix for this probably involves adding a new setting (e.g.destination_table
) and passing that to the SQLAlchemy URL construction intap-bigquery/tap_bigquery/connector.py
Lines 45 to 47 in 0e37a0f
The string in question is a fully qualified table, e.g.
different-project.different-dataset.table
.Footnotes
https://cloud.google.com/bigquery/docs/writing-results ↩
https://github.com/googleapis/python-bigquery-sqlalchemy?tab=readme-ov-file#connection-string-parameters ↩
The text was updated successfully, but these errors were encountered: