You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that python sdk for databricks allows to upload files.
Research if it is possible to load files into tables like we do into BigQuery: when a local file may be copied into a table without any stage
If that does not work, research how to use Volumes on databricks to copy files there and use COPY INTO to move them into table.
If authentication is not configured, enable default credentials (ie. if present on serverless compute). You can take a look how CredentialsWithDefault is used (most implementations check if default credentials are present in def on_partial(self) -> None: but in your case you should to it in on_resolve when all fields holding credentials are empty)
Ideal scenario. when running in a Notebook, is that we can load a source (ie rest_api) without any additional configuration, staging or authorization - like we are able to do with duckdb
The text was updated successfully, but these errors were encountered:
It seems that python sdk for databricks allows to upload files.
CredentialsWithDefault
is used (most implementations check if default credentials are present in def on_partial(self) -> None: but in your case you should to it in on_resolve when all fields holding credentials are empty)Ideal scenario. when running in a Notebook, is that we can load a source (ie
rest_api
) without any additional configuration, staging or authorization - like we are able to do with duckdbThe text was updated successfully, but these errors were encountered: