-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support AWS S3 Simple Authentication (Access/Secret Key) #410
Comments
Thanks, @spiegela. It sounds like the table is already registered in the catalog you're using. Out of curiosity, not that I particularly think it is going to make a difference, have you tried using qbeast-spark |
Hi Aaron! CREATE TABLE my_table (id INT, name STRING)
USING qbeast
LOCATION 's3a://spieg-qbeast/qbeast-table'
OPTIONS( 'columnsToIndex'='id,name') I had to change the type to STRING and add the LOCATION and OPTIONS settings. SELECT * FROM delta.`s3a://spieg-qbeast/qbeast-table` However, this syntax only works if you are using the DeltaCatalog, while for some reason, we have to investigate why it doesn't work with our catalog (also see syntax qbeast. By the way, be careful, the correct configuration for adding packages is |
I've created issue #412 to take care of the lack of support of delta.` path` and qbeast.`path` syntax in QbeastCatalog. |
Thanks @cugni. Now that I figured out, I had an obsolete jar in my include path, the |
What went wrong?
When creating as new table with
qbeast-spark
on an S3 bucket configured with Access Key/Secret Key credentials, Spark inaccurately indicates that the table already exists.How to reproduce?
1. Code that triggered the bug, or steps to reproduce:
Configure spark to use S3 with simple credentials & Qbeast:
Attempt to create the bucket:
2. Branch and commit id:
0.6.0
3. Spark version:
3.5.1
4. Hadoop version:
3.3.4
5. How are you running Spark?
Local computer. Reproduced in Qbeast cloud
6. Stack trace:
The text was updated successfully, but these errors were encountered: