Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Avro type with no fields #255

Open
Fryuni opened this issue Feb 28, 2020 · 1 comment
Open

Allow Avro type with no fields #255

Fryuni opened this issue Feb 28, 2020 · 1 comment

Comments

@Fryuni
Copy link

Fryuni commented Feb 28, 2020

We have one record field that can be of different types depending of the what the message is about. All of they are mapped correctly to BigQuery except this one:

{
          "type": "record",
          "name": "NothingChanged",
          "fields": []
}

which results in this error:

java.lang.IllegalArgumentException: The RECORD field must have at least one sub-field
	at com.google.cloud.bigquery.Field$Builder.setType(Field.java:149)
	at com.google.cloud.bigquery.Field.newBuilder(Field.java:285)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertStruct(BigQuerySchemaConverter.java:175)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertField(BigQuerySchemaConverter.java:127)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertStruct(BigQuerySchemaConverter.java:169)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertField(BigQuerySchemaConverter.java:127)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertSchema(BigQuerySchemaConverter.java:109)
	at com.wepay.kafka.connect.bigquery.convert.BigQuerySchemaConverter.convertSchema(BigQuerySchemaConverter.java:43)
	at com.wepay.kafka.connect.bigquery.SchemaManager.constructTableInfo(SchemaManager.java:68)
	at com.wepay.kafka.connect.bigquery.SchemaManager.createTable(SchemaManager.java:49)
	at com.wepay.kafka.connect.bigquery.BigQuerySinkConnector.ensureExistingTables(BigQuerySinkConnector.java:117)
	at com.wepay.kafka.connect.bigquery.BigQuerySinkConnector.ensureExistingTables(BigQuerySinkConnector.java:140)
	at com.wepay.kafka.connect.bigquery.BigQuerySinkConnector.start(BigQuerySinkConnector.java:159)
	at org.apache.kafka.connect.runtime.WorkerConnector.doStart(WorkerConnector.java:110)
	at org.apache.kafka.connect.runtime.WorkerConnector.start(WorkerConnector.java:135)
	at org.apache.kafka.connect.runtime.WorkerConnector.transitionTo(WorkerConnector.java:195)
	at org.apache.kafka.connect.runtime.Worker.startConnector(Worker.java:257)
	at org.apache.kafka.connect.runtime.distributed.DistributedHerder.startConnector(DistributedHerder.java:1183)
	at org.apache.kafka.connect.runtime.distributed.DistributedHerder.access$1400(DistributedHerder.java:125)
	at org.apache.kafka.connect.runtime.distributed.DistributedHerder$15.call(DistributedHerder.java:1199)
	at org.apache.kafka.connect.runtime.distributed.DistributedHerder$15.call(DistributedHerder.java:1195)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Could record types with 0 fields simply be ignored while generating the schema for BigQuery?

@anderseriksson
Copy link
Contributor

We have a similar situation with a struct without fields. It seems version 1.6.5 (with some extra commits) of the connector is actually ignoring this field in BigQuery...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants