Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE] Issue with databricks_cluster resource. Can't specify the boot disk size and can't set 0 as number of local SSD disks on GCP #4439

Open
micheledaddetta-databricks opened this issue Jan 27, 2025 · 2 comments
Labels
plugin framework This issue will be resolved when we migrate towards using the plugin framework.

Comments

@micheledaddetta-databricks

Configuration

resource "databricks_cluster" "all_purpose" {
  provider = databricks.workspace

  cluster_name = "All purpose cluster"

  spark_version           = "14.3.x-scala2.12" //data.databricks_spark_version.latest.id
  runtime_engine          = "STANDARD" #PHOTON
  node_type_id            = "n2d-highmem-8"
  driver_node_type_id     = "n2d-highmem-8"
  autotermination_minutes = 30
  enable_elastic_disk     = false
  num_workers             = 3

  custom_tags = {
    "x-databricks-nextgen-cluster" : "true"
  }
  spark_conf = {}


  gcp_attributes {
    google_service_account = var.cluster_service_account
    boot_disk_size         = 50
    local_ssd_count        = 0
    availability = "ON_DEMAND_GCP"
  }
}

Expected Behavior

That the cluster was created with no local SSDs and with a 50GB boot disk.

Plan is expected to have the following outcome:

# module.ws-assets.databricks_cluster.all_purpose will be created
  + resource "databricks_cluster" "all_purpose" {
      ...
      + gcp_attributes {
          + availability           = "ON_DEMAND_GCP"
          + boot_disk_size         = 50
          + google_service_account = "[email protected]"
          + local_ssd             = 0
        }
    }

Actual Behavior

The cluster is defined with a 30 GB boot disk and with 2 local SSDs (default value).

Plan contains the following:

# module.ws-assets.databricks_cluster.all_purpose will be created
  + resource "databricks_cluster" "all_purpose" {
      ...
      + gcp_attributes {
          + availability           = "ON_DEMAND_GCP"
          + boot_disk_size         = 50
          + google_service_account = "[email protected]"
        }
    }

Steps to Reproduce

  1. Copy the provided configuration
  2. Execute terraform apply

Terraform and provider versions

Terraform v1.10.0
Databricks v1.64.0

Is it a regression?

NA

Debug Output

tf-debug.log

Important Factoids

Would you like to implement a fix?

I'm available to support in the fix implementation.

@alexott
Copy link
Contributor

alexott commented Jan 27, 2025

@micheledaddetta-databricks it's a known problem, but it's not so easy to solve (see #4395)

@alexott alexott changed the title [ISSUE] Issue with databricks_XXX resource [ISSUE] Issue with databricks_cluster resource. Can't specify 0 as number of local SSD disks on GCP Jan 27, 2025
@micheledaddetta-databricks
Copy link
Author

@alexott is the boot disk size also part of the fix? As result of the apply, the disk is defined with the Databricks image default size and not with the custom defined size.

@micheledaddetta-databricks micheledaddetta-databricks changed the title [ISSUE] Issue with databricks_cluster resource. Can't specify 0 as number of local SSD disks on GCP [ISSUE] Issue with databricks_cluster resource. Can't specify the boot disk size and can't set 0 as number of local SSD disks on GCP Feb 5, 2025
@alexott alexott added the plugin framework This issue will be resolved when we migrate towards using the plugin framework. label Feb 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
plugin framework This issue will be resolved when we migrate towards using the plugin framework.
Projects
None yet
Development

No branches or pull requests

2 participants