Skip to content

Commit

Permalink
Merge branch 'main' into hshahconsulting-add-tags-support
Browse files Browse the repository at this point in the history
  • Loading branch information
alexott authored Jan 2, 2025
2 parents 746a706 + bc6518a commit 5ce4f60
Show file tree
Hide file tree
Showing 66 changed files with 148,295 additions and 6,935 deletions.
20 changes: 20 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -1,20 +1,40 @@
internal/service/apps_tf/legacy_model.go linguist-generated=true
internal/service/apps_tf/model.go linguist-generated=true
internal/service/billing_tf/legacy_model.go linguist-generated=true
internal/service/billing_tf/model.go linguist-generated=true
internal/service/catalog_tf/legacy_model.go linguist-generated=true
internal/service/catalog_tf/model.go linguist-generated=true
internal/service/cleanrooms_tf/legacy_model.go linguist-generated=true
internal/service/cleanrooms_tf/model.go linguist-generated=true
internal/service/compute_tf/legacy_model.go linguist-generated=true
internal/service/compute_tf/model.go linguist-generated=true
internal/service/dashboards_tf/legacy_model.go linguist-generated=true
internal/service/dashboards_tf/model.go linguist-generated=true
internal/service/files_tf/legacy_model.go linguist-generated=true
internal/service/files_tf/model.go linguist-generated=true
internal/service/iam_tf/legacy_model.go linguist-generated=true
internal/service/iam_tf/model.go linguist-generated=true
internal/service/jobs_tf/legacy_model.go linguist-generated=true
internal/service/jobs_tf/model.go linguist-generated=true
internal/service/marketplace_tf/legacy_model.go linguist-generated=true
internal/service/marketplace_tf/model.go linguist-generated=true
internal/service/ml_tf/legacy_model.go linguist-generated=true
internal/service/ml_tf/model.go linguist-generated=true
internal/service/oauth2_tf/legacy_model.go linguist-generated=true
internal/service/oauth2_tf/model.go linguist-generated=true
internal/service/pipelines_tf/legacy_model.go linguist-generated=true
internal/service/pipelines_tf/model.go linguist-generated=true
internal/service/provisioning_tf/legacy_model.go linguist-generated=true
internal/service/provisioning_tf/model.go linguist-generated=true
internal/service/serving_tf/legacy_model.go linguist-generated=true
internal/service/serving_tf/model.go linguist-generated=true
internal/service/settings_tf/legacy_model.go linguist-generated=true
internal/service/settings_tf/model.go linguist-generated=true
internal/service/sharing_tf/legacy_model.go linguist-generated=true
internal/service/sharing_tf/model.go linguist-generated=true
internal/service/sql_tf/legacy_model.go linguist-generated=true
internal/service/sql_tf/model.go linguist-generated=true
internal/service/vectorsearch_tf/legacy_model.go linguist-generated=true
internal/service/vectorsearch_tf/model.go linguist-generated=true
internal/service/workspace_tf/legacy_model.go linguist-generated=true
internal/service/workspace_tf/model.go linguist-generated=true
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,25 @@
# Version changelog

## [Release] Release v1.62.1

### Bug Fixes

* Reflect backend updates in state for databricks_app ([#4337](https://github.com/databricks/terraform-provider-databricks/pull/4337)).


### Documentation

* Update `databricks_workspace_conf` documentation ([#4334](https://github.com/databricks/terraform-provider-databricks/pull/4334)).
* apply `make fmt-docs` to all docs ([#4344](https://github.com/databricks/terraform-provider-databricks/pull/4344)).


### Internal Changes

* Generate both SdkV2-compatible and Plugin Framework-compatible structures ([#4332](https://github.com/databricks/terraform-provider-databricks/pull/4332)).
* Mark TestAccServicePrincipalResourceOnAzure test as flaky ([#4333](https://github.com/databricks/terraform-provider-databricks/pull/4333)).
* Retry on 504 when calling the permission API ([#4355](https://github.com/databricks/terraform-provider-databricks/pull/4355)).


## [Release] Release v1.62.0

### New Features and Improvements
Expand Down
9 changes: 6 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,10 @@ We are migrating the resource from SDKv2 to Plugin Framework provider and hence

### Adding a new resource
1. Check if the directory for this particular resource exists under `internal/providers/pluginfw/products`, if not create the directory eg: `cluster`, `volume` etc... Please note: Resources and Data sources are organized under the same package for that service.
2. Create a file with resource_resource-name.go and write the CRUD methods, schema for that resource. For reference, please take a look at existing resources eg: `resource_quality_monitor.go`. Make sure to set the user agent in all the CRUD methods. In the `Metadata()`, if the resource is to be used as default, use the method `GetDatabricksProductionName()` else use `GetDatabricksStagingName()` which suffixes the name with `_pluginframework`.
2. Create a file with resource_resource-name.go and write the CRUD methods, schema for that resource. For reference, please take a look at existing resources eg: `resource_app.go`.
- Make sure to set the user agent in all the CRUD methods.
- In the `Metadata()`, use the method `GetDatabricksProductionName()`.
- In the `Schema()` method, import the appropriate struct from the `internal/service/{package}_tf` package and use the `ResourceStructToSchema` method to convert the struct to schema. Use the struct that does not have the `_SdkV2` suffix.
3. Create a file with `resource_resource-name_acc_test.go` and add integration tests here.
4. Create a file with `resource_resource-name_test.go` and add unit tests here. Note: Please make sure to abstract specific method of the resource so they are unit test friendly and not testing internal part of terraform plugin framework library. You can compare the diagnostics, for example: please take a look at: `data_cluster_test.go`
5. Add the resource under `internal/providers/pluginfw/pluginfw.go` in `Resources()` method. Please update the list so that it stays in alphabetically sorted order.
Expand All @@ -139,9 +142,9 @@ There must not be any behaviour change or schema change when migrating a resourc
- Please make sure there are no breaking differences due to changes in schema by running: `make diff-schema`.
- Integration tests shouldn't require any major changes.
By default, `ResourceStructToSchema` will convert a `types.List` field to a `ListAttribute` or `ListNestedAttribute`. For resources or data sources migrated from the SDKv2, `ListNestedBlock` must be used for such fields. To do this, call `cs.ConfigureAsSdkV2Compatible()` in the `ResourceStructToSchema` callback:
By default, `ResourceStructToSchema` will convert a `types.List` field to a `ListAttribute` or `ListNestedAttribute`. For resources or data sources migrated from the SDKv2, `ListNestedBlock` must be used for such fields. To do this, use the `_SdkV2` variant from the `internal/service/{package}_tf` package when defining the resource schema and when interacting with the plan, config and state. Additionally, in the `Schema()` method, call `cs.ConfigureAsSdkV2Compatible()` in the `ResourceStructToSchema` callback:
```go
resp.Schema = tfschema.ResourceStructToSchema(ctx, Resource{}, func(c tfschema.CustomizableSchema) tfschema.CustomizableSchema {
resp.Schema = tfschema.ResourceStructToSchema(ctx, Resource_SdkV2{}, func(c tfschema.CustomizableSchema) tfschema.CustomizableSchema {
cs.ConfigureAsSdkV2Compatible()
// Add any additional configuration here
return cs
Expand Down
22 changes: 20 additions & 2 deletions common/retry.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@ package common

import (
"context"
"log"
"errors"
"regexp"

"github.com/databricks/databricks-sdk-go/apierr"
"github.com/databricks/databricks-sdk-go/logger"
"github.com/databricks/databricks-sdk-go/retries"
)

Expand All @@ -15,11 +17,27 @@ func RetryOnTimeout[T any](ctx context.Context, f func(context.Context) (*T, err
msg := err.Error()
isTimeout := timeoutRegex.MatchString(msg)
if isTimeout {
log.Printf("[DEBUG] Retrying due to timeout: %s", msg)
logger.Debugf(ctx, "Retrying due to timeout: %s", msg)
}
return isTimeout
}))
return r.Run(ctx, func(ctx context.Context) (*T, error) {
return f(ctx)
})
}

// RetryOn504 returns a [retries.Retrier] that calls the given method
// until it either succeeds or returns an error that is different from
// [apierr.ErrDeadlineExceeded].
func RetryOn504[T any](ctx context.Context, f func(context.Context) (*T, error)) (*T, error) {
r := retries.New[T](retries.WithTimeout(-1), retries.WithRetryFunc(func(err error) bool {
if !errors.Is(err, apierr.ErrDeadlineExceeded) {
return false
}
logger.Debugf(ctx, "Retrying on error 504")
return true
}))
return r.Run(ctx, func(ctx context.Context) (*T, error) {
return f(ctx)
})
}
85 changes: 85 additions & 0 deletions common/retry_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ import (
"errors"
"testing"

"github.com/databricks/databricks-sdk-go/apierr"
"github.com/databricks/databricks-sdk-go/experimental/mocks"
"github.com/databricks/databricks-sdk-go/service/workspace"
"github.com/stretchr/testify/assert"
Expand Down Expand Up @@ -47,3 +48,87 @@ func TestRetryOnTimeout_NonRetriableError(t *testing.T) {
})
assert.ErrorIs(t, err, expected)
}

func TestRetryOn504_noError(t *testing.T) {
wantErr := error(nil)
wantRes := (*workspace.ObjectInfo)(nil)
wantCalls := 1

w := mocks.NewMockWorkspaceClient(t)
api := w.GetMockWorkspaceAPI().EXPECT()
api.GetStatusByPath(mock.Anything, mock.Anything).Return(wantRes, wantErr)

gotCalls := 0
gotRes, gotErr := RetryOn504(context.Background(), func(ctx context.Context) (*workspace.ObjectInfo, error) {
gotCalls += 1
return w.WorkspaceClient.Workspace.GetStatusByPath(ctx, "path")
})

assert.ErrorIs(t, gotErr, wantErr)
assert.Equal(t, gotRes, wantRes)
assert.Equal(t, gotCalls, wantCalls)
}

func TestRetryOn504_errorNot504(t *testing.T) {
wantErr := errors.New("test error")
wantRes := (*workspace.ObjectInfo)(nil)
wantCalls := 1

w := mocks.NewMockWorkspaceClient(t)
api := w.GetMockWorkspaceAPI().EXPECT()
api.GetStatusByPath(mock.Anything, mock.Anything).Return(wantRes, wantErr)

gotCalls := 0
gotRes, gotErr := RetryOn504(context.Background(), func(ctx context.Context) (*workspace.ObjectInfo, error) {
gotCalls += 1
return w.WorkspaceClient.Workspace.GetStatusByPath(ctx, "path")
})

assert.ErrorIs(t, gotErr, wantErr)
assert.Equal(t, gotRes, wantRes)
assert.Equal(t, gotCalls, wantCalls)
}

func TestRetryOn504_error504ThenFail(t *testing.T) {
wantErr := errors.New("test error")
wantRes := (*workspace.ObjectInfo)(nil)
wantCalls := 2

w := mocks.NewMockWorkspaceClient(t)
api := w.GetMockWorkspaceAPI().EXPECT()
call := api.GetStatusByPath(mock.Anything, mock.Anything).Return(nil, apierr.ErrDeadlineExceeded)
call.Repeatability = 1
api.GetStatusByPath(mock.Anything, mock.Anything).Return(wantRes, wantErr)

gotCalls := 0
gotRes, gotErr := RetryOn504(context.Background(), func(ctx context.Context) (*workspace.ObjectInfo, error) {
gotCalls++
return w.WorkspaceClient.Workspace.GetStatusByPath(ctx, "path")
})

assert.ErrorIs(t, gotErr, wantErr)
assert.Equal(t, gotRes, wantRes)
assert.Equal(t, gotCalls, wantCalls)
}

func TestRetryOn504_error504ThenSuccess(t *testing.T) {
wantErr := error(nil)
wantRes := &workspace.ObjectInfo{}
wantCalls := 2

w := mocks.NewMockWorkspaceClient(t)
api := w.GetMockWorkspaceAPI().EXPECT()
call := api.GetStatusByPath(mock.Anything, mock.Anything).Return(nil, apierr.ErrDeadlineExceeded)
call.Repeatability = 1
api.GetStatusByPath(mock.Anything, mock.Anything).Return(wantRes, wantErr)

gotCalls := 0
gotRes, gotErr := RetryOn504(context.Background(), func(ctx context.Context) (*workspace.ObjectInfo, error) {
gotCalls++
return w.WorkspaceClient.Workspace.GetStatusByPath(ctx, "path")
})

assert.ErrorIs(t, gotErr, wantErr)
assert.Equal(t, gotRes, wantRes)
assert.Equal(t, gotCalls, wantCalls)
}
2 changes: 1 addition & 1 deletion common/version.go
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ package common
import "context"

var (
version = "1.62.0"
version = "1.62.1"
// ResourceName is resource name without databricks_ prefix
ResourceName contextKey = 1
// Provider is the current instance of provider
Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/app.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ This data source allows you to fetch information about a Databricks App.

```hcl
data "databricks_app" "this" {
name = "my-custom-app"
name = "my-custom-app"
}
```

Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/mws_network_connectivity_configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ provider "databricks" {
}
data "databricks_mws_network_connectivity_configs" "this" {
region = "us-east-1"
region = "us-east-1"
}
output "filtered" {
Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/serving_endpoints.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ data "databricks_serving_endpoints" "all" {
}
resource "databricks_permissions" "ml_serving_usage" {
for_each = databricks_serving_endpoints.all.endpoints
for_each = databricks_serving_endpoints.all.endpoints
serving_endpoint_id = each.value.id
access_control {
Expand Down
35 changes: 18 additions & 17 deletions docs/resources/app.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,28 +11,29 @@ subcategory: "Apps"

```hcl
resource "databricks_app" "this" {
name = "my-custom-app"
description = "My app"
name = "my-custom-app"
description = "My app"
resources = [{
name = "sql-warehouse"
sql_warehouse = {
id = "e9ca293f79a74b5c"
permission = "CAN_MANAGE"
}
},
{
name = "serving-endpoint"
serving_endpoint = {
name = "databricks-meta-llama-3-1-70b-instruct"
permission = "CAN_MANAGE"
}
},
{
name = "job"
job = {
id = "1234"
id = "e9ca293f79a74b5c"
permission = "CAN_MANAGE"
}
},
{
name = "serving-endpoint"
serving_endpoint = {
name = "databricks-meta-llama-3-1-70b-instruct"
permission = "CAN_MANAGE"
}
},
{
name = "job"
job = {
id = "1234"
permission = "CAN_MANAGE"
}
}]
}
```
Expand Down
14 changes: 7 additions & 7 deletions docs/resources/custom_app_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@ This resource allows you to enable [custom OAuth applications](https://docs.data

```hcl
resource "databricks_custom_app_integration" "this" {
name = "custom_integration_name"
redirect_urls = ["https://example.com"]
scopes = ["all-apis"]
token_access_policy {
access_token_ttl_in_minutes = 15
refresh_token_ttl_in_minutes = 30
}
name = "custom_integration_name"
redirect_urls = ["https://example.com"]
scopes = ["all-apis"]
token_access_policy {
access_token_ttl_in_minutes = 15
refresh_token_ttl_in_minutes = 30
}
}
```

Expand Down
4 changes: 2 additions & 2 deletions docs/resources/grant.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,8 @@ See [databricks_grants Catalog grants](grants.md#catalog-grants) for the list of

```hcl
resource "databricks_catalog" "sandbox" {
name = "sandbox"
comment = "this catalog is managed by terraform"
name = "sandbox"
comment = "this catalog is managed by terraform"
properties = {
purpose = "testing"
}
Expand Down
1 change: 1 addition & 0 deletions docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,7 @@ The resource supports the following arguments:
* `health` - (Optional) An optional block that specifies the health conditions for the job [documented below](#health-configuration-block).
* `tags` - (Optional) An optional map of the tags associated with the job. See [tags Configuration Map](#tags-configuration-map)
* `budget_policy_id` - (Optional) The ID of the user-specified budget policy to use for this job. If not specified, a default budget policy may be applied when creating or modifying the job.
* `edit_mode` - (Optional) If `"UI_LOCKED"`, the user interface for the job will be locked. If `"EDITABLE"` (the default), the user interface will be editable.

### task Configuration Block

Expand Down
4 changes: 2 additions & 2 deletions docs/resources/lakehouse_monitor.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ resource "databricks_sql_table" "myTestTable" {
data_source_format = "DELTA"
column {
name = "timestamp"
type = "int"
name = "timestamp"
type = "int"
}
}
Expand Down
4 changes: 2 additions & 2 deletions docs/resources/quality_monitor.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@ resource "databricks_sql_table" "myTestTable" {
data_source_format = "DELTA"
column {
name = "timestamp"
type = "int"
name = "timestamp"
type = "int"
}
}
Expand Down
2 changes: 1 addition & 1 deletion docs/resources/workspace_conf.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Manages workspace configuration for expert usage. Currently, more than one insta
Allows specification of custom configuration properties for expert usage:

- `enableIpAccessLists` - enables the use of [databricks_ip_access_list](ip_access_list.md) resources
- `maxTokenLifetimeDays` - (string) Maximum token lifetime of new tokens in days, as an integer. If zero, new tokens are permitted to have no lifetime limit. Negative numbers are unsupported. **WARNING:** This limit only applies to new tokens, so there may be tokens with lifetimes longer than this value, including unlimited lifetime. Such tokens may have been created before the current maximum token lifetime was set.
- `maxTokenLifetimeDays` - (string) Maximum token lifetime of new tokens in days, as an integer. This value can range from 1 day to 730 days (2 years). If not specified, the maximum lifetime of new tokens is 730 days. **WARNING:** This limit only applies to new tokens, so there may be tokens with lifetimes longer than this value, including unlimited lifetime. Such tokens may have been created before the current maximum token lifetime was set.
- `enableTokensConfig` - (boolean) Enable or disable personal access tokens for this workspace.
- `enableDeprecatedClusterNamedInitScripts` - (boolean) Enable or disable [legacy cluster-named init scripts](https://docs.databricks.com/clusters/init-scripts.html#disable-legacy-cluster-named-init-scripts-for-a-workspace) for this workspace.
- `enableDeprecatedGlobalInitScripts` - (boolean) Enable or disable [legacy global init scripts](https://docs.databricks.com/clusters/init-scripts.html#migrate-legacy-scripts) for this workspace.
Expand Down
Loading

0 comments on commit 5ce4f60

Please sign in to comment.