Skip to content

Commit

Permalink
Merge pull request #283 from dbt-labs/TFPCLD-9
Browse files Browse the repository at this point in the history
  • Loading branch information
b-per authored Aug 19, 2024
2 parents 3e2fa34 + 099da93 commit d5c8d9e
Show file tree
Hide file tree
Showing 36 changed files with 2,449 additions and 117 deletions.
64 changes: 33 additions & 31 deletions .goreleaser.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,41 +4,43 @@ before:
hooks:
# this is just an example and not a requirement for provider building/publishing
- go mod tidy
- go generate ./...

builds:
- env:
# goreleaser does not work with CGO, it could also complicate
# usage by users in CI/CD systems like Terraform Cloud where
# they are unable to install libraries.
- CGO_ENABLED=0
mod_timestamp: '{{ .CommitTimestamp }}'
flags:
- -trimpath
ldflags:
- '-s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X "github.com/dbt-labs/terraform-provider-dbtcloud/pkg/dbt_cloud.versionString={{.Env.VERSION}}"'
goos:
- freebsd
- windows
- linux
- darwin
goarch:
- amd64
- '386'
- arm
- arm64
ignore:
- goos: darwin
goarch: '386'
binary: '{{ .ProjectName }}_v{{ .Version }}'
- env:
# goreleaser does not work with CGO, it could also complicate
# usage by users in CI/CD systems like Terraform Cloud where
# they are unable to install libraries.
- CGO_ENABLED=0
mod_timestamp: "{{ .CommitTimestamp }}"
flags:
- -trimpath
ldflags:
- '-s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X "github.com/dbt-labs/terraform-provider-dbtcloud/pkg/dbt_cloud.versionString={{.Env.VERSION}}"'
goos:
- freebsd
- windows
- linux
- darwin
goarch:
- amd64
- "386"
- arm
- arm64
ignore:
- goos: darwin
goarch: "386"
binary: "{{ .ProjectName }}_v{{ .Version }}"
archives:
- format: zip
name_template: '{{ .ProjectName }}_{{ .Version }}_{{ .Os }}_{{ .Arch }}'
- format: zip
name_template: "{{ .ProjectName }}_{{ .Version }}_{{ .Os }}_{{ .Arch }}"
checksum:
name_template: '{{ .ProjectName }}_{{ .Version }}_SHA256SUMS'
name_template: "{{ .ProjectName }}_{{ .Version }}_SHA256SUMS"
algorithm: sha256
signs:
- artifacts: checksum
args:
# if you are using this in a GitHub action or some other automated pipeline, you
# if you are using this in a GitHub action or some other automated pipeline, you
# need to pass the batch flag to indicate its not interactive.
- "--batch"
- "--local-user"
Expand All @@ -48,7 +50,7 @@ signs:
- "--detach-sign"
- "${artifact}"
# release:
# If you want to manually examine the release before its live, uncomment this line:
# draft: true
# If you want to manually examine the release before its live, uncomment this line:
# draft: true
changelog:
skip: true
disable: true
14 changes: 13 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,24 @@

All notable changes to this project will be documented in this file.

## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.10...HEAD)
## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.11...HEAD)

# [0.3.11](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.9...v0.3.11)

### Changes

- [#267](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/267) Support for global connections
- `dbtcloud_environment` now accepts a `connection_id` to link the environment to the connection. This is the new recommended way to link connections to environments instead of linking the connection to the project with `dbtcloud_project_connection`
- The `dbtcloud_project_connection` still works today and when used doesn't require setting up a `connection_id` in the `dbtcloud_environment` resource (i.e. , any current config/module should continue working), but the resource is flagged as deprecated and will be removed in a future version of the provider
- For now, people can continue using the project-scoped connection resources `dbtcloud_connection`, `dbtcloud_bigquery_connection` and `dbtcloud_fabric_connection` for creating and updating global connections. The parameter `project_id` in those connections still need to be a valid project id but doesn't mean that this connection is restricted to this project ID. The project-scoped connections created from Terraform are automatically converted to global connections
- A new resource `dbtcloud_global_connection` has been created and currently supports Snowflake and BigQuery connections. In the next weeks, support for all the Data Warehouses will be added to this resource
- When a data warehouse is supported in `dbtcloud_global_connection`, we recommend using this new resource instead of the legacy project-scoped connection resources. Those resources will be deprecated in a future version of the provider.
- [#278](https://github.com/dbt-labs/terraform-provider-dbtcloud/pull/278) Deprecate `state` attribute in the resources and datasources that use it. It will be removed in the next major version of the provider. This attribute is used for soft-delete and isn't intended to be configured in the scope of the provider.

### Fix

- [#281](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/281) Fix the datasource `dbcloud_environments` where the environment IDs were not being saved

# [0.3.10](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.9...v0.3.10)

### Changes
Expand Down
1 change: 1 addition & 0 deletions docs/data-sources/environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ Retrieve data for a single environment

### Read-Only

- `connection_id` (Number) A connection ID (used with Global Connections)
- `credentials_id` (Number) The project ID to which the environment belong
- `custom_branch` (String) The custom branch name to use
- `dbt_version` (String) Version number of dbt to use in this environment.
Expand Down
1 change: 1 addition & 0 deletions docs/data-sources/environments.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ Retrieve data for multiple environments

Read-Only:

- `connection_id` (Number) A connection ID (used with Global Connections)
- `credentials_id` (Number) Credential ID to create the environment with. A credential is not required for development environments but is required for deployment environments
- `custom_branch` (String) The custom branch name to use
- `dbt_version` (String) Version number of dbt to use in this environment.
Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/privatelink_endpoint.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,5 +43,5 @@ data "dbtcloud_privatelink_endpoint" "test_with_name_and_url" {

- `cidr_range` (String) The CIDR range of the PrivateLink Endpoint
- `id` (String) The internal ID of the PrivateLink Endpoint
- `state` (Number) PrivatelinkEndpoint state should be 1 = active, as 2 = deleted
- `state` (Number, Deprecated) PrivatelinkEndpoint state should be 1 = active, as 2 = deleted
- `type` (String) Type of the PrivateLink Endpoint
2 changes: 1 addition & 1 deletion docs/data-sources/project.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,4 @@ data "dbtcloud_project" "test_project" {
- `freshness_job_id` (Number) ID of Job for source freshness
- `id` (String) The ID of this resource.
- `repository_id` (Number) ID of the repository associated with the project
- `state` (Number) Project state should be 1 = active, as 2 = deleted
- `state` (Number, Deprecated) Project state should be 1 = active, as 2 = deleted
3 changes: 3 additions & 0 deletions docs/resources/bigquery_connection.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,16 @@ page_title: "dbtcloud_bigquery_connection Resource - dbtcloud"
subcategory: ""
description: |-
Resource to create BigQuery connections in dbt Cloud. Can be set to use OAuth for developers.
~> This resource is going to be deprecated in the future, please use the dbtcloud_global_connection resource instead to crate BigQuery connections.
---

# dbtcloud_bigquery_connection (Resource)


Resource to create BigQuery connections in dbt Cloud. Can be set to use OAuth for developers.

~> This resource is going to be deprecated in the future, please use the `dbtcloud_global_connection` resource instead to crate BigQuery connections.

## Example Usage

```terraform
Expand Down
18 changes: 17 additions & 1 deletion docs/resources/environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,21 @@
page_title: "dbtcloud_environment Resource - dbtcloud"
subcategory: ""
description: |-
Resource to manage dbt Cloud environments for the different dbt Cloud projects.
In a given dbt Cloud project, one development environment can be defined and as many deployment environments as needed can be created.
~> In August 2024, dbt Cloud released the "global connection" feature, allowing connections to be defined at the account level and reused across environments and projects.
This version of the provider has the connection_id as an optional field but it is recommended to start setting it up in your projects. In future versions, this field will become mandatory.
---

# dbtcloud_environment (Resource)


Resource to manage dbt Cloud environments for the different dbt Cloud projects.

In a given dbt Cloud project, one development environment can be defined and as many deployment environments as needed can be created.

~> In August 2024, dbt Cloud released the "global connection" feature, allowing connections to be defined at the account level and reused across environments and projects.
This version of the provider has the `connection_id` as an optional field but it is recommended to start setting it up in your projects. In future versions, this field will become mandatory.

## Example Usage

Expand All @@ -20,6 +28,7 @@ resource "dbtcloud_environment" "ci_environment" {
project_id = dbtcloud_project.dbt_project.id
type = "deployment"
credential_id = dbtcloud_snowflake_credential.ci_credential.credential_id
connection_id = dbtcloud_global_connection.my_global_connection.id
}
// we can also set a deployment environment as being the production one
Expand All @@ -30,6 +39,7 @@ resource "dbtcloud_environment" "prod_environment" {
type = "deployment"
credential_id = dbtcloud_snowflake_credential.prod_credential.credential_id
deployment_type = "production"
connection_id = dbtcloud_connection.my_legacy_connection.connection_id
}
// Creating a development environment
Expand All @@ -38,6 +48,7 @@ resource "dbtcloud_environment" "dev_environment" {
name = "Dev"
project_id = dbtcloud_project.dbt_project.id
type = "development"
connection_id = dbtcloud_global_connection.my_other_global_connection
}
```

Expand All @@ -53,6 +64,11 @@ resource "dbtcloud_environment" "dev_environment" {

### Optional

- `connection_id` (Number) The ID of the connection to use (can be the `id` of a `dbtcloud_global_connection` or the `connection_id` of a legacy connection).
- At the moment, it is optional and the environment will use the connection set in `dbtcloud_project_connection` if `connection_id` is not set in this resource
- In future versions this field will become required, so it is recommended to set it from now on
- When configuring this field, it needs to be configured for all the environments of the project
- To avoid Terraform state issues, when using this field, the `dbtcloud_project_connection` resource should be removed from the project or you need to make sure that the `connection_id` is the same in `dbtcloud_project_connection` and in the `connection_id` of the Development environment of the project
- `credential_id` (Number) Credential ID to create the environment with. A credential is not required for development environments but is required for deployment environments
- `custom_branch` (String) Which custom branch to use in this environment
- `deployment_type` (String) The type of environment. Only valid for environments of type 'deployment' and for now can only be 'production', 'staging' or left empty for generic environments
Expand Down
2 changes: 1 addition & 1 deletion docs/resources/extended_attributes.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ resource "dbtcloud_environment" "issue_depl" {

### Optional

- `state` (Number) Extended Attributes state (1 is active, 2 is inactive)
- `state` (Number, Deprecated) Extended Attributes state (1 is active, 2 is inactive)

### Read-Only

Expand Down
127 changes: 127 additions & 0 deletions docs/resources/global_connection.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,127 @@
---
page_title: "dbtcloud_global_connection Resource - dbtcloud"
subcategory: ""
description: |-
This resource can be used to create global connections as introduced in dbt Cloud in August 2024.
Those connections are not linked to a project and can be linked to environments from different projects by using the connection_id field in the dbtcloud_environment resource.
For now, only BigQuery and Snowflake connections are supported and the other Data Warehouses can continue using the existing resources dbtcloud_connection and dbtcloud_fabric_connection ,
but all Data Warehouses will soon be supported under this resource and the other ones will be deprecated in the future.
---

# dbtcloud_global_connection (Resource)


This resource can be used to create global connections as introduced in dbt Cloud in August 2024.

Those connections are not linked to a project and can be linked to environments from different projects by using the `connection_id` field in the `dbtcloud_environment` resource.

For now, only BigQuery and Snowflake connections are supported and the other Data Warehouses can continue using the existing resources `dbtcloud_connection` and `dbtcloud_fabric_connection` ,
but all Data Warehouses will soon be supported under this resource and the other ones will be deprecated in the future.

## Example Usage

```terraform
resource "dbtcloud_global_connection" "snowflake" {
name = "My Snowflake connection"
// we can set Privatelink if needed
private_link_endpoint_id = data.dbtcloud_privatelink_endpoint.my_private_link.id
snowflake = {
account = "my-snowflake-account"
database = "MY_DATABASE"
warehouse = "MY_WAREHOUSE"
client_session_keep_alive = false
allow_sso = true
oauth_client_id = "yourclientid"
oauth_client_secret = "yourclientsecret"
}
}
resource "dbtcloud_global_connection" "bigquery" {
name = "My BigQuery connection"
bigquery = {
gcp_project_id = "my-gcp-project-id"
timeout_seconds = 1000
private_key_id = "my-private-key-id"
private_key = "ABCDEFGHIJKL"
client_email = "my_client_email"
client_id = "my_client_id"
auth_uri = "my_auth_uri"
token_uri = "my_token_uri"
auth_provider_x509_cert_url = "my_auth_provider_x509_cert_url"
client_x509_cert_url = "my_client_x509_cert_url"
application_id = "oauth_application_id"
application_secret = "oauth_secret_id"
}
}
```

<!-- schema generated by tfplugindocs -->
## Schema

### Required

- `name` (String) Connection name

### Optional

- `bigquery` (Attributes) (see [below for nested schema](#nestedatt--bigquery))
- `private_link_endpoint_id` (String) Private Link Endpoint ID. This ID can be found using the `privatelink_endpoint` data source
- `snowflake` (Attributes) Snowflake connection configuration (see [below for nested schema](#nestedatt--snowflake))

### Read-Only

- `adapter_version` (String) Version of the adapter
- `id` (Number) Connection Identifier
- `is_ssh_tunnel_enabled` (Boolean) Whether the connection can use an SSH tunnel
- `oauth_configuration_id` (Number)

<a id="nestedatt--bigquery"></a>
### Nested Schema for `bigquery`

Required:

- `auth_provider_x509_cert_url` (String) Auth Provider X509 Cert URL for the Service Account
- `auth_uri` (String) Auth URI for the Service Account
- `client_email` (String) Service Account email
- `client_id` (String) Client ID of the Service Account
- `client_x509_cert_url` (String) Client X509 Cert URL for the Service Account
- `gcp_project_id` (String) The GCP project ID to use for the connection
- `private_key` (String, Sensitive) Private Key for the Service Account
- `private_key_id` (String) Private Key ID for the Service Account
- `token_uri` (String) Token URI for the Service Account

Optional:

- `application_id` (String, Sensitive) OAuth Client ID
- `application_secret` (String, Sensitive) OAuth Client Secret
- `dataproc_cluster_name` (String) Dataproc cluster name for PySpark workloads
- `dataproc_region` (String) Google Cloud region for PySpark workloads on Dataproc
- `execution_project` (String) Project to bill for query execution
- `gcs_bucket` (String) URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- `impersonate_service_account` (String) Service Account to impersonate when running queries
- `job_creation_timeout_seconds` (Number) Maximum timeout for the job creation step
- `job_retry_deadline_seconds` (Number) Total number of seconds to wait while retrying the same query
- `location` (String) Location to create new Datasets in
- `maximum_bytes_billed` (Number) Max number of bytes that can be billed for a given BigQuery query
- `priority` (String) The priority with which to execute BigQuery queries (batch or interactive)
- `retries` (Number) Number of retries for queries
- `scopes` (Set of String) OAuth scopes for the BigQuery connection
- `timeout_seconds` (Number) Timeout in seconds for queries


<a id="nestedatt--snowflake"></a>
### Nested Schema for `snowflake`

Required:

- `account` (String) The Snowflake account name
- `database` (String) The default database for the connection
- `warehouse` (String) The default Snowflake Warehouse to use for the connection

Optional:

- `allow_sso` (Boolean) Whether to allow Snowflake OAuth for the connection. If true, the `oauth_client_id` and `oauth_client_secret` fields must be set
- `client_session_keep_alive` (Boolean) If true, the snowflake client will keep connections for longer than the default 4 hours. This is helpful when particularly long-running queries are executing (> 4 hours)
- `oauth_client_id` (String, Sensitive) OAuth Client ID. Required to allow OAuth between dbt Cloud and Snowflake
- `oauth_client_secret` (String, Sensitive) OAuth Client Secret. Required to allow OAuth between dbt Cloud and Snowflake
- `role` (String) The Snowflake role to use when running queries on the connection
3 changes: 3 additions & 0 deletions examples/resources/dbtcloud_environment/resource.tf
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ resource "dbtcloud_environment" "ci_environment" {
project_id = dbtcloud_project.dbt_project.id
type = "deployment"
credential_id = dbtcloud_snowflake_credential.ci_credential.credential_id
connection_id = dbtcloud_global_connection.my_global_connection.id
}

// we can also set a deployment environment as being the production one
Expand All @@ -15,6 +16,7 @@ resource "dbtcloud_environment" "prod_environment" {
type = "deployment"
credential_id = dbtcloud_snowflake_credential.prod_credential.credential_id
deployment_type = "production"
connection_id = dbtcloud_connection.my_legacy_connection.connection_id
}

// Creating a development environment
Expand All @@ -23,4 +25,5 @@ resource "dbtcloud_environment" "dev_environment" {
name = "Dev"
project_id = dbtcloud_project.dbt_project.id
type = "development"
connection_id = dbtcloud_global_connection.my_other_global_connection
}
Loading

0 comments on commit d5c8d9e

Please sign in to comment.