Skip to content

Commit

Permalink
Update docs and changelog
Browse files Browse the repository at this point in the history
  • Loading branch information
b-per committed Aug 16, 2024
1 parent c6f6984 commit 099da93
Show file tree
Hide file tree
Showing 7 changed files with 125 additions and 50 deletions.
14 changes: 13 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,24 @@

All notable changes to this project will be documented in this file.

## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.10...HEAD)
## [Unreleased](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.11...HEAD)

# [0.3.11](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.9...v0.3.11)

### Changes

- [#267](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/267) Support for global connections
- `dbtcloud_environment` now accepts a `connection_id` to link the environment to the connection. This is the new recommended way to link connections to environments instead of linking the connection to the project with `dbtcloud_project_connection`
- The `dbtcloud_project_connection` still works today and when used doesn't require setting up a `connection_id` in the `dbtcloud_environment` resource (i.e. , any current config/module should continue working), but the resource is flagged as deprecated and will be removed in a future version of the provider
- For now, people can continue using the project-scoped connection resources `dbtcloud_connection`, `dbtcloud_bigquery_connection` and `dbtcloud_fabric_connection` for creating and updating global connections. The parameter `project_id` in those connections still need to be a valid project id but doesn't mean that this connection is restricted to this project ID. The project-scoped connections created from Terraform are automatically converted to global connections
- A new resource `dbtcloud_global_connection` has been created and currently supports Snowflake and BigQuery connections. In the next weeks, support for all the Data Warehouses will be added to this resource
- When a data warehouse is supported in `dbtcloud_global_connection`, we recommend using this new resource instead of the legacy project-scoped connection resources. Those resources will be deprecated in a future version of the provider.
- [#278](https://github.com/dbt-labs/terraform-provider-dbtcloud/pull/278) Deprecate `state` attribute in the resources and datasources that use it. It will be removed in the next major version of the provider. This attribute is used for soft-delete and isn't intended to be configured in the scope of the provider.

### Fix

- [#281](https://github.com/dbt-labs/terraform-provider-dbtcloud/issues/281) Fix the datasource `dbcloud_environments` where the environment IDs were not being saved

# [0.3.10](https://github.com/dbt-labs/terraform-provider-dbtcloud/compare/v0.3.9...v0.3.10)

### Changes
Expand Down
2 changes: 1 addition & 1 deletion docs/data-sources/privatelink_endpoint.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,5 +43,5 @@ data "dbtcloud_privatelink_endpoint" "test_with_name_and_url" {

- `cidr_range` (String) The CIDR range of the PrivateLink Endpoint
- `id` (String) The internal ID of the PrivateLink Endpoint
- `state` (Number) PrivatelinkEndpoint state should be 1 = active, as 2 = deleted
- `state` (Number, Deprecated) PrivatelinkEndpoint state should be 1 = active, as 2 = deleted
- `type` (String) Type of the PrivateLink Endpoint
2 changes: 1 addition & 1 deletion docs/data-sources/project.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,4 @@ data "dbtcloud_project" "test_project" {
- `freshness_job_id` (Number) ID of Job for source freshness
- `id` (String) The ID of this resource.
- `repository_id` (Number) ID of the repository associated with the project
- `state` (Number) Project state should be 1 = active, as 2 = deleted
- `state` (Number, Deprecated) Project state should be 1 = active, as 2 = deleted
2 changes: 1 addition & 1 deletion docs/resources/extended_attributes.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ resource "dbtcloud_environment" "issue_depl" {

### Optional

- `state` (Number) Extended Attributes state (1 is active, 2 is inactive)
- `state` (Number, Deprecated) Extended Attributes state (1 is active, 2 is inactive)

### Read-Only

Expand Down
129 changes: 86 additions & 43 deletions docs/resources/global_connection.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,83 +2,126 @@
page_title: "dbtcloud_global_connection Resource - dbtcloud"
subcategory: ""
description: |-
This resource can be used to create global connections as introduced in dbt Cloud in August 2024.
Those connections are not linked to a project and can be linked to environments from different projects by using the connection_id field in the dbtcloud_environment resource.
For now, only BigQuery and Snowflake connections are supported and the other Data Warehouses can continue using the existing resources dbtcloud_connection and dbtcloud_fabric_connection ,
but all Data Warehouses will soon be supported under this resource and the other ones will be deprecated in the future.
---

# dbtcloud_global_connection (Resource)





This resource can be used to create global connections as introduced in dbt Cloud in August 2024.

Those connections are not linked to a project and can be linked to environments from different projects by using the `connection_id` field in the `dbtcloud_environment` resource.

For now, only BigQuery and Snowflake connections are supported and the other Data Warehouses can continue using the existing resources `dbtcloud_connection` and `dbtcloud_fabric_connection` ,
but all Data Warehouses will soon be supported under this resource and the other ones will be deprecated in the future.

## Example Usage

```terraform
resource "dbtcloud_global_connection" "snowflake" {
name = "My Snowflake connection"
// we can set Privatelink if needed
private_link_endpoint_id = data.dbtcloud_privatelink_endpoint.my_private_link.id
snowflake = {
account = "my-snowflake-account"
database = "MY_DATABASE"
warehouse = "MY_WAREHOUSE"
client_session_keep_alive = false
allow_sso = true
oauth_client_id = "yourclientid"
oauth_client_secret = "yourclientsecret"
}
}
resource "dbtcloud_global_connection" "bigquery" {
name = "My BigQuery connection"
bigquery = {
gcp_project_id = "my-gcp-project-id"
timeout_seconds = 1000
private_key_id = "my-private-key-id"
private_key = "ABCDEFGHIJKL"
client_email = "my_client_email"
client_id = "my_client_id"
auth_uri = "my_auth_uri"
token_uri = "my_token_uri"
auth_provider_x509_cert_url = "my_auth_provider_x509_cert_url"
client_x509_cert_url = "my_client_x509_cert_url"
application_id = "oauth_application_id"
application_secret = "oauth_secret_id"
}
}
```

<!-- schema generated by tfplugindocs -->
## Schema

### Required

- `name` (String)
- `name` (String) Connection name

### Optional

- `bigquery` (Attributes) (see [below for nested schema](#nestedatt--bigquery))
- `oauth_configuration_id` (Number)
- `private_link_endpoint_id` (Number)
- `snowflake` (Attributes) (see [below for nested schema](#nestedatt--snowflake))
- `private_link_endpoint_id` (String) Private Link Endpoint ID. This ID can be found using the `privatelink_endpoint` data source
- `snowflake` (Attributes) Snowflake connection configuration (see [below for nested schema](#nestedatt--snowflake))

### Read-Only

- `adapter_version` (String)
- `id` (Number) The ID of this resource.
- `is_ssh_tunnel_enabled` (Boolean)
- `adapter_version` (String) Version of the adapter
- `id` (Number) Connection Identifier
- `is_ssh_tunnel_enabled` (Boolean) Whether the connection can use an SSH tunnel
- `oauth_configuration_id` (Number)

<a id="nestedatt--bigquery"></a>
### Nested Schema for `bigquery`

Required:

- `application_id` (String) OAuth Client ID
- `application_secret` (String) OAuth Client Secret
- `auth_provider_x509_cert_url` (String)
- `auth_uri` (String)
- `client_email` (String)
- `client_id` (String)
- `client_x509_cert_url` (String)
- `gcp_project_id` (String)
- `private_key` (String)
- `private_key_id` (String)
- `timeout_seconds` (Number)
- `token_uri` (String)
- `auth_provider_x509_cert_url` (String) Auth Provider X509 Cert URL for the Service Account
- `auth_uri` (String) Auth URI for the Service Account
- `client_email` (String) Service Account email
- `client_id` (String) Client ID of the Service Account
- `client_x509_cert_url` (String) Client X509 Cert URL for the Service Account
- `gcp_project_id` (String) The GCP project ID to use for the connection
- `private_key` (String, Sensitive) Private Key for the Service Account
- `private_key_id` (String) Private Key ID for the Service Account
- `token_uri` (String) Token URI for the Service Account

Optional:

- `dataproc_cluster_name` (String)
- `dataproc_region` (String)
- `execution_project` (String)
- `gcs_bucket` (String)
- `impersonate_service_account` (String)
- `job_creation_timeout_seconds` (Number)
- `job_retry_deadline_seconds` (Number)
- `location` (String)
- `maximum_bytes_billed` (Number)
- `priority` (String)
- `retries` (Number)
- `scopes` (Set of String)
- `application_id` (String, Sensitive) OAuth Client ID
- `application_secret` (String, Sensitive) OAuth Client Secret
- `dataproc_cluster_name` (String) Dataproc cluster name for PySpark workloads
- `dataproc_region` (String) Google Cloud region for PySpark workloads on Dataproc
- `execution_project` (String) Project to bill for query execution
- `gcs_bucket` (String) URI for a Google Cloud Storage bucket to host Python code executed via Datapro
- `impersonate_service_account` (String) Service Account to impersonate when running queries
- `job_creation_timeout_seconds` (Number) Maximum timeout for the job creation step
- `job_retry_deadline_seconds` (Number) Total number of seconds to wait while retrying the same query
- `location` (String) Location to create new Datasets in
- `maximum_bytes_billed` (Number) Max number of bytes that can be billed for a given BigQuery query
- `priority` (String) The priority with which to execute BigQuery queries (batch or interactive)
- `retries` (Number) Number of retries for queries
- `scopes` (Set of String) OAuth scopes for the BigQuery connection
- `timeout_seconds` (Number) Timeout in seconds for queries


<a id="nestedatt--snowflake"></a>
### Nested Schema for `snowflake`

Required:

- `account` (String)
- `database` (String)
- `warehouse` (String)
- `account` (String) The Snowflake account name
- `database` (String) The default database for the connection
- `warehouse` (String) The default Snowflake Warehouse to use for the connection

Optional:

- `allow_sso` (Boolean)
- `client_session_keep_alive` (Boolean)
- `oauth_client_id` (String, Sensitive)
- `oauth_client_secret` (String, Sensitive)
- `role` (String)
- `allow_sso` (Boolean) Whether to allow Snowflake OAuth for the connection. If true, the `oauth_client_id` and `oauth_client_secret` fields must be set
- `client_session_keep_alive` (Boolean) If true, the snowflake client will keep connections for longer than the default 4 hours. This is helpful when particularly long-running queries are executing (> 4 hours)
- `oauth_client_id` (String, Sensitive) OAuth Client ID. Required to allow OAuth between dbt Cloud and Snowflake
- `oauth_client_secret` (String, Sensitive) OAuth Client Secret. Required to allow OAuth between dbt Cloud and Snowflake
- `role` (String) The Snowflake role to use when running queries on the connection
26 changes: 23 additions & 3 deletions terraform_resources.d2
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
*.*.style.font-size: 22

title: |md
# Terraform resources (v0.3.6)
# Terraform resources (v0.3.11)
| {near: top-center}

direction: right
Expand All @@ -11,6 +11,13 @@ direction: right
license_map
partial_license_map

project_connection: {
style: {
fill: "#C5C6C7"
stroke: grey
}
}

privatelink_endpoint: {tooltip: Datasource only}
group: {tooltip: Group permissions as well}
group_partial_permissions
Expand Down Expand Up @@ -60,14 +67,27 @@ webhook -- job: triggered by {
stroke-dash: 3
}
}
environment -- global_connection
environment -- conns
global_connection -- privatelink_endpoint

environment -- env_creds
project -- project_connection
project_connection -- conns
conns -- privatelink_endpoint
project -- project_repository
project_repository -- repository
environment -- environment_variable
environment -- extended_attributes

project -- project_connection {
style: {
stroke: "#C5C6C7"
}
}
project_connection -- conns {
style: {
stroke: "#C5C6C7"
}
}

(job -- *)[*].style.stroke: green
(* -- job)[*].style.stroke: green
Binary file modified terraform_resources.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 099da93

Please sign in to comment.