Skip to content

Commit

Permalink
Merge pull request #22 from fishtown-analytics/020/end-to-end-testing
Browse files Browse the repository at this point in the history
Readier for v0.2.0
  • Loading branch information
jtcohen6 authored May 7, 2020
2 parents d7c69cb + 0aacf80 commit 740f5f1
Show file tree
Hide file tree
Showing 6 changed files with 25 additions and 15 deletions.
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
### External tables in dbt
# External sources in dbt

* Source config extension for metadata about external file structure
* Adapter macros to create external tables and refresh external table partitions
* Snowflake-specific macros to create, backfill, and refresh snowpipes

## Syntax

```bash
# iterate through all source nodes, create if missing + refresh if appropriate
$ dbt run-operation stage_external_sources
Expand All @@ -15,7 +17,7 @@ $ dbt run-operation stage_external_sources --vars 'ext_full_refresh: true'
![sample docs](etc/sample_docs.png)

The macros assume that you have already created an external stage (Snowflake)
or external schema (Spectrum), and that you have permissions to select from it
or external schema (Redshift/Spectrum), and that you have permissions to select from it
and create tables in it.

The `stage_external_sources` macro accepts a similar node selection syntax to
Expand All @@ -36,7 +38,7 @@ $ dbt source stage-external --full-refresh
$ dbt source stage-external --select snowplow.event logs
```

### Spec
## Spec

```yml
version: 2
Expand Down Expand Up @@ -107,10 +109,14 @@ sources:
...
```

See [`sample_sources`](sample_sources) for full valid YML config that establishes Snowplow events
## Resources

* [`sample_sources`](sample_sources) for full valid YML config that establishes Snowplow events
as a dbt source and stage-ready external table in Snowflake and Spectrum.
* [`sample_analysis`](sample_analysis) for a "dry run" version of the DDL/DML that
`stage_external_sources` will run as an operation

### Supported databases
## Supported databases

* Redshift (Spectrum)
* Snowflake
Expand Down
1 change: 0 additions & 1 deletion macros/external/create_snowpipe.sql
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,6 @@

{% if auto_ingest is true %}

{{ dbt_utils.log_info('PASS') }}
{% do return([]) %}

{% else %}
Expand Down
2 changes: 0 additions & 2 deletions macros/external/refresh_external_table.sql
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,6 @@

{% else %}

{{ dbt_utils.log_info('PASS') }}
{% do return([]) %}

{% endif %}
Expand All @@ -99,7 +98,6 @@

{% else %}

{{ dbt_utils.log_info('PASS') }}
{% do return([]) %}

{% endif %}
Expand Down
4 changes: 3 additions & 1 deletion macros/external/stage_external_sources.sql
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,9 @@

{% set run_queue = get_external_build_plan(node) %}

{% do exit_transaction() %}
{% do dbt_utils.log_info(loop_label ~ ' SKIP') if run_queue == [] %}

{% do dbt_external_tables.exit_transaction() %}

{% for q in run_queue %}

Expand Down
6 changes: 0 additions & 6 deletions macros/helpers/common.sql
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,3 @@
{{return(ddl)}}

{% endmacro %}

{% macro exit_transaction() %}

{% do run_query('begin; commit;') %}

{% endmacro %}
11 changes: 11 additions & 0 deletions macros/helpers/redshift/transaction.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{% macro exit_transaction() %}
{{ return(adapter_macro('dbt_external_tables.exit_transaction')) }}
{% endmacro %}

{% macro default__exit_transaction() %}
{# noop #}
{% endmacro %}

{% macro redshift__exit_transaction() %}
{% do run_query('begin; commit;') %}
{% endmacro %}

0 comments on commit 740f5f1

Please sign in to comment.